Hi ,


Wishes of the day,

Please find the requirement details below



*Job Title* :Big Data Engineer

*Location* : Seattle/Bellevue WA (Local candidates)

*Duration*:6 months





*Top Three Skills:*

1.Working with Big Data Concepts and development of Big Data (Hadoop)

2. ETL/ Data Warehousing (Informatica)

3. Great problem solver/ critical thinker



*Job Description:*

Client is seeking a Software engineer with Data Warehouse and Big Data
experience to support Disney's Data Solutions division. They will work with
a team to help build the Japan Operational Data store among other projects
for ESPN and Parks and Resorts.

This person will be responsible for 90% development and 10% architecture.
They will be solving business problems using infromatica and solutioning
around the ETL process. This person needs to understand how to
implemenation versus simply migrating. They will be doing both physical and
logical data modeling along with implementation of the design and also
working with the clients to understand strategy and initiatives.



The candidate must develop specifications in data modeling, setting team
standards, data architecture, extraction/transform/loading. We are seeking
a highly motivated, experienced engineer with a background in open source,
big data solutions that excels in teams, is an excellent communicator, and
has an insatiable desire to help the customer. The ideal candidate will
have 7-10 years in development using SQL/Hive/Pig/MR, ETL, Java, Python and
PHP.



The role of this individual will be to design and develop solutions that
allow for the ingestion of data into a variety of different stores, and
then transform these extracts to various business units to solve their data
needs. This role will have an active hand in optimizing these data feeds
and working with very large petabyte data sets. The individual will utilize
Hive, Pig, MapReduce jobs, Query Tool and ETL tools to manage these data
solutions. Working with internal customers, they will help develop
successful proof-of-concepts, analytics and experimentation with the
business units, providing them technical solutions to solve their
particular business needs.



*Experience Required:*

1.10 + years of deep data warehousing experience

2.Experience with Integration Tools -Netezza (Teradata, Green Plum)

3.Scripting experience with Pig, Hive, and Python

4.Informatica data manipulation

5.Lead a team and mentor/teach others

6.Great communication and executive presence

Preferred Experience:

7.Experience with Big Data including Hadoop Ecosystem and mapreduce

8.Good problem solving skills

9.Java development experience



*Additional Information:*

This team is made up of Big Data, Data Warehousing, and Analytics engineers
and analyst to deploy large scale solutions for ESPN, ABC, Disney
Interactive, Walt Disney Studios, Parks and Resorts, etc.

Team is made up of 15 people. 9 FTEs and 4 contractors



*Daily Tasks:*

- Develop Technical Solutions and services and interfaces for business
units

- Design of new HDFS data solutions

- Support for pilot, POC and experiments with various business units

- Development of unit, integration and acceptance testing

- Architect new Web Services and APIs

- Production support and triage of support issues



*Qualifications:*

7-10 years in software development in dynamic environments

Informatica Expert with Training experience

Java required

SQL, Hive/Pig, Map Reduce, Stored Procedures required

Scripting required

HDFS/HBase experience required

Python desirable

Hadoop/Cassandra/Solr/Mongo experience desired

Cloudera Certified Professional desired



Experience with open source platforms is highly desirable, including
technologies like Hadoop, Cassandra, Solr, Mongo and other large scale
platforms. Experience using analytical tools like Pig and Hive and in the
development of Map Reduce jobs on Hadoop is highly preferred.


Work Environment:Casual, collaborative. Value work life balance, however
deadline intensive. Must be able to work independently and within a team.
Must also be able to help lead the team and work with VPs and Directors.


Interview Information:phone screen with team member and then full in-person
loop with the team.



Impact to the Internal/External Customer:Deploying Data for various
business groups.

Business Challenge: This team owns the data storage for all of Disney's
business groups, therefore driving data and trends for these groups which
ultimately lead to a successful project.



*Non-Technical Skills:*     good communication, ability to interact with
VPs and Directors.

*Project Stage/Lifecycle Info:*ongoing

*Technical Environment:*Java, Hadoop, Netezza, Informatica, Hive, Pig,
Python



Please let me know, if you have any questions.



Sincerely,

*Barry S*

415-424-4275

barryconqt...@gmail.com

www.conq-tech.com

-- 
You received this message because you are subscribed to the Google Groups 
"Oracle-Projects" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to oracle-projects+unsubscr...@googlegroups.com.
To post to this group, send email to oracle-projects@googlegroups.com.
Visit this group at http://groups.google.com/group/oracle-projects.
For more options, visit https://groups.google.com/d/optout.

Reply via email to