A new Job, ID: 139823 <http://www.ejobsville.com/display-job/139823> was
added at eJobsVille.com - For the Best Tech Jobs in
Town<http://www.ejobsville.com>

Title:  Hadoop Engineer  posted on 2014-03-31 11:00:42

Job Description:
Job Description:
Position: Hadoop Engineer
Location: Renton, WA
Duration: 6 months plus/FTE
Visa Status :: GC/USC

Role:
The Hadoop Software Engineer applies analysis and programming skills to
transform complex data sources within the Hadoop environment into relevant
data sets used for advanced analytics.
The Hadoop Software Engineer is fluent in the Hadoop applications as well
as Perl, Ruby, Java MapReduce, Mahout, and HiveQL.
The Hadoop software engineer works with Data Scientist using an advance
analytics frameworks and models applied to structured and unstructured data
using a variety of software applications and analytic methods such
decision-tree, univariate, bivariate, clustering, associative rules
learning, collaborative filtering, , regression analysis, including
optimization, and randomized algorithms.
The Hadoop Software Engineer has an applied knowledge of information
management life cycle, data acquisition, data management, information
visualization.
The Hadoop Software Engineer has good interpersonal communication and
presentation skills necessary to works across lines of business with
executives, product managers, operations, marketing research, and analytics
teams to understand the problem-to-solve and develop insights to run and
grow the business.
The Hadoop Software Engineer combines advanced programming skills and
design of data sets included facts and dimensions (star and snow flake
schemas) with a strong working knowledge of advanced analytics and machine
learning.
* Hands-on experience "Big Data" environment and complex data processing
* business issues, trends, tools and techniques
* Creative in relation to developing programs to acquire and process any
data within the Hadoop environment
* Hands-on experience with machine learning techniques Hands-on experience
with Perl, Ruby, Java MapReduce, Java
* Proficient in Teradata - Unix/Linux as well as Windows platforms
* Works well in teams working with emerging technologies
o Expert or working knowledge of Hadoop (2.x) o Working knowledge of  SPSS,
Mahout, and R tools and techniques

Responsibilities:
-          Analysis and programming skills to transform complex data
sources within the Hadoop environment into relevant data sets used for
advanced analytics
-          Design and build robust Hadoop solutions for Big Data problems
-          Guide the full lifecycle for the Big Data solution including
requirements analysis, technical architecture design, solution design,
solution development, testing & deployment



SkillsCandidate Self RatingYears of Experience Basic
KnowledgeMediumExpertHadoop
applications as well as Perl, Ruby, Java MapReduce, Mahout, and
HiveQL.     Data
acquisition, data management, information visualization     Big Data
Teradata     Hadoop (2.x)

*Click here to view full job description and apply
<http://www.ejobsville.com/display-job/139823>  (Registration not mandatory
to apply for this job)*

------------------------------
Best regards,
eJobsVille.com - For The Best Tech Jobs In Town

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at http://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.

Reply via email to