*1) Big Data Lead*

*Location:  Sunnyvale, CA*

*Client-Equinix*

*Duration-6+ months*

   1. Equinix Evaluation Matrix will need to completed (see attached)
   2. Big Data Lead $85 C2C pay range > MUST Lead a project
   3. Local candidates is definitely a plus!



*Job Responsibilities:*

·         Lead innovation in the space of Big Data by exploring,
investigating, recommending, benchmarking and implementing data centric
technologies.

·         Technically lead software engineering team in design,
architecture and implementation of solutions using Big data technologies.

·         Establish scalable, efficient, automated processes for large
scale data analyses and model development.

·         Work with different application engineering teams to understand
their requirements around data integration, processing and consumption and
implement solutions to those into a central Big Data Platform.

·         Reports critical issues to management effectively, timely and
with clarity.

·         Hands-on system design and development, as needed.

·         Participate in the governance process of data management strategy
/ roadmap / execution plan.



*Qualifications and Experience:*

·         Bachelor's degree in Computer Science/Engineering or equivalent
work experience with 8-12 years of Java & J2EE based software development
experience (Must be hands on).

·         Must have experience in architecting, building and maintaining
large scale systems with capabilities of high availability and high
scalability (e.g. e-commerce systems, retail systems, search etc.)

·         2+ years of hands-on experience with NoSQL technologies such as
Cassandra, HBase and/or Hadoop eco-system (MapReduce, Pig, Hive, Sqoop)
along with experience on real-time & stream processing systems.

·         Knowledge of cloud computing infrastructure (e.g. Amazon Web
Services EC2, Elastic MapReduce) and considerations for scalable and
distributed systems.

·         Ideal candidate should have experience and knowledge in machine
learning algorithms, statistical modeling, predictive analysis and data
mining.

·         Must have a deep understanding of different SDLC methodologies
and experience with all phases of SDLC.

·         Must have strong estimating and planning skills.

·         Previous experience with high-scale or distributed RDBMS
(Teradata, Netezza, Greenplum, Aster Data, Vertica) a plus.

·         Knowledge of Apache Solr, Lucene, Storm and IBM Streams is a
strong plus.

·         Experience & knowledge of Apache Mahout, R language, statistical
modeling is highly desirable.

Contributions to open source projects like Apache Hadoop or Cassandra is a
strong plus.
-- 



Regards

*Mayank Sharma*

Technical Recruiter

Sage Group Consulting Inc.

Direct:  732-837-2134 :: Phone: 732-767-0010 Ext: 305
Email Id: mysha...@sagetl.com <dguss...@sagetl.com> Yahoo:
mayank_sharma_recruiter gtalk: mayank.mayank999

Website: www.sageci.com

-- 
You received this message because you are subscribed to the Google Groups "SAP 
Resource Center" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-resource-center+unsubscr...@googlegroups.com.
To post to this group, send email to sap-resource-center@googlegroups.com.
Visit this group at http://groups.google.com/group/sap-resource-center.
For more options, visit https://groups.google.com/d/optout.

Reply via email to