Hi Anil, Hope you are doing good…
Please find the requirement and revert ASAP with the updated resume. ---------------------------------------------------------------------------------- Full Name of Candidate: Email Address: Contact details: Current Location: Relocation (Yes/NO): Travelling (Yes/NO): Visa Status: 4 Digit SSN Number: Availability for the role: Availability for the calls: Any interviews lined up (Yes /No): Expected Rate : Expected Salary after contract: Skype id: ____________________________________________________ *Job Title: Hadoop “Solution Architect”* *Location: Chicago, IL* *Duration: 12+ Months* *Solution Architect with Hadoop.* Big Data Solution Architects will be responsible for guiding the full lifecycle of a Hadoop solution, including requirements analysis, platform selection, technical architecture design, application design and development, testing, and deployment. Provide technical and managerial leadership in a team that designs and develops path breaking large-scale cluster data processing systems Dealing with large data sets and distributed computing Working in the data warehousing and Business Intelligence systems Working with various relational and MPP database platforms like Netezza, Teradata Hands-on experience with the Hadoop stack (e.g. MapReduce, Sqoop, Pig, Hive, HBase, Flume) Deep experience in working on large linux clusters Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Hands-on experience with ETL (Extract-Transform-Load) tools (e.g Informatica, Talend, Pentaho) Hands-on experience with BI tools and reporting software (e.g. Microstrategy, Cognos, Pentaho) Hands-on experience with R, Matlab or other statiscal package Hands-on experience with "productionalizing" Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning) Previous experience with high-scale or distributed RDBMS (Teradata, Netezza, Greenplum, Aster Data, Vertica) Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems Knowledge of NoSQL platforms (e.g. key-value stores, graph databases, RDF triple stores) *Basic Qualifications* - Seven or more years of experience in the design and implementation of complex IT systems - Five or more years of experience in a data-related role such as data scientist or data architect - Significant experience with data analytic tools including Apache Hadoop, Hive, Pig, Spark, Storm, Mahout and others - Exceptional interpersonal and communication skills - Demonstrated effectiveness working across multiple business units to achieve results - An understanding of cloud computing deployment models as they relate to data and analytics *Preferred Qualifications* - An advanced degree in computer science, engineering or related discipline - Experience in a customer-facing, sales-aligned role such as consultant, solutions engineer or solutions architect - Working knowledge of one or more computer languages such as Java, R, C++, Perl, Ruby or Python - Hands-on experience with Amazon Web Services Best Regards... Amar amarosai...@gmail.com voice-630-566-7324 -- You received this message because you are subscribed to the Google Groups "OracleD2K" group. To unsubscribe from this group and stop receiving emails from it, send an email to oracled2k+unsubscr...@googlegroups.com. To post to this group, send email to oracled2k@googlegroups.com. Visit this group at http://groups.google.com/group/oracled2k. For more options, visit https://groups.google.com/d/optout.