*Hi,*


*Please find the requirement below for share suitable resumes to
rah...@sysmind.com <rah...@sysmind.com>*



*Role : Hadoop Developer*

*Location : Malvern, PA*

*Duration 12+Months*

*Experience : 8+Years*

*Rate : $50/hr on c2c  *



*Please share Profile with Name, Current Location and Visa Status*



Data:

·        RDBMS, OLAP, OLTP concepts

·        Modeling design considerations and optimization

Development:

·        File management on UNIX

·        Application performance monitoring and troubleshooting on UNIX

·        SH / KSH

·        Java

·        Python

·        SQL

·        ETL tool (DataStage, Informatica)

Big Data/Hadoop:

·        Hadoop Certified Developer (Cloudera preferred, HortonWorks, etc)

·        Flume

o        agent design, development, and optimization

o        sourcing from flat files to HDFS files, HBase, and serialized file
formats

o        advanced concepts including durable channels, fault tolerance,
and interceptors

o        access metrics, ex. via mbeans, for active monitoring

·        HBase

o        conceptual and physical differences compared to RDBMS

o        design, development, and creation of HBase schemas

o        advanced techniques including blocksize configuration, in-memory
column families, and compression

·        HDFS

o        file storage concepts and optimization

·        Hive / Impala

o        design, development, and creation of metastore tables, views

o        optimization including indexing, partitioning, compression, and
serialization

·        MapReduce

o        application processing  and parallelization concepts

o        streaming API for development of non-Java MR application

o        advanced development techniques using ToolRunner, distributed
cache, logging

o        advanced development techniques using custom partitioners,
combiners, and formats

o        unit testing using MRUnit

o        common algorithms: sort, search, indexing, co-occurrence

·        Oozie

o        design, development, and execution of complex scalable and
fault-tolerant workflows

o        linking and coordination of HDFS, MR, Java, Sqoop, Hive, SSH etc
actions and sub-workflows to build application pipelines for data ingestion
and processing

o        application failure analysis and troubleshooting

o        advanced design techniques including graceful failure and
recovery, parameterization, parallel action execution, and decision control

o        coordinator design, development, and execution for automated
workflow execution

o        bundle design, development, and execution for advanced coordinator
control

·        Sqoop

o        data imports and exports, configuration and optimization thereof

o        design and limitations; incremental imports





Thanks and Regards



Rahul Kushwaha

Technical Recruiter

Email: rah...@sysmind.com

Website: www.sysmind.com

Gmail : rahulkushwaha2...@gmail.com

Phone:609-897-9670 Extn 2160

Fax: 609-228-5522

Address: 38 Washington Road, Princeton Junction, NJ 08850

[image: Description: Description: Description:
cid:image001.gif@01CC0A79.D4FFDB80]



[image: cid:image002.png@01CF4A69.18319D80]  [image:
cid:image003.png@01CF4A69.18319D80]  [image:
cid:image004.png@01CF4A69.18319D80]



NOTE: Under Bill s.1618 Title III passed by the 105th U.S. Congress this
mail cannot be considered Spam as long as we include the contact
information for removal from our mailing list. To be removed from our
mailing list please reply with 'remove' in the subject heading and your
email address in the body. Include complete address and/or domain/aliases
to be removed. If you still get these emails, please call us at the number
in the signature.

-- 
You received this message because you are subscribed to the Google Groups 
"USITCV" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to usitcv+unsubscr...@googlegroups.com.
To post to this group, send email to usitcv@googlegroups.com.
Visit this group at http://groups.google.com/group/usitcv.
For more options, visit https://groups.google.com/d/optout.

Reply via email to