Hi ,

 

Please help me out by providing good resource for this requirement on 
*s...@puresoftinc.com 
<man...@puresoftinc.com>*

 

 

*Hortonworks Hadoop Data Platform Engineer- Rochester, NY*

*Location: Rochester, NY *

*Rate: Open*

*Client: Excellus *

*Duration: *12/31/18 C2H.  *all candidates must be GC EAD, GC, or USC and 
able to convert with no sponsorship now or in the future. *

*Interviews: *F2F for local.  Phone/Skype/WebEx for out of state.  Work 
must be done on site. 

*(ALL RESUMES MUST HAVE PHOTO ON THEM FOR EVERY CANDIDATE SUBMITTED) or the 
must have an active LinkedIn profile.*

*Excellent communication skills*

*Data Platform Engineer*

Responsibilities:

·         Hortonworks Hadoop administration. Development of sop's, 
policies, and documentation for a new big data platform. 

·         Working with the project team on the standup and config of Hadoop 
echo system.

·         Work on the ground up Big Data implementation. 

·         Implementation, Configuration, Administration, Build out, Add 
nodes, Troubleshooting of Hortonworks Hadoop.

·         40hrs weekly / Business Casual.  Temp to perm potential 

Required Skills:

·         *Hortonworks Hadoop Architecture, Administration, Configuration, 
Troubleshooting, etc.  Need very in depth Hortonworks Hadoop skills;*

·         *Big Data architecture; *

·         *Security configurations for big data platforms*

·         *NiFi** (ETL processor)*

·         *HDP **(Hortonworks Hadoop Data Plaftorm)*

·         *Development experience (Spark)*

·         *Very strong communication and documentation skills.*

 Details:

The Data Platform Engineer engages in the design, development and 
maintenance of the big data platform. This platform hosts structured and 
non-structured data sets that support various business operations and 
enable data-driven decisions. The role involves administering various 
data-hub ecosystem open source software tools along with other vendor 
supported tools like HortonWorks Hadoop, CISCO DV, IBM COGNOS tools. The 
DPE will work closely with Data Scientists, infrastructure Administrators 
and Data Platform Architects to ensure the platform meets business demands.

Essential Responsibilities/Accountabilities: Level I

   - Installs and /or upgrades, configures, administers and troubleshoots 
   the data hub software environment in order to achieve a reliable, highly 
   available, well performing system. 
   - Provides direct technical support to data warehouse user community, 
   and triages support to appropriate personnel when technical support is not 
   sufficient. 
   - Develops standards, policies and procedures for the form, structure 
   and attributes of the data warehouse tools and systems. 
   - Develops data/information quality metrics. 
   - Develops processes to monitor cluster performance and resource usage. 
   - Works with Server Engineers to install new nodes, resolve node 
   failures, and apply patches and upgrades. 
   - Maintains processes that feed data from various systems across the 
   enterprise, ensuring data quality and process efficiency. 
   - Follows and helps streamline procedures for provisioning access to the 
   BI system and establishing security 

Level II In addition to Level I responsibilities:

   - Provides training to staff on the use of the tools. 
   - Develops and applies standards and best practices for data storage, 
   data processing and platform integration. 
   - Ensures security of the environment by managing permissions and 
   encryption strategy. 

Level III In addition to Level II responsibilities:

   - Researches new technology. 
   - Addresses performance and scalability issues and performs necessary 
   capacity planning to meet new business initiatives. 
   - Develops backup, restore and mirroring strategies to ensure a highly 
   available platform. 

Minimum Qualifications:

   - Bachelor's degree in Information Technology, Computer Science, 
   Software Engineering, or closely related field (or four additional years 
   related work experience in lieu of bachelors). 
   - Related work experience (i.e. Co-ops/Internships) preferred. 
   - Ability to take initiative and have a strong vision for driving large 
   scale distributed data platforms. 
   - Experience with ecosystem components such as Hadoop, Cisco DV, Cognos, 
   etc. 
   - Familiarity with Linux system administration, Linux scripting and 
   networking skills. 
   - Proficiency in a programming language such as Python or Java is a plus. 
   - Experience with relational and NoSQL databases, including modeling and 
   writing complex queries is a plus. 
   - Excellent communication and analytical skills. 

Level I

   - Ability to perform routine tool maintenance working with the vendors 
   and others on the team. 
   - Ability to identify data warehouse application and data issues. 
   - Monitor system uptime and performance. 
   - Escalate appropriately to management and/or more senior Tool 
   Administrators. 
   - Minimum of 3 years’ experience in tool administration. 
   - Demonstrated experience with Data Warehousing. 
   - Proactively identify and address risks on system before they become 
   systems issues. 
   - Research and resolve data warehouse application and data issues. 

Level II

   - Minimum of 4 years’ experience in tool administration, with a minimum 
   of 2 years in the tool be requested to support. 
   - Experience with object-oriented design, coding and testing patterns, 
   as well as experience in engineering (commercial or open source) software 
   platforms and large-scale data infrastructures.. Understanding of how to 
   apply technologies to solve big data problems and to develop innovative big 
   data solutions, which requires knowledge in different programming or 
   scripting languages like Java, Linux, C++, PHP, Ruby, Python and/or R. 
   - Ability to facilitate effective presentations to front line management 
   staff. 
   - Training facilitation skills. 

Level III

   - Minimum of 5 years’ experience using the particular tools being asked 
   to support. 
   - Recognized as subject matter expert in the field. 
   - Experienced in managing vendor contract support. 
   - Capability to architect highly scalable distributed systems using 
   different open source tools. 
   - Understanding of how algorithms work and experience building 
   high-performance algorithms. 
   - Proactively plan for growth, system improvements, upgrades, etc. 

Physical Requirements

   - Ability to travel across regions 

Equal Opportunity Employer

All qualified applicants will receive consideration for employment without 
regard to race, color, religion, sex, sexual orientation, gender identity, 
national origin, disability or veteran status.

 

 

 

 

 

 

*Thanks & Regards*

*Om Shiv *

*Puresoft, Inc*

*W:+ 408-442-3664  EXT: 4425*

*Email:**s...@puresoftinc.com* <man...@puresoftinc.com>*  || 
Hangout:kesharioms...@gmail.com*

*Website: **www.puresoftinc.com* 
<https://www.linkedin.com/redirect?url=http%3A%2F%2Fwww%2Epuresoftinc%2Ecom&urlhash=tP9r&trk=Puresoft+Inc_website>
  


 *[image: Puresoft Inc]*

*This message contains information that may be privileged or confidential 
and is the property of Puresoft, Inc. It is intended only for the person to 
whom it is addressed. If you are not the intended recipient, you are not 
authorized to read, print, retain copy, disseminate, distribute, or use 
this message or any part thereof. If you receive this message in error, 
please notify the sender immediately and delete all copies of this message. 
Puresoft, Inc does not accept any liability for virus infected mails.*

 

 

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.

Reply via email to