Greetings


*Kindly share profiles to **s...@terminalcontacts.com*
<s...@terminalcontacts.com>



*Only genuine profiles with a DL / Visa copy and LinkedIn profile is must*

*#1*

*Location: Tampa, FL*

*Duration: 1 Year*

*Experience: 10 years*

*Role: BigData Lead*



*The Tech Lead for Big Data* is responsible for managing the full
life-cycle of a Hadoop solution. Detailed responsibilities are given below,

•              Should have at least 9 years of experience and played Tech
Lead at least for 4 years

•              Creating the requirements analysis, the platform selection,
design of the technical architecture, design of the application design and
development, testing, and deployment of the proposed solution benchmark
systems, analyse system bottlenecks and propose solutions to eliminate them

•              Have experience with the major big data solutions like
Hadoop, Spark, Kafka, MapReduce, Hive, HBase, MongoDB, Cassandra. Also need
to have experience in big data solutions like Impala, Oozie, Mahout, Flume,
ZooKeeper and/or Sqoop

•              Have a firm understanding of major programming/scripting
languages like Java, Scala, and  Phyton. As well as have experience in
working with ETL tools such as Informatica, Talend and etc.

•              Having experience in banking domain is an added advantage

•              Clearly articulate pros and cons of various technologies and
platforms;

•              Document use cases, solutions and recommendations;

•              Have excellent written and verbal communication skills;

•              Explain the work in plain language;

•              Help program and project managers in the design, planning
and governance of implementing projects of any kind;

•              Perform detailed analysis of business problems and technical
environments and use this in designing the solution;

•              Work creatively and analytically in a problem-solving
environment;

•              Be a self-starter, work in teams, as a big data environment
is developed in a team of employees with different disciplines;

•              Work in a fast-paced agile development environment.



#2

Location: Denver, CO

Duration: 1 year

Experience: 12+ years

Role: BigData Architect

*Below is the JD of Sr. Architect on Big Data Profile*

•              10-12 years of experience as an Architect/ Consultant
working on Big Data Platforms

•              Experience with handling very large data repositories and
technologies (multi-terabyte scale or larger scale)

•              Experience delivering distributed and highly scalable
applications on NoSQL/Sharded relational Databases/Mapreduce

•              Ability to quickly prototype, architect and build software
using latest/greatest technologies including java-based

•              Experience in application deployment architectures and
concerns such as scalability, performance, availability, reliability,
security etc.

•              Experience in any one or more of the following technologies

o   1 + yrs on Hadoop (Apache/Cloudera) and/or other Map Reduce Platforms

o   1 +  yrs on Hive, Pig, Sqoop, Flume and/or Mahout

o   3 + yrs as an J2EE architect with strong experience in core Java

o   1+ yrs in architecting NO-SQL and competent SQL skills

o   Good background of Configuration Management/Ticketing systems like
Maven/Ant/JIRA etc.

o   Strong in Shell Scripting/LINUX programming

•              Good knowledge of any Data Integration and/or DW tools is
plus

•              Good knowledge in vertical db (Vertica, RedShift), data
model

•              Candidate should have at least one or two projects delivered
on above technology stacks.

•              Candidate should have worked on the solutions using EDW
products or niche Open Source software

*Key Responsibilities*

•              Design and implement core technologies in fast paced
environment, with minimum guidance

•              Implement robust, highly scalable, highly optimized
distributed components

•              Evaluate and integrate latest technologies and third party
tools/APIs

•              Optimize architecture for security, operational stability,
scalability and cost

•              Develop capacity planning models

*Other Responsibilities: *

•              Enable Competency building and support for the Center of
Excellence activities

•              Handling proposals and support business development

•              Providing architectural and capability presentations to
Customers on Big Data Platform

•              Support in developing the Industry/Horizontal solution along
with the Domain teams

*Other Qualification:*

•              Degree in Computer Science or Engineering or equivalent work
experience.

•              Highly proficient & Customer facing Project experience
involving design, development and deployment in one of the areas mentioned
above

•              Must have a proven record of delivering technical solutions

•              At least 2-4 years of related Hadoop technology experience







*#3*

*Location: Denver, CO*

*Duration: 1 year*

*Experience: 8 years*

*Role: BigData Engineer*



*Below is the JD of Big Data ETL Developer *

•              Essential skills: AWS, Hortonworks, Kafka, H2O. Nice to have
Hive, R and Pig

•              1-2 years of experience as big data developer

•              Experience delivering distributed and highly scalable
applications on NoSQL/Sharded relational Databases/Mapreduce

•              Experience in application deployment architectures and
concerns such as scalability, performance, availability, reliability,
security etc.

•              Experience in any one or more of the following technologies

o   1 + yrs on Amazon/AWS, Big data, S3, Redshift  Reduce Platforms

o   1 + yrs on Hadoop (Apache/Cloudera) and/or other Map Reduce Platforms

o   1 +  yrs on Hive, Pig, Sqoop, Flume and/or Mahout

o   1 + yrs as an J2EE architect with strong experience in core Java

o   1+ yrs in architecting NO-SQL and competent SQL skills

o   Strong in Shell Scripting/LINUX programming

o   Good knowledge of any Data Integration and/or DW tools is plus

•              Candidate should have worked on the solutions using EDW
products or niche Open Source software



*Key Responsibilities*

•              Developing in hadoop & hadoop related tools for ETL and
other performance related tools

•              Develop, Test and UAT in big data hadoop environment. Agile
methodology

•              Design and build scalable infrastructure and platform to
collect and process very large amounts of

data (structured and unstructured), including streaming real-time data.

•              Work closely across an array of various teams and
organizations in the company and ndustry

(including partners, customers and researchers).



*Skills*

•              2+ years of software development experience using multiple
computer languages. Experience building large scale distributed data
processing systems/applications or large-scale internet systems (cloud
computing).

•              Strong foundational knowledge and experience with
distributed systems and computing systems in general. Hands-on engineering
skills.

•              Experience with a range of big data architectures, including
OpenStack, Hadoop, Pig, Hive or other big data frameworks.

•              Broad understanding and experience of real-time analytics,
NoSQL data stores, data modeling and data management, analytical tools,
languages, or libraries (e.g. SAS, SPSS, R, Mahout).

•              Strong interpersonal communication skills.

•              Ability to lead initiatives and people toward common goals.

•              Excellent oral and written communication, presentation, and
analytical skills

•              Bachelor's degree in Computer Science/Engineering, higher
degrees preferred.



*Kindly share the below details for quick process:*

Full Legal Name as in Driving License/ Passport:

DOB (MM/DD/YYYY):

Current Location, City and State:

Mobile and Home Phone No:

Email ID:

US work authorization:

Highest Educational degree:

Year of Passed-out:

Overall years of experience

Currently on a project:

Willingness to relocate across US:

Passport Number:

Interview Availability:

Available to join from (Availability):

Skype Id:

Expected Hourly Rate (w2/1099/c2c)







Thanks,



N.Sivakumar

Terminal Contacts

s...@terminalcontacts.com

813 321 5307



*Post REQUIREMENTS/HOT LISTS @ **new-deliberat...@googlegroups.com*
<new-deliberat...@googlegroups.com>

-- 
You received this message because you are subscribed to the Google Groups 
"mainframe" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mainframe+unsubscr...@googlegroups.com.
To post to this group, send email to mainframe@googlegroups.com.
Visit this group at https://groups.google.com/group/mainframe.
For more options, visit https://groups.google.com/d/optout.

Reply via email to