Hi *Vendors*,

Please Share the suitable Profiles For Below Requirements to *u...@genisists.com
<u...@genisists.com>*

*ROLES: DATA ANALYST, DATA ENGINEER, BIG DATA LEAD, *
*               ETL INFORMATICA*
*Location: Charlotte, NC*
*Duration- Long Term*
*Experience- 8+ Yrs*
*Visa- No Opt & Cpt's*

****DIRECT NUMBER IS MANDATORY****

*1) Role - Data Analyst*
Job Description:
• Under direct supervision of an experienced Data Analyst, the Data
Analytics Analyst will learn
how to explore data to find its meaning and relationship to solving
business problems:
o Document technical and business data definitions and data lineage from
source to
target in a metadata management system.

o Work with peer technical personnel, members of the DAO team, and our
business
partners to assemble data from various sources into datasets to analyze the
information
to solve business problems and improve business efficiency.
o Support technical and business testing as a member of development project
teams.
o Provide support to data quality and data governance efforts through
analysis of data
and assisting in understanding the meaning of data.
o Learn to perform data profiling.
o Administer and generate metrics on data to monitor and improve the
quality of data.
o Participate in changes to key data elements and source data and document
the changes
to business and technical definitions and data lineage in our metadata
repository.
o Learn to perform periodic user access reviews.

Job Requirements:
• Bachelor's degree or at least 1 -2 years of training or relevant work
experience in Data Analysis
(e.g., Business and Data Analysis, Data Engineering)
• Excellent written, verbal and interpersonal skills, a must as there will
be significant collaboration
with the business and IT
• Experience in creating dashboards with data quality key performance
indicators (KPIs)
• Relevant experience in at least one of the following areas are desired:
• Connecting, wrangling, and cleaning data from relational, columnar, or
Hadoop databases using
SQL, Python, R, or other tools to make data consumable for analysis. You
can automate and
document your processes to make them repeatable.
• Creating metadata: Defining the meaning of data and analyzing data flows
to determine source
to target data lineage, then storing this information in a metadata
reference library.
Documenting data entity relationships in a tool such as Excel, Erwin, etc..
• Analyzing data to obtain new insights that will lead to improved
efficiencies, lower cost, or
reduced business risk. You may be using Excel, SAS, Python, R, SQL / HQL
and other tools. You
may be able to visualize your results using data visualization tools to
make your conclusions
more understandable to your audience.
• Analyzing the statistical relationship of data to a problem to be solved
using SQL, Python, or R.
Determine new and creative sources of data and then determining the
comparative strength of
the relationship to the problem after the new data is applied. You may also
have
• the training and skills to create a predictive or analytical model using
your dataset, that will
enable repeated automatic results on an ongoing basis.
Preferred Skills:.
• Training or degree in Data Analysis, Data Engineering or Statistical
Modeling.
• Experience in exploratory data analysis and how data analysis fits into
an analytic lifecycle
desired.
• Experience in metadata management with a metadata management tool (such
as IBM
Infosphere tool suite).
• Relevant IT or other technical experience in data analysis/mapping,
logical and physical data
modeling preferred.
• Multi-line Life Insurance and Annuity experience preferred for working
with business to
understand business impact for data issues



*2) Role - Data Engineer*
Job Description
• Creation of data products for all consumers – business users, analysts,
and modelers. Explore
and understand data sets.
• Visualize the data set; determine whether the data set has enough
information to answer the
question that the business is asking.
• Work with IT support to create ETL / ELT interfaces to the data lake and
create and visualize the
data and data products on the data lake
• Implement required data transformation in the data lake
• Configure required security and data masking to a data set
• Support testing of data acquisition, data set correlation, and / or model
development.
Investigate and resolve interface issues
• Work with IT to harden and productionize the model, model interfaces, and
business
procedures.
Requirements:
• Master’s/ Bachelor’s Degree in one of the following: Engineering,
Statistical Analytics, Data
Science, or Actuarial Science.
o At least 3-5 years of relevant work experience in implementing data and
analytics projects.
• The resources must have domain technical experience in delivering data
engineering solutions
using data lake technology
• Experience with the following: Hadoop (CDH), relational databases and
SQL, ETL development,
spark, data validation and testing (Data Warehousing, ETL/ELT to the Data
Lake, Using the Data
Lake for data analysis (Hadoop tools – Hive, Impala, Pig, Sqoop, Hue,
Kafka, etc., Python, R, java,
Docker, Dakota).
• Knowledge of Cloud platform implementation (Azure or Amazon). Knowledge
of data
visualization tools is a plus (Tableau on multiple platforms along with
Python visualization in the
Data Lake using Pandas and bokeh packages)
• Excellent written, verbal and interpersonal skills, a must as there will
be significant collaboration
with the business and IT
• Experience with collaborative development workflows (e.g., Microsoft
VSTS, TFS, Bamboo,
Github)


*3) ROLE: BIG DATA TECHNICAL LEAD*
*JOB DESCRIPTION:*
.Big Data/Hadoop Technical Lead with extensive experience on the Hadoop
platform and related Big Data tools/technologies.

.10+ years of experience in implementing and managing high performance
scalable enterprise applications in the Financial Services industry.

.Good knowledge of architecture, design patterns, Source target mappings,
ETL Architecture in Hadoop space, data modeling techniques, performance
tuning in Hadoop environment.

.Experience with big data tools: Hadoop, Spark, Kafka, PySpark, python,
Impala, Hive, HDFS and related tools

.Experience in CI/CD tools such as Maven, Git/Stash, Jenkins, Docker etc.

.Advance level knowledge and experience with SQL queries

Able to analyze huge volume of data and understand patterns, data quality
issues and design solutions to manage volume.

.Expertise in building application in Cloud Platforms such as AWS and
leveraging native services an asset

.Experience working in Agile environment

.Experience working with shell scripting.

Modularize the project components. Architect the complete project and
provide solution architecture.



*Basic Qualifications:*

.5+ Years of experience in Big Data Technologies, 10+ years’ experience in
Data Warehouse, Business Intelligence and analytics area.

.3 + Years of experience in Hadoop Eco Systems

.2+ Years of experience in Spark

.1+ years’ experience with Microsoft Azure and/or AWS

.Any exposure to Mainframe as a source is preferred


*4) ROLE: ETL informatica Developer *
*Location: CT *
*Client : Massmutual *
*Experience: 8+*
*12+ Months contract (C2C)*

Required Skills:

   - Informatica
   - PowerCenter
   - Big Data
   - ETL
   - SSIS


*Basic Qualifications:*

   - ETL & ELT solutions
   - Informatica PowerCenter experience
   - 5+ years of design and development in data warehousing.
   - Additional Skills:
   - Hadoop, MySQL, SAS
   - Agile is a plus
   - Informatica BDM/ PowerCenter Administrator & Developer
   - This role is to strategically design and implement ETL & ELT
   solutions, including integration with on-premise and cloud databases and
   data warehouses.


*Specific elements of skills and responsibilities for this position are
listed below:*

   - Experience providing administration and technical leadership for all
   aspects of the Informatica BDM/Power Center, Test Data Management, Data
   Quality, and/or Big Data
   - Management Informatica domains
   - Working knowledge of Informatica Architecture and installation
   processes
   - 5+ years of Experience in Data Architecture and Database Design
   - 5+ years of Experience designing and implementing Data Warehouse and
   related Applications
   - Experience in coordinating data operations, and liaison with
   Infrastructure, security, Data Platform and Application teams
   - Familiarity with popular cloud database and Big database systems,
   tools and modelers. (such as MySQL, Hadoop. PostgresSQL, SAS,R ) etc.
   - Demonstrated knowledge of understanding IT environmental issues and
   troubleshooting.
   - Experience in Agile Development methodology and tools
   - Must possess abilities to work in a team and mentor junior resources;
   therefore must possess excellent written and verbal communication skills
   - Experience in Financial industry a plus


*Thanks & Regards……..**?*

*Umashankar*

*US IT Recruiter*

*Genisis Technology Solutions*

*Email: **u...@genisists.com* <u...@genisists.com>

*Desk number: **+1(908)-801-6926*

[image: Mailtrack]
<https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
Sender
notified by
Mailtrack
<https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
10/10/19,
08:18:39 PM

-- 
You received this message because you are subscribed to the Google Groups 
"Android Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to android-developers+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/android-developers/CAMSH8spbusWcnsUGqWmGu7jPC4ZjGxKA0Am9ZW1FkPhQW%3D8tNA%40mail.gmail.com.

Reply via email to