Hi
We do have multiple requirements for different C2C positions please share suitable consultant profiles to u...@genisists.com (or) reach me at +1(908)440-2928 ****DIRECT MOBILE & PASSPORT NUMBERS ARE MANDATORY For All Requirements**** *1. **Role: Salesforce Lightning Developer* *Location: Connecticut* *Duration: Long Term* *Experience: 6+ Yrs* *Visa: Any Except CPT’s* - Sound technical knowledge on Salesforce development and leadership with relevant exp. of 6+ Years. - Must have 2 years hands-on experience with lightning development framework - Thorough knowledge of Salesforce Sales Cloud and Salesforce Service Cloud. - In-depth understanding of force.com application architecture. - Experience to build complex projects on force.com platform. - Experience in APEX classes, and Lighting components. - Deployment automation experience is required using Metadata API, Change Set and ANT - Good knowledge in Apex Custom Controllers, Apex Classes, Apex Triggers, Visualforce Pages/Components, Apex Web Services, SOQL, SOSL, Workflow and Approvals, Reports and Dashboards - Good knowledge in Copado deployments and GIThub. - Minimum 2+ years of experience in Agile technology and JIRA - Should be able to review SFDC code, lead and suggest standard improvements. *2. **Role: Data Engineer* *Location: Charlotte, NC* *Duration: Long Term* *Experience: 8+ Yrs* *Visa: No OPT & CPT’s* Requirements: - Master’s/ Bachelor’s Degree in one of the following: Engineering, Statistical Analytics, Data - Science, or Actuarial Science. - o At least 3-5 years of relevant work experience in implementing data and analytics projects. - The resources must have domain technical experience in delivering data engineering solutions - using data lake technology - Experience with the following: Hadoop (CDH), relational databases and SQL, ETL development, - spark, data validation and testing (Data Warehousing, ETL/ELT to the Data Lake, Using the Data - Lake for data analysis (Hadoop tools – Hive, Impala, Pig, Sqoop, Hue, Kafka, etc., Python, R, java, - Docker, Dakota). - Knowledge of Cloud platform implementation (Azure or Amazon). Knowledge of data - visualization tools is a plus (Tableau on multiple platforms along with Python visualization in the - Data Lake using Pandas and bokeh packages) - Excellent written, verbal and interpersonal skills, a must as there will be significant collaboration - with the business and IT - Experience with collaborative development workflows (e.g., Microsoft VSTS, TFS, Bamboo, - Github) *3. **Role: Data Analyst* *Location: Charlotte, NC* *Duration: Long Term* *Experience: 8+ Yrs* *Visa: No OPT & CPT’s* Job Requirements: - Bachelor's degree or at least 1 -2 years of training or relevant work experience in Data Analysis - (e.g., Business and Data Analysis, Data Engineering) - Excellent written, verbal and interpersonal skills, a must as there will be significant collaboration - with the business and IT - Experience in creating dashboards with data quality key performance indicators (KPIs) - Relevant experience in at least one of the following areas are desired: - Connecting, wrangling, and cleaning data from relational, columnar, or Hadoop databases using - SQL, Python, R, or other tools to make data consumable for analysis. You can automate and - Document your processes to make them repeatable. - Creating metadata: Defining the meaning of data and analyzing data flows to determine source - To target data lineage, then storing this information in a metadata reference library. - Documenting data entity relationships in a tool such as Excel, Erwin, etc.. - Analyzing data to obtain new insights that will lead to improved efficiencies, lower cost, or - reduced business risk. You may be using Excel, SAS, Python, R, SQL / HQL and other tools. You - may be able to visualize your results using data visualization tools to make your conclusions - more understandable to your audience. - Analyzing the statistical relationship of data to a problem to be solved using SQL, Python, or R. - Determine new and creative sources of data and then determining the comparative strength of - the relationship to the problem after the new data is applied. You may also have - The training and skills to create a predictive or analytical model using your dataset, that will - enable repeated automatic results on an ongoing basis. *4. **Role: Big Data Lead* *Location: Charlotte, NC* *Duration: Long Term* *Experience: 10+ Yrs* *Visa: No OPT & CPT’s* - Big Data/Hadoop Technical Lead with extensive experience on the Hadoop platform and related Big Data tools/technologies. - 10+ years of experience in implementing and managing high performance scalable enterprise applications in the Financial Services industry. - Good knowledge of architecture, design patterns, Source target mappings, ETL Architecture in Hadoop space, data modeling techniques, performance tuning in Hadoop environment. - Experience with big data tools: Hadoop, Spark, Kafka, PySpark, python, Impala, Hive, HDFS and related tools - Experience in CI/CD tools such as Maven, Git/Stash, Jenkins, Docker etc. - Advance level knowledge and experience with SQL queries - Able to analyze huge volume of data and understand patterns, data quality issues and design solutions to manage volume. - Expertise in building application in Cloud Platforms such as AWS and leveraging native services an asset - Experience working in Agile environment - Experience working with shell scripting. - Modularize the project components. Architect the complete project and provide solution architecture. *5. **Role: ETL Informatica Developer* *Location: Connecticut* *Duration: 12+ Months contract (C2C)* *Visa: No OPT & CPTs* *Experience: 8+Years* Required Skills: - UNIX - Informatica - PowerCenter - Big Data - ETL - SSIS Additional Skills: - Hadoop, MySQL, SAS - Agile is a plus - Informatica BDM/ PowerCenter Administrator & Developer - This role is to strategically design and implement ETL & ELT solutions, including integration with on-premise and cloud databases and data warehouses. - Specific elements of skills and responsibilities for this position are listed below: - experience providing administration and technical leadership for all aspects of the Informatica BDM/Power Center, Test Data Management, Data Quality, and/or Big Data Management Informatica domains - Working knowledge of Informatica Architecture and installation processes - 5+ years of Experience in Data Architecture and Database Design - 5+ years of Experience designing and implementing Data Warehouse and related Applications - Experience in coordinating data operations, and liaison with Infrastructure, security, Data Platform and Application teams - Familiarity with popular cloud database and Big database systems, tools and modelers. (such as MySQL, Hadoop. PostgresSQL, SAS,R ) etc. - Demonstrated knowledge of understanding IT environmental issues and troubleshooting. - Experience in Agile Development methodology and tools - Must possess abilities to work in a team and mentor junior resources; therefore must possess excellent written and verbal communication skills - Experience in Financial industry a plus *Thanks & Regards……..**?* *Umashankar* *US IT Recruiter* *Genisis Technology Solutions* *Email: **u...@genisists.com* <u...@genisists.com> *Desk number: **+1(908)-440-2928* [image: Mailtrack] <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&> Sender notified by Mailtrack <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&> 11/11/19, 08:52:23 PM -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To unsubscribe from this group and stop receiving emails from it, send an email to android-developers+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/android-developers/CAMSH8srd%2B_MQXbNwGUE9QohgVLhW2J8LHJUDp9cya-iy2VQ81g%40mail.gmail.com.