*Immediate Need for Business Analyst // Data warehouse Engineer // Big Data Architect // DevOps/Implementation Engineer // ReactJS Developer*
*Business Analyst* *Location:* *Salt Lake city, UT* Interview slots are available 10+ years’ experience Need *Business architect/Sr. BA/ BSA *who can drive the requirement and communicate the requirement to India team and can make user stories And capture minutes of the meetings. He/she should drive the call *Skill* *How Many Years Used* *Year Last Used* Visio Business Process Diagrams User story writing building mock-ups Mortgage/ consumer lending onshore offshore coordination ********************************************************************************** *Datawarehouse Engineer* *Location:** Century City, CA* *Contract position: **12+ months with possible extension* Interview slots are available *Must have skills:* Informatica, ETL, IBM Datastage, scheduling tools such as Control- M,Autosys *Skill* *How Many Years Used* *Year Last Used* Informatica, ETL IBM Datastage scheduling tools such as Control- M Autosys *Job Description * · Candidate should be around 8-10 years of experience and work location would be Century City, California. · Knowledge of IBM DataStage Architecture, Stages/Transformations, Design, ETL flow, advanced SQLs. · Knowledge of data loading techniques using DataStage · Hands on ETL development experience with good knowledge of Data warehousing concepts and experience on data modeling. · Experienced in DataStage Parallel jobs development using different stages like Aggregator, Join, Merge, Lookup, dataset,Filter, Row generator, Column Generator, Change Capture, Copy, Funnel, Sort, peek,Routines etc. · Knowledge and experience on DataStage Director for running, monitoring and error management the jobs. · Experienced in development of DataStage sequences to run the batch jobs. · Understanding and Experience on Unix/Linux system, file systems, shell scripting · knowledge of scheduling tools such as Control- M,Autosys,Tivoli etc.(anyone). *Good to have:* · Hands-on experience applying modeling techniques to actual business solutions. · Knowledge of Salesforce. · Good experience on Onsite and offshore co-ordination and Handling ETL team. · Good communication skills, ability to work independently on ETL tasks, including data-analysis and documentation. ************************************************************************************ *Big Data Architect* *Location:* San Jose, CA *Contract position: *12+ months with possible extension Interview slots are available Architect- 10+ years of experience. Need excellent candidate. Attached is the JD. *Skill* *How Many Years Used* *Year Last Used* Hadoop spark Big data AWS Coludera Scala *Must have skill sets:* Hadoop, spark, Big data, AWS, Coludera, Scala. Strong programming and design ************************************************************************************ *DevOps/Implementation Engineer* *Location:* Costa Mesa, CA *Contract position: *12+ months with possible extension Interview slots are available *Skill* *How Many Years Used* *Year Last Used* AWS Java Cloudera Installation Administration in Linux *Must have skill sets:* Expert in AWS, Java, Cloudera Installation and Administration in Linux 4-6 in AWS and Cloudera so you need to look at 10-12 years candidates. *Job Description:* · With direction from the Chief Architect and Architect Leader, align application technical architecture with key business strategy. · Evaluate and introduce technology tools and processes that enable organization to develop products and solutions, to embrace business opportunities and/or improve operational efficiency. · Responsible for building scalable distributed data solutions. · Proven prototyping track record in assessment of BigData technology integration or a design pattern for a project or a proof of concept solution. · Promote and influence usage of BigData and Cloud architectural standards, patterns and models in project and be a support anchor for development teams. · Self-driven, Ability to work independently and as part of a team with proven track record developing and launching products at scale. · Exceptional problem solving and debugging ability to support multiple projects, teams and clients simultaneously with crucial or ad-hoc custom tasks to aid in the successful delivery of the product. · Hands-on experience designing and implementing data applications in production using Hadoop Ecosystem (MapReduce, Hive, HBase, Spark, Sqoop, Flume, Pig etc.) and NoSQL (e.g. Cassandra or RedShift). · Ability to deploy a full AWS environment that includes IAM, Multiple peered VPCs with Direct Connect, RDS, Auto-scaling EC2 cluster, S3 buckets, Redshift server. · Ability to use a nested stack deployment and a single centralized pipeline. · Create and maintain build scripts using Cloudformation, Python, Batch, ANT, NANT, RAKE, to perform builds and deployments efficiently. · Understand and execute SysOps and DevOps engineering responsibilities in Continuous Delivery and Automation. · Expertise in CloudFormation which is primarily used for properly managing an AWS cloud environment. · Expertise in AWS CLI for creating and managing stacks, producing code deployments for creating EC2 instances like Jenkins, Chef and Web Servers. · Expert in CloudWatch concepts for creating Alarms & Metrics, Log collection using the UI or API commands in the CLI as well as the ability to deploy a consist logging solution for ingesting all Cloud logs into Splunk. · Draw conclusions and effectively communicate findings with both technical and non-technical team members. *Required Qualifications* *IDE and Tools:* PyCharm, Atlassian Suite, GIT, CVS, SVN, IBM Rational ClearCase, IBM Rational ClearQuest, Ant, NAnt, Powershell, Rake, Wix, Shell, PVCS, JSON, YAML, CruiseControl, TeamCity, Jenkins, Hudson, Chef, Visual Studio, Eclipse, Apache Tomcat. *Cloud Technologies:* Amazon Web Services EC2, S3, EMR, Redshift, CodeDeploy, EKS, Kinesis, Config, Cloudwatch, Cloudtrail, CodePipeline, CodeCommit, Athena, Glue, CloudFormation, Elastic Beanstalk, Chef, SCRUM, R-Studio, SAS, RDS, EMR *Operating Systems: *UNIX, LINUX, Windows Platforms. · Exceptional problem solving and debugging ability to support multiple projects, teams and clients simultaneously with crucial or ad-hoc custom tasks to aid in the successful delivery of the product. · Experience in using container technology such as DOCKEKRS, KUBERNETES,PaaS (OPENSHIFT) to deploy Cassandra DB, MicroServices and related tools will be an added advantage. · Experience in using and integrating with BigData technologies such as HDFS, HBASE, HIVE, MapReduce, PIG, SPARK, IMPALA, STORM, KAFKA, and SOLR. · Experience in Cloud technologies – AWS, BOTO, S3 and Security protocols. · Cloudera or AWS certification. · Experience with SAS, Redpoint or similar technologies preferred. ************************************************************************************ *ReactJS Developer * *Location:** New York City, NY* *Contract position: **12+ months with possible extension* Interview slots are available Need good react.js front end developers *Job Description:* We should have at least two Front End ( mid-experience) React professionals The react engineers should be really good to cranking out front end improvements quickly. They dont have to be senior, but super enthusiastic and hard working. They also need to be very very proficient. Basically, we need some really good hands-on react.js developers around 3 to 4 years of experience in working on react.js -- *Regards,* Anand Koppula Sr Technical Recruiter *Network Objects Inc* E-mail: anan...@networkobjects.com 7709 San Jacinto Place Suite 201, Plano TX 7024 www.networkobjects.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAN_UxXQucTvXCbAFZw_2gdOnVfHqVwfUD9s7oKQ61jF58E4FeQ%40mail.gmail.com.