*Kindly Reply :: sumit.ku...@braintreeus.com <sumit.ku...@braintreeus.com> or sumit.braintre...@gmail.com <sumit.braintre...@gmail.com>*
Hi Vendors, I have multiple openings for the below jobs with one of my client in CO and CA. Kindly share your consultant profile matching the below skills. *Position #1:* *Job Title: Test Engineer * *Location: LITTLETON, CO* *Description:* · Strong experience creating and executing functional and integration test cases using automation tools · Test Automation is a requirement and experience with test tools like Selenium, QTP or any test automation tools · Experience testing both web portal & enterprise applications · Understanding of web technologies involved with web page rendering in the browser(DOM, AJAX, JS, CSS) · Experience with detailed test case creation and cross platform browser knowledge is desired · Continuous Integration experience is preferred · Sound knowledge in REST API, JSON · Strong experience with defect tracking and reporting tools · Experience with testing and defect management tools (JIRA, QC, Rally, etc) · Experience working in agile team *Position #2: * *Job Title: Senior Big Data Engineer* *Location: Irvine, CA* *Only US citizen, GC, GC-EAD* *Description: * · The Senior Big Data Engineer will work within the Enterprise Data Services team in Agile environment to install, update, maintain, monitor and support the Hadoop and Enterprise Data Warehouse/Enterprise Business Intelligence tools, applications and databases to support all Kelley Blue Book Business units. · The individual will work closely in a team consisting of other big data and data warehouse engineers, business analysts, business intelligence engineers, system analysts, quality assurance engineers and database administrators in developing data warehouse solutions to meet stated requirements, develop detailed specifications and unit test plans for implementation. · Since EDW works with multiple business units to create cross functional solutions, the Senior Big Data Engineer must be able to interface with various business units to understand the requirements and prepare documentation to support development. · The individual must be capable of understanding Hadoop ecosystem and the complex object design and the underlying data model of the system. The individual must be comfortable with developing data-centric applications using Hadoop tools, Netezza, Informatica, Informatica BDE (Big Data Edition), HIVE MapReduce, Spark and able to develop ETL packages, and is expected to develop queries and stored procedures involving complex database structures and Hadoop Distributed File System(HDFS). · The individual must have excellent communication skills, work well in a team environment, enjoy solving complex problems and be able to work in a fast paced environment. *Responsibilities:* · Define technical scope and objectives through research and participation in requirements-gathering and definition of processes Warm Regards *Sumit Kumar* *Braintree Technology Solutions* *630-343-0522* *sumit.ku...@braintreeus.com* <ed.math...@braintreeus.com> *Gtalk: **sumit.braintre...@gmail.com* <edmathews.braint...@gmail.com> *www.braintreeus.com* <http://www.braintreeus.com/> *Note:* This email is not intended to be a solicitation. Please accept our apologies and reply in the subject heading with REMOVE to be removed from our Mailing list. -- You received this message because you are subscribed to the Google Groups "USITCV" group. To unsubscribe from this group and stop receiving emails from it, send an email to usitcv+unsubscr...@googlegroups.com. To post to this group, send email to usitcv@googlegroups.com. Visit this group at https://groups.google.com/group/usitcv. For more options, visit https://groups.google.com/d/optout.