Hi,

We have a opening with one of direct clients in Dallas, TX. Please let me know 
if you have any resources for this position.

Job Title: Sr Big Data Engineer
Location: Dallas, TX
Duration: 12 months

Job Description:
* Massive data: You will source / examine, engineer data pipelines for 
gigabytes/terabytes of structured and unstructured data with our platform to 
create value for customers.
* Production deployment: You will be responsible for integration and deployment 
of the machine learning pipelines into production where your ideas can come to 
life.
* Linux hacking: You will be masterfully using the command line, including 
tools like vi/emacs and understanding beyond basics of grep, bash, awk, sed, 
etc to aggressively dive into data, systems, and compute platforms to get the 
results you are seeking.
* Pushing the limits: This role will be on the cutting edge of our Data / 
Machine Learning platform. As we push to solve more of our customer challenges, 
you will be prototyping new features, tools and ideas. Innovate at a very fast 
pace to maintain our competitive edge.
* Coordinate and work with cross functional teams, sometimes located at 
different geo locations.

What you have done:
* Commercial software engineering: You have 3+ years of professional software 
development experience with languages and systems such as Java, Python, and 
version control (git), with good analytical & debugging skills.
* Big data: You have extensive experience with data analytics, and working 
knowledge of big data infrastructure such as Hadoop Eco System, HDFS, Spark, 
and AWS (nice to have). You've routinely built data pipelines with 
gigabytes/terabytes of data and understand the challenges of manipulating such 
large datasets.
* Environments: You have worked in atleast one cloud environment like AWS, GCP 
(preferred) and have experience in Hadoop environment
* Data Modeling: Flair for data, schema, data model, PL/SQL, Star & snow flake 
schema, how to bring efficiency in data modeling for efficient querying data 
for analysis, understands criticality TDD and develops data validation 
techniques.
* Real Time Systems: Understands evolution of databases for in-memory, NoSQL & 
indexing technologies along with experience on real-time & stream processing 
systems like kafka, Storm.
* Project management: You demonstrate excellent project and time management 
skills, exposure to scrum or other agile practices in JIRA.
* CS fundamentals: You have earned at least a B.S. / MS in Computer Science , 
or related degree AND you have a strong ethos of continuous learning.


Thanks & Regards,
Amit |
Intelliswift Software Inc
39600 Balentine Dr., Suite 200 Newark, CA 94560
www.intelliswift.com<http://www.intelliswift.com/>   |    Phone: 510-370-4448  
| Fax: 510-578-7710
...............................................................................................................

  *   Ranked #1 among Best Places to Work in Bay Area in 2017 (among Small 
Business Category)
  *   Intelliswift Recognized as Top Diversity Staffing Firm in the U.S. by 
Staffing Industry Analysts
  *   Intelliswift Ranked Number 1 Minority-Owned IT Staffing Company in San 
Francisco Business Times<http://www.intelliswift.us/news#news_section-block-0>

-- 
You received this message because you are subscribed to "rtc-linux".
Membership options at http://groups.google.com/group/rtc-linux .
Please read http://groups.google.com/group/rtc-linux/web/checklist
before submitting a driver.
--- 
You received this message because you are subscribed to the Google Groups 
"rtc-linux" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to rtc-linux+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to