Remote position ==== Looking for Hadoop Architect ==== Must be US citizen or GC holder.

2017-03-16 Thread Mazhar Khan
*Job Title: Hadoop Architect -Remote Work*
*Duration: 3+ Years Contract*

*Rate : $65/hr on C2C. *
*Must be US citizen or Gc holder.*

*Overall Purpose*: Responsible for the development of distributed computing
tasks which include MapReduce, NoSQL, and other distributed environment
technologies based on need or preference to solve the problems that arise.
Using programming languages and technology, writes code, completes
programming and documentation, and performs testing and debugging of
applications. Analyzes, designs, programs, debugs and modifies software
enhancements and/or new products used in local, networked, or
Internet-related computer programs. May interact with users to define
system requirements and/or necessary modifications .
Roles & Responsibilities: 1) 8-10 years experience in developing software
applications including: analysis, design, coding, testing, deploying and
supporting of applications. 2) Proficient in application/software
architecture (Definition, Business Process Modeling, etc.). 3) Project
Management experience. 4) Experience building Big Data solutions using
Hadoop technology. 5) Extensive experience with software development and
the complete software lifecycle (analysis, design, implementation, testing,
quality assurance). 6) Ability to work with non-technical resources on the
team to translate data needs into Big Data solutions using the appropriate
tools. 7) Extensive experience developing complex MapReduce programs
against structured and unstructured data. 8) Experience with loading data
to Hive and writing software accessing Hive data.

*MUST HAVE*
Minimum of 3 years’ experience with Red Hat Enterprise Linux 6/7 and UNIX
administration supporting large scale environments
Minimum of 3 years’ experience in administrative support of Hortonworks
HDP, Cloudera or MapR
Minimum of 3 years’ experience in administrative support of secured
clusters with Kerberos
Minimum of 3 years’ experience in administrative support Apache Kafka,
Spark and streaming applications
Minimum of 1 year experience in hardening clusters with Knox and Ranger
Minimum of 1 year experience in LDAP integration with HDP and other tools
Minimum of 1 year experience with automation
Minimum of 1 year experience with Solr or Elastic Search administration and
tuning experience.
Minimum of 1 year experience with supporting HBASE
Experience with issues that will affect a distributed system
Experience in supporting MySQL & NoSQL such as MongoDB
Experience in programming with Java, bash/ksh scripting and Python
Strong written communications and documentation experience

*Responsibilities*
Manage cluster hardening activities through the implementation and
maintenance of security and governance components across various cluster.
Manage large scale Hadoop cluster environments, handling all Hadoop
environment builds, including design, capacity planning, cluster setup,
performance tuning, analyze Hadoop log files, support,  maintenance,
backups and ongoing monitoring
Work closely with network, infrastructure, Linux admin and application
development teams ensuring data is completing within SLA
Identify hardware and software technical problems, storage and/or related
system malfunctions. All root cause analysis results will be properly
documented to minimize future system issues.
Continually work to automate administrative tasks while designing
resiliency and high availability of the Hadoop platform
Manage and troubleshoot scheduled Hadoop jobs with TWS and Oozie.
Provide off hours support on a rotational basis to support installation,
upgrades or production issues.

*Additional Experience and Skills are a +*
Languages R, Scala and Ruby
Additional Hadoop related software Accumulo, Sqrrl, Atlas, NiFi and
Cloudbreak
Monitoring software NagiosXi, Nagios Core
Docker or Kubernetes
HDP Certified Administrator
HDP Certified Developer
Experience with automation with chef for application deployment
Experience with deploying HDP via blueprints
Experience with Solr and Elastic Search administration and tuning
Tuning Hadoop applications using the ORC storage format
Experience with automation in Chef, Puppet, Ansible, and Jenkins
SAML SSO, PAM
Confluent, Apache Flink


*Mazhar Khan*
*||Tel: 703-962-7227 X 427 || Fax: 703-439-2550 || *
*Email: maz...@gsquire.com *
*Gsquire Business Solutions Inc || www.gsquire.com 
|| 4229 Lafayette Center Dr , Suite #1625, Chantilly, VA 20151Women Owned
Small Business / MBE / SWAM Certified*

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.


Remote position ==== Looking for Hadoop Architect ==== Must be US citizen or GC holder.

2017-03-16 Thread Mohammed Mazhar Khan
*Job Title: Hadoop Architect -Remote Work*
*Duration: 3+ Years Contract*

*Rate : $65/hr on C2C. *
*Must be US citizen or Gc holder.*

*Overall Purpose*: Responsible for the development of distributed computing
tasks which include MapReduce, NoSQL, and other distributed environment
technologies based on need or preference to solve the problems that arise.
Using programming languages and technology, writes code, completes
programming and documentation, and performs testing and debugging of
applications. Analyzes, designs, programs, debugs and modifies software
enhancements and/or new products used in local, networked, or
Internet-related computer programs. May interact with users to define
system requirements and/or necessary modifications .
Roles & Responsibilities: 1) 8-10 years experience in developing software
applications including: analysis, design, coding, testing, deploying and
supporting of applications. 2) Proficient in application/software
architecture (Definition, Business Process Modeling, etc.). 3) Project
Management experience. 4) Experience building Big Data solutions using
Hadoop technology. 5) Extensive experience with software development and
the complete software lifecycle (analysis, design, implementation, testing,
quality assurance). 6) Ability to work with non-technical resources on the
team to translate data needs into Big Data solutions using the appropriate
tools. 7) Extensive experience developing complex MapReduce programs
against structured and unstructured data. 8) Experience with loading data
to Hive and writing software accessing Hive data.

*MUST HAVE*
Minimum of 3 years’ experience with Red Hat Enterprise Linux 6/7 and UNIX
administration supporting large scale environments
Minimum of 3 years’ experience in administrative support of Hortonworks
HDP, Cloudera or MapR
Minimum of 3 years’ experience in administrative support of secured
clusters with Kerberos
Minimum of 3 years’ experience in administrative support Apache Kafka,
Spark and streaming applications
Minimum of 1 year experience in hardening clusters with Knox and Ranger
Minimum of 1 year experience in LDAP integration with HDP and other tools
Minimum of 1 year experience with automation
Minimum of 1 year experience with Solr or Elastic Search administration and
tuning experience.
Minimum of 1 year experience with supporting HBASE
Experience with issues that will affect a distributed system
Experience in supporting MySQL & NoSQL such as MongoDB
Experience in programming with Java, bash/ksh scripting and Python
Strong written communications and documentation experience

*Responsibilities*
Manage cluster hardening activities through the implementation and
maintenance of security and governance components across various cluster.
Manage large scale Hadoop cluster environments, handling all Hadoop
environment builds, including design, capacity planning, cluster setup,
performance tuning, analyze Hadoop log files, support,  maintenance,
backups and ongoing monitoring
Work closely with network, infrastructure, Linux admin and application
development teams ensuring data is completing within SLA
Identify hardware and software technical problems, storage and/or related
system malfunctions. All root cause analysis results will be properly
documented to minimize future system issues.
Continually work to automate administrative tasks while designing
resiliency and high availability of the Hadoop platform
Manage and troubleshoot scheduled Hadoop jobs with TWS and Oozie.
Provide off hours support on a rotational basis to support installation,
upgrades or production issues.

*Additional Experience and Skills are a +*
Languages R, Scala and Ruby
Additional Hadoop related software Accumulo, Sqrrl, Atlas, NiFi and
Cloudbreak
Monitoring software NagiosXi, Nagios Core
Docker or Kubernetes
HDP Certified Administrator
HDP Certified Developer
Experience with automation with chef for application deployment
Experience with deploying HDP via blueprints
Experience with Solr and Elastic Search administration and tuning
Tuning Hadoop applications using the ORC storage format
Experience with automation in Chef, Puppet, Ansible, and Jenkins
SAML SSO, PAM
Confluent, Apache Flink


*Mazhar Khan*
*||Tel: 703-962-7227 X 427 || Fax: 703-439-2550 || *
*Email: maz...@gsquire.com *
*Gsquire Business Solutions Inc || www.gsquire.com 
|| 4229 Lafayette Center Dr , Suite #1625, Chantilly, VA 20151Women Owned
Small Business / MBE / SWAM Certified*

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.


Remote position ==== Looking for Hadoop Architect === Must be local to GA.

2017-03-16 Thread Mohammed Mazhar Khan
*Job Title: Hadoop Architect -Remote Work*
*Duration: 3+ Years Contract*
*Location: Atlanta, Georgia*
Manager prefers local candidates, but would consider remote. *

*Overall Purpose:* Responsible for the development of distributed computing
tasks which include MapReduce, NoSQL, and other distributed environment
technologies based on need or preference to solve the problems that arise.
Using programming languages and technology, writes code, completes
programming and documentation, and performs testing and debugging of
applications. Analyzes, designs, programs, debugs and modifies software
enhancements and/or new products used in local, networked, or
Internet-related computer programs. May interact with users to define
system requirements and/or necessary modifications .
Roles & Responsibilities: 1) 8-10 years experience in developing software
applications including: analysis, design, coding, testing, deploying and
supporting of applications. 2) Proficient in application/software
architecture (Definition, Business Process Modeling, etc.). 3) Project
Management experience. 4) Experience building Big Data solutions using
Hadoop technology. 5) Extensive experience with software development and
the complete software lifecycle (analysis, design, implementation, testing,
quality assurance). 6) Ability to work with non-technical resources on the
team to translate data needs into Big Data solutions using the appropriate
tools. 7) Extensive experience developing complex MapReduce programs
against structured and unstructured data. 8) Experience with loading data
to Hive and writing software accessing Hive data.

*Must Skills are a +*
Languages R, Scala and Ruby
Additional Hadoop related software Accumulo, Sqrrl, Atlas, NiFi and
Cloudbreak
Monitoring software NagiosXi, Nagios Core
Docker or Kubernetes
HDP Certified Administrator
HDP Certified Developer
Experience with automation with chef for application deployment
Experience with deploying HDP via blueprints
Experience with Solr and Elastic Search administration and tuning
Tuning Hadoop applications using the ORC storage format
Experience with automation in Chef, Puppet, Ansible, and Jenkins
SAML SSO, PAM
Confluent, Apache Flink
Minimum of 3 years’ experience with Red Hat Enterprise Linux 6/7 and UNIX
administration supporting large scale environments
Minimum of 3 years’ experience in administrative support of Hortonworks
HDP, Cloudera or MapR
Minimum of 3 years’ experience in administrative support of secured
clusters with Kerberos
Minimum of 3 years’ experience in administrative support Apache Kafka,
Spark and streaming applications
Minimum of 1 year experience in hardening clusters with Knox and Ranger
Minimum of 1 year experience in LDAP integration with HDP and other tools
Minimum of 1 year experience with automation
Minimum of 1 year experience with Solr or Elastic Search administration and
tuning experience.
Minimum of 1 year experience with supporting HBASE
Experience with issues that will affect a distributed system
Experience in supporting MySQL & NoSQL such as MongoDB
Experience in programming with Java, bash/ksh scripting and Python
Strong written communications and documentation experience

*Responsibilities*
Manage cluster hardening activities through the implementation and
maintenance of security and governance components across various cluster.
Manage large scale Hadoop cluster environments, handling all Hadoop
environment builds, including design, capacity planning, cluster setup,
performance tuning, analyze Hadoop log files, support,  maintenance,
backups and ongoing monitoring
Work closely with network, infrastructure, Linux admin and application
development teams ensuring data is completing within SLA
Identify hardware and software technical problems, storage and/or related
system malfunctions. All root cause analysis results will be properly
documented to minimize future system issues.
Continually work to automate administrative tasks while designing
resiliency and high availability of the Hadoop platform
Manage and troubleshoot scheduled Hadoop jobs with TWS and Oozie.
Provide off hours support on a rotational basis to support installation,
upgrades or production issues.



*Mazhar Khan*
*||Tel: 703-962-7227 X 427 || Fax: 703-439-2550 || *
*Email: maz...@gsquire.com *
*Gsquire Business Solutions Inc || www.gsquire.com 
|| 4229 Lafayette Center Dr , Suite #1625, Chantilly, VA 20151Women Owned
Small Business / MBE / SWAM Certified*

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.


Remote position ==== Looking for Hadoop Architect === Must be local to GA.

2017-03-16 Thread Mazhar Khan
*Job Title: Hadoop Architect -Remote Work*
*Duration: 3+ Years Contract*
*Location: Atlanta, Georgia, USA, 30308*
Manager prefers local candidates, but would consider remote. *

*Overall Purpose:* Responsible for the development of distributed computing
tasks which include MapReduce, NoSQL, and other distributed environment
technologies based on need or preference to solve the problems that arise.
Using programming languages and technology, writes code, completes
programming and documentation, and performs testing and debugging of
applications. Analyzes, designs, programs, debugs and modifies software
enhancements and/or new products used in local, networked, or
Internet-related computer programs. May interact with users to define
system requirements and/or necessary modifications .
Roles & Responsibilities: 1) 8-10 years experience in developing software
applications including: analysis, design, coding, testing, deploying and
supporting of applications. 2) Proficient in application/software
architecture (Definition, Business Process Modeling, etc.). 3) Project
Management experience. 4) Experience building Big Data solutions using
Hadoop technology. 5) Extensive experience with software development and
the complete software lifecycle (analysis, design, implementation, testing,
quality assurance). 6) Ability to work with non-technical resources on the
team to translate data needs into Big Data solutions using the appropriate
tools. 7) Extensive experience developing complex MapReduce programs
against structured and unstructured data. 8) Experience with loading data
to Hive and writing software accessing Hive data.

*Must Skills are a +*
Languages R, Scala and Ruby
Additional Hadoop related software Accumulo, Sqrrl, Atlas, NiFi and
Cloudbreak
Monitoring software NagiosXi, Nagios Core
Docker or Kubernetes
HDP Certified Administrator
HDP Certified Developer
Experience with automation with chef for application deployment
Experience with deploying HDP via blueprints
Experience with Solr and Elastic Search administration and tuning
Tuning Hadoop applications using the ORC storage format
Experience with automation in Chef, Puppet, Ansible, and Jenkins
SAML SSO, PAM
Confluent, Apache Flink
Minimum of 3 years’ experience with Red Hat Enterprise Linux 6/7 and UNIX
administration supporting large scale environments
Minimum of 3 years’ experience in administrative support of Hortonworks
HDP, Cloudera or MapR
Minimum of 3 years’ experience in administrative support of secured
clusters with Kerberos
Minimum of 3 years’ experience in administrative support Apache Kafka,
Spark and streaming applications
Minimum of 1 year experience in hardening clusters with Knox and Ranger
Minimum of 1 year experience in LDAP integration with HDP and other tools
Minimum of 1 year experience with automation
Minimum of 1 year experience with Solr or Elastic Search administration and
tuning experience.
Minimum of 1 year experience with supporting HBASE
Experience with issues that will affect a distributed system
Experience in supporting MySQL & NoSQL such as MongoDB
Experience in programming with Java, bash/ksh scripting and Python
Strong written communications and documentation experience

*Responsibilities*
Manage cluster hardening activities through the implementation and
maintenance of security and governance components across various cluster.
Manage large scale Hadoop cluster environments, handling all Hadoop
environment builds, including design, capacity planning, cluster setup,
performance tuning, analyze Hadoop log files, support,  maintenance,
backups and ongoing monitoring
Work closely with network, infrastructure, Linux admin and application
development teams ensuring data is completing within SLA
Identify hardware and software technical problems, storage and/or related
system malfunctions. All root cause analysis results will be properly
documented to minimize future system issues.
Continually work to automate administrative tasks while designing
resiliency and high availability of the Hadoop platform
Manage and troubleshoot scheduled Hadoop jobs with TWS and Oozie.
Provide off hours support on a rotational basis to support installation,
upgrades or production issues.



*Mazhar Khan*
*||Tel: 703-962-7227 X 427 || Fax: 703-439-2550 || *
*Email: maz...@gsquire.com *
*Gsquire Business Solutions Inc || www.gsquire.com 
|| 4229 Lafayette Center Dr , Suite #1625, Chantilly, VA 20151Women Owned
Small Business / MBE / SWAM Certified*

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit 

looking for Hadoop architect

2016-03-31 Thread Sudhakar Reddy
Hi ,

Hope you are doing great,

My Self,  *Sudhakar* from *HCL Global Systems Inc*. We have a requirement
for * Hadoop architect  *for *Englewood,Co

Please review the Job description below and if you’d like to pursue this,
please include a word copy of your latest resume along with a daytime phone
number and rate in your response.

You can also reach me at 248-473-0720*191,Drop the suitable profiles
on *r...@hclglobal.com
*





Title: Hadoop architect

Location: Englewood,Co

Duration: 6 Months +

1. Fluency in programming ( Java/Python, Scala would be an advantage),
should be able to review design, code and flow of data.

2. Fluency in multiple Hadoop tools. (Minimum Hive, Hbase, Flume, Sqoop,
Map-reduce)

3. In-depth understanding of Hadoop architecture

4. Good in performance tuning for data workflow.

5. Able to talk to Senior management and suggest future plans for the
company in Big data platform.

6. Identify pit falls in the existing data flow and suggest changes.

Minimum work experience:>=5 Years



*Thanks and Regards . . . . *



*Sudhakar*,

Technical Recruiter,
HCL Global Systems, Inc
24543 Indoplex Circle,
Suite 220, Farminton. MI 48335

Direct:248-473-0720*191

Email Id: r...@hclglobal.com

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.


Looking for Hadoop Architect

2016-02-11 Thread sudhakar reddy
Hi ,

Hope you are doing great,

My Self,  *Sudhakar* from *HCL Global Systems Inc*. We have a requirement
for * *Hadoop Architect for Denver,CO.*

Please review the Job description below and if you’d like to pursue this,
please include a word copy of your latest resume along with a daytime phone
number and rate in your response.

You can also reach me at 248-473-0720*191,Drop the suitable profiles
on *r...@hclglobal.com
*



*Position: Hadoop Architect*

*Location: Denver,CO*

*Duration: 6 months +*



*Job Description:*

*Key skills required for the job are:*



1) 4+ years of hands-on experience with the technologies in the Hadoop
ecosystem like Hadoop, HDFS, Spark, MapReduce, Pig, Hive, Flume, Sqoop,
Cloudera Impala,  Oozie,  Kafka
2) 5+ years of hands-on large-scale software development / integration
experience using Java/J2EE technologies.
3) Proven experience of driving technology and architectural execution for
enterprise grade solutions based on Big Data analytics platforms.
4) Experienced in Leading and mentoring big data team.
5) Research, evaluate, architect, and deploy new tools, frameworks, and
patterns to build sustainable Big Data platforms



*Thanks and Regards . . . . *



*Sudhakar*,

Technical Recruiter,
HCL Global Systems, Inc
24543 Indoplex Circle,
Suite 220, Farminton. MI 48335

Direct:248-473-0720*191

Email Id: r...@hclglobal.com

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.


Looking for Hadoop Architect at El Segundo, CA.

2015-12-02 Thread Mohammed Mazhar Khan
Hi,

This is Mazhar Khan from Systel one of Americas premiere staffing
organization. This mail is in regards to a career opportunity with one of
our clients, we are currently looking for tech savvy professionals  in *Hadoop
Architect  **at*  *  El Segundo, CA*





*Position: Hadoop Architect*

*Location: El Segundo, CA*

*Duration: 6-12 Months.*

*Max Rate : $ 70/hr on C2C all inc.*



*Description:*

· Extensive experience in development including low level
understanding of Big Data technologies

· Hadoop MR (MR1 and YARN), HDFS, Scoop, Hive Pig, Flume, Oozie,
Impala, DistCp, Solr.

· High level understanding or working knowledge in Spark, Storm,
Kafka and other evolving technologies.

· Programming language experience of Java, Python, Scala, shell
script. RDBMS is plus.

· Experience in leading a team or demonstrate leadership skills,
motivates, proactive and ability to work with multiple teams on a daily
basis.

· Troubleshooting production issues and work with the team to
design and develop solutions.

· 10+ Years of experience with minimum 2 years design/architect
experience in Big Data

· Excellent Verbal and written communication skills

· Excellent analytical and problem solving skills.





*About Systel:*

Systel is a professional services organization providing IT staff
augmentation, IT product development and Business process outsourcing
services. Through our 18 years of existence we have delivered to the
expectations of the largest System Integrators and Fortune 1000 companies.

IT Staffing Services has been Systel’s core competency, and over the past
few years Systel has added and groomed its capabilities for IT application
development & maintenance, testing and BPO services. Systel is currently
investing in product developments for Health Care and E-Governance domains,
it has also recently launched a major initiative on the social media
platform. Some of our key achievements are:

· Awarded Regional Supplier of the Year NMSDC

· Certified MBE from Georgia Minority Supplier Development council

· Prime vendor status for 3 fortune 500 companies

· Excellence in E-governance Award



*SYSTEL IS AN EQUAL OPPORTUNITY EMPLOYER*





*Warm Regards,*

*Mazhar Khan*

*__*

*Resourcing Specialist*

*SYSTEL *| Atlanta, GA

*A **Certified** (MBE) Minority Business Enterprise*

*Direct: *+1- 678-203-2434

*mazh...@systelcomputers.com* 

 Gmail : kmazhar...@gmail.com

Yahoo : mazhar...@yahoo.com

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at http://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.