Hadoop Administrator (24-Month Contract) - ERP Implementation
We're seeking an experienced Hadoop Administrator to join our dynamic team on a 24-month contract. In this role, you'll play a crucial part in our enterprise resource planning (ERP) initiatives by managing our Cloudera environment and ensuring optimal performance of our data infrastructure.
The Ideal Candidate
We're looking for a self-sufficient professional who takes initiative and approaches challenges proactively. You'll thrive in our environment if you're comfortable balancing day-to-day operations (including on-call responsibilities) while contributing to strategic project work. Your experience with Cloudera upgrades will be essential as we enhance and optimize our data ecosystem.
This opportunity offers the chance to make a significant impact on our organization's data capabilities while working with cutting-edge technologies in a collaborative environment.
location: San Antonio, Texas
job type: Contract
salary: $40 - 55 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
- Hadoop administrator provides support and maintenance and its eco-systems including HDFS, Yarn, Hive, LLAP, Druid, Impala, Spark, Kafka, HBase, Cloudera Work Bench, etc.
- Accountable for storage, performance tuning and volume management of Hadoop clusters and MapReduce routines
- Deploys Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups.
- Installs and configures software, installs patches, and upgrades software as needed.
- Capacity planning and implementation of new/upgraded hardware and software releases for storage infrastructure.
- Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administration
- Communicates with other development, administrating and business teams. They include infrastructure, application, network, database, and business intelligence teams.
- Responsible for Data Lake and Data Warehousing design and development.
- Collaboration with various technical/non-technical resources such as infrastructure and application teams regarding project work, POCs (Proofs of Concept) and/or troubleshooting exercises.
- Configuring Hadoop security, specifically Kerberos integration with ability to implement.
- Creation and maintenance of job and task scheduling and administration of jobs.
- Responsible for data movement in and out of Hadoop clusters and data ingestion using Sqoop and/or Flume
- Review Hadoop environments and determine compliance with industry best practices and regulatory requirements.
- Data modeling, designing and implementation of data based on recognized standards.
- Working as a key person for Vendor escalation
- On-call rotation is required to support 24/7 environment and is also expected to be able to work outside business hours to support corporate needs.
qualifications:
- Bachelor's degree in Information Systems, Engineering, Computer Science, or related field from an accredited university.
- Intermediate experience in a Hadoop production environment.
- Must have intermediate experience and expert knowledge with at least 4 of the following:
- Hands on experience with Hadoop administration in Linux and virtual environments.
- Well versed in installing & managing distributions of Hadoop (Cloudera).
- Expert knowledge and hands-on experience in Hadoop ecosystem components; including HDFS, Yarn, Hive, LLAP, Druid, Impala, Spark, Kafka, HBase, Cloudera Work Bench, etc.
- Thorough knowledge of Hadoop overall architecture.
- Experience using and troubleshooting Open Source technologies including configuration management and deployment.
- Data Lake and Data Warehousing design and development.
- Experience reviewing existing DB and Hadoop infrastructure and determine areas of improvement.
- Implementing software lifecycle methodology to ensure supported release and roadmap adherence.
- Configuring high availability of name-nodes.
- Scheduling and taking backups for Hadoop ecosystem.
- Data movement in and out of Hadoop clusters.
- Good hands-on scripting experience in a Linux environment.
skills:
- Experience in project management concepts, tools (MS Project) and techniques.
- A record of working effectively with application and infrastructure teams.
- Strong ability to organize information, manage tasks and use available tools to effectively contribute to a team and the organization.
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility).
This posting is open for thirty (30) days.
It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.