Description:
- AWS Data Engineer IV Core Technical Skills
- 5+ years of AWS experience
- AWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSight
- Experience with Kafka/Messaging preferably Confluent Kafka
- Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Aurora
- Experience with AWS data warehousing tools such as Amazon Redshift and Amazon Athena
- Proven track record in the design and implementation of data warehouse solutions using AWS
- Skilled in data modeling and executing ETL processes tailored for data warehousing
- Competence in developing and refining data pipelines within AWS
- Proficient in handling both real-time and batch data processing tasks
- Extensive understanding of database management fundamentals
- Expertise in creating alerts and automated solutions for handling production problems
- Tools and Languages - Python, Spark, PySpark and Pandas
- Infrastructure as Code technology - Terraform/CloudFormation
- Experience with Secrets Management Platform like Vault and AWS Secrets manager
- Experience with Event Driven Architecture
- DevOps pipeline (CI/CD); Bitbucket; Concourse
- Experience with RDBMS platforms and good proficiency with SQL
- Experience with Rest APIs and API gateway
- Deep knowledge of IAM roles and Policies
- Experience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch events
- Deep understanding of networking DNS, TCP/IP and VPN
- Experience with AWS workflow orchestration tool like Airflow or Step Functions
- Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearch
- Databases - Document DB, Mongo DB
- Hadoop platform (Hive; HBase; Druid)
- Java, Scala, Node JS
- Workflow Automation
- Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
- Excellent background in Kubernetes, Distributed Systems, Microservice architecture and containers
- Ability to perform hands on development and peer review for certain components / tech stack on the product
- Standing up of development instances and migration path (with required security, access/roles)
- Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows)
- Lead implementation of integrated data quality framework
- Ensures optimal framework design and load testing scope to optimize performance (specifically for Big Data)
- Supports data scientist with test and validation of models
- Performs impact analysis and identifies risk to design changes
- Ability to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications
- Ability to implement data pipelines with the right attentiveness to durability and data quality
- Implements data warehousing products thinking of the end users experience (ease of use with right performance)
- Ensures Test Driven development
- 5+ years of Experience leading teams to deliver complex products
- Good technical skills and communication skills
- Good skills with business stakeholder interactions
- Good solutioning and architecture skills
- 5+ years of Experience building real time data ingestion streams (event driven)
- Ensure data security and permissions solutions, including data encryption, user access controls and logging
location: Charlotte, North Carolina
job type: Contract
salary: $74.45 - 84.45 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
- Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery
- Lead the Design, Build, Test and Deployment of components
- Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)
- Understand requirements / use case to outline technical scope and lead delivery of technical solution
- Confirm required developers and skillsets specific to product
- Provides leadership, direction, peer review and accountability to developers on the product (key responsibility)
- Works closely with the Product Owner to align on delivery goals and timing
- Assists Product Owner with prioritizing and managing team backlog
- Collaborates with Data and Solution architects on key technical decisions
- The architecture and design to deliver the requirements and functionality
- Skilled in developing data pipelines, focusing on long-term reliability and maintaining high data quality
- Designs data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performance
- Manage and resolve issues in production data warehouse environments on AWS
qualifications:
- Experience level: Experienced
- Minimum 5 years of experience
- Education: Bachelors
skills:
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility).
This posting is open for thirty (30) days.