job summary: Description: AWS Data Engineer IV Core Technical Skills5+ years of AWS experienceAWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSightExperience with Kafka/Messaging preferably Confluent KafkaExperience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and AuroraExperience with AWS data warehousing tools such as Amazon Redshift and Amazon AthenaProven track record in the design and implementation of data warehouse solutions using AWSSkilled in data modeling and executing ETL processes tailored for data warehousingCompetence in developing and refining data pipelines within AWSProficient in handling both real-time and batch data processing tasksExtensive understanding of database management fundamentalsExpertise in creating alerts and automated solutions for handling production problemsTools and Languages - Python, Spark, PySpark and PandasInfrastructure as Code technology - Terraform/CloudFormationExperience with Secrets Management Platform like Vault and AWS Secrets managerExperience with Event Driven ArchitectureDevOps pipeline (CI/CD); Bitbucket; ConcourseExperience with RDBMS platforms and good proficiency with SQLExperience with Rest APIs and API gatewayDeep knowledge of IAM roles and PoliciesExperience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch eventsDeep understanding of networking DNS, TCP/IP and VPNExperience with AWS workflow orchestration tool like Airflow or Step Functions AWS Data Engineer IV Additional Technical Skills (nice to have, but not required for the role) Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearchDatabases - Document DB, Mongo DBHadoop platform (Hive; HBase; Druid)Java, Scala, Node JSWorkflow AutomationExperience transitioning on premise big data platforms into cloud-based platforms such as AWSExcellent background in Kubernetes, Distributed Systems, Microservice architecture and containers Core Experience and Abilities Ability to perform hands on development and peer review for certain components / tech stack on the productStanding up of development instances and migration path (with required security, access/roles)Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows)Lead implementation of integrated data quality frameworkEnsures optimal framework design and load testing scope to optimize performance (specifically for Big Data)Supports data scientist with test and validation of modelsPerforms impact analysis and identifies risk to design changesAbility to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applicationsAbility to implement data pipelines with the right attentiveness to durability and data qualityImplements data warehousing products thinking of the end users experience (ease of use with right performance)Ensures Test Driven development5+ years of Experience leading teams to deliver complex productsGood technical skills and communication skillsGood skills with business stakeholder interactionsGood solutioning and architecture skills5+ years of Experience building real time data ingestion streams (event driven)Ensure data security and permissions solutions, including data encryption, user access controls and logging location: Charlotte, North Carolina job type: Contract salary: $74.45 - 84.45 per hour work hours: 8am to 5pm education: Bachelors responsibilities: Provides technical direction, guides the team on key technical aspects and responsible for product tech deliveryLead the Design, Build, Test and Deployment of componentsWhere applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)Understand requirements / use case to outline technical scope and lead delivery of technical solutionConfirm required developers and skillsets specific to productProvides leadership, direction, peer review and accountability to developers on the product (key responsibility)Works closely with the Product Owner to align on delivery goals and timingAssists Product Owner with prioritizing and managing team backlogCollaborates with Data and Solution architects on key technical decisionsThe architecture and design to deliver the requirements and functionalitySkilled in developing data pipelines, focusing on long-term reliability and maintaining high data qualityDesigns data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performanceManage and resolve issues in production data warehouse environments on AWS qualifications: Experience level: ExperiencedMinimum 5 years of experienceEducation: Bachelors skills: Data Warehouse Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com. Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility). This posting is open for thirty (30) days.