job summary: Seeking a Subject Matter Expert to help develop an interconnected network of data capabilities and data products designed to deliver data efficiently and at scale. Candidates should have expertise in developing and building data platforms, demonstrating experience with overcoming obstacles and avoiding pitfalls. They should also possess skills in optimizing and automating deliverables to production using the required tech stack. Additionally, candidates should be experienced and adaptable to changing demands and priorities in an Agile development environment. We are specifically looking for individuals with at least 5+ years of experience in Data Engineering and/or Software Engineering roles who can provide knowledge and support to our existing engineers. Must have experience with similar platform engineering/management solutions: - Building/optimizing Data LakeHouse with Open Table formats - Kubernetes deployments/cluster administration - Transitioning on-premise big data platforms to scalable cloud-based platforms like AWS - Distributed Systems, Microservice architecture, and containers - Cloud Streaming use cases in Big Data Ecosystems (e.g., EMR, EKS, Hadoop, Spark, Hudi, Kafka/Kinesis) location: Charlotte, North Carolina job type: Contract salary: $74.86 - 84.86 per hour work hours: 8am to 5pm education: Bachelors responsibilities: Provides technical direction, engage team in discussion on how to best guide/build features on key technical aspects and responsible for product tech deliveryWorks closely with the Product Owner and team to align on delivery goals and timingCollaborates with architects on key technical decisions for data and overall solutionLead design and implementation of data quality check methodsEnsure data security and permissions solutions, including data encryption, user access controls and loggingBe able to think unconventionally to find the best way to solve for a defined use case with fuzzy requirements.Self-starter mentalityWilling to do their own research to solve problems and can clearly present findings and engage in conversation on what makes one solution better than anotherThrive in a fail-fast environment, involving mini PoCs, and participate in an inspect and adapt process.Questioning and Improvement mindsetMust be ready to ask questions about why something is currently done the way it is and suggest alternative solutionsCustomer facing skillsInterfacing with stakeholders and other product teams via pairing, troubleshooting support, and debugging issues they encounter with our products qualifications: Must have experience with below tech stack: - Git Hib and Git Hub Actions - AWS o IAM o API Gateway o Lambda o Step Functions o Lake formation o EKS & Kubernetes o Glue: Catalog, ETL, Crawler o Athena o Lambda o S3 (Strong foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc) - Apache Hudi - Apache Flink - PostgreSQL and SQL - RDS (Relational Database Services). - Python - Java - Terraform Enterprise o Must be able to explain what TF is used for o Understand and explain basic principles (e.g. modules, providers, functions) o Must be able to write and debug TF Helpful tech stack experience would include: - Helm - Kafka and Kafka Schema Registry - AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift, Iceberg - Secrets Management Platform: Vault, AWS Secrets manager skills: Must have experience with below tech stack: - Git Hib and Git Hub Actions - AWS o IAM o API Gateway o Lambda o Step Functions o Lake formation o EKS & Kubernetes o Glue: Catalog, ETL, Crawler o Athena o Lambda o S3 (Strong foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc) - Apache Hudi - Apache Flink - PostgreSQL and SQL - RDS (Relational Database Services). - Python - Java - Terraform Enterprise o Must be able to explain what TF is used for o Understand and explain basic principles (e.g. modules, providers, functions) o Must be able to write and debug TF Helpful tech stack experience would include: - Helm - Kafka and Kafka Schema Registry - AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift, Iceberg - Secrets Management Platform: Vault, AWS Secrets manager Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com. Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility). This posting is open for thirty (30) days.