job summary: Designing and Building ETL pipeline using Sqoop, Hive, Map Reduce and Spark on on-prem and cloud environments. Functional Programming using Python and Scala for complex data transformations and in-memory computations. Using Erwin for Logical/Physical data modeling and Dimensional Data Modeling. Designing and developing UNIX/Linux scripts for handing complex File formats and structures Orchestration of workflows and jobs using Airflow
job summary: Designing and Building ETL pipeline using Sqoop, Hive, Map Reduce and Spark on on-prem and cloud environments. Functional Programming using Python and Scala for complex data transformations and in-memory computations. Using Erwin for Logical/Physical data modeling and Dimensional Data Modeling. Designing and developing UNIX/Linux scripts for handing complex File formats and structures Orchestration of workflows and jobs using Airflow
job summary: strong in real time & batch pipelines in big data technologies (i.e. Spark/Kafka/Cassandra/Hadoop/Hive/Elasticsearch) Proficient in RESTful Services, Java, Scala, Spring Boot/Play Framework, RDBMS, NoSql, PythonProficient in development of scalable cloud native microservicesProficient with Designing and building APIs location: Bentonville, Arkansas job type: Contract salary: $70 - 80 per hour work hours: 8am to 4pm education:
job summary: strong in real time & batch pipelines in big data technologies (i.e. Spark/Kafka/Cassandra/Hadoop/Hive/Elasticsearch) Proficient in RESTful Services, Java, Scala, Spring Boot/Play Framework, RDBMS, NoSql, PythonProficient in development of scalable cloud native microservicesProficient with Designing and building APIs location: Bentonville, Arkansas job type: Contract salary: $70 - 80 per hour work hours: 8am to 4pm education:
let similar jobs come to you
We will keep you updated when we have similar job postings.
Thank you for subscribing to your personalised job alerts.
It looks like you want to switch your language. This will reset your filters on your current job search.