job summary: This is a 100% onsite position in Charlotte, NC 28273. Job Description: Ensure the writing aligns with industry standards and company goals.Gather technical information by collaborating with subject matter experts (SMEs), developers, and product teams.Organize information logically for easy understanding.Create outlines, templates, and structures to streamline content delivery.Coordinate reviews and approvals with stakeholders.Revise doc
job summary: This is a 100% onsite position in Charlotte, NC 28273. Job Description: Ensure the writing aligns with industry standards and company goals.Gather technical information by collaborating with subject matter experts (SMEs), developers, and product teams.Organize information logically for easy understanding.Create outlines, templates, and structures to streamline content delivery.Coordinate reviews and approvals with stakeholders.Revise doc
job summary: Resource will be responsible for end-to-end development covering Financial Attribution, SCD, Booking and Referring Agreements, Data Aggregrations, and System of Record Onboarding. This will be a large data modernization effort for a custom pyspark metrics engine requiring Pyspark, Python, Dremio, ETL, Financial experience location: Charlotte, North Carolina job type: Contract salary: $54 - 58 per hour work hours: 8am to 4pm ed
job summary: Resource will be responsible for end-to-end development covering Financial Attribution, SCD, Booking and Referring Agreements, Data Aggregrations, and System of Record Onboarding. This will be a large data modernization effort for a custom pyspark metrics engine requiring Pyspark, Python, Dremio, ETL, Financial experience location: Charlotte, North Carolina job type: Contract salary: $54 - 58 per hour work hours: 8am to 4pm ed
job summary: - Proven experience designing and implementing Apache Airflow workflows. - strong programming skills in Python, with knowledge of SQL for data manipulation. - Design and implement scalable Apache Airflow workflows tailored for model validation tasks, including data preprocessing, model scoring, validation checks, and result reporting. - Develop DAGs (Directed Acyclic Graphs) to automate model validation workflows across multiple environments
job summary: - Proven experience designing and implementing Apache Airflow workflows. - strong programming skills in Python, with knowledge of SQL for data manipulation. - Design and implement scalable Apache Airflow workflows tailored for model validation tasks, including data preprocessing, model scoring, validation checks, and result reporting. - Develop DAGs (Directed Acyclic Graphs) to automate model validation workflows across multiple environments
let similar jobs come to you
We will keep you updated when we have similar job postings.
Thank you for subscribing to your personalised job alerts.
It looks like you want to switch your language. This will reset your filters on your current job search.