xelys jobs xelys jobs

Senior Data Engineer

8:00 AM

full-remoteseniorpermanentbackend Full remote Today via WTTJ

See how well this job matches your profile

Sign up to get an AI match score and generate a tailored application in seconds.

Get your match score

Tags

PythonSQLETLData WarehousingKafkaAWSTerraformCI/CDAirflowPySpark

About the role

Role Overview

Join 8:00 AM as a Senior Data Engineer to design and scale the company’s data platform. You’ll own key parts of the data architecture, build reliable data pipelines, and collaborate with product, engineering, and analytics teams.

Key Missions

  • Lead design discussions around data architecture, data modeling strategy, and platform evolution.
  • Improve data reliability, observability, and data quality; establish data modeling best practices.
  • Develop and optimize ETL processes and integrate data sources across a broad product portfolio.
  • Help set engineering standards and mentor teammates.
  • Drive team momentum across data engineering initiatives.

Responsibilities

  • Build and maintain large-scale data pipelines and data platform components.
  • Work with multiple data stores, including relational and NoSQL systems.
  • Integrate data from third-party APIs.
  • Contribute to CI/CD and infrastructure automation practices.

Requirements

  • 7+ years of data engineering experience.
  • Advanced Python and SQL.
  • Experience with data warehouses / large-scale analytics systems such as Redshift, Snowflake, and BigQuery.
  • Experience building/maintaining relational and NoSQL databases (document, key-value, and other types).
  • Experience with event/streaming systems (e.g., Kafka).
  • Experience with business process automation.
  • Experience working with third-party APIs for data integration.
  • CI/CD experience.
  • Experience with Infrastructure as Code (Terraform / CloudFormation).
  • Cloud infrastructure experience (AWS preferred).
  • Experience with workflow orchestration tools (e.g., Airflow, Prefect, Dagster).
  • Strong experience designing and building data pipelines and platforms.
  • Bachelor’s degree or higher in a technical field (or equivalent education/experience).

Nice to Have

  • PySpark and Spark experience.
  • Redshift experience.

Scraped 5/12/2026

xelys jobs xelys jobs

Built for remote job seekers. Powered by AI.