Machine Learning Engineer
Twilio
full-remotemidbackenddata United States Yesterday via LinkedIn
See how well this job matches your profile
Sign up to get an AI match score and generate a tailored application in seconds.
Get your match scoreTags
Machine LearningPythonSQLMLOpsETL/ELTAirflowDagsterSnowflakeBigQueryKubernetes
About the role
Role Overview
Join Twilio’s rapidly growing AI & Data Platform team as a Machine Learning Engineer (L3) for the Trust Intelligence Platform. You will design, build, and operate cloud-native data and ML infrastructure that turns raw events into real-time intelligence for Twilio product teams and customers.
Responsibilities
- Architect, implement, and maintain scalable data pipelines and feature stores for batch and real-time workloads.
- Build reproducible ML training, evaluation, and inference workflows using modern orchestration and MLOps tools.
- Integrate event streams from Twilio products (e.g., Messaging, Voice, Segment) into unified, analytics-ready datasets.
- Monitor and improve data quality, model performance, latency, and cost.
- Partner with product, data science, and security teams to ship resilient, compliant services.
- Automate deployment using CI/CD, infrastructure-as-code, and container orchestration best practices.
- Produce documentation, dashboards, and runbooks; share knowledge via code reviews and brown-bag sessions.
- Take ownership of problems and drive them to completion in line with Twilio’s “We are Builders” values.
Required Qualifications
- B.S. in Computer Science/Data Engineering/Electrical Engineering/Mathematics (or related) or equivalent practical experience.
- 3–5 years building and operating data or ML systems in production.
- Proficient in Python and SQL; strong software engineering fundamentals (testing, version control, code reviews).
- Hands-on experience with ETL/ELT orchestration (e.g., Airflow, Dagster) and cloud data warehouses (Snowflake, BigQuery, or Redshift).
- Familiarity with ML lifecycle tooling (MLflow, SageMaker, Vertex AI, or similar).
- Working knowledge of Docker and Kubernetes and at least one major cloud (AWS, GCP, or Azure).
- Understanding of data modeling, distributed computing, and streaming frameworks (Spark, Flink, Kafka Streams).
- Strong analytical thinking, communication skills, and a sense of ownership/curiosity.
Desired Qualifications
- Experience with Twilio Segment, Kafka/Kinesis, or other high-throughput event buses.
- Exposure to infrastructure-as-code (Terraform, Pulumi) and GitHub-based CI/CD.
- Practical knowledge of generative AI workflows, foundation-model fine-tuning, or vector databases.
- Contributions to open-source data/ML projects or published technical content.
- Domain experience in communications, marketing automation, or customer engagement analytics.
About Twilio
Twilio is a communications technology company that provides cloud-based tools for businesses and developers to build and deliver personalized customer experiences. It operates a remote-first culture and develops AI- and data-driven platforms to power real-time intelligence across customer interactions.
Scraped 4/24/2026