Data Platform Engineer
Worth AI
See how well this job matches your profile
Sign up to get an AI match score and generate a tailored application in seconds.
Get your match scoreTags
About the role
Role Overview
As a Data Platform Engineer at Worth AI, you will design, build, and operate core data services that power product features and analytics. You’ll own end-to-end data pipelines and API services that ingest, process, and expose high-quality data to internal teams and external partners, treating the data platform as a product with strong SLAs.
Key Responsibilities
- Architect and implement entity resolution to deduplicate and link data into unified "Golden Records"
- Build and maintain a global business knowledge graph and ontology (including ownership chains, UBOs, and cross-border risk relationships)
- Implement a hybrid storage strategy combining graph databases (relationships) with document/search stores (metadata and adverse media content)
- Optimize for real-time risk assessment, enabling fast multi-level ownership traversal for automated Go/No-Go onboarding decisions
- Design and build scalable data services and APIs for ingesting, transforming, and serving data
- Develop batch and streaming pipelines using modern data processing frameworks on AWS
- Own reliability, performance, and API-first platform capabilities (monitoring, alerting, and on-call where appropriate)
- Apply best practices for data modeling, quality, lineage, and governance
- Partner with data scientists, analysts, and application engineers to translate needs into platform capabilities
- Drive automation and standardization via CI/CD, model as a service, and reproducible environments
- Help evolve the platform architecture with clear contracts, SLAs, and versioned APIs
Requirements
- Hands-on experience with graph databases (e.g., Neo4j, AWS Neptune, TigerGraph) and graph query languages (e.g., Cypher or Gremlin)
- Proven experience with entity resolution / record linkage (e.g., Senzing, Quantexa, or custom probabilistic matching)
- Ability to design flexible ontologies/schemas that adapt to evolving regulatory data (e.g., PEP/sanctions formats)
- Experience building GraphQL or REST APIs optimized for graph traversals and deep lookups
- Strong software engineering skills in at least one of: Python, Java, Go, Rust
- Hands-on experience building ETL/ELT pipelines on a major cloud provider (AWS preferred)
- Experience with modern data stack tools such as Spark/Flink, Kafka/Kinesis, and Airflow/managed schedulers, plus data warehouses (e.g., Snowflake, Redshift, BigQuery, Databricks)
- Familiarity with DevOps practices: CI/CD, Docker, Kubernetes, and Terraform
- Strong focus on observability (metrics/logs/traces), resilience, and early warning/alerting
Nice-to-Haves
- Experience operating data-as-a-service / centralized platforms at scale (implied by the requirements)
About Worth AI
Worth AI is a computer software company focused on applying artificial intelligence to revolutionize decision-making. The team builds data and risk-related capabilities to support reliable, high-impact product and onboarding processes, with an emphasis on collaboration and ownership.
Scraped 4/1/2026