Senior Data Engineer at Turaco

March 16, 2026

Job Description

Job Summary

We are seeking an experienced Senior Data Engineer to architect, build, and optimize our data infrastructure. In this role, you will move beyond simple execution to provide technical leadership—designing scalable systems that handle sensitive financial data with speed and accuracy. You will proactively implement methods to improve data reliability while ensuring our architecture meets the rigorous security standards of the FinTech industry.

Key Duties & Responsibilities

  • Architect and own scalable batch and real-time data pipelines supporting high-volume financial transactions
  • Design resilient data infrastructure using modern streaming and processing technologies (e.g., Kafka, Spark)
  • Build, optimize, and maintain robust ETL/ELT pipelines integrating core banking systems, internal platforms, and third-party APIs
  • Continuously improve platform scalability by automating manual workflows and re-engineering data processes
  • Ensure strong data quality, reliability, and accuracy, safeguarding critical financial reporting and customer balance integrity
  • Embed security, governance, and compliance by design into all data systems handling sensitive financial data and any PII
  • Conduct deep root-cause analysis to resolve data anomalies and prevent systemic issues
  • Define monitoring, validation, and observability practices to proactively detect pipeline failures and data drift
  • Mentor and guide junior data engineers, raising technical standards across the team
  • Collaborate with Product, Risk, Finance, and Engineering teams to translate complex business requirements into scalable data solutions
  • Build and enable analytics-ready data models and tooling that drive actionable business insights and decision-making

Educational Qualifications, Experience, & Skills Required

  • Live Turaco’s values – 1) Pushing boundaries, 2) Working with excellence, and 3) Profound respect for the individual
  • Experience: 5+ years of experience in Data Engineering, ideally within Financial Services or FinTech.
  • Education: Degree in Computer Science, Statistics, IT, or similar field.
  • Programming: Advanced proficiency in Python, Java, or Scala.
  • Database Mastery: Expert-level SQL skills and hands-on experience with database design and data modeling. Experience with modern data warehouses (Snowflake, BigQuery, or Redshift).
  • Big Data Tech: Working knowledge of message queuing (Kafka, RabbitMQ) and stream processing.
  • Orchestration: Experience with workflow management tools (Airflow, DBT, Luigi).
  • Experience with Infrastructure as Code (Terraform, CloudFormation).
  • Familiarity with containerization (Docker, Kubernetes).
  • Experience visualizing data using Tableau, PowerBI, or open-source libraries (D3, matplotlib).

Loading

Location