Skip to main content
Download free report
SoftBlues
Back to Projects

Multi-Asset Market Data Platform

Senior Data EngineerDmytro K.
DK
Dmytro K.

Data Engineering Lead

Data Engineer & Big Data

Key Expertise

Cloud Data PlatformsData Engineering LeadershipData Warehouse MigrationFinTech & Banking

Experience

10+ years

Timezone

CET (UTC +1)

Skills

Languages

JavaPythonScala

Databases

Google BigQueryMongoDBOracle DBkdb+KineticaNeo4j

Infrastructure

GCPDataprocDockerKubernetesGrafanaKafka

Frameworks

Apache SparkApache Airflowdbt

Integrations & Protocols

Apache Atlas
7-day risk-free trial
Response within 24 hours
View Full Profile

Overview

Mission-critical market data platform covering equities across dozens of global exchanges, FX, and fixed income, serving both internal quantitative trading desks. At this scale and in this context, reliability is not a quality attribute – it is the product. The system had to sustain consistent ingestion and low-latency access during peak trading hours across multiple time zones, with no tolerance for data gaps or processing delays that could affect trading decisions.

Achievements

Substantially reduced average processing time on the core ingestion pipelines, directly improving throughput for downstream trading systems. Delivered a custom streaming sink integration that cut both average and peak query latency by a significant margin – the peak reduction being the more operationally important figure, as it addressed the tail-latency spikes that affected time-sensitive analytics. Improved the startup behavior of the main streaming component, eliminating instability during high-load trading windows.

Responsibilities

  • Owned end-to-end reliability of ETL processes ingesting global equities data and delivering analytics to internal trading systems – a zero-defect-tolerance environment with direct business exposure on failure.
  • Built ingestion and analytics pipelines for FX and fixed income asset classes, expanding the platform's cross-asset coverage and enabling unified analytics across instrument types.
  • Led a performance engineering initiative on the primary real-time streaming ingestion component, improving startup time and processing stability during peak load periods.
  • Developed an ETL migration framework to automate transition from legacy warehouse infrastructure to a modern architecture, reducing per-pipeline development effort dramatically and accelerating onboarding of new engineers.
  • Delivered storage format optimizations that reduced memory consumption during ETL execution; coordinated phased production rollouts with distributed teams across multiple time zones.

Technologies Used

JavaPythonScalaApache SparkOracle DBkdb+KineticaKafka
DK

This project was delivered by

Dmytro K.

View Full Profile

Ready to Build Your AI Team?

Get matched with the right AI experts for your project. Book a free discovery call to discuss your requirements.

We respond within 24 hours.