Data Quality Engineer

Kaiko



About the job

Founded in 2014, Kaiko is a global fintech company, with offices in NYC, London, Paris, and Singapore. Rapidly growing, we are the leading crypto market data provider for financial institutions and enterprises in the digital asset space.


What We Do


Kaiko provides financial data products and solutions, across three main business units:

1 - Market Data: โ€œCEXโ€ Centralized Exchanges Market Data: we collect, structure and distribute market data from 100+ cryptocurrency trading venues; โ€œDEXโ€ Decentralized Protocols Market Data: we run blockchain infrastructure in order to read, collect, engineer and distribute venue-level market data from DeFI protocols.

2 -Analytics: proprietary quantitative models & data solutions to price and assess risk.

3 - Indices: suite of mono-assets rates and benchmarks, as well as cross-assets indices.


Kaikoโ€™s products are available worldwide on all networks and infrastructures: public APIs, private & on-premises networks; private & hybrid cloud set-ups; blockchain native (Kaiko oracles solution).


Additionally, Kaikoโ€™s Research publications are read by thousands of industry professionals and cited in the worldโ€™s leading media organizations. We provide original insights and in-depth analysis on crypto markets using Kaikoโ€™s data and products.


Who We Are


Weโ€™re a team of 80 (and growing) passionate individuals with a deep interest in building data solutions and supporting the growth of the digital finance economy. Weโ€™re proud of Kaikoโ€™s talented team and are committed to our international representation and diversity. Our people and their values, are the foundation of our continued success.


About The Role


Kaiko operates a data infrastructure which is growing to a critical service for most of our customers. We collect and redistribute Terabytes of data from hundreds of diverse sources across multiple channels and protocols. Providing a high and consistent level of quality across all data types and sources is a key component of our value proposition.

As per any critical service and under its ongoing SOC 2 Audit process, Kaiko has to meet operational standards in the financial environment like a 99.9% uptime, very strict data quality and requirement rules, 100% auditable processes for its regulated services.


Missions & Responsibilities


1 - The Data Quality Engineering team defines, owns, and guarantees the highest standards of quality and consistency of data for the Kaiko platform. To achieve this objective, the team focuses on two main activities:

  • Ensure that the Kaiko platform is equipped with the best data quality standards and tools:
  • Define data quality standards and KPIs
  • Build and maintain the systems needed for monitoring data quality and consistency, including real-time stream analysis
  • Work hand-in-hand with the Product and Engineering teams to implement multidimensional anomaly detection algorithms (signal processing, statistics, machine learning, ...)
  • Support the Product team to ensure that new data product features are tested at scale with large and representative sets of data
  • Work hand-in-hand with the Engineering team to improve ingestion and storage systems

2 - Ensure that the Kaiko platform data is always in a high-quality and consistent state:

  • Monitor data quality against defined standards and KPIs
  • Review data to identify patterns or trends that may indicate errors or inconsistencies
  • Build and maintain the systems needed for data correction and backfilling
  • Work hand-in-hand with the Product Operations department to identify and fix data issues, both proactively and retroactively


About You


  • Solid Software Engineering experience (knowledge of Go is a plus), combined with good understanding of data science, basic analytics and Machine Learning concepts
  • Experience with building large scale data pipelines in distributed environments with technologies such as Hadoop, Spark, Dataproc and/or Kafka, HBase is a plus.
  • Experience working with Linux environment, Cloud (GCP/AWS/Azure), Terraform, Git, Docker, Kubernetes, Nomad
  • Experience working with relational and column-oriented DBMS like PostgreSQL, ClickHouse and/or TimescaleDB
  • Experience with creating and monitoring workflows
  • Masterโ€™s degree in computer science, Information Technology, or a related field; or equivalent work experience
  • Good communicator, fluent in written and spoken English


What we offer


  • 25 paid holidays & 8 RTT in 2023
  • The hardware of your choice
  • Great health insurance (Alan)
  • Meal vouchers
  • Contribution to your monthly gym subscription
  • Contribution to daily commuting
  • Remote-friendly
  • Multiple team events (annual retreat, casual drinks, etc.)
  • An entrepreneurial environment with a lot of autonomy and responsibilities
  • Staff surprises!


Talent Acquisition Process


  • Interview with the Talent Acquisition team (30 min)
  • Call with the Hiring Manager (45 min)
  • Technical test / Business Case (1h in meet)
  • Cross team interviews with 2-3 team members (30 min)
  • Offer & reference check


Interested? Apply directly ๐Ÿ™‚


Diversity & Inclusion


At Kaiko, we believe in the diversity of thought because we appreciate that this makes us stronger. Therefore, we encourage applications from everyone who can offer their unique experience to our collective achievements.


Apply now
Apply now

Please let Kaiko know that you found this job on Web3Jobs.so. Your support will help us grow!