We are looking for an ETL DWH Developer to join our Data Infrastructure team.

The Data Infrastructure team is responsible for the collection, storage, and quality of all company data, the development and support of analytical services and tools. In this position, you will have the opportunity to be responsible for a dedicated domain area within our data warehouse and improve its structure. 

Job Responsibilities

  • Development of DWH and data showcases; Optimization of ETL Pipelines;
  • Communicating with business analysts and formalizing tasks;
  • Participating in analytical warehouse architecture enhancements and data modeling;
  • Describing the business subject matter domain and logical model design;
  • Configuring data transports from external sources.

Key Qualifications

  • Experience with Python from 3 years;
  • Experience with MPP DBMS (Vertica/Trino/Clickhouse/Greenplum) from 1 year;
  • Hadoop (writing queries on Hive, understanding of general MPP concepts + core ecosystem elements, optimization principles);
  • Experience with Docker and Airflow - 1 year or more;
  • Experience with GIT;
  • Knowledge of data warehouse architecture (normal forms, Star schema, Data Vault);
  • Knowledge of English not lower than B1.
Will be a plus:
  • Experience with Postgres.

We Offer You

  • remote work;
  • a flexible timetable — we don’t require you to be online at 09:00 sharp. You can start work at a time that suits you;
  • interesting and ambitious tasks that will take you to the next professional level;
  • learning: seminars, trainings and conferences. If you want to participate in a conference,we will help to organize it;
  • private health insurance;
  • team-building activities: movie nights, quizzes, thematic parties, annual trips to the countryside, football and volleyball matches;
  • corporate discounts on hotels and other services;
  • a young and active team of super specialists.
Apply to this position

Or share with your friends