Middle Data Engineer
FAVBET Tech develops software that is used by millions of players around the world for the international company FAVBET Entertainment.
We develop innovations in the field of gambling and betting through a complex multi-component platform which is capable to withstand enormous loads and provide a unique experience for players.
FAVBET Tech does not organize and conduct gambling on its platform. Its main focus is software development.
We are looking for a Middle/Senior Data Engineer to join our Data Integration Team.
Main areas of work:
- Betting/Gambling Platform Software Development β software development that is easy to use and personalized for each customer.
- Highload Development β development of highly loaded services and systems.
- CRM System Development β development of a number of services to ensure a high level of customer service, effective engagement of new customers and retention of existing ones.
- Big Data β development of complex systems for processing and analysis of big data.
- Cloud Services β we use cloud technologies for scaling and business efficiency
Responsibilities:
- Design, build, install, test, and maintain highly scalable data management systems.
- Develop ETL/ELT processes and frameworks for efficient data transformation and loading.
- Implement, optimize, and support reporting solutions for the Sportsbook domain.
- Ensure effective storage, retrieval, and management of large-scale data.
- Improve data query performance and overall system efficiency.
- Collaborate closely with data scientists and analysts to deliver data solutions and actionable insights.
Requirements:
- At least 2 years of experience in designing and implementing modern data integration solutions.
- Masterβs degree in Computer Science or a related field.
- Proficiency in Python and SQL, particularly for data engineering tasks.
- Hands-on experience with data processing, ETL (Extract, Transform, Load), ELT (Extract, Load, Transform) processes, and data pipeline development.
- Experience with DBT framework and Airflow orchestration.
- Practical experience with both SQL and NoSQL databases (e.g., PostgreSQL, MongoDB).
- Experience with Snowflake.
- Working knowledge of cloud services, particularly AWS (S3, Glue, Redshift, Lambda, RDS, Athena).
- Experience in managing data warehouses and data lakes.
- Familiarity with star and snowflake schema design.
- Understanding of the difference between OLAP and OLTP.
Would be a plus:
- Experience with other cloud data services (e.g., AWS Redshift, Google BigQuery).
- Experience with version control tools (e.g., GitHub, GitLab, Bitbucket).
- Experience with real-time data processing (e.g., Kafka, Flink).
- Familiarity with orchestration tools (e.g., Airflow, Luigi).
- Experience with monitoring and logging tools (e.g., ELK Stack, Prometheus, CloudWatch).
- Knowledge of data security and privacy practices.
We offer:
- 30 day off β we value rest and recreation;
- Medical insurance for employees and the possibility of training employees at the expense of the company and gym membership;
- Remote work or the opportunity β our own modern lofty office with spacious workplace, and brand-new work equipment (near Pochaina metro station);
- Flexible work schedule β we expect a full-time commitment but do not track your working hours;
- Flat hierarchy without micromanagement β our doors are open, and all teammates are approachable.
During the war, the company actively supports the Ministry of Digital Transformation of Ukraine in the initiative to deploy an IT army and has already organized its own cyber warfare unit, which makes a crushing blow to the enemyβs IT infrastructure 24/7, coordinates with other cyber volunteers and plans offensive actions on its IT front line.
Required languages
| English | B1 - Intermediate |
| Ukrainian | Native |