Solution Architect / Data Architect
Turbo Stars - We’re looking for a Solution Architect / Data Architect to join our team working on complex data solutions for products in the sports and iGaming domain.
Our platforms handle large-scale data — from game analytics and user behavior to CRM and external APIs. We build systems that turn data into business impact through analytics, monitoring, prediction, and personalization.
If you’re passionate about data architecture, engineering, and analytics, love optimizing pipelines, scaling data systems, and improving data quality — we’d love to have you on our team.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines connecting CRM, APIs, databases, and BI systems.
- Configure and monitor Airflow DAGs (daily tasks, retry policies, logging).
- Optimize queries and build aggregations in ClickHouse; create new tables and views.
- Manage data integrations with external APIs (CRM, Google Sheets, internal services).
- Monitor and optimize Power BI Dataflows, ensuring data freshness and performance.
- Build and maintain monitoring and alerting systems (Airflow + Telegram bot).
- Ensure data quality through validation and reconciliation with CRM and internal systems.
- Contribute to defining and evolving data architecture and infrastructure scalability.
Requirements:
- 2+ years of experience as a Data Engineer / BI Developer / ETL Engineer.
- Strong SQL skills (PostgreSQL, ClickHouse).
- Experience working with Airflow (DAG creation, troubleshooting, scheduling).
- Solid Python skills (API integrations, pandas, Airflow hooks/operators).
- Experience building BI solutions (Power BI / Looker / Tableau).
- Hands-on experience with large data volumes and query optimization.
- Basic understanding of streaming systems (Kafka / RabbitMQ).
- Ability to write clean, structured, and well-documented code.
Would be a plus:
- Background in iGaming or sports tech.
- Experience with GA4, GTM, or clickstream data.
- Automation of alerts and notifications (Slack / Telegram).
- Understanding of CI/CD processes and DevOps practices for data projects.
What we offer:
🧘♂ Work—Life Balance
Even in challenging times, we’ve built an environment where it’s pleasant to work and grow. You’ll find the right balance between work and rest, and feel what it’s like to be part of a team that truly values you.
💻 Work Format
- Choose where and how you work — from home, the office, or anywhere in the world
- Flexible schedule — we respect your personal rhythm🎉
Atmosphere & Development
- Gifts, raffles, training, team-building events, and corporate parties — it’s never boring with us!
- Internal training, mentoring, and access to courses
- Career growth without bureaucracy🏝
Vacation & Sick Leave
- 20 working days of paid annual vacation
- 100% paid sick leave with no bureaucracy
🚀 Why us?
If you’re looking for a friendly and ambitious product company that:
- Uses modern technologies
- Values initiative and independence
- Strives for continuous growth of both team and product
Then we’re waiting for you 🤝
Our platforms handle large-scale data — from game analytics and user behavior to CRM and external APIs. We build systems that turn data into business impact through analytics, monitoring, prediction, and personalization.
If you’re passionate about data architecture, engineering, and analytics, love optimizing pipelines, scaling data systems, and improving data quality — we’d love to have you on our team.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines connecting CRM, APIs, databases, and BI systems.
- Configure and monitor Airflow DAGs (daily tasks, retry policies, logging).
- Optimize queries and build aggregations in ClickHouse; create new tables and views.
- Manage data integrations with external APIs (CRM, Google Sheets, internal services).
- Monitor and optimize Power BI Dataflows, ensuring data freshness and performance.
- Build and maintain monitoring and alerting systems (Airflow + Telegram bot).
- Ensure data quality through validation and reconciliation with CRM and internal systems.
- Contribute to defining and evolving data architecture and infrastructure scalability.
Requirements:
- 2+ years of experience as a Data Engineer / BI Developer / ETL Engineer.
- Strong SQL skills (PostgreSQL, ClickHouse).
- Experience working with Airflow (DAG creation, troubleshooting, scheduling).
- Solid Python skills (API integrations, pandas, Airflow hooks/operators).
- Experience building BI solutions (Power BI / Looker / Tableau).
- Hands-on experience with large data volumes and query optimization.
- Basic understanding of streaming systems (Kafka / RabbitMQ).
- Ability to write clean, structured, and well-documented code.
Would be a plus:
- Background in iGaming or sports tech.
- Experience with GA4, GTM, or clickstream data.
- Automation of alerts and notifications (Slack / Telegram).
- Understanding of CI/CD processes and DevOps practices for data projects.
What we offer:
🧘♂ Work—Life Balance
Even in challenging times, we’ve built an environment where it’s pleasant to work and grow. You’ll find the right balance between work and rest, and feel what it’s like to be part of a team that truly values you.
💻 Work Format
- Choose where and how you work — from home, the office, or anywhere in the world
- Flexible schedule — we respect your personal rhythm🎉
Atmosphere & Development
- Gifts, raffles, training, team-building events, and corporate parties — it’s never boring with us!
- Internal training, mentoring, and access to courses
- Career growth without bureaucracy🏝
Vacation & Sick Leave
- 20 working days of paid annual vacation
- 100% paid sick leave with no bureaucracy
🚀 Why us?
If you’re looking for a friendly and ambitious product company that:
- Uses modern technologies
- Values initiative and independence
- Strives for continuous growth of both team and product
Then we’re waiting for you 🤝
Published 10 October
12 views
·
0 applications
📊
Average salary range of similar jobs in
analytics →
Loading...