Jobs

24
  • Β· 29 views Β· 6 applications Β· 1d

    Data Engineer

    Countries of Europe or Ukraine Β· 2 years of experience Β· Intermediate
    Looking for a Data Engineer to join the Dataforest team. If you are looking for a friendly team, a healthy working environment, and a flexible schedule β€’ you have found the right place to send your CV. Skills requirements: β€’ 2+ years of experience with...

    Looking for a Data Engineer to join the Dataforest team. If you are looking for a friendly team, a healthy working environment, and a flexible schedule β€’ you have found the right place to send your CV.

     

    Skills requirements:
    β€’ 2+ years of experience with Python;
    β€’ 2+ years of experience as a Data Engineer;
    β€’ Experience with Pandas;
    β€’ Experience with SQL DB / NoSQL (Redis, Mongo, Elasticsearch) / BigQuery;
    β€’ Familiarity with Amazon Web Services;
    β€’ Knowledge of data algorithms and data structures is a MUST;
    β€’ Working with high volume tables 10m+.


    Optional skills (as a plus):
    β€’ Experience with Spark (pyspark);
    β€’ Experience with Airflow;
    β€’ Experience with Kafka;
    β€’ Experience in statistics;
    β€’ Knowledge of DS and Machine learning algorithms..

     

    Key responsibilities:
    β€’ Create ETL pipelines and data management solutions (API, Integration logic);
    β€’ Different data processing algorithms;
    β€’ Involvement in creation of forecasting, recommendation, and classification models.

     

    We offer:

    β€’ Great networking opportunities with international clients, challenging tasks;

    β€’ Building interesting projects from scratch using new technologies;

    β€’ Personal and professional development opportunities;

    β€’ Competitive salary fixed in USD;

    β€’ Paid vacation and sick leaves;

    β€’ Flexible work schedule;

    β€’ Friendly working environment with minimal hierarchy;

    β€’ Team building activities, corporate events.

    More
  • Β· 62 views Β· 9 applications Β· 22d

    Data Engineer

    Full Remote Β· EU Β· Product Β· 2 years of experience Β· Upper-Intermediate
    Role Overview: We are looking for a Data Engineer to join the growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow. The right candidate will...

    Role Overview:

    We are looking for a Data Engineer to join the growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

    Key Responsibilities:
     

    β€” Develop and maintain data infrastructure and data warehouse solutions;

    β€” Design, develop, and maintain scalable and efficient data pipelines and ETL processes;

    β€” Develop APIs;

    β€” Gathering and defining business requirements for data tools and analytics;

    β€” Communicate and collaborate with the analytics team;

    β€”Monitor and troubleshoot data pipelines and infrastructure, and implement measures to ensure data integrity, security, and performance;

    β€” Assistance in the implementation of data science solutions;

    β€” Develop and maintain documentation for data pipelines, infrastructure, and workflows;

    β€” Stay up-to-date with the latest data engineering technologies and best practices, and make recommendations for new tools and approaches to improve efficiency and quality;

    β€” Automation of the data processes;

    β€” Collect data from different sources.
     

    Ideal profile for the position:
     

    β€” 2+ years of work experience as a Data Engineer;

    β€” Experience with AWS - S3, Redshift, DMS, Glue, Lambda, Athena, QuickSight;

    β€” Excellent level of SQL;

    β€” Proficient in Python;

    β€” Knowledge and experience with the development of data warehousing and ETL pipelines;

    β€” API development experience;

    β€” Basic understanding of machine learning and data science;

    β€” Experience with API development;

    β€” Experience in relational and non-relational databases;

    β€” Good-level written and verbal communication skills;

    β€” Upper-intermediate or higher English level.
     

    The company guarantees you the following benefits:
     

    The company guarantees you the following benefits:

    β€” Global Collaboration: Join an international team where everyone treats each other with respect and moves towards the same goal;

    Autonomy and Responsibility: Enjoy the freedom and responsibility to make decisions without the need for constant supervision.

    β€” Competitive Compensation: Receive competitive salaries reflective of your expertise and knowledge as our partner seeks top performers.

    β€” Remote Work Opportunities: Embrace the flexibility of fully remote work, with the option to visit company offices that align with your current location.

    β€” Flexible Work Schedule: Focus on performance, not hours, with a flexible work schedule that promotes a results-oriented approach;

    β€” Unlimited Paid Time Off: Prioritize work-life balance with unlimited paid vacation and sick leave days to prevent burnout;

    β€” Career Development: Access continuous learning and career development opportunities to enhance your professional growth;

    β€” Corporate Culture: Experience a vibrant corporate atmosphere with exciting parties and team-building events throughout the year;

    β€” Referral Bonuses: Refer talented friends and receive a bonus after they successfully complete their probation period;

    β€” Medical Insurance Support: Choose the right private medical insurance, and receive compensation (full or partial) based on the cost;

    β€” Flexible Benefits: Customize your compensation by selecting activities or expenses you'd like the company to cover, such as a gym subscription, language courses, Netflix subscription, spa days, and more;

    β€” Education Foundation: Participate in a biannual raffle for a chance to learn something new, unrelated to your job, as part of our commitment to ongoing education.

     

     

    Interview process:

    β€” A 30-minute interview with a member of our HR team to get to know you and your experience;

    β€” A final 2-hour interview with the team to gauge your fit with our culture and working style.

     

     

    If you find this opportunity right for you, don't hesitate to apply or get in touch with us if you have any questions!

     

    More
  • Β· 46 views Β· 2 applications Β· 22d

    Data Engineer (Azure)

    Full Remote Β· Countries of Europe or Ukraine Β· 2 years of experience Β· Upper-Intermediate
    Dataforest is looking for a Data Engineer to join an interesting software development project in the field of water monitoring. Our EU client’s platform offers full visibility into water quality, compliance management, and system performance. If you are...

    Dataforest is looking for a Data Engineer to join an interesting software development project in the field of water monitoring. Our EU client’s platform offers full visibility into water quality, compliance management, and system performance. If you are looking for a friendly team, a healthy working environment, and a flexible schedule β€’ you have found the right place to send your CV.

    Key Responsibilities:
    - Create and manage scalable data pipelines with Azure SQL and other databases;
    - Use Azure Data Factory to automate data workflows;
    - Write efficient Python code for data analysis and processing;
    - Ability to develop data reports and dashboards using PowerBI;
    - Use Docker for application containerization and deployment streamlining;
    - Manage code quality and version control with Git.

    Skills requirements:
    - 3+ years of experience with Python;
    - 2+ years of experience as a Data Engineer;
    - Strong SQL knowledge, preferably with Azure SQL experience;
    - Python skills for data manipulation;
    - Expertise in Docker for app containerization;
    - Familiarity with Git for managing code versions and collaboration;
    - Upper- intermediate level of English.

    Optional skills (as a plus):
    - Experience with Azure Data Factory for orchestrating data processes;
    - Experience developing APIs with FastAPI or Flask;
    - Proficiency in Databricks for big data tasks;
    - Experience in a dynamic, agile work environment;
    - Ability to manage multiple projects independently;
    - Proactive attitude toward continuous learning and improvement.

    We offer:

    - Great networking opportunities with international clients, challenging tasks;

    - Building interesting projects from scratch using new technologies;

    - Personal and professional development opportunities;

    - Competitive salary fixed in USD;

    - Paid vacation and sick leaves;

    - Flexible work schedule;

    - Friendly working environment with minimal hierarchy;

    - Team building activities and corporate events.

    More
  • Β· 20 views Β· 0 applications Β· 4d

    Middle BI/DB Developer

    Office Work Β· Ukraine (Lviv) Β· Product Β· 2 years of experience Β· Upper-Intermediate
    About us: EveryMatrix is a leading B2B SaaS provider delivering iGaming software, content and services. We provide casino, sports betting, platform and payments, and affiliate management to 200 customers worldwide. But that's not all! We're not just about...

    About us:

    EveryMatrix is a leading B2B SaaS provider delivering iGaming software, content and services. We provide casino, sports betting, platform and payments, and affiliate management to 200 customers worldwide.

    But that's not all! We're not just about numbers, we're about people. With a team of over 1000 passionate individuals spread across twelve countries in Europe, Asia, and the US, we're all united by our love for innovation and teamwork.

    EveryMatrix is a member of the World Lottery Association (WLA) and European Lotteries Association. In September 2023 it became the first iGaming supplier to receive WLA Safer Gambling Certification. EveryMatrix is proud of its commitment to safer gambling and player protection whilst producing market leading gaming solutions.

    Join us on this exciting journey as we continue to redefine the iGaming landscape, one groundbreaking solution at a time.
     

    We are looking for a passionate and dedicated Junior QA to join our team in Lviv!

    About the unit:

    DataMatrix is a part of EveryMatrix platform that is responsible for collecting, storing, processing and utilizing hundreds of millions of transactions from the whole platform every single day. We develop Business Intelligent solutions, reports, 3rd party integrations, data streaming and other products for both external and internal use. The team consists of 35 people and is located in Lviv.

    What You'll get to do:

    • Develop real time data processing and aggregations
    • Create and modify data marts (enhance our data warehouse)
    • Take care of internal and external integrations
    • Forge various types of reports

    Our main stack:

    • DB: BigQuery, PostgreSQL
    • ETL: Apache Airflow, Apache NiFi
    • Streaming: Apache Kafka

    What You need to know:

    Here's what we offer:

    • Start with 22 days of annual leave, with 2 additional days added each year, up to 32 days by your fifth year with us.
    • Stay Healthy: 10 sick leave days per year, no doctor's note required; 30 medical leave days with medical allowance
    • Support for New Parents:
    • 21 weeks of paid maternity leave, with the flexibility to work from home full-time until your child turns 1 year old.
    • 4 weeks of paternity leave, plus the flexibility to work from home full-time until your child is 13 weeks old.

    Our office perks include on-site massages and frequent team-building activities in various locations.

    Benefits & Perks:

    • Daily catered lunch or monthly lunch allowance.β€―
    • Private Medical Subscription.β€―
    • Access online learning platforms like Udemy for Business, LinkedIn Learning or O’Reilly, and a budget for external training.
    • Gym allowance

    At EveryMatrix, we're committed to creating a supportive and inclusive workplace where you can thrive both personally and professionally. Come join us and experience the difference!

    More
  • Β· 28 views Β· 5 applications Β· 24d

    Cloud System engineer

    Full Remote Β· Ukraine Β· Product Β· 2 years of experience Β· Pre-Intermediate
    Requirements: Knowledge of the core functionality of virtualization platforms; Experience implementing and migrating workloads in virtualized environment; Experience in complex IT solutions and Hybrid Cloud solution projects. Good understanding of...

    Requirements:

    • Knowledge of the core functionality of virtualization platforms;
    • Experience implementing and migrating workloads in virtualized environment;
    • Experience in complex IT solutions and Hybrid Cloud solution projects.
    • Good understanding of IT-infrastructure services is a plus;
    • Strong knowledge in troubleshooting of complex environments in case of failure;
    • At least basic knowledge in networking & information security is an advantage
    • Hyper-V, Proxmox, VMWare experience would be an advantage;
    • Experience in the area of services outsourcing (as customer and/or provider) is an advantage.
    • Work experience of 2+ years in a similar position
    • Scripting and programming experience/background in PowerShell/Bash is an advantage;
    • Strong team communication skills, both verbal and written;
    • Experience in technical documentation writing and preparation;
    • English skills - intermediate level is minimum and mandatory for global teams communication;
    • Industry certification focused on relevant solution area.

    Areas of Responsibility includes:

    • Participating in deployment and IT-infrastructure migration projects, Hybrid Cloud solution projects; Client support;
    • Consulting regarding migration IT-workloads in complex infrastructures;
    • Presales support (Articulating service value in the sales process) / Up and cross sell capability);
    • Project documentation: technical concepts
    • Education and development in professional area including necessary certifications.
    More
  • Β· 32 views Β· 1 application Β· 18d

    Middle BigData Engineer to $2300

    Full Remote Β· Ukraine Β· 2 years of experience
    Description of the project: We are looking for a Middle Big Data Engineer to join a large-scale telecommunications project. This role involves designing and implementing robust data processing systems, building data warehouses, and working with modern big...

    Description of the project:

    We are looking for a Middle Big Data Engineer to join a large-scale telecommunications project. This role involves designing and implementing robust data processing systems, building data warehouses, and working with modern big data tools and technologies.

     

    Your qualification:

    • 2+ years of experience in Big Data engineering.
    • Solid knowledge and practical experience with OLAP technologies.
    • Strong SQL skills and experience with schema design.
    • Proficiency in Java or Python for process automation.
    • Experience with NoSQL databases such as HBase, Elasticsearch; familiarity with Redis or MongoDB is a plus.
    • Hands-on experience with Vertica or other DBMS suitable for large-scale data analysis.
    • Understanding of distributed systems such as Spark, Hadoop, etc.
    • Experience working with Kafka or other message broker systems.
    • Familiarity with data governance tools and data science/analytics workbenches.
    • Experience with Ezmeral Data Fabric is a plus.
    • Knowledge of UNIX and experience in Shell scripting for automation tasks.
    • Technical English proficiency (reading and understanding documentation).

     

    Responsibilities:

    • Design and implement data extraction, processing, and transformation pipelines based on MPP architecture.
    • Build and maintain data warehouses and OLAP-based systems.
    • Design database schemas and develop dimensional data models.
    • Work with distributed systems and clusters for big data processing.

     

    We are delighted to provide you with the following benefits:

    • Opportunities for growth and development within the project
    • Flexible working hours
    • Option to work remotely or from the office
    More
  • Β· 65 views Β· 9 applications Β· 17d

    Middle Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 2 years of experience Β· Intermediate
    At Promova, we’re redefining language education to make it accessible, personal, and effective for today’s fast-paced world. Our growing team of 150+ professionals is on a mission to connect people, bridge cultures, and empower lifelong learnersβ€”reaching...

    At Promova, we’re redefining language education to make it accessible, personal, and effective for today’s fast-paced world. Our growing team of 150+ professionals is on a mission to connect people, bridge cultures, and empower lifelong learnersβ€”reaching every country except aggressor states (yes, even Antarctica).

    We blend AI-driven innovation with human expertise to create tools that help people speak with confidence, embrace new cultures, and truly belong in any language. As part of our team, you’ll make a real impact, work in an environment built on care, quality, and creativity, and grow alongside a community that values progress.

    With flexible work options, comprehensive benefits, and endless opportunities for growth, Promova is more than a workplace β€” it’s a movement.

    If you’re ready to help reimagine language learning for today’s world, let’s do it together!

    We’re looking for a highly motivated and experienced Data Engineer to build and optimize our data infrastructure. If you’re passionate about data architecture, ETL development, and driving data quality β€” this is your opportunity to make a real impact.

     

    About You:

    • Minimum 2+ years of experience as a Data Engineer or Python Backend Developer.
    • Proficiency in SQL and Data Modeling for data warehouses.
    • Strong focus on data quality and commitment to building solutions with high data integrity.
    • Experience with Dataform or DBT.
    • Solid Python skills, especially with a background in backend development.
    • Hands-on experience working with MongoDB or other databases (such as DynamoDB, Firebase Firestore, etc.).
    • Ability to adapt to new challenges, work proactively, and collaborate effectively within a team or independently.
    • English level: Intermediate (B1+) or higher

     

    Nice to Have:

    • Proven experience in building, optimizing, and maintaining ETL pipelines.
    • Familiarity with Airflow and other modern data orchestration tools.
    • Proficiency in working with Git and a basic understanding of Docker.

     

    Your Areas of Impact:

    • Build and optimize data architecture: Design and develop scalable and reliable DWH architecture and data models (BigQuery).
    • Maintain data accuracy: Monitor, maintain, and document data models (DataMart) to ensure high-quality, trustworthy data.
    • ETL development: Build and optimize ETL pipelines to process data from internal databases (MongoDB, PostgreSQL) and external APIs, leveraging tools like Airflow and Fivetran.
    • Optimize existing processes: Continuously enhance the performance, reliability, and scalability of existing data pipelines and architecture.
    • Cloud service configuration: Manage and configure cloud infrastructure (GCP), ensuring optimal performance and cost-efficiency.
    • Support data-driven decision-making: Enable the business with timely, accurate, and accessible data to power insights and innovation.

     

    What We Offer:

    • Opportunity to work with a modern technology stack, including Google Cloud, BigQuery, Airflow, and Tableau.
    • Ability to influence product development and participate in technology stack selection decisions.
    • Access to a large volume of transactional and product data for analysis and optimization.
    • A team of high-performing, results-driven professionals, eager to share knowledge and expertise.
    • Full ownership of the entire product lifecycle within the team (development, promotion, monetization), providing the opportunity to understand the business holistically, work with metrics across all stages of the user journey, and drive impact through analytics.
    • Cross-platform work environment, with products available on iOS, Android, and Web.

     

    Corporate Benefits:

    πŸŽ“Growth β€” offered to help develop your skills, advance your career, and reach your full potential: сompensation for additional training at external events and seminars; access to a large electronic library; paid online courses and conferences; Promova English Group; English Classes; Promova Speaking Club, and access to Promova Premium.

     

    🧘🏼Wellbeing β€” offered to support your overall health, happiness, and resilience: work remotely from any safe location worldwide; flexible work schedule; 20 paid vacation days per year; an unlimited number of sick days medical insurance coverage; mental health support; power station reimbursement; employee discounts and special benefits for remote employees.

     

    πŸ„πŸΌβ€β™‚οΈFun & Activities β€” offered to foster informal communication and strengthen social connections among teammates: remote team compensation for gathering and team-building episodes.

     

    Interview Process:

    • Pre-screen with Recruiter (40 minutes)
    • Interview with the Hiring Manager (1,5 hours)
    • Test Task
    • Bar-raising (1 hour)

     

    Hit the apply button and let’s create the unicorns together! πŸ¦„

    More
  • Β· 74 views Β· 2 applications Β· 17d

    Middle Strong/Senior Data Engineer

    Full Remote Β· Ukraine Β· 2 years of experience Β· Upper-Intermediate
    Our mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities Our values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious...

    Our mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities 🌿Our values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious Openness and Result Driven. We offer a safe, inclusive and productive environment for all team members, and we’re always open to feedbackπŸ’œ
    If you want to work from home or work in the city center of Kyiv, great β€” apply right now.

    About the project:
    Generative AI technologies are rapidly changing how digital content is created and consumed. However, many of these systems are trained on vast amounts of data, including articles, videos, and other creative worksβ€”often without the knowledge or consent of the original creators. As a result, publishers, journalists, and content producers face the risk of losing both visibility and critical revenue streams such as advertising, subscriptions, and licensing.

    Our project addresses this issue by developing a system that allows AI platforms to identify when specific content has influenced a generated result. This enables transparent attribution and the possibility for content creators to receive compensation based on how often their work is used. The goal is to build a sustainable ecosystem where creators are fairly rewarded, while AI-generated content remains trustworthy and ethically grounded.

    Requirements:
    ● 3+ years of experience in Data Engineering;
    ● Solid Python programming skills, especially in data processing and system automation;
    ● Proven experience with Airflow, Kubeflow, or Kafka for orchestrating data workflows;
    ● Familiarity with search engine concepts and indexing;
    ● Experience working with structured and semi-structured web data (HTML, JSON, APIs);
    ● Ability to work with large-scale distributed systems and cloud platforms (e.g., AWS, GCP, Azure);
    ● English: Upper-Intermediate+.

    What you will get:
    ● Competitive salary and good compensation package;
    ● Exciting, challenging and stable startup projects with a modern stack;
    ● Corporate English course;
    ● Ability to practice English and communication skills through permanent interaction with clients from all over the world;
    ● Professional study compensation, online courses and certifications;
    ● Career development opportunity, semi-annual and annual salary review process;
    ● Necessary equipment to perform work tasks;
    ● VIP medical insurance or sports coverage;
    ● Informal and friendly atmosphere;
    ● The ability to focus on your work: a lack of bureaucracy and micromanagement;
    ● Flexible working hours (start your day between 8:00 and 11:30);
    ● Team buildings, corporate events;
    ● Paid vacation (18 working days) and sick leaves;
    ● Cozy offices in 2 cities ( Kyiv & Lviv ) with electricity and Wi-Fi (Generator & Starlink);
    ● Compensation for coworking (except for employees from Kyiv and Lviv);
    ● Corporate lunch + soft skills clubs;
    ● Unlimited work from home from anywhere in the world (remote);
    ● Geniusee has its own charity fund.




     

    More
  • Β· 87 views Β· 7 applications Β· 16d

    Data Engineer

    Office Work Β· Ukraine (Kyiv) Β· Product Β· 2 years of experience Β· Intermediate
    3Shape develops 3D scanners and software solutions that enable dental and hearing professionals to treat more people, more effectively. Our products are market leading innovative solutions that make a real difference in the lives of both patients and...

    3Shape develops 3D scanners and software solutions that enable dental and hearing professionals to treat more people, more effectively. Our products are market leading innovative solutions that make a real difference in the lives of both patients and dental professionals around the world.
     

    3Shape is headquartered in Copenhagen, with development teams in Denmark, Ukraine, North Macedonia and with a production site in Poland.
     

    We are a global company with presence in Europe, Asia and the Americas. Founded in a year 2000, today, we provide services to customers in over 130 countries. Our growing talent pool of over 2500 employees spans 45+ nationalities.
     

    3Shape as an employer is committed to Ukraine. Our UA office was founded in 2006, and we are continuing to grow, hire and take care of our employees even during the war in Ukraine. Among other actions, we support our contractors who are called to the military service, as well as care about our colleagues’ mental health by implementing different activities.
     

    If you are looking for stability in your future, we are the right place for you.


    About the role:

    The Customer Data Strategy is a high-priority initiative with significant potential and senior management buy-in. Join our expanding team that currently includes a Data Analyst, Data Engineer, Data Architect, and Manager.
     

    Key responsibilities: 

    • Develop and optimize Azure Databricks in collaboration with cross-functional teams to enable a 'one-stop-shop' for analytical data
    • Translate customer-focused commercial needs into concrete data products
    • Build data products to unlock commercial value and help integrate systems
    • Coordinate technical alignment meetings between functions
    • Act as customer data ambassador to improve 'data literacy' across the organization

    Your profile:

    • 1-2 years of experience working with data engineering in a larger organization, tech start-up, or as an external consultant
    • Extensive experience with Azure Databricks, Apache Spark, and Delta Lake
    • Proficiency in Python, PySpark and SQL
    • Experience with optimizing and automating data engineering processes
    • Familiarity with GitHub and GitHub Actions for CI/CD processes
    • Knowledge of Terraform as a plus


    Being the part of us means:

    • Make an impact in one of the most exciting Danish tech companies in the medical device industry
    • Work on solutions used by thousands of dental professionals worldwide
    • Be part of 3Shape's continued accomplishments and growth
    • Contribute to meaningful work that changes the future of dentistry
    • Develop professionally in a unique and friendly environment
    • Enjoy a healthy work-life balance
    • Occasional business trips to Western Europe
       

    We offer:

    • 39 hours of cooperation per week within a flexible time frame
    • 24 business days of annual leaves
    • Medical insurance (with additional Dentistry Budget and 10 massaging sessions per year included)
    • Possibility of flexible remote cooperation
    • Good working conditions in a comfortable office near National Technical University β€œKPI” which includes: Blackout ready infrastructure. Corporate Paper Book Library. Gym-Room with Shower.
    • A parking lot with free spaces for employees
    • Partial compensation of lunches
    • Paid sick leaves and child sick leaves
    • Maternity, paternity and family issues leaves
    • Well-being program: monthly well-being meetings and individual psychology hot-line
       

    Want to join us and change the future of dentistry?

    More
  • Β· 147 views Β· 18 applications Β· 14d

    Senior Data Engineer

    Full Remote Β· Worldwide Β· 2 years of experience
    Responsibilities: β€’ Design and develop data pipelines and ETL/ELT processes to support internal data analytics and reporting. β€’ Build and maintain databases, with a strong understanding of both OLTP and OLAP architectures. β€’ Optimize data workflows to...

    Responsibilities:

    β€’ Design and develop data pipelines and ETL/ELT processes to support internal data analytics and reporting.
    β€’ Build and maintain databases, with a strong understanding of both OLTP and OLAP architectures.
    β€’ Optimize data workflows to ensure reliable, accessible, and actionable data for product and analytics teams.
    β€’ Collaborate with analysts, engineers, and product teams to align data infrastructure with business goals.
    β€’ Utilize tools like Apache Spark, Spark Streaming, Kafka, Airflow, and dbt for data transformation and pipeline orchestration.
    β€’ Implement data visualization solutions using modern BI tools.
     

    Requirements:

    β€’ 1–3 years of experience as a Data Engineer or in a related role.
    β€’ Proficiency in database design and development.
    β€’ Strong SQL skills and experience with ClickHouse, PostgreSQL, MongoDB.
    β€’ Knowledge of OLTP and OLAP architecture.
    β€’ Experience building and managing ETL/ELT pipelines.
    β€’ Hands-on experience with data orchestration tools like Apache Spark, Kafka, Airflow, and dbt.
    β€’ English level A2+, Ukrainian – native speaker.
     

    Benefits:

    β€’ Remote or office-based work (flexible depending on your location).
    β€’ Work in a fast-paced, data-driven product team with real business impact.
    β€’ Opportunities to grow professionally, influence product decisions, and own your data architecture.

    More
  • Β· 55 views Β· 1 application Β· 14d

    Middle-Senior Data Engineer (Grafana)

    Full Remote Β· Ukraine Β· 2 years of experience Β· Upper-Intermediate
    Our mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities Our values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious...

    Our mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities 🌿Our values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious Openness and Result Driven. We offer a safe, inclusive and productive environment for all team members, and we’re always open to feedbackπŸ’œ
    If you want to work from home or work in the city center of Kyiv, great β€” apply right now.

     

    About the project:
    Generative AI technologies are rapidly changing how digital content is created and consumed. However, many of these systems are trained on vast amounts of data, including articles, videos, and other creative worksβ€”often without the knowledge or consent of the original creators. As a result, publishers, journalists, and content producers face the risk of losing both visibility and critical revenue streams such as advertising, subscriptions, and licensing.

    Our project addresses this issue by developing a system that allows AI platforms to identify when specific content has influenced a generated result. This enables transparent attribution and the possibility for content creators to receive compensation based on how often their work is used. The goal is to build a sustainable ecosystem where creators are fairly rewarded, while AI-generated content remains trustworthy and ethically grounded.

     

    Requirements:
    ● 2+ years of experience in Data Engineering;
    ● Hands-on experience with Grafana, Loki, Promtail, and Grafana Agent;
    ● Strong knowledge of log processing pipelines, including log parsing, structuring, and indexing;
    ● Proficiency in query languages such as LogQL, PromQL, or SQL;
    ● Experience setting up alerting and reporting in Grafana;
    ● Proficiency in Python;
    ● English: Upper-Intermediate+.

     

    What you will get:
    ● Competitive salary and good compensation package;
    ● Exciting, challenging and stable startup projects with a modern stack;
    ● Corporate English course;
    ● Ability to practice English and communication skills through permanent interaction with clients from all over the world;
    ● Professional study compensation, online courses and certifications;
    ● Career development opportunity, semi-annual and annual salary review process;
    ● Necessary equipment to perform work tasks;
    ● VIP medical insurance or sports coverage;
    ● Informal and friendly atmosphere;
    ● The ability to focus on your work: a lack of bureaucracy and micromanagement;
    ● Flexible working hours (start your day between 8:00 and 11:30);
    ● Team buildings, corporate events;
    ● Paid vacation (18 working days) and sick leaves;
    ● Cozy offices in 2 cities ( Kyiv & Lviv ) with electricity and Wi-Fi (Generator & Starlink);
    ● Compensation for coworking (except for employees from Kyiv and Lviv);
    ● Corporate lunch + soft skills clubs;
    ● Unlimited work from home from anywhere in the world (remote);
    ● Geniusee has its own charity fund.

    More
  • Β· 106 views Β· 10 applications Β· 10d

    Data Engineer

    Part-time Β· Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 2 years of experience Β· Intermediate
    Novoplex is a group of companies that develop iGaming products and provide services in various areas of performance marketing. We are looking for an experienced Data Engineer to design efficient data workflows and ensure data reliability across our...

    Novoplex is a group of companies that develop iGaming products and provide services in various areas of performance marketing.

    We are looking for an experienced Data Engineer to design efficient data workflows and ensure data reliability across our systems.

    Key Responsibilities:
    - Design, build, and maintain scalable data pipelines using Airflow and Python.
    - Develop and optimize SQL queries for data transformation, analysis, and reporting in BigQuery.
    - Ensure data quality, reliability, and integrity across the pipeline.
    - Automate data ingestion from various sources (APIs, cloud storage, etc.).
    - Monitor and troubleshoot data pipeline performance and failures.
    - Collaborate with analysts, and stakeholders to understand data needs.
    - Implement data governance best practices (logging, monitoring, versioning).

    Requirements:
    - 2–4 years of experience as a data engineer or similar role.
    - Strong Python skills, especially for scripting and data processing.
    - Expertise in SQL (especially analytical queries and data modeling).
    - Experience with Google BigQuery: data loading, partitioning, and performance tuning.
    - Solid understanding and experience with Apache Airflow: DAG design, scheduling, and troubleshooting.
    - Hands-on experience with Google Cloud Platform (GCP) services like:
    β€’ Cloud Storage
    β€’ Cloud Functions (optional)
    β€’ Pub/Sub (nice to have)
    β€’ Dataflow (bonus)
    - Familiarity with ETL/ELT best practices and orchestration patterns.
    - Experience working with version control systems (e.g., Git).
    - Comfortable working in CI/CD environments.

    We offer:
    - 4 hours working day from Monday to Friday.
    - 10 working days of annual paid vacations.
    - 3 working days of sick leave per year without a medical
    certificate, unlimited with a medical certificate.
    - Working equipment provision.
    - Open-minded and engaged team. 


     

    Our hiring process:
    - HR Screening
    - General/Technical Interview – with a Data Engineer and the CTO
    - Final Interview – with the Head of Analytics and the CTO
    - OfferπŸ₯³ 
     





     

    More
  • Β· 73 views Β· 0 applications Β· 8d

    Middle Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· 2 years of experience Β· Intermediate
    Dataforest is looking for a Middle Data Engineer to join our team and work on the Dropship project β€” a cutting-edge data intelligence platform for e-commerce analytics. You will be responsible for developing and maintaining a scalable data architecture...

    Dataforest is looking for a Middle Data Engineer to join our team and work on the Dropship project β€” a cutting-edge data intelligence platform for e-commerce analytics. You will be responsible for developing and maintaining a scalable data architecture that powers large-scale data collection, analysis, and integrations. We are waiting for your CV!

    Requirements:

    • 2+ years of commercial experience with Python.
    • Experience working with PostgreSQL databases.
    • Experience with monitoring tools, ideally CloudWatch, Prometheus, or Grafana.
    • Familiarity with data structures and algorithms, able to choose suitable solutions for common data processing tasks.
    • Excellent programming skills in Python with a strong emphasis on optimization and code structuring.
    • Understanding of ETL principles.
    • Strong teamwork and communication abilities, ready to learn from and assist team members.
    • Experience working with Linux environments, cloud services (AWS), and Docker.
    • Ability to manage your tasks and work proactively under supervision.
       

    Nice to Have:

    • Experience in web scraping, data extraction, cleaning, and visualization.
    • Understanding of multiprocessing/multithreading in Python.
    • Familiarity with Redis.
    • Experience with Flask / Flask-RESTful for API development.
    • Knowledge and experience with Kafka.
       

    Key Responsibilities:

    • Develop and maintain a robust data processing architecture using Python.
    • Design and manage data pipelines using Kafka and SQS.
    • Optimize code for better performance and maintainability.
    • Design and implement efficient ETL processes.
    • Work with AWS technologies to ensure flexible and reliable data processing systems.
    • Collaborate with colleagues, actively participate in code reviews, and improve technical knowledge.
    • Take responsibility for your tasks and suggest improvements to processes and systems.

      We offer:
    • Working in a fast growing company;
    • Great networking opportunities with international clients, challenging tasks;
    • Personal and professional development opportunities;
    • Competitive salary fixed in USD;
    • Paid vacation and sick leaves;
    • Flexible work schedule;
    • Friendly working environment with minimal hierarchy;
    • Team building activities, corporate events.
    More
  • Β· 46 views Β· 4 applications Β· 3d

    Data Engineer (with Azure)

    Full Remote Β· Countries of Europe or Ukraine Β· 2 years of experience Β· Upper-Intermediate
    Would you like to increase your cloud expertise? We’re looking for a Data Engineer to join an international cloud technology company. This is a leading Microsoft & Azure partner providing cloud services in Europe and East Asia. Working with different...

    Would you like to increase your cloud expertise? We’re looking for a Data Engineer to join an international cloud technology company.

    This is a leading Microsoft & Azure partner providing cloud services in Europe and East Asia.

    Working with different customer domains + the most professional team – growth! Let’s discuss.

     

    Main Responsibilities:

    Data Engineer is responsible for helping select, deploy, and manage the systems and infrastructure required of a data processing pipeline to support customer requirements.

     

    You will work on cutting-edge cloud technologies, including Microsoft Fabric, Azure Synapse Analytics, Apache Spark, Data Lake, Data Bricks, Data Factory, Cosmos DB, HD Insights, Stream Analytics, Event Grid in the implementation projects for corporate clients all over EU, CIS, United Kingdom, Middle East.

    Our ideal candidate is a professional passionated with technologies, a curious and self-motivated person.

     

    Responsibilities revolve around DevOps and include implementing ETL pipelines, monitoring/maintaining data pipeline performance, model optimization

     

    Mandatory Requirements:

    – 2+ years of experience, ideally within a Data Engineer role.

    – understanding of data modeling, data warehousing concepts, and ETL processes

    – experience with Azure Cloud technologies

    – experience in distributed computing principles and familiarity with key architectures, broad experience across a set of data stores (Azure Data Lake Store, Azure Synapse Analytics, Apache Spark, Azure Data Factory)

    – Understanding of landing, staging area, data cleansing, data profiling, data security and data architecture concepts (DWH, Data Lake, Delta Lake/Lakehouse, Datamart)

    – SQL-skills

    – communication and interpersonal skills

    – English β€”Π’2

    – Ukrainian language

     

    Will be beneficial if a candidate has experience in SQL migration from on-premises to cloud, data modernization and migration, advanced analytics projects, and/or professional certification in data&analytics.

     

    We offer:

    – professional growth and international certification

    – free of charge technical and business trainings and the best bootcamps (worldwide, including HQ Microsoft- Redmond courses)

    – innovative data & analytics projects, practical experience with cutting-edge Azure data&analytics technologies at various customers’ projects

    – great compensation and individual bonus remuneration

    – medical insurance

    – long-term employment

    – ondividual development plan

    More
  • Β· 28 views Β· 2 applications Β· 2d

    Middle Data Engineer

    Full Remote Β· Georgia, Hungary, Kazakhstan, Poland, Serbia Β· 2 years of experience Β· Upper-Intermediate
    One of the leading digital health and wellness companies with millions of global customers is looking for a Middle BI/Data Engineer. The company offers robust digital capabilities and provides opportunities for entrepreneurial talents and visionaries to...

    One of the leading digital health and wellness companies with millions of global customers is looking for a Middle BI/Data Engineer. The company offers robust digital capabilities and provides opportunities for entrepreneurial talents and visionaries to find new, effective ways to help people live healthier lives. 

    Responsibilities:

    • Employ GCP tools (dbt, Airflow, and Looker) to enhance data quality, efficiency, and delivery of accurate and timely data.
    • Identify, investigate, and solve data issues: data quality, data discrepancies, and missing data.
    • Contribute to the development and improvement of data solutions.
    • Leverage the power of dbt for overcoming complex modelling problems with a focus on performance, robustness, and scalability.
    • Work on prevention and alerting solutions.
    • Adopt and refine our best practices, e.g., naming convention, data modeling, and data quality testing.
    • Communicate with cross-functional teams and non-technical stakeholders in a clear and structured manner.
    • Assist and support other team members in the design, development, and implementation of data warehousing, reporting, and analytics solutions.

     

    Technological stack: ETL, SQL, Airflow.
     

    Requirements

    • 2+ years of proven SQL experience: ability to join and manipulate data of various types (String, Integer, JSON, Array), write parameterized scripts, and debug SQL code.
    • Understanding of ETL/ELT concepts and modern data transformation practices.
    • Ability to understand, address, and clearly communicate data-related challenges from both technical and business perspectives.
    • Upper-Intermediate English proficiency (ability to pass a technical interview in English and collaborate effectively on an international team).
    • Strong soft skills: communication, proactivity, and a positive attitude.


    Benefits:

    • Fully remote collaboration
    • 20 business days of paid annual vacation
    • Competitive compensation package
    • Needed software license reimbursement
    • Collaborative and supportive team, where initiative is always valued
    • Annual salary review based on performance and contribution

       

    Our hiring process:

    • Screening with a recruiter (30 minutes)
    • English check (15 minutes) 
    • Technical interview (1-1,5 hour) 
    • Interview with the client (40 minutes - 1 hour)
    More
Log In or Sign Up to see all posted jobs