Jobs
112-
Β· 60 views Β· 7 applications Β· 12d
Middle Data Engineer
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 2 years of experience Β· IntermediateAt Promova, weβre redefining language education to make it accessible, personal, and effective for todayβs fast-paced world. Our growing team of 150+ professionals is on a mission to connect people, bridge cultures, and empower lifelong learnersβreaching...At Promova, weβre redefining language education to make it accessible, personal, and effective for todayβs fast-paced world. Our growing team of 150+ professionals is on a mission to connect people, bridge cultures, and empower lifelong learnersβreaching every country except aggressor states (yes, even Antarctica).
We blend AI-driven innovation with human expertise to create tools that help people speak with confidence, embrace new cultures, and truly belong in any language. As part of our team, youβll make a real impact, work in an environment built on care, quality, and creativity, and grow alongside a community that values progress.
With flexible work options, comprehensive benefits, and endless opportunities for growth, Promova is more than a workplace β itβs a movement.
If youβre ready to help reimagine language learning for todayβs world, letβs do it together!
Weβre looking for a highly motivated and experienced Data Engineer to build and optimize our data infrastructure. If youβre passionate about data architecture, ETL development, and driving data quality β this is your opportunity to make a real impact.About You:
- Minimum 2+ years of experience as a Data Engineer or Python Backend Developer.
- Proficiency in SQL and Data Modeling for data warehouses.
- Strong focus on data quality and commitment to building solutions with high data integrity.
- Experience with Dataform or DBT.
- Solid Python skills, especially with a background in backend development.
- Hands-on experience working with MongoDB or other databases (such as DynamoDB, Firebase Firestore, etc.).
- Ability to adapt to new challenges, work proactively, and collaborate effectively within a team or independently.
- English level: Intermediate (B1+) or higher
Nice to Have:
- Proven experience in building, optimizing, and maintaining ETL pipelines.
- Familiarity with Airflow and other modern data orchestration tools.
- Proficiency in working with Git and a basic understanding of Docker.
Your Areas of Impact:
- Build and optimize data architecture: Design and develop scalable and reliable DWH architecture and data models (BigQuery).
- Maintain data accuracy: Monitor, maintain, and document data models (DataMart) to ensure high-quality, trustworthy data.
- ETL development: Build and optimize ETL pipelines to process data from internal databases (MongoDB, PostgreSQL) and external APIs, leveraging tools like Airflow and Fivetran.
- Optimize existing processes: Continuously enhance the performance, reliability, and scalability of existing data pipelines and architecture.
- Cloud service configuration: Manage and configure cloud infrastructure (GCP), ensuring optimal performance and cost-efficiency.
- Support data-driven decision-making: Enable the business with timely, accurate, and accessible data to power insights and innovation.
What We Offer:
- Opportunity to work with a modern technology stack, including Google Cloud, BigQuery, Airflow, and Tableau.
- Ability to influence product development and participate in technology stack selection decisions.
- Access to a large volume of transactional and product data for analysis and optimization.
- A team of high-performing, results-driven professionals, eager to share knowledge and expertise.
- Full ownership of the entire product lifecycle within the team (development, promotion, monetization), providing the opportunity to understand the business holistically, work with metrics across all stages of the user journey, and drive impact through analytics.
- Cross-platform work environment, with products available on iOS, Android, and Web.
Corporate Benefits:
πGrowth β offered to help develop your skills, advance your career, and reach your full potential: Ρompensation for additional training at external events and seminars; access to a large electronic library; paid online courses and conferences; Promova English Group; English Classes; Promova Speaking Club, and access to Promova Premium.
π§πΌWellbeing β offered to support your overall health, happiness, and resilience: work remotely from any safe location worldwide; flexible work schedule; 20 paid vacation days per year; an unlimited number of sick days medical insurance coverage; mental health support; power station reimbursement; employee discounts and special benefits for remote employees.
ππΌββοΈFun & Activities β offered to foster informal communication and strengthen social connections among teammates: remote team compensation for gathering and team-building episodes.
Interview Process:
- Pre-screen with Recruiter (40 minutes)
- Interview with the Hiring Manager (1,5 hours)
- Test Task
- Bar-raising (1 hour)
Hit the apply button and letβs create the unicorns together! π¦
More -
Β· 16 views Β· 1 application Β· 12d
Senior Data Engineer with Snowflake
Full Remote Β· Ukraine Β· 7 years of experience Β· Upper-IntermediateProject Description: As a Data & Application Engineer for FP&A, you are responsible for the engineering team and the technology that the team owns. You will not only work as a coach for your team but also as a technical leader, ensuring that the right...Project Description:
As a Data & Application Engineer for FP&A, you are responsible for the engineering team and the technology that the team owns. You will not only work as a coach for your team but also as a technical leader, ensuring that the right technical decisions are made when building our data and reporting product(s).
As a data and analytics team, we are responsible for building a cloud-based Data Platform for BHI Global Services and its stakeholders across brands. We aim to provide our end users from different Finance departments, e.g.., Risk, FPA, Tax, Order to Cash, the best possible platform for all of their Analytics, Reporting & Data needs.
Collaborating closely with a talented team of engineers and product managers, you'll lead the delivery of features that meet the evolving needs of our business on time. You will be responsible to conceptualize, design, build and maintain data services through data platforms for the assigned business units. Together, we'll tackle complex engineering challenges to ensure seamless operations at scale and in (near) real-time.
If you're passionate about owning the end-to-end solution delivery, thinking for future and driving innovation and thrive in a fast-paced environment, join us in shaping the future of the Data and Analytics team!
Responsibilities:
Strategy and Project Delivery
β Together with the business Subject Matter Experts and Product Manager, conceptualize, define, shape and deliver the roadmap to achieving the company priority and objectives
β Lead business requirement gathering sessions to translate into actionable delivery solutions backlog for the team to build
β Lead technical decisions in the process to achieve excellence and contribute to organizational goals.
β Lead the D&A teams in planning and scheduling the delivery process, including defining project scope, milestones, risk mitigation and timelines management including allocating tasks to team members and ensuring that the project stays on track.
β Have the full responsibility in ensuring successful delivery by the D&A teams to deliver new products on time, set up processes and operational plans from end to end, e.g., collecting user requirement, design, build & test solution and Ops Maintenance
β Technical leader with a strategic thinking for the team and the organization. Visionary who can deliver strategic projects and products for the organization.
β Own the data engineering processes, architecture across the teams
Technology, Craft & Delivery
β Experience in designing and architecting data engineering frameworks, dealing with high volume of data
β Experience in large scale data processing and workflow management
β Mastery in technology leadership
β Engineering delivery, quality and practices within own team
β Participating in defining, shaping and delivering the wider engineering strategic objectives
β Ability to get into the technical detail (where required) to provide technical coach, support and mentor the team
β Drive a culture of ownership and technical excellence, including reactive work such as incident escalations
β Learn new technologies and keep abreast of existing technologies to be able to share learnings and apply these to a variety of projects when needed
Mandatory Skills Description:
Role Qualifications and Requirements:
β Bachelor degree
β At least 5 years of experience leading and managing one or multiple teams of engineers in a fast-paced and complex environment to deliver complex projects or products on time and with demonstrable positive results.
β 7+ years' experience with data at scale, using Kafka, Spark, Hadoop/YARN, MySQL (CDC), Airflow, Snowflake, S3 and Kubernetes
β Solid working experience working with Data engineering platforms involving languages like PySpark, Python or other equivalent scripting languages
β Experience working with public cloud providers such as Snowflake, AWS
β Experience to work in a complex stakeholders' organizations
β A deep understanding of software or big data solution development in a team, and a track record of leading an engineering team in developing and shipping data products and solutions.
β Strong technical skills (Coding & System design) with ability to get hands-on with your team when needed
β Excellent communicator with strong stakeholder management experience, good commercial awareness and technical vision
β You have driven successful technical, business and people related initiatives that improved productivity, performance and quality
β You are a humble and thoughtful technology leader, you lead by example and gain your teammates' respect through actions, not the title
β Exceptional and demonstrable leadership capabilities in creating unified and motivated engineering teams
- Languages:
- English: B2 Upper Intermediate
-
Β· 66 views Β· 2 applications Β· 12d
Middle Strong/Senior Data Engineer
Full Remote Β· Ukraine Β· 2 years of experience Β· Upper-IntermediateOur mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities Our values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious...Our mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities πΏOur values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious Openness and Result Driven. We offer a safe, inclusive and productive environment for all team members, and weβre always open to feedbackπ
If you want to work from home or work in the city center of Kyiv, great β apply right now.
About the project:
Generative AI technologies are rapidly changing how digital content is created and consumed. However, many of these systems are trained on vast amounts of data, including articles, videos, and other creative worksβoften without the knowledge or consent of the original creators. As a result, publishers, journalists, and content producers face the risk of losing both visibility and critical revenue streams such as advertising, subscriptions, and licensing.
Our project addresses this issue by developing a system that allows AI platforms to identify when specific content has influenced a generated result. This enables transparent attribution and the possibility for content creators to receive compensation based on how often their work is used. The goal is to build a sustainable ecosystem where creators are fairly rewarded, while AI-generated content remains trustworthy and ethically grounded.
Requirements:
β 3+ years of experience in Data Engineering;
β Solid Python programming skills, especially in data processing and system automation;
β Proven experience with Airflow, Kubeflow, or Kafka for orchestrating data workflows;
β Familiarity with search engine concepts and indexing;
β Experience working with structured and semi-structured web data (HTML, JSON, APIs);
β Ability to work with large-scale distributed systems and cloud platforms (e.g., AWS, GCP, Azure);
β English: Upper-Intermediate+.
What you will get:
β Competitive salary and good compensation package;
β Exciting, challenging and stable startup projects with a modern stack;
β Corporate English course;
β Ability to practice English and communication skills through permanent interaction with clients from all over the world;
β Professional study compensation, online courses and certifications;
β Career development opportunity, semi-annual and annual salary review process;
β Necessary equipment to perform work tasks;
β VIP medical insurance or sports coverage;
β Informal and friendly atmosphere;
β The ability to focus on your work: a lack of bureaucracy and micromanagement;
β Flexible working hours (start your day between 8:00 and 11:30);
β Team buildings, corporate events;
β Paid vacation (18 working days) and sick leaves;
β Cozy offices in 2 cities ( Kyiv & Lviv ) with electricity and Wi-Fi (Generator & Starlink);
β Compensation for coworking (except for employees from Kyiv and Lviv);
β Corporate lunch + soft skills clubs;
β Unlimited work from home from anywhere in the world (remote);
β Geniusee has its own charity fund.
More
-
Β· 22 views Β· 0 applications Β· 12d
Technical Lead/Senior Data Engineer
Full Remote Β· Ukraine Β· 7 years of experience Β· Upper-IntermediateProject Description: As a Data & Application Engineer for FP&A, you are responsible for the engineering team and the technology that the team owns. You will not only work as a coach for your team but also as a technical leader, ensuring that the right...Project Description:
As a Data & Application Engineer for FP&A, you are responsible for the engineering team and the technology that the team owns. You will not only work as a coach for your team but also as a technical leader, ensuring that the right technical decisions are made when building our data and reporting product(s).
As a data and analytics team, we are responsible for building a cloud-based Data Platform for BHI Global Services and its stakeholders across brands. We aim to provide our end users from different Finance departments, e.g.., Risk, FPA, Tax, Order to Cash, the best possible platform for all of their Analytics, Reporting & Data needs.
Collaborating closely with a talented team of engineers and product managers, you'll lead the delivery of features that meet the evolving needs of our business on time. You will be responsible to conceptualize, design, build and maintain data services through data platforms for the assigned business units. Together, we'll tackle complex engineering challenges to ensure seamless operations at scale and in (near) real-time.
If you're passionate about owning the end-to-end solution delivery, thinking for future and driving innovation and thrive in a fast-paced environment, join us in shaping the future of the Data and Analytics team!
Responsibilities:
Strategy and Project Delivery
β Together with the business Subject Matter Experts and Product Manager, conceptualize, define, shape and deliver the roadmap to achieving the company priority and objectives
β Lead business requirement gathering sessions to translate into actionable delivery solutions backlog for the team to build
β Lead technical decisions in the process to achieve excellence and contribute to organizational goals.
β Lead the D&A teams in planning and scheduling the delivery process, including defining project scope, milestones, risk mitigation and timelines management including allocating tasks to team members and ensuring that the project stays on track.
β Have the full responsibility in ensuring successful delivery by the D&A teams to deliver new products on time, set up processes and operational plans from end to end, e.g., collecting user requirement, design, build & test solution and Ops Maintenance
β Technical leader with a strategic thinking for the team and the organization. Visionary who can deliver strategic projects and products for the organization.
β Own the data engineering processes, architecture across the teams
Technology, Craft & Delivery
β Experience in designing and architecting data engineering frameworks, dealing with high volume of data
β Experience in large scale data processing and workflow management
β Mastery in technology leadership
β Engineering delivery, quality and practices within own team
β Participating in defining, shaping and delivering the wider engineering strategic objectives
β Ability to get into the technical detail (where required) to provide technical coach, support and mentor the team
β Drive a culture of ownership and technical excellence, including reactive work such as incident escalations
β Learn new technologies and keep abreast of existing technologies to be able to share learnings and apply these to a variety of projects when needed
Mandatory Skills Description:
Role Qualifications and Requirements:
β Bachelor degree
β At least 5 years of experience leading and managing one or multiple teams of engineers in a fast-paced and complex environment to deliver complex projects or products on time and with demonstrable positive results.
β 7+ years' experience with data at scale, using Kafka, Spark, Hadoop/YARN, MySQL (CDC), Airflow, Snowflake, S3 and Kubernetes
β Solid working experience working with Data engineering platforms involving languages like PySpark, Python or other equivalent scripting languages
β Experience working with public cloud providers such as Snowflake, AWS
β Experience to work in a complex stakeholders' organizations
β A deep understanding of software or big data solution development in a team, and a track record of leading an engineering team in developing and shipping data products and solutions.
β Strong technical skills (Coding & System design) with ability to get hands-on with your team when needed
β Excellent communicator with strong stakeholder management experience, good commercial awareness and technical vision
β You have driven successful technical, business and people related initiatives that improved productivity, performance and quality
β You are a humble and thoughtful technology leader, you lead by example and gain your teammates' respect through actions, not the title
β Exceptional and demonstrable leadership capabilities in creating unified and motivated engineering teams
- Languages:
- English: B2 Upper Intermediate
-
Β· 34 views Β· 1 application Β· 12d
Data Engineer 2070/06 to $5500
Office Work Β· Poland Β· 3 years of experience Β· Upper-IntermediateOur partner is a leading programmatic media company, specializing in ingesting large volumes of data, modeling insights, and offering a range of products and services across Media, Analytics, and Technology. Among their clients are well-known brands such...Our partner is a leading programmatic media company, specializing in ingesting large volumes of data, modeling insights, and offering a range of products and services across Media, Analytics, and Technology. Among their clients are well-known brands such as Walmart, Barclaycard, and Ford.
The company has expanded to over 700 employees, with 15 global offices spanning four continents. With the imminent opening of a new office in Warsaw, we are seeking experienced
Data Engineers to join their expanding team.
The Data Engineer will be responsible for developing, designing, and maintaining end-to-end optimized, scalable Big Data pipelines for our products and applications. In this role, you will collaborate closely with team leads across various departments and receive support from peers and experts across multiple fields.
Opportunities:
- Possibility to work in a successful company
- Career and professional growth
- Competitive salary
- Hybrid work model (3 days per week work from office space in the heart of Warsaw city)
- Long-term employment with 20 working days of paid vacation, sick leaves, and national holidays
Responsibilities:
- Follow and promote best practices and design principles for Big Data ETL jobs
- Help in technological decision-making for the businessβs future data management and analysis needs by conducting POCs
- Monitor and troubleshoot performance issues on data warehouse/lakehouse systems
- Provide day-to-day support of data warehouse management
- Assist in improving data organization and accuracy
- Collaborate with data analysts, scientists, and engineers to ensure best practices in terms of technology, coding, data processing, and storage technologies
- Ensure that all deliverables adhere to our world-class standards
Skills:
- 3+ years of overall experience in Data Warehouse development and database design
- Deep understanding of distributed computing principles
- Experience with AWS cloud platform, and big data platforms like EMR, Databricks, EC2, S3, Redshift
- Experience with Spark, PySpark, Hive, Yarn, etc.
- Experience in SQL and NoSQL databases, as well as experience with data modeling and schema design
- Proficiency in programming languages such as Python for implementing data processing algorithms and workflows
- Experience with Presto and Kafka is a plus
- Experience with DevOps practices and tools for automating deployment, monitoring, and management of big data applications is a plus
- Excellent communication, analytical, and problem-solving skills
- Knowledge of scalable service architecture
- Experience in scalable data processing jobs on high-volume data
- Self-starter, proactive, and able to work to deadlines
- Noce to have: Experience with Scala
If you are looking for an environment where you can grow professionally, learn from the best in the field, balance work and life, and enjoy a pleasant and enthusiastic atmosphere, submit your CV today and become part of our team!
Everything you do will help us lead the programmatic industry and make it better.
More -
Β· 81 views Β· 16 applications Β· 12d
Data Engineer
Countries of Europe or Ukraine Β· Product Β· 3 years of experience Β· Intermediate Ukrainian Product πΊπ¦Headway Inc is a global tech company, revolutionizing lifelong learning by creating digital products for over 150 million users worldwide. Our mission is to help people grow. Weβre proud to be ranked 4th among the Worldβs Top EdTech Π‘ompanies by TIME...Headway Inc is a global tech company, revolutionizing lifelong learning by creating digital products for over 150 million users worldwide. Our mission is to help people grow. Weβre proud to be ranked 4th among the Worldβs Top EdTech Π‘ompanies by TIME magazine. We believe lifelong learning should be accessible, personalized, and impactful to each individual. Thatβs how we change the world and why we bring together exceptional minds.
The core of our achievements is our team. We believe in people and shared values SELECT. Thatβs why, together with Yuliya Savchuk, Engineering Manager of the MIT team, weβre looking for a Data Engineer to join our team of superstars transforming the EdTech industry.
About the role:
With business scaling, we see the need to strengthen the team that is working on building a data analytics platform for Headway Inc. We need to ensure that every business area and our products have reliable data to drive deep insights and innovation.
Data is at the core of our company. You will build and maintain a reliable, efficient, and scalable data infrastructure that enables Headway Inc to leverage data as a strategic asset for informed decision-making, driving innovation, and achieving business goals.
What awaits you on our team:
- Have the opportunity to join the team of a global EdTech company that creates socially impactful products for the international market.
- Have the opportunity to collaborate with a large team of analysts and marketers β to create solutions that have a direct and tangible impact on their work.
- You'll be able to use a wide variety of modern tools and independently decide which technologies are most appropriate to apply.
- We work in an atmosphere of freedom and responsibility.
- Your decisions and ideas will actively impact the business. Youβll own the full development lifecycleβfrom solution design through to user feedback and iteration.
What will you do:
At MIT, the Engineering team develops data platforms and automation tools that help teams work more efficiently and make informed marketing decisions. We create solutions that allow us to analyze and and utilize data for effective decision-making in marketing strategies, improving results and increasing return on investment.
- Communicate and collaborate with the analytics team, being responsible for delivering data to the analytical database for visualization.
- Create and maintain optimal and scalable pipeline architecture. Develop new pipelines and refine existing ones.
- Develop ETL/ELT processes and Data Lake architecture.
- Research and collect large, complex data.
- Identify, design, and implement internal process improvements.
- Continuously learn, develop, and utilize cutting-edge technologies.
What do you need to join us:
- Experience in production development and knowledge of any programming language, including Python, Golang, Java, etc.
- Understanding of Data Lakes, Data Warehousing, OLAP/OLTP approaches, and ETL/ELT processes.
- Proficiency in SQL and experience working with databases.
- Workflow orchestration expirience.
- Problem-solving skills and a passion for creating efficient, well-tested, and maintainable solutions.
- Alignment with the values of our team (SELECT).
Good to have:
- Experience with GCP Data Services and Airflow.
- Experience with CI/CD in Data Engineering.
- Knowledge of Data Governance and Security principles.
- Experience optimizing data pipeline performance.
- Experience in MarTech or AdTech platforms, like marketing campaign orchestration.
What do we offer:
- Work within an ambitious team on a socially impactful education product.
- An office with a reliable shelter, generators, satellite internet, and other amenities.
- Access to our corporate knowledge base and professional communities.
- Personal development plan.
- Partial compensation for English language learning, external training, and courses.
- Medical insurance coverage with a $70 employee contribution and full sick leave compensation.
- Company doctor and massage in the office.
- Sports activities: running, yoga, boxing, and more.
- Corporate holidays: we go on a week-paid holiday to rest and recharge twice a year.
- Supporting initiatives that help Ukraine. Find out more about our projects here.
Working schedule:
This is a full-time position with a hybrid remote option. It means that you can decide for yourself: whether you want to work from the office, remotely, or combine these options.
Are you interested?
Send your CV!
-
Β· 22 views Β· 4 applications Β· 11d
Data Engineering Team Lead
Poland Β· 5 years of experience Β· Upper-IntermediateAbout Us We are a leading Israeli IT company with 15 years of market experience and 8 years in Ukraine. Officially registered in Ukraine, Israel, and Estonia, we employ over 100 professionals worldwide. Specializing in successful startup collaboration,...About Us
More
We are a leading Israeli IT company with 15 years of market experience and 8 years in Ukraine. Officially registered in Ukraine, Israel, and Estonia, we employ over 100 professionals worldwide. Specializing in successful startup collaboration, we offer services across e-commerce, Fintech, logistics, and healthcare.
Our client is leading mobile app company that depends on high-volume, real-time data pipelines to drive user acquisition and engagement. This role is instrumental in maintaining data reliability, supporting production workflows, and enabling operational agility across teams. This is a hands-on leadership role that requires deep technical expertise, ownership mindset, and strong collaboration across engineering and business stakeholders.
Key Requirements:
πΉ5+ years of experience in data engineering, with strong hands-on expertise in building and maintaining data pipelines;
πΉ At least 2 years in a team leadership or technical lead role;
πΉ Proficient in Python, SQL, and data orchestration tools such as Airflow;
πΉ Experience with both SQL and NoSQL databases, such as MySQL, Presto, Couchbase, MemSQL, or MongoDB;
πΉ Bachelorβs degree in Computer Science, Engineering, or a related field;
πΉ English β Upper-Intermediate or higher.
Will be plus:
πΉ Background in NOC or DevOps environments is a plus;
πΉ Familiarity with PySpark is an advantage.
What you will do:
πΉ Oversee daily data workflows, troubleshoot failures, and escalate critical issues to ensure smooth and reliable operations; πΉ Use Python, SQL, and Airflow to configure workflows, extract client-specific insights, and adjust live processes as needed;
πΉ Build and maintain automated data validation and testing frameworks to ensure data reliability at scale;
πΉ Own and evolve the metadata system, maintaining table lineage, field definitions, and data usage context to support a unified knowledge platform;
πΉ Act as the primary point of contact for operational teams and stakeholders, ensuring consistent collaboration and high data quality across the organization.
Interview stages:
πΉ HR Interview;
πΉ Pro-Interview;
πΉ Technical Interview;
πΉ Final Interview;
πΉ Reference Check;
πΉ Offer.
Why Join Us?
πΉ Be part of a friendly international team, working together on interesting global projects;
πΉ Enjoy many chances to grow, learn from mentors, and work on projects that make a real difference;
πΉ Join a team that loves fresh ideas and supports creativity and new solutions;
πΉ Work closely with clients, building great communication skills and learning directly from their needs;
πΉ Thrive in a workplace that values your needs, offering flexibility and a good balance between work and life. -
Β· 38 views Β· 0 applications Β· 11d
Lead Data Engineer (ETL)
Full Remote Β· Ukraine, Poland Β· 5 years of experience Β· Upper-IntermediateDescription: Our Client is the Enterprise Worldwide Company. The product you will be working with, provides management and data processing/handling capabilities for networks of the clients scientific lab equipment such as microscopes, etc. The main...Description:
Our Client is the Enterprise Worldwide Company. The product you will be working with, provides management and data processing/handling capabilities for networks of the clients scientific lab equipment such as microscopes, etc. The main goals are:
Collection and centralized management of data outputs (measurement results, etc.) provided by clients devices
Outdated data utilization
Managing large volumes of data acquired from measurement devices in the cloud securely and reliably
Seamless sharing of measurement data with collaborators
The ability to share measurement results and accelerate customer service.
Requirements:
We are looking for a Lead Data Engineer with at least 6 years of commercial experience in development of data platforms for enterprise applications. With the experience to Lead a team of engineers and take responsibility for the technical solution.
β Proficiency in Airflow for workflow orchestration, dbt for data transformation, and SQL for data querying and manipulation.
β Experience in data modeling, ETL (Extract, Transform, Load) processes, and data warehousing concepts.
β Familiarity with cloud platforms (AWS) and their data services.
β Excellent analytical and problem-solving skills with meticulous attention to detail.
β Strong communication and collaboration skills with the ability to lead and motivate cross-functional teams.Good to have ability to participate onsite meeting.Job responsibilities:
β’ Implement new solutions into the current system with the refactoring and from scratch methods;
More
β’ Preparing the technical documentation;
β’ Participating in client meetings to understand business and user requirements and estimate tasks;
β’ Collaborating closely with other engineers, product owners and testers to identify and solve challenging problems;
β’ Taking part in defect investigation, bug fixing, troubleshooting. -
Β· 84 views Β· 6 applications Β· 11d
Data Engineer
Office Work Β· Ukraine (Kyiv) Β· Product Β· 2 years of experience Β· Intermediate3Shape develops 3D scanners and software solutions that enable dental and hearing professionals to treat more people, more effectively. Our products are market leading innovative solutions that make a real difference in the lives of both patients and...3Shape develops 3D scanners and software solutions that enable dental and hearing professionals to treat more people, more effectively. Our products are market leading innovative solutions that make a real difference in the lives of both patients and dental professionals around the world.
3Shape is headquartered in Copenhagen, with development teams in Denmark, Ukraine, North Macedonia and with a production site in Poland.
We are a global company with presence in Europe, Asia and the Americas. Founded in a year 2000, today, we provide services to customers in over 130 countries. Our growing talent pool of over 2500 employees spans 45+ nationalities.
3Shape as an employer is committed to Ukraine. Our UA office was founded in 2006, and we are continuing to grow, hire and take care of our employees even during the war in Ukraine. Among other actions, we support our contractors who are called to the military service, as well as care about our colleaguesβ mental health by implementing different activities.
If you are looking for stability in your future, we are the right place for you.
About the role:The Customer Data Strategy is a high-priority initiative with significant potential and senior management buy-in. Join our expanding team that currently includes a Data Analyst, Data Engineer, Data Architect, and Manager.
Key responsibilities:
- Develop and optimize Azure Databricks in collaboration with cross-functional teams to enable a 'one-stop-shop' for analytical data
- Translate customer-focused commercial needs into concrete data products
- Build data products to unlock commercial value and help integrate systems
- Coordinate technical alignment meetings between functions
- Act as customer data ambassador to improve 'data literacy' across the organization
Your profile:
- 1-2 years of experience working with data engineering in a larger organization, tech start-up, or as an external consultant
- Extensive experience with Azure Databricks, Apache Spark, and Delta Lake
- Proficiency in Python, PySpark and SQL
- Experience with optimizing and automating data engineering processes
- Familiarity with GitHub and GitHub Actions for CI/CD processes
- Knowledge of Terraform as a plus
Being the part of us means:- Make an impact in one of the most exciting Danish tech companies in the medical device industry
- Work on solutions used by thousands of dental professionals worldwide
- Be part of 3Shape's continued accomplishments and growth
- Contribute to meaningful work that changes the future of dentistry
- Develop professionally in a unique and friendly environment
- Enjoy a healthy work-life balance
- Occasional business trips to Western Europe
We offer:
- 39 hours of cooperation per week within a flexible time frame
- 24 business days of annual leaves
- Medical insurance (with additional Dentistry Budget and 10 massaging sessions per year included)
- Possibility of flexible remote cooperation
- Good working conditions in a comfortable office near National Technical University βKPIβ which includes: Blackout ready infrastructure. Corporate Paper Book Library. Gym-Room with Shower.
- A parking lot with free spaces for employees
- Partial compensation of lunches
- Paid sick leaves and child sick leaves
- Maternity, paternity and family issues leaves
- Well-being program: monthly well-being meetings and individual psychology hot-line
Want to join us and change the future of dentistry?
More -
Β· 40 views Β· 5 applications Β· 10d
Power BI Developer to $3000
Full Remote Β· Countries of Europe or Ukraine Β· 4 years of experience Β· Upper-IntermediateWeβre implementing a Microsoft-first analytics stack, designed to integrate data from Google Forms, ESRI ArcGIS / Survey123, and other HTTP-based sources into OneLake (Microsoft Fabric), with insights delivered through Power BI and access controlled via...Weβre implementing a Microsoft-first analytics stack, designed to integrate data from Google Forms, ESRI ArcGIS / Survey123, and other HTTP-based sources into OneLake (Microsoft Fabric), with insights delivered through Power BI and access controlled via Microsoft 365 roles.
As a Power BI Engineer, youβll own the end-to-end data pipelineβfrom ingestion to visualization. Youβll be responsible for building connectors, modeling data in OneLake, and delivering fast, accurate, and secure dashboards.
Key Responsibilities
- Develop and maintain Dataflows, Pipelines, and Power Query connectors for various sources including Google Forms, ArcGIS REST, Survey123, CSV/JSON, and other HTTP-based feeds
- Design efficient OneLake tables and implement star-schema models for Power BI reporting
- Deliver high-quality executive dashboards and enable self-service analytics for internal users
- Optimize dataset refresh, manage incremental data loads, and configure DirectQuery/Import modes
- Implement and manage row-level and role-based security, integrated with Microsoft 365 group permissions
Required Skills & Experience
- 4+ years of hands-on experience with Power BI development
- Strong knowledge of Microsoft Fabric and OneLake
- Experience building custom or REST-based Power Query connectors
- Proficiency in SQL for data modeling and performance optimization
- Practical experience with security models in Power BI, including row-level security and M365 role-based access
- Upper-intermediate or higher English for daily communication with international clients
Why Join Us?
Work on modern, mission-driven data solutions using cutting-edge Microsoft tools. Enjoy the freedom of remote work, a supportive team, and real ownership of your work.
More -
Β· 128 views Β· 17 applications Β· 9d
Senior Data Engineer
Full Remote Β· Worldwide Β· 2 years of experienceResponsibilities: β’ Design and develop data pipelines and ETL/ELT processes to support internal data analytics and reporting. β’ Build and maintain databases, with a strong understanding of both OLTP and OLAP architectures. β’ Optimize data workflows to...Responsibilities:
β’ Design and develop data pipelines and ETL/ELT processes to support internal data analytics and reporting.
β’ Build and maintain databases, with a strong understanding of both OLTP and OLAP architectures.
β’ Optimize data workflows to ensure reliable, accessible, and actionable data for product and analytics teams.
β’ Collaborate with analysts, engineers, and product teams to align data infrastructure with business goals.
β’ Utilize tools like Apache Spark, Spark Streaming, Kafka, Airflow, and dbt for data transformation and pipeline orchestration.
β’ Implement data visualization solutions using modern BI tools.
Requirements:
β’ 1β3 years of experience as a Data Engineer or in a related role.
β’ Proficiency in database design and development.
β’ Strong SQL skills and experience with ClickHouse, PostgreSQL, MongoDB.
β’ Knowledge of OLTP and OLAP architecture.
β’ Experience building and managing ETL/ELT pipelines.
β’ Hands-on experience with data orchestration tools like Apache Spark, Kafka, Airflow, and dbt.
β’ English level A2+, Ukrainian β native speaker.
Benefits:
β’ Remote or office-based work (flexible depending on your location).
More
β’ Work in a fast-paced, data-driven product team with real business impact.
β’ Opportunities to grow professionally, influence product decisions, and own your data architecture. -
Β· 68 views Β· 6 applications Β· 9d
Data Engineer(Billing Automation)
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 2 years of experience Β· IntermediateWeβre looking for a Data Engineer to support and analyze our Atlas CRM system. In this role, youβll be responsible for running monthly invoicing processes, generating client reports, and collaborating closely with Sales, Account Management, Billing,...Weβre looking for a Data Engineer to support and analyze our Atlas CRM system. In this role, youβll be responsible for running monthly invoicing processes, generating client reports, and collaborating closely with Sales, Account Management, Billing, Legal, and Integration teams to gather and process key customer data.
Responsibilities:
- Develop and maintain Python scripts to retrieve and process data from APIs;
- Clean and transform raw data into structured formats;
- Troubleshoot and debug issues related to API requests and data processing;
- Continuously improve the project/application by optimizing performance, enhancing features, and implementing best practices;
- Generate, analyze, and visualize reports using Tableau to support business decisions;
- Create and manage dashboards, filters, and data visualizations to provide insights;
Collaborate with teams to ensure data accuracy and system efficiency.
Requirements:
- 2+ years of working experience as a Data Engineer;
- Knowledge of Python, with a focus on data processing and automation;
- Experience working with RESTful APIs, JSON and authentication mechanisms;
- Understanding of data structures and usage of Pandas for data manipulation;
- Experience with SQL for querying and managing datasets;
- Ability to create and interpret reports and dashboards;
- Basic understanding of financial concepts and working with financial data;
- Basic understanding of SQL for working with structured data;
- Basic experience with Docker is a plus.
Benefits:
π° Be part of the international iGaming industry β Work with a top European solution provider and shape the future of online gaming;
π A Collaborative Culture β Join a supportive and understanding team;
π° Competitive salary and bonus system β Enjoy additional rewards on top of your base salary;
π Unlimited vacation & sick leave β Because we prioritize your well-being;
π Professional Development β Access a dedicated budget for self-development and learning;
π₯ Healthcare coverage β Available for employees in Ukraine and compensation across the EU;
π« Mental health support β Free consultations with a corporate psychologist;
π¬π§ Language learning support β We cover the cost of foreign language courses;
π Celebrating Your Milestones β Special gifts for lifeβs important moments;
β³ Flexible working hours β Start your day anytime between 9:00-11:00 AM;
π’ Flexible Work Arrangements β Choose between remote, office, or hybrid work;
π₯ Modern Tech Setup β Get the tools you need to perform at your best;
π Relocation support β Assistance provided if you move to one of our hubs.
More -
Β· 27 views Β· 3 applications Β· 9d
Data Engineer
Full Remote Β· Poland Β· 4 years of experience Β· Upper-IntermediateWho we are: Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries. About the Product: Our client, Harmonya, develops an AI-powered product...Who we are:
Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries.
About the Product:
Our client, Harmonya, develops an AI-powered product data enrichment, insights, and attribution platform for retailers and brands. Its proprietary technology processes millions of online product listings, extracting valuable insights from titles, descriptions, ingredients, consumer reviews, and more.
Harmonya builds robust tools to help uncover insights about the consumer drivers of market performance, improve assortment and merchandising, categorize products, guide product innovation, and engage target audiences more effectively.
About the Role:
We're seeking talented data engineers to join our rapidly growing team, which includes senior software and data engineers. Together, we drive our data platform from acquisition and processing to enrichment, delivering valuable business insights. Join us in designing and maintaining robust data pipelines, making an impact in our collaborative and innovative workplace.Key Responsibilities:
- Design, implement, and optimize scalable data pipelines for efficient processing and analysis.
- Build and maintain robust data acquisition systems to collect, process, and store data from diverse sources.
- Collaborate with DevOps, Data Science, and Product teams to understand needs and deliver tailored data solutions.
- Monitor data pipelines and production environments proactively to detect and resolve issues promptly.
- Apply best practices for data security, integrity, and performance across all systems.
Required Competence and Skills:
- 4+ years of experience in data or backend engineering, with strong proficiency in Python for data tasks.
- Proven track record in designing, developing, and deploying complex data applications.
- Hands-on experience with orchestration and processing tools (e.g. Apache Airflow and/or Apache Spark).
- Experience with public cloud platforms (preferably GCP) and cloud-native data services.
- Bachelorβs degree in Computer Science, Information Technology, or a related field (or equivalent practical experience).
- Ability to perform under pressure and make strategic prioritization decisions in fast-paced environments.
- Strong verbal and written communication skills in English.
- Excellent communication skills and a strong team player, capable of working cross-functionally.
Nice to have:
- Familiarity with data science tools and libraries (e.g., pandas, scikit-learn).
- Experience working with Docker and Kubernetes.
- Hands-on experience with CI tools such as GitHub Actions
Why Us?
We provide 20 days of vacation leave per calendar year (plus official national holidays of a country you are based in).
We provide full accounting and legal support in all countries we operate.
We utilize a fully remote work model with a powerful workstation and co-working space in case you need it.
We offer a highly competitive package with yearly performance and compensation reviews.
-
Β· 52 views Β· 1 application Β· 9d
Middle-Senior Data Engineer (Grafana)
Full Remote Β· Ukraine Β· 2 years of experience Β· Upper-IntermediateOur mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities Our values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious...Our mission at Geniusee is to help businesses thrive through tech partnership and strengthen the engineering community by sharing knowledge and creating opportunities πΏOur values are Continuous Growth, Team Synergy, Taking Responsibility, Conscious Openness and Result Driven. We offer a safe, inclusive and productive environment for all team members, and weβre always open to feedbackπ
If you want to work from home or work in the city center of Kyiv, great β apply right now.About the project:
Generative AI technologies are rapidly changing how digital content is created and consumed. However, many of these systems are trained on vast amounts of data, including articles, videos, and other creative worksβoften without the knowledge or consent of the original creators. As a result, publishers, journalists, and content producers face the risk of losing both visibility and critical revenue streams such as advertising, subscriptions, and licensing.Our project addresses this issue by developing a system that allows AI platforms to identify when specific content has influenced a generated result. This enables transparent attribution and the possibility for content creators to receive compensation based on how often their work is used. The goal is to build a sustainable ecosystem where creators are fairly rewarded, while AI-generated content remains trustworthy and ethically grounded.
Requirements:
β 2+ years of experience in Data Engineering;
β Hands-on experience with Grafana, Loki, Promtail, and Grafana Agent;
β Strong knowledge of log processing pipelines, including log parsing, structuring, and indexing;
β Proficiency in query languages such as LogQL, PromQL, or SQL;
β Experience setting up alerting and reporting in Grafana;
β Proficiency in Python;
β English: Upper-Intermediate+.What you will get:
More
β Competitive salary and good compensation package;
β Exciting, challenging and stable startup projects with a modern stack;
β Corporate English course;
β Ability to practice English and communication skills through permanent interaction with clients from all over the world;
β Professional study compensation, online courses and certifications;
β Career development opportunity, semi-annual and annual salary review process;
β Necessary equipment to perform work tasks;
β VIP medical insurance or sports coverage;
β Informal and friendly atmosphere;
β The ability to focus on your work: a lack of bureaucracy and micromanagement;
β Flexible working hours (start your day between 8:00 and 11:30);
β Team buildings, corporate events;
β Paid vacation (18 working days) and sick leaves;
β Cozy offices in 2 cities ( Kyiv & Lviv ) with electricity and Wi-Fi (Generator & Starlink);
β Compensation for coworking (except for employees from Kyiv and Lviv);
β Corporate lunch + soft skills clubs;
β Unlimited work from home from anywhere in the world (remote);
β Geniusee has its own charity fund. -
Β· 34 views Β· 0 applications Β· 9d
Senior Data Architect
Full Remote Β· Poland Β· 9 years of experience Β· Upper-IntermediateN-iX is a software development service company that helps businesses across the globe develop successful software products. During 21 years on the market and by leveraging the capabilities of Eastern Europe talents the company has grown to 2000+...N-iX is a software development service company that helps businesses across the globe develop successful software products. During 21 years on the market and by leveraging the capabilities of Eastern Europe talents the company has grown to 2000+ professionals with a broad portfolio of customers in the area of Fortune 500 companies as well as technological start-ups. N-iX has come a long way and increased its presence in nine countries - Poland, Ukraine, Romania, Bulgaria, Sweden, Malta, the UK, the US, and Colombia.
The Data and Analytics practice, part of the Technology Office, is a team of high-end experts in data strategy, data governance, and data platforms, and contribute to shaping the future of data platforms for our customers. As Senior Data Architect, you will play a crucial role in designing and overseeing the implementation of our strategic Databricks-based data and AI platforms. You will collaborate with data engineers and data scientists, define architecture standards, and ensure alignment across multiple business units. Your role will be pivotal in shaping the future state of our data infrastructure and driving innovative solutions within the automotive claims management domain.
Key Responsibilities:
- Design scalable and robust data architectures using Databricks and cloud technologies (Azure/AWS)
- Oversee and guide the implementation of Databricks platforms across diverse business units
- Collaborate closely with data engineers, data scientists, and stakeholders to define architecture standards and practices
- Develop and enforce governance strategies, ensuring data quality, consistency, and security across platforms
- Lead strategic decisions on data ingestion, processing, storage, and analytics frameworks
- Evaluate and integrate new tools and technologies to enhance data processing capabilities
- Provide mentorship and guidance to engineering teams, ensuring architectural compliance and effective knowledge transfer
- Develop and maintain detailed architectural documentation.
Requirements:
- 5+ years of experience as a Solution/Data Architect in complex enterprise environments
- Extensive expertise in designing and implementing Databricks platforms
- Strong experience in cloud architecture, preferably Azure or AWS
- Proficient in Apache Spark and big data technologies
- Advanced understanding of data modeling, data integration patterns, and data governance
- Solid background in relational databases (MS SQL preferred) and SQL proficiency
- Practical knowledge of data orchestration and CI/CD practices (Terraform, GitLab)
- Ability to articulate complex technical strategies to diverse stakeholders
- Strong leadership and mentorship capabilities
- Fluent English (B2 level or higher)
Exceptional interpersonal and communication skills in an international team setting.
Nice to have:
- Experience with Elasticsearch or vector databases
- Knowledge of containerization technologies (Docker, Kubernetes)
- Familiarity with dbt (data build tool)
- Willingness and ability to travel internationally twice a year for workshops and team alignment.