Jobs
139-
Β· 23 views Β· 3 applications Β· 1d
Data Engineer / DataOps
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 2 years of experience Β· B2 - Upper IntermediateDeepX is looking for an experienced Data Engineer to drive our data integration initiatives. In this role, you will connect, transform, and prepare complex datasets to support centralized reporting and actionable business insights. Leveraging modern...DeepX is looking for an experienced Data Engineer to drive our data integration initiatives. In this role, you will connect, transform, and prepare complex datasets to support centralized reporting and actionable business insights. Leveraging modern cloud-based technologies, data orchestration frameworks, and API integrations, you will play a pivotal role in ensuring our data infrastructure meets the evolving needs of our organization.
Key Responsibilities
- Architect, build, and maintain scalable and reliable ETL/ELT pipelines to integrate data from diverse international sources.
- Engineer data transformations that convert raw, complex data into clean, analysis-ready formats suitable for downstream analytics.
- Leverage the Google Cloud Platform (GCP) suite to build and manage scalable data storage and processing solutions, ensuring optimal security, reliability, and performance.
- Orchestrate complex data workflows using Apache Airflow, developing and maintaining robust DAGs for scheduling and monitoring.
- Troubleshoot and resolve issues within data pipelines and optimize workflow scheduling to guarantee timely data availability.
- Independently integrate with third-party services by interpreting API documentation, managing authentication, and developing custom data extraction solutions.
- Master Google Analytics 4's BigQuery export, structuring raw event data by flattening nested fields (e.g., event_params, user_properties) into query-optimized tables.
- Partner with our Business Intelligence teams to align data models and pipelines, seamlessly feeding into visualization tools like Looker Studio, DOMO, and Looker.
- Provide dedicated data support for dashboards, analytical projects, and ad-hoc reporting.
- Integrate and manage modern data connector tools, such as Stitch Data, and stay current with emerging technologies to enhance our data capabilities.
- Collaborate effectively with data analysts, data scientists, and other cross-functional teams to translate business needs into technical specifications.
- Curate and maintain comprehensive documentation for all data workflows, architectural designs, and transformation logic.
- Implement rigorous data validation, monitoring, and testing strategies to ensure data integrity and continuously improve pipeline performance and cost-efficiency.
Qualifications
- A minimum of 3 years of professional experience in a data engineering role, preferably with exposure to international datasets.
- Deep, hands-on experience with the Google Cloud Platform (GCP) ecosystem.
- Demonstrable expertise in orchestrating data pipelines with Apache Airflow, including DAG development and maintenance.
- Solid background in building production-grade ETL/ELT pipelines and utilizing connector tools like Stitch Data.
- Proven ability to work with APIs, from reading documentation to implementing data extraction logic.
- Experience handling Google Analytics 4 BigQuery exports, specifically with flattening nested data structures.
- Proficiency in SQL and at least one programming language (e.g., Python, Java, or Scala) for data manipulation and automation.
- Familiarity with BI platforms (Looker Studio, DOMO, Looker) and supporting BI team requirements.
- Proficiency with version control systems, particularly Git.
- Strong problem-solving skills with the ability to translate business requirements into technical solutions and optimize complex data processes.
- Excellent communication and collaboration skills, with the ability to work effectively in an international team environment.
- A proactive and detail-oriented mindset with a commitment to data quality and performance.
- English proficiency: Upper-Intermediate or higher.
About DeepX
DeepX is an R&D intensive and innovation-driven consortium that provides Artificial Intelligence-powered Computer Vision solutions for businesses. To find out more about us, please visit: https://deepxhub.com/
More -
Β· 19 views Β· 5 applications Β· 29d
Infrastructure Engineer
Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· C1 - AdvancedWe are looking for a Senior Infrastructure Engineer to manage and improve our IT systems and cloud environments. Youβll work closely with DevOps and security teams to ensure system availability and reliability. Details: Experience: 5 years Schedule:...We are looking for a Senior Infrastructure Engineer to manage and improve our IT systems and cloud environments. Youβll work closely with DevOps and security teams to ensure system availability and reliability.
Details:
Experience: 5 years
Schedule: Full time, remote
Start: ASAP
English: Fluent
Employment: B2B ContractResponsibilities:
- Design, deploy, and manage infrastructure environments
- Automate deployments using Terraform, Ansible, etc.
- Monitor and improve system performance and availability
- Implement disaster recovery plans
- Support troubleshooting across environments
Requirements:
- Strong Linux administration background
- Experience with AWS, GCP, or Azure
- Proficiency with containerization tools (Docker, Kubernetes)
- Infrastructure as Code (IaC) using Terraform or similar
- Scripting skills in Python, Bash, etc.
-
Β· 57 views Β· 4 applications Β· 29d
Data Architect
Full Remote Β· Spain, Poland, Portugal, Romania Β· 5 years of experience Β· B2 - Upper IntermediateProject tech stack: Python, Snowflake, Azure, Azure Data Factory, AWS ABOUT THE ROLE This is an exciting opportunity to work on a high-impact project, architecting an end-to-end data solution in a collaborative and forward-thinking environment. The...Project tech stack: Python, Snowflake, Azure, Azure Data Factory, AWS
ABOUT THE ROLEThis is an exciting opportunity to work on a high-impact project, architecting an end-to-end data solution in a collaborative and forward-thinking environment. The ideal candidate will be at the forefront of delivering scalable, efficient, and best-in-class Data engineering solutions supporting business-critical insights and reporting capabilities.
About the role
We are seeking a Lead/Architect Data Engineer to lead the design and implementation of robust data pipelines and warehouse solutions leveraging Snowflake, Azure, and Azure Data Factory. This role will focus on ingesting and transforming data from marketing and sales systems, enabling advanced analytics and reporting capabilities. The candidate will play a key advisory role in defining and implementing best practices for data ingestion, transformation, and reporting.About the project
Our client is a global real estate services company specializing in the management and development of commercial properties. Over the past several years, the organization has made significant strides in systematizing and standardizing its reporting infrastructure and capabilities. Due to the increased demand for reporting, the organization is seeking a dedicated team to expand capacity and free up existing resources.
Location
Remote: Ukraine / EuropeSkills & Experience
- Bachelor's degree in Computer Science, Engineering, or related field;
- 7+ years of experience in data engineering/ architecture roles;
- Database management and SQL proficiency, Knowledge of modern data warehousing tools like Snowflake, Databricks, Redshift (hands-on);
- Data modeling and design;
- Programming skills (Spark, Python);
- Experience of data governance frameworks and compliance requirements;
- Knowledge of big data technologies, machine learning integration or API development;
- Proficiency with cloud platforms ( at least one AWS, Azure, GCP) for scalable solutions;
- Expertise in streaming pipeline design and complex data transformation;
- Ability to analyze system requirements and translate them into effective technical designs;
- Experience with performance optimization for large-scale databases;
- Understanding knowledge of CI/CD practices;
- Problem-solving mindset to address technical challenges in dynamic environments;
- Collaboration skills to work effectively with cross-functional teams;
- Expertise in using and/or introducing AI-based coding practices to the projects.
Responsibilities
- Design and maintain the organizationβs data architecture, including databases, data warehouses, and data lakes;
- Develop and implement data models to structure and organize data for storage and access;
- Design data pipeline architectures to handle real-time data processing;
- Define and implement Changed Data Capture (CDC) pipelines;
- Ensure data security, integrity and compliance, and assist in implementing data governance practices;
- Monitor system health, identify bottlenecks, and recommend improvements to ensure scalability and efficiency;
- Collaborate with cross-functional teams on data relation topics;
- Stay updated on emerging technologies to continuously improve the organization's data infrastructure.
-
Β· 43 views Β· 8 applications Β· 29d
Platform Data Engineer (Python, Kubernetes, AWS) to $7500
Full Remote Β· Ukraine, Poland Β· Product Β· 5 years of experience Β· B2 - Upper IntermediateAbout the Product: Our client - Finaloop - reshapes bookkeeping to fit the e-commerce needs, building a fully automated, real-time accounting platform that replaces traditional bookkeeping for e-commerce and DTC brands. That means handling vast volumes...About the Product:
Our client - Finaloop - reshapes bookkeeping to fit the e-commerce needs, building a fully automated, real-time accounting platform that replaces traditional bookkeeping for e-commerce and DTC brands. That means handling vast volumes of financial data with precision, scale, and zero margin for error.
To support this, weβre investing in our platformβs core infrastructure, the foundation that powers real-time financial insight across thousands of businesses globally.
About the Role
We're seeking an outstanding and passionate Senior Platform Data Engineer to take part in shaping Finaloop's data infrastructure at the forefront of Fintech and AI.
You'll join a high-impact R&D team in a fast-paced startup environment, building scalable pipelines and robust data systems that empower eCommerce businesses to make smarter decisions.
Key Responsibilities:
- Designing, building, and maintaining scalable data pipelines and ETL processes for our financial data platform
- Developing and optimizing data infrastructure to support real-time analytics and reporting
- Implementing data governance, security, and privacy controls to ensure data quality and compliance
- Creating and maintaining documentation for data platforms and processes
- Collaborating with data scientists and analysts to deliver actionable insights to our customers
- Troubleshooting and resolving data infrastructure issues efficiently
- Monitoring system performance and implementing optimizations
- Staying current with emerging technologies and implementing innovative solutions
Required Competence and Skills:
- 5+ years experience in Data Engineering or Platform Engineering roles
- Strong programming skills in Python and SQL
- Experience with orchestration platforms and tools (Airflow, Dagster, Temporal or similar)
- Experience with MPP platforms (e.g., Snowflake, Redshift, Databricks)
- Hands-on experience with cloud platforms (AWS) and their data services
- Understanding of data modeling, data warehousing, and data lake concepts
- Ability to optimize data infrastructure for performance and reliability
- Experience working with containerization (Docker) in Kubernetes environments
- Familiarity with CI/CD concepts and principles
- Fluent English (written and spoken)
Nice to have skills:
- Experience with big data processing frameworks (Apache Spark, Hadoop)
- Experience with stream processing technologies (Flink, Kafka, Kinesis)
- Knowledge of infrastructure as code (Terraform)
- Experience building analytics platforms or clickstream pipelines
- Familiarity with ML workflows and MLOps
- Experience working in a startup environment or fintech industry
The main components of our current technology stack:
- AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
-
Β· 50 views Β· 7 applications Β· 29d
Senior Data Engineer (Healthcare)
Full Remote Β· EU Β· 5 years of experience Β· B2 - Upper IntermediateWe are looking for a Senior Data Engineer to lead a team, improve data practices, and help drive new opportunities and growth. Would you like to join an experienced team and work with a world-known Customer? We have a fantastic opportunity for an...We are looking for a Senior Data Engineer to lead a team, improve data practices, and help drive new opportunities and growth.
Would you like to join an experienced team and work with a world-known Customer? We have a fantastic opportunity for an experienced and skilled Senior Data Engineer to participate in the development of a rapidly growing project.
Feel challenged? Then we would like to hear from you!
Customer
Our client is a leading global company with years of experience in innovative technologies. By bridging the real and digital worlds with top solutions, they enable the transformation of everyday life for billions of people. Their portfolio of products, services, and solutions is focused on clinical decision-making and treatment pathways.
Project
The projectβs mission is to enable healthcare providers to increase their value by providing them with innovative technology and services in diagnostic and therapeutic imaging, laboratory diagnostics, molecular medicine, digital health, and enterprise services.
Requirements- Proven experience in data engineering and hands-on work with cloud computing services, specifically Microsoft Azure
- Strong understanding of data analytics fundamentals, including dimensional modeling, ETL, reporting tools, data governance, data warehousing, and both structured and unstructured data
- Proficiency in SQL and Python
- Hands-on experience with Databricks and PySpark for data processing and modeling
- Familiarity with big data platforms such as Snowflake, BigQuery, etc.; experience with Snowflake is a plus
Fluency in business English
Personal Profile
- Excellent communication skills
Responsibilities - Work closely with the client (PO) and other Team Leads to clarify technical requirements and expectations
- Provide support where Team Leadβs support is required
- Implement architecture based on Azure cloud platforms (Data Factory, Databricks, and Event Hub)
- Design, develop, optimize, and maintain squad-specific data architectures and pipelines that adhere to defined ETL and Data Lake principles
- Discover, analyze, and organize disparate data sources and structure them into clean data models with clear, understandable schemas
- Contribute to the evaluation of new tools for analytical data engineering or data science
- Suggest and contribute to training and improvement plans related to analytical data engineering skills, standards, and processes
- Participate in pre-sale activities and work with potential opportunities in the data direction
-
Β· 58 views Β· 4 applications Β· 28d
Data Quality Engineer
Office Work Β· Ukraine (Kyiv) Β· Product Β· 3 years of experience Β· B1 - Intermediate MilTech πͺWeβre building a large-scale data analytics ecosystem powered by Microsoft Azure and Power BI. Our team integrates, transforms, and visualizes data from multiple sources to support critical business decisions. Data quality is one of our top priorities,...Weβre building a large-scale data analytics ecosystem powered by Microsoft Azure and Power BI. Our team integrates, transforms, and visualizes data from multiple sources to support critical business decisions. Data quality is one of our top priorities, and weβre seeking an engineer who can help us enhance the reliability, transparency, and manageability of our data landscape.
Your responsibilities:
- Develop and maintain data quality monitoring frameworks within the Azure ecosystem (Data Factory, Data Lake, Databricks).
- Design and implement data quality checks, including validation, profiling, cleansing, and standardization.
- Detect data anomalies and design alerting systems (rules, thresholds, automation).
- Collaborate with Data Engineers, Analysts, and Business stakeholders to define data quality criteria and expectations.
- Ensure high data accuracy and integrity for Power BI reports and dashboards.
- Document data validation processes and recommend improvements to data sources.
Requirements:
- 3+ years of experience in a Data Quality, Data Engineering, or BI Engineering role.
- Hands-on experience with Microsoft Azure services (Data Factory, SQL Database, Data Lake).
- Advanced SQL skills (complex queries, optimization, data validation).
- Familiarity with Power BI or similar BI tools.
- Understanding of DWH principles and ETL/ELT pipelines.
- Experience with data quality frameworks and metrics (completeness, consistency, timeliness).
- Knowledge of Data Governance, Master Data Management, and Data Lineage concepts.
Would be a plus:
- Experience with Databricks or Apache Spark.
- DAX and Power Query (M) knowledge.
- Familiarity with DataOps or DevOps principles in a data environment.
- Experience in creating automated data quality dashboards in Power BI.
-
Β· 49 views Β· 2 applications Β· 27d
Data Engineer
Office Work Β· Cyprus Β· Product Β· 2 years of experience Β· B1 - IntermediateLocation: Cyprus office IGaming product is looking for a Data Engineer to join the team to work on the brand. We need a person with successful experience in a similar position in a product (gambling/betting verticals). Responsibilities: Design, develop,...Location: Cyprus office
IGaming product is looking for a Data Engineer to join the team to work on the brand. We need a person with successful experience in a similar position in a product (gambling/betting verticals).
Responsibilities:
- Design, develop, and maintain scalable data warehouse solutions
- Develop ETL pipelines to integrate data from multiple sources
- Optimize database performance and ensure data integrity
- Collaborate with BI analysts, data engineers, and stakeholders to understand requirements and translate them into effective database structures
- Implement best practices for data modelling, warehousing, and governance
- Troubleshoot database and ETL issues to maintain high availability and reliability
- Ensure data security and compliance standards are met
Requirements:
- Bachelorβs degree in Computer Science, Information Systems, or related field
- 2+ years of experience as a Data Engineer
Strong SQL skills (complex queries, optimisation, indexing) - Experience with data warehouse architecture and ETL development
- Knowledge of relational databases (PostgreSQL, MySQL, CTE)
- Understanding of data modelling concepts (Star Schema, Snowflake Schema)
- Strong problem-solving and analytical skills
- Experience with Python and Apache Airflow
- Experience with cloud data warehousing solutions (AWS Redshift, Google BigQuery, Snowflake ClickHouse)
- Experience with Apache Kafka
- Experience working with GIT
- Good communication skills in English
Nice to have:
- Knowledge of scripting languages (Bash) for automation
- Experience with Pyspark
- Experience working in Agile environments
What you will get:
- Modern Office in Sunny Limassol. Work from our brand-new three-story office building in Limassol. Enjoy spacious meeting rooms, fully equipped workspaces, ergonomic chairs, smart climate control, a private elevator, and everything you need for comfort and productivity. Each floor has cozy coffee stations, a large dining area stocked with fresh fruits, snacks, healthy treats, and a selection of premium coffee β all provided by the company.
- Premium Health Insurance β Paid by the Company. Our health insurance plan is 100% company-paid and offers comprehensive coverage up to β¬3,600,000 per year, with no deductible per incident. It includes in-patient and out-patient care, doctor visits, diagnostics, dental, and emergency treatment β because your health matters.
- Monthly Well-being Budget β Fully Sponsored. Every month, we provide a company-paid allowance for your physical and mental wellness: sports, fitness memberships, online/offline courses, massage, spa, creative hobbies β whatever keeps you energized.
- Daily Lunches β On Us. No need to bring lunch β we pay for your meals daily. Choose your favorite dishes from any restaurant via our Wolt corporate account, and your lunch will be delivered right to the office.
- Time Off β 100% Paid. We offer 21 working days of paid vacation, 10 fully paid sick leave days, and all public holidays in Cyprus β so you can truly rest and recharge.
- Life Happens β Weβve Got You. We offer 3 additional paid days off per year for significant life events: birth of a child, your wedding, or the loss of a close family member. These days are fully paid by the company β because you come first.
- Referral Bonus Program. Know someone whoβd be a great fit? Refer and earn! We offer generous cash bonuses for successful hires you bring into the team.
- Celebrations β Backed by Bonuses. On your birthday, youβll receive β¬100 (probation) or β¬200 (post-probation) β plus a delicious office celebration, all covered by us. On your work anniversaries, enjoy personalized gifts and growing monetary rewards β the longer you stay, the more we celebrate.
- Impactful Work in the iGaming Industry. Work on innovative iGaming projects that blend creativity with top-tier tech. Youβll join a team of specialists passionate about sharing knowledge and pushing boundaries.
- Learning & Development β Company-Funded. We believe in growth. All company-initiated training is 100% paid by us. If you choose additional personal development, we cover 50% of the cost, up to β¬800/year.
- Industry Conferences β Fully Sponsored. Attend leading industry events like SiGMA and more. The company covers all travel, tickets, and participation costs β whether youβre visiting or representing us at the booth.
More
-
Β· 108 views Β· 29 applications Β· 27d
Data Engineer
Full Remote Β· Worldwide Β· 3 years of experience Β· B1 - IntermediateTech Stack & Key Competencies Cloud & Infrastructure Hands-on experience with Microsoft Azure (Data & Analytics ecosystem) Proficiency in Azure DevOps, Terraform (Infrastructure as Code, CI/CD pipelines) Solid understanding of cloud security,...Tech Stack & Key Competencies
Cloud & Infrastructure
- Hands-on experience with Microsoft Azure (Data & Analytics ecosystem)
- Proficiency in Azure DevOps, Terraform (Infrastructure as Code, CI/CD pipelines)
- Solid understanding of cloud security, networking, and monitoring
Data Engineering & Pipelines
- Strong expertise in Apache Spark (including performance tuning & optimization)
- Deep experience with Databricks on Azure
- Proficient in Delta Lake, Azure Data Factory (complex pipelines)
- Experience working with Azure Data Lake Storage Gen2 (ADLS Gen2)
- Familiarity with Azure Synapse Analytics (serverless preferred)
Programming & Querying
- Advanced hands-on skills in Python
- Strong experience writing and optimizing complex SQL queries
Big Data & Architecture
- Experience designing enterprise-grade Big Data architectures on Azure
- Ability to evaluate architectural components (Databricks vs. Snowflake vs. Microsoft Fabric, etc.)
- Proven track record working on PoCs, MVPs, and solution prototyping
Streaming & Real-Time Processing
- Hands-on experience with Kafka, Spark Structured Streaming, or cloud-native streaming alternatives
Other Tools & Concepts
- Familiarity with Microsoft Fabric (a plus)
- Strong understanding of DevOps and CI/CD best practices
- Experience working directly with customers (e.g., technical consulting, stakeholder engagement)
More
-
Β· 40 views Β· 0 applications Β· 9d
Senior Data Engineer
Full Remote Β· Ukraine Β· 6 years of experience Β· B2 - Upper IntermediateN-iX is looking Senior Data Engineer to join our skilled and continuously growing team! The position is for our fintech customer from Europe. The person would be a part of the customerβs Data Platform team - a key function within the company, responsible...N-iX is looking Senior Data Engineer to join our skilled and continuously growing team! The position is for our fintech customer from Europe. The person would be a part of the customerβs Data Platform team - a key function within the company, responsible for the architecture, development, and management of our core data infrastructure. We leverage Snowflake, Looker, Airflow (MWAA), and dbt while managing DevOps configurations for the platform. Our goal is to build and maintain a self-serve data platform that empowers stakeholders with tools for efficient data management while ensuring security, governance, and compliance standards.
Requirements:
- 6+ years of experience in Data Engineering.
- Strong proficiency in Airflow, Python, and SQL.
- Hands-on experience with cloud data warehouses (Snowflake or equivalent).
- Solid understanding of AWS services and Kubernetes at an advanced user level.
- Familiarity with Data Quality and Observability best practices.
- Ability to thrive in a dynamic environment with a strong sense of ownership and responsibility.
- Analytical mindset and problem-solving skills for tackling complex technical challenges.
- Bachelor's in Mathematics, Computer Science or other relevant quantitative fields
Nice-to-Have Skills:
- Experience with DevOps practices, CI/CD, and Infrastructure as Code (IaC).
- Hands-on experience with Looker or other BI tools.
- Performance optimization of large-scale data pipelines.
- Knowledge of metadata management and Data Governance best practices.
Responsibilities:
- Design and develop a scalable data platform to efficiently process and analyze large volumes of data using Snowflake, Looker, Airflow, and dbt.
- Enhance the self-serve data platform by implementing new features to improve stakeholder access and usability.
- Work with cross-functional teams to provide tailored data solutions and optimize data pipelines.
- Foster a culture of knowledge sharing within the team to enhance collaboration and continuous learning.
- Stay updated on emerging technologies and best practices in data engineering and bring innovative ideas to improve the platform.
We offer*:
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
*not applicable for freelancers
More -
Β· 37 views Β· 0 applications Β· 24d
Senior Data Engineer
Full Remote Β· Ukraine Β· 6 years of experience Β· B2 - Upper IntermediateN-iX is looking Senior Data Engineer to join our skilled and continuously growing team! The position is for our fintech customer from Europe. The person would be a part of the customerβs Data Platform team - a key function within the company, responsible...N-iX is looking Senior Data Engineer to join our skilled and continuously growing team! The position is for our fintech customer from Europe. The person would be a part of the customerβs Data Platform team - a key function within the company, responsible for the architecture, development, and management of our core data infrastructure. We leverage Snowflake, Looker, Airflow (MWAA), and dbt while managing DevOps configurations for the platform. Our goal is to build and maintain a self-serve data platform that empowers stakeholders with tools for efficient data management while ensuring security, governance, and compliance standards.
Requirements:
- 6+ years of experience in Data Engineering.
- Strong proficiency in Airflow, Python, and SQL.
- Hands-on experience with cloud data warehouses (Snowflake or equivalent).
- Solid understanding of AWS services and Kubernetes at an advanced user level.
- Familiarity with Data Quality and Observability best practices.
- Ability to thrive in a dynamic environment with a strong sense of ownership and responsibility.
- Analytical mindset and problem-solving skills for tackling complex technical challenges.
- Bachelor's in Mathematics, Computer Science or other relevant quantitative fields
Nice-to-Have Skills:
- Experience with DevOps practices, CI/CD, and Infrastructure as Code (IaC).
- Hands-on experience with Looker or other BI tools.
- Performance optimization of large-scale data pipelines.
- Knowledge of metadata management and Data Governance best practices.
Responsibilities:
- Design and develop a scalable data platform to efficiently process and analyze large volumes of data using Snowflake, Looker, Airflow, and dbt.
- Enhance the self-serve data platform by implementing new features to improve stakeholder access and usability.
- Work with cross-functional teams to provide tailored data solutions and optimize data pipelines.
- Foster a culture of knowledge sharing within the team to enhance collaboration and continuous learning.
- Stay updated on emerging technologies and best practices in data engineering and bring innovative ideas to improve the platform.
We offer*:
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
*not applicable for freelancers
More -
Β· 24 views Β· 3 applications Β· 24d
Cloud Engineer
Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· C1 - AdvancedWe're in search of The Dream Maker - a talented and experienced Cloud Engineer, someone with a wealth of hands-on cloud engineering experience specializing in AWS. As part of our team, you'll be at the forefront of crafting a groundbreaking solution that...We're in search of The Dream Maker - a talented and experienced Cloud Engineer, someone with a wealth of hands-on cloud engineering experience specializing in AWS. As part of our team, you'll be at the forefront of crafting a groundbreaking solution that leverages cutting-edge technology to combat fraud. You will design, implement, and manage cloud infrastructure and services. You will work closely and as part of the R&D team to ensure optimal performance, security, and scalability of our applications. Our goal is to equip our clients with resilient safeguards against chargebacks, empowering them to safeguard their revenue and optimize their profitability.
Join us on this thrilling mission to redefine the battle against fraud.What you'll do?
Design, implement, and manage cloud solutions on AWS, ensuring best practices in security and scalability.
Monitor and optimize cloud performance, costs, and resource utilization.
Implement security measures to protect data and applications in the cloud.
Build a highly scalable data platform for diversified and complex data flows.
Have end-to-end ownership: Design, build, ship, measure, and maintain our AWS Cloud services. You build it β you own it!
Troubleshoot and resolve cloud-related issues, providing support to other teams as needed.
Stay up-to-date with AWS services and industry trends, recommending improvements and innovations.
Work closely with our amazing Product team that created an industry-leading SaaS Product-Led-Growth.
Join us in building a professional, fun company and engineering culture with the highest standard.Requirements
More
5+ years of experience in cloud engineering, specifically with AWS - Must
Experience with Node.js and Typescript - Must
Deep experience with AWS cloud Serverless infrastructures and architectures - Must
Strong understanding of AWS services (e.g., ECS, EKS, EC2, S3, RDS, Lambda, IAM) - Must
Experience in backend engineering and supporting containerization technology (Kubernetes, EKS/GKE, ECS) - Must
Experience with IaC tools - CloudFormation or SAM (must) and Terraform - Advantage
Experience with large scale systems
Experience with Python β Advantage
Experience with EDA- Advantage
Experience in building infrastructure, choosing and integrating tools and best practices around development speed, velocity and standardization
B.S.c in computer science or equivalent -
Β· 35 views Β· 14 applications Β· 24d
Senior Data Platform Engineer (Python)
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· B2 - Upper IntermediateOur client - Finaloop - reshapes bookkeeping to fit the e-commerce needs, building a fully automated, real-time accounting platform that replaces traditional bookkeeping for e-commerce and DTC brands. That means handling vast volumes of financial data...Our client - Finaloop - reshapes bookkeeping to fit the e-commerce needs, building a fully automated, real-time accounting platform that replaces traditional bookkeeping for e-commerce and DTC brands. That means handling vast volumes of financial data with precision, scale, and zero margin for error.
To support this, weβre investing in our platformβs core infrastructure, the foundation that powers real-time financial insight across thousands of businesses globally.
About the Role
We're seeking an outstanding and passionate Senior Data Platform Engineer to take part in shaping Finaloop's data infrastructure at the forefront of Fintech and AI.
You'll join a high-impact R&D team in a fast-paced startup environment, building scalable pipelines and robust data systems that empower eCommerce businesses to make smarter decisions.
Key Responsibilities:
- Designing, building, and maintaining scalable data pipelines and ETL processes for our financial data platform
- Developing and optimizing data infrastructure to support real-time analytics and reporting
- Implementing data governance, security, and privacy controls to ensure data quality and compliance
- Creating and maintaining documentation for data platforms and processes
- Collaborating with data scientists and analysts to deliver actionable insights to our customers
- Troubleshooting and resolving data infrastructure issues efficiently
- Monitoring system performance and implementing optimizations
- Staying current with emerging technologies and implementing innovative solutions
Required Competence and Skills:
- 5+ years experience in Data Engineering or Platform Engineering roles
- Strong programming skills in Python and SQL
- Experience with orchestration platforms and tools (Airflow, Dagster, Temporal or similar)
- Experience with MPP platforms (e.g., Snowflake, Redshift, Databricks)
- Hands-on experience with cloud platforms (AWS) and their data services
- Understanding of data modeling, data warehousing, and data lake concepts
- Ability to optimize data infrastructure for performance and reliability
- Experience working with containerization (Docker) in Kubernetes environments
- Familiarity with CI/CD concepts and principles
- Fluent English (written and spoken)
Nice to have skills:
- Experience with big data processing frameworks (Apache Spark, Hadoop)
- Experience with stream processing technologies (Flink, Kafka, Kinesis)
- Knowledge of infrastructure as code (Terraform)
- Experience building analytics platforms or clickstream pipelines
- Familiarity with ML workflows and MLOps
- Experience working in a startup environment or fintech industry
The main components of our current technology stack:
- AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
-
Β· 41 views Β· 3 applications Β· 24d
Middle/Senior Data Engineer (Europe)
Full Remote Β· Poland Β· 3 years of experience Β· B2 - Upper IntermediateWeβre looking for a technically strong and highly motivated Middle/Senior Implementation Engineer (Data Engineer) who will become a βsuper userβ of the platform and help deliver end-to-end, data-driven solutions for enterprise clients. This is a hybrid...Weβre looking for a technically strong and highly motivated Middle/Senior Implementation Engineer (Data Engineer) who will become a βsuper userβ of the platform and help deliver end-to-end, data-driven solutions for enterprise clients.
This is a hybrid role combining business analysis, data transformation, system integration, and configuration. Youβll work across the full solution lifecycle β from requirements gathering to platform setup, data modeling, and validation β helping clients unlock the full value of the platform through advanced configuration and light custom development.
Youβll work closely with both internal teams and client stakeholders, leveraging your communication, analytical, and problem-solving skills to deliver impactful outcomes.Responsibilities:
As an Implementation Engineer, you will be configuring, extending, deploying, testing and validating complete solutions end to end for our customers:
- Data analysis, exploration, testing and validation, interacting with the client to understand data structures and use cases
- Configure connectors (e.g. Shopify, Akeneo, Bloomreach, Optimove, etc)
- Setup clientβs Platform workflows/data transformations
- Functions and transforms - configure, write new plugins
- Canonical data models (XDM) and mapping- configure, extend and map the data using Platform portal, Jsonata or code plugins
- Setup Platform data hubs (data mapping, domain specific component)
- Platform data quality dashboards
- Data pipelines and warehouse/lake tables, views using Databricks and other tooling and the medallion architecture
- Data lineage, ML models
- Configure, extend of create Platform BI dashboards with PowerBI on top of data layers
- Testing and validation
- Work with our client clients to analyse data driven business processes and understand and check the data
- Produce documentation and training guides on the packs and how use them
- Advise on the best approach to leverage the Platform and achieve end results
Requirements:
- Leadership - Independent. Driven. Get things done
- Communication - Very strong written and verbal. Comfortable in front of a client
- Business Analysis - Capture requirements, understand use cases and translate into solutions
- Domain knowledge - Retail preferred. Must know at least one complex domain
- Data modelling, transformation - Strong data modelling knowledge and ability to design complex model
- Integration and data warehousing - Have used a platform and configured it to create complete solutions for customers
- Programming and software engineering - Python or equivalent at competent level to write functions and plugins, not expert
- BI and dashboarding - PowerBI or else required - or ability to pick up rapidly
- Databases and SQL is a must have
- Technical understanding - Good technical understanding of modern architectures and leading data platforms, if possible Databricks, Spark (as a user, not an expert)
- Cloud - Should know way around at least one cloud
- Previous experience working for a system integrator, a consultancy or professional services organization to build solutions for customers would be extremely beneficial
-
Β· 29 views Β· 2 applications Β· 24d
Data Engineer (Warsaw/hybrid)
Hybrid Remote Β· Poland Β· 3 years of experienceA fast-growing product company that develops and promotes mobile utilities and services for iOS. Our business operates on a web-to-app subscription model and focuses on Tier-1 markets. Currently, we are launching a new product and building an...A fast-growing product company that develops and promotes mobile utilities and services for iOS. Our business operates on a web-to-app subscription model and focuses on Tier-1 markets.
Currently, we are launching a new product and building an infrastructure team from scratch. We are looking for a Data Engineer who will help us design a scalable analytics system, integrating data from payment systems, advertising platforms, and product events.
Role Objective:
Build and maintain a robust architecture for collecting, integrating, and analyzing data for web-to-app subscriptions. Implement end-to-end analytics from click to payment, design ETL processes, integrate with payment service providers (PSPs) and ad platforms, and create BI dashboards with key business metrics.
Responsibilities:
- Design and implement analytics logic for web-to-app subscriptions: redirects, paywalls, deeplinks
- Integrate and process events from Stripe and Solidgate (invoice.created, paid, failed, etc.)
- Link payment data to click IDs to enable end-to-end tracking
- Collect and process data on initial payments, declines, refunds, chargebacks, grace periods, and rebills
- Develop and automate ETL processes
- Build BI dashboards and reports on key metrics (CAC, ROAS, MRR, ARR, Churn Rate, LTV, ARPU, Trial-to-Paid Conversion)
- Integrate data from multiple sources: Stripe, Solidgate, Facebook Ads, Google Ads, Adsterra, backend
- Work within Google Cloud Platform (BigQuery, Cloud Functions, Cloud Storage)
Requirements:
Must-have skills & experience:
- Strong SQL knowledge and experience with PostgreSQL (or other relational databases)
- Proven track record of building ETL processes and API/Webhook integrations
- Hands-on experience with Stripe, Solidgate, or other PSPs
- Knowledge of Google Cloud ecosystem: BigQuery, Cloud Functions, Cloud Storage
- Proficiency in Python (pandas, analytics, ETL automation)
- Experience with BI tools (Looker, Tableau, Redash)
- Understanding of subscription business models: trial, recurring, dunning process
- Background in mobile or web products (mandatory)
Nice-to-have:
- Knowledge of advertising platforms: Google Ads, Facebook Ads, Adsterra
- Experience with tracking systems (e.g., Keitaro)
- Understanding of high-load data flow architectures
- Experience with Airflow or similar tools
Work Format:
- Office in central Warsaw
- Hybrid model (office + remote)
- Relocation after probation is possible
We Offer:
- Competitive salary
- Flexible approach to schedule and tasks
- Minimal bureaucracy, high trust, and autonomy
- A professional and ambitious team where initiative and contribution are valued
-
Β· 11 views Β· 0 applications Β· 22d
Lead Big Data Engineer
Hybrid Remote Β· Ukraine Β· 5 years of experience Β· B1 - IntermediateWe are looking for an experienced Lead Big Data Engineer to join our innovative team. Requirements 5+ years of experience in data engineering within consumer finance or related fields Demonstrated expertise in Python, Pyspark, or AWS-specific technologies...We are looking for an experienced Lead Big Data Engineer to join our innovative team.
Requirements
- 5+ years of experience in data engineering within consumer finance or related fields
- Demonstrated expertise in Python, Pyspark, or AWS-specific technologies including Lambda, Airflow, and Athena
- Production experience with Big Data tools such as HDFS, YARN, Hive, Spark, Kafka, and AWS
- Proficiency in utilizing Docker, Kubernetes, and Snowflake in a production environment
- Strong foundation in mathematics, statistics, computer science, data science, or similar
- Proficiency in business intelligence, analytical tools, and data visualization techniques
- Excellent organizational skills with the ability to manage multiple projects and meet deadlines
- Effective communication skills to articulate results to non-technical audiences
- Continuous learning and integration of new technologies into solutions
Nice to have
- AWS certification
- Experience with Kafka Streaming and Kafka Connect
- Familiarity with ELK Stack
- Background in managing Cassandra or MongoDB databases