Jobs Lviv, Data Engineer

21
  • Β· 36 views Β· 3 applications Β· 12 November

    Senior Data Engineer at Payments AI Team

    Hybrid Remote Β· Ukraine Β· Product Β· 3 years of experience Β· B2 - Upper Intermediate
    Job Description As a Senior Data Engineer on the Wix Payments AI Team, you’ll play a crucial role in the design and integration of emerging AI solutions into the Payments product. You’ll have significant responsibilities which include: Developing &...

    Job Description

    As a Senior Data Engineer on the Wix Payments AI Team, you’ll play a crucial role in the design and integration of emerging AI solutions into the Payments product. You’ll have significant responsibilities which include:

    • Developing & maintaining infrastructure for both generative AI and classical data science applications.
    • Researching emerging AI technology stacks and methodologies to identify optimal solutions.
    • Monitoring data pipeline performance and troubleshooting issues.
    • Leading & driving the entire lifecycle of a typical team project: ideation β†’ map business constraints, research and evaluate alternative solutions β†’ design & implement a proof-of-concept in collaboration with various stakeholders across the organization,  including data engineers, analysts, data scientists and product managers.

     

    Qualifications

    • Proficient in Trino SQL (with the ability to craft complex queries) and highly skilled in Python, with expertise in Python frameworks (e.g., Streamlit, Airflow, Pyless, etc.).
    • Ability to design, prototype, code, test and deploy production-ready systems.
    • Experience with a versatile range of infrastructure, server and frontend tech stacks.
    • Experience implementing and integrating GenAI models, particularly LLMs, into production systems. 
    • Experience with AI agentic technologies (e.g. MCP, A2A, ADK) - an advantage.
    • An independent and quick learner.
    • Passion for product and technical leadership.
    • Business-oriented thinking and skills: data privacy and system security awareness, understanding of business objectives and how to measure their key performance indicators (KPIs), derive and prioritize actionable tasks from complex business problems, business impact guided decision making. 
    • Open-headed, capable of coming up with creative solutions and adapting to frequently changing circumstances and technological advances.
    • Fluent in English with strong communication abilities

     

    About the Team

    We’re the Wix Payments team.

    We provide Wix users with the best way to collect payments from their customers and manage their Wix income online, in person, and on-the-go. We’re passionate about crafting the best experience for our users, and empowering any business on Wix to realize its full financial potential. We have developed our own custom payment processing solution that blends many integrations into one clean and intuitive user interface. We also build innovative products that help our users manage their cash and grow their business. The Payments AI team is instrumental in promoting AI based capabilities within the payments domain and is responsible for ensuring the company is always at the forefront of the AI revolution.

     

    More
  • Β· 60 views Β· 3 applications Β· 11d

    Middle/Senior/Lead Python Cloud Engineer (IRC280058)

    Hybrid Remote Β· Ukraine Β· 5 years of experience Β· B2 - Upper Intermediate
    Job Description β€’ Terraform β€’ AWS Platform: Working experience with AWS services - in particular serverless architectures (S3, RDS, Lambda, IAM, API Gateway, etc.) supporting API development in a microservices architecture β€’ Programming Languages: Python...

    Job Description

    β€’ Terraform

    β€’ AWS Platform: Working experience with AWS services - in particular serverless architectures (S3, RDS, Lambda, IAM, API Gateway, etc.) supporting API development in a microservices architecture

    β€’ Programming Languages: Python (strong programming skills)

    β€’ Data Formats: Experience with JSON, XML, and other relevant data formats

    β€’ CI/CD Tools: experience setting up and managing CI/CD pipelines using GitLab CI, Jenkins, or similar tools

    β€’ Scripting and automation: experience in scripting languages such as Python, PowerShell, etc.

    β€’ Monitoring and Logging: Familiarity with monitoring & logging tools like CloudWatch, ELK, Dynatrace, Prometheus, etc…

    β€’ Source Code Management: Expertise with git commands and associated VCS (Gitlab, Github, Gitea, or similar)

     

     

    NICE TO HAVE

    β€’ Strongly Preferred: Infrastructure as Code: Experience with Terraform and CloudFormation - Proven ability to write and manage Infrastructure as Code (IaC)
    β€’ Documentation: Experience with markdown and, in particular, Antora for creating technical documentation
    β€’ Experience working with Healthcare Data, including HL7v2, FHI,R and DICOM
    β€’ FHIR and/or HL7 Certifications
    β€’ Building software classified as Software as a Medical Device (SaMD)
    β€’ Understanding of EHR technologies such as EPIC, Cerner, e.c.
    β€’ Experience in implementing enterprise-grade cyber security & privacy by design into software products
    β€’ Experience working in Digital Health software
    β€’ Experience developing global applications
    β€’ Strong understanding of SDLC – Waterfall & Agile methodologies
    β€’ Software estimation
    β€’ Experience leading software development teams onshore and offshore

    Job Responsibilities

    β€’ Develops, documents, and configures systems specifications that conform to defined architecture standards, address business requirements, and processes in the cloud development & engineering.
    β€’ Involved in planning of system and development deployment, as well as responsible for meeting compliance and security standards.
    β€’ API development using AWS services
    β€’ Experience with Infrastructure as Code (IaC)
    β€’ Actively identifies system functionality or performance deficiencies, executes changes to existing systems, and tests functionality of the system to correct deficiencies and maintain more effective data handling, data integrity, conversion, input/output requirements, and storage.
    β€’ May document testing and maintenance of system updates, modifications, and configurations.
    β€’ May act as a liaison with key technology vendor technologists or other business functions.
    β€’ Function Specific: Strategically design technology solutions that meet the needs and goals of the company and its customers/users.
    β€’ Leverages platform process expertise to assess if existing standard platform functionality will solve a business problem or customization solution would be required.
    β€’ Test the quality of a product and its ability to perform a task or solve a problem.
    β€’ Perform basic maintenance and performance optimization procedures in each of the primary operating systems.
    β€’ Ability to document detailed technical system specifications based on business system requirements
    β€’ Ensures system implementation compliance with global & local regulatory and security standards (i.e. , HIPAA, SOCII, ISO27001, etc.)

     

    Department/Project Description

    The Digital Health organization is a technology team that focuses on next-generation Digital Health capabilities, which deliver on the Medicine mission and vision to deliver Insight Driven Care. This role will operate within the Digital Health Applications & Interoperability subgroup of the broader Digital Health team, focused on patient engagement, care coordination, AI, healthcare analytics & interoperability amongst other advanced technologies which enhance our product portfolio with new services, while improving clinical & patient experiences.

     

    Authorization and Authentication platform & services for Digital Health

     

    Secure cloud platform for storing and managing medical images (DICOM compliant). Leverages AWS for cost-effective storage and access, integrates with existing systems (EHR, PACS), and offers a customizable user interface.

    More
  • Β· 38 views Β· 2 applications Β· 12 November

    Data Engineer (Strong Middle / Senior) to $4500

    Hybrid Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· C1 - Advanced
    We’re looking for an experienced Data Engineer to help our team design and implement reliable, scalable, and high-performance data solutions. If you have hands-on experience building data pipelines, working with cloud technologies, and optimizing complex...

    We’re looking for an experienced Data Engineer to help our team design and implement reliable, scalable, and high-performance data solutions. If you have hands-on experience building data pipelines, working with cloud technologies, and optimizing complex systems β€” this position is for you.

     

    Requirements:

    • 3–4+ years of professional experience in Data Engineering;
    • Proven experience in designing and deploying Data Lake or Data Warehouse pipelines;
    • Strong understanding of ETL/ELT principles and large-scale data processing;
    • Proficiency in SQL;
    • Practical experience with Python for data processing and automation tasks;
    • Hands-on experience with Spark (Cloud / On-Prem / Databricks);
    • Experience working with at least one major cloud provider (AWS / GCP / Azure) in data-related environments.

       

    Nice to have:

    • Knowledge of Airflow or similar orchestration tools;
    • Experience with Infrastructure as Code tools (Terraform, Terragrunt, Pulumi);
    • Understanding of DevOps practices within the data engineering domain.

       

    We Offer:

    • Participation in building modern data processes and enterprise-grade solutions;
    • Full-time schedule and the possibility of fully remote collaboration;
    • A team of skilled engineers open to knowledge sharing and continuous improvement;
    • Stable, long-term cooperation with opportunities for professional growth;
    • Comfortable and well-equipped working environment (MacBook Pro, cosy office available if preferred);
    • 18 business days of paid vacation, 10 paid sick leaves.
    • English lessons to support continuous language improvement.
    More
  • Β· 26 views Β· 1 application Β· 30d

    Data Engineer with Expertise in SQL Development and Snowflake

    Hybrid Remote Β· Ukraine Β· 8 years of experience Β· B2 - Upper Intermediate
    Project overview Our client is a leading global travel agency network specializing in luxury and experiential journeys. They are seeking to strengthen their relational database and Azure data platform through enhanced design, architecture, development,...

    Project overview

     

    Our client is a leading global travel agency network specializing in luxury and experiential journeys. They are seeking to strengthen their relational database and Azure data platform through enhanced design, architecture, development, and the creation of new features.

     

    Position overview

     

    We seek a skilled Data Engineer with expertise in SQL development and Snowflake. This role focuses on building data ingestion pipelines, ensuring data integrity, and developing service layers that support external users.

     

    Technology stack

     

    Azure Cloud, SQL / T-SQL, Python, Snowflake, SQL Server

     

    Responsibilities

    • Analyze, plan, develop, deploy, and manage large, scalable, distributed data systems.
    • Develop automated tests for unit, integration, regression, performance, and build verification.
    • Understand and apply advanced principles of entity-relationship model design, proper data typing practices, index management, data management, and data security.
    • Research and prototype new product and database features, design, and architecture ahead of mainstream development.
    • Implement monitoring and logging solutions to ensure reliability and traceability of data flows.
    • Ensure security, scalability, and performance of data services exposed to external users.
    • Review designs, code, and test plans of other developers, providing recommendations for improvement or optimization.
    • Develop and maintain microservices and stateless architectures.
    • Follow defined software development lifecycle best practices.
    • Collaborate with management and stakeholders to accurately identify requirements and establish priorities.

    Requirements

    • More than 8 years of experience designing and developing solutions with SQL, including 3 years specializing in Snowflake cloud data warehouses, along with extensive work on other relational and cloud-based databases.
    • Intermediate-level knowledge of developing solutions using Python and REST APIs.
    • Experience in developing relational and non-relational data platforms/data pipelines using Azure cloud solutions.
    • Familiarity with ETL/ELT processes, data modeling, and data warehousing concepts.
    • Proficiency with Git and CI/CD tools (e.g., Azure DevOps).
    • Desire and ability to work as part of a team with minimal supervision in a results-oriented, fast-paced, dynamic environment.
    • Time zone alignment until 5 PM UTC-3 (exclusive).
    • Good spoken English.

    Nice to have

    • Database architecture and designing experience.
    • Advanced Snowflake experience
    • Advance level knowledge in automation test creation
    • Experience working with foreign clients
    • Understanding of Agile methodologies’ development
    • Microsoft certificates
    • Experience with the Travel domain
    • Team player
    More
  • Β· 24 views Β· 0 applications Β· 26d

    Senior Data Streaming Engineer

    Hybrid Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· C1 - Advanced
    Who we are! At Levi9, we are passionate about what we do. We love our work and together in a team, we are smarter and stronger. We are looking for skilled team players who make change happen. Are you one of these players? About the role As a Data...

    πŸ”ΉWho we are!

    At Levi9, we are passionate about what we do. We love our work and together in a team, we are smarter and stronger. We are looking for skilled team players who make change happen. Are you one of these players?

     

    πŸ”ΉAbout the role

    As a Data Streaming Engineer in the customer team, you will leverage millions of daily connections with readers and viewers across the online platforms as a competitive advantage to deliver reliable, scalable streaming solutions. You will collaborate closely with analysts, data scientists and developers across all departments throughout the entire customer organisation. You will design and build cloud-based data pipelines, both batch and streaming, and its underlying infrastructure. In short: you live up to our principle You Build It, You Run It.

    You will be working closely with a tech stack that includes Scala, Kafka, Kubernetes, Kafka Streams, and Snowflake.

     

    πŸ”ΉResponsibilities

    • Deliver reliable, scalable streaming solutions
    • Collaborate closely with analysts, data scientists and developers across all departments throughout the entire organisation
    • Design and build cloud-based data pipelines, both batch and streaming, and its underlying infrastructure
    • You Build It, You Run It.
    • Building a robust real-time customer profile by aggregating their online behaviour and allowing the usage of this profile to recommend other articles on our online platforms.
    • Co-develop and cooperate on streaming architectures from inception and design, through deployment, operation and refinement to meet the needs of millions of real-time interactions.
    • Closely collaborate with business stakeholders, data scientists and analysts in our daily work, data engineering guild and communities of practice.
    • You are a protagonist in harmonising our data landscape across countries, over departments and through acquisitions

       

    πŸ”ΉRequirements

    • Experience implementing highly available and scalable big data solutions
    • In-depth knowledge of at least one cloud provider, preferably AWS
    • Proficiency in languages such as Scala, Python, or shell scripting, specifically in the context of streaming data workflows
    • Experience with Infrastructure as Code and CI/CD pipelines
    • Full understanding of modern software engineering best practices
    • Experience with Domain-driven design
    • DevOps mindset
    • You see the value in a team and enjoy working together with others, also with techniques like pair programming
    • You either have an AWS certification or are willing to achieve AWS certification within 6 months (minimum: AWS Certified Associate)

       

    πŸ”ΉInterview stages

    • HR interview
    • Technical interview in English
    • Test assignment
    • Final interview

       

    πŸ”Ή9 reasons to join us:

    1. Today we’re working with the technology of tomorrow.
    2. We don’t wait for a change. We are the change.
    3. We’re experts in creating experts (Levi9 academy, Lead9 program for leaders).
    4. No micromanagement. We are free birds with a clear understanding of what the high performance is!
    5. Learning in Levi9 never stops (unlimited Udemy for business, meetups, English&German courses, Professional trainings).
    6. Here you can train your body and mind.
    7. We’ve gathered the best locations β€” comfortable, cozy and pet-friendly offices in Kyiv (5 minutes from Olimpiyska metro station) and Lviv (overlooking the Stryiskyi Park) with regular offline internal events
    8. We have master’s degree in work-life balance.
    9. We are actively supporting Ukraine with constant donations and volunteering
    More
  • Β· 35 views Β· 4 applications Β· 25d

    Data Engineer

    Hybrid Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· B2 - Upper Intermediate
    We’re looking for a highly skilled Data Expert! Product | Remote About the role ​​We’re looking for a data engineer who bridges technical depth with curiosity. You’ll help Redocly turn data into insight β€” driving smarter product, growth, and...

     

    πŸ”₯ We’re looking for a highly skilled Data Expert!πŸ”₯

     

    Product | Remote

     

    About the role 

    ​​We’re looking for a data engineer who bridges technical depth with curiosity. You’ll help Redocly turn data into insight β€” driving smarter product, growth, and business decisions.

     

    This role combines data governance, analytics, and development. You’ll build reliable data pipelines, improve observability, and uncover meaningful patterns that guide how we grow and evolve.

     

    You’ll work closely with product and technical teams to analyze user behavior, run experiments, build predictive models, and turn complex findings into actionable recommendations. You’ll also design and support systems for collecting, transforming, and analyzing data across our stack.

     

    You're a great fit if you have 

    • 5+ years of software engineering experience
    •  3+ years focused on data engineering, data science or analytics.
    • Strong SQL skills and experience with data modeling (dbt preferred).
    • Solid understanding of statistics, hypothesis testing, and experimental design.
    • Proven experience in data governance, analytics, and backend systems.
    • Familiarity with columnar databases or analytics engines (ClickHouse, Postgres, etc.).
    • Experience with modern data visualization tools.
    • Strong analytical mindset, attention to detail, and clear communication.
    • Passionate about clarity, simplicity, and quality in both data and code.
    • English proficiency: Upper-Intermediate or higher.

     

     

    What you’ll do 

    • Analyze product and user behavior to uncover trends, bottlenecks, and opportunities.
    • Design and evaluate experiments (A/B tests) to guide product and growth decisions.
    • Build and maintain data pipelines, ETL processes, and dashboards for analytics and reporting.
    • Develop and validate statistical and machine learning models for prediction, segmentation, and forecasting.
    • Design and optimize data models for new features and analytics (e.g., using dbt).
    • Work with event-driven architectures and standards like AsyncAPI and CloudEvents.
    • Collaborate with engineers to improve data quality, consistency, and governance across systems.
    • Use observability and tracing tools (e.g., OpenTelemetry) to monitor and improve performance.
    • Create visualizations and reports that clearly communicate results to technical and non-technical audiences.
    • Support existing frontend and backend systems related to analytics and data processing.
    • Champion experimentation, measurement, and data-driven decision-making across teams.

     

    Nice to have

    • Understanding of product analytics and behavioral data.
    • Experience with causal inference or time-series modeling.
    • Strong proficiency with Node.js, React, JavaScript, and TypeScript.
    • Experience with frontend or backend performance optimization.
    • Familiarity with Git-based workflows and CI/CD for data pipelines.
       

    How you’ll know you’re doing a great job

    • Teams make better product decisions, faster, because of your insights.
    • Data pipelines are trusted, observable, and performant.
    • Experiments drive measurable product and business outcomes.
    • Metrics and dashboards are used across teams β€” not just built once.
    • You’re the go-to person for clarity when questions arise about β€œwhat the data says.”

     

    About Redocly

    Redocly builds tools that accelerate API ubiquity. Our platform helps teams create world-class developer experiences β€” from API documentation and catalogs to internal developer hubs and public showcases. We're a globally distributed team that values clarity, autonomy, and craftsmanship. You'll work alongside people who love developer experience, storytelling, and building tools that make technical work simpler and more joyful.

    Headquarter – Austin, Texas, US. There is also an office in Lviv, Ukraine.

     

    Redocly is trusted by leading tech, fintech, telecom, and enterprise teams to power API documentation and developer portals. Redocly’s clients range from startups to Fortune 500 enterprises.

    https://redocly.com/

     

    Working with Redocly

    • Team: 4-6 people (middle-seniors)
    • Team’s location: Ukraine&Europe
    • There are functional, product, and platform teams and each has its own ownership, and line structure, and teams themselves decide when to have weekly meetings.
    • Cross-functional teams are formed for each two-month cycle, giving team members the opportunity to work across all parts of the product.
    • Methodology: Shape Up

     

    Perks

    • Competitive salary based on your expertise (approximately $6,000 - $6,500 per month)
    • Full remote, though you’re welcome to come to the office occasionally if you wish.
    • Cooperation on a B2B basis with a US-based company (for EU citizens) or under a gig contract (for Ukraine).
    • After a year of working with the company, you can buy a certain number of company’s shares
    • Around 30 days of vacation (unlimited,  but let’s keep it reasonable)
    • 10 working days of sick leave per year
    • Public holidays according to the standards
    • No trackers and screen recorders
    • Working hours – EU/UA timezone. Working day – 8 hours. Mostly they start working from 10-11 am
    • Equipment provided – MacBooks (M1 – M4)
    • Regular performance reviews

     

    Hiring Stages

    • Prescreening (30-45 min)
    • HR Call (45 min)
    • Initial Interview (30 min)
    • Trial Day (paid)
    • Offer

     

    If you are an experienced Data Scientist, and you want to work on impactful data-driven projects, we’d love to hear from you! 


    Apply now to join our team!

    More
  • Β· 29 views Β· 0 applications Β· 22d

    Data Engineer

    Hybrid Remote Β· Ukraine Β· 5 years of experience Β· B2 - Upper Intermediate
    The project is a global leader in sports, music, travel, culture, and more. It creates and delivers extraordinary experiences by providing official access and premium hospitality at marquee global sporting events, global music tours, etc. Tech...

    The project is a global leader in sports, music, travel, culture, and more. It creates and delivers extraordinary experiences by providing official access and premium hospitality at marquee global sporting events, global music tours, etc.

    Tech Stack
    Snowflake | dbt Cloud | Github | SQL | Omni Analytics | Tableau.


    Requirements:

    • 5+ years building dimensional data models in production (facts and dimensions)
    • Expert SQL + dbt with deep understanding of materialization strategies (views, tables incremental)
    • Snowflake production experience
    • Git workflows: pull requests, code review, peer review collaboration
    • Code quality adherence: linting, pre-commit hooks, following team conventions
    • Strong data validation and quality assurance skills
    • Experience debugging and correcting transformation logic
    • Can deliver independently with minimal supervision while collaborating effectively withAnalytics/Data Platform teams
    • Portfolio required: GitHub or similar showing dimensional modeling/dbt work
    • English β€” Upper-Intermediate (speaking and writing) and fluency Ukrainian.

    Nice to Have:

    • E-commerce/ticketing/transactional business experience
    • Experience with cross-system data reconciliation
    • BI platform integration knowledge (Tableau, Omni Analytics, Looker)
    • US Central timezone overlap availability

     

    Key Responsibilities:

    • Build dimensional analytics infrastructure (dbt + Snowflake) for a high-volume live eventsticketing & hospitality business.
    • Transform existing Snowflake source data into production-ready BI models for commercial strategy, pricing, and revenue optimization.
    • Significant validation and data quality work required to handle complex business logic, nullhandling, and cross-system reconciliation.

     

    We offer:

    • Big stable project with a professional team.
    • Enterprise project.
    • Friendly and supportive work environment.
    • Competitive salary and benefits package.
    • Room for personal and professional growth.
    • Zero bureaucracy.
    • 18 business days of paid vacation + public holidays compensation.
    • Insurance Fund of the company.
    • Coverage of all professional studies.
    • Coverage of sick leaves, sports activities, and English language courses.

     

    πŸ“Œ If you believe this role could be a great match for you, please send us your resume via the link β€” we’ll be happy to get in touch with you.

    More
  • Β· 16 views Β· 0 applications Β· 16d

    Azure Data Engineer

    Hybrid Remote Β· Ukraine Β· 5 years of experience Β· B2 - Upper Intermediate
    Client The client is a premier institution offering world-class postgraduate business education, including MBA, Executive MBA, and specialised finance and management programs. Its mission is to transform global business practices. The client is globally...

    Client

     

    The client is a premier institution offering world-class postgraduate business education, including MBA, Executive MBA, and specialised finance and management programs. Its mission is to transform global business practices. The client is globally recognized for its rigorous academics, exceptional faculty, and cutting-edge research. It consistently ranks among the world’s top business schools, securing high positions in global MBA rankings. The client fosters leadership and innovation, equipping students for impactful careers in the international business landscape. Join a great company, not merely an individual project.
     

    Project overview

     

    The Client recently completed their Data & AI Strategy and Roadmap, establishing a foundation for a data-driven future. The assessment phase reviewed the current state of Data and AI, analyzing technology, processes, resources, and structure, and provided strategic recommendations aligned with the School’s 5-year transformation plan. Building on this, the discovery phase focused on data governance, use cases, business drivers, service offerings, technology, and roles, delivering further strategic insights. With this groundwork complete, The Client is now entering the delivery phase, which includes implementing a Data Platform, a first use case, and establishing Data Governance roles, processes, and technologies.

     

    Responsibilities
     

    • Design, build, and optimize complex ETL/ELT pipelines using Azure Data Factory, including mapping data flows and orchestration.
    • Develop scalable data processing solutions using Azure Synapse Analytics (dedicated SQL pools, Spark pools) for enterprise-grade analytics.
    • Implement and maintain Medallion architecture (Bronze/Silver/Gold) on Azure Data Lake Storage (ADLS Gen2) with proper data organization, security, and governance.
    • Build large-scale data transformation workflows using Azure Databricks/Synapse Spark with PySpark/Python.
    • Develop and integrate Azure Functions (Python/C#) to enable event-driven processing and custom pipeline logic.
    • Implement CI/CD pipelines for data platforms using Azure DevOps Pipelines or GitHub Actions.
    • Automate cross-environment deployments using IaC tools (ARM, Bicep, Terraform).
    • Optimize SQL queries, database objects, and Spark jobs for performance and reliability.
    • Design dimensional models (Star/Snowflake schemas) and develop production-grade data models.
    • Collaborate with cross-functional teams to clarify requirements, communicate architecture decisions, and resolve data issues.
       

    Requirements
     

    • Proven expertise with core Azure data engineering services:
    • Azure Data Factory
    • Azure Synapse Analytics (SQL & Spark)
    • Azure Data Lake Storage Gen2
    • Azure Databricks / Synapse Spark
    • Azure Functions
    • Strong command of Python and PySpark for data processing and automation.
    • Advanced SQL skills: complex queries, stored procedures, optimization, and performance tuning.
    • Solid understanding of data modeling, including design of facts, dimensions, and analytical structures.
    • Hands-on experience building and maintaining CI/CD pipelines (Azure DevOps or GitHub Actions).
    • Practical experience with Infrastructure-as-Code for multi-environment deployment (ARM, Bicep, Terraform).
    • Excellent Git knowledge: branching strategies, pull requests, code reviews.
    • Strong problem-solving abilities for debugging pipelines, resolving deployment issues, and optimizing performance.
    • Strong communication skills for effectively explaining technical solutions to technical and non-technical stakeholders.
       

    Nice to have
     

    • Experience with Microsoft Fabric ecosystem, including:
    • OneLake & Lakehouse
    • Fabric Data Engineering / Warehouse
    • Notebooks & Spark jobs
    • Power BI development experience:
    • Building dashboards and reports
    • Advanced DAX
    • Optimized data modeling
    • Row-Level Security (RLS) setup
    • Microsoft certifications:
    • DP-203: Azure Data Engineer Associate
    • DP-600: Fabric Data Engineer Associate
    More
  • Β· 22 views Β· 1 application Β· 16d

    Data Engineer with Expertise in SQL Development and Snowflake, Digital Platform

    Hybrid Remote Β· Ukraine Β· 4 years of experience Β· B2 - Upper Intermediate
    Project overview Our client is a leading global travel agency network specializing in luxury and experiential journeys. They are seeking to strengthen their relational database and Azure data platform through enhanced design, architecture, development,...

    Project overview

     

    Our client is a leading global travel agency network specializing in luxury and experiential journeys. They are seeking to strengthen their relational database and Azure data platform through enhanced design, architecture, development, and the creation of new features.

     

    Position overview

     

    We seek a skilled Data Engineer with expertise in SQL development and Snowflake. This role focuses on building data ingestion pipelines, ensuring data integrity, and developing service layers that support external users.

     

    Technology stack

     

    Azure Cloud, SQL / T-SQL, Python, Snowflake, SQL Server

     

    Responsibilities
     

    • Analyze, plan, develop, deploy, and manage large, scalable, distributed data systems.
    • Develop automated tests for unit, integration, regression, performance, and build verification.
    • Understand and apply advanced principles of entity-relationship model design, proper data typing practices, index management, data management, and data security.
    • Research and prototype new product and database features, design, and architecture ahead of mainstream development.
    • Implement monitoring and logging solutions to ensure reliability and traceability of data flows.
    • Ensure security, scalability, and performance of data services exposed to external users.
    • Review designs, code, and test plans of other developers, providing recommendations for improvement or optimization.
    • Develop and maintain microservices and stateless architectures.
    • Follow defined software development lifecycle best practices.
    • Collaborate with management and stakeholders to accurately identify requirements and establish priorities.
       

    Requirements
     

    • More than 8 years of experience designing and developing solutions with SQL, including 3 years specializing in Snowflake cloud data warehouses, along with extensive work on other relational and cloud-based databases.
    • Intermediate-level knowledge of developing solutions using Python and REST APIs.
    • Experience in developing relational and non-relational data platforms/data pipelines using Azure cloud solutions.
    • Familiarity with ETL/ELT processes, data modeling, and data warehousing concepts.
    • Proficiency with Git and CI/CD tools (e.g., Azure DevOps).
    • Desire and ability to work as part of a team with minimal supervision in a results-oriented, fast-paced, dynamic environment.
    • Time zone alignment until 5 PM UTC-3 (exclusive).
    • Good spoken English.
       

    Nice to have
     

    • Database architecture and designing experience.
    • Advanced Snowflake experience
    • Advance level knowledge in automation test creation
    • Experience working with foreign clients
    • Understanding of Agile methodologies’ development
    • Microsoft certificates
    • Experience with the Travel domain
    • Team player
    More
  • Β· 10 views Β· 0 applications Β· 16d

    Snowflake Platform Engineer/Administrator

    Hybrid Remote Β· Ukraine Β· 5 years of experience Β· B2 - Upper Intermediate
    Client Our client is a leading financial services business operating a comprehensive data marketplace that supports the entire Client’s business. Project overview The Snowflake platform administration team underpins the data marketplace ecosystem,...

    Client

     

    Our client is a leading financial services business operating a comprehensive data marketplace that supports the entire Client’s business.

     

    Project overview

     

    The Snowflake platform administration team underpins the data marketplace ecosystem, including Snowflake, Confluent, dbt Labs, and Astronomer. The team is expanding to include new skills in Terraform-based platform configuration automation to better support the platform's operational needs. The role focuses on platform administration and operational stability.

     

    Position overview

     

    We are seeking an experienced Snowflake Platform Engineer/Administrator to join the team. The successful candidate will primarily deliver platform configuration automation using Terraform within a CI/CD environment, onboard new consuming applications, troubleshoot user issues, and ensure overall stability of the Snowflake environment. This role requires strong expertise in managing Snowflake environments, platform integrations, and infrastructure-as-code automation with Terraform.

    A $1,000 bonus will be provided after a successful trial period.

     

    Technology stack

     

    Snowflake platform and RBAC management
    Terraform for infrastructure as code and change management within CI/CD pipelines
    Azure cloud services, including Azure Functions, security integrations, Private Link, and authentication mechanisms
    SaaS integrations with private link connectivity
    Data ecosystem, including Confluent, dbt Labs, and Astronomer

     

    Responsibilities
     

    • Manage the full lifecycle of Snowflake environments from account setup through production deployment
    • Administer Snowflake RBAC, storage integrations, and application integrations
    • Deliver platform configuration automation using Terraform in a CI/CD environment, supporting multiple SaaS capabilities
    • Onboard new consumer applications onto the platform and provide support and troubleshooting for platform issues
    • Collaborate across teams to maintain platform stability and reliability
    • Learn and apply new technologies and best practices to enhance team capability in automation and platform administration
       

    Requirements
     

    • Expert experience managing platform changes using Terraform within CI/CD pipelines
    • Minimum 5 years of experience managing Snowflake environments, including implementation and production support
    • Strong skills with Snowflake RBAC, storage, application integrations, and knowledge of Snowpark Container Services
    • Minimum 3 years of strong Azure knowledge, including Azure Functions, security integrations, Private Link, and authentication mechanisms
    • Experience integrating SaaS capabilities using a private link
    • Ability to troubleshoot and resolve platform and integration issues effectively
    • Strong communication skills and ability to collaborate across technical and business teams
       

    Nice to have
     

    • Experience working within complex, multi-technology ecosystems
    • Ability to work autonomously and as part of a team to manage platform stability and support business needs
    • Willingness to learn new automation technologies and contribute to team growth
    More
  • Β· 53 views Β· 0 applications Β· 5d

    Data Engineer (Strong Middle / Senior) to $4500

    Hybrid Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· C1 - Advanced
    We’re looking for an experienced Data Engineer to help our team design and implement reliable, scalable, and high-performance data solutions. If you have hands-on experience building data pipelines, working with cloud technologies, and optimizing complex...

    We’re looking for an experienced Data Engineer to help our team design and implement reliable, scalable, and high-performance data solutions. If you have hands-on experience building data pipelines, working with cloud technologies, and optimizing complex systems β€” this position is for you.

     

    Requirements:

    • 3–4+ years of professional experience in Data Engineering;
    • Proven experience in designing and deploying Data Lake or Data Warehouse pipelines;
    • Strong understanding of ETL/ELT principles and large-scale data processing;
    • Proficiency in SQL;
    • Practical experience with Python for data processing and automation tasks;
    • Hands-on experience with Spark (Cloud / On-Prem / Databricks);
    • Experience working with at least one major cloud provider (AWS / GCP / Azure) in data-related environments.

       

    Nice to have:

    • Knowledge of Airflow or similar orchestration tools;
    • Experience with Infrastructure as Code tools (Terraform, Terragrunt, Pulumi);
    • Understanding of DevOps practices within the data engineering domain.

       

    We Offer:

    • Participation in building modern data processes and enterprise-grade solutions;
    • Full-time schedule and the possibility of fully remote collaboration;
    • A team of skilled engineers open to knowledge sharing and continuous improvement;
    • Stable, long-term cooperation with opportunities for professional growth;
    • Comfortable and well-equipped working environment (MacBook Pro, cosy office available if preferred);
    • 18 business days of paid vacation, 10 paid sick leaves.
    • English lessons to support continuous language improvement.
    More
  • Β· 25 views Β· 0 applications Β· 5d

    Middle Data/ETL Engineer

    Office Work Β· Ukraine (Lviv) Β· 3 years of experience Β· B2 - Upper Intermediate
    We are looking for a Middle Data/ETL Engineer with 3+ years of experience for a successful web project from the USA. As a Software Engineer, you will be a key contributor within a high-performing engineering team responsible for building, enhancing, and...

    We are looking for a Middle Data/ETL Engineer with 3+ years of experience for a successful web project from the USA.

    As a Software Engineer, you will be a key contributor within a high-performing engineering team responsible for building, enhancing, and supporting our SaaS technology products. You will apply strong engineering fundamentals, deliver high-quality code, and model best practices in solution design, collaboration, and continuous improvement.

     

    Requirements

     

    Must have

    β€’ 3+ years of experience building enterprise-grade software solutions

    β€’ Strong experience with .NET and C#

    β€’ Hands-on experience with REST APIs, including JSON, and related standards

    β€’ Experience building backend or middleware services

    β€’ Experience with SQL and/or NoSQL databases (e.g., MySQL, MongoDB)

    β€’ Strong problem-solving ability and a passion for tackling complex challenges

    β€’ High personal integrity, accountability, and a continuous-learning mindset

    β€’ Team-oriented with a customer-first approach

    β€’ Positive, self-motivated, and adaptable, with a sense of humor and a collaborative work style

    β€’ English level - Upper or Advanced

     

    Will be a plus

    β€’ Experience in healthcare systems, especially around claims, eligibility, or provider data

    β€’ Strong communication skills with the ability to explain technical concepts to non-technical audiences

    β€’ Technical documentation experience

    β€’ Experience supporting end users or conducting user training

    β€’ Experience gathering and refining technical requirements from non-technical stakeholders

    β€’ Experience with Agile development practices (Kanban, Scrum)

    β€’ Experience with JavaScript technologies or frameworks (e.g., Node.js, Angular)

    β€’ Experience with MongoDB and/or Snowflake 

     

    Responsibilities

    β€’ Design, develop, and implement enterprise-class SaaS software solutions

    β€’ Perform system analysis, development, troubleshooting, and testing of software products

    β€’ Integrate with third-party platforms, APIs, and services to extend and enhance product capabilities

    β€’ Collaborate as an empowered and engaged member of an agile engineering team

    β€’ Contribute independently to team initiatives, feature implementations, and technical decision-making

    β€’ Follow established engineering standards and participate in peer reviews

    β€’ Participate in improving engineering processes, tools, and automation where needed

     

    About Project

    Our client is a Software company from the USA.

     

    Product

    This platform is dedicated to ensuring payment integrity for health payers. Our project focuses on creating and implementing cutting-edge technology solutions to tackle current and future challenges in the healthcare industry for health plans. By bringing together payers, providers, and business partners on one technology platform, we aim to facilitate transparent communication and support collaboration.

     

    Stage

    Active phase of development

     

    Project team

    Existing team on the client side. We work with the US team (Eastern and Central time zones) β€” around 10 people. Our team is 17 people (9 MEAN stack developers, 8 QA, 1 PM)

     

    Project Technologies

    MEAN (MongoDB, Express. js, Angular and Node.js)

     

    Work Schedule

    Full-time working day in our office (flexible hours).

     

    Interview Stages

    1-st stage β€” Call with Recruiter (30 minutes)
    2-nd stage β€” Interview with our Senior Engineer and Project Manager (1 hour)
    3-rd stage β€” Client interview

     

    Our Benefits

    • Projects with a modern JS stack (React.js, React Native, Vue.js, Angular, Node.js)
    • Strong JavaScript community at the company (80 developers)
    • Paid vacations and sick leaves, additional days off, relocation bonus
    • Education: regular tech talks, educational courses, paid certifications, English classes
    • One of the best IT employers in Lviv based on DOU rating

     

    Recruiter - Anastasiia Vaskiv

    More
  • Β· 22 views Β· 0 applications Β· 4d

    Senior Data Engineer (Enterprise and Game Solution Unit)

    Hybrid Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· B2 - Upper Intermediate
    Are you an experienced Data Engineer ready to tackle complex, high-load, and data-intensive systems? We are looking for a Senior professional to join our team in Ukraine, Europe, working full-time on a project that will make a real impact in the public...

    Are you an experienced Data Engineer ready to tackle complex, high-load, and data-intensive systems? We are looking for a Senior professional to join our team in Ukraine, Europe, working full-time on a project that will make a real impact in the public sector.

    At Sigma Software, we specialize in delivering innovative solutions for enterprise clients and public organizations. In this role, you will contribute to building an integrated platform that collects, processes, and visualizes critical indicators, enabling better decision-making and analytics.

    Why join us? You will work with a modern big data stack, have end-to-end involvement from ingestion to machine learning workflows, and be part of a professional team that values ownership, collaboration, and continuous improvement.

    Project
    You will be involved in developing an integrated platform that processes both batch and streaming data, ensures secure and governed data environments, and supports advanced analytics and machine learning workflows. The solution will leverage modern big data technologies to provide actionable insights for the public sector.

    Responsibilities

    • Design and implement data ingestion pipelines for batch and streaming data
    • Configure and maintain data orchestration workflows (Airflow, NiFi) and CI/CD automation for data processes
    • Design and organize data layers within Data Lake architecture (HDFS, Iceberg, S3)
    • Build and maintain secure and governed data environments using Apache Ranger, Atlas, and SDX
    • Develop SQL queries and optimize performance for analytical workloads in Hive/Impala
    • Collaborate on data modeling for analytics and BI, ensuring clean schemas and dimensional models
    • Support machine learning workflows using Spark MLlib or Cloudera Machine Learning (CML)

    Requirements

    • Proven experience in building and maintaining large-scale data pipelines (batch and streaming)
    • Strong knowledge of data engineering fundamentals: ETL/ELT, data governance, data warehousing, Medallion architecture
    • Strong SQL skills for Data Warehouse data serving
    • Minimum 3 years of experience in Python or Scala for data processing
    • Hands-on experience with Apache Spark, Kafka, Airflow, and distributed systems optimization
    • Experience with Apache Ranger and Atlas for security and metadata management
    • Upper-Intermediate English proficiency

    Will be a plus

    • Experience with Cloudera Data Platform (CDP)
    • Advanced SQL skills and Hive/Impala query optimization
    • BS in Computer Science or related field
    • Exposure to ML frameworks and predictive modeling

    Personal profile

    • Ownership mindset and proactive approach
    • Ability to drive initiatives forward and suggest improvements
    • Team player with shared responsibility for delivery speed, efficiency, and quality
    • Excellent written and verbal communication skills
    More
  • Β· 18 views Β· 0 applications Β· 2d

    Senior DB/BI Developer

    Office Work Β· Ukraine (Lviv) Β· Product Β· 3 years of experience Β· B2 - Upper Intermediate
    About us: EveryMatrix is a leading B2B SaaS provider delivering iGaming software, content and services. We provide casino, sports betting, platform and payments, and affiliate management to 200 customers worldwide. But that’s not all! We’re not just about...

    About us:

    EveryMatrix is a leading B2B SaaS provider delivering iGaming software, content and services. We provide casino, sports betting, platform and payments, and affiliate management to 200 customers worldwide.

    But that’s not all! We’re not just about numbers, we’re about people. With a team of over 1000 passionate individuals spread across twelve countries in Europe, Asia, and the US, we’re all united by our love for innovation and teamwork.

    EveryMatrix is a member of the World Lottery Association (WLA) and European Lotteries Association. In September 2023 it became the first iGaming supplier to receive WLA Safer Gambling Certification. EveryMatrix is proud of its commitment to safer gambling and player protection whilst producing market leading gaming solutions.

    Join us on this exciting journey as we continue to redefine the iGaming landscape, one groundbreaking solution at a time.

    We are looking for a passionate and dedicated Senior DB/BI Developer to join our team in Lviv!
     

    About the job:
    Our main stack: DB β€” BigQuery, Postgre, SQL ETL β€” Apache Airflow, Apache NiFi Streaming β€” Apache Kafka.

    What You’ll get to do:

    • Develop real time data processing and aggregations;
    • Create and modify data marts (enhance our data warehouse);
    • Take care of internal and external integrations;
    • Forge various types of reports.
       

    What You need to know:

    • Bachelor’s/Master’s degree in S.T.E.M.;
    • Understanding of Computer Science, Software Engineering, Algorithms, Operating Systems, Networking, etc.;
    • Good knowledge of at least one RDBMS (PostgreSQL, MSSQL, Oracle, MySQL) β€” understanding how db works β€” query optimisation β€” indexes β€” parttitioning.
    • Practical experience with ETL/ELT porcess.
    • Experience in Data Warehouse creation or supporting;
    • Practical experience with at least ONE of enterprise business intelligence platforms;
    • English β€” intermediate + (reading/writing/speaking).

      Will be a plus:
    • Knowledge of analytical data warehouse systems (Google BigQuery, Azure Synapse Analytics, AWS Redshift, Snowflake);
    • Programming skills in one of language (Python,Java);
    • Experience with other tools and systems like Apache Airflow, Apache NiFi.
       

    Here’s what we offer:

    • Start with 22 days of annual leave, with 2 additional days added each year, up to 32 days by your fifth year with us.
    • Stay Healthy: 3 sick leave days per year, no doctor’s note required; 30 medical leave days with medical allowance
    • Support for New Parents:
    • 21 weeks of paid maternity leave, with the flexibility to work from home full-time until your child turns 1 year old.
    • 4 weeks of paternity leave, plus the flexibility to work from home full-time until your child is 13 weeks old.

    Our office perks include on-site massages and frequent team-building activities in various locations.
     

    Benefits & Perks:

    • Daily catered lunch or monthly lunch allowance.β€―
    • Private Medical Subscription.β€―
    • Access online learning platforms like Udemy for Business, LinkedIn Learning or O’Reilly, and a budget for external training.
    • Gym allowance

    At EveryMatrix, we’re committed to creating a supportive and inclusive workplace where you can thrive both personally and professionally. Come join us and experience the difference!

    More
  • Β· 10 views Β· 0 applications Β· 2d

    Principal Data Engineer

    Hybrid Remote Β· Poland, Romania, Ukraine Β· 7 years of experience Β· B2 - Upper Intermediate
    As a Data Engineer you will work with our Product and Engineering team, as well as other feature development teams, to build, deliver and operate our data platform. The role is focused on analyzing and sourcing data for our teams to use, building and...

    As a Data Engineer you will work with our Product and Engineering team, as well as other feature development teams, to build, deliver and operate our data platform. The role is focused on analyzing and sourcing data for our teams to use, building and maintaining pipelines and automations to wrangle, cleanse, secure, govern and provide that data to teams, scaling DS prototypes to operational ML solutions, building automations, owning tests, supporting junior engineers, and contributing to the wider team principles and practices, and the tools we use. The role has no line management responsibilities.

     

    Our data platform is built with Python and Airflow, deployed using CI/CD, heavily exploits automations, and runs on GCP, k8s, Spark, Redis and more. Our efforts in data engineering support our adtech platform which supports hundreds of millions of ad buys annually. You’ll play a leading role in significantly scaling this further.

    Our team consists of 100+ engineers, designers, data, and product people, working in small inter-disciplinary teams closely with creative agencies, media agencies, and with our customers, to develop and scale our DCO platform, a leading digital advertising optimization suite that delivers amazing outcomes for brands and audiences.

     

    Responsibilities: 

     

    • Work with product, product engineering, data engineering, and data science peers to build and support our AdTech platform.
    • Build data-oriented solutions that are simple, scalable, reliable, secure, maintainable, and make a measurable impact.
    • Provide support and coaching for junior developers in the team.
    • Provide our teams with the data they need to build, sell, and manage our platform, and scale DS prototypes into production solutions. Develop, deliver and maintain batch and real-time data pipelines, analysis services, workflows and orchestrations, and create and manage the platforms and data infrastructure that hold, secure, cleanse and validate, govern, and manage our data.
    • Manage our data platform, incorporating services using Airflow, CloudSQL, BigQuery, Kafka, Dataproc, and Redis running on Kubernetes and GCP.
    • Support our Data Science teams with access to data, performing code reviews, aiding model evaluation and testing, deploying models, and supporting their execution.
    • Employ modern pragmatic engineering principles, practices, and tooling, including TDD/BDD/ATDD, XP, QA Engineering, Trunk Based Development, Continuous Delivery, automation, DevSecOps, and Site Reliability Engineering.
    • Contribute to driving ongoing improvements to our engineering principles, practices, and tooling. Provide support and mentorship to junior engineers.
    • Develop and maintain a contemporary understanding of AdTech developments, industry standards, partner and competitor platform developments, and commercial models, from an engineering perspective.

     

    Requirements: 

     

    • Experience architecting ML-based solutions in conjunction with DS teams, software engineering teams, and Product teams.
    • Proven experience translating data science prototypes into production services with clear APIs, SLAs/SLOs, and acceptance criteria in high-volume, low-latency contexts (e.g., AdTech).
    • Proven experience designing, building, and operating batch/streaming feature pipelines with schema control, validation, lineage, and offline/online parity using Python, Airflow/Composer, Kafka, and BigQuery; leveraging Spark, MySQL, and Redis as appropriate.
    • Proven experience implementing reproducible ML training workflows (data prep, hyperparameter tuning, evaluation) with artifact and model versioning on public cloud (GCP strongly preferred).
    • Proven experience packaging and deploying models as containers/services with staged promotion, canary/shadow/A/B rollouts, rollbacks, and environment parity via CI/CD.
    • Proven experience running scalable inference (batch, microservice, streaming) that meets latency/error budgets, with autoscaling, observability, and SRE-style reliability practices.
    • Proven experience establishing CI/CD for data and models with automated tests, data quality gates, model regression/drift detection, and API/data contract testing.
    • Proven experience applying DevSecOps in ML systems: IAM, secrets management, network policies, vulnerability scanning, artifact signing, and policy-as-code on GCP.
    • Proven experience collaborating with data science on feature design, labeling/annotation strategies, evaluation metrics, error analysis, and defining retraining triggers/schedules.
    • Exposure to contributing to product strategy and KPI definition; planning experiments (A/B) and prioritizing ML features aligned to SaaS delivery and operational needs.
    • Exposure to coaching and uplifting teams on data/ML testing, observability, CI/CD, trunk-based development/XP, and writing clear documentation (design docs, runbooks, model/data cards).
    • Proven experience operating in ambiguous, fast-changing environments; iterating from prototype to production with safe rollouts, clear ownership, and continuous improvement.
    • Strong English, excellent influencing and communication skills, and excellent documentation skills.

     

    What is in for You: 

     

    Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

     

    Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

     

    Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

     

    Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

     

    High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

    More
Log In or Sign Up to see all posted jobs