Jobs Data Engineer

147
  • Β· 70 views Β· 0 applications Β· 17d

    Sales Executive (Google Cloud+Google Workspace)

    Full Remote Β· Czechia Β· Product Β· 2 years of experience Β· English - B2
    Cloudfresh is a Global Google Cloud Premier Partner, Zendesk Premier Partner, Asana Solutions Partner, GitLab Select Partner, Hubspot Platinum Partner, Okta Activate Partner, and Microsoft Partner. Since 2017, we’ve been specializing in the...

    Cloudfresh ⛅️ is a Global Google Cloud Premier Partner, Zendesk Premier Partner, Asana Solutions Partner, GitLab Select Partner, Hubspot Platinum Partner, Okta Activate Partner, and Microsoft Partner.

    Since 2017, we’ve been specializing in the implementation, migration, integration, audit, administration, support, and training for top-tier cloud solutions. Our products focus on cutting-edge cloud computing, advanced location and mapping, seamless collaboration from anywhere, unparalleled customer service, and innovative DevSecOps.

    We are seeking a dynamic Sales Executive to lead our sales efforts for GCP and GWS solutions across the EMEA and CEE regions. The ideal candidate will be a high-performing A-player with experience in SaaS sales, adept at navigating complex sales environments, and driven to exceed targets through strategic sales initiatives.

    Requirements:

    • Fluency in English and native Czech is essential;
    • From 2 years of proven sales experience in SaaS/ IaaS fields, with a documented history of achieving and exceeding sales targets, particularly in enterprise sales;
    • Sales experience on GCP and/or GWS specifically;
    • Sales or technical certifications related to Cloud Solutions are advantageous;
    • Experience in expanding new markets with outbound activities;
    • Excellent communication, negotiation, and strategic planning abilities;
    • Proficient in managing CRM systems and understanding their strategic importance in sales and customer relationship management.

    Responsibilities:

    • Develop and execute sales strategies for GCP and GWS solutions, targeting enterprise clients within the Cloud markets across EMEA and CEE;
    • Identify and penetrate new enterprise market segments, leveraging GCP and GWS to improve client outcomes;
    • Conduct high-level negotiations and presentations with major companies across Europe, focusing on the strategic benefits of adopting GCP and GWS solutions;
    • Work closely with marketing and business development teams to align sales strategies with broader company goals;
    • Continuously assess the competitive landscape and customer needs, adapting sales strategies to meet market demands and drive revenue growth.

    Work conditions:

    • Competitive Salary & Transparent Motivation: Receive a competitive base salary with commission on sales and performance-based bonuses, providing clear financial rewards for your success.
    • Flexible Work Format: Work remotely with flexible hours, allowing you to balance your professional and personal life efficiently.
    • Freedom to Innovate: Utilize multiple channels and approaches for sales, allowing you the freedom to find the best strategies for success.
    • Training with Leading Cloud Products: Access in-depth training on cutting-edge cloud solutions, enhancing your expertise and equipping you with the tools to succeed in an ever-evolving industry.
    • International Collaboration: Work alongside A-players and seasoned professionals in the cloud industry. Expand your expertise by engaging with international markets across the EMEA and CEE regions.
    • Vibrant Team Environment: Be part of an innovative, dynamic team that fosters both personal and professional growth, creating opportunities for you to advance in your career.
    • When applying to this position, you consent to the processing of your personal data by CLOUDFRESH for the purposes necessary to conduct the recruitment process, in accordance with Regulation (EU) 2016/679 of the European Parliament and of the Council of April 27, 2016 (GDPR).
    • Additionally, you agree that CLOUDFRESH may process your personal data for future recruitment processes.
    More
  • Β· 42 views Β· 6 applications Β· 18d

    Principal Analytics Developer

    Full Remote Β· EU Β· 3 years of experience Β· English - B2
    The Principal Analytics Developer is a new role that will support the newly created Product Data Domain teams. The role requires strong skills in dimensionally modelling, conforming and integrating data from multiple sources, as well as experience in...

    The Principal Analytics Developer is a new role that will support the newly created Product Data Domain teams. The role requires strong skills in dimensionally modelling, conforming and integrating data from multiple sources, as well as experience in leading strong analytics engineering teams.
    Responsibilities:
     

    • Planning workloads and delegating tasks in agile environment
    • Assisting with the daily operation of the organisation, including support and incidents
    • Able to provide feedback to team members, including constructive areas for development
    • Leading on the design, implementation and maintenance of dimensional data models that promote a self-service approach to data consumption. This includes ensuring that data quality within the data warehouse is maintained throughout the data lifecycle.
    • Define best practices in dimensional data modelling and database design and ensure standards are adhered to across the team.
    • Mentoring, coaching and supporting other team members in developing data modelling skills through knowledge transfer.
    • Automating data pipelines using proprietary technology & Airflow.
    • Using your expert knowledge of the company products and their features to inform the design and development of data products and upskilling the team through this knowledge.
    • Developing ways of working between product data domains and other data teams within product group.
    • The creation of processes for data product development, ensuring these processes are documented and advocating their use throughout the organisation.
    • Supporting analytics, data science and other colleagues outside the digital product area in managing projects and fielding queries.
    • Ability to build and maintain strong working relationships where you might, as a specialist, have to manage the expectations of more senior colleagues.
    • Working across mobile, web, television and voice platforms supporting Product Managers, Business Analysts and working closely with Software & Data Engineers.

       

    Requirements:
     

    • Extensive (5+ years) experience in managing teams building data warehouses / analytics from a diverse set of data sources (including event streams, various forms of batch processing)
    • At least 5 years’ experience in a Data Analyst, Data Modelling, Data Engineering or Analytics Engineering role, preferably in digital products, with an interest in data modelling and ETL processes
    • Proven experience in dimensionally modelling complex data at the conceptual, logical and physical layer.
    • Experience of designing STAR Schemas
    • Excellent SQL skills for extracting and manipulating data. Experience of using tools such as DBT, Looker and Airflow would be an advantage.
    • Good knowledge of analytical database systems (Redshift, Snowflake, BigQuery).
    • Comfortable working alongside cross-functional teams interacting with Product Managers, Engineers, Data Scientists, and Analysts.
    • Knowledge of digital products and their components, as well as what metrics affect their performance.
    • An understanding of how digital products use experimentation.
    • Some experience coding in R or Python.
    • A good understanding of on-demand audio and video media products, with a knowledge of key competitors.
       

    Will be a plus:

     

    • Ability to listen to others’ ideas and build on them 
    • Ability to clearly communicate to both technical and non-technical audiences.
    • Ability to collaborate effectively, working alongside other team members towards the team’s goals, and enabling others to succeed, where possible.
    • Ability to prioritise. A structured approach and ability to bring other on the journey. 
    • Strong attention to detail
       
    More
  • Β· 119 views Β· 13 applications Β· 18d

    Senior Solana Engineer (Smart Wallet)

    Full Remote Β· Countries of Europe or Ukraine Β· 3 years of experience Β· English - B1
    Senior Solana Developer - CoFo Neobank About the Project We're building CoFo Neobank β€” the first AI-first smart wallet on Solana that brings the banking app experience (like Revolut, Robinhood) into the on-chain environment. Our goal is to abstract...

    Senior Solana Developer - CoFo Neobank

    About the Project

    We're building CoFo Neobank β€” the first AI-first smart wallet on Solana that brings the banking app experience (like Revolut, Robinhood) into the on-chain environment.

    Our goal is to abstract blockchain complexity. We're building an architecture where every user gets a Smart Account (a programmable account, not a simple EOA) that supports multi-factor authentication (2/3 Multisig), access recovery, spending limits, and native integration of complex financial products (Staking, Yield, Perps, RWA).

    Core Responsibilities

    β€’ Smart Account Architecture Development: Design and write custom Rust programs (Anchor) for managing user accounts. Implement multisig logic (Device Key + 2FA Key), key rotation, and access recovery (Social Recovery).

    β€’ DeFi Composability (Integrations): Write adapters and CPI (Cross-Program Invocations) calls to integrate external protocols directly into the smart account:

    • Swap: Aggregation through Jupiter
    • Yield & Lending: Integration with Kamino, MarginFi, Meteora
    • Perps: Integration with Drift Protocol

    β€’ Security and Access Control: Implement spending limits system, protocol whitelisting, and replay attack protection.

    β€’ Portfolio Logic: Develop on-chain structures for position tracking (storing data about deposits, debts, PnL) for fast frontend/AI reading.

    β€’ Gas Abstraction: Implement mechanisms for paying fees on behalf of users (Fee Bundling / Gas Tank).

    Requirements (Hard Skills)

    β€’ Expert Rust & Anchor: Deep understanding of Solana Sealevel runtime, memory management, PDAs, and Compute Units (CU) limitations.

    β€’ Account Abstraction Experience: Understanding of how to build smart contract wallets that differ from standard system accounts.

    β€’ DeFi Integration Experience: You've already worked with SDKs or IDLs of major Solana protocols (Jupiter, Kamino, Drift, etc.). You understand what CPI is and how to safely call external code.

    β€’ Cryptography: Understanding of signature operations (Ed25519), transaction verification, and building secure multisig schemes.

    β€’ Security Mindset: Experience with audits, knowledge of attack vectors on Solana (re-entrancy, account substitution, ownership checks).

    Nice to Have

    β€’ Experience with Privy (for authentication) β€’ Understanding of cross-chain bridges (Wormhole/LayerZero) for implementing deposits from other networks β€’ Experience with tokenized assets (RWA) and Token-2022 standard

    Tech Stack

    β€’ Solana (Rust, Anchor Framework) β€’ Integrations: Jupiter, Kamino, Drift, MarginFi β€’ Infrastructure: Helius, Privy

    We Offer

    β€’ Work on a product that's changing UX in DeFi β€’ Complex architectural challenges (not just another token fork, but sophisticated wallet infrastructure) β€’ Competitive compensation in stablecoins/fiat + project options/tokens

    More
  • Β· 64 views Β· 31 applications Β· 18d

    Python Data Engineer

    Full Remote Β· Worldwide Β· 5 years of experience Β· English - B2
    Core Responsibilities β€’ Data Pipeline Management: Develop, optimize, and maintain scalable data pipelines to ensure high-quality data flow. β€’ API Development: Build and maintain high-performance backend APIs using FastAPI. β€’ System Reliability:...

    Core Responsibilities

     β€’ Data Pipeline Management: Develop, optimize, and maintain scalable data pipelines to ensure high-quality data flow. 

    β€’ API Development: Build and maintain high-performance backend APIs using FastAPI. 

    β€’ System Reliability: Proactively identify bottlenecks and improve system stability within existing infrastructures.

     β€’ Collaboration: Work closely with cross-functional teams to integrate AWS services and workflow orchestration tools into the production environment.

     

    Required Qualifications 

    β€’ Experience: 3+ years of professional Python development experience. 

    β€’ Databases: Strong proficiency in both SQL and NoSQL database design and management. 

    β€’ DevOps Tools: Hands-on experience with Docker, CI/CD pipelines, and Git version control. 

    β€’ Frameworks: Proven experience building applications with FastAPI.

    β€’ Cloud & Orchestration: Practical experience with AWS services and familiarity with Airflow (or similar workflow orchestration tools). 

    β€’ Communication: Upper-Intermediate level of English (written and spoken) for effective team collaboration. 

     

     Preferred Skills (Nice to Have) 

    β€’ Experience within the Financial Domain. 

    β€’ Hands-on experience with Apache Spark and complex ETL pipelines.

    β€’ Knowledge of container orchestration using Kubernetes. 

    β€’ Exposure to or interest in Large Language Models (LLMs) and AI integration.

    More
  • Β· 47 views Β· 11 applications Β· 18d

    Senior Data Engineer

    Worldwide Β· Product Β· 4 years of experience Β· English - C1
    How about building a high-load data architecture that handles millions of transactions daily? We’re looking for a Senior Data Engineer with growing to Data Lead. For design scalable pipelines from scratch. An international iGaming company & Data-first...

    How about building a high-load data architecture that handles millions of transactions daily?
    We’re looking for a Senior Data Engineer with growing to Data Lead.
    For design scalable pipelines from scratch.
    An international iGaming company & Data-first mindset,
    Remote, TOP-Salary

     

    Responsibilities

    – Build and run scalable pipelines (batch + streaming) that power gameplay, wallet, and promo analytics.

    – Model data for decisions (star schemas, marts) that Product, BI, and Finance use daily.

    – Make things reliable: tests, lineage, alerts, SLAs. Fewer surprises, faster fixes.

    – Optimize ETL/ELT for speed and cost (partitioning, clustering, late arrivals, idempotency).

    – Keep promo data clean and compliant (PII, GDPR, access controls).

    – Partner with POs and analysts on bets/wins/turnover KPIs, experiment readouts, and ROI.

    – Evaluate tools, migrate or deprecate with clear trade-offs and docs.

    – Handle prod issues without drama, then prevent the next one.

     

     

    Requirements

    – 4+ years building production data systems. You’ve shipped, broken, and fixed pipelines at scale.

    – SQL that sings and Python you’re proud of.

    – Real experience with OLAP and BI (Power BI / Tableau / Redash β€” impact > logo).

    – ETL/ELT orchestration (Airflow/Prefect or similar) and CI/CD for data.

    – Strong grasp of warehouses & lakes: incremental loads, SCDs, partitioning.

    – Data quality mindset: contracts, tests, lineage, monitoring.

    – Product sense: you care about player/clients impact, not just rows processed.

    ✨ Nice to Have (tell us if you’ve got it)

    – Kafka (or similar streaming), ClickHouse (we like it), dbt (modular ELT).

    – AWS data stack (S3, IAM, MSK/Glue/Lambda/Redshift) or equivalents.

    – Containers & orchestration (Docker/K8s), IaC (Terraform).

    – Familiarity with AI/ML data workflows (feature stores, reproducibility).

    – iGaming context: provider metrics bets / wins / turnover, regulated markets, promo events.

     

     

    We offer

    – Fully remote (EU-friendly time zones) or Bratislava/Malta/Cyprus if you like offices.

    – Unlimited vacation + paid sick leave.

    – Quarterly performance bonuses.

    – No micromanagement. Real ownership, real impact.

    – Budget for conferences and growth.

    – Product-led culture with sharp people who care.

     

     

    🧰 Our Day-to-Day Stack (representative)
    Python, SQL, Airflow/Prefect, Kafka, ClickHouse/OLAP DBs, AWS (S3 + friends), dbt, Redash/Tableau, Docker/K8s, GitHub Actions.

    More
  • Β· 53 views Β· 9 applications Β· 19d

    Data Engineer

    Ukraine Β· 4 years of experience Β· English - B2
    Role Summary A key role in our data engineering team, working closely with the rest of the technology team to provide a first class service to both internal and external users. In this role you will be responsible for building solutions that allow us to...

    Role Summary

    A key role in our data engineering team, working closely with the rest of the technology team to provide a first class service to both internal and external users. In this role you will be responsible for building solutions that allow us to use our data in a robust, flexible and efficient way while also maintaining the integrity of our data, much of which is of a sensitive nature.

    Role and Responsibilities

    Manages resources (internal and external) in the delivery of the product roadmap for our data asset. Key responsibilities include but not exhaustive:

    • Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes
    • Work closely with the development and product teams (both internal and external) to ensure that products meet the required specification prior to release.
    • Working closely with our technology colleagues throughout the delivery lifecycle to ensure that all data related processes are efficient and accurate
    • Providing expert assistance with design and implementation of all new products. All of our new technology stack has data at its heart.
    • Ensuring data is available for business and management reporting purposes.
    • Assist with the development and refinement of the agile process.
    • Be an advocate for best practices and continued learning
    • Strong technical understanding of a data experience
    • Ensure the ongoing maintenance of their own CPD
    • Carry out all duties in a manner that always reflect Financial Wellness Group’s values and principles

    Essential Criteria

    • Extensive knowledge of using Python to build ETL and ELT products in AWS using Lambda and Batch processing.
    • A keen understanding of developing and tuning Microsoft SQL Server.
    • Exposure to development in Postgres.
    • A good understanding of CI/CD for data and the challenges inherent.
    • Ability to use Source Control Systems such as Git/Azure DevOps
    • An understanding of dimensional modelling and data warehousing methodologies and an interest in Data Lakehousing technologies.
    • An understanding of Infrastructure as a Service provision (for example Terraform)
    • The ability to rapidly adapt to new technologies and technical challenges.
    • The flexibility to quickly react to changing business priorities.
    • A team player, with a natural curiosity and a desire to learn new skills
    • An interest in finding the β€˜right way’
    • Passionate about data delivery and delivering change

    What To Expect From Digicode?

    🌎 Work from Anywhere: From an office, home, or travel continuously if that’s your thing. Everything we do is online. As long as you have the Internet and your travel nomad lifestyle doesn’t affect the work process (you meet all deadlines and are present at all the meetings), you’re all set.

    πŸ’Ό Professional Development: We offer great career development opportunities in a growing company, international work environment, paid language classes, conference and education budget, & internal 42 Community training.

    πŸ§˜β€β™‚οΈ Work-life Balance: We provide employees with 18+ paid vacation days and paid sick leave, flexible schedule, medical insurance for employees and their children, monthly budget for things like a gym or pool membership.

    πŸ™Œ Culture of Openness: We’re committed to fostering a community where everyone feels welcome, seen, and heard, with minimal bureaucracy, and a flat organization structure.

    And, also, corporate gifts, corporate celebrations, free food & snacks, play & relax rooms for those who visit the office.

    Did we catch your attention? We’d love to hear from you.

    More
  • Β· 30 views Β· 1 application Β· 19d

    Senior/Lead Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    Description The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of...

    Description

    The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of our broader team, with a focus on patient engagement, care coordination, AI, healthcare analytics, and interoperability. These advanced technologies enhance our product portfolio with new services while improving clinical and patient experiences.

     

    Requirements

    An AWS Data Engineer designs, develops, and maintains scalable data solutions using AWS cloud services.
    Key Responsibilities:
        β€’ Design, build, and manage ETL (Extract, Transform, Load) pipelines using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3).
        β€’ Develop and maintain data architecture (data lakes, warehouses, databases) on AWS.
        β€’ Implement data quality and governance solutions.
        β€’ Automate data workflows and monitor pipeline health.
        β€’ Ensure data security and compliance with company policies.
    Required Skills:
        β€’ Proficiency with AWS cloud services, especially data-related offerings (S3, Glue, Redshift, Athena, EMR, Kinesis, Lambda).
        β€’ Strong SQL and Python skills.
        β€’ Experience with ETL tools and frameworks.
        β€’ Familiarity with data modelling and warehousing concepts.
        β€’ Knowledge of data security, access management, and best practices in AWS.
    Preferred Qualifications:
        β€’ AWS certifications (e.g., AWS Certified Data Analytics – Speciality, AWS Certified Solutions Architect).
        β€’ Background in software engineering or data science.

    β€’ Hands-on experience with Oracle Database and log-based Change Data Capture (CDC) replication using AWS Database Migration Service (DMS) for near real-time data ingestion.

    Job responsibilities

    • Develops, documents, and configures systems specifications that conform to defined architecture standards, address business requirements, and processes in the cloud development & engineering.
    • Involved in planning of system and development deployment, as well as responsible for meeting compliance and security standards.
    • API development using AWS services in a scalable, microservices-based architecture
    • Actively identifies system functionality or performance deficiencies, executes changes to existing systems, and tests functionality of the system to correct deficiencies and maintain more effective data handling, data integrity, conversion, input/output requirements, and storage.
    • May document testing and maintenance of system updates, modifications, and configurations.
    • May act as a liaison with key technology vendor technologists or other business functions.
    • Function Specific: Strategically design technology solutions that meet the needs and goals of the company and its customers/users.
    • Leverages platform process expertise to assess if existing standard platform functionality will solve a business problem or if a customisation solution would be required.
    • Test the quality of a product and its ability to perform a task or solve a problem.
    • Perform basic maintenance and performance optimisation procedures in each of the primary operating systems.
    • Ability to document detailed technical system specifications based on business system requirements
    • Ensures system implementation compliance with global & local regulatory and security standards (i.e. HIPAA, SOCII, ISO27001, etc.)
    More
  • Β· 45 views Β· 1 application Β· 19d

    Middle Data Engineer IRC285068

    Full Remote Β· Ukraine Β· 3 years of experience Β· English - None
    Description The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of...

    Description

    The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of our broader team, with a focus on patient engagement, care coordination, AI, healthcare analytics, and interoperability. These advanced technologies enhance our product portfolio with new services while improving clinical and patient experiences.

     

    Requirements

    MUST HAVE

    AWS Platform: Working experience with AWS data technologies, including S3
    Programming Languages: Strong programming skills in Python
    Data Formats: Experience with JSON, XML and other relevant data formats
    HealthCare Interoperability Tools: Previous experience with integration engines such as Intersystems, Lyniate, Redox, Mirth Connect, etc…

    Hands-on experience with Oracle Database and log-based Change Data Capture (CDC) replication using AWS Database Migration Service (DMS) for near real-time data ingestion.

    CI/CD Tools: experience setting up and managing CI/CD pipelines using GitLab CI, Jenkins, or similar tools
    Scripting and automation: experience in scripting language such as Python, PowerShell, etc…
    Monitoring and Logging: Familiarity with monitoring & logging tools like CloudWatch, ELK, Dynatrace, Prometheus, etc…
    Source Code Management: Expertise with git commands and associated VCS (Gitlab, Github, Gitea or similar)
    Documentation: Experience with markdown and in particular Antora for creating technical documentation

     

    NICE TO HAVE
    Strongly Preferred:
    Previous Healthcare or Medical Device experience
    Other data technologies such as Snowflake, Trino/Starburst
    Experience working with Healthcare Data, including HL7v2, FHIR and DICOM
    FHIR and/or HL7 Certifications
    Building software classified as Software as a Medical Device (SaMD)
    Understanding of EHR technologies such as EPIC, Cerner, etc…
    Experience implementation enterprise grade cyber security & privacy by design into software products
    Experience working in Digital Health software
    Experience developing global applications
    Strong understanding of SDLC – Waterfall & Agile methodologies
    Software estimation
    Experience leading software development teams onshore and offshore

     

    Job responsibilities

    – Develops, documents, and configures systems specifications that conform to defined architecture standards, address business requirements, and processes in the cloud development & engineering.

    – Involved in planning of system and development deployment as well as responsible for meeting compliance and security standards.

    – API development using AWS services in a scalable, microservices based architecture

    – Actively identifies system functionality or performance deficiencies, executes changes to existing systems, and tests functionality of the system to correct deficiencies and maintain more effective data handling, data integrity, conversion, input/output requirements, and storage.

    – May document testing and maintenance of system updates, modifications, and configurations.

    – May act as a liaison with key technology vendor technologists or other business functions.

    – Function Specific: Strategically design technology solutions that meets the needs and goals of the company and its customers/users.

    – Leverages platform process expertise to assess if existing standard platform functionality will solve a business problem or customization solution would be required.

    – Test the quality of a product and its ability to perform a task or solve a problems.

    – Perform basic maintenance and performance optimization procedures in each of the primary operating systems.

    – Ability to document detailed technical system specifications based on business system requirements

    – Ensures system implementation compliance with global & local regulatory and security standards (i.e. HIPAA, SOCII, ISO27001, etc…)

     

    What we offer

    Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

    Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

    Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

    Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

    High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

    More
  • Β· 24 views Β· 1 application Β· 19d

    Senior Data Engineer IRC284644

    Full Remote Β· Ukraine Β· 4 years of experience Β· English - None
    Description Our client is a luxury skincare and beauty brand. The brand is based in San Francisco and sells luxury skincare products worldwide. Client’s main IT β€œproduct” is its e-commerce website, which functions as a digital platform to sell products,...

    Description

    Our client is a luxury skincare and beauty brand. The brand is based in San Francisco and sells luxury skincare products worldwide.

    Client’s main IT β€œproduct” is its e-commerce website, which functions as a digital platform to sell products, educate customers, and personalize experiences.

    • Runs on Salesforce Commerce Cloud (formerly Demandware) β€” an enterprise e-commerce platform that supports online shopping, order processing, customer accounts, and product catalogs.
    • Hosted on cloud infrastructure (e.g., AWS, Cloudflare) for reliable performance and security
      Uses HTTPS/SSL encryption to secure data transfers.
    • Integrated marketing and analytics technologies such as Klaviyo (email & SMS automation), Google Tag Manager, and personalization tools to track behavior, optimize campaigns, and increase conversions

    It’s both a shopping platform and a digital touchpoint for customers worldwide.

     

    Requirements

    • 4+ years of experience as a Data Engineer, Analytics Engineer, or in a similar data-focused role.
    • Strong SQL skills for complex data transformations and analytics-ready datasets.
    • Hands-on experience with Python for data pipelines, automation, and data processing.
    • Experience working with cloud-based data platforms (AWS preferred).
    • Solid understanding of data warehousing concepts (fact/dimension modeling, star schemas).
    • Experience building and maintaining ETL/ELT pipelines from multiple data sources.
    • Familiarity with data quality, monitoring, and validation practices.
    • Experience handling customer, transactional, and behavioral data in a digital or e-commerce environment.
    • Ability to work with cross-functional stakeholders (Marketing, Product, Analytics, Engineering).

    Nice to have:

    • Experience with Snowflake, Redshift, or BigQuery.
    • Experience with dbt or similar data transformation frameworks.
    • Familiarity with Airflow or other orchestration tools.
    • Experience with marketing and CRM data (e.g. Klaviyo, GA4, attribution tools).
    • Exposure to A/B testing and experimentation data.
    • Understanding of privacy and compliance (GDPR, CCPA).
    • Experience in consumer, retail, or luxury brands.
    • Knowledge of event tracking and analytics instrumentation.
    • Ability to travel + visa to the USA

     

    Job responsibilities

    • Design, build, and maintain scalable data pipelines ingesting data from multiple sources:
      e-commerce platform (e.g. Salesforce Commerce Cloud), CRM/marketing tools (Klaviyo), web analytics, fulfillment and logistics systems.
    • Ensure reliable, near-real-time data ingestion for customer behavior, orders, inventory, and marketing performance.
    • Develop and optimize ETL/ELT workflows using cloud-native tools.
    • Model and maintain customer, order, product, and session-level datasets to support analytics and personalization use cases.
    • Enable 360Β° customer view by unifying data from website interactions, email/SMS campaigns, purchases, and returns.
    • Support data needs for personalization tools (e.g. product recommendation quizzes, ritual finders).
    • Build datasets that power marketing attribution, funnel analysis, cohort analysis, and LTV calculations.
    • Enable data access for growth, marketing, and CRM teams to optimize campaign targeting and personalization
    • Ensure accurate tracking and validation of events, conversions, and user journeys across channels.
    • Work closely with Product, E-commerce, Marketing, Operations, and Engineering teams to translate business needs into data solutions.
    • Support experimentation initiatives (A/B testing, new digital experiences, virtual stores).
    • Act as a data partner in decision-making for growth, CX, and operational efficiency.
    • Build and manage data solutions on cloud infrastructure (e.g. AWS).
    • Optimize storage and compute costs while maintaining performance and scalability.

     

    What we offer

    Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

    Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

    Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

    Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

    High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

    More
  • Β· 80 views Β· 12 applications Β· 20d

    Data Platform Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 6 years of experience Β· English - B1
    WHO WE ARE At Bennett Data Science, we’ve been pioneering the use of predictive analytics and data science for over ten years, for some of the biggest brands and retailers. We’re at the top of our field because we focus on actionable technology that helps...

    WHO WE ARE

    At Bennett Data Science, we’ve been pioneering the use of predictive analytics and data science for over ten years, for some of the biggest brands and retailers. We’re at the top of our field because we focus on actionable technology that helps people around the world. Our deep experience and product-first attitude set us apart from other groups and it's why people who work with us tend to stay with us long term.

     

    WHY YOU SHOULD WORK WITH US

    You'll work on an important problem that improves the lives of a lot of people. You'll be at the cutting edge of innovation and get to work on fascinating problems, supporting real products, with real data. Your perks include: expert mentorship from senior staff, competitive compensation, paid leave, flexible work schedule and ability to travel internationally.

    Essential Requirements for Data Platform Engineer:

    • Architecture & Improvement: Continuously review the current architecture and implement incremental improvements, facilitating a gradual transition of production operations from Data Science to Engineering.
    • AWS Service Ownership: Own the full lifecycle (development, deployment, support, and monitoring) of client-facing AWS services (including SageMaker endpoints, Lambdas, and OpenSearch). Maintain high uptime and adherence to Service Level Agreements (SLAs).
    • ETL Operations Management: Manage all ETL processes, including the operation and maintenance of Step Functions and Batch jobs (scheduling, scaling, retry/timeout logic, failure handling, logging, and metrics).
    • Redshift Operations & Maintenance: Oversee all Redshift operations, focusing on performance optimization, access control, backup/restore readiness, cost management, and general housekeeping.
    • Performance Optimization: Post-stabilization of core monitoring and pipelines, collaborate with the Data Science team on targeted code optimizations to enhance reliability, reduce latency, and lower operational costs.
    • Security & Compliance: Implement and manage the vulnerability monitoring and remediation workflow (Snyk).
    • CI/CD Implementation: Establish and maintain robust Continuous Integration/Continuous Deployment (CI/CD) systems.
    • Infrastructure as Code (Optional): Utilize IaC principles where necessary to ensure repeatable and streamlined release processes.


    Mandatory Hard Skills:

    • AWS Core Services: Proven experience with production fundamentals (IAM, CloudWatch, and VPC networking concepts).
    • AWS Deployment: Proficiency in deploying and operating AWS SageMaker and Lambda services.
    • ETL Orchestration: Expertise in using AWS Step Functions and Batch for ETL and job orchestration.
    • Programming & Debugging: Strong command of Python for automation and troubleshooting.
    • Containerization: Competence with Docker/containers (build, run, debug).
    • Version Control & CI/CD: Experience with CI/CD practices and Git (GitHub Actions preferred).
    • Data Platform Tools: Experience with Databricks, or a demonstrated aptitude and willingness to quickly learn.
    •  

    Essential Soft Skills:

    • Accountability: Demonstrate complete autonomy and ownership over all assigned systems ("you run it, you fix it, you improve it").
    • Communication: Fluent in English; capable of clear, direct communication, especially during incidents.
    • Prioritization: A focus on delivering a minimally-supportable, deployable solution to meet deadlines, followed by optimization and cleanup.
    • Incident Management: Maintain composure under pressure and possess strong debugging and incident handling abilities.
    • Collaboration: Work effectively with the Data Science team while communicating technical trade-offs clearly and maintaining momentum.
    More
  • Β· 110 views Β· 16 applications Β· 20d

    Senior Data Engineer

    Full Remote Β· Ukraine Β· Product Β· 5 years of experience Β· English - B2
    Join a Company That Invests in You Seeking Alpha is the world’s leading community of engaged investors. We’re the go-to destination for investors looking for actionable stock market opinions, real-time market analysis, and unique financial insights. At...

    Join a Company That Invests in You

    Seeking Alpha is the world’s leading community of engaged investors. We’re the go-to destination for investors looking for actionable stock market opinions, real-time market analysis, and unique financial insights. At the same time, we’re also dedicated to creating a workplace where our team thrives. We’re passionate about fostering a flexible, balanced environment with remote work options and an array of perks that make a real difference.

    Here, your growth matters. We prioritize your development through ongoing learning and career advancement opportunities, helping you reach new milestones. Join Seeking Alpha to be part of a company that values your unique journey, supports your success, and champions both your personal well-being and professional goals.

     

    What We're Looking For

    Seeking Alpha is looking for a Senior Data Engineer responsible for designing, building, and maintaining the infrastructure necessary for analyzing large data sets. This individual should be an expert in data management, ETL (extract, transform, load) processes, and data warehousing and should have experience working with various big data technologies, such as Hadoop, Spark, and NoSQL databases. In addition to technical skills, a Senior Data Engineer should have strong communication and collaboration abilities, as they will be working closely with other members of the data and analytics team, as well as other stakeholders, to identify and prioritize data engineering projects and to ensure that the data infrastructure is aligned with the overall business goals and objectives.

     

    What You'll Do

    • Work closely with data scientists/analytics and other stakeholders to identify and prioritize data engineering projects and to ensure that the data infrastructure is aligned with business goals and objectives
    • Design, build and maintain optimal data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources, including external APIs, data streams, and data stores. 
    • Continuously monitor and optimize the performance and reliability of the data infrastructure, and identify and implement solutions to improve scalability, efficiency, and security
    • Stay up-to-date with the latest trends and developments in the field of data engineering, and leverage this knowledge to identify opportunities for improvement and innovation within the organization
    • Solve challenging problems in a fast-paced and evolving environment while maintaining uncompromising quality.
    • Implement data privacy and security requirements to ensure solutions comply with security standards and frameworks.
    • Enhance the team's dev-ops capabilities.

     

    Requirements

    • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
    • 2+ years of proven experience developing large-scale software using an object-oriented or functional language.
    • 5+ years of professional experience in data engineering, focusing on building and maintaining data pipelines and data warehouses
    • Strong experience with Spark, Scala, and Python, including the ability to write high-performance, maintainable code
    • Experience with AWS services, including EC2, S3, Athena, Kinesis/Firehose Lambda and EMR
    • Familiarity with data warehousing concepts and technologies, such as columnar storage, data lakes, and SQL
    • Experience with data pipeline orchestration and scheduling using tools such as Airflow
    • Strong problem-solving skills and the ability to work independently as well as part of a team
    • High-level English - a must. 
    • A team player with excellent collaboration skills.

      

    Nice to Have:

    • Expertise with Vertica or Redshift, including experience with query optimization and performance tuning
    • Experience with machine learning and/or data science projects
    • Knowledge of data governance and security best practices, including data privacy regulations such as GDPR and CCPA.
    • Knowledge of Spark internals (tuning, query optimization)
    More
  • Β· 66 views Β· 9 applications Β· 22d

    Senior Data Engineer (Healthcare domain)

    Full Remote Β· EU Β· 5 years of experience Β· English - None
    Are you passionate about building large-scale cloud data infrastructure that makes a real difference? We are looking for a Senior Data Engineer to join our team and work on an impactful healthcare technology project. This role offers a remote work format...

    Are you passionate about building large-scale cloud data infrastructure that makes a real difference? We are looking for a Senior Data Engineer to join our team and work on an impactful healthcare technology project. This role offers a remote work format with the flexibility to collaborate across international teams.

    At Sigma Software, we deliver innovative IT solutions to global clients in multiple industries, and we take pride in projects that improve lives. Joining us means working with cutting-edge technologies, contributing to meaningful initiatives, and growing in a supportive environment.


    CUSTOMER
    Our client is a leading medical technology company. Its portfolio of products, services, and solutions is at the center of clinical decision-making and treatment pathways. Patient-centered innovation has always been, and will always be, at the core of the company. The client is committed to improving patient outcomes and experiences, regardless of where patients live or what they face. The Customer is innovating sustainably to provide healthcare for everyone, everywhere. 


    PROJECT
    The project focuses on building and maintaining large-scale cloud-based data infrastructure for healthcare applications. It involves designing efficient data pipelines, creating self-service tools, and implementing microservices to simplify complex processes. The work will directly impact how healthcare providers access, process, and analyze critical medical data, ultimately improving patient care.

     

    Responsibilities:

    • Collaborate with the Product Owner and team leads to define and design efficient pipelines and data schemas
    • Build and maintain infrastructure using Terraform for cloud platforms
    • Design and implement large-scale cloud data infrastructure, self-service tooling, and microservices
    • Work with large datasets to optimize performance and ensure seamless data integration
    • Develop and maintain squad-specific data architectures and pipelines following ETL and Data Lake principles
    • Discover, analyze, and organize disparate data sources into clean, understandable schemas

     

    Requirements:

    • Hands-on experience with cloud computing services in data and analytics
    • Experience with data modeling, reporting tools, data governance, and data warehousing
    • Proficiency in Python and PySpark for distributed data processing
    • Experience with Azure, Snowflake, and Databricks
    • Experience with Docker and Kubernetes
    • Knowledge of infrastructure as code (Terraform)
    • Advanced SQL skills and familiarity with big data databases such as Snowflake, Redshift, etc.
    • Experience with stream processing technologies such as Kafka, Spark Structured Streaming
    • At least an Upper-Intermediate level of English 

     

    More
  • Β· 21 views Β· 0 applications Β· 22d

    Senior Snowflake Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data...

    The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data infrastructure through the transition to Snowflake as a priority, as it will enhance capabilities for implementing advanced AI solutions and unlock numerous opportunities for innovation and growth.

    We are seeking a highly skilled Snowflake Data Engineer to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD practices, with a deep understanding of data governance, security, and role-based access control (RBAC). Knowledge of data modeling methodologies (OLTP, OLAP, Data Vault 2.0), data quality frameworks, Stream lit application development and SAP integration and infrastructure-as-code with Terraform is essential. Experience working with different file formats such as JSON, Parquet, CSV, and XML is highly valued.

    • Responsibilities:

      β€’ Design and develop data pipelines using Snowflake and Snow pipe for real-time and batch ingestion.
      β€’ Implement CI/CD pipelines in Azure DevOps for seamless deployment of data solutions.
      β€’ Automate DBT jobs to streamline transformations and ensure reliable data workflows.
      β€’ Apply data modeling techniques including OLTP, OLAP, and Data Vault 2.0 methodologies to design scalable architectures.
      β€’ Document data models, processes, and workflows clearly for future reference and knowledge sharing.
      β€’ Build data tests, unit tests, and mock data frameworks to validate and maintain reliability of data solutions.
      β€’ Develop Streamlit applications integrated with Snowflake to deliver interactive dashboards and self-service analytics.
      β€’ Integrate SAP data sources into Snowflake pipelines for enterprise reporting and analytics.
      β€’ Leverage SQL expertise for complex queries, transformations, and performance optimization.
      β€’ Integrate cloud services across AWS, Azure, and GCP to support multi-cloud data strategies.
      β€’ Develop Python scripts for ETL/ELT processes, automation, and data quality checks.
      β€’ Implement infrastructure-as-code solutions using Terraform for scalable and automated cloud deployments.
      β€’ Manage RBAC and enforce data governance policies to ensure compliance and secure data access.
      β€’ Collaborate with cross-functional teams including business analysts, and business stakeholders to deliver reliable data solutions.

    • Mandatory Skills Description:

      β€’ Strong proficiency in Snowflake (Snowpipe, RBAC, performance tuning).
      β€’ Hands-on experience with Python , SQL , Jinja , JavaScript for data engineering tasks.
      β€’ CI/CD expertise using Azure DevOps (build, release, version control).
      β€’ Experience automating DBT jobs for data transformations.
      β€’ Experience building Streamlit applications with Snowflake integration.
      β€’ Cloud services knowledge across AWS (S3, Lambda, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Pub/Sub).

    More
  • Β· 20 views Β· 3 applications Β· 22d

    Senior Snowflake Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data...
    • The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data infrastructure through the transition to Snowflake as a priority, as it will enhance capabilities for implementing advanced AI solutions and unlock numerous opportunities for innovation and growth.

      We are seeking a highly skilled Snowflake Data Engineer to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD practices, with a deep understanding of data governance, security, and role-based access control (RBAC). Knowledge of data modeling methodologies (OLTP, OLAP, Data Vault 2.0), data quality frameworks, Stream lit application development and SAP integration and infrastructure-as-code with Terraform is essential. Experience working with different file formats such as JSON, Parquet, CSV, and XML is highly valued.

     

     

    • Responsibilities:

      β€’ Design and develop data pipelines using Snowflake and Snow pipe for real-time and batch ingestion.
      β€’ Implement CI/CD pipelines in Azure DevOps for seamless deployment of data solutions.
      β€’ Automate DBT jobs to streamline transformations and ensure reliable data workflows.
      β€’ Apply data modeling techniques including OLTP, OLAP, and Data Vault 2.0 methodologies to design scalable architectures.
      β€’ Document data models, processes, and workflows clearly for future reference and knowledge sharing.
      β€’ Build data tests, unit tests, and mock data frameworks to validate and maintain reliability of data solutions.
      β€’ Develop Streamlit applications integrated with Snowflake to deliver interactive dashboards and self-service analytics.
      β€’ Integrate SAP data sources into Snowflake pipelines for enterprise reporting and analytics.
      β€’ Leverage SQL expertise for complex queries, transformations, and performance optimization.
      β€’ Integrate cloud services across AWS, Azure, and GCP to support multi-cloud data strategies.
      β€’ Develop Python scripts for ETL/ELT processes, automation, and data quality checks.
      β€’ Implement infrastructure-as-code solutions using Terraform for scalable and automated cloud deployments.
      β€’ Manage RBAC and enforce data governance policies to ensure compliance and secure data access.
      β€’ Collaborate with cross-functional teams including business analysts, and business stakeholders to deliver reliable data solutions.

     

     

    • Mandatory Skills Description:

      β€’ Strong proficiency in Snowflake (Snowpipe, RBAC, performance tuning).
      β€’ Hands-on experience with Python , SQL , Jinja , JavaScript for data engineering tasks.
      β€’ CI/CD expertise using Azure DevOps (build, release, version control).
      β€’ Experience automating DBT jobs for data transformations.
      β€’ Experience building Streamlit applications with Snowflake integration.
      β€’ Cloud services knowledge across AWS (S3, Lambda, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Pub/Sub).

     

    • Nice-to-Have Skills Description:

      - Cloud certifications is a plus

     

     

    • Languages:
      • English: B2 Upper Intermediate
    More
  • Β· 66 views Β· 9 applications Β· 23d

    ETL Developer

    Full Remote Β· Countries of Europe or Ukraine Β· 3 years of experience Β· English - B2
    Description We are looking for a ETL Developer to join our team and work on data integration for a Pharmaceutical Marketing company. You will develop and support ETL processes that run in Docker containers. Your daily work will primarily involve writing...

    Description

    We are looking for a ETL Developer to join our team and work on data integration for a Pharmaceutical Marketing company. 

    You will develop and support ETL processes that run in Docker containers. Your daily work will primarily involve writing complex SQL queries and views, performing data transformations, and ensuring accurate and timely delivery of data by monitoring notifications and logs in AWS CloudWatch. Work also involves scripting in Bash and Python for automation, SFTP data transfers, and connecting to APIs when required. 

    We work as a team, care about code and data quality, and like people who want to learn and improve. 

    Our teams have daily standups and direct communication with a client on a daily basis. 
    The platform processes sensitive data, so development is manual, controlled, and accuracy-driven rather than highly automated. 

     

    Requirements

    • 3+ years of experience working with ETL processes or data pipelines
    • Strong SQL skills: creating and debugging complex queries, aggregations, and validation logic
    • Experience with a relational database (preferably PostgreSQL)
    • Basic understanding of data warehouse concepts (facts, dimensions, SCD, star schema)
    • Experience building ETL pipelines
    • Python knowledge (Pandas, boto3, paramiko), connecting to SFTPs, APIs, and pulling/pushing data
    • Understanding of clean code and good coding practices
    • Experience using Git and pipelines
    • Solid Bash scripting skills for automation and troubleshooting
    • Experience with Docker (images, containers, passing data between containers)
    • Basic knowledge of AWS, including:
      • Running containers in ECS
      • Mounting EFS volumes
      • Viewing logs in CloudWatch
    • English level B2 (can communicate and understand documentation)
    • Willingness to learn and improve skills
    • Interest in software development and data work

    Nice to have

    • Experience with Amazon Redshift, Snowflake, Postgres
    • Experience using AWS CLI
    • Knowledge of AWS services such as:
      • ECR
      • ECS
      • EventBridge
      • CloudWatch
      • Lambda
      • Step Functions
    • Experience working with REST APIs
    • Knowledge of NoSQL databases
    • Experience with CI/CD tools

    We offer:

    • Possibility to propose solutions on a project
    • Dynamic and challenging tasks
    • Team of professionals
    • Competitive salary
    • Low bureaucracy
    • Continuous self-improvement
    • Long-term employment with paid vacation and other social benefits
    • Bunch of perks 😊

    This vacancy is exclusively for Ukrainian developers!





     

    More
Log In or Sign Up to see all posted jobs