Jobs Data Engineer

162
  • Β· 39 views Β· 19 applications Β· 1d

    Data Engineer (Python, SQL, Dagster, Pandas)

    Part-time Β· Full Remote Β· Worldwide Β· 4 years of experience Β· English - C1
    Company: German E-Commerce BI Solutions Provider Engagement: Part-time to start, with clear expansion potential About Us We are a German-based provider of Business Intelligence solutions for the e-commerce sector. Our mission is to enable data-driven...

    Company: German E-Commerce BI Solutions Provider
    Engagement: Part-time to start, with clear expansion potential

     

    About Us

    We are a German-based provider of Business Intelligence solutions for the e-commerce sector. Our mission is to enable data-driven decision-making for online businesses through scalable data architectures, robust analytics pipelines, and actionable insights.

    To strengthen our data team, we are looking for a hands-on Data Engineer who enjoys building reliable data pipelines and working at the intersection of engineering and analytics.
     

    Your Role

    As a Data Engineer, you will design, build, and maintain scalable data pipelines and data models that power our BI and analytics solutions for e-commerce clients. You will work closely with analytics, product, and business stakeholders to ensure high data quality and availability.

    This role starts as a part-time engagement with strong potential to grow into a larger scope as the collaboration evolves.
     

    Responsibilities

    • Design, develop, and maintain robust data pipelines using Python
    • Transform and process data using Pandas DataFrames
    • Write and optimize complex SQL queries (SQL proficiency is a core requirement)
    • Build and maintain data workflows using orchestration tools (e.g., Dagster or similar)
    • Work with cloud data warehouses (e.g., Amazon Redshift or similar technologies)
    • Ensure data quality, reliability, and performance across pipelines
    • Collaborate with BI and analytics teams to translate business requirements into technical solutions
    • Continuously improve data architecture and engineering best practices
       

    Must-Have Qualifications

    • Minimum 4 years of professional experience with Python, including extensive hands-on experience with Pandas
    • Strong, production-level SQL skills (natural prerequisite for the role)
    • Solid understanding of data modeling and ETL/ELT processes
    • Experience building and maintaining data pipelines in a professional environment
    • Ability to work independently in a part-time setup and take ownership of deliverables
       

    Nice-to-Have

    • Experience with Amazon Redshift
    • Experience with Dagster
    • Experience with other SQL dialects beyond Redshift
    • Experience with orchestration tools such as Airflow or similar
    • Background in e-commerce, BI, or analytics-driven environments
    • Familiarity with cloud-based data architectures
       

    What We Offer

    • Flexible part-time engagement with scalability toward a larger role
    • Remote setup
    • High impact in a growing e-commerce BI environment
    • Opportunity to shape and improve modern data architecture
    • Collaborative, pragmatic, and engineering-focused culture
    More
  • Β· 101 views Β· 38 applications Β· 1d

    Data Engineer / Data Architect

    Full Remote Β· Worldwide Β· Product Β· 4 years of experience Β· English - B2
    We are a leading sea moss superfood brand in the U.S., on a mission to redefine natural wellness by making it simple, accessible, and affordable for everyone, everywhere. As we continue to scale across multiple marketplaces and marketing channels, data...

    We are a leading sea moss superfood brand in the U.S., on a mission to redefine natural wellness by making it simple, accessible, and affordable for everyone, everywhere.
    As we continue to scale across multiple marketplaces and marketing channels, data has become our backbone. We are now looking for a Data Engineer / Data Architect who will take ownership of building a fast, reliable, and scalable data infrastructure for our e-commerce, marketing, and finance teams.

     

    Why join us?

    You’ll be building the core data infrastructure for a fast-growing, multi-channel wellness brand. This role has real ownership, high impact, and direct influence on how leadership makes decisions across marketing, finance, and operations.

     

    What we’re looking for:

    You’re a builder at heart. You’ve worked with e-commerce data before and understand how marketing platforms, stores, subscriptions, and marketplaces connect behind the scenes.
     

    You are:
    ● Experienced in e-commerce data β€” You’ve built or maintained data systems for online brands before.
    ● System-oriented β€” You think in pipelines, schemas, and automation.
    ● Fast and pragmatic β€” You can move quickly and deliver working solutions, not just plans.
    ● Detail-driven β€” You care about data accuracy, consistency, and definitions.
    ● Independent β€” You can take ownership of the data stack end to end.
    ● Collaborative β€” You work closely with marketing and finance teams to understand real needs.

     

    What you’ll do:

    Data Collection & Integration
    ● Pull data via APIs from: Google Ads, Meta, Shopify, Recharge, Klaviyo, Postscript,
    Amazon, TikTok, Walmart, and other platforms.
    ● Build reliable pipelines to collect and sync data continuously.

    Data Infrastructure & Storage

    ● Store and manage data in AWS (S3, Redshift) and/or BigQuery.
    ● Design scalable data models and schemas for marketing and finance use cases.
    ● Ensure data is clean, matched, transformed, and ready for analysis.
    Transformation & Automation
    ● Transform raw data based on business requirements (metrics, attribution, matching).
    ● Ensure automatic updates and stable refresh schedules.
    ● Optimize performance to eliminate delays and bottlenecks.
    Reporting & Accessibility
    ● Make data available via Google Sheets, data warehouses, and BI tools.
    ● Enable seamless access for Power BI (and other BI tools if needed).
    ● Support leadership dashboards and recurring reports.
     

    Key Technical Skills
    ● AWS: S3, Redshift, Airflow, Data Pipelines
    ● SQL
    ● Python
    ● Experience with API integrations
    ● BI tools: Power BI, Tableau (nice to have)

     

    What we offer:

    ● Welcome Pack and custom True Sea Moss merch to ensure you arrive like you were always meant to be here
    ● Sports reimbursement to support your physical and mental health
    ● Coaching & career consultations to support your personal and professional growth
    ● Access to corporate English lessons to sharpen your communication skills
    ● WHOOP membership to help you track your health, sleep, and recovery
    ● Coworking membership if you prefer a hybrid work lifestyle
    ● Sabbatical options after long-term contributions
    ● Project grants for side ideas, personal initiatives, or creative experiments
    If you’re ready to build clean, scalable data systems that power a real, fast-growing wellness brand β€” we’d love to hear from you.

    More
  • Β· 21 views Β· 2 applications Β· 1d

    Data Engineer (DBT, Snowflake)

    Ukraine Β· 5 years of experience Β· English - B2
    Client Our client is one of the world’s top 20 investment companies headquartered in Great Britain, with branch offices in the US, Asia, and Europe. Project overview The company’s IT environment is constantly growing, with around 30 programs and more...

    Client

    Our client is one of the world’s top 20 investment companies headquartered in Great Britain, with branch offices in the US, Asia, and Europe.

     

    Project overview

    The company’s IT environment is constantly growing, with around 30 programs and more than 60 active projects. They are building a data marketplace that aggregates and analyzes data from multiple sources such as stock exchanges, news feeds, brokers, and internal quantitative systems.

    As the company moves to a new data source, the main goal of this project is to create a golden source of data for all downstream systems and applications. The team is performing classic ELT/ETL: transforming raw data from multiple sources (third-party and internal) and creating a single interface for delivering data to downstream applications.

     

    Position overview

    We are looking for a Data Engineer with strong expertise in DBT, Snowflake, and modern data engineering practices. In this role, you will design and implement scalable data models, build robust ETL/ELT pipelines, and ensure high-quality data delivery for critical investment management applications.

     

    Responsibilities

    • Design, build, and deploy DBT Cloud models.
    • Design, build, and deploy Airflow jobs (Astronomer).
    • Identify and test for bugs and bottlenecks in the ELT/ETL solution.


    Requirements

    • 5+ years of experience in software engineering (GIT, CI/CD, Shell scripting).
    • 3+ years of experience building scalable and robust Data Platforms (SQL, DWH, Distributed Data Processing).
    • 2+ years of experience developing in DBT Core/Cloud.
    • 2+ years of experience with Snowflake.
    • 2+ years of experience with Airflow.
    • 2+ years of experience with Python.
    • Good spoken English.

    Nice to have

    • Proficiency in message queues (Kafka).
    • Experience with cloud services (Azure).
    • CI/CD knowledge (Jenkins, Groovy scripting).
    More
  • Β· 18 views Β· 0 applications Β· 1d

    Data Solutions Architect

    Ukraine Β· 5 years of experience Β· English - B2
    Client Our client is a large international manufacturing company listed in the S&P 500 and operating within a complex, multi‑site global environment. As part of a Data Strategy Development initiative, the organization is working to define a clear,...

    Client

    Our client is a large international manufacturing company listed in the S&P 500 and operating within a complex, multi‑site global environment. As part of a Data Strategy Development initiative, the organization is working to define a clear, pragmatic, and scalable data strategy aimed at improving decision‑making, reducing manual processes, strengthening data ownership, and laying a solid foundation for future analytics and digital transformation. The engagement is currently focused on strategy definition and is expected to progress into platform and analytics implementation phases.
     

    Position overview

    The Data Solutions Architect leads the definition of the future-state data and analytics architecture as part of the overall data strategy. The role is responsible for designing the end-to-end architectural blueprint, ensuring alignment between business objectives, governance requirements, and technology capabilities.

    This position operates at a strategic and conceptual level, defining logical architecture, integration patterns, and platform standards centered on Microsoft Azure and Snowflake, without committing to physical implementation during the strategy phase.

     

    Responsibilities

    • Assess the current-state technology and data landscape across enterprise, manufacturing, logistics, and analytics systems
    • Define the future-state data and analytics architecture aligned with business and governance objectives
    • Design logical data integration and data flow models for a complex, multi-site environment
    • Define target BI and analytics architecture supporting enterprise reporting and advanced analytics
    • Establish data models, security principles, and platform standards for a scalable and governed data ecosystem
    • Develop a phased data platform evolution roadmap with a focus on Azure and Snowflake
    • Provide architectural inputs into the Data Governance Framework and operating model
    • Contribute architecture sections to the Data Strategy Report
    • Define logical data sourcing and integration approaches for enterprise systems without physical implementation commitments

    Requirements

    • Strong experience in enterprise data and analytics architecture
    • Proven expertise with Microsoft Azure and Snowflake
    • Experience designing future-state architectures in complex, global environments
    • Solid understanding of data governance, security, and analytics enablement
    • Ability to operate at strategy and blueprint level while grounding decisions in practical feasibility
    • Experience working with senior stakeholders in high-visibility initiatives

    Nice to have

    • Experience with manufacturing, logistics, or industrial enterprise environments
    • Familiarity with SAP ECC / SAP BW, SAP C4C, ERP and operational systems
    • Experience with BI and analytics platforms such as Power BI, Qlik Sense, Alteryx
    • Background in data strategy or large-scale transformation programs
    More
  • Β· 4 views Β· 0 applications Β· 1d

    Senior Salesforce Marketing Cloud Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· 4 years of experience Β· English - B2
    N-iX is looking for Senior Salesforce Marketing Cloud Engineer to join the fast-growing team one our project! Our customer is the European online car market with over 30 million monthly users, with a market presence in 18 countries. As a Salesforce...

     

    N-iX is looking for Senior Salesforce Marketing Cloud Engineer to join the fast-growing team one our project! Our customer is the European online car market with over 30 million monthly users, with a market presence in 18 countries. As a Salesforce Marketing Cloud Engineer, you will play a pivotal role in shaping the future of online car markets and enhancing the user experience for millions of car buyers and sellers. This role ensures day-to-day platform reliability, supports scalable customer engagement across multiple brands and markets, and contributes to the continuous development of our omni-channel automation capabilities.
    The position combines technical execution, platform administration, data-driven troubleshooting, and close collaboration with CRM Managers, Product, Data, and Engineering teams. collaboration with CRM Managers, Product, Data, and Engineering teams.

     

    Responsibilities:

    • Platform Management: Execute and support marketing activities across Email Studio, Content Builder, Contact Builder, Automation Studio, and Journey Builder. Monitor platform health, journey execution, error rates, and data synchronization issues across channels (Email, Push, In-App, WhatsApp).
      β€’ Data Orchestration: Manage and optimize data flows between Salesforce CRM (Sales/Service Cloud) and SFMC. Ensure data integrity and seamless synchronization across platforms. Diagnose and resolve issues related to journey failures, template rendering, deliverability, and API-driven components.
      β€’ Solution Design: Build and maintain complex automated journeys and multi-step automations for various use cases. Provide day-to-day troubleshooting and solution guidance on segmentation, templates, content automation, journeys, and campaign execution.
      β€’ Optimization: Review existing CloudPages and content blocks, proposing and implementing technical optimizations to improve performance or scalability, using AMPscript or SSJS and SQL for data segmentation and personalization. Assist in monitoring and improving data synchronization reliability across systems. Maintain and evolve template libraries, modular email components, and reusable assets.
      β€’ Stakeholder Collaboration: Engage directly with business stakeholders to gather requirements, provide updates, and translate business needs into technical solutions. Administer users, roles, permissions, and configuration settings across tools. Enable CRM Managers, Campaign Managers, and Data Delivery teams through training, documentation, and hands-on onboarding to platform capabilities.
      β€’ Knowledge Sharing: Be an active voice in team meetings - don’t just listen but challenge the status quo and share your expertise with the team. Contribute to automation initiatives enabling highly personalized, real-time, multi-channel communication journeys for B2C and B2B audiences.
       

    Requirements: 

    • Minimum 4 years of experience in marketing automation, CRM engineering, or MarTech operations.
    • Hands-on experience within the Salesforce ecosystem.
    • Solid experience within the Salesforce Marketing Cloud.
    • A deep understanding of how data moves, syncs, and updates between CRM and SFMC.
    • Proven expertise in both stakeholder management and project management.
    • Hands-on working with REST APIs, data integration patterns, SSJS, AmpScript and SQL.
    • Experience building, troubleshooting, and optimizing multi-step journeys or workflows.
    • Strong analytical skills, comfort with dashboards, and campaign performance metrics.
    • Familiarity with GDPR and data privacy principles related to CRM operations.
    • English level - at least Upper-Intermediate, both spoken and written.

      

    We offer*:

    • Flexible working format - remote, office-based or flexible
    • A competitive salary and good compensation package
    • Personalized career growth
    • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
    • Active tech communities with regular knowledge sharing
    • Education reimbursement
    • Memorable anniversary presents
    • Corporate events and team buildings
    • Other location-specific benefits

    *not applicable for freelancers

    More
  • Β· 35 views Β· 1 application Β· 1d

    Senior Data Engineer (GCP)

    Full Remote Β· Worldwide Β· Product Β· 5 years of experience Β· English - B2 MilTech πŸͺ–
    OpenMinds is seeking a skilled and curious Data Engineer who’s excited to design and build data systems that power meaningful insight. You’ll work closely with a passionate team of behavioral scientists and ML engineers on creating a robust data...

    OpenMinds is seeking a skilled and curious Data Engineer who’s excited to design and build data systems that power meaningful insight. You’ll work closely with a passionate team of behavioral scientists and ML engineers on creating a robust data infrastructure that supports everything from large-scale narrative tracking to sentiment analysis.
    ‍

    In the position you will:
    ‍
    β€” Take ownership of our multi-terabyte data infrastructure, from data ingestion and orchestration to transformation, storage, and lifecycle management

    ‍

    β€” Collaborate with data scientists, analysts, ML engineers, and domain experts to develop impactful data solutions

    ‍

    β€” Optimize and troubleshoot data infrastructure to ensure high performance, cost-efficiency, scalability, and resilience

    ‍

    β€” Stay up-to-date with trends in data engineering and apply modern tools and practices

    ‍

    β€” Define and implement best practices for data processing, storage, and governance

    ‍

    β€” Translate complex requirements into efficient data workflows that support threat detection and response
    ‍

    We are a perfect match if you have:
    ‍
    β€” 5+ years of hands-on experience as a Data Engineer, with a proven track record of leading complex data projects from design to production

     

    β€” Experience with Google Cloud Platform (GCP) and its data ecosystem (BigQuery, GCS, Pub/Sub, Cloud Run, Cloud Functions, Looker)

    ‍

    β€” Highly skilled in SQL and Python for advanced data processing, pipeline development, and optimization

    ‍

    β€” Deep understanding of software engineering best practices, including SOLID, error handling, observability, performance tuning, and modular architecture

    ‍

    β€” Ability to write, test and deploy production-ready code

    ‍

    β€” Extensive experience in database design, data modeling, and modern data warehousing, including ETL orchestration using Airflow or equivalent
    ‍
    β€” Open-headed, capable of coming up with creative solutions and adapting to frequently changing circumstances and technological advances
    ‍

    β€” Experience in DevOps (docker/k8s, IaaC, CI/CD) and MLOps
    ‍

    β€” Fluent in English with excellent communication and cross-functional collaboration skills

    ‍
    We offer:
    ‍

    β€” Professional development within a multidisciplinary team with backgrounds in academia, tech, and defence sector.

    ‍

    β€” Competitive market salary with room for performance-based progression.

    ‍

    β€” Work in a fast-growing company with proprietary AI technologies, solving the most complex problems in the domains of information resilience and national security.

    ‍

    β€” Participation in Tier-1 international industry conferences, events, and closed briefings.

    ‍

    β€” Flexible work arrangements β€” adjustable hours, location, and hybrid/remote options.

    ‍

    β€” A work culture that values focus, speed, experimentation, strategic thinking, and elegance in the proposed solutions.

    More
  • Β· 11 views Β· 0 applications Β· 1d

    Cloud Platform Engineer

    Full Remote Β· Ukraine Β· 2 years of experience Β· English - B2
    PwC is a global network of more than 370,000 professionals in 149 countries that turns challenges into opportunities. We create innovative solutions in audit, consulting, tax and technology, combining knowledge from all over the world. PwC SDC Lviv,...

    PwC is a global network of more than 370,000 professionals in 149 countries that turns challenges into opportunities. We create innovative solutions in audit, consulting, tax and technology, combining knowledge from all over the world.

     

    PwC SDC Lviv, opened in 2018, is part of this global space. It is a place where technology is combined with team spirit, and ambitious ideas find their embodiment in real projects for Central and Eastern Europe.

     

    What do we guarantee?

    • Work format: Remote or in a comfortable office in Lviv - you choose.
    • Development: Personal development plan, mentoring, English and Polish language courses.
    • Stability: Official employment from day one, annual review of salary and career prospects.
    • Corporate culture: Events that unite the team and a space where everyone can be themselves.

     

    We are looking for a talented and experienced Cloud Engineer to join our growing team. As well as experience with cloud infrastructure, networking, and security, the Cloud Platform Engineer will be responsible for operating, monitoring, and maintaining our many Azure subscriptions and associated global applications. As a Cloud Platform Engineer with cloud architecture, you will work closely with other data professionals, software engineers, and business stakeholders to build, maintain and support efficient, scalable, and secure data pipelines. This role involves both engineering and operations, requiring comfort with project work and handling incident tickets as part of the team.

     

    Responsibilities: 

     

    • Design, build and maintain Azure based infrastructure with Terraform modules (IaaS and PaaS resources) as part of new projects or updates needed to production environments;
    • Troubleshoot IaaS and PaaS issues raised via SNow incident and meet ticket closure SLA’s;
    • Documentation and knowledge sharing: create and maintain environment diagrams, playbooks, runbooks, and best practice documents;
    • Communicate with internal business partners including guidance on application design.

     

    Qualifications:

     

    • 2+ years of experience with Azure core services: Resource Manager, Virtual Machines/Scale Sets, Storage (Blob/Files), Networking (VNets, NSGs, Load Balancer, App Gateway, DNS, Private Link);
    • 2+ year of experience in Windows or Linux administration, with strong problem-solving abilities;
    • 2+ years of experience with Azure Kubernetes Service;
    • 2+ years of experience with Terraform;
    • Ability to work independently and initiate tasks without supervision and utilize strong interpersonal and organizational skills;
    • Excellent problem-solving and analytical skills.

     

    Policy statements:
    https://www.pwc.com/ua/uk/about/privacy.html

    More
  • Β· 55 views Β· 11 applications Β· 2d

    Senior Data Engineer

    Full Remote Β· Worldwide Β· 8 years of experience Β· English - B2
    US Company is searching for a Senior Data Engineer. Interesting project, distributed team, full-time, an official contract. Remote work, CET business hours. Brief project description: The product is a complex platform with applications realized like...

    US Company is searching for a Senior Data Engineer. Interesting project, distributed team, full-time, an official contract. Remote work, CET business hours.

     

    Brief project description:
    The product is a complex platform with applications realized like SaaS, which perform multiple functions for improving Safety, Sustainability and Productivity (works planning / incident management / monitoring & analytics) on high-risky industries: oil & gas gathering and transportation, chemical, building, energy market.

     

    Main Responsibilities:

    • Build and operate production data pipelines for product, analytics, and AI.
    • Integrate internal/external sources with strong quality, reliability, and SLAs.
    • Maintain and improve the data warehouse (performance, cost).
    • Model curated datasets/semantic layers and implement transformations as code (dbt or similar).
    • Ensure observability and operational readiness (tests, freshness checks, runbooks).
    • Support embedded Power BI in production (monitoring, incidents, improvements).
    • Partner with stakeholders and AI/ML teams to keep data current and clarify requirements.

     

    Must-Have Requirements:

    • 5+ years in data engineering / data platform / analytics engineering roles.
    • Strong SQL and experience with relational databases and analytical warehouses/lakehouses.
    • Proficiency in Python (or equivalent) for data processing, automation, and APIs.
    • Experience with cloud platforms (AWS and/or Azure) and cloud-native data services.
    • Production experience with workflow orchestration (Airflow/dbt or similar).
    • Solid data modeling skills (dimensional modeling/star schema or equivalent).
    • Strong software engineering habits: Git, CI/CD, automated testing, readable maintainable code.
    • Clear communication in English and comfort explaining tradeoffs to non-experts.
    • BI enablement: Power BI / Tableau / Looker.

     

    Considered as a BIG plus:

    • Experience with Snowflake /Databricks/ BigQuery or similar analytical engines.
    • Infrastructure-as-code: Terraform, plus Docker
    • Experience using AI coding assistants (e.g., GitHub Copilot, Cursor, Claude Code)
    • Familiarity with GenAI/LLM and machine-learning fundamentals

     

    Work conditions:

    • Distributed team, remote work.
    • Kanban or scrum approach, 5-6 team members / team.
    • Full-time (40 hours per week).
    • Official contract: salary, sick-leave days, holidays, vacations.

     

    Hiring process:
    Step 1 - preliminary interview (main questions) - 30 mins
    Step 2 - internal tech interview (tech questions) - 40-50 mins
    Step 3 - tech interview with team leader and architect - 1 hour

    More
  • Β· 243 views Β· 75 applications Β· 2d

    Data Engineer

    Full Remote Β· Worldwide Β· 3 years of experience Β· English - B2
    The CHI Software team is not standing still. We love our job and give it one hundred percent of us! Every new project is a challenge that we face successfully. The only thing that can stop us is… Wait, it’s nothing! The number of projects is growing, and...

    The CHI Software team is not standing still. We love our job and give it one hundred percent of us! Every new project is a challenge that we face successfully. The only thing that can stop us is… Wait, it’s nothing! The number of projects is growing, and with them, our team too. And now we need a Data Engineer.
     

    Project Description:

    It is a real-time data processing and analytics solution for a high-traffic web application. 
    Tech stack.
    AWS: AWS Glue Studio, Redshift, RDS, Airflow, AWS Step Function, Lambda, AWS Kinesis, Athena, Apache Iceberg, AWS Data Brew, S3, OpenSearch, Python, SQL, CI/CD, dbt, Snowflake


     Responsibilities:

    • Design a scalable and robust AWS cloud architecture;
    • Utilize AWS Kinesis for real-time data streaming and aggregation;
    • Implemente AWS Lambda for serverless data processing, reducing operational costs;
    • Configured AWS RDS (Relational Database Service) for structured data storage and AWS DynamoDB for NoSQL requirements;
    • Ensured data security and compliance with AWS IAM (Identity and Access Management) and encryption services;
    • Developed and deployed data pipelines using AWS Glue for ETL processes;
    • Wrote Python scripts and SQL queries for data transformation and loading;
    • Set up continuous integration and continuous deployment (CI/CD) pipelines using AWS CodePipeline and CodeBuild;
    • Monitored system performance and data quality using AWS CloudWatch and custom logging solutions;
    • Collaborated with other teams to integrate data sources and optimize data flow;
    • Achieve a highly scalable real-time data processing system, resulting in a 40% increase in data analysis efficiency and a significant reduction in operational costs.
    • Build ETL pipelines from S3 to AWS OpenSearch by AWS Glue
    • Upper-Intermediate or higher English level.
    More
  • Β· 10 views Β· 0 applications Β· 2d

    Physical Security Infrastructure, Systems Engineer

    Full Remote Β· EU Β· Product Β· 2 years of experience Β· English - B2
    RISK inc: An International iGaming Company Pushing the Boundaries of Entertainment Who We Are: An international iGaming company specializing in identifying and fostering the growth of high-potential entertainment markets. With 600+ professionals in 20+...

    RISK inc: An International iGaming Company Pushing the Boundaries of Entertainment
    Who We Are:
    An international iGaming company specializing in identifying and fostering the growth of high-potential entertainment markets. With 600+ professionals in 20+ locations, we operate in 10 countries, serving over 300,000 customers.
    Always Pushing the Boundaries? You Already Belong at RISK!
    Our global-scale operations are based on strong internal expertise, analytics, and data research. We have expertise in iGaming operations (sports betting, online casino), digital and affiliate marketing, tech solutions, and data analytics.

    We are looking for a Physical Security Infrastructure & Systems Engineer to join our team.

    Responsibilities:

    • Ensure continuous operation and maintenance of security systems, including CCTV, access control, alarms, and network infrastructure;
    • Participate in design and implementation of: AI video analytics solutions (e.g. BriefCam), PSIM platforms (e.g. IMMIX), integrations with InCoreSoft;
    • Develop and maintain: response scenarios, risk-based logic;
    • Provide technical support for the SOC;
    • Quality control of external system integrators;
    • Configure and support Mikrotik routers and VPNs for secure remote site connectivity;
    • Manage server infrastructure and video storage systems, including monitoring and backups;
    • Administer Active Directory for the security team and manage user access across systems;
    • Monitor network performance 24/7 and respond to technical issues or security incidents;
    • Participate in the planning, deployment, and testing of security systems at local and remote sites;
    • Maintain equipment inventory and configure Ajax alarm systems as part of integrated security solutions.


    Requirements:

    • 2+ years of experience as a system or network administrator in security or technical infrastructure;
    • Hands-on experience with CCTV / VMS platforms (specific vendors are not mandatory);
    • Solid understanding of SOC / PSIM logic, including: incidents, escalations, operator roles;
    • Practical experience with, or strong understanding of, PSIM platforms (IMMIX experience is a strong advantage but not mandatory);
    • Practical experience or strong understanding of: video analytics, behavior detection, object tracking;
    • Clear understanding of AI as an assistive tool, not an autonomous β€œautopilot” system;
    • Understanding of REST APIs / Webhooks at an integration level;
    • Experience working with: external vendors and system integrators;
    • Linux (basic level: services, logs, networking);
    • Experience in network diagnostics and coordination with ISPs;
    • Strong problem-solving skills and a proactive, security-oriented mindset;
    • Good written and verbal communication skills in English;
    • Willingness to travel occasionally for system installations and technical support.

      Our Benefit Cafeteria is Packed with Goodies:
      - Children Allowance
      - Mental Health Support
      - Sport Activities
      - Language Courses
      - Automotive Services
      - Veterinary Services
      - Home Office Setup Assistance
      - Dental Services
      - Books and Stationery
      - Training Compensation
      - And yes, even Massage!
    More
  • Β· 31 views Β· 3 applications Β· 2d

    Data Engineer (Python / SQL)

    Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - B2
    Requirements β€’ Strong proficiency in SQL β€’ Solid experience with Python β€’ Commercial experience as a Data Engineer / Python Developer β€’ Hands-on experience with data migration and database processes β€’ Ability to work independently and ensure data...

    Requirements
    β€’ Strong proficiency in SQL
    β€’ Solid experience with Python
    β€’ Commercial experience as a Data Engineer / Python Developer
    β€’ Hands-on experience with data migration and database processes
    β€’ Ability to work independently and ensure data consistency
    β€’ English: B2+

    β€’ Duration: 3–5 months
    β€’ Start: ASAP


    We offer:

    • Cooperation with a stable company with well-established processes and a positive atmosphere
    • Experience in project management of different levels of complexity, methodology, and approaches
    • Flexible schedule: Mon-Fri (8 hours a day)
    • 18 days of paid vacation; 15 days sick leave
    • A decent level of remuneration with regular reviews of the results of cooperation
    • Extensive loyalty program 

     

    More
  • Β· 134 views Β· 10 applications Β· 2d

    Senior Data Engineer (Python-first, ETL, Azure)

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    COMPANY Atlas Technica β€” the US-based MSP providing services in the hedge fund vertical. Founded in New York in 2016, and rapidly growing (twice a year) all along the way. These days comprises 200+ engineers and 10+ established offices in US, UK,...

    COMPANY 
    Atlas Technica β€” the US-based MSP providing services in the hedge fund vertical. Founded in New York in 2016, and rapidly growing (twice a year) all along the way. These days comprises 200+ engineers and 10+ established offices in US, UK, Ukraine, Hong Kong, Singapore. 

    Location/Type: Remote (Ukraine only)

    Hours: UA timezone, flexible

     

    We are seeking an experienced Data Engineer to lead the design, implementation, and ongoing maintenance of scalable data pipelines and cloud-native solutions. You will work extensively with Python, Azure cloud services, and SQL-based data models, with a strong focus on automation, reliability, and data security, and collaborate closely with cross-functional teams to turn data into actionable insights.

     

    Responsibilities:β€―

    • Build and maintain efficient ETL workflows using Python 3, applying both object-oriented and functional paradigms.
    • Write comprehensive unit, integration, and end-to-end tests; troubleshoot complex Python traces.
    • Automate deployment and integration processes.
    • Develop Azure Functions, configure and deploy Storage Accounts and SQL Databases.
    • Design relational schemas, optimize queries, and manage advanced MSSQL features including temporal tables, external tables, and row-level security.
    • Author and maintain stored procedures, views, and functions.
    • Collaborating with cross-functional teams
       

    Requirements:β€―

    • English level – B2 or higher
    • 5+ years of proven experience as a Data engineer
    • Programming
      • Proficient in Python 3, with both object-oriented and functional paradigms
      • Design and implement ETL workflows using sensible code patterns
      • Discover, navigate and understand third-party library source code
      • Author unit, integration and end-to-end tests for new or existing ETL (pytest, fixtures, mocks, monkey patching)
      • Ability to troubleshoot esoteric python traces encountered in the terminal, logs, or debugger
    • Tooling & Automation
      • Git for version control and branching strategies
      • Unix-like shells (Nix-based OS) in cloud environments
      • Author CI/CD configs and scripts (JSON, YAML, Bash, PowerShell)
    • Cloud & Serverless Patterns
      • Develop Azure Functions (HTTP, Blob, Queue triggers) using azure-functions SDK
      • Implement concurrency and resilience (thread pools, tenacity, rate limiters)
    • Azure SDKs & Services
      • Deploy and configure:
      • Functions, Web Apps & App Service Plans
      • Storage Accounts, Communication Services
      • SQL Database / Managed Instance
    • Database Administration
      • Relational data modeling & schema design
      • Data partitioning strategies & temporal tables (system-versioned)
      • Query performance tuning (indexes, execution plans)
      • Selection of optimal data types
      • Complex T-SQL (windowing, CTEs, advanced joins)
      • Advanced MSSQL features (External Tables, Row-Level Security)
    • SQL Objects & Schema Management
      • Author and maintain tables, views, Stored Procedures, Functions, and external tables (polybase)
      • Strong analytical and problem-solving skills, with meticulous attention to detail
      • Strong technical documentation skills

     

    WE OFFER:

    • Direct long-term contract with a US-based company
    • Full-time remote role aligned with EST
    • B2B set-up via SP (FOP in $USD)
    • Competitive compensation
    • Annual salary reviews and performance-based bonuses
    • Company equipment provided for work
    • Professional, collaborative environment with the ability to influence strategic decisions
    • Opportunities for growth within a scaling global organization
    More
  • Β· 21 views Β· 1 application Β· 2d

    Senior Data Engineer

    Full Remote Β· Ukraine Β· Product Β· 5 years of experience Β· English - B2
    About the job: We are an innovative AI-driven construction intelligence startup, committed to transforming the construction industry with cutting-edge technology. Our mission is to enhance the efficiency, safety, and productivity of construction projects...

    About the job:

    We are an innovative AI-driven construction intelligence startup, committed to transforming the construction industry with cutting-edge technology. Our mission is to enhance the efficiency, safety, and productivity of construction projects through intelligent solutions.
     

    We’re hiring a hands-on Senior Data Engineer who wants to build data products that move the needle in the physical world. Your work will help construction professionals make better, data-backed decisions every day. You’ll be part of a high-performing engineering team based in Tel Aviv.

    ‍

    Responsibilities:

    • Lead the design, development, and ownership of scalable data pipelines (ETL/ELT) that power analytics, product features, and downstream consumption.
    • Collaborate closely with Product, Data Science, Data Analytics, and full-stack/platform teams to deliver data solutions that serve product and business needs.
    • Build and optimize data workflows using Databricks, Spark (PySpark, SQL), Kafka, and AWS-based tooling.
    • Implement and manage data architectures that support both real-time and batch processing, including streaming, storage, and processing layers.
    • Develop, integrate, and maintain data connectors and ingestion pipelines from multiple sources.
    • Manage the deployment, scaling, and performance of data infrastructure and clusters, including Spark on Kubernetes, Kafka, and AWS services.
    • Manage the deployment, scaling, and performance of data infrastructure and clusters, including Databricks, Kafka, and AWS services.
    • Use Terraform (and similar tools) to manage infrastructure-as-code for data platforms.
    • Model and prepare data for analytics, BI, and product-facing use cases, ensuring high performance and reliability.

    ‍

    Requirements:
     

    • 8+ years of hands-on experience working with large-scale data systems in production environments.
    • Proven experience designing, deploying, and integrating big data frameworks - PySpark, Kafka, Databricks. 
    • Strong expertise in Python and SQL, with experience building and optimizing batch and streaming data pipelines.
    • Experience with AWS cloud services and Linux-based environments.
    • Background in building ETL/ELT pipelines and orchestrating workflows end-to-end.
      Proven experience designing, deploying, and operating data infrastructure / data platforms.
    • Mandatory hands-on experience with Apache Spark in production environments. 
    • Mandatory experience running Spark on Kubernetes.
    • Mandatory hands-on experience with Apache Kafka, including Kafka connectors.
    • Understanding of event-driven and domain-driven design principles in modern data architectures.
    • Familiarity with infrastructure-as-code tools (e.g., Terraform) β€” advantage.
    • Experience supporting machine learning or algorithmic applications β€” advantage.
    • BSc or higher in Computer Science, Engineering, Mathematics, or another quantitative field.
    More
  • Β· 66 views Β· 5 applications Β· 2d

    Lead Analytics Manager

    Full Remote Β· Countries of Europe or Ukraine Β· 3 years of experience Β· English - B1
    We are looking for an experienced Head of Analytics / Analytics Manager to lead the development and maintenance of company-wide reporting, manage core data sources, and drive the implementation of modern analytics and AI-enabled solutions. This...

        We are looking for an experienced Head of Analytics / Analytics Manager to lead the development and maintenance of company-wide reporting, manage core data sources, and drive the implementation of modern analytics and AI-enabled solutions. 
        This role combines hands-on expertise in Power BI, SQL, and Python with ownership of analytics processes, data quality, and a small analytics team. You will play a key role in ensuring reliable management reporting, supporting business growth, and evolving our analytics ecosystem.


    Key Responsibilities


    β€’ Design, build, and maintain Power BI dashboards and reports for company management and business stakeholders ;
    β€’ Own and manage global data sources, including:
     - HRM system ;
     - Time tracking system ;
     - Agent schedules and workforce data ;

    β€’ Fully build and support reporting and data logic for the company’s main project, ensuring data accuracy and consistency ;

    β€’ Implement reporting for new projects, including: 
    - Connecting to new data sources o Integrating data via APIs ;
    - Creating new dashboards and data models; 

    β€’ Develop and improve data models and DAX calculations in Power BI ;

    β€’ Write and optimize SQL queries and data transformations; 

    β€’ Participate in the development of Microsoft Fabric capabilities within existing processes; 

    β€’ Coordinate implementation of ML / forecasting solutions together with external vendors; 

    β€’ Lead and manage a small team: Reporting & Data Analyst and Operational Analyst;
    β€’ Define priorities, distribute tasks, and review results;

    β€’ Ensure documentation, stability, and reliability of reporting solutions;

    • Collect, process, and analyze Customer Experience (CX) data (CSAT, NPS, CES, QA scores, customer feedback, complaints, etc.)
    • Build CX dashboards and analytical views to monitor service quality and customer satisfaction

      Required Qualifications
       
    • Higher education in IT, Computer Science, Mathematics, Finance, or related field;
    • 3+ years of hands-on experience with Power BI;
    • Strong and practical knowledge of DAX;
    • 3+ years of experience with SQL and building complex queries;
    • 1+ year of experience with Python (for data processing / automation / ETL tasks);
    • Experience connecting to external systems via APIs;
    • Solid understanding of data modeling and BI best practices;
    • Experience working with large datasets;
    • English level: B1 or higher;

     

    Nice to Have
     

    • Experience with Microsoft Fabric (Dataflows Gen2, Lakehouse/Warehouse, Notebooks, Pipelines);
    • Exposure to forecasting or machine learning concepts;
    • Experience in BPO / Contact Center / Operations analytics;

     

    What We Offer

     

    • Opportunity to build and shape the analytics function
    • Direct impact on management decision-making
    • Participation in AI-driven analytics transformation
    • Professional growth in a fast-scaling company
    More
  • Β· 70 views Β· 6 applications Β· 2d

    Data Engineer

    Full Remote Β· Worldwide Β· 6 years of experience Β· English - B2
    Business Digital marketing agency that specializes in enhancing the digital presence and growth of automotive dealerships nationwide. Initially, they focused on supporting their service teams through automation tools and have since expanded to develop and...

    Business

    Digital marketing agency that specializes in enhancing the digital presence and growth of automotive dealerships nationwide. Initially, they focused on supporting their service teams through automation tools and have since expanded to develop and sell white-label, productized services to their partners. Their mission is to deliver high-quality tools that meet the needs of their partners, leveraging their expertise in digital marketing to drive significant improvements and efficiencies in the automotive sector. The agency places a strong emphasis on building scalable, functional tools and fostering team growth through continuous improvement and collaboration.
     

     

    Requirements

    Client seeking a skilled Data Engineer with experience in architecting scalable data solutions using Amazon Web Services (AWS). This role will focus on building and maintaining data infrastructure to support our customer-facing tools, ensuring data is efficiently stored, processed, and available for analytics.

    Overlap with EST is required
     

    Key Responsibilities:

    • Data Architecture Design: Architect, develop, and maintain scalable and robust data architectures using AWS services such as Amazon S3, RDS, Redshift, and EMR.
    • ETL Processes: Design and implement ETL processes using AWS Glue, ensuring data is clean, reliable, and available for analysis.
    • Data Pipeline Management: Develop and manage data pipelines using tools like Amazon Kinesis and AWS Lambda for real-time and batch processing.
    • Database Management: Optimize databases using Amazon RDS and Redshift, ensuring data integrity, security, and performance.
    • Data Lake Management: Utilize Amazon S3 for building and managing data lakes to store structured and unstructured data.
    • Collaboration: Work closely with software engineers, data scientists, and product managers to ensure data solutions meet business requirements.
    • Security and Compliance: Implement security best practices and ensure compliance with industry standards and regulations.
    • Monitoring and Optimization: Implement monitoring tools and strategies to optimize the performance of data systems.

       

    Qualifications:

    • Education: Bachelor’s degree in Computer Science, Engineering, or a related field.
    • Experience: Minimum of 5 years in data engineering roles with experience in AWS services. Proven track record in building scalable data architectures.
    • Technical Skills: Proficiency in AWS tools (e.g., S3, RDS, Redshift, Glue), data processing frameworks (e.g., Apache Spark, Hadoop), and programming languages (e.g., Python, SQL).
    • Analytical Skills: Strong problem-solving abilities and a data-driven mindset. Experience with performance tuning and data optimization.
    • Communication Skills: Excellent verbal and written communication skills. Ability to collaborate effectively with cross-functional teams.
    • Attention to Detail: High attention to detail and a commitment to delivering high-quality solutions.

     

    Additional requirements

    • Certifications: AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect, or similar certifications.
    • Domain Knowledge: Experience in the automotive industry or digital marketing sector.

    Additional Skills: Familiarity with machine learning models and integration with customer-facing applications.

     

    More
Log In or Sign Up to see all posted jobs