Jobs Data & Analytics

1052
  • Β· 16 views Β· 1 application Β· 1d

    CRM manager (iGaming)

    Full Remote Β· Worldwide Β· Product Β· 1 year of experience Β· English - B2
    Join our growing team as a CRM Executive/Email Marketing Manager for retention activities for a multi-country iGaming product. We're backed by ambitious industry innovators with a strong multi-regional presence and over 3 years of successful operations....

    Join our growing team as a CRM Executive/Email Marketing Manager for retention activities for a multi-country iGaming product.

    We're backed by ambitious industry innovators with a strong multi-regional presence and over 3 years of successful operations. Our portfolio features two large-scale brands active across Tier 1 markets.

    As part of our expanding team, you’ll play a crucial role in elevating user engagement, increasing retention, and driving lifetime value through segmented communications, localized campaigns, and smart automation. This is a high-impact position where you’ll directly influence CRM performance metrics like deposit activity, average check size, and player retention rates.

     

     

    Your Mission

    • Implement and execute an omnichannel CRM strategy product mechanics and communications to enhance onboarding, retention, engagement, and monetization across two iGaming brands.
    • Coordinate and hands-on execute multi-channel marketing campaigns using relevant segmentation and personalization, traffic specific instruments and channels.
    • Analyze user behavior and in-platform activity to develop data-driven retention strategies and improve applicable CRM campaign effectiveness.
    • Implement and action market-specific CRM approaches and solutions, tailored to local languages, cultural nuances, and player preferences.
    • Manage and configure CRM platforms and marketing automation tools across multiple media, channels, and formats.
    • Segment customer databases to deliver personalized communications and offers.
    • Collaborate with cross-functional teams - including development, marketing, and analytics - to align CRM initiatives with business goals and product features.
    • Research and stay ahead of CRM trends in the iGaming sector, bringing innovative mechanics and techniques to your campaigns.

     

    Our Requirements

    • 2+ years of experience in CRM Executive or Email Marketing Manager role, ideally in online gaming or a similarly dynamic industry.
    • Deep, hands-on knowledge of Customer Journey Optimization and CRM best practices, with a proven ability to implement and test strategies across multiple channels.
    • Expertise in multi-channel engagement strategies, including resolving deliverability and localization issues.
    • Strong command of A/B testing, campaign optimization, and performance analysis.
    • Solid skills in CRM platform management, customer segmentation, and personalization tactics.
    • Ability to manage multiple CRM projects concurrently in a fast-paced organization.
    • English at Intermediate level or higher (written and spoken).

     

     

    Preferred Qualifications

    • Experience working with Customer.io or similar CRM tools.
    • Familiarity with predictive analytics and advanced automation platforms.
    • Experience in designing loyalty programs and retention mechanics.
    • Strong creative thinking and problem-solving abilities.

     

     

    What we offer:

    • Competitive salary;
    • Remote work in a flexible environment;
    • 20 working days of paid vacation and education projects;
    • Great product with our software solution;
    • Opportunity for growth at professional levels, attending top industry events and conferences, and international workshops at our competence centers.

     

    Don’t delay! Send your CV right now and join our highly professional and ambitious team!

    More
  • Β· 12 views Β· 0 applications Β· 1d

    Delivery Analyst

    Full Remote Β· Ukraine Β· 3 years of experience Β· English - B2
    About the Job The Delivery Operations Analyst is a critical operational role responsible for supporting Mira’s ability to plan, forecast, and execute work predictably across the company. As a Delivery Operations Analyst at Mira Commerce, you will play...

    About the Job

     

    The Delivery Operations Analyst is a critical operational role responsible for supporting Mira’s ability to plan, forecast, and execute work predictably across the company.


    As a Delivery Operations Analyst at Mira Commerce, you will play a central role in driving operational efficiency and data-backed decision-making across teams that support ecommerce strategy, delivery, and managed services. You’ll work closely with technical and business stakeholders to analyze performance, forecast workload and billable utilization, develop work scheduling insights, and assist in optimizing operational processes that support capacity management.

     

    This role blends analytics, operational planning, and cross-functional collaboration to ensure delivery excellence and continuous improvement aligned with client and business goals.

     

    Job Responsibilities

    Operational & Data Analysis

    • Collect, clean, and interpret operational, delivery, and support performance data across multiple teams.
    • Build dashboards and recurring reports that show KPIs related to delivery performance, team velocity, SLA adherence, utilization, and client success.
    • Identify trends, bottlenecks, inefficiencies, and opportunities to improve delivery processes and cross-team workflows.
    • Monitor project and operational health metrics, providing actionable insights to leadership and team leads.
       

    Capacity Planning & Resource Forecasting

    • Forecast workload, skill demand, and resource capacity across delivery, development, UX/design, QA, and managed services teams.
    • Conduct ongoing utilization analysis to understand team load, availability, and potential resource constraints.
    • Work with project managers and department leads to assign resources, balance workloads, and ensure accurate planning for upcoming initiatives.
    • Collaborate with recruiting/HR to identify short- and long-term hiring needs based on projected capacity gaps.
    • Develop models to predict peak periods, staffing requirements, and throughput expectations.
       

    Process Optimization

    • Evaluate delivery, support, and internal operational workflows; recommend improvements to increase efficiency, reduce friction, and improve predictability.
    • Help establish and maintain standard operating procedures (SOPs), documentation, and delivery best practices.
    • Support continuous improvement initiatives and lead small operational improvement projects as needed.
       

    Jira Management & Reporting

    • Configure and optimize Jira boards, workflows, fields, and reporting structures to support efficient project tracking and consistent cross-team processes.
    • Build and maintain Jira dashboards for team performance, SLAs, capacity, and throughput metrics.
    • Ensure Jira governance by establishing usage standards and supporting adoption across delivery teams.
    • Leverage Jira data to improve operational visibility and resource planning accuracy.
       

    Cross-Functional Collaboration

    • Partner with delivery, engineering, UX/design, managed services, and strategy teams to align operational initiatives with business and client priorities.
    • Work closely with project managers to validate schedules, capacity constraints, and workload planning assumptions.
    • Present insights, recommendations, and forecast models to leadership to influence decision-making.
    • Serve as an operational liaison across functional groups, helping teams translate data into action.
       

    Tooling & Reporting

    • Use BI and analytics tools (Looker Studio, internal dashboards, spreadsheets) to build automated, accurate operational reporting.
    • Maintain data integrity across operational systems and ensure reports are timely, relevant, and actionable.
    • Support the adoption of new tools and enhancements that improve delivery efficiency and resource planning.

     

    Required Qualifications

     

    • Bachelor’s degree in Business, Analytics, Operations Management, or similar.
    • Strong analytical skills with experience in data processing and reporting.
    • Familiarity with Systems Integration (SI) / Software Development processes and related operations in professional IT services environments.
    • Excellent communication skills with the ability to present insights to technical and non-technical stakeholders.
    • Practical experience with Google Sheets/Excel, Looker Studio / Power BI / Tableau or similar tools.
    • Hands-on experience with Atlassian Jira/Confluence for project tracking, workflow management, reporting / data extraction, and board configuration.

     

    Preferred (Nice-to-Have)

    • Knowledge of operational KPIs specific to professional B2B services (IT services, Agency services, Consulting etc.).
    • Experience working with and extending functionality of project management tools such as Jira workflows, automation rules, or cross-team reporting structures.
    • Technical knowledge and experience with systems integration via tools such as n8n, make.com, Zapier and similar tools

     

    More
  • Β· 18 views Β· 2 applications Β· 1d

    Lead Product Analyst

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· English - B2
    We’re looking for a Lead Product Analyst who can turn raw data into product strategy and help the team make smarter decisions every day. If you enjoy connecting metrics with real user behavior, leading analysts, and shaping product direction with data -...

    We’re looking for a Lead Product Analyst who can turn raw data into product strategy and help the team make smarter decisions every day. If you enjoy connecting metrics with real user behavior, leading analysts, and shaping product direction with data - this quest is for you πŸš€
     

    πŸ’ͺ YOUR QUEST-LINE:

    • Guide and grow a team of Product Analysts - help them think deeper and keep the analytical bar high;
    • Own analytics across key product domains - understand how users behave, what drives engagement, and where monetization actually happens;
    • Shape and refine core product metrics (LTV, retention, churn, cohorts, funnels, activation/A-ha moments) and ensure consistent metric definitions and event taxonomy across the company;
    • Develop and maintain dashboards in Tableau / Amplitude that reflect key product and business metrics;
    • Drive experimentation culture - from shaping hypotheses to interpreting results and explaining what they mean beyond the numbers;
    • Bring structure and statistical sanity into A/B testing and decision-making;
    • Analyze user lifecycle and segmentation to uncover growth opportunities, friction points, and revenue drivers;
    • Keep an eye on product health metrics, set up alerts/anomaly checks, investigate drops, and drive quick RCA with clear next steps;
    • Work closely with CEO, CPO, Product Managers, and key stakeholders to make data part of everyday decisions;
    • Continuously improve how we work with analytics - definitions, reporting logic, internal standards.

     

    πŸ‘¨β€πŸš€ YOUR IN-GAME SKIN:

    • 5+ years as a Product Analyst in product-based tech environments;
    • 1+ year of experience leading or mentoring analysts;
    • Strong SQL skills with hands-on experience working with large datasets;
    • Experience defining and evolving core product metrics: retention, LTV, churn, funnels, cohorts, segmentation, activation;
    • Hands-on experience designing and analyzing A/B tests, understanding statistical significance and experiment pitfalls;
    • Experience working with Tableau and Amplitude;
    • Experience building dashboards and reporting systems that support real product decisions;
    • Experience owning metric definitions/event tracking and ensuring data consistency across tools (single source of truth);
    • Understanding of how metrics connect to business impact and monetization logic;
    • Strong communication skills - able to translate complex analysis into clear, actionable insights;
    • Upper-Intermediate level of English.

     

    πŸ§œβ€β™€οΈ WILL BE YOUR BOOST:

    • Familiarity with the Steam trade system, skins market, or gaming experience (CS2, Dota2, Rust);
    • Domain experience in high-load E-commerce, Fintech, or iGaming projects;
    • Proficiency in Python for deeper product analysis.

     

    πŸ₯· YOUR POWER-UPS SOURCE:

    • Spontaneous weekend from the CEO;
    • Support for professional development, including events (we went on a MAJOR trip to Copenhagen, for example), certifications, and educational materials;
    • Start working at a time that’s comfortable for you;
    • People Partner who will always support you in word and deed;
    • Anniversary and performance bonuses;
    • Input on project direction in a bureaucracy-free environment;
    • Regular performance reviews and personal feedback sessions;
    • 20 paid vacation days and 7 sick leave days;
    • Assisting with the accounting and tax management for individual entrepreneurs;
    • English language and Well-being benefit β€” so that you can improve your mental health the way you want;
    • No time tracking;
    • We purchase CHARGING STATIONS for our spacers;
    • We help Ukraine: every month the company supports AFU and transfers donations for ammunition and supplies.

     

    If you read everything carefully and exclaimed: Β«Oh, yeah, I'm good at everything and I want to do cool things together!Β», then we are already looking forward to your resume! 

    Send them as soon as possible! πŸš€

     

    More
  • Β· 39 views Β· 6 applications Β· 1d

    Senior AI Data Engineer (Python) to $8000

    Full Remote Β· Worldwide Β· Product Β· 5 years of experience Β· English - C2
    About Pulse Intelligence Pulse Intelligence is building the definitive data platform for the global mining industry. We aggregate, process, and enrich data from hundreds of sources (regulatory filings, stock exchanges, company websites, news, and...

    About Pulse Intelligence

    Pulse Intelligence is building the definitive data platform for the global mining industry. We aggregate, process, and enrich data from hundreds of sources (regulatory filings, stock exchanges, company websites, news, and financial APIs) to give mining investors and analysts a real-time, comprehensive view of every mining asset, company, and commodity on the planet.

     

    Our platform combines large-scale web scraping with LLM-powered data extraction to turn unstructured documents (NI 43-101 technical reports, RNS announcements, SEDAR filings) into structured, queryable intelligence. We're a small team shipping fast, and every engineer has an outsized impact on the product.

     

    About the Role

    We're looking for a Senior AI Data Engineer to take ownership of our entire data pipeline, from raw document ingestion through AI-powered extraction to clean, structured records in our database. You'll be the technical lead on data acquisition and enrichment: architecting scrapers for new sources, designing LLM extraction strategies, making decisions on data modeling, and driving the quality and coverage of our mining asset database.

     

    This is a high-autonomy role for someone who can see the big picture and execute on the details. You'll decide which data sources to prioritise, how to structure extraction pipelines, and when to invest in automation vs. manual curation. You'll ship scrapers one day, redesign an entity extraction pipeline the next, and mentor the team on best practices throughout.

     

    What You'll Do

    • Own data acquisition and scraping - identify, prioritise, and build scrapers for new data sources (exchanges, regulatory filings, company websites, financial APIs) and scale them to run reliably in production
    • Design LLM extraction pipelines - architect and iterate on prompt-driven pipelines that extract structured mining data (assets, production, reserves, companies) from unstructured documents
    • Build the document processing pipeline - take raw PDFs, HTML, and filings from ingestion through to clean, structured data using OCR, parsing, deduplication, and text normalisation
    • Drive data quality and coverage - design verification, deduplication, and enrichment workflows, and own the data model that keeps our mining asset database accurate and well-structured
    • Keep pipelines running - monitor scheduled jobs, design for failure recovery, and ensure the system scales without manual intervention

     

    What You Need

    • 5+ years of Python in data engineering or backend development
    • Web scraping at scale - you've built and maintained production scrapers (Scrapy, Playwright, Selenium, or similar)
    • Prompt engineering - you've used LLM APIs (OpenAI, Anthropic, or similar) to extract structured data from unstructured text, and you iterate on prompts systematically
    • Strong SQL and data modeling - you've designed schemas and optimised queries in PostgreSQL or similar
    • Self-directed - you identify what needs doing and drive it to completion with minimal oversight

     

    Nice to Haves

    • Mining or resources industry knowledge (NI 43-101, JORC, resource classifications)
    • AWS (S3, EKS) or similar cloud infrastructure
    • LLM self-verification, chain-of-thought, or agentic pipelines
    • Experience with workflow orchestration tools (Airflow, Dagster, or similar)
    • Experience mentoring engineers or leading a small data team

     

    Benefits

    • Work on a product that maps the entire global mining industry
    • Small team - your work directly shapes the product
    • Remote-friendly with flexible hours
    • Equity in a growing platform

     

    Hiring Process

    • Introductory call - 30 minutes
    • Take-home challenge - 6 hours
    • Technical & cultural fit interview - 1 hour
    • System design interview - 1 hour
    • Final chat with CEO - offer within 48 hours
    More
  • Β· 10 views Β· 3 applications Β· 1d

    Senior Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - C1
    We’re looking for a Senior Data Engineer to join a SaaS company working with large-scale web data and consumer insights for global brands. Details: Format: Full-time, Remote Duration: 6 months (with possible extension) Start: ASAP English: Fluent ...

    We’re looking for a Senior Data Engineer to join a SaaS company working with large-scale web data and consumer insights for global brands.

     

    Details
    Format: Full-time, Remote

    Duration: 6 months (with possible extension)

    Start: ASAP 
    English: Fluent 

     

    Requirements:

    - 5+ years of Data Engineering experience

    - Strong PySpark, Python, SQL

    - Hands-on experience with AWS

    - Practical experience with Databricks (critical)

    - Understanding of production-grade data pipelines and data quality

    - Confident English for cross-team collaboration

     

    Nice to have:

    - MLflow / LLM exposure

    - Big Data & Data Lake architectures

    - CI/CD, DevOps experience

    More
  • Β· 7 views Β· 0 applications Β· 1d

    Systems and Data Analyst

    Full Remote Β· Ukraine Β· Product Β· 4 years of experience Β· English - B2
    Team Summary Our team provides insights during the development and rollout of AI features (e.g., motion and object detection) to improve user comprehension. We quantify user experience, conduct deep-dive studies, and evaluate new initiatives for core...

    Team Summary

    Our team provides insights during the development and rollout of AI features (e.g., motion and object detection) to improve user comprehension. We quantify user experience, conduct deep-dive studies, and evaluate new initiatives for core device detection functionality.

     

    Job Summary

    We are seeking an experienced Systems & Data Analyst to analyze, provide requirements and document system requirements to ensure seamless system and data integration. This role involves working with complex datasets and system data flows, as you will collaborate with business and technical teams to translate needs into effective solutions, optimize reporting, and support data-driven decisions.

     

    Responsibilities and Duties

    • Perform system analysis and define requirements for performance metrics and monitoring tools
    • Contribute to the design, development, and continuous improvement of monitoring solutions, including the definition of dashboard structure, content, and key indicators
    • Optimise data processing and retrieval processes, ensuring efficient interaction with databases, data sources, and monitoring platforms
    • Identify opportunities for automation and drive the implementation of solutions that streamline routine analytical and system processes
    • Review and validate the quality of system solutions, configurations, and dashboards to ensure accuracy, reliability, and alignment with requirements
    • Collaborate closely with data engineering, architecture, and development teams to define requirements for automating reports, system integrations, and data flows, including contributions to data warehouse and system design

       

    Qualifications and Skills

    • 4+ years of hands-on experience as a System Analyst, Computer Systems Analyst, or a similar role involving system analysis of information systems, definition and maintenance of functional and non-functional requirements, and analysis of system architecture and integrations between systems
    • Strong expertise in system analysis methodologies, tools, and best practices
    • Strong understanding of ETL/ELT process design and proven database query writing and optimization skills for consolidated data
    • Proven experience in modeling and documenting complex data flows, system integrations, and component interactions across enterprise environments
    • Proven experience with dashboarding and data visualization platforms for system and performance monitoring
    • Solid understanding of database management principles and data architecture
    • Advanced skills in preparing analytical reports, functional specifications, and comprehensive technical documentation
    • Good written and spoken English (preferably B2+)

       

    Nice to have

    • Experience designing interaction architecture between databases

       

     We offer multiple benefits that include

    • The environment of equal opportunities, transparent and value-based corporate culture, and an individual approach to each team member
    • Competitive compensation and perks. Annual performance review
    • Gig-contract
    • 21 paid vacation days per year, paid public holidays according to Ukrainian legislation
    • Development opportunities like corporate courses, knowledge hubs, and free English classes as well as educational leaves
    • Medical insurance is provided from day one. Sick leaves and medical leaves are available
    • Remote working mode is available within Ukraine only
    • Free meals, fruits, and snacks when working in the office.
    More
  • Β· 4 views Β· 1 application Β· 1d

    BAS ERP Business Analyst

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 3 years of experience Β· English - B1
    We are looking for a Business Analyst to work on automating business processes within the existing BAS ERP 2.5 information system. You will analyze business requirements, optimize processes, and ensure effective communication between the business and the...

    We are looking for a Business Analyst to work on automating business processes within the existing BAS ERP 2.5 information system. You will analyze business requirements, optimize processes, and ensure effective communication between the business and the technical team.

    Responsibilities:

    • Communication with business representatives regarding automation requirements, conducting interviews with key stakeholders.
    • Independent analysis of requests and finding solutions based on documentation and system knowledge.
    • Developing functional requirements for automation systems.
    • Studying existing processes, evaluating the capabilities of the system and its alignment with business requirements.
    • Modeling and describing business processes, implementing changes, and optimizing them.
    • Analyzing employee requests, identifying problems, and formulating tasks for developers.
    • Direct involvement in business automation projects.
    • Testing improvements and preparing for deployment into production.
      Writing user manuals and providing consultation support.
    • Consultative support and task setting for further development.
       

    Requirements:

    • 3+ years of experience as a business analyst with BAS or 1C.
    • At least 1 year of experience working with BAS ERP configuration or similar systems (e.g., BAS KUP), preferably with support experience.
    • Knowledge of the standard BAS ERP configuration.
    • Understanding of key business areas: management accounting, payroll, budgeting.
    • Industry knowledge in business process automation and accounting.
      Knowledge of IFRS (International Financial Reporting Standards).
    • Understanding of BAS integration with other systems.
    • English proficiency: A2-B1 (ability to read technical documentation).

    Nice to have:

    • Experience in successful integrations with other systems (via API services).
    • Knowledge of modern development approaches.
       

    We offer:

    • Tax expenses coverage for private entrepreneurs in Ukraine.
    • Expert support and guidance for Ukrainian private entrepreneurs.
    • 20 paid vacation days per year.
    • 10 paid sick leave days per year.
    • Public holidays as per the company’s approved public holiday list.
    • Medical insurance.
    • Opportunity to work remotely.
    • Professional education budget.
    • Language learning budget.
    • Wellness budget (gym membership, sports gear, and related expenses).

       

    Explore opportunities at JustMarkets and join our team of professionals!

    More
  • Β· 27 views Β· 2 applications Β· 1d

    Senior Data Engineer (Batch and Streaming)

    Full Remote Β· Countries of Europe or Ukraine Β· 2.5 years of experience Β· English - B2
    Role Overview We are building a greenfield analytics platform supporting both batch and real-time data processing. We are looking for a Senior Data Engineer who can design, implement, and evolve scalable data systems in AWS. This role combines hands-on...

    Role Overview

    We are building a greenfield analytics platform supporting both batch and real-time data processing. We are looking for a Senior Data Engineer who can design, implement, and evolve scalable data systems in AWS.

    This role combines hands-on development, architectural decision-making, and platform ownership.

     

    Core Responsibilities

    • Design and implement batch and streaming data pipelines using Apache Spark.
    • Build and evolve a scalable AWS-based data lake architecture.
    • Develop and maintain real-time data processing systems (event-driven pipelines).
    • Own performance tuning and cost optimization of Spark workloads.
    • Define best practices for data modeling, partitioning, and schema evolution.
    • Implement monitoring, observability, and data quality controls.
    • Contribute to infrastructure automation and CI/CD for data workflows.
    • Participate in architectural decisions and mentor other engineers.

       

    Required Qualifications

     

    Experience

    • 5+ years of experience in Data Engineering.
    • Strong hands-on experience with Apache Spark (including Structured Streaming).
    • Experience building both batch and streaming pipelines in production environments.
    • Proven experience designing AWS-based data lake architectures (S3, EMR, Glue, Athena).

       

    Streaming & Event-Driven Systems

    • Experience with event streaming platforms such as Apache Kafka or Amazon Kinesis.

       

    Data Architecture & Modeling

    • Experience implementing lakehouse formats such as Delta Lake.
    • Strong understanding of partitioning strategies and schema evolution.

       

    Performance & Reliability

    • Experience using SparkUI and AWS CloudWatch for profiling and optimization.
    • Strong understanding of Spark performance tuning (shuffle, skew, memory, partitioning).
    • Proven track record of cost optimization in AWS environments.

       

    DevOps & Platform Engineering

    • Experience with Docker and CI/CD pipelines.
    • Experience with Infrastructure as Code (Terraform, AWS CDK, or similar).
    • Familiarity with monitoring and observability practices.
    More
  • Β· 131 views Β· 35 applications Β· 1d

    Senior Data Analyst to $5500

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 3 years of experience Β· English - C1
    We are looking for a Senior Data Analyst to join a leading AI-powered sports content platform serving top clients like NBA, Bundesliga, LaLiga, and ESPN. Key Responsibilities: Own all analytical aspects within assigned product domains; Build and...

    We are looking for a Senior Data Analyst to join a leading AI-powered sports content platform serving top clients like NBA, Bundesliga, LaLiga, and ESPN.

     

    Key Responsibilities:

     

    • Own all analytical aspects within assigned product domains;
    • Build and maintain dashboards, reports, and KPIs;
    • Monitor data quality, integrity, and validity;
    • Use data mining, analytical tools, and business logic to support Product teams;
    • Analyze product performance and contribute to design and implementation decisions;
    • Manage multiple data initiatives (ongoing and ad-hoc);
    • Define alerts and monitor performance to improve operational efficiency;
    • Provide data-driven insights for prioritizing new features and initiatives;
    • Work closely with Product, R&D, Operations, and Business teams.

       

     Requirements:

     

    • 3+ years of experience in Product / Data / Business / Operations Analysis in a data-rich tech environment;
    • Strong SQL skills with daily hands-on query writing;
    • Excellent communication skills and ability to clearly present insights;
    • Strong multitasking skills and fast learning ability;
    • Bachelor’s degree in a quantitative field (Math, Statistics, Engineering, etc.);
    • Proven experience using AI tools for analytics.
       

    Nice to Have:

     

    • Experience with Tableau / Power BI / Looker;
    • Experience analyzing datasets using LLM tools;
    • Python or R experience;
    • Passion for sports.

       

    Conditions:

     

    • Fully remote work format.
    • Salary range: $4,000–5,500.
    • International, product-focused team.
    • Work on a globally impactful AI-driven product.


     

    More
  • Β· 22 views Β· 1 application Β· 1d

    Senior Data Engineer

    Full Remote Β· EU Β· Product Β· 4 years of experience Β· English - B2
    Equals 5 is a Healthcare Marketing SaaS for Pharma and Life Sciences. Our platform leverages exclusive NPI-level targeting technology to help brands reach healthcare professionals across 20+ channels with precise, user-level reporting. We are looking for...

    Equals 5 is a Healthcare Marketing SaaS for Pharma and Life Sciences. Our platform leverages exclusive NPI-level targeting technology to help brands reach healthcare professionals across 20+ channels with precise, user-level reporting.

    We are looking for a Senior Data Engineer to join the Identity team. This is not a standard ETL role. We are building a dynamic data ecosystem where AI is deeply integrated - both as a productivity multiplier and as a core component of our data processing logic for identity data enrichment and data scoring.

    You will own the infrastructure that handles over 10,000 executions per minute, ensuring stability, scalability, and data integrity. You will work with a modern stack on Google Cloud Platform, utilizing Cloud Functions, Kubernetes, and a highly advanced N8N implementation (up to 20M executions per 24 hours).

     

    Responsibilities

    • Design and implement pipelines that utilize LLMs to analyze and score identity data in real-time. Integrate AI models directly into the decision-making loop, balancing accuracy with latency and cost.
    • Architect scalable data solutions using GCP, Python, and N8N. Decide how to route, process, and store massive volumes of identity data. Manage storage with BigQuery and Apache Iceberg for TBs of data.
    • Maintain and optimize N8N instances - complex dataflows, custom Python nodes, and performance tuning for 10k+ execs/minute.
    • Manage PostgreSQL performance under heavy load, optimizing complex queries and indexing strategies.
    • Utilize Apache Spark for data transformations and batch processing when lightweight cloud functions are not enough.
    • Proactively monitor the system.

     

    Requirements

    • 4-5+ years of experience in Data Engineering or Backend Engineering with a strong data focus
    • Production AI Integration: Experience integrating LLMs (OpenAI, Anthropic, Gemini) into production applications via API
    • Expertise in GCP: Cloud Functions, IAM, Networking, Cloud Run
    • Strong Python: Clean, efficient, and testable code; comfortable building custom logic
    • Kubernetes (K8s): Experience deploying and scaling services in containerized environments
    • PostgreSQL Mastery: Proven ability to handle heavy write/read loads and optimize schemas
    • English: B2+ (Upper-Intermediate) or higher

     

    Nice-to-have: 

    • N8N / Workflow Automation experience at a deep technical level

     

    What We Offer
    - Fully remote with flexible hours (aligned with EU timezones for syncs).
    - Influence on quality strategy across the entire engineering organization.
    - A cross-team role with visibility into every part of the product.
    - AI-first tooling.
    - Claude Code licenses and cutting-edge AI development workflows. 
    - A team with no bureaucracy, decisions are made fast.

    More
  • Β· 10 views Β· 3 applications Β· 1d

    Senior AI Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - B2
    In a partnership with a Berlin-based startup that has successfully established a leading digital trading platform for industrial metals across Europe, we are looking for a heavy-hitting Senior AI Engineer to join our team. You aren’t just someone who...

    In a partnership with a Berlin-based startup that has successfully established a leading digital trading platform for industrial metals across Europe, we are looking for a heavy-hitting Senior AI Engineer to join our team. You aren’t just someone who calls APIs; you are an architect of intelligent systems who understands the "why" behind the "how."

    You will be responsible for building, fine-tuning, and deploying sophisticated AI agents and custom models within the GCP ecosystem. If you have a deep obsession with agentic workflows, the Model Context Protocol (MCP), and the nuances of model evaluation, we want to talk to you.

    Key Responsibilities

    • Architect Agentic Systems: Design and implement complex agentic layers and workflows using LangChain and LangGraph to solve multi-step reasoning problems.
    • Model Engineering: Select, deploy, and fine-tune open-source models (Llama, Mistral, etc.) alongside proprietary LLMs (Gemini, OpenAI).
    • R&D & MCP: Leverage the Model Context Protocol (MCP) to integrate local and remote resources into agentic environments.
    • Production Excellence: Implement rigorous model evaluation frameworks, regression testing, and observability using LangSmith.
    • System Design: Build scalable AI infrastructure on GCP, ensuring seamless integration between custom models and cloud-native services.

    Technical Requirements

    • Experience: 3–5+ years of high-level experience in AI/ML engineering (Very Senior level).
    • GenAI Stack: Expert-level command of Gemini, OpenAI, and Llama.
    • Orchestration: Deep expertise in LangChain and LangSmith for building and monitoring agentic tools.
    • Cloud: Professional experience with GCP (Vertex AI, Cloud Run, GKE).
    • Coding: Python mastery is a given.
    • Fine-Tuning: Proven track record of leveraging and fine-tuning open-source models for specific domain tasks.
    • Testing: Strong focus on model evaluation, bench-marking, and regression testing.
    • Nice to Have: Deep learning frameworks like PyTorch or TensorFlow and a solid grasp of classic ML concepts.

     

    Interview Process

    1. Technical Deep-Dive: A 1-hour session with our Engineering Manager focusing on system design, agentic logic, and live problem-solving.
    2. Culture & Vision: A conversation with our Founder or HR lead to discuss soft skills, values, and long-term fit.

     

    More
  • Β· 7 views Β· 0 applications Β· 1d

    Senior Azure Data Engineer IRC289060

    Full Remote Β· Ukraine Β· 4 years of experience Β· English - B2
    Description GlobalLogic is searching for a motivated, results-driven, and innovative software engineer to join our project team at a dynamic startup specializing in pet insurance. Our client is a leading global holding company that is dedicated to...

    Description

    GlobalLogic is searching for a motivated, results-driven, and innovative software engineer to join our project team at a dynamic startup specializing in pet insurance. Our client is a leading global holding company that is dedicated to developing an advanced pet insurance claims clearing solution designed to expedite and simplify the veterinary invoice reimbursement process for pet owners.
    You will be working on a cutting-edge system built from scratch, leveraging Azure cloud services and adopting a low-code paradigm. The project adheres to industry best practices in quality assurance and project management, aiming to deliver exceptional results.
    We are looking for an engineer who thrives in collaborative, supportive environments and is passionate about making a meaningful impact on people’s lives. If you are enthusiastic about building innovative solutions and contributing to a cause that matters, this role could be an excellent fit for you.

     

    Requirements

    • Strong hands-on experience with Azure Databricks (DLT Pipelines, Lakeflow Connect, Delta Live Tables, Unity Catalog, Time Travel, Delta Share) for large-scale data processing and analytics
    • Proficiency in data engineering with Apache Spark, using PySpark, Scala, or Java for data ingestion, transformation, and processing
    • Proven expertise in the Azure data ecosystem: Databricks, ADLS Gen2, Azure SQL, Azure Blob Storage, Azure Key Vault, Azure Service Bus/Event Hub, Azure Functions, Azure Data Factory, and Azure CosmosDB
    • Solid understanding of Lakehouse architecture, Modern Data Warehousing, and Delta Lake concepts
    • Experience designing and maintaining config-driven ETL/ELT pipelines with support for Change Data Capture (CDC) and event/stream-based processing
    • Proficiency with RDBMS (MS SQL, MySQL, PostgreSQL) and NoSQL databases
    • Strong understanding of data modeling, schema design, and database performance optimization
    • Practical experience working with various file formats, including JSON, Parquet, and ORC
    • Familiarity with machine learning and AI integration within the data platform context
    • Hands-on experience building and maintaining CI/CD pipelines (Azure DevOps, GitLab) and automating data workflow deployments
    • Solid understanding of data governance, lineage, and cloud security (Unity Catalog, encryption, access control)
    • Strong analytical and problem-solving skills with attention to detail
    • Excellent teamwork and communication skills
    • Upper-Intermediate English (spoken and written)

     

    Job responsibilities

    • Design, implement, and optimize scalable and reliable data pipelines using Databricks, Spark, and Azure data services
    • Develop and maintain config-driven ETL/ELT solutions for both batch and streaming data
    • Ensure data governance, lineage, and compliance using Unity Catalog and Azure Key Vault
    • Work with Delta tables, Delta Lake, and Lakehouse architecture to ensure efficient, reliable, and performant data processing
    • Collaborate with developers, analysts, and data scientists to deliver trusted datasets for reporting, analytics, and machine learning use cases
    • Integrate data pipelines with event-based and microservice architectures leveraging Service Bus, Event Hub, and Functions
    • Design and maintain data models and schemas optimized for analytical and operational workloads
    • Identify and resolve performance bottlenecks, ensuring cost efficiency and maintainability of data workflows
    • Participate in architecture discussions, backlog refinement, estimation, and sprint planning
    • Contribute to defining and maintaining best practices, coding standards, and quality guidelines for data engineering
    • Perform code reviews, provide technical mentorship, and foster knowledge sharing within the team
    • Continuously evaluate and enhance data engineering tools, frameworks, and processes in the Azure environment

     

    What we offer

    Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

    Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

    Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

    Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

    High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

    More
  • Β· 24 views Β· 3 applications Β· 1d

    Senior Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· 4 years of experience Β· English - B2
    Our client is a global jewelry manufacturer that is transforming the retail experience with cutting-edge technology. The mission is to provide an exceptional shopping experience to our customers by leveraging data-driven insights and innovative solutions....

    Our client is a global jewelry manufacturer that is transforming the retail experience with cutting-edge technology. The mission is to provide an exceptional shopping experience to our customers by leveraging data-driven insights and innovative solutions. We are looking for a talented Data Engineer to join our dynamic team and help us shape the future of retail, where you will play a critical role in developing and maintaining robust data pipelines in both Azure Synapse and Databricks.

    Joining our team, you will contribute to building a solid foundation for the data infrastructure, supporting our marketing analytics organization's goal of becoming more data-driven and customer-centric. You will collaborate closely with cross-functional teams, helping to drive impactful data product deliveries and optimizing our analytical framework for scalable insights globally.

     

    Responsibilities

     

    • Develop and maintain data products and pipelines in Databricks, Azure Data Factory and Azure
      Synapse Analytics
    • Communicate with stakeholders and users of our data products by understanding their problems and supporting them in their needs
    • Optimize data pipelines for performance and scalability, automating repetitive tasks to improve
      efficiency and reduce the time from data ingestion to actionable insights
    • Implement and maintain data quality processes, including data validation, cleansing, and error
      handling, to ensure high data integrity across all systems
    • Improve existing data integration processes to provide better reliability and robustness
    • Partner with product managers, analysts, and business stakeholders to understand data
      requirements, provide data engineering support, and ensure data is accessible and usable for
      analysis and reporting
    • Stay up-to-date with the latest trends and best practices in data engineering, bringing innovative
      ideas and solutions to improve our data infrastructure and capabilities

     

    Skills Required

    • 4+ years of experience as a Data Engineer, ETL Developer, or similar role
    • Experience with Azure Synapse Analytics
    • Strong knowledge of SQL and Spark SQ
    • Understanding of dimensional data modelling concepts
    • Understanding of streaming data ingestion processes
    • Ability to develop & manage Apache Spark data processing applications using PySpark on Databricks
    • Experience with version control (e.g., Git), DevOps, and CI/CD
    • Experience with Python
    • Experience with Microsoft data platform, Microsoft Azure stack, and Databricks
    • Experience in Marketing, Retail, and Ecom will be a plus.

    Soft Skills:
    β€’ Strong problem-solving skills and the ability to work independently as well as part of a team.
    β€’ Excellent communication skills, with the ability to translate technical concepts into business-friendly language.
    β€’ Detail-oriented with a commitment to delivering high-quality, reliable data solutions.

    More
  • Β· 19 views Β· 0 applications Β· 1d

    Data Analyst / Data Engineer

    Hybrid Remote Β· Ukraine Β· Product Β· 2 years of experience Β· English - B2
    About the role: We are looking for an experienced Data Analyst / Engineer (Data Analyst+) to work in our Kyiv office as part of Product Management (Lab & Studios). In this role, your primary mission is to enhance the speed and quality of analytical...


    About the role:

    We are looking for an experienced Data Analyst / Engineer (Data Analyst+) to work in our Kyiv office as part of Product Management (Lab & Studios). In this role, your primary mission is to enhance the speed and quality of analytical insights, turning data into clear answers and recommendations that support informed product and business decisions.

    This position is analytics-first: you will focus on metrics, in-depth analysis, data validation, and delivering stakeholder-facing insights. You will collaborate closely with data engineering, product management, R&D, and BI colleagues when changes in pipelines, models, or data structures are needed.

     

     

    Areas of responsibility:

    • Drive analytical work with Product Management and Data Engineering (Lab & Studios): clarify the question, define metrics, select the right approach, and deliver decision-ready insights.
    • Perform deep-dive analyses on product usage and operational performance (e.g., segmentation, trends, anomaly/root-cause analysis, adoption and behavioral patterns).
    • Build and maintain dashboards and visualizations, primarily using Databricks Dashboards, to support fast and reliable decision-making.
    • Create data extracts and curated analytical datasets to enable ad-hoc analysis and stakeholder self-service.
    • Define, align, and maintain metric definitions (single source of truth): assumptions, logic, filters, limitations, and interpretation guidelines.
    • Sanity check and validate usage and business data; identify inconsistencies, gaps, and tracking issues and follow through until resolved.
    • Define and adapt data collection strategies across our products together with relevant stakeholders in R&D and Product (ensure we measure the right things consistently).
    • Create and maintain relevant reports and dashboards together with Product Management (in alignment with existing reporting standards and best practices).
    • Apply advanced analytical techniques, including basic data science approaches, forecasting, or AI-assisted analysis, where relevant to generate deeper insights.
    • Enable faster self-service for stakeholders by documenting datasets/metrics and recommending the right views for recurring questions.
    • Collaborate with Data Engineers / BI Developers on data availability and reliability (request and specify needed transformations rather than owning heavy infrastructure).
    • Align with key stakeholders in R&D Management, Development Teams, Product Management, Finance, BI team, and corporate IT.


    To succeed in your role, you should possess the following skills and competencies:

    • Strong analytical mindset with a bias for clarity and actionable conclusions
    • Ability to apply critical sense to analysis results (challenge assumptions; quantify uncertainty)
    • Strong stakeholder management skills and the ability to translate business questions into analytical tasks
    • Excellent communication and data storytelling skills
    • Detail-oriented approach and high standards for data accuracy
    • Ability to prioritize in a fast-changing environment and deliver with short lead times

    You will be able to utilize the following experience:

    • Strong SQL skills (data extraction, validation, consistency checks, performance-aware querying)
    • Proven experience delivering insights from product / usage / operational data (not only reporting, but interpretation and recommendations)
    • Hands-on experience building dashboards and visualizations, preferably with Databricks Dashboard
    • Solid understanding of data quality practices (validation, reconciliation, monitoring)
    • Experience with Power BI (a strong plus)
    • Python for data analysis (a strong plus)
    • Experience or interest in data science techniques, forecasting, or applying AI/ML models to analytical insights (nice to have)
    • Familiarity with data modeling concepts (e.g., dimensional thinking, metric layers) is a plus
    • Basic understanding of ETL/ELT and modern data platforms (e.g., Databricks) is a plus - infrastructure ownership is not the focus
    • Knowledge about OLTP/OLAP (online transaction/analytical processing)
    • Programming experience and version control (Git) is a plus

     

     

    Being part of us means:

    • Meaningful work that helps to change the future of dentistry
    • Work in a unique professional, friendly and supportive environment
    • Constant professional growth and development
    • A healthy work-life balance
    • Comprehensive benefits incl. 24 working days of annual vacation; medical insurance; paid sick leaves and child sick leaves; maternity and paternity leaves etc
    • Breakfasts and lunches in the office
    • Good working conditions in a comfortable office in UNIT.City
    • A parking lot with free spaces for employees
    • Occasional business trips to Western Europe
    • Opportunity to become a part of the success that 3Shape has created over the past 25 years.
       
    More
  • Β· 57 views Β· 13 applications Β· 1d

    Lead AI Consultant

    Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - B2
    About the Role We are expanding our AI practice and are looking for a hands-on Lead AI Consultant who can step into real business contexts, understand how companies operate, and design practical AI Agent–based solutions that automate processes, reduce...

    About the Role

    We are expanding our AI practice and are looking for a hands-on Lead AI Consultant who can step into real business contexts, understand how companies operate, and design practical AI Agent–based solutions that automate processes, reduce manual work, and improve efficiency.


    We are looking for a consultant - someone who designs, builds, evaluates, explains, and delivers.

    You will work across internal initiatives and client projects, collaborating directly with business stakeholders, department heads, and technical teams. 

     

    Key Responsibilities

    AI Agents & Automation

    • Design and implement AI agents that act as technical/operational assistants (process execution, monitoring, decision support).
    • Translate business problems into practical AI use cases using LLMs, APIs, and orchestration frameworks.
    • Build end-to-end automation flows integrating AI agents with enterprise systems.

       

    Business & Consulting Work

    • Engage directly with business stakeholders and department heads to understand workflows, constraints, and priorities.
    • Evaluate client businesses and map current IT and process landscapes.
    • Propose realistic AI-driven improvements aligned with business goals, security requirements, and operational constraints.
    • Support clients in adopting AI solutions, including explaining how systems work and how teams should interact with them.

       

    Technical Design & Security

    • Design AI solutions with a strong focus on security, compliance, and data protection, especially for enterprise and regulated environments.
    • Work with APIs, system integrations, and data pipelines to ensure stable and scalable solutions.
    • Make architectural decisions balancing speed, reliability, and long-term maintainability.

       

    Research, R&D & Enablement

    • Continuously research new approaches in LLMs, AI agents, orchestration frameworks, and automation tools.
    • Participate in internal R&D initiatives and help shape internal best practices.
    • Contribute to knowledge-sharing and educational activities (internal sessions, client workshops, presentations).
    • Support product teams with AI expertise when needed.

       

    Pre-Sales & Client Interaction

    • Participate in pre-sales activities: technical discovery, solution design, estimations, and presentations.
    • Help shape proposals and technical narratives for clients.
    • Represent the company in client meetings as a trusted technical and business advisor.

       

    Requirements

    • Strong technical background (engineering, architecture, or senior technical consulting).
    • Practical experience working with LLMs, APIs, and automation systems.
    • Hands-on involvement in client-facing roles (consulting, solution delivery, pre-sales).
    • Experience evaluating business processes and proposing technical solutions.
    • Solid understanding of:
      • Generative AI & LLMs
      • AI Agents and orchestration concepts
      • APIs and system integrations
      • Business process automation
    • Ability to quickly understand how a business operates and turn that understanding into working AI solutions.
    • Strong communication skills - able to explain complex technical topics to non-technical audiences.
    • Comfortable working in environments with rapid change and evolving requirements.
    • Curious, pragmatic, and solution-oriented.
    • Fluent English is required; additional languages (e.g., Arabic, Italian, German, French) are a plus.

       

    Education:

    • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field.
    • MBA or equivalent business leadership training preferred.
    More
Log In or Sign Up to see all posted jobs