Jobs Data & Analytics

1027
  • · 33 views · 2 applications · 10d

    Junior+ Database Administrator

    Full Remote · EU · Product · 3 years of experience · English - B1
    We are seeking an experienced Database Administrator (DBA) with strong expertise in database management and AWS cloud infrastructure. Requirements Key Responsibilities: Administer and optimize databases such as Amazon RDS(Postgres, MariaDB), MongoDB,...

    We are seeking an experienced Database Administrator (DBA) with strong expertise in database management and AWS cloud infrastructure.

    Requirements
    Key Responsibilities:

    • Administer and optimize databases such as Amazon RDS(Postgres, MariaDB), MongoDB, Clickhouse
    • Monitor database performance, availability, and capacity; proactively tune and optimize systems
    • Ensure database security, encryption, access control, and compliance with best practices
    • Collaborate with application and DevOps teams to support cloud-native architectures
    • Troubleshoot database and infrastructure issues in production environments
    • Automate routine DBA tasks using scripts and cloud-native tools


    Required Skills & Qualifications:

    • Strong hands-on experience with cloud databases on AWS
    • Proficiency in one or more databases (Postgres, MariaDB, MongoDB, Clickhouse, etc.)
    • Strong knowledge of database performance tuning, indexing, and query optimization
    • Familiarity with IAM, VPC, security groups, and cloud security best practices
    • Experience with monitoring tools such as CloudWatch, Performance Insights, or similar
    • Strong troubleshooting and problem-solving skills


    Nice to Have:

    • Experience with Terraform/Terragrunt
    • Experience with data migration tools like AWS DMS

     

    What we offer:
     

    Rewards & Celebrations

    • Quarterly Bonus System
    • Team Buildings Compensations
    • Memorable Days Financial Benefit

     

    Learning & Development

    • Annual fixed budget for personal learning 
    • English Language Courses Compensation

     

    Time Off & Leave

    • Paid Annual Leave (Vacation) - 24 working days
    • Sick leave - unlimited number of days, fully covered

     

    Wellbeing Support

    • Mental Health Support (Therapy Compensation)
    • Holiday Helper Service

     

    Workplace Tools & Assistance

    • Laptop provided by Company (after probation)

     

    Work conditions:

    • Remote work from EU
    • Flexible 8-hour workday, typically between 9:00 - 18:00 CET
    • Five working days, Monday to Friday
    • Public holidays observed according to Ukrainian legislation
    • Business trips to Bratislava every 3-6 months (company provides compensation of expenses)

    At Ixilix, we value transparency, trust, and ownership. We believe that great results come from people who care - about their work, their team, and the impact they create. 

    Sounds like you? Let’s connect! We’re just one click away.

    More
  • · 16 views · 1 application · 10d

    Middle Database Administrator

    Full Remote · EU · Product · 3 years of experience · English - B1
    We are seeking an experienced Database Administrator (DBA) with strong expertise in database management and AWS cloud infrastructure. Requirements Key Responsibilities: Administer and optimize databases such as Amazon RDS(Postgres, MariaDB), MongoDB,...

    We are seeking an experienced Database Administrator (DBA) with strong expertise in database management and AWS cloud infrastructure.

    Requirements
    Key Responsibilities:

    • Administer and optimize databases such as Amazon RDS(Postgres, MariaDB), MongoDB, Clickhouse
    • Monitor database performance, availability, and capacity; proactively tune and optimize systems
    • Ensure database security, encryption, access control, and compliance with best practices
    • Collaborate with application and DevOps teams to support cloud-native architectures
    • Troubleshoot database and infrastructure issues in production environments
    • Automate routine DBA tasks using scripts and cloud-native tools


    Required Skills & Qualifications:

    • Strong hands-on experience with cloud databases on AWS
    • Proficiency in one or more databases (Postgres, MariaDB, MongoDB, Clickhouse, etc.)
    • Strong knowledge of database performance tuning, indexing, and query optimization
    • Familiarity with IAM, VPC, security groups, and cloud security best practices
    • Experience with monitoring tools such as CloudWatch, Performance Insights, or similar
    • Strong troubleshooting and problem-solving skills


    Nice to Have:

    • Experience with Terraform/Terragrunt
    • Experience with data migration tools like AWS DMS

     

    What we offer:
     

    Rewards & Celebrations

    • Quarterly Bonus System
    • Team Buildings Compensations
    • Memorable Days Financial Benefit

     

    Learning & Development

    • Annual fixed budget for personal learning 
    • English Language Courses Compensation

     

    Time Off & Leave

    • Paid Annual Leave (Vacation) - 24 working days
    • Sick leave - unlimited number of days, fully covered

     

    Wellbeing Support

    • Mental Health Support (Therapy Compensation)
    • Holiday Helper Service

     

    Workplace Tools & Assistance

    • Laptop provided by Company (after probation)

     

    Work conditions:

    • Remote work from EU
    • Flexible 8-hour workday, typically between 9:00 - 18:00 CET
    • Five working days, Monday to Friday
    • Public holidays observed according to Ukrainian legislation
    • Business trips to Bratislava every 3-6 months (company provides compensation of expenses)

    At Ixilix, we value transparency, trust, and ownership. We believe that great results come from people who care - about their work, their team, and the impact they create. 

    Sounds like you? Let’s connect! We’re just one click away.

    More
  • · 72 views · 17 applications · 10d

    Senior AI Engineer

    Full Remote · Worldwide · Product · 5 years of experience · English - C1
    We are seeking a highly skilled Senior AI Engineer to design, integrate, and optimize AI-driven solutions across our platform. This role will focus on integrating existing AI tools and services, enhancing data processing and analysis workflows with AI,...

    We are seeking a highly skilled Senior AI Engineer to design, integrate, and optimize AI-driven solutions across our platform. This role will focus on integrating existing AI tools and services, enhancing data processing and analysis workflows with AI, and laying the technical foundation for advanced AI capabilities.

     

    You will work closely with backend, frontend, and product teams, and over time, grow into an AI Lead position, shaping the long-term AI strategy and architecture.

    Key Responsibilities

    AI Integration & Architecture

    • Integrate and orchestrate third-party and in-house AI/ML tools (e.g., LLMs, embeddings, vector databases, AI APIs).
    • Design scalable AI-powered services that integrate seamlessly with existing systems.
    • Evaluate new AI technologies and tools, and recommend adoption based on business impact.

    Data Processing & AI Optimization

    • Optimize data pipelines for AI-driven processing, analysis, and automation.
    • Apply AI techniques to improve data quality, insights generation, and operational efficiency.
    • Collaborate with data and product teams to translate business needs into AI solutions.

    Backend & Frontend Collaboration

    • Build and maintain AI-enabled backend services using Python, AWS, and RESTful APIs.
    • Expose AI capabilities via well-structured APIs for frontend consumption.
    • Work with frontend engineers to integrate AI features into React applications.

    Scalability, Reliability & Security

    • Ensure AI solutions are production-ready, scalable, cost-efficient, and secure.
    • Monitor performance, latency, and accuracy of AI systems in production.
    • Implement best practices for model versioning, observability, and responsible AI use.

    Leadership & Growth

    • Act as a technical mentor for engineers working with AI-related features.
    • Contribute to defining AI standards, best practices, and long-term roadmap.
    • Gradually take ownership of AI architecture and strategy, growing into an AI Lead role.

     

     

    Required Qualifications

    • 5+ years of relevant experience in software engineering, machine learning, or AI-focused roles.
    • Strong proficiency in Python, with experience building production backend systems.
    • Hands-on experience with AI/ML integration, including LLMs, ML APIs, or AI platforms.
    • Experience with AWS (e.g., EC2, Lambda, S3, SageMaker, or similar services).
    • Solid understanding of RESTful API design and microservice architectures.
    • Experience collaborating with frontend teams using React or similar frameworks.
    • Strong knowledge of data processing, analysis pipelines, and performance optimization.
    • Ability to design systems that balance experimentation with production reliability.

    Nice to Have

    • Experience with MLOps, model deployment, and monitoring.
    • Familiarity with vector databases, retrieval-augmented generation (RAG), or agent-based systems.
    • Experience optimizing AI cost, latency, and inference performance.
    • Prior experience mentoring engineers or leading technical initiatives.
    • Exposure to AI ethics, governance, and compliance considerations.

    What We Offer

    • Opportunity to build and scale AI capabilities in a real-world production environment.
    • Clear growth path toward an AI Lead position with strategic ownership.
    • High level of autonomy and technical influence.
    • Collaborative environment working across backend, frontend, and product teams.
    • Competitive compensation and benefits, commensurate with experience.
    More
  • · 52 views · 1 application · 10d

    Solutions Data Analyst

    Full Remote · Countries of Europe or Ukraine · 5 years of experience · English - B2
    On behalf of our client, we are looking for a Solutions Data Analyst Responsibilities: - Collect, analyze, and interpret data from Salesforce, Looker, BigQuery, and other business systems. - Design and maintain the analytics layer in Google BigQuery...

    On behalf of our client, we are looking for a Solutions Data Analyst

     

    Responsibilities:

     

    - Collect, analyze, and interpret data from Salesforce, Looker, BigQuery, and other business systems.

    - Design and maintain the analytics layer in Google BigQuery and Looker (LookML), building refined tables and semantic models.

    - Build and maintain dashboards, reports, and visualizations for business stakeholders.

    - Present structured insights to both technical and non-technical audiences.

    - Identify gaps in Salesforce data capture and implement light configuration changes (fields, validation rules, flows, reporting enhancements).

    - Collaborate with RevOps, Sales, Marketing, Engineering, and other teams to translate business needs into reporting solutions.

    - Monitor data quality, identify anomalies, and ensure ongoing data integrity.

    - Ensure compliance with data privacy and security standards (ISO 27001, Cyber Essentials).

    - Recommend system and process improvements to enhance business intelligence and decision-making.

     

    Requirements:

     

    - 5+ years of experience as a Data Analyst.

    - Strong SQL skills and experience with BI tools (Looker preferred; Power BI, Tableau as alternatives).

    - Hands-on experience working with Salesforce, including reporting and dashboards.

    - Understanding of Salesforce data structures and their impact on reporting.

    - Experience working with Google BigQuery.

    - Strong analytical and problem-solving skills.

    - Good understanding of data governance, privacy, and security principles.

    - Excellent communication and presentation skills.

    - Experience collaborating with cross-functional

    - Salesforce Administrator certification is a plus.

     teams.

      - Upper - Intermediate or higher level of English.

     

    Company offers:

     

    - Long-term employment with possibilities for professional growth

    - Fully remote work

    - Reasonably flexible schedule

    - 15 days of paid vacation

    - Regular performance reviews

    More
  • · 71 views · 9 applications · 10d

    Freelance Business Analyst

    Part-time · Full Remote · Ukraine · 2 years of experience · English - B2
    At Embrox Solutions, we bring ideas to life. We’re not just about writing code — we create complete hardware and software products that make a difference. By combining software development with electrical and mechanical engineering, we deliver the full...

    At Embrox Solutions, we bring ideas to life. We’re not just about writing code — we create complete hardware and software products that make a difference. By combining software development with electrical and mechanical engineering, we deliver the full new product development cycle (NPD), turning even the boldest ideas into reality.

    We’re a team of passionate innovators who value reliability, speed, and flexibility in everything we do. If you want to grow, create, and be part of a friendly, forward-thinking team, Embrox is the place for you!

    About the Role:
    We’re looking for a detail-oriented Business Analyst to join our team on a freelance basis. You’ll work closely with Product Managers, Designers, and Developers to transform business needs into clear, structured requirements and ensure smooth communication between all parties involved.

    Responsibilities:

    • Gather, analyze, and document business and functional requirements
    • Create user stories, acceptance criteria, and process flows
    • Collaborate with stakeholders to clarify goals and priorities
    • Support the development team during implementation and testing
    • Participate in backlog grooming and sprint planning
    • Contribute to product improvement ideas based on data and user feedback

    Requirements:

    • 2+ years of experience as a Business Analyst in IT (outsourcing or product environment)
    • Strong understanding of SDLC, Agile/Scrum methodology
    • Experience working with tools like Jira, Confluence, Figma, or similar
    • Excellent analytical and communication skills
    • Upper-Intermediate+ English level (both written and spoken)
    • Ability to work independently and manage priorities in a remote setup

    Nice to have:

    • Experience with UX/UI teams or pre-sale processes
    • Knowledge of SQL or data analysis tools
    • Previous freelance or project-based experience

    We offer:

    • Flexible workload (10–20 hours/week, with potential to expand)
    • Remote collaboration with a friendly and result-driven team
    • Fair hourly rate and timely payments
    • Involvement in diverse international projects.
    More
  • · 20 views · 1 application · 10d

    Senior Product Analyst

    Office Work · Poland · Product · 5 years of experience · English - C1
    A leading international product company in the live-streaming industry is looking for a Senior Product Analyst in Warsaw. The platform has over 450 million registered users and enables creators worldwide to connect with their audiences and monetize...

    A leading international product company in the live-streaming industry is looking for a Senior Product Analyst in Warsaw.

     

    The platform has over 450 million registered users and enables creators worldwide to connect with their audiences and monetize their talents. The team consists of 350+ global professionals.

     

    They offer:

    • Stock options
    • Medical insurance (100% for employees and 75% for family members)
    • Office lunches
    • Parking
    • Multisport card

     

    Requirements:

    • Experience as a Data Analyst / Product Analyst / Game Analyst
    • Experience with BI tools (Looker, Tableau, Power BI, etc.)
    • SQL experience
    • Experience with cloud platforms
    • B2C experience
    • Mobile analytics

     

    Location: Warsaw (office-based).

     

    More
  • · 36 views · 3 applications · 10d

    Senior BI Engineer (Power BI and DevExpress)

    Full Remote · Poland, Ukraine · 5 years of experience · English - B2
    Description You’ll be working on Client side team, which focuses on Data engineering, Business analysis and reports. Team is working in Scrum framework. You’ll be close to those who make business decisions, so value that team provides is huge and highly...

    Description

    You’ll be working on Client side team, which focuses on Data engineering, Business analysis and reports.

    Team is working in Scrum framework.

    You’ll be close to those who make business decisions, so value that team provides is huge and highly appreciated.

    Client is located in Ohio, USA.

    Working hours are:

    • Warsaw: till 6pm
    • Kyiv: till 7pm


    Interview stages:

    • Internal tech interview
    • Interview with project PM
    • Client’s interview with team representative

     

    Client Information

    People
    1. USA: owner, top management, PM, PO, Acrh, 4 full SCRUM teams.
    2. Ukraine: PM, PPOs (1 per each SCRUM team), Arch, 4 SCRUM teams, DevOps, RM, QA Leads (MQA & AQA). The GL team counts 80 people in total and is growing.

    Product.
    We developed a POS solution that allows providing attended (employee) and unattended (customer) tunnel car washing service. There are two labs:
    1. One in USA (Ohio)
    2. One in Ukraine (Lviv)

    Both support the USA and Canadian markets. One hardware set includes several points on sale, a set of different sensors, readers and cameras, payment terminals, servers, and automatic gates.
    Ukrainian team develops a PWA application and a hybrid web application (.Net & Angular) that operates on hardware & cloud and supports full customer flow: from customer recognition or account creation to sale & directing their car to a tunnel. All embedded development is on the client’s side. Some teams have hardware dependencies based on the areas of the application they work with, meaning office presence is necessary.

    Flow.
    SCRUM, 2-week sprints, releases every 3 weeks. Sprints include mandatory daily stand-ups, grooming, planning, demos, and retro sessions
     

    Requirements:

    • Strong PowerBI knowledge and practical experience;
    • Strong DevExpress knowledge and practical experience;
    • Good .net knowledge and practical experience;

     

    • Able to work with the customer’s stakeholders to understand business and technical requirements;
    • SQL skills are a must to produce efficient reports;
    • Strong decision-making, problem-solving, and analytical skills;
    • SQL, ETL Tools, and Relational/Non-Relational Databases is advantages;
    • Upper-intermediate English level, Advanced is preferred

     

    • Experience with design principals for UI a plus;
    • Creativity and ability to think outside the box while defining sound and practical solutions;
    • Experience with complex data modeling is an advantage;
    • Analytical background is an advantage;
    • Knowledge of Agile processes (Scrum preferred);

    Job responsibilities

    • Develop Power BI & DevExpress Reports;
    • Work in Agile environment
    • Collaborate with the Client and the team members on how to create visualizations, reporting, and analytical dashboards/reports in Power BI & DevExpress Reports as we transform their data into a modern landscape;
    • Get requirements on reports from BA, PO and client;
    • Show regular progress of report development;
    • Educate the client and the team on Power BI capabilities, report creation, and analytics best practices;
    • Meet deadlines agreed upon with the client on each piece of report;
    • Continue learning leading-edge on Power BI & DevExpress capabilities and incorporate modernizations in the Empower product and business decisions.
    More
  • · 35 views · 1 application · 10d

    Big Data Engineer to $8000

    Full Remote · Bulgaria, Poland, Romania · 6 years of experience · English - B2
    Who We Are: Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries. About the Product: The product is an enterprise-grade digital experience...

    Who We Are:

    Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries.  

     

    About the Product:

    The product is an enterprise-grade digital experience platform that provides real-time visibility into system performance, application stability, and end-user experience across on-premises, virtual, and cloud environments. It ingests large volumes of telemetry from distributed agents on employee devices and infrastructure, processes and enriches data through streaming pipelines, detects anomalies, and stores analytical data for monitoring and reporting. The platform serves a global customer base with high throughput and strict requirements for security, correctness, and availability. Rapid adoption has driven significant year-over-year growth and demand from large, distributed teams seeking to secure and stabilize digital environments without added complexity.

     

    About the Role:

    This is a true Big Data engineering role focused on designing and building real-time data pipelines that operate at scale in production environments serving real customers. You will join a senior, cross-functional platform team responsible for the end-to-end data flow: ingestion, processing, enrichment, anomaly detection, and storage. You will own both architecture and delivery, collaborating with Product Managers to translate requirements into robust, scalable solutions and defining guardrails for data usage, cost control, and tenant isolation. The platform is evolving from distributed, product-specific flows to a centralized, multi-region, highly observable system designed for rapid growth, advanced analytics, and future AI-driven capabilities. Strong ownership, deep technical expertise, and a clean-code mindset are essential.

     

    Key Responsibilities: 

    • Design, build, and maintain high-throughput, low-latency data pipelines handling large volumes of telemetry.
    • Develop real-time streaming solutions using Kafka and modern stream-processing frameworks (Flink, Spark, Beam, etc.).
    • Contribute to the architecture and evolution of a large-scale, distributed, multi-region data platform.
    • Ensure data reliability, fault tolerance, observability, and performance in production environments.
    • Collaborate with Product Managers to define requirements and translate them into scalable, safe technical solutions.
    • Define and enforce guardrails for data usage, cost optimization, and tenant isolation within a shared platform.
    • Participate actively in system monitoring, troubleshooting incidents, and optimizing pipeline performance.
    • Own end-to-end delivery: design, implementation, testing, deployment, and monitoring of data platform components.

     

    Required Competence and Skills:

    • 5+ years of hands-on experience in Big Data or large-scale data engineering roles.
    • Strong programming skills in Java or Python, with willingness to adopt Java and frameworks like Vert.x or Spring.
    • Proven track record of building and operating production-grade data pipelines at scale.
    • Solid knowledge of streaming technologies such as Kafka, Kafka Streams, Flink, Spark, or Apache Beam.
    • Experience with cloud platforms (AWS, Azure, or GCP) and designing distributed, multi-region systems.
    • Deep understanding of production concerns: availability, data loss prevention, latency, and observability.
    • Hands-on experience with data stores such as ClickHouse, PostgreSQL, MySQL, Redis, or equivalents.
    • Strong system design skills, able to reason about trade-offs, scalability challenges, and cost efficiency.
    • Clean code mindset, solid OOP principles, and familiarity with design patterns.
    • Experience with AI-first development tools (e.g., GitHub Copilot, Cursor) is a plus.

     

    Nice to have:

    • Experience designing and operating globally distributed, multi-region data platforms.
    • Background in real-time analytics, enrichment, or anomaly detection pipelines.
    • Exposure to cost-aware data architectures and usage guardrails.
    • Experience in platform or infrastructure teams serving multiple products.

     

    Why Us?

    We provide 20 days of vacation leave per calendar year (plus official national holidays of the country you are based in).

    We provide full accounting and legal support in all countries in which we operate.

    We utilize a fully remote work model with a powerful workstation and co-working space in case you need it.

    We offer a highly competitive package with yearly performance and compensation reviews.

    More
  • · 30 views · 3 applications · 10d

    Senior Data Engineer (Python + AWS)

    Full Remote · Ukraine · 4 years of experience · English - B2
    Description We are a global audience and location intelligence company that helps marketers connect the digital and physical world. We provide data-driven solutions to enhance marketing campaigns by leveraging location and audience data to reveal consumer...

    Description

    We are a global audience and location intelligence company that helps marketers connect the digital and physical world. We provide data-driven solutions to enhance marketing campaigns by leveraging location and audience data to reveal consumer behavior and enable more precise targeting and measurement. We work on high-end / high-performance / high-throughput systems for in-time analysis of data for autonomous driving and other big data applications e.g. for E-commerce.

     

    Requirements

    • You have 4+ years of experience on similar position.
    • You have significant experience with Python. Familiarity with Java or Scala is a plus.
    • Hands-on experience building scalable solutions in AWS.
    • Proficiency in NoSQL and SQL databases and in high-throughput data-related architecture and technologies (e.g. Kafka, Spark, Hadoop, MongoDB, AWS Batch, AWS Glue, Athena, Airflow, dbt).
    • Excellent SQL and data transformation skills.
    • Excellent written and verbal communication skills with an ability to simplify complex technical information.
    • Experience guiding and mentoring junior team members in a collaborative environment.

     

    Job responsibilities

    • Work in a self-organised agile team with a high level of autonomy, and you will actively shape your team’s culture.
    • Design, build, and standardise privacy-first big data architectures, large-scale data pipelines, and advanced analytics solutions in AWS.
    • Develop complex integrations with third-party partners, transferring terabytes of data.
    • Align with other Data experts on data (analytics) engineering best practices and standards, and introduce those standards and data engineering expertise to the team in order to enhance existing data pipelines and build new ones.
    • Successfully partner up with the Product team to constantly develop further and improve our platform features.
    More
  • · 140 views · 7 applications · 10d

    Data Entry Analyst

    Office Work · Ukraine (Poltava) · English - B1
    We are looking for a Car Data Specialist with a strong interest in automobiles and a high level of attention to detail. The role involves working with large volumes of automotive data, performing repetitive tasks, and ensuring data accuracy and...

    We are looking for a Car Data Specialist with a strong interest in automobiles and a high level of attention to detail. The role involves working with large volumes of automotive data, performing repetitive tasks, and ensuring data accuracy and consistency. This position is ideal for someone who is analytical, organized, stress-resistant, and motivated to work with automotive information on a daily basis.

    Key Responsibilities

    • Collect, analyze, filter, and update large volumes of automotive data
    • Perform repetitive data-processing tasks while maintaining high accuracy
    • Validate, structure, and maintain car-related information
    • Detect inconsistencies or errors in data and correct them
    • Follow internal processes, quality standards, and deadlines
    • Collaborate with team members to achieve individual and team goals

    Required Skills & Qualifications

    • Good knowledge and understanding of cars (brands, models, specifications, automotive terminology)
    • Ability to work efficiently with repetitive tasks
    • Strong analytical skills and attention to detail
    • Ability to process and manage large amounts of information
    • Stress resistance and ability to stay focused under workload
    • Strong organizational skills
    • Good manners and professional attitude
    • Punctuality and reliability

    Nice to Have (Will Be a Plus)

    • Previous experience in data management, data analytics, or data science
    • Experience working with databases or large datasets
    • Background in the automotive industry

    Working Conditions & Benefits

    • 5-day working week, 8 hours per day
    • Office-based position (no remote work)
    • Possibility of paid overtime in the future
    • Cozy and fully equipped office with all necessary facilities
    • Office in the City center
    • Health insurance for employees
    • Additional health-related bonuses
    • 24 paid vacation days per year
    • Friendly, supportive, and professional team environment

    Why Join Us?

    We believe in teamwork, mutual support, and continuous growth. Together, we set ambitious goals and work consistently to achieve them. If you are passionate about cars, value accuracy, and want to be part of a growing international company — we would be happy to meet you.

    Don’t hesitate to apply for the position!

    More
  • · 150 views · 14 applications · 10d

    Junior Data Engineer (Python)

    Full Remote · Ukraine · 1 year of experience · English - B2
    Description Customer is one of the biggest companies on the market of home entertainment consumer electronics devices that strives to provide their clients with high-quality products and services. This position collaborates with a geographically diverse...

    Description

    Customer is one of the biggest companies on the market of home entertainment consumer electronics devices that strives to provide their clients with high-quality products and services.

    This position collaborates with a geographically diverse team to develop, deliver, and maintain systems for digital subscription and transactional products across the Customer’ SVOD portfolio.

     

    Requirements

    – 1+ years of intermediate to advanced SQL

    – 1+ years of python development (intermediate level is fine: Pandas, Numpy, boto3, seaborn, requests, unittest)

    – Experience building ETLs

    – Experience with data tools (ex.: Airflow, Grafana, AWS Glue, AWS Athena)

    – Excellent understanding of database design

    – Cloud expereince (AWS S3, Lambda, or alternatives)

    – Agile SDLC knowledge
    – Detail oriented
    – Data-focused
    – Strong verbal/written communication and data presentation skills, including an ability to effectively communicate with both business and technical teams
    – An ability and interest in working in a fast-paced and rapidly changing environment
    – Be self-driven and show ability to deliver on ambiguous projects with incomplete or dirty data

     

    Would be a plus:
    – Understanding of basic SVOD store purchase workflows
    – Background in supporting data scientists in conducting data analysis / modelling to support business decision making

    – Experience in supervising subordinate staff

     

    Job responsibilities

    – Data analysis, auditing, statistical analysis
    – ETL buildouts for data reconciliation
    – Creation of automatically-running audit tools
    – Interactive log auditing to look for potential data problems
    – Help in troubleshooting customer support team cases
    – Troubleshooting and analyzing subscriber reporting issues:
    – Answer management questions related to subscriber count trends
    – App purchase workflow issues
    – Audit/reconcile store subscriptions vs userdb

    More
  • · 32 views · 5 applications · 10d

    Middle Data Engineer (Python)

    Full Remote · Ukraine · 3 years of experience · English - B2
    Description Customer is one of the biggest companies on the market of home entertainment consumer electronics devices that strives to provide their clients with high-quality products and services. This position collaborates with a geographically diverse...

    Description

    Customer is one of the biggest companies on the market of home entertainment consumer electronics devices that strives to provide their clients with high-quality products and services.

    This position collaborates with a geographically diverse team to develop, deliver, and maintain systems for digital subscription and transactional products across the Customer’ SVOD portfolio.

     

    Requirements

    – 3+ years of intermediate to advanced SQL

    – 3+ years of python development (intermediate level is fine: Pandas, Numpy, boto3, seaborn, requests, unittest)

    – Experience building ETLs

    – Experience with data tools (ex.: Airflow, Grafana, AWS Glue, AWS Athena)

    – Excellent understanding of database design

    – Cloud expereince (AWS S3, Lambda, or alternatives)

    – Agile SDLC knowledge
    – Detail oriented
    – Data-focused
    – Strong verbal/written communication and data presentation skills, including an ability to effectively communicate with both business and technical teams
    – An ability and interest in working in a fast-paced and rapidly changing environment
    – Be self-driven and show ability to deliver on ambiguous projects with incomplete or dirty data

     

    Would be a plus:
    – Understanding of basic SVOD store purchase workflows
    – Background in supporting data scientists in conducting data analysis / modelling to support business decision making

    – Experience in supervising subordinate staff

     

    Job responsibilities

    – Data analysis, auditing, statistical analysis
    – ETL buildouts for data reconciliation
    – Creation of automatically-running audit tools
    – Interactive log auditing to look for potential data problems
    – Help in troubleshooting customer support team cases
    – Troubleshooting and analyzing subscriber reporting issues:
          Answer management questions related to subscriber count trends
          App purchase workflow issues
          Audit/reconcile store subscriptions vs userdb

    More
  • · 26 views · 1 application · 10d

    Senior Machine Learning Engineer

    Full Remote · Poland, Ukraine · 5 years of experience · English - B2
    A leading mobile marketing and audience platform, empowers the app ecosystem with cutting-edge solutions in mobile marketing, audience building, and monetization. With integration into over 500,000 monthly active apps and a global reach, the platform...

    A leading mobile marketing and audience platform, empowers the app ecosystem with cutting-edge solutions in mobile marketing, audience building, and monetization. With integration into over 500,000 monthly active apps and a global reach, the platform leverages first-party data to deliver impactful and scalable advertising solutions.

    We’re looking for a highly skilled, independent, and driven Machine Learning Engineer to lead the design and development of our next-generation real-time inference services - the core engine powering algorithmic decision-making at scale. This is a rare opportunity to own the system at the heart of our product, serving billions of daily requests across mobile apps, with tight latency and performance constraints.

    You’ll work at the intersection of machine learning, large-scale backend engineering, and business logic, building robust services that blend predictive models with dynamic, engineering logic - all while maintaining extreme performance and reliability requirements.

    Description:

    • Own and lead the design and development of low-latency Algo inference services handling billions of requests per day.
    • Build and scale robust real-time decision-making engines, integrating ML models with business logic under strict SLAs.
    • Collaborate closely with DS to deploy models seamlessly and reliably in production.
    • Design systems for model versioning, shadowing, and A/B testing at runtime.
    • Ensure high availability, scalability, and observability of production systems.
    • Continuously optimize latency, throughput, and cost-efficiency using modern tooling and techniques.
    • Work independently while interfacing with cross-functional stakeholders from Algo, Infra, Product, Engineering, BA & Business.

    Requirements:

    • B.Sc. or M.Sc. in Computer Science, Software Engineering, or a related technical discipline.
    • 5+ years of experience building high-performance backend or ML inference systems.
    • Deep expertise in Python and experience with low-latency APIs and real-time serving frameworks (e.g., FastAPI, Triton Inference Server, TorchServe, BentoML).
    • Experience with scalable service architecture, message queues (Kafka, Pub/Sub), and async processing.
    • Strong understanding of model deployment practices, online/offline feature parity, and real-time monitoring.
    • Experience in cloud environments (AWS, GCP, or OCI) and container orchestration (Kubernetes).
    • Experience working with in-memory and NoSQL databases (e.g. Aerospike, Redis, Bigtable) to support ultra-fast data access in production-grade ML services.
    • Familiarity with observability stacks (Prometheus, Grafana, OpenTelemetry) and best practices for alerting and diagnostics.
    • A strong sense of ownership and the ability to drive solutions end-to-end.
    • Passion for performance, clean architecture, and impactful systems.

    Why join us?

    • Lead the mission-critical inference engine that drives our core product.
    • Join a high-caliber Algo group solving real-time, large-scale, high-stakes problems.
    • Work on systems where every millisecond matters, and every decision drives real value.
    • Enjoy a fast-paced, collaborative, and empowered culture with full ownership of your domain.

    What we offer:

    • Polish public holidays.
    • 20 working days per year is Non-Operational Allowance and settled to be used for personal recreation matters and are compensated in full. These days have to be used within the year, with no rollover to the next calendar year.
    • Health Insurance.
    • Gym Subscription (Multisport).
    More
  • · 47 views · 7 applications · 10d

    Strong Junior BA Business Analyst

    Office Work · Ukraine (Lviv) · Product · 2 years of experience · English - B1
    Milla Nova – a world-renowned bridal fashion brand – is looking for a Business Analyst for e-commerce projects. Key Responsibilities: Analyze business requirements and processes for e-commerce projects (official website, marketplaces, data...

    Milla Nova – a world-renowned bridal fashion brand – is looking for a Business Analyst for e-commerce projects.

     

    Key Responsibilities:

    1. Analyze business requirements and processes for e-commerce projects (official website, marketplaces, data import/export, internal services).
    2. Implement and support e-commerce solutions, integrate plugins, set up synchronizations and data exchange.
    3. Write and formalize technical requirements for development tasks.
    4. Collaborate with developers and QA teams throughout the project lifecycle.
    5. Test developed software and prepare user guides.
    6. Conduct trainings and provide user support.

    Candidate Requirements:

    • Experience with e-commerce platforms, especially Shopify.
    • English level: Intermediate+ or higher.
    • Knowledge of data import/export processes and system integration.
    • Familiarity with BPMN and functional requirements documentation.
    • Understanding of Agile and SDLC.
    • Confident use of MS Office (Word, Excel, PowerPoint, Visio), Jira, Google Docs.
    • Strong analytical skills, attention to detail, responsibility.
    • Teamwork and time-management skills.

     

    We Offer:

    • Competitive salary based on experience and skills.
    • Work in a successful international e-commerce company.
    • Friendly atmosphere and modern office in Lviv.
    • Official employment.
    • Opportunities for professional growth.
    More
  • · 35 views · 4 applications · 10d

    AI/ML Engineer (LLM/ Autonomous Systems Specialist)

    Full Remote · Ukraine · Product · 5 years of experience · English - None
    We are looking for an AI/ML Engineer (LLM and Autonomous Systems Specialist) to help us deliver an ambitious new project - the development and training of intelligent behavioral models for autonomous platforms capable of operating and making decisions...

    We are looking for an AI/ML Engineer (LLM and Autonomous Systems Specialist) to help us deliver an ambitious new project - the development and training of intelligent behavioral models for autonomous platforms capable of operating and making decisions without constant network connectivity.

     

    The project lies at the intersection of artificial intelligence, machine learning, simulation, and robotics.

    Its goal is to create a system where AI not only reacts but also understands context and adapts to its environment.

     

    We are a technology-driven FinTech company that has gone far beyond traditional financial services - actively expanding R&D directions related to autonomous systems, model training, and next-generation agent architectures.

     

    If you’re ready to build technologies at the intersection of AI, machine learning, and autonomous systems - welcome to the team!

     

    Responsibilities:

     

    • Train AI models that analyze sensor data and make autonomous decisions (Behavioral Cloning, Reinforcement Learning).
    • Work with virtual simulation environments that create scenarios for reinforcement learning.
    • Configure and optimize algorithms for computer vision and navigation — obstacle detection, route planning, spatial orientation.
    • Transfer trained models from simulation to real devices and adapt them to limited hardware resources.
    • Develop and test behavioral strategies for operation under weak or no connectivity.

     

     

    Requirements:

     

    • Strong experience with Python, PyTorch, and/or TensorFlow.
    • Deep understanding of CNNs, RNNs, Transformers, and LLM fine-tuning techniques (LoRA, PEFT).
    • Experience with LLM serving frameworks such as vLLM, Ollama.
    • Simulation & Training: The AI model is initially trained in a virtual environment using cloud platforms.
    • Computing Power: A separate high-performance computer.

     

    Nice to have: 
     

    • Work with simulation environments (AirSim, CARLA, Isaac Sim) for synthetic data generation and model validation.
    • Data Handling: MongoDB is used as a local onboard database to store instructions and sensor data. 
    • Hardware Integration: A Raspberry Pi acts as the intermediary, receiving AI commands and relaying them to the controller (e.g., Jetson, Pixhawk). 

     

    What We Offer: 

     

    • Competitive salary and performance-based bonuses.
    • Opportunity to work on cutting-edge AI + blockchain projects.
    • Flexible working hours and remote work options.
    • Professional development, training, and access to AI & blockchain research resources.
    • A collaborative and innovative work environment. 

     

    Join us and help shape the future of AI-driven blockchain automation!

    More
Log In or Sign Up to see all posted jobs