Jobs Data & Analytics

1028
  • Β· 540 views Β· 112 applications Β· 10d

    Billing and Payment Collection Specialist

    Full Remote Β· Worldwide Β· 1 year of experience Β· English - B1
    Location: Remote Schedule: Full-time, Sunday to Thursday Company: A fast-growing affiliate marketing network connecting advertisers with high-quality leads We’re looking for a detail-oriented and reliable Billing and Payment Collection Specialist...

    πŸ“ Location: Remote

    πŸ• Schedule: Full-time, Sunday to Thursday

    🌍 Company: A fast-growing affiliate marketing network connecting advertisers with high-quality leads

     

    We’re looking for a detail-oriented and reliable Billing and Payment Collection Specialist to join our finance team. This is a great opportunity for someone who wants to grow in a remote-friendly, performance-driven environment.

     

    πŸ”§ Responsibilities:

    • Assist in managing billing operations, ensuring accurate billing calculations and on-time payments from affiliates and advertisers;
    • Maintain and update billing records using Google Sheets;
    • Coordinate with advertiser managers and finance teams to capture all costs and revenues accurately;
    • Work closely with the sales and finance teams to ensure accurate cost and revenue tracking;
    • Handle client billing inquiries (affiliates and advertisers);
    • Support in preparing financial reports and analysis;
    • Contribute to improving billing workflows and overall team efficiency.
    •  

    βœ… Requirements:

    • Experience in finance, billing, accounting, or related roles;
    • Strong knowledge of Google Sheets;
    • Excellent attention to detail and organization;
    • English level: B1 or higher;
    • Self-starter with the ability to stay organized and deliver results remotely.
    More
  • Β· 51 views Β· 4 applications Β· 11d

    Senior Data Engineer - new platform concept

    Full Remote Β· Countries of Europe or Ukraine Β· 7 years of experience Β· English - B2
    Looking for talent who thrive establishing new technologies and the challenges that come with that. This is development and go live of an entirely new platform and offering in the agriculture space working with big Ag manufacturers and data. You'll...

    Looking for talent who thrive establishing new technologies and the challenges that come with that. 

     

    This is development and go live of an entirely new platform and offering in the agriculture space working with big Ag manufacturers and data. 

     

    You'll Own:

    • ETL pipelines ingesting vendor pricebooks, inventory feeds, and product metadata
    • Data normalization and enrichment workflows (JSON sidecar pattern)
    • Multi-source data unification into MongoDB Atlas
    • Image/metadata processing pipelines for ML-ready assets

     

    Stack: GCP (Cloud Storage, Cloud Run), MongoDB Atlas, Python, JSON/XMP metadata

    Must Have:

    • 3+ years ETL/data pipeline experience
    • Python proficiency
    • Experience with document databases (MongoDB preferred)
    • Comfortable with messy, real-world data normalization
    •  

    Nice to Have: Ecommerce catalog experience, CLIP/vector embeddings, agricultural/equipment industry

    More
  • Β· 14 views Β· 0 applications Β· 11d

    Senior Product Analyst

    Hybrid Remote Β· Poland Β· Product Β· 4 years of experience Β· English - B2
    The product company is looking for a Senior Product Analyst in Warsaw. A successful market leader, a live-streaming platform with 450+ million registered users. The B2C mobile platform allows millions of talented people worldwide to connect with their...

    The product company is looking for a Senior Product Analyst in Warsaw.


    A successful market leader, a live-streaming platform with 450+ million registered users.
    The B2C mobile platform allows millions of talented people worldwide to connect with their fans and monetize their talents.lions of talented people worldwide to connect with their fans and monetize their ta


    The live streaming platform was founded in 2018 in the USA and is powered by 350+ global employees.


    They offer: options, medical insurance (100% for employees and 75% for family members), lunch in the office, parking, multisport card.


    Requirements:

    • experience as data analyst / product analyst / game analyst
    • experience with BI tools such as Looker / Tableau / Power BI, etc.
    • SQL experience
    • cloud experience (not any specific, just one of cloud's in general)
    • B2C experience
    • mobile analytics
    More
  • Β· 44 views Β· 5 applications Β· 11d

    Senior Data Engineer (Scala) β€” Tieto Tech Consulting (m/f/d)

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    Tieto Tech Consulting is inviting a talented Data Engineer to join our growing team and support our customer BICS, a global telecommunication enabler with a physical network spanning the globe. In this role, you will work on the BICS Voice and CC Value...

    Tieto Tech Consulting is inviting a talented Data Engineer to join our growing team and support our customer BICS, a global telecommunication enabler with a physical network spanning the globe. In this role, you will work on the BICS Voice and CC Value Streams, delivering qualified customer and network support by designing, building, and optimizing large-scale data pipelines within the telecom domain. The position requires strong expertise in Scala Spark, Databricks, and AWS cloud services, and focuses on developing high-performance data platforms that enable network analytics, customer insights, real-time monitoring, and regulatory reporting.

     

    Key Responsibilities

    • Design, develop, and maintain scalable batch data pipelines using Scala, Databricks Spark, Databricks SQL and Airflow
    • Implement optimized ETL/ELT processes to ingest, cleanse, transform, and enrich large volumes of telecom network, usage, and operational data
    • Ensure pipeline reliability, observability, and performance tuning of Spark workloads
    • Build and manage data architectures leveraging AWS services (such as but not limited to) S3, Lambda, IAM, and CloudWatch
    • Implement infrastructure-as-code using Terraform
    • Ensure security best practices and compliance with telecom regulatory requirements (GDPR, Data sovereignty, retention)
    • Collaborate with cross-functional teams (Architecture, DevOps, Network Engineering, Business Intelligence)
    • Document system designs, data flows, and best practices

     

    Requirements

    • 4+ years of experience as a Data Engineer or Big Data Developer
    • Strong proficiency in Scala and functional programming concepts
    • Advanced experience with Apache Spark (batch processing using Data Frame API and low-level Spark API’s, performance tuning, cluster optimization)
    • Experience with optimized SQL-based data transformations for analytics and machine learning workloads
    • Hands-on experience with Databricks including notebooks, jobs, Delta Lake, Unity Catalog, and MLflow (nice to have)
    • Solid understanding of CI/CD practices with Git, Jenkins/Gitlab Actions
    • Strong AWS skills: S3, Lambda, IAM, CloudWatch, and related services
    • Knowledge of distributed systems, data governance, and security best practices
    • Experience with Airflow integration with AWS services for end-to-end orchestration across cloud data pipelines
    • Experience with IaC tools: Terraform or CloudFormation
    • Experience with Python is a Plus
    • Experience with DBT is a Plus
    • Experience with Snowflake is s Plus

     

    Soft Skills

    • Strong analytical and problem-solving skills
    • High degree of ownership and a mindset for continuous improvement
    • Quality oriented, pragmatic and solution oriented
    • Excellent communication and teamwork abilities
    • Ability to translate business requirements into technical solutions
    • Experience in telecom sector is a plus
    • Experience with an agile way of working is a plus
    • English proficiency
    More
  • Β· 227 views Β· 42 applications Β· 11d

    AI Developer

    Full Remote Β· Worldwide Β· 1.5 years of experience Β· English - B2
    We are looking for an AI Developer for our company. For this position, we are considering only Ukrainians, from Ukraine or abroad. Requirements: - English is a must - back-end Python skills (Python, Django/FastAPI/Flask) - AI & ML skills (prompt...

    We are looking for an AI Developer for our company. For this position, we are considering only Ukrainians, from Ukraine or abroad.

     

    Requirements:
    - English is a must 
    - back-end Python skills (Python, Django/FastAPI/Flask)
    - AI & ML skills (prompt engineering, computer vision, voice recognition, etc.)

     

    We value responsible and serious developers open to new knowledge and technologies.
    The first stage is a meeting with HR, the second is a test task, and the last one is a technical interview. 

    More
  • Β· 40 views Β· 5 applications Β· 11d

    Senior Product Analyst

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 4 years of experience Β· English - B1
    We are building one of the most powerful live-streaming platforms, already connecting 400K+ monthly active users through live events, interactive video chats, and real-time community experiences. Our mission is bold: bring joy, fight loneliness, and...

    We are building one of the most powerful live-streaming platforms, already connecting 400K+ monthly active users through live events, interactive video chats, and real-time community experiences. Our mission is bold: bring joy, fight loneliness, and deliver safe, ad-free entertainment that keeps people coming back.
    This is a complex, fast-moving space, and we love it that way! So, you will join a passionate, expert team tackling everything from scaling AI-powered recommendations to launching high-impact, data-driven features that boost monetization and engagement. Every idea counts here, and every release reaches a massive global audience.
    We are now expanding into Arab countries and Europe, enhancing content quality, and rolling out new features to keep users hooked. We are among the Top 5 streaming products in the US and rank in the Top 10 worldwide. Our next milestone: break into the global Top 5, and we are moving fast to make it happen.

    About the role: In this role, you will strengthen our product analytics by bringing senior-level expertise in statistical analysis, retention analytics, and experimentation. From your first weeks, you’ll dive into our product, data pipelines, and key metrics, delivering deep insights into retention and monetization, and independently running A/B tests to shape decisions at scale. Your input will help refine analytical processes and frameworks, directly influencing product strategy. With the chance to grow horizontally through diverse tasks, and to be involved in many of the company’s key processes, you’ll see the tangible impact of your work across the business.

    In this role, you will

    • Drive deep-dive analyses of user behaviour, retention, and monetization
    • Independently design, monitor, and evaluate A/B tests and experiments
    • Build, maintain, and improve analytical frameworks, dashboards, and reporting systems to ensure clarity and efficiency across the team
    • Collaborate closely with product managers to define success metrics, uncover growth opportunities, and shape the product strategy
    • Identify gaps and inefficiencies in current processes and elevate the team’s analytical capabilities by proposing smart improvements
    • Ensure all product research meets the highest standards of statistical rigour, reliability, and scalability


    It’s all about you

    • 4+ years of experience in product or marketing analytics and working cross-functionally with product, marketing, and engineering teams
    • Advanced SQL skills and a solid foundation in statistics, including hypothesis testing, causal inference, regression, and retention/survival analysis
    • Proven track record in product analytics, ideally in transactional or subscription-based business models, with experience in designing and evaluating A/B tests
    • Ownership mindset, strong communication skills and structural thinking that will allow you to  translate complex data into actionable business recommendations for both technical and non-technical stakeholders
       

    Would be a plus

    • Experience with GBQ, Python and R for analytics
    • Familiarity with cohort analysis, LTV modelling, and churn prediction
    • Regular usage of AI tools like ChatGPT, Claude AI, Gemini, or similar to support your work


    What we offer

    Care and support: 

    • 20 paid vacation days, 15 sick days, and 6 additional days off for family events
    • Up to 10 additional days off for public holidays
    • 100% medical insurance coverage
    • Sports and equipment reimbursement
    • Team building events, corporate gifts, and stylish merch
    • Financial and legal support
    • Position retention and support for those who join the Armed Forces of Ukraine
    • Participation in social initiatives supporting Ukraine
       

    Comfortable working environment:

    • Work from our Kyiv hub or remotely with a flexible schedule 
    • Workspace rental reimbursement in other cities and abroad
    • Modern equipment or depreciation of your own tools
       

    Investment in your future:

    • Collaborate with a highly-skilled team of Middle & Senior professionals, sharing practical cases and expertise in the social networking niche
    • 70% of our heads and leads have grown into their roles here – so can you!
    • Performance-oriented reviews and Individual Development Plans (IDPs)
    • Reimbursement for professional courses and English classes
    • Corporate library, book club, and knowledge-sharing events
       

    Hiring process

    • Intro call
    • Technical interview
    • Reference check
    • Offer
    More
  • Β· 60 views Β· 15 applications Β· 11d

    Senior LLM Systems / Agent Infrastructure Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - C1
    Job Description We are hiring a Senior LLM Systems / Agent Infrastructure Engineer to optimize and scale a production multi-agent LLM system that translates natural language into Cypher / SQL queries, executes guarded graph retrieval, and synthesizes...

    Job Description

    We are hiring a Senior LLM Systems / Agent Infrastructure Engineer to optimize and scale a production multi-agent LLM system that translates natural language into Cypher / SQL queries, executes guarded graph retrieval, and synthesizes structured RAG responses. You will engineer a faster, cheaper, and more reliable LLM system in production.

     

    What We’re Building

    A production multi-agent LLM system that:

    • Converts Natural Language (NL) β†’ Cypher (Neo4j) / SQL (PostgreSQL)
    • Executes guarded graph queries
    • Runs structured Retrieval-Augmented Generation (RAG)
    • Orchestrates used tools

     

    What You’ll Own

    • Cut NL β†’ Cypher / SQL latency
    • Optimize model routing & tool orchestration
    • Replace unnecessary LLM calls with deterministic logic
    • Redesign context strategy
    • Evaluate / replace current LLM stack (Google ADK)

    Outcome: A faster, cheaper, and more stable pipeline.

     

    Required

    • 5+ years backend / ML systems experience
    • 2+ years production LLM pipelines experience
    • Strong Python (async, FastAPI)
    • Experience with agent frameworks (LangGraph, DSPy, Semantic Kernel, etc.)
    • Structured outputs / function calling
    • RAG over structured data
    • LLM + SQL or graph DB integration
    • Strong grasp of token economics & latency optimization
    • Inference optimization (vLLM, batching, streaming)
    More
  • Β· 29 views Β· 1 application Β· 11d

    Data Scientist / Machine Learning Engineer - AI at Massive Scale

    Office Work Β· Ukraine (Dnipro, Lviv) Β· Product Β· 3 years of experience Β· English - B1
    Help us push AI further β€” and faster LoopMe’s Data Science team builds production AI that powers real-time decisions for campaigns seen by hundreds of millions of people every day. We process billions of data points daily β€” and we don’t just re-apply old...

    Help us push AI further β€” and faster

    LoopMe’s Data Science team builds production AI that powers real-time decisions for campaigns seen by hundreds of millions of people every day. We process billions of data points daily β€” and we don’t just re-apply old tricks. We design and deploy genuinely novel machine learning systems, from idea to prototype to production.

    You’ll join a high-trust team that has a 5-star Glassdoor rating led by Leonard Newnham, where your work moves fast, ships to production, and makes measurable impact.

     

    What you’ll do:

    • Design, build, and run large-scale ML pipelines that process terabytes of data
    • Apply a mix of supervised learning, custom algorithms, and statistical modelling to real-world problems
    • Ship production-grade Python code that’s clear, documented, and tested
    • Work in small, agile squads (3–4 people) with DS, ML, and engineering peers
    • Partner with product and engineering to take models from idea β†’ production β†’ impact
    • Work with Google Cloud, Docker, Kafka, Spark, Airflow, ElasticSearch, ClickHouse and more

     

    What you bring:

    • Bachelor’s degree in Computer Science, Maths, Engineering, Physics or similar (MSc/PhD a plus)
    • 3+ years’ commercial Python experience
    • Track record building ML pipelines that handle large-scale data
    • Excellent communication skills β€” comfortable working across time zones
    • A curious, scientific mindset β€” you ask β€œwhy?” and prove the answer

     

    Bonus if you have:

    • Experience with adtech or real-time bidding
    • Agile / Scrum experience
    • Knowledge of high-availability infrastructure (ElasticSearch, Kafka, ClickHouse)
    • Airflow expertise

     

    About the Data Science Team:

    We’re 17 ML engineers, data scientists, and data engineers, distributed across London, Poland, and Ukraine β€” acting as one team, not a satellite office.

    What sets us apart:

    • Led by an experienced Chief Data Scientist who codes, leads, and listens
    • Inclusive, supportive culture where ideas are heard and people stay
    • Strong values: open communication, continual innovation, fair treatment, and high standards
    • Track record of publishing award-winning research in automated bidding

    Don’t just take our word for it β€” check our Glassdoor reviews (search β€œData Scientist”) for a real view of the culture.

     

    About LoopMe:

    LoopMe was founded to close the loop on brand advertising. Our platform combines AI, mobile data, and attribution to deliver measurable brand outcomes β€” from purchase intent to foot traffic. Founded in 2012, we now have offices in New York, London, Chicago, LA, Dnipro, Singapore, Beijing, Dubai and more.

     

    What we offer:

    • Competitive salary + bonus
    • Billions of real-world data points to work with daily
    • Flexible remote/hybrid options
    • Learning budget and career growth support
    • Friendly, transparent culture with strong leadership

     

    Hiring process:

    1. Intro with Talent Partner
    2. 30-min technical interview with Chief Data Scientist
    3. Panel with 2 team members (technical, culture & collaboration)
    4. Offer – usually within 48 hours of final round

     

    Are you ready to design and deploy AI systems that run at truly massive scale?

    More
  • Β· 27 views Β· 3 applications Β· 11d

    Senior Snowflake Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    Project description The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future....

    Project description

    The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data infrastructure through the transition to Snowflake as a priority, as it will enhance capabilities for implementing advanced AI solutions and unlock numerous opportunities for innovation and growth.

    We are seeking a highly skilled Snowflake Data Engineer to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD practices, with a deep understanding of data governance, security, and role-based access control (RBAC). Knowledge of data modeling methodologies (OLTP, OLAP, Data Vault 2.0), data quality frameworks, Stream lit application development and SAP integration and infrastructure-as-code with Terraform is essential. Experience working with different file formats such as JSON, Parquet, CSV, and XML is highly valued.

    Responsibilities

    Design and develop data pipelines using Snowflake and Snow pipe for real-time and batch ingestion.

    Implement CI/CD pipelines in Azure DevOps for seamless deployment of data solutions.

    Automate DBT jobs to streamline transformations and ensure reliable data workflows.

    Apply data modeling techniques including OLTP, OLAP, and Data Vault 2.0 methodologies to design scalable architectures.

    Document data models, processes, and workflows clearly for future reference and knowledge sharing.

    Build data tests, unit tests, and mock data frameworks to validate and maintain reliability of data solutions.

    Develop Streamlit applications integrated with Snowflake to deliver interactive dashboards and self-service analytics.

    Integrate SAP data sources into Snowflake pipelines for enterprise reporting and analytics.

    Leverage SQL expertise for complex queries, transformations, and performance optimization.

    Integrate cloud services across AWS, Azure, and GCP to support multi-cloud data strategies.

    Develop Python scripts for ETL/ELT processes, automation, and data quality checks.

    Implement infrastructure-as-code solutions using Terraform for scalable and automated cloud deployments.

    Manage RBAC and enforce data governance policies to ensure compliance and secure data access.

    Collaborate with cross-functional teams including business analysts, and business stakeholders to deliver reliable data solutions.

    Skills

    Must have

    Strong proficiency in Snowflake (Snowpipe, RBAC, performance tuning).

    Hands-on experience with Python , SQL , Jinja , JavaScript for data engineering tasks.

    CI/CD expertise using Azure DevOps (build, release, version control).

    Experience automating DBT jobs for data transformations.

    Experience building Streamlit applications with Snowflake integration.

    Cloud services knowledge across AWS (S3, Lambda, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Pub/Sub).

    Nice to have

    Cloud certifications is a plus

    Languages

    English: B2 Upper Intermediate

    More
  • Β· 174 views Β· 38 applications Β· 11d

    AI Engineer

    Full Remote Β· Ukraine Β· 2 years of experience Β· English - B2
    We seek an AI Engineer with strong expertise in state-of-the-art techniques for RAG, LLM Agents. You will be responsible for designing, developing, and deploying AI-driven solutions for our Clients and in-house solutions. The ideal candidate has hands-on...

    We seek an AI Engineer with strong expertise in state-of-the-art techniques for RAG, LLM Agents. You will be responsible for designing, developing, and deploying AI-driven solutions for our Clients and in-house solutions. The ideal candidate has hands-on experience with a wide range of techniques in modern Agents development.

    πŸ§™β€β™‚οΈ In This Role You Will

    • Develop AI agents, including chat agents and metric extraction agents.
    • Create evaluation pipelines for agents, including tool usage and RAG (Retrieval-Augmented Generation) assessments like coherence and relevance.
    • Improve results from Qdrant & Azure AI Search.
    • Research and apply state-of-the-art techniques for agents and RAG systems.
    • Maintain and support evaluation datasets.
    • Write documentation for back-end integration or handle integration directly when needed.
       

    🎯 You May Be Interested If

    • You are excited about learning and working with AI technologies.
    • You enjoy solving complex problems and have strong analytical skills.
    • You thrive in a remote work setup and appreciate flexible working hours.
    • You are eager to join a small, dynamic team and make a big impact.
    • You value a company culture that promotes transparency, respect, and continuous learning.
    • You are self-motivated and willing to learn new skills.
    • Active code agents user.
       

    🍰 Role Requirements

    Agents

    • Experience in LLM Agents architecture (not only Langchain)
    • Qdrant (Must have), Azure AI Search, BM25 experience
    • Semantic Reranking
    • RAG evals (embeddings, LLM as a Judge)
    • RAG (Naive, Graph, Hybrid, Agentic)
    • Experience in different data mining techniques from unstructured data
    • OCR
    • PydanticAI


    Web

    • Django (DRF)
    • Fastapi
    • Celery

     

    Infrastructure

    • Azure Cloud
    • Docker
       

    πŸ’» Working Conditions

    • This is a fully remote position with the entire team across Ukraine.
    • The team collaboratively defines deadlines, ensuring a balanced and manageable workload. The focus is on delivering commitments on time.
    • 20 working days of paid vacation per year.
    • Financial compensation for completing certifications and encouraging continuous learning and professional development.
    • Preferred working hours: To ensure effective collaboration with our US-based client, the workday is shifted to the afternoon/evening. The 8-hour workday is scheduled within the 12:00 – 23:00 (Kyiv time) window, with the specific shift to be coordinated based on the required overlap
    More
  • Β· 32 views Β· 3 applications Β· 11d

    Founding Machine Learning Engineer (Inference / Computer Vision)

    Ukraine Β· 4 years of experience Β· English - B2 MilTech πŸͺ–
    We are a Britishβ€”Ukrainian air defence tech company building distributed systems across the frontline. We are looking for a Founding Engineer with deep expertise in Machine Learning and Computer Vision to take full ownership of the architecture and...

    We are a Britishβ€”Ukrainian air defence tech company building distributed systems across the frontline. We are looking for a Founding Engineer with deep expertise in Machine Learning and Computer Vision to take full ownership of the architecture and implementation of detection systems for our current and future product lines.

    Our core value is impact for Ukraine: we build technology that makes a real difference on the actual battlefield, not in labs or on paper specs. We operate with a deep-tech mindset, pushing new technologies where engineering rigor truly matters. We value ownership and responsibilityβ€”people who execute, deliver, and stand behind the results.

    We especially value engineers with strong analytical thinking and proven problem-solving skills β€” such as participation in mathematical/programming olympiads or experience contributing to open-source projects.

    Key Responsibilities

    • Detection Systems Development: Build and maintain the full lifecycle of ML solutions across the company’s product lines.
    • Inference Engineering: Optimize models for real-time operations.
    • Backlog Execution & Implementation:
      • Develop RF Classification models (Radio Frequency signals).
      • Improve the accuracy of DoA (Direction of Arrival) algorithms.
      • Implement drone behavior analysis and pilot localization algorithms.
    • Best Practices: Define the architecture, tooling, and approaches to build a scalable ML infrastructure from the ground up.

    Skills & Requirements

    • Strong theoretical background in Linear Algebra, Probability, and Statistics.
    • Experience with Computer Vision models: Classification, Object Recognition, and Tracking.
    • Python + relevant ML libraries, PyTorch.
    • Experience taking models to production (Production ML).
    • Hands-on experience with inference engines.
    • Ability to build low-latency, real-time detection pipelines.
    • Proficiency in C++ and/or CUDA for building high-performance modules.

    Nice to Have

    • Experience in DefenseTech, SigInt (Signals Intelligence), or Embedded Systems.
    • Understanding of Radio Frequency (RF) physics.

    We Offer

    • Meaningful and high-impact work in the AIR defense domain
    • Practical R&D environment with real autonomy
    • Direct influence on technical decisions and the product direction
    • Work in an experienced senior engineering team
    • Π‘lear, fast, senior-to-senior communication with a minimum number of team meetings
    • Office in the center of Kyiv (available for onsite work when required)
    • Full-time position
    • Standard business hours; some tasks may require involvement outside the usual schedule (testing, critical updates)
    More
  • Β· 115 views Β· 18 applications Β· 11d

    Business Analyst

    Ukraine Β· 1 year of experience Β· English - B2
    Dotcode is looking for a Business Analyst to join our team on a long-term product with active development and integrations with external services. You will work closely with stakeholders, developers, and QA, participate in requirements elicitation and...

    Dotcode is looking for a Business Analyst to join our team on a long-term product with active development and integrations with external services.
    You will work closely with stakeholders, developers, and QA, participate in requirements elicitation and analysis, help shape product functionality, and ensure clear, structured, and high-quality requirements throughout the development lifecycle.

    Skills requirements: 

    • 1+ years of experience in system, functional, or business analysis; 
    • English: Π’2 and higher; 
    • Practical experience with key business analysis techniques and requirement management tools;
    • Understanding of SDLC and agile development processes; 
    • Knowledge of business process modeling; 
    • Proficiency in identifying and translating business needs into clearly defined requirements; 
    • Experience in drafting functional / business / system requirement specifications; 
    • Experience in working with stakeholders and requirements elicitation; 
    • Critical thinking and problem-solving skills; 
    • Ability to decompose complex requirements into manageable tasks; 
    • Understanding of web development concepts and technologies;
    • Excellent interpersonal and communication skills. 

       

    Nice to have: 

    • Experience in using SQL; 
    • Experience with integration projects.

       

    We offer:

    • Competitive salary;
    • Flexible working time;
    • Professional training opportunities;
    • Friendly work environment and office in a good location;
    • Paid vacations and sick leaves.

      Hiring flow:
    • Call with HR Manager;
    • Technical interview;
    • Final interview with CEO.
       

      If you have any questions, feel free to contact :)

     

    More
  • Β· 42 views Β· 0 applications Β· 11d

    Machine Learning Engineer

    Hybrid Remote Β· Ukraine Β· Product Β· 3 years of experience Β· English - B2
    As a Machine Learning Engineer, you'll work as part of the Wix CTO Office team, researching problems that can give Wix’s products a competitive edge across various challenges. A key focus of the team is on Agents over LLMs, exploring new techniques for...

    As a Machine Learning Engineer, you'll work as part of the Wix CTO Office team, researching problems that can give Wix’s products a competitive edge across various challenges. A key focus of the team is on Agents over LLMs, exploring new techniques for building agents and developing innovative products that leverage them.  

     

    In your day-to-day, you will:  

    • Build POCs for research projects led by the team  
    • Evaluate results and provide actionable insights  
    • Collaborate with different teams at Wix to advance their agent implementations  
    • Build shared infrastructure for agents  

       

    Requirements

    • Creativity and willingness to tackle ambitious, high-risk problems  
    • 3+ years of experience in working on production code with active users  
    • BSc in Computer Science or related field, MSc preferred  
    • Proficient in Python; TypeScript is a significant advantage  
    • Experience in training and evaluating Machine Learning models  
    • Hands-on experience with building GenAI systems using LLMs and agents  
    • Proven ability to work in a collaborative, cross-functional environment  
    • Excellent written and verbal communication skills in English 

     

    About the Team  

    We are Wix's Data Science CTO Office team, a small group of researchers and engineers. We collaborate with various groups at Wix and the CEO on innovative research projects. Some projects aim to enhance Wix products with new features, while others focus on strategic research areas that can provide Wix with a competitive advantage.

     

    More
  • Β· 61 views Β· 14 applications Β· 11d

    Middle Cloud/Data Engineer

    Part-time Β· Full Remote Β· Countries of Europe or Ukraine Β· 4 years of experience Β· English - B2
    Metamindz is a fast-growing UK-based IT software company. We support global clients by providing fractional CTOs-as-a-service, building digital products, hiring exceptional technical talent, and conducting in-depth tech due diligence. We’re currently...

    Metamindz is a fast-growing UK-based IT software company. We support global clients by providing fractional CTOs-as-a-service, building digital products, hiring exceptional technical talent, and conducting in-depth tech due diligence.

     

    We’re currently looking for a Cloud & Data Engineer (GCP / IoT) to join one of our startup clients in a part-time engagement. This is a full-time opportunity for a hands-on engineer who can take ownership of cloud data platforms and backend systems, working with high-volume IoT data and real-time analytics in production environments.

     

    Responsibilities:

     

    • Own and operate the cloud-based backend and data platform supporting large-scale IoT deployments
    • Architect, build, and maintain high-volume data ingestion pipelines using GCP services (BigQuery, Dataflow, Pub/Sub)
    • Design and manage streaming and batch data workflows for real-time and historical analytics
    • Define data storage, querying, retention, and archiving strategies across warehouses and data lakes
    • Ensure backend services, APIs, and data pipelines are secure, scalable, observable, and fault-tolerant
    • Set up monitoring, logging, alerting, and recovery strategies for event-driven workloads
    • Collaborate closely with the CTO, embedded engineers, and product teams to align device capabilities with cloud and data architecture
    • Contribute to data platform evolution, including governance, access policies, and metadata management

     

    Requirements:

     

    • 3–5 years of commercial engineering experience in cloud, data, or backend roles
    • Strong hands-on experience with GCP and its data ecosystem (BigQuery, Dataflow, Pub/Sub)
    • Solid experience with relational databases (Postgres, MySQL), including schema design, migrations, indexing, and scaling strategies
    • Proven experience building and maintaining data pipelines, particularly for IoT or time-series data
    • Hands-on experience with Python (Node.js is a plus)
    • Experience designing and consuming APIs in distributed or microservices-based systems
    • Familiarity with CI/CD pipelines, environment management, and Infrastructure as Code (Terraform)
    • Good understanding of cloud security, IAM, and best practices for production systems
    • Ability to work independently in a startup environment and make pragmatic technical decisions

     

    Nice to Have:

     

    • Google Professional Data Engineer certification
    • Experience with orchestration tools such as Airflow / Cloud Composer
    • Exposure to applied ML or AI use cases (e.g. anomaly detection, forecasting on IoT data)
    • Experience using managed ML services like GCP Vertex AI

     

    What We Offer:

     

    • Opportunity to work on a real-world, IoT-powered product with visible impact
    • High ownership and influence over technical architecture and data strategy
    • Collaborative startup environment with direct access to decision-makers
    • Modern cloud stack and meaningful engineering challenges around scale and reliability
    • Competitive compensation aligned with experience and responsibilities

     

    How to Apply:

     

    Please send a short blurb about yourself β€” and tell us your favorite ice cream flavor (mine is cherry πŸ’)

    More
  • Β· 48 views Β· 1 application Β· 11d

    Senior Business Analyst (Fintech)

    Full Remote Β· EU Β· Product Β· 3 years of experience Β· English - B1
    Paycord is a PayTech company with a high-load platform for payment processing. We combine fintech expertise with merchant insights to create innovative solutions. We’ve successfully developed a strong product that helps businesses succeed in new...

    Paycord is a PayTech company with a high-load platform for payment processing. We combine fintech expertise with merchant insights to create innovative solutions. We’ve successfully developed a strong product that helps businesses succeed in new markets.

    Our primary focus is on solution-driven development, and we prioritize the needs of our business clients. We provide access to a wide range of local and international payment methods, supporting businesses in reaching new heights and achieving excellence.

    We`re rapidly growing and inviting a Senior Business(System) Analyst to our team.

    You would be running such tasks as:

    • Collaborate proactively with business stakeholders, product managers and tech team to reach a common understanding of the pain/problem to address the needs;
    • Analysis of the gathered requirements and expectations and creation of requirements specifications;
    • Presentation business requirements to tech team and support tech team in clarifications of requirements and change request management, lead refinement sessions;
    • Plan, facilitate, and conduct requirements elicitation, documentation, and validation;
    • Create various artifacts, such as SRSs, minimal schema and configuration requirements, presentations, user guides, introduction materials, and release notes.

    Required skills and expertise:

    • 3+ years of experience as a Business Analyst or Systems Analyst;
    • Ability to understand complex business tasks and translate them into requirements;
    • Ability to write functional and non-functional requirements, create user stories and use cases (including diagrams, UML, BPMN);
    • Good knowledge of prototyping tools;
    • Strong technical knowledge of client-server architecture, API, SQL, HTML/CSS;
    • Strong understanding of the Software Development Life Cycle (SDLC);
    • Attention to details, strong logic mindset, ability to identify corner cases, result-oriented approach;
    • Strong time management skills to feel confident with multi-task activities;
    • Experience in payments domain;
    • English: Upper-Intermediate.

    Tech Stack:

    PHP 8, Nginx, MySQL, Redis, RabbitMQ, GitLabCI, AWS, Docker, Kubernetes, Grafana

     

    We offer:

    Care for your health and well-being

    • 100 % paid sick leaves;
    • 20 working days of paid vacation;
    • Medical support;
    • Benefits Cafeteria (budget for gym/stomatology/psychological service & etc.);
    • Ability to work remotely or in the office (as you wish);
    • Corporate gifts & events.

    Professional growth & development

    • Competitive salary with annual salary promotions;
    • The annual budget for professional courses, conferences, workshops, and books;
    • Internal training courses;
    • Work with a team of professionals and have the opportunity to share knowledge.

    Corporate Culture

    • Dynamic and result-oriented work environment;
    • The ability to influence product development at an early stage;
    • Openness to new ideas and approaches, healthy team discussions;
    • No β€œred tape” culture.
    More
Log In or Sign Up to see all posted jobs