Jobs Data Science
89-
Β· 22 views Β· 4 applications Β· 25d
Senior Computer Vision Engineer
Full Remote Β· Ukraine Β· 4 years of experience Β· English - B2Job Description 4+ years of experience in computer vision or related fields. Strong knowledge of machine learning/deep learning. Hands-on experience with object detectors, instance segmentation, keypoint/pose detection, RNNs/Transformers (computer vision...Job Description
4+ years of experience in computer vision or related fields.
Strong knowledge of machine learning/deep learning.
Hands-on experience with object detectors, instance segmentation, keypoint/pose detection, RNNs/Transformers (computer vision in time domain), tracking algorithms.
Proficiency in programming languages such as Python and C++ (optional).
Extensive experience with computer vision libraries and frameworks like PyTorch, and OpenCV.
Familiarity with image processing techniques and annotation tools.
Experience with hardware integration for vision-based systems.
Good understanding of best practices of software development (code reviews, TDD, Git, etc.)
Advanced English (written and verbal) for daily communication with the customer.
Efficiency in remote development on Linux (VMs, on-premise machines).
Would be a Plus:
Understanding of real-time processing and optimization.
Experience with edge AI deployment.
Experience with optimization and inference libraries (ONNX, TensorRT, OpenVINO, etc.).
Linux development.
Experience in medical computer vision.
Experience in managing huge volumes of visual data.
Ability to leverage advanced nVidia ADA GPU features to speedup training and inference.
Experience with CPU and GPU profiling (nVidia nsight, cProfile)Job Responsibilities
Contribute to the design, development, code review, and testing of computer vision algorithms and systems.
Develop new features and improve existing functionality in vision-based projects.
Work on the integration of computer vision solutions with third-party tools and hardware.
Collaborate with cross-functional teams to deliver high-quality, compliant products.
Stay updated with the latest advancements in computer vision and machine learning technologies.Department/Project Description
As a Computer Vision Engineer, you will join a mature and senior team dedicated to developing cutting-edge computer vision solutions for medical applications. Our projects range from advanced image processing to real-time vision systems, contributing to fields like medical devices, robotics, autonomous vehicles and others. We emphasize technical excellence and offer a stimulating environment that encourages innovation and professional growth.
More -
Β· 59 views Β· 20 applications Β· 22d
Data Scientist / ML Engineer
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 3 years of experience Β· English - B2Weβre hiring: Data Scientist / ML Engineer Product: E-commerce solution with AI integration Format: Remote What youβll do Build and improve ML models (risk/anomaly-style scenarios) Analyze patterns, validate hypotheses, and iterate on models Support...Weβre hiring: Data Scientist / ML Engineer
Product: E-commerce solution with AI integration
Format: RemoteWhat youβll do
- Build and improve ML models (risk/anomaly-style scenarios)
- Analyze patterns, validate hypotheses, and iterate on models
- Support deployment and monitoring in collaboration with engineering/product
Requirements
- 3+ years of experience
- Hands-on ML/DS workflow experience (data prep, modeling, evaluation)
- Ability to work with production data and deliver measurable outcomes.
What we offer
- 24 paid vacation days/year (after probation)
- Paid sick leave (after probation)
- Remote work option
- Company-supported English courses
-
Β· 111 views Β· 33 applications Β· 11d
Senior AI / Machine Learning Engineer to $6500
Full Remote Β· Worldwide Β· Product Β· 5 years of experience Β· English - B2About Tie Tie is building the next generation of identity resolution and marketing intelligence. Our platform connects hundreds of millions of consumers across devices, browsers, and channelsβwithout relying on cookiesβto power higher deliverability,...About Tie
Tie is building the next generation of identity resolution and marketing intelligence. Our platform connects hundreds of millions of consumers across devices, browsers, and channelsβwithout relying on cookiesβto power higher deliverability, smarter targeting, and measurable revenue lift for modern marketing teams.
At Tie, AI is not a featureβit is a core execution advantage. We operate large-scale identity graphs, real-time scoring systems, and production ML pipelines that directly impact revenue, deliverability, and customer growth.
The Role
We are looking for a Senior AI / Machine Learning Engineer to design, build, and deploy production ML systems that sit at the heart of our identity graph and scoring platform. You will work at the intersection of machine learning, graph data, and real-time systems, owning models end to endβfrom feature engineering and training through deployment, monitoring, and iteration.
This role is highly hands-on and impact-driven. You will help define Tieβs ML architecture, ship models that operate at sub-second latency, and partner closely with platform engineering to ensure our AI systems scale reliably.
What Youβll Do
- Design and deploy production-grade ML models for identity resolution, propensity scoring, deliverability, and personalization
- Build and maintain feature pipelines across batch and real-time systems (BigQuery, streaming events, graph-derived features)
- Develop and optimize classification models (e.g., XGBoost, logistic regression) with strong handling of class imbalance and noisy labels
- Integrate ML models directly with graph databases to support real-time inference and identity scoring
- Own model lifecycle concerns: evaluation, monitoring, drift detection, retraining, and performance reporting
- Partner with engineering to expose models via low-latency APIs and scalable services
- Contribute to GPU-accelerated and large-scale data processing efforts as we push graph computation from hours to minutes
- Help shape ML best practices, tooling, and standards across the team
What Youβll Bring
Required Qualifications
- 5+ years of experience building and deploying machine learning systems in production
- Strong proficiency in Python for ML, data processing, and model serving
- Hands-on experience with feature engineering, model training, and evaluation for real-world datasets
- Ability to travel outside of Ukraine is a must
- Experience deploying ML models via APIs or services (e.g., FastAPI, containers, Kubernetes)
- Solid understanding of data modeling, SQL, and analytical workflows
- Experience working in a cloud environment (GCP, AWS, or equivalent)
- Experience with graph data, graph databases, or graph-based ML
Familiarity with Neo4j, Cypher, or graph algorithms (community detection, entity resolution)
Preferred / Bonus Experience
- Experience with XGBoost, tree-based models, or similar classical ML approaches
- Exposure to real-time or streaming systems (Kafka, Pub/Sub, event-driven architectures)
- Experience with MLOps tooling and practices (CI/CD for ML, monitoring, retraining pipelines)
- GPU or large-scale data processing experience (e.g., RAPIDS, CUDA, Spark, or similar)
- Domain experience in identity resolution, marketing technology, or email deliverability
Our Technology Stack
- ML & Data: Python, Pandas, Scikit-learn, XGBoost
- Graphs: Neo4j (Enterprise, GDS)
- Cloud: Google Cloud Platform (BigQuery, Vertex AI, Cloud Run, Pub/Sub)
- Infrastructure: Docker, Kubernetes, GitHub Actions
- APIs: FastAPI, REST-based inference services
What We Offer
- Competitive compensation, including salary, equity, and performance incentives
- Opportunity to work on core AI systems that directly impact revenue and product differentiation
- High ownership and autonomy in a senior, hands-on role
- Remote-first culture with a strong engineering and data focus
- Exposure to cutting-edge problems in identity resolution, graph ML, and real-time AI systems
Clear growth path toward Staff / Principal IC roles
What else:
- 4 weeks of paid vacation per year (flexible scheduling)
- Unlimited sick leave β we trust your judgment and care about your health
- US Bank Holidays off (American calendar)
- Remote-first culture and flexible working hours
- Flat structure, no micromanagement, and full ownership
Opportunity to make a real impact during a critical growth phase
Interview Process
- Recruitment Screening Call
- Initial call with Head of Data Science & AI and CTO (30 min) in English
- Technical deep dive Interview (1,5h) in English
Optional test-task (paid)
Why Join Us?
- High-impact delivery leadership role during a critical period
- Real ownership and autonomy
- Opportunity to shape delivery across the entire engineering organization
- Exposure to SaaS, data, integrations, automation, and platform work
- Collaboration with global teams and vendors
- A strong product with real scale and momentum
Why This Role Matters
At Tie, your work will not live in notebooks or experimentsβit will power production systems used by real customers at scale. You will help define how AI is embedded into the companyβs core platform and play a key role in making machine learning a durable competitive advantage.
More -
Β· 73 views Β· 5 applications Β· 1d
Head of Data Science
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 6 years of experience Β· English - B2About Everstake Everstake is the largest decentralized staking provider in Ukraine and one of the top 5 blockchain validators worldwide. We help institutional and retail investors participate in staking across more than 85 blockchain networks, including...About Everstake
Everstake is the largest decentralized staking provider in Ukraine and one of the top 5 blockchain validators worldwide. We help institutional and retail investors participate in staking across more than 85 blockchain networks, including Solana, Ethereum, Cosmos, and many others. By building secure, scalable, and reliable blockchain infrastructure, we support the growth of the global Web3 ecosystem and enable the adoption of decentralized technologies worldwide.
About the Role
We are looking for a Head of Data Science to own and scale Everstakeβs data science and analytics function. This is a hands-on leadership role with a strong technical focus. You will define the data science direction, lead senior-level engineers, and work closely with the CDO, product, engineering, and business teams to drive data-informed decisions across a complex Web3 infrastructure.
You will be responsible not only for analytics and modeling, but also for data architecture, orchestration, performance, reliability, and engineering standards in a fast-growing blockchain environment.
Key Responsibilities:- Own and evolve data science and analytics architecture across Everstake
- Design and maintain scalable data pipelines, metrics layers, and analytical models
- Lead technical decision-making across data platforms, BI, and orchestration
- Translate blockchain, product, and business problems into clear data solutions
- Define data standards, best practices, and development guidelines
- Review code, data models, and pipelines for quality, performance, and correctness
- Mentor senior data scientists and analysts, provide technical leadership
- Partner closely with product, backend, infrastructure, and finance teams
- Ensure data reliability, observability, and correctness in production
- Actively contribute hands-on where technical depth is required
Requirements (Must-Have):
Seniority & Leadership- 6+ years of professional experience in data-related roles
- Strong experience as a Senior / Lead Data Scientist or Analytics Engineer
- Proven ability to lead technically strong teams and initiatives
- Ability to balance hands-on execution with leadership responsibilities
Core Technical Skills
- Python β expert level (data processing, analytics, modeling, production code)
- Apache Airflow β 2β3+ years of hands-on experience
(DAG design, dependencies, retries, backfills, monitoring, failure handling)
Databases & Warehouses
- ClickHouse (performance tuning, large-scale analytics)
- PostgreSQL
- Snowflake
BI & Analytics- Power BI and/or Tableau
- Strong understanding of semantic layers, metrics definitions, and data modeling
Infrastructure & Observability- Docker
- Git
- Grafana (monitoring data pipelines and platform health)
Data & Systems Thinking
- Strong understanding of data modeling (facts, dimensions, slowly changing data)
- Experience designing KPIs and metrics that actually reflect business reality
- Ability to identify incorrect assumptions, misleading metrics, and data biases
- Experience working with high-volume, high-frequency, or nearβreal-time data
- Strong SQL skills and performance-oriented thinking
Blockchain / Crypto Domain (Required)- Practical experience in blockchain, crypto, or Web3 products
- Experience working with blockchain-derived datasets or crypto-financial metrics
- Ability to reason about probabilistic, noisy, and incomplete on-chain data
- Understanding of: Blockchain mechanics (validators, staking, rewards, transactions)
- Wallets, addresses, and transaction flows
- On-chain vs off-chain data
Soft Skills:- Systems and critical thinking
- Strong communication skills with technical and non-technical stakeholders
- Team-oriented mindset with high ownership and accountability
- Fluent English (B2+ or higher)
Nice-to-Have:- Experience in staking, DeFi, or blockchain infrastructure companies
- Background in analytics engineering or data platform teams
- Experience building data systems from scratch or scaling them significantly
- Familiarity with financial or yield-related metrics
- Experience working in globally distributed teams
What We Offer:- Opportunity to work on mission-critical Web3 infrastructure used globally
- Head-level role with real influence on data and technical strategy
- Fully remote work format
- Competitive compensation aligned with experience and seniority
- Professional growth in a top-tier Web3 engineering organization
- Strong engineering culture with focus on quality, ownership, and impact
-
Β· 40 views Β· 5 applications Β· 15d
Head of Data Science
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 6 years of experience Β· English - B2About Everstake Everstake is the largest decentralized staking provider in Ukraine and one of the top 5 blockchain validators worldwide. We help institutional and retail investors participate in staking across more than 85 blockchain networks, including...About Everstake
Everstake is the largest decentralized staking provider in Ukraine and one of the top 5 blockchain validators worldwide. We help institutional and retail investors participate in staking across more than 85 blockchain networks, including Solana, Ethereum, Cosmos, and many others. By building secure, scalable, and reliable blockchain infrastructure, we support the growth of the global Web3 ecosystem and enable the adoption of decentralized technologies worldwide.
About the Role
We are looking for a Head of Data Science to own and scale Everstakeβs data science and analytics function. This is a hands-on leadership role with a strong technical focus. You will define the data science direction, lead senior-level engineers, and work closely with the CDO, product, engineering, and business teams to drive data-informed decisions across a complex Web3 infrastructure.
You will be responsible not only for analytics and modeling, but also for data architecture, orchestration, performance, reliability, and engineering standards in a fast-growing blockchain environment.
Key Responsibilities:- Own and evolve data science and analytics architecture across Everstake
- Design and maintain scalable data pipelines, metrics layers, and analytical models
- Lead technical decision-making across data platforms, BI, and orchestration
- Translate blockchain, product, and business problems into clear data solutions
- Define data standards, best practices, and development guidelines
- Review code, data models, and pipelines for quality, performance, and correctness
- Mentor senior data scientists and analysts, provide technical leadership
- Partner closely with product, backend, infrastructure, and finance teams
- Ensure data reliability, observability, and correctness in production
- Actively contribute hands-on where technical depth is required
Requirements (Must-Have):
Seniority & Leadership- 6+ years of professional experience in data-related roles
- Strong experience as a Senior / Lead Data Scientist or Analytics Engineer
- Proven ability to lead technically strong teams and initiatives
- Ability to balance hands-on execution with leadership responsibilities
Core Technical Skills
- Python β expert level (data processing, analytics, modeling, production code)
- Apache Airflow β 2β3+ years of hands-on experience
(DAG design, dependencies, retries, backfills, monitoring, failure handling)
Databases & Warehouses
- ClickHouse (performance tuning, large-scale analytics)
- PostgreSQL
- Snowflake
BI & Analytics- Power BI and/or Tableau
- Strong understanding of semantic layers, metrics definitions, and data modeling
Infrastructure & Observability- Docker
- Git
- Grafana (monitoring data pipelines and platform health)
Data & Systems Thinking
- Strong understanding of data modeling (facts, dimensions, slowly changing data)
- Experience designing KPIs and metrics that actually reflect business reality
- Ability to identify incorrect assumptions, misleading metrics, and data biases
- Experience working with high-volume, high-frequency, or nearβreal-time data
- Strong SQL skills and performance-oriented thinking
Blockchain / Crypto Domain (Required)- Practical experience in blockchain, crypto, or Web3 products
- Experience working with blockchain-derived datasets or crypto-financial metrics
- Ability to reason about probabilistic, noisy, and incomplete on-chain data
- Understanding of: Blockchain mechanics (validators, staking, rewards, transactions)
- Wallets, addresses, and transaction flows
- On-chain vs off-chain data
Soft Skills:- Systems and critical thinking
- Strong communication skills with technical and non-technical stakeholders
- Team-oriented mindset with high ownership and accountability
- Fluent English (B2+ or higher)
Nice-to-Have:- Experience in staking, DeFi, or blockchain infrastructure companies
- Background in analytics engineering or data platform teams
- Experience building data systems from scratch or scaling them significantly
- Familiarity with financial or yield-related metrics
- Experience working in globally distributed teams
What We Offer:- Opportunity to work on mission-critical Web3 infrastructure used globally
- Head-level role with real influence on data and technical strategy
- Fully remote work format
- Competitive compensation aligned with experience and seniority
- Professional growth in a top-tier Web3 engineering organization
- Strong engineering culture with focus on quality, ownership, and impact
-
Β· 289 views Β· 61 applications Β· 14d
Junior Data Scientist
Full Remote Β· Worldwide Β· English - B2As a Junior Data Scientist, you will contribute to the development and delivery of data science products, working alongside senior data scientists. You will be involved in implementing and refining supervised learning, bandit algorithms, and generative AI...As a Junior Data Scientist, you will contribute to the development and delivery of data science products, working alongside senior data scientists. You will be involved in implementing and refining supervised learning, bandit algorithms, and generative AI models, as well as supporting
experimentation and analysis.
You will write production-quality Python code, collaborate on cloud-based deployments, and help translate data insights into actionable recommendations that drive business impact. This role provides hands-on experience while allowing you to take ownership of well-scoped components
within larger projects.
This is a fantastic opportunity for an early-career data scientist with an analytical background to join and grow within a market leading digital content agency and media network.
CORE RESPONSIBILITIES
Model Development: Assist in developing, testing, and improving machine learning
models, with a focus on bandit algorithms and experimentation frameworks.
Experimentation: Support the setup, execution, and analysis of A/B tests and online experiments to evaluate the impact of our generative AI-driven products.
Production Support: Assist with deploying and monitoring models and experiments on GCP (Airflow, Docker, Cloud Run, SQL databases, etc.), following existing patterns and CI/CD workflows.
Data Analysis: Perform exploratory data analysis, data validation, and basic feature engineering to support modelling and experimentation efforts.
Collaboration: Work closely with senior data scientists, engineers, and product stakeholders to understand business problems and translate them into actionable tasks.
SKILLS REQUIRED FOR THIS ROLE
Essential Functional/Job-specific skills
β’ Bachelorβs or Masterβs degree in Data Science, Computer Science, Mathematics, Statistics, or
a related field with 1+ years of relevant work experience.
β’ Solid foundation in SQL and Python, including experience with common libraries such as
Pandas, NumPy, Scikit-learn, Matplotlib, and Statsmodels.
β’ Basic understanding of supervised learning, experimentation, causal inference, and concepts in
reinforcement learning and multi-armed bandits.
β’ Foundational knowledge of probability, statistics, and linear algebra.
β’ Working knowledge of Git, including version control and collaboration through pull requests and
code reviews.
β’ Ability to write good documentation and ability to explain analysis results clearly to technical
and non-technical audiences
β’ Familiarity with deploying machine learning models in production cloud environments (GCP or
AWS).
Essential core skills:
- Communication
- Collaboration
- Organisation
- Delivering Results
- Solutions Focused
- Adaptability
-
Β· 15 views Β· 1 application Β· 14d
Data Architect
Full Remote Β· Ukraine Β· 4 years of experience Β· English - B2PwC is a global network of more than 370,000 professionals in 149 countries that turns challenges into opportunities. We create innovative solutions in audit, consulting, tax and technology, combining knowledge from all over the world. PwC SDC Lviv,...PwC is a global network of more than 370,000 professionals in 149 countries that turns challenges into opportunities. We create innovative solutions in audit, consulting, tax and technology, combining knowledge from all over the world.
PwC SDC Lviv, opened in 2018, is part of this global space. It is a place where technology is combined with team spirit, and ambitious ideas find their embodiment in real projects for Central and Eastern Europe.
What do we guarantee?
- Work format: Remote or in a comfortable office in Lviv - you choose.
- Development: Personal development plan, mentoring, English and Polish language courses.
- Stability: Official employment from day one, annual review of salary and career prospects.
- Corporate culture: Events that unite the team and a space where everyone can be themselves.
Join us as a Data Architect / Lead and play a key role in shaping the data foundation behind our next generation of analytics and AI solutions. In this position, youβll define the architecture vision for our modern data ecosystem, guide a growing team of Data Engineers, and build cloudβnative platforms that unlock enterpriseβwide insights. This is a highβimpact role where youβll work closely with business and technology leaders, influence strategic decisions, and drive the adoption of advanced analytics and Generative AI across the organization.
What Youβll Do:
Lead & Strategize
- Lead and mentor Data Engineers, fostering innovation and continuous improvement.
- Own the data architecture vision aligned with business goals.
- Partner with executives and stakeholders to turn strategic needs into scalable data solutions.
Architect & Build
- Design and optimize modern data platforms using Azure Data Lake, Databricks, SQL Server, Microsoft Fabric, and NoSQL.
- Build robust, scalable data pipelines across diverse data sources.
- Implement strong data quality, governance, and security practices.
- Support advanced analytics, machine learning, and AI-based solutions.
Enable Insights
- Build solutions that deliver accurate, timely insights for decision-makers.
- Collaborate on Power BI dashboards and executive reporting.
- Integrate Generative AI into insight-generation workflows.
Requirements:
- Bachelorβs degree in Computer Science, Data Engineering, or related field.
- 4+ years in data engineering, architecture, or analytics roles.
- Experience leading Data Engineering teams in enterprise settings will be a plus.
- Strong skills in Azure data services, Databricks, SQL, NoSQL, and Microsoft Fabric.
- Hands-on Power BI and enterprise reporting experience.
- Proven ability to build data pipelines and enforce data quality.
- Excellent communication skills, especially with executive stakeholders.
- Relevant certifications (Azure Data Engineer, Databricks, Fabric Analytics, etc.) are a plus.
Policy statements:
More
https://www.pwc.com/ua/uk/about/privacy.html -
Β· 10 views Β· 1 application Β· 14d
Data Architect (AWS) (IRC286424)
Full Remote Β· Croatia, Poland, Romania, Slovakia, Ukraine Β· 10 years of experience Β· English - B2Description The client is a pioneer in medical devices for less invasive surgical procedures, ranking as a leader in the market for coronary stents. The companyβs medical devices are used in a variety of interventional medical specialties, including...Description
The client is a pioneer in medical devices for less invasive surgical procedures, ranking as a leader in the market for coronary stents. The companyβs medical devices are used in a variety of interventional medical specialties, including interventional cardiology, peripheral interventions, vascular surgery, electrophysiology, neurovascular intervention, oncology, endoscopy, urology, gynecology, and neuromodulation.
The clientβs mission is to improve the quality of patient care and the productivity of health care delivery through the development and advocacy of less-invasive medical devices and procedures. This is accomplished through the continuing refinement of existing products and procedures and the investigation and development of new technologies that can reduce risk, trauma, cost, procedure time and the need for aftercare.Requirements
Boston Scientific is seeking a highly motivated R&D Data Engineer to support our R&D team in data management and development of complex electro-mechanical medical device systems. In this role you will use your technical and collaboration skills alongside your passion for data, innovation, and continuous improvement to help drive our product development forward.
β’ Design a systems level architecture for clinical, device, and imaging data and pipelines to support machine learning & classical algorithm development throughout the product lifecycle.
β’ Ensure architecture supports high-throughput image ingestion, indexing, and retrieval.
β’ Advance conceptual, logical, and physical data models for structured, semi-structured, and unstructured data.
β’ Help define and document data standards and definitions.
β’ Implement governance frameworks that enforce healthcare and data regulations to data architecture (HIPAA, FDA Part 11, GDPR, etc.).
β’ Performs strategic validation tasks of data management tools and platforms
β’ Collaborate closely with data scientists, cloud data engineers, algorithm engineers, clinical engineers, software engineers and systems engineers locally and globally.
β’ Investigate, research, and recommend appropriate software designs, machine learning operations, tools for dataset organization, controls, and traceability.
β’ In all actions, lead with integrity and demonstrate a primary commitment to patient safety and product quality by maintaining compliance to all documented quality processes and procedures.Job responsibilities
Required Qualifications
β’ Bachelorβs degree or higher in Computer Science, Software Engineering, Data Science, Biomedical Engineering or related field
β’ 6+ Years of relevant work experience with Bachelorβs degree
β’ 3+ Years of relevant work experience with Masters or PhD
β’ 4+ years of consistent coding in Python
β’ Strong understanding and use of relational databases and clinical data models
β’ Experience working with medical imaging data (DICOM) computer vision algorithms and tools
β’ Experience with AWS and cloud technologies and AWS DevOps tools
β’ Experience with creating and managing CI/CD pipelines in AWS
β’ Experience with Infrastructure as Code (IaC) using Terraform, CloudFormation or AWS CDK
β’ Excellent organizational, communication, and collaboration skills
β’ Foundational knowledge in machine learning (ML) operations and imaging ML pipelinesPreferred Qualifications
More
β’ Experience with software validation in a regulated industry
β’ Experience with Cloud imaging tools (ex. AWS Health Imaging, Azure Health Data Services)
β’ Working knowledge of data de-identification/pseudonymization methods
β’ Manipulating tabular metadata using SQL and Pythonβs Pandas library
β’ Experience with the Atlassian Tool Chain
β’ Data and annotation version control tools and processes
β’ Knowledge of HIPAA, FDA regulations (21 CFR Part 11), GDPR for medical device data governance. -
Β· 12 views Β· 0 applications Β· 14d
Data Architect (Azure Platform) IRC279265
Full Remote Β· Ukraine, Poland, Romania, Croatia, Slovakia Β· 10 years of experience Β· English - B2Description As the Data Architect, you will be the senior technical visionary for the Data Platform. You will be responsible for the high-level design of the entire solution, ensuring it is scalable, secure, and aligned with the companyβs long-term...Description
As the Data Architect, you will be the senior technical visionary for the Data Platform. You will be responsible for the high-level design of the entire solution, ensuring it is scalable, secure, and aligned with the companyβs long-term strategic goals. Your decisions will form the technical foundation upon which the entire platform is built, from initial batch processing to future real-time streaming capabilities.
Requirements
Required Skills (Must-Haves)
β Cloud Architecture: Extensive experience designing and implementing large-scale data platforms on Microsoft Azure.
β Expert Technical Knowledge: Deep, expert-level understanding of the Azure data stack, including ADF, Databricks, ADLS, Synapse, and Purview.
β Data Concepts: Mastery of data warehousing, data modeling (star schemas), data lakes, and both batch and streaming architectural patterns.
β Strategic Thinking: Ability to align technical solutions with long-term business strategy.Nice-to-Have Skills:
β Hands-on Coding Ability: Proficiency in Python/PySpark, allowing for the creation of architectural proofs-of-concept.
β DevOps & IaC Acumen: Deep understanding of CI/CD for data platforms and experience with Infrastructure as Code (Bicep/Terraform)/Experience with AzureDevOps for BigData services
β Azure Cost Management: Experience with FinOps and optimizing the cost of Azure data services.Job responsibilities
β End-to-End Architecture Design: Design and document the complete, end-to-end data architecture, encompassing data ingestion, processing, storage, and analytics serving layers.
More
β Technology Selection & Strategy: Make strategic decisions on the use of Azure services (ADF, Databricks, Synapse, Event Hubs) to meet both immediate MVP needs and future scalability requirements.
β Define Standards & Best Practices: Establish data modeling standards, development best practices, and governance policies for the engineering team to follow.
β Technical Leadership: Provide expert technical guidance and mentorship to the data engineers and BI developers, helping them solve the most complex technical challenges.
β Stakeholder Communication: Clearly articulate the architectural vision, benefits, and trade-offs to technical teams, project managers, and senior business leaders. -
Β· 20 views Β· 0 applications Β· 13d
ML Architect / Principal Data Scientist, Healthcare Business Unit (EMEA)
Full Remote Β· Poland, Romania, Slovakia, Croatia Β· 7 years of experience Β· English - C1Job Description Bachelorβs or Masterβs degree in Data Science, Computer Science, Statistics, Applied Mathematics, or related field. 7+ years of experience in data science or machine learning roles, ideally with exposure to healthcare projects. Strong...Job Description
- Bachelorβs or Masterβs degree in Data Science, Computer Science, Statistics, Applied Mathematics, or related field.
- 7+ years of experience in data science or machine learning roles, ideally with exposure to healthcare projects.
- Strong knowledge of ML frameworks such as scikit-learn, TensorFlow, PyTorch, XGBoost, or LightGBM.
- Proficiency in Python for data science and related libraries (NumPy, pandas, matplotlib, seaborn, etc.).
- Experience working with large datasets and data processing frameworks (e.g., Spark, Dask, SQL).
- Understanding of MLOps concepts and tools (e.g., MLflow, Kubeflow, Vertex AI, Azure ML).
- Familiarity with cloud environments (Azure, AWS, or GCP) for training and deploying models.
- Experience with model interpretability, fairness, and explainability techniques.
- Strong communication and visualization skills for storytelling with data.
- English proficiency at Upper-Intermediate level or higher.
Preferred Qualifications (Nice to Have)- Experience working with medical data (EHR, imaging, wearables, clinical trials, etc.).
- Familiarity with healthcare regulations related to data and AI (e.g., HIPAA, GDPR, FDA AI/ML guidelines).
- Knowledge of FHIR, HL7, or other healthcare interoperability standards.
- Practical experience with deep learning models (e.g., CNNs for imaging, transformers for NLP).
- Involvement in presales, proposal writing, or technical advisory work.
Job Responsibilities
- Lead the design and development of AI/ML solutions across HealthTech and MedTech projects.
- Participate in technical presales by analyzing business cases and identifying opportunities for AI/ML application.
- Build and validate predictive models, classification systems, NLP workflows, and optimization algorithms.
- Collaborate with software engineers, cloud architects, and QA to integrate models into scalable production systems.
- Define and guide data acquisition, preprocessing, labeling, and augmentation strategies.
- Contribute to the development of GlobalLogicβs healthcare-focused AI accelerators and reusable components.
- Present technical solutions to clients, both business and technical audiences.
- Support model monitoring, drift detection, and retraining pipelines in deployed systems.
- Ensure adherence to privacy, security, and compliance standards for data and AI usage.
- Author clear documentation and contribute to knowledge sharing within the Architects Team.
Department/Project Description
Join GlobalLogicβs Architects Team within the Healthcare Business Unit, supporting clients across the EMEA region. This strategic role focuses on engaging new clients, solving real-world healthcare challenges, and launching data-driven AI/ML projects. You will work closely with clients and internal stakeholders to translate complex business needs into impactful data science solutions. If youβre passionate about applying data science in a meaningful domain and want to shape the future of healthcare, weβd love to hear from you.
More -
Β· 80 views Β· 30 applications Β· 12d
Data Scientist
Full Remote Β· Worldwide Β· Product Β· 3 years of experience Β· English - B1Almus is looking for a Data Scientist to join our Analytics team and build production-grade machine learning models that directly impact marketing and business performance. You will work on end-to-end ML solutions, from data and features to deployment and...Almus is looking for a Data Scientist to join our Analytics team and build production-grade machine learning models that directly impact marketing and business performance.
You will work on end-to-end ML solutions, from data and features to deployment and monitoring, focusing on improving LTV prediction quality, optimizing ML-driven costs, and driving key metrics such as LTV, ROAS, retention, and CAC. This is an individual contributor role with strong ownership, close collaboration with Marketing, Product, and Data teams, and a clear focus on real business impact.
Apply to join Almus and take ownership of high-impact data initiatives!
Responsibilities
- Design, develop, and deploy machine learning models to production
- Improve product and business decision-making through data-driven approaches
- Build and evolve end-to-end ML pipelines (data β features β model β inference β monitoring)
- Drive measurable impact on key product and commercial metrics
- Standardize ML approaches within the team (best practices, documentation, reproducibility)
- Provide technical input to the architecture of analytics and ML infrastructure
- Develop and deploy models that drive growth in LTV, ROAS, retention, and CAC
- Influence performance and lifecycle marketing strategy
- Act as a domain expert and collaborate closely with Marketing, Product, and Data Engineering teams
What We Look For
- 3+ years of experience as a Data Scientist / ML Engineer
- Experience working with mobile subscription-based products
- Strong Python skills (production-level code)
- Solid knowledge of classical machine learning algorithms and practical experience applying them
- Experience with feature engineering, model evaluation, and biasβvariance trade-offs
- Hands-on experience with marketing models such as LTV, churn, cohort, and funnel modeling
- Experience with attribution, incrementality, and uplift modeling
- Strong SQL skills and experience working with analytical datasets
- Experience with production ML systems and A/B testing
English level: Intermediate+
Nice to have
- Experience with BigQuery
- MLOps experience (Docker, CI/CD, model registres)
- Experience working with performance marketing data (Meta, Google Ads, Adjust)
- Knowledge of causal inference
Experience with AutoML and Bayesian models
We Offer
- Exciting challenges and growth prospects together with an international company
- High decision-making speed and diverse projects
- Flexibility in approaches, no processes for the sake of processes
- Effective and friendly communication at any level
- Highly competitive compensation package that recognizes your expertise and experience, Performance Review practice to exchange feedback and discuss terms of cooperation
- Flexible schedule, opportunity to work in a stylish and comfortable office or remotely
- Respect for work-life balance (holidays, sick days - of course)
- Bright corporate events and gifts for employees
- Additional medical insurance
- Compensation for specialized training and conference attendance
- Restaurant lunches at the company's expense for those working in the office, endless supplies of delicious food all year round
-
Β· 32 views Β· 2 applications Β· 7d
Senior Data Science Engineer
Full Remote Β· Poland, Spain, Portugal, Romania, Bulgaria Β· 5 years of experience Β· English - C1We are seeking a skilled Senior Data Science Engineer to design, build, and deploy production machine learning solutions for an enterprise Fleet Cascading & Optimization Platform managing 46,000+ vehicles across 545+ locations. In this role, you will...We are seeking a skilled Senior Data Science Engineer to design, build, and deploy production machine learning solutions for an enterprise Fleet Cascading & Optimization Platform managing 46,000+ vehicles across 545+ locations. In this role, you will develop and operationalize demand forecasting, cascading optimization, contract intelligence (NLP/Vision), and out-of-spec prediction models with a strong focus on explainability and business impact. You will own the end-to-end ML lifecycle β from experimentation and model development to scalable production deployment on AWSβworking closely with engineering and business stakeholders to deliver reliable, data-driven outcomes.
Must-Have Requirements:
- Programming & ML Frameworks: Python; PyTorch or TensorFlow; scikit-learn; XGBoost or LightGBM; pandas; NumPy
- Time Series & Forecasting: BSTS; Prophet; Temporal Fusion Transformer (TFT); hierarchical forecasting with MinT reconciliation
- Optimization: Linear Programming and MILP using tools such as PuLP and OR-Tools; constraint satisfaction; min-cost flow optimization
- AWS ML Stack: Amazon SageMaker (Training Jobs, Endpoints, Model Monitor, Clarify, Feature Store, Pipelines)
Nice-to-have:
- NLP & Document AI: Amazon Textract; LayoutLMv3; Retrieval-Augmented Generation (RAG) pipelines; Amazon Bedrock (Claude); OpenSearch vector databases
- Advanced Machine Learning: Graph Neural Networks (GNNs); Deep Reinforcement Learning; Survival Analysis (Cox Proportional Hazards, XGBoost-Survival); attention-based models
- Explainability & MLOps: SHAP, LIME, Captum; MLflow; A/B testing; champion/challenger frameworks; model and data drift detection
Core Responsibilities:
- Build demand forecasting models (XGBoost, BSTS, Temporal Fusion Transformer) with hierarchical reconciliation across 545+ locations
- Develop cascading optimization using MILP/Min-Cost Flow solvers (PuLP, OR-Tools, Gurobi) and Hybrid ML+Optimization pipelines
- Implement document intelligence pipeline: Textract + LayoutLMv3 for document extraction, RAG with Bedrock (Claude) for semantic reasoning
- Deploy models on SageMaker with MLOps (Model Monitor, Feature Store, Pipelines); implement SHAP/LIME explainability
Models Youβll Build:
- Demand Forecasting: Gradient-boosted models (XGBoost), Bayesian Structural Time Series (BSTS), and Temporal Fusion Transformers (TFT), including hierarchical reconciliation
- Cascading Optimization: Mixed-Integer Linear Programming (MILP) and Min-Cost Flow models, evolving to hybrid ML + solver approaches and advanced Graph Neural Network (GNN) and Deep Reinforcement Learning (DRL) solutions
- Document Intelligence: Automated document extraction using Amazon Textract and LayoutLMv3, advancing to Retrieval-Augmented Generation (RAG) pipelines with Amazon Bedrock and Vision-Language Models
- Survival & Out-of-Spec Prediction: KaplanβMeier estimators, Cox Proportional Hazards models, and XGBoost-Survival techniques
What we offer:
- Continuous learning and career growth opportunities
- Professional training and English/Spanish language classes
- Comprehensive medical insurance
- Mental health support
- Specialized benefits program with compensation for fitness activities, hobbies, pet care, and more
- Flexible working hours
- Inclusive and supportive culture
About Us:
Established in 2011, Trinetix is a dynamic tech service provider supporting enterprise clients around the world.
Headquartered in Nashville, Tennessee, we have a global team of over 1,000 professionals and delivery centers across Europe, the United States, and Argentina. We partner with leading global brands, delivering innovative digital solutions across Fintech, Professional Services, Logistics, Healthcare, and Agriculture.
Our operations are driven by a strong business vision, a people-first culture, and a commitment to responsible growth. We actively give back to the community through various CSR activities and adhere to international principles for sustainable development and business ethics.
To learn more about how we collect, process, and store your personal data, please review our Privacy Notice: https://www.trinetix.com/corporate-policies/privacy-notice
More -
Β· 17 views Β· 0 applications Β· 7d
Data Architect (Azure Platform)
Full Remote Β· Ukraine Β· 10 years of experience Β· English - B2Description As the Data Architect, you will be the senior technical visionary for the Data Platform. You will be responsible for the high-level design of the entire solution, ensuring it is scalable, secure, and aligned with the companyβs long-term...Description
As the Data Architect, you will be the senior technical visionary for the Data Platform. You will be responsible for the high-level design of the entire solution, ensuring it is scalable, secure, and aligned with the companyβs long-term strategic goals. Your decisions will form the technical foundation upon which the entire platform is built, from initial batch processing to future real-time streaming capabilities.
Requirements
Required Skills (Must-Haves)
β Cloud Architecture: Extensive experience designing and implementing large-scale data platforms on Microsoft Azure.
β Expert Technical Knowledge: Deep, expert-level understanding of the Azure data stack, including ADF, Databricks, ADLS, Synapse, and Purview.
β Data Concepts: Mastery of data warehousing, data modeling (star schemas), data lakes, and both batch and streaming architectural patterns.
β Strategic Thinking: Ability to align technical solutions with long-term business strategy.Nice-to-Have Skills:
β Hands-on Coding Ability: Proficiency in Python/PySpark, allowing for the creation of architectural proofs-of-concept.
β DevOps & IaC Acumen: Deep understanding of CI/CD for data platforms and experience with Infrastructure as Code (Bicep/Terraform)/Experience with AzureDevOps for BigData services
β Azure Cost Management: Experience with FinOps and optimizing the cost of Azure data services.Job responsibilities
β End-to-End Architecture Design: Design and document the complete, end-to-end data architecture, encompassing data ingestion, processing, storage, and analytics serving layers.
More
β Technology Selection & Strategy: Make strategic decisions on the use of Azure services (ADF, Databricks, Synapse, Event Hubs) to meet both immediate MVP needs and future scalability requirements.
β Define Standards & Best Practices: Establish data modeling standards, development best practices, and governance policies for the engineering team to follow.
β Technical Leadership: Provide expert technical guidance and mentorship to the data engineers and BI developers, helping them solve the most complex technical challenges.
β Stakeholder Communication: Clearly articulate the architectural vision, benefits, and trade-offs to technical teams, project managers, and senior business leaders. -
Β· 37 views Β· 6 applications Β· 6d
Data Scientist to $6000
Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - B2We are looking for an experienced Data Scientist to join our team. Requirements: 5β8+ years of experience in Data Science/Analytics Strong background in Mathematics, Statistics, or related field Solid knowledge of statistical inference and hypothesis...We are looking for an experienced Data Scientist to join our team.
Requirements:
- 5β8+ years of experience in Data Science/Analytics
- Strong background in Mathematics, Statistics, or related field
- Solid knowledge of statistical inference and hypothesis testing (p-values, Z-tests, Chi-Square)
- Experience with machine learning for insight generation (e.g., clustering, segmentation, prediction)
- Strong skills in Python and SQL
- Strong stakeholder management and communication skills
- Proven experience creating executive-ready reports and PowerPoint presentations
- Ability to explain complex analytics to non-technical audiences
- Upper-Intermediate or higher level of English (B2+)
Responsibilities
- Lead end-to-end analytics: from problem definition to insights and recommendations
- Apply statistical analysis and machine learning for segmentation, trend analysis, and predictive insights
- Design and interpret statistical tests (p-values, Z-tests, Chi-Square, confidence intervals)
- Translate analytical results into clear business narratives
- Prepare analytical reports and executive-level PowerPoint presentations (core part of the role)Partner with business teams to align analytics with commercial objectives
- Query, clean, and analyze data using SQL and Python
- Act as a trusted analytics partner; mentor junior team members when needed
We offer:
- A full-time job and a long-term contract
- Flexible working hours
- Paid vacation and sick leave
- Managing your taxes and accounting
- Career and professional growth opportunities
- Optional benefits package that includes Health insurance, Gym membership, English courses, compensation of certification, courses, and training
- Creative and lively team of IT specialists, adequate management, and zero unnecessary bureaucracy
-
Β· 16 views Β· 0 applications Β· 6d
Senior Data Scientist
Ukraine Β· Product Β· 5 years of experience Β· English - B2Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with uniting top AI talents and organizing the first Data Science tech conference in Kyiv. Over the past 9 years, we have diligently fostered one of...Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with uniting top AI talents and organizing the first Data Science tech conference in Kyiv. Over the past 9 years, we have diligently fostered one of the largest Data Science & AI communities in Europe.
More
About the client:
The company is a trailblazer in the world of data-driven advertising, known for its innovative approach to optimizing ad placements and campaign effectiveness through advanced analytics and machine learning techniques. Our mission is to revolutionize the advertising sector by enabling brands to reach their audiences more effectively.
About the role:
We are seeking an experienced and motivated Senior Data Scientist to join our dynamic team. The ideal candidate will have deep expertise in supervised learning, reinforcement learning, and optimization techniques. You will play a pivotal role in developing and implementing advanced machine learning models, driving actionable insights, and optimizing our advertising solutions.
This position is based in Ukraine. The team primarily works remotely, with occasional in-person meetings in the Kyiv or Lviv office.
Responsibilities:
- Develop and implement advanced supervised and reinforcement learning models to improve ad targeting and campaign performance.
- Collaborate with cross-functional teams to identify opportunities for leveraging machine learning and optimization techniques to solve business problems.
- Conduct extensive data analysis and feature engineering to prepare datasets for machine learning tasks.
- Apply optimization algorithms to enhance the effectiveness and efficiency of advertising campaigns.
- Evaluate and refine existing models to enhance their accuracy, efficiency, and scalability.
- Utilize statistical techniques and machine learning algorithms to analyze large and complex datasets.
- Communicate findings and recommendations effectively to both technical and non-technical stakeholders.
- Stay updated with the latest advancements in machine learning, reinforcement learning, and optimization techniques.
- Work with engineering teams to integrate models into production systems.
- Monitor, troubleshoot, and improve the performance of deployed models.
- Mentor junior data scientists and contribute to the continuous improvement of the data science practice within the company.
Requirements:
- 5+ years of experience in data science or machine learning roles, with a strong focus on supervised learning, reinforcement learning, and optimization techniques.
- Technical Skills:
- Proficiency in Python.
- Strong understanding of working with relational databases and SQL.
- Experience with machine learning libraries such as scikit-learn, TensorFlow, PyTorch, or similar.
- Deep understanding of statistical modeling and supervised learning algorithms (e.g., linear regression, logistic regression, decision trees, random forests, SVMs, gradient boosting, neural networks).
- Hands-on experience with reinforcement learning algorithms and frameworks like OpenAI Gym.
- Practical experience with optimization algorithms (linear, non-linear, combinatorial, etc.).
- Hands-on experience with data manipulation tools and libraries (e.g., pandas, NumPy).
- Familiarity with cloud services, specifically AWS, is a plus.
- Practical experience building and managing cloud-based ML pipelines using AWS services (e.g. SageMaker, Bedrock) is a plus.
- Education:
- Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, Engineering, or a related field. A PhD is a plus.
Other Skills:
- Strong analytical and problem-solving skills.
- Excellent communication skills, with the ability to clearly articulate complex concepts to diverse audiences.
- Ability to work in a fast-paced environment and manage multiple priorities.
- Strong organizational skills and attention to detail.
- Ability to mentor and guide junior data scientists.
- Must be able to communicate with U.S.-based teams
The company offers:
- An opportunity to be at the forefront of advertising technology, impacting major marketing decisions.
- A collaborative, innovative environment where your contributions make a difference.
- The chance to work with a passionate team of data scientists, engineers, product managers, and designers.
- A culture that values learning, growth, and the pursuit of excellence.