- Flat structure. There are no βbossesβ and βsubordinatesβ.
- We hire people not to a project, but to the company. If the project (or your work in it) is over, you go to another project or to a paid βIdleβ.
- Flexible schedule, ability to change projects, to work from home, to try yourself in different roles.
- Minimal bureaucracy and micromanagement, convenient corporate services
-
Β· 123 views Β· 5 applications Β· 10d
QA Trainee
Hybrid Remote Β· Ukraine (Kyiv) Β· 0.5 years of experience Β· B2 - Upper IntermediatePosition overview DataArt is a global software engineering firm and a trusted technology partner for market leaders and visionaries. Our world-class team designs and engineers data-driven, cloud-native solutions to deliver immediate and enduring...Position overview
DataArt is a global software engineering firm and a trusted technology partner for market leaders and visionaries. Our world-class team designs and engineers data-driven, cloud-native solutions to deliver immediate and enduring business value.
We promote a culture of radical respect, prioritizing your personal well-being as much as your expertise. We stand firmly against prejudice and inequality, valuing each of our employees equally.
We respect the autonomy of others before all else, offering remote, onsite, and hybrid work options. Our Learning and development centers, R&D labs, and mentorship programs encourage professional growth.
Our long-term approach to collaboration with clients and colleagues alike focuses on building partnerships that extend beyond one-off projects. We provide the ability to switch between projects and technology stacks, creating opportunities for exploration through our learning and networking systems to advance your career.
Gain practical experience, enhance your skills, and master independent work on real IT projects.
Demonstrate your expertise during the trial period, and if you meet the professional standards, youβll earn the Junior QA qualificationβwith the opportunity to join DataArt full-time.Requirements
- Basic knowledge of manual testing (types of tests, methodologies of organizing testing processes, test design techniques)
- Basic knowledge of databases and skills working with at least one of database management systems (one of the following: MSSQL, Oracle, MySQL, PostgreSQL, etc.)
- Good interpersonal skills
- Excellent spoken English
Nice to have
- Practical and/or theoretical knowledge of API and mobile testing
-
Β· 34 views Β· 0 applications Β· 26d
System Administrator
Office Work Β· Ukraine (Dnipro) Β· 1 year of experience Β· B1 - IntermediateClient DataArt is a global software engineering firm and a trusted technology partner for market leaders and visionaries. Our world-class team designs and engineers data-driven, cloud-native solutions to deliver immediate and enduring business value. ...Client
DataArt is a global software engineering firm and a trusted technology partner for market leaders and visionaries. Our world-class team designs and engineers data-driven, cloud-native solutions to deliver immediate and enduring business value.
Technology stack
Operating Systems: Microsoft Windows, macOS, Linux
Network Technologies: AD, DNS, DHCP, NAT, VPN, VLAN, Group PoliciesResponsibilities
- Workstation Setup & Maintenance: Configure and deploy hardware and software for new employees
- Network Configuration & Support: Ensure seamless connectivity and troubleshoot network issues
- Peripheral Device Management: Set up printers, scanners, and other peripherals
- System & Infrastructure Monitoring: Monitor system performance and promptly address any disruptions
- User Support: Provide efficient technical support to end-users, troubleshooting issues and answering queries
- Technical Documentation: Maintain accurate records of system configurations and changes
- Routine Maintenance: Regularly update software, perform backups, and manage incidents
- Security & Compliance: Implement security measures and adhere to industry standards
Requirements
- Computer Proficiency: Excellent understanding of hardware and software troubleshooting
- Microsoft Windows Expertise: Familiarity with Windows Server and desktop operating systems
- Network Knowledge: Strong understanding of Active Directory (AD), DNS, DHCP, NAT, VPN, VLAN, and Group Policies
- Cross-Platform Skills: Comfortable working with Windows, macOS, and Linux
- Soft Skills: Effective communication with internal customers and colleagues
- Adaptability & Teamwork: Initiative, ability to multitask, and desire to collaborate
- English Proficiency: Excellent verbal and written communication skills
Nice to have
- Virtualization: Experience with VMware ESXi
- Scripting: Proficiency in PowerShell scripts for task automation
- Server Administration: Experience managing Windows Servers
- Access Management: Managing access rights and File Server service quotas
- Monitoring Tools: Experience with Zabbix and Syslog
- Certifications: CCNA, MCSA, or other relevant vendor certifications are highly valued
-
Β· 23 views Β· 1 application Β· 26d
Dynamics 365 Developer
Hybrid Remote Β· Ukraine (Dnipro, Ivano-Frankivsk, Kyiv + 4 more cities) Β· 4 years of experience Β· B1 - IntermediateClient Our client is the UKβs development finance institution, dedicated to supporting companies in developing countries and driving sustainable economic growth. Position overview We are seeking a skilled Dynamics 365 Developer to design and build a...Client
Our client is the UKβs development finance institution, dedicated to supporting companies in developing countries and driving sustainable economic growth.
Position overview
We are seeking a skilled Dynamics 365 Developer to design and build a custom business application using Microsoft Dynamics 365 and Dataverse. The ideal candidate will have hands-on experience with the Power Platform, strong knowledge of Dataverse schema design, and the ability to integrate business logic and workflows into scalable, user-friendly solutions.
Responsibilities
- Design and develop a custom application using Dynamics 365 and Dataverse.
- Customize forms, views, dashboards, and business process flows.
- Implement business logic using Power Automate, JavaScript, and Plugins (C#).
- Integrate external systems and services using Dataverse APIs, Azure Logic Apps, or custom connectors.
- Collaborate with stakeholders to gather requirements and translate them into technical specifications.
- Ensure application security, performance, and scalability.
- Provide documentation and user training as needed.
Requirements
- Proven experience with Microsoft Dynamics 365 CE/CRM and Power Platform.
- Strong understanding of Dataverse (formerly Common Data Service).
- Experience with Power Apps (Model-driven and/or Canvas).
- Proficiency in Power Automate, JavaScript, and C# Plugins.
- Familiarity with Azure services, REST APIs, and custom connectors.
- Ability to work independently and manage project timelines.
- Excellent communication and problem-solving skills.
-
Β· 26 views Β· 0 applications Β· 25d
Senior Data Engineer
Hybrid Remote Β· Ukraine (Dnipro, Kyiv, Lviv + 2 more cities) Β· 7 years of experience Β· B2 - Upper IntermediateClient Our client is a hedge fund sponsor that mainly manages pooled investment vehicles and typically invests in fixed income, private equity, rates, credit, and foreign exchange. The company operates offices in London, New York, and Hong Kong. We...Client
Our client is a hedge fund sponsor that mainly manages pooled investment vehicles and typically invests in fixed income, private equity, rates, credit, and foreign exchange. The company operates offices in London, New York, and Hong Kong.
We are seeking an experienced Senior Data Engineer with 7+ years in asset management or financial services to join our team. The ideal candidate will have expertise handling diverse datasets via batch files, APIs, and streaming from both internal and external sources.
Responsibilities
- Onboard new datasets and develop data models using Snowflake and DBT
- Build and maintain data transformation pipelines
- Design and manage data orchestration and ETL workflows with Azure Data Factory
- Optimize queries and apply data warehousing best practices for large and complex datasets
- Collaborate with development teams using agile methodologies, DevOps, Git, and CI/CD pipelines
- Support cloud-based services, especially Azure Functions, KeyVault, and LogicApps
- Optionally develop APIs to serve data to internal or external stakeholders
Requirements
- 7+ years as a Data Engineer in asset management or financial services
- Expertise in Snowflake, DBT, and data pipeline orchestration tools (Azure Data Factory)
- Strong knowledge of SQL, Python, data modeling, and warehousing principles
- Familiarity with DevOps practices including CI/CD and version control (Git)
- Experience with Azure cloud services
Nice to have
- Industry knowledge of Security Master, IBOR, and Portfolio Management
-
Β· 24 views Β· 0 applications Β· 18d
AI and ML Engineer
Hybrid Remote Β· Ukraine (Dnipro, Kyiv, Lviv + 2 more cities) Β· 5 years of experience Β· B2 - Upper IntermediatePosition overview We are seeking a highly skilled AI & ML Engineer to join our innovative team. In this role, you will lead the development, optimization, and deployment of advanced machine learning and artificial intelligence models. You will drive...Position overview
We are seeking a highly skilled AI & ML Engineer to join our innovative team. In this role, you will lead the development, optimization, and deployment of advanced machine learning and artificial intelligence models. You will drive state-of-the-art research, build scalable pipelines, and collaborate with cross-functional teams to translate business needs into effective ML solutions.
This position offers the opportunity to work on cutting-edge technologies in a dynamic, fast-paced environment, contributing to impactful projects that power next-generation AI applications.Responsibilities
- Design, implement, and optimize machine learning and AI models, including classification, regression, clustering, and agent tuning.
- Experiment with the latest methods, frameworks, and architectures to enhance model performance and efficiency.
- Develop robust, scalable ML pipelines for training, validation, and inference.
- Deploy ML models into production environments (cloud or on-prem), ensuring high reliability, low latency, and scalability.
- Apply MLOps best practices including CI/CD, monitoring, automated retraining, and model registry management.
- Partner with data engineering teams to source, clean, and transform large datasets for model training and inference.
- Ensure high data quality, perform feature engineering, and support real-time data integration processes.
- Work closely with data scientists, software engineers, and product managers to align ML solutions with business goals.
- Clearly communicate complex technical results to both technical and non-technical stakeholders.
- Provide technical guidance and mentorship to junior ML engineers and data scientists.
- Contribute to establishing team best practices, code reviews, and architectural decisions.
Requirements
- 5+ years of professional experience in machine learning, AI engineering, or related fields.
- Masterβs degree in Computer Science, Machine Learning, Physics, or related field; PhD preferred.
- Proficient in Python and ML frameworks such as TensorFlow, PyTorch, and Scikit-learn.
- Strong understanding of algorithms, statistics, probability, and linear algebra.
- Hands-on experience with data pipelines and ETL tools (e.g., Spark, AWS Lambda, AWS Glue).
- Practical experience with cloud platforms, preferably AWS.
- Solid software engineering fundamentals including version control (Git), testing, and design patterns.
- Demonstrated success in deploying ML models into production at scale.
- Familiarity with MLOps tools such as MLflow, Kubeflow, SageMaker, or Vertex AI.
- Excellent analytical, problem-solving, and communication skills.
- Ability to work autonomously and collaboratively within a fast-paced, cross-functional team environment.
Nice to have
- Experience with agentic frameworks like LangChain or LangGraph
-
Β· 16 views Β· 1 application Β· 18d
Senior GenAI Data Scientist
Hybrid Remote Β· Ukraine (Dnipro, Kyiv, Lviv + 2 more cities) Β· 5 years of experience Β· B2 - Upper IntermediateClient Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers...Client
Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers worldwide.
You'll collaborate with a world-class team of senior data scientists, ML engineers, and technology consultants from leading organizations in the fintech and cloud computing space. This diverse group brings together deep technical expertise, industry knowledge, and proven experience delivering mission-critical solutions at enterprise scale.
Position overview
We are seeking an experienced Senior Data Scientist with deep expertise in Generative AI implementations. This role is designed for seasoned data science professionals who have successfully transitioned their expertise into production GenAI environments - not for those simply exploring AI technologies.
Technology stack
AWS Bedrock, SageMaker, and comprehensive AI/ML service ecosystem
Vector databases and advanced RAG architectures
Enterprise-scale data processing and real-time model deployment systems
Automated CI/CD pipelines specifically designed for ML workflowsResponsibilities
- Design and implement data architectures for GenAI solutions across structured, semi-structured, and unstructured data sources
- Extract, prepare, and optimize data for consumption into AI platforms from data lakes and direct model ingestion
- Structure diverse data sources for proper ingestion into AI workflows and model training
- Develop and manage automated data streams and pipeline orchestration
- Collaborate with MLOps engineers to ensure seamless data flow for model training and inference
- Implement data quality monitoring and validation frameworks for GenAI applications
- Design feature engineering strategies specifically for Foundation Models and LLM implementations
- Scale proof-of-concepts to production-ready, enterprise-grade data solutions
Requirements
- Hands-on experience with diverse data sources (structured, semi-structured, unstructured) for AI platform integration
- Proven ability to extract, prepare, and structure data for consumption into AI platforms from data lakes or direct model ingestion
- Experience structuring various data sources for proper ingestion into AI workflows and Foundation Model training
- Advanced knowledge of automated data stream management and pipeline orchestration for AI/ML workloads
- Demonstrated experience building scalable data infrastructure supporting GenAI applications in production environments
- Strong background in AWS data services (S3, Glue, Kinesis, etc.) and integration with AI/ML platforms
- Advanced Python, SQL, and experience with big data technologies (Spark, Kafka, etc.)
- Proven track record of transitioning POCs to production-ready, enterprise-scale data solutions
- 5+ years data science experience with 2+ years dedicated GenAI data engineering and preparation experience
- Availability during US Eastern Time (ET) business hours to collaborate with onsite team
Nice to have
- Bachelor's degree in Computer Science, Data Science, Engineering, Statistics, or related technical field (Master's preferred)
- AWS certifications (Data Analytics, Machine Learning Specialty, etc.)
- Experience with financial services or payment processing data systems
- Knowledge of data governance and compliance frameworks in regulated industries
-
Β· 16 views Β· 0 applications Β· 18d
Senior MLOps Engineer / AI/ML Developer
Hybrid Remote Β· Ukraine (Dnipro, Ivano-Frankivsk, Kyiv + 3 more cities) Β· 5 years of experience Β· B2 - Upper IntermediateClient Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers...Client
Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers worldwide.
You'll collaborate with a world-class team of senior data scientists, ML engineers, and technology consultants from leading organizations in the fintech and cloud computing space. This diverse group brings together deep technical expertise, industry knowledge, and proven experience delivering mission-critical solutions at enterprise scale.
Position overview
We are seeking an experienced Senior MLOps Engineers with deep expertise in Generative AI implementations. This role is designed for seasoned ML engineering professionals who have successfully transitioned their expertise into production GenAI environments - not for those simply exploring AI technologies.
Technology stack
AWS Bedrock, SageMaker, and comprehensive AI/ML service ecosystem
Vector databases and advanced RAG architectures
Enterprise-scale data processing and real-time model deployment systems
Automated CI/CD pipelines specifically designed for ML workflowsResponsibilities
- Design and implement robust MLOps pipelines for GenAI solutions using AWS Bedrock platform
- Lead the selection, training, and fine-tuning of Foundation Models (FM) and Large Language Models (LLM) for specific business use cases
- Architect and deploy RAG (Retrieval-Augmented Generation) systems with vector databases
- Develop and optimize prompt engineering strategies for production environments
- Integrate AI-powered chatbots and conversational interfaces into existing business workflows
- Implement comprehensive automation frameworks across the AI/ML lifecycle
- Manage diverse data sources and ensure optimal data preparation for AI platform consumption
- Scale proof-of-concepts to production-ready, enterprise-grade solutions
Requirements
- Hands-on, production-level experience with implementation and management
- Proven experience selecting, training, and fine-tuning Foundation Models (FM) or Large Language Models (LLM) for specific business use cases
- Deep hands-on knowledge of Retrieval-Augmented Generation implementation and vector database management
- Advanced skills in designing and optimizing prompt engineering strategies for production AI applications
- Demonstrated experience integrating and deploying chatbots within business workflows and implementing AI automation frameworks
- Expert understanding of diverse data types (structured, semi-structured, unstructured) and their utilization within AI platforms
- Proven track record of transitioning POCs to production-ready, enterprise-scale solutions
- Strong AWS cloud services background and advanced Python skills with ML frameworks (TensorFlow, PyTorch, etc.)
- ML engineering/MLOps experience with 2+ years dedicated GenAI production experience
- To collaborate with onsite team during US Eastern Time (ET) business hours
Nice to have
- Bachelor's degree in Computer Science, Data Science, Engineering, or related technical field (Master's preferred)
- AWS certifications (Machine Learning Specialty, Solutions Architect, etc.)
- Experience with financial services or payment processing systems
-
Β· 26 views Β· 1 application Β· 18d
Data Modeler with expertise in Snowflake and SQLdbm
Hybrid Remote Β· Ukraine (Dnipro, Ivano-Frankivsk, Kyiv + 4 more cities) Β· 4 years of experience Β· B1 - IntermediateClient Our client revolutionizes the retail direct store delivery model by addressing key challenges like communication gaps, out-of-stocks, invoicing errors, and price inconsistencies to boost sales, profits, and customer loyalty with innovative...Client
Our client revolutionizes the retail direct store delivery model by addressing key challenges like communication gaps, out-of-stocks, invoicing errors, and price inconsistencies to boost sales, profits, and customer loyalty with innovative technology and partnerships.
Position overview
We are seeking an experienced Data Modeler to lead logical and physical data modeling efforts across Raw, Conformed, and CDM layers within a medallion architecture. The ideal candidate will design dimensional (star/snowflake) and 3NF schemas where appropriate, implement and enforce modeling standards (naming conventions, data types, SCD strategies), and maintain thorough documentation using SqlDBM or equivalent tools.
Responsibilities
- Lead logical and physical data modeling across Raw, Conformed, and CDM layers following medallion architecture
- Design dimensional (star/snowflake) and 3NF schemas where appropriate
- Implement and enforce modeling standards, including naming conventions, data types, and SCD strategies
- Document models and processes using SqlDBM or equivalent tools
- Translate business requirements into canonical data models and source-to-target mappings
- Collaborate closely with engineers and analysts to ensure alignment and accuracy
- Design and optimize Snowflake schemas focusing on DDL, micro-partitions, clustering, RBAC, and cost-efficient patterns
- Manage Postgres schema design, perform tuning, and optimize queries
- Review and refactor PL/pgSQL code (bonus)
- Maintain strong data governance practices including version control, peer reviews, and data lineage tracking
Requirements
- Proven expertise in data modeling across multiple layers of medallion architecture
- Strong understanding of data modeling concepts (OLTP, Star Schema, Medallion architecture)
- Strong experience with dimensional modeling (star/snowflake) and 3NF design
- Skilled in Snowflake schema design, performance tuning, and security best practices
- Experience with Postgres schema design and query optimization
- Familiarity with data governance standards and tools
- Proficiency with SqlDBM/ERwin or equivalent modeling/documentation platforms
-
Β· 23 views Β· 0 applications Β· 18d
Senior Data Engineer
Hybrid Remote Β· Ukraine (Dnipro, Kyiv, Lviv + 2 more cities) Β· 7 years of experience Β· B2 - Upper IntermediateClient Our client is a hedge fund sponsor that mainly manages pooled investment vehicles and typically invests in fixed income, private equity, rates, credit, and foreign exchange. The company operates offices in London, New York, and Hong Kong. Join...Client
Our client is a hedge fund sponsor that mainly manages pooled investment vehicles and typically invests in fixed income, private equity, rates, credit, and foreign exchange. The company operates offices in London, New York, and Hong Kong.
Join a great company, not merely an individual project
Position overview
We are seeking an experienced Senior Data Engineer with 7+ years in asset management or financial services to join our team. The ideal candidate will have expertise handling diverse datasets via batch files, APIs, and streaming from both internal and external sources.
Responsibilities
- Onboard new datasets and develop data models using Snowflake and DBT
- Build and maintain data transformation pipelines
- Design and manage data orchestration and ETL workflows with Azure Data Factory
- Optimize queries and apply data warehousing best practices for large and complex datasets
- Collaborate with development teams using agile methodologies, DevOps, Git, and CI/CD pipelines
- Support cloud-based services, especially Azure Functions, KeyVault, and LogicApps
- Optionally develop APIs to serve data to internal or external stakeholders
Requirements
- 7+ years as a Data Engineer in asset management or financial services
- Expertise in Snowflake, DBT, and data pipeline orchestration tools (Azure Data Factory)
- Strong knowledge of SQL, Python, data modeling, and warehousing principles
- Familiarity with DevOps practices including CI/CD and version control (Git)
- Experience with Azure cloud services
Nice to have
- Industry knowledge of Security Master, IBOR, and Portfolio Management
-
Β· 28 views Β· 0 applications Β· 13d
AI/ML Engineer
Hybrid Remote Β· Bulgaria, Georgia, Kazakhstan, Poland, Ukraine Β· 5 years of experience Β· B2 - Upper IntermediatePosition overview We are seeking a highly skilled AI & ML Engineer to join our innovative team. In this role, you will lead the development, optimization, and deployment of advanced machine learning and artificial intelligence models. You will drive...Position overview
We are seeking a highly skilled AI & ML Engineer to join our innovative team. In this role, you will lead the development, optimization, and deployment of advanced machine learning and artificial intelligence models. You will drive state-of-the-art research, build scalable pipelines, and collaborate with cross-functional teams to translate business needs into effective ML solutions.
This position offers the opportunity to work on cutting-edge technologies in a dynamic, fast-paced environment, contributing to impactful projects that power next-generation AI applications.Responsibilities
- Design, implement, and optimize machine learning and AI models, including classification, regression, clustering, and agent tuning.
- Experiment with the latest methods, frameworks, and architectures to enhance model performance and efficiency.
- Develop robust, scalable ML pipelines for training, validation, and inference.
- Deploy ML models into production environments (cloud or on-prem), ensuring high reliability, low latency, and scalability.
- Apply MLOps best practices including CI/CD, monitoring, automated retraining, and model registry management.
- Partner with data engineering teams to source, clean, and transform large datasets for model training and inference.
- Ensure high data quality, perform feature engineering, and support real-time data integration processes.
- Work closely with data scientists, software engineers, and product managers to align ML solutions with business goals.
- Clearly communicate complex technical results to both technical and non-technical stakeholders.
- Provide technical guidance and mentorship to junior ML engineers and data scientists.
- Contribute to establishing team best practices, code reviews, and architectural decisions.
Requirements
- 5+ years of professional experience in machine learning, AI engineering, or related fields.
- Masterβs degree in Computer Science, Machine Learning, Physics, or related field; PhD preferred.
- Proficient in Python and ML frameworks such as TensorFlow, PyTorch, and Scikit-learn.
- Strong understanding of algorithms, statistics, probability, and linear algebra.
- Hands-on experience with data pipelines and ETL tools (e.g., Spark, AWS Lambda, AWS Glue).
- Practical experience with cloud platforms, preferably AWS.
- Solid software engineering fundamentals including version control (Git), testing, and design patterns.
- Demonstrated success in deploying ML models into production at scale.
- Familiarity with MLOps tools such as MLflow, Kubeflow, SageMaker, or Vertex AI.
- Excellent analytical, problem-solving, and communication skills.
- Ability to work autonomously and collaboratively within a fast-paced, cross-functional team environment.
Nice to have
- Experience with agentic frameworks like LangChain or LangGraph
-
Β· 25 views Β· 0 applications Β· 13d
AI Engineer
Hybrid Remote Β· Bulgaria, Poland, Ukraine Β· 5 years of experience Β· B1 - IntermediatePosition overview We are seeking an AI Engineer with a strong software engineering background, proficient in Python and modern cloud-native technologies. The ideal candidate has hands-on experience with Snowflake, BigQuery, or AWS data platforms and...Position overview
We are seeking an AI Engineer with a strong software engineering background, proficient in Python and modern cloud-native technologies. The ideal candidate has hands-on experience with Snowflake, BigQuery, or AWS data platforms and solid expertise in data engineering, including ETL, Spark, Spark Streaming, Jupyter Notebooks, data quality, and medallion architecture and design.
Experience with machine learning best practices such as model training, evaluation, and weighting is essential.Responsibilities
- Design, develop, and deploy scalable AI and machine learning models.
- Build and maintain data pipelines and ETL processes using Spark, Spark Streaming, and related tools.
- Ensure high data quality and implement medallion architecture design principles.
- Collaborate with data scientists, engineers, and product teams to translate requirements into technical solutions.
- Implement best practices for model training, evaluation, and performance tuning.
- Develop, integrate, and maintain AI agents and conversational AI solutions where applicable.
Requirements
- Strong software engineering skills (Python, cloud-native stacks)
- Hands-on experience with Snowflake, BigQuery, or AWS data platforms
- Solid data engineering experience (ETL, Spark, Spark Streaming, Jupyter Notebooks, data quality, medallion architecture)
- Knowledge of machine learning best practices (model training, evaluation, weighting)
Nice to have
- Experience building AI agents (Langchain, Langgraph, OpenAI Agents, PydanticAI)
- Experience building conversational AI agents (AI chats, Evaluation-Driven Development)
-
Β· 23 views Β· 0 applications Β· 13d
Senior ML Engineer
Hybrid Remote Β· Bulgaria, Georgia, Kazakhstan, Poland, Ukraine Β· 4 years of experience Β· B1 - IntermediatePosition overview DataArt is a global software engineering firm and a trusted technology partner for market leaders and visionaries. Our world-class team designs and engineers data-driven, cloud-native solutions to deliver immediate and enduring...Position overview
DataArt is a global software engineering firm and a trusted technology partner for market leaders and visionaries. Our world-class team designs and engineers data-driven, cloud-native solutions to deliver immediate and enduring business value.
We promote a culture of radical respect, prioritizing your personal well-being as much as your expertise. We stand firmly against prejudice and inequality, valuing each of our employees equally.
We respect the autonomy of others before all else, offering remote, onsite, and hybrid work options. Our Learning and development centers, R&D labs, and mentorship programs encourage professional growth.
Our long-term approach to collaboration with clients and colleagues alike focuses on building partnerships that extend beyond one-off projects. We provide the ability to switch between projects and technology stacks, creating opportunities for exploration through our learning and networking systems to advance your career.
We are looking for a Senior Machine Learning Engineer to design, develop, and deploy advanced ML models focused on betting position forecasting, real-time analytics, and anomaly detection. The ideal candidate will have strong expertise in time series forecasting, predictive modeling, and scalable production deployment. You will work closely with cross-functional teams to integrate machine learning solutions that drive data-driven decision-making in a fast-paced betting environment, ensuring high model accuracy and reliability.Responsibilities
- Design and develop machine learning models for betting position forecasting and recommendation systems
- Build and deploy predictive analytics solutions delivering real-time betting insights
- Implement anomaly detection systems to identify unusual betting patterns and potential risks
- Develop pattern recognition algorithms for market trend analysis and user behavior prediction
- Ensure scalable, reliable deployment of ML models within production betting environments
- Monitor, evaluate, and optimize model performance and accuracy continuously
- Collaborate closely with development teams to integrate ML solutions into betting platforms
Requirements
- Strong proficiency in Python and core ML libraries (e.g., scikit-learn, TensorFlow, PyTorch)
- Proven experience in time series forecasting, predictive modeling, and anomaly detection
- Solid grasp of statistical analysis and probability theory relevant to betting/gaming contexts
- Hands-on experience deploying ML models in real-time production environments
- Expertise in data preprocessing, feature engineering, and model validation techniques
- Familiarity with ML Ops best practices including model monitoring and versioning
Nice to have
- Experience with Azure ML or similar cloud ML platforms
- Knowledge of databases, data pipelines, and data engineering fundamentals
- Understanding of A/B testing and experimentation frameworks
- Experience with containerization tools like Docker and CI/CD pipelines
-
Β· 21 views Β· 0 applications Β· 13d
Senior GenAI Data Scientist
Hybrid Remote Β· Ukraine, Poland, Bulgaria, Serbia Β· 5 years of experience Β· B2 - Upper IntermediateClient Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers...Client
Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers worldwide.
Join a great company, not merely an individual project
Team
You'll collaborate with a world-class team of senior data scientists, ML engineers, and technology consultants from leading organizations in the fintech and cloud computing space. This diverse group brings together deep technical expertise, industry knowledge, and proven experience delivering mission-critical solutions at enterprise scale.
Position overview
We are seeking an experienced Senior Data Scientist with deep expertise in Generative AI implementations. This role is designed for seasoned data science professionals who have successfully transitioned their expertise into production GenAI environments - not for those simply exploring AI technologies.
Technology stack
AWS Bedrock, SageMaker, and comprehensive AI/ML service ecosystem
Vector databases and advanced RAG architectures
Enterprise-scale data processing and real-time model deployment systems
Automated CI/CD pipelines specifically designed for ML workflowsResponsibilities
- Design and implement data architectures for GenAI solutions across structured, semi-structured, and unstructured data sources
- Extract, prepare, and optimize data for consumption into AI platforms from data lakes and direct model ingestion
- Structure diverse data sources for proper ingestion into AI workflows and model training
- Develop and manage automated data streams and pipeline orchestration
- Collaborate with MLOps engineers to ensure seamless data flow for model training and inference
- Implement data quality monitoring and validation frameworks for GenAI applications
- Design feature engineering strategies specifically for Foundation Models and LLM implementations
Scale proof-of-concepts to production-ready, enterprise-grade data solutions
Requirements
- Hands-on experience with diverse data sources (structured, semi-structured, unstructured) for AI platform integration
- Proven ability to extract, prepare, and structure data for consumption into AI platforms from data lakes or direct model ingestion
- Experience structuring various data sources for proper ingestion into AI workflows and Foundation Model training
- Advanced knowledge of automated data stream management and pipeline orchestration for AI/ML workloads
- Demonstrated experience building scalable data infrastructure supporting GenAI applications in production environments
- Strong background in AWS data services (S3, Glue, Kinesis, etc.) and integration with AI/ML platforms
- Advanced Python, SQL, and experience with big data technologies (Spark, Kafka, etc.)
- Proven track record of transitioning POCs to production-ready, enterprise-scale data solutions
- 5+ years data science experience with 2+ years dedicated GenAI data engineering and preparation experience
Availability during US Eastern Time (ET) business hours to collaborate with onsite team
Nice to have
- Bachelor's degree in Computer Science, Data Science, Engineering, Statistics, or related technical field (Master's preferred)
- AWS certifications (Data Analytics, Machine Learning Specialty, etc.)
- Experience with financial services or payment processing data systems
- Knowledge of data governance and compliance frameworks in regulated industries
-
Β· 26 views Β· 0 applications Β· 13d
Senior MLOps Engineer / AI/ML Developer
Hybrid Remote Β· Bulgaria, Poland, Serbia, Ukraine Β· 3 years of experience Β· B2 - Upper IntermediateClient Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers...Client
Our client is a leading Fortune 500 financial technology company that provides comprehensive payment solutions and financial services across multiple continents. They process billions of transactions annually and serve millions of customers worldwide.
Join a great company, not merely an individual project
Team
You'll collaborate with a world-class team of senior data scientists, ML engineers, and technology consultants from leading organizations in the fintech and cloud computing space. This diverse group brings together deep technical expertise, industry knowledge, and proven experience delivering mission-critical solutions at enterprise scale.
Position overview
We are seeking an experienced Senior MLOps Engineers with deep expertise in Generative AI implementations. This role is designed for seasoned ML engineering professionals who have successfully transitioned their expertise into production GenAI environments - not for those simply exploring AI technologies.
Technology stack
AWS Bedrock, SageMaker, and comprehensive AI/ML service ecosystem
Vector databases and advanced RAG architectures
Enterprise-scale data processing and real-time model deployment systems
Automated CI/CD pipelines specifically designed for ML workflowsResponsibilities
- Design and implement robust MLOps pipelines for GenAI solutions using AWS Bedrock platform
- Lead the selection, training, and fine-tuning of Foundation Models (FM) and Large Language Models (LLM) for specific business use cases
- Architect and deploy RAG (Retrieval-Augmented Generation) systems with vector databases
- Develop and optimize prompt engineering strategies for production environments
- Integrate AI-powered chatbots and conversational interfaces into existing business workflows
- Implement comprehensive automation frameworks across the AI/ML lifecycle
- Manage diverse data sources and ensure optimal data preparation for AI platform consumption
- Scale proof-of-concepts to production-ready, enterprise-grade solutions
Requirements
- Hands-on, production-level experience with implementation and management
- Proven experience selecting, training, and fine-tuning Foundation Models (FM) or Large Language Models (LLM) for specific business use cases
- Deep hands-on knowledge of Retrieval-Augmented Generation implementation and vector database management
- Advanced skills in designing and optimizing prompt engineering strategies for production AI applications
- Demonstrated experience integrating and deploying chatbots within business workflows and implementing AI automation frameworks
- Expert understanding of diverse data types (structured, semi-structured, unstructured) and their utilization within AI platforms
- Proven track record of transitioning POCs to production-ready, enterprise-scale solutions
- Strong AWS cloud services background and advanced Python skills with ML frameworks (TensorFlow, PyTorch, etc.)
- ML engineering/MLOps experience with 2+ years dedicated GenAI production experience
- To collaborate with onsite team during US Eastern Time (ET) business hours
Nice to have
- Bachelor's degree in Computer Science, Data Science, Engineering, or related technical field (Master's preferred)
- AWS certifications (Machine Learning Specialty, Solutions Architect, etc.)
- Experience with financial services or payment processing systems
-
Β· 31 views Β· 0 applications Β· 13d
AI Tech Lead
Hybrid Remote Β· Bulgaria, Poland, Ukraine, Serbia Β· 5 years of experience Β· B2 - Upper IntermediatePosition overview We are looking for a skilled AI Tech Lead to guide our AI development efforts, mentor a talented team, and deliver innovative machine learning and AI solutions that drive business value. Responsibilities Lead and mentor a team of AI...Position overview
We are looking for a skilled AI Tech Lead to guide our AI development efforts, mentor a talented team, and deliver innovative machine learning and AI solutions that drive business value.
Responsibilities
- Lead and mentor a team of AI engineers and data scientists.
- Oversee the design, development, and deployment of AI and machine learning models.
- Collaborate with stakeholders to define project goals and align AI solutions with business needs.
- Ensure best practices in AI development, including code quality, testing, and documentation.
- Drive innovation by researching and applying the latest AI technologies and techniques.
- Manage project timelines, priorities, and deliverables within an agile environment.
- Facilitate cross-team communication and collaboration.
- Monitor model performance and lead continuous improvements.
Requirements
- Experience leading teams or projects as a Tech Lead / Senior Engineer
- Strong software engineering background (Python, modern cloud-native stacks)
- Hands-on experience with Snowflake, BigQuery, or AWS data platforms
- Solid experience with data engineering (ETL, Spark, Jupyter Notebooks, medallion architecture and design)
- Experience building conversational AI agents (AI chats, Evaluation-Driven Development)
- Understanding of constraint solving (SAT/CT-SAT) and/or optimization algorithms
- Experience with machine learning best practices (model training, evaluation, weighting)
- Solid API integration experience (REST, gRPC, messaging systems)
- Excellent communication and leadership skills
Nice to have
- Knowledge of agentic AI patterns (human-in-the-loop, ReAct)
- Experience building AI agents with frameworks like Langchain, Langgraph, OpenAI Agents, PydanticAI
- 1
- 2