Jobs Lviv

6
  • · 65 views · 1 application · 24d

    GenAI Consultant

    Ukraine · 5 years of experience · B2 - Upper Intermediate
    EPAM GenAI Consultants are changemakers who bridge strategy and technology—applying agentic intelligence, RAG, and multimodal AI to transform how enterprises operate, serve users, and make decisions. Preferred Tech stack Programming Languages...

    EPAM GenAI Consultants are changemakers who bridge strategy and technology—applying agentic intelligence, RAG, and multimodal AI to transform how enterprises operate, serve users, and make decisions. 

     

    Preferred Tech stack 

     

     Programming Languages 

    • Python (*) 
    • TypeScript 
    • Rust 
    • Mojo 
    • Go 

     

     Fine-Tuning & Optimization 

    • LoRA (Low-Rank Adaptation) 
    • PEFT (Parameter-Efficient Fine-Tuning) 

     

    Foundation & Open Models 

    • OpenAI (GPT series), Anthropic Claude Family, Google Gemini, Grok (*, at least one of them ) 
    • Llama 
    • Falcon 
    • Mistral 

     

    Inference Engines  

    • VLLM 

     

     Prompting & Reasoning Paradigms (*) 

    • CoT (Chain of Thought) 
    • ToT (Tree of Thought) 
    • ReAct (Reasoning + Acting) 
    • DSPy 

     

    Multimodal AI Models 

    • CLIP (*) 
    • BLIP2 
    • Whisper 
    • LLaVA 
    • SAM (Segment Anything Model) 

     

     Retrieval-Augmented Generation (RAG) 

    • RAG (core concept) (*) 
    • RAGAS (RAG evaluation and scoring) (*) 
    • Haystack (RAG orchestration & experimentation) 
    • LangChain Evaluation (LCEL Eval) 

     

    Agentic Frameworks 

     

    • CrewAI  (*) 
    • AutoGen, AutoGPT, LangGraph, Semantic Kernel, LangChain (* at least  2 of them) 
    • Prompt ToolsPromptLayer, PromptFlow (Azure),  Guidance by Microsoft (* at least one of them) 

     

    Evaluation & Observability 

    • RAGAS – Quality metrics for RAG (faithfulness, context precision, etc.) (*) 
    • TruLens – LLM eval with attribution and trace inspection (*) 
    • EvalGAI – GenAI evaluation testbench 
    • Giskard – Bias and robustness testing for NLP 
    • Helicone – Real-time tracing and logging for LLM apps 
    • HumanEval – Code generation correctness testing 
    • OpenRAI – Evaluation agent orchestration 
    • PromptBench – Prompt engineering comparison 
    • Phoenix by Arize AI – Multimodal and LLM observability 
    • Zeno – Human-in-the-loop LLM evaluation platform 
    • LangSmith – LangChain observability and evaluation 
    • WhyLabs – Data drift and model behavior monitoring 

     

    Explainability & Interpretability (understanding) 

    • SHAP 
    • LIME 

     

    Orchestration & Experimentation (*) 

    • MLflow 
    • Airflow 
    • Weights & Biases (W&B) 
    • LangSmith 

     

     Infrastructure & Deployment 

    • Kubernetes 
    • Amazon SageMaker 
    • Microsoft Azure AI 
    • Goggle Vertex AI  
    • Docker 
    • Ray Serve (for distributed model serving) 

     

    Responsibilities 

    • Lead GenAI discovery workshops with clients
    • Design Retrieval-Augmented Generation (RAG) systems and agentic workflows
    • Deliver PoCs and MVPs using LangChain, LangGraph, CrewAI , Semantic Kernel,  DSPy, RAGAS 
    • Ensure Responsible AI principles in deployments (bias, fairness, explainability) 
    • Support RFPs, technical demos, and GenAI architecture narratives 
    • Reuse of accelerators/templates for faster delivery 
    • Governance & compliance setup for enterprise-scale AI 
    • Use of evaluation frameworks to close feedback loops 

     

    Requirements 

    • Consulting: Experience in exploring the business problem and converting it to applied AI technical solutions; expertise in pre-sales, solution definition activities 
    • Data Science: 3+ years of hands-on experience with core Data Science, as well as knowledge of one of the advanced Data Science and AI domains (Computer Vision, NLP, Advanced Analytics etc.)   
    • Engineering: Experience delivering applied AI from concept to production, familiarity, and experience with MLOps, Data, design of Data Analytics platforms, data engineering, and technical leadership 
    • Leadership: Track record of delivering complex AI-empowered and/or AI-empowering programs to clients in a leadership position. Experience in managing and growing a team to scale up Data Science, AI, and ML capabilities is a big plus. 
    • Excellent communication skills (active listening, writing and presentation), drive for problem solving and creative solutions, high EQ 
    • Experience with LLMOps or GenAIOps tooling (e.g., guardrails, tracing, prompt tuning workflows) 
    • Understanding of the importance of AI products evaluation is a must 
    • Knowledge of cloud GenAI platforms (AWS Bedrock, Azure OpenAI, GCP Vertex AI) 
    • Understanding of data privacy, compliance, and Governance in GenAI (GDPR, HIPAA, SOC2, RAI, etc.) 
    • In-depth understanding of a specific industry or a broad range of industries. 

     

    More
  • · 40 views · 1 application · 29d

    Computer Vision Engineer (slam, vio)

    Ukraine · Product · 3 years of experience MilTech 🪖
    We are looking for a Computer Vision Engineer with a background in classical computer vision techniques and hands-on implementation of low-level CV algorithms. The ideal candidate will have experience with SLAM, Visual-Inertial Odometry (VIO), and sensor...

    We are looking for a Computer Vision Engineer with a background in classical computer vision techniques and hands-on implementation of low-level CV algorithms.

    The ideal candidate will have experience with SLAM, Visual-Inertial Odometry (VIO), and sensor fusion.

    We consider engineers at Middle/Senior levels — tasks and responsibilities will be adjusted accordingly.

     

    Required Qualifications:

    • 3+ years of hands-on experience with classical computer vision
    • Knowledge of popular computer vision networks and components 
    • Understanding of geometrical computer vision principles
    • Hands-on experience in implementing low-level CV algorithms
    • Practical experience with SLAM and/or Visual-Inertial Odometry (VIO)
    • Proficiency in C++
    • Experience with Linux
    • Ability to quickly navigate through recent research and trends in computer vision.

    Nice to Have:

    • Experience with Python
    • Familiarity with neural networks and common CV frameworks/libraries (OpenCV, NumPy, PyTorch, ONNX, Eigen, etc.)
    • Experience with sensor fusion.
    More
  • · 164 views · 11 applications · 3d

    Senior Data Scientist

    Countries of Europe or Ukraine · 4 years of experience · B2 - Upper Intermediate
    We’re looking for a Senior Data Scientist to help shape how our clients build and scale AI solutions on AWS. In this role, you’ll develop and deploy cutting-edge generative AI models on SageMaker – from model training and fine-tuning to optimized...

    We’re looking for a Senior Data Scientist to help shape how our clients build and scale AI solutions on AWS. In this role, you’ll develop and deploy cutting-edge generative AI models on SageMaker – from model training and fine-tuning to optimized deployment – guiding customers from ideation to production through proof of concept. You’ll work closely with startup founders, technical leaders, and account teams to create scalable, high-impact AI solutions that drive real business value.

     

    Responsibilities:

    • Model Development & Deployment: Deploy and train models on AWS SageMaker (using TensorFlow/PyTorch).
    • Model Tuning & Optimization: Fine-tune and optimize models using techniques like quantization and distillation, and tools like Pruna.ai and Replicate.
    • Generative AI Solutions: Design and implement advanced GenAI solutions, including prompt engineering and retrieval-augmented generation (RAG) strategies.
    • LLM Workflows: Develop agentic LLM workflows that incorporate tool usage, memory, and reasoning for complex problem-solving.
    • Scalability & Performance: Maximize model performance on AWS’s by leveraging techniques such as model compilation, distillation, and quantization and using AWS specific features.
    • Collaboration: Work closely with Data Engineering, DevOps, and MLOps teams to integrate models into production pipelines and workflows.

     

    Requirements:

    • 4+ years of experience in machine learning or data science roles, with deep learning (NLP, LLMs) expertise.
    • Expert in Python and deep learning frameworks (PyTorch/TensorFlow), and hands-on with AWS ML services (especially SageMaker and Bedrock).
    • Proven experience with generative AI and fine-tuning large language models.
    • Strong experience deploying ML solutions on AWS cloud infrastructure and familiarity with MLOps best practices.
    • Excellent communication skills and ability to work directly with customers in a consulting capacity.
    • A master’s degree in a relevant field and AWS ML certifications are a plus.

     

    Benefits:

    • Professional training and certifications covered by the company (AWS, FinOps, Kubernetes, etc.)
    • International work environment
    • Referral program – enjoy cooperation with your colleagues and get a bonus 
    • Company events and social gatherings (happy hours, team events, knowledge sharing, etc.)
    • English classes
    • Soft skills training

       

    Country-specific benefits will be discussed during the hiring process.

     

    Automat-it is committed to fostering a workplace that promotes equal opportunities for all and believes that a diverse workforce is crucial to our success. Our recruitment decisions are based on your experience and skills, recognizing the value you bring to our team.

    More
  • · 29 views · 1 application · 21d

    Data Science Engineer

    Hybrid Remote · Spain, Poland, Portugal, Ukraine · 5 years of experience · B2 - Upper Intermediate
    Quantum is a global technology partner delivering high-end software products that address real-world problems. We advance emerging technologies for outside-the-box solutions. We focus on Machine Learning, Computer Vision, Deep Learning, GIS, MLOps,...

    Quantum is a global technology partner delivering high-end software products that address real-world problems.

    We advance emerging technologies for outside-the-box solutions. We focus on Machine Learning, Computer Vision, Deep Learning, GIS, MLOps, Blockchain, and more.

    Here at Quantum, we are dedicated to creating state-of-art solutions that effectively address the pressing issues faced by businesses and the world. To date, our team of exceptional people has already helped many organizations globally attain technological leadership.

    We constantly discover new ways to solve never-ending business challenges by adopting new technologies, even when there isn’t yet a best practice. If you share our passion for problem-solving and making an impact, join us and enjoy getting to know our wealth of experience!

     

    About the position

    Quantum is expanding the team and has brilliant opportunities for a Data Science Engineer. The client is a technological research company that utilizes proprietary AI-based analysis and language models to provide comprehensive insights into global stocks in all languages. Our mission is to bridge the knowledge gap in the investment world and empower investors of all types to become “super-investors.”

    Through our generative AI technology implemented into brokerage platforms and other financial institutions’ infrastructures, we offer instant fundamental analyses of global stocks alongside bespoke investment strategies, enabling informed investment decisions for millions of investors worldwide.

     

    Must have skills:

    • At least 5 years of commercial experience in Data Science
    • Strong knowledge of linear algebra, calculus, statistics, and probability theory
    • Proficiency in algorithms and data structures
    • Experience with Machine Learning libraries (NumPy, SciPy, Pandas, Scikit-learn)
    • Experience with at least one Deep Learning framework (TensorFlow, Keras, or PyTorch)
    • Knowledge of modern Neural Network architectures
    • Experience in developing solutions with LLMs
    • Experience with Cloud Computing Platforms (AWS, Google Cloud, or Azure)
    • Practical experience with Docker
    • Experience with SQL
    • Strong understanding of Object-Oriented Programming (OOP) principles
    • Hands-on experience in building solutions for financial domain
    • At least an Upper-Intermediate level of English (spoken and written)

     

    Would be a plus:

    • Experience with MLOps solutions
    • Basic understanding of Big Data concepts
    • Experience in classical Computer Vision algorithms
    • Participation in Kaggle competitions

     

    Your tasks will include:

    • Full-cycle data science projects
    • Data analysis and data preparation
    • Development of NLP/Deep Learning / Machine Learning; Developing models and deploying them to production
    • Sometimes, this will require the ability to implement methods from scientific papers and apply them to new domains

     

    We offer:

    • Delivering high-end software projects that address real-world problems
    • Surrounding experts who are ready to move forward professionally
    • Professional growth plan and team leader support
    • Taking ownership of R&D and socially significant projects
    • Participation in worldwide tech conferences and competitions
    • Taking part in regular educational activities
    • Being a part of a multicultural company with a fun and lighthearted atmosphere
    • Working from anywhere with flexible working hours
    • Paid vacation and sick leave days

     

    Join Quantum and take a step toward your data-driven future.

    More
  • · 10 views · 0 applications · 8d

    Data Architect

    Hybrid Remote · Ukraine · 3 years of experience · B2 - Upper Intermediate
    Client Our client is a leading global travel agency network specializing in luxury and experiential journeys. They seek to develop a unified API framework that ensures secure, flexible, and seamless system integration. As a Data Architect, you will...

    Client

    Our client is a leading global travel agency network specializing in luxury and experiential journeys. They seek to develop a unified API framework that ensures secure, flexible, and seamless system integration.

     

     

    As a Data Architect, you will design and implement scalable, secure, and high-performance data architectures that support business needs. You will leverage cloud platforms—particularly Azure Data Services—and data warehousing solutions such as Snowflake to build robust data pipelines, ensure data quality, and optimize data storage and processing. Collaborating closely with data engineers, analysts, and business stakeholders, you will translate complex requirements into effective technical solutions.

    You will also define data standards and governance policies, lead data migration and modernization initiatives, and provide technical leadership and mentorship to the team. Your work will drive data-driven decision-making and enable the organization to efficiently manage and utilize its data assets.

     

    Responsibilities

    • Design and implement robust, scalable, and secure data architectures using Azure Data Services (e.g., Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake).
    • Architect and optimize Snowflake data warehouses for performance, scalability, and cost-efficiency.
    • Develop and maintain complex SQL and T-SQL scripts for data transformation, integration, and reporting.
    • Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical solutions.
    • Define and enforce data architecture standards, best practices, and governance policies.
    • Lead data migration and modernization initiatives from legacy systems to cloud-based platforms.
    • Evaluate and recommend new tools, technologies, and frameworks to improve data infrastructure.
    • Mentor junior team members and provide technical leadership across projects.

    Requirements

    • Azure Data Services (Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Data Lake Storage)
    • Cloud Data Architecture & Design (especially Azure cloud)
    • Snowflake Data Warehouse Design & Optimization
    • Advanced SQL and T-SQL Programming
    • Data Transformation & ETL/ELT Processes
    • Data Integration Techniques
    • Data Modeling (Star Schema, Snowflake Schema, Normalization & Denormalization)
    • Technical Leadership and Mentorship
    • Cross-Functional Collaboration with Business and Technical Teams
    • Requirements Gathering and Technical Solution Design
    • Documentation and Standardization of Data Architecture
    • Problem Solving and Analytical Thinking
    • Work schedule alignment till 5pm UTC -3 (exclusive)
    More
  • · 75 views · 7 applications · 4d

    Junior Data Science Engineer

    Hybrid Remote · Spain, Poland, Portugal, Ukraine · 1 year of experience · B2 - Upper Intermediate
    Quantum is a global technology partner delivering high-end software products that address real-world problems. We advance emerging technologies for outside-the-box solutions; We focus on Machine Learning, Computer Vision, Deep Learning, GIS, MLOps,...

    Quantum is a global technology partner delivering high-end software products that address real-world problems. 

    We advance emerging technologies for outside-the-box solutions; We focus on Machine Learning, Computer Vision, Deep Learning, GIS, MLOps, Blockchain, and more.

     

    About the position

    Quantum expands the team in Central Europe and has brilliant opportunities for Data Science Engineers. 

    If you are interested in working on areas related to Data Analysis, fintech, image processing, and solving real-world challenges with innovative technologies, apply for the vacancy below. 

     

    Must have skills:

    • 1-2 years of commercial experience as a Data Science Engineer
    • Strong knowledge of linear algebra, calculus, statistics, and probability theory
    • Knowledge and experience with algorithms and data structures
    • Strong experience with Machine Learning
    • Expertise in areas of Computer Vision or Natural Language Processing
    • Knowledge of modern Neural Networks architectures (DNN, CNN, LSTM, etc.)
    • Experience with at least one of the Deep Learning frameworks (Tensorflow, PyTorch)
    • Experience with SQL
    • Strong knowledge of OOP
    • At least an Upper-Intermediate level of English (spoken and written)

     

    Nice to have skills:

    • Experience with production ML/DL frameworks (OpenVino, TensorRT, etc.)
    • Docker practical experience
    • Experience with Cloud Computing Platforms (AWS, GCloud, Azure)
    • Participation in Kaggle competitions

     

    Your tasks will include:

    • Full-cycle data science projects
    • Data analysis and data preparation
    • Development of Machine Learning / Computer Vision / Deep Learning / NLP solutions; Developing models and deploying them to production
    • Sometimes, this will require the ability to implement methods from scientific papers and apply them to new domains

     

    We offer:

    • Delivering high-end software projects that address real-world problems
    • Surrounding experts who are ready to move forward professionally
    • Professional growth plan and team leader support
    • Taking ownership of R&D and socially significant projects
    • Participation in worldwide tech conferences and competitions
    • Taking part in regular educational activities
    • Being a part of a multicultural company with a fun and lighthearted atmosphere
    • Working from anywhere with flexible working hours
    • Paid vacation and sick leave days

     

    Join Quantum and take a step toward your data-driven future.

    More
Log In or Sign Up to see all posted jobs