EPAM Systems

Joined in 2014
37% answers
ЕРАМ прагне надавати своїй глобальній команді з понад 60,000+ професіоналів у більш ніж 45 країнах можливості для професійного зростання з першого дня співпраці. Наші колеги – джерело успіху ЕРАМ, тож ми цінуємо співпрацю, намагаємось завжди розуміти бізнес наших клієнтів та прагнемо до найвищих стандартів якості. Незалежно від місця, де ви знаходитесь, ви долучитесь до відданої, розмаїтої спільноти, яка допоможе вам реалізувати свій потенціал якомога повніше
  • Big Data Engineer

    Ukraine · 3 years of experience · Upper-Intermediate
    We are looking for a Senior/Lead Big Data Engineer who can help us to create the data pipelines for the data platform we are designing now. -3+ years of hands-on experience in the big-data field: Hadoop, Spark, Spark Streaming, MR. -3+ years of...

    We are looking for a Senior/Lead Big Data Engineer who can help us to create the data pipelines for the data platform we are designing now.

     

    -3+ years of hands-on experience in the big-data field: Hadoop, Spark, Spark Streaming, MR.

    -3+ years of experience with relevant cloud data services (AWS, Azure, GCP) like EMR, Glue, S3, Lambda, Fargate, DynamoDB, ADF, Azure functions, Azure Blob Storage, DataProc & DataFlow, BigQuery.

    -Knowledge of DataBricks will be significant benefit for the candidate.

    -Excellent knowledge and hands-on experience of SQL in context of Big Data: Spark SQL, Hive QL

    - Ability to understand and optimize Spark execution plans via Spark UI

    - Excellent knowledge of Python or Scala

    - Batch processing and ETL principles in Data Warehouses.

    - Data completeness signals and orchestration

    - Approaches for historical reprocessing and data correction

    - Handling bad data and late data in inputs and outputs

    - Schema migrations and datasets evolution

    - B2 Strong speaking English.

    - Ability to learn fast new set of tools and technology used internally: Radar, platform services, telemetry providers, Spark-as-a-Service, build system and much more.

     

    Will be a plus

    - Understanding of functional programming ideas and principles.

    - Experience in building and using web services.

    - Experience with any of Teradata, Vertica, Oracle, Tableau.

    - Spark Streaming and Kafka.

    - Experience or knowledge of: Apache Iceberg, Trino (Presto), Druid, Cassandra, Blob storage like AWS.

    - Understanding or experience with Azkaban or Airflow.

  • Big Data Engineer (Azure, Databricks)

    Full Remote · Ukraine · 4 years of experience · Upper-Intermediate
    RESPONSIBILITIES Review existing practices and solutions from our client Design and develop solutions to support data science teams in production models Integrate Data pipelines and ML pipelines Refactor models to modularized Python code Create...

    RESPONSIBILITIES

    Review existing practices and solutions from our client

    Design and develop solutions to support data science teams in production models

    Integrate Data pipelines and ML pipelines

    Refactor models to modularized Python code

    Create standardized APIs for model serving, registering models in the model registry and features in the feature store

    Diagnose and resolve issues post-implementation with new and existing environments that were provisioned

    Respond and consult end-users during office hours

    Develop unit and component tests, and support CDP end-to-end testing

    Add performance metrics within monitoring solutions and update code to address areas within the solution developed that do not meet metrics developed

     

     

    REQUIREMENTS

    4+ years experience in Data Engineering

    Python, Java

    Azure

    Databricks

    Azure ML Ops

    Potential for Dataiku

    Intermediate level of English, both spoken and written (B1+)

     

     

    WE OFFER

    Competitive compensation depends on experience and skills

    Individual career path

    Unlimited access to LinkedIn learning solutions

    Sick leave and regular vacation

    English classes with certified English teachers

    Flexible work hours

  • Senior/Lead Data Scientist

    Ukraine · 5 years of experience · Intermediate
    As a Generative AI Data Scientist, you will play a critical role in use case evaluation, architecture, and development planning, developing and implementing state-of-the-art generative models to solve complex business problems. This is an exciting...

    As a Generative AI Data Scientist, you will play a critical role in use case evaluation, architecture, and development planning, developing and implementing state-of-the-art generative models to solve complex business problems. This is an exciting opportunity to apply cutting-edge techniques and push the boundaries of AI technology.

     

    Qualifications:

     

    -Proven experience in developing and implementing generative models, such as GANs, VAEs, or deep generative models.

    - Strong proficiency in programming languages such as Python, with experience using deep learning frameworks such as TensorFlow or PyTorch.

    - Solid understanding of machine learning, deep learning, and statistical modeling concepts.

    - Experience working with large-scale datasets and preprocessing techniques.

    - Proficiency in data visualization and exploratory analysis tools.

    - Strong problem-solving skills and ability to think creatively to design innovative solutions.

    - Excellent written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical audiences.

    - Proven ability to work collaboratively in a team environment and contribute to cross-functional projects.

    - Strong research and self-learning abilities, with a passion for staying up-to-date with the latest advancements in generative AI and related fields.

     

     

    Preferred Skills:

     

    - Experience with natural language processing (NLP) and text generation models.

    - Familiarity with cloud-based machine learning platforms and Generative AI services, such as Azure (Open AI, ChatGPT), Google Cloud or AWS.

    - Knowledge of parallel computing and distributed training frameworks.

    - Publications or contributions to the research community in the field of generative AI or related disciplines.

     

    Responsibilities:

    - Research and Development: Conduct research and stay up-to-date with the latest advancements in generative AI, deep learning, and related fields. Explore and experiment with different generative models, architectures, and algorithms to enhance our capabilities.

     

    - Model Development: Design, develop, and implement novel generative models tailored to specific use cases. Create and optimize deep neural networks, variational autoencoders (VAEs), generative adversarial networks (GANs), or other generative architectures to generate realistic and diverse synthetic data.

     

    - Data Preprocessing: Work closely with data engineers and domain experts to preprocess and clean large-scale datasets. Apply statistical techniques and data augmentation methods to ensure high-quality input data for training generative models.

     

    - Model Training and Evaluation: Train and fine-tune generative models using large-scale datasets, leveraging techniques such as transfer learning and unsupervised learning. Develop evaluation metrics and benchmarks to assess model performance and generate insights to guide model improvements.

     

    - Collaboration: Collaborate with cross-functional teams, including data scientists, software engineers, insurance experts, and experience designers, to integrate generative models into real-world applications. Provide technical guidance and support to ensure successful implementation and deployment of generative AI solutions.

     

    - Innovation and Optimization: Continuously explore new techniques, frameworks, and tools to optimize and enhance the performance of generative models. Stay informed about emerging trends and best practices in the field and contribute to the advancement of the organization's data science capabilities.

     

    - Documentation and Reporting: Prepare clear and concise technical documentation, including model architectures, methodologies, and experiment results. Present findings and insights to both technical and non-technical stakeholders, contributing to knowledge-sharing and decision-making processes.

  • Lead Big Data Engineer (with Python experience)

    Ukraine · 5 years of experience · Intermediate
    We are looking for an experienced data engineer with design and development experience in automating scalable and high-performance data processing systems (batch and/or streaming) on the cloud (AWS preferably). What You’ll Do Design our data models...

    We are looking for an experienced data engineer with design and

    development experience in automating scalable and high-performance data processing systems (batch and/or streaming) on the cloud (AWS preferably).

     

    What You’ll Do

    ● Design our data models for optimal storage and retrieval on the cloud and to meet critical product and business requirements

    ● Build scalable and highly-performant distributed data processing systems as we migrate to the cloud

    ● Work closely with our business stakeholders to flesh out and deliver on requirements in an agile manner

    ● Set and evolve data standards and best practices

    ● Contribute to the data architecture and align it with the business and technology

    ● Adhere to and enforce software development best practices on the cloud in areas including but not limited to CI/CD, code reviews, automated testing, operational excellence, data quality etc.

     

    What You Should Have

    ● 5+ years of experience with Big Data

    ● 4+ years programming experience in Java Python

    ● 3+ years experience using an enterprise cloud-based solution

    ● 3+ years implementing data processing pipelines on the cloud (batch and/or streaming)

    ● 3+ years of experience with Spark, Kafka

    ● Advanced understanding of SQL, relational and NoSQL databases required

    ● Experience with various data access patterns, streaming technology, data quality, data modeling, data performance, and cost optimization

  • Senior/Lead Big Data Engineer with Azure and Databricks

    Ukraine · 5 years of experience · Intermediate
    5+ років досвіду з Data Engineering Досвід роботи з Azure: Data Factory, Data Lake, Databricks, Azure Functions Практичний досвід і високий рівень володіння Python Практичний досвід з Spark та Kafka Вміння інтегрувати, трансформувати і консолідувати дані...

    5+ років досвіду з Data Engineering
    Досвід роботи з Azure: Data Factory, Data Lake, Databricks, Azure Functions
    Практичний досвід і високий рівень володіння Python
    Практичний досвід з Spark та Kafka
    Вміння інтегрувати, трансформувати і консолідувати дані з різних структурованих і неструктурованих систем даних в структуру, придатну для побудови різних аналітичних рішень
    Хороші комунікативні навички, вміння працювати в команді
    Досвід роботи в Agile середовищі (Scrum, Kanban)
    Досвід роботи з Azure ML буде перевагою
    Знання письмової та усної англійської мови на рівні Upper-Intermediate і вище (B2)

  • Senior Big Data Engineer

    Ukraine · 4 years of experience · Intermediate
    EPAM у пошуку Senior Big Data Software Engineer для розробки та підтримки внутрішніх систем сховищ даних для продуктів та додатків нашого клієнта Навички Більше 4 років з досвіду в розробці сховищ даних та проєктуванні баз даних Експерт в SQL з досвідом...

    EPAM у пошуку Senior Big Data Software Engineer для розробки та підтримки внутрішніх систем сховищ даних для продуктів та додатків нашого клієнта

    Навички

    • Більше 4 років з досвіду в розробці сховищ даних та проєктуванні баз даних
    • Експерт в SQL з досвідом роботи не менше 3+ років
    • Хороші навички розробки на Python 
    • Розробка програмних рішень з використанням компонент екосистеми Hadoop(Yarn, MapReduce, Hive, HDFS)
    • Практичний досвід з Spark та Kafka
    • Досвід роботи з хмарними сервісами AWS (EMR, S3, lambda)
    • Відмінні аналітичні навички та навички комплексного вирішення проблем
    • Хороші комунікаційні та презентаційні навички
    • Уважність до деталей
    • Здатність дотримуватися стислих термінів та вмінння розставляти пріоритети

    Буде перевагою

    • Досвід роботи в адмініструванні баз даних
    • Досвід роботи з AWS Redshift
    • Практичний досвід з pyspark
    • Розуміння підходів масштабування сервісів/аплікацій
  • Lead Big Data Engineer

    Ukraine · 5 years of experience · Upper-Intermediate
    5+ years of hands-on experience in the big-data field: Hadoop, Spark, Spark Streaming, MR. -3+ years of experience with relevant cloud data services (AWS, Azure, GCP) like EMR, Glue, S3, Lambda, Fargate, DynamoDB, ADF, Azure functions, Azure Blob Storage,...

    5+ years of hands-on experience in the big-data field: Hadoop, Spark, Spark Streaming, MR.

    -3+ years of experience with relevant cloud data services (AWS, Azure, GCP) like EMR, Glue, S3, Lambda, Fargate, DynamoDB, ADF, Azure functions, Azure Blob Storage, DataProc & DataFlow, BigQuery.

    -Knowledge of DataBricks will be significant benefit for the candidate.

    - Excellent knowledge and hands-on experience of SQL in context of Big Data: Spark SQL, Hive QL

    - Ability to understand and optimize Spark execution plans via Spark UI

    - Excellent knowledge of Python or Scala including Functional Scala Libs (Cats) or Java SE

    - Batch processing and ETL principles in Data Warehouses.

    - Data completeness signals and orchestration

    - Approaches for historical reprocessing and data correction

    - Handling bad data and late data in inputs and outputs

    - Schema migrations and datasets evolution

    - B2+ Strong speaking English.

    - Ability to learn fast new set of tools and technology used internally: Radar, platform services, telemetry providers, Spark-as-a-Service, build system and much more.

     

     

    Will be a plus

    - Understanding of functional programming ideas and principles.

    - Experience in building and using web services.

    - Experience with any of Teradata, Vertica, Oracle, Tableau.

    - Spark Streaming and Kafka.

    - Experience or knowledge of: Apache Iceberg, Trino (Presto), Druid, Cassandra, Blob storage like AWS.

    - Understanding or experience with Azkaban or Airflow.

  • Senior/Lead Big Data Engineer

    Ukraine · 4 years of experience · Intermediate
    Huge fortune top3 healthcare vendor partnered with EPAM to accelerate their digital transformation. One of the program we started working together on is Cloud transformation and Cloud migration. We are looking for experienced developers passioned in big...

    Huge fortune top3 healthcare vendor partnered with EPAM to accelerate their digital transformation. One of the program we started working together on is Cloud transformation and Cloud migration. We are looking for experienced developers passioned in big data and cloud technologies to join our team of professional. 

    4+years of experience with Big Data

    Detailed requirements: 

    Mandatory 

    • English  

    • Overall experience in IT area 

    • Python 

    • AWS (EMR, Lambda, Kinesis, EKS, S3)  

    • Spark (core, sql)  

    • Kafka, Kafka streams 

    • SQL 

    • Linux/Unix OS Good to have 

    • NoSQL (HBase, DynamoDB) 

    • CDH platform 

    • Yarn

  • Senior Big Data Software Engineer

    Ukraine · 5 years of experience · Intermediate
    Responsibilities Ensure high quality performance by implementing and refining robust data processing using Java, Scala or Python Build scalable analytics solution, including data processing, storage and serving large-scale data through batch and...

    Responsibilities

    Ensure high quality performance by implementing and refining robust data processing using Java, Scala or Python

    Build scalable analytics solution, including data processing, storage and serving large-scale data through batch and stream

    Help operationalize machine learning models and build apps

    Contribute to making our data platform more scalable, resilient and reliable

    Participate in code review sessions

    Requirements

    5+ years of experience with Java, Scala or Python

    Data engineering skills (data ingestion, storage and processing) in batch and streaming solutions using Kafka and Spark

    Understanding and practical experience with AWS

    Skills with big data framework such as Hadoop & Apache Spark, NoSQL systems such as Cassandra or DynamoDB, streaming technologies such as Apache Kafka

    Understand reactive programming and dependency injection such as Spring to develop REST services

    Experience working with data scientists to operationalize machine learning models and build apps to make use of power of machine learning

    Experience with newer technologies relevant to the data space such as Spark, Kafka, Apache Druid 

    Good problem-solving skills

    Spoken and written English level - B1+ and higher

  • Senior Data Scientist

    Ukraine · 4 years of experience · Intermediate
    Responsibilities Design and develop AI/ML-enabled solutions to search, visualize and explain data from various sources (data warehouses, databases) Compare performance of different LLMs for particular business problems Test and evaluate AI/ML solution...

    Responsibilities

    • Design and develop AI/ML-enabled solutions to search, visualize and explain data from various sources (data warehouses, databases)
    • Compare performance of different LLMs for particular business problems
    • Test and evaluate AI/ML solution with objective to meet defined criteria for accuracy and coherence
    • Contribute to continuous improvement of solution performance and cost efficiency

    Requirements

    • University degree in Computer Science, Data Science or equivalent practical experience
    • Deep understanding of capabilities and challenges of applying Generative AI in business context
    • Familiarity with GenAI/LLM frameworks (e.g. Langchain, LlamaIndex, haystack)
    • Proficiency in Python and hands-on experience with ML frameworks and platforms (e.g. Tensorflow, PyTorch, MLflow)
    • Experience working with multi-dimensional data cubes and building analytics on top of them
    • Ability to work in cross-functional teams in a dynamic business environment based on agile development principles
    • Capability to describe business use-cases, implementation vision and technology choices for both technical and non-technical audiences


     

  • Lead Big Data Engineer

    Ukraine · 4 years of experience · Intermediate
    Responsibilities Development of scalable, secure, and high-performing software solutions Conducting code reviews to ensure coding standards are met Enforcement of team's adherence to standard engineering practices, including testing and CI/CD Performance...

    Responsibilities

    • Development of scalable, secure, and high-performing software solutions
    • Conducting code reviews to ensure coding standards are met
    • Enforcement of team's adherence to standard engineering practices, including testing and CI/CD
    • Performance of testing and optimization to enhance system efficiency
    • Integration of monitoring and observability capabilities into the software solutions

    Requirements

    • Minimum of 5 years' experience in Data Software Engineering or a related field
    • At least 1 year of relevant leadership experience
    • Proficiency in AWS, Apache Kafka, PySpark, Snowflake
    • Proven experience in developing scalable, secure, and reusable data transformation and processing solutions
    • Comprehensive knowledge of integration with monitoring and observability capabilities
    • Experience in performance testing and optimization
    • Solid understanding of standard engineering practices such as testing and CI/CD
    • Ability to conduct code reviews and enforce adherence to coding standards
    • Excellent problem-solving and analytical skills
    • Strong communication and collaboration skills
    • Bachelor's degree in Computer Science, Engineering, or a related field
    • English communication skills at a B2+ level
  • Big Data Developer

    Ukraine · 2 years of experience · Intermediate
    DESCRIPTION Our client is one of the largest consumer credit reporting agencies, with almost billion of individual clients and thousands of employees worldwide. This project is the largest project on Google Cloud and this is first and main reason why you...

    DESCRIPTION

    Our client is one of the largest consumer credit reporting agencies, with almost billion of individual clients and thousands of employees worldwide.

    This project is the largest project on Google Cloud and this is first and main reason why you should be interested. The main goal of the project consists of two parts. First is to develop Enterprise Data Platform which we be used across all company by each department and unit, second is to migrate current solutions totally to Google Cloud.

    REQUIREMENTS

    4+ years of commercial experience

    Experience in Google Storage, DataProc

    Good knowledge of Stackdriver, DataFlow

    Hands on BigQuery, KMS

    Practical experience in Airflow, Docker

    Good knowledge of Jenkins, Spark SQL

    Hands on Terraform

    Knowledge of Oracle

    Experience in SQL Server

    Experience in Python

  • Senior Data Software Engineer (Big Data)

    Ukraine · 3.5 years of experience · Intermediate
    We are currently looking for Senior Data Software Engineer to join Data Practice team and work with EPAM strategic clients. As Senior Data Software Engineer you will be building data systems and applications to support decision making. You will focus on...

    We are currently looking for Senior Data Software Engineer to join Data Practice team and work with EPAM strategic clients.

     

    As Senior Data Software Engineer you will be building data systems and applications to support decision making. You will focus on developing and maintaining data pipelines in public clouds like AWS, Azure or GCP. You will be participating in data migration from on-premise to public clouds and building components for central enterprise data platforms.

    RESPONSIBILITIES

    Contributing to the success of our customer projects

    Build data platform solutions in Clouds

    Development of data pipelines and migration of data

    Design and implementation of new business requirements

    REQUIREMENTS

    Bachelor's degree in Computer Science, Mathematics, related technical field or equivalent practical experience

    4+ years of direct experience working in Enterprise Data Platforms technologies

    Experience with public cloud (AWS, Azure or GCP) as well as with on prem data technologies

    Experience in data acquisition (API calls/FTP downloads), ETL, transformation/normalization (from raw to DB table schema structure), storage (Raw files, Database server), distribution & access (Entitlements for users, build of API’s and access points for data)

    Proficient in ETL/ELT process

    Experience building components for central data platforms for enterprise use (data warehouses, Operational Data Stores, Access layers with APIs, file extracts, user queries)

    Hands-on experience with SQL, Python, Spark, Kafka

     

  • Senior/Lead Power BI Software Engineer

    Ukraine · 3 years of experience · Upper-Intermediate
    We are looking for a Senior / Lead Data Analytics and Visualization Engineer passionate about data to extend our team of professionals. In this role, you will work with clients from various business domains – Finance, Telecom, Media, Insurance, Oil&Gas,...

    We are looking for a Senior / Lead Data Analytics and Visualization Engineer passionate about data to extend our team of professionals. In this role, you will work with clients from various business domains – Finance, Telecom, Media, Insurance, Oil&Gas, Pharma and others.

    The remote option applies only to the Candidates who will be working from any location in Ukraine.

     

    Responsibilities

    • Build advanced-level dashboards and reports
    • Design and build data models, conduct data preparation and profiling
    • Perform detailed analysis of business problems and technical environments
    • Manage security and administrate the reporting environment
    • Participate in architecture and testing processes to ensure they meet best practice specifications
    • Participate in architecture and testing processes to ensure they meet best practice specifications

     

    Requirements

    • 5+ years of practical experience with one of the reporting tools – Power BI (preferable)
    • Proficiency in Data Analytics and Data Visualization concepts
    • Skills in developing reports and dashboards
    • Advanced SQL knowledge and background in relational databases such as MS SQL Server, Oracle, MySQL, and PostgreSQL
    • Understanding of landing, staging area, data cleansing, data profiling, data security and data architecture concepts (DWH, Data Lake, Delta Lake/Lakehouse, Datamart)
    • Familiarity with Cloud technologies (AWS, Azure, GCP, Snowflake)
    • Expertise in data products testing
    • Ability to work in an agile development environment (SCRUM, Kanban)
    • Understanding of CI/CD principles and best practices
    • English level – Upper-Intermediate and higher

     

    We offer

    • Work on a flexible schedule remotely or from any of our comfortable offices or coworking spaces in Ukraine
    • Receive the necessary equipment to perform your work tasks
    • Change projects and technology stacks within EPAM
    • Gain experience in various business domains (Insurance, E-commerce, Healthcare, Finance, Travelling, Media, Artificial Intelligence, and more)
    • Consider relocation options in over 30 countries worldwide
    • Participate in volunteer, charity programs and communities (both technical and interest-based)
    • You can plan your individual career path together with your manager
    • Receive regular feedback from colleagues
    • Improve your English for free with certified teachers (Speaking Clubs, client interview preparation courses, etc.)
    • Get the opportunity to undergo free training and certification in AWS, GCP, or Azure Clouds
    • Use the internal E-learn training program (18,200+ specialized training and mentoring programs)
    • Access corporate accounts on LinkedIn Learning, Get Abstract and other partner resources
    • Study at EPAM Solution Architecture School with the instructors who are practicing architects
    • Develop as a leader, join Delivery Management, Resource Management, Leadership Essentials school and more
    • Participate in internal communities (500+ meetups, technical discussions, brainstorming sessions, online events and conferences annually)
    • Vacation and sick leave (including a sick leave without a medical certificate)
    • A wide range of Voluntary Medical Insurance programs providing both medical treatment and various preventive options (including sports activities)
    • Medical insurance for family members at corporate rates
    • Company support during significant life events (childbirth or adoption, marriage, etc.)
    • Support for psychological comfort: discounts on services from mental health specialists or coaches, thematic training
    • E-kids program - a free programming language training program for EPAMers' children

     

    EPAM strives to provide its global team of over 52,800+ professionals in more than 55 countries with opportunities for professional growth from day one of collaboration. Our colleagues are the source of EPAM's success, so we value cooperation, strive to always understand our clients' business and aim for the highest quality standards. No matter where you are, you will join a dedicated, diverse community that will help you realize your potential to the fullest.

  • Data Delivery Manager

    Full Remote · Ukraine · 5 years of experience · Advanced/Fluent
    As a Data Delivery Manager you will leverage your data-driven mindset and your passion for people in a multi-faceted role. You will be bringing delivery expertise and theoretical knowledge from Data and Analytics domain, working alongside globally...

    As a Data Delivery Manager you will leverage your data-driven mindset and your passion for people in a multi-faceted role. You will be bringing delivery expertise and theoretical knowledge from Data and Analytics domain, working alongside globally renowned, award-winning development and technology teams who are passionate about what they do, and how they do it. 
    You will play a key role in building strong business / technology relationships, taking a hands-on approach to define and deliver a culture of continuous improvement.

    WHAT YOU’LL DO
    Own end-to-end solution delivery on the assigned account(s)
    Assure governance of processes in delivery management and production as per selected delivery model
    For the top management and client - act as a single point of responsibility over any delivery-related matters, including escalations, upsells, ramp-downs, etc
    Take responsibility for resources needed for delivery - assure sound skills and seniority staffing according to the delivery roadmap
    Accountable for the technical leadership regarding the delivery. Ensure a sound and future-proof architecture is planned and the implementation meets the technical quality standards
    Coordinate between multiple disciplines and stakeholders
    Establish and manage long-term partnerships with the client(s)
    Address clients’ issues, identify and manage engagement risks
    Coordinate the preparation of client proposals and statements of work
    Ensure that projects are delivered in line with EPAM processes and methodologies;
    Make sure that the client executes their responsibilities on the engagements
    Build up delivery plan along with estimations on timeframes, quality and quantity of resources required to successfully deliver projects
    Establish a strategy of continuous delivery risk management that enables proactive decisions and actions throughout the delivery life cycle
    Support sales activities: participate in the bid process, and provide technical and management expertise to win bids
    Ensure EPAM delivery standards while bringing to life the best products for our clients
    Manage your team via KPIs around delivery, local Data Practice growth and technology proficiency

    WHAT YOU HAVE
    Practical experience in delivering projects in Data and Analytics, Big Data, Data Warehousing, Business Intelligence, LLM.
    Knowledge of corresponding technological solutions and industry best practices
    Good understanding of data engineering challenges and proven experience with data platform engineering (batch and streaming, ingestion, storage, processing, management, integration, consumption)
    Familiar with multiple Data & Analytics technology stacks
    Aware of various Data & Analytics tools and techniques (e.g. Python, data mining, predictive analytics, machine learning, data modeling, etc.)
    Experience with data visualization, aware of various tools and technologies
    Understanding of data-related security challenges
    Experience with one or more leading cloud providers (AWS/Azure/GCP); leading role in on-prem to cloud migration projects
    Experience executing from ground-up solutions
    A thorough understanding of how to plan, resource, and deliver large-scale technical projects
    Knowledge of software development/Product lifecycle methodologies (Scrum, Kanban, etc.)
    High degree of comfort to present both to large and small audiences
    Ability to manage multiple tasks and projects in a fast-moving environment
    Excellent interpersonal skills, and ability to work with diverse personality types
    Excellent business communication skills
     

Log In or Sign Up to see all posted jobs