Jobs

154
  • · 74 views · 1 application · 1d

    Data Engineer

    Hybrid Remote · Spain · 2 years of experience · B2 - Upper Intermediate
    We are looking for a talented Data Engineer (4 months project) to join our growing team. In this role, you will be responsible for designing, implementing, and optimizing sophisticated data pipeline flows within our advanced financial crime detection...

    We are looking for a talented Data Engineer (4 months project) to join our growing team. In this role, you will be responsible for designing, implementing, and optimizing sophisticated data pipeline flows within our advanced financial crime detection system. 

     

    Requirements:

    • 2+ years of hands-on experience with Apache Spark using PySpark or Scala (mandatory requirement)
    • Bachelor's degree or higher in Computer Science, Statistics, Informatics, Information Systems, Engineering, or related quantitative field
    • Proficiency in SQL for data querying and manipulation
    • Experience with version control systems, particularly Git
    • Working knowledge of Apache Hadoop ecosystem components (Hive, Impala, Hue, HDFS, Sqoop)
    • Demonstrated experience in data transformation, validation, cleansing, and ML feature engineering
    • Strong analytical capabilities with structured and semi-structured datasets
    • Excellent collaboration skills for cross-functional team environments
    • Fluent English

     

    Nice to have:

    • Machine learning pipeline development and deployment
    • Proficiency with Zeppelin or Jupyter notebook environments
    • Experience with workflow automation platforms (Jenkins, Apache Airflow)
    • Knowledge of microservices architecture, including containerization technologies (Docker, Kubernetes)

     

    Responsibilities:

    • Design, implement, and maintain production-ready data pipeline flows 
    • Build and optimize machine learning data pipelines to support advanced analytics capabilities
    • Develop solution-specific data flows tailored to unique use cases and customer requirements
    • Create sophisticated data tools and frameworks to empower analytics and data science teams
    • Collaborate closely with product, R&D, data science, and analytics teams to enhance system functionality and drive innovation
    • Work with cross-functional stakeholders to translate business requirements into scalable technical solutions
    • Build and nurture technical relationships with customers and strategic partners

     

    Recruitment process:

    • Screening call  (20 min)
    • Technical Interview with Team Lead (60 min)
    • Technical Test (2 hours take-home assignment)
    • Interview with Global Head of Data(30 min)
    • Interview with VP of R&D Manager (30 min)
    • HR Interview (30 min)
    • Reference Checks

    Timeline: Complete process typically takes 2-3 weeks from application to offer.

     

    We offer:

    • Сompetitive compensation based on your skills and experience.
    • Exciting project involving the newest technologies.
    • Flexible working hours.
    More
  • · 24 views · 8 applications · 8d

    Azure Data and AI Engineer

    Full Remote · Worldwide · 5 years of experience · B2 - Upper Intermediate
    Requirement: - Must to have strong experience with Azure, - Must to have strong experience in Data engineering, - Must to have strong experience with AI, We offer: • Attractive financial package • Paid vacation, holidays and sick leaves • Challenging...

    Requirement:

    - Must to have strong experience with Azure,

    - Must to have strong experience in Data engineering,

    - Must to have strong experience with AI,

     

    We offer:

    • Attractive financial package

    • Paid vacation, holidays and sick leaves

    • Challenging projects

    • Professional & career growth

    • Great atmosphere in a friendly small team

    • Flexible working hours

    More
  • · 30 views · 6 applications · 22d

    GCP Data Engineer

    Full Remote · Worldwide · 5 years of experience · B2 - Upper Intermediate
    Requirements: - Experience with GCP (BigQuery and Dataform) - Excellent communication skills We offer: • Attractive financial package • Challenging projects • Professional & career growth • Great atmosphere in a friendly small team • Flexible working hours ...

    Requirements:

    - Experience with GCP (BigQuery and Dataform)

    - Excellent communication skills

     

    We offer:

    • Attractive financial package

    • Challenging projects

    • Professional & career growth

    • Great atmosphere in a friendly small team

    • Flexible working hours

    More
  • · 347 views · 39 applications · 2d

    Data Engineer

    Countries of Europe or Ukraine · 2 years of experience · B1 - Intermediate
    Looking for a Data Engineer to join the Dataforest team. If you are looking for a friendly team, a healthy working environment, and a flexible schedule ‒ you have found the right place to send your CV. Skills requirements: • 2+ years of experience with...

    Looking for a Data Engineer to join the Dataforest team. If you are looking for a friendly team, a healthy working environment, and a flexible schedule ‒ you have found the right place to send your CV.

     

    Skills requirements:
    • 2+ years of experience with Python;
    • 2+ years of experience as a Data Engineer;
    • Experience with Pandas;
    • Experience with SQL DB / NoSQL (Redis, Mongo, Elasticsearch) / BigQuery;
    • Familiarity with Amazon Web Services;
    • Knowledge of data algorithms and data structures is a MUST;
    • Working with high volume tables 10m+.


    Optional skills (as a plus):
    • Experience with Spark (pyspark);
    • Experience with Airflow;
    • Experience with Kafka;
    • Experience in statistics;
    • Knowledge of DS and Machine learning algorithms..

     

    Key responsibilities:
    • Create ETL pipelines and data management solutions (API, Integration logic);
    • Different data processing algorithms;
    • Involvement in creation of forecasting, recommendation, and classification models.

     

    We offer:

    • Great networking opportunities with international clients, challenging tasks;

    • Building interesting projects from scratch using new technologies;

    • Personal and professional development opportunities;

    • Competitive salary fixed in USD;

    • Paid vacation and sick leaves;

    • Flexible work schedule;

    • Friendly working environment with minimal hierarchy;

    • Team building activities, corporate events.

    More
  • · 1111 views · 111 applications · 2d

    Junior Data Engineer

    Full Remote · Countries of Europe or Ukraine · 0.5 years of experience · B1 - Intermediate
    We seek a Junior Data Engineer with basic pandas and SQL experience. At Dataforest, we are actively seeking Data Engineers of all experience levels. If you're ready to take on a challenge and join our team, please send us your resume. We will review it...

    We seek a Junior Data Engineer with basic pandas and SQL experience.

    At Dataforest, we are actively seeking Data Engineers of all experience levels.

    If you're ready to take on a challenge and join our team, please send us your resume.

    We will review it and discuss potential opportunities with you.

     

    Requirements:

    • 6+ months of experience as a Data Engineer

    • Experience with SQL ;

    • Experience with Python;

     

     

    Optional skills (as a plus):

    • Experience with ETL / ELT pipelines;

    • Experience with PySpark;

    • Experience with Airflow;

    • Experience with Databricks;

     

    Key Responsibilities:

    • Apply data processing algorithms;

    • Create ETL/ELT pipelines and data management solutions;

    • Work with SQL queries for data extraction and analysis;

    • Data analysis and application of data processing algorithms to solve business problems;

     

     

    We offer:

    • Onboarding phase with hands-on experience with major DE stack, including Pandas, Kafka, Redis, Cassandra, and Spark

    • Opportunity to work with the high-skilled engineering team on challenging projects;

    • Interesting projects with new technologies;

    • Great networking opportunities with international clients, challenging tasks;

    • Building interesting projects from scratch using new technologies;

    • Personal and professional development opportunities;

    • Competitive salary fixed in USD;

    • Paid vacation and sick leaves;

    • Flexible work schedule;

    • Friendly working environment with minimal hierarchy;

    • Team building activities, corporate events.

    More
  • · 60 views · 6 applications · 19d

    Data Engineer

    Full Remote · Ukraine · Product · 3 years of experience · B1 - Intermediate
    We are looking for an experienced Data Engineer to design and maintain robust data infrastructure across our systems. In this role, you will be responsible for building scalable data pipelines, ensuring data integrity, and integrating third-party data...

    We are looking for an experienced Data Engineer to design and maintain robust data infrastructure across our systems. In this role, you will be responsible for building scalable data pipelines, ensuring data integrity, and integrating third-party data sources. Your primary focus will be to enable efficient data flow and support analytical capabilities across the organization. You will also contribute to the development of our data architecture, implement best engineering practices, and collaborate closely with cross-functional teams to turn raw data into actionable insights.

     

    Responsibilities

    • Communicate with both technical and non-technical audiences to gather requirements
    • Review and analyze data and logic to ensure consistency and accuracy
    • Design, implement, and maintain data pipelines for efficient data flow
    • Integrate and  support of developed solutions
    • Research and evaluate third-party components for potential use
    • Follow best engineering practices: refactoring, code review, testing, continuous delivery, and Scrum
    • Design, optimize, and support of data storage

     

    Requirements

    • At least 5+ years of experience in data engineering
    • Experience in requirement gathering and communication with stakeholders
    • Strong knowledge of DWH (data warehouse) architecture and principles
    • Practical experience building ETL pipelines and designing data warehouses
    • Deep experience with Python with a strong focus on PySpark
    • Proficiency in SQL and databases such as PostgreSQL, ClickHouse, MySQL
    • Hands-on experience with data scraping and integrating third-party sources and APIs
    • Solid understanding of software design patterns, algorithms, and data structures
    • Intermediate English proficiency

     

    Will be a plus

    • Experience with RabbitMQ or Kafka
    • Understanding of web application architecture
    • Familiarity with DataOps practices
    • Background in FinTech or Trading domains

     

    We offer

    • Tax expenses coverage for private entrepreneurs in Ukraine
    • Expert support and guidance for Ukrainian private entrepreneurs
    • 20 paid vacation days per year
    • 10 paid sick leave days per year
    • Public holidays as per the company's approved Public holiday list
    • Medical insurance
    • Opportunity to work remotely
    • Professional education budget
    • Language learning budget
    • Wellness budget (gym membership, sports gear and related expenses)


     

    More
  • · 24 views · 3 applications · 19d

    Cloud System engineer

    Full Remote · Ukraine · Product · 2 years of experience · A2 - Elementary
    Requirements: Knowledge of the core functionality of virtualization platforms; Experience implementing and migrating workloads in virtualized environment; Experience in complex IT solutions and Hybrid Cloud solution projects. Good understanding of...

    Requirements:

    • Knowledge of the core functionality of virtualization platforms;
    • Experience implementing and migrating workloads in virtualized environment;
    • Experience in complex IT solutions and Hybrid Cloud solution projects.
    • Good understanding of IT-infrastructure services is a plus;
    • Strong knowledge in troubleshooting of complex environments in case of failure;
    • At least basic knowledge in networking & information security is an advantage
    • Hyper-V, Proxmox, VMWare experience would be an advantage;
    • Experience in the area of services outsourcing (as customer and/or provider) is an advantage.
    • Work experience of 2+ years in a similar position
    • Scripting and programming experience/background in PowerShell/Bash is an advantage;
    • Strong team communication skills, both verbal and written;
    • Experience in technical documentation writing and preparation;
    • English skills - intermediate level is minimum and mandatory for global teams communication;
    • Industry certification focused on relevant solution area.

    Areas of Responsibility includes:

    • Participating in deployment and IT-infrastructure migration projects, Hybrid Cloud solution projects; Client support;
    • Consulting regarding migration IT-workloads in complex infrastructures;
    • Presales support (Articulating service value in the sales process) / Up and cross sell capability);
    • Project documentation: technical concepts
    • Education and development in professional area including necessary certifications.
    More
  • · 9 views · 0 applications · 14d

    Data Engineer TL / Poland

    Office Work · Poland · 4 years of experience · B2 - Upper Intermediate
    On behalf with our customer we are seeking for DataOps Team Lead to join our global R&D department. Our customer is an innovative technology company led by data scientists and engineers devoted to mobile app growth. They focus on solving the key...

    On behalf with our customer we are seeking for DataOps Team Lead to join our global R&D department.

     

    Our customer is an innovative technology company led by data scientists and engineers devoted to mobile app growth. They focus on solving the key challenge of growth for mobile apps by building Machine Learning and Big Data-driven technology that can both accurately predict what apps a user will like and connect them in a compelling way. 

    We are looking for a data centric quality driven team leader focusing on data process observability. The person is passionate about building high-quality data products and processes as well as supporting production data processes and ad-hoc data requests. 

    As a Data OPS TL, you will be in charge of the quality of service as well as quality of the data and knowledge platform for all data processes. You’ll be coordinating with stakeholders and play a major role in driving the business by promoting the quality and stability of the data performance and lifecycle and giving the Operational groups immediate abilities to affect the daily business outcomes.

     

    Responsibilities:

    • Process monitoring - managing and monitoring the daily data processes; troubleshooting server and process issues, escalating bugs and documenting data issues.
    • Ad-hoc operation configuration changes - Be the extension of the operation side into the data process; Using Airflow and python scripting alongside SQL to extract specific client relevant data points and calibrate certain aspects of the process.
    • Data quality automation - Creating and maintaining data quality tests and validations using python code and testing frameworks.
    • Metadata store ownership - Creating and maintaining the metadata store; Managing the metadata system which holds meta data of tables, columns, calculations and lineage. Participating in the design and development of the knowledge base metastore and UX. In order to be the pivotal point of contact when needing information on tables, columns and how they are connected. I.e., What is the data source? What is it used for? Why are we calculating this field in this manner?

       

    Requirements:

    • Over 2 years in a leadership role within a data team.
    • Over 3 years of hands-on experience as a Data Engineer, with strong proficiency in Python and Airflow.
    • Solid background in working with both SQL and NoSQL databases and data warehouses, including but not limited to MySQL, Presto, Athena, Couchbase, MemSQL, and MongoDB.
    • Bachelor’s degree or higher in Computer Science, Mathematics, Physics, Engineering, Statistics, or a related technical discipline.
    • Highly organized with a proactive mindset.
    • Strong service orientation and a collaborative approach to problem-solving.

       

    Nice to have skills:

    • Previous experience as a NOC or DevOps engineer is a plus.
    • Familiarity with PySpark is considered an advantage.

       

    What we can offer you

    • Remote work from Poland, flexible working schedule
    • Accounting support & consultation
    • Opportunities for learning and developing on the project
    • 20 working days of annual vacation
    • 5 days paid sick leaves/days off; state holidays
    • Provide working equipment
    More
  • · 45 views · 8 applications · 8d

    Data Engineer

    Hybrid Remote · Ukraine (Kyiv) · Product · 2 years of experience · B1 - Intermediate
    3Shape develops 3D scanners and software solutions that enable dental and hearing professionals to treat more people, more effectively. Our products are market leading innovative solutions that make a real difference in the lives of both patients and...

    3Shape develops 3D scanners and software solutions that enable dental and hearing professionals to treat more people, more effectively. Our products are market leading innovative solutions that make a real difference in the lives of both patients and dental professionals around the world.
     

    3Shape is headquartered in Copenhagen, with development teams in Denmark, Ukraine, North Macedonia and with a production site in Poland.
     

    We are a global company with presence in Europe, Asia and the Americas. Founded in a year 2000, today, we provide services to customers in over 130 countries. Our growing talent pool of over 2500 employees spans 45+ nationalities.
     

    3Shape as an employer is committed to Ukraine. Our UA office was founded in 2006, and we are continuing to grow, hire and take care of our employees even during the war in Ukraine. Among other actions, we support our contractors who are called to the military service, as well as care about our colleagues’ mental health by implementing different activities.
     

    If you are looking for stability in your future, we are the right place for you.


    About the role:

    The Customer Data Strategy is a high-priority initiative with significant potential and senior management buy-in. Join our expanding team that currently includes a Data Analyst, Data Engineer, Data Architect, and Manager.
     

    Key responsibilities: 

    • Develop and optimize Azure Databricks in collaboration with cross-functional teams to enable a 'one-stop-shop' for analytical data
    • Translate customer-focused commercial needs into concrete data products
    • Build data products to unlock commercial value and help integrate systems
    • Coordinate technical alignment meetings between functions
    • Act as customer data ambassador to improve 'data literacy' across the organization

    Your profile:

    • Experience working with data engineering in a larger organization, tech start-up, or as an external consultant
    • Extensive experience with Azure Databricks, Apache Spark, and Delta Lake
    • Proficiency in Python, PySpark and SQL
    • Experience with optimizing and automating data engineering processes
    • Familiarity with GitHub and GitHub Actions for CI/CD processes
    • Knowledge of Terraform as a plus


    Being the part of us means:

    • Make an impact in one of the most exciting Danish tech companies in the medical device industry
    • Work on solutions used by thousands of dental professionals worldwide
    • Be part of 3Shape's continued accomplishments and growth
    • Contribute to meaningful work that changes the future of dentistry
    • Develop professionally in a unique and friendly environment
    • Enjoy a healthy work-life balance
    • Occasional business trips to Western Europe
       

    We offer:

    • 39 hours of cooperation per week within a flexible time frame
    • 24 business days of annual leaves
    • Medical insurance (with additional Dentistry Budget and 10 massaging sessions per year included)
    • Possibility of flexible remote cooperation
    • Good working conditions in a comfortable office near National Technical University “KPI” which includes: Blackout ready infrastructure. Corporate Paper Book Library. Gym-Room with Shower.
    • A parking lot with free spaces for employees
    • Partial compensation of lunches
    • Paid sick leaves and child sick leaves
    • Maternity, paternity and family issues leaves
    • Well-being program: monthly well-being meetings and individual psychology hot-line
       

    Want to join us and change the future of dentistry?

    More
  • · 446 views · 40 applications · 2d

    Middle Data Engineer

    Full Remote · Countries of Europe or Ukraine · 2 years of experience · B1 - Intermediate
    Dataforest is looking for a Middle Data Engineer to join our team and work on the Dropship project — a cutting-edge data intelligence platform for e-commerce analytics. You will be responsible for developing and maintaining a scalable data architecture...

    Dataforest is looking for a Middle Data Engineer to join our team and work on the Dropship project — a cutting-edge data intelligence platform for e-commerce analytics. You will be responsible for developing and maintaining a scalable data architecture that powers large-scale data collection, analysis, and integrations. We are waiting for your CV!

    Requirements:

    - 2+ years of commercial experience with Python.

    - Experience working with PostgreSQL databases.
    - Profound understanding of algorithms and their complexities, with the ability to analyze and optimize them effectively.
    - Solid understanding of ETL principles and best practices.
    - Excellent collaborative and communication skills, with demonstrated ability to mentor and support team members.
    - Experience working with Linux environments, cloud services (AWS), and Docker.
    - Strong decision-making capabilities with the ability to work independently and proactively.

    Will be a plus:
    - Experience in web scraping, data extraction, cleaning, and visualization.
    - Understanding of multiprocessing and multithreading, including process and thread management.
    - Familiarity with Redis.
    - Excellent programming skills in Python with a strong emphasis on optimization and code structuring.
    - Experience with Flask / Flask-RESTful for API development.
    - Knowledge and experience with Kafka.
     

    Key Responsibilities:

    - Develop and maintain a robust data processing architecture using Python.

    - Design and manage data pipelines using Kafka and SQS.

    - Optimize code for better performance and maintainability.

    - Design and implement efficient ETL processes.

    - Work with AWS technologies to ensure flexible and reliable data processing systems.

    - Collaborate with colleagues, actively participate in code reviews, and improve technical knowledge.

    - Take responsibility for your tasks and suggest improvements to processes and systems.

    We offer:

    - Working in a fast growing company;

    - Great networking opportunities with international clients, challenging tasks;

    - Personal and professional development opportunities;

    - Competitive salary fixed in USD;

    - Paid vacation and sick leaves;

    - Flexible work schedule;

    - Friendly working environment with minimal hierarchy;

    - Team building activities, corporate events.

    More
  • · 43 views · 3 applications · 22d

    Senior Market Data Engineer

    Full Remote · Countries of Europe or Ukraine · 5 years of experience · B2 - Upper Intermediate
    We are looking for a skilled and experienced Software Engineer to join our team, building high-performance real-time data pipelines to process financial market data, including security prices for various asset classes such as equities, options, futures,...

    We are looking for a skilled and experienced Software Engineer to join our team, building high-performance real-time data pipelines to process financial market data, including security prices for various asset classes such as equities, options, futures, and more. You will play a key role in designing, developing, and optimizing data pipelines that handle large volumes of data with low latency and high throughput, ensuring that our systems can process market data in real time and batch modes.

     

    Key Responsibilities:

    • Architect, develop, and enhance market data systems
    • Contribute to the software development lifecycle in a collaborative team environment, including design, implementation, testing, and support
    • Design highly efficient, scalable, mission-critical systems
    • Maintain good software quality and test coverage
    • Participate in code reviews
    • Troubleshooting incidents and reported bugs

     

    Requirements:

    • Bachelor’s or advanced degree in Computer Science or Electrical Engineering
    • Proficiency in the following programming languages: Java, Python or Go
    • Prior experience working with equities or futures market data, such as CME data, US Equities Options, is a must
    • Experience in engineering and supporting Market Data feed handlers
    • Technically fluent (Python, SQL, JSON, ITCH, FIX, CSV); comfortable discussing pipelines and validation specs.
    • Prior experience working on tick data storage, such as KDB+ or Clickhouse
    • Familiarity with time series analysis
    • Good understanding of the Unix/Linux programming environment
    • Expertise with SQL and relational databases
    • Excellent problem-solving and communication skills
    • Self-starter and works well in a fast-paced environment
    More
  • · 23 views · 2 applications · 12d

    Back End and Data Engineer

    Hybrid Remote · Spain, Ukraine · Product · 3 years of experience · C1 - Advanced
    Back End Data Engineer Company: NeuCurrent Location: Remote / Hybrid NeuCurrent is an AI powered CRM and omnichannel marketing platform for retailers. NeuCurrent empowers brands to connect with their customers more intelligently – through real-time...

    Back End Data Engineer

     

    Company: NeuCurrent

     

    Location: Remote / Hybrid

     

    NeuCurrent is an AI powered CRM and omnichannel marketing platform for retailers. NeuCurrent empowers brands to connect with their customers more intelligently – through real-time personalization across email, SMS, WhatsApp, and push. 

     

    We are looking for a motivated, bright and independent Data and Back End Engineer with at least 3 years experience who is willing to take broad responsibilities across product development and wants to progress quickly to a senior role within a company. 

     

    Why you should apply:

    This is a challenging job. Building something new is not easy.

     

    We are building a new product.

     

    It takes time to grow and get results.

     

    Working for a start up you are going through lots of challenges.

     

    We often multi-task as we grow our company.

     

    BUT If you want to be part of the fast growing business and team, generate your own ideas, and see the direct impact of your work on the business of the users, you should consider applying for this job. 

     

    Apply if you want to try yourself in something new!

     

    About company:

    At NeuCurrent we are building a disruptive technology for retail which enables all retailers to implement data driven customer retention without expensive data analytics and marketing resources. We are actively expanding in Europe at present. 

     

    Our values:

    We care for our people and customers

    We develop innovation that generates real benefit

    We are not afraid to take risks and see failure as a learning opportunity

    We are determined and always look for solutions to move forward

     

    Responsibilities and Tech Stacks:

    Data Processing

     

    • Maintain and improve data collection pipelines, which are the backbone of our product
    • Create new integrations
    • Maintain DWH architecture
    • Support and improve Kubernetes infrastructure on GCP
    • Developer infrastructure for ML pipelines

     

    Tech stack:

    Redis, Argo

    Python 3, Pandas, Numpy, Sqlalchemy

    PostgreSQL, Google BigQuery

    Docker, Kubernetes, Google Cloud Platform

    Flask

     

    Backend Development

     

    • Support and further develop the product backend API
    • Developer infrastructure for ML pipelines

     

    Tech stack:

    Python, Flask

    PostgreSQL

    Docker, Kubernetes, Google Cloud Platform

     

    ML and Data Analysis 

     

    • Improve and support ML models of product recommendation engine
    • Developing dashboards

     

    Tech stack:

    Collaborative filtering models

    Content and hybrid recommendation models

    Learn to rank models

    Sequence prediction neural network models

    Python, Jupyter, pandas

    BigQuery

    Data Build tool

    Looke studio   

     

    Other Requirements:

    Work closely with the founders on product development and improvement

    Actively contribute ideas to product innovation including AI implementation

    Customer focused 

    An excellent / good working English is a must as you will deal with the English speaking customers

     

    Personal Qualities:

    A self-starter who has the experience and confidence to act independently

    Have a positive mindset and attitude to solve problems and represent NeuCurrent with our customers

    A strong team player who is keen to learn and share her/his knowledge

    Enthusiasm and desire to progress professionally to a managerial role quickly

    Be structured and a good communicator 

     

    What we offer:

    You will have a lot of freedom in your field to create direct impact on our clients and NeuCurrent business and product development

    We have a young, highly motivated, and talented team

    You will become an integral part of new product development

    You will have plenty of opportunities to make your own decisions

    You will have unlimited opportunities to progress to managerial roles in our fast growing company

    Options / shares are available for a proven candidate

     

    More
  • · 26 views · 3 applications · 8d

    Lead Data Engineer

    Full Remote · Countries of Europe or Ukraine · 7 years of experience · B2 - Upper Intermediate
    We are seeking a highly skilled Lead Data Engineer to design, develop, and optimize our Data Warehouse solutions. The ideal candidate will have extensive experience in ETL/ELT development, data modeling, and big data technologies, ensuring efficient data...

    We are seeking a highly skilled Lead Data Engineer to design, develop, and optimize our Data Warehouse solutions. The ideal candidate will have extensive experience in ETL/ELT development, data modeling, and big data technologies, ensuring efficient data processing and analytics. This role requires strong collaboration with Data Analysts, Data Scientists, and Business Stakeholders to drive data-driven decision-making.

     

    Does this relate to you?

    • 7+ years of experience in Data Engineering field 
    • At least 1+ year of experience as Lead\Architect 
    • Strong expertise in SQL and data modeling concepts.
    • Hands-on experience with Airflow.
    • Experience working with Redshift.
    • Proficiency in Python for data processing.
    • Strong understanding of data governance, security, and compliance.
    • Experience in implementing CI/CD pipelines for data workflows.
    • Ability to work independently and collaboratively in an agile environment.
    • Excellent problem-solving and analytical skills.

    A new team member will be in charge of:

    • Design, develop, and maintain scalable data warehouse solutions.
    • Build and optimize ETL/ELT pipelines for efficient data integration.
    • Design and implement data models to support analytical and reporting needs.
    • Ensure data integrity, quality, and security across all pipelines.
    • Optimize data performance and scalability using best practices.
    • Work with big data technologies such as Redshift.
    • Collaborate with cross-functional teams to understand business requirements and translate them into data solutions.
    • Implement CI/CD pipelines for data workflows.
    • Monitor, troubleshoot, and improve data processes and system performance.
    • Stay updated with industry trends and emerging technologies in data engineering.

    Already looks interesting? Awesome! Check out the benefits prepared for you:

    • Regular performance reviews, including remuneration
    • Up to 25 paid days off per year for well-being
    • Flexible cooperation hours with work-from-home
    • Fully paid English classes with an in-house teacher
    • Perks on special occasions such as birthdays, marriage, childbirth
    • Referral program implying attractive bonuses
    • External & internal training and IT certifications

     

    Ready to try your hand? Send your CV without a doubt!

    More
  • · 19 views · 0 applications · 6h

    Data Engineer

    Full Remote · Countries of Europe or Ukraine · 5 years of experience · B2 - Upper Intermediate MilTech 🪖
    Who We Are OpenMinds is a cognitive defence tech company countering authoritarian influence in the battle for free and open societies. We work with over 30 governments and organisations worldwide, including Ukraine, the UK, and NATO member governments,...

    Who We Are
     

    OpenMinds is a cognitive defence tech company countering authoritarian influence in the battle for free and open societies. We work with over 30 governments and organisations worldwide, including Ukraine, the UK, and NATO member governments, leading StratCom agencies, and research institutions.

    Our expertise lies in accessing restricted and high-risk environments, including conflict zones and closed platforms.

    We combine ML technologies with deep local expertise. Our team, based in Kyiv, Lviv, London, Ottawa, and Washington, DC, includes behavioural scientists, ML/AI engineers, data journalists, communications experts, and regional specialists.

    Our core values are: speed, experimentation, elegance and focus. We are expanding the team and welcome passionate, proactive, and resourceful professionals who are eager to contribute to the global fight in cognitive warfare.
     

    Who we’re looking for

    OpenMinds is seeking a skilled and curious Data Engineer who’s excited to design and build data systems that power meaningful insight. You’ll work closely with a passionate team of behavioral scientists and ML engineers on creating a robust data infrastructure that supports everything from large-scale narrative tracking to sentiment analysis.
     

    In the position you will:

    • Take ownership of our multi-terabyte data infrastructure, from data ingestion and orchestration to transformation, storage, and lifecycle management
    • Collaborate with data scientists, analysts, ML engineers, and domain experts to develop impactful data solutions
    • Optimize and troubleshoot data infrastructure to ensure high performance, cost-efficiency, scalability, and resilience
    • Stay up-to-date with trends in data engineering and apply modern tools and practices
    • Define and implement best practices for data processing, storage, and governance
    • Translate complex requirements into efficient data workflows that support threat detection and response
       

    We are a perfect match if you have:

    • 5+ years of hands-on experience as a Data Engineer, with a proven track record of leading complex data projects from design to production
    • Highly skilled in SQL and Python for advanced data processing, pipeline development, and optimization
    • Deep understanding of software engineering best practices, including SOLID, error handling, observability, performance tuning, and modular architecture
    • Ability to write, test and deploy production-ready code
    • Extensive experience in database design, data modeling, and modern data warehousing, including ETL orchestration using Airflow or equivalent
    • Familiarity with Google Cloud Platform (GCP) and its data ecosystem (BigQuery, GCS, Pub/Sub, Cloud Run, Cloud Functions, Looker)
    • Open-headed, capable of coming up with creative solutions and adapting to frequently changing circumstances and technological advances
    • Experience in DevOps (docker/k8s, IaaC, CI/CD) and MLOps
    • Fluent in English with excellent communication and cross-functional collaboration skills
       

    We offer:

    • Work in a fast-growing company with proprietary AI technologies, solving the most difficult problems in the domains of social behaviour analytics and national security
    • Competitive market salary
    • Opportunity to present your work on tier 1 conferences, panels, and briefings behind closed doors
    • Work face-to-face with world-leading experts in their fields, who are our partners and friends
    • Flexible work arrangements, including adjustable hours, location, and remote/hybrid options
    • Unlimited vacation and leave policies
    • Opportunities for professional development within a multidisciplinary team, boasting experience from academia, tech, and intelligence sectors
    • A work culture that values resourcefulness, proactivity, and independence, with a firm stance against micromanagement
    More
  • · 15 views · 1 application · 8d

    Senior ML/GenAI Engineer

    Full Remote · Ukraine · Product · 5 years of experience · B2 - Upper Intermediate
    Senior ML Engineer Full-time / Remote About Us ExpoPlatform is a UK-based company founded in 2013, delivering advanced technology for online, hybrid, and in-person events across 30+ countries. Our platform provides end-to-end solutions for event...

    Senior ML Engineer 

    Full-time / Remote 

     

    About Us

    ExpoPlatform is a UK-based company founded in 2013, delivering advanced technology for online, hybrid, and in-person events across 30+ countries. Our platform provides end-to-end solutions for event organizers, including registration, attendee management, event websites, and networking tools.

     

    Role Responsibilities:

    • Develop AI Agents, tools for AI Agents, API as a service
    • Prepare development and deployment documentation
    • Participate in R&D activities of Data Science team

     

    Required Skills & Experience:

    • 5+ years of experience with DL frameworks (PyTorch and/or TensorFlow)
    • 5+ years of experience in software development in Python
    • Hand-on experience with LLM, RAG and AI Agents development
    • Experience with Amazon SageMaker, Amazon Bedrock, LangChain, LangGraph, LangSmith, LlamaIndex, HaggingFace, OpenAI 
    • Hand-on experience of usage AI tools for software development to increase efficiency and code quality, usage AI tools for code review.
    • Knowledge of SQL, non-SQL and vector databases
    • Understanding of embedding vectors  and semantic search
    • Proficiency in Git (Bitbucket) and Docker
    • Upper-Intermediate (B2+) or higher level of English

     

    Would a Plus:

    • Hand-on experience with SLM and LLM fine-tuning
    • Education in Data Science, Computer Science, Applied Math or similar
    • AWS certifications (AWS Certified ML or equivalent)
    • Experience with TypeSense
    • Experience with speech recognition, speech-to-text ML models

     

    What We Offer:

    • Career growth with an international team.
    • Competitive salary and financial stability.
    • Flexible working hours (Mon-Fri, 8 hours).
    • Free English courses and a budget for education


     

    More
Log In or Sign Up to see all posted jobs