Jobs Data Engineer

161
  • Β· 44 views Β· 5 applications Β· 17d

    Senior Data Platform Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 7 years of experience Β· English - B2
    What You’ll Actually Do Architect and run high-load, production-grade data pipelines where correctness and latency matter. Design systems that survive schema changes, reprocessing, and partial failures. Own data availability, freshness, and trust - not...

    🎯 What You’ll Actually Do

    • Architect and run high-load, production-grade data pipelines where correctness and latency matter.
    • Design systems that survive schema changes, reprocessing, and partial failures.
    • Own data availability, freshness, and trust - not just pipeline success.
    • Make hard calls: accuracy vs cost, speed vs consistency, rebuild vs patch.
    • Build guardrails so downstream consumers (Analysts, Product, Ops) don’t break.
    • Improve observability: monitoring, alerts, data quality checks, SLAs.
    • Partner closely with backend engineers, data analysts, and Product - no handoffs, shared ownership.
    • Debug incidents, own RCA, and make sure the same class of failure doesn’t return.

    This is a hands-on IC role with platform-level responsibility.

     

    🧠 What You Bring

    • 5+ years in data or backend engineering on real production systems.
    • Strong experience with columnar analytical databases (ClickHouse, Snowflake, BigQuery, similar).
    • Experience with event-driven / streaming systems (Kafka, pub/sub, CDC, etc.).
    • Strong SQL + at least one general-purpose language (Python, Java, Scala).
    • You think in failure modes, not happy paths.
    • You explain why something works - and when it shouldn’t be used.

    Bonus: You’ve rebuilt or fixed a data system that failed in production.

     

    πŸ”§ How We Work

    • Reliability > elegance. Correct data beats clever data.
    • Ownership > tickets. You run what you build.
    • Trade-offs > dogma. Context matters.
    • Direct > polite. We fix problems, not dance around them.
    • One team, one system. No silos.
    •  

    πŸ”₯ What We Offer

    • Fully remote.
    • Unlimited vacation + paid sick leave.
    • Quarterly performance bonuses.
    • Medical insurance for you and your partner.
    • Learning budget (courses, conferences, certifications).
    • High trust, high autonomy.
    • Zero bureaucracy. Real engineering problems.

       

    πŸ‘‰ Apply if you see data platforms as systems to be engineered - not pipelines to babysit.

    More
  • Β· 43 views Β· 10 applications Β· 17d

    Senior Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 7 years of experience Β· English - B2
    What You’ll Actually Do Design and run high-throughput, production-grade data pipelines. Own data correctness, latency, and availability end to end. Make hard trade-offs: accuracy vs speed, cost vs freshness, rebuild vs patch. Design for change - schema...

    🎯 What You’ll Actually Do

    • Design and run high-throughput, production-grade data pipelines.
    • Own data correctness, latency, and availability end to end.
    • Make hard trade-offs: accuracy vs speed, cost vs freshness, rebuild vs patch.
    • Design for change - schema evolution, reprocessing, and new consumers.
    • Protect BI, Product, and Ops from breaking changes and silent data issues.
    • Build monitoring, alerts, and data quality checks that catch problems early.
    • Work side-by-side with Product, BI, and Engineering β€” no handoffs, shared ownership.
    • Step into incidents, own RCA, and make sure the same class of failure never repeats.

    This is a hands-on senior IC role with real accountability.

     

     

    🧠 What You Bring (Non-Negotiable)

    • 5+ years in data or backend engineering on real production systems.
    • Strong experience with analytical databases
      (ClickHouse, Snowflake, BigQuery, or similar).
    • Experience with event-driven or streaming systems
      (Kafka, CDC, pub/sub).
    • Solid understanding of:
      • at-least-once vs exactly-once semantics
      • schema evolution & backfills
      • mutation and reprocessing costs
    • Strong SQL and at least one programming language
      (Python, Java, Scala, etc.).
    • You don’t just ship - you own what happens after.

       

    πŸ”§ How We Work

    • Reliability > cleverness.
    • Ownership > process.
    • Impact > output.
    • Direct > polite.
    • One team, one system.

       

    πŸ”₯ What We Offer

    • Fully remote (Europe).
    • Unlimited vacation + paid sick leave.
    • Quarterly performance bonuses.
    • Medical insurance for you and your partner.
    • Learning budget (courses, conferences, certifications).
    • High trust, high autonomy.
    • No bureaucracy. Real data problems.

       

    πŸ‘‰ Apply if you treat data like production software - and feel uncomfortable when numbers can’t be trusted.

    More
  • Β· 51 views Β· 9 applications Β· 17d

    Senior Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· English - None
    About us: Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with the organization of the first Data Science UA conference, setting the foundation for our growth. Over the past 9 years, we have...

    About us:
    Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with the organization of the first Data Science UA conference, setting the foundation for our growth. Over the past 9 years, we have diligently fostered the largest Data Science Community in Eastern Europe, boasting a network of over 30,000 AI top engineers.

    About the client:
    We are working with a new generation of data service provider, specializing in data consulting and data-driven digital marketing, dedicated to transforming data into business impact across the entire value chain of organizations. The company’s data-driven services are built upon the deep AI expertise the company’s acquired with a 1000+ client base around the globe. The company has 1000 employees across 20 offices who are focused on accelerating digital transformation.

    About the role:
    We are seeking a Senior Data Engineer (Azure) to design and maintain data pipelines and systems for analytics and AI-driven applications. You will work on building reliable ETL/ELT workflows and ensuring data integrity across the organization.

    Required skills:
    - 6+ years of experience as a Data Engineer, preferably in Azure environments.
    - Proficiency in Python, SQL, NoSQL, and Cypher for data manipulation and querying.
    - Hands-on experience with Airflow and Azure Data Services for pipeline orchestration.
    - Strong understanding of data modeling, ETL/ELT workflows, and data warehousing concepts.
    - Experience in implementing DataOps practices for pipeline automation and monitoring.
    - Knowledge of data governance, data security, and metadata management principles.
    - Ability to work collaboratively with data science and analytics teams.
    - Excellent problem-solving and communication skills.

    Responsibilities:
    - Transform data into formats suitable for analysis by developing and maintaining processes for data transformation;
    - Structuring, metadata management, and workload management.
    - Design, implement, and maintain scalable data pipelines on Azure.
    - Develop and optimize ETL/ELT processes for various data sources.
    - Collaborate with data scientists and analysts to ensure data readiness.
    - Monitor and improve data quality, performance, and governance.

    More
  • Β· 71 views Β· 4 applications Β· 17d

    Data Engineer

    Full Remote Β· Ukraine Β· Product Β· 3 years of experience Β· English - None
    About us: Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with uniting top AI talents and organizing the first Data Science tech conference in Kyiv. Over the past 9 years, we have diligently...

    About us:
    Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with uniting top AI talents and organizing the first Data Science tech conference in Kyiv. Over the past 9 years, we have diligently fostered one of the largest Data Science & AI communities in Europe.

    About the client:
    Our client is an IT company that develops technological solutions and products to help companies reach their full potential and meet the needs of their users. The team comprises over 600 specialists in IT and Digital, with solid expertise in various technology stacks necessary for creating complex solutions.

    About the role:
    We are looking for a Data Engineer (NLP-Focused) to build and optimize the data pipelines that fuel the Ukrainian LLM and NLP initiatives. In this role, you will design robust ETL/ELT processes to collect, process, and manage large-scale text and metadata, enabling the Data Scientists and ML Engineers to develop cutting-edge language models.

    You will work at the intersection of data engineering and machine learning, ensuring that the datasets and infrastructure are reliable, scalable, and tailored to the needs of training and evaluating NLP models in a Ukrainian language context.

    Requirements:
    - Education & Experience: 3+ years of experience as a Data Engineer or in a similar role, building data-intensive pipelines or platforms. A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field is preferred. Experience supporting machine learning or analytics teams with data pipelines is a strong advantage.
    - NLP Domain Experience: Prior experience handling linguistic data or supporting NLP projects (e.g., text normalization, handling different encodings, tokenization strategies). Knowledge of Ukrainian text sources and data sets, or experience with multilingual data processing, can be an advantage given the project’s focus.
    Understanding of FineWeb2 or a similar processing pipeline approach.
    - Data Pipeline Expertise: Hands-on experience designing ETL/ELT processes, including extracting data from various sources, using transformation tools, and loading into storage systems. Proficiency with orchestration frameworks like Apache Airflow for scheduling workflows. Familiarity with building pipelines for unstructured data (text, logs) as well as structured data.
    - Programming & Scripting: Strong programming skills in Python for data manipulation and pipeline development. Experience with NLP packages (spaCy, NLTK, langdetect, fasttext, etc.). Experience with SQL for querying and transforming data in relational databases. Knowledge of Bash or other scripting for automation tasks. Writing clean, maintainable code and using version control (Git) for collaborative development.
    - Databases & Storage: Experience working with relational databases (e.g., PostgreSQL, MySQL), including schema design and query optimization. Familiarity with NoSQL or document stores (e.g., MongoDB) and big data technologies (HDFS, Hive, Spark) for large-scale data is a plus. Understanding of or experience with vector databases (e.g., Pinecone, FAISS) is beneficial, as the NLP applications may require embedding storage and fast similarity search.
    - Cloud Infrastructure: Practical experience with cloud platforms (AWS, GCP, or Azure) for data storage and processing. Ability to set up services such as S3/Cloud Storage, data warehouses (e.g., BigQuery, Redshift), and use cloud-based ETL tools or serverless functions. Understanding of infrastructure-as-code (Terraform, CloudFormation) to manage resources is a plus.
    - Data Quality & Monitoring: Knowledge of data quality assurance practices. Experience implementing monitoring for data pipelines (logs, alerts) and using CI/CD tools to automate pipeline deployment and testing. An analytical mindset to troubleshoot data discrepancies and optimize performance bottlenecks.
    - Collaboration & Domain Knowledge: Ability to work closely with data scientists and understand the requirements of machine learning projects. Basic understanding of NLP concepts and the data needs for training language models, so you can anticipate and accommodate the specific forms of text data and preprocessing they require. Good communication skills to document data workflows and to coordinate with team members across different functions.

    Nice to have:
    - Advanced Tools & Frameworks: Experience with distributed data processing frameworks (such as Apache Spark or Databricks) for large-scale data transformation, and with message streaming systems (Kafka, Pub/Sub) for real-time data pipelines. Familiarity with data serialization formats (JSON, Parquet) and handling of large text corpora.
    - Web Scraping Expertise: Deep experience in web scraping, using tools like Scrapy, Selenium, or Beautiful Soup, and handling anti-scraping challenges (rotating proxies, rate limiting). Ability to parse and clean raw text data from HTML, PDFs, or scanned documents.
    - CI/CD & DevOps: Knowledge of setting up CI/CD pipelines for data engineering (using GitHub Actions, Jenkins, or GitLab CI) to test and deploy changes to data workflows. Experience with containerization (Docker) to package data jobs and with Kubernetes for scaling them is a plus.
    - Big Data & Analytics: Experience with analytics platforms and BI tools (e.g., Tableau, Looker) used to examine the data prepared by the pipelines. Understanding of how to create and manage data warehouses or data marts for analytical consumption.
    - Problem-Solving: Demonstrated ability to work independently in solving complex data engineering problems, optimizing existing pipelines, and implementing new ones under time constraints. A proactive attitude to explore new data tools or techniques that could improve the workflows.

    Responsibilities:
    - Design, develop, and maintain ETL/ELT pipelines for gathering, transforming, and storing large volumes of text data and related information.
    - Ensure pipelines are efficient and can handle data from diverse sources (e.g., web crawls, public datasets, internal databases) while maintaining data integrity.
    - Implement web scraping and data collection services to automate the ingestion of text and linguistic data from the web and other external sources. This includes writing crawlers or using APIs to continuously collect data relevant to the language modeling efforts.
    - Implementation of NLP/LLM-specific data processing: cleaning and normalization of text, like filtering of toxic content, de-duplication, de-noising, detection, and deletion of personal data.
    - Formation of specific SFT/RLHF datasets from existing data, including data augmentation/labeling with LLM as teacher.
    - Set up and manage cloud-based data infrastructure for the project. Configure and maintain data storage solutions (data lakes, warehouses) and processing frameworks (e.g., distributed compute on AWS/GCP/Azure) that can scale with growing data needs.
    - Automate data processing workflows and ensure their scalability and reliability.
    - Use workflow orchestration tools like Apache Airflow to schedule and monitor data pipelines, enabling continuous and repeatable model training and evaluation cycles.
    - Maintain and optimize analytical databases and data access layers for both ad-hoc analysis and model training needs.
    - Work with relational databases (e.g., PostgreSQL) and other storage systems to ensure fast query performance and well-structured data schemas.
    - Collaborate with Data Scientists and NLP Engineers to build data features and datasets for machine learning models.
    - Provide data subsets, aggregations, or preprocessing as needed for tasks such as language model training, embedding generation, and evaluation.
    - Implement data quality checks, monitoring, and alerting. Develop scripts or use tools to validate data completeness and correctness (e.g., ensuring no critical data gaps or anomalies in the text corpora), and promptly address any pipeline failures or data issues. Implement data version control.
    - Manage data security, access, and compliance.
    - Control permissions to datasets and ensure adherence to data privacy policies and security standards, especially when dealing with user data or proprietary text sources.

    The company offers:
    - Competitive salary.
    - Equity options in a fast-growing AI company.
    - Remote-friendly work culture.
    - Opportunity to shape a product at the intersection of AI and human productivity.
    - Work with a passionate, senior team building cutting-edge tech for real-world business use.

    More
  • Β· 18 views Β· 1 application Β· 17d

    Senior Snowflake Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data...

    The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data infrastructure through the transition to Snowflake as a priority, as it will enhance capabilities for implementing advanced AI solutions and unlock numerous opportunities for innovation and growth.

    We are seeking a highly skilled Snowflake Data Engineer to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD practices, with a deep understanding of data governance, security, and role-based access control (RBAC). Knowledge of data modeling methodologies (OLTP, OLAP, Data Vault 2.0), data quality frameworks, Stream lit application development and SAP integration and infrastructure-as-code with Terraform is essential. Experience working with different file formats such as JSON, Parquet, CSV, and XML is highly valued.

    • Responsibilities:

      β€’ In-depth knowledge of Snowflake's data warehousing capabilities.
      β€’ Understanding of Snowflake's virtual warehouse architecture and how to optimize performance
      and cost.
      β€’ Proficiency in using Snowflake's data sharing and integration features for seamless collaboration.
      β€’ Develop and optimize complex SQL scripts, stored procedures, and data transformations.
      β€’ Work closely with data analysts, architects, and business teams to understand requirements and
      deliver reliable data solutions.
      β€’ Implement and maintain data models, dimensional modeling for data warehousing, data marts,
      and star/snowflake schemas to support reporting and analytics.
      β€’ Integrate data from various sources including APIs, flat files, relational databases, and cloud
      services.
      β€’ Ensure data quality, data governance, and compliance standards are met.
      β€’ Monitor and troubleshoot performance issues, errors, and pipeline failures in Snowflake and
      associated tools.
      β€’ Participate in code reviews, testing, and deployment of data solutions in development and production environments.

    • Mandatory Skills Description:

      β€’ 5+ years of experience
      β€’ Strong proficiency in Snowflake (Snowpipe, RBAC, performance tuning).
      β€’ Ability to write complex SQL queries, stored procedures, and user-defined functions.
      β€’ Skills in optimizing SQL queries for performance and efficiency.
      β€’ Experience with ETL/ELT tools and techniques, including Snowpipe, AWS Glue, openflow, fivetran
      or similar tools for real-time and periodic data processing.
      β€’ Proficiency in transforming data within Snowflake using SQL, with Python being a plus.
      β€’ Strong understanding of data security, compliance and governance.
      β€’ Experience with DBT for database object modeling and provisioning.
      β€’ Experience in version control tools, particularly Azure DevOps.
      β€’ Good documentation and coaching practice.

    More
  • Β· 37 views Β· 0 applications Β· 17d

    Infrastructure Developer (Π‘++), Vinnytsia HUB, Ukraine

    Hybrid Remote Β· Ukraine Β· Product Β· 5 years of experience Β· English - B2
    An engineering and technology company that creates cutting-edge robotic, autonomous, and mission-critical systems used in real-world conditions around the world. Teams work on complex hardware and software solutions, from system architecture and...

    An engineering and technology company that creates cutting-edge robotic, autonomous, and mission-critical systems used in real-world conditions around the world. Teams work on complex hardware and software solutions, from system architecture and electronics to high-performance real-time software.
     

    The company's employees work in international engineering hubs, where local talent interacts with teams and partners from different countries, sees the direct impact of their work, and participates in global projects. This opens up opportunities for professional growth, development of expertise in robotics and autonomous systems, and participation in the creation of innovative solutions that shape the future of high-tech industries

    We are looking for an Infrastructure Developer to take ownership of the core system infrastructure that ensures reliable, low-latency, real-time operation. You will work with Linux, embedded platforms, and video systems, collaborating with backend, frontend, and hardware teams to maintain system stability, performance, and scalability throughout the full software lifecycle. This is a unique opportunity to work on complex, real-world systems at the intersection of robotics, autonomy, and high-performance software engineering.


    KEY RESPONSIBILITIES
    β€’ Develop, maintain, and optimize infrastructure and low-level components for embedded systems.
    β€’ Develop and maintain video pipelines for real-time and low-latency systems.
    β€’ Build, customize, and maintain Linux kernels and BSPs.
    β€’ Develop and maintain Docker-based build and deployment environments for embedded systems.
    β€’ Optimize system performance, latency, reliability, and resource usage.
    β€’ Debug, profile, and maintain complex production and embedded systems.
    β€’ Conduct code reviews and ensure high code quality and adherence to best practices.
    β€’ Collaborate with cross-disciplinary teams to deliver robust system solutions.

    BASIC QUALIFICATIONS

    β€’ At least 5 years of hands-on C++ development experience.
    β€’ Strong experience working in Linux-based environments.
    β€’ Experience with Docker and containerized deployments.
    β€’ Experience with real-time or low-latency systems.
    β€’ Strong debugging, profiling, and performance optimization skills.
    β€’ Experience with Git and modern development tools.
    β€’ Ability to work independently and take ownership of infrastructure components.

    PREFERRED SKILLS AND EXPERIENCE

    β€’ Experience with video streaming protocols (e.g., RTP, RTSP, WebRTC).
    β€’ Experience with Gstreamer.
    β€’ Familiarity with GPU / hardware-accelerated video pipelines.
    β€’ Background in robotics or autonomous systems.
    β€’ Experience with mission-critical or safety-critical environments.

    what we offer
    β€’ Experience in a fast-growing, highly innovative global industry.
    β€’ Excellent work conditions and open-minded team.
    β€’ Corporate events, regular internal activities and other benefits.
    β€’ Professional development opportunities and training.

    More
  • Β· 70 views Β· 0 applications Β· 17d

    Sales Executive (Google Cloud+Google Workspace)

    Full Remote Β· Czechia Β· Product Β· 2 years of experience Β· English - B2
    Cloudfresh is a Global Google Cloud Premier Partner, Zendesk Premier Partner, Asana Solutions Partner, GitLab Select Partner, Hubspot Platinum Partner, Okta Activate Partner, and Microsoft Partner. Since 2017, we’ve been specializing in the...

    Cloudfresh ⛅️ is a Global Google Cloud Premier Partner, Zendesk Premier Partner, Asana Solutions Partner, GitLab Select Partner, Hubspot Platinum Partner, Okta Activate Partner, and Microsoft Partner.

    Since 2017, we’ve been specializing in the implementation, migration, integration, audit, administration, support, and training for top-tier cloud solutions. Our products focus on cutting-edge cloud computing, advanced location and mapping, seamless collaboration from anywhere, unparalleled customer service, and innovative DevSecOps.

    We are seeking a dynamic Sales Executive to lead our sales efforts for GCP and GWS solutions across the EMEA and CEE regions. The ideal candidate will be a high-performing A-player with experience in SaaS sales, adept at navigating complex sales environments, and driven to exceed targets through strategic sales initiatives.

    Requirements:

    • Fluency in English and native Czech is essential;
    • From 2 years of proven sales experience in SaaS/ IaaS fields, with a documented history of achieving and exceeding sales targets, particularly in enterprise sales;
    • Sales experience on GCP and/or GWS specifically;
    • Sales or technical certifications related to Cloud Solutions are advantageous;
    • Experience in expanding new markets with outbound activities;
    • Excellent communication, negotiation, and strategic planning abilities;
    • Proficient in managing CRM systems and understanding their strategic importance in sales and customer relationship management.

    Responsibilities:

    • Develop and execute sales strategies for GCP and GWS solutions, targeting enterprise clients within the Cloud markets across EMEA and CEE;
    • Identify and penetrate new enterprise market segments, leveraging GCP and GWS to improve client outcomes;
    • Conduct high-level negotiations and presentations with major companies across Europe, focusing on the strategic benefits of adopting GCP and GWS solutions;
    • Work closely with marketing and business development teams to align sales strategies with broader company goals;
    • Continuously assess the competitive landscape and customer needs, adapting sales strategies to meet market demands and drive revenue growth.

    Work conditions:

    • Competitive Salary & Transparent Motivation: Receive a competitive base salary with commission on sales and performance-based bonuses, providing clear financial rewards for your success.
    • Flexible Work Format: Work remotely with flexible hours, allowing you to balance your professional and personal life efficiently.
    • Freedom to Innovate: Utilize multiple channels and approaches for sales, allowing you the freedom to find the best strategies for success.
    • Training with Leading Cloud Products: Access in-depth training on cutting-edge cloud solutions, enhancing your expertise and equipping you with the tools to succeed in an ever-evolving industry.
    • International Collaboration: Work alongside A-players and seasoned professionals in the cloud industry. Expand your expertise by engaging with international markets across the EMEA and CEE regions.
    • Vibrant Team Environment: Be part of an innovative, dynamic team that fosters both personal and professional growth, creating opportunities for you to advance in your career.
    • When applying to this position, you consent to the processing of your personal data by CLOUDFRESH for the purposes necessary to conduct the recruitment process, in accordance with Regulation (EU) 2016/679 of the European Parliament and of the Council of April 27, 2016 (GDPR).
    • Additionally, you agree that CLOUDFRESH may process your personal data for future recruitment processes.
    More
  • Β· 42 views Β· 6 applications Β· 18d

    Principal Analytics Developer

    Full Remote Β· EU Β· 3 years of experience Β· English - B2
    The Principal Analytics Developer is a new role that will support the newly created Product Data Domain teams. The role requires strong skills in dimensionally modelling, conforming and integrating data from multiple sources, as well as experience in...

    The Principal Analytics Developer is a new role that will support the newly created Product Data Domain teams. The role requires strong skills in dimensionally modelling, conforming and integrating data from multiple sources, as well as experience in leading strong analytics engineering teams.
    Responsibilities:
     

    • Planning workloads and delegating tasks in agile environment
    • Assisting with the daily operation of the organisation, including support and incidents
    • Able to provide feedback to team members, including constructive areas for development
    • Leading on the design, implementation and maintenance of dimensional data models that promote a self-service approach to data consumption. This includes ensuring that data quality within the data warehouse is maintained throughout the data lifecycle.
    • Define best practices in dimensional data modelling and database design and ensure standards are adhered to across the team.
    • Mentoring, coaching and supporting other team members in developing data modelling skills through knowledge transfer.
    • Automating data pipelines using proprietary technology & Airflow.
    • Using your expert knowledge of the company products and their features to inform the design and development of data products and upskilling the team through this knowledge.
    • Developing ways of working between product data domains and other data teams within product group.
    • The creation of processes for data product development, ensuring these processes are documented and advocating their use throughout the organisation.
    • Supporting analytics, data science and other colleagues outside the digital product area in managing projects and fielding queries.
    • Ability to build and maintain strong working relationships where you might, as a specialist, have to manage the expectations of more senior colleagues.
    • Working across mobile, web, television and voice platforms supporting Product Managers, Business Analysts and working closely with Software & Data Engineers.

       

    Requirements:
     

    • Extensive (5+ years) experience in managing teams building data warehouses / analytics from a diverse set of data sources (including event streams, various forms of batch processing)
    • At least 5 years’ experience in a Data Analyst, Data Modelling, Data Engineering or Analytics Engineering role, preferably in digital products, with an interest in data modelling and ETL processes
    • Proven experience in dimensionally modelling complex data at the conceptual, logical and physical layer.
    • Experience of designing STAR Schemas
    • Excellent SQL skills for extracting and manipulating data. Experience of using tools such as DBT, Looker and Airflow would be an advantage.
    • Good knowledge of analytical database systems (Redshift, Snowflake, BigQuery).
    • Comfortable working alongside cross-functional teams interacting with Product Managers, Engineers, Data Scientists, and Analysts.
    • Knowledge of digital products and their components, as well as what metrics affect their performance.
    • An understanding of how digital products use experimentation.
    • Some experience coding in R or Python.
    • A good understanding of on-demand audio and video media products, with a knowledge of key competitors.
       

    Will be a plus:

     

    • Ability to listen to others’ ideas and build on them 
    • Ability to clearly communicate to both technical and non-technical audiences.
    • Ability to collaborate effectively, working alongside other team members towards the team’s goals, and enabling others to succeed, where possible.
    • Ability to prioritise. A structured approach and ability to bring other on the journey. 
    • Strong attention to detail
       
    More
  • Β· 119 views Β· 13 applications Β· 18d

    Senior Solana Engineer (Smart Wallet)

    Full Remote Β· Countries of Europe or Ukraine Β· 3 years of experience Β· English - B1
    Senior Solana Developer - CoFo Neobank About the Project We're building CoFo Neobank β€” the first AI-first smart wallet on Solana that brings the banking app experience (like Revolut, Robinhood) into the on-chain environment. Our goal is to abstract...

    Senior Solana Developer - CoFo Neobank

    About the Project

    We're building CoFo Neobank β€” the first AI-first smart wallet on Solana that brings the banking app experience (like Revolut, Robinhood) into the on-chain environment.

    Our goal is to abstract blockchain complexity. We're building an architecture where every user gets a Smart Account (a programmable account, not a simple EOA) that supports multi-factor authentication (2/3 Multisig), access recovery, spending limits, and native integration of complex financial products (Staking, Yield, Perps, RWA).

    Core Responsibilities

    β€’ Smart Account Architecture Development: Design and write custom Rust programs (Anchor) for managing user accounts. Implement multisig logic (Device Key + 2FA Key), key rotation, and access recovery (Social Recovery).

    β€’ DeFi Composability (Integrations): Write adapters and CPI (Cross-Program Invocations) calls to integrate external protocols directly into the smart account:

    • Swap: Aggregation through Jupiter
    • Yield & Lending: Integration with Kamino, MarginFi, Meteora
    • Perps: Integration with Drift Protocol

    β€’ Security and Access Control: Implement spending limits system, protocol whitelisting, and replay attack protection.

    β€’ Portfolio Logic: Develop on-chain structures for position tracking (storing data about deposits, debts, PnL) for fast frontend/AI reading.

    β€’ Gas Abstraction: Implement mechanisms for paying fees on behalf of users (Fee Bundling / Gas Tank).

    Requirements (Hard Skills)

    β€’ Expert Rust & Anchor: Deep understanding of Solana Sealevel runtime, memory management, PDAs, and Compute Units (CU) limitations.

    β€’ Account Abstraction Experience: Understanding of how to build smart contract wallets that differ from standard system accounts.

    β€’ DeFi Integration Experience: You've already worked with SDKs or IDLs of major Solana protocols (Jupiter, Kamino, Drift, etc.). You understand what CPI is and how to safely call external code.

    β€’ Cryptography: Understanding of signature operations (Ed25519), transaction verification, and building secure multisig schemes.

    β€’ Security Mindset: Experience with audits, knowledge of attack vectors on Solana (re-entrancy, account substitution, ownership checks).

    Nice to Have

    β€’ Experience with Privy (for authentication) β€’ Understanding of cross-chain bridges (Wormhole/LayerZero) for implementing deposits from other networks β€’ Experience with tokenized assets (RWA) and Token-2022 standard

    Tech Stack

    β€’ Solana (Rust, Anchor Framework) β€’ Integrations: Jupiter, Kamino, Drift, MarginFi β€’ Infrastructure: Helius, Privy

    We Offer

    β€’ Work on a product that's changing UX in DeFi β€’ Complex architectural challenges (not just another token fork, but sophisticated wallet infrastructure) β€’ Competitive compensation in stablecoins/fiat + project options/tokens

    More
  • Β· 64 views Β· 31 applications Β· 18d

    Python Data Engineer

    Full Remote Β· Worldwide Β· 5 years of experience Β· English - B2
    Core Responsibilities β€’ Data Pipeline Management: Develop, optimize, and maintain scalable data pipelines to ensure high-quality data flow. β€’ API Development: Build and maintain high-performance backend APIs using FastAPI. β€’ System Reliability:...

    Core Responsibilities

     β€’ Data Pipeline Management: Develop, optimize, and maintain scalable data pipelines to ensure high-quality data flow. 

    β€’ API Development: Build and maintain high-performance backend APIs using FastAPI. 

    β€’ System Reliability: Proactively identify bottlenecks and improve system stability within existing infrastructures.

     β€’ Collaboration: Work closely with cross-functional teams to integrate AWS services and workflow orchestration tools into the production environment.

     

    Required Qualifications 

    β€’ Experience: 3+ years of professional Python development experience. 

    β€’ Databases: Strong proficiency in both SQL and NoSQL database design and management. 

    β€’ DevOps Tools: Hands-on experience with Docker, CI/CD pipelines, and Git version control. 

    β€’ Frameworks: Proven experience building applications with FastAPI.

    β€’ Cloud & Orchestration: Practical experience with AWS services and familiarity with Airflow (or similar workflow orchestration tools). 

    β€’ Communication: Upper-Intermediate level of English (written and spoken) for effective team collaboration. 

     

     Preferred Skills (Nice to Have) 

    β€’ Experience within the Financial Domain. 

    β€’ Hands-on experience with Apache Spark and complex ETL pipelines.

    β€’ Knowledge of container orchestration using Kubernetes. 

    β€’ Exposure to or interest in Large Language Models (LLMs) and AI integration.

    More
  • Β· 47 views Β· 11 applications Β· 18d

    Senior Data Engineer

    Worldwide Β· Product Β· 4 years of experience Β· English - C1
    How about building a high-load data architecture that handles millions of transactions daily? We’re looking for a Senior Data Engineer with growing to Data Lead. For design scalable pipelines from scratch. An international iGaming company & Data-first...

    How about building a high-load data architecture that handles millions of transactions daily?
    We’re looking for a Senior Data Engineer with growing to Data Lead.
    For design scalable pipelines from scratch.
    An international iGaming company & Data-first mindset,
    Remote, TOP-Salary

     

    Responsibilities

    – Build and run scalable pipelines (batch + streaming) that power gameplay, wallet, and promo analytics.

    – Model data for decisions (star schemas, marts) that Product, BI, and Finance use daily.

    – Make things reliable: tests, lineage, alerts, SLAs. Fewer surprises, faster fixes.

    – Optimize ETL/ELT for speed and cost (partitioning, clustering, late arrivals, idempotency).

    – Keep promo data clean and compliant (PII, GDPR, access controls).

    – Partner with POs and analysts on bets/wins/turnover KPIs, experiment readouts, and ROI.

    – Evaluate tools, migrate or deprecate with clear trade-offs and docs.

    – Handle prod issues without drama, then prevent the next one.

     

     

    Requirements

    – 4+ years building production data systems. You’ve shipped, broken, and fixed pipelines at scale.

    – SQL that sings and Python you’re proud of.

    – Real experience with OLAP and BI (Power BI / Tableau / Redash β€” impact > logo).

    – ETL/ELT orchestration (Airflow/Prefect or similar) and CI/CD for data.

    – Strong grasp of warehouses & lakes: incremental loads, SCDs, partitioning.

    – Data quality mindset: contracts, tests, lineage, monitoring.

    – Product sense: you care about player/clients impact, not just rows processed.

    ✨ Nice to Have (tell us if you’ve got it)

    – Kafka (or similar streaming), ClickHouse (we like it), dbt (modular ELT).

    – AWS data stack (S3, IAM, MSK/Glue/Lambda/Redshift) or equivalents.

    – Containers & orchestration (Docker/K8s), IaC (Terraform).

    – Familiarity with AI/ML data workflows (feature stores, reproducibility).

    – iGaming context: provider metrics bets / wins / turnover, regulated markets, promo events.

     

     

    We offer

    – Fully remote (EU-friendly time zones) or Bratislava/Malta/Cyprus if you like offices.

    – Unlimited vacation + paid sick leave.

    – Quarterly performance bonuses.

    – No micromanagement. Real ownership, real impact.

    – Budget for conferences and growth.

    – Product-led culture with sharp people who care.

     

     

    🧰 Our Day-to-Day Stack (representative)
    Python, SQL, Airflow/Prefect, Kafka, ClickHouse/OLAP DBs, AWS (S3 + friends), dbt, Redash/Tableau, Docker/K8s, GitHub Actions.

    More
  • Β· 53 views Β· 9 applications Β· 19d

    Data Engineer

    Ukraine Β· 4 years of experience Β· English - B2
    Role Summary A key role in our data engineering team, working closely with the rest of the technology team to provide a first class service to both internal and external users. In this role you will be responsible for building solutions that allow us to...

    Role Summary

    A key role in our data engineering team, working closely with the rest of the technology team to provide a first class service to both internal and external users. In this role you will be responsible for building solutions that allow us to use our data in a robust, flexible and efficient way while also maintaining the integrity of our data, much of which is of a sensitive nature.

    Role and Responsibilities

    Manages resources (internal and external) in the delivery of the product roadmap for our data asset. Key responsibilities include but not exhaustive:

    • Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes
    • Work closely with the development and product teams (both internal and external) to ensure that products meet the required specification prior to release.
    • Working closely with our technology colleagues throughout the delivery lifecycle to ensure that all data related processes are efficient and accurate
    • Providing expert assistance with design and implementation of all new products. All of our new technology stack has data at its heart.
    • Ensuring data is available for business and management reporting purposes.
    • Assist with the development and refinement of the agile process.
    • Be an advocate for best practices and continued learning
    • Strong technical understanding of a data experience
    • Ensure the ongoing maintenance of their own CPD
    • Carry out all duties in a manner that always reflect Financial Wellness Group’s values and principles

    Essential Criteria

    • Extensive knowledge of using Python to build ETL and ELT products in AWS using Lambda and Batch processing.
    • A keen understanding of developing and tuning Microsoft SQL Server.
    • Exposure to development in Postgres.
    • A good understanding of CI/CD for data and the challenges inherent.
    • Ability to use Source Control Systems such as Git/Azure DevOps
    • An understanding of dimensional modelling and data warehousing methodologies and an interest in Data Lakehousing technologies.
    • An understanding of Infrastructure as a Service provision (for example Terraform)
    • The ability to rapidly adapt to new technologies and technical challenges.
    • The flexibility to quickly react to changing business priorities.
    • A team player, with a natural curiosity and a desire to learn new skills
    • An interest in finding the β€˜right way’
    • Passionate about data delivery and delivering change

    What To Expect From Digicode?

    🌎 Work from Anywhere: From an office, home, or travel continuously if that’s your thing. Everything we do is online. As long as you have the Internet and your travel nomad lifestyle doesn’t affect the work process (you meet all deadlines and are present at all the meetings), you’re all set.

    πŸ’Ό Professional Development: We offer great career development opportunities in a growing company, international work environment, paid language classes, conference and education budget, & internal 42 Community training.

    πŸ§˜β€β™‚οΈ Work-life Balance: We provide employees with 18+ paid vacation days and paid sick leave, flexible schedule, medical insurance for employees and their children, monthly budget for things like a gym or pool membership.

    πŸ™Œ Culture of Openness: We’re committed to fostering a community where everyone feels welcome, seen, and heard, with minimal bureaucracy, and a flat organization structure.

    And, also, corporate gifts, corporate celebrations, free food & snacks, play & relax rooms for those who visit the office.

    Did we catch your attention? We’d love to hear from you.

    More
  • Β· 30 views Β· 1 application Β· 19d

    Senior/Lead Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    Description The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of...

    Description

    The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of our broader team, with a focus on patient engagement, care coordination, AI, healthcare analytics, and interoperability. These advanced technologies enhance our product portfolio with new services while improving clinical and patient experiences.

     

    Requirements

    An AWS Data Engineer designs, develops, and maintains scalable data solutions using AWS cloud services.
    Key Responsibilities:
        β€’ Design, build, and manage ETL (Extract, Transform, Load) pipelines using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3).
        β€’ Develop and maintain data architecture (data lakes, warehouses, databases) on AWS.
        β€’ Implement data quality and governance solutions.
        β€’ Automate data workflows and monitor pipeline health.
        β€’ Ensure data security and compliance with company policies.
    Required Skills:
        β€’ Proficiency with AWS cloud services, especially data-related offerings (S3, Glue, Redshift, Athena, EMR, Kinesis, Lambda).
        β€’ Strong SQL and Python skills.
        β€’ Experience with ETL tools and frameworks.
        β€’ Familiarity with data modelling and warehousing concepts.
        β€’ Knowledge of data security, access management, and best practices in AWS.
    Preferred Qualifications:
        β€’ AWS certifications (e.g., AWS Certified Data Analytics – Speciality, AWS Certified Solutions Architect).
        β€’ Background in software engineering or data science.

    β€’ Hands-on experience with Oracle Database and log-based Change Data Capture (CDC) replication using AWS Database Migration Service (DMS) for near real-time data ingestion.

    Job responsibilities

    • Develops, documents, and configures systems specifications that conform to defined architecture standards, address business requirements, and processes in the cloud development & engineering.
    • Involved in planning of system and development deployment, as well as responsible for meeting compliance and security standards.
    • API development using AWS services in a scalable, microservices-based architecture
    • Actively identifies system functionality or performance deficiencies, executes changes to existing systems, and tests functionality of the system to correct deficiencies and maintain more effective data handling, data integrity, conversion, input/output requirements, and storage.
    • May document testing and maintenance of system updates, modifications, and configurations.
    • May act as a liaison with key technology vendor technologists or other business functions.
    • Function Specific: Strategically design technology solutions that meet the needs and goals of the company and its customers/users.
    • Leverages platform process expertise to assess if existing standard platform functionality will solve a business problem or if a customisation solution would be required.
    • Test the quality of a product and its ability to perform a task or solve a problem.
    • Perform basic maintenance and performance optimisation procedures in each of the primary operating systems.
    • Ability to document detailed technical system specifications based on business system requirements
    • Ensures system implementation compliance with global & local regulatory and security standards (i.e. HIPAA, SOCII, ISO27001, etc.)
    More
  • Β· 45 views Β· 1 application Β· 19d

    Middle Data Engineer IRC285068

    Full Remote Β· Ukraine Β· 3 years of experience Β· English - None
    Description The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of...

    Description

    The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of our broader team, with a focus on patient engagement, care coordination, AI, healthcare analytics, and interoperability. These advanced technologies enhance our product portfolio with new services while improving clinical and patient experiences.

     

    Requirements

    MUST HAVE

    AWS Platform: Working experience with AWS data technologies, including S3
    Programming Languages: Strong programming skills in Python
    Data Formats: Experience with JSON, XML and other relevant data formats
    HealthCare Interoperability Tools: Previous experience with integration engines such as Intersystems, Lyniate, Redox, Mirth Connect, etc…

    Hands-on experience with Oracle Database and log-based Change Data Capture (CDC) replication using AWS Database Migration Service (DMS) for near real-time data ingestion.

    CI/CD Tools: experience setting up and managing CI/CD pipelines using GitLab CI, Jenkins, or similar tools
    Scripting and automation: experience in scripting language such as Python, PowerShell, etc…
    Monitoring and Logging: Familiarity with monitoring & logging tools like CloudWatch, ELK, Dynatrace, Prometheus, etc…
    Source Code Management: Expertise with git commands and associated VCS (Gitlab, Github, Gitea or similar)
    Documentation: Experience with markdown and in particular Antora for creating technical documentation

     

    NICE TO HAVE
    Strongly Preferred:
    Previous Healthcare or Medical Device experience
    Other data technologies such as Snowflake, Trino/Starburst
    Experience working with Healthcare Data, including HL7v2, FHIR and DICOM
    FHIR and/or HL7 Certifications
    Building software classified as Software as a Medical Device (SaMD)
    Understanding of EHR technologies such as EPIC, Cerner, etc…
    Experience implementation enterprise grade cyber security & privacy by design into software products
    Experience working in Digital Health software
    Experience developing global applications
    Strong understanding of SDLC – Waterfall & Agile methodologies
    Software estimation
    Experience leading software development teams onshore and offshore

     

    Job responsibilities

    – Develops, documents, and configures systems specifications that conform to defined architecture standards, address business requirements, and processes in the cloud development & engineering.

    – Involved in planning of system and development deployment as well as responsible for meeting compliance and security standards.

    – API development using AWS services in a scalable, microservices based architecture

    – Actively identifies system functionality or performance deficiencies, executes changes to existing systems, and tests functionality of the system to correct deficiencies and maintain more effective data handling, data integrity, conversion, input/output requirements, and storage.

    – May document testing and maintenance of system updates, modifications, and configurations.

    – May act as a liaison with key technology vendor technologists or other business functions.

    – Function Specific: Strategically design technology solutions that meets the needs and goals of the company and its customers/users.

    – Leverages platform process expertise to assess if existing standard platform functionality will solve a business problem or customization solution would be required.

    – Test the quality of a product and its ability to perform a task or solve a problems.

    – Perform basic maintenance and performance optimization procedures in each of the primary operating systems.

    – Ability to document detailed technical system specifications based on business system requirements

    – Ensures system implementation compliance with global & local regulatory and security standards (i.e. HIPAA, SOCII, ISO27001, etc…)

     

    What we offer

    Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

    Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

    Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

    Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

    High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

    More
  • Β· 24 views Β· 1 application Β· 19d

    Senior Data Engineer IRC284644

    Full Remote Β· Ukraine Β· 4 years of experience Β· English - None
    Description Our client is a luxury skincare and beauty brand. The brand is based in San Francisco and sells luxury skincare products worldwide. Client’s main IT β€œproduct” is its e-commerce website, which functions as a digital platform to sell products,...

    Description

    Our client is a luxury skincare and beauty brand. The brand is based in San Francisco and sells luxury skincare products worldwide.

    Client’s main IT β€œproduct” is its e-commerce website, which functions as a digital platform to sell products, educate customers, and personalize experiences.

    • Runs on Salesforce Commerce Cloud (formerly Demandware) β€” an enterprise e-commerce platform that supports online shopping, order processing, customer accounts, and product catalogs.
    • Hosted on cloud infrastructure (e.g., AWS, Cloudflare) for reliable performance and security
      Uses HTTPS/SSL encryption to secure data transfers.
    • Integrated marketing and analytics technologies such as Klaviyo (email & SMS automation), Google Tag Manager, and personalization tools to track behavior, optimize campaigns, and increase conversions

    It’s both a shopping platform and a digital touchpoint for customers worldwide.

     

    Requirements

    • 4+ years of experience as a Data Engineer, Analytics Engineer, or in a similar data-focused role.
    • Strong SQL skills for complex data transformations and analytics-ready datasets.
    • Hands-on experience with Python for data pipelines, automation, and data processing.
    • Experience working with cloud-based data platforms (AWS preferred).
    • Solid understanding of data warehousing concepts (fact/dimension modeling, star schemas).
    • Experience building and maintaining ETL/ELT pipelines from multiple data sources.
    • Familiarity with data quality, monitoring, and validation practices.
    • Experience handling customer, transactional, and behavioral data in a digital or e-commerce environment.
    • Ability to work with cross-functional stakeholders (Marketing, Product, Analytics, Engineering).

    Nice to have:

    • Experience with Snowflake, Redshift, or BigQuery.
    • Experience with dbt or similar data transformation frameworks.
    • Familiarity with Airflow or other orchestration tools.
    • Experience with marketing and CRM data (e.g. Klaviyo, GA4, attribution tools).
    • Exposure to A/B testing and experimentation data.
    • Understanding of privacy and compliance (GDPR, CCPA).
    • Experience in consumer, retail, or luxury brands.
    • Knowledge of event tracking and analytics instrumentation.
    • Ability to travel + visa to the USA

     

    Job responsibilities

    • Design, build, and maintain scalable data pipelines ingesting data from multiple sources:
      e-commerce platform (e.g. Salesforce Commerce Cloud), CRM/marketing tools (Klaviyo), web analytics, fulfillment and logistics systems.
    • Ensure reliable, near-real-time data ingestion for customer behavior, orders, inventory, and marketing performance.
    • Develop and optimize ETL/ELT workflows using cloud-native tools.
    • Model and maintain customer, order, product, and session-level datasets to support analytics and personalization use cases.
    • Enable 360Β° customer view by unifying data from website interactions, email/SMS campaigns, purchases, and returns.
    • Support data needs for personalization tools (e.g. product recommendation quizzes, ritual finders).
    • Build datasets that power marketing attribution, funnel analysis, cohort analysis, and LTV calculations.
    • Enable data access for growth, marketing, and CRM teams to optimize campaign targeting and personalization
    • Ensure accurate tracking and validation of events, conversions, and user journeys across channels.
    • Work closely with Product, E-commerce, Marketing, Operations, and Engineering teams to translate business needs into data solutions.
    • Support experimentation initiatives (A/B testing, new digital experiences, virtual stores).
    • Act as a data partner in decision-making for growth, CX, and operational efficiency.
    • Build and manage data solutions on cloud infrastructure (e.g. AWS).
    • Optimize storage and compute costs while maintaining performance and scalability.

     

    What we offer

    Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. 

    Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.

    Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.

    Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!

    High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.

    More
Log In or Sign Up to see all posted jobs