Jobs Data Engineer

147
  • Β· 43 views Β· 5 applications Β· 11d

    Senior Data Engineer (Scala) β€” Tieto Tech Consulting (m/f/d)

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    Tieto Tech Consulting is inviting a talented Data Engineer to join our growing team and support our customer BICS, a global telecommunication enabler with a physical network spanning the globe. In this role, you will work on the BICS Voice and CC Value...

    Tieto Tech Consulting is inviting a talented Data Engineer to join our growing team and support our customer BICS, a global telecommunication enabler with a physical network spanning the globe. In this role, you will work on the BICS Voice and CC Value Streams, delivering qualified customer and network support by designing, building, and optimizing large-scale data pipelines within the telecom domain. The position requires strong expertise in Scala Spark, Databricks, and AWS cloud services, and focuses on developing high-performance data platforms that enable network analytics, customer insights, real-time monitoring, and regulatory reporting.

     

    Key Responsibilities

    • Design, develop, and maintain scalable batch data pipelines using Scala, Databricks Spark, Databricks SQL and Airflow
    • Implement optimized ETL/ELT processes to ingest, cleanse, transform, and enrich large volumes of telecom network, usage, and operational data
    • Ensure pipeline reliability, observability, and performance tuning of Spark workloads
    • Build and manage data architectures leveraging AWS services (such as but not limited to) S3, Lambda, IAM, and CloudWatch
    • Implement infrastructure-as-code using Terraform
    • Ensure security best practices and compliance with telecom regulatory requirements (GDPR, Data sovereignty, retention)
    • Collaborate with cross-functional teams (Architecture, DevOps, Network Engineering, Business Intelligence)
    • Document system designs, data flows, and best practices

     

    Requirements

    • 4+ years of experience as a Data Engineer or Big Data Developer
    • Strong proficiency in Scala and functional programming concepts
    • Advanced experience with Apache Spark (batch processing using Data Frame API and low-level Spark API’s, performance tuning, cluster optimization)
    • Experience with optimized SQL-based data transformations for analytics and machine learning workloads
    • Hands-on experience with Databricks including notebooks, jobs, Delta Lake, Unity Catalog, and MLflow (nice to have)
    • Solid understanding of CI/CD practices with Git, Jenkins/Gitlab Actions
    • Strong AWS skills: S3, Lambda, IAM, CloudWatch, and related services
    • Knowledge of distributed systems, data governance, and security best practices
    • Experience with Airflow integration with AWS services for end-to-end orchestration across cloud data pipelines
    • Experience with IaC tools: Terraform or CloudFormation
    • Experience with Python is a Plus
    • Experience with DBT is a Plus
    • Experience with Snowflake is s Plus

     

    Soft Skills

    • Strong analytical and problem-solving skills
    • High degree of ownership and a mindset for continuous improvement
    • Quality oriented, pragmatic and solution oriented
    • Excellent communication and teamwork abilities
    • Ability to translate business requirements into technical solutions
    • Experience in telecom sector is a plus
    • Experience with an agile way of working is a plus
    • English proficiency
    More
  • Β· 27 views Β· 3 applications Β· 11d

    Senior Snowflake Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    Project description The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future....

    Project description

    The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data infrastructure through the transition to Snowflake as a priority, as it will enhance capabilities for implementing advanced AI solutions and unlock numerous opportunities for innovation and growth.

    We are seeking a highly skilled Snowflake Data Engineer to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD practices, with a deep understanding of data governance, security, and role-based access control (RBAC). Knowledge of data modeling methodologies (OLTP, OLAP, Data Vault 2.0), data quality frameworks, Stream lit application development and SAP integration and infrastructure-as-code with Terraform is essential. Experience working with different file formats such as JSON, Parquet, CSV, and XML is highly valued.

    Responsibilities

    Design and develop data pipelines using Snowflake and Snow pipe for real-time and batch ingestion.

    Implement CI/CD pipelines in Azure DevOps for seamless deployment of data solutions.

    Automate DBT jobs to streamline transformations and ensure reliable data workflows.

    Apply data modeling techniques including OLTP, OLAP, and Data Vault 2.0 methodologies to design scalable architectures.

    Document data models, processes, and workflows clearly for future reference and knowledge sharing.

    Build data tests, unit tests, and mock data frameworks to validate and maintain reliability of data solutions.

    Develop Streamlit applications integrated with Snowflake to deliver interactive dashboards and self-service analytics.

    Integrate SAP data sources into Snowflake pipelines for enterprise reporting and analytics.

    Leverage SQL expertise for complex queries, transformations, and performance optimization.

    Integrate cloud services across AWS, Azure, and GCP to support multi-cloud data strategies.

    Develop Python scripts for ETL/ELT processes, automation, and data quality checks.

    Implement infrastructure-as-code solutions using Terraform for scalable and automated cloud deployments.

    Manage RBAC and enforce data governance policies to ensure compliance and secure data access.

    Collaborate with cross-functional teams including business analysts, and business stakeholders to deliver reliable data solutions.

    Skills

    Must have

    Strong proficiency in Snowflake (Snowpipe, RBAC, performance tuning).

    Hands-on experience with Python , SQL , Jinja , JavaScript for data engineering tasks.

    CI/CD expertise using Azure DevOps (build, release, version control).

    Experience automating DBT jobs for data transformations.

    Experience building Streamlit applications with Snowflake integration.

    Cloud services knowledge across AWS (S3, Lambda, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Pub/Sub).

    Nice to have

    Cloud certifications is a plus

    Languages

    English: B2 Upper Intermediate

    More
  • Β· 61 views Β· 14 applications Β· 11d

    Middle Cloud/Data Engineer

    Part-time Β· Full Remote Β· Countries of Europe or Ukraine Β· 4 years of experience Β· English - B2
    Metamindz is a fast-growing UK-based IT software company. We support global clients by providing fractional CTOs-as-a-service, building digital products, hiring exceptional technical talent, and conducting in-depth tech due diligence. We’re currently...

    Metamindz is a fast-growing UK-based IT software company. We support global clients by providing fractional CTOs-as-a-service, building digital products, hiring exceptional technical talent, and conducting in-depth tech due diligence.

     

    We’re currently looking for a Cloud & Data Engineer (GCP / IoT) to join one of our startup clients in a part-time engagement. This is a full-time opportunity for a hands-on engineer who can take ownership of cloud data platforms and backend systems, working with high-volume IoT data and real-time analytics in production environments.

     

    Responsibilities:

     

    • Own and operate the cloud-based backend and data platform supporting large-scale IoT deployments
    • Architect, build, and maintain high-volume data ingestion pipelines using GCP services (BigQuery, Dataflow, Pub/Sub)
    • Design and manage streaming and batch data workflows for real-time and historical analytics
    • Define data storage, querying, retention, and archiving strategies across warehouses and data lakes
    • Ensure backend services, APIs, and data pipelines are secure, scalable, observable, and fault-tolerant
    • Set up monitoring, logging, alerting, and recovery strategies for event-driven workloads
    • Collaborate closely with the CTO, embedded engineers, and product teams to align device capabilities with cloud and data architecture
    • Contribute to data platform evolution, including governance, access policies, and metadata management

     

    Requirements:

     

    • 3–5 years of commercial engineering experience in cloud, data, or backend roles
    • Strong hands-on experience with GCP and its data ecosystem (BigQuery, Dataflow, Pub/Sub)
    • Solid experience with relational databases (Postgres, MySQL), including schema design, migrations, indexing, and scaling strategies
    • Proven experience building and maintaining data pipelines, particularly for IoT or time-series data
    • Hands-on experience with Python (Node.js is a plus)
    • Experience designing and consuming APIs in distributed or microservices-based systems
    • Familiarity with CI/CD pipelines, environment management, and Infrastructure as Code (Terraform)
    • Good understanding of cloud security, IAM, and best practices for production systems
    • Ability to work independently in a startup environment and make pragmatic technical decisions

     

    Nice to Have:

     

    • Google Professional Data Engineer certification
    • Experience with orchestration tools such as Airflow / Cloud Composer
    • Exposure to applied ML or AI use cases (e.g. anomaly detection, forecasting on IoT data)
    • Experience using managed ML services like GCP Vertex AI

     

    What We Offer:

     

    • Opportunity to work on a real-world, IoT-powered product with visible impact
    • High ownership and influence over technical architecture and data strategy
    • Collaborative startup environment with direct access to decision-makers
    • Modern cloud stack and meaningful engineering challenges around scale and reliability
    • Competitive compensation aligned with experience and responsibilities

     

    How to Apply:

     

    Please send a short blurb about yourself β€” and tell us your favorite ice cream flavor (mine is cherry πŸ’)

    More
  • Β· 71 views Β· 10 applications Β· 11d

    Senior Data Engineer (with relocation to Cyprus)

    Full Remote Β· Worldwide Β· 6 years of experience Β· English - B2
    About the project: We are looking for a Senior Data Engineer to take ownership and evolve both our Data Warehouse and core databases within a microservices-based, multi-tenant application. This role encompasses more than just analytics and includes broad...

    About the project:

    We are looking for a Senior Data Engineer to take ownership and evolve both our Data Warehouse and core databases within a microservices-based, multi-tenant application. This role encompasses more than just analytics and includes broad responsibility for database architecture, ensuring data consistency, and managing technical debt across production systems. It is a hands-on opportunity focused on long-term stewardship of the data layer, working closely with backend engineers and product teams to deliver scalable, reliable, and maintainable data infrastructure.

    Please note that this position requires Cyprus-based candidates or readiness to relocate to Cyprus after the probation period. We provide full support throughout the relocation process, along with a relocation bonus 🎁

     

    A new team member will be in charge of:

    • Designing, developing, and maintaining scalable data warehouse solutions.
    • Building and optimizing ETL/ELT pipelines for efficient data integration.
    • Designing and implementing data models to support analytical and reporting needs.
    • Ensuring data integrity, quality, and security across all pipelines.
    • Optimizing data performance and scalability using best practices.
    • Working with big data technologies such as Redshift.
    • Collaborating with cross-functional teams to understand business requirements and translate them into data solutions.
    • Implementing CI/CD pipelines for data workflows.
    • Monitoring, troubleshooting, and improving data processes and system performance.
    • Staying updated with industry trends and emerging technologies in data engineering.
    • Taking ownership of core production databases across a microservices, multi-tenant application.

       

    Does this relate to you?

    • 6+ years of experience in Data Engineering or a related field.
    • Strong expertise in SQL and data modeling concepts.
    • Hands-on experience with Airflow.
    • Experience working with Redshift.
    • Proficiency in Python for data processing.
    • Strong understanding of data governance, security, and compliance.
    • Experience in implementing CI/CD pipelines for data workflows.
    • Ability to work independently and collaboratively in an agile environment.
    • Excellent problem-solving and analytical skills.
    • Experience managing production databases in microservices environments.
    • Experience designing and supporting multi-tenant data models.
    • English proficiency at an upper-intermediate level.

     

    Ready to try your hand? Send your CV without a doubt!

    More
  • Β· 28 views Β· 3 applications Β· 11d

    Senior Data Engineer (AWS, E-commerce )

    Full Remote Β· Ukraine Β· 3 years of experience Β· English - C1
    About the Project: We are looking for a Senior Data Engineer to take over and further develop our e-commerce analytics solution. The project focuses on building a "Product as a Service" (SaaS) platform that helps customers improve their business through...

    About the Project: We are looking for a Senior Data Engineer to take over and further develop our e-commerce analytics solution. The project focuses on building a "Product as a Service" (SaaS) platform that helps customers improve their business through data-driven insights. You will be responsible for the full lifecycle of data: from scraping and parsing to making it available for AI model training.

    More
  • Β· 100 views Β· 18 applications Β· 11d

    Data Engineering Lead

    Full Remote Β· Worldwide Β· Product Β· 5 years of experience Β· English - None
    About Traffic Label Traffic Label is a performance marketing and technology company with nearly two decades of experience driving engagement and conversion across the iGaming and digital entertainment sectors. We’re now building a Customer Data Platform...

    About Traffic Label
    Traffic Label is a performance marketing and technology company with nearly two decades of experience driving engagement and conversion across the iGaming and digital entertainment sectors.
    We’re now building a Customer Data Platform (CDP) on Snowflake and AWS - unifying player data across multiple brands to power automation, insights, and personalization.

    The Role
    We’re looking for a Data Engineering Lead to own the technical delivery and development of this platform. You’ll architect scalable pipelines, lead a small team, and ensure data reliability, accuracy, and performance.
    Team size: 3–4 engineers/analysts

    Key Responsibilities

    • Design and implement scalable data pipelines processing millions of events daily
    • Own Snowflake data warehouse architecture, optimization, and cost control
    • Lead the engineering team through delivery and performance improvements
    • Ensure >95% data accuracy and 99.9% pipeline uptime
    • Collaborate with marketing, analytics, and compliance teams to align data with business goals

    Requirements

    • 5+ years in data engineering, 2+ in leadership roles
    • Expert in Snowflake, SQL, and Python
    • Proficient with AWS (S3, Lambda, IAM) and orchestration tools (Airflow, dbt, etc.)
    • Strong understanding of data governance, cost optimization, and performance tuning
    • Experience with iGaming data, Kafka/Kinesis, or MLflow is a plus

    Why Join Us

    • Build a core data platform from the ground up
    • Competitive salary and performance bonuses
    • Flexible remote or hybrid work across Europe
    • Supportive, innovative, data-driven culture

    Ready to lead a data platform that powers smarter decisions across global iGaming brands?
    Apply now to join Traffic Label’s Data & Technology team.

    More
  • Β· 36 views Β· 11 applications Β· 12d

    Senior Data Engineer (Snowflake and Informatica)

    Full Remote Β· EU Β· 5 years of experience Β· English - B2
    We are looking for a Senior Data Engineer to worik with an existing enterprise Data Warehouse. The build, support, and enhance data pipelines and data assets while working under defined SLAs in a regulated utility environment. Who are we looking for? 5+...

    We are looking for a Senior Data Engineer to worik with an existing enterprise Data Warehouse. The build, support, and enhance data pipelines and data assets while working under defined SLAs in a regulated utility environment.

    Who are we looking for?
    ● 5+ years of experience in data engineering or data-centric roles
    ● Proven hands-on experience supporting production data platforms
    ● Strong experience with Snowflake, including: data structures and transformations, working with existing schemas and layered architectures.
    ● Solid experience with Informatica Cloud (IDMC / CDI): building and supporting ETL / ELT pipelines
    ● Good understanding of ETL / ELT concepts and data pipelines
    ● Experience working under SLAs and structured support processes
    ● Ability to investigate, diagnose, and resolve data incidents and defects
    ● Experience performing impact analysis and technical validation for changes
    ● Familiarity with Agile delivery and tools such as Jira
    ● Willingness to travel occasionally for business trips to the UK.
    ● Strong communication skills and ability to work closely with both technical and business stakeholders.
    ● Excellent proficiency in both verbal and written English communication skills.

     

    We offer:
    ● А place with friendly environment where you can reach your full potential and grow your career
    ● Flexible work schedules
    ● Work from home
    ● Social package: paid sick leave and vacation
    ● English courses, medical insurance, legal support, etc.

     

    As a Senior Data Engineer you will:
    ● Build, maintain, and support data pipelines using Informatica IDMC (CDI)
    ● Develop and support data transformations and data assets in Snowflake.
    ● Ensure stable operation of data pipelines in line with BAU and SLA requirements.
    ● Investigate and resolve production incidents and defects.
    ● Deliver approved service requests and incremental enhancements.
    ● Perform impact analysis and technical validation for data changes.
    ● Execute unit testing and support release activities.
    ● Produce and maintain technical documentation and operational artefacts.
    ● Collaborate closely with other Data Engineers, BI specialists, and stakeholders
    ● Operate within defined role boundaries, without ownership of business rules, data definitions, or platform configuration.


     

    Our client is a global energy company focused on renewable power generation and low-carbon energy solutions. Operating across multiple regions, the company develops, builds, and operates large-scale energy assets, including wind, solar, and hybrid power projects. The company works in a highly regulated environment, where data accuracy, traceability, and reliability are essential.

    More
  • Β· 34 views Β· 3 applications Β· 12d

    Performance Engineer (Data Platform )

    Full Remote Β· EU Β· Product Β· 3 years of experience Β· English - B2
    We are looking for a specialist to design and implement an end-to-end performance testing framework for a healthcare system running on Databricks and Microsoft Azure. You will build a repeatable, automated approach to measure and improve performance...

    We are looking for a specialist to design and implement an end-to-end performance testing framework for a healthcare system running on Databricks and Microsoft Azure. You will build a repeatable, automated approach to measure and improve performance across data ingestion, ETL/ELT pipelines, Spark workloads, serving layers, APIs, security/identity flows, integration components, and presentation/UI, while meeting healthcare-grade security and compliance expectations.

    This role sits at the intersection of performance engineering, cloud architecture, and test automation, with strong attention to regulated-domain requirements (privacy, auditability, access controls).

    Key Responsibilities

    • Design and build a performance testing strategy and framework for a Databricks + Azure healthcare platform.
    • Define performance KPIs/SLOs (e.g., pipeline latency, throughput, job duration, cluster utilization, cost per run, data freshness).
    • Create workload models that reflect production usage (batch, streaming, peak loads, concurrency, backfills).
    • Create a test taxonomy: smoke perf, baseline benchmarks, load, stress, soak/endurance, spike tests, and capacity planning.

       
    • Implement automated performance test suites for:
      • Databricks jobs/workflows (Workflows, Jobs API)
      • Spark/Delta Lake operations (reads/writes, mergers, compaction, Z-Ordering where relevant)
      • Data ingestion (ADF, Event Hubs, ADLS Gen2, Autoloader, etc. as applicable)
    • Build test data generation and data anonymization/synthetic data approaches suitable for healthcare contexts.
    • Instrument, collect, and analyze metrics from:
      • Spark UI / event logs
      • Databricks metrics and system tables
      • Azure Monitor / Log Analytics
      • Application logs and telemetry (if applicable)
    • Produce actionable performance reports and dashboards (trend, regression detection, run-to-run comparability).
    • Create performance tests for key user journeys (page load, search, dashboards) using appropriate tooling.
    • Measure client-side and network timings and correlate them with API/backend performance.
    • Integrate performance tests into CI/CD (Azure DevOps or GitHub Actions), including gating rules and baselines.
    • Document framework usage, standards, and provide enablement to engineering teams.

    Required Qualifications

    • Proven experience building performance testing frameworks (not just executing tests), ideally for data platforms.
    • Strong hands-on expertise with Databricks and Apache Spark performance tuning and troubleshooting.
    • Strong knowledge of Azure services used in data platforms (commonly ADLS Gen2, ADF, Key Vault, Azure Monitor/Log Analytics; others as relevant).
    • Strong programming/scripting ability in Python and/or Java/TypeScript.
    • Familiarity with load/performance tools and approaches (e.g., custom harnesses, Locust/JMeter/k6 where appropriate, or Spark-specific benchmarking).
    • Ability to design repeatable benchmarking (baseline creation, environment parity, noise reduction, statistical comparison).
    • Understanding of data security and compliance needs typical for healthcare (e.g., HIPAA-like controls, access management, auditability; adapt to your jurisdiction).
    • High-level proficiency in English

    Nice-to-Have / Preferred

    • Experience with Delta Lake optimization (OPTIMIZE, ZORDER, liquid clustering where applicable), streaming performance, and structured streaming.
    • Experience with Terraform/IaC for reproducible test environments.
    • Knowledge of Unity Catalog, data governance, and fine-grained access controls.
    • Experience with OpenTelemetry tracing and correlation across UI β†’ API β†’ data workloads.
    • FinOps mindset: performance improvements tied to cost efficiency on Databricks/Azure.
    • Prior work on regulated domains (healthcare, pharma, insurance).

     Working Model

    • Contract
    • Remote
    • Collaboration with Data Engineering, Platform Engineering, Security/Compliance, and Product teams.


     

    More
  • Β· 216 views Β· 9 applications Β· 12d

    Google Cloud Solutions Architect / Pre-Sales Engineer

    Full Remote Β· Ukraine Β· Product Β· 1 year of experience Β· English - B1
    Google Cloud Solutions Architect / Pre-Sales Engineer Company: Softprom Solutions Location: Remote (Ukraine) Employment: Full-time, FOP (contractor) About the role Softprom Solutions is looking for a Google Cloud Solutions Architect / Pre-Sales...

    Google Cloud Solutions Architect / Pre-Sales Engineer

     

    Company: Softprom Solutions
    Location: Remote (Ukraine)
    Employment: Full-time, FOP (contractor)

     

    About the role

    Softprom Solutions is looking for a Google Cloud Solutions Architect / Pre-Sales Engineer to join our Cloud team.

    This role is ideal for a specialist with hands-on experience in Google Cloud Platform and an active Google Cloud certification, who wants to work on real commercial projects, participate in pre-sales activities, and grow in cloud architecture.

     

    Responsibilities

    • Design and document Google Cloud architectures according to the Google Cloud Architecture Framework
    • Participate in pre-sales activities:
      • technical discovery with customers
      • solution design
      • participation in demos and presentations
    • Deployment and configuration of core GCP services, including:
      • Compute Engine, Cloud Storage, Cloud SQL
      • VPC, IAM, Load Balancing
      • Cloud Functions / Cloud Run
    • Design and configure GCP networking:
      • VPC networks, subnets
      • Firewall rules
      • Routes
    • Implement and support Infrastructure as Code (IaC) using Terraform
    • Create technical and solution documentation
    • Act as a technical point of contact for sales and customers

     

    Requirements (Must have)

    • Active Google Cloud certification
      (Associate Cloud Engineer or Professional Cloud Architect)
    • Experience as Cloud Engineer / Solutions Architect / Pre-Sales Engineer
    • Practical understanding of Google Cloud core services:
      • Compute, Storage, Databases, Networking, Security
    • Solid networking knowledge:
      • TCP/IP (L3, L4)
      • IP addressing, subnetting
      • DNS
    • Understanding of cloud principles:
      • High Availability
      • Fault Tolerance
      • Scalability
      • Security
    • Linux skills (command line)

     

    Nice to have

    • Experience in pre-sales or customer-facing roles
    • Experience with Terraform
    • Basic scripting skills (Python, Bash)
    • Experience with Docker, Kubernetes (GKE), Cloud Run
    • Experience with AWS or multi-cloud environments

     

    Soft skills

    • Ability to communicate technical solutions to non-technical audiences
    • Structured and analytical thinking
    • Proactivity and ownership
    • Ukrainian β€” fluent
    • English β€” reading and basic spoken (technical / pre-sales)

     

    We offer

    • Compensation: $2000 USD + project-based bonuses
    • FOP cooperation
    • Full-time workload
    • Real commercial Google Cloud projects
    • Participation in pre-sales and architecture design
    • Professional growth in Cloud & Solutions Architecture
    • International projects and vendors
    • Strong cloud team and mentorship
    More
  • Β· 41 views Β· 4 applications Β· 12d

    AWS Cloud Engineer

    Full Remote Β· Worldwide Β· Product Β· 2 years of experience Β· English - B2
    AWS Cloud Engineer Softprom Solutions Azerbaijan | Remote / Hybrid Full-time | Contractor (Individual Entrepreneur) Contract with Austria About Softprom Softprom Solutions is an international IT distributor and solutions provider working with leading...

    AWS Cloud Engineer

    Softprom Solutions
    Azerbaijan | Remote / Hybrid
    Full-time | Contractor (Individual Entrepreneur)
    Contract with Austria

     

    About Softprom

    Softprom Solutions is an international IT distributor and solutions provider working with leading global vendors in Cloud, Cybersecurity, Infrastructure, and Enterprise IT.

    We are expanding our Cloud Practice and are looking for an AWS Cloud Engineer in Azerbaijan who wants to grow professionally, work on real customer projects, and collaborate with international teams under an Austrian contract.

     

    About the role

    This role is ideal for an AWS-certified engineer with solid fundamentals who wants to deepen hands-on experience in cloud architecture, deployments, automation, and customer-facing work.

    You will work closely with senior architects, sales teams, and customers, participating in both technical delivery and pre-sales activities.

     

    Responsibilities

    • Support the design and documentation of AWS cloud architectures following the AWS Well-Architected Framework
       
    • Participate in deployment and configuration of core AWS services, including
      VPC, EC2, S3, RDS, IAM, Lambda, Load Balancers
       
    • Assist with AWS networking configuration:
      Subnets, Route Tables, Security Groups, NACLs
       
    • Contribute to automation and Infrastructure as Code (IaC) initiatives using
      Terraform and/or AWS CloudFormation
       
    • Create and maintain technical documentation for architectures and configurations
       
    • Participate in customer meetings, presentations, and demos, explaining AWS solutions and capabilities

       

    Requirements (Must have)

    • Active AWS Certified Solutions Architect – Associate (SAA-C03)
       
    • Solid theoretical understanding of AWS core services:
      Compute, Storage, Databases, Networking, Security
       
    • Basic networking knowledge:
      TCP/IP (L3, L4), IP addressing, subnetting, DNS
       
    • Understanding of cloud principles:
      High Availability, Fault Tolerance, Scalability, Security
       
    • Basic Linux skills (command line)
       
    • Ability to read and understand technical documentation in English

       

    Nice to have

    • Hands-on experience with Terraform or CloudFormation
       
    • Basic scripting skills (Python, Bash)
       
    • Familiarity with Docker, Kubernetes, ECS
       
    • Personal, educational, or non-commercial projects deployed on AWS

       

    Soft skills

    • Strong motivation to learn and grow in cloud engineering
       
    • Structured and analytical thinking
       
    • Clear and confident communication
       
    • Ukrainian or Russian β€” fluent
       
    • English β€” reading technical documentation (spoken English is a plus)

       

     

    We offer
     

    • Individual Entrepreneur (FOP) / contractor model
       
    • Official contract with Austria
       
    • Full-time workload
       
    • Real commercial AWS projects (not internal labs)
       
    • Participation in architecture design and pre-sales activities
       
    • Professional growth within Cloud & Solutions Architecture
       
    • International customers and vendors
       
    • Supportive, senior cloud team and mentorship
    More
  • Β· 32 views Β· 6 applications Β· 12d

    Senior Data Engineer (Capacity and Forecasting Systems)

    Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - B2
    We at Sigma Software are looking for a skilled Senior Data Engineer to join an exciting short-term project with a US-based customer. This is a remote position, offering you the flexibility to work from anywhere while contributing to a high-impact data...

    We at Sigma Software are looking for a skilled Senior Data Engineer to join an exciting short-term project with a US-based customer. This is a remote position, offering you the flexibility to work from anywhere while contributing to a high-impact data platform.

    In this role, you will take ownership of designing and optimizing the data foundation for an automated storage capacity forecasting platform. You’ll work with modern technologies, collaborate with experienced engineers, and have the opportunity to influence both technical and process decisions.

    Why join us? At Sigma Software, you’ll work in a culture that values innovation, encourages knowledge sharing, and offers the chance to make a real impact on projects used by thousands of businesses worldwide.

    CUSTOMER
    Our customer is ConnectWise β€” a US-based software company providing business automation solutions for Managed Service Providers (MSPs). ConnectWise offers a suite of tools for IT service management, cybersecurity, remote monitoring, and business process automation. Their solutions are used globally by thousands of MSPs to streamline operations, improve service delivery, and enhance security for small and medium-sized businesses (SMBs).

    PROJECT
    The project focuses on building an automated storage capacity forecasting platform for MSPs. The platform will model historical infrastructure data, enable predictive insights, and support lifecycle planning for hardware and storage resources.
    It will integrate PostgreSQL, Python-based ETL pipelines, and PowerBI analytics to deliver accurate capacity forecasts and actionable reports for an 18-month planning horizon. The work environment encourages technical ownership, process improvement, and collaborative problem-solving with the customer’s engineering team.

    Project duration is 3-4 months

     

    RESPONSIBILITIES

    • Design and optimize PostgreSQL data models for historical capacity and lifecycle tracking
    • Build and maintain robust ETL pipelines using Python for structured and semi-structured (JSON) data
    • Aggregate and structure data by Region, Node Type, and time dimensions
    • Support time-series analysis and capacity forecasting use cases
    • Develop and enable PowerBI datasets, models, and reports based on clean, reliable data
    • Ensure data quality, performance, and scalability across the pipeline
    • Translate infrastructure and business requirements into scalable data solutions
    • Collaborate closely with software developers and stakeholders on end-to-end data workflows

       

    REQUIREMENTS

    • At least 5 years of experience as a Data Engineer or in a similar data-focused role
    • Strong proficiency in SQL and relational databases, preferably PostgreSQL
    • Solid experience with Python for data transformation and pipeline development
    • Hands-on experience working with JSON and semi-structured data formats
    • Proven track record of building and optimizing ETL processes
    • Practical experience with PowerBI, including dataset modeling and report creation
    • Experience working with time-series and historical datasets
    • Strong understanding of data modelling principles for analytics and forecasting
    • Upper-Intermediate level of English 

       

    WILL BE A PLUS:

    • Experience with Kibana or other BI/visualization tools
    • Familiarity with monitoring, infrastructure, or capacity planning data
    • Exposure to forecasting techniques or growth trend analysis
    • Experience integrating data from metrics and inventory systems
    More
  • Β· 35 views Β· 7 applications Β· 12d

    Senior Data Engineer

    Full Remote Β· Worldwide Β· Product Β· 5 years of experience Β· English - B2
    About us: Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with the organization of the first Data Science UA conference, setting the foundation for our growth. Over the past 9 years, we have...

    About us:
    Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with the organization of the first Data Science UA conference, setting the foundation for our growth. Over the past 9 years, we have diligently fostered the largest Data Science Community in Eastern Europe, boasting a network of over 30,000 AI top engineers.

    About the client:
    We are working with a new generation of data service provider, specializing in data consulting and data-driven digital marketing, dedicated to transforming data into business impact across the entire value chain of organizations. The company’s data-driven services are built upon the deep AI expertise the company’s acquired with a 1000+ client base around the globe. The company has 1000 employees across 20 offices who are focused on accelerating digital transformation.

    About the role:
    We are seeking a Senior Data Engineer (Azure) to design and maintain data pipelines and systems for analytics and AI-driven applications. You will work on building reliable ETL/ELT workflows and ensuring data integrity across the organization.

    Required skills:
    - 6+ years of experience as a Data Engineer, preferably in Azure environments.
    - Proficiency in Python, SQL, NoSQL, and Cypher for data manipulation and querying.
    - Hands-on experience with Airflow and Azure Data Services for pipeline orchestration.
    - Strong understanding of data modeling, ETL/ELT workflows, and data warehousing concepts.
    - Experience in implementing DataOps practices for pipeline automation and monitoring.
    - Knowledge of data governance, data security, and metadata management principles.
    - Ability to work collaboratively with data science and analytics teams.
    - Excellent problem-solving and communication skills.

    Responsibilities:
    - Transform data into formats suitable for analysis by developing and maintaining processes for data transformation;
    - Structuring, metadata management, and workload management.
    - Design, implement, and maintain scalable data pipelines on Azure.
    - Develop and optimize ETL/ELT processes for various data sources.
    - Collaborate with data scientists and analysts to ensure data readiness.
    - Monitor and improve data quality, performance, and governance.

    More
  • Β· 38 views Β· 3 applications Β· 12d

    Senior Data Engineer

    Full Remote Β· Poland, Romania, Ukraine Β· 6 years of experience Β· English - B2
    Transcenda is a global provider of design and engineering services. We put people first and strive to be agents of change by building a better future through technology. We are dedicated to empowering organizations to rapidly scale, digitally transform,...

    Transcenda is a global provider of design and engineering services. We put people first and strive to be agents of change by building a better future through technology. We are dedicated to empowering organizations to rapidly scale, digitally transform, and bring new products to market.

    Recognized by Newsweek as one of America’s greatest workplaces of 2025, Transcenda is home to 200+ engineers, designers, analysts, and advisors solving complex business challenges through technology. By approaching our work through a variety of cultures and perspectives, we take calculated risks to design and develop innovative solutions that will have a positive impact tomorrow.

     

    Interesting Facts:

    • Over 200 team members
    • Fully remote β€” we let people work where they work best.
    • We work with clients who value our opinion and thought leadership, and where we can make a meaningful contribution to architectural decisions, engineering decisions, and product decisions.
    • We have a strong social agenda and promote diversity and inclusion, and participate in a variety of charity initiatives throughout the year.
    • We have fun team-building activities.
    • Since we are rapidly growing, the ability to grow and advance your career is available and at a fairly quick rate.


    Must Haves:

    • Strong experience with Python, Java, or other programming languages
    • Advanced knowledge of SQL, including complex queries, query modularization, and optimization for performance and readability
    • Familiarity with the modern data stack and cloud-native data platforms, such as Snowflake, BigQuery, or Amazon Redshift
    • Hands-on experience with dbt (data build tool) for data modeling and transformations
    • Experience with data orchestration tools, such as Airflow or Dagster

    ‍

    Nice to Have:

    • Experience with GitOps, continuous delivery for data pipelines
    • Experience with Infrastructure-as-Code tooling (Terraform)

    ‍

    Key Responsibilities:

    • Design and build a data platform that standardizes data practices across multiple internal teams
    • Support the entire data lifecycle
    • Build and maintain integrations across data processing layers, including ingestion, orchestration, transformation, and consumption
    • Collaborate closely with cross-functional teams to understand data needs and ensure the platform delivers value
    • Document architectures, solutions, and integrations to promote best practices, maintainability, and usability
    More
  • Β· 33 views Β· 2 applications Β· 12d

    Sr Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· 5 years of experience Β· English - B2
    You’ll take ownership of a large-scale AWS data platform powering analytics for thousands of hotels and restaurants worldwide. This is a hands-on role where your work directly impacts business decisions across the hospitality industry β€” not internal...

    You’ll take ownership of a large-scale AWS data platform powering analytics for thousands of hotels and restaurants worldwide. This is a hands-on role where your work directly impacts business decisions across the hospitality industry β€” not internal dashboards nobody uses.

    We’re looking for someone who doesn’t just build pipelines β€” but runs them, fixes them, and makes them bulletproof.

     

    About the Product

    A hospitality technology company operating a data analytics platform serving:

    • 2,500+ hotels
    • 500+ restaurants

    The system processes operational and performance data, delivering insights to product and analytics teams who rely on it daily.

     

    Your Mission

    Own and operate the AWS data infrastructure:

    • Build scalable, production-grade data pipelines
    • Ensure reliability, performance, and cost-efficiency
    • Keep everything running smoothly in real production environments

    This is not a β€œdesign slides and disappear” role β€” it’s real ownership of real data systems.

     

    What You’ll Be Doing

    Data Engineering & Pipelines

    • Build and operate Spark / PySpark workloads on EMR and Glue
    • Design end-to-end pipelines:
      API / DB / file ingestion β†’ transformation β†’ delivery to analytics consumers
    • Implement data validation, monitoring, and quality checks
    • Optimize pipelines for performance, cost, and scalability

     

    Infrastructure & Operations

    • Manage AWS infrastructure using Terraform
    • Monitor via CloudWatch
    • Debug production failures and implement preventive solutions
    • Maintain IAM and security best practices

     

    Collaboration

    • Work closely with product and analytics teams
    • Define clear data contracts
    • Deliver reliable datasets for BI and analytics use cases

     

    Must-Have Experience

    • 5+ years of hands-on data engineering in production
      (actual pipelines running in production, not only architecture work)
    • Strong Spark / PySpark
    • Advanced Python
    • Advanced SQL
    • AWS data stack: EMR, Glue, S3, Redshift (or similar), IAM, CloudWatch
    • Infrastructure as Code with Terraform
    • Experience debugging and stabilizing production data systems

     

    Nice to Have

    • Kafka or Kinesis (streaming)
    • Airflow or similar orchestration tools
    • Experience supporting BI tools and analytics teams

     

    What We Care About

    • You’ve handled pipeline failures in production β€” and learned from them
    • You prioritize data correctness, not just speed
    • You write maintainable, readable code
    • You understand AWS cost and scaling trade-offs
    • You avoid over-engineering β€” and ship what delivers value
    More
  • Β· 72 views Β· 18 applications Β· 12d

    Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 3 years of experience Β· English - B2
    We are seeking a skilled Data Engineer to join our team and contribute to the development of large- scale analytics platforms. The ideal candidate will have strong experience in cloud ecosystems such as Azure and AWS, as well as expertise in AI and...

    We are seeking a skilled Data Engineer to join our team and contribute to the development of large- scale analytics platforms. The ideal candidate will have strong experience in cloud ecosystems such as Azure and AWS, as well as expertise in AI and machine learning applications. Knowledge of the healthcare industry and life sciences is a plus.

    Key Responsibilities

    • Design, develop, and maintain scalable data pipelines for large-scale analytics platforms.
    • Implement cloud-based solutions using Azure and AWS, ensuring reliability and performance.
    • Work closely with data scientists and AI/ML teams to optimize data workflows.
    • Ensure data quality, governance, and security across platforms.
    • Collaborate with cross-functional teams to integrate data solutions into business processes.

    Required Qualifications

    • Bachelor's degree (or higher) in Computer Science, Engineering, or a related field.
    • 3+ years of experience in data engineering, big data processing, and cloud-based architecture.
    • Strong proficiency in cloud services (Azure, AWS) and distributed computing frameworks.
    • Mandatory hands-on experience with Databricks (UC, DLTs, Delta Sharing, etc.)
    • Expertise in SQL and database management systems (SQL Server, MySQL, etc.).
    • Experience with data modeling, ETL processes, and data warehousing solutions.
    • Knowledge of AI and machine learning concepts and their data requirements.
    • Proficiency in Python, Scala, or similar programming languages.
    • Basic knowledge of C# and/or Java programming.
    • Familiarity with DevOps, CI/CD pipelines.
    • High-level proficiency in English (written and spoken).

    Preferred Qualifications

    • Experience in the healthcare or life sciences industry.
    • Understanding of regulatory compliance related to healthcare data (HIPAA, GDPR, etc.).
    • Familiarity with interoperability standards such as HL7, FHIR, and EDI.
    More
Log In or Sign Up to see all posted jobs