Data Engineer $$$

Profisea is an Israeli boutique DevOps and Cloud company with a full cycle of services. For more than nine years, we have been implementing best practices of GitOps, DevSecOps, and FinOps, and providing Kubernetes-based infrastructure services to help businesses of all sizes โ€”SMB, SME, or large enterprise clients to stay innovative and effective.     


We are looking for a Data Engineer with 3+ years of experience building and maintaining scalable data platforms in cloud environments. The ideal candidate will design, implement, and optimize data pipelines that support analytics and data-driven applications across AWS and GCP ecosystems.
Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL/ELT workflows.
  • Build and manage data processing solutions using AWS Athena, AWS Glue, Glue Data Catalog, EMR, and S3.
  • Implement data ingestion and replication pipelines using AWS DMS and other integration tools.
  • Develop and orchestrate workflows using Apache Airflow and GCP Cloud Composer.
  • Manage and optimize analytical workloads in Google BigQuery.
  • Perform database migrations using AWS DMS and GCP Database Migration Service.
  • Work with relational and NoSQL databases including PostgreSQL, MongoDB, Amazon RDS, DocumentDB, OpenSearch/Elasticsearch, and CloudSQL.
  • Ensure data quality, reliability, and performance across data platforms.
  • Collaborate with data analysts, data scientists, and software engineers to enable efficient data access and analytics.

Requirements

  • 3+ years of experience in Data Engineering or related roles.
  • Hands-on experience with AWS data services including Athena, Glue, Glue Data Catalog, EMR, DMS, and S3.
  • Experience working with GCP data platforms, especially BigQuery and Cloud Composer.
  • Experience with database migration tools such as AWS DMS or GCP Database Migration Service.
  • Strong knowledge of SQL and data modeling.
  • Experience working with PostgreSQL, MongoDB, and search platforms such as OpenSearch or Elasticsearch.
  • Experience managing relational databases like RDS and CloudSQL.
  • Familiarity with workflow orchestration tools such as Airflow.
  • Understanding of data warehousing, data lakes, and distributed data processing concepts.

Nice to Have

  • Experience with large-scale data lake architectures.
  • Experience optimizing BigQuery or Athena query performance and cost.
  • Knowledge of infrastructure-as-code and CI/CD for data platforms.
  • Experience working in multi-cloud environments (AWS + GCP).

What we Offer: 

  • Competitive salary   
  • Remote work 
  • Flexible schedule 
  • Career growth 
  • Sport compensations 
  • Professional working environment, where youโ€™d be an essential member of our company 
  • Corporate culture, mutual support. 

Required languages

English B2 - Upper Intermediate
Ukrainian Native
Published 1 April
14 views
ยท
1 application
Last responded 28 minutes ago
To apply for this and other jobs on Djinni login or signup.
Loading...