Data Engineer (Python / PySpark) to $4000

We’re hiring a Data Engineer (Python / PySpark), who brings his professional skills and passion to deliver awesome enterprise solutions.

✅ Knowledge of English at least Upper-Intermediate level.

 

  • Location: Lviv, Ukraine (on-site / hybrid)
  • Workplace: Candidate must be based in Lviv or willing to relocate before start
  • Note: This position requires the employee to work on-site in Lviv

 

Responsibilities:

  • Design, build, and maintain ETL pipelines for large-scale data processing
  • Develop batch and streaming data workflows using Python and PySpark
  • Work with cloud data platforms such as AWS Glue, EMR, S3, and AWS Data Pipeline
  • Integrate and manage messaging systems like Kafka and RabbitMQ
  • Develop and maintain solutions using Hadoop ecosystem components: HDFS, Hive
  • Optimize data storage and query performance in relational databases (PostgreSQL, Redshift)
  • Containerize data workflows using Docker
  • Orchestrate workflows with Airflow
  • Implement CI/CD pipelines for data workflows and maintain version control (Git)
  • Monitor data pipelines and system performance using Grafana and logging tools
  • Ensure data security and access control: encryption, IAM, and compliance best practices
  •  

Requirements:

  • 4–5+ years experience in Data Engineering
  • Strong proficiency in Python and PySpark
  • Hands-on experience with ETL pipelines and data modeling
  • Knowledge of cloud data services (AWS Glue, EMR, S3, Data Pipeline)
  • Experience with messaging systems: Kafka, RabbitMQ
  • Familiarity with Hadoop ecosystem: HDFS, Hive
  • Strong SQL skills (PostgreSQL, Redshift)
  • Experience with Docker and workflow orchestration (Airflow)
  • Knowledge of CI/CD and version control (Git)
  • Monitoring and logging experience (Grafana)
  • Understanding of data security, encryption, and access control
  • Analytical mindset and strong problem-solving skills
  • Upper-Intermediate English or higher

 

Nice to Have:

  • Experience with multi-cloud environments or hybrid infrastructures
  • Familiarity with big data performance tuning (partitioning, memory optimization)
  • Experience with real-time streaming data processing
  • Knowledge of data governance and compliance standards

 

What we can offer:

  • Full-time flexible working schedule;
  • Comfortable, cosy, and well-equipped office;
  • Modern working place MacBookPro;
  • 18 business days of paid vacation / 20 paid sick leaves / public holidays in Ukraine.
  • English lessons
Published 2 September
17 views
·
1 application
To apply for this and other jobs on Djinni login or signup.
Loading...