Senior BigData DevOps Engineer to $4000

We are looking for a Senior Big Data DevOps Engineer to join a telecommunications project focused on data-intensive systems. You’ll be responsible for supporting and optimizing infrastructure for big data pipelines and ensuring system reliability and performance.

 

Requirements:

  • 4+ years of experience as a DevOps Engineer.
  • Strong hands-on experience with Linux and Bash scripting.
  • Solid knowledge of SQL.
  • Confident in at least one programming language: Python or JavaScript; experience with Ezmeral Data Fabric is a plus.
  • Experience with Kubernetes and container systems.
  • Hands-on experience with Spark, Kafka, and Airflow.
  • Familiarity with monitoring systems: Zabbix, Grafana, Prometheus.
  • Good understanding of TCP/IP, HTTP, and REST API principles.
  • Experience with Git, GitLab, and CI/CD tools.
  • Experience with Terraform and Ansible.
  • English sufficient for reading and understanding technical documentation.

 

Responsibilities:

  • Support and maintain big data infrastructure components.
  • Automate deployment and infrastructure provisioning processes.
  • Work with large-scale data tools to ensure stability and performance.
  • Collaborate with developers and data engineers to support end-to-end solutions.

 

We are delighted to provide you with the following benefits:

  • Opportunities for growth and development within the project
  • Flexible working hours
  • Option to work remotely or from the office
Published 9 June
39 views
·
10 applications
20% read
·
10% responded
Last responded 4 days ago
To apply for this and other jobs on Djinni login or signup.
Loading...