Data Engineer Offline

Overview: 

We are seeking a skilled and motivated Data Engineer with expertise in Snowflake and Python to join our dynamic team. In this role, you will be responsible for designing, building, and optimizing scalable data pipelines, ensuring data accuracy and availability, and driving efficient data workflows. Your work will directly support data-driven decision-making across the organization. 

 

Key Responsibilities: 

  • Design, develop, and maintain scalable ETL/ELT pipelines for structured and unstructured data using Snowflake and Python. 
  • Implement and optimize Snowflake-based data warehouses, ensuring performance, scalability, and security. 
  • Collaborate with cross-functional teams to understand business requirements and translate them into efficient data models and pipelines. 
  • Perform data ingestion, transformation, and integration from various data sources (cloud and on-premises). 
  • Develop and manage data processing workflows and scheduling using orchestration tools like Apache Airflow or dbt. 
  • Monitor, troubleshoot, and resolve data pipeline and data quality issues. 
  • Establish best practices for Snowflake configuration, including data sharing, partitioning, clustering, and query optimization. 
  • Automate data validation and testing processes to ensure high data quality and accuracy. 
  • Maintain data security, privacy, and compliance in adherence to organizational and regulatory standards. 
  • Document data engineering processes, pipelines, and systems to ensure transparency and continuity. 

     

Qualifications: 

  • Bachelor's or Master’s degree in Computer Science, Information Systems, or a related field. 
  • 3+ years of experience in data engineering, with proven expertise in Snowflake and Python. 
  • Hands-on experience in implementing and managing Snowflake solutions, including data modeling, schema design, and query optimization. 
  • Proficiency in Python for data manipulation, automation, and integration. 
  • Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. 
  • Strong knowledge of SQL, including query optimization and stored procedure development. 
  • Familiarity with data integration tools (e.g., Fivetran, Talend, Informatica, SSIS) and orchestration tools (e.g., Airflow or Orchestra). 
  • Understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA). 
  • Experience with CI/CD pipelines for data workflows is a plus. 
  • Excellent analytical, problem-solving, and communication skills. 

     

Preferred Skills: 

  • Knowledge of machine learning workflows and integration with data pipelines. 
  • Experience with APIs for data integration. 
  • Familiarity with BI tools like Tableau, Power BI, AWS QuickSight or Looker. 
  • Certification in Snowflake or cloud platforms (e.g., AWS Certified Data Analytics). 

     

Soft Skills: 

  • Strong collaboration skills to work with cross-functional teams. 
  • Proactive and innovative mindset to identify opportunities for improvement in data processes. 
  • Attention to detail and commitment to delivering high-quality solutions. 

     

Why Join Us? 

β€’ Opportunity to work on cutting-edge data technologies and projects. 
β€’ Collaborative and inclusive work environment. 
β€’ Career development opportunities through training and certifications. 
β€’ Competitive compensation package, including benefits and performance bonuses. 

To apply for this and other jobs on Djinni login or signup.

Similar jobs

Europe except Ukraine
Ukraine, Poland, Bulgaria, Germany, Spain