Middle Data Engineer

Our customer (originally the Minnesota Mining and Manufacturing Company) is an American multinational conglomerate operating in the fields of industry, worker safety, and consumer goods. Based in the Saint Paul suburb of Maplewood, the company produces over 60,000 products, including adhesives, abrasives, laminates, passive fire protection, personal protective equipment, window films, paint protection film, electrical, electronic connecting, insulating materials, car-care products, electronic circuits, and optical films.

 

We are looking for a skilled and motivated Data Engineer to join our growing team. In this role, you will be contributing to the development and maintenance of our data pipelines. You will work on a data project, helping to build and support robust data pipelines that transform raw data into valuable assets for analytics and business intelligence. This is an excellent opportunity to enhance your skills in a collaborative environment while making a significant impact on our data capabilities.

 

Required Qualifications & Skills

  • 3+ years of professional experience in data engineering or a related role.
  • Solid proficiency with Python for data processing and automation, with at least 2-3 years of hands-on experience.
  • Strong SQL skills for querying and manipulating complex datasets.
  • Experience with cloud data services, preferably Azure (Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage).
  • Hands-on experience with big data processing frameworks like Spark (PySpark) and platforms such as Databricks.
  • Good understanding of data warehousing concepts, ETL processes, and data integration techniques.
  • Experience in applying data quality assessment and improvement techniques.
  • Experience working with various data formats, including structured, semi-structured, and unstructured data (e.g., CSV, JSON, Parquet).
  • Familiarity with Agile and Scrum methodologies and project management tools (e.g., Azure DevOps, Jira).
  • Good communication skills and the ability to work effectively as part of a team.

Preferred Qualifications & Skills

  • Knowledge of DevOps methodologies and CI/CD practices for data pipelines.
  • Familiarity with modern data platforms like Microsoft Fabric for data modeling and integration.
  • Experience with consuming data from REST APIs.
  • Experience with database design concepts and performance tuning.
  • Knowledge of dimensional data modeling concepts (Star Schema, Snowflake Schema).
  • Awareness of modern data architecture concepts such as Data Mesh.
  • Experience in supporting production data pipelines.

Key Responsibilities

  • Develop & Maintain Data Pipelines: Develop, test, and maintain robust and efficient data pipelines using Python, SQL, and Spark on the Azure cloud platform.
  • Implement Data Solutions: Implement and support end-to-end data solutions, from data ingestion and processing to storage in our data lake (Azure Data Lake Storage, Delta Lake) and data warehouse.
  • Utilize Cloud Data Services: Work with Azure services like Azure Data Factory, Databricks, and Azure SQL Database to build and manage data workflows.
  • Ensure Data Quality: Implement data quality checks, including data profiling, cleansing, and validation routines, to help ensure the accuracy and reliability of our data.
  • Performance Tuning: Assist in monitoring and optimizing data pipelines for performance and scalability under the guidance of senior engineers.
  • Code Reviews & Best Practices: Actively participate in code reviews and adhere to team best practices in data engineering and coding standards.
  • Stakeholder Collaboration: Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and assist in delivering effective solutions.
  • Troubleshooting: Provide support for production data pipelines by investigating and resolving data-related issues.

Required languages

English B2 - Upper Intermediate
Published 30 October
15 views
ยท
0 applications
To apply for this and other jobs on Djinni login or signup.
Loading...