Senior Data Engineer

Description

Method is a global design and engineering consultancy founded in 1999. We believe that innovation should be meaningful, beautiful and human. We craft practical, powerful digital experiences that improve lives and transform businesses. Our teams based in New York, Charlotte, Atlanta, London, Bengaluru, and remote work with a wide range of organizations in many industries, including Healthcare, Financial Services, Retail, Automotive, Aviation, and Professional Services.

 

Method is part of GlobalLogic, a digital product engineering company. GlobalLogic integrates experience design and complex engineering to help our clients imagine what’s possible and accelerate their transition into tomorrow’s digital businesses. GlobalLogic is a Hitachi Group Company.

 

Your role is to collaborate with multidisciplinary individuals and support the project lead on data strategy and implementation projects. You will be responsible for data and systems assessment, identifying the critical data and quality gaps required for effective decision support, and contributing to the data platform modernization roadmap. 

 

Responsibilities:

  • Work closely with data scientists, data architects, business analysts, and other disciplines to understand data requirements and deliver accurate data solutions.
  • Analyze and document existing data system processes to identify areas for improvement.
  • Develop detailed process maps that describe data flow and integration across systems.
  • Create a data catalog and document data structures across various databases and systems.
  • Compare data across systems to identify inconsistencies and discrepancies.
  • Contribute towards gap analysis and recommend solutions for standardizing data.
  • Recommend data governance best practices to organize and manage data assets effectively.
  • Propose database design standards and best practices to suit various downstream systems, applications, and business objectives
  • Strong problem-solving abilities with meticulous attention to detail and experience. 
  • Experience with requirements gathering and methodologies. 
  • Excellent communication and presentation skills with the ability to clearly articulate technical concepts, methodologies, and business impact to both technical teams and clients.
  • A unique point of view. You are trusted to question approaches, processes, and strategy to better serve your team and clients.

 

Skills Required 

Technical skills

  • Proven experience (5+ years) in data engineering.
  • 5+ years of proven data engineering experience with expertise in data warehousing, data management, and data governance in SQL or NoSQL databases.
  • Deep understanding of data modeling, data architecture, and data integration techniques.
  • Advanced proficiency in ETL/ELT processes and data pipeline development from raw, structured to business/analytics layers to support BI Analytics and AI/GenAI models.
  • Hands-on experience with ETL tools, including: Databricks (preferred), Matillion, Alteryx, or similar platforms.
  • Commercial experience with a major cloud platform like Microsoft Azure (e.g., Azure Data Factory, Azure Synapse, Azure Blob Storage).

 

 

Core Technology stack

Databases

  • Oracle RDBMS (for OLTP): Expert SQL for complex queries, DML, DDL.
  • Oracle Exadata (for OLAP/Data Warehouse): Advanced SQL optimized for analytical workloads. Experience with data loading techniques and performance optimization on Exadata.

Storage:

  • S3-Compatible Object Storage (On-Prem): Proficiency with S3 APIs for data ingest, retrieval, and management.

Programming & Scripting:

  • Python: Core language for ETL/ELT development, automation, and data manipulation.
  • Shell Scripting (Linux/Unix): Bash/sh for automation, file system operations, and job control.

Version Control: 

         Git: Managing all code artifacts (SQL scripts, Python code, configuration files).​​

Related Technologies & Concepts:

  • Data Pipeline Orchestration Concepts: Understanding of scheduling, dependency management, monitoring, and alerting for data pipelines
  • Containerization: Docker, basic understanding of how containerization works
  • API Interaction: Understanding of REST APIs for data exchange (as they might need to integrate with the Java Spring Boot microservices).
     

Location

  • Remote across Poland

 

Why Method?

We look for individuals who are smart, kind and brave. Curious people with a natural ability to think on their feet, learn fast, and develop points of view for a constantly changing world find Method an exciting place to work. Our employees are excited to collaborate with dispersed and diverse teams that bring together the best in thinking and making. We champion the ability to listen and believe that critique and dissonance lead to better outcomes. We believe everyone has the capacity to lead and look for proactive individuals who can take and give direction, lead by example, enjoy the making as much as they do the thinking, especially at senior and leadership levels.

Next Steps

If Method sounds like the place for you, please submit an application. Also, let us know if you have a presence online with a portfolio, GitHub, Dribbble, or another platform.

 

* For information on how we process your personal data, please see Privacy: https://www.method.com/privacy/

Published 23 May
42 views
·
6 applications
100% read
·
34% responded
Last responded 2 weeks ago
To apply for this and other jobs on Djinni login or signup.
Loading...