
MAPEGY
We are a leading provider of Business Intelligence in Innovation, Research & Development. Our platform helps enterprises uncover trends, track technology, and analyze competitors β easier, faster, and smarter.
β We prioritize expertise and results over location or contract type.
β Work from anywhere, ideally within European time zones to facilitate collaboration.
β Join a highly skilled, passionate team that thrives on data and innovation.
β Enjoy fair compensation, professional growth, and an engaging work environment.
-
Senior Data Engineer
Part-time Β· Full Remote Β· Worldwide Β· Product Β· 5 years of experience Β· IntermediateAbout the Role Weβre looking for a Senior Data Engineer to help us optimize SQL-heavy workflows, improve ETL processes, and streamline data pipelines. This is an opportunity for a highly experienced expert, freelancer, or agency to collaborate with us on...About the Role
Weβre looking for a Senior Data Engineer to help us optimize SQL-heavy workflows, improve ETL processes, and streamline data pipelines. This is an opportunity for a highly experienced expert, freelancer, or agency to collaborate with us on an impactful level. While we offer flexibility in working arrangements, we are invested in a long-term collaboration with the right professional.If you have deep technical expertise, a problem-solving mindset, and a proven ability to deliver results, weβd love to hear from you!
What Youβll Do
- Unify and optimize data infrastructure to enhance efficiency and maintainability.
- Redesign and streamline ETL workflows for improved scalability and performance.
- Ensure data integrity, reliability, and accessibility to support business needs.
- Optimize PostgreSQL queries and indexing strategies to improve system performance.
- Automate and orchestrate workflows using Python and Apache Airflow.
- Act as a subject-matter expert, guiding the team on best practices in data engineering.What Weβre Looking For
Must-Have Skills:
β Expertise in SQL and PostgreSQL, with a strong focus on query optimization.
β Hands-on experience in ETL pipeline design, data transformation, and workflow automation.
β Proficiency in Python for scripting and automation.
β Deep knowledge of Apache Airflow for orchestrating data workflows.
β Strong understanding of data cleaning, deduplication, and integration best practices.
β Ability to collaborate effectively within European time zones.Bonus Skills (Nice to Have):
+ Experience with document databases for handling large-scale data.
+ Strong problem-solving skills and ability to troubleshoot data issues.
+ A good sense of data architecture best practices for efficiency.Your Background
5+ years of experience in data engineering, working with SQL-heavy workflows, ETL pipelines, and PostgreSQL-based systems.