Data Engineer Offline
PIN-UP Global is an international holding specializing in the development and implementation of advanced technologies, B2B solutions and innovative products for the iGaming industry. We provide certification and licensing of our products, providing customers and partners of the holding with high-quality and reliable solutions.
We are looking for a Data Engineer to join our team!
Requirements:
- 3+ years of experience as a Data Engineer, particularly with data integration.
- Proficiency in Python and SQL for data manipulation and processing.
- Hands-on experience with Snowflake or other databases for data warehousing solutions.
- Experience with Data Build Tool (DBT) for data transformation and modeling.
- Familiarity with data orchestration tools such as Airflow and ArgoCD.
- Knowledge of OpenMetadata for managing metadata within data projects.
- Experience with GitLab for version control and collaboration.
- Familiarity with NATS or other message brokers for data streaming.
Will be plus:
- Familiarity with machine learning frameworks and their integration with data pipelines.
- Familiarity with cloud services such as AWS may be beneficial for infrastructure management.
Responsibilities:
- Build and optimize scalable data pipelines to integrate datasets from multiple sources (APIs, databases, NATS events) into Snowflake, ensuring compliance with both functional and non-functional requirements.
- Provide the necessary technology, tools, and curated datasets to empower data scientists and business intelligence analysts effectively.
- Implement and manage a comprehensive data quality framework that monitors data integrity and quality in collaboration with data source teams, addressing any issues promptly.
- Collaborate with team members to identify, troubleshoot, and resolve infrastructure issues that may impact data accessibility and performance.
Technical Stack:
Programming languages:
Python (for data manipulation, ETL processes, and scripting)
SQL (for querying and managing data within databases)
Data warehousing:
Snowflake (primarily) or other databases (e.g., Clickhouse)
Data orchestration tools:
Apache Airflow (for scheduling and automating complex data pipelines)
ArgoCD (for continuous delivery of Kubernetes applications)
Data transformation:
Data Build Tool (DBT) (for modeling and transforming data inside the warehouse)
Metadata management:
OpenMetadata (for managing and documenting metadata across data projects)
Version control and collaboration:
GitLab (for version control, code collaboration, and project management)
Messaging and data streaming:
NATS or other message brokers (for real-time data streaming and event-driven architecture)
Cloud platform:
AWS, specifically S3
Our benefits to you:
🍀 An exciting and challenging job in a fast-growing holding, the opportunity to be part of a multicultural team of top professionals in Development, Architecture, Management, Operations, Marketing, Legal, Finance and more
🤝🏻 Great working atmosphere with passionate experts and leaders, sharing a friendly culture and a success-driven mindset is guaranteed
🧑🏻💻 Modern corporate equipment based on macOS or Windows and additional equipment are provided
🏖️ Paid vacations, sick leave, personal events days, days off
💵 Referral program — enjoy cooperation with your colleagues and get the bonus
📚 Educational programs: regular internal training sessions, compensation for external education, attendance of specialized global conferences
🎯 Rewards program for mentoring and coaching colleagues
🗣️ Free internal English courses
🦄 Multiple internal activities: online platform for employees with quests, gamification, presents and news, PIN-UP clubs for movie / book / pets lovers and more
🎳 Other benefits could be added based on your location
The job ad is no longer active
Look at the current jobs Data Engineer →