Senior Data Engineer
We are looking for a Senior Data Engineer to join AltexSoft and strengthen our data engineering practice on a large-scale, high-impact project within the travel and hospitality domain. In this role, you will work hands-on with complex data ecosystems, taking responsibility for the stability, scalability, and performance of mission-critical data integrations.
You will play a key role in resolving non-trivial technical issues, improving existing pipelines, and evolving engineering processes to make data operations more resilient and proactive. The position offers a strong technical challenge and real influence on systems that process massive volumes of data used by analytics platforms and customer-facing products.
You Have
- 5+ years of experience in Python and proven experience working with large-scale datasets.
- Experience with PMS integrations/Hospitality domain
- Solid background in designing, building, and maintaining data processing pipelines.
- Experience with cloud platforms (GCP, AWS, or Azure).
- Hands-on skills with SQL and data storage/querying systems (e.g., BigQuery, BigTable, or similar).
- Knowledge of containerization and orchestration tools (Docker, Kubernetes).
- Ability to troubleshoot and debug complex technical issues in distributed systems.
- Strong communication skills in English, with the ability to explain technical details to both technical and non-technical stakeholders.
- Experience using AI coding assistants (e.g., Cursor, GitHub Copilot, or similar) in day-to-day development tasks.
Experience with Google Cloud services such as Pub/Sub, Dataflow, and ML-driven data workflows.
Would be a plus
- Experience with airline, travel, or hospitality-related datasets.
- Exposure to observability and monitoring tools for large-scale data systems.
- Experience building AI-powered solutions or integrating AI pipelines/APIs into software projects.
- Experience with 2nd tier PMS market like Tesipro or Maestro.
Or any property management systems APIs.
You Are Going To
- Maintain and enhance existing data integrations, ensuring the reliability, accuracy, and quality of incoming data.
- Lead the investigation and resolution of complex incidents by performing deep technical analysis and debugging.
- Communicate effectively with stakeholders (including customer-facing teams and external partners) by providing transparent and timely updates.
- Collaborate with partners to troubleshoot integration issues and ensure smooth data flow.
- Identify opportunities to improve processes, tooling, and documentation to scale and streamline data operations.
- Contribute to the design and delivery of new data engineering solutions supporting business-critical systems.
We offer
Required skills experience
| Python | 4.5 years |
| Docker | 2 years |
| Data Engineering | 5 years |
| GCP (Google Cloud Platform) | 2 years |
Required languages
| English | B2 - Upper Intermediate |