Middle/Senior Big Data Engineer (Big Data Competence Center) (offline)

REQUIREMENTS
- 4+ years’ development experience with Java or Python
- Solid knowledge in algorithms and data structures
- Experience with AWS — Glue, Lambda, Step Functions, S3, etc.
- Experience with developing data pipelines based on one of the mainstream frameworks like Spark/Flink/Presto, etc.
- Experience with developing data lakes and data warehouses based on one of the mainstream technologies like Hive/Snowflake/OLAP/Hudi, etc.
- Knowledge in SQL and solid experience with NoSQL or RDBMS technologies
- Experience with data integrations and different data formats CSV, Protobuf, Parquet, Avro, ORC, etc.
- Solid understanding of technical excellence and hands-on experience with code reviews, test development, and CI/CD

WOULD BE A PLUS:
- Experience with building data platform using cloud provider services (GCP or Azure)
- Experience with developing Snowflake-driven data warehouses
- Experience with developing event-driven data pipelines

RESPONSIBILITIES
- Contributing to new technologies investigations and complex solutions design, supporting a culture of innovation considering matters of security, scalability, and reliability
- Working with modern data stack, coming up with well-designed technical solutions and robust code, implement data governance processes
- Working and professionally communicating with customer’s team
- Taking up responsibility for delivering major solution features
- Participating in requirements gathering & clarification process, proposing optimal architecture strategies, lead the data architecture implementation
- Developing core modules and functions, designing scalable and cost-effective solutions
- Performing code reviews, writing unit, and integration tests
- Scaling the distributed system and infrastructure to the next level
- Building data platform using power of modern cloud providers (AWS/GCP/Azure)

EXTRA RESPONSIBILITIES:
- Developing the Micro Batch/Real-Time streaming pipelines (Lambda architecture)
- Working on POCs for validating proposed solutions and migrations
-Leading the migration to modern technology platform, providing technical guidance
- Adhere to CI/CD methods, helping to implement best practices in the team
- Contributing to unit growth, mentoring other members in the team (optional)
- Owning the whole pipeline and optimizing the engineering processes
- Designing complex ETL processes for analytics and data management, driving the massive implementation

About Sigma Software

Sigma Software is a place where Nordic Traditions meet Ukrainian Spirit to create Superior Software. We combine the best practices and approaches from Swedish and Ukrainian cultures. Taking high demand for quality, minimal hierarchy, freedom of decision-making, and attention to every opinion from Swedes, as a Ukrainian company we demonstrate flexibility and dedication to every project and every customer.

We are IT consulting and software product company with development offices in Ukraine, Sweden, the USA, Canada, Poland, and Australia. We deliver smartest solutions to our customers in the areas of government, telecommunications, advertising, automotive, gaming, and others. Being a part of Sigma Group, one of the largest Nordic IT corporations, we are a global player with more than 5000 employees in 12 countries and over 1500 in Ukraine.

We work with startups, software houses, and enterprises providing the products and services that suit our clients the most. Company`s R&D centers are mastering the trending technologies and directions: Infotainment, AR/VR, Blockchain, Machine Learning, Data Science, Artificial Intelligence, and others.

Company website:
https://career.sigma.software/

DOU company page:
https://jobs.dou.ua/companies/sigma-software/

The job ad is no longer active
Job unpublished on 22 February 2023

Look at the current jobs Data Engineer Kyiv→