Our customer is Fortune 500 company. As a leading business-to-business organization, more than 3.2 million customers rely on its products in categories such as safety, material handling and metalworking, along with services like inventory management and technical support.
Skills:
-Hadoop experience, familiarity with HDFS/Hive in particular
-Python (PySpark a plus) and Java
-AWS experience, especially EC2, S3, and EMR (Cloud formation/other infrastructure scripting a plus)
- SQL
-Test-driven development (unit testing)
-RDBMS/ETL experience
-Streaming data (Kafka/Kinesis/etc.)
-CI/CD, Docker
-ETL orchestration (Airflow)
Responsibilities:
-Build and maintain AWS BigData solution.
-Perform data conversion, data imports, data exports and transformations.
-Design and implement processes to ensure data integrity and standardization.
-Continuously improve the Hadoop environment to ensure they are highly performant and provide an optimal end user experience.
-Implements data reliability, efficiency, and quality improvements.
-Ability to bring complex concepts into the organization and mentor others.
-Demonstrated ability to learn independently as well as from others. Push for improvement by bringing new ideas into the organization.
We offer:
Flexible working hours
A competitive salary and good compensation package
18 business days of payable annual free-time
10 days of paid sick leave
IT Club membership card which provides pleasant discounts
A masseur and a corporate doctor
An inspiring and comfy office
Regular office fruit delivery
Professional growth:
Challenging tasks and projects
An individual development plan
A personal education budget
A regular performance appraisal
Meetups and events for professional development
Mentorship opportunities
Business trips
Fun:
Corporate events and outstanding parties
Exciting team buildings
Memorable anniversary presents
A fun zone where you can play video games, football, ping pong, and more
The job ad is no longer active
Look at the current
jobs
(Other)
Kyiv→