Senior Big Data Developer (offline)

Project Description:
There are millions of cars on the road today with solutions designed & implemented by us focusing on Autonomous Drive, Embedded Applications, Digital Cockpit, Connected Mobility & overall excellence in delivery.

Software inside vehicles used to be a self-contained environment. Equipping cars with perception, intelligence & connecting them changed a lot for the vehicle manufacturer. Our goal is to empower our customers with smart solutions to help them develop the cars of the future.

About the Project:
The Big Data Developer is part of the WW offering build team. (S)He works with respective automotive accounts, Hadoop experts, experienced software developers, data scientists and technology experts to deliver analytic solutions for our customers.

The goal of the program is to solve the 'billion-mile validation challenge' by providing end-to-end features and service for Level 3+ Autonomous Driving software development and validation that can be deployed as a collaboration eco-system accessible to the automotive market worldwide.
Responsibilities:
β€’ Develop data life cycle management engine to tackle hundreds of PB of autonomous driving data to meet regulatory compliance and data retrieval
β€’ R&D driven feature development and enhancement in data tiering, back up/archival, deletion, mirroring/cross-sharing and governance extending to multiple environments (MapR, Azure Blob, Cloudera etc.)
β€’ Feature innovation build in auditing, full lineage of data throughout entire data lifecycle and meta-data driven exploration (e.g. file age, sensor topics etc.)
β€’ Performance benchmarking, data auditing reports and statistics
β€’ Design and develop prototypes & proof-of-concept solutions in support of presales activities as well as support the development of standardized analytic offerings across multiple industries
β€’ Establish demos to showcase hybrid data life cycle management capabilities for DXC clients as part of business development, presales, PoCs
Mandatory Skills:
β€’ Bachelor or Master degree required in informatics, business informatics, business administration or equivalent qualification
β€’ Minimum 2+ years of experience with Hadoop distributions and innovative technologies in Hadoop ecosystem (e.g. Spark, Hive, HBase)
β€’ Proven experience in all phases of a Big Data project: Concept & design, development, implementation, change and operation
β€’ Minimum 5+ years of experience in Java
β€’ Advanced know-how in Python
β€’ Advanced know-how in Hadoop like HDFS/YARN.
β€’ Experience in Jenkins, Git, SQL, Rest API
β€’ Ability to determine impact of architectural solutions and recommendations
β€’ Experience in maintenance and enhancement of deployed products
β€’ Ability to work in a fast-paced environment with diverse cultures
β€’ Ability to determine impact of architectural solutions and recommendations
β€’ Experience in maintenance and enhancement of deployed products
β€’ Ability to work in a fast-paced environment with diverse cultures
β€’ Able to communicate internal and external confidently
Nice-to-Have Skills:β€’ Nice to have: Knowledge and basic experience in Airflow, Docker, Kubernetes

About Luxoft

Luxoft is a high-end application outsourcing provider of choice and a trusted technology advisor to Global 2000 and medium-sized growth companies that apply compelling technologies to obtain leadership positions in their respective markets.
Luxoft today finds the Best talents, proposes career growth & employment benefits. Our teams are involved in high complicity & innovative projects for the Top leaders companies around the Globe.

Company website:
https://career.luxoft.com/locations/ukraine/

DOU company page:
https://jobs.dou.ua/companies/luxoft/

The job ad is no longer active

Look at the current jobs Java Kyiv→