Senior Data Engineer (offline)

What’s in it for you?
• Working with the latest cutting-edge data streaming technologies
• Helping to disrupt a century-old industry in a startup environment
• Having direct influence in how we build our data streaming platform at 90poe
• Opportunity to grow and develop your core skills
• Deliver a greenfield system
• Work with a diverse multicultural team in an agile environment
• Variety of knowledge sharing and self-development opportunities
• Competitive salary
• State of the art, cool, centrally located offices with warm atmosphere, which creates good working conditions
• Opportunity to travel to the London office
• Occasional visits to vessels to observe how our software and hardware is being used in the real world
• Experience firsthand the squad-chapter-guild workflow model, our version of the Spotify model

Responsibilities
• You will work closely with the CTO
• Contribute to data pipeline design, development, and monitoring
• Develop Java-based Kafka Stream applications
• Supervise production managed Kafka and Elasticsearch clusters
• Conform to the Company-wide code standards and tech culture
• Responsible for the full lifecycle of data pipelines. Developers will take the services they build from design, through implementation, and into production.
• Designing solutions for monitor data pipelines operations with proper alerting

Requirements
• Full hands-on development experience
Proficiency in:
o Java and/or Golang
o Develop with the latest Java version, build with Gradle and test and deploy your own code into production
o Experience using Kafka technologies (Kafka Streams DSL, Processor API, Kafka Connect, Avro Schema Registry)
o Elasticsearch
o Code & systems testing
o RDBMS and NoSQL databases
o Kubernetes and Docker
o Advanced use of git
o Use of Unix/Linux shell commands
o Microservices architecture concepts
o Event-driven paradigm
o Evaluating/designing/building data solutions for operations & support (I.e. metrics, tracing, logging)
Understanding:
o Protobuf/gRPC
o Best practices in scaling & monitoring data pipelines

Nice to have
• AWS stack experience
• Ability to perform basic DevOps tasks
• Python and related data science packages such as pandas/numpy/scikit-learn
• Basic data analysis techniques
• Understanding of statistics

Experience
Demonstrated track record and proficiency in the points below:
• Deliver features autonomously with a high degree of team coordination
• Deliver code based on precise architecture spec as well as without relying on precise architecture spec or requirements
• Automated testing
• Working with CI and GitOps practices
• Delivering code to production
• Maintaining production-ready code
• Collaborating in small but fast-paced teams
• Event-driven architecture and message passing

More about you
• Good level of English
• Willingness to learn and open mind about new technologies
• Confident to operate in a fast-paced environment
• A collaborative approach and willingness to engage in an environment of active idea sharing
• Ability to learn autonomously
• Excellent all-round communications skills

About Ninety Percent of Everything

Our goal is to revolutionize the Maritime industry by creating a suite of comprehensive software and hardware solutions commercialized under the SaaS model. Over the next couple of years, our squads will build more than 30 products from the ground up. This includes everything from global vessel tracking to vessel performance analysis, crew allocation optimization and so much more. This is an exciting and challenging opportunity to apply cutting-edge technology to revolutionizing an iconic industry.

Our tech stack consists of React, React Native and Flutter applications communicating using GraphQL to microservice containers orchestrated by Kubernetes. The majority of our services is written in Golang with stream processing in Java, they use gRPC for communication, achieve high scalability thanks to Apache Kafka based event driven architecture, persist data to a mix of RDBS and No-SQL databases including PostgresDB, MongoDB, Cassandra, S3 and Elasticsearch. We follow CI/CD and agile methodologies to deploy into production multiple times per week.

Company website:
https://www.90poe.io/

DOU company page:
https://jobs.dou.ua/companies/studio53/

The job ad is no longer active

Look at the current jobs Java Kyiv→