You will be working in the maritime industry and will develop next-gen vessel engine optimization and predictive maintenance.
About the client
The client is a dynamic, young, and passionate growing company that provides a broad set of products and services for maritime vessels, predictive maintenance and advanced cloud monitoring tools for engine operations and other vessel equipment. The company is positioning itself as the leader in advanced engine optimization together with cloud computing-assisted technologies and pioneers the digitalization of vessels and fleets globally.
You will be a part of the Scrum team, which is in charge of technical analysis, data algorithm development, implement software functionalities and deploy the solution to vessels. To continuously improve the way the client delivers customer-focused insights and value, our team develops the science behind the engines used to power, and run the most efficient vessels on earth.
We are looking for a strong Python generalist with a good understanding of modern DevOps procedures. We need someone who can continuously deliver quality code from product ideation to production delivery. Cloud and On-premise infrastructure is a core of our company, so you will be a good fit if you know how to maintain and develop container-based applications.
Requirements and Qualifications
* 3 years hands-on, active development experience
* Superior understanding and consistent application of software development best practices
* Proficiency in Python and related frameworks and technologies: Flask, Dash, Numpy, Pandas
* Strong DB skills: BigData, SQL, NoSQL
* Message Queues expertise (e.g., RabbitMQ, Kafka, Kinesis, or similar)
* Proficiency in code quality best practices - TDD, BDD
* Confidence in container technologies (Docker, K8S)
* Ansible / Terraform is a big plus
* Proficiency in written and spoken English
Duties and Responsibilities
* Create detailed designs and solutions from defined business requirements
* Work with Dashboards from Dash Python libraries.
* Work with time series stack: Telegraf, InfluxDB, Kapacitor, Grafana
* Build Data Pipelines on top of Kafka message queues to consume and process the data through pre-defined research code
* Supporting code quality across all project by following unit testing/integration tests best practices
* Develop estimates and plans and ensure the team meets these targets
* Contribute to the infrastructure activities on the cloud
* Enforce best practices within your team for all aspects of the software development
We are techflower team. We help international businesses and startups launch remote software engineering teams and open efficient software R&D Offices in Ukraine.
We're located in Kyiv, the capital of Ukraine, Europe's fastest-growing tech center.
Since 2018 we helped 15 clients from Europe, US and Singapore open their R&D offices in Kyiv which led to their huge transformation and results.
Job posted on
22 March 2021