Company is looking for a self-motivated and goal-oriented Big Data Engineer who would like to be a part of a team working on brand new startup projects for foreign customers.
ABOUT THE PROJECT
Antidote is revolutionizing telemedicine, creating an environment for doctors to remotely treat patients for various health conditions, for affordable prices. The company collects and leverages big amounts of health data, to create analytical and Al-based solutions for the company’s products. If you eager to jump into water and learn many new things, you belong with us.
People-oriented management without bureaucracy
The friendly climate inside the company which is confirmed by the frequent come back of previous employees
Flexible working schedule
Paid time off (18 working days per year, plus all national holidays and 9 sick days)
Laptop of your choice: MacBook Pro or Windows/Linux business laptop + large extra screen
Full financial and legal support for private entrepreneurs
Free English classes with native speakers or with Ukrainian teachers (for your choice)
Comfortable offices in the city center (pets friendly btw:) )
Possibility to choose your workspace either remote or combination of your home and one of our development offices/possibility of rotation between offices (Kyiv, Kharkiv)
We help with relocation. If you’d like to move to Kyiv or Kharkiv, we’ll try our best to make your relocation smooth and effortless
Last but not least - regular team buildings and corporate events
Develop the company’s DWH and data infrastructures, in order to provide data consistency and enable easy and reliable data extraction for analytical and ML purposes.
Build data infrastructure architecture, ETL processes, and data integration.
Additionally, while the company receives big amounts of data from its data partners, you will be responsible to build supporting tools to ingest the data into our infrastructure and DWH, as well as to build data validation pipelines.
Encountering tabular data, JSON and textual formats, integrating several cloud servers, various files, and stand-alone data sources.
3 years minimum of a proven "hands on" experience as a Big-Data engineer: building data pipelines, ETL processes, data integration pipelines.
Proficiency with Python and SQL – a must
Experience working with both SQL and NoSql databases
Experience working with cloud environments
Experience with stream processing, such as kafka/pubsub/kinesis
Experienced with data orchestration tools such as airflow/argo
Bachelor's degree in Engineering/Computer Science/related discipline
English level - intermediate or higher
WILL BE A PLUS
Experience working with CI/CD
Experience working with GCP tools and Grafana
All Technical Assistance
Мы реализуем IT-решения, которые помогают компаниям повышать эффективность и качество бизнеса.
Job posted on
20 July 2021