We’re here to accelerate the journey to the cloud, improve business operations and provide our customers with exciting innovative solutions.

Come join us!

What will your job look like?

— Design, Create, Enhance & Support Hadoop data pipelines for different domain using Big Data Technologies.
— In charge of Data Transformation, Data Models, Schemas, Metadata, And Workload Management
— Perform Development & Deployment tasks, should be able to Code, Unit Test & Deploy.
— Perform application analysis and propose technical solution for application enhancement, optimizing existing ETL processes etc.
— Handle and resolve Production Issues (Tier 2 & weekend support) & ensure SLAs are met.
— Create necessary documentation for all project deliverable phases.
— Collaborate with multi discipline interfaces: dev team, business analysts, Infra, information security, end users.


All you need is...

— 4+ years of Hadoop architecture and other relevant Big data tools like HBase, HDFS, Hive, Map-reduce etc.
— Strong Experience with Hadoop administration (Ambari etc.)
— Strong Experience with Kafka message queuing technologies, Apache Nifi Stream Data Integration and RESTful APIs and open systems.
— Strong experience in Object-oriented/Object function scripting using languages such as R, Python, Scala, or similar.
— Strong experience in Informatica BDM (Big Data Management) to implement complex data transformations.
— Strong ability to design, build and manage data pipelines in Python and related technologies.
— Hands on in SQL, Unix & advanced Unix Shell Scripting.
— Experience in working with Data Governance, Data Quality, and Data Security teams and standards.
— Superb communication and collaboration skills.
— Independent and Self-learning attitude.

Good to have skills:

— Knowledge of handling xml, json, structured, fixed-width, un-structured files using custom Pig/Hive.
— Understanding/ experience with Data Virtualization tools like TIBCO DV, Denodo etc.
— Understanding/ experience with Data Governance tools like Informatica Data Catalog etc.
— Frontend experience, working with Data Discovery, Analytics and BI software tools like Tableau, Power BI.
— Knowledge of any cloud technology (AWS/Azure/GCP) is a plus

Why you will love this job:

— You will be challenged to design and develop new software applications.
— You will have the opportunity to work in a growing organization, with ever growing opportunities for personal growth
— You will be part of an international expert team, in the CTO line of Amdocs IT/MIS team.
—You will be able to work remotely from any location
—You will have health insurance with premium-level clinics
— Last but not least — high compensation for your efforts

About X1 Group

At X1 Group we're building R&D Offices and dedicated development teams in Ukraine for EU and USA tech startups since 2010. Headquartered in Berlin, Germany and have Development Office in Kharkiv, Ukraine.

Company website:
https://careers.x1group.com/

DOU company page:
https://jobs.dou.ua/companies/x1group/

This job is no longer active.
Смотреть актуальные вакансии Data Science Remote→.

Similar jobs

Technical Lead/Data Engineer for Data Platform at Intellias

Kyiv, Kharkiv, Lviv, Dnipro, Odesa, Vinnitsia, Zhytomyr, Ivano-Frankivsk, Zaporizhzhya, Mykolaiv, Cherkasy, Chernivtsi, Chernigiv, Khmelnytskiy, Uzhgorod, remote


All jobs Data Science remote    All jobs X1 Group