Big Data Middle/Senior engineer Offline
Description:
Internationally leading Fintech company and independent provider for the financial industry to both develop and operate its own software. Headquartered in Switzerland with highly qualified domain and IT specialists and has a global customer base of more than hundred financial institutions.
With its core and digital banking software, and its international network of BPO centers, the Client brings trustworthy and efficient banking to the world, delivered through great user experience.
Requirements:
- PhD, Master or Bachelor degree in Computer Science, Math, Physics, Engineering, Statistics or another technical field;
- Experience in IT for more than 4 years;
- Experience in BigData engineering for more than 2 years;
- Good understanding of major programming languages like Java, Python;
- Familiar with defining Data Governance concepts (incl. data lineage, data dictionary);
- Strong knowledge in data modelling, query optimization on different storage solutions such as RDMS, document stores;
- Сommercial experience with:
- Distributed systems, Big Data technologies;
- Streaming technologies and SaaS based architectures (e.g. Hadoop, Spark, Kafka, Data Lakes);
- Architecting data pipelines including data collection, data storage and processing, data analysis on scale and elastically;
- CI/CD;
- Experience with different delivery and development processes and practices:
- Troubleshooting, profiling and debugging applications;
- Agile, SCRUM software processes and technologies;
- Code Review process;
- Refactoring process;
- Upper-Intermediate English.
Desirable:
- Knowledge of Gradle based tooling for building CI/CD pipelines, DevOps, automation;
- Commercial experience with:
- Cloud Computing Platforms (e.g. AWS, GCP, OpenShift);
- Automated testing, code quality;
- Orchestration and containerization software (Kubernetes, Dockers);
- Graph databases, time-series databases, data warehouses;
- Experience with:
- Creation software architecture and design of complex solutions;
Preferences
Java, Python, Azure Data Lake Store, Gradle, OpenShift, Kubernetes, Docker
Responsibilities:
- Integration of a new generation of tools within the existing environment to ensure access to accurate and current data;
- Capturing of existing data flows, develop of clear pipelines;
- Selecting appropriate technologies from open source, commercial on-premises and cloud-based offerings;
- Taking part in the decision-making process in design, solution development and code review;
- Working in an international distributed team;
- Communicate with PM, PO, Developers and other colleagues and stakeholders;
- Delivering the product roadmap and plannings, create estimations;
We Offer:
Exciting Projects: Come take your place at the forefront of digital transformation! With clients across all industries and sectors, we offer an opportunity to participate in creating market-defining products using the latest technologies.
Collaborative Environment: Expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!
Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible opportunities and options.
Professional Development: Our dedicated Learning & Development team regularly organizes certification and technical / soft skill training to help you realize your professional goals.
Excellent Benefits: We provide our consultants with competitive compensation and benefits
Fun Perks: We want you to love where you work, which is why we host sports classes, cultural, social and team building activities such as sports competitions and end-of-year corporate parties. Our vibrant offices also include dedicated GL Zones and rooftop decks where you can drink coffee or tea with your colleagues over a game of table football or darts!
The job ad is no longer active
Job unpublished on
20 December 2020
Look at the current jobs Data Science Kyiv→