Senior/Lead Big Data Engineer IRC162149

GlobalLogic is seeking an experienced Senior Big Data Engineer to help us improve and help to develop new modern personalized digital platform.
Project for customer base in the U.S.

Every developer in our team has an opportunity to make a tremendous impact. We are looking for an innovative and passionate developers who want to take ownership of features and projects and collaborate with other developers and product managers to evaluate, design, and implement from top to bottom. We will provide opportunities to work with cutting edge technology that helps millions of people save their/the money.

We are seeking for creative and motivated engineers to work on our digital coupon solution. There are a wide variety of challenges that have been and still needs to be solved. You will have opportunities for growth and development. You will be part of a team that works on services that have saved over 1 billion coupons.

Project for customer base in the U.S. shaping the ways CPG brands operate in the market. Work with us using the latest technology and influence the way over 50 million shoppers act each day.

The mission of the Analytics Architecture team is to enable analytics at scale. We do this by making sure the Analytics and Data Science teams have the necessary Cloud Infrastructure, Tools, and Data to execute their research and development of machine learning/AI models. We then work to implement these models into the larger technology infrastructure. The team is the link between Analytics business mission, strategy, and processes of the organization to its IT strategy. The team is unique in its capability as a bridge between Analytics and technology, having a good understanding of analytics’ unique needs and tech stack, tools, etc.

- Recent experience in hands-on software development
- Experienced in software for distributed production systems
- Masters's degree in a computer science or related field
- 8 years overall work experience
- 5+ years of experience with some of the Big Data technologies (Spark (Scala/Python), Azure Cloud, Hadoop)
- Experience in scaling spark applications, optimizations, and performance tuning
- Expert knowledge of distributed computing, RDD and optimization techniques
- Knowledgeable in Warehouse solutions (ex. Yellowbrick, Snowflake
- Experience in work automation.
- Process mapping experience
- Proven proficiency with data analysis and ability to troubleshoot data issues
- 5+ years of experience with Linux/Unix Systems, scripting.
- Experience of software engineering in Python and Scala
- Experiences using dev ops tools (ex. Jenkins, Azure Dev Ops)
- Able to develop/unit test and deploy complex data & analytic solutions
- Able to mentor fellow co-workers to expand overall team skills.
- Advanced communication skills to present solutions clearly to the team and users.
- Working knowledge of Agile software engineering processes.
- Community developer presence (github, apache, open source projects, etc)
- Positive attitude towards challenges.

- Design, Develop, Deploy and Maintain solutions in Big Data technologies (Hadoop, Azure Cloud, Hive, Databricks, Spark (Scala/Python), and other open-source technologies) in 3 main areas including:
- Data ingestion pipelines from diverse sources, large-scale structured and unstructured data
- transmission between systems (API based) of Data – organizing data being captured and organized within our systems.

All of this work should be scalable code that follows company standards and be clearly documented
- A developer will actively participate in agile SCRUM ceremonies including:
- Refinement, Planning, Stand-ups, and Retros
- Flushing out business requirements and adding acceptance criteria & tasks to User Stories
- Participate in peer review code sessions to ensure the quality of code.
- Clearly communicate with management on proposed solutions/challenges.
- Track and resolve data issues showing creative problem-solving skills.
- Lastly, responsibilities also include:
- Maintaining in-depth knowledge of data ecosystem and trends; be a subject matter expert
- Recommends new tools to improve the current stack and solve data needs efficiently.

What We Offer

Exciting Projects: Come take your place at the forefront of digital transformation! With clients across all industries and sectors, we offer an opportunity to participate in creating market-defining products using the latest technologies.

Collaborative Environment: Expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!

Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible opportunities and options.

Professional Development: Our dedicated Learning & Development team regularly organizes certification and technical/soft skill training to help you realize your professional goals.

Excellent Benefits: We provide our consultants with competitive compensation and benefits

Fun Perks: We want you to love where you work, which is why we host sports classes, cultural, social, and team-building activities such as sports competitions and end-of-year corporate parties. Our vibrant offices also include dedicated GL Zones and rooftop decks where you can drink coffee or tea with your colleagues over a game of table football or darts!

About GlobalLogic

GlobalLogic, a Hitachi Group Company, is a leader in digital product engineering. We help our clients design and build innovative products, platforms, and digital experiences for the modern world. We help our clients imagine what’s possible and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world.

What is GlobalLogic in numbers:
24,000+ engineers
1,800+ products releases per year
400+ active clients
70+ private label customer labs
35 product engineering centers
14 countries

Visit our website and learn more about GlobalLogic, view our open positions and career opportunities, as well as why you should join us!

Company website:

DOU company page:

Job posted on 29 July 2022
10 views    2 applications

To apply for this and other jobs on Djinni login or signup.
  • Category: Data Engineer
  • Big Data, Spark, Scala, Azure Cloud, Data Warehouse, Linux/Unix, Python
  • English: Upper-Intermediate
  • 5 years of experience
  • maps_home_work Office/Remote of your choice
  • business_center Outsource
  • explore Ukraine (Kyiv, Lviv, Mykolaiv, Kharkiv)
  • public Only candidates from Ukraine