BigData Developer (with PySpark) (offline)

Requirements:
Good ETL + working knowledge of data warehouse
PySpark – Python + Spark - transformation into CDM + move data to snowflake
Talend – data ingestion
Snowflake
Good communicator, both written and oral

Nice to have:
Professional data engineering experience focused on batch and real time data pipelines using Talend, Spark, PySpark, Python, SQL, Java
Hands-on design and development experience in data space: data processing / data transformation using ETL tools, data warehouse (data modeling, programming), RDBMS
Experience building cloud data warehouse using Snowflake is a plus
Exposure in Microsoft technologies like SSIS, SQL Server, SSRS
Experience with a DevOps model utilizing a CI/CD tool
Experience on Azure Cloud a strong plus * Familiar with Agile practices and methodologies

We offer:
Opportunity to work on bleeding-edge project
Work with a highly motivated and dedicated team
Competitive salary
Flexible schedule
Medical insurance
Benefits program
Corporate social events

About us:
Grid Dynamics is the engineering services company known for transformative, mission-critical cloud solutions for retail, finance and technology sectors. We architected some of the busiest e-commerce services on the Internet and have never had an outage during the peak season. Founded in 2006 and headquartered in San Ramon, California with offices throughout the US and Eastern Europe, we focus on big data analytics, scalable omnichannel services, DevOps, and cloud enablement.

The job ad is no longer active
Job unpublished on 28 August 2020

Look at the current jobs Python Kharkiv→