GlobalLogic4

Joined in 2025
0% answers
  • · 19 views · 2 applications · 10h

    Middle Data Engineer

    Full Remote · Croatia, Poland, Romania, Slovakia, Ukraine · 4 years of experience · English - B2
    Description: Our Client is a Fortune 500 company and is one of the biggest global manufacturing companies operating in the fields of industrial systems, worker safety, health care, and consumer goods. The company is dedicated to creating the technology...

    Description:

    Our Client is a Fortune 500 company and is one of the biggest global manufacturing companies operating in the fields of industrial systems, worker safety, health care, and consumer goods. The company is dedicated to creating the technology and products that advance every business, improve every home, and enhance every life.

    Minimum Requirements:

    • Minimum of 4 years of experience in SQL and Python programming languages, specifically for data engineering tasks.
    • Proficiency in working with cloud technologies such as Azure or AWS.
    • Experience with Spark and Databricks or similar big data processing and analytics platforms
    • Experience working with large data environments, including data processing, data integration, and data warehousing.
    • Experience with data quality assessment and improvement techniques, including data profiling, data cleansing, and data validation.
    • Familiarity with data lakes and their associated technologies, such as Azure Data Lake Storage, AWS S3, or Delta Lake, for scalable and cost-effective data storage and management.
    • Experience with NoSQL databases, such as MongoDB or Cosmos, for handling unstructured and semi-structured data.
    • Fluent English

    Additional Skillset (Nice to Have):

    • Familiarity with Agile and Scrum methodologies, including working with Azure DevOps and Jira for project management.
    • Knowledge of DevOps methodologies and practices, including continuous integration and continuous deployment (CI/CD).
    • Experience with Azure Data Factory or similar data integration tools for orchestrating and automating data pipelines.
    • Ability to build and maintain APIs for data integration and consumption.
    • Experience with data backends for software platforms, including database design, optimization, and performance tuning.

    Job responsibilities:

    • Responsible for the building, deployment, and maintenance of mission-critical analytics solutions that process data quickly at big-data scales
    • Responsible for the design and implementation of data integration pipelines
    • Contributes design, code, configurations, and documentation for components that manage data ingestion, real-time streaming, batch processing, data extraction, transformation, and loading across multiple data storages
    • Take part in the full cycle of feature development (requirements analysis, decomposition, design, etc)
    • Contribute to the overall quality of development services through brainstorming, unit testing, and proactive offering of different improvements and innovations.
    More
Log In or Sign Up to see all posted jobs