Jobs Data Engineer

163
  • Β· 59 views Β· 0 applications Β· 26d

    Data Engineer (Relocate)

    Office Work Β· Spain Β· Product Β· 3 years of experience Β· English - B1 Ukrainian Product πŸ‡ΊπŸ‡¦
    We are the creators of a new fintech era! Our mission is to change this world by making blockchain accessible to everyone in everyday life. WhiteBIT is a global team of over 1,200 professionals united by one mission β€” to shape the new world order in the...

    We are the creators of a new fintech era!
    Our mission is to change this world by making blockchain accessible to everyone in everyday life. WhiteBIT is a global team of over 1,200 professionals united by one mission β€” to shape the new world order in the Web3 era. Each of our employees is fully engaged in this transformative journey.
    We work on our blockchain platform, providing maximum transparency and security for more than 8 million users worldwide. Our breakthrough solutions, incredible speed of adaptation to market challenges, and technological superiority are the strengths that take us beyond ordinary companies. Our official partners include the National Football Team of Ukraine, FC Barcelona, Lifecell, FACEIT and VISA.

    The future of Web3 starts with you: join us as a Data Engineer!

     

    Requirements

    β€” 3+ years of experience as a Data Engineer in high-load or data-driven environments
    β€” Proficient in Python for data processing and automation (pandas, pyarrow, sqlalchemy, etc.)
    β€” Advanced knowledge of SQL: query optimization, indexes, partitions, materialized views
    β€” Hands-on experience with ETL/ELT orchestration tools (e.g., Airflow, Prefect)
    β€” Experience with streaming technologies (e.g., Kafka, Flink, Spark Streaming)
    β€” Solid background in data warehouse solutions: ClickHouse, BigQuery, Redshift, or Snowflake
    β€” Familiarity with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code principles
    β€” Experience with containerization and deployment tools (e.g., Docker, Kubernetes, CI/CD)
    β€” Understanding of data modeling, data versioning, and schema evolution (e.g., dbt, Avro, Parquet)
    β€” English β€” at least intermediate (for documentation & communication with tech teams)

     

    Responsibilities

    β€” Design, build, and maintain scalable and resilient data pipelines (batch and real-time)
    β€” Develop and support data lake/data warehouse architectures
    β€” Integrate internal and external data sources/APIs into unified data systems
    β€” Ensure data quality, observability, and monitoring of pipelines
    β€” Collaborate with backend and DevOps engineers on infrastructure and deployment
    β€” Optimize query performance and data processing latency across systems
    β€” Maintain documentation and contribute to internal data engineering standards
    β€” Implement data access layers and provide well-structured data for downstream teams

     

    Work conditions

    Immerse yourself in Crypto & Web3:
    β€” Master cutting-edge technologies and become an expert in the most innovative industry.
    Work with the Fintech of the Future:
    β€” Develop your skills in digital finance and shape the global market.
    Take Your Professionalism to the Next Level:
    β€” Gain unique experience and be part of global transformations.
    Drive Innovations:
    β€” Influence the industry and contribute to groundbreaking solutions.
    Join a Strong Team:
    β€” Collaborate with top experts worldwide and grow alongside the best.
    Work-Life Balance & Well-being:
    β€” Modern equipment.
    β€” Comfortable working conditions, and an inspiring environment to help you thrive.
    β€” 22 business days of paid leave.
    β€” Additional days off for national holidays.

    More
  • Β· 39 views Β· 0 applications Β· 26d

    Data Quality Engineer

    Office Work Β· Ukraine (Kyiv) Β· Product Β· 3 years of experience Β· English - None MilTech πŸͺ–
    We’re building a large-scale data analytics ecosystem powered by Microsoft Azure and Power BI. Our team integrates, transforms, and visualizes data from multiple sources to support critical business decisions. Data quality is one of our top priorities,...

    We’re building a large-scale data analytics ecosystem powered by Microsoft Azure and Power BI. Our team integrates, transforms, and visualizes data from multiple sources to support critical business decisions. Data quality is one of our top priorities, and we’re seeking an engineer who can help us enhance the reliability, transparency, and manageability of our data landscape. 

    Your responsibilities: 

    • Develop and maintain data quality monitoring frameworks within the Azure ecosystem (Data Factory, Data Lake, Databricks). 
    • Design and implement data quality checks, including validation, profiling, cleansing, and standardization. 
    • Detect data anomalies and design alerting systems (rules, thresholds, automation). 
    • Collaborate with Data Engineers, Analysts, and Business stakeholders to define data quality criteria and expectations. 
    • Ensure high data accuracy and integrity for Power BI reports and dashboards. 
    • Document data validation processes and recommend improvements to data sources. 

    Requirements: 

    • 3+ years of experience in a Data Quality, Data Engineering, or BI Engineering role. 
    • Hands-on experience with Microsoft Azure services (Data Factory, SQL Database, Data Lake). 
    • Advanced SQL skills (complex queries, optimization, data validation). 
    • Familiarity with Power BI or similar BI tools. 
    • Understanding of DWH principles and ETL/ELT pipelines. 
    • Experience with data quality frameworks and metrics (completeness, consistency, timeliness). 
    • Knowledge of Data Governance, Master Data Management, and Data Lineage concepts. 

    Would be a plus: 

    • Experience with Databricks or Apache Spark. 
    • DAX and Power Query (M) knowledge. 
    • Familiarity with DataOps or DevOps principles in a data environment. 
    • Experience in creating automated data quality dashboards in Power BI. 

     

    More
  • Β· 25 views Β· 1 application Β· 26d

    Cloud DevOps Engineer

    Hybrid Remote Β· Ukraine Β· 4 years of experience Β· English - B2
    Hello everyone At Intobi, we're a software and product development company passionate about driving innovation and progress. We help our clients succeed by delivering custom-built tech solutions designed to meet their unique needs. Our expertise lies in...

    Hello everyone πŸ‘‹

    At Intobi, we're a software and product development company passionate about driving innovation and progress.

    We help our clients succeed by delivering custom-built tech solutions designed to meet their unique needs.

    Our expertise lies in developing cutting-edge Web and Mobile applications.

     

    We’re hiring a Cloud DevOps Engineer to drive the design, automation, and reliability of our multi-cloud infrastructure. This is a key role in a fast-paced startup environment, where you’ll play a critical part in building, managing, and securing our cloud-native platform across AWS, Azure, and GCP.

     

    Cyngular is an Israeli cybersecurity company focused on cloud investigation and automated incident response. The platform helps security teams detect, investigate, and respond to complex threats across AWS, Azure, and GCP

     

    Role Overview:

    As a Cloud DevOps Engineer, you will be responsible for implementing CI/CD pipelines, managing infrastructure as code, automating cloud operations, and ensuring high availability and security across environments. You’ll work closely with development, security, and data teams to enable fast, reliable, and secure deployments.

     

    Key Responsibilities:

    β€”  Design, build, and maintain infrastructure using Terraform, CloudFormation, or Bicep.

    β€”  Manage CI/CD pipelines (GitHub Actions, GitLab CI, Azure DevOps, etc.) across multiple cloud platforms.

    β€”  Automate provisioning and scaling of compute, storage, and networking resources in AWS, Azure, and GCP.

    β€”  Implement and maintain monitoring, logging, and alerting solutions (CloudWatch, Stackdriver, Azure Monitor, etc.).

    β€”  Harden environments according to security best practices (IAM, service principals, KMS, firewall rules, etc.).

    β€”  Support cost optimization strategies and resource tagging/governance.

    β€”  Collaborate with engineers to streamline developer workflows and cloud-based deployments.

     

    Required Skills:

    β€”  4+ years of experience in DevOps, Site Reliability Engineering, or Cloud Engineering.

    β€”  Hands-on experience with at least two major cloud providers (AWS, Azure, GCP); familiarity with the third.

    β€”  Proficiency in infrastructure as code (Terraform required; CloudFormation/Bicep is a plus).

    β€”  Experience managing containers and orchestration platforms (EKS, AKS, GKE, or Kubernetes).

    β€”  Strong knowledge of CI/CD tooling and best practices.

    β€”  Familiarity with secrets management, role-based access controls, and audit logging.

    β€”  Proficiency in scripting with Python, Bash, or PowerShell.

     

    The position requires a high level of English β€” reading, writing, and speaking.

    This role is not suitable for juniors or those with little to no experience.

    We’re looking for professional DevOps engineers who are passionate about technology and tools, and who aren’t afraid to take on significant responsibilities, including self-directed learning.

     

    Please send your CV here or via email

     

    Should the first stage be successfully completed, you’ll be invited to a personal interview.

    More
  • Β· 40 views Β· 5 applications Β· 26d

    Middle/Senior/Lead Data Engineer

    Full Remote Β· Ukraine Β· 3 years of experience Β· English - B2
    An AWS Data Engineer designs, develops, and maintains scalable data solutions using AWS cloud services. Key Responsibilities: β€’ Design, build, and manage ETL (Extract, Transform, Load) pipelines using AWS services (e.g., Glue, Lambda, EMR, Redshift,...

    An AWS Data Engineer designs, develops, and maintains scalable data solutions using AWS cloud services.
    Key Responsibilities:
        β€’ Design, build, and manage ETL (Extract, Transform, Load) pipelines using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3).
        β€’ Develop and maintain data architecture (data lakes, warehouses, databases) on AWS.
        β€’ Implement data quality and governance solutions.
        β€’ Automate data workflows and monitor pipeline health.
        β€’ Ensure data security and compliance with company policies.
    Required Skills:
        β€’ Proficiency with AWS cloud services, especially data-related offerings (S3, Glue, Redshift, Athena, EMR, Kinesis, Lambda).
        β€’ Strong SQL and Python skills.
        β€’ Experience with ETL tools and frameworks.
        β€’ Familiarity with data modelling and warehousing concepts.
        β€’ Knowledge of data security, access management, and best practices in AWS.
    Preferred Qualifications:
        β€’ AWS certifications (e.g., AWS Certified Data Analytics – Speciality, AWS Certified Solutions Architect).
        β€’ Background in software engineering or data science.

    β€’ Hands-on experience with Oracle Database and log-based Change Data Capture (CDC) replication using AWS Database Migration Service (DMS) for near real-time data ingestion.

     

    Job Responsibilities

    • Develops, documents, and configures systems specifications that conform to defined architecture standards, address business requirements, and processes in the cloud development & engineering.
    • Involved in planning of system and development deployment, as well as responsible for meeting compliance and security standards.
    • API development using AWS services in a scalable, microservices-based architecture
    • Actively identifies system functionality or performance deficiencies, executes changes to existing systems, and tests functionality of the system to correct deficiencies and maintain more effective data handling, data integrity, conversion, input/output requirements, and storage.
    • May document testing and maintenance of system updates, modifications, and configurations.
    • May act as a liaison with key technology vendor technologists or other business functions.
    • Function Specific: Strategically design technology solutions that meet the needs and goals of the company and its customers/users.
    • Leverages platform process expertise to assess if existing standard platform functionality will solve a business problem or if a customisation solution would be required.
    • Test the quality of a product and its ability to perform a task or solve a problem.
    • Perform basic maintenance and performance optimisation procedures in each of the primary operating systems.
    • Ability to document detailed technical system specifications based on business system requirements
    • Ensures system implementation compliance with global & local regulatory and security standards (i.e. HIPAA, SOCII, ISO27001, etc.)
    •  

    Department/Project Description

    The GlobalLogic technology team is focused on next-generation health capabilities that align with the client’s mission and vision to deliver Insight-Driven Care. This role operates within the Health Applications & Interoperability subgroup of our broader team, with a focus on patient engagement, care coordination, AI, healthcare analytics, and interoperability. These advanced technologies enhance our product portfolio with new services while improving clinical and patient experiences.

    More
  • Β· 30 views Β· 1 application Β· 26d

    Senior DBA/BI Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    ABOUT CLIENT This independent research group focuses on tracking and forecasting infectious diseases. Initially studying illnesses like influenza and dengue, they later expanded to global outbreaks such as COVID-19. The team gathers unique data sources,...

    ABOUT CLIENT

    This independent research group focuses on tracking and forecasting infectious diseases. Initially studying illnesses like influenza and dengue, they later expanded to global outbreaks such as COVID-19. The team gathers unique data sources, extracts key indicators of disease activity, and shares them publicly to support real-time monitoring and short-term forecasts. Their mission is to strengthen global readiness for future epidemics through data-driven insights and predictive modeling.
     

    PROJECT TECH STACK

    Apache Airflow 3.0, PostgreSQL
     

    PROJECT STAGE

    Live product
     

    QUALIFICATIONS AND SKILLS

    • 5+ years of experience with Airflow, with a focus on quality and depth of understanding, not just duration on the platform.
    • Proven experience designing, deploying, and maintaining production-grade ETL workflows using Apache Airflow, with a strong understanding of DAG orchestration, performance optimization, and operation in managed environments such as Astronomer.
    • Senior-level expertise in scaling, best practices, maintainability, and cost management β€” not just someone who can build pipelines.
    • Strong DBT skills would be a plus (the team has not yet adopted DBT, but is considering it).
    • Solid knowledge of Postgres optimization and the ability to clearly explain not only how to implement optimizations but also why they are needed.
    • Excellent communication skills are needed as the team needs guidance to enhance their capabilities more than immediate technical fixes.
       

    RESPONSIBILITIES

    • Ensure and help define best practices that promote maintainability and high code quality across the team.
    • Assist in resolving performance issues as they arise and proactively identify potential performance concerns.
    • Help the team detect bugs and problem areas early in the development process.
    • Guide the building of scalable solutions and the effective management of hosting costs.
    More
  • Β· 41 views Β· 3 applications Β· 26d

    Data Engineer

    Full Remote Β· Ukraine Β· Product Β· 3 years of experience Β· English - None
    About us: Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with uniting top AI talents and organizing the first Data Science tech conference in Kyiv. Over the past 9 years, we have diligently...

    About us:
    Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with uniting top AI talents and organizing the first Data Science tech conference in Kyiv. Over the past 9 years, we have diligently fostered one of the largest Data Science & AI communities in Europe.

    About the client:
    Our client is an IT company that develops technological solutions and products to help companies reach their full potential and meet the needs of their users. The team comprises over 600 specialists in IT and Digital, with solid expertise in various technology stacks necessary for creating complex solutions.

    About the role:
    We are looking for a Data Engineer (NLP-Focused) to build and optimize the data pipelines that fuel the Ukrainian LLM and NLP initiatives. In this role, you will design robust ETL/ELT processes to collect, process, and manage large-scale text and metadata, enabling the Data Scientists and ML Engineers to develop cutting-edge language models.

    You will work at the intersection of data engineering and machine learning, ensuring that the datasets and infrastructure are reliable, scalable, and tailored to the needs of training and evaluating NLP models in a Ukrainian language context.

    Requirements:
    - Education & Experience: 3+ years of experience as a Data Engineer or in a similar role, building data-intensive pipelines or platforms. A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field is preferred. Experience supporting machine learning or analytics teams with data pipelines is a strong advantage.
    - NLP Domain Experience: Prior experience handling linguistic data or supporting NLP projects (e.g., text normalization, handling different encodings, tokenization strategies). Knowledge of Ukrainian text sources and data sets, or experience with multilingual data processing, can be an advantage given the project’s focus.
    Understanding of FineWeb2 or a similar processing pipeline approach.
    - Data Pipeline Expertise: Hands-on experience designing ETL/ELT processes, including extracting data from various sources, using transformation tools, and loading into storage systems. Proficiency with orchestration frameworks like Apache Airflow for scheduling workflows. Familiarity with building pipelines for unstructured data (text, logs) as well as structured data.
    - Programming & Scripting: Strong programming skills in Python for data manipulation and pipeline development. Experience with NLP packages (spaCy, NLTK, langdetect, fasttext, etc.). Experience with SQL for querying and transforming data in relational databases. Knowledge of Bash or other scripting for automation tasks. Writing clean, maintainable code and using version control (Git) for collaborative development.
    - Databases & Storage: Experience working with relational databases (e.g., PostgreSQL, MySQL), including schema design and query optimization. Familiarity with NoSQL or document stores (e.g., MongoDB) and big data technologies (HDFS, Hive, Spark) for large-scale data is a plus. Understanding of or experience with vector databases (e.g., Pinecone, FAISS) is beneficial, as the NLP applications may require embedding storage and fast similarity search.
    - Cloud Infrastructure: Practical experience with cloud platforms (AWS, GCP, or Azure) for data storage and processing. Ability to set up services such as S3/Cloud Storage, data warehouses (e.g., BigQuery, Redshift), and use cloud-based ETL tools or serverless functions. Understanding of infrastructure-as-code (Terraform, CloudFormation) to manage resources is a plus.
    - Data Quality & Monitoring: Knowledge of data quality assurance practices. Experience implementing monitoring for data pipelines (logs, alerts) and using CI/CD tools to automate pipeline deployment and testing. An analytical mindset to troubleshoot data discrepancies and optimize performance bottlenecks.
    - Collaboration & Domain Knowledge: Ability to work closely with data scientists and understand the requirements of machine learning projects. Basic understanding of NLP concepts and the data needs for training language models, so you can anticipate and accommodate the specific forms of text data and preprocessing they require. Good communication skills to document data workflows and to coordinate with team members across different functions.

    Nice to have:
    - Advanced Tools & Frameworks: Experience with distributed data processing frameworks (such as Apache Spark or Databricks) for large-scale data transformation, and with message streaming systems (Kafka, Pub/Sub) for real-time data pipelines. Familiarity with data serialization formats (JSON, Parquet) and handling of large text corpora.
    - Web Scraping Expertise: Deep experience in web scraping, using tools like Scrapy, Selenium, or Beautiful Soup, and handling anti-scraping challenges (rotating proxies, rate limiting). Ability to parse and clean raw text data from HTML, PDFs, or scanned documents.
    - CI/CD & DevOps: Knowledge of setting up CI/CD pipelines for data engineering (using GitHub Actions, Jenkins, or GitLab CI) to test and deploy changes to data workflows. Experience with containerization (Docker) to package data jobs and with Kubernetes for scaling them is a plus.
    - Big Data & Analytics: Experience with analytics platforms and BI tools (e.g., Tableau, Looker) used to examine the data prepared by the pipelines. Understanding of how to create and manage data warehouses or data marts for analytical consumption.
    - Problem-Solving: Demonstrated ability to work independently in solving complex data engineering problems, optimizing existing pipelines, and implementing new ones under time constraints. A proactive attitude to explore new data tools or techniques that could improve the workflows.

    Responsibilities:
    - Design, develop, and maintain ETL/ELT pipelines for gathering, transforming, and storing large volumes of text data and related information.
    - Ensure pipelines are efficient and can handle data from diverse sources (e.g., web crawls, public datasets, internal databases) while maintaining data integrity.
    - Implement web scraping and data collection services to automate the ingestion of text and linguistic data from the web and other external sources. This includes writing crawlers or using APIs to continuously collect data relevant to the language modeling efforts.
    - Implementation of NLP/LLM-specific data processing: cleaning and normalization of text, like filtering of toxic content, de-duplication, de-noising, detection, and deletion of personal data.
    - Formation of specific SFT/RLHF datasets from existing data, including data augmentation/labeling with LLM as teacher.
    - Set up and manage cloud-based data infrastructure for the project. Configure and maintain data storage solutions (data lakes, warehouses) and processing frameworks (e.g., distributed compute on AWS/GCP/Azure) that can scale with growing data needs.
    - Automate data processing workflows and ensure their scalability and reliability.
    - Use workflow orchestration tools like Apache Airflow to schedule and monitor data pipelines, enabling continuous and repeatable model training and evaluation cycles.
    - Maintain and optimize analytical databases and data access layers for both ad-hoc analysis and model training needs.
    - Work with relational databases (e.g., PostgreSQL) and other storage systems to ensure fast query performance and well-structured data schemas.
    - Collaborate with Data Scientists and NLP Engineers to build data features and datasets for machine learning models.
    - Provide data subsets, aggregations, or preprocessing as needed for tasks such as language model training, embedding generation, and evaluation.
    - Implement data quality checks, monitoring, and alerting. Develop scripts or use tools to validate data completeness and correctness (e.g., ensuring no critical data gaps or anomalies in the text corpora), and promptly address any pipeline failures or data issues. Implement data version control.
    - Manage data security, access, and compliance.
    - Control permissions to datasets and ensure adherence to data privacy policies and security standards, especially when dealing with user data or proprietary text sources.

    The company offers:
    - Competitive salary.
    - Equity options in a fast-growing AI company.
    - Remote-friendly work culture.
    - Opportunity to shape a product at the intersection of AI and human productivity.
    - Work with a passionate, senior team building cutting-edge tech for real-world business use.

    More
  • Β· 91 views Β· 7 applications Β· 27d

    Data Engineer β€” Azure Data Factory, Functions, Snowflake (Nature-based Solutions) to $5500

    Full Remote Β· Countries of Europe or Ukraine Β· 4 years of experience Β· English - B2
    About the Client & Mission Our client is the world’s largest environmental nonprofit focused on reforestation and sustainable development (Nature-based Solutions). We are building a modern cloud data platform on Azure and Snowflake that will serve as a...

    About the Client & Mission

    Our client is the world’s largest environmental nonprofit focused on reforestation and sustainable development (Nature-based Solutions). We are building a modern cloud data platform on Azure and Snowflake that will serve as a single source of truth and enable faster, data-driven decision-making.

     

    About the Initiative

    This role supports a Data Warehouse initiative focused on tangible delivery impact: trusted data, clear and scalable models, and fast release cycles (1–3 months) with well-defined SLAs. You’ll work in a collaborative setup across Data Engineering ↔ BI ↔ Product, often handling 1–2 parallel workstreams with proactive risk and dependency management.

     

    Core Stack

    • ELT/DWH: Azure Data Factory + Azure Functions (Python) β†’ Snowflake
    • CI/CD: Azure DevOps pipelines + DL Sync (Snowflake objects and pipeline deployments)
    • Primary data sources: CRM/ERP (Dynamics 365, Salesforce), MS SQL, API-based ingestion, CDC concepts
    • Data formats: JSON, Parquet.

       

    Team (our side)

    Lead Data Engineering, PM, DevOps, QA.

     

    Your Responsibilities

    • Design, build, and maintain incremental and full-refresh ELT pipelines (ADF + Azure Functions β†’ Snowflake).
    • Develop and optimize Snowflake SQL for the DWH and data marts (Star Schema, incremental patterns, basic SCD2).
    • Build production-grade Python code in Azure Functions for ingestion, orchestration, and lightweight pre-processing.
    • Implement and maintain data quality controls (freshness, completeness, duplicates, late-arriving data).
    • Support CI/CD delivery for Snowflake objects and pipelines across dev/test/prod (Azure DevOps + DL Sync).
    • Contribute to documentation, best practices, and operational standards for the platform.
    • Communicate clearly and proactively: status β†’ risk β†’ options β†’ next step, ensuring predictable delivery.

       

    Requirements (Must-have)

    • 4+ years in Data Engineering or related roles.
    • Strong Snowflake SQL (CTEs, window functions, COPY INTO, MERGE).
    • Hands-on experience with incremental loading (watermarks, merge patterns) and basic SCD2 (effective dating / current flag).
    • Strong Python (production-ready code), including API integration (pagination, retries, error handling), logging, configuration, and secrets handling.
    • Solid experience with Azure Data Factory (pipelines, parameters, triggers) and Azure Functions (HTTP/Timer triggers, idempotency, retries).
    • Understanding of ELT/DWH modeling (Star Schema, fact/dimension design, performance implications of joins).
    • CI/CD familiarity: Azure DevOps and automated deployment practices for data platforms (DL Sync for Snowflake is a strong plus).
    • Strong communication skills and a proactive, accountable approach to teamwork.

       

    Nice to Have

    • PySpark (DataFrame API, joins, aggregations; general distributed processing understanding).
    • Experience with D365 / Salesforce, MS SQL sources, API-based ingestion, and CDC patterns.
    • Data governance/security basics, Agile/Scrum, and broader analytics tooling exposure.

       

    Selection Process (Transparent & Practical)

    Stage 1 β€” Intro + TA + Short Tech Screen (40–60 min, Zoom):

    • project context (multi-project setup, 1–3 month delivery cycles), must-haves for Azure/ELT, a short SQL/Python scenario;
    • soft skills & culture match discussion covering: Proactive communication & stakeholders, Critical thinking & judgment, Problem solving & systems thinking, Ownership & maturity.

       

    Stage 2 β€” Deep-Dive Technical Interview (75–90 min, with 2 engineers):
    Live SQL (CTE/window + incremental load/SCD2 approach), PySpark mini-exercises, Azure lakehouse architecture discussion, plus a mini-case based on a real delivery situation.
    No take-home task β€” we simulate day-to-day work during the session.

     

    What We Offer

    • Competitive compensation.
    • Learning and growth alongside strong leaders, deepening expertise in Snowflake/ Azure / DWH.
    • Opportunity to expand your expertise over time across diverse, mission-driven & AI projects.
    • Flexible work setup: remote / abroad / office (optional), gig contract (with an option to transition if needed).
    • Equipment and home-office support.
    • 36 paid days off per year: 20 vacation days + UA public holidays (and related days off, as applicable).
    • Monthly benefit of the cafeteria: $25 to support your personal needs (learning, mental health support, etc.).
    • Performance reviews: ongoing feedback, compensation review after 12 months, then annually.
    • Paid sabbatical after 5 years with the company.

       

    P.S. Dear fellow Ukrainians,
    we kindly ask you to apply for this role in a professional and well-reasoned manner, clearly highlighting the experience that is most relevant to the position.

    If you are unsure whether your background fully matches the requirements, please feel free to mention this openly in your application. This will not reduce your chances of being considered; it helps us review your profile fairly and prioritize candidates based on overall fit for the role.

    More
  • Β· 113 views Β· 14 applications Β· 29d

    Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· English - None
    What You’ll Actually Do Build and run scalable pipelines (batch + streaming) that power gameplay, wallet, and promo analytics. Model data for decisions (star schemas, marts) that Product, BI, and Finance use daily. Make things reliable: tests, lineage,...

    🎯 What You’ll Actually Do

    • Build and run scalable pipelines (batch + streaming) that power gameplay, wallet, and promo analytics.
    • Model data for decisions (star schemas, marts) that Product, BI, and Finance use daily.
    • Make things reliable: tests, lineage, alerts, SLAs. Fewer surprises, faster fixes.
    • Optimize ETL/ELT for speed and cost (partitioning, clustering, late arrivals, idempotency).
    • Keep promo data clean and compliant (PII, GDPR, access controls).
    • Partner with POs and analysts on bets/wins/turnover KPIs, experiment readouts, and ROI.
    • Evaluate tools, migrate or deprecate with clear trade-offs and docs.
    • Handle prod issues without drama, then prevent the next one.

       

    🧠 What You Bring

    • 4+ years building production data systems. You’ve shipped, broken, and fixed pipelines at scale.
    • SQL that sings and Python you’re proud of.
    • Real experience with OLAP and BI (Power BI / Tableau / Redash β€” impact > logo).
    • ETL/ELT orchestration (Airflow/Prefect or similar) and CI/CD for data.
    • Strong grasp of warehouses & lakes: incremental loads, SCDs, partitioning.
    • Data quality mindset: contracts, tests, lineage, monitoring.
    • Product sense: you care about player impact, not just rows processed.

       

    ✨ Nice to Have (tell us if you’ve got it)

    • Kafka (or similar streaming), ClickHouse (we like it), dbt (modular ELT).
    • AWS data stack (S3, IAM, MSK/Glue/Lambda/Redshift) or equivalents.
    • Containers & orchestration (Docker/K8s), IaC (Terraform).
    • Familiarity with AI/ML data workflows (feature stores, reproducibility).
    • iGaming context: provider metrics bets / wins / turnover, regulated markets, promo events.

       

    πŸ”§ How We Work

    • Speed > perfection. Iterate, test, ship.
    • Impact > output. One rock-solid dataset beats five flaky ones.
    • Behavior > titles. Ownership matters more than hierarchy.
    • Direct > polite. Say what matters, early.

       

    πŸ”₯ What We Offer

    • Fully remote (EU-friendly time zones) or Bratislava if you like offices.
    • Unlimited vacation + paid sick leave.
    • Quarterly performance bonuses.
    • No micromanagement. Real ownership, real impact.
    • Budget for conferences and growth.
    • Product-led culture with sharp people who care.

       

    🧰 Our Day-to-Day Stack (representative)
    Python, SQL, Airflow/Prefect, Kafka, ClickHouse/OLAP DBs, AWS (S3 + friends), dbt, Redash/Power BI/Tableau, Docker/K8s, GitHub Actions.

     

    πŸ‘‰ If you know how to make data boringly reliable and blisteringly fast β€” hit apply and let’s talk.

    More
  • Β· 47 views Β· 2 applications Β· 29d

    Data Engineer

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    Data Engineer Full-time. Remote. B2B. Working time zone: EET (Ukraine). Location of candidates: Ukraine About the company: It is a US-based Managed IT Services (MSP) company, founded in 2016. Services: IT management, user support, cybersecurity,...

    Data Engineer 
     

    Full-time. Remote. B2B. 
    Working time zone: EET (Ukraine). 
    Location of candidates: Ukraine
     

    About the company: It is a US-based Managed IT Services (MSP) company, founded in 2016.

    Services: IT management, user support, cybersecurity, cloud solutions (Microsoft Azure, M365), and data engineering.
    Core clients: Hedge funds, investment and asset management firms (financial sector focus) across North America, Europe, and Asia.

     

    As a Data Engineer, you will be responsible for designing, implementing, and maintaining robust data pipelines and cloud-native solutions that support scalable analytics and operational efficiency. This role requires deep expertise in Python programming, Azure cloud services, and SQL-based data modeling, with a strong emphasis on automation, reliability, and security.

    Currently, the data processing system is built entirely in pure Python, without external ETL or data integration platforms (such as Snowflake or Data Factory). The company plans to continue relying on Python as the core technology for data processing, making it essential that the new engineer has strong, hands-on expertise in Python-based ETL development β€” including automation, testing, error handling, and code stability.
    You will play a key role in evolving the current data platform as the company moves toward adopting Microsoft Fabric, while maintaining core Python ETL logic.

    This role will work closely with another Data Engineer on internal company projects.

    Team: Cloud Engineering Team

    Reports to: Cloud DevOps Manager
     

    Responsibilities:
    - Build and maintain efficient ETL workflows using Python 3, applying both object-oriented and functional paradigms.

    - Write comprehensive unit, integration, and end-to-end tests; troubleshoot complex Python traces.

    - Automate deployment and integration processes.

    - Develop Azure Functions, configure and deploy Storage Accounts and SQL Databases.

    - Design relational schemas, optimize queries, and manage advanced MSSQL features including temporal tables, external tables, and row-level security.

    - Author and maintain stored procedures, views, and functions.

    - Collaborating with cross-functional teams
     

    Requirements:
    - English level – B2 or higher (English speaking environment)
    - 5+ years of proven experience as a Data engineer
    - Proficient in Python 3, with both object-oriented and functional paradigms
    - Experience with Python (vanilla), Dagster, Prefect, Apache Airflow, Apache Beam
    - Design and implement ETL workflows using sensible code patterns

    - Discover, navigate and understand third-party library source code

    - Author unit, integration and end-to-end tests for new or existing ETL (pytest, fixtures, mocks, monkey patching)

    - Ability to troubleshoot esoteric python traces encountered in the terminal, logs, or debugger

    - Git (branching), Unix-like shells (Nix-based) in cloud environments- Author CI/CD configs and scripts (JSON, YAML, Bash, PowerShell)

    - Develop Azure Functions (HTTP, Blob, Queue triggers) using azure-functions SDK

    - Implement concurrency and resilience (thread pools, tenacity, rate limiters)

    - Deploy and configure: Functions, Web Apps & App Service Plans, Storage Accounts, Communication Services, SQL Database / Managed Instance

    - Secrets/access management, data validation, data quality checks

    - Relational data modeling, schema design, data partitioning strategies, and temporal tables (system-versioned) 

    - Query performance tuning (indexes, execution plans)

    - Selection of optimal data types

    - Complex T-SQL (windowing, CTEs, advanced joins)
    - Advanced MSSQL features (External Tables, Row-Level Security)
    - SQL Objects & Schema Management: Author and maintain tables, views, Stored Procedures, Functions, and external tables (polybase)- Strong analytical, problem-solving, and documentation skills

    - Microsoft certifications would be a plus.
     

    Work conditions:

    - B2B. Remote. Full-time.

    - Competitive salary and a performance-based bonus of up to 10% of the annual salary, paid at the end of the year.

    - Paid vacation (4 weeks / 20 working days) to start, increasing with years of service & Sick leave.

    - Official Ukrainian public holidays are days off

    - Professional development: company-paid courses and certifications. Successful certification exams are rewarded with several paid days off or a monetary bonus.
     

    Hiring stages:

    - Interview with the recruiter of the recruitment agency ~20–30 min (call recorded)

    - Personality test ~20 min & Cognitive test ~5 min (to be completed on your own) 

    - Technical interview

    - Final interview

    - Offer

    We are a recruitment agency helping our client find a Data Engineer. If you have any questions or would like to know more about the company, feel free to reach out to us.

    More
  • Β· 6 views Β· 1 application Β· 1d

    Head of Cloud PaaS Product (Cloud Platforms / Kubernetes)

    Office Work Β· Ukraine (Kyiv) Β· Product Β· 5 years of experience Β· English - C1
    Ми ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ Π»ΡŽΠ΄ΠΈΠ½Ρƒ, яка Π·ΠΌΠΎΠΆΠ΅ ΠΏΠ΅Ρ€Π΅Ρ‚Π²ΠΎΡ€ΠΈΡ‚ΠΈ Kubernetes/OpenShift-ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΡƒ Π½Π° ΠΏΠΎΠ²Π½ΠΎΡ†Ρ–Π½Π½ΠΈΠΉ ΠΊΠΎΠΌΠ΅Ρ€Ρ†Ρ–ΠΉΠ½ΠΈΠΉ PaaS-ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ для Enterprise Ρ‚Π° Π΄Π΅Ρ€ΠΆΠ°Π²Π½ΠΈΡ… Π·Π°ΠΌΠΎΠ²Π½ΠΈΠΊΡ–Π² Π² Π£ΠΊΡ€Π°Ρ—Π½Ρ–. Π¦Π΅ Π½Π΅ Ρ€ΠΎΠ»ΡŒ Сксплуатації Ρ– Π½Π΅ позиція Β«ΠΏΡ€ΠΎΠ΄Π°ΠΊΡ‚ Π·Π° JiraΒ». Π¦Π΅ позиція Π»Ρ–Π΄Π΅Ρ€Π° ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚Ρƒ, який...

    Ми ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ Π»ΡŽΠ΄ΠΈΠ½Ρƒ, яка Π·ΠΌΠΎΠΆΠ΅ ΠΏΠ΅Ρ€Π΅Ρ‚Π²ΠΎΡ€ΠΈΡ‚ΠΈ Kubernetes/OpenShift-ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΡƒ Π½Π° ΠΏΠΎΠ²Π½ΠΎΡ†Ρ–Π½Π½ΠΈΠΉ ΠΊΠΎΠΌΠ΅Ρ€Ρ†Ρ–ΠΉΠ½ΠΈΠΉ PaaS-ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ для Enterprise Ρ‚Π° Π΄Π΅Ρ€ΠΆΠ°Π²Π½ΠΈΡ… Π·Π°ΠΌΠΎΠ²Π½ΠΈΠΊΡ–Π² Π² Π£ΠΊΡ€Π°Ρ—Π½Ρ–.

    Π¦Π΅ Π½Π΅ Ρ€ΠΎΠ»ΡŒ Сксплуатації Ρ– Π½Π΅ позиція Β«ΠΏΡ€ΠΎΠ΄Π°ΠΊΡ‚ Π·Π° JiraΒ».
    Π¦Π΅ позиція Π»Ρ–Π΄Π΅Ρ€Π° ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚Ρƒ, який ΠΏΠΎΡ”Π΄Π½ΡƒΡ” Π³Π»ΠΈΠ±ΠΎΠΊΠ΅ Ρ‚Π΅Ρ…Π½Ρ–Ρ‡Π½Π΅ розуміння cloud-native ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌ Ρ–Π· бізнСс-мислСнням Ρ– Π·Π΄Π°Ρ‚Π½Ρ–ΡΡ‚ΡŽ Π΄ΠΎΠ²ΠΎΠ΄ΠΈΡ‚ΠΈ складні інфраструктурні Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ Π΄ΠΎ Ρ€Π΅Π°Π»ΡŒΠ½ΠΎΠ³ΠΎ Ρ€ΠΈΠ½ΠΊΡƒ.

    Π’Π°ΡˆΠ΅ Π³ΠΎΠ»ΠΎΠ²Π½Π΅ завдання β€” створити Ρ‚Π° запустити PaaS-сСрвіс Π½Π° Π±Π°Π·Ρ– OpenShift Π°Π±ΠΎ Π°Π»ΡŒΡ‚Π΅Ρ€Π½Π°Ρ‚ΠΈΠ²Π½ΠΎΡ— ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠΈ, який:

    • станС Π·Ρ€ΠΎΠ·ΡƒΠΌΡ–Π»ΠΎΡŽ Ρ‚Π° Π±Π΅Π·ΠΏΠ΅Ρ‡Π½ΠΎΡŽ Π°Π»ΡŒΡ‚Π΅Ρ€Π½Π°Ρ‚ΠΈΠ²ΠΎΡŽ VMware Ρ– частині сСрвісів гіпСрскСйлСрів;
    • Π±ΡƒΠ΄Π΅ Ρ€Π΅Π°Π»ΡŒΠ½ΠΎ Π²ΠΈΠΊΠΎΡ€ΠΈΡΡ‚ΠΎΠ²ΡƒΠ²Π°Ρ‚ΠΈΡΡŒ Ρƒ ΠΊΡ€ΠΈΡ‚ΠΈΡ‡Π½ΠΈΡ… систСмах: Π±Π°Π½ΠΊΡ–Π½Π³, логістика, дСрТсСктор;
    • ΠΌΠ°Ρ‚ΠΈΠΌΠ΅ Ρ‡Ρ–Ρ‚ΠΊΡƒ Π΅ΠΊΠΎΠ½ΠΎΠΌΡ–ΠΊΡƒ, сцСнарії ΠΌΡ–Π³Ρ€Π°Ρ†Ρ–Ρ— Ρ‚Π° ΠΏΡ€Π°Ρ†ΡŽΡŽΡ‡Ρƒ модСль ΠΏΡ€ΠΎΠ΄Π°ΠΆΡ–Π².

    Π—ΠΎΠ½Π° Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π°Π»ΡŒΠ½ΠΎΡΡ‚Ρ–

    ΠŸΡ€ΠΎΠ΄ΡƒΠΊΡ‚ Ρ– стратСгія

    • Ѐормування product vision Ρ– roadmap для PaaS-сСрвісів: Container Platform, OpenShift Virtualization, AI/ML-інструмСнти.
    • Π’ΠΈΠ±Ρ–Ρ€ Ρ– Ρ€ΠΎΠ·Π²ΠΈΡ‚ΠΎΠΊ ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠΈ (OpenShift Π°Π±ΠΎ Π°Π½Π°Π»ΠΎΠ³): Π· фокусом Π½Π΅ Π½Π° Β«ΠΌΠΎΠ΄Π½Ρ–ΡΡ‚ΡŒΒ», Π° Π½Π° ΠΌΠ°ΡΡˆΡ‚Π°Π±ΠΎΠ²Π°Π½Ρ–ΡΡ‚ΡŒ, ΡΡ‚Π°Π±Ρ–Π»ΡŒΠ½Ρ–ΡΡ‚ΡŒ Ρ– ΠΊΠΎΠΌΠ΅Ρ€Ρ†Ρ–ΠΉΠ½Ρƒ Π΄ΠΎΡ†Ρ–Π»ΡŒΠ½Ρ–ΡΡ‚ΡŒ.
    • Ѐормування ціннісної ΠΏΡ€ΠΎΠΏΠΎΠ·ΠΈΡ†Ρ–Ρ—: Ρ‡ΠΎΠΌΡƒ ΡƒΠΊΡ€Π°Ρ—Π½ΡΡŒΠΊΠΎΠΌΡƒ Enterprise Ρ‚Π° Π΄Π΅Ρ€ΠΆΠ°Π²Ρ– Π²ΠΈΠ³Ρ–Π΄Π½ΠΎ Π±ΡƒΠ΄ΡƒΠ²Π°Ρ‚ΠΈ PaaS самС Ρ‚ΡƒΡ‚, Π° Π½Π΅ Π² AWS/Azure Π°Π±ΠΎ Π·Π°Π»ΠΈΡˆΠ°Ρ‚ΠΈΡΡŒ Π½Π° VMware.
    • Π ΠΎΠ·Ρ€ΠΎΠ±ΠΊΠ° ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ ціноутворСння: compute, storage, Π»Ρ–Ρ†Π΅Π½Π·Ρ–Ρ—, ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠ° як сСрвіс.

    Execution Ρ– взаємодія Π· Ρ–Π½ΠΆΠ΅Π½Π΅Ρ€Ρ–Ρ”ΡŽ

    • Управління Π±Π΅ΠΊΠ»ΠΎΠ³ΠΎΠΌ R&D-ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ: пріоритизація, user stories, приймання Ρ€Π΅Π·ΡƒΠ»ΡŒΡ‚Π°Ρ‚Ρ–Π².
    • ΠŸΠΎΡΡ‚Ρ–ΠΉΠ½ΠΈΠΉ баланс ΠΌΡ–ΠΆ бізнСс-очікуваннями Ρ‚Π° Ρ‚Π΅Ρ…Π½Ρ–Ρ‡Π½ΠΎΡŽ Ρ€Π΅Π°Π»ΡŒΠ½Ρ–ΡΡ‚ΡŽ ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠΈ.
    • Π£Ρ‡Π°ΡΡ‚ΡŒ Ρƒ ΠΊΠ»ΡŽΡ‡ΠΎΠ²ΠΈΡ… Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Π½ΠΈΡ… Ρ€Ρ–ΡˆΠ΅Π½Π½ΡΡ… β€” Π½Π΅ як Π²ΠΈΠΊΠΎΠ½Π°Π²Π΅Ρ†ΡŒ, Π° як ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ΠΎΠ²ΠΈΠΉ Π°Ρ€Π±Ρ–Ρ‚Ρ€. 

    Go-to-Market Ρ– ΠΊΠ»Ρ–Ρ”Π½Ρ‚ΠΈ

    • Π£Ρ‡Π°ΡΡ‚ΡŒ Ρƒ pre-sales Ρ–Π· ΠΊΠ»ΡŽΡ‡ΠΎΠ²ΠΈΠΌΠΈ ΠΊΠ»Ρ–Ρ”Π½Ρ‚Π°ΠΌΠΈ: пояснСння Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€ΠΈ, сцСнаріїв, Ρ€ΠΈΠ·ΠΈΠΊΡ–Π².
    • Π ΠΎΠ·Ρ€ΠΎΠ±ΠΊΠ° migration paths для ΠΊΠ»Ρ–Ρ”Π½Ρ‚Ρ–Π² VMware: Π²Ρ–Π΄ VM-ΠΎΡ€Ρ–Ρ”Π½Ρ‚ΠΎΠ²Π°Π½ΠΎΠ³ΠΎ мислСння Π΄ΠΎ PaaS-ΠΌΠΎΠ΄Π΅Π»Ρ–.
    • Навчання sales-ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ Ρ‚Π° ΠΏΠ°Ρ€Ρ‚Π½Π΅Ρ€Ρ–Π²: як ΠΏΡ€ΠΎΠ΄Π°Π²Π°Ρ‚ΠΈ складний PaaS-ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ Π±Π΅Π· ΡΠΏΡ€ΠΎΡ‰Π΅Π½ΡŒ Ρ– ΠΎΠ±ΠΌΠ°Π½Ρƒ.
    • Π—Π±Ρ–Ρ€ Ρ€Π΅Π°Π»ΡŒΠ½ΠΎΠ³ΠΎ Π·Π²ΠΎΡ€ΠΎΡ‚Π½ΠΎΠ³ΠΎ зв’язку Π²Ρ–Π΄ ΠΊΠ»Ρ–Ρ”Π½Ρ‚Ρ–Π² Ρ– трансформація ΠΉΠΎΠ³ΠΎ Π² ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ΠΎΠ²Ρ– Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ.

    Кого ΠΌΠΈ ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ

    Ця Ρ€ΠΎΠ»ΡŒ Π΄ΠΎΠ±Ρ€Π΅ ΠΏΡ–Π΄Ρ–ΠΉΠ΄Π΅, якщо Π²ΠΈ:

    • ΠΌΠ°Ρ”Ρ‚Π΅ Π³Π»ΠΈΠ±ΠΎΠΊΠΈΠΉ Π±Π΅ΠΊΠ³Ρ€Π°ΡƒΠ½Π΄ Ρƒ Kubernetes / OpenShift / cloud-native Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Π°Ρ…;
    • ΠΏΡ€Π°Ρ†ΡŽΠ²Π°Π»ΠΈ Π· PaaS-Ρ€Ρ–ΡˆΠ΅Π½Π½ΡΠΌΠΈ AWS, GCP Π°Π±ΠΎ Azure Ρ– Ρ€ΠΎΠ·ΡƒΠΌΡ–Ρ”Ρ‚Π΅ Ρ—Ρ… ΡΠΈΠ»ΡŒΠ½Ρ– Ρ‚Π° слабкі сторони;
    • Π²ΠΆΠ΅ Π±ΡƒΠ»ΠΈ DevOps Lead, Platform Architect Π°Π±ΠΎ Tech Lead β€” Ρ– Ρ…ΠΎΡ‡Π΅Ρ‚Π΅ Π²ΠΏΠ»ΠΈΠ²Π°Ρ‚ΠΈ Π½Π΅ лишС Π½Π° «як ΠΏΡ€Π°Ρ†ΡŽΡ”Β», Π° Π½Π° Ρ‰ΠΎ Ρ– Π½Π°Π²Ρ–Ρ‰ΠΎ ΡΡ‚Π²ΠΎΡ€ΡŽΡ”Ρ‚ΡŒΡΡ;
    • Π²ΠΌΡ–Ρ”Ρ‚Π΅ Π³ΠΎΠ²ΠΎΡ€ΠΈΡ‚ΠΈ Π· Ρ–Π½ΠΆΠ΅Π½Π΅Ρ€Π°ΠΌΠΈ мовою Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€ΠΈ, Π° Π· бізнСсом β€” мовою Ρ€ΠΈΠ·ΠΈΠΊΡ–Π² Ρ– цінності.

    НСобхідний досвід

    • 5+ Ρ€ΠΎΠΊΡ–Π² Ρƒ DevOps / platform-Ρ–Π½ΠΆΠ΅Π½Π΅Ρ€Ρ–Ρ— / адмініструванні високонавантаТСних систСм.
    • ΠŸΡ€Π°ΠΊΡ‚ΠΈΡ‡Π½ΠΈΠΉ досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Kubernetes (OpenShift β€” Π²Π΅Π»ΠΈΠΊΠ° ΠΏΠ΅Ρ€Π΅Π²Π°Π³Π°).
    • Розуміння CI/CD, Infrastructure as Code (Terraform, Ansible Ρ‚ΠΎΡ‰ΠΎ).
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Ρ…ΠΌΠ°Ρ€Π½ΠΈΠΌΠΈ PaaS-ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠ°ΠΌΠΈ гіпСрскСйлСрів.
    • Досвід Ρƒ Ρ€ΠΎΠ»Ρ– Product Owner Π°Π±ΠΎ Product Manager Ρƒ B2B / Cloud / Enterprise-сСрСдовищі β€” Π²Ρ–Π΄ 2 Ρ€ΠΎΠΊΡ–Π².
    • ΠΠ½Π³Π»Ρ–ΠΉΡΡŒΠΊΠ°: Upper-Intermediate Π°Π±ΠΎ Π²ΠΈΡ‰Π΅.

    Π§ΠΎΠΌΡƒ Ρ†Π΅ сильна ΠΌΠΎΠΆΠ»ΠΈΠ²Ρ–ΡΡ‚ΡŒ

    • ΠœΠ°ΡΡˆΡ‚Π°Π± Ρ– сСнс: Π²ΠΈ Π±ΡƒΠ΄ΡƒΡ”Ρ‚Π΅ ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΡƒ, яка стає Ρ‡Π°ΡΡ‚ΠΈΠ½ΠΎΡŽ ΠΊΡ€ΠΈΡ‚ΠΈΡ‡Π½ΠΎΡ— інфраструктури ΠΊΡ€Π°Ρ—Π½ΠΈ.
    • РСальний Π²ΠΏΠ»ΠΈΠ²: Π²Π°ΡˆΡ– Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ Π²ΠΏΠ»ΠΈΠ²Π°ΡŽΡ‚ΡŒ Π½Π° Ρ‚Π΅, як бізнСс Ρ– Π΄Π΅Ρ€ΠΆΠ°Π²Π° ΠΏΠ΅Ρ€Π΅ΠΆΠΈΠ²Π°ΡŽΡ‚ΡŒ ΠΊΡ€ΠΈΠ·ΠΈ, Π±Π»Π΅ΠΊΠ°ΡƒΡ‚ΠΈ ΠΉ Π°Ρ‚Π°ΠΊΠΈ.
    • Π‘Π²ΠΎΠ±ΠΎΠ΄Π° Ρ€Ρ–ΡˆΠ΅Π½ΡŒ: Ρ†Π΅ Π½Π΅ Ρ€ΠΎΠ»ΡŒ Β«ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠΈ Ρ–ΡΠ½ΡƒΡŽΡ‡ΠΎΠ³ΠΎΒ», Π° створСння Π½ΠΎΠ²ΠΎΠ³ΠΎ ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚Ρƒ Π· нуля.
    • Π‘Ρ‚Π°Π±Ρ–Π»ΡŒΠ½Ρ–ΡΡ‚ΡŒ: компанія Π·Ρ– статусом ΠΊΡ€ΠΈΡ‚ΠΈΡ‡Π½ΠΎΡ— інфраструктури, Π΅Π½Π΅Ρ€Π³ΠΎΠ½Π΅Π·Π°Π»Π΅ΠΆΠ½Ρ–ΡΡ‚ΡŒ, Π±Ρ€ΠΎΠ½ΡŽΠ²Π°Π½Π½Ρ, ΡΠΎΡ†Ρ–Π°Π»ΡŒΠ½ΠΈΠΉ ΠΏΠ°ΠΊΠ΅Ρ‚.
    More
  • Β· 29 views Β· 1 application Β· 1d

    Middle/Senior Data Engineer

    Full Remote Β· Ukraine Β· Product Β· 3 years of experience Β· English - B1
    ΠŸΡ€ΠΈΠ²Π°Ρ‚Π‘Π°Π½ΠΊ β€” Ρ” Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΠΌ Π±Π°Π½ΠΊΠΎΠΌ Π£ΠΊΡ€Π°Ρ—Π½ΠΈ Ρ‚Π° ΠΎΠ΄Π½ΠΈΠΌ Π· Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆ Ρ–Π½Π½ΠΎΠ²Π°Ρ†Ρ–ΠΉΠ½ΠΈΡ… Π±Π°Π½ΠΊΡ–Π² світу. Π—Π°ΠΉΠΌΠ°Ρ” Π»Ρ–Π΄ΠΈΡ€ΡƒΡŽΡ‡Ρ– ΠΏΠΎΠ·ΠΈΡ†Ρ–Ρ— Π·Π° всіма фінансовими ΠΏΠΎΠΊΠ°Π·Π½ΠΈΠΊΠ°ΠΌΠΈ Π² Π³Π°Π»ΡƒΠ·Ρ– Ρ‚Π° складає близько Ρ‡Π²Π΅Ρ€Ρ‚Ρ– всієї Π±Π°Π½ΠΊΡ–Π²ΡΡŒΠΊΠΎΡ— систСми ΠΊΡ€Π°Ρ—Π½ΠΈ. Ми ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ Data Engineer, який ΠΏΡ€Π°Π³Π½Π΅...

    ΠŸΡ€ΠΈΠ²Π°Ρ‚Π‘Π°Π½ΠΊ β€” Ρ” Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΠΌ Π±Π°Π½ΠΊΠΎΠΌ Π£ΠΊΡ€Π°Ρ—Π½ΠΈ Ρ‚Π° ΠΎΠ΄Π½ΠΈΠΌ Π· Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆ Ρ–Π½Π½ΠΎΠ²Π°Ρ†Ρ–ΠΉΠ½ΠΈΡ… Π±Π°Π½ΠΊΡ–Π² світу. Π—Π°ΠΉΠΌΠ°Ρ” Π»Ρ–Π΄ΠΈΡ€ΡƒΡŽΡ‡Ρ– ΠΏΠΎΠ·ΠΈΡ†Ρ–Ρ— Π·Π° Π²ΡΡ–ΠΌΠ° фінансовими ΠΏΠΎΠΊΠ°Π·Π½ΠΈΠΊΠ°ΠΌΠΈ Π² Π³Π°Π»ΡƒΠ·Ρ– Ρ‚Π° ΡΠΊΠ»Π°Π΄Π°Ρ” близько Ρ‡Π²Π΅Ρ€Ρ‚Ρ– всієї Π±Π°Π½ΠΊΡ–Π²ΡΡŒΠΊΠΎΡ— систСми ΠΊΡ€Π°Ρ—Π½ΠΈ.
     

    Ми ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ Data Engineer, який ΠΏΡ€Π°Π³Π½Π΅ ΠΏΡ€Π°Ρ†ΡŽΠ²Π°Ρ‚ΠΈ Π² Π΄ΠΈΠ½Π°ΠΌΡ–Ρ‡Π½ΠΎΠΌΡƒ сСрСдовищі Ρ‚Π° Ρ€ΠΎΠ·Π΄Ρ–ляє цінності Π²Π·Π°Ρ”ΠΌΠ½ΠΎΡ— Π΄ΠΎΠ²Ρ–Ρ€ΠΈ, відкритості Ρ‚Π° Ρ–ніціативності.

    Ми ΠΏΡ€Π°Π³Π½Π΅ΠΌΠΎ Π·Π½Π°ΠΉΡ‚ΠΈ цілСспрямованого профСсіонала, який Π²ΠΌΡ–Ρ” ΠΏΡ€Π°Ρ†ΡŽΠ²Π°Ρ‚ΠΈ Π² Ρ€Π΅ΠΆΠΈΠΌΡ– багатозадачності, ΠΎΡ€Ρ–Ρ”Π½Ρ‚ΠΎΠ²Π°Π½ΠΎΠ³ΠΎ Π½Π° ΡΠΊΡ–ΡΡ‚ΡŒ Ρ‚Π° Ρ€Π΅Π·ΡƒΠ»ΡŒΡ‚Π°Ρ‚.
     

    ΠžΡΠ½ΠΎΠ²Π½Ρ– обов’язки:

    • ΠŸΠΎΠ±ΡƒΠ΄ΠΎΠ²Π° Ρ‚Π° супровід DWH Ρ‚Π° Data Mart Π½Π° Π±Π°Π·Ρ– AWS Redshift
    • ΠŸΡ€ΠΎΠ΅ΠΊΡ‚ΡƒΠ²Π°Π½Π½Ρ, Ρ€ΠΎΠ·Ρ€ΠΎΠ±ΠΊΠ° Ρ‚Π° впровадТСння ETL/ELT-процСсів
    • Π ΠΎΠ·Ρ€ΠΎΠ±ΠΊΠ° Ρ…ΡƒΠΊΡ–Π² Ρ‚Π° ΠΎΠΏΠ΅Ρ€Π°Ρ‚ΠΎΡ€Ρ–Π² Π½Π° Π±Π°Π·Ρ– AirFlow
    • ΠœΡ–Π³Ρ€Π°Ρ†Ρ–Ρ SAP IQ -> Redshift. ΠžΠΏΡ‚ΠΈΠΌΡ–Π·Π°Ρ†Ρ–Ρ SQL

       

    ΠžΡΠ½ΠΎΠ²Π½Ρ– Π²ΠΈΠΌΠΎΠ³ΠΈ:

    • 3+ Ρ€ΠΎΠΊΡ–Π² досвіду Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π½Π° ΠΏΠΎΠ·ΠΈΡ†Ρ–Ρ— Data Engineer 
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· рСляційними Ρ‚Π° Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½ΠΈΠΌΠΈ Π±Π°Π·Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ… (ΠΏΠ΅Ρ€ΡˆΠΎΡ‡Π΅Ρ€Π³ΠΎΠ²ΠΎ Amazon Redshift) 
    • ΠžΠΏΡ‚ΠΈΠΌΡ–Π·Π°Ρ†Ρ–Ρ SQL-Π·Π°ΠΏΠΈΡ‚Ρ–Π² для Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ масивами Π΄Π°Π½ΠΈΡ…
    • Вміння Π±ΡƒΠ΄ΡƒΠ²Π°Ρ‚ΠΈ ELT/ETL процСси
    • Знання Java Core Π°Π±ΠΎ Python 
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· ΡΠΈΡΡ‚Π΅ΠΌΠΎΡŽ контроля вСрсій Git

     

    Π‘Π²ΠΎΡ—ΠΌ співробітникам ΠΌΠΈ ΠΏΡ€ΠΎΠΏΠΎΠ½ΡƒΡ”ΠΌΠΎ:

    • ΠžΡ„Ρ–Ρ†Ρ–ΠΉΠ½Π΅ ΠΏΡ€Π°Ρ†Π΅Π²Π»Π°ΡˆΡ‚ΡƒΠ²Π°Π½Π½Ρ Ρ‚Π° 24+4 ΠΊΠ°Π»Π΅Π½Π΄Π°Ρ€Π½ΠΈΡ… Π΄Π½Ρ– відпустки
    • ΠšΠΎΠ½ΠΊΡƒΡ€Π΅Π½Ρ‚Π½Ρƒ Π·Π°Ρ€ΠΎΠ±Ρ–Ρ‚Π½Ρƒ ΠΏΠ»Π°Ρ‚Ρƒ
    • Бонуси Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π½ΠΎ Π΄ΠΎ ΠΏΠΎΠ»Ρ–Ρ‚ΠΈΠΊΠΈ ΠΊΠΎΠΌΠΏΠ°Π½Ρ–Ρ—
    • ΠœΠ΅Π΄ΠΈΡ‡Π½Π΅ страхування Ρ‚Π° ΠΊΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½ΠΈΠΉ ΠΌΠΎΠ±Ρ–Π»ΡŒΠ½ΠΈΠΉ зв’язок
    • ΠšΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½Π΅ навчання
    • ΠšΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½Π° фінансова Π΄ΠΎΠΏΠΎΠΌΠΎΠ³Π° Ρƒ ΠΊΡ€ΠΈΡ‚ΠΈΡ‡Π½ΠΈΡ… ситуаціях
       

    ΠŸΡ€ΠΈΠ²Π°Ρ‚Π‘Π°Π½ΠΊ Π²Ρ–Π΄ΠΊΡ€ΠΈΡ‚ΠΈΠΉ Π΄ΠΎ ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠΈ Ρ‚Π° ΠΏΡ€Π°Ρ†Π΅Π²Π»Π°ΡˆΡ‚ΡƒΠ²Π°Π½Π½Ρ Π²Π΅Ρ‚Π΅Ρ€Π°Π½Ρ–Π² i Π²Π΅Ρ‚Π΅Ρ€Π°Π½ΠΎΠΊ, Π° Ρ‚Π°ΠΊΠΎΠΆ людСй Π· Ρ–Π½Π²Π°Π»Ρ–Π΄Π½Ρ–ΡΡ‚ΡŽ.

    Для нас Π½Π΅ΠΏΡ€ΠΈΠΉΠ½ΡΡ‚Π½ΠΎΡŽ Ρ” дискримінація Ρ‡Π΅Ρ€Π΅Π· стан здоров’я Ρ‚Π° Ρ„Ρ–Π·ΠΈΡ‡Π½Ρ– моТливості, Π²Ρ–ΠΊ, расову Ρ‡ΠΈ Π΅Ρ‚Π½Ρ–Ρ‡Π½Ρƒ Π½Π°Π»Π΅ΠΆΠ½Ρ–ΡΡ‚ΡŒ, ΡΡ‚Π°Ρ‚ΡŒ Ρ– сімСйний стан.

    Ми Π³ΠΎΡ‚ΠΎΠ²Ρ– Π½Π°Π²Ρ‡Π°Ρ‚ΠΈ Π²Π΅Ρ‚Π΅Ρ€Π°Π½Ρ–Π² Ρ‚Π° ΠΊΠ°Π½Π΄ΠΈΠ΄Π°Ρ‚Ρ–Π² Π· Ρ–Π½Π²Π°Π»Ρ–Π΄Π½Ρ–ΡΡ‚ΡŽ Π±Π΅Π· досвіду Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π² Π±Π°Π½ΠΊΡ–Π²ΡΡŒΠΊΡ–ΠΉ сфСрі.

    More
  • Β· 23 views Β· 2 applications Β· 2d

    Data Engineer

    Full Remote Β· Ukraine Β· Product Β· 4 years of experience Β· English - native MilTech πŸͺ–
    Twist Robotics β€” Ρ†Π΅ ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ΠΎΠ²Π° ΠΎΠ±ΠΎΡ€ΠΎΠ½Π½Π° компанія, яка розробляє Π±Π΅Π·ΠΏΡ–Π»ΠΎΡ‚Π½Ρ– Π»Ρ–Ρ‚Π°Π»ΡŒΠ½Ρ– Π°ΠΏΠ°Ρ€Π°Ρ‚ΠΈ (FPV, ΠΊΠΎΠΏΡ‚Π΅Ρ€Π½ΠΎΠ³ΠΎ Ρ‚Π° Π»Ρ–Ρ‚Π°ΠΊΠΎΠ²ΠΎΠ³ΠΎ Ρ‚ΠΈΠΏΡ–Π²) Ρ– пов’язані Π· Π½ΠΈΠΌΠΈ систСми. Π‘Π΅Ρ€Π΅Π΄ Π½Π°ΡˆΠΈΡ… ΠΏΡƒΠ±Π»Ρ–Ρ‡Π½ΠΈΡ… ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚Ρ–Π² β€” симулятор місій БпЛА Β«ΠžΠ±Ρ€Ρ–ΠΉΒ» Ρ– ΠΊΠΎΠΏΡ‚Π΅Ρ€ Saker Scout, який ΡƒΠΆΠ΅...

    Twist Robotics β€” Ρ†Π΅ ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ΠΎΠ²Π° ΠΎΠ±ΠΎΡ€ΠΎΠ½Π½Π° компанія, яка розробляє Π±Π΅Π·ΠΏΡ–Π»ΠΎΡ‚Π½Ρ– Π»Ρ–Ρ‚Π°Π»ΡŒΠ½Ρ– Π°ΠΏΠ°Ρ€Π°Ρ‚ΠΈ (FPV, ΠΊΠΎΠΏΡ‚Π΅Ρ€Π½ΠΎΠ³ΠΎ Ρ‚Π° Π»Ρ–Ρ‚Π°ΠΊΠΎΠ²ΠΎΠ³ΠΎ Ρ‚ΠΈΠΏΡ–Π²) Ρ– ΠΏΠΎΠ²β€™ΡΠ·Π°Π½Ρ– Π· Π½ΠΈΠΌΠΈ систСми. Π‘Π΅Ρ€Π΅Π΄ Π½Π°ΡˆΠΈΡ… ΠΏΡƒΠ±Π»Ρ–Ρ‡Π½ΠΈΡ… ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚Ρ–Π² β€” симулятор місій БпЛА Β«ΠžΠ±Ρ€Ρ–ΠΉΒ» Ρ– ΠΊΠΎΠΏΡ‚Π΅Ρ€ Saker Scout, який ΡƒΠΆΠ΅ ΠΏΠΎΠ½Π°Π΄ Π΄Π²Π° Ρ€ΠΎΠΊΠΈ ΡƒΡΠΏΡ–ΡˆΠ½ΠΎ ΠΏΡ€Π°Ρ†ΡŽΡ” Π½Π° ΠΏΠ΅Ρ€Π΅Π΄ΠΎΠ²Ρ–ΠΉ.

     

    Ми ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ Data Engineer, який Π΄ΠΎΠ»ΡƒΡ‡ΠΈΡ‚ΡŒΡΡ Π΄ΠΎ Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π½Π°Π΄ систСмами Π°Π²Ρ‚ΠΎΠ½ΠΎΠΌΠ½ΠΎΠ³ΠΎ кСрування Ρ‚Π° ΠΏΠ»Π°Π½ΡƒΠ²Π°Π½Π½Ρ для Π±Π΅Π·ΠΏΡ–Π»ΠΎΡ‚Π½ΠΈΡ… ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌ Π½Π° Π±Π°Π·Ρ– ΠΏΡ€ΠΎΡ”ΠΊΡ‚Ρƒ Β«ΠžΠ±Ρ€Ρ–ΠΉΒ». Π¦Π΅ Ρ€ΠΎΠ»ΡŒ Π½Π° ΡΡ‚ΠΈΠΊΡƒ Π΄Π°Π½ΠΈΡ…, ML-інфраструктури Ρ‚Π° Π°Π²Ρ‚ΠΎΠ½ΠΎΠΌΠ½ΠΈΡ… Π°Π»Π³ΠΎΡ€ΠΈΡ‚ΠΌΡ–Π², Ρ–Π· Ρ„окусом Π½Π° ΠΏΠΎΠ±ΡƒΠ΄ΠΎΠ²Ρƒ Π½Π°Π΄Ρ–ΠΉΠ½ΠΎΡ— Ρ‚Π° ΠΌΠ°ΡΡˆΡ‚Π°Π±ΠΎΠ²Π°Π½ΠΎΡ— систСми Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ обсягами сСнсорних Ρ– Ρ‚Π΅Ρ…Π½Ρ–Ρ‡Π½ΠΈΡ… Π΄Π°Π½ΠΈΡ….

     

    ΠžΡΠ½ΠΎΠ²Π½Ρ– обов’язки:

    • ΠΏΠΎΠ±ΡƒΠ΄ΠΎΠ²Π° Ρ‚Π° ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠ° data pipelines для Π·Π±ΠΎΡ€Ρƒ, ΠΎΠ±Ρ€ΠΎΠ±ΠΊΠΈ Ρ‚Π° Π·Π±Π΅Ρ€Ρ–гання сСнсорних Π΄Π°Π½ΠΈΡ… (Π²Ρ–Π΄Π΅ΠΎ, тСлСмСтрія, IMU, GPS Ρ‚ΠΎΡ‰ΠΎ);
    • організація Ρ‚Π° ΠΏΡ–Π΄Π³ΠΎΡ‚ΠΎΠ²ΠΊΠ° датасСтів для Π·Π°Π΄Π°Ρ‡ computer vision Ρ‚Π° Π°Π²Ρ‚ΠΎΠ½ΠΎΠΌΠ½ΠΎΡ— Π½Π°Π²Ρ–Π³Π°Ρ†Ρ–Ρ—;
    • забСзпСчСння процСсів трСнування, тСстування Ρ‚Π° Π²Π°Π»Ρ–Π΄Π°Ρ†Ρ–Ρ— ML-ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ Π· Π±ΠΎΠΊΡƒ Π΄Π°Π½ΠΈΡ…;
    • оптимізація Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ обсягами Π»ΠΎΠ³Ρ–Π² Ρ– Ρ‚Π΅Ρ…Π½Ρ–Ρ‡Π½ΠΈΡ… Π΄Π°Π½ΠΈΡ…;
    • автоматизація ingestion, preprocessing Ρ‚Π° versioning Π΄Π°Π½ΠΈΡ…;
    • інтСграція data workflows Ρƒ ML/MLOps-інфраструктуру;
    • ΠΊΠΎΠ½Ρ‚Ρ€ΠΎΠ»ΡŒ якості, консистСнтності Ρ‚Π° Π²Ρ–Π΄Ρ‚Π²ΠΎΡ€ΡŽΠ²Π°Π½ΠΎΡΡ‚Ρ– Π΄Π°Π½ΠΈΡ…;
    • тісна взаємодія Ρ–Π· ΡΡƒΠΌΡ–ΠΆΠ½ΠΈΠΌΠΈ ΠΊΠΎΠΌΠ°Π½Π΄Π°ΠΌΠΈ.

     

    Π©ΠΎ Π΄Π»Ρ нас Π²Π°ΠΆΠ»ΠΈΠ²ΠΎ:

    • 4+ Ρ€ΠΎΠΊΠΈ ΠΊΠΎΠΌΠ΅Ρ€Ρ†Ρ–ΠΉΠ½ΠΎΠ³ΠΎ досвіду Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ як Data Engineer Π°Π±ΠΎ Π² ΡΡƒΠΌΡ–ΠΆΠ½Ρ–ΠΉ Ρ€ΠΎΠ»Ρ– Ρƒ ΡΡ„Π΅Ρ€Ρ– Π°Π²Ρ‚ΠΎΠ½ΠΎΠΌΠ½ΠΈΡ… систСм, Π°Π²Ρ‚ΠΎΠΏΡ–Π»ΠΎΡ‚Ρ–Π² Ρ‡ΠΈ computer vision;
    • досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Π΄Π°Π½ΠΈΠΌΠΈ для Π°Π²Ρ‚ΠΎΠΏΡ–Π»ΠΎΡ‚Ρ–Π², Π°Π²Ρ‚ΠΎΠ½ΠΎΠΌΠ½ΠΎΡ— Π½Π°Π²Ρ–Π³Π°Ρ†Ρ–Ρ— Π°Π±ΠΎ Ρ€ΠΎΠ±ΠΎΡ‚ΠΎΡ‚Π΅Ρ…Π½Ρ–ΠΊΠΈ;
    • розуміння Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· ΡΠ΅Π½ΡΠΎΡ€Π½ΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ (ΠΊΠ°ΠΌΠ΅Ρ€ΠΈ, LiDAR, radar, IMU, GPS);
    • досвід ΠΏΠΎΠ±ΡƒΠ΄ΠΎΠ²ΠΈ ΠΌΠ°ΡΡˆΡ‚Π°Π±ΠΎΠ²Π°Π½ΠΈΡ… data pipelines для ML-Π·Π°Π΄Π°Ρ‡;
    • Π²ΠΏΠ΅Π²Π½Π΅Π½Π΅ володіння Python Ρ‚Π° Ρ–нструмСнтами для Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Π΄Π°Π½ΠΈΠΌΠΈ;
    • розуміння ML lifecycle Ρ‚Π° ΠΏΠΎΡ‚Ρ€Π΅Π± ΠΊΠΎΠΌΠ°Π½Π΄ computer vision;
    • досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· ΠΎΠ±Ρ‡ΠΈΡΠ»ΡŽΠ²Π°Π»ΡŒΠ½ΠΈΠΌΠΈ кластСрами (CPU/GPU);
    • систСмнС мислСння, ΡΠ°ΠΌΠΎΡΡ‚Ρ–ΠΉΠ½Ρ–ΡΡ‚ΡŒ Ρ– Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π°Π»ΡŒΠ½Ρ–ΡΡ‚ΡŒ Π·Π° Ρ€Π΅Π·ΡƒΠ»ΡŒΡ‚Π°Ρ‚.

     

    Π‘ΡƒΠ΄Π΅ ΠΏΠ΅Ρ€Π΅Π²Π°Π³ΠΎΡŽ:

    • досвід Ρ–Π· ΡΠΈΠΌΡƒΠ»ΡΡ†Ρ–ΠΉΠ½ΠΈΠΌΠΈ сСрСдовищами;
    • Π±Π΅ΠΊΠ³Ρ€Π°ΡƒΠ½Π΄ Ρƒ computer vision Π°Π±ΠΎ perception-Π·Π°Π΄Π°Ρ‡Π°Ρ…;
    • розуміння Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€ΠΈ Π°Π²Ρ‚ΠΎΠ½ΠΎΠΌΠ½ΠΈΡ… систСм;
    • досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· ΠΏΠΎΡ‚ΠΎΠΊΠΎΠ²ΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ (real-time streaming).

     

    Ми ΠΏΡ€ΠΎΠΏΠΎΠ½ΡƒΡ”ΠΌΠΎ:

    • складні Ρ‚Π° Ρ†Ρ–ΠΊΠ°Π²Ρ– завдання Π² ΡΡ„Π΅Ρ€Ρ– Ρ€ΠΎΠ·Π²ΠΈΡ‚ΠΊΡƒ Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–ΠΉ;
    • ΠΌΠΎΠΆΠ»ΠΈΠ²Ρ–ΡΡ‚ΡŒ застосовувати ваш досвід Ρ– Π½Π°Π²ΠΈΡ‡ΠΊΠΈ для зміцнСння обороноздатності Π£ΠΊΡ€Π°Ρ—Π½ΠΈ;
    • ΠΊΠΎΠΌΡ„ΠΎΡ€Ρ‚Π½Ρ– ΡƒΠΌΠΎΠ²ΠΈ ΠΏΡ€Π°Ρ†Ρ– Ρ‚Π° Ρ€ΠΈΠ½ΠΊΠΎΠ²Ρƒ Π·Π°Ρ€ΠΎΠ±Ρ–Ρ‚Π½Ρƒ ΠΏΠ»Π°Ρ‚Ρƒ;
    • Π²Ρ–Π΄Π΄Π°Π»Π΅Π½Ρƒ Π·Π°ΠΉΠ½ΡΡ‚Ρ–ΡΡ‚ΡŒ Π°Π±ΠΎ ΠΆ Π³Ρ–Π±Ρ€ΠΈΠ΄Π½Ρƒ Ρƒ Π›ΡŒΠ²ΠΎΠ²Ρ–;
    • відпустки, лікарняні Ρ‚Π° Ρ–Π½ΡˆΡ– ΡΠΎΡ†Ρ–Π°Π»ΡŒΠ½Ρ– Π³Π°Ρ€Π°Π½Ρ‚Ρ–Ρ—, Π·Π³Ρ–Π΄Π½ΠΎ Π· Π½ΠΎΡ€ΠΌΠ°ΠΌΠΈ Ρ‡ΠΈΠ½Π½ΠΎΠ³ΠΎ Ρ‚Ρ€ΡƒΠ΄ΠΎΠ²ΠΎΠ³ΠΎ законодавства;
    • Ρ€ΠΎΠ±ΠΎΡ‚Ρƒ Π² ΠΊΠΎΠΌΠ°Π½Π΄Ρ–, Π΄Π΅ Ρ†Ρ–Π½ΡƒΡ”Ρ‚ΡŒΡΡ Π½Π΅Π·Π°Π»Π΅ΠΆΠ½Ρ–ΡΡ‚ΡŒ Ρ‚Π° Ρ–Π½Ρ–Ρ†Ρ–Π°Ρ‚ΠΈΠ²Π½Ρ–ΡΡ‚ΡŒ;
    • Π±Ρ€ΠΎΠ½ΡŽΠ²Π°Π½Π½Ρ.
    More
  • Β· 57 views Β· 4 applications Β· 3d

    Data Engineer (Snowflake + Cloud DWH) β€” Nature-based Solutions to $5000

    Full Remote Β· Ukraine Β· 5 years of experience Β· English - B2
    ΠŸΡ€ΠΎ ΠΌΡ–ΡΡ–ΡŽ Ρ‚Π° ΠΊΠ»Ρ–Ρ”Π½Ρ‚Π° Ми Π±ΡƒΠ΄ΡƒΡ”ΠΌΠΎ Ρ…ΠΌΠ°Ρ€Π½Ρƒ ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΡƒ Π΄Π°Π½ΠΈΡ… для ΠΎΠ΄Π½ΠΎΠ³ΠΎ Π· Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΡ… Ρƒ світі environmental nonprofit, який інвСстує Π² лісовідновлСння Ρ‚Π° Nature-based Solutions. Π¦Π΅ Π½Π΅ β€œΡ‡Π΅Ρ€Π³ΠΎΠ²ΠΈΠΉ DWH Π·Π°Ρ€Π°Π΄ΠΈ DWH” β€” Π΄Π°Π½Ρ– Ρ‚ΡƒΡ‚ напряму Π²ΠΏΠ»ΠΈΠ²Π°ΡŽΡ‚ΡŒ Π½Π° Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ, ΠΏΡ€ΠΎΠ·ΠΎΡ€Ρ–ΡΡ‚ΡŒ,...

    ΠŸΡ€ΠΎ ΠΌΡ–ΡΡ–ΡŽ Ρ‚Π° ΠΊΠ»Ρ–Ρ”Π½Ρ‚Π°

    Ми Π±ΡƒΠ΄ΡƒΡ”ΠΌΠΎ Ρ…ΠΌΠ°Ρ€Π½Ρƒ ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΡƒ Π΄Π°Π½ΠΈΡ… для ΠΎΠ΄Π½ΠΎΠ³ΠΎ Π· Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΡ… Ρƒ світі environmental nonprofit, який інвСстує Π² лісовідновлСння Ρ‚Π° Nature-based Solutions. Π¦Π΅ Π½Π΅ β€œΡ‡Π΅Ρ€Π³ΠΎΠ²ΠΈΠΉ DWH Π·Π°Ρ€Π°Π΄ΠΈ DWH” β€” Π΄Π°Π½Ρ– Ρ‚ΡƒΡ‚ напряму Π²ΠΏΠ»ΠΈΠ²Π°ΡŽΡ‚ΡŒ Π½Π° Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ, ΠΏΡ€ΠΎΠ·ΠΎΡ€Ρ–ΡΡ‚ΡŒ, Π·Π²Ρ–Ρ‚Π½Ρ–ΡΡ‚ΡŒ Ρ– ΠΌΠ°ΡΡˆΡ‚Π°Π±ΡƒΠ²Π°Π½Π½Ρ ΠΏΡ€ΠΎΠ³Ρ€Π°ΠΌ.

     

    ΠŸΡ€ΠΎ Ρ–Π½Ρ–Ρ†Ρ–Π°Ρ‚ΠΈΠ²Ρƒ

    ΠŸΡ€ΠΎΠ΅ΠΊΡ‚ DWH β€” Ρ†Π΅ сучасна cloud data platform, Π΄Π΅ Snowflake β€” ΠΊΠ»ΡŽΡ‡ΠΎΠ²Π° Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½Π° Π±Π°Π·Π°, Π° Ρ–Π½ΠΆΠ΅Π½Π΅Ρ€Π½Π° ΠΊΠΎΠΌΠ°Π½Π΄Π° ΡΡ‚Π²ΠΎΡ€ΡŽΡ” single source of truth Π· Ρ€Ρ–Π·Π½ΠΈΡ… систСм ΠΊΠ»Ρ–Ρ”Π½Ρ‚Π°. 

    ΠŸΡ€Π°Ρ†ΡŽΡ”ΠΌΠΎ ΠΊΠΎΡ€ΠΎΡ‚ΠΊΠΈΠΌΠΈ Ρ€Π΅Π»Ρ–Π·Π°ΠΌΠΈ 1–3 місяці, Π· Ρ‡Ρ–Ρ‚ΠΊΠΈΠΌΠΈ SLA, ΠΏΡ€Ρ–ΠΎΡ€ΠΈΡ‚ΠΈΠ·Π°Ρ†Ρ–Ρ”ΡŽ Ρ‚Π° Ρ€Π΅Π°Π»ΡŒΠ½ΠΈΠΌ ownership. 

    Π—Π°Π·Π²ΠΈΡ‡Π°ΠΉ Ρƒ Ρ€ΠΎΠ±ΠΎΡ‚Ρ– ΠΏΠ°Ρ€Π°Π»Π΅Π»ΡŒΠ½ΠΎ 1–2 ΠΏΠΎΡ‚ΠΎΠΊΠΈ Π·Π°Π΄Π°Ρ‡ β€” Π²Π°ΠΆΠ»ΠΈΠ²Ρ– комунікація Ρ‚Π° ΠΏΡ€ΠΎΠ³Π½ΠΎΠ·ΠΎΠ²Π°Π½Ρ–ΡΡ‚ΡŒ.

     

    Π’Π°ΠΆΠ»ΠΈΠ²ΠΎ: ΠΌΠΈ ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ сильного data engineer Π· Ρ€Π΅Π°Π»ΡŒΠ½ΠΈΠΌ досвідом Snowflake + DWH/ETL. Досвід Azure Π±Π°ΠΆΠ°Π½ΠΈΠΉ, Π°Π»Π΅ Π½Π΅ обов’язковий, якщо Ρ” сильна Π±Π°Π·Π° ΠΉ Π³ΠΎΡ‚ΠΎΠ²Π½Ρ–ΡΡ‚ΡŒ швидко Π΄ΠΎΠ±Ρ€Π°Ρ‚ΠΈ Azure-частину.

     

    Π’Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ— (core)

    • DWH / ELT: Snowflake (SQL, modeling, performance basics)
    • Orchestration / ingestion: Azure Data Factory + Azure Functions (Python) (Π°Π±ΠΎ Π°Π½Π°Π»ΠΎΠ³ΠΈ β€” якщо Π²ΠΌΡ–Ρ”Ρˆ пСрСносити ΠΏΡ–Π΄Ρ…Ρ–Π΄)
    • CI/CD: Azure DevOps pipelines + автоматизація дСплою Snowflake-об’єктів (DL Sync/Π°Π½Π°Π»ΠΎΠ³)
    • Π”ΠΆΠ΅Ρ€Π΅Π»Π°: CRM/ERP (D365/Salesforce), MS SQL, API-Ρ–Π½Ρ‚Π΅Π³Ρ€Π°Ρ†Ρ–Ρ—, CDC-ΠΊΠΎΠ½Ρ†Π΅ΠΏΡ‚ΠΈ
    • Π€ΠΎΡ€ΠΌΠ°Ρ‚ΠΈ: JSON, Parquet
    • Команда: Lead Data Engineering / Tech Lead, PM, DevOps, QA. BI team Π½Π° стороні ΠΏΠ°Ρ€Ρ‚Π½Π΅Ρ€Π°.

       

    Π§ΠΈΠΌ Ρ‚ΠΈ Π±ΡƒΠ΄Π΅Ρˆ Π·Π°ΠΉΠΌΠ°Ρ‚ΠΈΡΡŒ

    • ΠŸΡ€ΠΎΡ”ΠΊΡ‚ΡƒΠ²Π°Ρ‚ΠΈ Ρ‚Π° Π±ΡƒΠ΄ΡƒΠ²Π°Ρ‚ΠΈ Ρ–Π½ΠΊΡ€Π΅ΠΌΠ΅Π½Ρ‚Π°Π»ΡŒΠ½Ρ– Ρ‚Π° full-refresh ELT ΠΏΠ°ΠΉΠΏΠ»Π°ΠΉΠ½ΠΈ (ADF + Functions β†’ Snowflake).
    • Розробляти Ρ‚Π° ΠΎΠΏΡ‚ΠΈΠΌΡ–Π·ΡƒΠ²Π°Ρ‚ΠΈ Snowflake SQL для DWH Ρ– Π²Ρ–Ρ‚Ρ€ΠΈΠ½ (Star schema, incremental patterns, Π±Π°Π·ΠΎΠ²ΠΈΠΉ SCD2).
    • ΠŸΠΈΡΠ°Ρ‚ΠΈ production-ready Python для ingestion/ΠΎΠ±Ρ€ΠΎΠ±ΠΊΠΈ/API (pagination, retries, error handling, логування, ΠΊΠΎΠ½Ρ„Ρ–Π³ΠΈ, secrets).
    • Π’Π±ΡƒΠ΄ΠΎΠ²ΡƒΠ²Π°Ρ‚ΠΈ data quality: freshness, completeness, duplicates, late-arriving data, ΠΊΠΎΠ½Ρ‚Ρ€ΠΎΠ»ΡŒ Ρ–Π½ΠΊΡ€Π΅ΠΌΠ΅Π½Ρ‚Ρ–Π².
    • ΠŸΡ–Π΄Ρ‚Ρ€ΠΈΠΌΡƒΠ²Π°Ρ‚ΠΈ CI/CD Ρ‚Π° ΠΏΡ€ΠΎΠΌΠΎΡƒΡ‚ Π·ΠΌΡ–Π½ ΠΌΡ–ΠΆ dev/test/prod.
    • ΠŸΡ€Π°Ρ†ΡŽΠ²Π°Ρ‚ΠΈ ΠΏΡ€ΠΎΠ·ΠΎΡ€ΠΎ ΠΉ ΠΏΠ΅Ρ€Π΅Π΄Π±Π°Ρ‡ΡƒΠ²Π°Π½ΠΎ: status β†’ risk β†’ options β†’ next step.
    • По готовності (Π· часом), ΠΏΠ΅Ρ€Π΅ΠΉΠΌΠ΅Ρˆ Π½Π° сСбС всю Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π°Π»ΡŒΠ½Ρ–ΡΡ‚ΡŒ Π·Π° успіх ΠΏΡ€ΠΎΠ΅ΠΊΡ‚Ρƒ Ρ‚Π° ΠΊΠΎΠΌΡƒΠ½Ρ–ΠΊΠ°Ρ†Ρ–Ρ— Π· ΠΏΠ°Ρ€Ρ‚Π½Π΅Ρ€ΠΎΠΌ.

       

    Кого ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ (Must-have)

    • 5+ Ρ€ΠΎΠΊΡ–Π² Ρƒ Data Engineering / DWH / ETL.
    • Snowflake SQL Ρƒ ΠΏΡ€ΠΎΠ΄Π°ΠΊΡˆΠ΅Π½Ρ–: CTE, window functions, COPY INTO, MERGE.
    • Incremental load (watermarks / merge pattern) + Π±Π°Π·ΠΎΠ²ΠΈΠΉ SCD2 (effective dating, current flag).
    • Python production level: API-Ρ–Π½Ρ‚Π΅Π³Ρ€Π°Ρ†Ρ–Ρ— (pagination/retries/errors), логування, ΠΊΠΎΠ½Ρ„Ρ–Π³ΠΈ, сСкрСти, ΠΏΡ–Π΄Π³ΠΎΡ‚ΠΎΠ²ΠΊΠ° Π΄Π°Π½ΠΈΡ… Π΄ΠΎ ELT (validation/normalization).
    • Розуміння DWH modeling (fact/dimension, star schema, joins Ρ– Ρ—Ρ… performance implications).
    • Досвід Π· міграціями/Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ/складними Π΄ΠΆΠ΅Ρ€Π΅Π»Π°ΠΌΠΈ β€” плюс.
    • Π‘Π°ΠΆΠ°Π½ΠΎ: Π±Π°Π·ΠΎΠ²Π΅ розуміння OpenMetadata (Π°Π±ΠΎ Π°Π½Π°Π»ΠΎΠ³Ρ–Π² catalog/lineage) Ρ‚Π° data governance.

    Azure-досвід (ADF/Functions) β€” Π±Π°ΠΆΠ°Π½ΠΈΠΉ, Π°Π»Π΅ якщо Ρ‚ΠΈ сильний Ρƒ Snowflake + DWH/ETL Ρ– ΠΌΠ°Ρ”Ρˆ cloud-Π±Π΅ΠΊΠ³Ρ€Π°ΡƒΠ½Π΄, ΠΌΠΈ Π³ΠΎΡ‚ΠΎΠ²Ρ– розглянути Ρ‚Π° β€œΠ΄ΠΎΡ‚ΡΠ³Π½ΡƒΡ‚ΠΈβ€ Azure-частину Π² ΠΎΠ½Π±ΠΎΡ€Π΄ΠΈΠ½Π³Ρƒ.

     

    Nice-to-have

    • PySpark (DataFrame API, joins, aggregations; розуміння distributed processing).
    • Досвід Π· D365/Salesforce, MS SQL, API ingestion, CDC-ΠΏΠ°Ρ‚Π΅Ρ€Π½Π°ΠΌΠΈ.
    • Data governance/security basics, досвід Π· observability.

       

    Π―ΠΊ Ρƒ нас ΠΏΡ€ΠΎΡ…ΠΎΠ΄ΠΈΡ‚ΡŒ Π²Ρ–Π΄Π±Ρ–Ρ€ (чСсно Ρ– ΠΏΡ€Π°ΠΊΡ‚ΠΈΡ‡Π½ΠΎ)

    *Π•Ρ‚Π°ΠΏ 1 β€” Intro + ΠΊΠΎΡ€ΠΎΡ‚ΠΊΠΈΠΉ Ρ‚Π΅Ρ…-скрин (40–60 Ρ…Π², Zoom):
    контСкст ΠΏΡ€ΠΎΡ”ΠΊΡ‚Ρƒ, очікування, ΠΊΠΎΡ€ΠΎΡ‚ΠΊΠΈΠΉ сцСнарій ΠΏΠΎ SQL/Python + питання ΠΏΠΎ ΠΊΠΎΠΌΡƒΠ½Ρ–ΠΊΠ°Ρ†Ρ–Ρ— Ρ‚Π° ΠΏΡ–Π΄Ρ…ΠΎΠ΄Ρƒ Π΄ΠΎ Π·Π°Π΄Π°Ρ‡.

    *Π•Ρ‚Π°ΠΏ 2 β€” Technical deep-dive (75–90 Ρ…Π², Π· Tech Lead + Ρ–Π½ΠΆΠ΅Π½Π΅Ρ€ΠΎΠΌ):
    live-SQL (CTE/window + incremental/SCD2), кСйс ΠΏΠΎ ΠΏΠ°ΠΉΠΏΠ»Π°ΠΉΠ½Ρƒ/Ρ–Π½Ρ‚Π΅Π³Ρ€Π°Ρ†Ρ–Ρ—, ΠΏΡ–Π΄Ρ…Ρ–Π΄ Π΄ΠΎ якості Ρ‚Π° дСплою.
    Π‘Π΅Π· test task β€” ΡΠΈΠΌΡƒΠ»ΡŽΡ”ΠΌΠΎ Ρ€Π΅Π°Π»ΡŒΠ½Ρƒ Ρ€ΠΎΠ±ΠΎΡ‚Ρƒ ΠΏΡ–Π΄ час Ρ–Π½Ρ‚Π΅Ρ€Π²β€™ΡŽ.

     

    Π©ΠΎ ΠΏΡ€ΠΎΠΏΠΎΠ½ΡƒΡ”ΠΌΠΎ

    • ΠšΠΎΠΌΠΏΠ΅Π½ΡΠ°Ρ†Ρ–Ρ Π²Ρ–Π΄ $4500 (ΠΎΠ±Π³ΠΎΠ²ΠΎΡ€ΡŽΡ”ΠΌΠΎ ΠΏΡ–Π΄ Ρ€Ρ–Π²Π΅Π½ΡŒ Ρ– очікування).
    • ΠœΡ–ΡΡ–ΠΉΠ½ΠΈΠΉ ΠΏΡ€ΠΎΡ”ΠΊΡ‚, Π΄Π΅ Π΄Π°Π½Ρ– ΠΌΠ°ΡŽΡ‚ΡŒ Ρ€Π΅Π°Π»ΡŒΠ½ΠΈΠΉ Π²ΠΏΠ»ΠΈΠ².
    • БильнС сСрСдовищС: Tech Lead/Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Π½Ρ– ΠΏΡ€Π°ΠΊΡ‚ΠΈΠΊΠΈ, β€œΠ΄ΠΎΡ€ΠΎΡΠ»ΠΈΠΉβ€ delivery.
    • Π“Π½ΡƒΡ‡ΠΊΠΈΠΉ Ρ„ΠΎΡ€ΠΌΠ°Ρ‚: Π²Ρ–Π΄Π΄Π°Π»Π΅Π½ΠΎ / Π·Π° ΠΊΠΎΡ€Π΄ΠΎΠ½ΠΎΠΌ / офіс (ΠΎΠΏΡ†Ρ–ΠΉΠ½ΠΎ).
    • Π‘Ρ€ΠΎΠ½ΡŽΠ²Π°Π½Π½Ρ Π·Π° ΠΏΠΎΡ‚Ρ€Π΅Π±ΠΈ (Π·Π° наявності ΠΊΠ²ΠΎΡ‚ Ρ– відповідності процСсам).
    • Π’Π΅Ρ…Π½Ρ–ΠΊΠ° Ρ‚Π° ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠ° home-office (Π·Π° Π΄ΠΎΠΌΠΎΠ²Π»Π΅Π½Ρ–ΡΡ‚ΡŽ).
    • 36 paid days off/Ρ€Ρ–ΠΊ: 20 vacation + UA holidays (Ρ‚Π° суміТні Π΄Π½Ρ–, Π΄Π΅ застосовно).
    • Monthly benefit cafeteria $25 (навчання/ΠΌΠ΅Π½Ρ‚Π°Π»ΡŒΠ½Π΅ здоров’я Ρ‚ΠΎΡ‰ΠΎ).
    • Performance review: ongoing feedback, пСрСгляд компСнсації після 12 міс, Π΄Π°Π»Ρ– Ρ‰ΠΎΡ€Ρ–Ρ‡Π½ΠΎ.
    • Paid sabbatical після 5 Ρ€ΠΎΠΊΡ–Π² (ΠΌΠΈ Π·Π° Π·Π΄ΠΎΡ€ΠΎΠ²Ρ– Ρ‚Π° довгострокові відносини)

     

    Π“ΠΎΡ‚ΠΎΠ²ΠΈΠΉ/Π³ΠΎΡ‚ΠΎΠ²Π°?

    Надсилай CV Π°Π±ΠΎ LinkedIn β€” Ρ– Π² ΠΏΠΎΠ²Ρ–Π΄ΠΎΠΌΠ»Π΅Π½Π½Ρ– ΠΊΠΎΡ€ΠΎΡ‚ΠΊΠΎ:

    1. які ΠΏΡ€ΠΎΡ”ΠΊΡ‚ΠΈ Π·Ρ– Snowflake Ρ€ΠΎΠ±ΠΈΠ²(Π»Π°) Ρƒ ΠΏΡ€ΠΎΠ΄Π°ΠΊΡˆΠ΅Π½Ρ–,
    2. Ρ‡ΠΈ Π±ΡƒΠ² досвід ETL/DWH modeling + incremental loads,
    3. Π· Ρ‡ΠΈΠΌ ΠΏΡ€Π°Ρ†ΡŽΠ²Π°Π²(Π»Π°) для orchestration/ingestion (ADF/Airflow/Ρ–Π½ΡˆΡ–).

    Π‘Π»Π°Π²Π° Π£ΠΊΡ€Π°Ρ—Π½Ρ– Ρ– нСзламності Π½Π°ΡˆΠΎΡ— Π½Π°Ρ†Ρ–Ρ—!

    More
  • Β· 29 views Β· 6 applications Β· 4d

    Data Engineer

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 5 years of experience Β· English - B1
    Ми ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ досвічСного Data Engineer, який Π·Π΄Π°Ρ‚Π΅Π½ Ρ– Ρ…ΠΎΡ‡Π΅ Π²ΠΏΠ»ΠΈΠ²Π°Ρ‚ΠΈ Π½Π° Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Ρƒ, ΠΏΡ–Π΄Ρ…ΠΎΠ΄ΠΈ Ρ‚Π° Ρ€ΠΎΠ·Π²ΠΈΡ‚ΠΎΠΊ data-ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠΈ. Π©ΠΎ Π²ΠΈ Π±ΡƒΠ΄Π΅Ρ‚Π΅ Ρ€ΠΎΠ±ΠΈΡ‚ΠΈ: ΠŸΡ€ΠΎΡ”ΠΊΡ‚ΡƒΠ²Π°Ρ‚ΠΈ Ρ‚Π° Ρ€ΠΎΠ·Π²ΠΈΠ²Π°Ρ‚ΠΈ ETL / ELT-ΠΏΠ°ΠΉΠΏΠ»Π°ΠΉΠ½ΠΈ для фінансових Ρ– Ρ‚Ρ€Π°Π½Π·Π°ΠΊΡ†Ρ–ΠΉΠ½ΠΈΡ… Π΄Π°Π½ΠΈΡ… Π‘ΡƒΠ΄ΡƒΠ²Π°Ρ‚ΠΈ ΠΉ ΠΎΠΏΡ‚ΠΈΠΌΡ–Π·ΡƒΠ²Π°Ρ‚ΠΈ Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½Ρ–...

    Ми ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ досвічСного Data Engineer, який Π·Π΄Π°Ρ‚Π΅Π½ Ρ– Ρ…ΠΎΡ‡Π΅ Π²ΠΏΠ»ΠΈΠ²Π°Ρ‚ΠΈ Π½Π° Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Ρƒ, ΠΏΡ–Π΄Ρ…ΠΎΠ΄ΠΈ Ρ‚Π° Ρ€ΠΎΠ·Π²ΠΈΡ‚ΠΎΠΊ data-ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠΈ.

    Π©ΠΎ Π²ΠΈ Π±ΡƒΠ΄Π΅Ρ‚Π΅ Ρ€ΠΎΠ±ΠΈΡ‚ΠΈ:

    • ΠŸΡ€ΠΎΡ”ΠΊΡ‚ΡƒΠ²Π°Ρ‚ΠΈ Ρ‚Π° Ρ€ΠΎΠ·Π²ΠΈΠ²Π°Ρ‚ΠΈ ETL / ELT-ΠΏΠ°ΠΉΠΏΠ»Π°ΠΉΠ½ΠΈ для фінансових Ρ– Ρ‚Ρ€Π°Π½Π·Π°ΠΊΡ†Ρ–ΠΉΠ½ΠΈΡ… Π΄Π°Π½ΠΈΡ…
    • Π‘ΡƒΠ΄ΡƒΠ²Π°Ρ‚ΠΈ ΠΉ ΠΎΠΏΡ‚ΠΈΠΌΡ–Π·ΡƒΠ²Π°Ρ‚ΠΈ Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½Ρ– ΠΌΠΎΠ΄Π΅Π»Ρ– Π² StarRocks ΠΏΡ–Π΄ високі навантаТСння Ρ‚Π° ΡΠΊΠ»Π°Π΄Π½Ρ– Π°Π³Ρ€Π΅Π³Π°Ρ†Ρ–Ρ—
    • ΠŸΡ€Π°Ρ†ΡŽΠ²Π°Ρ‚ΠΈ Π· ΠΏΠΎΡ‚ΠΎΠΊΠ°ΠΌΠΈ Π΄Π°Π½ΠΈΡ… Ρƒ batch Ρ– near real-time сцСнаріях, Π±Π°Π»Π°Π½ΡΡƒΡŽΡ‡ΠΈ ΠΌΡ–ΠΆ ΡˆΠ²ΠΈΠ΄ΠΊΡ–ΡΡ‚ΡŽ, Π²Π°Ρ€Ρ‚Ρ–ΡΡ‚ΡŽ Ρ‚Π° Π°ΠΊΡ‚ΡƒΠ°Π»ΡŒΠ½Ρ–ΡΡ‚ΡŽ Π΄Π°Π½ΠΈΡ…
    • ΠžΠΏΡ‚ΠΈΠΌΡ–Π·ΡƒΠ²Π°Ρ‚ΠΈ latency, concurrency, cost Ρ‚Π° data freshness Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π½ΠΎ Π΄ΠΎ Π±Ρ–знСсових Ρ– Ρ€Π΅Π³ΡƒΠ»ΡΡ‚ΠΎΡ€Π½ΠΈΡ… Π²ΠΈΠΌΠΎΠ³
    • ΠŸΡ–Π΄Π²ΠΈΡ‰ΡƒΠ²Π°Ρ‚ΠΈ ΡΠΊΡ–ΡΡ‚ΡŒ, ΠΊΠΎΠ½ΡΠΈΡΡ‚Π΅Π½Ρ‚Π½Ρ–ΡΡ‚ΡŒ Ρ– ΠΊΠ΅Ρ€ΠΎΠ²Π°Π½Ρ–ΡΡ‚ΡŒ Π΄Π°Π½ΠΈΡ…, Ρ„ΠΎΡ€ΠΌΡƒΡŽΡ‡ΠΈ ΠΏΡ€Π°ΠΊΡ‚ΠΈΠΊΠΈ Data Quality Ρ‚Π° Data Governance
    • Π Π°Π·ΠΎΠΌ Π· ΠΊΠΎΠΌΠ°Π½Π΄ΠΎΡŽ Ρ„ΠΎΡ€ΠΌΡƒΠ²Π°Ρ‚ΠΈ Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Ρƒ DWH Ρ– BI-Ρ€Ρ–ΡˆΠ΅Π½ΡŒ, поступово ΠΌΠΎΠ΄Π΅Ρ€Π½Ρ–Π·ΡƒΡŽΡ‡ΠΈ legacy Π±Π΅Π· Π·ΡƒΠΏΠΈΠ½ΠΊΠΈ Ρ–ΡΠ½ΡƒΡŽΡ‡ΠΈΡ… процСсів

    Кого ΠΌΠΈ ΡˆΡƒΠΊΠ°Ρ”ΠΌΠΎ:
    β€’ Data Engineer Π· Π΄ΠΎΡΠ²Ρ–Π΄ΠΎΠΌ Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· OLAP-систСмами
    (StarRocks β€” основна ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌΠ° Π°Π±ΠΎ Π³ΠΎΡ‚ΠΎΠ²Π½Ρ–ΡΡ‚ΡŒ швидко Ρ—Ρ— ΠΎΠΏΠ°Π½ΡƒΠ²Π°Ρ‚ΠΈ)
    β€’ Π’ΠΏΠ΅Π²Π½Π΅Π½Π΅ володіння SQL Ρ‚Π° досвід ΠΎΠΏΡ‚ΠΈΠΌΡ–Π·Π°Ρ†Ρ–Ρ— складних Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½ΠΈΡ… Π·Π°ΠΏΠΈΡ‚Ρ–Π²
    β€’ Розуміння ΠΏΡ€ΠΈΠ½Ρ†ΠΈΠΏΡ–Π² ΠΏΠΎΠ±ΡƒΠ΄ΠΎΠ²ΠΈ data warehouse Ρ‚Π° Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½ΠΈΡ… ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ
    β€’ Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· ETL-інструмСнтами (Airflow, dbt Π°Π±ΠΎ Π°Π½Π°Π»ΠΎΠ³ΠΈ)
    β€’ Π‘Π°Π·ΠΎΠ²Ρ– знання Python Π°Π±ΠΎ Ρ–Π½ΡˆΠΎΡ— ΠΌΠΎΠ²ΠΈ для data-ΠΎΠ±Ρ€ΠΎΠ±ΠΊΠΈ
    β€’ Π†Π½ΠΆΠ΅Π½Π΅Ρ€Π½Π΅ мислСння, Π·Π΄Π°Ρ‚Π½Ρ–ΡΡ‚ΡŒ ΠΏΡ€Π°Ρ†ΡŽΠ²Π°Ρ‚ΠΈ Π· Π½Π΅Π²ΠΈΠ·Π½Π°Ρ‡Π΅Π½Ρ–ΡΡ‚ΡŽ Ρ‚Π° ΠΏΡ€ΠΈΠΉΠΌΠ°Ρ‚ΠΈ Ρ‚Π΅Ρ…Π½Ρ–Ρ‡Π½Ρ– Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ Ρƒ ΡΠΊΠ»Π°Π΄Π½ΠΎΠΌΡƒ фінансовому Π΄ΠΎΠΌΠ΅Π½Ρ–

    Π‘ΡƒΠ΄Π΅ Π²Π΅Π»ΠΈΠΊΠΈΠΌ плюсом:

    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π·Ρ– ΡΡ‚Ρ€Ρ–ΠΌΡ–Π½Π³ΠΎΠΌ Π΄Π°Π½ΠΈΡ… (Kafka, Flink, CDC)
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Parquet Ρ‚Π° lakehouse-ΠΏΡ–Π΄Ρ…ΠΎΠ΄Π°ΠΌΠΈ
    • ΠŸΡ€Π°ΠΊΡ‚ΠΈΠΊΠ° ΠΎΠΏΡ‚ΠΈΠΌΡ–Π·Π°Ρ†Ρ–Ρ— Π²Π΅Π»ΠΈΠΊΠΈΡ… обсягів Π΄Π°Π½ΠΈΡ… Ρ– Π²ΠΈΡΠΎΠΊΠΎΠ½Π°Π²Π°Π½Ρ‚Π°ΠΆΠ΅Π½ΠΈΡ… Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½ΠΈΡ… систСм
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· BI-інструмСнтами Π°Π±ΠΎ ΠΏΠΎΠ±ΡƒΠ΄ΠΎΠ²ΠΈ BI-ΡˆΠ°Ρ€Ρƒ
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Ρ…ΠΌΠ°Ρ€Π½ΠΎΡŽ Ρ–Π½Ρ„Ρ€Π°ΡΡ‚Ρ€ΡƒΠΊΡ‚ΡƒΡ€ΠΎΡŽ (AWS Π°Π±ΠΎ Π°Π½Π°Π»ΠΎΠ³ΠΈ)
    More
  • Β· 63 views Β· 9 applications Β· 4d

    Python Engineer (Airflow)

    Full Remote Β· Ukraine Β· Product Β· 2 years of experience Β· English - None Ukrainian Product πŸ‡ΊπŸ‡¦
    НС просто ΠΊΠΎΠ΄, Π° Ρ‚Π²Ρ–ΠΉ Π²ΠΏΠ»ΠΈΠ² Π½Π° ΠΌΡ–Π»ΡŒΠΉΠΎΠ½ΠΈ ΡƒΠΊΡ€Π°Ρ—Π½Ρ†Ρ–Π². Nova Digital β€” Ρ†Π΅ Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ‡Π½Π΅ сСрцС СкосистСми NOVA, Π΄Π΅ Ρ‚Π²Ρ–ΠΉ ΠΊΠΎΠ΄ стає Ρ‡Π°ΡΡ‚ΠΈΠ½ΠΎΡŽ Ρ‰ΠΎΠ΄Π΅Π½Π½ΠΎΠ³ΠΎ Тиття Ρ†Ρ–Π»ΠΎΡ— ΠΊΡ€Π°Ρ—Π½ΠΈ. ΠŸΡ€ΠΎ ΠΌΠ°ΡΡˆΡ‚Π°Π± Π½Π°ΡˆΠΈΡ… сСрвісів: β€’ 50+ ΠΌΠ»Π½ Π·Π°ΠΏΠΈΡ‚Ρ–Π² щодня ΠΏΡ€ΠΎΡ…ΠΎΠ΄ΡΡ‚ΡŒ Ρ‡Π΅Ρ€Π΅Π· Π½Π°ΡˆΡ– систСми β€’ 10+ ΠΌΠ»Π½...

    НС просто ΠΊΠΎΠ΄, Π° Ρ‚Π²Ρ–ΠΉ Π²ΠΏΠ»ΠΈΠ² Π½Π° ΠΌΡ–Π»ΡŒΠΉΠΎΠ½ΠΈ ΡƒΠΊΡ€Π°Ρ—Π½Ρ†Ρ–Π².

    Nova Digital β€” Ρ†Π΅ Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ‡Π½Π΅ сСрцС СкосистСми NOVA, Π΄Π΅ Ρ‚Π²Ρ–ΠΉ ΠΊΠΎΠ΄ стає Ρ‡Π°ΡΡ‚ΠΈΠ½ΠΎΡŽ Ρ‰ΠΎΠ΄Π΅Π½Π½ΠΎΠ³ΠΎ Тиття Ρ†Ρ–Π»ΠΎΡ— ΠΊΡ€Π°Ρ—Π½ΠΈ.

    ΠŸΡ€ΠΎ ΠΌΠ°ΡΡˆΡ‚Π°Π± Π½Π°ΡˆΠΈΡ… сСрвісів:
    β€’ 50+ ΠΌΠ»Π½ Π·Π°ΠΏΠΈΡ‚Ρ–Π² щодня ΠΏΡ€ΠΎΡ…ΠΎΠ΄ΡΡ‚ΡŒ Ρ‡Π΅Ρ€Π΅Π· Π½Π°ΡˆΡ– систСми
    β€’ 10+ ΠΌΠ»Π½ Π°ΠΊΡ‚ΠΈΠ²Π½ΠΈΡ… користувачів ΠΏΠΎΠΊΠ»Π°Π΄Π°ΡŽΡ‚ΡŒΡΡ Π½Π° Π½Π°ΡˆΡ– Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ
    Ми Π±ΡƒΠ΄ΡƒΡ”ΠΌΠΎ Π½Π΅ просто ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ β€” ΠΌΠΈ ΡΡ‚Π²ΠΎΡ€ΡŽΡ”ΠΌΠΎ Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ‡Π½Ρƒ інфраструктуру, яка об’єднує людСй, бізнСси Ρ‚Π° моТливості Π²ΠΆΠ΅ ΠΌΠ°ΠΉΠΆΠ΅ ΠΏΠΎ Ρ†Ρ–Π»ΠΎΠΌΡƒ світу.

    Π©ΠΎ Ρ€ΠΎΠ±ΠΈΡ‚ΠΈΠΌΠ΅Ρˆ як Data Engineer (Airflow):
     

    • ΠŸΡ€ΠΎΡ”ΠΊΡ‚ΡƒΠ²Π°Π½Π½Ρ, оптимізація Ρ‚Π° ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠ° кластСра Apache Airflow Ρ– Python-ΠΊΠΎΠ΄Ρƒ для оркСстрації Π΄Π°Π½ΠΈΡ…
    • Π ΠΎΠ·Ρ€ΠΎΠ±ΠΊΠ° Π»ΠΎΠ³Ρ–ΠΊΠΈ Ρ‚Π° ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠ° ETL-процСсів (Π·Π±Ρ–Ρ€, трансформація, завантаТСння Π΄Π°Π½ΠΈΡ…)
    • ΠŸΡ€ΠΎΡ”ΠΊΡ‚ΡƒΠ²Π°Π½Π½Ρ Ρ–Π½Ρ‚Π΅Π³Ρ€Π°Ρ†Ρ–ΠΉΠ½ΠΎΡ— Π²Π·Π°Ρ”ΠΌΠΎΠ΄Ρ–Ρ— ΠΌΡ–ΠΆ систСмами, створСння сСрвісів для Ρ–Π½Ρ‚Π΅Π³Ρ€Π°Ρ†Ρ–Ρ— Ρ‚Π° ΠΎΠ±ΠΌΡ–Π½Ρƒ Π΄Π°Π½ΠΈΠΌΠΈ
    • ΠžΠΏΡ‚ΠΈΠΌΡ–Π·Π°Ρ†Ρ–Ρ продуктивності процСсів ΠΎΠ±Ρ€ΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ…, виявлСння Ρ‚Π° усунСння Π²ΡƒΠ·ΡŒΠΊΠΈΡ… ΠΌΡ–ΡΡ†ΡŒ
    • Бпівпраця Π· Π°Π½Π°Π»Ρ–Ρ‚ΠΈΠΊΠ°ΠΌΠΈ Ρ‚Π° Ρ€ΠΎΠ·Ρ€ΠΎΠ±Π½ΠΈΠΊΠ°ΠΌΠΈ для забСзпСчСння якісної доставки Π΄Π°Π½ΠΈΡ…


    Π’Π²Ρ–ΠΉ Ρ–Π΄Π΅Π°Π»ΡŒΠ½ΠΈΠΉ ΠΏΡ€ΠΎΡ„Ρ–Π»ΡŒ:
    Must-have:
     

    • ΠŸΡ€ΠΎΡ„Π΅ΡΡ–ΠΉΠ½ΠΈΠΉ досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Apache Airflow: створСння DAG’ів, Π½Π°Π»Π°ΡˆΡ‚ΡƒΠ²Π°Π½Π½Ρ ΠΎΠΏΠ΅Ρ€Π°Ρ‚ΠΎΡ€Ρ–Π², ΠΌΠΎΠ½Ρ–Ρ‚ΠΎΡ€ΠΈΠ½Π³ Ρ‚Π° оптимізація продуктивності
    • Досвід Π½Π΅ мСншС 3 Ρ€ΠΎΠΊΡ–Π² Ρƒ Ρ€ΠΎΠ»Ρ– Python-Ρ€ΠΎΠ·Ρ€ΠΎΠ±Π½ΠΈΠΊΠ° Π· Π°ΠΊΡ†Π΅Π½Ρ‚ΠΎΠΌ Π½Π° Π±Π΅ΠΊΠ΅Π½Π΄ Ρ‚Π° Ρ–Π½Ρ‚Π΅Π³Ρ€Π°Ρ†Ρ–ΠΉΠ½Ρ– Ρ€Ρ–ΡˆΠ΅Π½Π½Ρ
    • ΠŸΡ€Π°ΠΊΡ‚ΠΈΡ‡Π½Ρ– Π½Π°Π²ΠΈΡ‡ΠΊΠΈ Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Bash, Docker, DBT, GitLab CI/CD (автоматизація процСсів, Π΄Π΅ΠΏΠ»ΠΎΠΉ, оптимізація ΠΏΠ°ΠΉΠΏΠ»Π°ΠΉΠ½Ρ–Π²)
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Π±Π°Π·Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ… MS SQL Server Ρ‚Π° PostgreSQL (написання складних SQL-Π·Π°ΠΏΠΈΡ‚Ρ–Π², оптимізація, створСння ΠΏΡ€ΠΎΡ†Π΅Π΄ΡƒΡ€ Ρ– Ρ„ΡƒΠ½ΠΊΡ†Ρ–ΠΉ)

    Nice-to-have:
     

    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Apache Spark (Ρ€ΠΎΠ·ΠΏΠΎΠ΄Ρ–Π»Π΅Π½Π° ΠΎΠ±Ρ€ΠΎΠ±ΠΊΠ° Π²Π΅Π»ΠΈΠΊΠΈΡ… масивів Π΄Π°Π½ΠΈΡ…)
    • Знання Ρ–Π½ΡˆΠΈΡ… інструмСнтів Big Data Π°Π±ΠΎ Ρ…ΠΌΠ°Ρ€Π½ΠΈΡ… ΠΏΠ»Π°Ρ‚Ρ„ΠΎΡ€ΠΌ (AWS/GCP/Azure)
    • Розуміння ΠΏΡ€ΠΈΠ½Ρ†ΠΈΠΏΡ–Π² DataOps Ρ– CI/CD Ρƒ сфСрі управління Π΄Π°Π½ΠΈΠΌΠΈ

    Π§ΠΎΠΌΡƒ СкспСрти Ρ€Ρ–Π·Π½ΠΈΡ… напрямів ΠΏΡ€Π°Ρ†ΡŽΡŽΡ‚ΡŒ Π· Π½Π°ΠΌΠΈ:
    Π’Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ‡Π½ΠΎ:
     

    • ВисоконавантаТСні систСми Π· Ρ€Π΅Π°Π»ΡŒΠ½ΠΈΠΌΠΈ Π²ΠΈΠΊΠ»ΠΈΠΊΠ°ΠΌΠΈ ΠΌΠ°ΡΡˆΡ‚Π°Π±ΡƒΠ²Π°Π½Π½Ρ
    • Бучасний стСк Ρ‚Π° ΡΠ²ΠΎΠ±ΠΎΠ΄Π° Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ‡Π½ΠΈΡ… Ρ€Ρ–ΡˆΠ΅Π½ΡŒ
    • ΠœΠΎΠΆΠ»ΠΈΠ²Ρ–ΡΡ‚ΡŒ Π²ΠΏΠ»ΠΈΠ²Π°Ρ‚ΠΈ Π½Π° Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Ρƒ ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚Ρ–Π², якими ΠΊΠΎΡ€ΠΈΡΡ‚ΡƒΡŽΡ‚ΡŒΡΡ ΠΌΡ–Π»ΡŒΠΉΠΎΠ½ΠΈ

    ΠŸΡ€ΠΎΡ„Π΅ΡΡ–ΠΉΠ½ΠΎ:
     

    • ΠœΠ΅Π½Ρ‚ΠΎΡ€ΡΡ‚Π²ΠΎ Π²Ρ–Π΄ ΡΠ΅Π½ΡŒΠΉΠΎΡ€Ρ–Π² Π· Π΄ΠΎΡΠ²Ρ–Π΄ΠΎΠΌ enterprise-Ρ€Ρ–ΡˆΠ΅Π½ΡŒ
    • Π Π°Π·ΠΎΠΌ Π±ΡƒΠ΄ΡƒΡ”ΠΌΠΎ Ρ–Π½Π½ΠΎΠ²Π°Ρ†Ρ–Ρ—, Ρ‰ΠΎ ΠΏΡ€Π°Ρ†ΡŽΡŽΡ‚ΡŒ для ΠΌΡ–Π»ΡŒΠΉΠΎΠ½Ρ–Π²

    Π†Π΄Π΅ΠΉΠ½ΠΎ:
     

    • ΠŸΡ€ΠΎΠ΄ΡƒΠΊΡ‚, Ρ‰ΠΎ Π·ΠΌΡ–Π½ΡŽΡ” ΠΊΡ€Π°Ρ—Π½Ρƒ β€” Π½Π΅ ΠΏΡ€ΠΎΡΡ‚ΠΎ Ρ‰Π΅ ΠΎΠ΄ΠΈΠ½ стартап
    • Π’ΡƒΡ‚ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ ΡΡ‚Π°ΡŽΡ‚ΡŒ Π΄Ρ€Π°ΠΉΠ²Π΅Ρ€Π°ΠΌΠΈ Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ‡Π½ΠΈΡ… Π·ΠΌΡ–Π½

    Benefits:
     

    • ΠšΠΎΠΌΠΏΠ΅Π½ΡΠ°Ρ†Ρ–Ρ Π΄ΠΎΠ΄Π°Ρ‚ΠΊΠΎΠ²ΠΈΡ… Π²ΠΈΡ‚Ρ€Π°Ρ‚, пов’язаних Π· Ρ€ΠΎΠ±ΠΎΡ‡ΠΈΠΌΠΈ завданнями, Π·Π΄Ρ–ΠΉΡΠ½ΡŽΡŽΡ‚ΡŒΡΡ Π΄ΠΎ Π²Π½ΡƒΡ‚Ρ€Ρ–ΡˆΠ½Ρ–Ρ… ΠΏΠΎΠ»Ρ–Ρ‚ΠΈΠΊ ΠΊΠΎΠΌΠΏΠ°Π½Ρ–ΠΉ
    • Π”ΠΎΠ±Ρ€ΠΎΠ²Ρ–Π»ΡŒΠ½Π΅ ΠΌΠ΅Π΄ΠΈΡ‡Π½Π΅ страхування Ρ‚Π° ΡΡ‚рахування Тиття
    • ΠšΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½Ρ– Π·Π½ΠΈΠΆΠΊΠΈ Π²Ρ–Π΄ ΠΏΠ°Ρ€Ρ‚Π½Π΅Ρ€Ρ–Π² Nova
    • ΠŸΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠ° ΠΌΠ΅Π½Ρ‚Π°Π»ΡŒΠ½ΠΎΠ³ΠΎ здоровʼя. ΠœΠΎΠΆΠ»ΠΈΠ²Ρ–ΡΡ‚ΡŒ Π±Π΅Π·ΠΎΠΏΠ»Π°Ρ‚Π½ΠΈΡ… ΠΊΠΎΠ½ΡΡƒΠ»ΡŒΡ‚Π°Ρ†Ρ–ΠΉ Π· ΠΊΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½ΠΈΠΌ психологом
    • Π‘ΠΎΡ†Ρ–Π°Π»ΡŒΠ½Π° Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π°Π»ΡŒΠ½Ρ–ΡΡ‚ΡŒ Ρ‚Π° Π²ΠΎΠ»ΠΎΠ½Ρ‚Π΅Ρ€ΡΡŒΠΊΠΈΠΉ Ρ€ΡƒΡ… Β«Π‘Π²ΠΎΡ— для своїх»
    • ΠšΠ»ΡƒΠ±ΠΈ Π·Π° Ρ–нтСрСсами: Π±Ρ–Π³ΠΎΠ²ΠΈΠΉ, ΡˆΠ°Ρ…ΠΎΠ²ΠΈΠΉ Ρ‚Π° Ρ–Π½ΡˆΡ– Ρ–Π½Ρ–Ρ†Ρ–Π°Ρ‚ΠΈΠ²ΠΈ для натхнСння
    • Π‘ΠΎΡ†Ρ–Π°Π»ΡŒΠ½Ρ– відпустки Ρ‚Π° ΠΊΠΎΠΌΠΏΠ΅Π½ΡΠ°Ρ†Ρ–Ρ— Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π½ΠΎ Π΄ΠΎ ΠšΠ—ΠΏΠŸ Π£ΠΊΡ€Π°Ρ—Π½ΠΈ

    Наступний ΠΊΡ€ΠΎΠΊ Π·Π° Ρ‚ΠΎΠ±ΠΎΡŽ.Ми Ρ‡Π΅ΠΊΠ°Ρ”ΠΌΠΎ Π½Π° Π·Π½Π°ΠΉΠΎΠΌΡΡ‚Π²ΠΎ β€” залишилось лишС ΠΎΡ‚Ρ€ΠΈΠΌΠ°Ρ‚ΠΈ Ρ‚Π²ΠΎΡ” Ρ€Π΅Π·ΡŽΠΌΠ΅.

    ΠžΡ‡Ρ–ΠΊΡƒΠ²Π°Π½ΠΈΠΉ Ρ‚Π΅Ρ€ΠΌΡ–Π½ Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Ρ– β€” Π΄ΠΎ 10 Ρ€ΠΎΠ±ΠΎΡ‡ΠΈΡ… Π΄Π½Ρ–Π².

     

     

    More
Log In or Sign Up to see all posted jobs