Jobs at Djinni

9457
  • Β· 94 views Β· 20 applications Β· 4d

    Senior Software Engineer (Web)

    Hybrid Remote Β· Ukraine Β· 4 years of experience Β· B2 - Upper Intermediate
    Job Description We're excited if you have 4+ years of web development development experience building consumer-facing websites and backend services using REST APIs Good culture fit and excellent team player Strong knowledge of JavaScript and Node.js...

    Job Description

    We're excited if you have

    • 4+ years of web development development experience building consumer-facing websites and backend services using REST APIs
    • Good culture fit and excellent team player
    • Strong knowledge of JavaScript and Node.js (Express.js)
    • Solid experience with ReactJS, Jest, Cypress
    • Experience with Material UI would be a plus
    • Knowledge of the latest open standards and web technologies
    • Ability to decompose complex problems into manageable units of work
    • Experience with AWS would be a plus
    • BS or MS degree in Computer Science or equivalent
    • Strong verbal and written communication skills in English
    • Plus: familiar with e-commerce websites and trends

     

    Job Responsibilities

    What you'll be doing

    • Build innovative features in the Web Commerce and Payments areas (subscriptions, invoices, transactions, refunds, promos, etc.)
    • Code, test, document and deliver features with cross-browser compatibility, accessibility and search engine optimization
    • Enhance existing and create new features for Node.js backend
    • Participate in Agile development process, including scoping, technical design, estimation effort, coding, testing, debugging, code reviews, maintenance and support
    • Collaborate with program managers, marketing, designers, software and QA engineers to build and enhance both UI and backend
    • Innovate and create the best user experience online

     

    Department/Project Description

    The Client is the #1 TV streaming platform in the U.S with tens of millions of customers. They have pioneered TV streaming in US. Their mission is to be the TV streaming platform that connects the entire TV ecosystem. 

    The project is an e-commerce website related to selling consumer equipment for video streaming services.

    More
  • Β· 145 views Β· 19 applications Β· 4d

    DevOps Trainee

    Hybrid Remote Β· Ukraine Β· B1 - Intermediate
    Job Description Required skills: - Networking basics understanding - Linux experience - Basic understanding CI/CD processes and DevOps concepts - Basic knowledge of containerization and virtualization (Docker) - Basic knowledge of cloud providers...

    Job Description

    Required skills:
    - Networking basics understanding
    - Linux experience
    - Basic understanding CI/CD processes and DevOps concepts
    - Basic knowledge of containerization and virtualization (Docker)
    - Basic knowledge of cloud providers (AWS/GCP)
    - Experience with Git (GitHub/GitLab)
    - Infrastructure as a Code (Terraform)
    - Basic knowledge of SQL and NoSQL databases
    - Basic knowledge of any scripting language (Bash/Python)
    - Good communication skills

     

    Job Responsibilities

    Identify and implement automation strategies in the SDLC process that enables high quality and fast delivery of new solutions to our business users.

    - Manage and optimize infrastructure and deployment workflows.
    - Learn and contribute to the implementation of CI/CD.
    - Participate in troubleshooting and resolving issues.
    - Participate in vendor software evaluations and integration strategies.
    - Champion our CI/CD agile practices across all of the development teams.
    - Create and maintain functional / technical design specifications and solutions to satisfy project requirements.
    - Develop infrastructure as code to meet the needs of our platforms.
    - Monitor system and application performance and troubleshoot / resolve escalated issues.
    - Be part of a high-performing Agile / Continuous Integration engineering practice.
    - Continually seek ways to optimize and improve all operational aspects of our technical solutions.

     

    Department/Project Description

    We are looking for an innovative, results-oriented, and passionate DevOps trainee. You will be working with a team of like-minded engineers to design, develop and implement cutting-edge consumer desktop, web, and mobile applications. We are building an infrastructure platform, and the Platform Engineering team is responsible for designing and writing Infrastructure-as-Code and nurturing a DevOps culture by providing tools and training for development teams. This role will be key in building cloud-based, fault tolerant, highly available, high performing applications that provide a seamless experience to our development teams and our end users.

    More
  • Β· 49 views Β· 5 applications Β· 4d

    Team Lead Recruitment

    Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 6 years of experience Β· B2 - Upper Intermediate
    Team Lead Recruitment β€” Remote We are a fintech product company building white-label and SaaS solutions for EMI, PSP, e-commerce, and open banking. As a remote-first and rapidly growing company, we are looking for a Team Lead Recruitment to strengthen...

    Team Lead Recruitment β€” Remote 

    We are a fintech product company building white-label and SaaS solutions for EMI, PSP, e-commerce, and open banking. As a remote-first and rapidly growing company, we are looking for a Team Lead Recruitment to strengthen our Talent Acquisition function and lead our team of recruiters.

    Role Overview

    You will lead, mentor, and develop a team of recruiters, optimize recruitment processes, work closely with hiring managers, and personally manage key vacancies. Focus areas: leadership, strategy, delivery, and quality of hiring.

    Key Responsibilities

    • Manage, mentor, and develop a team of 5+ recruiters
    • Set goals, track performance, and provide feedback
    • Drive sourcing strategies and optimize recruitment processes
    • Close key roles, including management and C-level positions
    • Collaborate with hiring managers to define requirements and expectations
    • Track and use recruitment metrics (time-to-hire, quality-of-hire, pipeline health)
    • Implement best practices, tools, and automation
    • Ensure excellent candidate experience and internal client satisfaction
    • Use analytics to improve hiring efficiency and generate actionable insights

    Requirements

    • 6+ years in recruitment, including 2+ years as Team Lead or Recruitment Manager
    • Experience managing a team of 5+ recruiters
    • Proven track record closing management & C-level roles
    • Experience recruiting in at least two of: IT, product companies, outsourcing/outstaffing
    • Experience building or improving recruitment processes
    • Strong sourcing, assessment, and communication skills
    • Experience with recruitment analytics and metrics
    • Ability to work in a fast-paced, dynamic environment
    • Remote-first experience is a plus
    • Ukrainian, Russian, English β€” fluent

    Recruitment Process

    1. Application screening
    2. Prescreening call (60 min)
    3. Technical interview (60 min)
    4. Offer + background check

    We Offer

    • Competitive compensation
    • 19 business days PTO
    • Fully remote work, flexible schedule (CET-friendly)
    • Supportive culture & growth opportunities
    • Participation in conferences and industry events
    • Efficient matrix structure with minimal bureaucracy

    If you’re ready to lead a strong recruitment team and shape the company’s hiring strategy β€” apply now!

    More
  • Β· 72 views Β· 1 application Β· 4d

    Senior Grafana Developer

    Full Remote Β· Ukraine Β· 5 years of experience Β· B2 - Upper Intermediate
    We are seeking a Grafana+ Developer to design, build, and maintain interactive dashboards and observability solutions for enterprise-level environments. The ideal candidate will have strong experience in Grafana, Prometheus, and related monitoring and...

    We are seeking a Grafana+ Developer to design, build, and maintain interactive dashboards and observability solutions for enterprise-level environments. The ideal candidate will have strong experience in Grafana, Prometheus, and related monitoring and visualization tools, with the ability to connect to various data sources (e.g., Snowflake, PostgreSQL, Elasticsearch, InfluxDB, Azure Monitor, or Mulesoft metrics).

    The role involves close collaboration with platform, DevOps, and integration teams to deliver actionable insights and performance monitoring dashboards for complex distributed systems.

    • Responsibilities:

      Design and develop Grafana dashboards to visualize system health, performance, and business KPIs.
      Integrate Grafana with various data sources such as Prometheus, Loki, Elastic, Snowflake, or SQL-based stores.
      Configure and manage Grafana plugins, alerts, and notifications across different environments.
      Develop custom queries (e.g., PromQL, SQL) to extract and visualize relevant metrics.
      Work with DevOps and Cloud teams to implement observability best practices and automated dashboard deployments via CI/CD pipelines.
      Create and maintain documentation for dashboards, data models, and metric definitions.
      Troubleshoot performance issues and optimize dashboard load times and data queries.
      Collaborate with architecture and platform teams to ensure monitoring alignment with SLIs, SLOs, and SLAs.

    • Mandatory Skills Description:

      4+ years of hands-on experience with Grafana (Grafana Cloud, OSS, or Enterprise).
      Practical experience with Python
      Strong skills in PromQL, SQL, or Elastic queries.
      Experience integrating Grafana with Prometheus, Loki, Snowflake, or other time-series databases.
      Familiarity with alerting and notification channels (Slack, Teams, PagerDuty, email, etc.).
      Knowledge of Grafana provisioning (JSON models, configuration-as-code).
      Experience with CI/CD and infrastructure-as-code (e.g., Terraform, GitHub Actions, Jenkins).
      Understanding of monitoring frameworks, metrics collection, and log aggregation.
      Experience working in cloud environments (Azure).

    • Nice-to-Have Skills Description:

      Experience with Mulesoft, Confluent Kafka, or API monitoring.
      Familiarity with Azure Monitor, App Insights, or Dynatrace.
      Basic scripting knowledge (Golang, Bash, or PowerShell).
      Exposure to DevOps or SRE practices.

    More
  • Β· 20 views Β· 3 applications Β· 4d

    Senior Marketing Automation Specialist

    Full Remote Β· EU Β· 4 years of experience Β· B2 - Upper Intermediate
    N-iX is looking for Senior Marketing Automation Specialist to join the fast-growing team one our project! Our customer is the European online car market with over 30 million monthly users, with a market presence in 18 countries. As a Senior Marketing...

    N-iX is looking for Senior Marketing Automation Specialist to join the fast-growing team one our project! Our customer is the European online car market with over 30 million monthly users, with a market presence in 18 countries. As a Senior Marketing Automation Specialist, you will play a pivotal role in shaping the future of online car markets and enhancing the user experience for millions of car buyers and sellers.This role ensures day-to-day platform reliability, supports scalable customer engagement across multiple brands and markets, and contributes to the continuous development of our omni-channel automation capabilities. The position combines technical execution, platform administration, data-driven troubleshooting, and close collaboration with CRM Managers, Product, Data, and Engineering teams.
     

    Responsibilities:

    • Ensure operational continuity and performance of the MarTech stack (Iterable, Salesforce, Marketing Cloud, supporting integrations with Sales and Service Cloud, Litmus, and more).
    • Monitor platform health, journey execution, error rates, and data synchronization issues across channels (Email, Push, In-App, WhatsApp).
    • Administer users, roles, permissions, and configuration settings across tools.
    • Enable CRM Managers, Campaign Managers, and Data Delivery teams through training, documentation, and hands-on onboarding to platform capabilities.
    • Provide day-to-day troubleshooting and solution guidance on segmentation, templates, content automation, journeys, and campaign execution.
    • Support data flow configuration and maintenance across CRM systems, AWS/Databricks pipelines, audience sync tools (e.g., Mediarithmics), and CMS integrations (e.g., Contentful).
    • Contribute to automation initiatives enabling highly personalized, real-time, multi-channel communication journeys for B2C and B2B audiences.
    • Assist in monitoring and improving data synchronization reliability across systems.
    • Maintain and evolve template libraries, modular email components, and reusable assets.
    • Diagnose and resolve issues related to journey failures, template rendering, deliverability, and API-driven components.
       

    Requirements: 

    • Minimum 4 years of experience in marketing automation, CRM engineering, or MarTech operations.
    • Hands-on experience with at least one major automation platform (Iterable, Salesforce, Marketing Cloud, Hubspot, or equivalent).
    • Proven expertise in both stakeholder management and project management
    • Hands-on working with REST APIs, data integration patterns, Javascript, and SQL.
    • Experience building, troubleshooting, and optimizing multi-step journeys or workflows.
    • Strong analytical skills, comfort with dashboards, and campaign performance metrics.
    • Familiarity with GDPR and data privacy principles related to CRM operations.
    • Knowledge of deliverability monitoring tooling (Google Postmaster, MxToolbox) and ESP authentication methods.
    • English level - at least Upper-Intermediate, both spoken and written.  
       

    We offer*:

    • Flexible working format - remote, office-based or flexible
    • A competitive salary and good compensation package
    • Personalized career growth
    • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
    • Active tech communities with regular knowledge sharing
    • Education reimbursement
    • Memorable anniversary presents
    • Corporate events and team buildings
    • Other location-specific benefits

    *not applicable for freelancers

    More
  • Β· 74 views Β· 2 applications Β· 4d

    Automation Tester

    Full Remote Β· Poland Β· 4 years of experience Β· B2 - Upper Intermediate
    Project: Insurance Contract Duration: February 2, 2026 – June 30, 2026 Experience Required: 4+ years Work Format: Remote (from Poland) Job Description We are looking for an experienced Automation Tester to join an insurance-related project. You will work...

    Project: Insurance
    Contract Duration: February 2, 2026 – June 30, 2026
    Experience Required: 4+ years
    Work Format: Remote (from Poland)


    Job Description
    We are looking for an experienced Automation Tester to join an insurance-related project. You will work as part of a Scrum team on a large-scale system involving multiple teams, focusing on automated and API testing.


    Responsibilities

    • Design and implement automated tests based on requirements
    • Perform API testing to ensure system reliability and data integrity
    • Collaborate closely with Scrum team members across multiple teams
    • Contribute to quality assurance processes in a complex, large-scale environment


    Technical Requirements (Must Have)

    • Strong experience in automated testing using Java and Selenium
    • Ability to create automated tests from requirements using the EIS Testing Framework
    • Hands-on experience with API testing
    • Experience working in Scrum teams
    • Background in large projects involving 10 or more teams
    • Advanced level of English for daily communication


    Nice to Have

    • Experience with EIS Insurance Suite (versions 12 or 20)
    • Previous work with the EIS Testing Framework
    • Exposure to projects in the UK insurance domain


    Required Technical Skills

    • Java
    • Selenium
    • API Testing


    Location Requirements

    • Remote work possible only from Poland
    More
  • Β· 18 views Β· 1 application Β· 4d

    Senior Business Analyst

    Full Remote Β· Ukraine Β· 5 years of experience Β· C2 - Proficient
    The project involves building an enterprise-grade web portal for an energy domain company that consolidates Power BI project data into a single, secure, role-based user interface. The solution includes dynamic dashboards, embedded Power BI reports,...

    The project involves building an enterprise-grade web portal for an energy domain company that consolidates Power BI project data into a single, secure, role-based user interface.
    The solution includes dynamic dashboards, embedded Power BI reports, certified document previews, SSO authentication via Microsoft Entra ID, RBAC permissions, audit logging, and integration with multiple sources of project information.
    As part of the team, you will work closely with business stakeholders, architects, UX, and engineering to ensure the solution meets the operational and analytical needs of various user groups across the organization.


    Requirements

    We are looking for a highly skilled Senior Business Analyst with experience in enterprise software projects and an understanding of the energy domain.
    You will drive the requirements discovery process, analyze client needs, work with stakeholders, and prepare structured documentation for the development team.
    Your work will include reviewing client materials, extracting user and business problems, structuring them into clear deliverables, and supporting the team with functional flows and acceptance criteria. You will be contributing to the following key activities:

    • Requirements elicitation & analysis
    • Gap analysis and impact assessment
    • Backlog creation and prioritization
    • Preparing user stories, acceptance criteria, workflows
    • Supporting design of role-based flows, dashboards, reporting logic, and certification workflows


    Job responsibilities

    Your role will include the following activities:

    • Research and analyze client documents
    • Prepare gap analysis reports and identify missing capabilities
    • Extract user problems and business needs from unstructured sources
    • Create and maintain a prioritized backlog of market / client problems
    • Define user stories, acceptance criteria, and functional specifications
    • Support UX/UI and architectural discussions with clearly structured requirements
    • Collaborate with Product Owner, architects, and the engineering team
    • Validate requirements with stakeholders and ensure alignment with business goals
    • Support UAT preparation and documentation
    More
  • Β· 53 views Β· 1 application Β· 4d

    JavaScript full-stack developer (React + Node.js), Egypt

    Full Remote Β· Egypt Β· 5 years of experience Β· B2 - Upper Intermediate
    About the Project: We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides...

    About the Project:

    We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides real-time insights into operational performance.
     

    The platform includes a real-time reporting module, a flexible role-based access system, integrations with third-party services, and an intuitive interface for visualizing complex business processes.

    Our architecture is based on a microservices approach, leveraging Kubernetes, cloud services, and up-to-date DevOps practices. The team follows Scrum, with bi-weekly release cycles and strong engineering standards.

     

    Requirements:

    • 5+ years of commercial experience in software development
    • Strong experience with Node.js, Kubernetes, TypeScript
    • Hands-on experience with React
    • Familiarity with Next.js (nice to have)
    • Solid understanding of modern CI/CD practices
    • English Upper-Intermediate or higher

     

    Responsibilities:

    • Develop new features and modules for the platform
    • Work on both frontend (React) and backend (Node.js) parts of the system
    • Participate in architectural discussions and contribute to technical decisions
    • Work with microservices, Kubernetes deployments, and cloud infrastructure
    • Optimize performance, ensure code quality, and maintain best engineering practices
    • Collaborate closely with QA, DevOps, and Product teams
    • Take part in sprint planning, task estimation, and code reviews

     

     

    We Offer:

    • Work in a strong international engineering team
    • Opportunity to influence technical decisions and product architecture
    • Fully remote & flexible schedule
    • Competitive compensation
    • Modern tech stack and challenging tasks
    More
  • Β· 11 views Β· 0 applications Β· 4d

    Middle/Senior Data Engineers (Egypt)

    Full Remote Β· Egypt Β· 5 years of experience Β· B2 - Upper Intermediate
    Middle/Senior Data Engineer We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a...

    Middle/Senior Data Engineer

     

    We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a unified semantic layer for analytics and reporting teams.

    Our data ecosystem is built on a modern tech stack using a cloud data warehouse (BigQuery/Snowflake), dbt for transformations, Airflow for orchestration, and a collection of BI tools for dashboards and self-service analytics.

    You will join a highly skilled data team that collaborates closely with Data Scientists, Analysts, and Backend Engineers to design scalable and reliable data pipelines.

     

    Requirements:

    • 5+ years of experience working with data pipelines in production
    • Strong skills in SQL and Python
    • Hands-on experience with dbt and cloud data warehouses (BigQuery, Snowflake)
    • Experience with Apache Airflow or other orchestration tools
    • Good knowledge of Docker and modern CI/CD workflows
    • Familiarity with BI tools (Superset, Metabase, Looker, Tableau)
    • Fluent English 

     

    Responsibilities:

    • Build, maintain, and optimize production-grade ETL/ELT pipelines
    • Develop dbt models and contribute to the platform’s data transformation layer
    • Work with BigQuery/Snowflake to design efficient data architectures
    • Set up and manage workflows in Airflow (or similar orchestration systems)
    • Ensure data quality, reliability, and proper documentation
    • Collaborate with data analysts, ML engineers, and product teams
    • Improve pipeline performance and implement best practices in DevOps for data

     

    We Offer:

    • Work with a modern cloud data stack and cutting-edge tools
    • An international team with a strong data-driven culture
    • Flexible remote work
    • Competitive compensation
    • Opportunity to shape the data architecture of a growing product
    More
  • Β· 16 views Β· 2 applications Β· 4d

    Senior Data Engineer

    Hybrid Remote Β· Ukraine Β· Product Β· 4 years of experience Β· B2 - Upper Intermediate
    Your future responsibilities: Collaborate with data and analytics experts to strive for greater functionality in our data systems Design, use and test the infrastructure required for optimal extraction, transformation, and loading of data from a wide...

    Your future responsibilities:

    • Collaborate with data and analytics experts to strive for greater functionality in our data systems
    • Design, use and test the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies (DevOps & Continuous Integration)
    • Drive the advancement of data infrastructure by designing and implementing the underlying logic and structure for how data is set up, cleansed, and ultimately stored for organizational usage
    • Assemble large, complex data sets that meet functional / non-functional business requirements
    • Build data integration from various sources and technologies to the data lake infrastructure as part of an agile delivery team
    • Monitor the capabilities and react on unplanned interruptions ensuring that environments are provided & loaded in time

    Your skills and experience:

    • Minimum 5 years experience in a dedicated data engineer role
    • Experience working with large structured and unstructured data in various formats
    • Knowledge or experience with streaming data frameworks and distributed data architectures (e.g. Spark Structured Streaming, Apache Beam or Apache Flink)
    • Experience with cloud technologies (preferable AWS, Azure)
    • Experience in Cloud services (Data Flow, Data Proc, BigQuery, Pub/Sub)
    • Experience of practical operation of Big Data stack: Hadoop, HDFS, Hive, Presto, Kafka
    • Experience of Python in the context of creating ETL data pipelines
    • Experience with Data Lake / Data Warehouse solutions (AWS S3 // Minio)
    • Experience with Apache Airflow
    • Development skills in a Docker / Kubernetes environment
    • Open and team-minded personality and communication skills
    • Willingness to work in an agile environment

    We offer what matters most to you:

    • Competitive salary: we guarantee a stable income and annual bonuses for your personal contribution. Additionally, we have a referral program with rewards for bringing in new colleagues to Raiffeisen Bank
    • Social package: official employment, 28 days of paid leave, additional paternity leave, and financial assistance for parents with newborns
    • Comfortable working conditions: possibility of a hybrid work format, offices equipped with shelters and generators, modern equipment. Classification: PUBLIC
    • Wellbeing program: all employees have access to medical insurance from the first working day; consultations with a psychologist, nutritionist, or lawyer; discount programs for sports and purchases; family days for children and adults; in-office massages
    • Training and development: access to over 130 online training resources; corporate training programs in CX, Data, IT Security, Leadership, Agile. Corporate library and English lessons. β€’ Great team: our colleagues form a community where curiosity, talent, and innovation are welcome. We support each other, learn together, and grow. You can find like-minded individuals in over 15 professional communities, reading clubs, or sports clubs
    • Career opportunities: we encourage advancement within the bank across functions
    • Innovations and technologies: Infrastructure: AWS, Kubernetes, Docker, GitHub, GitHub actions, ArgoCD, Prometheus, Victoria, Vault, OpenTelemetry, ElasticSearch, Crossplain, Grafana. Languages: Java (main), Python (data), Go (infra, security), Swift (IOS), Kotlin (Android). Data stores: Sql-Oracle, PgSql, MsSql, Sybase. Data management: Kafka, AirFlow, Spark, Flink
    • Support program for defenders: we maintain jobs and pay average wages to mobilized individuals. For veterans, we have a support program and develop the Bank’s veterans community. We work on increasing awareness among leaders and teams about the return of veterans to civilian life. Raiffeisen Bank has been recognized as one of the best employers for veterans by Forbes

    Why Raiffeisen Bank?

    • Our main value is people, and we support and recognize them, educate them and involve them in changes. Join Raif’s team because for us YOU matter!
    • One of the largest lenders to the economy and agricultural business among private banks
    • Recognized as the best employer by EY, Forbes, Randstad, Franklin Covey, and Delo.UA
    • The largest humanitarian aid donor among banks (Ukrainian Red Cross, UNITED24, Superhumans, Π‘ΠœΠ†Π›Π˜Π’Π†)
    • One of the largest IT product teams among the country’s banks. β€’ One of the largest taxpayers in Ukraine; 6.6 billion UAH were paid in taxes in 2023

    Opportunities for Everyone:

    • Rife is guided by principles that focus on people and their development, with 5,500 employees and more than 2.7 million customers at the center of attention
    • We support the principles of diversity, equality and inclusiveness
    • We are open to hiring veterans and people with disabilities and are ready to adapt the work environment to your special needs
    • We cooperate with students and older people, creating conditions for growth at any career stage

    Want to learn more? β€” Follow us on social media:

    Facebook, Instagram, LinkedIn

    ___________________________________________________________________________________________

    Π Π°ΠΉΡ„Ρ„Π°ΠΉΠ·Π΅Π½ Π‘Π°Π½ΠΊ β€” Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΠΉ ΡƒΠΊΡ€Π°Ρ—Π½ΡΡŒΠΊΠΈΠΉ Π±Π°Π½ΠΊ Π· Ρ–Π½ΠΎΠ·Π΅ΠΌΠ½ΠΈΠΌ ΠΊΠ°ΠΏΡ–Ρ‚Π°Π»ΠΎΠΌ. Π‘Ρ–Π»ΡŒΡˆΠ΅ 30 Ρ€ΠΎΠΊΡ–Π² ΠΌΠΈ ΡΡ‚Π²ΠΎΡ€ΡŽΡ”ΠΌΠΎ Ρ‚Π° Π²ΠΈΠ±ΡƒΠ΄ΠΎΠ²ΡƒΡ”ΠΌΠΎ Π±Π°Π½ΠΊΡ–Π²ΡΡŒΠΊΡƒ систСму Π½Π°ΡˆΠΎΡ— Π΄Π΅Ρ€ΠΆΠ°Π²ΠΈ.

    Π£ Π Π°ΠΉΡ„Ρ– ΠΏΡ€Π°Ρ†ΡŽΡ” ΠΏΠΎΠ½Π°Π΄ 5 500 ΡΠΏΡ–Π²Ρ€ΠΎΠ±Ρ–Ρ‚Π½ΠΈΠΊΡ–Π², сСрСд Π½ΠΈΡ… ΠΎΠ΄Π½Π° Ρ–Π· Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΡ… ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚ΠΎΠ²ΠΈΡ… Π†Π’-ΠΊΠΎΠΌΠ°Π½Π΄, Ρ‰ΠΎ Π½Π°Π»Ρ–Ρ‡ΡƒΡ” ΠΏΠΎΠ½Π°Π΄ 800 Ρ„Π°Ρ…Ρ–Π²Ρ†Ρ–Π². Щодня ΠΏΠ»Ρ–Ρ‡-ΠΎ-ΠΏΠ»Ρ–Ρ‡ ΠΌΠΈ ΠΏΡ€Π°Ρ†ΡŽΡ”ΠΌΠΎ, Ρ‰ΠΎΠ± Π±Ρ–Π»ΡŒΡˆ Π½Ρ–ΠΆ 2,7 ΠΌΡ–Π»ΡŒΠΉΠΎΠ½Π° Π½Π°ΡˆΠΈΡ… ΠΊΠ»Ρ–Ρ”Π½Ρ‚Ρ–Π² ΠΌΠΎΠ³Π»ΠΈ ΠΎΡ‚Ρ€ΠΈΠΌΠ°Ρ‚ΠΈ якіснС обслуговування, користуватися ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ‚Π°ΠΌΠΈ Ρ– ΡΠ΅Ρ€Π²Ρ–сами Π±Π°Π½ΠΊΡƒ, Ρ€ΠΎΠ·Π²ΠΈΠ²Π°Ρ‚ΠΈ бізнСс, Π°Π΄ΠΆΠ΅ ΠΌΠΈ #Π Π°Π·ΠΎΠΌ_Π·_Π£ΠΊΡ€Π°Ρ—Π½ΠΎΡŽ.β€―

    Π’Π²ΠΎΡ— ΠΌΠ°ΠΉΠ±ΡƒΡ‚Π½Ρ– обов’язки:

    • Бпівпраця Π· Π΅ΠΊΡΠΏΠ΅Ρ€Ρ‚Π°ΠΌΠΈ Π· Π΄Π°Π½ΠΈΡ… Ρ‚Π° Π°Π½Π°Π»Ρ–Ρ‚ΠΈΠΊΠΈ, Ρ‰ΠΎΠ± досягти Π±Ρ–Π»ΡŒΡˆΠΎΡ— Ρ„ΡƒΠ½ΠΊΡ†Ρ–ΠΎΠ½Π°Π»ΡŒΠ½ΠΎΡΡ‚Ρ– Π½Π°ΡˆΠΈΡ… систСм Π΄Π°Π½ΠΈΡ…
    • ΠŸΡ€ΠΎΠ΅ΠΊΡ‚ΡƒΠ²Π°Π½Π½Ρ, використання Ρ‚Π° Ρ‚Сстування інфраструктури, Π½Π΅ΠΎΠ±Ρ…Ρ–Π΄Π½ΠΎΡ— для ΠΎΠΏΡ‚ΠΈΠΌΠ°Π»ΡŒΠ½ΠΎΠ³ΠΎ вилучСння, пСрСтворСння Ρ‚Π° Π·Π°Π²Π°Π½Ρ‚аТСння Π΄Π°Π½ΠΈΡ… Π· ΡˆΠΈΡ€ΠΎΠΊΠΎΠ³ΠΎ спСктру Π΄ΠΆΠ΅Ρ€Π΅Π» Π΄Π°Π½ΠΈΡ… Π·Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΎΡŽ Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–ΠΉ SQL Ρ‚Π° AWS для Π²Π΅Π»ΠΈΠΊΠΈΡ… Π΄Π°Π½ΠΈΡ… (DevOps Ρ‚Π° Π±Π΅Π·ΠΏΠ΅Ρ€Π΅Ρ€Π²Π½Π° інтСграція)
    • Бприяння Ρ€ΠΎΠ·Π²ΠΈΡ‚ΠΊΡƒ інфраструктури Π΄Π°Π½ΠΈΡ… ΡˆΠ»ΡΡ…ΠΎΠΌ проСктування Ρ‚Π° Π²ΠΏΡ€ΠΎΠ²Π°Π΄ΠΆΠ΅Π½Π½Ρ Π±Π°Π·ΠΎΠ²ΠΎΡ— Π»ΠΎΠ³Ρ–ΠΊΠΈ Ρ‚Π° ΡΡ‚Ρ€ΡƒΠΊΡ‚ΡƒΡ€ΠΈ для Π½Π°Π»Π°ΡˆΡ‚ΡƒΠ²Π°Π½Π½Ρ, очищСння Ρ‚Π°, Π·Ρ€Π΅ΡˆΡ‚ΠΎΡŽ, збСрігання Π΄Π°Π½ΠΈΡ… для використання Π² ΠΎΡ€Π³Π°Π½Ρ–Π·Π°Ρ†Ρ–Ρ—
    • Π—Π±ΠΈΡ€Π°Ρ‚ΠΈ Π²Π΅Π»ΠΈΠΊΡ–, складні Π½Π°Π±ΠΎΡ€ΠΈ Π΄Π°Π½ΠΈΡ…, Ρ‰ΠΎ Π²Ρ–Π΄ΠΏΠΎΠ²Ρ–Π΄Π°ΡŽΡ‚ΡŒ Ρ„ΡƒΠ½ΠΊΡ†Ρ–ΠΎΠ½Π°Π»ΡŒΠ½ΠΈΠΌ/Π½Π΅Ρ„ΡƒΠ½ΠΊΡ†Ρ–ΠΎΠ½Π°Π»ΡŒΠ½ΠΈΠΌ бізнСс-Π²ΠΈΠΌΠΎΠ³Π°ΠΌ
    • Π‘Ρ‚Π²ΠΎΡ€ΡŽΠ²Π°Ρ‚ΠΈ Ρ–Π½Ρ‚Π΅Π³Ρ€Π°Ρ†Ρ–ΡŽ Π΄Π°Π½ΠΈΡ… Π· Ρ€Ρ–Π·Π½ΠΈΡ… Π΄ΠΆΠ΅Ρ€Π΅Π» Ρ‚Π° Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–ΠΉ Π² Ρ–нфраструктуру ΠΎΠ·Π΅Ρ€Π° Π΄Π°Π½ΠΈΡ… як Ρ‡Π°ΡΡ‚ΠΈΠ½Π° Π³Π½ΡƒΡ‡ΠΊΠΎΡ— ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ Π· ΠΏΠΎΡΡ‚ачання
    • ΠœΠΎΠ½Ρ–Ρ‚ΠΎΡ€ΠΈΡ‚ΠΈ моТливості Ρ‚Π° Ρ€Π΅Π°Π³ΡƒΠ²Π°Ρ‚ΠΈ Π½Π° Π½Π΅Π·Π°ΠΏΠ»Π°Π½ΠΎΠ²Π°Π½Ρ– ΠΏΠ΅Ρ€Π΅Π±ΠΎΡ—, Π·Π°Π±Π΅Π·ΠΏΠ΅Ρ‡ΡƒΡŽΡ‡ΠΈ своєчаснС надання Ρ‚Π° Π·Π°Π²Π°Π½Ρ‚аТСння сСрСдовищ

    Π’Π²Ρ–ΠΉ досвід Ρ‚Π° Π½Π°Π²ΠΈΡ‡ΠΊΠΈ:

    • ΠœΡ–Π½Ρ–ΠΌΡƒΠΌ 5 Ρ€ΠΎΠΊΡ–Π² досвіду Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π½Π° ΠΏΠΎΡΠ°Π΄Ρ– спСціалізованого Ρ–Π½ΠΆΠ΅Π½Π΅Ρ€Π° Π· Π΄Π°Π½ΠΈΡ…
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ структурованими Ρ‚Π° Π½Π΅ΡΡ‚Ρ€ΡƒΠΊΡ‚ΡƒΡ€ΠΎΠ²Π°Π½ΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ Π² Ρ€Ρ–Π·Π½ΠΈΡ… Ρ„ΠΎΡ€ΠΌΠ°Ρ‚Π°Ρ…
    • Знання Π°Π±ΠΎ досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Ρ„Ρ€Π΅ΠΉΠΌΠ²ΠΎΡ€ΠΊΠ°ΠΌΠΈ ΠΏΠΎΡ‚ΠΎΠΊΠΎΠ²ΠΈΡ… Π΄Π°Π½ΠΈΡ… Ρ‚Π° Ρ€ΠΎΠ·ΠΏΠΎΠ΄Ρ–Π»Π΅Π½ΠΈΠΌΠΈ Π°Ρ€Ρ…Ρ–Ρ‚Π΅ΠΊΡ‚ΡƒΡ€Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ… (Π½Π°ΠΏΡ€ΠΈΠΊΠ»Π°Π΄,
    • Spark Structured Streaming, Apache Beam Π°Π±ΠΎ Apache Flink)
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Ρ…ΠΌΠ°Ρ€Π½ΠΈΠΌΠΈ тСхнологіями (Π±Π°ΠΆΠ°Π½ΠΎ AWS, Azure)
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Ρ…ΠΌΠ°Ρ€Π½ΠΈΠΌΠΈ сСрвісами (Data Flow, Data Proc, BigQuery, Pub/Sub)
    • Досвід ΠΏΡ€Π°ΠΊΡ‚ΠΈΡ‡Π½ΠΎΡ— Сксплуатації стСку Big Data: Hadoop, HDFS, Hive, Presto, Kafka
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Python Ρƒ ΠΊΠΎΠ½Ρ‚Сксті створСння ETL-ΠΏΠΎΡ‚ΠΎΠΊΡ–Π² Π΄Π°Π½ΠΈΡ…
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Ρ€Ρ–ΡˆΠ΅Π½Π½ΡΠΌΠΈ Data Lake / Data Warehouse (AWS S3 // Minio)
    • Досвід Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ Π· Apache Airflow
    • Навички Ρ€ΠΎΠ·Ρ€ΠΎΠ±ΠΊΠΈ Π² ΡΠ΅Ρ€Π΅Π΄ΠΎΠ²ΠΈΡ‰Ρ– Docker / Kubernetes
    • Π’Ρ–Π΄ΠΊΡ€ΠΈΡ‚Π° Ρ‚Π° ΠΊΠΎΠΌΠ°Π½Π΄Π½Π° ΠΎΡΠΎΠ±ΠΈΡΡ‚Ρ–ΡΡ‚ΡŒ, ΠΊΠΎΠΌΡƒΠ½Ρ–ΠΊΠ°Ρ‚ΠΈΠ²Π½Ρ– Π½Π°Π²ΠΈΡ‡ΠΊΠΈ
    • Π“ΠΎΡ‚ΠΎΠ²Π½Ρ–ΡΡ‚ΡŒ ΠΏΡ€Π°Ρ†ΡŽΠ²Π°Ρ‚ΠΈ Π² Π³Π½ΡƒΡ‡ΠΊΠΎΠΌΡƒ сСрСдовищі

    ΠŸΡ€ΠΎΠΏΠΎΠ½ΡƒΡ”ΠΌΠΎ Ρ‚Π΅, Ρ‰ΠΎ ΠΌΠ°Ρ” значСння самС для Ρ‚Π΅Π±Π΅:β€―

    • ΠšΠΎΠ½ΠΊΡƒΡ€Π΅Π½Ρ‚Π½Π° Π·Π°Ρ€ΠΎΠ±Ρ–Ρ‚Π½Π° ΠΏΠ»Π°Ρ‚Π°: Π³Π°Ρ€Π°Π½Ρ‚ΡƒΡ”ΠΌΠΎ ΡΡ‚Π°Π±Ρ–Π»ΡŒΠ½ΠΈΠΉ Π΄ΠΎΡ…Ρ–Π΄ Ρ‚Π° Ρ€Ρ–Ρ‡Π½Ρ– бонуси Π·Π° Ρ‚Π²Ρ–ΠΉ особистий внСсок. Π”ΠΎΠ΄Π°Ρ‚ΠΊΠΎΠ²ΠΎ, Ρƒ Π½Π°Ρ Π΄Ρ–Ρ” Ρ€Π΅Ρ„Π΅Ρ€Π°Π»ΡŒΠ½Π° ΠΏΡ€ΠΎΠ³Ρ€Π°ΠΌΠ° Π²ΠΈΠ½Π°Π³ΠΎΡ€ΠΎΠ΄ΠΈ Π·Π° Π·Π°Π»ΡƒΡ‡Π΅Π½Π½Ρ Π½ΠΎΠ²ΠΈΡ… ΠΊΠΎΠ»Π΅Π³ Π΄ΠΎ Π Π°ΠΉΡ„Ρ„Π°ΠΉΠ·Π΅Π½ Π‘Π°Π½ΠΊΡƒ.
    • Π‘ΠΎΡ†Ρ–Π°Π»ΡŒΠ½ΠΈΠΉ ΠΏΠ°ΠΊΠ΅Ρ‚: ΠΎΡ„Ρ–Ρ†Ρ–ΠΉΠ½Π΅ ΠΏΡ€Π°Ρ†Π΅Π²Π»Π°ΡˆΡ‚ΡƒΠ²Π°Π½Π½Ρ, 28 Π΄Π½Ρ–Π² ΠΎΠΏΠ»Π°Ρ‡ΡƒΠ²Π°Π½ΠΎΡ— відпустки, Π΄ΠΎΠ΄Π°Ρ‚ΠΊΠΎΠ²ΠΈΠΉ β€œΠ΄Π΅ΠΊΡ€Π΅Ρ‚β€ для татусів, Ρ‚Π° ΠΌΠ°Ρ‚Π΅Ρ€Ρ–Π°Π»ΡŒΠ½Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³Π° для Π±Π°Ρ‚ΡŒΠΊΡ–Π² ΠΏΡ€ΠΈ Π½Π°Ρ€ΠΎΠ΄ΠΆΠ΅Π½Π½Ρ– Π΄Ρ–Ρ‚Π΅ΠΉ.
    • ΠšΠΎΠΌΡ„ΠΎΡ€Ρ‚Π½Ρ– ΡƒΠΌΠΎΠ²ΠΈ ΠΏΡ€Π°Ρ†Ρ–: ΠΌΠΎΠΆΠ»ΠΈΠ²Ρ–ΡΡ‚ΡŒ Π³Ρ–Π±Ρ€ΠΈΠ΄Π½ΠΎΠ³ΠΎ Ρ„ΠΎΡ€ΠΌΠ°Ρ‚Ρƒ Ρ€ΠΎΠ±ΠΎΡ‚ΠΈ, офіси Π·Π°Π±Π΅Π·ΠΏΠ΅Ρ‡Π΅Π½Π½Ρ– укриттями Ρ‚Π° Π³Π΅Π½Π΅Ρ€Π°Ρ‚ΠΎΡ€Π°ΠΌΠΈ, забСзпСчСння ΡΡƒΡ‡Π°ΡΠ½ΠΎΡŽ Ρ‚Π΅Ρ…Π½Ρ–ΠΊΠΎΡŽ.
    • Wellbeing ΠΏΡ€ΠΎΠ³Ρ€Π°ΠΌΠ°: для всіх співробітників доступні ΠΌΠ΅Π΄ΠΈΡ‡Π½Π΅ страхування Π· ΠΏΠ΅Ρ€ΡˆΠΎΠ³ΠΎ Ρ€ΠΎΠ±ΠΎΡ‡ΠΎΠ³ΠΎ дня; ΠΊΠΎΠ½ΡΡƒΠ»ΡŒΡ‚Π°Ρ†Ρ–Ρ— психолога, Π½ΡƒΡ‚Ρ€ΠΈΡ†Ρ–ΠΎΠ»ΠΎΠ³Π° Ρ‡ΠΈ ΡŽΡ€ΠΈΡΡ‚Π°; дисконт ΠΏΡ€ΠΎΠ³Ρ€Π°ΠΌΠΈ Π½Π° ΡΠΏΠΎΡ€Ρ‚ Ρ‚Π° ΠΏΠΎΠΊΡƒΠΏΠΊΠΈ; family days для Π΄Ρ–Ρ‚Π΅ΠΉ Ρ‚Π° Π΄ΠΎΡ€ΠΎΡΠ»ΠΈΡ…; масаТ Π² ΠΎΡ„ісі.
    • Навчання Ρ‚Π° Ρ€ΠΎΠ·Π²ΠΈΡ‚ΠΎΠΊ: доступ Π΄ΠΎ ΠΏΠΎΠ½Π°Π΄ 130 Π½Π°Π²Ρ‡Π°Π»ΡŒΠ½ΠΈΡ… ΠΎΠ½Π»Π°ΠΉΠ½-рСсурсів; ΠΊΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½Ρ– Π½Π°Π²Ρ‡Π°Π»ΡŒΠ½Ρ– ΠΏΡ€ΠΎΠ³Ρ€Π°ΠΌΠΈ Π· CX, Data, IT Security, ЛідСрства, Agile. ΠšΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½Π° Π±Ρ–Π±Π»Ρ–ΠΎΡ‚Π΅ΠΊΠ° Ρ‚Π° ΡƒΡ€ΠΎΠΊΠΈ Π°Π½Π³Π»Ρ–ΠΉΡΡŒΠΊΠΎΡ—.
    • ΠšΡ€ΡƒΡ‚Π° ΠΊΠΎΠΌΠ°Π½Π΄Π°: Π½Π°ΡˆΡ– ΠΊΠΎΠ»Π΅Π³ΠΈ β€” Ρ†Π΅ ΡΠΏΡ–Π»ΡŒΠ½ΠΎΡ‚Π°, Π΄Π΅ Π²Ρ–Ρ‚Π°ΡŽΡ‚ΡŒΡΡ Π΄ΠΎΠΏΠΈΡ‚Π»ΠΈΠ²Ρ–ΡΡ‚ΡŒ, Ρ‚Π°Π»Π°Π½Ρ‚ Ρ‚Π° Ρ–Π½Π½ΠΎΠ²Π°Ρ†Ρ–Ρ—. Ми ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΡƒΡ”ΠΌΠΎ ΠΎΠ΄ΠΈΠ½ ΠΎΠ΄Π½ΠΎΠ³ΠΎ, вчимося Ρ€Π°Π·ΠΎΠΌ Ρ‚Π° Π·Ρ€ΠΎΡΡ‚Π°Ρ”ΠΌΠΎ. Π’ΠΈ ΠΌΠΎΠΆΠ΅Ρˆ Π·Π½Π°ΠΉΡ‚ΠΈ ΠΎΠ΄Π½ΠΎΠ΄ΡƒΠΌΡ†Ρ–Π² Ρƒ ΠΏΠΎΠ½Π°Π΄ 15-Ρ‚ΠΈ профСсійних ΠΊΠΎΠΌβ€™ΡŽΠ½Ρ–Ρ‚Ρ–, Ρ‡ΠΈΡ‚Π°Ρ†ΡŒΠΊΠΎΠΌΡƒ Ρ‡ΠΈ ΡΠΏΠΎΡ€Ρ‚ΠΈΠ²Π½ΠΎΠΌΡƒ ΠΊΠ»ΡƒΠ±Π°Ρ….
    • ΠšΠ°Ρ€β€™Ρ”Ρ€Π½Ρ– моТливості: ΠΌΠΈ Π·Π°ΠΎΡ…ΠΎΡ‡ΡƒΡ”ΠΌΠΎ просування всСрСдині Π±Π°Π½ΠΊΡƒ ΠΌΡ–ΠΆ функціями.
    • Π†Π½Π½ΠΎΠ²Π°Ρ†Ρ–Ρ— Ρ‚Π° Ρ‚Π΅Ρ…Π½ΠΎΠ»ΠΎΠ³Ρ–Ρ—. Infrastructure: AWS, Kubernetes, Docker, GitHub, GitHub actions, ArgoCD, Prometheus, Victoria, Vault, OpenTelemetry, ElasticSearch, Crossplain, Grafana. Languages: Java (main), Python (data), Go(infra,security), Swift (IOS), Kotlin (Andorid). Datastores: Sql-Oracle, PgSql, MsSql, Sybase. Data management: Kafka, AirFlow, Spark, Flink.
    • ΠŸΡ€ΠΎΠ³Ρ€Π°ΠΌΠ° ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠΈ захисників Ρ– Π·Π°Ρ…ΠΈΡΠ½ΠΈΡ†ΡŒ: ΠΌΠΈ Π·Π±Π΅Ρ€Ρ–Π³Π°Ρ”ΠΌΠΎ Ρ€ΠΎΠ±ΠΎΡ‡Ρ– місця Ρ‚Π° Π²ΠΈΠΏΠ»Π°Ρ‡ΡƒΡ”ΠΌΠΎ ΡΠ΅Ρ€Π΅Π΄Π½ΡŽ Π·Π°Ρ€ΠΎΠ±Ρ–Ρ‚Π½Ρƒ ΠΏΠ»Π°Ρ‚Ρƒ ΠΌΠΎΠ±Ρ–Π»Ρ–Π·ΠΎΠ²Π°Π½ΠΈΠΌ. Для Π²Π΅Ρ‚Π΅Ρ€Π°Π½Ρ–Π² Ρ‚Π° Π²Π΅Ρ‚Π΅Ρ€Π°Π½ΠΎΠΊ Ρƒ Π½Π°Ρ Π΄Ρ–Ρ” ΠΏΡ€ΠΎΠ³Ρ€Π°ΠΌΠ° ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΠΈ, Ρ€ΠΎΠ·Π²ΠΈΠ²Π°Ρ”Ρ‚ΡŒΡΡ Π²Π΅Ρ‚Π΅Ρ€Π°Π½ΡΡŒΠΊΠ° ΡΠΏΡ–Π»ΡŒΠ½ΠΎΡ‚Π° Π‘Π°Π½ΠΊΡƒ. Ми ΠΏΡ€Π°Ρ†ΡŽΡ”ΠΌΠΎ Π½Π°Π΄ підвищСнням обізнаності ΠΊΠ΅Ρ€Ρ–Π²Π½ΠΈΠΊΡ–Π² Ρ‚Π° ΠΊΠΎΠΌΠ°Π½Π΄ Π· ΠΏΠΈΡ‚Π°Π½ΡŒ повСрнСння Π²Π΅Ρ‚Π΅Ρ€Π°Π½Ρ–Π² Π΄ΠΎ Ρ†ΠΈΠ²Ρ–Π»ΡŒΠ½ΠΎΠ³ΠΎ Тиття. Π Π°ΠΉΡ„Ρ„Π°ΠΉΠ·Π΅Π½ Π‘Π°Π½ΠΊ Π²Ρ–Π΄Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ як ΠΎΠ΄ΠΈΠ½ Π· Π½Π°ΠΉΠΊΡ€Π°Ρ‰ΠΈΡ… Ρ€ΠΎΠ±ΠΎΡ‚ΠΎΠ΄Π°Π²Ρ†Ρ–Π² для Π²Π΅Ρ‚Π΅Ρ€Π°Π½Ρ–Π² (Forbes).

    Π§ΠΎΠΌΡƒ Π Π°ΠΉΡ„Ρ„Π°ΠΉΠ·Π΅Π½ Π‘Π°Π½ΠΊ?β€―

    • Наша Π³ΠΎΠ»ΠΎΠ²Π½Π° Ρ†Ρ–Π½Π½Ρ–ΡΡ‚ΡŒ β€” люди Ρ– ΠΌΠΈ Π΄Π°Ρ”ΠΌΠΎ Ρ—ΠΌ ΠΏΡ–Π΄Ρ‚Ρ€ΠΈΠΌΠΊΡƒ Ρ– Π²ΠΈΠ·Π½Π°Π½Π½Ρ, Π½Π°Π²Ρ‡Π°Ρ”ΠΌΠΎ, Π·Π°Π»ΡƒΡ‡Π°Ρ”ΠΌΠΎ Π΄ΠΎ Π·ΠΌΡ–Π½. ΠŸΡ€ΠΈΡ”Π΄Π½ΡƒΠΉΡΡ Π΄ΠΎ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ Π Π°ΠΉΡ„Ρƒ, Π°Π΄ΠΆΠ΅ для нас Π’И ΠΌΠ°Ρ”Ρˆ значСння!β€―
    • Один Ρ–Π· Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΡ… ΠΊΡ€Π΅Π΄ΠΈΡ‚ΠΎΡ€Ρ–Π² Π΅ΠΊΠΎΠ½ΠΎΠΌΡ–ΠΊΠΈ Ρ‚Π° Π°Π³Ρ€Π°Ρ€Π½ΠΎΠ³ΠΎ бізнСсу сСрСд ΠΏΡ€ΠΈΠ²Π°Ρ‚Π½ΠΈΡ… Π±Π°Π½ΠΊΡ–Π²β€―
    • Π’ΠΈΠ·Π½Π°Π½ΠΈΠΉ Π½Π°ΠΉΠΊΡ€Π°Ρ‰ΠΈΠΌ ΠΏΡ€Π°Ρ†Π΅Π΄Π°Π²Ρ†Π΅ΠΌ Π·Π° Π²Π΅Ρ€ΡΡ–ями EY, Forbes, Randstad, Franklin Covey, Delo.UAβ€―
    • ΠΠ°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΠΉ Π΄ΠΎΠ½ΠΎΡ€ Π³ΡƒΠΌΠ°Π½Ρ–Ρ‚Π°Ρ€Π½ΠΎΡ— допомогисСрСд Π±Π°Π½ΠΊΡ–Π² (Π§Π΅Ρ€Π²ΠΎΠ½ΠΈΠΉ Π₯рСст Π£ΠΊΡ€Π°Ρ—Π½ΠΈ, UNITED24, Superhumans, Π‘ΠœΠ†Π›Π˜Π’Π†)β€―
    • Один Ρ–Π· Π½Π°ΠΉΠ±Ρ–Π»ΡŒΡˆΠΈΡ… ΠΏΠ»Π°Ρ‚Π½ΠΈΠΊΡ–Π² ΠΏΠΎΠ΄Π°Ρ‚ΠΊΡ–Π² Π² Π£ΠΊΡ€Π°Ρ—Π½Ρ–, Π·Π° 2023 Ρ€Ρ–ΠΊ Π±ΡƒΠ»ΠΎ сплачСно 6,6 ΠΌΠ»Ρ€Π΄ Π³Ρ€ΠΈΠ²Π΅Π½ΡŒ

    ΠœΠΎΠΆΠ»ΠΈΠ²ΠΎΡΡ‚Ρ– для всіх:β€―

    • Π Π°ΠΉΡ„ ΠΊΠ΅Ρ€ΡƒΡ”Ρ‚ΡŒΡΡ ΠΏΡ€ΠΈΠ½Ρ†ΠΈΠΏΠ°ΠΌΠΈ, Ρ‰ΠΎ Ρ„ΠΎΠΊΡƒΡΡƒΡŽΡ‚ΡŒΡΡ Π½Π° Π»ΡŽΠ΄ΠΈΠ½Ρ– Ρ‚Π° Ρ—Ρ— Ρ€ΠΎΠ·Π²ΠΈΡ‚ΠΊΡƒ, Ρƒ Ρ†Π΅Π½Ρ‚Ρ€Ρ– ΡƒΠ²Π°Π³ΠΈ 5β€―500 співробітників Ρ‚Π° ΠΏΠΎΠ½Π°Π΄ 2,7 ΠΌΡ–Π»ΡŒΠΉΠΎΠ½ΠΈ ΠΊΠ»Ρ–Ρ”Π½Ρ‚Ρ–Π²β€―β€―
    • ΠŸΡ–Π΄Ρ‚Ρ€ΠΈΠΌΡƒΡ”ΠΌΠΎ ΠΏΡ€ΠΈΠ½Ρ†ΠΈΠΏΠΈ різноманіття, рівності Ρ‚Π° Ρ–Π½ΠΊΠ»ΡŽΠ·ΠΈΠ²Π½ΠΎΡΡ‚Ρ–
    • Ми Π²Ρ–Π΄ΠΊΡ€ΠΈΡ‚Ρ– Π΄ΠΎ Π½Π°ΠΉΠΌΡƒ Π²Π΅Ρ‚Π΅Ρ€Π°Π½Ρ–Π² Ρ– Π»ΡŽΠ΄Π΅ΠΉ Π· Ρ–Π½Π²Π°Π»Ρ–Π΄Π½Ρ–ΡΡ‚ΡŽ Ρ‚Π° Π³ΠΎΡ‚ΠΎΠ²Ρ– Π°Π΄Π°ΠΏΡ‚ΡƒΠ²Π°Ρ‚ΠΈ Ρ€ΠΎΠ±ΠΎΡ‡Π΅ сСрСдовищС ΠΏΡ–Π΄ Π²Π°ΡˆΡ– особливі ΠΏΠΎΡ‚Ρ€Π΅Π±ΠΈ
    • Π‘ΠΏΡ–Π²ΠΏΡ€Π°Ρ†ΡŽΡ”ΠΌΠΎ Π·Ρ– ΡΡ‚ΡƒΠ΄Π΅Π½Ρ‚Π°ΠΌΠΈ Ρ‚Π° Π»ΡŽΠ΄ΡŒΠΌΠΈ ΡΡ‚Π°Ρ€ΡˆΠΎΠ³ΠΎ Π²Ρ–ΠΊΡƒ,β€―ΡΡ‚Π²ΠΎΡ€ΡŽΡŽΡ‡ΠΈ ΡƒΠΌΠΎΠ²ΠΈ для зростання Π½Π° Π±ΡƒΠ΄ΡŒ-якому Π΅Ρ‚Π°ΠΏΡ– кар’єри

    Π‘Π°ΠΆΠ°Ρ”Ρˆ дізнатися Π±Ρ–Π»ΡŒΡˆΠ΅? β€” ΠŸΡ–дписуйся Π½Π° Π½Π°Ρ Ρƒ ΡΠΎΡ†.ΠΌΠ΅Ρ€Π΅ΠΆΠ°Ρ…:

    Facebook, Instagram, LinkedInβ€―

    More
  • Β· 25 views Β· 0 applications Β· 4d

    Python Engineer (Egypt)

    Full Remote Β· Egypt Β· 4 years of experience Β· B2 - Upper Intermediate
    About the Project: We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture, leveraging...

    About the Project:

    We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture, leveraging cloud infrastructure, asynchronous Python, and robust CI/CD pipelines.

    You will work in a skilled engineering team focused on scalability, clean architecture, and automation.

     

    Requirements:

    • 4+ years of experience in backend development
    • Strong hands-on expertise with Python (FastAPI, Django, or Flask)
    • Solid understanding of REST APIs, asynchronous programming, and clean architecture
    • Experience with relational and/or NoSQL databases
    • Familiarity with Docker and CI/CD processes
    • Ability to work independently in a remote environment
    • English  Upper Intermediate or higher

     

    Responsibilities:

    • Develop and maintain backend services and APIs
    • Design scalable architecture and contribute to technical decisions
    • Work with databases, optimize queries, and ensure data reliability
    • Implement integrations with third-party services
    • Write clean, maintainable code and participate in code reviews
    • Collaborate with DevOps, QA, and product teams
    • Troubleshoot, debug, and optimize system performance

     

    We Offer:

    • Work with a modern Python backend stack
    • A flexible remote schedule
    • Competitive compensation
    • International engineering culture with strong technical standards
    • An opportunity to work on a long-term, impactful product
    More
  • Β· 46 views Β· 2 applications Β· 4d

    Full-Stack Developer (React + Node.js), Serbia

    Full Remote Β· Serbia Β· 5 years of experience Β· B2 - Upper Intermediate
    About the Project: We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides...

    About the Project: 

    We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides real-time insights into operational performance. 
     

    The platform includes a real-time reporting module, a flexible role-based access system, integrations with third-party services, and an intuitive interface for visualizing complex business processes. 

    Our architecture is based on a microservices approach, leveraging Kubernetes, cloud services, and up-to-date DevOps practices. The team follows Scrum, with bi-weekly release cycles and strong engineering standards. 

     

    Requirements: 

    • 5+ years of commercial experience in software development 
    • Strong experience with Node.js, Kubernetes, TypeScript 
    • Hands-on experience with React 
    • Familiarity with Next.js (nice to have) 
    • Solid understanding of modern CI/CD practices 
    • English Upper-Intermediate or higher 

     

    Responsibilities: 

    • Develop new features and modules for the platform 
    • Work on both frontend (React) and backend (Node.js) parts of the system 
    • Participate in architectural discussions and contribute to technical decisions 
    • Work with microservices, Kubernetes deployments, and cloud infrastructure 
    • Optimize performance, ensure code quality, and maintain best engineering practices 
    • Collaborate closely with QA, DevOps, and Product teams 
    • Take part in sprint planning, task estimation, and code reviews 

     

    We Offer: 

    • Work in a strong international engineering team 
    • Opportunity to influence technical decisions and product architecture 
    • Fully remote & flexible schedule 
    • Competitive compensation 
    • Modern tech stack and challenging tasks 
    More
  • Β· 9 views Β· 0 applications Β· 4d

    Middle/Senior Data Engineers (Serbia)

    Full Remote Β· Serbia Β· 5 years of experience Β· B2 - Upper Intermediate
    Middle/Senior Data Engineer We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a...

    Middle/Senior Data Engineer

     

    We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a unified semantic layer for analytics and reporting teams.

    Our data ecosystem is built on a modern tech stack using a cloud data warehouse (BigQuery/Snowflake), dbt for transformations, Airflow for orchestration, and a collection of BI tools for dashboards and self-service analytics.

    You will join a highly skilled data team that collaborates closely with Data Scientists, Analysts, and Backend Engineers to design scalable and reliable data pipelines.

     

    Requirements:

    • 5+ years of experience working with data pipelines in production
    • Strong skills in SQL and Python
    • Hands-on experience with dbt and cloud data warehouses (BigQuery, Snowflake)
    • Experience with Apache Airflow or other orchestration tools
    • Good knowledge of Docker and modern CI/CD workflows
    • Familiarity with BI tools (Superset, Metabase, Looker, Tableau)
    • Fluent English 

     

    Responsibilities:

    • Build, maintain, and optimize production-grade ETL/ELT pipelines
    • Develop dbt models and contribute to the platform’s data transformation layer
    • Work with BigQuery/Snowflake to design efficient data architectures
    • Set up and manage workflows in Airflow (or similar orchestration systems)
    • Ensure data quality, reliability, and proper documentation
    • Collaborate with data analysts, ML engineers, and product teams
    • Improve pipeline performance and implement best practices in DevOps for data

     

    We Offer:

    • Work with a modern cloud data stack and cutting-edge tools
    • An international team with a strong data-driven culture
    • Flexible remote work
    • Competitive compensation
    • Opportunity to shape the data architecture of a growing product
    More
  • Β· 35 views Β· 0 applications Β· 4d

    Python Engineer (Serbia)

    Full Remote Β· Serbia Β· 4 years of experience Β· B2 - Upper Intermediate
    About the Project: We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture,...

    About the Project: 

    We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture, leveraging cloud infrastructure, asynchronous Python, and robust CI/CD pipelines. 

    You will work in a skilled engineering team focused on scalability, clean architecture, and automation. 

     

    Requirements: 

    • 4+ years of experience in backend development 
    • Strong hands-on expertise with Python (FastAPI, Django, or Flask) 
    • Solid understanding of REST APIs, asynchronous programming, and clean architecture 
    • Experience with relational and/or NoSQL databases 
    • Familiarity with Docker and CI/CD processes 
    • Ability to work independently in a remote environment 
    • English β€―Upper Intermediate or higher 

     

    Responsibilities: 

    • Develop and maintain backend services and APIs 
    • Design scalable architecture and contribute to technical decisions 
    • Work with databases, optimize queries, and ensure data reliability 
    • Implement integrations with third-party services 
    • Write clean, maintainable code and participate in code reviews 
    • Collaborate with DevOps, QA, and product teams 
    • Troubleshoot, debug, and optimize system performance 

     

    We Offer: 

    • Work with a modern Python backend stack 
    • A flexible remote schedule 
    • Competitive compensation 
    • International engineering culture with strong technical standards 
    • An opportunity to work on a long-term, impactful product 

     

    More
  • Β· 21 views Β· 3 applications Β· 4d

    Middle/Senior Data Engineers (Armenia)

    Full Remote Β· Armenia Β· 5 years of experience Β· B2 - Upper Intermediate
    Middle/Senior Data Engineer We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a...

    Middle/Senior Data Engineer

     

    We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a unified semantic layer for analytics and reporting teams.

    Our data ecosystem is built on a modern tech stack using a cloud data warehouse (BigQuery/Snowflake), dbt for transformations, Airflow for orchestration, and a collection of BI tools for dashboards and self-service analytics.

    You will join a highly skilled data team that collaborates closely with Data Scientists, Analysts, and Backend Engineers to design scalable and reliable data pipelines.

     

    Requirements:

    • 5+ years of experience working with data pipelines in production
    • Strong skills in SQL and Python
    • Hands-on experience with dbt and cloud data warehouses (BigQuery, Snowflake)
    • Experience with Apache Airflow or other orchestration tools
    • Good knowledge of Docker and modern CI/CD workflows
    • Familiarity with BI tools (Superset, Metabase, Looker, Tableau)
    • Fluent English 

     

    Responsibilities:

    • Build, maintain, and optimize production-grade ETL/ELT pipelines
    • Develop dbt models and contribute to the platform’s data transformation layer
    • Work with BigQuery/Snowflake to design efficient data architectures
    • Set up and manage workflows in Airflow (or similar orchestration systems)
    • Ensure data quality, reliability, and proper documentation
    • Collaborate with data analysts, ML engineers, and product teams
    • Improve pipeline performance and implement best practices in DevOps for data

     

    We Offer:

    • Work with a modern cloud data stack and cutting-edge tools
    • An international team with a strong data-driven culture
    • Flexible remote work
    • Competitive compensation
    • Opportunity to shape the data architecture of a growing product
    More
Log In or Sign Up to see all posted jobs