Jobs at Djinni
9457-
Β· 94 views Β· 20 applications Β· 4d
Senior Software Engineer (Web)
Hybrid Remote Β· Ukraine Β· 4 years of experience Β· B2 - Upper IntermediateJob Description We're excited if you have 4+ years of web development development experience building consumer-facing websites and backend services using REST APIs Good culture fit and excellent team player Strong knowledge of JavaScript and Node.js...Job Description
We're excited if you have
- 4+ years of web development development experience building consumer-facing websites and backend services using REST APIs
- Good culture fit and excellent team player
- Strong knowledge of JavaScript and Node.js (Express.js)
- Solid experience with ReactJS, Jest, Cypress
- Experience with Material UI would be a plus
- Knowledge of the latest open standards and web technologies
- Ability to decompose complex problems into manageable units of work
- Experience with AWS would be a plus
- BS or MS degree in Computer Science or equivalent
- Strong verbal and written communication skills in English
- Plus: familiar with e-commerce websites and trends
Job Responsibilities
What you'll be doing
- Build innovative features in the Web Commerce and Payments areas (subscriptions, invoices, transactions, refunds, promos, etc.)
- Code, test, document and deliver features with cross-browser compatibility, accessibility and search engine optimization
- Enhance existing and create new features for Node.js backend
- Participate in Agile development process, including scoping, technical design, estimation effort, coding, testing, debugging, code reviews, maintenance and support
- Collaborate with program managers, marketing, designers, software and QA engineers to build and enhance both UI and backend
- Innovate and create the best user experience online
Department/Project Description
The Client is the #1 TV streaming platform in the U.S with tens of millions of customers. They have pioneered TV streaming in US. Their mission is to be the TV streaming platform that connects the entire TV ecosystem.
The project is an e-commerce website related to selling consumer equipment for video streaming services.
More -
Β· 145 views Β· 19 applications Β· 4d
DevOps Trainee
Hybrid Remote Β· Ukraine Β· B1 - IntermediateJob Description Required skills: - Networking basics understanding - Linux experience - Basic understanding CI/CD processes and DevOps concepts - Basic knowledge of containerization and virtualization (Docker) - Basic knowledge of cloud providers...Job Description
Required skills:
- Networking basics understanding
- Linux experience
- Basic understanding CI/CD processes and DevOps concepts
- Basic knowledge of containerization and virtualization (Docker)
- Basic knowledge of cloud providers (AWS/GCP)
- Experience with Git (GitHub/GitLab)
- Infrastructure as a Code (Terraform)
- Basic knowledge of SQL and NoSQL databases
- Basic knowledge of any scripting language (Bash/Python)
- Good communication skillsJob Responsibilities
Identify and implement automation strategies in the SDLC process that enables high quality and fast delivery of new solutions to our business users.
- Manage and optimize infrastructure and deployment workflows.
- Learn and contribute to the implementation of CI/CD.
- Participate in troubleshooting and resolving issues.
- Participate in vendor software evaluations and integration strategies.
- Champion our CI/CD agile practices across all of the development teams.
- Create and maintain functional / technical design specifications and solutions to satisfy project requirements.
- Develop infrastructure as code to meet the needs of our platforms.
- Monitor system and application performance and troubleshoot / resolve escalated issues.
- Be part of a high-performing Agile / Continuous Integration engineering practice.
- Continually seek ways to optimize and improve all operational aspects of our technical solutions.Department/Project Description
We are looking for an innovative, results-oriented, and passionate DevOps trainee. You will be working with a team of like-minded engineers to design, develop and implement cutting-edge consumer desktop, web, and mobile applications. We are building an infrastructure platform, and the Platform Engineering team is responsible for designing and writing Infrastructure-as-Code and nurturing a DevOps culture by providing tools and training for development teams. This role will be key in building cloud-based, fault tolerant, highly available, high performing applications that provide a seamless experience to our development teams and our end users.
More -
Β· 49 views Β· 5 applications Β· 4d
Team Lead Recruitment
Full Remote Β· Countries of Europe or Ukraine Β· Product Β· 6 years of experience Β· B2 - Upper IntermediateTeam Lead Recruitment β Remote We are a fintech product company building white-label and SaaS solutions for EMI, PSP, e-commerce, and open banking. As a remote-first and rapidly growing company, we are looking for a Team Lead Recruitment to strengthen...Team Lead Recruitment β Remote
We are a fintech product company building white-label and SaaS solutions for EMI, PSP, e-commerce, and open banking. As a remote-first and rapidly growing company, we are looking for a Team Lead Recruitment to strengthen our Talent Acquisition function and lead our team of recruiters.
Role Overview
You will lead, mentor, and develop a team of recruiters, optimize recruitment processes, work closely with hiring managers, and personally manage key vacancies. Focus areas: leadership, strategy, delivery, and quality of hiring.
Key Responsibilities
- Manage, mentor, and develop a team of 5+ recruiters
- Set goals, track performance, and provide feedback
- Drive sourcing strategies and optimize recruitment processes
- Close key roles, including management and C-level positions
- Collaborate with hiring managers to define requirements and expectations
- Track and use recruitment metrics (time-to-hire, quality-of-hire, pipeline health)
- Implement best practices, tools, and automation
- Ensure excellent candidate experience and internal client satisfaction
- Use analytics to improve hiring efficiency and generate actionable insights
Requirements
- 6+ years in recruitment, including 2+ years as Team Lead or Recruitment Manager
- Experience managing a team of 5+ recruiters
- Proven track record closing management & C-level roles
- Experience recruiting in at least two of: IT, product companies, outsourcing/outstaffing
- Experience building or improving recruitment processes
- Strong sourcing, assessment, and communication skills
- Experience with recruitment analytics and metrics
- Ability to work in a fast-paced, dynamic environment
- Remote-first experience is a plus
- Ukrainian, Russian, English β fluent
Recruitment Process
- Application screening
- Prescreening call (60 min)
- Technical interview (60 min)
- Offer + background check
We Offer
- Competitive compensation
- 19 business days PTO
- Fully remote work, flexible schedule (CET-friendly)
- Supportive culture & growth opportunities
- Participation in conferences and industry events
- Efficient matrix structure with minimal bureaucracy
If youβre ready to lead a strong recruitment team and shape the companyβs hiring strategy β apply now!
More -
Β· 72 views Β· 1 application Β· 4d
Senior Grafana Developer
Full Remote Β· Ukraine Β· 5 years of experience Β· B2 - Upper IntermediateWe are seeking a Grafana+ Developer to design, build, and maintain interactive dashboards and observability solutions for enterprise-level environments. The ideal candidate will have strong experience in Grafana, Prometheus, and related monitoring and...We are seeking a Grafana+ Developer to design, build, and maintain interactive dashboards and observability solutions for enterprise-level environments. The ideal candidate will have strong experience in Grafana, Prometheus, and related monitoring and visualization tools, with the ability to connect to various data sources (e.g., Snowflake, PostgreSQL, Elasticsearch, InfluxDB, Azure Monitor, or Mulesoft metrics).
The role involves close collaboration with platform, DevOps, and integration teams to deliver actionable insights and performance monitoring dashboards for complex distributed systems.Responsibilities:
Design and develop Grafana dashboards to visualize system health, performance, and business KPIs.
Integrate Grafana with various data sources such as Prometheus, Loki, Elastic, Snowflake, or SQL-based stores.
Configure and manage Grafana plugins, alerts, and notifications across different environments.
Develop custom queries (e.g., PromQL, SQL) to extract and visualize relevant metrics.
Work with DevOps and Cloud teams to implement observability best practices and automated dashboard deployments via CI/CD pipelines.
Create and maintain documentation for dashboards, data models, and metric definitions.
Troubleshoot performance issues and optimize dashboard load times and data queries.
Collaborate with architecture and platform teams to ensure monitoring alignment with SLIs, SLOs, and SLAs.Mandatory Skills Description:
4+ years of hands-on experience with Grafana (Grafana Cloud, OSS, or Enterprise).
Practical experience with Python
Strong skills in PromQL, SQL, or Elastic queries.
Experience integrating Grafana with Prometheus, Loki, Snowflake, or other time-series databases.
Familiarity with alerting and notification channels (Slack, Teams, PagerDuty, email, etc.).
Knowledge of Grafana provisioning (JSON models, configuration-as-code).
Experience with CI/CD and infrastructure-as-code (e.g., Terraform, GitHub Actions, Jenkins).
Understanding of monitoring frameworks, metrics collection, and log aggregation.
Experience working in cloud environments (Azure).Nice-to-Have Skills Description:
Experience with Mulesoft, Confluent Kafka, or API monitoring.
Familiarity with Azure Monitor, App Insights, or Dynatrace.
Basic scripting knowledge (Golang, Bash, or PowerShell).
Exposure to DevOps or SRE practices.
-
Β· 20 views Β· 3 applications Β· 4d
Senior Marketing Automation Specialist
Full Remote Β· EU Β· 4 years of experience Β· B2 - Upper IntermediateN-iX is looking for Senior Marketing Automation Specialist to join the fast-growing team one our project! Our customer is the European online car market with over 30 million monthly users, with a market presence in 18 countries. As a Senior Marketing...N-iX is looking for Senior Marketing Automation Specialist to join the fast-growing team one our project! Our customer is the European online car market with over 30 million monthly users, with a market presence in 18 countries. As a Senior Marketing Automation Specialist, you will play a pivotal role in shaping the future of online car markets and enhancing the user experience for millions of car buyers and sellers.This role ensures day-to-day platform reliability, supports scalable customer engagement across multiple brands and markets, and contributes to the continuous development of our omni-channel automation capabilities. The position combines technical execution, platform administration, data-driven troubleshooting, and close collaboration with CRM Managers, Product, Data, and Engineering teams.
Responsibilities:
- Ensure operational continuity and performance of the MarTech stack (Iterable, Salesforce, Marketing Cloud, supporting integrations with Sales and Service Cloud, Litmus, and more).
- Monitor platform health, journey execution, error rates, and data synchronization issues across channels (Email, Push, In-App, WhatsApp).
- Administer users, roles, permissions, and configuration settings across tools.
- Enable CRM Managers, Campaign Managers, and Data Delivery teams through training, documentation, and hands-on onboarding to platform capabilities.
- Provide day-to-day troubleshooting and solution guidance on segmentation, templates, content automation, journeys, and campaign execution.
- Support data flow configuration and maintenance across CRM systems, AWS/Databricks pipelines, audience sync tools (e.g., Mediarithmics), and CMS integrations (e.g., Contentful).
- Contribute to automation initiatives enabling highly personalized, real-time, multi-channel communication journeys for B2C and B2B audiences.
- Assist in monitoring and improving data synchronization reliability across systems.
- Maintain and evolve template libraries, modular email components, and reusable assets.
- Diagnose and resolve issues related to journey failures, template rendering, deliverability, and API-driven components.
Requirements:
- Minimum 4 years of experience in marketing automation, CRM engineering, or MarTech operations.
- Hands-on experience with at least one major automation platform (Iterable, Salesforce, Marketing Cloud, Hubspot, or equivalent).
- Proven expertise in both stakeholder management and project management
- Hands-on working with REST APIs, data integration patterns, Javascript, and SQL.
- Experience building, troubleshooting, and optimizing multi-step journeys or workflows.
- Strong analytical skills, comfort with dashboards, and campaign performance metrics.
- Familiarity with GDPR and data privacy principles related to CRM operations.
- Knowledge of deliverability monitoring tooling (Google Postmaster, MxToolbox) and ESP authentication methods.
- English level - at least Upper-Intermediate, both spoken and written.
We offer*:
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
*not applicable for freelancers
More -
Β· 74 views Β· 2 applications Β· 4d
Automation Tester
Full Remote Β· Poland Β· 4 years of experience Β· B2 - Upper IntermediateProject: Insurance Contract Duration: February 2, 2026 β June 30, 2026 Experience Required: 4+ years Work Format: Remote (from Poland) Job Description We are looking for an experienced Automation Tester to join an insurance-related project. You will work...Project: Insurance
Contract Duration: February 2, 2026 β June 30, 2026
Experience Required: 4+ years
Work Format: Remote (from Poland)
Job Description
We are looking for an experienced Automation Tester to join an insurance-related project. You will work as part of a Scrum team on a large-scale system involving multiple teams, focusing on automated and API testing.
Responsibilities- Design and implement automated tests based on requirements
- Perform API testing to ensure system reliability and data integrity
- Collaborate closely with Scrum team members across multiple teams
- Contribute to quality assurance processes in a complex, large-scale environment
Technical Requirements (Must Have)- Strong experience in automated testing using Java and Selenium
- Ability to create automated tests from requirements using the EIS Testing Framework
- Hands-on experience with API testing
- Experience working in Scrum teams
- Background in large projects involving 10 or more teams
- Advanced level of English for daily communication
Nice to Have- Experience with EIS Insurance Suite (versions 12 or 20)
- Previous work with the EIS Testing Framework
- Exposure to projects in the UK insurance domain
Required Technical Skills- Java
- Selenium
- API Testing
Location Requirements- Remote work possible only from Poland
-
Β· 18 views Β· 1 application Β· 4d
Senior Business Analyst
Full Remote Β· Ukraine Β· 5 years of experience Β· C2 - ProficientThe project involves building an enterprise-grade web portal for an energy domain company that consolidates Power BI project data into a single, secure, role-based user interface. The solution includes dynamic dashboards, embedded Power BI reports,...The project involves building an enterprise-grade web portal for an energy domain company that consolidates Power BI project data into a single, secure, role-based user interface.
The solution includes dynamic dashboards, embedded Power BI reports, certified document previews, SSO authentication via Microsoft Entra ID, RBAC permissions, audit logging, and integration with multiple sources of project information.
As part of the team, you will work closely with business stakeholders, architects, UX, and engineering to ensure the solution meets the operational and analytical needs of various user groups across the organization.
RequirementsWe are looking for a highly skilled Senior Business Analyst with experience in enterprise software projects and an understanding of the energy domain.
You will drive the requirements discovery process, analyze client needs, work with stakeholders, and prepare structured documentation for the development team.
Your work will include reviewing client materials, extracting user and business problems, structuring them into clear deliverables, and supporting the team with functional flows and acceptance criteria. You will be contributing to the following key activities:- Requirements elicitation & analysis
- Gap analysis and impact assessment
- Backlog creation and prioritization
- Preparing user stories, acceptance criteria, workflows
- Supporting design of role-based flows, dashboards, reporting logic, and certification workflows
Job responsibilitiesYour role will include the following activities:
- Research and analyze client documents
- Prepare gap analysis reports and identify missing capabilities
- Extract user problems and business needs from unstructured sources
- Create and maintain a prioritized backlog of market / client problems
- Define user stories, acceptance criteria, and functional specifications
- Support UX/UI and architectural discussions with clearly structured requirements
- Collaborate with Product Owner, architects, and the engineering team
- Validate requirements with stakeholders and ensure alignment with business goals
- Support UAT preparation and documentation
-
Β· 53 views Β· 1 application Β· 4d
JavaScript full-stack developer (React + Node.js), Egypt
Full Remote Β· Egypt Β· 5 years of experience Β· B2 - Upper IntermediateAbout the Project: We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides...About the Project:
We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides real-time insights into operational performance.
The platform includes a real-time reporting module, a flexible role-based access system, integrations with third-party services, and an intuitive interface for visualizing complex business processes.
Our architecture is based on a microservices approach, leveraging Kubernetes, cloud services, and up-to-date DevOps practices. The team follows Scrum, with bi-weekly release cycles and strong engineering standards.
Requirements:
- 5+ years of commercial experience in software development
- Strong experience with Node.js, Kubernetes, TypeScript
- Hands-on experience with React
- Familiarity with Next.js (nice to have)
- Solid understanding of modern CI/CD practices
- English Upper-Intermediate or higher
Responsibilities:
- Develop new features and modules for the platform
- Work on both frontend (React) and backend (Node.js) parts of the system
- Participate in architectural discussions and contribute to technical decisions
- Work with microservices, Kubernetes deployments, and cloud infrastructure
- Optimize performance, ensure code quality, and maintain best engineering practices
- Collaborate closely with QA, DevOps, and Product teams
- Take part in sprint planning, task estimation, and code reviews
We Offer:
- Work in a strong international engineering team
- Opportunity to influence technical decisions and product architecture
- Fully remote & flexible schedule
- Competitive compensation
- Modern tech stack and challenging tasks
-
Β· 11 views Β· 0 applications Β· 4d
Middle/Senior Data Engineers (Egypt)
Full Remote Β· Egypt Β· 5 years of experience Β· B2 - Upper IntermediateMiddle/Senior Data Engineer We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a...Middle/Senior Data Engineer
We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a unified semantic layer for analytics and reporting teams.
Our data ecosystem is built on a modern tech stack using a cloud data warehouse (BigQuery/Snowflake), dbt for transformations, Airflow for orchestration, and a collection of BI tools for dashboards and self-service analytics.
You will join a highly skilled data team that collaborates closely with Data Scientists, Analysts, and Backend Engineers to design scalable and reliable data pipelines.
Requirements:
- 5+ years of experience working with data pipelines in production
- Strong skills in SQL and Python
- Hands-on experience with dbt and cloud data warehouses (BigQuery, Snowflake)
- Experience with Apache Airflow or other orchestration tools
- Good knowledge of Docker and modern CI/CD workflows
- Familiarity with BI tools (Superset, Metabase, Looker, Tableau)
- Fluent English
Responsibilities:
- Build, maintain, and optimize production-grade ETL/ELT pipelines
- Develop dbt models and contribute to the platformβs data transformation layer
- Work with BigQuery/Snowflake to design efficient data architectures
- Set up and manage workflows in Airflow (or similar orchestration systems)
- Ensure data quality, reliability, and proper documentation
- Collaborate with data analysts, ML engineers, and product teams
- Improve pipeline performance and implement best practices in DevOps for data
We Offer:
- Work with a modern cloud data stack and cutting-edge tools
- An international team with a strong data-driven culture
- Flexible remote work
- Competitive compensation
- Opportunity to shape the data architecture of a growing product
-
Β· 16 views Β· 2 applications Β· 4d
Senior Data Engineer
Hybrid Remote Β· Ukraine Β· Product Β· 4 years of experience Β· B2 - Upper IntermediateYour future responsibilities: Collaborate with data and analytics experts to strive for greater functionality in our data systems Design, use and test the infrastructure required for optimal extraction, transformation, and loading of data from a wide...Your future responsibilities:
- Collaborate with data and analytics experts to strive for greater functionality in our data systems
- Design, use and test the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies (DevOps & Continuous Integration)
- Drive the advancement of data infrastructure by designing and implementing the underlying logic and structure for how data is set up, cleansed, and ultimately stored for organizational usage
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Build data integration from various sources and technologies to the data lake infrastructure as part of an agile delivery team
- Monitor the capabilities and react on unplanned interruptions ensuring that environments are provided & loaded in time
Your skills and experience:
- Minimum 5 years experience in a dedicated data engineer role
- Experience working with large structured and unstructured data in various formats
- Knowledge or experience with streaming data frameworks and distributed data architectures (e.g. Spark Structured Streaming, Apache Beam or Apache Flink)
- Experience with cloud technologies (preferable AWS, Azure)
- Experience in Cloud services (Data Flow, Data Proc, BigQuery, Pub/Sub)
- Experience of practical operation of Big Data stack: Hadoop, HDFS, Hive, Presto, Kafka
- Experience of Python in the context of creating ETL data pipelines
- Experience with Data Lake / Data Warehouse solutions (AWS S3 // Minio)
- Experience with Apache Airflow
- Development skills in a Docker / Kubernetes environment
- Open and team-minded personality and communication skills
- Willingness to work in an agile environment
We offer what matters most to you:
- Competitive salary: we guarantee a stable income and annual bonuses for your personal contribution. Additionally, we have a referral program with rewards for bringing in new colleagues to Raiffeisen Bank
- Social package: official employment, 28 days of paid leave, additional paternity leave, and financial assistance for parents with newborns
- Comfortable working conditions: possibility of a hybrid work format, offices equipped with shelters and generators, modern equipment. Classification: PUBLIC
- Wellbeing program: all employees have access to medical insurance from the first working day; consultations with a psychologist, nutritionist, or lawyer; discount programs for sports and purchases; family days for children and adults; in-office massages
- Training and development: access to over 130 online training resources; corporate training programs in CX, Data, IT Security, Leadership, Agile. Corporate library and English lessons. β’ Great team: our colleagues form a community where curiosity, talent, and innovation are welcome. We support each other, learn together, and grow. You can find like-minded individuals in over 15 professional communities, reading clubs, or sports clubs
- Career opportunities: we encourage advancement within the bank across functions
- Innovations and technologies: Infrastructure: AWS, Kubernetes, Docker, GitHub, GitHub actions, ArgoCD, Prometheus, Victoria, Vault, OpenTelemetry, ElasticSearch, Crossplain, Grafana. Languages: Java (main), Python (data), Go (infra, security), Swift (IOS), Kotlin (Android). Data stores: Sql-Oracle, PgSql, MsSql, Sybase. Data management: Kafka, AirFlow, Spark, Flink
- Support program for defenders: we maintain jobs and pay average wages to mobilized individuals. For veterans, we have a support program and develop the Bankβs veterans community. We work on increasing awareness among leaders and teams about the return of veterans to civilian life. Raiffeisen Bank has been recognized as one of the best employers for veterans by Forbes
Why Raiffeisen Bank?
- Our main value is people, and we support and recognize them, educate them and involve them in changes. Join Raifβs team because for us YOU matter!
- One of the largest lenders to the economy and agricultural business among private banks
- Recognized as the best employer by EY, Forbes, Randstad, Franklin Covey, and Delo.UA
- The largest humanitarian aid donor among banks (Ukrainian Red Cross, UNITED24, Superhumans, Π‘ΠΠΠΠΠΠ)
- One of the largest IT product teams among the countryβs banks. β’ One of the largest taxpayers in Ukraine; 6.6 billion UAH were paid in taxes in 2023
Opportunities for Everyone:
- Rife is guided by principles that focus on people and their development, with 5,500 employees and more than 2.7 million customers at the center of attention
- We support the principles of diversity, equality and inclusiveness
- We are open to hiring veterans and people with disabilities and are ready to adapt the work environment to your special needs
- We cooperate with students and older people, creating conditions for growth at any career stage
Want to learn more? β Follow us on social media:
Facebook, Instagram, LinkedIn
___________________________________________________________________________________________
Π Π°ΠΉΡΡΠ°ΠΉΠ·Π΅Π½ ΠΠ°Π½ΠΊ β Π½Π°ΠΉΠ±ΡΠ»ΡΡΠΈΠΉ ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠΈΠΉ Π±Π°Π½ΠΊ Π· ΡΠ½ΠΎΠ·Π΅ΠΌΠ½ΠΈΠΌ ΠΊΠ°ΠΏΡΡΠ°Π»ΠΎΠΌ. ΠΡΠ»ΡΡΠ΅ 30 ΡΠΎΠΊΡΠ² ΠΌΠΈ ΡΡΠ²ΠΎΡΡΡΠΌΠΎ ΡΠ° Π²ΠΈΠ±ΡΠ΄ΠΎΠ²ΡΡΠΌΠΎ Π±Π°Π½ΠΊΡΠ²ΡΡΠΊΡ ΡΠΈΡΡΠ΅ΠΌΡ Π½Π°ΡΠΎΡ Π΄Π΅ΡΠΆΠ°Π²ΠΈ.
Π£ Π Π°ΠΉΡΡ ΠΏΡΠ°ΡΡΡ ΠΏΠΎΠ½Π°Π΄ 5 500 ΡΠΏΡΠ²ΡΠΎΠ±ΡΡΠ½ΠΈΠΊΡΠ², ΡΠ΅ΡΠ΅Π΄ Π½ΠΈΡ ΠΎΠ΄Π½Π° ΡΠ· Π½Π°ΠΉΠ±ΡΠ»ΡΡΠΈΡ ΠΏΡΠΎΠ΄ΡΠΊΡΠΎΠ²ΠΈΡ ΠΠ’-ΠΊΠΎΠΌΠ°Π½Π΄, ΡΠΎ Π½Π°Π»ΡΡΡΡ ΠΏΠΎΠ½Π°Π΄ 800 ΡΠ°Ρ ΡΠ²ΡΡΠ². Π©ΠΎΠ΄Π½Ρ ΠΏΠ»ΡΡ-ΠΎ-ΠΏΠ»ΡΡ ΠΌΠΈ ΠΏΡΠ°ΡΡΡΠΌΠΎ, ΡΠΎΠ± Π±ΡΠ»ΡΡ Π½ΡΠΆ 2,7 ΠΌΡΠ»ΡΠΉΠΎΠ½Π° Π½Π°ΡΠΈΡ ΠΊΠ»ΡΡΠ½ΡΡΠ² ΠΌΠΎΠ³Π»ΠΈ ΠΎΡΡΠΈΠΌΠ°ΡΠΈ ΡΠΊΡΡΠ½Π΅ ΠΎΠ±ΡΠ»ΡΠ³ΠΎΠ²ΡΠ²Π°Π½Π½Ρ, ΠΊΠΎΡΠΈΡΡΡΠ²Π°ΡΠΈΡΡ ΠΏΡΠΎΠ΄ΡΠΊΡΠ°ΠΌΠΈ Ρ ΡΠ΅ΡΠ²ΡΡΠ°ΠΌΠΈ Π±Π°Π½ΠΊΡ, ΡΠΎΠ·Π²ΠΈΠ²Π°ΡΠΈ Π±ΡΠ·Π½Π΅Ρ, Π°Π΄ΠΆΠ΅ ΠΌΠΈ #Π Π°Π·ΠΎΠΌ_Π·_Π£ΠΊΡΠ°ΡΠ½ΠΎΡ.β―
Π’Π²ΠΎΡ ΠΌΠ°ΠΉΠ±ΡΡΠ½Ρ ΠΎΠ±ΠΎΠ²βΡΠ·ΠΊΠΈ:
- Π‘ΠΏΡΠ²ΠΏΡΠ°ΡΡ Π· Π΅ΠΊΡΠΏΠ΅ΡΡΠ°ΠΌΠΈ Π· Π΄Π°Π½ΠΈΡ ΡΠ° Π°Π½Π°Π»ΡΡΠΈΠΊΠΈ, ΡΠΎΠ± Π΄ΠΎΡΡΠ³ΡΠΈ Π±ΡΠ»ΡΡΠΎΡ ΡΡΠ½ΠΊΡΡΠΎΠ½Π°Π»ΡΠ½ΠΎΡΡΡ Π½Π°ΡΠΈΡ ΡΠΈΡΡΠ΅ΠΌ Π΄Π°Π½ΠΈΡ
- ΠΡΠΎΠ΅ΠΊΡΡΠ²Π°Π½Π½Ρ, Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½Ρ ΡΠ° ΡΠ΅ΡΡΡΠ²Π°Π½Π½Ρ ΡΠ½ΡΡΠ°ΡΡΡΡΠΊΡΡΡΠΈ, Π½Π΅ΠΎΠ±Ρ ΡΠ΄Π½ΠΎΡ Π΄Π»Ρ ΠΎΠΏΡΠΈΠΌΠ°Π»ΡΠ½ΠΎΠ³ΠΎ Π²ΠΈΠ»ΡΡΠ΅Π½Π½Ρ, ΠΏΠ΅ΡΠ΅ΡΠ²ΠΎΡΠ΅Π½Π½Ρ ΡΠ° Π·Π°Π²Π°Π½ΡΠ°ΠΆΠ΅Π½Π½Ρ Π΄Π°Π½ΠΈΡ Π· ΡΠΈΡΠΎΠΊΠΎΠ³ΠΎ ΡΠΏΠ΅ΠΊΡΡΡ Π΄ΠΆΠ΅ΡΠ΅Π» Π΄Π°Π½ΠΈΡ Π·Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΎΡ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΠΉ SQL ΡΠ° AWS Π΄Π»Ρ Π²Π΅Π»ΠΈΠΊΠΈΡ Π΄Π°Π½ΠΈΡ (DevOps ΡΠ° Π±Π΅Π·ΠΏΠ΅ΡΠ΅ΡΠ²Π½Π° ΡΠ½ΡΠ΅Π³ΡΠ°ΡΡΡ)
- Π‘ΠΏΡΠΈΡΠ½Π½Ρ ΡΠΎΠ·Π²ΠΈΡΠΊΡ ΡΠ½ΡΡΠ°ΡΡΡΡΠΊΡΡΡΠΈ Π΄Π°Π½ΠΈΡ ΡΠ»ΡΡ ΠΎΠΌ ΠΏΡΠΎΠ΅ΠΊΡΡΠ²Π°Π½Π½Ρ ΡΠ° Π²ΠΏΡΠΎΠ²Π°Π΄ΠΆΠ΅Π½Π½Ρ Π±Π°Π·ΠΎΠ²ΠΎΡ Π»ΠΎΠ³ΡΠΊΠΈ ΡΠ° ΡΡΡΡΠΊΡΡΡΠΈ Π΄Π»Ρ Π½Π°Π»Π°ΡΡΡΠ²Π°Π½Π½Ρ, ΠΎΡΠΈΡΠ΅Π½Π½Ρ ΡΠ°, Π·ΡΠ΅ΡΡΠΎΡ, Π·Π±Π΅ΡΡΠ³Π°Π½Π½Ρ Π΄Π°Π½ΠΈΡ Π΄Π»Ρ Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½Ρ Π² ΠΎΡΠ³Π°Π½ΡΠ·Π°ΡΡΡ
- ΠΠ±ΠΈΡΠ°ΡΠΈ Π²Π΅Π»ΠΈΠΊΡ, ΡΠΊΠ»Π°Π΄Π½Ρ Π½Π°Π±ΠΎΡΠΈ Π΄Π°Π½ΠΈΡ , ΡΠΎ Π²ΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π°ΡΡΡ ΡΡΠ½ΠΊΡΡΠΎΠ½Π°Π»ΡΠ½ΠΈΠΌ/Π½Π΅ΡΡΠ½ΠΊΡΡΠΎΠ½Π°Π»ΡΠ½ΠΈΠΌ Π±ΡΠ·Π½Π΅Ρ-Π²ΠΈΠΌΠΎΠ³Π°ΠΌ
- Π‘ΡΠ²ΠΎΡΡΠ²Π°ΡΠΈ ΡΠ½ΡΠ΅Π³ΡΠ°ΡΡΡ Π΄Π°Π½ΠΈΡ Π· ΡΡΠ·Π½ΠΈΡ Π΄ΠΆΠ΅ΡΠ΅Π» ΡΠ° ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΠΉ Π² ΡΠ½ΡΡΠ°ΡΡΡΡΠΊΡΡΡΡ ΠΎΠ·Π΅ΡΠ° Π΄Π°Π½ΠΈΡ ΡΠΊ ΡΠ°ΡΡΠΈΠ½Π° Π³Π½ΡΡΠΊΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ Π· ΠΏΠΎΡΡΠ°ΡΠ°Π½Π½Ρ
- ΠΠΎΠ½ΡΡΠΎΡΠΈΡΠΈ ΠΌΠΎΠΆΠ»ΠΈΠ²ΠΎΡΡΡ ΡΠ° ΡΠ΅Π°Π³ΡΠ²Π°ΡΠΈ Π½Π° Π½Π΅Π·Π°ΠΏΠ»Π°Π½ΠΎΠ²Π°Π½Ρ ΠΏΠ΅ΡΠ΅Π±ΠΎΡ, Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΡΡΡΠΈ ΡΠ²ΠΎΡΡΠ°ΡΠ½Π΅ Π½Π°Π΄Π°Π½Π½Ρ ΡΠ° Π·Π°Π²Π°Π½ΡΠ°ΠΆΠ΅Π½Π½Ρ ΡΠ΅ΡΠ΅Π΄ΠΎΠ²ΠΈΡ
Π’Π²ΡΠΉ Π΄ΠΎΡΠ²ΡΠ΄ ΡΠ° Π½Π°Π²ΠΈΡΠΊΠΈ:
- ΠΡΠ½ΡΠΌΡΠΌ 5 ΡΠΎΠΊΡΠ² Π΄ΠΎΡΠ²ΡΠ΄Ρ ΡΠΎΠ±ΠΎΡΠΈ Π½Π° ΠΏΠΎΡΠ°Π΄Ρ ΡΠΏΠ΅ΡΡΠ°Π»ΡΠ·ΠΎΠ²Π°Π½ΠΎΠ³ΠΎ ΡΠ½ΠΆΠ΅Π½Π΅ΡΠ° Π· Π΄Π°Π½ΠΈΡ
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ ΡΡΡΡΠΊΡΡΡΠΎΠ²Π°Π½ΠΈΠΌΠΈ ΡΠ° Π½Π΅ΡΡΡΡΠΊΡΡΡΠΎΠ²Π°Π½ΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ Π² ΡΡΠ·Π½ΠΈΡ ΡΠΎΡΠΌΠ°ΡΠ°Ρ
- ΠΠ½Π°Π½Π½Ρ Π°Π±ΠΎ Π΄ΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· ΡΡΠ΅ΠΉΠΌΠ²ΠΎΡΠΊΠ°ΠΌΠΈ ΠΏΠΎΡΠΎΠΊΠΎΠ²ΠΈΡ Π΄Π°Π½ΠΈΡ ΡΠ° ΡΠΎΠ·ΠΏΠΎΠ΄ΡΠ»Π΅Π½ΠΈΠΌΠΈ Π°ΡΡ ΡΡΠ΅ΠΊΡΡΡΠ°ΠΌΠΈ Π΄Π°Π½ΠΈΡ (Π½Π°ΠΏΡΠΈΠΊΠ»Π°Π΄,
- Spark Structured Streaming, Apache Beam Π°Π±ΠΎ Apache Flink)
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Ρ ΠΌΠ°ΡΠ½ΠΈΠΌΠΈ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡΠΌΠΈ (Π±Π°ΠΆΠ°Π½ΠΎ AWS, Azure)
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Ρ ΠΌΠ°ΡΠ½ΠΈΠΌΠΈ ΡΠ΅ΡΠ²ΡΡΠ°ΠΌΠΈ (Data Flow, Data Proc, BigQuery, Pub/Sub)
- ΠΠΎΡΠ²ΡΠ΄ ΠΏΡΠ°ΠΊΡΠΈΡΠ½ΠΎΡ Π΅ΠΊΡΠΏΠ»ΡΠ°ΡΠ°ΡΡΡ ΡΡΠ΅ΠΊΡ Big Data: Hadoop, HDFS, Hive, Presto, Kafka
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Python Ρ ΠΊΠΎΠ½ΡΠ΅ΠΊΡΡΡ ΡΡΠ²ΠΎΡΠ΅Π½Π½Ρ ETL-ΠΏΠΎΡΠΎΠΊΡΠ² Π΄Π°Π½ΠΈΡ
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· ΡΡΡΠ΅Π½Π½ΡΠΌΠΈ Data Lake / Data Warehouse (AWS S3 // Minio)
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Apache Airflow
- ΠΠ°Π²ΠΈΡΠΊΠΈ ΡΠΎΠ·ΡΠΎΠ±ΠΊΠΈ Π² ΡΠ΅ΡΠ΅Π΄ΠΎΠ²ΠΈΡΡ Docker / Kubernetes
- ΠΡΠ΄ΠΊΡΠΈΡΠ° ΡΠ° ΠΊΠΎΠΌΠ°Π½Π΄Π½Π° ΠΎΡΠΎΠ±ΠΈΡΡΡΡΡΡ, ΠΊΠΎΠΌΡΠ½ΡΠΊΠ°ΡΠΈΠ²Π½Ρ Π½Π°Π²ΠΈΡΠΊΠΈ
- ΠΠΎΡΠΎΠ²Π½ΡΡΡΡ ΠΏΡΠ°ΡΡΠ²Π°ΡΠΈ Π² Π³Π½ΡΡΠΊΠΎΠΌΡ ΡΠ΅ΡΠ΅Π΄ΠΎΠ²ΠΈΡΡ
ΠΡΠΎΠΏΠΎΠ½ΡΡΠΌΠΎ ΡΠ΅, ΡΠΎ ΠΌΠ°Ρ Π·Π½Π°ΡΠ΅Π½Π½Ρ ΡΠ°ΠΌΠ΅ Π΄Π»Ρ ΡΠ΅Π±Π΅:β―
- ΠΠΎΠ½ΠΊΡΡΠ΅Π½ΡΠ½Π° Π·Π°ΡΠΎΠ±ΡΡΠ½Π° ΠΏΠ»Π°ΡΠ°: Π³Π°ΡΠ°Π½ΡΡΡΠΌΠΎ ΡΡΠ°Π±ΡΠ»ΡΠ½ΠΈΠΉ Π΄ΠΎΡ ΡΠ΄ ΡΠ° ΡΡΡΠ½Ρ Π±ΠΎΠ½ΡΡΠΈ Π·Π° ΡΠ²ΡΠΉ ΠΎΡΠΎΠ±ΠΈΡΡΠΈΠΉ Π²Π½Π΅ΡΠΎΠΊ. ΠΠΎΠ΄Π°ΡΠΊΠΎΠ²ΠΎ, Ρ Π½Π°Ρ Π΄ΡΡ ΡΠ΅ΡΠ΅ΡΠ°Π»ΡΠ½Π° ΠΏΡΠΎΠ³ΡΠ°ΠΌΠ° Π²ΠΈΠ½Π°Π³ΠΎΡΠΎΠ΄ΠΈ Π·Π° Π·Π°Π»ΡΡΠ΅Π½Π½Ρ Π½ΠΎΠ²ΠΈΡ ΠΊΠΎΠ»Π΅Π³ Π΄ΠΎ Π Π°ΠΉΡΡΠ°ΠΉΠ·Π΅Π½ ΠΠ°Π½ΠΊΡ.
- Π‘ΠΎΡΡΠ°Π»ΡΠ½ΠΈΠΉ ΠΏΠ°ΠΊΠ΅Ρ: ΠΎΡΡΡΡΠΉΠ½Π΅ ΠΏΡΠ°ΡΠ΅Π²Π»Π°ΡΡΡΠ²Π°Π½Π½Ρ, 28 Π΄Π½ΡΠ² ΠΎΠΏΠ»Π°ΡΡΠ²Π°Π½ΠΎΡ Π²ΡΠ΄ΠΏΡΡΡΠΊΠΈ, Π΄ΠΎΠ΄Π°ΡΠΊΠΎΠ²ΠΈΠΉ βΠ΄Π΅ΠΊΡΠ΅Ρβ Π΄Π»Ρ ΡΠ°ΡΡΡΡΠ², ΡΠ° ΠΌΠ°ΡΠ΅ΡΡΠ°Π»ΡΠ½Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³Π° Π΄Π»Ρ Π±Π°ΡΡΠΊΡΠ² ΠΏΡΠΈ Π½Π°ΡΠΎΠ΄ΠΆΠ΅Π½Π½Ρ Π΄ΡΡΠ΅ΠΉ.
- ΠΠΎΠΌΡΠΎΡΡΠ½Ρ ΡΠΌΠΎΠ²ΠΈ ΠΏΡΠ°ΡΡ: ΠΌΠΎΠΆΠ»ΠΈΠ²ΡΡΡΡ Π³ΡΠ±ΡΠΈΠ΄Π½ΠΎΠ³ΠΎ ΡΠΎΡΠΌΠ°ΡΡ ΡΠΎΠ±ΠΎΡΠΈ, ΠΎΡΡΡΠΈ Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΠ΅Π½Π½Ρ ΡΠΊΡΠΈΡΡΡΠΌΠΈ ΡΠ° Π³Π΅Π½Π΅ΡΠ°ΡΠΎΡΠ°ΠΌΠΈ, Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΠ΅Π½Π½Ρ ΡΡΡΠ°ΡΠ½ΠΎΡ ΡΠ΅Ρ Π½ΡΠΊΠΎΡ.
- Wellbeing ΠΏΡΠΎΠ³ΡΠ°ΠΌΠ°: Π΄Π»Ρ Π²ΡΡΡ ΡΠΏΡΠ²ΡΠΎΠ±ΡΡΠ½ΠΈΠΊΡΠ² Π΄ΠΎΡΡΡΠΏΠ½Ρ ΠΌΠ΅Π΄ΠΈΡΠ½Π΅ ΡΡΡΠ°Ρ ΡΠ²Π°Π½Π½Ρ Π· ΠΏΠ΅ΡΡΠΎΠ³ΠΎ ΡΠΎΠ±ΠΎΡΠΎΠ³ΠΎ Π΄Π½Ρ; ΠΊΠΎΠ½ΡΡΠ»ΡΡΠ°ΡΡΡ ΠΏΡΠΈΡ ΠΎΠ»ΠΎΠ³Π°, Π½ΡΡΡΠΈΡΡΠΎΠ»ΠΎΠ³Π° ΡΠΈ ΡΡΠΈΡΡΠ°; Π΄ΠΈΡΠΊΠΎΠ½Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΈ Π½Π° ΡΠΏΠΎΡΡ ΡΠ° ΠΏΠΎΠΊΡΠΏΠΊΠΈ; family days Π΄Π»Ρ Π΄ΡΡΠ΅ΠΉ ΡΠ° Π΄ΠΎΡΠΎΡΠ»ΠΈΡ ; ΠΌΠ°ΡΠ°ΠΆ Π² ΠΎΡΡΡΡ.
- ΠΠ°Π²ΡΠ°Π½Π½Ρ ΡΠ° ΡΠΎΠ·Π²ΠΈΡΠΎΠΊ: Π΄ΠΎΡΡΡΠΏ Π΄ΠΎ ΠΏΠΎΠ½Π°Π΄ 130 Π½Π°Π²ΡΠ°Π»ΡΠ½ΠΈΡ ΠΎΠ½Π»Π°ΠΉΠ½-ΡΠ΅ΡΡΡΡΡΠ²; ΠΊΠΎΡΠΏΠΎΡΠ°ΡΠΈΠ²Π½Ρ Π½Π°Π²ΡΠ°Π»ΡΠ½Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΈ Π· CX, Data, IT Security, ΠΡΠ΄Π΅ΡΡΡΠ²Π°, Agile. ΠΠΎΡΠΏΠΎΡΠ°ΡΠΈΠ²Π½Π° Π±ΡΠ±Π»ΡΠΎΡΠ΅ΠΊΠ° ΡΠ° ΡΡΠΎΠΊΠΈ Π°Π½Π³Π»ΡΠΉΡΡΠΊΠΎΡ.
- ΠΡΡΡΠ° ΠΊΠΎΠΌΠ°Π½Π΄Π°: Π½Π°ΡΡ ΠΊΠΎΠ»Π΅Π³ΠΈ β ΡΠ΅ ΡΠΏΡΠ»ΡΠ½ΠΎΡΠ°, Π΄Π΅ Π²ΡΡΠ°ΡΡΡΡΡ Π΄ΠΎΠΏΠΈΡΠ»ΠΈΠ²ΡΡΡΡ, ΡΠ°Π»Π°Π½Ρ ΡΠ° ΡΠ½Π½ΠΎΠ²Π°ΡΡΡ. ΠΠΈ ΠΏΡΠ΄ΡΡΠΈΠΌΡΡΠΌΠΎ ΠΎΠ΄ΠΈΠ½ ΠΎΠ΄Π½ΠΎΠ³ΠΎ, Π²ΡΠΈΠΌΠΎΡΡ ΡΠ°Π·ΠΎΠΌ ΡΠ° Π·ΡΠΎΡΡΠ°ΡΠΌΠΎ. Π’ΠΈ ΠΌΠΎΠΆΠ΅Ρ Π·Π½Π°ΠΉΡΠΈ ΠΎΠ΄Π½ΠΎΠ΄ΡΠΌΡΡΠ² Ρ ΠΏΠΎΠ½Π°Π΄ 15-ΡΠΈ ΠΏΡΠΎΡΠ΅ΡΡΠΉΠ½ΠΈΡ ΠΊΠΎΠΌβΡΠ½ΡΡΡ, ΡΠΈΡΠ°ΡΡΠΊΠΎΠΌΡ ΡΠΈ ΡΠΏΠΎΡΡΠΈΠ²Π½ΠΎΠΌΡ ΠΊΠ»ΡΠ±Π°Ρ .
- ΠΠ°ΡβΡΡΠ½Ρ ΠΌΠΎΠΆΠ»ΠΈΠ²ΠΎΡΡΡ: ΠΌΠΈ Π·Π°ΠΎΡ ΠΎΡΡΡΠΌΠΎ ΠΏΡΠΎΡΡΠ²Π°Π½Π½Ρ Π²ΡΠ΅ΡΠ΅Π΄ΠΈΠ½Ρ Π±Π°Π½ΠΊΡ ΠΌΡΠΆ ΡΡΠ½ΠΊΡΡΡΠΌΠΈ.
- ΠΠ½Π½ΠΎΠ²Π°ΡΡΡ ΡΠ° ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡ. Infrastructure: AWS, Kubernetes, Docker, GitHub, GitHub actions, ArgoCD, Prometheus, Victoria, Vault, OpenTelemetry, ElasticSearch, Crossplain, Grafana. Languages: Java (main), Python (data), Go(infra,security), Swift (IOS), Kotlin (Andorid). Datastores: Sql-Oracle, PgSql, MsSql, Sybase. Data management: Kafka, AirFlow, Spark, Flink.
- ΠΡΠΎΠ³ΡΠ°ΠΌΠ° ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ Π·Π°Ρ ΠΈΡΠ½ΠΈΠΊΡΠ² Ρ Π·Π°Ρ ΠΈΡΠ½ΠΈΡΡ: ΠΌΠΈ Π·Π±Π΅ΡΡΠ³Π°ΡΠΌΠΎ ΡΠΎΠ±ΠΎΡΡ ΠΌΡΡΡΡ ΡΠ° Π²ΠΈΠΏΠ»Π°ΡΡΡΠΌΠΎ ΡΠ΅ΡΠ΅Π΄Π½Ρ Π·Π°ΡΠΎΠ±ΡΡΠ½Ρ ΠΏΠ»Π°ΡΡ ΠΌΠΎΠ±ΡΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠΌ. ΠΠ»Ρ Π²Π΅ΡΠ΅ΡΠ°Π½ΡΠ² ΡΠ° Π²Π΅ΡΠ΅ΡΠ°Π½ΠΎΠΊ Ρ Π½Π°Ρ Π΄ΡΡ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠ° ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ, ΡΠΎΠ·Π²ΠΈΠ²Π°ΡΡΡΡΡ Π²Π΅ΡΠ΅ΡΠ°Π½ΡΡΠΊΠ° ΡΠΏΡΠ»ΡΠ½ΠΎΡΠ° ΠΠ°Π½ΠΊΡ. ΠΠΈ ΠΏΡΠ°ΡΡΡΠΌΠΎ Π½Π°Π΄ ΠΏΡΠ΄Π²ΠΈΡΠ΅Π½Π½ΡΠΌ ΠΎΠ±ΡΠ·Π½Π°Π½ΠΎΡΡΡ ΠΊΠ΅ΡΡΠ²Π½ΠΈΠΊΡΠ² ΡΠ° ΠΊΠΎΠΌΠ°Π½Π΄ Π· ΠΏΠΈΡΠ°Π½Ρ ΠΏΠΎΠ²Π΅ΡΠ½Π΅Π½Π½Ρ Π²Π΅ΡΠ΅ΡΠ°Π½ΡΠ² Π΄ΠΎ ΡΠΈΠ²ΡΠ»ΡΠ½ΠΎΠ³ΠΎ ΠΆΠΈΡΡΡ. Π Π°ΠΉΡΡΠ°ΠΉΠ·Π΅Π½ ΠΠ°Π½ΠΊ Π²ΡΠ΄Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ ΡΠΊ ΠΎΠ΄ΠΈΠ½ Π· Π½Π°ΠΉΠΊΡΠ°ΡΠΈΡ ΡΠΎΠ±ΠΎΡΠΎΠ΄Π°Π²ΡΡΠ² Π΄Π»Ρ Π²Π΅ΡΠ΅ΡΠ°Π½ΡΠ² (Forbes).
Π§ΠΎΠΌΡ Π Π°ΠΉΡΡΠ°ΠΉΠ·Π΅Π½ ΠΠ°Π½ΠΊ?β―
- ΠΠ°ΡΠ° Π³ΠΎΠ»ΠΎΠ²Π½Π° ΡΡΠ½Π½ΡΡΡΡ β Π»ΡΠ΄ΠΈ Ρ ΠΌΠΈ Π΄Π°ΡΠΌΠΎ ΡΠΌ ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΡ Ρ Π²ΠΈΠ·Π½Π°Π½Π½Ρ, Π½Π°Π²ΡΠ°ΡΠΌΠΎ, Π·Π°Π»ΡΡΠ°ΡΠΌΠΎ Π΄ΠΎ Π·ΠΌΡΠ½. ΠΡΠΈΡΠ΄Π½ΡΠΉΡΡ Π΄ΠΎ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ Π Π°ΠΉΡΡ, Π°Π΄ΠΆΠ΅ Π΄Π»Ρ Π½Π°Ρ Π’Π ΠΌΠ°ΡΡ Π·Π½Π°ΡΠ΅Π½Π½Ρ!β―
- ΠΠ΄ΠΈΠ½ ΡΠ· Π½Π°ΠΉΠ±ΡΠ»ΡΡΠΈΡ ΠΊΡΠ΅Π΄ΠΈΡΠΎΡΡΠ² Π΅ΠΊΠΎΠ½ΠΎΠΌΡΠΊΠΈ ΡΠ° Π°Π³ΡΠ°ΡΠ½ΠΎΠ³ΠΎ Π±ΡΠ·Π½Π΅ΡΡ ΡΠ΅ΡΠ΅Π΄ ΠΏΡΠΈΠ²Π°ΡΠ½ΠΈΡ Π±Π°Π½ΠΊΡΠ²β―
- ΠΠΈΠ·Π½Π°Π½ΠΈΠΉ Π½Π°ΠΉΠΊΡΠ°ΡΠΈΠΌ ΠΏΡΠ°ΡΠ΅Π΄Π°Π²ΡΠ΅ΠΌ Π·Π° Π²Π΅ΡΡΡΡΠΌΠΈ EY, Forbes, Randstad, Franklin Covey, Delo.UAβ―
- ΠΠ°ΠΉΠ±ΡΠ»ΡΡΠΈΠΉ Π΄ΠΎΠ½ΠΎΡ Π³ΡΠΌΠ°Π½ΡΡΠ°ΡΠ½ΠΎΡ Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΈΡΠ΅ΡΠ΅Π΄ Π±Π°Π½ΠΊΡΠ² (Π§Π΅ΡΠ²ΠΎΠ½ΠΈΠΉ Π₯ΡΠ΅ΡΡ Π£ΠΊΡΠ°ΡΠ½ΠΈ, UNITED24, Superhumans, Π‘ΠΠΠΠΠΠ)β―
- ΠΠ΄ΠΈΠ½ ΡΠ· Π½Π°ΠΉΠ±ΡΠ»ΡΡΠΈΡ ΠΏΠ»Π°ΡΠ½ΠΈΠΊΡΠ² ΠΏΠΎΠ΄Π°ΡΠΊΡΠ² Π² Π£ΠΊΡΠ°ΡΠ½Ρ, Π·Π° 2023 ΡΡΠΊ Π±ΡΠ»ΠΎ ΡΠΏΠ»Π°ΡΠ΅Π½ΠΎ 6,6 ΠΌΠ»ΡΠ΄ Π³ΡΠΈΠ²Π΅Π½Ρ
ΠΠΎΠΆΠ»ΠΈΠ²ΠΎΡΡΡ Π΄Π»Ρ Π²ΡΡΡ :β―
- Π Π°ΠΉΡ ΠΊΠ΅ΡΡΡΡΡΡΡ ΠΏΡΠΈΠ½ΡΠΈΠΏΠ°ΠΌΠΈ, ΡΠΎ ΡΠΎΠΊΡΡΡΡΡΡΡΡ Π½Π° Π»ΡΠ΄ΠΈΠ½Ρ ΡΠ° ΡΡ ΡΠΎΠ·Π²ΠΈΡΠΊΡ, Ρ ΡΠ΅Π½ΡΡΡ ΡΠ²Π°Π³ΠΈ 5β―500 ΡΠΏΡΠ²ΡΠΎΠ±ΡΡΠ½ΠΈΠΊΡΠ² ΡΠ° ΠΏΠΎΠ½Π°Π΄ 2,7 ΠΌΡΠ»ΡΠΉΠΎΠ½ΠΈ ΠΊΠ»ΡΡΠ½ΡΡΠ²β―β―
- ΠΡΠ΄ΡΡΠΈΠΌΡΡΠΌΠΎ ΠΏΡΠΈΠ½ΡΠΈΠΏΠΈ ΡΡΠ·Π½ΠΎΠΌΠ°Π½ΡΡΡΡ, ΡΡΠ²Π½ΠΎΡΡΡ ΡΠ° ΡΠ½ΠΊΠ»ΡΠ·ΠΈΠ²Π½ΠΎΡΡΡ
- ΠΠΈ Π²ΡΠ΄ΠΊΡΠΈΡΡ Π΄ΠΎ Π½Π°ΠΉΠΌΡ Π²Π΅ΡΠ΅ΡΠ°Π½ΡΠ² Ρ Π»ΡΠ΄Π΅ΠΉ Π· ΡΠ½Π²Π°Π»ΡΠ΄Π½ΡΡΡΡ ΡΠ° Π³ΠΎΡΠΎΠ²Ρ Π°Π΄Π°ΠΏΡΡΠ²Π°ΡΠΈ ΡΠΎΠ±ΠΎΡΠ΅ ΡΠ΅ΡΠ΅Π΄ΠΎΠ²ΠΈΡΠ΅ ΠΏΡΠ΄ Π²Π°ΡΡ ΠΎΡΠΎΠ±Π»ΠΈΠ²Ρ ΠΏΠΎΡΡΠ΅Π±ΠΈ
- Π‘ΠΏΡΠ²ΠΏΡΠ°ΡΡΡΠΌΠΎ Π·Ρ ΡΡΡΠ΄Π΅Π½ΡΠ°ΠΌΠΈ ΡΠ° Π»ΡΠ΄ΡΠΌΠΈ ΡΡΠ°ΡΡΠΎΠ³ΠΎ Π²ΡΠΊΡ,β―ΡΡΠ²ΠΎΡΡΡΡΠΈ ΡΠΌΠΎΠ²ΠΈ Π΄Π»Ρ Π·ΡΠΎΡΡΠ°Π½Π½Ρ Π½Π° Π±ΡΠ΄Ρ-ΡΠΊΠΎΠΌΡ Π΅ΡΠ°ΠΏΡ ΠΊΠ°ΡβΡΡΠΈ
ΠΠ°ΠΆΠ°ΡΡ Π΄ΡΠ·Π½Π°ΡΠΈΡΡ Π±ΡΠ»ΡΡΠ΅? β ΠΡΠ΄ΠΏΠΈΡΡΠΉΡΡ Π½Π° Π½Π°Ρ Ρ ΡΠΎΡ.ΠΌΠ΅ΡΠ΅ΠΆΠ°Ρ :
Facebook, Instagram, LinkedInβ―
More -
Β· 25 views Β· 0 applications Β· 4d
Python Engineer (Egypt)
Full Remote Β· Egypt Β· 4 years of experience Β· B2 - Upper IntermediateAbout the Project: We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture, leveraging...About the Project:
We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture, leveraging cloud infrastructure, asynchronous Python, and robust CI/CD pipelines.
You will work in a skilled engineering team focused on scalability, clean architecture, and automation.
Requirements:
- 4+ years of experience in backend development
- Strong hands-on expertise with Python (FastAPI, Django, or Flask)
- Solid understanding of REST APIs, asynchronous programming, and clean architecture
- Experience with relational and/or NoSQL databases
- Familiarity with Docker and CI/CD processes
- Ability to work independently in a remote environment
- English Upper Intermediate or higher
Responsibilities:
- Develop and maintain backend services and APIs
- Design scalable architecture and contribute to technical decisions
- Work with databases, optimize queries, and ensure data reliability
- Implement integrations with third-party services
- Write clean, maintainable code and participate in code reviews
- Collaborate with DevOps, QA, and product teams
- Troubleshoot, debug, and optimize system performance
We Offer:
- Work with a modern Python backend stack
- A flexible remote schedule
- Competitive compensation
- International engineering culture with strong technical standards
- An opportunity to work on a long-term, impactful product
-
Β· 46 views Β· 2 applications Β· 4d
Full-Stack Developer (React + Node.js), Serbia
Full Remote Β· Serbia Β· 5 years of experience Β· B2 - Upper IntermediateAbout the Project: We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides...About the Project:
We are building a modern cloud-based platform designed to help mid-size and enterprise-level companies manage and optimize their internal digital infrastructure. The product streamlines workflows, enhances transparency, and provides real-time insights into operational performance.
The platform includes a real-time reporting module, a flexible role-based access system, integrations with third-party services, and an intuitive interface for visualizing complex business processes.
Our architecture is based on a microservices approach, leveraging Kubernetes, cloud services, and up-to-date DevOps practices. The team follows Scrum, with bi-weekly release cycles and strong engineering standards.
Requirements:
- 5+ years of commercial experience in software development
- Strong experience with Node.js, Kubernetes, TypeScript
- Hands-on experience with React
- Familiarity with Next.js (nice to have)
- Solid understanding of modern CI/CD practices
- English Upper-Intermediate or higher
Responsibilities:
- Develop new features and modules for the platform
- Work on both frontend (React) and backend (Node.js) parts of the system
- Participate in architectural discussions and contribute to technical decisions
- Work with microservices, Kubernetes deployments, and cloud infrastructure
- Optimize performance, ensure code quality, and maintain best engineering practices
- Collaborate closely with QA, DevOps, and Product teams
- Take part in sprint planning, task estimation, and code reviews
We Offer:
- Work in a strong international engineering team
- Opportunity to influence technical decisions and product architecture
- Fully remote & flexible schedule
- Competitive compensation
- Modern tech stack and challenging tasks
-
Β· 9 views Β· 0 applications Β· 4d
Middle/Senior Data Engineers (Serbia)
Full Remote Β· Serbia Β· 5 years of experience Β· B2 - Upper IntermediateMiddle/Senior Data Engineer We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a...Middle/Senior Data Engineer
We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a unified semantic layer for analytics and reporting teams.
Our data ecosystem is built on a modern tech stack using a cloud data warehouse (BigQuery/Snowflake), dbt for transformations, Airflow for orchestration, and a collection of BI tools for dashboards and self-service analytics.
You will join a highly skilled data team that collaborates closely with Data Scientists, Analysts, and Backend Engineers to design scalable and reliable data pipelines.
Requirements:
- 5+ years of experience working with data pipelines in production
- Strong skills in SQL and Python
- Hands-on experience with dbt and cloud data warehouses (BigQuery, Snowflake)
- Experience with Apache Airflow or other orchestration tools
- Good knowledge of Docker and modern CI/CD workflows
- Familiarity with BI tools (Superset, Metabase, Looker, Tableau)
- Fluent English
Responsibilities:
- Build, maintain, and optimize production-grade ETL/ELT pipelines
- Develop dbt models and contribute to the platformβs data transformation layer
- Work with BigQuery/Snowflake to design efficient data architectures
- Set up and manage workflows in Airflow (or similar orchestration systems)
- Ensure data quality, reliability, and proper documentation
- Collaborate with data analysts, ML engineers, and product teams
- Improve pipeline performance and implement best practices in DevOps for data
We Offer:
- Work with a modern cloud data stack and cutting-edge tools
- An international team with a strong data-driven culture
- Flexible remote work
- Competitive compensation
- Opportunity to shape the data architecture of a growing product
-
Β· 35 views Β· 0 applications Β· 4d
Python Engineer (Serbia)
Full Remote Β· Serbia Β· 4 years of experience Β· B2 - Upper IntermediateAbout the Project: We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture,...About the Project:
We are developing a high-performance backend platform that powers data-intensive operations, real-time processing, and integrations with multiple external services. The system is built with a modern microservice architecture, leveraging cloud infrastructure, asynchronous Python, and robust CI/CD pipelines.
You will work in a skilled engineering team focused on scalability, clean architecture, and automation.
Requirements:
- 4+ years of experience in backend development
- Strong hands-on expertise with Python (FastAPI, Django, or Flask)
- Solid understanding of REST APIs, asynchronous programming, and clean architecture
- Experience with relational and/or NoSQL databases
- Familiarity with Docker and CI/CD processes
- Ability to work independently in a remote environment
- English β―Upper Intermediate or higher
Responsibilities:
- Develop and maintain backend services and APIs
- Design scalable architecture and contribute to technical decisions
- Work with databases, optimize queries, and ensure data reliability
- Implement integrations with third-party services
- Write clean, maintainable code and participate in code reviews
- Collaborate with DevOps, QA, and product teams
- Troubleshoot, debug, and optimize system performance
We Offer:
- Work with a modern Python backend stack
- A flexible remote schedule
- Competitive compensation
- International engineering culture with strong technical standards
- An opportunity to work on a long-term, impactful product
More -
Β· 21 views Β· 3 applications Β· 4d
Middle/Senior Data Engineers (Armenia)
Full Remote Β· Armenia Β· 5 years of experience Β· B2 - Upper IntermediateMiddle/Senior Data Engineer We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a...Middle/Senior Data Engineer
We are developing a next-generation analytics platform that helps global companies transform raw data into actionable insights. The product centralizes data from multiple sources, automates data quality checks, and provides a unified semantic layer for analytics and reporting teams.
Our data ecosystem is built on a modern tech stack using a cloud data warehouse (BigQuery/Snowflake), dbt for transformations, Airflow for orchestration, and a collection of BI tools for dashboards and self-service analytics.
You will join a highly skilled data team that collaborates closely with Data Scientists, Analysts, and Backend Engineers to design scalable and reliable data pipelines.
Requirements:
- 5+ years of experience working with data pipelines in production
- Strong skills in SQL and Python
- Hands-on experience with dbt and cloud data warehouses (BigQuery, Snowflake)
- Experience with Apache Airflow or other orchestration tools
- Good knowledge of Docker and modern CI/CD workflows
- Familiarity with BI tools (Superset, Metabase, Looker, Tableau)
- Fluent English
Responsibilities:
- Build, maintain, and optimize production-grade ETL/ELT pipelines
- Develop dbt models and contribute to the platformβs data transformation layer
- Work with BigQuery/Snowflake to design efficient data architectures
- Set up and manage workflows in Airflow (or similar orchestration systems)
- Ensure data quality, reliability, and proper documentation
- Collaborate with data analysts, ML engineers, and product teams
- Improve pipeline performance and implement best practices in DevOps for data
We Offer:
- Work with a modern cloud data stack and cutting-edge tools
- An international team with a strong data-driven culture
- Flexible remote work
- Competitive compensation
- Opportunity to shape the data architecture of a growing product