Busy Rebel

Joined in 2020
30% answers
At Busy Rebel, we aim to redefine the technology and innovation landscape, creating a distinctive ecosystem of exceptional talents while fostering ownership, inclusion, and diversity. We aim to catalyze societal and technological evolution by empowering tech entrepreneurs to challenge norms and spearhead change.

Busy Rebel aims to forge unparalleled partnerships, driving global impact through creative rebellion. We commit to being the trusted ally that amplifies visions with expert insight and inventive strategies, fostering a community of disruptive thinkers for technological empowerment and societal betterment. Every partnership and project is a step toward a more interconnected and enlightened world.

Be part of a company where every day is an opportunity to learn, challenge, and transcend the ordinary. Your journey with Busy Rebel is not just a career; it’s a pivotal chapter in the larger story of technological revolution and societal evolution. Join us, and let’s build the future together.
  • · 77 views · 10 applications · 25d

    Data Engineer

    Part-time · Full Remote · Countries of Europe or Ukraine · Product · 5 years of experience · English - B2
    We are looking for a Senior Data Engineer to join on a freelance basis to drive R&D initiatives, PoCs, and architecture validations for our enterprise and startup clients. You’ll work at the edge of innovation—validating modern data technologies,...

    We are looking for a Senior Data Engineer to join on a freelance basis to drive R&D initiatives, PoCs, and architecture validations for our enterprise and startup clients. You’ll work at the edge of innovation—validating modern data technologies, designing scalable prototypes, and enabling real-world applications in complex, regulated environments. This role is ideal for an experienced engineer with strong system thinking, deep knowledge of modern data architectures, and the ability to move fast in R&D cycles that could mature into production environments.


    What You’ll Do

    • Design and build proof-of-concept pipelines on cloud-native environments (AWS/GCP) to validate performance, scalability, and architecture.
    • Work across OLAP systems like ClickHouse, Redshift, BigQuery, and support data persistence strategies for batch and real-time use cases.
    • Contribute to the design of data-agnostic platforms capable of working across structured, semi-structured, and unstructured data sources.
    • Evaluate and experiment with modern approaches such as Data Mesh, Lakehouse, and unified metadata/catalog strategies.
    • Prototype graph-based analytics components where applicable (e.g., Neo4j, Amazon Neptune).
    • Collaborate with architects, AI/ML engineers, and domain experts to deliver validated data foundations for further automation and intelligence.
    • Work with enterprise teams, adapting solutions to their compliance, security, and governance requirements.

       

    Required Skills & Experience

    • 7+ years in data engineering, with a strong record of delivering backend and infrastructure for large-scale data systems.
    • Hands-on experience with AWS and/or GCP (IAM, VPCs, storage, compute, cost control).
    • Proven use of ClickHouse, Redshift, BigQuery, or similar for high-performance analytical workloads.
    • Practical knowledge of Lakehouse, Data Mesh, Data Lake + Warehouse hybrid models.
    • Experience building data ingestion frameworks (batch & streaming), including CDC, schema evolution, and orchestration.
    • Strong Python or Go; advanced SQL; CI/CD familiarity.
    • Comfort interacting with enterprise stakeholders; clear, concise documentation and proposal skills.
    • Product-oriented, research-driven, and able to handle ambiguity while delivering value fast.

      Bonus Points
    • Experience with graph technologies (e.g., Neo4j, Neptune, TigerGraph).
    • Familiarity with dbt, Airflow, Dagster, or similar orchestrators.
    • Knowledge of open metadata/catalog tools (OpenMetadata, DataHub, Amundsen).
    • Experience in highly regulated or enterprise environments.
    • Involvement in cloud cost optimization, FinOps, or scalable query engine evaluation.


    Engagement Details

    • Type: Freelance / B2B contract
    • Extension: High potential to convert into a core team role or longer-term engagement
    • Location: Remote (preference for overlap with European time zones)
    More
  • · 142 views · 25 applications · 11d

    DevOps Engineer

    Part-time · Full Remote · Countries of Europe or Ukraine · Product · 6 years of experience · English - B2
    About the project We are building a cloud- and data-agnostic process intelligence platform. The platform connects operational data, KPIs, and process flows to deliver real-time process visibility, analytics, and automation across heterogeneous...

    About the project

    We are building a cloud- and data-agnostic process intelligence platform.
    The platform connects operational data, KPIs, and process flows to deliver real-time process visibility, analytics, and automation across heterogeneous environments.
    The system runs primarily on Google Cloud, supports hybrid deployments, and serves data-heavy, security-sensitive workloads.

     

    Responsibilities:
     

    • Operate and evolve GCP infrastructure (VPC, IAM, GKE, Secret Manager).
    • Manage Kubernetes workloads in production environments.
    • Own GitOps-based CI/CD using Argo CD and Helm.
    • Maintain observability stack (Prometheus, Grafana, Loki).
    • Support hybrid cloud / customer-VPC deployments.
    • Enforce security, access control, and environment isolation.
    • Monitor and optimize cost, reliability, and performance.

     

    Required Technical Stack:
     

    • Cloud: Google Cloud Platform (GKE Standard, IAM, VPC, Cloud DNS, Secret Manager)
    • Containers: Kubernetes, Docker
    • CI/CD: Argo CD, Helm, GitHub or GitLab pipelines
    • Observability: Prometheus, Grafana, Loki
    • Datastores (operational familiarity): PostgreSQL, Redis, Neo4j
    • Messaging: NATS or similar
    • Runtime environments: Python (FastAPI), Node.js
    • Security: Workload Identity, RBAC, secrets management

     

    Requirements:
     

    • Proven production experience with Kubernetes on GCP
    • Strong understanding of GitOps and declarative infrastructure
    • Experience operating data-intensive platforms
    • Comfortable supporting hybrid and multi-environment deployments
    • Able to work independently in a part-time, remote setup
    • Pragmatic, operationally focused, low-ceremony mindset

     

    Nice to Have:
     

    • FinOps or cloud cost optimization experience
    • Experience in regulated or enterprise environments
    • Familiarity with data platforms or analytics workloads

     

    Engagement:
     

    • Remote, part-time
    • UK or EU timezone preferred
    • Hands-on execution role, not advisory
    More
  • · 174 views · 44 applications · 8d

    UX/UI Designer

    Part-time · Full Remote · Countries of Europe or Ukraine · Product · 5 years of experience · English - C1
    We are building a consumer-facing web platform that helps employees increase their purchasing power through better bill management, exclusive offers, and a rewards model tied to real-world community impact. The product includes member onboarding,...

    We are building a consumer-facing web platform that helps employees increase their purchasing power through better bill management, exclusive offers, and a rewards model tied to real-world community impact. The product includes member onboarding, group-based experiences, directories, offer discovery, a wallet-style balance and transactions, and contribution flows to vetted community projects.

    This is a hands-on role for a designer who can take early concepts and AI-generated prototypes and turn them into a coherent, scalable, implementation-ready product experience.
     

    What You’ll Do

    • Design and adapt initial concepts into a design system while preserving usability and consistency with the tech stack.
    • Bring clarity and usability improvements while remaining aligned with the original product vision.
    • Create wireframes, UI mockups, and interaction flows based on evolving requirements.
    • Collaborate with developers to ensure smooth implementation of your designs.
    • Maintain design consistency across projects and contribute to the growth of our design system.


    What You’ll Be Designing

    You will work across several core areas of a modern B2C web product:

    • Public site and onboarding flows (sign-up, verification, joining a group-based space).
    • Member profile and settings, including wallet-like views (balances, history, claims, transfers).
    • Discovery and directory UX (search, filters, landing pages for groups, projects, partner businesses).
    • Offer and voucher purchasing flows, including clear pricing, value explanation, and trust cues.
    • Contribution flows to community initiatives, including attribution and transparency elements.
    • Admin-lite interfaces for partners (basic content publishing, program management concepts).


    What We’re Looking For
     

    Must-have experience:

    • 4+ years of experience designing web products, particularly in B2C environments.
    • Strong proficiency with Figma and modern AI design tools.
    • Ability to work with and adapt existing design systems.
    • Experience balancing business requirements, usability, and tech feasibility.

    Nice-to-have:

    • Familiarity with prototyping tools like Lovable.
    • Understanding of responsive design principles and front-end basics.
    • Experience working in dynamic or startup environments.


    A Unique Collaboration Model

    This role involves a tight collaboration with our CEO, who acts as both the visionary and product owner. The CEO often brings unconventional and ambitious ideas to the table, initially shaping them via AI-powered prototyping.


    We’re not looking for someone to challenge or redirect this vision, but rather a designer who can embrace the creative challenge, refine the raw concepts, and inject usability and structure, making these ideas come alive in a user-friendly and tech-aligned way.


    This calls for someone who is flexible, pragmatic, and capable of navigating ambiguity while staying grounded in solid design principles.


    How We Work (Practical Expectations)

    • You should be comfortable taking imperfect inputs (rough prototypes, partial requirements, shifting priorities) and turning them into crisp deliverables.
    • You will ship in short cycles: concept, wireframes, UI, interaction details, developer handoff, iteration.
    • You will document decisions inside the design system to prevent drift as features expand.


    Deliverables You’ll Own

    • A structured design system (tokens, components, patterns, states, responsive rules).
    • Key user journeys mapped end-to-end with interaction flows.
    • High-fidelity UI with clear handoff specs (component behavior, edge cases, empty states, error handling).
    • Iteration support during implementation (quick feedback loops with engineering).

    Engagement

    • Contract or part-time to start, with potential to extend based on delivery and product momentum.
    • Remote-friendly; collaboration hours should overlap with Europe time zones.
    More
  • · 30 views · 5 applications · 7d

    Social Media Manager

    Part-time · Full Remote · Countries of Europe or Ukraine · Product · 6 years of experience · English - None
    We are looking for a Social Media Manager (instagram, youtube) who can manage our social channels and also create content hands-on. You should be comfortable designing social posts in Figma and making simple Reels with basic motion and clean editing. ...

    We are looking for a Social Media Manager (instagram, youtube) who can manage our social channels and also create content hands-on. You should be comfortable designing social posts in Figma and making simple Reels with basic motion and clean editing.

     

     

    What you will do

    • Plan and manage our social media content (weekly and monthly).
    • Create posts and carousels in Figma (clean layout, strong typography, consistent branding).
    • Create Reels/short videos: write simple scripts, edit videos, add captions, and publish.
    • Add minimal motion when needed (simple text animation, transitions, UI highlights).
    • Post consistently, engage with comments and DMs, and keep the account active.
    • Track performance and improve content based on what works.

     

     

     

    What we need

    • Experience running social media for a product or startup (or strong portfolio showing you can).
    • Strong Figma skills for social creatives.
    • Good Reels editing skills (CapCut, Premiere, AE, or similar).
    • Basic motion skills (simple animations, not high-end production).
    • Good writing skills: clear hooks, simple messaging, strong CTAs.
    • Reliable execution and ability to deliver content on schedule.

     

     

     

    Nice to have

    • Experience in B2B SaaS or tech products.
    • Paid ads creatives experience.
    • Ability to turn product features into content quickly.

     

     

     

    Output expectations (example)

    • Several posts per week (static + carousels)
    • Several Reels per week (short-form video)

     

    More
  • · 55 views · 7 applications · 7d

    Senior Data Engineer

    Part-time · Full Remote · Countries of Europe or Ukraine · Product · 5 years of experience · English - B2
    About the Platform We’re building a unified data ecosystem that connects raw data, analytical models, and intelligent decision layers. The platform combines the principles of data lakes, lakehouses, and modern data warehouses — structured around the...

    About the Platform

    We’re building a unified data ecosystem that connects raw data, analytical models, and intelligent decision layers.
    The platform combines the principles of data lakes, lakehouses, and modern data warehouses — structured around the Medallion architecture (Bronze / Silver / Gold).
    Every dataset is versioned, governed, and traceable through a unified catalog and lineage framework.
    This environment supports analytics, KPI computation, and AI-driven reasoning — designed for performance, transparency, and future scalability. (in Partnership with GCP, OpenAI, Cohere)
     

    What You’ll Work On

    1. Data Architecture & Foundations

    • Design, implement, and evolve medallion-style data pipelines — from raw ingestion to curated, business-ready models.
    • Build hybrid data lakes and lakehouses using Iceberg, Delta, or Parquet formats with ACID control and schema evolution.
    • Architect data warehouses that unify batch and streaming sources into a consistent, governed analytics layer.
    • Ensure optimal partitioning, clustering, and storage strategies for large-scale analytical workloads.

    2. Data Ingestion & Transformation

    • Create ingestion frameworks for APIs, IoT, ERP, and streaming systems (Kafka, Pub/Sub).
    • Develop reproducible ETL/ELT pipelines using Airflow, dbt, Spark, or Dataflow.
    • Manage CDC and incremental data loads, ensuring freshness and resilience.
    • Apply quality validation, schema checks, and contract-based transformations at every stage.

    3. Governance, Cataloging & Lineage

    • Implement a unified data catalog with lineage visibility, metadata capture, and schema versioning.
    • Integrate dbt metadata, OpenLineage, and Great Expectations to enforce data quality.
    • Define clear governance rules: data contracts, access policies, and change auditability.
    • Ensure every dataset is explainable and fully traceable back to its source.

    4. Data Modeling & Lakehouse Operations

    • Design dimensional models and business data marts to power dashboards and KPI analytics.
    • Develop curated Gold-layer tables that serve as trusted sources of truth for analytics and AI workloads.
    • Optimize materialized views and performance tuning for analytical efficiency.
    • Manage cross-domain joins and unified semantics across products, customers, or operational processes.

    5. Observability, Reliability & Performance

    • Monitor data pipeline health, freshness, and cost using modern observability tools (Prometheus, Grafana, Cloud Monitoring).
    • Build proactive alerting, anomaly detection, and drift monitoring for datasets.
    • Implement CI/CD workflows for data infrastructure using Terraform, Helm, and ArgoCD.
    • Continuously improve query performance and storage efficiency across warehouses and lakehouses.

    6. Unified Data & Semantic Layers

    • Help define a unified semantic model that connects operational, analytical, and AI-ready data.
    • Work with AI and analytics teams to structure datasets for semantic search, simulation, and reasoning systems.
    • Collaborate on vectorized data representation and process-relationship modeling (graph or vector DBs).


    What We’re Looking For

    • 5+ years of hands-on experience building large-scale data platforms, warehouses, or lakehouses.
    • Strong proficiency in SQL, Python, and distributed processing frameworks (PySpark, Spark, Dataflow).
    • Deep understanding of Medallion architecture, data modeling, and modern ETL orchestration (Airflow, dbt).
    • Experience implementing data catalogs, lineage tracking, and validation frameworks.
    • Knowledge of data governance, schema evolution, and contract-based transformations.
    • Familiarity with streaming architectures, CDC patterns, and real-time analytics.
    • Practical understanding of FinOps, data performance tuning, and cost management in analytical environments.
    • Strong foundation in metadata-driven orchestration, observability, and automated testing.
    • Bonus: experience with ClickHouse, Trino, Iceberg, or hybrid on-prem/cloud data deployments.


    You’ll Excel If You

    • Think of data systems as living, evolving architectures — not just pipelines.
    • Care deeply about traceability, scalability, and explainability.
    • Love designing platforms that unify data across analytics, AI, and process intelligence.
    • Are pragmatic, hands-on, and focused on building systems that last.
    More
  • · 117 views · 53 applications · 4d

    Python Web Engineer (Django + FastAPI)

    Part-time · Full Remote · Countries of Europe or Ukraine · Product · 5 years of experience · English - B2
    About the project: We are building a cloud- and data-agnostic process intelligence platform. The product exposes APIs and web services for process modeling, KPI computation, analytics, and automation. The system is API-first, data-heavy, and designed...

    About the project:

    We are building a cloud- and data-agnostic process intelligence platform.
     The product exposes APIs and web services for process modeling, KPI computation, analytics, and automation.
     The system is API-first, data-heavy, and designed to run in cloud and hybrid environments.

     

    Responsibilities:
     

    • Build and maintain backend services using Django and FastAPI.
    • Design and implement REST APIs for data, process, and KPI services.
    • Work with PostgreSQL as the primary transactional datastore.
    • Implement async and background workloads where required.
    • Integrate services with data pipelines, analytics layers, and messaging systems.
    • Write production-grade code with clear boundaries and minimal technical debt.
    • Collaborate with DevOps on deployment, observability, and scaling.


    Required Technical Stack:
     

    • Languages: Python 3.x
    • Frameworks: Django, FastAPI
    • APIs: REST (OpenAPI)
    • Databases: PostgreSQL
    • Async & background: Celery, async Python, or equivalent
    • Caching / queues: Redis (operational familiarity)
    • Auth: JWT, role-based access patterns
    • Testing: Pytest or equivalent
    • Deployment awareness: Docker, basic Kubernetes literacy

     

    Requirements:
     

    • Strong, hands-on experience with Django and FastAPI in production.
    • Solid understanding of API design, data modeling, and transactions.
    • Experience building data-driven backend systems.
    • Comfortable working in a distributed, service-oriented architecture.
    • Able to work independently in a remote, part-time setup.
    • Focus on correctness, maintainability, and performance.

     

    Nice to Have:
     

    • Experience with analytics or data platforms.
    • Familiarity with BigQuery or large-scale data processing.
    • Experience integrating with Kubernetes-deployed services.
    • Exposure to enterprise or regulated environments.

     

    Engagement:
     

    • Remote, part-time 80h+ with conversion to a full-time role in a few months
    • UK or EU timezone preferred
    • Hands-on development role, no management responsibilities
    More
Log In or Sign Up to see all posted jobs