Senior Python AI Engineer (FastAPI))

Who:
We are looking for a Senior Backend Engineer with 7+ years of experience who takes ownership of designing, building, and running production systems.

What:
You will own core backend services and data pipelines that power AI workflows, handle high-volume data, and serve LLMs in production.

When:
Immediate start.

Where:
Remote.

Why:
To build scalable, reliable, data- and AI-driven systems with real production impact and minimal process overhead.

Office Environment:
Fast-growing startup environment with high ownership, hands-on engineering, and close collaboration across teams.

Salary:
Competitive, based on experience.

Position Overview:

This is a senior, hands-on backend role for engineers who enjoy building systems end-to-end. You’ll design APIs, data pipelines, and AI-adjacent services, take responsibility for their reliability in production, and continuously improve performance, scalability, and maintainability.

Key Responsibilities:

  • Build and own core backend services using Python and FastAPI
  • Design and maintain production-grade APIs for data ingestion, processing, and inference
  • Handle high-volume structured and unstructured data (documents, text, embeddings, events)
  • Design and operate AI data pipelines for ingestion, transformation, and serving
  • Support hosting and scaling of LLMs and ML models in production
  • Design and optimize PostgreSQL schemas and queries
  • Build asynchronous and event-driven systems using RabbitMQ
  • Make pragmatic architecture decisions focused on simplicity and scalability
  • Ensure systems are observable, fault-tolerant, and easy to operate
  • Own production incidents, debugging, and continuous improvement
  • Collaborate closely with product, frontend, and ML teams

Requirements:

  • 7+ years of backend engineering experience
  • Strong proficiency in Python, including async programming
  • Extensive experience with FastAPI or similar modern frameworks
  • Solid fundamentals in API design, authentication, and security
  • Deep experience with PostgreSQL (schema design, indexing, query optimization)
  • Hands-on experience with RabbitMQ or similar messaging systems
  • Experience working with large-scale structured and unstructured data

Nice to Have:

  • Experience building or supporting AI/ML data pipelines
  • Exposure to LLM inference systems (self-hosted or managed)
  • Experience with Docker and containerized deployments
  • Familiarity with CI/CD and automated releases
  • Kubernetes or container orchestration experience
  • Experience running compute-intensive or GPU-backed services
  • Experience with vector databases or search systems

Required languages

English B2 - Upper Intermediate
Ukrainian Native
Published 22 January
113 views
·
24 applications
17% read
·
5% responded
Last responded 2 days ago
To apply for this and other jobs on Djinni login or signup.
Loading...