Confitech Dienstleistungs GmbH

Joined in 2019
55% answers
Im Jahr 2000 begannen wir mit Engineering-Themen im Automotivbereich. Aus der Begeisterung fürs Auto und dem Wunsch Mobilität umweltgerecht zu gestalten, erfolgte unser Engagement ins CarSharing in 2006 an unserem Unternehmensstandort Ulm. Das Thema "bedarfsgerechte Mobilität" wurde in 2017 durch die Übernahme einer traditionsreichen Nutzfahrzeugvermietung in Stuttgart erweitert. Neben Applikationstätigkeiten im Automotivbereich bekamen in den vergangenen Jahren die Themen Funktionsentwicklung und Software-Testing einen immer höheren Stellenwert. Folgerichtig wurde deshalb in 2020 ein eigenständiger Bereich für die IT-Entwicklung ins Leben gerufen.
  • · 57 views · 19 applications · 20d

    Data Scientist — Python, FastAPI, AWS CDK to $5500

    Full Remote · Countries of Europe or Ukraine · 4 years of experience · Upper-Intermediate
    We are looking for an experienced Data Scientist to join a distributed international team working on AI-driven backend systems. This is a long-term, full-time contract opportunity with a fully remote setup. English is the working language. You will...

    We are looking for an experienced Data Scientist to join a distributed international team working on AI-driven backend systems. This is a long-term, full-time contract opportunity with a fully remote setup. English is the working language.
     

    You will contribute to the design, development, and optimization of services built on modern ML/AI technologies, with a focus on backend integrations and infrastructure automation.
     

    Project details
    Start: ASAP
    Duration: Until 31.12.2025 (possible extension)
    Workload: Full-time (5 days/week)
    Location: Remote
    Language: English

    Responsibilities

    • Develop and maintain scalable APIs using FastAPI
    • Contribute to AI/ML-driven features, including RAG-based systems and LangGraph pipelines
    • Manage dependencies with Poetry or UV
    • Implement infrastructure as code using AWS CDK (Python) or other IaC tools
    • Work collaboratively using GitLab workflows and CI/CD pipelines
    • Write modular, testable code in large-scale Python projects
    • Support serverless deployments with AWS Lambda, Aurora, API Gateway, and other services


    Required skills

    • Strong Python programming experience
    • Hands-on experience with FastAPI
    • Experience with dependency management tools (Poetry or UV)
    • Familiarity with Retrieval-Augmented Generation (RAG) and LangGraph or similar
    • Proficiency with Docker, Git, and GitLab CI/CD
    • Experience working with AWS serverless services (Lambda, Aurora, API Gateway)
    • Knowledge of infrastructure as code with AWS CDK or equivalent


    Optional but nice to have

    • Experience integrating ML models into production systems
    • Understanding of asynchronous programming and performance optimization
    • Prior experience in remote or distributed teams

     

    If you meet the requirements and are interested in the project, we would be happy to schedule an initial phone call to provide further details.

    More
  • · 67 views · 19 applications · 20d

    Data Engineer to $4800

    Full Remote · Countries of Europe or Ukraine · 4 years of experience · Upper-Intermediate
    We are currently seeking a skilled Data Engineer to join our team in the development and maintenance of robust data solutions. This role involves building and optimizing data pipelines, managing ETL processes, and supporting data visualization needs for...

    We are currently seeking a skilled Data Engineer to join our team in the development and maintenance of robust data solutions. This role involves building and optimizing data pipelines, managing ETL processes, and supporting data visualization needs for business-critical use cases.
     

    As part of your responsibilities, you will design and implement cloud infrastructure on AWS using AWS CDK in Python, contribute to solution architecture, and develop reusable components to streamline delivery across projects. You will also implement data quality checks and design scalable data models leveraging both SQL and NoSQL technologies.

     

    Project details:

    • Start: ASAP
    • Duration: Until 31.12.2026
    • Location: Remote
    • Language: English


    Responsibilities:

    • Develop, monitor, and maintain efficient ETL pipelines and data workflows
    • Build infrastructure on AWS using AWS CDK (Python)
    • Design and implement reusable data engineering components and frameworks
    • Ensure data quality through validation, testing, and monitoring mechanisms
    • Contribute to solution architecture and technical design
    • Create and optimize scalable data models in both SQL and NoSQL databases
    • Collaborate with cross-functional teams including data scientists, analysts, and product owners

     

    Requirements:

    • Solid experience in building and maintaining ETL pipelines
    • Hands-on experience with data visualization tools or integrations (e.g., Tableau, Power BI, or custom dashboards via APIs)
    • Strong working knowledge of AWS services, especially with AWS CDK (Python)
    • Good understanding of SQL and NoSQL database technologies
    • Familiarity with version control systems (e.g., Git)
    • Experience working in Agile environments
    • Strong communication skills and ability to work autonomously in remote teams
    More
Log In or Sign Up to see all posted jobs