TeamCraft Roofing Inc
TeamCraft Roofing is one of the largest commercial roofing contractors in the United States, 300+ employees, Fortune 500 clients across the East Coast.
We're scaling, and AI is at the core of that strategy. Not marketing buzzwords - actual production systems handling real data every day.
What we're building:
- Enterprise Data Lakehouse - medallion architecture integrating multiple systems
- RAG assistants
- ShelterCareAI - real-time weather monitoring platform
- Computer vision for field compliance
- AI-powered workflow automation
- Other AI use cases
Why engineers join us:
- Greenfield build - no legacy code. We're designing everything from scratch with modern tooling
- AI Committee - engineers shape technical direction, not just execute tickets
- Leadership gets it - company leadership genuinely understands what AI is (and isn’t)
- Small team, high autonomy - your decisions matter
Stack: Azure (Container Apps, OpenAI Service, Data Factory, ADLS), Python, PostgreSQL, dbt, Power BI, modern CI/CD.
100% remote. US-based company, European engineering team.
-
· 70 views · 13 applications · 11d
Senior Data Engineer to $6000
Full Remote · Countries of Europe or Ukraine · Product · 5 years of experience · English - B2Job Description Solid experience with the Azure data ecosystem: Factory, Databricks or Fabric, ADLS Gen2, Azure SQL, Blob Storage, Key Vault, and Functions Proficiency in Python and SQL for building ingestion, transformation, and processing...Job Description
- Solid experience with the Azure data ecosystem: Factory, Databricks or Fabric, ADLS Gen2, Azure SQL, Blob Storage, Key Vault, and Functions
- Proficiency in Python and SQL for building ingestion, transformation, and processing workflows
- Clear understanding of Lakehouse architecture principles, Delta Lake patterns, and modern data warehousing
- Practical experience building config-driven ETL/ELT pipelines, including API integrations and Change Data Capture (CDC)
- Working knowledge of relational databases (MS SQL, PostgreSQL) and exposure to NoSQL concepts
- Ability to design data models and schemas optimized for analytics and reporting workloads
- Comfortable working with common data formats: JSON, Parquet, CSV
- Experience with CI/CD automation for data workflows (GitHub Actions, Azure DevOps, or similar)
- Familiarity with data governance practices: lineage tracking, access control, encryption
- Strong problem-solving mindset with attention to detail
Clear written and verbal communication for async collaboration
Nice-to-Have
- Proficiency with Apache Spark using PySpark for large-scale data processing
- Experience with Azure Service Bus/Event Hub for event-driven architectures
- Familiarity with machine learning and AI integration within data platform context (RAG, vector search, Azure AI Search)
- Data quality frameworks (Great Expectations, dbt tests)
Experience with Power BI semantic models and Row-Level Security
Job Responsibilities
- Design, implement, and optimize scalable and reliable data pipelines using Azure Data Factory, Synapse, and Azure data services
- Develop and maintain config-driven ETL/ELT solutions for batch and API-based data ingestion
- Build Medallion architecture layers (Bronze → Silver → Gold) ensuring efficient, reliable, and performant data processing
- Ensure data governance, lineage, and compliance using Azure Key Vault and proper access controls
- Collaborate with developers and business analysts to deliver trusted datasets for reporting, analytics, and AI/ML use cases
- Design and maintain data models and schemas optimized for analytical and operational workloads
- Implement cross-system identity resolution (global IDs, customer/property keys across multiple platforms)
- Identify and resolve performance bottlenecks, ensuring cost efficiency and maintainability of data workflows
- Participate in architecture discussions, backlog refinement, and sprint planning
- Contribute to defining and maintaining best practices, coding standards, and quality guidelines for data engineering
- Perform code reviews and foster knowledge sharing within the team
Continuously evaluate and enhance data engineering tools, frameworks, and processes in the Azure environment
Why TeamCraft?
- Greenfield project - build architecture from scratch, no legacy debt
- Direct impact - your pipelines power real AI products and business decisions
- Small team, big ownership - no bureaucracy, fast iteration, your voice matters
- Stable foundation - US-based company, 300+ employees
- Growth trajectory - scaling with technology as the driver
About the Project
TeamCraft is a large U.S. commercial roofing company undergoing an ambitious AI transformation. We’re building a centralized data platform from scratch - a unified Azure Lakehouse that integrates multiple operational systems into a single source of truth (Bronze -> Silver -> Gold).
This is greenfield development with real business outcomes - not legacy maintenance.
More -
· 98 views · 17 applications · 5d
Senior DevOps Engineer to $5500
Full Remote · Countries of Europe or Ukraine · Product · 5 years of experience · English - NoneWe are looking for a Senior DevOps Engineer to own the infrastructure for our new Medallion Lakehouse and AI-powered applications. You will not be maintaining legacy on-prem servers; you will be building the cloud-native foundation that drives our...We are looking for a Senior DevOps Engineer to own the infrastructure for our new Medallion Lakehouse and AI-powered applications. You will not be maintaining legacy on-prem servers; you will be building the cloud-native foundation that drives our growth.
Tech Stack• Cloud: Azure PaaS (AKS, Container Apps, Functions, Key Vault, ADLS Gen2, Azure SQL).
• IaC: Terraform (Module design, state management).
• CI/CD: GitHub Actions (Automated pipelines, quality gates).
• Orchestration: Kubernetes (AKS), Docker.
• Observability: Prometheus, Grafana, Azure Monitor.
• Scripting: Python, Bash.
Responsibilities
• Infrastructure as Code: Design and maintain scalable Azure infrastructure using Terraform. Implement module-based architectures for our Data Lakehouse (Bronze/Silver/Gold layers) and AI microservices.
• CI/CD Mastery: Build bulletproof GitHub Actions pipelines for Python applications and data workflows. Implement automated testing, security scanning, and seamless deployment strategies.
• Reliability & Observability: Implement monitoring dashboards (Grafana/Azure Monitor) and alerting strategies. Define and track SLIs/SLOs to distinguish between application and infrastructure noise.
• Security & Compliance: Enforce "Zero Trust" and "Least Privilege" principles. Manage identity (Azure AD/Entra ID), secrets (Key Vault), and network security (VNet, Private Endpoints).
• Disaster Recovery: Design and test backup/recovery strategies for databases and data lakes to ensure business continuity.
• Collaboration: Act as a bridge between Data Engineering and Backend teams, helping optimize Docker containers and troubleshoot production issues.
Requirements
• 5+ years of DevOps/Platform Engineering experience.
• Deep expertise in Azure ecosystems (proven track record of delivering Azure-based systems).
• Strong proficiency with Terraform in a production environment.
• Experience managing Kubernetes (AKS) and containerized workloads.
• Solid understanding of networking (VNets, DNS, Firewalls) and security (RBAC, IAM).
• English B2 – We communicate daily with the US team.
Nice to Have
• Experience supporting Data Engineering teams (Data Factory, Databricks, Spark).
• Background in Python development or complex scripting.
• Experience with GitOps workflows (ArgoCD).
What We Offer
• Greenfield Project: No legacy code or technical debt. You choose the right tools for the job.
• High Autonomy: You are the infrastructure owner. Your architecture decisions matter.
• Remote-First: Flexible schedule
• Stability: Long-term role with a stable US company
• Direct Impact: Your work directly powers AI models predicting roof damage and optimizing multimillion-dollar construction projects.
About the Project
TeamCraft is a large U.S. commercial roofing company undergoing an ambitious AI transformation. We’re building a centralized data platform from scratch - a unified Azure Lakehouse that integrates multiple operational systems into a single source of truth.
This is greenfield development with real business outcomes - not legacy maintenance.
More