Point Wild helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital world. Backed by WndrCo, Warburg Pincus and General Catalyst, Point Wild is dedicated to creating the world’s most comprehensive portfolio of industry-leading cybersecurity solutions. Our vision is to become THE go-to resource for every cyber protection need individuals may face - today and in the future.
Join us for the ride!
-
· 77 views · 11 applications · 23d
Data Engineer
Full Remote · Poland, Ukraine, Romania, Bulgaria, Lithuania · Product · 5 years of experience · B2 - Upper IntermediateData Engineer (100% remote) in either Poland, Ukraine, Romania, Bulgaria, Lithuania, Latvia, or Estonia Point Wild helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital...Data Engineer (100% remote) in either Poland, Ukraine, Romania, Bulgaria, Lithuania, Latvia, or Estonia
Point Wild helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital world. Backed by WndrCo, Warburg Pincus and General Catalyst, Point Wild is dedicated to creating the world’s most comprehensive portfolio of industry-leading cybersecurity solutions. Our vision is to become THE go-to resource for every cyber protection need individuals may face - today and in the future.
Join us for the ride!
About the Role:
We are seeking a highly skilled Data Engineer with deep experience in Databricks and modern lakehouse architectures to join the Lat61 platform team. This role is critical in designing, building, and optimizing the pipelines, data structures, and integrations that power Lat61.
You will collaborate closely with data architects, AI engineers, and product leaders to deliver a scalable, resilient, and secure foundation for advanced analytics, machine learning, and cryptographic risk management use cases.
Your Day to Day:
- Build and optimize data ingestion pipelines on Databricks (batch and streaming) to process structured, semi-structured, and unstructured data.
- Implement scalable data models and transformations leveraging Delta Lake and open data formats (Parquet, Delta).
- Design and manage workflows with Databricks Workflows, Airflow, or equivalent orchestration tools.
- Implement automated testing, lineage, and monitoring frameworks using tools like Great Expectations and Unity Catalog.
- Build integrations with enterprise and third-party systems via cloud APIs, Kafka/Kinesis, and connectors into Databricks.
- Partner with AI/ML teams to provision feature stores, integrate vector databases (Pinecone, Milvus, Weaviate), and support RAG-style architectures.
- Optimize Spark and SQL workloads for speed and cost efficiency across multi-cloud environments (AWS, Azure, GCP).
- Apply secure-by-design data engineering practices aligned with Point Wild’s cybersecurity standards and evolving post-quantum cryptographic frameworks.
What you bring to the table:
- At least 5 years in Data Engineering with strong experience building production data systems on Databricks.
- Expertise in PySpark, SQL, and Python.
- Strong expertise with various AWS services.
- Strong knowledge of Delta Lake, Parquet, and lakehouse architectures.
- Experience with streaming frameworks (Structured Streaming, Kafka, Kinesis, or Pub/Sub).
- Familiarity with DBT for transformation and analytics workflows.
- Strong understanding of data governance and security controls (Unity Catalog, IAM).
- Exposure to AI/ML data workflows (feature stores, embeddings, vector databases).
- Detail-oriented, collaborative, and comfortable working in a fast-paced innovation-driven environment.
Bonus Points:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- Data Engineering experience in a B2B SaaS organization.
Lat61 Mission
The Lat61 platform will power the next generation of cybersecurity and AI-enabled decision-making. As a Data Engineer on this team, you will help deliver:
- Multi-Modal Data Ingestion: Bringing together logs, telemetry, threat intel, identity data, cryptographic assets, and third-party feeds into a unified lakehouse.
- AI Agent Enablement: Supporting Retrieval-Augmented Generation (RAG) workflows, embeddings, and feature stores to fuel advanced AI use cases across Point Wild products.
- Analytics & Decision Systems: Providing real-time insights into risk posture, compliance, and security events through scalable pipelines and APIs.
- Future-Proofing for Quantum: Laying the groundwork for automated remediation and transition to post-quantum cryptographic standards.
Your work won’t just be about pipelines and data models - it will directly shape how enterprises anticipate, prevent, and respond to cybersecurity risks in an era of quantum disruption.
More -
· 67 views · 6 applications · 23d
AI Engineer
Full Remote · Bulgaria, Lithuania, Poland, Romania, Ukraine · Product · 4 years of experience · B2 - Upper IntermediateAI Engineer (100% remote) in either Poland, Ukraine, Romania, Bulgaria, Lithuania, Latvia, or Estonia Point Wild helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital world....AI Engineer (100% remote) in either Poland, Ukraine, Romania, Bulgaria, Lithuania, Latvia, or Estonia
Point Wild helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital world. Backed by WndrCo, Warburg Pincus and General Catalyst, Point Wild is dedicated to creating the world’s most comprehensive portfolio of industry-leading cybersecurity solutions. Our vision is to become THE go-to resource for every cyber protection need individuals may face - today and in the future.
Join us for the ride!
About the Role:
As an AI Engineer, you will work closely with an AI & Data Engineering team to develop, deploy, and optimize AI models that power our next-generation security products.
This is a hands-on engineering role, requiring expertise in machine learning model development, deployment, and optimization. You will contribute to multiple AI initiatives, ensuring that models are efficient, scalable, and production-ready.
Your Day to Day:
- Model Development & Optimization: Implement, refine, and maintain ML models (supervised, unsupervised, NLP-based models).
- Production Integration: Work with Data/MLOps engineers to package, deploy, and monitor models within a robust CI/CD pipeline.
- Performance Tuning: Optimize model efficacy, efficiency, and reliability
- Parallel Project Support: Split or manage multiple AI initiatives to enable faster execution across various R&D efforts.
What you bring to the table:
- At least 4 years of experience in AI/ML Engineering with strong hands-on expertise in developing and deploying production-grade ML models.
- Strong ML & AI Fundamentals: Hands-on experience with supervised/unsupervised learning, NLP, and deep learning techniques.
- Production Experience: Proven track record of building, optimizing, and deploying models into production environments.
- Model Deployment & MLOps Experience: Comfortable working with containerized models, cloud-based AI pipelines (AWS preferred), and CI/CD for ML.
- Proficiency in Python & ML Frameworks: Experience with PyTorch, TensorFlow, Scikit-learn, or similar tools.
- Performance-Driven Mindset: Skilled in optimizing models for accuracy, efficiency, and scalability.
- Collaboration & Growth: Thrives in a cross-functional team and is eager to learn from and contribute to AI best practices.
Lat61 Mission
The Lat61 platform will power the next generation of cybersecurity and AI-enabled decision-making. As an AI Engineer on this team, you will help deliver:
- Advanced AI Model Development: Designing and optimizing machine learning models (supervised, unsupervised, and NLP-based) to drive security-focused intelligence across the platform.
- RAG & Embedding Workflows: Building and deploying RAG pipelines, embeddings, and feature stores that enable context-aware, real-time decision-making.
- Production-Ready AI Systems: Collaborating with Data and MLOps engineers to package, deploy, and monitor models in scalable, cloud-native environments.
- AI-Driven Analytics: Leveraging deep learning and NLP to provide actionable insights into risk posture, compliance, and security events.
- Future-Proofing for Quantum: Researching and prototyping AI approaches that can adapt to post-quantum cryptographic standards and automated remediation.