
Monster Lead Group
Joined in
2021
98% answers
Since 2013 we've provided the mortgage industry with a data-driven direct mail solution and lead management technology. But forget about how fantastic our technology is or how good we are at managing complex data. You can't accomplish much without great people. And we have the best people in the industry.
Website:
https://www.monsterleadgroup.com/
-
· 34 views · 7 applications · 9d
Data Operations Engineer (AWS Bedrock and SageMaker Focus)
Full Remote · Ukraine · Product · 5 years of experience · Upper-IntermediateWe are seeking a talented and driven Data Operations Engineer to join our data team. In this role, you will be instrumental in building and maintaining our data infrastructure on AWS, specifically leveraging AWS Bedrock, SageMaker, and related services....We are seeking a talented and driven Data Operations Engineer to join our data team. In this role, you will be instrumental in building and maintaining our data infrastructure on AWS, specifically leveraging AWS Bedrock, SageMaker, and related services. You will collaborate with data analysts, machine learning engineers, and software developers to ensure the reliability, scalability, and efficiency of our data pipelines and machine learning workflows.
If you have a strong background in AWS, a passion for data, and a desire to work with cutting-edge technologies, we want to hear from you.
Qualifications:
- 5+ years of experience working with AWS services.
- Strong understanding of AWS Bedrock, SageMaker, S3, RDS, Glue, Lambda, and CloudWatch.
- Proficiency in at least one scripting language (Python or C# is preferred.).
- Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation.
- Experience with data pipeline development and optimization.
- Familiarity with containerization technologies (Docker, Kubernetes) is a plus.
- Strong problem-solving and troubleshooting skills.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Experience with LLMs and prompt engineering is a plus.
- Bachelor's degree in Computer Science, Engineering, or a related field.
- AWS certifications (e.g., AWS Certified DevOps Engineer, AWS Certified Solutions Architect).
- Experience with data warehousing and data lake technologies.
- Experience with CI/CD pipelines.
Responsibilities:
- Infrastructure Development & Management:
- Design, build, and maintain scalable and reliable data infrastructure on AWS, with a focus on AWS Bedrock and SageMaker.
- Automate infrastructure deployments and configurations using Infrastructure as Code (IaC) tools like CDK/CloudFormation.
- Implement and manage data storage solutions, including S3, RDS, and other relevant AWS services. (We also use Snowflake and Looker)
- Ensure data security and compliance by implementing best practices for access control and data encryption.
- Data Pipeline Development & Optimization:
- Develop and maintain robust data pipelines for data ingestion, transformation, and processing using AWS Glue, Lambda, and other related services.
- Optimize data pipelines for performance, reliability, and cost-effectiveness.
- Monitor and troubleshoot data pipeline issues, ensuring data quality and availability.
- Machine Learning Support:
- Collaborate with machine learning engineers to deploy and manage machine learning models using AWS SageMaker.
- Develop and maintain data pipelines to support model training and inference.
- Implement and monitor model performance metrics and ensure model reliability.
- Work with AWS Bedrock to integrate LLM and MI into company workflows.
- Monitoring & Logging:
- Implement comprehensive monitoring and logging solutions using DataDog, CloudWatch and other AWS monitoring tools.
- Proactively identify and resolve infrastructure and data pipeline issues.
- Work with the BI team to develop and maintain dashboards and alerts for key performance indicators of developed work.
- Collaboration & Documentation:
- Work closely with cross-functional teams, including data analysts, machine learning engineers, data and software developers.
- Document infrastructure designs, data pipelines, and operational procedures.
- Participate in code reviews and contribute to team knowledge sharing.