Data Science Engineer Offline
Requirements:
Education: Bachelor’s degree (IT/Software Enginnering/Mathematics)
Must have:
- Work experience: 3+ years
- Strong expertise in:
- Machine Learning and Data Science
- Statistics (probability distribution, hypothesis testing)
- Classical ML algorithms (Linear Regression, Decision Trees, SVM, Clustering, etc.)
- MLOps (BentoML, Docker, Kubernetes, KubeFlow, KNative)
- Python (PyTorch or Tensor)
- General programming skills (OOP, OO design, MVC, Design patterns)
- Solid understanding of ML/DS processes (from data preparation to model deployment)
- Proven portfolio of successful projects
- Participation in team projects
- Good verbal and written communication English skills (Upper Intermediate level)
- Experience in data extraction, cleansing, and preparation for ML training and test data sets
- Experience in analysis of structured and unstructured / semi-structured data (e.g. raw text, json, xml)
- Statistical Analysis & Modelling experience
- Good knowledge of Predictive Modelling, Regression and Classification techniques
- Good knowledge of Deep learning tools and techniques
- Understanding and knowledge of Natural Language Processing (NLP) applications
- Any BigData platform knowledge and/or experience (Apache Spark, Hadoop, Hive etc)
Good to have:
- Hackathons/Kaggle experience
- Data Visualisation (e.g. using Python, Tableau, R Shiny)
- Hands on experience developing with relevant Python toolkits
- Experience with:
- C# and .Net
- Go, Scala
- DataRobot, Chatterbox Labs, Google ML, Ludwig
- Working with sensitive data
- Desirable Data Engineering skills
- General knowledge of enterprise data architecture and data modelling
- Relational database modelling including transactional systems and data warehousing
- NoSQL databases e.g. MongoDB, Cassandra, Elasticsearch, Graph databases
- Experience working with Unix command line and shell scripting
- Cloud Architecture or Engineering experience with emphasis on data and analytics services
- Relevant certifications in Microsoft Azure or AWS
- Knowledge of deployment specifics of ML models in CI/CD
- Hands-on experience with microservice architectures and containerisation (e.g. Docker and Kubernetes)
- Experience developing and working with REST APIs
Personal skills:
- Passionate about technology
- Eager learner of new technologies and systems
- Written, verbal and interpersonal communication skills
- Team player attitude
Responsibilities:
- Bring new ideas in software development
- Leverage industry knowledge and stay close to technology developments
- Collaborate with cross-functional teams
We offer:
- Medical Insurance;
- 20 working-days paid vacation and other social benefits;
- Mid-year and annual performance review with a constructive feedback and development plan;
- Continuing education/training;
- Free access to global online educational platforms;
Project:
Cortex AI Platform streamlines the building, management, and deployment of ML models and in a no-code / low-code fashion, empowered by an in-house orchestration of continuously expanding landscape of third-party AutoML engines (DataRobot, Chatterbox Labs, Google ML, Ludwig, etc.) and open source container orchestration framework (Kubeflow).
Project development is expected to last for at least few years, with 30+ engineers involved (Ukrainian side only). You will be able to influence the project key solutions and
SCRUM and SAFe is used as a development process methodology: 2-week sprints and releases once per quarter are expected.
The system is implemented using the following technologies and tools: C#, .Net, Python, Go, React, JavaScript, AWS, and Kubernetes.