Data Science (offline)
Project long term - 1+ year
Must have:
- Experience with non-machine created structured data, i.e. not relevant for now are image data, sensor data, etc; but relevant is e.g.
- Proper EDA experience
- Proper experience in predictions, preferably experience in forecasting
- Experience in stakeholder communication, not just on a purely technical level but also with business stakeholders
- Experience with Databases and Amazon web service
- Experience ETL pipelines
- Experience AWS Glue, AWS S3
- Good communication skills in English, clear, spot-on
Will be a plus
- Basic data engineering skills
- With solid experience in these fields come also required technical (coding, libraries, etc) skills automatically
About project:
Steps/Cost Estimation Breakdown for the Client:
1. Set up the end-to-end platform
- Set up Virtual Network, e.g. with Terraform
- Should allow for subnet structure if needed
- Consider + set up OATH
- Ensure connection within VNET to other (cloud) services, e.g. AWS Glue (if needed), AWS S3, Sagemaker, Databricks, etc, such that Data Engineers (DE) and Data Scientists (DS) can directly start working
- Data Engineering efforts should include
- Setting up + maintaining “standard” ETL pipelines
- pseudonymisation/anonymisation, e.g. hashing
- DevOps + MLOps (Airflow or Kafka, depending on use case)
- Optional: modeling
2. Build Use Cases
- Data Scientist (DS) efforts for development of end-to-end data science projects full life cycle
- specifically for deployment:
- Either: full stack DS who can do MLOps themselves
- Or: DS supports DE in understanding the DS code correctly to do MLOps properly
Must have:
- Experience with non-machine created structured data, i.e. not relevant for now are image data, sensor data, etc; but relevant is e.g.
- Proper EDA experience
- Proper experience in predictions, preferably experience in forecasting
- Experience in stakeholder communication, not just on a purely technical level but also with business stakeholders
- Experience with Databases and Amazon web service
- Experience ETL pipelines
- Experience AWS Glue, AWS S3
- Good communication skills in English, clear, spot-on
Will be a plus
- Basic data engineering skills
- With solid experience in these fields come also required technical (coding, libraries, etc) skills automatically
About project:
Steps/Cost Estimation Breakdown for the Client:
1. Set up the end-to-end platform
- Set up Virtual Network, e.g. with Terraform
- Should allow for subnet structure if needed
- Consider + set up OATH
- Ensure connection within VNET to other (cloud) services, e.g. AWS Glue (if needed), AWS S3, Sagemaker, Databricks, etc, such that Data Engineers (DE) and Data Scientists (DS) can directly start working
- Data Engineering efforts should include
- Setting up + maintaining “standard” ETL pipelines
- pseudonymisation/anonymisation, e.g. hashing
- DevOps + MLOps (Airflow or Kafka, depending on use case)
- Optional: modeling
2. Build Use Cases
- Data Scientist (DS) efforts for development of end-to-end data science projects full life cycle
- specifically for deployment:
- Either: full stack DS who can do MLOps themselves
- Or: DS supports DE in understanding the DS code correctly to do MLOps properly
About :DTeam
Ми ІТ-аутсорсингова компанія, яка надає послуги з розробки програмного забезпечення, тестування та DevOps для стартапів, технологічних компаній і прямих клієнтів.Детальну інформацію про нас можете прочитати на нашому сайті https://dteam.dev/ або https://clutch.co/profile/dteam
Company website:
https://dteam.dev/
DOU company page:
https://jobs.dou.ua/companies/dteam/
The job ad is no longer active
Look at the current jobs Data Engineer →
$2500-5000
Average salary range of similar jobs in
analytics →
Similar jobs
Sr Data Engineer at Luxe Quality
Ukraine
Data Engineer at Numerical Technologies
Ukraine
Ukraine
All jobs DTeam