Candidates 146
$10000 / mo
≈ $120000 / year net
Lead Data Engineer
Ukraine · 7 years of experience · Advanced/Fluent ·Published today · In passive search
Ukraine · 7 years of experience · Advanced/Fluent ·Published today · In passive search
Stack: python.
2. Built NLP processing algorithm from scratch to enhance marketing SMS conversion (for a telecom).
Managed a team, researched reinforcement learning applications for the same task.
Stack: python, bash, PostgreSQL, Keras, GCP.
3. Built multiple data pipelines and automation routines as a replacement of manual labor performed by 10 employees.
Currently in the process of constructing a fully autonomous system to be integrated in the business. The system has replaced an approximate of 5 employees and covers data exchange and processing, system monitoring and reporting for executives.
Stack: python, GCP, bash, PostgreSQL.
4. Led a team of data scientists and data engineers in one of the top live streaming platforms, to deliver data driven recommendations to the viewers. Included ML models, A/B testing, working with GCP. Stack: BigQuery, Cloud Functions, Dataflow, Container Registry, Cloud Run, App Engine.
5. Developed from scratch adtech solution for an auctions participant to detect fraud. Stack included: Snowflake, ClickHouse, GCP Cloud Run, Cloud Functions, Dataflow.
6. Developed and maintained from scratch an IT division at a commodity trading enterprise, including management of on-premise and cloud-based infrastructure, integration of existing solutions, research and development of ETL pipelines, BI and analytics setup, development of Django-based internal solutions for operational activities. Stack included: Azure AD, Azure Virtual Machines, Azure Power BI, Azure Flexible SQL, GCP Dataflow, GCP Cloud Functions, GCP SQL, GCP Storage, GCP BigQuery, GCP Cloud Run, GCP Domains.
$5000 / mo
≈ $60000 / year net
Database Developer | Data Engineer
Ukraine · Kyiv · More than 10 years of experience · Upper-Intermediate ·Published today · In passive search
Ukraine · Kyiv · More than 10 years of experience · Upper-Intermediate ·Published today · In passive search
Responsibilities: Database Design and Development, Performance tuning, Reporting, BI dashboards, Backend development and collaboration, Cloud.
Tech stack: Oracle, MSSQL, MongoDB, AWS, C#/.Net
Current role: Database Developer, Data Engineer, Backend Developer
- AWS Certified Cloud Practitioner
- Associate Cloud Engineer - Google Cloud
- Oracle PL/SQL Developer Certified Associate
- Oracle9i Database Administrator Certified Associate
$4000 / mo
≈ $48000 / year net
Data analyst
Ukraine · Kyiv · 4 years of experience · Upper-Intermediate ·Published today · In passive search
Ukraine · Kyiv · 4 years of experience · Upper-Intermediate ·Published today · In passive search
Data Visualization, Excel, R, MySQL, MS Sql. PostgreSQL, Python, statistic, PowerPoint, Adobe Reader, ME.DOC, Outlook, Analytics, Data Analisys, Git, Analysis, Access, Microsoft Word.
Хочу плідно працювати з датасетами та отримувати гідну оплату за виконану роботу
$1000 / mo
≈ $12000 / year net
Python Developer/Data Engineer
Ukraine · 1 year of experience · Advanced/Fluent ·Published yesterday
Ukraine · 1 year of experience · Advanced/Fluent ·Published yesterday
As for Data Engineer position I was responsible for:
1) Researching the resources for information extraction.
2) Writing personalized scripts for extracting the information.
3) Preparing, structuring and exporting the results in different formats.
4) Developing algorithmic solutions for constant extraction of information based on limited information.
Stack: scrapy, selenium, requests, spacy, Pandas, docker, numpy, BeautifulSoup, Python.
Personal projects:
Streamlined LinkedIn profile configuration with a Python-based automation bot.
Tech Stack: Python, Selenium, BeautifulSoup, Docker, Pandas.
Developed a web scraper to extract all Japanese lessons from a website.
Tech Stack: Python, Scrapy, BeautifulSoup, Docker
Created a Python script for extracting comprehensive animal data.
Tech Stack: Python, requests, pandas, BeautifulSoup
Designing a Telegram bot for interactive Japanese language lessons.
Tech Stack: node.js, MongoDB, node-telegram-bot-api, Docker
Developed an prototype of web application focused on League of Legends.
Tech Stack: Python, Django, openpyxl, matplotlib, MySQL
In the past, I participated in several math boot-camps and was a medalist at math city olympiads.
$2500 / mo
≈ $30000 / year net
Data / BI Engineer
Ukraine · Kyiv · 2.5 years of experience · Advanced/Fluent ·Published yesterday
Ukraine · Kyiv · 2.5 years of experience · Advanced/Fluent ·Published yesterday
During my tenure at Adastra for 1 year and 4 months, I specialized in preparing technical data for interface agreements and excelled in integrating data from diverse sources (nearly 15 at once) using heterogeneous integration strategies for Dr. Max client. At project for Mubadala Capital, I honed my skills in implementing mappings and crafting data pipelines using Azure Data Factory. Additionally, I developed stored procedures for SCD processes, showcasing my proficiency in database management.
My stint at Novalytica for 8 months allowed me to leverage Power BI for report development and Python for ETL scripting. I also gained experience in building pipelines using Azure Data Factory and BigQuery.
At RAZOM GROUP, where I spent 1 year, I focused on developing and supporting econometric models using R and Excel. I also created interactive dashboards for clients using Power BI and prepared analytical reviews in PowerPoint for various industries such as pharmaceuticals and FMCG. Additionally, I provided support for internal company software utilizing R.
$3000 / mo
≈ $36000 / year net
MLOps Engineer / Data Engineer
Ukraine · 2.5 years of experience · Upper-Intermediate ·Published yesterday
Ukraine · 2.5 years of experience · Upper-Intermediate ·Published yesterday
Fractal Analytics | Dec 2021 - present
- Engineered a Models Monitoring solution integrated into existing Vertex AI pipelines using Cloud Monitoring, Cloud Logging, BigQuery, Looker Studio etc. This initiative helped to assess the robustness and performance of the deployed models on a regular basis.
- Orchestrated end-to-end MLOps pipelines setup and deployment on Vertex AI, including the establishment of CI/CD pipelines.
- Played a key role in the migration project, moving data and ML pipelines from Airflow to Vertex AI. Contributed to the development and optimization of pipelines using the KubeFlow SDK. I utilized various GCP services and worked on CI/CD pipelines.
- Assisted in the migration of data from legacy storage to GCP, employing SQL for data transformation and loading.
Data Analyst
City Development Solutions | Sep 2020 - Feb 2021
Responsibilities:
- Data collection, cleaning and processing
- Analysis of real estate market objects
- Plotting graphs, creating reports
Achievements:
I've automated and sped up the process of collecting information using Python
$6000 / mo
≈ $72000 / year net
Data Engineer
Ukraine · 7 years of experience · Advanced/Fluent ·Published yesterday · In passive search
Ukraine · 7 years of experience · Advanced/Fluent ·Published yesterday · In passive search
Work best if trusted with team-level objectives.
, built a Data Studio connector
, competed at Kaggle
, go crazy about dev productivity - have 51 Github gists (and counting), version-controlled VS Code snippets, half a dosen self-built utilities and paid subscriptions to multiple AI-based dev tools
, been developing physics engines - state-of-the-art one at work, and an ordinary one for fun, as a hobby.
Didn't save a forest, didn't sell a company, but well, that's something too.
$700 / mo
≈ $8400 / year net
SQL developer, Qa engineer
Ukraine · Kyiv · 6 months · Upper-Intermediate ·Published yesterday
Ukraine · Kyiv · 6 months · Upper-Intermediate ·Published yesterday
manual testing of the web and mobile applications.
Strong understanding and experience in Test Design Techniques (Equivalence Class, Boundary Value, Pairwise and etc Understanding of types of testing, types of test documentation (bug report, test case, test report, test plan). Experience in creation of bug reports, test cases,
checks lists.Experience in requirements analysis.
$8000 / mo
≈ $96000 / year net
Business Intelligence Engineer
Ukraine · Kyiv · 9 years of experience · Advanced/Fluent ·Published yesterday
Ukraine · Kyiv · 9 years of experience · Advanced/Fluent ·Published yesterday
— Scoping of the new reporting platform with a customer.
— Analyzing data structures in Snowflake DWH, connecting Looker to the DWH using SQL and LookML queries.
— Arranging dimensions and measures with a customer and implementing them in Looker.
— Creating LookML views, explores, models, and dashboards as per client's requirements.
— Setting up user access logic using access grants and user attributes.
— Tracking development progress and cooperation with colleagues in the GitLab versioning system.
QlikView Dashboards:
— Composing / modifying / optimizing QlikView data reload scripts.
— Implement automatic data load errors handling system for all projects.
— QlikView data models development.
— Designing reports layout (creating and modifying objects using complex formulas, set analysis, variables, triggers, bookmarks, alternate states, etc.).
— Data visualization in QlikView reports using filters, pivot tables, charts.
— Managing QlikView reports automatic reload and distribution on the QlikView server.
— Managing user access via Section Access / on the server level.
— Understanding client’s business logic and the best ways of its reflection in BI solutions.
— Support and guidance work with Project Managers and stakeholders on the proposed solution structure and necessary components for successful delivery.
— Working with clients on preliminary scoping and dashboard architecture
— Writing technical specifications for the developed dashboards (instructions for the end-users, manuals for QA).
— Optimizing dashboards for better performance.
— Reviewing dashboards with legacy back-end or front-end.
— Implementing ETL logic in the data reload process.
— Tracking development process in the GitLab versioning system.
— QlikView Dashboard for the global industrial water treatment company). Analysis of the regular security audit results in the company's facilities in the UK, European Union, and Australia.
— QlikView Dashboards for the top US retail corporation. Mainly used to analyze the data about the primary and secondary skills each Employee has. The company has the requirements regarding each skill headcount. The Dashboard helps them regulate the number of people with each skill dynamically across different Departments.
— QlikView Dashboard for the US distributor and merchandizer of supermarket non-foods. Company performance analysis, broken down by districts, stores, categories of goods
— QlikView Dashboards for the software development company. Analyzing time logged by company employees in JIRA and cost-effectiveness of company’s projects.
— QlikView Dashboards for one of the top insurance companies in the United States. Analyzing various aspects related to the clients’ credit risk. Includes scoring, geographical analysis, plan / actual analysis).
— QlikView Dashboards for the operator of senior homes network in the United Kingdom and the United States. Used to analyze effectiveness of senior homes operation as well as level of seniors’ comfort living there.
— Dashboards for one of the world’s top franchises specialized in coffee-making. Used to analyze the performance of coffee shops, quality of coffee, and quality of services (including Mystery Shopping methodology).
$2500 / mo
≈ $30000 / year net
Java Developer, Big Data Developer
Ukraine · Kyiv · 5 years of experience · Intermediate ·Published 1 May
Ukraine · Kyiv · 5 years of experience · Intermediate ·Published 1 May
5+ years as Java / Big Data dev.
Strong at JVM languages as well as scripting languages: JS, Python, Bash, etc.
Hands-on experience with Big Data (Spark, Kafka, etc) and clouds (AWS, GCP).
Participated in successful migration of existing Big Data (spark apps, cloud functions) AWS solution to another cloud (GCP)
Provided analysis for optimisations possibilities / to cloud migration of existing on-premice Big Data solution (Spark jobs, Hive queries).
Provided mentorship for team members.