Jobs Kyiv
16-
Β· 80 views Β· 2 applications Β· 3d
Π‘ΡΠ°ΡΡΠΈΠΉ Π΄Π΅ΡΠ΅ΠΊΡΠΈΠ² Π²ΡΠ΄Π΄ΡΠ»Ρ Π΄Π΅ΡΠ΅ΠΊΡΠΈΠ²ΡΠ² Π· ΡΠΈΡΡΠΎΠ²ΠΈΡ ΡΠΎΠ·ΡΠ»ΡΠ΄ΡΠ²Π°Π½Ρ ΡΠ° ΡΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ ΠΊΡΠΈΠΌΡΠ½Π°Π»ΡΠ½ΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ (Data Engineer) to $1950
Office Work Β· Ukraine (Kyiv) Β· 2 years of experience Β· IntermediateΠ‘ΡΠ°ΡΡΠΈΠΉ Π΄Π΅ΡΠ΅ΠΊΡΠΈΠ² Π²ΡΠ΄Π΄ΡΠ»Ρ Π΄Π΅ΡΠ΅ΠΊΡΠΈΠ²ΡΠ² Π· ΡΠΈΡΡΠΎΠ²ΠΈΡ ΡΠΎΠ·ΡΠ»ΡΠ΄ΡΠ²Π°Π½Ρ ΡΠ° ΡΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ ΠΊΡΠΈΠΌΡΠ½Π°Π»ΡΠ½ΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ Π£ΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ ΠΊΡΠΈΠΌΡΠ½Π°Π»ΡΠ½ΠΎΠ³ΠΎ Π°Π½Π°Π»ΡΠ·Ρ ΡΠ° ΡΡΠ½Π°Π½ΡΠΎΠ²ΠΈΡ ΡΠΎΠ·ΡΠ»ΡΠ΄ΡΠ²Π°Π½Ρ ΠΠ°ΡΡΠΎΠ½Π°Π»ΡΠ½Π΅ Π°Π½ΡΠΈΠΊΠΎΡΡΠΏΡΡΠΉΠ½Π΅ Π±ΡΡΠΎ Π£ΠΊΡΠ°ΡΠ½ΠΈ (ΠΠΠΠ£) Ρ Π½Π΅Π·Π°Π»Π΅ΠΆΠ½ΠΈΠΌ Π΄Π΅ΡΠΆΠ°Π²Π½ΠΈΠΌ ΠΎΡΠ³Π°Π½ΠΎΠΌ, ΡΠΊΠΈΠΉ...Π‘ΡΠ°ΡΡΠΈΠΉ Π΄Π΅ΡΠ΅ΠΊΡΠΈΠ² Π²ΡΠ΄Π΄ΡΠ»Ρ Π΄Π΅ΡΠ΅ΠΊΡΠΈΠ²ΡΠ² Π· ΡΠΈΡΡΠΎΠ²ΠΈΡ ΡΠΎΠ·ΡΠ»ΡΠ΄ΡΠ²Π°Π½Ρ ΡΠ° ΡΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ ΠΊΡΠΈΠΌΡΠ½Π°Π»ΡΠ½ΠΈΠΌΠΈ Π΄Π°Π½ΠΈΠΌΠΈ Π£ΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ ΠΊΡΠΈΠΌΡΠ½Π°Π»ΡΠ½ΠΎΠ³ΠΎ Π°Π½Π°Π»ΡΠ·Ρ ΡΠ° ΡΡΠ½Π°Π½ΡΠΎΠ²ΠΈΡ ΡΠΎΠ·ΡΠ»ΡΠ΄ΡΠ²Π°Π½Ρ
ΠΠ°ΡΡΠΎΠ½Π°Π»ΡΠ½Π΅ Π°Π½ΡΠΈΠΊΠΎΡΡΠΏΡΡΠΉΠ½Π΅ Π±ΡΡΠΎ Π£ΠΊΡΠ°ΡΠ½ΠΈ (ΠΠΠΠ£) Ρ Π½Π΅Π·Π°Π»Π΅ΠΆΠ½ΠΈΠΌ Π΄Π΅ΡΠΆΠ°Π²Π½ΠΈΠΌ ΠΎΡΠ³Π°Π½ΠΎΠΌ, ΡΠΊΠΈΠΉ ΡΠΏΠ΅ΡΡΠ°Π»ΡΠ·ΡΡΡΡΡΡ Π½Π° ΡΠΎΠ·ΡΠ»ΡΠ΄ΡΠ²Π°Π½Π½Ρ ΡΠΎΠΏΠΊΠΎΡΡΠΏΡΡΠΉΠ½ΠΈΡ Π·Π»ΠΎΡΠΈΠ½ΡΠ². Π ΠΎΠ±ΠΎΡΠ° Π² ΠΠΠΠ£ β ΡΠ΅ ΠΌΠΎΠΆΠ»ΠΈΠ²ΡΡΡΡ Π΄ΠΎΠ»ΡΡΠΈΡΠΈΡΡ Π΄ΠΎ Π±ΠΎΡΠΎΡΡΠ±ΠΈ Π· ΠΊΠΎΡΡΠΏΡΡΡΡ ΡΠ° Π·ΡΠΎΠ±ΠΈΡΠΈ Π²Π½Π΅ΡΠΎΠΊ Ρ ΡΠΎΠ·Π²ΠΈΡΠΎΠΊ ΠΏΡΠΎΠ·ΠΎΡΠΎΠ³ΠΎ ΡΠ° ΡΠΏΡΠ°Π²Π΅Π΄Π»ΠΈΠ²ΠΎΠ³ΠΎ ΡΡΡΠΏΡΠ»ΡΡΡΠ²Π°.
Π£ΠΠΠΠ:
- Π ΠΎΠ±ΠΎΡΠ° Π² ΠΌ. ΠΠΈΡΠ².
- ΠΠΎΠ²Π½Π° Π·Π°ΠΉΠ½ΡΡΡΡΡΡ.
- ΠΠΎΡΠ°Π΄ΠΎΠ²ΠΈΠΉ ΠΎΠΊΠ»Π°Π΄: 79 939,00 Π³ΡΠ½.*
ΠΠΎΠΏΠ»Π°ΡΠΈ: Π²ΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π½ΠΎ Π΄ΠΎ ΡΡΠ°ΡΡΡ 23 ΠΠ°ΠΊΠΎΠ½Ρ Π£ΠΊΡΠ°ΡΠ½ΠΈ Β«ΠΡΠΎ ΠΠ°ΡΡΠΎΠ½Π°Π»ΡΠ½Π΅ Π°Π½ΡΠΈΠΊΠΎΡΡΠΏΡΡΠΉΠ½Π΅ Π±ΡΡΠΎ Π£ΠΊΡΠ°ΡΠ½ΠΈ.
- ΠΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π½ΠΎ Π΄ΠΎ ΠΏ. 24 Ρ.1 ΡΡ. 23 ΠΠ°ΠΊΠΎΠ½Ρ Π£ΠΊΡΠ°ΡΠ½ΠΈ Β«ΠΡΠΎ ΠΌΠΎΠ±ΡΠ»ΡΠ·Π°ΡΡΠΉΠ½Ρ ΠΏΡΠ΄Π³ΠΎΡΠΎΠ²ΠΊΡ ΡΠ° ΠΌΠΎΠ±ΡΠ»ΡΠ·Π°ΡΡΡΒ» Π²ΡΠΉΡΡΠΊΠΎΠ²ΠΎΠ·ΠΎΠ±ΠΎΠ²βΡΠ·Π°Π½Ρ ΠΏΡΠ°ΡΡΠ²Π½ΠΈΠΊΠΈ ΠΠ°ΡΡΠΎΠ½Π°Π»ΡΠ½ΠΎΠ³ΠΎ Π±ΡΡΠΎ ΠΠ ΠΏΡΠ΄Π»ΡΠ³Π°ΡΡΡ ΠΏΡΠΈΠ·ΠΎΠ²Ρ Π½Π° Π²ΡΠΉΡΡΠΊΠΎΠ²Ρ ΡΠ»ΡΠΆΠ±Ρ ΠΏΡΠ΄ ΡΠ°Ρ ΠΌΠΎΠ±ΡΠ»ΡΠ·Π°ΡΡΡ (Π²ΡΠ΄ΡΡΡΠΎΡΠΊΠ°).
ΠΠΠΠΠΠ:
- ΠΠ½Π°Π½Π½Ρ ΡΠ° ΠΏΡΠ°ΠΊΡΠΈΡΠ½Ρ Π½Π°Π²ΠΈΠΊΠΈ ΠΎΠ΄Π½ΡΡΡ Π°Π±ΠΎ Π΄Π΅ΠΊΡΠ»ΡΠΊΠΎΡ Π· Π½Π°ΡΡΡΠΏΠ½ΠΈΡ ΠΌΠΎΠ² ΠΏΡΠΎΠ³ΡΠ°ΠΌΡΠ²Π°Π½Π½Ρ: Python (Django, Flask, FastAPI) / JavaScript (Node.js) / Java / Dart / Go (Golang) / C# (.NET) / Rust / PHP, ΡΠ½ΡΠΈΡ ΠΌΠΎΠ² ΠΏΡΠΎΠ³ΡΠ°ΠΌΡΠ²Π°Π½Π½Ρ.
- ΠΠΌΡΠ½Π½Ρ ΠΏΡΠ°ΡΡΠ²Π°ΡΠΈ Π· API ΡΠ° ΡΠ½ΡΠ΅Π³ΡΠ°ΡΡΡΡ Π·ΠΎΠ²Π½ΡΡΠ½ΡΡ ΡΠ΅ΡΠ²ΡΡΡΠ².
- ΠΠ’-ΡΡΡΠ΅Π½Π½Ρ ΡΠ° Π°Π²ΡΠΎΠΌΠ°ΡΠΈΠ·Π°ΡΡΡ: Π΄ΠΎΡΠ²ΡΠ΄ ΡΠΎΠ·ΡΠΎΠ±ΠΊΠΈ ΡΠ° Π²ΠΏΡΠΎΠ²Π°Π΄ΠΆΠ΅Π½Π½Ρ ΡΠ½ΡΠΎΡΠΌΠ°ΡΡΠΉΠ½ΠΎ-ΠΊΠΎΠΌΡΠ½ΡΠΊΠ°ΡΡΠΉΠ½ΠΈΡ ΡΠΈΡΡΠ΅ΠΌ, Π°Π²ΡΠΎΠΌΠ°ΡΠΈΠ·Π°ΡΡΡ Π±ΡΠ·Π½Π΅Ρ-ΠΏΡΠΎΡΠ΅ΡΡΠ².
- Π ΠΎΠ·ΡΠΌΡΠ½Π½Ρ ΠΏΡΠΈΠ½ΡΠΈΠΏΡΠ² ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ , Π±Π°Π·ΠΎΠ²Ρ Π·Π½Π°Π½Π½Ρ ΠΌΠ΅ΡΠΎΠ΄ΡΠ² Π·Π±ΠΎΡΡ ΡΠ° Π°Π½Π°Π»ΡΠ·Ρ ΡΠ½ΡΠΎΡΠΌΠ°ΡΡΡ, ΠΏΡΠ΄Π³ΠΎΡΠΎΠ²ΠΊΠΈ Π°Π½Π°Π»ΡΡΠΈΡΠ½ΠΎΡ Π΄ΠΎΠΊΡΠΌΠ΅Π½ΡΠ°ΡΡΡ, Π²ΡΠ·ΡΠ°Π»ΡΠ·Π°ΡΡΡ Π΄Π°Π½ΠΈΡ . ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· ΡΠ½ΡΡΡΡΠΌΠ΅Π½ΡΠ°ΠΌΠΈ Π΄Π»Ρ Π·Π±ΠΎΡΡ ΡΠ° Π°Π½Π°Π»ΡΠ·Ρ Π΄Π°Π½ΠΈΡ .
- ΠΠ½Π°Π½Π½Ρ ΠΎΡΠ½ΠΎΠ² ΡΠΎΠ±ΠΎΡΠΈ Π· Π±Π°Π·Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ (SQL/NoSQL).
- ΠΠΈΡΠ° ΠΎΡΠ²ΡΡΠ° (ΠΌΠ°Π³ΡΡΡΡ, ΡΠΏΠ΅ΡΡΠ°Π»ΡΡΡ Π°Π±ΠΎ Π±Π°ΠΊΠ°Π»Π°Π²Ρ Π· 2016 ΡΠΎΠΊΡ) Π·Π° Π½Π°ΠΏΡΡΠΌΠΎΠΌ ΠΠ»Π΅ΠΊΡΡΠΎΠ½ΡΠΊΠ°, Π°Π²ΡΠΎΠΌΠ°ΡΠΈΠ·Π°ΡΡΡ ΡΠ° Π΅Π»Π΅ΠΊΡΡΠΎΠ½Π½Ρ ΠΊΠΎΠΌΡΠ½ΡΠΊΠ°ΡΡΡ; ΠΠ½ΡΠΎΡΠΌΠ°ΡΡΠΉΠ½Ρ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡ; ΠΠ°ΡΠ΅ΠΌΠ°ΡΠΈΠΊΠ° ΡΠ° ΡΡΠ°ΡΠΈΡΡΠΈΠΊΠ°; Π‘ΠΎΡΡΠ°Π»ΡΠ½Ρ ΡΠ° ΠΏΠΎΠ²Π΅Π΄ΡΠ½ΠΊΠΎΠ²Ρ Π½Π°ΡΠΊΠΈ (ΠΠΊΠΎΠ½ΠΎΠΌΡΠΊΠ°).
- Π‘ΡΠ°ΠΆ ΡΠΎΠ±ΠΎΡΠΈ ΡΡΠΈΠ²Π°Π»ΡΡΡΡ Π½Π΅ ΠΌΠ΅Π½ΡΠ΅ Π΄Π²ΠΎΡ ΡΠΎΠΊΡΠ² Π² ΠΎΠ΄Π½ΡΠΉ Π· Π½Π°ΡΡΡΠΏΠ½ΠΈΡ ΡΡΠ΅Ρ: ΡΠ½ΡΠΎΡΠΌΠ°ΡΡΠΉΠ½ΠΈΡ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΠΉ, Π°ΡΠ΄ΠΈΡΡ, ΡΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ ΡΠΈΠ·ΠΈΠΊΠ°ΠΌΠΈ, ΡΠΈΡΡΠ΅ΠΌΠ½ΠΎΠ³ΠΎ ΡΠ° Π±ΡΠ·Π½Π΅Ρ-Π°Π½Π°Π»ΡΠ·Ρ, ΠΊΠΎΡΠΏΠΎΡΠ°ΡΠΈΠ²Π½ΠΎΡ (Π΅ΠΊΠΎΠ½ΠΎΠΌΡΡΠ½ΠΎΡ) ΡΠΎΠ·Π²ΡΠ΄ΠΊΠΈ, ΡΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ Π΄Π°Π½ΠΈΠΌΠΈ.
- ΠΠΎΠ»ΠΎΠ΄ΡΠ½Π½Ρ Π΄Π΅ΡΠΆΠ°Π²Π½ΠΎΡ ΠΌΠΎΠ²ΠΎΡ β Π²ΡΠ»ΡΠ½Π΅, ΠΠΎΠ»ΠΎΠ΄ΡΠ½Π½Ρ ΡΠ½ΠΎΠ·Π΅ΠΌΠ½ΠΎΡ ΠΌΠΎΠ²ΠΎΡ (Π°Π½Π³Π»ΡΠΉΡΡΠΊΠ°, ΡΡΠ°Π½ΡΡΠ·ΡΠΊΠ°, Π½ΡΠΌΠ΅ΡΡΠΊΠ°) ΡΡΠ²Π½Ρ Upper-Intermediate (B2) ΡΠ° Π²ΠΈΡΠ΅ Ρ Π΄ΠΎΠ΄Π°ΡΠΊΠΎΠ²ΠΎΡ ΠΏΠ΅ΡΠ΅Π²Π°Π³ΠΎΡ.
ΠΠΠΠ‘Π’ ΠΠΠΠΠΠ£ΠΠΠΠΠ Π ΠΠΠΠ’Π:
- ΠΠ½Π°Π»ΡΠ· Ρ ΠΏΡΠ΄Π³ΠΎΡΠΎΠ²ΠΊΠ° Π΄ΠΎ ΡΠΎΠ·ΡΠΎΠ±ΠΊΠΈ:
Π ΠΎΠ·Π±ΡΡ ΡΠ΅Ρ Π½ΡΡΠ½ΠΎΠ³ΠΎ Π·Π°Π²Π΄Π°Π½Π½Ρ. ΠΠΈΠ·Π½Π°ΡΠ΅Π½Π½Ρ ΠΏΠΎΡΡΡΠ±Π½ΠΈΡ ΡΠ½ΡΡΡΡΠΌΠ΅Π½ΡΡΠ² Ρ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΠΉ, ΠΎΡΡΠ½ΠΊΠ° ΠΎΠ±ΡΡΠ³Ρ Π΄Π°Π½ΠΈΡ Ρ Π²ΠΈΠ±ΡΡ ΠΎΠΏΡΠΈΠΌΠ°Π»ΡΠ½ΠΎΠ³ΠΎ ΠΏΡΠ΄Ρ ΠΎΠ΄Ρ.
ΠΠ²ΡΠΎΠΌΠ°ΡΠΈΠ·Π°ΡΡΡ Π±ΡΠ·Π½Π΅Ρ-ΠΏΡΠΎΡΠ΅ΡΡΠ²:
ΠΠ΄Π΅Π½ΡΠΈΡΡΠΊΠ°ΡΡΡ Π·Π°Π²Π΄Π°Π½Ρ, ΡΠΊΡ ΠΌΠΎΠΆΠ½Π° ΠΎΠΏΡΠΈΠΌΡΠ·ΡΠ²Π°ΡΠΈ Π·Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΎΡ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠ½ΠΈΡ ΡΡΡΠ΅Π½Ρ. Π ΠΎΠ·ΡΠΎΠ±ΠΊΠ° ΡΠ° Π²ΠΏΡΠΎΠ²Π°Π΄ΠΆΠ΅Π½Π½Ρ ΡΠ½ΡΡΡΡΠΌΠ΅Π½ΡΡΠ² Π΄Π»Ρ Π°Π²ΡΠΎΠΌΠ°ΡΠΈΠ·Π°ΡΡΡ (Π½Π°ΠΏΡΠΈΠΊΠ»Π°Π΄, ΠΏΠ°ΡΡΠΈΠ½Π³ Π΄Π°Π½ΠΈΡ , Π°Π²ΡΠΎΠΌΠ°ΡΠΈΡΠ½Π΅ Π·Π°ΠΏΠΎΠ²Π½Π΅Π½Π½Ρ ΡΠΎΡΠΌ, Π³Π΅Π½Π΅ΡΠ°ΡΡΡ Π΄ΠΎΠΊΡΠΌΠ΅Π½ΡΡΠ² ΡΠΎΡΠΎ).
Π ΠΎΠ·ΡΠΎΠ±ΠΊΠ° ΠΏΡΠΎΠ³ΡΠ°ΠΌΠ½ΠΎΠ³ΠΎ Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΠ΅Π½Π½Ρ:
Π‘ΡΠ²ΠΎΡΠ΅Π½Π½Ρ ΡΠΊΡΠΈΠΏΡΡΠ², ΠΌΠΎΠ΄ΡΠ»ΡΠ² Π°Π±ΠΎ ΠΏΠΎΠ²Π½ΠΎΡΡΠ½Π½ΠΈΡ Π΄ΠΎΠ΄Π°ΡΠΊΡΠ² Π΄Π»Ρ Π°Π²ΡΠΎΠΌΠ°ΡΠΈΠ·Π°ΡΡΡ ΡΠ· Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½ΡΠΌ ΠΌΠΎΠ² ΠΏΡΠΎΠ³ΡΠ°ΠΌΡΠ²Π°Π½Π½Ρ Π²ΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π½ΠΎ Π΄ΠΎ ΠΊΠΎΠΌΠΏΠ΅ΡΠ΅Π½ΡΡΡ. ΠΡΠΎΠ΅ΠΊΡΡΠ²Π°Π½Π½Ρ ΡΠ° ΡΠ΅Π°Π»ΡΠ·Π°ΡΡΡ API Π΄Π»Ρ ΡΠ½ΡΠ΅Π³ΡΠ°ΡΡΡ Π· ΡΠ½ΡΠΈΠΌΠΈ ΡΠΈΡΡΠ΅ΠΌΠ°ΠΌΠΈ. ΠΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π½Π° ΡΠΎΠ±ΠΎΡΠ° Π· Π±Π°Π·Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ .
- Π’Π΅ΡΡΡΠ²Π°Π½Π½Ρ ΡΡΡΠ΅Π½Ρ:
ΠΠ΅ΡΠ΅Π²ΡΡΠΊΠ° ΠΊΠΎΠ΄Ρ Π½Π° ΡΠ΅ΡΡΠΎΠ²ΠΈΡ Π΄Π°Π½ΠΈΡ . ΠΠΈΡΠ²Π»Π΅Π½Π½Ρ ΠΏΠΎΠΌΠΈΠ»ΠΎΠΊ, ΡΠ΅ΡΡΡΠ²Π°Π½Π½Ρ Π½Π° Π³ΡΠ°Π½ΠΈΡΠ½ΠΈΡ Π²ΠΈΠΏΠ°Π΄ΠΊΠ°Ρ (ΠΏΠΎΡΠΎΠΆΠ½Ρ Π΄Π°Π½Ρ, Π²Π΅Π»ΠΈΠΊΡ ΠΎΠ±ΡΡΠ³ΠΈ) ΡΠΎΡΠΎ.
- ΠΠΏΡΠΈΠΌΡΠ·Π°ΡΡΡ.
- ΠΠ½ΡΠ΅Π³ΡΠ°ΡΡΡ Π· ΡΠΈΡΡΠ΅ΠΌΠ°ΠΌΠΈ:
ΠΡΠ΄ΠΊΠ»ΡΡΠ΅Π½Π½Ρ ΡΠΎΠ·ΡΠΎΠ±Π»Π΅Π½ΠΈΡ ΡΡΡΠ΅Π½Ρ Π΄ΠΎ ΡΠ½ΡΠΈΡ ΠΏΠ»Π°ΡΡΠΎΡΠΌ ΡΠ΅ΡΠ΅Π· API Π°Π±ΠΎ ΠΏΡΡΠΌΡ Π·Π°ΠΏΠΈΡΠΈ Π΄ΠΎ Π±Π°Π·ΠΈ Π΄Π°Π½ΠΈΡ . ΠΠ°Π»Π°ΡΡΡΠ²Π°Π½Π½Ρ ΠΎΠ±ΠΌΡΠ½Ρ Π΄Π°Π½ΠΈΠΌΠΈ ΠΌΡΠΆ ΡΡΠ·Π½ΠΈΠΌΠΈ ΡΠ΅ΡΠ²ΡΡΠ°ΠΌΠΈ. ΠΠ΅ΡΠ΅Π²ΡΡΠΊΠ° ΡΡΠ°Π±ΡΠ»ΡΠ½ΠΎΡΡΡ ΡΠ½ΡΠ΅Π³ΡΠ°ΡΡΡ ΠΏΡΠΈ ΡΠ΅Π°Π»ΡΠ½ΠΈΡ Π½Π°Π²Π°Π½ΡΠ°ΠΆΠ΅Π½Π½ΡΡ .
- ΠΠ΄ΡΠΉΡΠ½Π΅Π½Π½Ρ, Π² ΠΌΠ΅ΠΆΠ°Ρ ΠΊΠΎΠΌΠΏΠ΅ΡΠ΅Π½ΡΡΡ, ΡΠ½ΡΠΎΡΠΌΠ°ΡΡΠΉΠ½ΠΎ-Π°Π½Π°Π»ΡΡΠΈΡΠ½ΠΎΠ³ΠΎ Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΠ΅Π½Π½Ρ Π΄ΡΡΠ»ΡΠ½ΠΎΡΡΡ ΠΠ°ΡΡΠΎΠ½Π°Π»ΡΠ½ΠΎΠ³ΠΎ Π±ΡΡΠΎ Π· ΠΌΠ΅ΡΠΎΡ ΠΏΠΎΠΏΠ΅ΡΠ΅Π΄ΠΆΠ΅Π½Π½Ρ, Π²ΠΈΡΠ²Π»Π΅Π½Π½Ρ, ΠΏΡΠΈΠΏΠΈΠ½Π΅Π½Π½Ρ, ΡΠΎΠ·ΡΠ»ΡΠ΄ΡΠ²Π°Π½Π½Ρ Ρ ΡΠΎΠ·ΠΊΡΠΈΡΡΡ ΠΊΠΎΡΡΠΏΡΡΠΉΠ½ΠΈΡ ΡΠ° ΡΠ½ΡΠΈΡ ΠΊΡΠΈΠΌΡΠ½Π°Π»ΡΠ½ΠΈΡ ΠΏΡΠ°Π²ΠΎΠΏΠΎΡΡΡΠ΅Π½Ρ, Π²ΡΠ΄Π½Π΅ΡΠ΅Π½ΠΈΡ Π΄ΠΎ ΠΏΡΠ΄ΡΠ»ΡΠ΄Π½ΠΎΡΡΡ ΠΠ°ΡΡΠΎΠ½Π°Π»ΡΠ½ΠΎΠ³ΠΎ Π±ΡΡΠΎ, Π° ΡΠ°ΠΊΠΎΠΆ ΡΠ½ΡΠΈΡ ΠΏΡΠ°Π²ΠΎΠΏΠΎΡΡΡΠ΅Π½Ρ
- Π£ΡΠ°ΡΡΡ Ρ ΠΏΡΠΎΠ²Π΅Π΄Π΅Π½Π½Ρ ΡΠ»ΡΠ΄ΡΠΈΡ (ΡΠΎΠ·ΡΡΠΊΠΎΠ²ΠΈΡ ) ΡΠ° ΡΠ½ΡΠΈΡ ΠΏΡΠΎΡΠ΅ΡΡΠ°Π»ΡΠ½ΠΈΡ Π΄ΡΠΉ ΡΠΊ ΡΠΏΠ΅ΡΡΠ°Π»ΡΡΡΡΠ² Ρ ΡΡΠ΅ΡΡ ΡΠ½ΡΠΎΡΠΌΠ°ΡΡΠΉΠ½ΠΈΡ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΠΉ.
- ΠΠ΄ΡΠΉΡΠ½Π΅Π½Π½Ρ Π·Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΎΡ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠ½ΠΎ-ΡΠ΅Ρ Π½ΡΡΠ½ΠΈΡ Π·Π°ΡΠΎΠ±ΡΠ² Π·Π±ΠΎΡΡ, ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ ΡΠ° Π°Π½Π°Π»ΡΠ·Ρ Π΄Π°Π½ΠΈΡ Π· ΡΠΈΡΡΠΎΠ²ΠΈΡ ΠΏΡΠΈΡΡΡΠΎΡΠ².
ΠΠ’ΠΠΠ ΠΠ ΠΠΠΠΠΠΠΠ― ΠΠΠΠΠ£Π Π‘Π£:
- ΠΠΎΠ΄Π°ΡΠ° Π΄ΠΎΠΊΡΠΌΠ΅Π½ΡΡΠ².
- Π’Π΅ΡΡΡΠ²Π°Π½Π½Ρ (ΡΠ΅ΡΡΡΠ²Π°Π½Π½Ρ Π½Π° Π·Π½Π°Π½Π½Ρ Π·Π°ΠΊΠΎΠ½ΠΎΠ΄Π°Π²ΡΡΠ²Π° 1-Π³ΠΎ ΡΡΠ²Π½Ρ β ΠΏΠ΅ΡΠ΅Π»ΡΠΊ ΠΏΠΈΡΠ°Π½Ρ Π·Π° ΠΏΠΎΡΠΈΠ»Π°Π½Π½ΡΠΌ https://nabu.gov.ua/perelik-pytan-do-kvalifikaciynogo-ispytu, ΡΠ΅ΡΡΡΠ²Π°Π½Π½Ρ Π·Π°Π³Π°Π»ΡΠ½ΠΈΡ Π·Π΄ΡΠ±Π½ΠΎΡΡΠ΅ΠΉ, ΠΏΡΠΈΡ ΠΎΠ»ΠΎΠ³ΡΡΠ½Π΅ ΡΠ΅ΡΡΡΠ²Π°Π½Π½Ρ).
- Π‘ΠΏΡΠ²Π±Π΅ΡΡΠ΄Π°.
*ΠΠΎΡΠ°Π΄ΠΎΠ²Ρ ΠΎΠΊΠ»Π°Π΄ΠΈ ΠΏΡΠ°ΡΡΠ²Π½ΠΈΠΊΡΠ² ΠΠ°ΡΡΠΎΠ½Π°Π»ΡΠ½ΠΎΠ³ΠΎ Π±ΡΡΠΎ, ΡΠΊΡ ΠΏΡΠΎΡ ΠΎΠ΄ΡΡΡ ΡΡΠ°ΠΆΡΠ²Π°Π½Π½Ρ, Π²ΡΡΠ°Π½ΠΎΠ²Π»ΡΡΡΡΡΡ Π· ΠΏΠΎΠ½ΠΈΠΆΡΡΡΠΈΠΌ ΠΊΠΎΠ΅ΡΡΡΡΡΠ½ΡΠΎΠΌ 1,5.
**ΠΠ° ΠΌΠΎΠΌΠ΅Π½Ρ ΠΏΠΎΠ΄Π°ΡΡ Π΄ΠΎΠΊΡΠΌΠ΅Π½ΡΡΠ² Π½Π΅ΠΎΠ±Ρ ΡΠ΄Π½ΠΎ ΠΌΠ°ΡΠΈ/ΠΎΡΡΠΈΠΌΠ°ΡΠΈ ΠΠ΅ΡΠΆΠ°Π²Π½ΠΈΠΉ ΡΠ΅ΡΡΠΈΡΡΠΊΠ°Ρ ΠΏΡΠΎ ΡΡΠ²Π΅Π½Ρ Π²ΠΎΠ»ΠΎΠ΄ΡΠ½Π½Ρ Π΄Π΅ΡΠΆΠ°Π²Π½ΠΎΡ ΠΌΠΎΠ²ΠΎΡ
More -
Β· 53 views Β· 8 applications Β· 4d
ETL Developer
Ukraine Β· Product Β· 2 years of experience Β· IntermediateΠΠ Π ΠΠΠ‘ UKRSIBTECH β ΡΠ΅ Π°ΠΌΠ±ΡΡΠ½Π° ΠΠ’-ΠΊΠΎΠΌΠ°Π½Π΄Π° Π· Π±Π»ΠΈΠ·ΡΠΊΠΎ 400 ΡΠΏΠ΅ΡΡΠ°Π»ΡΡΡΡΠ², ΡΠΎ Π΄ΡΠ°ΠΉΠ²ΠΈΡΡ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡ UKRSIBBANK. ΠΠΈ ΡΡΠ²ΠΎΡΡΡΠΌΠΎ ΡΠΎΠΏΠΎΠ²ΠΈΠΉ Π±Π°Π½ΠΊΡΠ½Π³ Π΄Π»Ρ > 2 000 000 ΠΊΠ»ΡΡΠ½ΡΡΠ² ΡΠ° ΠΏΡΠ°Π³Π½Π΅ΠΌΠΎ Π²ΠΈΠ²ΠΎΠ΄ΠΈΡΠΈ ΡΡΠ½Π°Π½ΡΠΎΠ²Ρ ΡΡΠ΅ΡΡ Π² Π£ΠΊΡΠ°ΡΠ½Ρ Π½Π° Π½ΠΎΠ²ΠΈΠΉ ΡΡΠ²Π΅Π½Ρ. ΠΠ°ΡΠΈΠΌΠΈ ΠΏΡΠΎΠ΄ΡΠΊΡΠ°ΠΌΠΈ...ΠΠ Π ΠΠΠ‘
UKRSIBTECH β ΡΠ΅ Π°ΠΌΠ±ΡΡΠ½Π° ΠΠ’-ΠΊΠΎΠΌΠ°Π½Π΄Π° Π· Π±Π»ΠΈΠ·ΡΠΊΠΎ 400 ΡΠΏΠ΅ΡΡΠ°Π»ΡΡΡΡΠ², ΡΠΎ Π΄ΡΠ°ΠΉΠ²ΠΈΡΡ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡ UKRSIBBANK.
ΠΠΈ ΡΡΠ²ΠΎΡΡΡΠΌΠΎ ΡΠΎΠΏΠΎΠ²ΠΈΠΉ Π±Π°Π½ΠΊΡΠ½Π³ Π΄Π»Ρ > 2 000 000 ΠΊΠ»ΡΡΠ½ΡΡΠ² ΡΠ° ΠΏΡΠ°Π³Π½Π΅ΠΌΠΎ Π²ΠΈΠ²ΠΎΠ΄ΠΈΡΠΈ ΡΡΠ½Π°Π½ΡΠΎΠ²Ρ ΡΡΠ΅ΡΡ Π² Π£ΠΊΡΠ°ΡΠ½Ρ Π½Π° Π½ΠΎΠ²ΠΈΠΉ ΡΡΠ²Π΅Π½Ρ. ΠΠ°ΡΠΈΠΌΠΈ ΠΏΡΠΎΠ΄ΡΠΊΡΠ°ΠΌΠΈ ΠΊΠΎΡΠΈΡΡΡΡΡΡΡΡ ΡΠ·Π΅ΡΠΈ ΡΠΎΠ΄Π΅Π½Π½ΠΎΠ³ΠΎ Π±Π°Π½ΠΊΡΠ½Π³Ρ, Π»ΡΠ΄Π΅ΡΠΈ ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠΎΡ Π΅ΠΊΠΎΠ½ΠΎΠΌΡΠΊΠΈ ΡΠ° Π²Π΅Π»ΠΈΠΊΡ ΠΌΡΠΆΠ½Π°ΡΠΎΠ΄Π½Ρ ΠΊΠΎΡΠΏΠΎΡΠ°ΡΡΡ.
ΠΠΈ Π΄ΡΠΊΡΡΠΌΠΎ Π½Π°ΡΠΈΠΌ Π·Π°Ρ ΠΈΡΠ½ΠΈΠΊΠ°ΠΌ ΡΠ° Π·Π°Ρ ΠΈΡΠ½ΠΈΡΡΠΌ, ΡΠΊΡ Π²ΡΠ΄Π΄Π°Π½ΠΎ Π±ΠΎΡΠΎΠ½ΡΡΡ ΡΠ²ΠΎΠ±ΠΎΠ΄Ρ ΡΠ° Π½Π΅Π·Π°Π»Π΅ΠΆΠ½ΡΡΡΡ Π£ΠΊΡΠ°ΡΠ½ΠΈ, ΡΠ° ΡΡΠ²ΠΎΡΡΡΠΌΠΎ ΡΠΏΡΠΈΡΡΠ»ΠΈΠ²Π΅ ΡΠ΅ΡΠ΅Π΄ΠΎΠ²ΠΈΡΠ΅ Π΄Π»Ρ ΡΠΎΠ±ΠΎΡΠΈ Π² Π±Π°Π½ΠΊΡ.
ΠΠ°ΠΉΠ±ΡΡΠ½Ρ Π·Π°Π΄Π°ΡΡ:
- Π ΠΎΠ·ΡΠΎΠ±ΠΊΠ° ΠΏΡΠΈΠΊΠ»Π°Π΄Π½ΠΎΠ³ΠΎ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠ½ΠΎΠ³ΠΎ Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΠ΅Π½Π½Ρ Π² ΠΎΠ±Π»Π°ΡΡΡ Π°Π½Π°Π»ΡΡΠΈΡΠ½ΠΈΡ Π±Π°Π· Π΄Π°Π½ΠΈΡ
- Π‘ΡΠ²ΠΎΡΠ΅Π½Π½Ρ ΡΡΠ·ΠΈΡΠ½ΠΈΡ ΡΡΡΡΠΊΡΡΡ Π΄Π°Π½ΠΈΡ Π² ΡΠ΅Π»ΡΡΡΠΉΠ½ΠΈΡ Π±Π°Π·Π°Ρ Π΄Π°Π½ΠΈΡ
- ΠΠΈΡΡΡΠ΅Π½Π½Ρ Π·Π°Π΄Π°Ρ Π·Π° Π΄ΠΎΠΏΠΎΠΌΠΎΠ³ΠΎΡ DataStage
- Π Π΅Π°Π»ΡΠ·Π°ΡΡΡ ΠΏΡΠΎΡΠ΅Π΄ΡΡ Oracle ΡΠ° Π½Π°ΠΏΠΈΡΠ°Π½Π½Ρ SQL-ΡΠΊΡΠΈΠΏΡΡΠ²
- Π‘ΡΠ²ΠΎΡΠ΅Π½Π½Ρ Π½Π΅ΠΎΠ±Ρ ΡΠ΄Π½ΠΎΡ ΡΠ΅Ρ Π½ΡΡΠ½ΠΎΡ Π΄ΠΎΠΊΡΠΌΠ΅Π½ΡΠ°ΡΡΡ Π΄ΠΎ ΡΠΎΠ·ΡΠΎΠ±ΠΎΠΊ
- ΠΠ΄ΡΠΉΡΠ½Π΅Π½Π½Ρ ΡΠ΅Ρ Π½ΡΡΠ½ΠΎΠ³ΠΎ ΡΡΠΏΡΠΎΠ²ΠΎΠ΄Ρ ΡΠ΅Π³Π»Π°ΠΌΠ΅Π½ΡΠ½ΠΈΡ ΠΏΡΠΎΡΠ΅Π΄ΡΡ
- Π£ΡΠ°ΡΡΡ Ρ ΠΏΡΠΎΡΠ΅ΡΠ°Ρ ΡΠ½ΡΠΈΠ΄Π΅Π½Ρ-ΠΌΠ΅Π½Π΅Π΄ΠΆΠΌΠ΅Π½ΡΡ Π·Π°Π²Π°Π½ΡΠ°ΠΆΠ΅Π½Π½Ρ/Π²ΠΈΠ²Π°Π½ΡΠ°ΠΆΠ΅Π½Π½Ρ Π΄Π°Π½ΠΈΡ
- Π€ΠΎΡΠΌΡΠ²Π°Π½Π½Ρ ΡΠ° Π½Π°Π΄Π°Π½Π½Ρ Π·Π²ΡΡΠ½ΠΎΡΡΡ
ΠΠΈ Π² ΠΏΠΎΡΡΠΊΡ ΡΠ°Ρ ΡΠ²ΡΡ, ΡΠΊΠΈΠΉ:
- ΠΠ°Ρ Π΄ΠΎΡΠ²ΡΠ΄ ΡΠ° Π·Π½Π°Π½Π½Ρ Π² Π³Π°Π»ΡΠ·Ρ Π‘Π£ΠΠ Oracle
- ΠΠΏΠ΅Π²Π½Π΅Π½ΠΎ Π²ΠΎΠ»ΠΎΠ΄ΡΡ PL/SQL
- ΠΠ½Π°Ρ Π±Π°Π·ΠΎΠ²Ρ ΠΏΡΠΈΠ½ΡΠΈΠΏΠΈ ΠΏΠΎΠ±ΡΠ΄ΠΎΠ²ΠΈ ΡΡ ΠΎΠ²ΠΈΡ Π΄Π°Π½ΠΈΡ
ΠΡΠ΄Π΅ ΠΏΠ»ΡΡΠΎΠΌ:
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· DataStage (Π°Π±ΠΎ Informatica/Oracle Data Integrator)
ΠΠΊΡΡΠΌ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ ΠΎΠ΄Π½ΠΎΠ΄ΡΠΌΡΡΠ² ΡΠ° ΡΡΠΊΠ°Π²ΠΎΡ ΡΠΎΠ±ΠΎΡΠΈ ΡΠΈ ΠΎΡΡΠΈΠΌΠ°ΡΡ:
Π‘ΡΠ°Π±ΡΠ»ΡΠ½ΡΡΡΡ:
- ΠΎΡΡΡΡΠΉΠ½Π΅ ΠΏΡΠ°ΡΠ΅Π²Π»Π°ΡΡΡΠ²Π°Π½Π½Ρ
- ΠΌΠ΅Π΄ΠΈΡΠ½Π΅ ΡΡΡΠ°Ρ ΡΠ²Π°Π½Π½Ρ ΡΠ° ΡΡΡΠ°Ρ ΡΠ²Π°Π½Π½Ρ ΠΆΠΈΡΡΡ, ΠΏΠΎΠ²Π½ΡΡΡΡ ΠΎΠΏΠ»Π°ΡΠ΅Π½Π΅ ΠΠ°Π½ΠΊΠΎΠΌ
- Π·Π°ΡΠΎΠ±ΡΡΠ½Ρ ΠΏΠ»Π°ΡΡ Π½Π° ΡΡΠ²Π½Ρ ΠΏΡΠΎΠ²ΡΠ΄Π½ΠΈΡ Π’ΠΠ-ΡΠΎΠ±ΠΎΡΠΎΠ΄Π°Π²ΡΡΠ²
- 25 Π΄Π½ΡΠ² ΡΠΎΡΡΡΠ½ΠΎΡ Π²ΡΠ΄ΠΏΡΡΡΠΊΠΈ, Π΄ΠΎΠ΄Π°ΡΠΊΠΎΠ²Ρ Π΄Π½Ρ Π²ΡΠ΄ΠΏΡΡΡΠΊΠΈ Π½Π° ΠΏΠ°ΠΌβΡΡΠ½Ρ ΠΏΠΎΠ΄ΡΡ, ΡΠΎΡΡΠ°Π»ΡΠ½Ρ Π²ΡΠ΄ΠΏΡΡΡΠΊΠΈ Ρ Π²ΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π½ΠΎΡΡΡ Π΄ΠΎ Π·Π°ΠΊΠΎΠ½ΠΎΠ΄Π°Π²ΡΡΠ²Π° Π£ΠΊΡΠ°ΡΠ½ΠΈ
- ΡΠΎΡΡΡΠ½Ρ ΠΏΠ΅ΡΠ΅Π³Π»ΡΠ΄ΠΈ Π·Π°ΡΠΎΠ±ΡΡΠ½ΠΎΡ ΠΏΠ»Π°ΡΠΈ Π²ΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π½ΠΎ Π΄ΠΎ Π²Π»Π°ΡΠ½ΠΎΡ Π΅ΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΡΡΡ ΡΠ° ΡΡΠ½Π°Π½ΡΠΎΠ²ΠΈΡ ΠΏΠΎΠΊΠ°Π·Π½ΠΈΠΊΡΠ² ΠΠ°Π½ΠΊΡ
Π ΠΎΠ·Π²ΠΈΡΠΎΠΊ ΡΠ²ΠΎΡΡ ΡΠ°Π»Π°Π½ΡΡΠ² Π·Π°ΡΠ°Π΄ΠΈ ΠΊΠ°ΡβΡΡΠΈ:
- Π½Π°Π²ΡΠ°Π½Π½Ρ: Π¨ΠΊΠΎΠ»Π° Π»ΡΠ΄Π΅ΡΠ°, Π‘Π΅ΡΠ²ΡΡ Π΄ΠΈΠ·Π°ΠΉΠ½Ρ, ΠΡΠ·Π½Π΅Ρ Π°Π½Π°Π»ΡΡΠΈΠΊΠΈ, Data, Digital, Agile ΡΠ° ΠΏΡΠΎΠ³ΡΠ°ΠΌΡ ΡΠΎΠ·Π²ΠΈΡΠΊΡ ΡΠ°Π»Π°Π½ΡΡΠ² I-Players ΡΠ° ΡΠΈΡΠΎΠΊΠΈΠΉ Π½Π°Π±ΡΡ ΠΌΡΠΊΡΠΎ-ΡΡΠ΅Π½ΡΠ½Π³ΡΠ² (SQL, BI, EQ, ΡΠΎΡΠΎ)
- ΠΌΠΎΠΆΠ»ΠΈΠ²ΡΡΡΡ Π±ΡΠ°ΡΠΈ ΡΡΠ°ΡΡΡ Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΡ Π²Π½ΡΡΡΡΡΠ½ΡΠΎΡ ΡΠ° ΠΌΡΠΆΠ½Π°ΡΠΎΠ΄Π½ΠΎΡ ΠΌΠΎΠ±ΡΠ»ΡΠ½ΠΎΡΡΡ Π΄ΠΎ BNP Paribas Group
- ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΡ ΠΏΡΠ°ΡΡΠ²Π½ΠΈΠΊΡΠ² ΠΏΠΎΡΡΡΠΉΠ½ΠΎ Π½Π°Π²ΡΠ°ΡΠΈΡΡ ΡΠ° ΠΏΡΠΎΡΠ΅ΡΡΠΉΠ½ΠΎ ΡΠΎΠ·Π²ΠΈΠ²Π°ΡΠΈΡΡ. Π©ΠΎΡΠΎΠΊΡ Π½Π°ΡΡ ΠΏΡΠ°ΡΡΠ²Π½ΠΈΠΊΠΈ ΡΠΊΠ»Π°Π΄Π°ΡΡΡ ΡΠ½Π΄ΠΈΠ²ΡΠ΄ΡΠ°Π»ΡΠ½ΠΈΠΉ ΠΏΠ»Π°Π½ ΡΠΎΠ·Π²ΠΈΡΠΊΡ ΡΠ°Π·ΠΎΠΌ Π· ΠΊΠ΅ΡΡΠ²Π½ΠΈΠΊΠΎΠΌ ΡΠΈ ΠΌΠ΅Π½ΡΠΎΡΠΎΠΌ Ρ Π²ΠΏΠ΅Π²Π½Π΅Π½ΠΎ ΡΡΡ Π°ΡΡΡΡΡ Π΄ΠΎ ΠΌΠ΅ΡΠΈ
ΠΡΠ°ΡΠ΅ ΠΌΡΡΡΠ΅ Π΄Π»Ρ ΡΠΎΠ±ΠΎΡΠΈ:
- ΠΏΡΠΎΠ³ΡΠ°ΠΌΡ Β«War&Life balance: ΡΠΊ ΠΏΠΎΠ²Π΅ΡΠ½ΡΡΠΈ ΠΆΠΈΡΡΡ Π² ΠΆΠΈΡΡΡΒ» β ΡΠ΅ ΠΏΡΠ°ΠΊΡΠΈΡΠ½Ρ Π²ΠΎΡΠΊΡΠΎΠΏΠΈ Π· ΡΠ°Ρ ΡΠ²ΡΡΠΌΠΈ Π΄Π»Ρ ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ Π΅ΠΌΠΎΡΡΠΉΠ½ΠΎΠ³ΠΎ ΡΡΠ°Π½Ρ
- Π² ΡΠ°ΠΌΠΊΠ°Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΈ Β«New Ways of WorkingΒ» Π΄Π°ΠΉΠ΄ΠΆΠ΅ΡΡ Π· ΠΏΠΎΡΠ°Π΄Π°ΠΌΠΈ Π²ΡΠ΄ Π΅ΠΊΡΠΏΠ΅ΡΡΡΠ² ΡΠΎΠ΄ΠΎ ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ ΡΡΠ·ΠΈΡΠ½ΠΎΠ³ΠΎ ΡΠ° ΠΌΠ΅Π½ΡΠ°Π»ΡΠ½ΠΎΠ³ΠΎ Π·Π΄ΠΎΡΠΎΠ²βΡ
- ΡΡΠ²Π½Ρ ΠΌΠΎΠΆΠ»ΠΈΠ²ΠΎΡΡΡ: ΡΡΠ·Π½ΠΎΠΌΠ°Π½ΡΡΠ½ΡΡΡΡ, ΡΠ½ΠΊΠ»ΡΠ·ΠΈΠ²Π½ΡΡΡΡ
- Π·Π°Π»ΡΡΠ΅Π½Π½Ρ Π΄ΠΎ ΠΏΡΠΎΡΠΊΡΡΠ² ΡΡΠ°Π»ΠΎΠ³ΠΎ ΡΠΎΠ·Π²ΠΈΡΠΊΡ ΡΠ° ΠΊΠΎΡΠΏΠΎΡΠ°ΡΠΈΠ²Π½ΠΎ-ΡΠΎΡΡΠ°Π»ΡΠ½ΠΎΡ Π²ΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π°Π»ΡΠ½ΠΎΡΡΡ
Π―ΠΊΡΠΎ ΡΠΈ ΠΏΡΠ°Π³Π½Π΅Ρ Π΄ΠΎΡΡΠ³ΡΠΈ ΡΡΠΏΡΡ Ρ Ρ Π±Π°Π½ΠΊΡΠ½Π³Ρ ΡΠ° Π΄ΠΎΠ»ΡΡΠΈΡΠΈΡΡ Π΄ΠΎ ΡΠΏΡΠ»ΡΠ½ΠΎΡΠΈ, ΡΠΎ ΠΏΡΠ°Π³Π½Π΅ ΠΌΠ°ΡΠΈ ΠΏΠΎΠ·ΠΈΡΠΈΠ²Π½ΠΈΠΉ Π²ΠΏΠ»ΠΈΠ² Π½Π° ΡΡΡΠΏΡΠ»ΡΡΡΠ²ΠΎ, ΠΌΠΈ ΡΠ°Π΄Ρ Π²ΡΡΠ°ΡΠΈ ΡΠ΅Π±Π΅ Ρ ΠΊΠΎΠΌΠ°Π½Π΄Ρ UKRSIB Tech
#veteranfriendly
More -
Β· 102 views Β· 9 applications Β· 5d
Data Engineer
Countries of Europe or Ukraine Β· Product Β· 1 year of experience Β· Intermediate Ukrainian Product πΊπ¦OBRIO is an IT company with Ukrainian roots inside Genesis business ecosystem. Our team consists of more than 300 talented professionals, whose ambitions and striving for success help us build the best products on the market. We have offices in Kyiv, Lviv...OBRIO is an IT company with Ukrainian roots inside Genesis business ecosystem. Our team consists of more than 300 talented professionals, whose ambitions and striving for success help us build the best products on the market. We have offices in Kyiv, Lviv and Warsaw.
We are developing Nebula β the biggest brand in the spiritual niche:
- Nebula is #1 in its niche in terms of downloads and revenue targets;
- 60 million users worldwide;
- Users from 50+ countries;
4.8 - our average AppStore rating (with more than 215 thousand ratings).
As we establish and grow our data platform and data engineering team, weβre looking for a fourth Data Engineer to join us. Youβll team up with Daniil Yolkin, our Lead Data Engineer, along with data analysts in different domains (product, payments, marketing etc.), to optimize processes and craft innovative solutions that drive the business forward.
Our technical stack: AWS, Airflow, Redshift, SQL, Python, Git.
Why Youβll Thrive Here:
- π Work with data at scale β Gain hands-on experience handling extensive batch and streaming datasets.
- ποΈ Shape the future β Build our data engineering function with frameworks like Medallion, OBT, and Data Mesh.
- π― Empowered decision-making β Take ownership of decisions in the analytical domain and collaborate directly with stakeholders.
- 𧱠Drive innovation β Play a key role in influencing architectural decisions that define our data strategy.
- π Career growth β Progress from Data Engineer to Data Architect, with long-term opportunities in Data Science or ML Engineering roles.
Your impact:
- Maintain and optimize existing data processes in Airflow;
- Extract and process data from RESTful APIs and OLTP systems;
- Develop and enhance a scalable data platform on Redshift.
About You:
- 1+ years of experience working with data;
- Proficient in Python for ETL/ELT processes (Airflow, PySpark, or similar);
- Strong SQL knowledge (Postgres or similar);
- Experience working with RESTful APIs;
- Familiar with analytical database architectures (Kimball, Inmon, Data Vault, Medallion);
- Experience with cloud-native analytical databases (Redshift or similar);
- Hands-on experience in cloud environments (AWS or similar);
- Worked with big data storage/processing tools (AWS MWAA, AWS Glue, Redshift);
- Familiar with orchestration tools (Apache Airflow or similar).
Nice To Have:
- Experience with streaming analytics;
- Writing Data Quality scripts;
- Experience with external RESTful APIs (Google Ads, Facebook Ads, TikTok Ads).
Our benefits:
- Work from the comfort of your home or from one of our offices in Kyiv, Lviv or Warsaw. The choice is yours!
- Enjoy 20 annual vacation days and unlimited sick leave, all covered by the company;
- Don't worry about getting the right equipment, we've got you covered if necessary;
- Stay healthy with access to a corporate doctor online, and health insurance options in Ukraine or a fixed amount towards insurance abroad after your probation period;
- Keep learning with our extensive corporate library, internal online meetings, and lectures;
- Grow your skills with our training compensation program;
- Take advantage of our supportive corporate culture, including assistance with relocation, advice on legal stay abroad, housing support, and help for third-country nationals;
- Have fun with our online events and team-building activities!
Here's what our hiring journey looks like: Introductory Call with a Recruiter (20β30 minutes) β‘οΈ Technical Interview (1.5 hours) β‘οΈ Final Interview with a C-level Executive (1 hour) β‘οΈ Job Offer.
Let's team up and reach for the stars together!
More -
Β· 67 views Β· 10 applications Β· 11d
Data Engineer
Countries of Europe or Ukraine Β· Product Β· 4 years of experience Ukrainian Product πΊπ¦ΠΡΠΈΠ²ΡΡ! ΠΠΈ β E-Com β ΠΊΠΎΠΌΠ°Π½Π΄Π° Π·Π°ΠΊΠΎΡ Π°Π½ΠΈΡ Ρ Foodtech ΡΠ° ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠΈΠΉ ΠΏΡΠΎΠ΄ΡΠΊΡ. ΠΠΈ β ΠΎΠΊΡΠ΅ΠΌΠΈΠΉ Π°ΠΌΠ±ΡΡΠ½ΠΈΠΉ ΡΡΠ°ΡΡΠ°ΠΏ Π² ΡΠ°ΠΌΠΊΠ°Ρ ΠΏΠΎΡΡΠΆΠ½ΠΎΡ ΠΊΠΎΠΌΠΏΠ°Π½ΡΡ. Π ΠΊΡΡΡΠΎΡ ΡΠΈΠ»ΡΠ½ΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΎΡ ΡΠ° Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ ΠΌΡΡΡΠΌΠΈ. Π ΠΌΠΈ ΡΡΠΊΠ°ΡΠΌΠΎ Data engineer, ΡΠΊΠΈΠΉ Π°Π±ΠΎ ΡΠΊΠ° ΡΠ°Π·ΠΎΠΌ Π· Π½Π°ΠΌΠΈ ΡΡΠ²ΠΎΡΡΠ²Π°ΡΠΈΠΌΠ΅...ΠΡΠΈΠ²ΡΡ!
ΠΠΈ β E-Com β ΠΊΠΎΠΌΠ°Π½Π΄Π° Π·Π°ΠΊΠΎΡ Π°Π½ΠΈΡ Ρ Foodtech ΡΠ° ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠΈΠΉ ΠΏΡΠΎΠ΄ΡΠΊΡ.
ΠΠΈ β ΠΎΠΊΡΠ΅ΠΌΠΈΠΉ Π°ΠΌΠ±ΡΡΠ½ΠΈΠΉ ΡΡΠ°ΡΡΠ°ΠΏ Π² ΡΠ°ΠΌΠΊΠ°Ρ ΠΏΠΎΡΡΠΆΠ½ΠΎΡ ΠΊΠΎΠΌΠΏΠ°Π½ΡΡ. Π ΠΊΡΡΡΠΎΡ ΡΠΈΠ»ΡΠ½ΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΎΡ ΡΠ° Π²Π΅Π»ΠΈΠΊΠΈΠΌΠΈ ΠΌΡΡΡΠΌΠΈ.
Π ΠΌΠΈ ΡΡΠΊΠ°ΡΠΌΠΎ Data engineer, ΡΠΊΠΈΠΉ Π°Π±ΠΎ ΡΠΊΠ° ΡΠ°Π·ΠΎΠΌ Π· Π½Π°ΠΌΠΈ ΡΡΠ²ΠΎΡΡΠ²Π°ΡΠΈΠΌΠ΅ ΠΏΡΠΎΠ³ΡΠ΅ΡΠΈΠ²Π½Ρ ΠΏΡΠΎΠ΄ΡΠΊΡΠΈ, ΡΠΊΠΈΠΌΠΈ ΠΊΠΎΡΠΈΡΡΡΡΡΡΡΡ ΡΠΈΡΡΡΡ ΡΠΊΡΠ°ΡΠ½ΡΡΠ². ΠΠ΅ Π²ΠΈΠΊΠ»ΡΡΠ°ΡΠΌΠΎ, ΡΠΎ Ρ ΡΠ²ΠΎΡ ΡΡΠΌβΡ.
Π ΡΠ΅ - ΠΌΠΈ Π»Π°ΠΌΠ°ΡΠΌΠΎ ΡΡΠ΅ΡΠ΅ΠΎΡΠΈΠΏΠΈ, ΡΠΎ ΡΠΈΡΠ΅ΠΉΠ» - ΡΠΎ Π»ΠΈΡΠ΅ ΠΏΡΠΎ ΠΏΠΎΠΌΡΠ΄ΠΎΡΡΠΈΠΊΠΈ. ΠΠΎΠ²ΡΡ, ΡΠ΅Ρ Π½ΡΡΠ½Π° ΡΠ°ΡΡΠΈΠ½Π° Π½Π°ΡΠΈΡ ΠΏΡΠΎΠ΅ΠΊΡΡΠ² Π΄Π°Ρ ΡΡΠ»Π΅ ΠΏΠΎΠ»Π΅ Π΄Π»Ρ ΠΊΡΠ΅Π°ΡΠΈΠ²Ρ ΡΠ° ΠΏΡΠΎΠΊΠ°ΡΠΊΠΈ Π΄ΡΠΌΠ°Π»ΠΊΠΈ.
Π©ΠΎ Ρ Π½Π°Ρ Π·Π°ΡΠ°Π· Π² ΡΠΎΠ±ΠΎΡΡ:
- ΠΏΡΠΎΠΊΠ°ΡΡΡΠΌΠΎ Π½Π°ΡΠ²Π½Ρ Π΄ΠΎΡΡΠ°Π²ΠΊΡ ΡΠΈΡΠΎΠΊΠΎΠ³ΠΎ Π°ΡΠΎΡΡΠΈΠΌΠ΅Π½ΡΡ Π· ΠΌΠ°Π³Π°Π·ΠΈΠ½ΡΠ² Π‘ΡΠ»ΡΠΏΠΎ
- ΡΠΎΠ·ΡΠΎΠ±Π»ΡΡΠΌΠΎ Π½Π°Π΄ΡΠ²ΠΈΠ΄ΠΊΡ Π΄ΠΎΡΡΠ°Π²ΠΊΡ ΠΏΡΠΎΠ΄ΡΠΊΡΡΠ² ΡΠ° ΡΡΡΠ°Π² ΠΏΡΠ΄ Π½ΠΎΠ²ΠΈΠΌ Π±ΡΠ΅Π½Π΄ΠΎΠΌ LOKO
ΠΠ° ΡΠΊΠΈΡ ΡΡΠ½Π½ΠΎΡΡΡΡ Π±ΡΠ΄ΡΡΠΌΠΎ ΠΏΡΠΎΠ΅ΠΊΡΠΈ?
- Π£ΠΊΡΠ°ΡΠ½ΡΡΠΊΠΈΠΉ ΠΏΡΠΎΠ΄ΡΠΊΡ Π΄Π»Ρ ΡΠΊΡΠ°ΡΠ½ΡΡΠ². Π‘Π²ΠΎΡ Π΄Π»Ρ ΡΠ²ΠΎΡΡ . ΠΠ°ΡΡ ΠΏΡΠΎΠ΄ΡΠΊΡΠΈ Π²Π°ΠΆΠ»ΠΈΠ²Ρ ΡΠ° ΠΊΠΎΡΠΈΡΠ½Ρ ΠΠΎΡΡΡΠΌ, ΡΠ²ΠΎΡΠΉ ΡΡΠΌβΡ ΡΠ° ΡΠ²ΠΎΡΠΌ Π΄ΡΡΠ·ΡΠΌ
- Agile mindset Π½Π΅ Π½Π° ΡΠ»ΠΎΠ²Ρ, Π° Π½Π° Π΄ΡΠ»Ρ. Π ΡΠΎΠ»ΠΊΠΎΠΌ, ΡΡΠ²ΡΠ΄ΠΎΠΌΠ»Π΅Π½ΠΎ ΡΠ° Π²ΠΈΡ ΠΎΠ΄ΡΡΠΈ Π· Π½Π°ΡΠΎΠ³ΠΎ ΠΊΠΎΠ½ΡΠ΅ΠΊΡΡΡ
- ΠΠΎΠΆΠ»ΠΈΠ²ΡΡΡΡ Π²ΠΏΠ»ΠΈΠ²Π°ΡΠΈ Π½Π° ΠΏΡΠΎΠ΄ΡΠΊΡ. ΠΠΈ Π½Π°Π±ΠΈΡΠ°ΡΠΌΠΎ Π·ΡΡΠΎΡΠΎΠΊ ΡΠ° Π΄Π°ΡΠΌΠΎ ΡΠΌ ΠΌΠΎΠΆΠ»ΠΈΠ²ΡΡΡΡ ΡΡΡΡΠΈ ΡΠ° ΡΠΈΠΌ ΡΠ°ΠΌΠΈΠΌ Π·Π°ΠΏΠ°Π»ΡΠ²Π°ΡΠΈ Π½Π°ΡΡ ΠΏΡΠΎΠ΄ΡΠΊΡΠΈ. ΠΠ΄ΠΎΡΠΎΠ²ΠΈΠΉ Π³Π»ΡΠ·Π΄ Π΄Π»Ρ Π½Π°Ρ ΡΡΠ½Π½ΡΡΠΈΠΉ, Π½ΡΠΆ βΡΠ°ΠΊ ΡΡΡΠΎΡΠΈΡΠ½ΠΎ ΡΠΊΠ»Π°Π»ΠΎΡΡβ
Π―ΠΊ ΠΌΠΈ Π±Π°ΡΠΈΠΌΠΎ Π½Π°ΡΠΎΠ³ΠΎ dream ΠΊΠ°Π½Π΄ΠΈΠ΄Π°ΡΠ° ΡΠΈ ΠΊΠ°Π½Π΄ΠΈΠ΄Π°ΡΠΊΡ?
- 4+ ΡΠΎΠΊΠΈ ΡΠΎΠ±ΠΎΡΠΈ Π² ΠΎΠ±Π»Π°ΡΡΡ Π°Π½Π°Π»ΡΠ·Ρ/ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ , Π· ΠΊΠΎΡΡΠΈΡ Π½Π΅ ΠΌΠ΅Π½ΡΠ΅ 3-Ρ ΡΠΎΠΊΡΠ² Π½Π° ΠΏΠΎΠ·ΠΈΡΡΡ Data Engineer
- ΠΠ»ΠΈΠ±ΠΎΠΊΡ Π·Π½Π°Π½Π½Ρ ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ Π· Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½ΡΠΌ Python ΡΠ° SQL ΡΠ° Π΄ΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Airflow
- ΠΠΈΡΠΎΠΊΠΈΠΉ ΡΡΠ²Π΅Π½Ρ Π½Π°Π²ΠΈΡΠΎΠΊ ΡΠΎΠ±ΠΎΡΠΈ Π· Π±Π°Π·Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ : Π΄ΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· SQL ΡΠ° NoSQL Π±Π°Π·Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ , ΡΠ°ΠΊΠΈΠΌΠΈ ΡΠΊ PostgreSQL, MySQL, MongoDB, Redshift
- Π ΠΎΠ·ΡΠΌΡΠ½Π½Ρ ΠΊΠΎΠ½ΡΠ΅ΠΏΡΡΠΉ ETL (Extract, Transform, Load): Π΄ΠΎΡΠ²ΡΠ΄ ΡΠΎΠ·ΡΠΎΠ±ΠΊΠΈ ΡΠ° ΡΠΎΠ·Π³ΠΎΡΡΠ°Π½Π½Ρ ETL-ΠΏΡΠΎΡΠ΅ΡΡΠ² Π΄Π»Ρ ΡΠ½ΡΠ΅Π³ΡΠ°ΡΡΡ Π΄Π°Π½ΠΈΡ Π· ΡΡΠ·Π½ΠΈΡ Π΄ΠΆΠ΅ΡΠ΅Π»
- ΠΠ½Π°Π½Π½Ρ ΠΏΡΠΈΠ½ΡΠΈΠΏΡΠ² ΡΠΎΠ±ΠΎΡΠΈ Π· Π΄Π°Π½ΠΈΠΌΠΈ: Π΄ΠΎΡΠ²ΡΠ΄ ΡΠΎΠ·ΡΠΎΠ±ΠΊΠΈ ΡΡ Π΅ΠΌ Π±Π°Π· Π΄Π°Π½ΠΈΡ , Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΠ΅Π½Π½Ρ Π½Π°Π΄ΡΠΉΠ½ΠΎΡΡΡ ΡΠ° ΠΎΠΏΡΠΈΠΌΡΠ·Π°ΡΡΡ ΠΏΡΠΎΠ΄ΡΠΊΡΠΈΠ²Π½ΠΎΡΡΡ
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Ρ ΠΌΠ°ΡΠ½ΠΈΠΌΠΈ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡΠΌΠΈ, ΡΠ°ΠΊΠΈΠΌΠΈ ΡΠΊ AWS, ΡΠ° Π²ΡΠ΄ΠΏΠΎΠ²ΡΠ΄Π½ΠΈΠΌΠΈ ΡΠ΅ΡΠ²ΡΡΠ°ΠΌΠΈ Π΄Π»Ρ Π·Π±Π΅ΡΡΠ³Π°Π½Π½Ρ ΡΠ° ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ
- Π ΠΎΠ±ΠΎΡΠ° Π· ΠΊΠΎΠΌΠ°Π½Π΄ΠΎΡ: Π΄ΠΎΡΠ²ΡΠ΄ ΠΊΠΎΠΎΡΠ΄ΠΈΠ½Π°ΡΡΡ ΡΠΎΠ±ΠΎΡΠΈ ΠΌΡΠΆ ΡΡΠ·Π½ΠΈΠΌΠΈ ΠΊΠΎΠΌΠ°Π½Π΄Π°ΠΌΠΈ, ΡΠΏΡΠ»ΠΊΡΠ²Π°Π½Π½Ρ Π·Ρ ΡΡΠ΅ΠΉΠΊΡ ΠΎΠ»Π΄Π΅ΡΠ°ΠΌΠΈ ΡΠ° ΡΠΎΠ·ΠΏΠΎΠ΄ΡΠ» Π·Π°Π²Π΄Π°Π½Ρ ΠΌΡΠΆ ΡΠ»Π΅Π½Π°ΠΌΠΈ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ
- ΠΠΎΠ±ΡΠ΅ ΡΠΎΠ·ΡΠΌΡΠ½Π½Ρ AWS Cloud, Google BQ
ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π·Ρ Spark
Π©ΠΎ Π±ΡΠ΄Π΅ ΠΏΠ»ΡΡΠΎΠΌ:
- ΠΠ½Π°Π½Π½Ρ Π±Π°Π·ΠΎΠ²ΠΈΡ ΠΏΡΠΈΠ½ΡΠΈΠΏΡΠ² ΡΠΎΠ±ΠΎΡΠΈ Π· BI ΡΠΈΡΡΠ΅ΠΌΠ°ΠΌΠΈ
- ΠΠΎΡΠ²ΡΠ΄ Π² ΡΠΎΠ±ΠΎΡΡ Π· Π΄Π°Π½ΠΈΠΌΠΈ Π°Π½Π°Π»ΡΡΠΈΡΠ½ΠΈΡ ΡΠΈΡΡΠ΅ΠΌ (GA, Firebase)
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π·Ρ Scala
Π©ΠΎ ΡΠΈ Π±ΡΠ΄Π΅Ρ ΡΠΎΠ±ΠΈΡΠΈ?
- ΠΡΠ΄ΡΠ²Π°ΡΠΈ Π²ΠΈΡΠΎΠΊΠΎΠΏΡΠΎΠ΄ΡΠΊΡΠΈΠ²Π½Ρ ΡΠΈΡΡΠ΅ΠΌΡ ΡΡΠ°Π½ΡΡΠ΅ΡΡ/ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π»Ρ Π΄Π°Π½ΠΈΡ Π°Π½Π°Π»ΡΡΠΈΠΊΠΈ Π· ΠΏΠΎΡΠ΅Π½ΡΡΠ°Π»ΠΎΠΌ Π΄Π»Ρ ΠΌΠ°ΡΡΡΠ°Π±ΡΠ²Π°Π½Π½Ρ
- ΠΠ±ΡΠΎΠ±Π»ΡΡΠΈ ΡΠ° Π°Π³ΡΠ΅Π³ΡΠ²Π°ΡΠΈ ΡΠΈΡΡ Π΄Π°Π½Ρ Π· ΡΡΠ·Π½ΠΈΡ ΡΠΈΡΡΠ΅ΠΌ (Π΄Π°Π½Ρ ΠΏΡΠΎ Π΄ΡΡ ΠΊΠΎΡΠΈΡΡΡΠ²Π°ΡΡΠ² Π΄ΠΎΠ΄Π°ΡΠΊΡΠ², ΡΠ°ΠΉΡΡ)
- ΠΡΠ΄ΡΡΠΈΠΌΡΠ²Π°ΡΠΈ/ΠΊΠΎΡΠ΅ΠΊΡΡΠ²Π°ΡΠΈ ΡΠ»ΠΎΡ ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ
- ΠΠΎ Π·Π°ΠΏΠΈΡΡ ΠΎΠΏΡΠΈΠΌΡΠ·ΡΠ²Π°ΡΠΈ ΡΠΎΠ±ΠΎΡΡ SQL ΡΠΊΡΠΈΠΏΡΡΠ²
Π©ΠΎ ΠΌΠΈ ΠΏΡΠΎΠΏΠΎΠ½ΡΡΠΌΠΎ?
- Π ΠΎΠ±ΠΎΡΡ Π½Π°Π΄ ΡΠ»Π°Π³ΠΌΠ°Π½ΡΡΠΊΠΈΠΌ ΠΏΡΠΎΠ΅ΠΊΡΠΎΠΌ Π· Π²Π΅Π»ΠΈΠΊΠΎΡ ΠΊΡΠ»ΡΠΊΡΡΡΡ ΡΠ·Π΅ΡΡΠ²
- ΠΠΎΠΆΠ»ΠΈΠ²ΡΡΡΡ Π²ΠΏΠ»ΠΈΠ²Π°ΡΠΈ Π½Π° ΡΡΡΠ΅Π½Π½Ρ
- ΠΠΎΠ½ΠΊΡΡΠ΅Π½ΡΠΎΡΠΏΡΠΎΠΌΠΎΠΆΠ½Π° ΠΎΠΏΠ»Π°ΡΠ°, ΠΎΠ±Π³ΠΎΠ²ΠΎΡΡΡΡΡΡΡ Π·Π° ΡΠ΅Π·ΡΠ»ΡΡΠ°ΡΠ°ΠΌΠΈ ΡΠΏΡΠ²Π±Π΅ΡΡΠ΄ΠΈ Π· ΡΡΠΏΡΡΠ½ΠΈΠΌ ΠΊΠ°Π½Π΄ΠΈΠ΄Π°ΡΠΎΠΌ
- ΠΠΎΠ±ΡΠΎΠ²ΡΠ»ΡΠ½Π΅ ΠΌΠ΅Π΄ΠΈΡΠ½Π΅ ΡΡΡΠ°Ρ ΡΠ²Π°Π½Π½Ρ
- ΠΠ½ΠΈΠΆΠΊΠΈ Π΄Π»Ρ ΡΠΏΡΠ²ΡΠΎΠ±ΡΡΠ½ΠΈΠΊΡΠ² ΠΌΠ΅ΡΠ΅ΠΆΡ
- ΠΠΎΠ½ΡβΡΡΠΆ ΡΠ΅ΡΠ²ΡΡ Π΄Π»Ρ ΠΏΡΠ°ΡΡΠ²Π½ΠΈΠΊΡΠ²
- ΠΡΠ΄Π΄Π°Π»Π΅Π½ΠΈΠΉ ΡΠ΅ΠΆΠΈΠΌ ΡΠΎΠ±ΠΎΡΠΈ. Π ΡΠΊΡΠΎ Ρ ΠΎΡΠ΅Ρ ΠΏΠΎΠΏΡΠ°ΡΡΠ²Π°ΡΠΈ Π² ΠΎΡΡΡΡ - Ρ Π·Π°ΡΠΈΡΠ½ΠΈΠΉ ΠΊΠΎΠ²ΠΎΡΠΊΡΠ½Π³ Π² ΠΠ¦ SilverBreeze Π½Π° ΠΠ΅ΡΠ΅Π·Π½ΡΠΊΠ°Ρ . ΠΡΠΎΡΠ΅ Π½Π°ΡΠ°Π·Ρ Π² ΡΡΠ»ΡΡ Π±Π΅Π·ΠΏΠ΅ΠΊΠΈ ΠΌΠΈ ΡΠ΅ΠΊΠΎΠΌΠ΅Π½Π΄ΡΡΠΌΠΎ ΡΠΊΠΎΡΠΎΡΡΠ²Π°ΡΠΈ ΠΏΠΎΡΠ·Π΄ΠΊΠΈ Π² ΠΎΡΡΡ
-
Β· 29 views Β· 3 applications Β· 12d
Senior Data Engineer
Ukraine Β· Product Β· 4 years of experienceAbout us: Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with the organization of the first Data Science UA conference, setting the foundation for our growth. Over the past 8 years, we have...About us:
Data Science UA is a service company with strong data science and AI expertise. Our journey began in 2016 with the organization of the first Data Science UA conference, setting the foundation for our growth. Over the past 8 years, we have diligently fostered the largest Data Science Community in Eastern Europe.
About the client:
Our client is a large-scale financial project with an extensive network and a strong market presence. The company manages vast amounts of financial data and is focused on enhancing its data infrastructure to support innovative solutions. With a commitment to long-term development, the team works on complex, high-impact projects β from optimizing data pipelines to implementing modern technologies across both cloud and on-premises environments. This is an opportunity to join a dynamic, data-driven company that values technical expertise and encourages growth.About the role:
More
We are looking for a talented and passionate Data Engineer to join the team.
Requirements:
- 3-5+ years of experience working as a Data Engineer;
- Masterβs or PhD in Mathematics, Cybernetics, Computer Science, or a related field.
- Experience building and scaling data pipelines on large tabular data using cloud and on-premises;
- Understanding data requirements from the DS team;
- Transforming raw data with issues into ready data for modeling;
- Experience working with temporal data and scraping;
- Experience of working with Python, PyTorch, Pandas, Oracle DB;
- Knowledge of cloud technologies, such as AWS and Python ecosystem;
- English - Upper-intermediate+.
Responsibilities:
- Design and implement scalable data pipelines for financial information processing using Python;
- Design and support hybrid data infrastructure spanning on-premises and AWS environments;
- Implement and support data integration solutions using tools such as Airbyte;
- Ensure data quality, reliability, and system performance;
- Collaborate with data scientists and analysts to optimize data delivery;
- Establish best practices for data processing and documentation processes.
The company offers:
- Medical insurance;
- Laptop and cloud services by the company;
- Paid training, the opportunity to participate in the creation and publication of scientific articles (open-source knowledge development);
- Social package (paid sick leaves and vacation). -
Β· 20 views Β· 1 application Β· 16d
SAP Analytics Cloud Consultant
Office Work Β· Ukraine (Kyiv) Β· Product Β· 2 years of experience Β· Pre-Intermediate Ukrainian Product πΊπ¦Ajax Systems is a global technology company and the leading developer and manufacturer of Ajax security systems with smart home capabilities in Europe. It encompasses a comprehensive ecosystem featuring 135 devices, mobile and desktop applications, and a...Ajax Systems is a global technology company and the leading developer and manufacturer of Ajax security systems with smart home capabilities in Europe. It encompasses a comprehensive ecosystem featuring 135 devices, mobile and desktop applications, and a robust server infrastructure. Each year, we experience substantial growth in both our workforce and our user base worldwide. Currently, the company employs over 3,300 individuals, while Ajax sensors safeguard 2.5 million users across more than 187 countries.
We have an open position for a Senior SAC Consultant in our team. We are looking for a professional who will help us design and implement advanced analytics solutions, transforming business data into strategic insights using SAP Analytics Cloud and SAP S/4HANA Embedded Analytics.
Key Responsibilities:
- Analytics Solution Development: Design and develop interactive dashboards, reports, and KPIs in SAP Analytics Cloud (SAC);
- Focus on visual design principles to create user-friendly and intuitive dashboards. Propose a design code guide to follow in the team;
- Utilize SAC capabilities, including planning, forecasting, and predictive analytics;
- Enable and optimize Embedded Analytics in SAP S/4HANA by consuming CDS Views.
- Data Integration and exploration: Integrate data from various sources (SAP and non-SAP systems, SQL databases, etc.) into SAC for unified analytics and reporting;
- Perform exploratory data analysis (EDA) to uncover patterns, validate data quality, and derive actionable insights.
- Business Collaboration: Work closely with business stakeholders to gather requirements, define KPIs, and translate business needs into technical solutions;
- Provide guidance on self-service analytics to empower business users and enhance BI adoption.
- Solution Optimization and Governance: Ensure SAC solutions align with data governance and performance best practices;
- Manage access controls, data security, and compliance across SAC deployments;
- Monitor and optimize dashboards for usability, performance, and scalability.
- Documentation and Knowledge Sharing: Document all solutions, models, data integrations, and reporting workflows to ensure transparency and maintainability;
- Create clear technical and user documentation to support stakeholders and end-users.
- Integration and Enablement: Collaborate with SAP Data Engineers to ensure data readiness from SAP S/4HANA and SAP Data Sphere;
- Integrate SAC with SAP back-end systems and other data sources for a unified
- Leadership and Support: Provide thought leadership on advanced SAC functionalities, trends, and best practices;
- Mentor team members and train end-users to maximize the value of SAC.
Key Requirements:
- 4+ years of experience in SAP Analytics Cloud (SAC), including planning, predictive analytics, and dashboard development.
- Strong knowledge of SAP S/4HANA Embedded Analytics and integration of CDS Views into SAC models.
- Hands-on experience performing exploratory data analysis (EDA) across various data sources to identify patterns, insights, and data quality issues.
- Experience working with SQL-based databases to query, validate, and manipulate datasets for analytics purposes.
- Solid understanding of data visualization best practices and performance optimization techniques.
- Familiarity with integrating SAC with SAP and non-SAP systems for seamless data consumption.
- Excellent ability to gather and translate business requirements into technical analytics solutions.
- Strong communication and stakeholder engagement skills, enabling collaboration with technical and business teams.
- Problem-solving mindset to address complex analytics challenges and ensure data-driven decision-making.
- Ability to work in cross-functional teams, coordinating with Data Engineers and IT stakeholders to ensure data readiness.
Proactive, detail-oriented, and committed to delivering high-quality solutions.
We offer:
- Opportunity to build your own processes and best practices;
- A dynamic team working within a zero-bullshit culture;
- Working in a comfortable office at UNIT.City (Kyiv). The office is safe as it has a bomb shelter;
- Reimbursement for external training for professional development;
- Ajax's security system kit to use;
- Official employment with Diia City ;
- Medical Insurance;
- Flexible work schedule.
Ajax Systems is a Ukrainian success story, a place of incredible strength and energy.
More -
Β· 77 views Β· 9 applications Β· 16d
Data Engineer
Ukraine Β· 4 years of experienceΠΡΡΠ°Ρ! Π¨ΡΠΊΠ°ΡΠΌΠΎ Π΄ΠΎΡΠ²ΡΠ΄ΡΠ΅Π½ΠΎΠ³ΠΎ ΡΠ½ΠΆΠ΅Π½Π΅ΡΠ° Π· Π΄Π°Π½ΠΈΡ Π΄Π»Ρ ΡΠΎΠ·ΡΠΎΠ±ΠΊΠΈ ΡΠ° ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ ΡΠ½ΡΡΠ°ΡΡΡΡΠΊΡΡΡΠΈ Π΄Π°Π½ΠΈΡ Ρ ΠΏΡΠΎΡΠΊΡΡ Π²ΡΡΡΡΠ°Π»ΡΠ½ΠΎΡ Π΅Π»Π΅ΠΊΡΡΠΎΡΡΠ°Π½ΡΡΡ, ΠΌΠ΅ΡΠΎΡ ΡΠΊΠΎΠ³ΠΎ Ρ ΠΎΠΏΡΠΈΠΌΡΠ·Π°ΡΡΡ Π΅Π½Π΅ΡΠ³ΠΎΡΠΏΠΎΠΆΠΈΠ²Π°Π½Π½Ρ ΡΠ° ΡΠΎΠ·ΠΏΠΎΠ΄ΡΠ» ΡΠ΅ΡΡΡΡΡΠ². ΠΠ±ΠΎΠ²βΡΠ·ΠΊΠΈ ΠΡΠΎΡΠΊΡΡΠ²Π°ΡΠΈ ΡΠ° ΡΠΎΠ·ΡΠΎΠ±Π»ΡΡΠΈ Π½Π°Π΄ΡΠΉΠ½Ρ ETL-ΠΏΡΠΎΡΠ΅ΡΠΈ...ΠΡΡΠ°Ρ! Π¨ΡΠΊΠ°ΡΠΌΠΎ Π΄ΠΎΡΠ²ΡΠ΄ΡΠ΅Π½ΠΎΠ³ΠΎ ΡΠ½ΠΆΠ΅Π½Π΅ΡΠ° Π· Π΄Π°Π½ΠΈΡ Π΄Π»Ρ ΡΠΎΠ·ΡΠΎΠ±ΠΊΠΈ ΡΠ° ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ ΡΠ½ΡΡΠ°ΡΡΡΡΠΊΡΡΡΠΈ Π΄Π°Π½ΠΈΡ Ρ ΠΏΡΠΎΡΠΊΡΡ Π²ΡΡΡΡΠ°Π»ΡΠ½ΠΎΡ Π΅Π»Π΅ΠΊΡΡΠΎΡΡΠ°Π½ΡΡΡ, ΠΌΠ΅ΡΠΎΡ ΡΠΊΠΎΠ³ΠΎ Ρ ΠΎΠΏΡΠΈΠΌΡΠ·Π°ΡΡΡ Π΅Π½Π΅ΡΠ³ΠΎΡΠΏΠΎΠΆΠΈΠ²Π°Π½Π½Ρ ΡΠ° ΡΠΎΠ·ΠΏΠΎΠ΄ΡΠ» ΡΠ΅ΡΡΡΡΡΠ².
ΠΠ±ΠΎΠ²βΡΠ·ΠΊΠΈ
- ΠΡΠΎΡΠΊΡΡΠ²Π°ΡΠΈ ΡΠ° ΡΠΎΠ·ΡΠΎΠ±Π»ΡΡΠΈ Π½Π°Π΄ΡΠΉΠ½Ρ ETL-ΠΏΡΠΎΡΠ΅ΡΠΈ Π΄Π»Ρ ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π²Π΅Π»ΠΈΠΊΠΈΡ ΠΎΠ±ΡΡΠ³ΡΠ² Π΄Π°Π½ΠΈΡ .
- Π‘ΡΠ²ΠΎΡΡΠ²Π°ΡΠΈ ΠΉ ΠΏΡΠ΄ΡΡΠΈΠΌΡΠ²Π°ΡΠΈ ΡΡ ΠΎΠ²ΠΈΡΠ° Π΄Π°Π½ΠΈΡ Π΄Π»Ρ Π΅ΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΠ³ΠΎ Π·Π±Π΅ΡΡΠ³Π°Π½Π½Ρ ΡΠ°ΡΠΎΠ²ΠΈΡ ΡΡΠ΄ΡΠ².
- ΠΠ½ΡΠ΅Π³ΡΡΠ²Π°ΡΠΈΡΡ Π· ΡΡΠ·Π½ΠΈΠΌΠΈ Π΄ΠΆΠ΅ΡΠ΅Π»Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ Π΅Π½Π΅ΡΠ³Π΅ΡΠΈΡΠ½ΠΎΡ ΡΠ½ΡΡΠ°ΡΡΡΡΠΊΡΡΡΠΈ.
- Π ΠΎΠ·ΡΠΎΠ±Π»ΡΡΠΈ ΡΠ° ΠΎΠΏΡΠΈΠΌΡΠ·ΡΠ²Π°ΡΠΈ ΠΏΠΎΡΠΎΠΊΠΈ Π΄Π°Π½ΠΈΡ Π΄Π»Ρ ML-ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ.
- ΠΠ°Π±Π΅Π·ΠΏΠ΅ΡΡΠ²Π°ΡΠΈ ΡΠΊΡΡΡΡ Ρ ΡΠ·Π³ΠΎΠ΄ΠΆΠ΅Π½ΡΡΡΡ Π΄Π°Π½ΠΈΡ .
- Π ΠΎΠ·ΡΠΎΠ±Π»ΡΡΠΈ ΡΠ° ΠΏΡΠ΄ΡΡΠΈΠΌΡΠ²Π°ΡΠΈ ΡΠΈΡΡΠ΅ΠΌΡ ΠΌΠΎΠ½ΡΡΠΎΡΠΈΠ½Π³Ρ Π΄Π°Π½ΠΈΡ
.
ΠΠΈΠΌΠΎΠ³ΠΈ
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π½Π° ΠΏΠΎΠ·ΠΈΡΡΡ Data Engineer Π²ΡΠ΄ 4 ΡΠΎΠΊΡΠ².
- ΠΠΏΠ΅Π²Π½Π΅Π½Π΅ Π²ΠΎΠ»ΠΎΠ΄ΡΠ½Π½Ρ Python:
- Pandas, NumPy Π΄Π»Ρ ΠΌΠ°Π½ΡΠΏΡΠ»ΡΡΡΡ Π΄Π°Π½ΠΈΠΌΠΈ
- PySpark Π΄Π»Ρ ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π²Π΅Π»ΠΈΠΊΠΈΡ Π΄Π°Π½ΠΈΡ
- Airflow/Luigi Π΄Π»Ρ ΠΎΡΠΊΠ΅ΡΡΡΠ°ΡΡΡ ΠΏΠΎΡΠΎΠΊΡΠ² Π΄Π°Π½ΠΈΡ
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· AWS.
- ΠΠ»ΠΈΠ±ΠΎΠΊΠ΅ ΡΠΎΠ·ΡΠΌΡΠ½Π½Ρ SQL Ρ Π΄ΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Π±Π°Π·Π°ΠΌΠΈ Π΄Π°Π½ΠΈΡ .
- ΠΠΎΡΠ²ΡΠ΄ Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½Ρ ΡΠ½ΡΡΡΡΠΌΠ΅Π½ΡΡΠ²:
- Git Π΄Π»Ρ Π²Π΅ΡΡΡΠΎΠ½ΡΠ²Π°Π½Π½Ρ ΠΊΠΎΠ΄Ρ
- Docker/Kubernetes Π΄Π»Ρ ΠΊΠΎΠ½ΡΠ΅ΠΉΠ½Π΅ΡΠΈΠ·Π°ΡΡΡ ΡΠ° ΠΎΡΠΊΠ΅ΡΡΡΠ°ΡΡΡ
- Terraform/CloudFormation Π΄Π»Ρ ΡΠ½ΡΡΠ°ΡΡΡΡΠΊΡΡΡΠΈ ΡΠΊ ΠΊΠΎΠ΄Ρ
- Grafana/Kibana Π΄Π»Ρ Π²ΡΠ·ΡΠ°Π»ΡΠ·Π°ΡΡΡ ΡΠ° ΠΌΠΎΠ½ΡΡΠΎΡΠΈΠ½Π³Ρ
ΠΠΈ ΠΏΡΠΎΠΏΠΎΠ½ΡΡΠΌΠΎ
- Π ΠΎΠ±ΠΎΡΡ Π½Π°Π΄ ΡΠ½Π½ΠΎΠ²Π°ΡΡΠΉΠ½ΠΈΠΌ ΠΏΡΠΎΡΠΊΡΠΎΠΌ Ρ Π΄ΠΈΠ½Π°ΠΌΡΡΠ½ΡΠΉ ΡΡΠ΅ΡΡ.
- ΠΠΎΠΆΠ»ΠΈΠ²ΠΎΡΡΡ ΠΏΡΠΎΡΠ΅ΡΡΠΉΠ½ΠΎΠ³ΠΎ ΡΠΎΠ·Π²ΠΈΡΠΊΡ ΠΉ Π½Π°Π²ΡΠ°Π½Π½Ρ.
- ΠΠ½ΡΡΠΊΠΈΠΉ Π³ΡΠ°ΡΡΠΊ ΡΠΎΠ±ΠΎΡΠΈ.
- ΠΠΎΠ½ΠΊΡΡΠ΅Π½ΡΠ½Ρ Π·Π°ΡΠΎΠ±ΡΡΠ½Ρ ΠΏΠ»Π°ΡΡ.
-
Β· 105 views Β· 3 applications Β· 16d
Strong Junior Data Engineer
Worldwide Β· 1 year of experience Β· IntermediateDataforest Π² ΠΏΠΎΡΡΠΊΡ Π²ΠΌΠΎΡΠΈΠ²ΠΎΠ²Π°Π½ΠΎΠ³ΠΎ Π½Π° ΡΠΎΠ·Π²ΠΈΡΠΎΠΊ Data Engineer, ΡΠΊΠΈΠΉ ΡΡΠ°Π½Π΅ ΡΠ°ΡΡΠΈΠ½ΠΎΡ Π½Π°ΡΠΎΡ Π΄ΡΡΠΆΠ½ΡΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ. Π―ΠΊ Data Engineer, ΡΠΈ Π±ΡΠ΄Π΅Ρ ΡΠΎΠ·Π²'ΡΠ·ΡΠ²Π°ΡΠΈ ΡΡΠΊΠ°Π²Ρ Π·Π°Π΄Π°ΡΡ, Π²ΠΈΠΊΠΎΡΠΈΡΡΠΎΠ²ΡΡΡΠΈ ΠΏΠ΅ΡΠ΅Π΄ΠΎΠ²Ρ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡ Π·Π±ΠΎΡΡ, ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ, Π°Π½Π°Π»ΡΠ·Ρ ΡΠ° ΠΌΠΎΠ½ΡΡΠΎΡΠΈΠ½Π³Ρ Π΄Π°Π½ΠΈΡ . Π―ΠΊΡΠΎ ΡΠΈ Π½Π΅...Dataforest Π² ΠΏΠΎΡΡΠΊΡ Π²ΠΌΠΎΡΠΈΠ²ΠΎΠ²Π°Π½ΠΎΠ³ΠΎ Π½Π° ΡΠΎΠ·Π²ΠΈΡΠΎΠΊ Data Engineer, ΡΠΊΠΈΠΉ ΡΡΠ°Π½Π΅ ΡΠ°ΡΡΠΈΠ½ΠΎΡ Π½Π°ΡΠΎΡ Π΄ΡΡΠΆΠ½ΡΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ. Π―ΠΊ Data Engineer, ΡΠΈ Π±ΡΠ΄Π΅Ρ ΡΠΎΠ·Π²'ΡΠ·ΡΠ²Π°ΡΠΈ ΡΡΠΊΠ°Π²Ρ Π·Π°Π΄Π°ΡΡ, Π²ΠΈΠΊΠΎΡΠΈΡΡΠΎΠ²ΡΡΡΠΈ ΠΏΠ΅ΡΠ΅Π΄ΠΎΠ²Ρ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΡ Π·Π±ΠΎΡΡ, ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ, Π°Π½Π°Π»ΡΠ·Ρ ΡΠ° ΠΌΠΎΠ½ΡΡΠΎΡΠΈΠ½Π³Ρ Π΄Π°Π½ΠΈΡ .
Π―ΠΊΡΠΎ ΡΠΈ Π½Π΅ Π±ΠΎΡΡΡΡ Π²ΠΈΠΊΠ»ΠΈΠΊΡΠ², ΡΡ Π²Π°ΠΊΠ°Π½ΡΡΡ ΡΠ°ΠΌΠ΅ Π΄Π»Ρ ΡΠ΅Π±Π΅!
ΠΠ°ΠΌ Π²Π°ΠΆΠ»ΠΈΠ²ΠΎ:
β’ ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ ΡΠΊ Data Engineer β 1+ ΡΡΠΊ;
β’ ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Python;
β’ ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· Databricks ΡΠ° Datafactory;
β’ ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· AWS/Azure;β’ ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· ETL / ELT pipelines;
β’ ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· SQL.
ΠΠ±ΠΎΠ²'ΡΠ·ΠΊΠΈ:
β’ Π‘ΡΠ²ΠΎΡΠ΅Π½Π½Ρ ETL/ELT pipelines ΡΠ° ΡΡΡΠ΅Π½Ρ Π΄Π»Ρ ΡΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ Π΄Π°Π½ΠΈΠΌΠΈ;
β’ ΠΠ°ΡΡΠΎΡΡΠ²Π°Π½Π½Ρ Π°Π»Π³ΠΎΡΠΈΡΠΌΡΠ² ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ ;
β’ Π ΠΎΠ±ΠΎΡΠ° Π· SQL-Π·Π°ΠΏΠΈΡΠ°ΠΌΠΈ Π΄Π»Ρ Π²ΠΈΠ΄ΠΎΠ±ΡΡΠΊΡ ΡΠ° Π°Π½Π°Π»ΡΠ·Ρ Π΄Π°Π½ΠΈΡ ;
β’ ΠΠ½Π°Π»ΡΠ· Π΄Π°Π½ΠΈΡ ΡΠ° Π²ΠΈΠΊΠΎΡΠΈΡΡΠ°Π½Π½Ρ Π°Π»Π³ΠΎΡΠΈΡΠΌΡΠ² ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ Π΄Π°Π½ΠΈΡ Π΄Π»Ρ Π²ΠΈΡΡΡΠ΅Π½Π½Ρ Π±ΡΠ·Π½Π΅Ρ-ΠΏΡΠΎΠ±Π»Π΅ΠΌ;
ΠΠΈ ΠΏΡΠΎΠΏΠΎΠ½ΡΡΠΌΠΎ:
β’ Π ΠΎΠ±ΠΎΡΠ° Π· high-skilled engineering team Π½Π°Π΄ ΡΡΠΊΠ°Π²ΠΈΠΌΠΈ ΡΠ° ΡΠΊΠ»Π°Π΄Π½ΠΈΠΌΠΈ ΠΏΡΠΎΡΠΊΡΠ°ΠΌΠΈ;
β’ ΠΠΈΠ²ΡΠ΅Π½Π½Ρ Π½ΠΎΠ²ΡΡΠ½ΡΡ ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΠΉ;
β’ Π‘ΠΏΡΠ»ΠΊΡΠ²Π°Π½Π½Ρ Π· ΡΠ½ΠΎΠ·Π΅ΠΌΠ½ΠΈΠΌΠΈ ΠΊΠ»ΡΡΠ½ΡΠ°ΠΌΠΈ, ΡΠ΅Π»Π΅Π½ΠΆΠΎΠ²Ρ Π·Π°Π²Π΄Π°Π½Π½Ρ;
β’ ΠΠΎΠΆΠ»ΠΈΠ²ΠΎΡΡΡ ΠΎΡΠΎΠ±ΠΈΡΡΠΎΠ³ΠΎ Ρ ΠΏΡΠΎΡΠ΅ΡΡΠΉΠ½ΠΎΠ³ΠΎ ΡΠΎΠ·Π²ΠΈΡΠΊΡ;
β’ ΠΠΎΠ½ΠΊΡΡΠ΅Π½ΡΠΎΡΠΏΡΠΎΠΌΠΎΠΆΠ½Π° Π·Π°ΡΠΏΠ»Π°ΡΠ°, ΡΡΠΊΡΠΎΠ²Π°Π½Π° Π² USD;
β’ ΠΠΏΠ»Π°ΡΡΠ²Π°Π½Π° Π²ΡΠ΄ΠΏΡΡΡΠΊΠ° Ρ Π»ΡΠΊΠ°ΡΠ½ΡΠ½Ρ;
β’ ΠΠ½ΡΡΠΊΠΈΠΉ Π³ΡΠ°ΡΡΠΊ ΡΠΎΠ±ΠΎΡΠΈ;
β’ ΠΡΡΠΆΠ½Ρ ΡΠΎΠ±ΠΎΡΠ° Π°ΡΠΌΠΎΡΡΠ΅ΡΠ° Π±Π΅Π· Π±ΡΡΠΎΠΊΡΠ°ΡΠΈΠ·ΠΌΡ;
β’ Π£ Π½Π°Ρ Π±Π°Π³Π°ΡΠΎ ΡΡΠ°Π΄ΠΈΡΡΠΉ β ΠΊΠΎΡΠΏΠΎΡΠ°ΡΠΈΠ²ΠΈ, ΡΠΈΠΌΠ±ΡΠ»Π΄ΠΈΠ½Π³ΠΈ ΡΠ° ΡΠ΅ΠΌΠ°ΡΠΈΡΠ½Ρ Π·Π°Ρ ΠΎΠ΄ΠΈ ΡΠ° Π±Π°Π³Π°ΡΠΎ ΡΠ½ΡΠΎΠ³ΠΎ!
Π―ΠΊΡΠΎ Π½Π°ΡΠ° Π²Π°ΠΊΠ°Π½ΡΡΡ ΡΠΎΠ±Ρ Π΄ΠΎ Π΄ΡΡΡ, ΡΠΎΠ΄Ρ Π²ΡΠ΄ΠΏΡΠ°Π²Π»ΡΠΉ ΡΠ²ΠΎΡ ΡΠ΅Π·ΡΠΌΠ΅ - Ρ ΡΡΠ°Π²Π°ΠΉ ΡΠ°ΡΡΠΈΠ½ΠΎΡ Π½Π°ΡΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ.
More -
Β· 19 views Β· 1 application Β· 16d
Senior Data Engineer
Office Work Β· Ukraine (Kyiv) Β· 4 years of experience Β· Upper-IntermediateWe are looking for a Senior Data Engineer with strong expertise in PySpark to join our team in Kyiv. In this role, you will design, develop, and optimize large-scale data pipelines, ensuring efficient data processing and integration. You will also...We are looking for a Senior Data Engineer with strong expertise in PySpark to join our team in Kyiv. In this role, you will design, develop, and optimize large-scale data pipelines, ensuring efficient data processing and integration. You will also collaborate closely with business stakeholders to gather and refine data requirements, ensuring solutions align with business needs.
Experience with Palantir Foundry is a plus.
Responsibilities:
- Design and implement scalable ETL/ELT pipelines using PySpark.
- Work closely with product owners, and different stakeholders to gather, analyze, and translate data requirements into technical solutions.
- Optimize data processing workflows for performance and reliability.
- Ensure data integrity, governance, and security best practices.
- Develop and maintain technical documentation for data pipelines and transformations.
- Troubleshoot and optimize Spark jobs to improve efficiency.
- Work with cloud-based or on-premises big data technologies.
Requirements:
- 5+ years of experience in data engineering, with a focus on PySpark.
- Strong knowledge of Apache Spark, distributed computing, and performance tuning.
- Experience gathering and refining business and technical data requirements.
- Hands-on experience with SQL, data modeling, and ETL processes.
- Proficiency in working with big data technologies (Hadoop, Databricks, etc.).
- Experience with cloud platforms (AWS, Azure, GCP) is a plus.
- Knowledge of CI/CD, version control (Git), and DevOps practices.
- Excellent problem-solving and communication skills.
Nice to Have:
- Experience with Palantir Foundry.
- Familiarity with streaming technologies (Kafka).
Exposure to containerization and orchestration (Docker, Kubernetes, Airflow).
More -
Β· 57 views Β· 8 applications Β· 17d
Senior Data Engineer
Ukraine Β· 5 years of experience Β· Upper-IntermediateOur client is an AI-driven company revolutionizing biopharma research and commercialization. Our mission is to accelerate the development of lifesaving cures by enhancing data-driven decision-making. Key Responsibilities: Develop scalable data pipelines...Our client is an AI-driven company revolutionizing biopharma research and commercialization. Our mission is to accelerate the development of lifesaving cures by enhancing data-driven decision-making.
Key Responsibilities:
- Develop scalable data pipelines for ingestion, processing, and analysis of complex datasets.
- Integrate structured & unstructured data sources to create unified, AI-ready data environments.
- Collaborate with AI & product teams to enhance data architectures supporting AI models.
- Ensure compliance & security in line with pharma industry regulations (FDA, EMA, etc.).
- Optimize cloud-based workflows for large-scale data processing.
- Work cross-functionally with data scientists, engineers, and domain experts to align data solutions with business needs.
Requirements:
- 5+ years of experience in data engineering, preferably in pharmaceuticals or life sciences
- Expertise in Python, PySpark, SQL, Databricks, Snowflake, DBT, PostgreSQL
- Hands-on experience with AWS (preferred), Terraform, CloudFormation, Docker, Kubernetes
- Strong background in designing scalable data pipelines
- Knowledge of data governance & compliance frameworks in the pharma industry
- Excellent problem-solving & collaboration skills.
Would be a plus:
- Experience with LLMs & embeddings
- Integration with ML platforms (e.g., AWS SageMaker)
- Experience handling unstructured data (PDFs, images)
What We Offer:
- Startup culture, a strong goal-oriented team, and a research mindset
- Opportunity to leverage your engineering skills for fellow engineers and shape the future of AI
- Working with the newest technical equipment
- 20 working days of annual vacation leave
- English courses, Educational Events & Conferences
- Medical insurance
-
Β· 13 views Β· 0 applications Β· 17d
Technical Lead Python Engineer with expertise in Data
Ukraine Β· 6 years of experience Β· Upper-IntermediateOur Client is a global investment advisor organization that utilizes advanced technologies and data analytics to improve the operational performance of their portfolio companies. Key objectives include delivering strong financial returns, offering diverse...Our Client is a global investment advisor organization that utilizes advanced technologies and data analytics to improve the operational performance of their portfolio companies. Key objectives include delivering strong financial returns, offering diverse investment opportunities, enhancing portfolio company operations, and proactively managing risks. They have invested in proprietary technology and data-driven strategies to extract additional value. Currently, they are expanding their data science team in London, seeking a Technical Lead Data Engineer to further develop their internal Data Platform.
Required Skills and Qualifications:
Must have:
- Python Development: minimum 5 years of professional experience in production environments, emphasising performance optimisation and code quality.
- Ingestion and modelling:
- Experience with Python and orchestration tools like Airflow is beneficial.
- SQL Proficiency: advanced knowledge of SQL:
- At least one of PostgreSQL, MySQL, MSSQL
- Ability to write complex queries and optimise database performance
- Utilise DPT tools.
- Infrastructure as Code: experience with Terraform or equivalent tools to facilitate code-controlled infrastructure management.
- Cloud Experience: over 3 years of hands-on experience with cloud providers, specifically:
- AWS or GCP is acceptable
- Azure is highly preferred due to project-specific requirements.
- English β Upper-Intermediate or Advanced.
- Ukrainian language Advanced or higher.
Education: advanced degree (Masterβs or Ph.D.) in Data Science, Statistics, Computer Science, Engineering, or a related field.
Nice to have:
- Data Pipeline Design: Strong understanding of designing robust and scalable data pipelines for large-scale applications.
- Version Control and CI/CD: Familiarity with Git-based workflows and continuous integration/deployment practices to ensure seamless code integration and deployment processes.
- Communication Skills: Ability to articulate complex technical concepts to technical and non-technical stakeholders alike.
- Experience with programming: Python, PySpark, DPT, SQLMesh (optional), PostgreSQL, MySQL, MSSQL, Terraform
Hands on experience with tools and technologies: Git, Azure DevOps, JIRA, Confluence
We offer:
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
-
Β· 81 views Β· 7 applications Β· 18d
Data Engineer
Countries of Europe or Ukraine Β· 2 years of experience Β· IntermediateLooking for a Data Engineer to join the Dataforest team. If you are looking for a friendly team, a healthy working environment, and a flexible schedule β you have found the right place to send your CV. Skills requirements: β’ 2+ years of experience with...Looking for a Data Engineer to join the Dataforest team. If you are looking for a friendly team, a healthy working environment, and a flexible schedule β you have found the right place to send your CV.
Skills requirements:
β’ 2+ years of experience with Python;
β’ 2+ years of experience as a Data Engineer;
β’ Experience with Pandas;
β’ Experience with SQL DB / NoSQL (Redis, Mongo, Elasticsearch) / BigQuery;
β’ Familiarity with Amazon Web Services;
β’ Knowledge of data algorithms and data structures is a MUST;
β’ Working with high volume tables 10m+.
Optional skills (as a plus):
β’ Experience with Spark (pyspark);
β’ Experience with Airflow;
β’ Experience with Kafka;
β’ Experience in statistics;
β’ Knowledge of DS and Machine learning algorithms..Key responsibilities:
β’ Create ETL pipelines and data management solutions (API, Integration logic);
β’ Different data processing algorithms;
β’ Involvement in creation of forecasting, recommendation, and classification models.We offer:
β’ Great networking opportunities with international clients, challenging tasks;
β’ Building interesting projects from scratch using new technologies;
β’ Personal and professional development opportunities;
β’ Competitive salary fixed in USD;
β’ Paid vacation and sick leaves;
β’ Flexible work schedule;
β’ Friendly working environment with minimal hierarchy;
β’ Team building activities, corporate events.
More -
Β· 35 views Β· 4 applications Β· 19d
Middle/Senior Data Engineer (Azure)
Ukraine Β· 2 years of experience Β· Upper-IntermediateN-iX is seeking a Data Engineer to join our UK/EU Client`s team. Position overview: Seeking a data engineer with proven Cloud skills based on the design and development of multiple data pipelines for ERP system data ingestions migration from Talend...N-iX is seeking a Data Engineer to join our UK/EU Client`s team.
Position overview: Seeking a data engineer with proven Cloud skills based on the design and development of multiple data pipelines for ERP system data ingestions migration from Talend into Azure Fabric.
Requirements:
- Extensive knowledge in SQL and writing complex stored procedures
- Knowledgeable in Azure Data Fabric pipelines
- Knowledgeable in Azure Data Fabric Data warehouse
- Knowledgeable in Azure Data Fabric Lake house
- Knowledgeable in Azure Data Factory
- Expertise in Power BI would be a plus
- Experience with Talend would be a plus as this project would be a migration from it into Azure data factory
- Working Agile (Scrum or Kanban) experience
- Upper-intermediate English level
Responsibilities:
- Establishes data pipelines with automated batch, micro-batch, and incremental data refreshes
- Provide data quality using technologies
- Provision, deploy, and scale resources
- Creates technical design documentation in adherence to business, source system, target system, and solution architecture requirements
We offer:
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
-
Β· 62 views Β· 6 applications Β· 19d
Middle Data Support Engineer
Ukraine Β· 2 years of experience Β· IntermediateN-iX is seeking a highly motivated Middle Data Support Engineer to join the Technical Support Team. As an Engineer, you will work closely with our experienced Data Engineering team members to provide technical support and assistance to our stakeholders in...N-iX is seeking a highly motivated Middle Data Support Engineer to join the Technical Support Team. As an Engineer, you will work closely with our experienced Data Engineering team members to provide technical support and assistance to our stakeholders in utilizing our data-driven solutions. You will play a crucial role in ensuring the smooth operation and successful implementation of our data products while gaining valuable hands-on experience in the field of data engineering.
Responsibilities:
- Ensure that all data is getting refreshed automatically daily as per SLA and data quality is being permanently monitored
- Troubleshoot and resolve complex issues of Foundry users within SLAs
- Solve small change requests and bug fixes self-sufficiently
- Provide a standard and ad-hoc reports about pipeline monitoring, issues, health checks and system usage
- Create new/missing documentation based on the activities done to share newly gained knowledge within the team, as well as keep up-to-date the existing documentation
- Ensure immediate communications in case of Foundry outages and delays, but also for upcoming changes and other support information to end users.
- Implement continuous improvement of existing processes
Requirements:
MUST HAVE:
- 2+ years experience with SQL
- Good Python skills
- Hands-on experience with PySpark (or Spark)
- Good knowledge of Data Warehouse concepts
- Intermediate+ English level
NICE TO HAVE: - Perfect troubleshooting skills is any support experience
- Readiness to work with TypeScript
We offer:
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
-
Β· 21 views Β· 1 application Β· 19d
Lead Data Analyst
Office Work Β· Ukraine (Kyiv) Β· Product Β· 3 years of experienceΠΡΠΈΠ²ΡΡ! ΠΠΈ β ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠ° ΠΊΠΎΠΌΠΏΠ°Π½ΡΡ, ΡΠΎ Π²ΠΆΠ΅ ΠΏΠΎΠ½Π°Π΄ 20 ΡΠΎΠΊΡΠ² ΠΏΡΠ°ΡΡΡ Ρ ΡΡΠ΅ΡΡ ΡΠΎΠ·Π΄ΡΡΠ±Π½ΠΈΡ ΠΏΡΠΎΠ΄Π°ΠΆΡΠ², Π·Π°ΠΊΡΠΏΡΠ²Π»Ρ ΡΠ° Π²ΠΈΡΠΎΠ±Π½ΠΈΡΡΠ²Π° ΠΎΠ΄ΡΠ³Ρ ΡΠ° Π°ΠΊΡΠ΅ΡΡΠ°ΡΡΠ². Π¨ΡΠΊΠ°ΡΠΌΠΎ Π² ΡΠ²ΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄Ρ ΡΠ°Π»Π°Π½ΠΎΠ²ΠΈΡΠΎΠ³ΠΎ Β«Lead Data AnalystΒ», ΡΠΊΠΈΠΉ ΡΡΠ°Π½Π΅ ΡΠ°ΡΡΠΈΠ½ΠΎΡ Π½Π°ΡΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ ΡΠ° Π΄ΠΎΠΏΠΎΠΌΠΎΠΆΠ΅ Π½Π°ΠΌ Π΄ΠΎΡΡΠ³ΡΠΈ...ΠΡΠΈΠ²ΡΡ!
ΠΠΈ β ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠ° ΠΊΠΎΠΌΠΏΠ°Π½ΡΡ, ΡΠΎ Π²ΠΆΠ΅ ΠΏΠΎΠ½Π°Π΄ 20 ΡΠΎΠΊΡΠ² ΠΏΡΠ°ΡΡΡ Ρ ΡΡΠ΅ΡΡ ΡΠΎΠ·Π΄ΡΡΠ±Π½ΠΈΡ ΠΏΡΠΎΠ΄Π°ΠΆΡΠ², Π·Π°ΠΊΡΠΏΡΠ²Π»Ρ ΡΠ° Π²ΠΈΡΠΎΠ±Π½ΠΈΡΡΠ²Π° ΠΎΠ΄ΡΠ³Ρ ΡΠ° Π°ΠΊΡΠ΅ΡΡΠ°ΡΡΠ². Π¨ΡΠΊΠ°ΡΠΌΠΎ Π² ΡΠ²ΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄Ρ ΡΠ°Π»Π°Π½ΠΎΠ²ΠΈΡΠΎΠ³ΠΎ Β«Lead Data AnalystΒ», ΡΠΊΠΈΠΉ ΡΡΠ°Π½Π΅ ΡΠ°ΡΡΠΈΠ½ΠΎΡ Π½Π°ΡΠΎΡ ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ ΡΠ° Π΄ΠΎΠΏΠΎΠΌΠΎΠΆΠ΅ Π½Π°ΠΌ Π΄ΠΎΡΡΠ³ΡΠΈ Π½ΠΎΠ²ΠΈΡ Π²ΠΈΡΠΎΡ.
ΠΡΠ½ΠΎΠ²Π½Ρ ΠΎΠ±ΠΎΠ²'ΡΠ·ΠΊΠΈ:
- ΠΡΠ΅ΠΊΡΠΈΠ²Π½Π΅ ΡΠΏΡΠ°Π²Π»ΡΠ½Π½Ρ ΠΊΠΎΠΌΠ°Π½Π΄ΠΎΡ Π΄Π°ΡΠ° Π°Π½Π°Π»ΡΡΠΈΠΊΡΠ² Π΄Π»Ρ Π·Π°Π±Π΅Π·ΠΏΠ΅ΡΠ΅Π½Π½Ρ ΡΠΊΡΡΠ½ΠΎΠ³ΠΎ Π°Π½Π°Π»ΡΠ·Ρ Π΄Π°Π½ΠΈΡ , ΡΠΎΠ·ΠΏΠΎΠ΄ΡΠ» Π·Π°Π΄Π°Ρ ΡΠ° ΠΊΠΎΠ½ΡΡΠΎΠ»Ρ ΡΡ Π²ΠΈΠΊΠΎΠ½Π°Π½Π½Ρ.
- Π ΠΎΠ·ΡΠΎΠ±ΠΊΠ° ΡΠ½Π½ΠΎΠ²Π°ΡΡΠΉΠ½ΠΈΡ ΠΏΡΠ΄Ρ ΠΎΠ΄ΡΠ² Π΄ΠΎ Π²ΠΈΡΡΡΠ΅Π½Π½Ρ Π±ΡΠ·Π½Π΅Ρ-Π·Π°Π²Π΄Π°Π½Ρ, Π° ΡΠ°ΠΊΠΎΠΆ ΠΊΠΎΠΎΡΠ΄ΠΈΠ½Π°ΡΡΡ ΠΏΡΠΎΡΠ΅ΡΡΠ², ΠΏΠΎΠ²βΡΠ·Π°Π½ΠΈΡ ΡΠ· ΠΎΠ±ΡΠΎΠ±ΠΊΠΎΡ, Π²ΡΠ·ΡΠ°Π»ΡΠ·Π°ΡΡΡΡ ΡΠ° ΡΠ½ΡΠ΅ΡΠΏΡΠ΅ΡΠ°ΡΡΡΡ Π΄Π°Π½ΠΈΡ Π΄Π»Ρ ΠΏΡΠΈΠΉΠ½ΡΡΡΡ ΡΡΡΠ°ΡΠ΅Π³ΡΡΠ½ΠΈΡ ΡΡΡΠ΅Π½Ρ.
- ΠΡΠΎΠ²Π΅Π΄Π΅Π½Π½Ρ Π³Π»ΠΈΠ±ΠΎΠΊΠΎΠ³ΠΎ Π°Π½Π°Π»ΡΠ·Ρ Π΄Π°Π½ΠΈΡ , Π²ΠΈΡΠ²Π»Π΅Π½Π½Ρ ΡΡΠ΅Π½Π΄ΡΠ², Π·Π°ΠΊΠΎΠ½ΠΎΠΌΡΡΠ½ΠΎΡΡΠ΅ΠΉ ΡΠ° Π°Π½ΠΎΠΌΠ°Π»ΡΠΉ.
- ΠΠΏΡΠΎΠ²Π°Π΄ΠΆΠ΅Π½Π½Ρ Π½ΠΎΠ²ΠΈΡ ΡΠ½ΡΡΡΡΠΌΠ΅Π½ΡΡΠ² ΡΠ° ΡΠ΅Ρ Π½ΠΎΠ»ΠΎΠ³ΡΠΉ Π΄Π»Ρ ΠΏΠΎΠΊΡΠ°ΡΠ΅Π½Π½Ρ ΠΏΡΠΎΡΠ΅ΡΡΠ² Π°Π½Π°Π»ΡΡΠΈΠΊΠΈ
- ΠΠΏΡΠΈΠΌΡΠ·Π°ΡΡΡ ΠΏΡΠΎΡΠ΅ΡΡΠ² Π·Π±ΠΎΡΡ, ΠΎΠ±ΡΠΎΠ±ΠΊΠΈ ΡΠ° Π·Π±Π΅ΡΡΠ³Π°Π½Π½Ρ Π΄Π°Π½ΠΈΡ .
- ΠΠ°Π²ΡΠ°Π½Π½Ρ ΡΠ° ΠΌΠ΅Π½ΡΠΎΡΡΡΠ²ΠΎ ΡΠ»Π΅Π½ΡΠ² ΠΊΠΎΠΌΠ°Π½Π΄ΠΈ, ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠ° ΡΡ
ΠΏΡΠΎΡΠ΅ΡΡΠΉΠ½ΠΎΠ³ΠΎ ΡΠΎΠ·Π²ΠΈΡΠΊΡ.
ΠΠΈΠΌΠΎΠ³ΠΈ:
- ΠΠΈΡΠ° ΠΎΡΠ²ΡΡΠ° Π² Π³Π°Π»ΡΠ·Ρ ΠΌΠ°ΡΠ΅ΠΌΠ°ΡΠΈΠΊΠΈ, ΡΡΠ°ΡΠΈΡΡΠΈΠΊΠΈ, Π΅ΠΊΠΎΠ½ΠΎΠΌΡΠΊΠΈ, ΠΊΠΎΠΌΠΏ'ΡΡΠ΅ΡΠ½ΠΈΡ Π½Π°ΡΠΊ Π°Π±ΠΎ ΡΡΠΌΡΠΆΠ½ΠΈΡ Π΄ΠΈΡΡΠΈΠΏΠ»ΡΠ½.
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π½Π° ΠΏΠΎΡΠ°Π΄Ρ Data Analyst Π²ΡΠ΄ 3 ΡΠΎΠΊΡΠ², Π²ΠΊΠ»ΡΡΠ°ΡΡΠΈ ΠΊΠ΅ΡΡΠ²Π½Ρ ΡΠΎΠ»Ρ.
- ΠΡΠ»ΡΠ½Π΅ Π²ΠΎΠ»ΠΎΠ΄ΡΠ½Π½Ρ ΡΠ½ΡΡΡΡΠΌΠ΅Π½ΡΠ°ΠΌΠΈ Π°Π½Π°Π»ΡΠ·Ρ Π΄Π°Π½ΠΈΡ (SQL, Python, Excel).
- ΠΠΎΡΠ²ΡΠ΄ ΡΠΎΠ±ΠΎΡΠΈ Π· BI-ΡΠ½ΡΡΡΡΠΌΠ΅Π½ΡΠ°ΠΌΠΈ (Power BI, Tableau).
- Π ΠΎΠ·ΡΠΌΡΠ½Π½Ρ ΠΌΠ΅ΡΠΎΠ΄ΡΠ² ΠΌΠ°ΡΠΈΠ½Π½ΠΎΠ³ΠΎ Π½Π°Π²ΡΠ°Π½Π½Ρ ΡΠ° ΡΡΠ°ΡΠΈΡΡΠΈΡΠ½ΠΎΠ³ΠΎ Π°Π½Π°Π»ΡΠ·Ρ.
- MDX/DAX.
- ΠΠΌΡΠ½Π½Ρ ΠΏΡΠ°ΡΡΠ²Π°ΡΠΈ Π² ΠΊΠΎΠΌΠ°Π½Π΄Ρ, ΡΠΈΠ»ΡΠ½Ρ ΠΊΠΎΠΌΡΠ½ΡΠΊΠ°ΡΡΠΉΠ½Ρ Π½Π°Π²ΠΈΡΠΊΠΈ ΡΠ° ΠΎΡΠ³Π°Π½ΡΠ·Π°ΡΡΠΉΠ½Ρ Π·Π΄ΡΠ±Π½ΠΎΡΡΡ.
- Π‘Π°ΠΌΠΎΡΡΡΠΉΠ½ΡΡΡΡ Ρ ΡΠΎΠ±ΠΎΡΡ, ΡΠ½ΡΡΡΠ°ΡΠΈΠ²Π½ΡΡΡΡ, Π²ΠΌΡΠ½Π½Ρ ΠΏΡΡΠΎΡΠΈΡΠΈΠ·ΡΠ²Π°ΡΠΈ Π·Π°Π΄Π°ΡΡ ΡΠ° ΠΏΠ»Π°Π½ΡΠ²Π°ΡΠΈ ΡΠ²ΡΠΉ ΡΠ°Ρ.
ΠΠΎΠ΄Π±Π°ΡΠΌΠΎ ΠΏΡΠΎ: - Π’Π²ΡΠΉ Π΄ΠΎΡ ΡΠ΄. ΠΠΎΠ½ΠΊΡΡΠ΅Π½ΡΠ½Π° Π·Π°ΡΠΎΠ±ΡΡΠ½Π° ΠΏΠ»Π°ΡΠ° ΡΠ° ΡΡΠ°Π±ΡΠ»ΡΠ½Ρ Π²ΠΈΠΏΠ»Π°ΡΠΈ.
- Π’Π²ΠΎΡ Π²ΠΏΠ΅Π²Π½Π΅Π½ΡΡΡΡ. ΠΠΏΠ»Π°ΡΡΠ²Π°Π½Ρ Π»ΡΠΊΠ°ΡΠ½ΡΠ½Ρ ΡΠ° Π²ΡΠ΄ΠΏΡΡΡΠΊΠΈ (24 ΠΊΠ°Π»Π΅Π½Π΄Π°ΡΠ½Ρ Π΄Π½Ρ Π½Π° ΡΡΠΊ).
- Π’Π²ΡΠΉ Π½Π°ΡΡΡΡΠΉ. ΠΡΡΠΆΠ½Ρ, ΠΊΡΡΡΡ ΠΊΠΎΡΠΏΠΎΡΠ°ΡΠΈΠ²ΠΈ, ΠΏβΡΡΠ½ΠΈΡΠ½Ρ ΠΏΠΎΡΠΈΠ΄Π΅Π»ΠΊΠΈ Π· Π½Π°ΡΡΠΎΠ»ΠΊΠ°ΠΌΠΈ;)
- Π’Π²ΡΠΉ ΠΊΠ°Ρ'ΡΡΠ½ΠΈΠΉ ΡΡΡΡ. ΠΠΎΠΌΠΏΠ΅Π½ΡΠ°ΡΡΡ Π½Π°Π²ΡΠ°Π½Π½Ρ ΡΠ° Π²ΡΡ ΡΠΌΠΎΠ²ΠΈ Π΄Π»Ρ ΡΠ°ΠΌΠΎΡΠΎΠ·Π²ΠΈΡΠΊΡ.
- Π’Π²ΡΠΉ ΠΊΠΎΠΌΡΠΎΡΡ. ΠΠΎΡΠΏΠΎΡΠ°ΡΠΈΠ²Π½Π° ΠΊΡΠ»ΡΡΡΡΠ° Π²Π·Π°ΡΠΌΠΎΠΏΠΎΠ²Π°Π³ΠΈ ΡΠ° ΠΏΡΠ΄ΡΡΠΈΠΌΠΊΠΈ, Π΄ΠΎΠΏΠΎΠΌΠΎΠΆΠ΅ Π»Π΅Π³ΠΊΠΎ Π°Π΄Π°ΠΏΡΡΠ²Π°ΡΠΈΡΡ Π² ΠΊΠΎΠ»Π΅ΠΊΡΠΈΠ²Ρ Ρ ΡΠ°Π·ΠΎΠΌ Π΄ΠΎΠ»Π°ΡΠΈ Π²ΠΈΠΊΠ»ΠΈΠΊΠΈ.
- Π’Π²ΠΎΡ Π·Π΄ΠΎΡΠΎΠ²βΡ. Π‘ΠΏΠΎΡΡΠΈΠ²Π½Π΅ ΠΆΠΈΡΡΡ β ΡΡΡΠ±ΠΎΠ», Π²ΠΎΠ»Π΅ΠΉΠ±ΠΎΠ», ΠΉΠΎΠ³Π°.
- Π’Π²ΡΠΉ ΡΠΎΠ·Π²ΠΈΡΠΎΠΊ. ΠΡο»ΏΡΡΠΈ Π°Π½Π³Π»ΡΠΉΡΠΊΠΎΡ ΠΌΠΎΠ²ΠΈ .
ΠΠΎΡΠΎΠ²ΠΈΠΉ Π΄ΠΎ Π½ΠΎΠ²ΠΈΡ Π²ΠΈΠΊΠ»ΠΈΠΊΡΠ² ΡΠ° ΡΡΠΊΠ°Π²ΠΈΡ Π·Π°Π΄Π°Ρ? Π’ΠΎΠ΄Ρ ΡΠ΅ΠΊΠ°ΡΠΌΠΎ ΡΠ°ΠΌΠ΅ Π½Π° ΡΠ²ΠΎΡ ΡΠ΅Π·ΡΠΌΠ΅!