-
· 55 views · 14 applications · 29d
Database Engineer
Full Remote · Ukraine, Poland, Hungary · Product · 5 years of experience · English - B2We’re hiring a Database Engineer to design, build, and operate reliable data platforms and pipelines. You’ll focus on robust ETL/ELT workflows, scalable big data processing, and cloud-first architectures (Azure preferred) that power analytics and...We’re hiring a Database Engineer to design, build, and operate reliable data platforms and pipelines. You’ll focus on robust ETL/ELT workflows, scalable big data processing, and cloud-first architectures (Azure preferred) that power analytics and applications.
What You’ll Do
- Design, build, and maintain ETL/ELT pipelines and data workflows (e.g., Azure Data Factory, Databricks, Spark, ClickHouse, Airflow, etc.).
- Develop and optimize data models, data warehouse/lake/lakehouse schema (partitioning, indexing, clustering, cost/performance tuning, etc.).
- Build scalable batch and streaming processing jobs (Spark/Databricks, Delta Lake; Kafka/Event Hubs a plus).
- Ensure data quality, reliability, and observability (tests, monitoring, alerting, SLAs).
- Implement CI/CD and version control for data assets and pipelines.
- Secure data and environments (IAM/Entra ID, Key Vault, strong tenancy guarantees, encryption, least privilege).
- Collaborate with application, analytics, and platform teams to deliver trustworthy, consumable datasets.
Required Qualifications
- ETL or ELT experience required (ADF/Databricks/dbt/Airflow or similar).
- Big data experience required.
- Cloud experience required; Azure preferred (Synapse, Data Factory, Databricks, Azure Storage, Event Hubs, etc.).
- Strong SQL and performance tuning expertise; hands-on with at least one warehouse/lakehouse (Synapse/Snowflake/BigQuery/Redshift or similar).
- Solid data modeling fundamentals (star/snowflake schemas, normalization/denormalization, CDC, etc.).
- Experience with CI/CD, Git, and infrastructure automation basics.
Nice to Have
- Streaming pipelines (Kafka, Event Hubs, Kinesis, Pub/Sub) and exactly-once/at-least-once patterns.
- Orchestration and workflow tools (Airflow, Prefect, Azure Data Factory).
- Python for data engineering.
- Data governance, lineage, and security best practices.
- Infrastructure as Code (Terraform) for data platform provisioning.
-
· 15 views · 1 application · 8d
Software Development Engineer
Full Remote · Hungary, Poland, Ukraine · 5 years of experience · English - B2We are looking for an experienced software engineer with strong technical expertise, excellent problem-solving skills, and the ability to work effectively in a collaborative environment. The ideal candidate should have a proven track record of working on...We are looking for an experienced software engineer with strong technical expertise, excellent problem-solving skills, and the ability to work effectively in a collaborative environment. The ideal candidate should have a proven track record of working on complex systems, particularly in networking and Python development. Candidates with a strong background in both software development and QA engineering are encouraged to apply for this role. Below are the key technical requirements for the role:
Technical Requirements:
1. Networking Experience:
· Experience with networking devices like routers and switches.
· Experience with FW/SW development for networking devices using C/C++
· Experience with BSP, data plane or control plane protocols development for networking devices
· Experience with data path automated tests development for network devices, using Python/Pytest.
· Experience with SW/HW bringup/integration of networking devices.
· Solid understanding of the differences between routing and forwarding, as well as switches and routers.
· Familiarity with VLANs.
· Proficiency in TCP/IP and UDP protocols.
2. Advanced Network Skills:
· Experience with link aggregation and LACP.
· Understanding of load balancing algorithms such as ECMP
· Understanding of SPAN/RSPAN/ERSPAN, ACL
· Familiarity with traffic mirroring.
· Knowledge of automatic routing protocols like BGP, OSPF, RIP, and IS-IS.
3. Programming and Tools:
· Advanced C/C++ programming skills.
· Advanced Python programming skills.
· Experience with Pytest, Scapy, traffic generators, traffic analysers, etc.
· Git/ GitHub
4. Additional Skills (Optional but Preferred):
· Knowledge of embedded systems, Linux, or related technologies is an advantage.
· Familiarity with tools like Jira, and CI/CD pipelines.
More