Lesnikov.co.il
We a leading provider of comprehensive digital solutions specializing in software development, website design, and digital marketing services. Our mission is to empower businesses with cutting-edge technology and innovative strategies to thrive in the digital landscape. Software Development: We craft custom software solutions tailored to meet the unique needs of business, ensuring efficiency and scalability. Website Design: Our creative team designs visually stunning, user-friendly websites that enhance user experience and drive engagement. Digital Marketing: We offer strategic digital marketing services, including SEO, PPC, and social media marketing, to boost your online presence and attract targeted traffic. We combine technical expertise, creative flair, and strategic insights to deliver exceptional results that help your business succeed.
-
· 42 views · 11 applications · 16h
Data Engineer
Full Remote · Countries of Europe or Ukraine · Product · 3 years of experience · English - B2We are seeking a skilled Data Engineer to join our team and contribute to the development of large- scale analytics platforms. The ideal candidate will have strong experience in cloud ecosystems such as Azure and AWS, as well as expertise in AI and...We are seeking a skilled Data Engineer to join our team and contribute to the development of large- scale analytics platforms. The ideal candidate will have strong experience in cloud ecosystems such as Azure and AWS, as well as expertise in AI and machine learning applications. Knowledge of the healthcare industry and life sciences is a plus.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines for large-scale analytics platforms.
- Implement cloud-based solutions using Azure and AWS, ensuring reliability and performance.
- Work closely with data scientists and AI/ML teams to optimize data workflows.
- Ensure data quality, governance, and security across platforms.
- Collaborate with cross-functional teams to integrate data solutions into business processes.
Required Qualifications
- Bachelor's degree (or higher) in Computer Science, Engineering, or a related field.
- 3+ years of experience in data engineering, big data processing, and cloud-based architecture.
- Strong proficiency in cloud services (Azure, AWS) and distributed computing frameworks.
- Mandatory hands-on experience with Databricks (UC, DLTs, Delta Sharing, etc.)
- Expertise in SQL and database management systems (SQL Server, MySQL, etc.).
- Experience with data modeling, ETL processes, and data warehousing solutions.
- Knowledge of AI and machine learning concepts and their data requirements.
- Proficiency in Python, Scala, or similar programming languages.
- Basic knowledge of C# and/or Java programming.
- Familiarity with DevOps, CI/CD pipelines.
- High-level proficiency in English (written and spoken).
Preferred Qualifications
- Experience in the healthcare or life sciences industry.
- Understanding of regulatory compliance related to healthcare data (HIPAA, GDPR, etc.).
- Familiarity with interoperability standards such as HL7, FHIR, and EDI.
-
· 13 views · 1 application · 8h
Performance Engineer (Data Platform / Databricks)
Full Remote · EU · Product · 3 years of experience · English - B2We are looking for a specialist to design and implement an end-to-end performance testing framework for a healthcare system running on Databricks and Microsoft Azure. You will build a repeatable, automated approach to measure and improve performance...We are looking for a specialist to design and implement an end-to-end performance testing framework for a healthcare system running on Databricks and Microsoft Azure. You will build a repeatable, automated approach to measure and improve performance across data ingestion, ETL/ELT pipelines, Spark workloads, serving layers, APIs, security/identity flows, integration components, and presentation/UI, while meeting healthcare-grade security and compliance expectations.
This role sits at the intersection of performance engineering, cloud architecture, and test automation, with strong attention to regulated-domain requirements (privacy, auditability, access controls).
Key Responsibilities
- Design and build a performance testing strategy and framework for a Databricks + Azure healthcare platform.
- Define performance KPIs/SLOs (e.g., pipeline latency, throughput, job duration, cluster utilization, cost per run, data freshness).
- Create workload models that reflect production usage (batch, streaming, peak loads, concurrency, backfills).
- Create a test taxonomy: smoke perf, baseline benchmarks, load, stress, soak/endurance, spike tests, and capacity planning.
- Implement automated performance test suites for:
- Databricks jobs/workflows (Workflows, Jobs API)
- Spark/Delta Lake operations (reads/writes, mergers, compaction, Z-Ordering where relevant)
- Data ingestion (ADF, Event Hubs, ADLS Gen2, Autoloader, etc. as applicable)
- Build test data generation and data anonymization/synthetic data approaches suitable for healthcare contexts.
- Instrument, collect, and analyze metrics from:
- Spark UI / event logs
- Databricks metrics and system tables
- Azure Monitor / Log Analytics
- Application logs and telemetry (if applicable)
- Produce actionable performance reports and dashboards (trend, regression detection, run-to-run comparability).
- Create performance tests for key user journeys (page load, search, dashboards) using appropriate tooling.
- Measure client-side and network timings and correlate them with API/backend performance.
- Integrate performance tests into CI/CD (Azure DevOps or GitHub Actions), including gating rules and baselines.
- Document framework usage, standards, and provide enablement to engineering teams.
Required Qualifications
- Proven experience building performance testing frameworks (not just executing tests), ideally for data platforms.
- Strong hands-on expertise with Databricks and Apache Spark performance tuning and troubleshooting.
- Strong knowledge of Azure services used in data platforms (commonly ADLS Gen2, ADF, Key Vault, Azure Monitor/Log Analytics; others as relevant).
- Strong programming/scripting ability in Python and/or Java/TypeScript.
- Familiarity with load/performance tools and approaches (e.g., custom harnesses, Locust/JMeter/k6 where appropriate, or Spark-specific benchmarking).
- Ability to design repeatable benchmarking (baseline creation, environment parity, noise reduction, statistical comparison).
- Understanding of data security and compliance needs typical for healthcare (e.g., HIPAA-like controls, access management, auditability; adapt to your jurisdiction).
- High-level proficiency in English
Nice-to-Have / Preferred
- Experience with Delta Lake optimization (OPTIMIZE, ZORDER, liquid clustering where applicable), streaming performance, and structured streaming.
- Experience with Terraform/IaC for reproducible test environments.
- Knowledge of Unity Catalog, data governance, and fine-grained access controls.
- Experience with OpenTelemetry tracing and correlation across UI → API → data workloads.
- FinOps mindset: performance improvements tied to cost efficiency on Databricks/Azure.
- Prior work on regulated domains (healthcare, pharma, insurance).
Working Model
- Contract
- Remote
- Collaboration with Data Engineering, Platform Engineering, Security/Compliance, and Product teams.
More