As a Data engineer you will take part in projects aiming to solve business problems like user experience modelling, real-time churn prediction, advanced segmentation and others.
The ideal candidate should be able to use his experience in implementing advanced data streaming applications that ingest ~10TB daily data which feeds all of the organization units including marketing, BI, AI and statistical/mathematical models.
You will working with professional data teams built from data engineers and architects, data scientists, economy managers & Business analysts.
• Architect, Design and implement ETL pipelines for multiple data sources ingesting massive amounts of data
• Tune and optimize data applications architecture to reach real-time capabilities.
• Promote best practices coding, code complete standards, OO programing & design patterns
• Develop smoke and integration tests with a running a daily build based CI
• Champion the overall strategy for data governance, security, and quality that ensure requirements are met.
• Collaborating with our architects to capture, format and prepare data for machine learning train and test
• Proficient in Java with +3 years of experience in writing Spark based batch and streaming data applications (Streaming\Structured Streaming\SparkSQL)
• Experience integrating at least one Big-Data Key\Value store (Aerospike, Redis, DynamoDb, MongoDb, Hazalcast, Cassandra, HBase, Memcached)
• Strong understanding of the Hadoop ecosystems (especially Kafka, hdfs, Zookeeper, Yarn, Mesos, Parquet, Hive) and related technologies with +2 years of experience
• Strong working knowledge of SQL or SQL-like data management languages with +5 years of experience
• Linux knowledge with bash scripting experience
• Deep understanding of strong Computer Science fundamentals: object-oriented design using best practices and design patterns, data structures systems, applications and multithreading programming
• Passion for data engineering automation and efficiency, self-starter, loves challenges, independent and a tech savvy
• Advantage: knowledge in K8s, Docker , DeltaLake, Aerospike Redis, Vertica
• BSc in computer science or equivalent
AppReal-VR is an outsourcing development software company, with offices in Tel Aviv and R&D center at Kyiv (Ukraine). AppReal-VR offers software development services for companies and individuals who wish to outsource their development. AppReal-VR specializes in Virtual Reality and Augmented Reality development, as well as Games and Gaming. AppReal-VR offers different development models such as fixed price project and dedicated team.
DOU company page:
Job posted on
18 November 2020