· 2 years of experience
English Upper Intermediate
We are looking for a talented DevOps engineer to strengthen our team, help us scale in many ways, and manage the ever-growing complexity. There are plenty of opportunities to grow, learn new skills, and make a positive impact (ask us how we are connected to the R&D on the COVID-19 vaccines).
– 3+ years of experience in DevOps Engineering – Experience at least with one of the scripting languages, such as shell scripting – Experience in building CI/CD pipeline using TeamCity, Jenkins, CircleCI, or similar – Experience working with Docker or other container technologies – Experience deploying and maintaining Kubernetes clusters – Cloud platforms management and best practices (AWS, Docker, Kubernetes) – Infrastructure as Code / Automation / Configuration Management experience (Terraform, Cloudformation, Ansible, Puppet) – Skills in AWS Cloud Networking -routing, firewalls, shared vpc, Cloud VPN, load balancers – Experience with Unix/Linux operating systems internals and administration (e.g. filesystems, network shares, security, authentification, access control) or networking (e.g. TCP/IP, routing, network topologies) – Experience in organizing, maintaining and daily running CI/testing pipelines – Ability to effectively communicate technical work to a wide audience – Experience working with RDBMS, SQL – Good communication skills
Would be a plus
– REST API (OpenAPI / Swagger, data formats, etc) – Basic understanding of ML workflows – Experience working with one or more of the ML frameworks (seldon.io, H2O, SageMaker, MLFlow, KubeFlow, MLlib, Azure ML Studio, Torch, Keras, TensorFlow, etc) – Knowledge in domain networks and directory services (Active Directory, OpenLDAP, Samba) – Scientific background
What you will do
– Take ownership of our systems for CI/CD, availability monitoring, etc – Help us design a framework for scalable computations – Optimize the deployment of our platform for different cloud services – Establish policies such as backup, access, security, disaster recovery, etc
Datagrok's mission is to help people understand their data. In order to do that, we are building a revolutionary platform that allows people to connect to any data source, interactively visualize up to 10 million completely in the browser, and leverage artificial intelligence techniques to derive actionable insights from it at a speed previously unheard of.
Think of Datagrok as an operating system for data. The platform is extremely flexible and could be extended in different ways, including visualizations, data pipelines, predictive models, R/Python scripts, and even applications built on top of it. Our customers are some of the biggest companies with the most demanding needs for data analysis. The free public and academic versions are coming (currently in public beta). Check it out, just click on "LAUNCH" and drag-and-drop your CSV file to start.
Datagrok helps you unlock the value of your organization's complex data by empowering non-technical users to discover, cleanse, visualize, explore, model data themselves, and share these results.
Enhance your company's ecosystem by managing connections to data, building data pipelines, keeping repository of domain-related scripts, and defining ontologies. Harness the power of AI by letting computers learn from your data and your actions.
Finally, build reusable components and domain-specific applications on top of the platform.