Akvelon

Joined in 2019
34% answers
Akvelon is an outsourcing company that has been developing projects for clients around the world since 2000, from small start-ups to Fortune 500 companies. In 13 offices around the world, we create software for large enterprise products and ML applications.

The development office in Ukraine (Kharkiv) was opened in November 2008 and presently employs more than 300 people. Currently, Akvelon Ukraine has offices in Kharkiv, Dnipro, Lviv, Ivano-Frankivsk, and Gdansk. Only during 2021, our staff grows by 49%!

Companies, such as Microsoft, Reddit, LinkedIn, GitHub, Amazon, Pinterest, Airbnb, Starbucks, T-Mobile, Intel, Nokia, Tideworks, Dropbox and many more, have greatly benefited from working with Akvelonโ€™s talented employees. โ €

We ๐—ผ๐—ณ๐—ณ๐—ฒ๐—ฟ ๐—ฎ ๐—ฐ๐—ต๐—ฎ๐—ป๐—ฐ๐—ฒ ๐˜๐—ผ ๐—ฏ๐—ฒ ๐—ฟ๐—ฒ๐—น๐—ผ๐—ฐ๐—ฎ๐˜๐—ฒ๐—ฑ ๐˜๐—ผ ๐˜๐—ต๐—ฒ USA ๐˜๐—ผ ๐—ผ๐˜‚๐—ฟ ๐—ฐ๐˜‚๐˜€๐˜๐—ผ๐—บ๐—ฒ๐—ฟ๐˜€โ€™ ๐—›๐—ค.

Akvelon is about socially significant projects, career growth, and development in various stacks, a culture of environmentally friendly communication and empathy, innovative technologies, and a flexible approach to work. We are not looking for candidates for projects, but we take people to a company where there is always an opportunity to grow and develop effectively in tandem with the team. โ €
  • ยท 87 views ยท 21 applications ยท 28d

    Data Engineer with Python/Spark skills

    Full Remote ยท Worldwide ยท 3 years of experience ยท Upper-Intermediate
    Akvelon is a known USA company, with offices in places like Seattle, Mexico, Ukraine, Poland, and Serbia. Our company is an official vendor of Microsoft and Google. Our clients also include Amazon, Evernote, Intel, HP, Reddit, Pinterest, AT&T, T-Mobile,...

    ๐ŸŒŽ Akvelon is a known USA company, with offices in places like Seattle, Mexico, Ukraine, Poland, and Serbia. Our company is an official vendor of Microsoft and Google. Our clients also include Amazon, Evernote, Intel, HP, Reddit, Pinterest, AT&T, T-Mobile, Starbucks, and LinkedIn. To work with Akvelon means to be connected with the best and brightest engineering teams from around the globe and working with an actual technology stack building Enterprise, CRM, LOB, Cloud, AI and Machine Learning, Cross-Platform, Mobile, and other types of applications customized to clientโ€™s needs and processes.

    We are looking for a Data Engineer with Python/Spark skills to join the Data Platform Team on a 3-month contract basis.

     

    About the Project

    The project is a leading provider of innovative software solutions for terminal operating systems and logistics management. Its products help ports, terminals, and intermodal facilities optimize cargo movement, improve operational efficiency, and streamline supply chain processes. The project offers data-driven solutions for real-time container tracking, yard management, vessel and rail planning, and automated workflows, enabling businesses to handle increasing cargo volumes with greater accuracy and speed.

     

    Responsibilities:

    • Develop, maintain, and optimize ETL pipelines using Python and Apache Spark.
    • Implement data transformations, cleansing, and enrichment processes to support business needs.
    • Work with large-scale distributed data processing using Apache Spark.
    • Design and optimize SQL databases, ensuring efficient data modeling and querying.
    • Work with Kubernetes, ensuring smooth deployment and management of containerized applications.
    • Analyze and document data mapping and data journey workflows.
    • Work with messaging architectures and platforms, understanding their advantages, limitations, and best use cases.
    • Collaborate in an Agile Scrum environment, following industry-standard SDLC practices.

       

    Requirements:

    • Strong experience in Python for production environments.
    • Proven hands-on experience with Apache Spark (batch and streaming processing).
    • Solid understanding of ETL processes, data transformation, and pipeline optimization.
    • Experience working with large-scale distributed data systems.
    • Proficiency with Kubernetes and familiarity with kubectl.
    • Expertise in SQL, data modeling, and database design.
    • Understanding of messaging architectures and their trade-offs.
    • Experience with data mapping and documentation methods.
    • Previous experience working in an Agile Scrum environment.

     

    Ready to take the next step? Apply now! ๐Ÿš€

    More
Log In or Sign Up to see all posted jobs