PR Volt's mission is to make PR more affordable, efficient and transparent for small businesses. At PR Volt, we use tech and automation to streamline the entire process of PR - from the way we interact with clients, to the way we target and reach out to journalists. Core to our focus is reducing manual touchpoints that have historically added cost without adding incremental value.
We're applying the latest in machine learning and AI to develop innovative ways to generate press coverage and level the playing field for our customers.
PR Volt is a simpler, higher quality, lower cost and more scalable approach to PR than building out your own internal PR team or hiring an expensive agency.
We're applying the latest in machine learning and AI to develop innovative ways to generate press coverage and level the playing field for our customers.
PR Volt is a simpler, higher quality, lower cost and more scalable approach to PR than building out your own internal PR team or hiring an expensive agency.
Website:
prvolt.com
-
· 56 views · 14 applications · 3d
Senior Python Developer (Team Lead)
Full Remote · Worldwide · 5 years of experience · C1 - AdvancedPR Volt is at the forefront of public relations. We leverage cutting-edge technology and automation to deliver impactful media coverage more efficiently and affordably. If you're passionate about innovation, efficiency, and helping businesses succeed, PR...PR Volt is at the forefront of public relations. We leverage cutting-edge technology and automation to deliver impactful media coverage more efficiently and affordably. If you're passionate about innovation, efficiency, and helping businesses succeed, PR Volt is the place for you!
We are seeking a proactive and results-oriented Senior Python Developer (Team Lead) to join our Engineering team. In this role, you will be instrumental in expanding our core media database capabilities, architecting robust and scalable systems, and extracting structured data from the vast and ever-evolving web landscape.Key Responsibilities:
- Lead the design and implementation of highly efficient, large-scale web scraping systems capable of monitoring and acquiring data from millions of web pages.
- Develop and deploy sophisticated techniques to effectively bypass anti-scraping mechanisms such as FunCaptcha, reCAPTCHA, Distil, and Cloudflare at scale, utilizing advanced open-source solutions.
- Architect and manage data storage and processing solutions on physical hardware, focusing on cost-efficiency and handling terabytes of incoming and stored data.
- Apply strong architectural principles and design patterns to ensure the development of clean, maintainable, and highly performant backend code.
- Provide technical leadership, code review, and mentorship to junior members of the engineering team.
- Seamlessly integrate newly acquired data with existing internal data platforms, including our data sources, ensuring data consistency and accessibility for downstream applications.
- Continuously research and implement new technologies and methodologies to enhance our data acquisition capabilities and overcome evolving web challenges.
Requirements
- Demonstrated deep expertise in Python development.
- Extensive, up-to-date experience in web scraping at scale (e.g. Selenium).
- Hands-on experience working with and optimizing infrastructure on physical hardware, with a clear understanding of managing costs associated with large-scale data storage and processing (terabytes).
- A clear understanding and practical application of software design patterns, clean code principles, and experience building highly maintainable, scalable, and resilient systems.
- Exceptional analytical and problem-solving skills, with the ability to tackle complex, ambiguous technical challenges independently.
- Experience coaching junior developers and conducting thorough code reviews.
Preferred Qualifications (Nice-to-Haves):
- Familiarity with building with AI and using AI libraries.
- Significant understanding or experience with frontend technologies (e.g., React).
- Experience with other cloud providers (AWS, Azure, GCP) beyond physical hardware, understanding their cost implications for data-intensive tasks.