Plumerai is making deep learning tiny and computationally radically more efficient to enable real-time inference on the edge – for automated warehouses, retail, smart cameras, micromobility and many more. We are able to unlock huge value for all these applications on the edge by using the most efficient form of deep learning – Binarized Neural Networks (BNNs). The activations and weights in these deep learning models are encoded not using 32, 16 or 8 bits, but using only 1 bit.
We develop Larq – an ecosystem of open-source Python packages for building, training and deploying BNNs. It integrates seamlessly with TensorFlow Keras, provides ready to use pretrained models and contains a highly-optimized inference library for deploying BNNs on mobile and edge devices.
We perform world-class research on Binarized Neural Networks, developing state-of-the-art architectures and training algorithms.
https://papers.nips.cc/paper/8971-latent-weights-do-not-exist-rethinking-binarized-neural-network-optimization, NeurIPS 2019.
Our team is backed by world-class investors with strong backgrounds in deep learning and with track records of founding multi-billion dollar chip and hardware companies.