$$ % Not all commands are supported. Please refer to MathJax docs before including % your macros % http://docs.mathjax.org/en/latest/tex.html#defining-tex-macros % \newcommand{\ALG}{FastGRNN\xspace} \newcommand{\redSpace}{\vspace{-6mm}} \newcommand{\algs}{FastRNN\xspace} \newcommand{\alg}{FastGRNN\xspace} \newcommand{\salg}{FastRNN\xspace} \newcommand{\algfloat}{FastGRNN-LSQ\xspace} %!TEX root = paper.tex \newcommand{\reals}{\mathbb R} \newcommand{\ex}{\mathbb E} \newcommand{\prob}{\mathbb P} \renewcommand{\vec}[1]{{\mathbf{#1}}} \newcommand{\br}[1]{\left({#1}\right)} \DeclareMathOperator{\Tr}{Tr} % Asymptotic notation \newcommand{\bigO}[1]{{\cal O}\br{{#1}}} \newcommand{\softO}[1]{\widetilde{\cal O}\br{{#1}}} \newcommand{\Om}[1]{\Omega\br{{#1}}} \newcommand{\softOm}[1]{\tilde\Omega\br{{#1}}} \def\half{{\textstyle\frac{1}{2}}} \newcommand{\smallfrac}[2]{{\textstyle \frac{#1}{#2}}} \def\v#1{\vec #1} $$

Algorithms and Tools


The algorithms that are part of EdgeML are written in Tensorflow and PyTorch for Python. They are hosted on GitHub. Additionally, the repository also provides fast and scalable C++ implementations of Bonsai and ProtoNN. The common usecases are as follows:

  • Bonsai or ProtoNN: Can be used for traditional machine learning tasks with pre-computed features like gesture recongition (Gesturepod), activity detection, image classification. They can also be used to replace bulky traditonal classifiers like fully connected layers, RBF-SVMs etc., in ML pipleines.
  • EMI-RNN & FastGRNN: These complementary techniques can be applied on time-series classification tasks which require the models to learn new feature representations such as wakeword detection (Key-word spotting), sentiment classification, activity recognition. FastGRNN can be used as a cheaper alternative to LSTM and GRU in deep learning pipleines while EMI-RNN provides framework for computational savings using multi-instance learning.
  • SeeDot:

A very brief introduction of these algorithms and tools is provided below.

  1. Bonsai: Bonsai is a shallow and strong non-linear tree based classifier which is designed to solve traditional ML problem with 2KB sized models. Bonsai has logarithmic prediction complexity and can be trained end-to-end with deep learning models.
    [Paper @ ICML 2017] [Bibtex] [Poster] [Cpp code] [Tensorflow example] [PyTorch example] [Blog]
  2. ProtoNN: ProtoNN is a prototype based k-nearest neighbors (kNN) classifier which is designed to solve traditional ML problem with 2KB sized models. ProtoNN can be trained end-to-end with deep learning models and has been used for deployment in GesturePod.
    [Paper @ ICML 2017] [Bibtex] [Poster] [Cpp code] [Tensorflow example] [PyTorch example] [Blog]
  3. EMI-RNN: Training routine to recover critical signature from time series data for faster and accurate RNN predictions. EMI-RNN helps in speeding-up RNN inference up to 72x when compared to traditional implementations.
    [Paper @ NeurIPS 2018] [Bibtex] [Poster] [Tensorflow example] [PyTorch example] [Video]
  4. FastRNN & FastGRNN: Fast, Accurate, Stable and Tiny (Gated) RNN Cells which can be used instead of LSTM and GRU. FastGRNN can be up to 35x smaller and faster than LSTM and GRU for time series classification problems with models with size less than 10KB.
    [Paper @ NeurIPS 2018] [Bibtex] [Poster] [Tensorflow example] [PyTorch example] [Video] [Blog]
  5. SeeDot: Floating-point to fixed-point quantization tool including a new language and compiler.
    [Paper @ PLDI 2019] [Bibtex] [Code] [Video]

All the above algorithms and tools are aimed at enabling machine learning inference on the edge devices which form the back-bone for the Internet of Things (IoT).