Skip to main navigation menu Skip to main content Skip to site footer

StochasticNet in StochasticNet

Abstract

Deep neural networks have been shown to outperform conventional
state-of-the-art approaches in several structured prediction
applications. While high-performance computing devices such as
GPUs has made developing very powerful deep neural networks
possible, it is not feasible to run these networks on low-cost, lowpower
computing devices such as embedded CPUs or even embedded
GPUs. As such, there has been a lot of recent interest
to produce efficient deep neural network architectures that can be
run on small computing devices. Motivated by this, the idea of
StochasticNets was introduced, where deep neural networks are
formed by leveraging random graph theory. It has been shown
that StochasticNet can form new networks with 2X or 3X architectural
efficiency while maintaining modeling accuracy. Motivated by
these promising results, here we investigate the idea of Stochastic-
Net in StochasticNet (SiS), where highly-efficient deep neural networks
with Network in Network (NiN) architectures are formed in
a stochastic manner. Such networks have an intertwining structure
composed of convolutional layers and micro neural networks
to boost the modeling accuracy. The experimental results show
that SiS can form deep neural networks with NiN architectures that
have 4X greater architectural efficiency with only a 2% drop
in accuracy for the CIFAR10 dataset. The results are even more
promising for the SVHN dataset, where SiS formed deep neural
networks with NiN architectures that have 11.5X greater architectural
efficiency with only a 1% decrease in modeling accuracy.

PDF