Skip to main navigation menu Skip to main content Skip to site footer

Binary Quantizer

Abstract

One-bit quantization is a general tool to execute a complex model,
such as deep neural networks, on a device with limited resources,
such as cell phones. Naively compressing weights into one bit
yields an extensive accuracy loss. One-bit models, therefore, re-
quire careful re-training. Here we introduce a class functions de-
vised to be used as a regularizer for re-training one-bit models. Us-
ing a regularization function, specifically devised for binary quanti-
zation, avoids heuristic touch of the optimization scheme and saves
considerable coding effort.

pdf