New number formats for faster deep learning

Exploring the use of POSIT numbers for adaptive precision schemes in deep learning algorithms.
Description of the Project: 

The training phase in Deep Learning is very compute and data intensive and, therefore, the efficiency of the training phase typically restricts the quality of results that can be achieved within a given time frame.

Many learning algorithms are dominated by the speed in which data can be brought to the CPU, i/e., by the memory bandwidth of the executing hardware. Consequently, techniques that are based on reduced precision number representations have been shown to produce faster results without a significant loss in the quality of results.

A new number format, named POSIT, promises better accuracy then the omnipresent IEEE floating point numbers.

This project investigates how POSIT numbers can be leveraged to improve the performance of Deep Learning algorithms.

Resources required: 
POSIT hardware (sponsored by Vivid Sparks)
Project number: 
400001
First Supervisor: 
University: 
Heriot-Watt University
First supervisor university: 
Heriot-Watt University
Essential skills and knowledge: 
Strong system software programming skills.
Desirable skills and knowledge: 
Advanced type systems; Compiler construction; Code generation.