BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:America/Chicago
X-LIC-LOCATION:America/Chicago
BEGIN:DAYLIGHT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
TZNAME:CDT
DTSTART:19700308T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
TZNAME:CST
DTSTART:19701101T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20210808T235336Z
LOCATION:Room B
DTSTART;TZID=America/Chicago:20210809T094500
DTEND;TZID=America/Chicago:20210809T100000
UID:icpp_ICPP 2021_sess137_ems_pap109@linklings.com
SUMMARY:Accelerating Neural Network Training using Arbitrary Precision App
roximating Matrix Multiplication Algorithms
DESCRIPTION:EMS Workshop\n\nAccelerating Neural Network Training using Arb
itrary Precision Approximating Matrix Multiplication Algorithms\n\nBallard
, Weissenberger, Zhang\n\nMatrix multiplication is one of the bottleneck c
omputations for training the weights within deep neural networks. To speed
up the training phase, we propose to use faster algorithms for matrix mul
tiplication known as Arbitrary Precision Approximating (APA) algorithms. A
PA algorithms perform asymptotically fewer arithmetic operations than the
classical algorithm, but they compute an approximate result with an error
that can be made arbitrarily small in exact arithmetic. Practical APA algo
rithms provide significant reduction in computation time and still provide
enough accuracy for many applications like neural network training. We de
monstrate that APA algorithms can be efficiently implemented and paralleli
zed for multicore CPUs to obtain up to 28% and 21% speedups over the faste
st implementation of the classical algorithm using one core and 12 cores,
respectively. Furthermore, using these algorithms to train a Multi-Layer P
erceptron (MLP) network yields no significant reduction in the training or
testing error. Our performance results on a large MLP network show overal
l sequential and multithreaded performance improvements of up to 25% and 1
3%, respectively.
END:VEVENT
END:VCALENDAR