If you have any problems related to the accessibility of any content (or if you want to request that a specific publication be accessible), please contact us at scholarworks@unr.edu.
Evolving GPU-Accelerated Capsule Networks
Date
2018Type
ThesisDepartment
Computer Science and Engineering
Degree Level
Master's Degree
Abstract
Capsule Networks exploit a new, vector-based perceptron model, providing feature instantiation parameters on top of feature existence probabilities. With these vectors, simple scalar operations are elaborated to vector-matrix multiplication and multi-vector weighted reduction. Capsule Networks include convolutional layers which take the initial input and help it become a tensor.A novel data abstraction maps the individual values of this tensor to one-dimensional arrays but is conceptualized as a 2D grids of multi-dimensional elements. Moreover, loss function thresholds and architectural dimensions were arbitrarily set during the introduction of Capsule Networks. While current machine learning libraries provide abstractions for convolutional layers, a TensorFlow optimization requires structural overhead for a full Capsule Network implementation. They lack simple optimizations specifically for Capsule Network data allocation. This thesis presents a scalable GPU optimization for the training and evaluation of Capsule Networks. Furthermore, hyperparameters are adjusted with the help of a multi-objective evolutionary algorithm (MOEA) to minimize the loss function while maximizing accuracy.
Permanent link
http://hdl.handle.net/11714/4551Additional Information
Committee Member | Latourrette, Nancy; Mortensen, Jeff |
---|---|
Rights | Creative Commons Attribution-ShareAlike 4.0 United States |
Rights Holder | Author(s) |