Leo's Home page -- Github Page -- License: CC BY-SA 4.0

MNIST Experiments - Playing With Different Architectures

Leonardo M. Rocha

Contact Me

Sometimes we need to come back to the basis, this is the place I choose for that.

Here I'll experiment with different networks on the MNIST and MNIST variants datasets trying to find relations in which I can reduce the number of parameters in comparison with a Fully Connected (FC) network.

Later on, I might try with other datasets that are small enough for my GTX1080.

Yes, I know, the issue is already solved for Images with Convolutional Networks but what I want to see is not that. Instead I want to understand ways in which fully connected networks can be replaced by other types of connections to minimize the number of parameters in it. This is an exploratory work to get a deeper understanding on Neural Networks (NNs) that will at least give me some fun time.

ColumnNet

ColumnNet experiments

A neural network that contains different networks, each consisting of a column, each column can have different activation units

Fully Connected ColumnNets