It’s been a while! I have been working on AI related stuff of course, but what exactly I have been spending the bulk of my time on will be revealed in the near future.
For now though, I would like to show a simple little demo I made for showing the generative characteristics of SDRs (Sparse Distributed Representations).
In terms of generative models, Generative Adverserial Networks (GANs) and Variational Autoencoders (VAEs) seem to currently be among the most popular models.
I have come up with another way of doing a generative model relatively easily. It’s based on K-sparse autoencoders (here). SDRs are implicitly generative, as they force a certain (binary) distribution on the hidden units. K-sparse autoencoders can be used to learn SDRs with some modifications: The hidden units should be binary (top K are set to 1, the rest to 0), and training proceeds by minimizing the reconstruction error with tied weights.
With these modifications, one can learn some nice features from data. It is also possible to be able to control certain features by forcing a correlation between a feature and an input pattern. I did this by having two networks: The K-sparse autoencoder network (with binary states + reconstruction), and a random projection network that maps “control” features to random biases in the hidden units of the autoencoder.
The resulting system learns to incorporate the controlled features into the hidden features such that the reconstruction from a biased set of features produces the image with the desired properties.
So let’s try the standard MNIST test. The control features are a 1-hot vector of length 10 (one for each digit), while the hidden size is 256 units. The random projections were initialized to relatively high weights to overcome the K-sparsity’s thresholding. After training for a few seconds, we can see the results:
I applied a thresholding shader to make it look a bit prettier, although it does cut off some transitions a little.
If you would like to experiment with GSDR yourself, here is the code:
Until next time!