Dream Simulation (failure)

Posted: October 24, 2013 at 10:40 am

After training the MLP, I thought I should try running it in feedback mode to see if it would actually be predictive. It is not. There are some shifts in input patterns at the start of the dream, but it takes only 8 iterations to settle into reproducing a stable pattern. The following image shows the results of the first ~1000 iterations of the dream (truncated from 10,000 iterations). The left image is the raw output while the right is the thresholded version. Note the slight changes in output patterns early in the sequence (far left edge of both panels).

backgroundState_sequential_dream.results

The reason I am using the MLP at this stage is because its the first of the usual suspects. It seems I have two choices to continue: (1) rather than feeding single input patterns, I present a window of concatenated input patterns or (2) I leave the MLP aside and move to an explicit RNN (or something else). The problem with #1 is that it requires me to explicitly define the size of the window, and considering there are likely as many as 2000 inputs for a single iteration (working with 1000 in this test setup), even a small window of 3 time-slices would require a very large network with as many as 6000 inputs. The problem with #2 is that neither FANN nor OpenNN, the most dominant C++ libs for ANNs, support RNNs.

In order to use a proper RNN I would need to either use another library which may not be popular and likely will lack examples and support. The learning curve on these libs is not so fast. I spent about 4 days on each FANN and OpenNN for it to start making sense before I was able to write working code. This investment makes me hesitant to jump into another library without being confident it will function as needed. I abandoned OpenNN due to its lack of flexibility (no sequential learning, little flexibility in feeding patterns from variables rather than data-files). It seems I can either use one of the lesser used RNN C++ libraries, or jump to a different language I can embed in C++. There seem to be quite popular and maintained RNN libraries for python and R, but obviously that adds complexity and could introduce performance issues. For now I’ll do more reading and take a closer look at the other options in more detail to see which direction will be most effective.