~150,000 frame test with live camera

Posted: June 5, 2013 at 2:58 pm

In preparing for the exhibition of a prototype of the system for Creativity and Cognition, I’ve been running the system with a live camera input. The good news is that the system is looking aesthetically interesting (sorry no screen-grabs currently available). The bad news is that this testing seems to have uncovered a memory leak somewhere, as shown in the continued increase of memory usage even after the maximum number of clusters have been reached for FG and BG:

148456_debug_1000bgc_2000fgc_cameraA

Note the clear day/night cycles of activity that match the mean brightness. The following image shows the state of the system over the first ~40,000 frames of the test. (I have not yet tried to get all ~150,000 frames into a single image yet.) The X-axis is time where 40,000 frames are compressed down to 5,000 pixels, and the Y-axis is the index of each percept. The top panel shows the transformation of the mean colour of each percept, while the bottom panel shows a white pixel when that percept is present, and a black pixel when it is not.

Note the signs of cycles here also, which are not present in the first 40,000 frames after max clusters have been reached in the above plot. These are likely bursts of activity in our living room padded by moments of stillness, but that does not explain the lack of activation of these background percepts, which should stay activated except during the night. As these slices of percept activity are saved for every frame, it seems unlikely there is a scaling bug making these results incompatible with the plot above…

meanColoursStateVectorBG-thumb

Onto debugging the memory leak…