I thought I would do a couple more explorations of the 7 days of frames following the previously posted average. The first is recreating the same frame where each column is actually extracted from a different time. I test this using the sample of different lighting conditions from this post and another test using the first 1920 frames from the full 7Day set below it. I think the latter results are quite uninteresting due to the amount of change minute to minute (i.e. the change of traffic manifest in vertical lines in the image).
I’ve visited the Public Art Control Room a few times now and set up a laptop and 5TB disk array for image capture. After doing a few tests to make sure my two ideas are possible, I’m moving forward with the “final” captures. My two ideas are (1) a time-lapse with one capture every minute for ~7 days and (2) a 30fps capture for 24 hours. I was not sure the old laptop would be up to the job, but I was able to capture to disk in real-time; though I did have to quit the window manager and run ffmpeg from the console to divert all available resources to the task. For the latter, I’ll be capturing on the summer solstice from midnight to midnight; the capture will actually be 48 hours to avoid the awkward hours and I’ll only keep frames captured during the solstice. As I’m saving individual JPEG frames, the 48 hour test takes close to 3TB of space. The laptop is now deleting the before and after midnight files, which takes a very long time; thus, I was unable to determine if I’ll have the space to keep the backup 24h test and the 48h solstice capture; I’ll find that out on Saturday morning when I’m back in the room to start the solstice capture.
The following images are a snap-shot of the diversity of weather over the 7 days of the capture. One thing I changed from the previous test was turn of the camera “sharpening” which seems to give everything a bit of a blurry look. I’m not sure how this will work for possible segmentation experiments, but it does seem to cut down on some of the JPEG artifacts. I should also explore changing the compression quality for the 7day capture since it takes up so much less space already; the 24hour will need to stay at this quality though.
Looking back on these images they really do look blurry; could the camera be having a focus issue? In comparing this to the previous test, the image quality is clearly less sharp and it seems “sharpening” is not actually a post-process for sharpening, but a smoothing post-process. Too bad about that, since I’m not likely to have the amazing diversity of weather over a 7 day period any time soon. I’ll have to keep an eye on the weather and see if I have another opportunity. It could also be the aperture of the camera… Also the exposure is automatic, but the sky very often gets totally blown out (e.g. frame 4212, middle left). The exposure of the street does look good though, so it is probably just exposing for the majority of the frame, exposing for the sky could make the street very dark; if I have to choose between exposed street and sky, I suppose the street makes more sense. I could also try the camera’s dynamic range mode. The sun certainly does shine directly into the camera at times (e.g. frame 9837, bottom left). The following video shows the entire time-lapse.
Over the next six months or so I’ll be working on a new commission funded by the Grunt Gallery for the MPCAS (Mount Pleasant Community Art Screen)! During this residency period, I’ll be revisiting some of the methods I used in my work, “As our gaze peers off into the distance, imagination takes over reality…” Commissioned by the City of Vancouver for the Platforms 2016: Coastal City Program. For this new project I’ll be working with a video stream from a camera installed with the MPCAS, rather than a photographic panorama as used in the 2016 work.
Last week I got my first chance to get into the server room for the MPCAS. This is the room with all the hardware that drives the screen and also the outdoor PTZ camera. The room is smaller than I expected, about the size of a closet with a server rack on wheels. The space is so small that we have to prop the door open and push the rack slightly out the room to open the access terminal! After some fiddling last week, I was able to get an old laptop onto the local network and access the camera feed! Unfortunately, I was unable to control the camera pan/tilt, resulting in a stream of images like this:
The camera is a BirdDog A200 outdoor PTZ IP camera. Since working on the Dreaming Machine, I have not looked at surveillance camera technology and it seems a new protocol, NDI (Network Device Interface), looks to be quite popular. While this is an royalty-free protocol, working with it does not seem that straight forward. I found that ffmpeg does support these streams “built-in” but only specifically version 3.4. Newer versions do not support NDI because NewTek (the company that created NDI) distributed ffmpeg without adhering to the GPL. Downloading the ffmpeg source and the NDI SDK allowed me to compile ffmpeg and access the video stream.
This morning Sebnem (the Grunt gallery Tech) and I got back onto the room to check on the previous test. Unfortunately only a few hours of frames were saved as a technician with the screen company (who installed the camera) did something to the camera that required a power cycle. Once we were on site and cycled the power for the camera we were able to control it! I also noticed that ffmpeg does seem to recover and write new frames in the case that the camera is unaccessible, so that is good to know for the future. After a little exploration I settled on the following frame as a starting point:
Compare this to my description of what I imagined the camera would see from my initial proposal:
I envision a frame showing the city-scape to the East with dynamism of Kingsway below, Kingsgate Mall in the lower middle, the growing density of the area beyond in the middle upper and the mountains and sky above. The camera’s view will contain a rich diversity of forms, colours and textures that are both natural and artificial.
Kingsgate mall is out of frame to the right behind the trees, but the dynamism of the busy intersection of Kingsway and Broadway below contrast nicely with the trees, mountains and sky. I also saved a “preset” view of without Kingsway below and more sky, but I think this composition is more balanced. I ended up using as many manual settings as possible, but with this much contrast (between the shadows in the trees and the bright sky) the sky looks quite blown out and some settings may need to be tweaked. There should be some rain coming at the end of this week so I should get a better sense of changing light through day, night and variable weather when I access the footage next week. I’m also quite curious about the image quality at night.
The laptop is capturing one frame to disk every minute and I’ll check in next week and get a sense of the variation of the material over time. I’ll see if any camera settings need to be tweaked and start working with the new material and see where that leads starting with code from “As our gaze peers off into the distance, imagination takes over reality…” and also “Through the haze of a machine’s mind we may glimpse our collective imaginations (Blade Runner)“.