Live Pixel Mapping with Python, OSC & a Webcam

Ever since software version 2.3 came out on the ETC Eos with OSC compatibility, I've always wanted to do a project that incorporated that new functionality!

Originally, I thought of creating a static, interactive pixel mapping experience. One could go to a website, take their photo using the on camera webcam, and their photo would be pixelized, and sent to the Eos via OSC. I did a quick mockup of this idea using Python and in Nomad, created some blank fixtures to represent pixels in the magic sheet and linked intensity to color fill. [see video below]

 

However, in actualizing this project, I found it difficult to find a lighting product that had enough pixels of control and that I could get as a demo.... Eventually, I settled on 10 Color Force 48's. This gave me a much more pixelized 'screen' to work with. 

 

I also decided that the web interface was unnecessary, and rewrote my program to take a photo every tenth of second. This created a pulsing feeling as the fixtures would cycle and refresh every tenth of second. In testing, it was quickly apparent that there wasn't a high enough resolution to find meaningful media within a greyscale context. I reconfigured the program to work in color, and reconfigured the instruments from extended RGBA mode, to RGB mode with magic amber. This allowed me to map the pixel values (RGB) directly from the image to the lighting fixtures. 

Finally, I gathered a collection of paint color swatches to display along with the installation. Viewers could hold up their color of choice, and witness the lights mapping to the color they had picked. 

Additionally, because the instruments were facing the same wall as the camera, there was a positive feedback loop when no user was interacting with the installation. The camera would see the color of the light outputted, re-interpret it, and then send out a different version of the color. 

See the final video below!