Xavier has an overhead projector which points toward a painting on a wall. So obviously, we had to project on it 🙂
After a bit of calibration to make sure we were projecting black outside of the painting, we tries different things:
The first thing was to feed the painting into the Deep Dream algorithms, in order to create those trippy pictures from neural networks. Sadly, once projected, the result was not noticeable enough.
So I tried something different: style transfer. This is an algorithm, also using machine learning, that can transfer the artistic style of an image to another. Take a look at this gallery:
I also worked on animations, using Adobe After effects: The painting usually represents a boring winter scene, so I created 3 different compositions, to layer on top of the original style, that represent the scene under different weather: summer, rain and snow. Each of them are color graded and animated: summer has a slowly moving trees and blue sky, I simulated raindrops and a snowfall. See the video below (sadly, the rain or snow do not render well in the recorded video):
Adobe After Effects easily allows to display the current composition on a second screen, in my case the projector. This allowed real time visualization of the effects, which is always better when working on something visual, especially when the computer screen is not the final medium: