Rendering Blender 3D scenes in the cloud

I created a simple app that renders a Blender 3D scene in the cloud: users can customize the displayed message by changing an URL parameter, and the app will return the 3D rendered image.  Give it a try here.

The code is a very simple Python function that invokes the Blender open source software. It uses the Blender API in order to dynamically change the value of a Text object. The function is then simply wrapped in a basic Flask application in order to respond to HTTP requests. Because Blender needs to be installed, I am using a Dockerfile that runs “apt-get install blender”.

Find the code on GitHub.

This app was created to showcase the upcoming “serverless containers” product of the Google Cloud Platform. It allows to run any container “in a serverless way”: developers are able to deploy and run any language or library they want, and only pay for actual resources used when it receives requests. I demoed the feature during the Cloud Next 2018 keynote and other sessions. sign up for early access at

Projection mapping on painting

Xavier has an overhead projector which points toward a painting on a wall. So obviously, we had to project on it  🙂

After a bit of calibration to make sure we were projecting black outside of the painting, we tries different things:

The first thing was to feed the painting into the Deep Dream algorithms, in order to create those trippy pictures from neural networks. Sadly, once projected, the result was not noticeable enough.

So I tried something different: style transfer. This is an algorithm, also using machine learning, that can transfer the artistic style of an image to another. Take a look at this gallery:

I also worked on animations, using Adobe After effects: The painting usually represents a boring winter scene, so I created 3 different compositions, to layer on top of the original style, that represent the scene under different weather: summer, rain and snow. Each of them are color graded and animated: summer has a slowly moving trees and blue sky, I simulated raindrops and a snowfall. See the video below (sadly, the rain or snow do not render well in the recorded video):

Adobe After Effects easily allows to display the current composition on a second screen, in my case the projector. This allowed real time visualization of the effects, which is always better when working on something visual, especially when the computer screen is not the final medium:


Trying to confuse Google’s Vision algorithms with dogs and muffins

When I saw this set of very similar pictures of dogs and muffins (which comes from @teenybiscuit‘s tweets), I had only one question: How would Google’s Cloud Vision API perform on this.

At a quick glance, it’s not obvious for a human, so how does the machine perform?  It turns out it does pretty well, check the results in this gallery:

(also find the album on imgur)

For almost each set, there is one tile that is completely wrong, but the rest is at least in the good category. Overall, I am really surprised how well it performs.

You can try it yourself online with your own images here, and of course find the code on GitHub.

Technically it is built entirely in the browser, there is no server side component except the what’s behind the API of course:

  • Images are loaded from presets or via the browser’s File API.
  • Each tile is converted in its own image, and converted to base 64.
  • All of this is sent at once to the Google Cloud Vision API, asking for label detection results (this is what matters to us here, even if the API can do much more like face detection, OCR, landmark detection…)
  • Only the label with the highest score is kept from the results and printed back into the main canvas.

Color palette showcase

I really like how Duminda Perera showcased color palettes in “Minimalist Color Palettes 2015“.

I decided to use this technique with the color theme of my wedding.

It is entirely built using HTML, CSS and SVG.  Click the image below to render it in full screen, and see the code in codepen (or gist). The background is a merge of gradient, blur and subtle noise using advanced SVG filters, which is rather CPU-intensive.

Screen Shot 2016-01-25 at 00.04.49

Download it as a wallpaper here


Painting in the style of Bob Ross

I followed Bob Ross‘s technique in this first attempt at painting. If you do not know Bob Ross, check the official Youtube channel it’s so relaxing. After watching a few episodes, I became familiar a few techniques and gave it a try.

Here is my composition, featuring the usual cloudy sky, snowy mountains, a lake and happy little trees:bob-ross-steren

I got quite good comments from the Reddit community.

MRI proton spin: 3D animated mathematical curve in the browser using MathBox.js

My partner is defending her thesis today and she needed some help to produce visuals for the presentation. The goal was to illustrate the relaxation of the spin of a proton that received a radio-frequential impulsion in a constant magnetic field, which is the principle at the root of Magnetic Resonance Imagery.

The visualisation needed to be animated and in 3D. I thought it would be a good use case for MathBox.js, a JavaScript library built on top of three.js, which is built on top of WebGL and developed by Steven Wittens.

This library is designed for mathematical visualisations. You focus on curve expressions and not on geometrical primitives, the engine is then evaluating and interpolating them on every frame to draw them. It works in 2D, 3D space and is not restricted to cartesian space.

I created a 3 step animation. MathBox allows you to write an animation script: you define for each step what is added, removed or animated.

You can load it in your browser at this address.
Or simply watch this youtube video:

I am also glad to use this animation to illustrate the Wikipedia article about MRI and MRN.

Using chrome as my code editor
I wanted to experiment with something that I think will happen more and more in the future: This short project was entirely developed inside the browser.

Code in Chrome dev tools

I had to create the local git repository and its submodules, but then I did not use an other editor than the “Source” tab of Chrome’s Dev Tools. This is possible thanks to Chrome’s “Workspaces” that allow it to map the current page source to a specific folder of your computer. It is then possible to edit and save the resources that are usually read-only. This workflow is more streamlined, than the regular workflow of debugging in the Dev Tools, then going back to the editor, find the line to change, save, go back to Chrome and reload. Here, you do not leave the browser’s window and can edit and save the JavaScript file at the same time of debugging.

Of course, this is a different approach than using an online IDE such as cloud9 IDE, that I also appreciate, but both are interesting and we should definitely keep an eye on these new workflows.

Poker tournament tracker

My parents are regularly playing poker with friends. They keep track of every game in a spreadsheet but had issues determining the overall winners and see who has the best score across games. They were either doing it by hand or using ugly Excel macros that nobody could understand.

poker tournament

I quickly helped them and wrote a Google Spreadsheet custom function that does the job. They now just have to fill in the position of every participant for each game, the final scores and stats are automatically generated.

Get the code on this gist. To use it, you have to copy paste it in a new script linked to your spreadsheet.

I’ve been more and more using Google app scripts for personal or professional projects. I like the fact that they selected JavaScript as scripting language and that a lot of Google products have a script API. Very often, it avoids you creating a full web app for a simple task.