Mobile Camera Effects, 2014 - 2018
Rendering and creative coding work on post-processing effects targeting mobile cameras.

Tools: C++, GLSL, Swift, Meta Spark AR

For 4 years, at Eyegroove first (2014 - 2016), and then Facebook (2016 - 2018), I created several dozens of mobile camera filters for photo and video. I worked across the whole graphics stack from implementing rendering logic in the C++ graphics engine, to designing and coding the actual effects in GLSL.

At Eyegroove, the effects were targeting iOS and interactive via touch. The app was designed to create music videos and would let you animate effects in sync with the music.

In 2016, Eyegroove was acquired by Facebook with the task to bring top-class mobile camera experiences to their suite of apps: Facebook, Messenger and Instagram.

My first task was to add pre and post processing steps to the C++ rendering engine, as well as UI support for it in the AR creation desktop application that eventually became Meta Spark AR. I then joined the camera art team as a creative coder and worked with a team of artists and designers to prototype and ship dozens of mobile camera effects using Meta Spark AR, GLSL and Javascript.

As new AI models such as face tracking and background segmentation came online in the authoring tool, the effects got more and more sophisticated.