Wednesday, 16 February 2011

Optical Flow and Rotoscopy

Been feeling pretty ill the last few days. Managed to drag myself to the Broadcast Video Expo yesterday at Earls Court. Would love to go again tomorrow to watch some of Arri's lighting seminars, and the DaVinci Resolve colour grading workshops, but probably won't be feeling good enough yet so it'll be another day at home storyboarding and tidying up odds and ends.



Last week I rotoscoped over the intro to 'Down by Law' with one long continuous virtual line using blender. At college Felipe suggested I could do a bit more with the data and maybe find some way of using the video to distort the drawings. I wrote a python script using the OpenCV bindings for Python 2.7 to analyse the optical flow of the original film footage. I based the script on the lkdemo using CalcOpticalFlowPyrLK. The script dumped out a load of text files containing the feature tracking info.

In blender I wrote a second script which warped my 'drawing' (made up of vertices and edges) on a frame by frame basis using the nearest tracked feature point to drag the line around. Effectively the script puts the line's control points (vertices) into voronoi cells which are shifted by the tracked features (like a virtual earthquake). I added in very simple outlier detection checking consistency with the flow of neighbouring points along the rotoscoped line. I had wanted to convert the point cloud of feature track points to a mesh using delaunay triangulation, and then use the animated mesh to deform the string of vertices, but it would probably have been excessively slow in python!

Anyways thanks to Felipe for pushing me that bit further. Meanwhile progress on the scribbler continues - I've added in three more drawing styles (two of them shown below) and a depth of field control. Next step is to work on edge detection (I've got down on paperware how it should work) and adding a bit of stability to make sure no essential parts of an image get left undrawn on any one frame.

1 comment:

  1. Hi there! Found your blog from searches on optical flow and rotoscopy. Love your work! I was wondering if it would be possible to use O-flow data to warp automatically detected lines in a smoother way? I came across the idea trying to use a cartoon filter in AfterEffects, with posterized time and then morphing the in-between frames, but it looks very weird. I thought one might be able to use O-flow data to inform the morphing so that it is smoother than raw video therefore more "animated" looking. I was hoping to find an "easy" way to get that rotoscoped look using only filters and processing. Let me know if that is not clear and I can send you my test clips.

    ReplyDelete