Abstract

Art-directability is a crucial aspect of creating aesthetically pleasing visual effects that help tell stories. A particularly common method of art direction is the retiming of a simulation. Unfortunately, the means of retiming an existing simulation sequence which preserves the desired shapes is an ill-defined problem. Naively interpolating values between frames leads to visual artifacts such as choppy frames or jittering intensities. Due to the difficulty in formulating a proper interpolation method we elect to use a machine learning approach to approximate this function. Our model is based on the ODE-net structure and reproduces a set of desired time samples (in our case equivalent to time steps) that achieves the desired new sequence speed, based on training from frames in the original sequence. The flexibility of the updated sequences' duration provided by the time samples input makes this a visually effective and intuitively directable way to retime a simulation.

Degree

MS

College and Department

Physical and Mathematical Sciences; Computer Science

Rights

https://lib.byu.edu/about/copyright/

Date Submitted

2020-03-24

Document Type

Thesis

Handle

http://hdl.lib.byu.edu/1877/etd11061

Keywords

retiming, art direction, fluid simulation, machine learning

Language

english

Share

COinS