For this project, a github repository has been created with Maya assets ready for use in matching the real Emily with different renderers. The data set currently includes example images and materials rendered using mental ray and PBRT v3. These were created with an eye towards facilitating further experimentation with the existing dataset, extending material definitions for Vray and the skin shader in OSL provided on the main site.
The Academy Color Encoding System 1.0 (ACES) was launched in December 2014 by the Academy of Motion Picture Arts and Sciences. The result of over 10 years of industry-driven development and testing, ACES provides a standardized color management infrastructure to replace what was lost in the transition away from film. A key application addressed by ACES 1.0 is visual effects production. The addition of the ACEScg color encoding to compositing, lighting, rendering and other CG workflows will simplify element interchange and preview, as well as enable high dynamic range and wide gamut deliverables.
A presentation from the Siggraph 2010 Course on Color presentation “Filmic Tonemapping for Real-time Rendering”. This is an extension of the presentation from 2006 that includes an analytical approximation used in Naughty Dog’s Uncharted 2 as well as still from the game.
Articles about the technique or games that have used the technique
Siggraph Asia 2009 presentation “Earthquake! Building a pipeline to Destroy Los Angeles in 2012” on Bento, the central pipeline component developed at Digital Domain to enable the creation of the LA Destruction sequence of 2012.
Haarm-Pieter Duiker, Digital Domain
Osiris Perez, Digital Domain
Masuo Suzuki, Digital Domain
Rito Trevino, Digital Domain
Abstract: 2012 presented a set of challenges that were unique in scale, even for an experienced company like Digital Domain. The LAP se- quence follows the progress of a plane flying through Los Ange- les as it is destroyed by a massive earthquake. From this aerial perspective, some shots had as many as 6000 individual objects. Each needed to be placed, animated, run through effects simula- tions, and finally lit and rendered. The time constraints and scale of this project meant that most of those steps would happen in parallel, and assets would be constantly modified, published and updated by each department as shots progressed.