Fact-checked by Grok 2 weeks ago

Paul Debevec

Paul Debevec is an computer graphics researcher renowned for pioneering techniques and developing the Light Stage, a system for capturing high-fidelity facial performances that has revolutionized digital actors in for major films. He holds a B.S. in and from the (1992) and a Ph.D. in from the (1996), where his dissertation introduced , an early image-based modeling and rendering system. Currently, Debevec serves as Director of Research for Creative Algorithms and Technology at Netflix's Eyeline Studios, while maintaining an adjunct research professorship at the University of Southern California's Viterbi School of Engineering and Institute for Creative Technologies (). Debevec's career began with groundbreaking work at UC Berkeley, including the 1997 short film The Campanile Movie, which demonstrated image-based rendering of architectural scenes, and his seminal 1997 paper on modeling and rendering architecture from photographs. Joining USC's in 2000, he led the Graphics Lab, where he advanced (HDR) imaging and co-authored the influential book High Dynamic Range Imaging: Acquisition, Display, and (2005, second edition 2010). His innovations in enabled realistic illumination of elements using real-world environment maps, first applied in films like (2004) and (2006). From 2016 to 2021, he served as a principal researcher at , focusing on and relighting technologies, before transitioning to in 2021 to oversee R&D in , , and for and virtual production. Debevec's Light Stage technology, iteratively developed from 2000 onward, uses arrays of lights and cameras to capture detailed facial geometry, reflectance, and subsurface scattering, allowing for relighting of digital characters under arbitrary conditions. This system has been integral to creating photorealistic digital humans in productions such as The Curious Case of Benjamin Button (2008), Avatar (2009), and the 3D scan of President Barack Obama for the 2014 short An Ode to Obama. Recent projects at Eyeline Studios include "DifFRelight" for performance relighting using diffusion models (SIGGRAPH Asia 2024) and "Lux Post Facto" for conditional video diffusion-based relighting (CVPR 2025). His contributions have earned numerous accolades, including the ACM SIGGRAPH Significant New Researcher Award (2001), two Academy Awards for Scientific and Technical Achievement (2010 for Light Stage and 2019 for Polarized Spherical Gradient Illumination), the SMPTE Progress Medal (2017), and the Charles F. Jenkins Lifetime Achievement Emmy Award (2022). Debevec was named one of MIT Technology Review's top 100 innovators under 35 in 2002 and has held leadership roles such as Vice President of ACM SIGGRAPH and Program Chair for FMX 2025.

Early life and education

Early years

Paul Debevec was born around 1970 and raised as the only child of a professor and a psychiatric social worker in . His father's position at the University of influenced his early exposure to scientific thinking, while his mother's role as a social worker provided a supportive environment for creative pursuits. Debevec attended University Laboratory High School in Urbana, graduating in 1988. From a young age, he developed a passion for special effects in films like Star Wars and Back to the Future, often staying up late to create stop-motion animations using an 8-mm camera. He was also fascinated by programming, tinkering with Commodore computers to experiment with graphics and design, viewing computers primarily as tools for creative computation rather than mere games or word processing. During high school, Debevec's interests in theater and visual media flourished through active participation in school activities. He acted in several productions, including , The Lottery, Big Show ‘87, and Big Show ‘88. As a senior, he served as photo editor for the school newspaper Gargoyle and the yearbook, where he honed skills in photography and image processing via a junior-year apprenticeship that involved developing film in a . He was also a member of the Math Club, blending his analytical and artistic inclinations. These formative experiences in theater, , , and laid the groundwork for Debevec's later academic pursuits in and .

Academic training

Paul Debevec earned a degree in and from the in 1992. During his undergraduate studies, he developed an early interest in , stemming from high school involvement in theater production, and in 1991 created an image-based model of a automobile using photographs to achieve photorealistic rendering. Debevec pursued graduate studies at the , where he obtained a Ph.D. in in 1996. His doctoral thesis, titled Modeling and Rendering from Photographs, focused on hybrid geometry- and image-based approaches for reconstructing and relighting architectural scenes from sparse photographic inputs. Supervised by a committee including Professors , John Canny, and David Wessel, Debevec's work was influenced by Malik's expertise in , which shaped his development of interactive photogrammetric techniques and view-dependent for realistic rendering. These academic efforts laid the groundwork for his subsequent contributions to image-based modeling, emphasizing practical tools for virtual cinematography.

Professional career

Positions at USC

After completing his Ph.D. at the , Debevec continued graphics research at UC Berkeley from 1996 to 2000 before joining the in 2000. At USC's Institute for Creative Technologies (), Debevec founded and served as director of the Graphics Lab from May 2000 to June 2016, where he oversaw pioneering efforts in research. Under his leadership, the lab evolved into the Vision and Graphics Laboratory (VGL), focusing on advanced techniques for visual simulation and digital content creation. Debevec holds the position of Adjunct Research in the Department of at USC's Viterbi School of Engineering, a role he continues to maintain alongside his contributions to . In this capacity, he has played a key leadership role in establishing ICT's research programs on virtual humans and , including the development of specialized lab facilities to support interdisciplinary collaborations between academia and the entertainment industry.

Roles in industry

During his PhD studies at UC Berkeley, Debevec interned and consulted at Interval Research Corporation on projects involving image-based techniques for interactive applications, including contributions to the Immersion '94 project led by Michael Naimark. In 2016, Debevec became a researcher in Google's group, specifically the team, serving until 2021 as a senior staff engineer focused on integrating advanced graphics technologies for experiences. Since 2021, Debevec has served as Chief Research Officer at Eyeline Studios, powered by , where he oversees research and development in , , , and to advance tools. Throughout these industry positions, Debevec has maintained an adjunct research professorship at the .

Research contributions

Image-based modeling and rendering

Paul Debevec's foundational contributions to image-based modeling and rendering emerged from his 1996 Ph.D. thesis at the , titled Modeling and Rendering Architecture from Photographs. In this work, he developed a hybrid - and image-based approach to reconstruct models of architectural scenes using a sparse set of still photographs, requiring minimal user input for initial recovery. The system, known as Facade, employs photogrammetric modeling to estimate camera positions and basic structure by tracing edges and applying constraints like , enabling the creation of coarse models from as few as one . This method significantly reduced the labor-intensive process of traditional , allowing for efficient capture of real-world environments. To refine these models and achieve photorealistic rendering, Debevec introduced model-based stereo, which uses the initial geometry to guide depth estimation from image pairs, recovering fine details such as architectural friezes that are challenging for pure stereo methods. For rendering, he pioneered view-dependent texture mapping, a technique that projects and blends textures from multiple input photographs onto the 3D model based on the novel viewpoint, capturing subtle effects like specular highlights and parallax without explicit geometric modeling of every surface detail. This approach addresses the "painted shoebox" limitation of simple texture mapping by simulating unmodeled complexity through image compositing, producing seamless fly-through animations from limited data—for instance, a 360-degree walkthrough of a building modeled in just four hours from 12 photographs. While the thesis anticipates extensions to estimate bidirectional reflectance distribution functions (BRDFs) for more accurate material properties under varying lighting, the core methods prioritize geometric fidelity and visual realism from photographs alone. A seminal demonstration of these techniques is The Campanile Movie (1997), a Debevec directed as a capstone to his doctoral research, featuring a virtual fly-around of the UC Berkeley Campanile tower and surrounding campus. Rendered from a model built using Facade and view-dependent textures derived from about 50 photographs, the animation showcased smooth, photorealistic navigation, including an aerial perspective from 250 feet that blended real imagery with synthetic camera paths. This work, presented at the 1997 Electronic Theatre, influenced subsequent visual effects innovations, notably the "bullet-time" sequences in (1999), by demonstrating how image-based rendering could create immersive, time-frozen viewpoints from static photos. The techniques have since informed broader applications in virtual heritage and architectural visualization.

High dynamic range imaging

Paul Debevec's pioneering work in (HDR) imaging began with the development of techniques to capture the full range of light intensities in real-world scenes, addressing the limitations of conventional cameras that compress luminance into low images. In his seminal 1997 SIGGRAPH paper, co-authored with , Debevec introduced a method to recover radiance maps from a sequence of photographs taken at different exposure times using standard imaging equipment. This approach models the camera's response function and deconvolves it from pixel values to estimate scene radiance, enabling the creation of images that preserve details in both bright highlights and dark shadows. The technique laid the foundation for subsequent advancements in and realistic rendering by providing accurate measurements of environmental illumination. Debevec extended these HDR capture methods to image-based lighting, where radiance maps serve as environment maps to realistically illuminate and relight 3D scenes. In the 1999 SIGGRAPH technical paper and accompanying short film Fiat Lux, co-authored with colleagues including Yizhou Yu and Tim Hawkins, he demonstrated the integration of HDR radiance maps captured from real environments to light synthetic objects inserted into photographed scenes, such as monolithic structures placed on the UC Berkeley campus. By using these high-fidelity light probes—often assembled from panoramic photograph sets—Debevec's system computed global illumination effects, including soft shadows and interreflections, allowing for physically accurate relighting without manual specification of light sources. This innovation bridged traditional ray-tracing with image-based rendering, enabling dynamic adjustments to scene lighting for enhanced photorealism in computer graphics applications. Debevec further consolidated and expanded the field through his co-authorship of the 2005 book High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting, edited by Erik Reinhard and others. The volume provides a comprehensive overview of HDR principles, with dedicated chapters on acquisition techniques like Debevec's radiance map recovery, tone mapping operators for displaying HDR content on low-dynamic-range devices, and practical applications in image-based lighting for rendering relit scenes. By synthesizing theoretical foundations with implementation details, the book has become a key reference for researchers and practitioners, influencing tools for HDR imaging in film production and real-time graphics.

Facial capture and Light Stage technology

Debevec introduced the Light Stage in 2000 as a pioneering device for acquiring the reflectance field of a human face, allowing for realistic relighting under novel illumination conditions. The system employed a with eight automated light sources and two high-dynamic-range cameras to capture images of the subject's face from multiple viewpoints under controlled lighting directions, factoring out geometric and photometric effects to isolate surface reflectance properties. This approach enabled the separation of specular and subsurface reflection components through chromaticity analysis, producing relightable face models that demonstrated photorealistic results when rendered with environment maps. Building on this foundation, Debevec and his collaborators advanced the Light Stage to spherical configurations with dense arrays of RGB LEDs, such as Light Stage 3 in 2002, which surrounded the actor with 156 individually controllable lights to sample across a wide range of directions efficiently. Later iterations, including Light Stage 5 with 156 LEDs, further improved capture speed and resolution, facilitating the acquisition of detailed 4D reflectance fields (parameterized by view and light directions) for faces. These systems integrated high-dynamic-range imaging principles to handle the wide range of intensities encountered in facial reflectance measurements. To address in , Debevec developed polarized variants of the Light Stage, which use linear polarizers on both light sources and cameras to disentangle diffuse (including subsurface) and specular components in a single capture pass. A key technique, polarized gradient illumination, employs opposing color patterns on parallel- and cross-polarized LED spheres to simultaneously estimate per-pixel surface normals, diffuse , and specular maps with sub-millimeter accuracy, capturing the translucent light transport effects critical for lifelike skin rendering. This method, demonstrated on high-resolution scans, reduced acquisition time from hours to seconds while enabling relighting that accounts for interreflection and shadow bleeding across facial features. Debevec's Light Stage technologies extended to performance-driven facial animation by incorporating real-time structured light projection and multi-view stereo during actor performances, capturing dynamic , normals, and simultaneously. In systems like Light Stage X, high-speed cameras and deformable LED spheres track facial deformations frame-by-frame, producing animation rigs with polynomial maps that deform high-resolution scans based on , achieving sub-millimeter in expressive sequences. These techniques allow for the synthesis of photorealistic facial expressions while preserving acquired for consistent relighting across animations. For creating photoreal digital humans, Debevec's group leveraged the Light Stage to build universal facial models from thousands of scans, compiling a diverse database of over 4,000 high-resolution face captures spanning ethnicities, ages, and genders. These scans informed models that blend , , and properties to generate customizable digital actors, enabling efficient production of relightable characters with anatomically accurate subsurface and specular variations. The resulting models support applications in by providing a baseline for performance retargeting and environmental integration. Debevec's ongoing research at Netflix's Eyeline Studios continues to advance capture and relighting technologies. Recent projects include "DifFRelight," which uses models for performance relighting (presented at Asia 2024), and "Lux Post Facto," a conditional video -based relighting system (presented at CVPR 2025). These innovations build on Light Stage foundations by integrating to enable efficient, high-fidelity relighting for and virtual production.

Notable projects and applications

Pioneering films and demonstrations

Paul Debevec's pioneering films and demonstrations have showcased innovative applications of techniques, particularly in image-based modeling, rendering, and facial capture technologies. These works, often presented at conferences, highlight his contributions to photorealistic reconstruction and animation, bridging academic research with visual storytelling. One of Debevec's early demonstrations, The (2004), is a short that visually reunites the ancient in with its original sculptural decorations, which had been separated since the early 1800s following the removal of the . Created using image-based methods such as time-of-flight , structured light scanning, photometric stereo, and inverse , the film reconstructs the site's geometry and reflectance properties under natural illumination conditions measured on-site. Photogrammetric modeling and Monte-Carlo enabled the rendering of 20 fully computer-generated shots, premiered at the 2004 Electronic Theater. This project demonstrated the potential of image-based techniques for reconstruction, influencing subsequent documentaries like NHK's The Parthenon: Treasure of Wisdom and Beauty (2004) and MacGillivray Freeman's film Greece: Secrets of the Past (2006). In 2008, Debevec led the Digital Emily project, which produced the first relightable digital human face model, capturing actress Emily O'Brien's likeness to create a photorealistic that convincingly animates facial expressions. Using the Light Stage 5 system—a dome equipped with 156 LED lights for controlled illumination—along with stereo digital still cameras and polarization imaging, the team acquired high-resolution 3D geometry, texture maps, and reflectance data in a single scanning session on , 2008. The resulting model, rigged with 33 facial expressions via Image Metrics' performance-driven technology, was debuted at SIGGRAPH 2008 and demonstrated realistic relighting under varying environmental conditions, advancing the field of digital human simulation. This work bridged the by achieving unprecedented facial detail and expressiveness in a sequence. Building on this foundation, (2013) extended Debevec's research to real-time performance capture and relighting of a , featuring volunteer Ira Rubenstein in a short demonstration film. The project integrated high-resolution facial scans from the Light Stage with video-based performance capture, using correspondences and sparse matches to drive a blendshape model for dynamic geometry and reflectance. Captured expressions were relit in , allowing controllable viewpoints and illumination changes while maintaining even in close-up shots. Presented at 2013's Computer Animation Festival and Real-Time Live!, this demonstration highlighted scalable techniques for creating reproducible digital actors, influencing advancements in virtual production. A notable demonstration from this period was the 2013 Light Stage capture of President at the , using Light Stage 6 to acquire high-fidelity and data. The resulting model was relit and integrated into the 2014 short film , showcasing the technology's ability to simulate the President's performance under diverse lighting conditions, from dramatic spotlights to natural environments. This project, a collaboration with the and presented at 2014, highlighted applications in archival and educational media. More recently, in 2024, Debevec collaborated with to recreate the historic sodium vapor matting process—a 1950s technique for clean —using modern digital tools in a short video demonstration. Debevec contributed by designing custom sodium vapor light sources and a beamsplitter-based setup with synchronized cameras and spectral filters to isolate the narrow sodium (589 nm) from the , enabling accurate foreground capture and holdout mattes without spill. The resulting composites demonstrated spill-free integration of subjects against complex backgrounds, reviving the process for contemporary while underscoring its foundational role in early compositing. This work was presented as a poster at 2024.

Applications in major motion pictures

Debevec's Light Stage technology and associated relighting techniques have been licensed for use in numerous major motion pictures, enabling advanced through precise facial capture and illumination simulation. In 2009, the technology was licensed by the Stevens Center for Innovation to OTOY, facilitating its integration into production pipelines for creating photorealistic digital characters. Early applications included the Matrix trilogy (1999–2003), where Debevec's image-based modeling and rendering methods, building toward Light Stage principles, supported virtual cinematography for bullet-time sequences and environmental compositing. Subsequent films leveraged the Light Stage for digital doubles and enhanced VFX, such as Spider-Man 2 (2004), where it captured actor Alfred Molina's performance as Doctor Octopus, allowing seamless integration of CGI tentacles and underwater sequences with realistic skin and lighting details. Similarly, in Superman Returns (2006), the system scanned actors to generate high-fidelity digital assets, contributing to the film's aerial and composite shots by ensuring accurate subsurface scattering and relighting of facial geometry. King Kong (2005) employed Light Stage scanning to bolster the realism of motion-captured apes and human characters in dynamic jungle environments, refining VFX pipelines for creature integration. The technology reached new heights in The Curious Case of Benjamin Button (2008), where Light Stage systems captured Brad Pitt's performance to create the de-aging digital human protagonist, enabling relighting and integration across decades of visual transformations in collaboration with . The technology's polarized gradient illumination variant, which achieves sub-millimeter accuracy in facial geometry capture, was pivotal in Avatar (2009), where it digitized performers like and Zoë Saldana to create Na'vi avatars, enabling relighting under Pandora's bioluminescent conditions during at Weta Digital. In The Avengers (2012), Light Stage processes supported the assembly of ensemble digital elements, including hero close-ups and battle composites, by providing relightable performance data that matched on-set lighting to virtual environments. These applications have profoundly influenced virtual production and digital doubles in contemporary cinema, standardizing Light Stage-derived methods for on-set capture and previewing of CG integrations. By allowing actors' performances to be relit post-capture without rescanning, the technology streamlines workflows in films reliant on heavy VFX, reducing production time while preserving in hybrid live-action/CG scenes.

Awards and honors

Academy and Emmy recognitions

In 2010, Paul Debevec received a Scientific and Engineering Award from the Academy of Motion Picture Arts and Sciences, shared with , John Monos, and Mark Sagar, for the design and engineering of the Light Stage capture devices and the image-based facial performance capture technique. This recognition highlighted the system's role in enabling realistic facial animations, as demonstrated in films like . Debevec earned a second Academy honor in 2019, a Technical Achievement Award shared with Timothy Hawkins, Wan-Chun Ma, and Xueming Yu, for inventing the Polarized Spherical Gradient Illumination facial appearance capture system. The technology uses a spherical array of LED lights to generate polarized gradient illumination, allowing for the rapid measurement of in , which enhances the fidelity of digital character rendering in . In 2022, Debevec was awarded the Charles F. Jenkins Lifetime Achievement Award by the Television Academy at the 74th Engineering, Science & Technology , recognizing his pioneering contributions to digital human creation and innovations in movie magic. The honor specifically acknowledged his foundational work in high dynamic range imaging, , and , which have transformed visual in .

Other professional awards

In 2001, Paul Debevec received the inaugural Significant New Researcher Award for his creative and innovative work in the field of image-based modeling and rendering. The following year, in 2002, he was named one of the top 100 by Review's TR100 list, recognizing his contributions to advanced imaging techniques. These early career honors highlighted Debevec's foundational research in computer graphics and imaging, which laid the groundwork for subsequent advancements in visual technology. In 2017, Debevec was awarded the SMPTE Progress Medal for his achievements and ongoing work in pioneering techniques for illuminating computer-generated objects based on measured real-world lighting. In 2023, Debevec received the Research Impact Award from the Conference on Visual Media Production (CVMP) for his pioneering work on .

References

  1. [1]
    Paul Debevec Bio
    Paul Debevec is the Director of Research, Creative Algorithms and ... SMPTE Progress Medal in 2017 in recognition of his achievements and ongoing ...
  2. [2]
    Paul Debevec - USC Institute for Creative Technologies
    USC Viterbi School of Engineering Computer Science DepartmentPaul Debevec is a research professor of computer science at the University of Southern California ...
  3. [3]
    Transforming Hollywood Visual Effects with Graphics and Vision
    Sep 5, 2023 · Paul's work in technology for visual effects and virtual production has been recognized with two Academy Awards for Scientific and Technical Achievement, the ...
  4. [4]
    Paul Debevec Home Page
    Lifetime Achievement Emmy Award ; Research at Netflix Eyeline Studios 2021- ; Lux Post Facto: Performance Relighting with Conditional Video Diffusion CVPR 2025Bio · High Dynamic Range Imagery · St. Paul's Cross 1997 · Immersion '94 1994
  5. [5]
    Scientific & Technical Awards 2018 | 2019 - Oscars.org
    Feb 9, 2019 · To Paul Debevec, Timothy Hawkins and Wan-Chun Ma for the invention of the Polarized Spherical Gradient Illumination facial appearance ...
  6. [6]
  7. [7]
    Paul Debevec - USC Today
    Expert in digital visual effects, virtual reality, digital photography, holograms, video doctoring and virtual actors. Associate Director, Graphics Research ...
  8. [8]
    FMX 2025 Program Chair
    Nov 14, 2024 · FMX is happy to announce as Program Chair a true master in VFX and virtual production: Paul Debevec, Chief Research Officer at Netflix's Eyeline Studios.
  9. [9]
    Pixel Perfect | The New Yorker
    Apr 21, 2014 · He grew up in Illinois, the only child of a physics professor and a psychiatric social worker, and was a fan of special-effects-filled movies ...
  10. [10]
    Alum Paul Debevec honored with Emmy Lifetime Achievement ...
    Nov 17, 2022 · Debevec grew up in Illinois. His father was a professor of nuclear physics at the University of Illinois, which partly inspired Debevec's ...Missing: childhood | Show results with:childhood
  11. [11]
    Paul Debevec - Class of 1988 | University Laboratory High School
    Paul Debevec - Class of 1988. Image. Paul Debevec. debevec. Paul Debevec, a member ... SMPTE Progress Medal (2017). Current Occupation. Director of Research ...
  12. [12]
  13. [13]
    Ph.D. Dissertations - 1996 - UC Berkeley EECS
    Modeling and Rendering Architecture from Photographs Paul E. Debevec [advisor: Jitendra Malik]. Modeling and Simulation of the Automated Highway SystemMissing: thesis | Show results with:thesis<|separator|>
  14. [14]
    Paul Debevec to receive Emmy for Lifetime Achievement
    Jul 29, 2022 · EECS alumnus Paul Debevec (Ph.D. '96, advisor: Jitendra Malik) will receive the Charles F. Jenkins Lifetime Achievement Award at the Television Academy's 74th ...
  15. [15]
    50 Years in Artificial Intelligence - Institute for Creative Technologies
    Apr 18, 2024 · Paul Debevec, a rising star in computer graphics, joined us from UC Berkeley along with Tim Hawkins, Christopher Tchou, and Jonathan Cohen ...
  16. [16]
    Paul Debevec - Facebook
    Starting today, Scanline VFX - Powered by Netflix and Eyeline Studios officially unite under a single name: Eyeline. Together, we're continuing our legacy of ...
  17. [17]
    Man of a thousand faces - Berkeley Engineering
    Sep 8, 2010 · As director of the Graphics Lab at the University of Southern California (USC)'s Institute for Creative Technologies, Debevec helped develop ...
  18. [18]
    Advanced Computing at USC is Everywhere
    ... graphics, AI and mixed reality. The space houses the Vision & Graphics Lab, where USC's Paul Debevec won an Oscar for creating characters for films such as ...
  19. [19]
    USC's Paul Debevec's Role in The Matrix, Avatar, Gravity & More
    Oct 29, 2013 · Debevec leads the graphics laboratory at the University of Southern California's Institute for Creative Technologies, and is a research ...
  20. [20]
    Paul E. Debevec - ACM SIGGRAPH HISTORY ARCHIVES
    Paul Debevec earned degrees in Math and Computer Engineering at the University of Michigan in 1992 and completed his Ph.D. at the University of California at ...
  21. [21]
    The Paul Debevec long read - befores & afters
    Apr 4, 2024 · At SIGGRAPH Asia 2023 in Sydney, Debevec, who is now chief research officer at Eyeline Studios, presented an incredible historical look back at ...
  22. [22]
    Netflix Hires Paul Debevec: USC Researcher to Lead AI, Graphics ...
    Jul 15, 2021 · His 1996 doctoral thesis with Prof. Jitendra Malik presented Façade, an image-based modeling system for creating virtual cinematography of ...
  23. [23]
    Netflix Hires Researcher Paul Debevec to Oversee Emerging ...
    Jul 15, 2021 · He will additionally continue his responsibilities as an adjunct research professor of computer science at USC's Viterbi School of Engineering, ...
  24. [24]
    [PDF] Modeling and Rendering Architecture from Photographs: A hybrid ...
    Our approach can be used to recover models for use in either geometry-based or image-based rendering systems. We present results that demonstrate our ap-.Missing: bidirectional scattering<|control11|><|separator|>
  25. [25]
    [PDF] Modeling and Rendering Architecture from Photographs by Paul ...
    This thesis presents an approach for modeling and rendering existing architectural scenes from sparse sets of still photographs. The modeling approach, which ...Missing: life childhood
  26. [26]
    The Campanile Movie - Paul Debevec
    This page provides images and links relating to the Campanile movie, a short film directed by Paul Debevec made in the spring of 1997 that used image-based ...
  27. [27]
    Why the Campanile movie, 25 years on, is still so important - fxguide
    Oct 5, 2022 · In 1997, Paul Debevec presented The Campanile Movie at the SIGGRAPH Electronic Theatre. This landmark short film was done as a capstone to Paul's Ph.D. work.Missing: roles | Show results with:roles
  28. [28]
    Recovering high dynamic range radiance maps from photographs
    We present a method of recovering high dynamic range radiance maps from photographs taken with conventional imaging equipment.
  29. [29]
    [PDF] Recovering High Dynamic Range Radiance Maps from Photographs
    We present a method of recovering high dynamic range radiance maps from photographs taken with conventional imaging equip-.
  30. [30]
  31. [31]
    High Dynamic Range Imaging - ScienceDirect.com
    Covers all the areas of high dynamic range imaging including capture devices, display devices, file formats, dynamic range reduction, and image-based lighting.Missing: 2005 | Show results with:2005
  32. [32]
    Acquiring the reflectance field of a human face - ACM Digital Library
    We present a method to acquire the reflectance field of a human face and use these measurements to render the face under arbitrary changes in lighting and ...Missing: invention | Show results with:invention
  33. [33]
    [PDF] The Light Stages and Their Applications to Photoreal Digital Actors
    Paul Debevec∗ ... Although only a 2D process, the results exhibit correct dif- fuse and specular reflection, subsurface scattering, self-shadowing,.
  34. [34]
    The Parthenon - ICT Vision & Graphics Lab
    The Parthenon is a short computer animation which visually reunites the Parthenon and its sculptural decorations, separated since the early 1800s.
  35. [35]
    The Parthenon | ACM SIGGRAPH 2004 Computer animation festival
    The Parthenon. Author: Paul Debevec. Paul Debevec. USC Institute for Creative ... SIGGRAPH '04: ACM SIGGRAPH 2004 Computer animation festival. Page 188.
  36. [36]
    The Digital Emily Project - ICT Vision & Graphics Lab
    Emily came by our institute to get scanned in our Light Stage 5 device on the afternoon of March 24, 2008. The image to the left shows Emily in the light stage ...
  37. [37]
    The Digital Emily project: photoreal facial modeling and animation
    This course describes how high-resolution face scanning, advanced character rigging, and performance-driven facial animation were combined to create Digital ...
  38. [38]
    Digital ira | ACM SIGGRAPH 2013 Computer Animation Festival
    Video performance capture drives a facial blendshape model made from high-resolution facial scans, compressed and realistically rendered in a reproducible ...
  39. [39]
    Driving High-Resolution Facial Scans - ICT Vision & Graphics Lab
    Digital Ira : Creating a Real-Time Photoreal Digital Actor, SIGGRAPH Realtime Live, 2013. Light Stage 5. Temporal Upsampling of Performance Geometry using ...
  40. [40]
    Recreating the Sodium Vapor Matting Process - ACM Digital Library
    Jul 25, 2024 · ... Debevec & Pueringer SIGGRAPH 2024 Poster "Recreating the Sodium Vapor Matting Process". Download; 3.51 MB. PDF File. Sodium Vapor Process ...
  41. [41]
    Lightstage LLC | Intellectual Property & Industry Research Alliances
    Light Stage systems efficiently capture how an actor's face appears when lit from every possible lighting direction.
  42. [42]
    Light Stage - USC Stevens Center for Innovation
    The Light Stages have been used to create photo-real digital actors in films such as “Spider-Man 2,” “The Curious Case of Benjamin Button,” “Avatar,” “The ...Missing: Matrix King Kong Superman Returns
  43. [43]
    The 82nd Scientific & Technical Awards 2009 | 2010 - Oscars.org
    Feb 20, 2010 · To Paul Debevec, Tim Hawkins, John Monos and Dr. Mark Sagar for the design and engineering of the Light Stage capture devices and the image ...
  44. [44]
    74th Engineering, Science & Technology Emmy Awards
    Sep 30, 2022 · Dr. Paul E. Debevec received the Charles F. Jenkins Lifetime Achievement Award, which honors a living individual whose ongoing contributions ...
  45. [45]
    Significant New Researcher Award - ACM SIGGRAPH
    ACM SIGGRAPH is pleased to present the 2024 Significant New Researcher Award ... 2001 Paul E. Debevec. More Information About Previous Recipients. Nomination ...
  46. [46]
    2002 | MIT Technology Review
    Paul Debevec. Entertainment · Shawn Fanning. Internet and Web · Justin Frankel. Internet and Web · Vinay Gidwaney. Software · Robert Guttman. Software · Ramesh ...
  47. [47]
    SMPTE Progress Medal Historical List Recipients | Society of Motion ...
    Paul E. Debevec In recognition of his achievements and ongoing work in pioneering techniques for illuminating computer-generated objects based on ...
  48. [48]
    SMPTE Announces Annual Awards Recipients for 2017
    Aug 23, 2017 · SMPTE is presenting the 2017 Progress Medal to Paul E. Debevec in recognition of his achievements and ongoing work in pioneering techniques ...