Program Abstracts

Posted on Aug 19, 2017 | Comments Off on Program Abstracts



Opening Reception and Screening
La Vérité
New  restoration courtesy of Sony Pictures Repertory

Reception:  6:30pm
Screening: 7:30pm





Recovering Early Optical Sound:
Joseph Tykociner’s 1922 Composite Sound-on-Film System
Joshua Harris, University of Illinois at Urbana-Champaign
Simon Lund, Cineric, Inc.

Simon Lund of Cineric will present recent work handling motion picture sound tracks. These efforts include the removal of cross modulation distortion through post processing audio files, wetgate scanning of optical soundtracks and handling non-standard track configurations.

As a case study, Joshua Harris, Head of the Media Preservation Unit at the University of Illinois, will present the results of a recent National Film Preservation Foundation (NFPF) funded project to fully preserve and decode audio from the first successful public demonstration of a composite sound-on-film system. UI Professor Joseph Tykociner’s 1922 demonstration had only been attempted to be reproduced once in the 1950s and was unsuccessful due to many of the same challenges Cineric encountered recently, including nonstandard track placement and variable frame-rates. Tykociner’s films presented a variety of challenges in capturing, preserving and reproducing these artifacts of early motion picture technological innovation.

The Digital Post-It: Cataloguing Unstructured Metadata for Preservation in Distributed Databases Using Open Standards
Randal Luckow, Home Box Office
Steve Kochak, Digital Preservation Laboratories

Migrating physical elements to homogenous file based structures can often leave out important pieces of legacy “physical metadata” that may prove important sometime in the future. New metadata is often generated, but usually not documented, regarding the decisions preservationists made at some point in time. Associating these new categories of metadata with digital image and audio can be problematic.

Meanwhile, as preservationists increase the number of metadata fields, databases are quickly becoming burdened with expansive and redundant metadata which was never planned for leading to slow searches and new challenges for database backups and archiving. Systems and workflows often become “tightly coupled” so one aspect of the system is heavily dependent on another to function. What if a central database could be archived in a distributed format and associated with the preservation essence? What open standards are available that could allow us to do this? Finally, if metadata was encapsulated in preservation objects, what would the ramifications be for the overall portability of an archive?

Restoring The Ballad of Gregorio Cortez:
The Result of a Long Partnership Between an Archive and a Festival

Josef Lindner, Academy Film Archive

In 2002, the Academy Film Archive began collaborating with the Los Angeles Latino International Film Festival, with the aim of presenting at least one archival screening at each year at the festival. The initial project had been brought to the Academy by Professor José Sanchez-H of California State University, Long Beach: Mina Alaska (1968), by the legendary Bolivian filmmaker Jorge Ruiz. In the years that followed, the Archive and LALIFF presented multiple features, some as new 35mm prints — like the Oscar nominated Macario (1960) – and some were new archival preservations, including Citia en la Frontera (Argentina, 1940) and Los Peqeuños Gigantes (Mexico, 1960).

This last title had great significance to both the Academy and to Edward James Olmos, the co-founder of LALIFF. For the Academy, the elements had been donated by the actress Jean Rouverol, the widow of Hugo Butler. Butler had an established connection to the Academy: he had been nominated for an Academy Award the screenplay of Edison, the Man (1940), and he would later present his father Frank Butler with the Oscar for the screenplay of Going My Way (1944). Subsequently, Hugo Butler became a victim to the Hollywood blacklist, and had to move to Mexico to find work. Under the name Hugo Mozo he made Los Peqeuños Gigantes – which then made its way to an East Los Angeles theater. At the festival screening, Edward James Olmos described how moved he was to see young Spanish- speaking characters actually portrayed on the big screen.

All of this led to the idea of screening the seminal film The Battle of Gregorio Cortez (1982). While Olmos had his own personal print, it had been heavily used as was fairly worn. The initial research suggested this would be a difficult photochemical restoration, as it had been filmed in Super 16mm but only conformed in 35mm. As no 35mm preprint element could be located or accessed by the Archive, the project languished for many years. By 2015, digital technology made the reconstruction of the film form the Super 16mm technically feasible, and it became financially possible with a partial grant from the National Endowment for the Arts. The 36 rolls / 16 hours of unconfirmed original Super 16mm negative were transferred in HD, and using Mr. Olmos’ personal print as a guide, the individual shots could be scanned in 2K. With the original 35mm mag track as a source, the audio restoration was relatively easy, and thus the film could finally be seen in its original theatrical quality.

This presentation will cover both the specific challenges of restoring The Ballad of Gregorio Cortez as well as the collaborative work on Latino cinema that finally led to the film’s resurrection.

Artificial Intelligence for Automatically Repairing Vertical Scratches
Mike Inchalik, PurePix Images
Alexander Petukhov, University of Georgia
Inna Kozlov, Algosoft Tech USA

We are dedicated to inventing and commercializing fully automated, high quality restoration technology that make it practical to restore the millions of hours of moving images archived around the world before they become unrecoverable. While today’s manual and even semi-automated tools can deliver exceptional results, their enormous cost in funds, time, and skilled manpower make them impractical for more than a small fraction of titles around the globe.

Digital scratch repair has traditionally been one of the very most difficult restoration tasks – often requiring lots of human inspection to verify what is and is not a scratch and often requiring a skilled artist to select and paint in the pixels to be used to do the repair. In the past, automated approaches to repair scratches have had only limited success, because scratches can be intermittent, because the look and intensity of the scratches can vary greatly even during the same long scratch, and because scratches can be confused with real vertical lines and edges in the image.

In our presentation, we will describe a revolutionary and automated image processing approach to detecting and repairing different types of vertical scratches. During this last year of research, we discovered a way to apply and extend state-of-the-art algorithms from the fields of “Big Data” and “Deep Learning Neural Networks” to deliver a unique and remarkably better approach to restoring scratched archival material. We will show how Neural Networks can be designed and “taught” to automatically tune all of the key parameters required to accurately recognize vertical image scratches.

We will discuss the development process, because while the end results are very encouraging, there were lots of false starts, challenges, and dead ends along the way. Among other issues we will cover, the lack of “absolutely correct data” (what really was behind the scratch), errors in the training database, and even finding the right Neural Network topology for thin scratches all presented real complications.

In the end, what did we learn? Even after choosing an appropriate Deep Learning Neural Network model and training it across an enormous amount of corrupted footage with different type of scratches, we found that these AI (Artificial Intelligence) technologies alone were not good enough. We needed to combine them with wavelet-based technology to deliver outstanding quality improvement and dramatically reduce restoration artifact.

We believe that the power and effectiveness of using these new technical advancements, used in combination, gives real hope that the gap between the capabilities of manual and unsupervised, automated digital film restoration is narrowing quickly. During the presentation, we will provide an overview of the technology and share before and after results.


HDR Video Mastering for Classic Cinema
Bill Baggelaar, Sony Pictures Entertainment

With the advent of HDR TV and the wave of new theatrical and television titles that are being mastered in this new format, it is only a matter of time before we turn our eyes towards re-envisioning our classic titles in this new medium.

While we have been working on the development of High Dynamic Range for several years now mainly for in-home television viewing and more recently for the cinema, there have been many misunderstandings on what exactly HDR means and how it relates to content, and especially to classic cinema.

With an expanded palette of color and light, HDR technologies allow us to bring out more detail that was in the original negatives. The new HDR masters can represent values in the original film that could not previously be expressed due to the limitation of film and television distribution technologies. HDR technology provides a visual experience that can more closely approximate – on today’s screens and for today’s viewers – the look and feel of an original theatrical presentation.

This does not necessarily mean that the images are “brighter”, but rather expands on the use of better overall contrast, with better black levels and shadow detail, as well as highlights that can be used for effect. Maintaining saturation at bright levels as well as deep color saturation more in-line with film saturation are some of the main advantages for classic film titles.


The Troop: Redux – An ACES Reformatting and Archiving Case Study Project
Marcus Dillistone, Film Director
Andy Maltz, Science and Technology Council, AMPAS
Michael Pogorzelski, Academy Film Archive

The Academy Color Encoding System (ACES) is widely used in motion picture, television, gaming and other content creation workflows to maintain color consistency and to provide a broad visual palette to support new technologies such as high dynamic range displays. ACES also includes standards and best practices for the long-term archiving of digital content. In 2016, the Academy Science and Technology Council and the Academy Film Archive began a project to explore the implications of reformatting film-originated content in an ACES context, as well as an implementation proof for new ACES archiving standards being published by SMPTE. The content for this is ‘The Troop’, a Royal Premiered short from 1999. The Troop conveys the kinetic magnificence of the British Army’s horse-drawn saluting battery, The King’s Troop.

This session will cover specifics about the ACES-based workflow, film scanning, color correction for standard and high dynamic range displays, and SMPTE/ACES archiving specs and tools.


Capacitance Cavalcade:
The Sorrowful Spinning Saga of RCA’s SelectaVision CED Videodisc

Jayson Wall, N-I Broadcasting

In the early to middle part of the 20th Century, RCA was one of the most forward thinking companies around. With groundbreaking advancements in radio, electric turntables, and television, in many ways, RCA was the Apple Computer of its day, bringing innovative modern technology into the households of America. As color television started to slowly take hold in the Eisenhower era homes of the late 1950’s, RCA was determined to introduce the “next big technology” to the public—and landed on the idea of “personalized television”. Being a huge manufacturer of LP’s and 45’s, RCA began development of an economical system that would allow consumers to play audio and video from vinyl discs on their television.

After 23 years of sporadic and unsuccessful attempts, in 1981, RCA finally took consumers back to Jurassic age technology, investing hundreds of millions of dollars into a system based on the gramophone. Born at a time when VCR sales were exploding, the CED (Capacitance Electronic Discs) VideoDisc system quickly turned out to be the wrong video system, for the wrong market at the wrong time. Retail sales were much lower than expected and after 3 years on the market the format entered the media graveyard. It was a disaster of epic proportions on the same level as the Edsel or “New” Coke, resulting in a 600 million dollar write off for RCA, but contrary to popular belief, it was not responsible for the downfall of the company.

Today the RCA CED Videodisc is all but forgotten outside of the broken machines and thousands of CED discs littering thrift store shelves. This presentation will take an entertaining look at the rise and fall of the RCA Videodisc, along with the historical comedy of errors caused by the revolving door of RCA leadership as they delivered the industry’s first huge Home Entertainment format flop…first in MONO, then in STEREO. Possibly, footage from a CED disc played directly from a CED player will be presented.

Analyzing Image Bit Depth in Digital Archive Deliverables
Sean Vilbert, Paramount Digital Archive
Miki Fukushima, Paramount Digital Archive

One of the most crucial technical attributes of every digital image we archive is bit depth. Bit depth defines the number of bits allocated in each pixel. In computing, each bit is defined as either 0 or 1; the larger the number of bits, the more combinations of shades of color each pixel can hold. By this mathematical calculation, an 8-bit image can hold 256 tones – 2 (possibility of 0 or 1) to 8 (number of bits). A16-bit image can hold 65,536 tones – 2 to the 16 bits, and so on. However, this does not necessarily mean that the pixel is utilizing the entire 256 or 65,536 color tone range. 256 or 65,536 is just the size of container, so the question we want to ask is how much of those tones are actually used to construct the details of captured objects and how much of those same tones are just used as noise?

In order to investigate further, we have decided to analyze the color tone usage with our customized toolsets by iterating each pixel in an image through them and isolating the color into red, green, blue channels respectively. This way, the information of each pixel is translated into x, y coordinates and 3 values that are representing red, green, blue. After extracting each color value, we then converted the extracted value into binary. The resulting output so far is either 0 or 1 for each bit layer in a single pixel. 0 means there is no information, 1 means there is. The last step is to visualize this binary output to make it accessible to humans. As an 8-bit container seems to be more compatible with more commonly-used display monitors, we mapped the value 0 to 0 and 1 to 255 which is the maximum color shade for 8 bit container. The reasoning behind this particular mapping strategy is to emphasize the existence of information in each bit layer in a single pixel. In our presentation, we would like to demonstrate our case studies and show the resulting output from the above experiments using some of our actual Paramount digital image deliverables with different technical attributes, (e.g. digitally produced digital intermediate of LOG encoding vs. Linear P3, digital born assets vs. digitized film born assets). The aim of this presentation is to not only determine the best bit depth for different digital preservation scenarios but also show other archives an open-source way to do the same kinds of tests to ensure we are capturing the most information in the course of our archiving work.



New restoration from Criterion,
courtesy of MGM/Park Circus
Lee Kline, Criterion




FILMIC Virtualization Model for Digital Motion Picture Film Preservation: Harvesting more that is FILMIC in Digital Film Preservation
Jim Lindner, Media Matters LLC

The FILMIC project is an international, multi-disciplinary, and open research project. The goal of the project is to create preservation quality virtual data representations of motion picture film as digital objects.  The virtual representation of motion picture film as data includes condition and physical metadata in addition to scanned image information that is based on spectral curves instead of only RGB.  The FILMIC project is:

  •  A re-think of Digital Film Preservation that provides a clear path for future innovation over a very long time horizon
  •  An investigation of film content preservation in purely digital terms
  •  A collaborative effort offering opportunities for:
  • Archives
  • Vendors (both manufacturers and service providers)
  • Scholars and Researchers
  • Stakeholders who have high value content

The presentation will briefly discuss the overall project but mainly focus on accomplishments and status in several projects now underway.


Active Digital Preservation: Combining Digital Asset
and Preservation Management Workflows
Linda Tadic, Digital Bedrock

Digital content must be actively managed to be preserved for future use. Formats and software come and go, there is no “store and ignore” storage media so files must be migrated to new media over time, and the bit health of any digital asset requires scheduled checks.  In addition, all of this work must be supported by complex metadata.

The Open Archival Information System Reference Model (OAIS) (ISO 14721) is the current data preservation workflow standard. It provides recommendations on how to create “packages” of data so the digital content can be preserved and understood over time, but its application in the archives and libraries space has been to create relatively static packages. Data packages are created, set aside with scheduled fixity checks, and only retrieved when needed. But data is preserved because it needs to be used, and the technical supporting environment can (will) change over time, meaning the package so carefully created today might not be usable in the future due to obsolescence.  To be future-proofed, one must track the metadata on the digital object’s purpose, as well as the relationships with other files.   Obsolescence vulnerability factors must be monitored over time, as well as performing bit-level preservation. The most efficient means to manage this information is through a detailed database, where every bit of metadata about the digital files are indexed and events tracked.

This presentation will describe key problems with preserving digital media, and offer a new data management approach.

Problems in preserving digital content include:

  • Changing formats, software, and related obsolescence factors
  • Storage and media migration as storage nears end of life
  • Managing relationships between files
  • Tracking provenance
  • Managing complex metadata on file creation and characteristics
  • Running scheduled fixity checks indefinitely

Digital Bedrock presents one approach to solving these problems. This service provider performs managed digital preservation for clients, through a system that was developed to be flexible enough to accept all types of data in any structure. The result is a hybrid OAIS compliant-digital asset management application that accommodates both traditional preservation workflows (OAIS) and managed digital asset management actions. The system will be demonstrated to include illustrations of:

Ingesting and managing DPX files as individual frames (not tarballs)

Extensive technical and embedded metadata extraction. This unstructured data becomes structured and therefore indexable when ingested into the system. The client can search for this data on their individual portal as well, enabling them to search for specific files based on color space, camera used, frame rate, etc.

Obsolescence vulnerabilities checks. The Digital Object Obsolescence Database (DOOD) (patent pending) monitors vulnerabilities in formats, codecs, software, operating systems, and hardware.

Metadata on file relationships, original creation environments, and events on the files (request, delivery, transcode, fixity check, etc.)

As part of this demonstration, an excerpt from This is Cinerama (1952) will be screened, followed by a display of how the file is indexed represented in the Digital Bedrock system.

Case Study: Restoring Behind the Door (1919)
Robert Byrne, San Francisco Silent Film Festival

Irvin Willat’s infamous 1919 film Behind the Door is relatively unknown to even the most devoted movie fans. Reviewers at the time of its release described the brutal tragedy as “a tremendous drama powerful in its perfection and production quality,” “exceptional in every detail,” “an opus in brutality,” and “an intermezzo of gory revenge.” A century later, the few that have seen the elements of the film are no less enthusiastic. Historian Kevin Brownlow called the film “the most outspoken of all the [WWI] vengeance films,” while the Cinefest catalog described it as having “one of the most beautiful and eerie photoplays.”

Until last year, Behind the Door was been notoriously difficult to see, and impossible to see it its complete form. The physical remains of the Thomas Ince production consist of a single incomplete set of edited tinting rolls, a small selection of outtakes, a small roll of shots from the estate of actor Hobart, and a previously inaccessible Russian-language version held by Gosfilmofond, the Russian national film archive. That changed when the San Francisco Silent Film Festival came to agreement with Library of Congress and Gosfilmofond of Russia to work together to combine all of the known film sources into a definitive reconstruction and restoration of the film.

The project to restore Behind the Door faced technical challenges on nearly every front, including: reconstructing the original continuity, interweaving elements from the disparate sources, creating replacement intertitles, bridging narrative gaps, replicating Irvin Willat’s intricate color scheme, and the usual challenges of editing, image restoration, grading, etc.

The technical processes employed a digital intermediate workflow, with the final result culminating in a new 35mm preservation negative, 35mm prints, and the usual digital renderings. Along the way, no small amount of historical research and technical creativity was required to faithfully restore Irvin Willat’s original vision to the screen.

A 20th Anniversary Salute to the National Film Preservation Foundation
Jeff Lambert, National Film Preservation Foundation

The National Film Preservation Foundation is the nonprofit organization created by the U.S. Congress to help save America’s film heritage.   The NFPF supports activities nationwide that preserve American films and improve film access for study, education and exhibition.  The NFPF has supported film preservation in cultural institutions in all 50 states, the District of Columbia, and Puerto Rico—efforts that have resulted in preserving more than 2,350 historically and culturally significant films.  Films preserved through NFPF programs range from one-reelers by Thomas Edison, industrial films and cartoons to home movies, silent era films and avant-garde animation.  The Reel Thing joins with NFPF Executive Director Jeff Lambert to present an overview of the work of the foundation covering the last twenty years, including screening selections from films that have been preserved and a conversation about the current and future activities of the foundation.

The 1920s Western Electric Sound Recording System’s Role in Developing the Architecture for Modern Film Sound
Nicholas Bergh, Endpoint Audio Labs

The recent PBS American Epic Sessions television special provided an opportunity to demonstrate the first electric sound recording system developed by Western Electric. Although the American Epic series only discussed the system in the context of the music recording industry, this same electric recording system was also responsible for the sound revolution in the motion picture industry starting in 1926. After showing a clip from American Epic Sessions demonstrating the sound of the restored working system, this presentation will discuss its parallel history in the motion picture industry.

Of all the eras in the history of film sound, the “dawn of sound” era of the late 1920s has received the most research and discussion.  However, the tendency to focus on the business/studio side of this era has created some confusion about the actual recording technology that is at the heart of it all.  Behind many of the early marketing names Vitaphone, Movietone, etc. is the same basic system that would forge the current studio lots and their approach to film sound for the decades to come. Even the early RCA-Photophone system, usually presented as a counterpoint to the Western Electric system, can be shown to be more a copy of the Western Electric system than something completely unique in these early years.

A better understanding of this initial Western Electric system helps to not only understand the restoration of film sound elements from this early era, but also film sound elements and workflow in general for the decades to come. The system became the architecture that future sound developments would primarily just embellish and improve upon. Even today in 2017, strong remnants of this first system still remain in modern film production.

Cinema Framerates: A Progress Report
Jonathan Erland, Pickfair Institute for Cinematic Studies

Jonathan Erland will present a progress report on cinema framerates. He’ll recap some of the work of the Pickfair Institute for Cinematic Studies, and review some of the recent efforts that have been made to restore variable frame rates to the artistic pallette of the cinema.

Case Study: Restoring Howard Hawks’ Scarface (1932)
Peter Schade, NBCUniversal
Wojtek Janio, MTI Film

The first part of this presentation will cover NBC Universal’s commitment to film preservation, the restoration-title selection process and a summary of notable screenings and preservation partners with whom NBCU has worked.  The presentation will also include a brief overview of evaluated film elements and the processes by which Scarface was scanned, color corrected and digitally restored.  In the second part of this presentation, Wojtek will explain the technical and ethical approach to the restoration. The only available materials for this feature were several different 3rd generation prints. The scans revealed that the material was badly scratched, blurry, very unstable and dirty. It was also flickering badly and had a strange unnatural grain structure.

The Rules of Restoration: Issues, Ethics and Choices in Digital Restoration
Wojtek Janio, MTI Film

The presentation starts with questioning the very goal of restoration – what should the final result look like and what is our target?  What is the actual purpose of restoration itself? Should we take different approaches for different movies or perhaps produce a universal set of rules to follow? Wojtek Janio during his presentation will try to answer those questions and raise many other controversial ones, mostly regarding the technical side of restoration, but also ethical dilemmas that every restoration professional encounters.




New restoration courtesy of NBCUniversal