Announcement

Collapse
No announcement yet.

Digital Rocks: How Hollywood Killed Celluloid

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Digital Rocks: How Hollywood Killed Celluloid

    Digital Rocks: How Hollywood Killed Celluloid

    By Will Tavlin

    Page 1 of 6

    HOLLYWOOD IS TO THE CINEMA what the United States is to the world. Its power is outsize. India, China, Nigeria, Russia — among many other countries — each release more movies annually than the five-member studio syndicate based in Los Angeles. But no other film sector carries more influence, generates more money for its investors, plays on as many screens, or dictates with such vainglory its vision for a world order. Also like the United States: when Hollywood drops a bomb, the world pays attention.

    On December 3, 2020, WarnerMedia CEO Jason Kilar announced that all of Warner Bros.’s 2021 titles would be sent to theaters and to HBO Max, WarnerMedia’s streaming service, simultaneously. To theater owners, this was a declaration of war. For decades, the Hollywood studios had gradually condensed the theatrical window — the block of time in which theaters had exclusive exhibition rights — in order to take advantage of the lucrative home video market. But to unilaterally do away with it for a year of new releases was unprecedented.

    A narrative soon emerged. Journalists, talking heads, and Twitter users all agreed that the twin phenomena of Covid-19 and advancements in home theater technologies had made Warner’s decision inevitable. “Theater owners,” the tech journalist Kara Swisher argued in the New York Times, “[have] not yet grasped the depth of the digital revolution, which has only accelerated during the pandemic.” Warner Bros., Swisher wrote, had “finally shattered Hollywood’s way of doing business, perhaps for all time.”

    But Kilar had not ushered in the film industry’s digital revolution. That revolution had long been underway, and it had little to do with streaming services. The story began in 2002, when executives from the largest Hollywood studios met to discuss their latest concern: digital prosumer technologies. Cheap digital cameras and home computers were everywhere, and the studios were worried. Their business model — which depended on maintaining a viselike grip on the distribution of their films — produced dizzying profits for their shareholders. Digital technologies could threaten all that. This was the year after Napster brought record companies to their knees. Who knew what chaos might unfold? The studios formed a little-known joint corporation, Digital Cinema Initiatives (DCI), to research the matter.

    Eventually DCI scrubbed celluloid film almost entirely from the film industry, ushering in the most significant technological shift since the introduction of sound. The digital revolution transformed nearly every aspect of filmmaking for Hollywood and independent filmmakers alike. Netflix’s rise from a fringe DVD rental service to dominant streamer — in whose steps studios like Warner Bros. are now desperate to follow — was just one of many outcomes that unfolded in DCI’s wake: archiving film assets became prohibitively expensive; independent theaters withered; thousands of projectionists lost their jobs.

    This revolution was invisible, and it was designed to be that way. Its success depended on audiences never noticing at all.
    Last edited by Geoff Jones; 04-01-2022, 08:59 PM.

  • #2
    Page 2 of 6

    AT FIRST THERE WERE the moguls: Adolph Zukor of Paramount; Carl Laemmle of Universal; Louis B. Mayer of MGM; William Fox of Fox. These men forged companies whose power resided in their role as distributors. In an industry known for technological and cultural upheaval, this has remained consistent: the distributors’ unique ability to put pictures on America’s tens of thousands of screens gives them the power to shape how movies get made around the world.

    Hollywood as we know it today began taking shape in 1914. America’s appetite for movies was growing, and in order to fulfill demand, the largest film distributors began opening and expanding production units of their own. They built out their burgeoning networks of movie theaters, too. These vertically integrated entities — such as Paramount, Warner Bros., and Fox — became known as the majors. Over the next decade and a half, they pioneered new distribution strategies at home and abroad that consolidated their hold on the film industry.

    As early as 1916, the majors, led by Paramount, began renting their films to independent theaters on an all-or-nothing basis: if an independent exhibitor wanted to book Mary Pickford’s films, they’d have to also rent Paramount’s entire film slate. Under the arrangement, known as block booking, the majors forced theaters to take on films they knew nothing about. Block booking was predictably regressive for independent theaters, but guaranteed that every film the majors produced would be profitable. By the late 1920s, Hollywood dominated overseas, too, supplying anywhere from 60 to 90 percent of all films shown in foreign countries. The decades that followed were Hollywood’s golden years. Each studio would eventually release up to seventy films annually, and American theaters clocked ninety million admissions a week.

    But by midcentury, a number of developments imperiled the majors’ profits. One of these threats was domestic. For years, the Justice Department had litigated the majors’ distribution model — in which they produced, distributed, and exhibited films — for violating antitrust law. In 1948, the Supreme Court finally forced the majors to sign the Paramount Consent Decrees. In doing so, the studios agreed to sell off their theaters and no longer block book films.

    Another threat came from international markets. Like other American manufacturing industries following World War II, the majors faced an overcompetitive market for their goods. Countries across Europe and Asia, aggressively rebuilding their manufacturing base out of the rubble of war, decided they needed robust film industries of their own. The Italian neorealists, Ingmar Bergman, and Akira Kurosawa made inroads into America’s newly built art house theaters with their sophisticated, sexy, and violent films, but more importantly, they challenged Hollywood’s hegemony in their home markets at a time when Hollywood was collecting 40 percent of its profits from global box office receipts.

    The digital revolution was invisible, and it was designed to be that way. Its success depended on audiences never noticing at all.

    Meanwhile, the rise of television began to break the industry’s monopoly on moving image entertainment, as the number of households with television sets shot up from three million to fifty million over the course of the 1950s. The result was that the “old assembly-line method of making movies,” as Robert Sklar writes in his 1975 book Movie-Made America, “gradually came to an end.” Deindustrialization soon followed: film output and ticket sales dropped, while un- and underemployment in Hollywood spiked; Paramount, Warner Bros., and United Artists sold themselves to an auto parts conglomerate, parking lot conglomerate, and insurance conglomerate, respectively; MGM and Fox auctioned off their props and wardrobes. Hollywood’s empty sound stages presaged the factories in Rust Belt cities that would empty out thirty years later.

    In the decades following World War II, the majors tried to win back audiences by throwing innovations at the screen, seeing what stuck, and repeating whatever made money over and over again. They developed new widescreen technologies such as CinemaScope, VistaVision, and Panavision, and gimmicks like stereoscopic 3D. The more successful ideas were low tech. After 1969’s Easy Rider netted an unexpected 16,000 percent profit at the box office, executives clamored to distribute a new generation of young filmmakers, dubbed the New Hollywood, who generated more hits with The Last Picture Show and The Godfather.

    But any successes that Hollywood found in these years were temporary. No single style or star could reverse the long-term trends of declining film production and ticket sales. Executives wouldn’t find a real solution to the overcompetitive market until 1975, when Steven Spielberg’s Jaws delivered Hollywood’s analogue to the multinational manufacturing companies that globalized production in the face of declining output growth and overcapacity: the blockbuster.

    The novelty of Jaws was its saturation release model. Before Jaws, distributors released films on a platform basis, opening in large cities first and gradually expanding to smaller markets depending on demand. Instead, executives at Universal, Jaws’sstudio, booked the film on hundreds of screens at the same time, and spent an unprecedented $2.5 million on marketing — almost double the average marketing budget of the time, with $700,000 dedicated to a television ad blitz that ensured that every American knew the film existed. The strategy paid off: Jaws smashed every box office record. Suddenly, for the handful of studios who commandeered massive marketing and distribution apparatuses, diminished output no longer mattered. The majors would steamroll the competition by force.

    Saturation release was also innovative because of the way it leveraged profit-sharing agreements between distributors and exhibitors. Since the introduction of sound, distributors had been renting their films to independent theaters for a sliding percentage of box office ticket sales, meaning that the longer a film played in theaters, the higher the percentage of ticket sales the exhibitor kept. Blockbusters, thanks to their onslaught of advertising, made the majority of their money in the first two weeks before rapidly tailing off, ensuring distributors a higher percentage of the box office gross.

    The largest theater chains welcomed the saturation release model. In the 1980s, companies like American Multi-Cinema (AMC) and General Cinema Corporation (GCC) began supersizing their growing network of multiscreen theaters by taking advantage of economies of scale. For most of the 20th century, movie theaters had screened films by using a changeover projector system in which a projectionist alternated film reels between two side-by-side projectors. This process required considerable attention. Projectionists could never operate more than one screen at a time — a losing proposition for the exhibition magnates who wanted to build theaters with dozens of screens and reduce labor costs. So multiplex chains turned to new automated platter systems. Platter projectors didn’t have changeovers because projectionists stitched the film’s reels together onto a single horizontal platter prior to the screening. They could then thread the film from one projector to the next in an adjacent theater, a process known as “interlocking.” With platters, a single film could potentially play on all the multiplex’s screens — and a single projectionist could operate all of them.

    Blockbusters comported well with the multiplex model. Movies like Raiders of the Lost Ark, Beverly Hills Cop, and Ghostbusters offered reactionary tales about the virtues of American firepower, lying cops, and deregulation bathed in the warm glow of family entertainment. Presented as an escape from politics, blockbusters delighted audiences and broke box office records. The projectionists who screened them found their hours scaled back and their jobs more difficult. At the largest multiplexes, projectionists were sometimes forced to monitor more than twenty screens at once. Projection quality suffered. Misaligned projectors, out-of-focus images, and deafening or muted sound levels became the multiplex norm. Interlocked film prints were rigged between walls, sometimes into lobbies and over popcorn machines, and were exposed to more harm. A single misaligned roller could engrave an emulsion scratch across an entire print. A misthreaded film could bunch up in the platter and melt.

    Meanwhile, theaters were losing control over their programming. As studios produced fewer films each year — and designed more of them to be blockbusters — they disincentivized smaller exhibitors from booking films from independent distributors. No mom-and-pop single-screen theater risked turning down a Star Wars, Jurassic Park, or Titanic, which every other theater in the country was screening — even though these blockbusters usually came with ruthless terms: a minimum number of play dates between eight and twelve weeks, nonrefundable guarantees to rent the films, and, as compensation for the crowds of moviegoers buying high-margin sodas and snacks, up to 90 percent of the box office gross reserved for the distributor. For multiplex conglomerates, blockbusters were the only kind of movie that could be interlocked and projected onto more than one screen at the same time. As the industry journalists Dade Hayes and Jonathan Bing wrote in their 2004 book Open Wide, the blockbuster era restructured film distribution such that exhibitors became “little more than the keepers of infrastructure and capacity, controlling little beyond seating, ticket prices, and concessions.”

    Comment


    • #3
      Page 3 of 6

      BY THE TURN OF THE MILLENNIUM, blockbuster directors like George Lucas and James Cameron had become reliable boosters of digital film technologies in the press and at industry conventions. To them, celluloid film was a clunky and expensive technology. Its supporters were analog nostalgists who stifled progress and who, like the aesthetes who’d shunned talkies decades earlier, refused to acknowledge the new reality that lay before the industry. The digital revolution had arrived, and it was a matter of simple economics: digital storage was limitless and therefore cheaper and faster to process, project, and store than its unwieldy analog equivalent, celluloid.

      The threats to the majors were severe. LA Weekly called digital filmmaking a “democratizing medium,” a point that must have made studio executives nervous. The problem was elaborated by Lucas in a 2006 interview with Time. He proselytized that digital’s affordability would overthrow the Hollywood power structure under which filmmakers were beholden to corporate behemoths to distribute their films. With digital, claimed Lucas, anyone could make a film and “go directly to the theater and say, ‘Hey I got a movie. Will you book this for three weeks?’ And the theater doesn’t have any costs involved.” One trade publication confirmed the dilemma to studio executives more bluntly: “If digital projectors were in wide use with an independent middleman providing easy transmission to theaters, film producers would in theory be able to release films theatrically without studios.”

      The film historian David Bordwell covers much of what followed in Pandora’s Digital Box, a 2012 collection of his comprehensive blog posts about the majors’ attempt to steer digital exhibition to their liking. After the majors — then 20th Century Fox, Paramount, Warner Bros., Universal, Sony, and Disney/Buena Vista — formed DCI in 2002 in order to take control of a global digital exhibition system, DCI spent the next three years designing the technology that would replace film reels, creating standards for digital projection, developing an antipiracy system, and generating a funding plan for theaters to buy digital projectors. All the while they made sure that as distributors they would maintain their leverage over exhibitors. While the DCI took input from the National Association of Theatre Owners (a group known, amazingly, as NATO), they were careful not to cede any ground for their final plan, released in 2005, which gave studios unprecedented control over the exhibition process.

      In the new system, projectionists screened a film with a Digital Cinema Package (DCP). The DCP was a collection of files on an industrial-grade hard drive. Projectionists downloaded the DCP onto a proprietary server and entered a passcode — sent separately by the studio — that decrypted the files for the duration of the theater’s rental period. The system made piracy impossible. But it also made studios more involved in the projection process. Unlike 35mm film, which could be viewed directly by projecting light through a film print, the DCP required patented technology to interpret its files. On its own the DCP was a useless brick. The significance was what it portended. For studios, the DCP guaranteed that they’d have full control over the circulation of their films in perpetuity, while permanently yoking theaters to their technological whims. Theaters might not have realized it then, but this was the first step toward a new exhibition paradigm that would no longer include them.

      For the majors, the benefits of digital cinema were manifold. DCPs were some 90 percent cheaper to distribute than multi-reel film canisters, and even multinational studios are beholden to postal costs. At a time when Hollywood had become wholly dependent on global box office receipts, digital distribution simplified the process of releasing films across continents and digital platforms in one fell swoop.

      The largest exhibitors realized they had something to gain, too. In part they were tricked by digital boosters like Cameron, whose baroque hearts-and-minds propaganda vehicle Avatar had recently become the highest-grossing film of all time. With serious faces these boosters claimed that new digital 3D systems would attract audiences in droves. (More accurately, Avatar’s digital 3D technology would perform like 1952’s stereoscopic 3D, fading as quickly as it appeared.) Audience numbers aside, theater chains identified another benefit of digital projection: they no longer needed to pay unionized projectionists. Anyone who knew how to operate a computer could theoretically screen a DCP. Some theaters opted to get rid of their projection booths altogether, installing theater management systems in closets and using the old space for building out concession areas.

      Not all exhibitors, however, were satisfied with the digital technology being air-dropped on top of them. The cost of converting a single screen to digital was between $50,000 and $100,000. The big three theater chains (AMC, Regal, and Cinemark) were able to secure financing to convert their screens because of their enormous capital reserves, access to Wall Street money, and economies of scale. For independent theaters, especially rural ones, the outlook was bleak. “All of the movie theaters in the Adirondacks were going to close,” Sally Wagshaw, owner of the State Theater in Tupper Lake, New York, told me. “No one was going to be able to take on a hundred-thousand-dollar loan and stay in business.” Wagshaw, along with several independent theater owners in the Adirondacks, launched a group fundraising effort. They succeeded. But the projectors they purchased are far more expensive to maintain. Thirty-five millimeter film projectors need only $1,000 to $2,000 per year in maintenance, use easily sourceable mechanical parts, and can last several decades with proper upkeep. Digital projectors require as much as $10,000 per year for maintenance, use proprietary digital parts that can take up to a week to install (during which they’re inoperable), and are estimated to last only ten years.

      But the studios had left independent theaters like Wagshaw’s no other choice. In 2011, 20th Century Fox notified theaters that by the end of 2013 none of its titles would be circulated on 35mm film. Other studios soon followed suit. Their decision to stop striking 35mm release prints sent shock waves through the film manufacturing industry. Overnight, annual orders for some forty billion feet of print stock evaporated. Film costs spiked. That same year, as a critical mass of theaters in the United States and Europe began projecting films digitally, the three major manufacturers of 35mm film cameras — Arri, Panavision, and Aaton — announced that they’d produce only digital ones. Meanwhile, Kodak, one of the last producers of film stock in the world, and the largest, burned through its cash reserves at a frightening clip. They declared bankruptcy in 2012, and although they emerged a year later with their film production units still intact, the damage was done.

      Celluloid filmmaking wasn’t dead, despite the outpouring of eulogies from film critics, nor was it by any means unaffordable. But it was increasingly passed over by studio executives who questioned the efficacy of the medium they themselves had willfully dismantled, and whose CGI-heavy superhero blockbusters — movies, after all, about the supremacy of computers — jibed better with a fully digital workflow. By late 2013, digital had overtaken 35mm as the dominant shooting format for top-grossing films.

      Comment


      • #4
        Page 4 of 6

        ON THE BRIGHT SIDE, digital cinema would make filmmaking cheaper and more accessible for directors. Or so the argument went.

        George Lucas, the most vocal advocate of digital filmmaking for its affordability, had an inkling of this in 1969. That year, the directors Albert and David Maysles and Charlotte Zwerin made Gimme Shelter, a documentary that followed the Rolling Stones for the final weeks of their 1969 US tour, which culminated at the Altamont Free Concert. The Maysles brothers and Zwerin hired several camera operators to shoot at Altamont, one of whom happened to be a 25-year-old Lucas.

        That night, two concertgoers were killed in hit-and-run accidents and one person drowned in an irrigation canal while tripping on LSD. Meredith Hunter, an 18-year-old Black man from Berkeley, was stabbed to death by the Hells Angels after brandishing a gun in full view of the stage. The murder was captured by one of the camera operators, but not by Lucas. Rumor has it that his camera jammed after he’d shot only 100 feet of film. As Albert Maysles later recalled to a journalist, “None of the stuff he shot turned out at all.”

        A recent graduate of the University of Southern California, Lucas was well known in those days for his disdain for Hollywood studios. He considered himself an antiestablishment director who hated not only Hollywood’s monopolistic distribution system but their trade unions, too. Now he would add celluloid film to the list. Over the next three decades, Lucas would position himself as a singular driver of the digital era.

        After the success of Star Wars, Lucas invested massive sums of money into upgrading his digital effects house, developed one of the first digital editing systems, and cofounded the digital animation studio Pixar. In the 1990s, he pushed Sony and Panavision to build out digital cameras and lenses and lobbied Texas Instruments and theater chains to project digitally, even when there was virtually no demand for such technologies.

        Lucas liked how digital technologies made filmmaking a highly controllable affair — unlike the chaotic scenes he’d watched unfold at Altamont. Digital cameras, lacking film canisters that needed to be reloaded every eleven minutes, enabled Lucas to shoot for as long as he wanted. No more jammed cameras. For Attack of the Clones, the first Hollywood feature film shot entirely with a digital cinema camera, producer Rick McCallum bragged that Lucas shot a hoggish equivalent of twenty thousand feet of film per day, compared with an average feature’s three hundred to five hundred feet per day, and saved millions of dollars in the process. Digital gave Lucas total command of the frame, allowing him to wholly rework his vast quantities of footage in the editing room. In postproduction for The Phantom Menace, performers were copied from some shots and pasted into others; actors who blinked on a cut were made to keep their eyes open; cast members who turned their head to Lucas’s disliking were turned the other way. Lucas’s editor claimed that there wasn’t a single shot in the film that he and the director hadn’t manipulated. “We could totally redirect the picture in the cutting room,” he explained. As Lucas once complained of the analog medium, “You can’t manipulate it enough.” Going back to celluloid, for Lucas, was out of the question: “It would be like going back and scratching things on rocks.”

        Digital cinema would make filmmaking cheaper and more accessible for directors. Or so the argument went.

        Despite the millions Lucas saved by forgoing film stock for digital storage, there was nothing cheap about Attack of the Clones. Estimates today put the film’s production budget at $115 million — and this is likely a lowball figure. Lucas’s bet that digital technologies could bring down the cost of filmmaking was only ever true for productions fully committed to guerrilla filmmaking and shot with prosumer cameras, such as 1999’s The Blair Witch Project, whose initial budget was only $60,000. Lucas, like most filmmakers shooting digitally today, opted for expensive digital cameras that were designed to re-create celluloid’s rich color science and wide dynamic range. As many filmmakers have found, these energy and data-intensive cameras aren’t guaranteed to save directors money so much as shift costs to other places in production.

        According to Darius Marder, the director of 2019’s Sound of Metal, the high up-front cost of film stock fosters an ethic of economy on set that digital productions — which incentivize directors to shoot vast quantities of footage they might not need — lack. “When you shoot on film, you have to know the law,” Marder said in an interview. “You have to know what you’re going for and you have to be willing to swing for it and not think you can just shoot forever.” More footage captured on set means more data to be processed, edited out, and archived in perpetuity. As Marder added, shooting longer entails another expense: “Overtime. There’s nothing more expensive on a film set than overtime.”

        As David Diliberto, the postproduction supervisor for several Coen brothers films, explained in an interview, the same technologies that enabled Lucas to direct his films from the editing room have also given rise to the attitude of “we can fix that in post.” He recounted working with digital filmmakers who wanted visual effects added to shots that were never budgeted to have them and that could have easily been addressed on set. In one case, a cinematographer wanted an extra’s shirt changed from bright red to a slightly duller red. In others, directors wanted sections of backgrounds replaced or wide shots cropped into close-ups. “Today it seems like visual effects build and build and build,” said Diliberto. The slightest visual effects can cost thousands of dollars. The problem is, Diliberto emphasized, “postproduction budgets don’t have a lot of money.” The lion’s share is usually reserved for shooting.

        Perhaps all this contributed to why in 2019, while in preproduction for their film Dark Waters, the director Todd Haynes and cinematographer Ed Lachman insisted on shooting with celluloid. Lachman believed the film’s plot, which follows a lawyer unearthing a criminal conspiracy waged by DuPont Chemical, resonated with celluloid’s chemical processes. As he told American Cinematographer, “The depth of film grain, which is affected by the chemical process of developing, and the crossover contamination of contrasting colors in the negative’s silver halides, bring attributes to an image that I find extremely difficult to create in digital capture.” But executives of the film’s studio, Participant Media, resisted, worried about the price of celluloid and the possibility that it would slow down production time. Lachman, in response, completed a series of screen tests with 35mm, Super 16mm, and digital 2K, proving that the film could be shot with celluloid in the same amount of time for the same amount of money as digital. The executives would not budge. In the end the film, an exposé of corporate malfeasance, was compromised by corporate stubbornness. Lachman and Haynes shot Dark Waters digitally.

        Hollywood executives prefer the high costs of a film that is reshaped in post, that has 5,000 percent more footage, for a simple reason: digital filmmaking offers more opportunities for studio executives to control the picture after it’s been shot.

        Lucas, a studio executive himself, knew this well. For the 2011 Blu-ray remaster of 1977’s Star Wars, later renamed A New Hope, the mogul went back and changed several scenes. For one notable sequence, in which Luke Skywalker is ambushed by a race of beings who live in the sand, Lucas added digital rocks to a shot of R2-D2. The effect accentuates the idea that the robot is hiding from the bad guys. Yet, as one blogger complained of the edit, it created a continuity error because those rocks were no longer there in the next scene. Lucas’s compulsion to go back and change his films made them worse. But digital filmmaking’s capacity to capture massive quantities of raw data on set that could be wholly reworked in post was irresistible.

        The digital image, as it was in the hands of Lucas, can be “preserved forever” as a profit-generating engine and manipulated endlessly, tweaked to the mercurial tastes of all future fans. And for Lucas it will be there years down the line, whenever he wants to add more rocks.

        Comment


        • #5
          Page 5 of 6

          IN 2007, as digital was gaining ground in shooting and exhibition, the Science and Technology Council at the Academy of Motion Picture Arts and Sciences released a report that sounded the alarm on another problem that digital advocates had overlooked or ignored: ballooning archiving costs.

          Over the course of the 20th century, as it became known that thousands of cinema’s earliest films had been lost or destroyed, studios and collecting institutions began to prioritize film preservation. The Holly­wood majors developed a storage strategy known as “save everything.” Like a building under construction, a movie produces a lot of scaffolding, and under “save everything,” studios archived not only masters but original camera negatives, answer prints used for timing, duplicate negatives used for fades and other opticals, screen tests, publicity stills, and deleted scenes. A single two-hour movie shot on 35mm, which fills six cans of film, could produce as many as 178 cans of film, which studios dutifully stored in temperature- and humidity-controlled vaults and working libraries. The system worked because film — the prints of which are made today with polyester — is what’s known as a “store and ignore” technology: if placed in a properly controlled environment, it can survive for a hundred years, perhaps many more.

          In their report, titled The Digital Dilemma, Academy researchers warned that digitally stored assets would not last for anywhere near a hundred years. Many of these assets, the report noted, have “longevities of thirty years or less, and are vulnerable to heat, humidity, static electricity and electromagnetic fields. The digital contents can be degraded by accumulating unnoticed statistically occurring ‘natural’ errors, by corruption induced by processing or communication errors, or by malicious viruses or human action.” Magnetic hard drives, meanwhile, “are designed to be ‘powered on and spinning,’ and cannot just be stored on a shelf for long periods of time.”

          And there’s the issue, forever plaguing all digital technologies, of obsolescence: both file formats and the hardware through which they’re accessed are designed to be updated or replaced by their manufacturers, meaning “digital ‘permanence’” requires an “ongoing and systemic preservation process” that includes regularly scheduled data migrations, in which data is periodically transported from old digital storage onto new storage.

          All of this quickly adds up in costs. The report’s authors wrote that long-term archiving of digital cinema was spectacularly more expensive than its analog equivalent. A digital master in 4K, Hollywood’s standard resolution for high-quality digital assets, cost 1,100 percent more to store than a 35mm film master. The difference for storing all of a film’s source material was even greater. An average two-hour movie shot and archived on film cost only $486 per year to store. Its 4K equivalent? A staggering $208,569 per year. The declining cost of digital storage doesn’t apply here. As technology advances and video codecs increase in quality, so too do the quantities of storage, and data migrations of assets and their backups get larger and more expensive.

          In their conclusion, the Academy called into question the long-term viability of the emerging digital archiving system and charged that studio executives had been quick to accept digital technologies with no forethought or regard for the “most fundamental needs of motion picture production and preservation.” More grim was their assessment that “today, no media, hardware, or software exists that can reasonably assure long-term accessibility to digital assets.” As of 2022, this is still true.

          All the issues that studios face in archiving digital materials — systems failure, hardware and software obsolescence, and data migration — are compounded for independent filmmakers, who can’t amortize costs over one hundred years. Hollywood partially answered The Digital Dilemma by continuing to strike YCM separation masters for all their films, even the digital ones. In this process, a film lab separates a picture’s yellow, cyan, and magenta elements onto three separate 35mm film strips. This is widely acknowledged by archivists to be the most stable archival format; all three color records fade synchronously over time, meaning future restorations can capture a film’s original qualities more accurately. Independents, unfortunately, can’t afford this. It costs around $100,000 dollars per film — roughly the entire cost of many low-budget features today.

          For independents working on small budgets, archiving has always been a lesser concern than getting films distributed and seen. But prior to digital filmmaking, directors could archive their films by throwing their negatives into a refrigerator. As the Academy concluded in a sequel report published in 2012, independent filmmakers were resolutely failing to meet the financial demands of digital archiving. Surveying 150 independent film professionals, the Academy found that 81 percent of interviewees stored some of their content on hard drives, but only 20 percent had temperature and humidity controls. Thirty percent had no climate control whatsoever, and 15 percent didn’t even know what they did.

          “It may be that we are accepting short-term advantages with long-term loss,” one filmmaker told Academy researchers. “I am very concerned that the next generation will not have the rich archive of historical and classical films that we have today.”

          Comment


          • #6
            Page 6 of 6

            IN 2010, JOHN GAJDA, a projectionist based in Tennessee, was looking for a new job. Gajda had been a projectionist since 1990, when he got his first job at Rice Lake Cinemas in Wheaton, Illinois, amid a multiplex building boom. For the next twenty years, he worked in the country’s largest multiplexes screening Hollywood’s biggest films. But when AMC, Regal, and Cinemark started ditching their film projectors for digital ones, everything changed. Projectionists across the country were getting laid off. “I saw the writing on the wall,” Gajda recalled. “All the union jobs were demolished. All the work they had at multiplexes was minimum wage.”

            Gajda got a job as a field service technician with Ballantyne Strong, a company that had manufactured 35mm film projectors for a century. In the wake of digital filmmaking, Ballantyne was reinventing itself. The company stopped building film projectors and began servicing digital ones. Ballantyne understood that film and digital projection couldn’t coexist. JPMorgan and Blackstone, the investment groups that had provided the capital for the big three theater chains to convert their screens, were making sure of it.

            The financing obtained by the theaters that then represented more than 50 percent of the US and Canada’s box office receipts — AMC, Regal, and Cinemark — was complex. The companies set up a limited liability corporation called Kasima (short for “Kicking and Screaming into the Modern Age”) that purchased digital projectors with loans from JPMorgan and Blackstone. Kasima repaid the loans by leasing the projectors back to AMC, Regal, and Cinemark and charging a virtual print fee (VPF): a payment of around $800 from the majors to Kasima every time AMC, Regal, and Cinemark screened a studio’s DCP. Everyone benefited in this arrangement: the majors avoided paying theaters directly, which could have permanently depressed the price of their rental fees; the theaters acquired digital projectors at a manageable price while keeping much of the overhead and risk off their own books; and JPMorgan and Blackstone securitized their loans, worth over $400 million, and sold them off to investors.

            There was a catch. As far as JPMorgan and Blackstone were concerned, as long as 35mm film projection was alive, any rogue independent with a film reel could threaten their investment. In their financing agreements, the investment groups stipulated that all the big three’s 35mm film projectors had to be removed or made inoperable. Regal hired Ballantyne Strong to take care of their film projectors. Gajda spent the first year of his job as a service technician removing and shipping film projectors from Regal’s 650 locations to a parking lot in Omaha, near Ballantyne’s headquarters. “We had a 35mm graveyard,” said Gajda. “Dumpsters and dumpsters of projectors, some of them brand new.” Some projectors were parted out. Most were hauled to the scrapper.

            For the thousands of independent theaters with repertory programs, getting rid of their film projectors made no sense. But the majors merely shrugged. Bill Hill, a projectionist since 1975, described to me the lengths to which studios went to restrict their archives with exhibitors who didn’t play ball. In 2011, while working at the Savannah Film Festival, Hill was supposed to screen a 35mm print of Stanley Kubrick’s Barry Lyndon. “Warner Bros. refused to give it to us,” said Hill. “They said, ‘We have a Blu-ray of it.’ But the Blu-ray was in the wrong aspect ratio!” This meant that sections of film’s original frame had been lopped off. Hill eventually tracked down a DVD in the correct ratio. He screened Barry Lyndon to a sold-out audience in the 1,237-seat Lucas Theatre in a format that was designed for a thirty-inch screen. The film’s celebrated photography, much of which was shot by candlelight with lenses designed for NASA, was reduced to pixels. “The cinematic experience was completely lost for those people,” Hill recalled. “It was just gone.” Stories like Hill’s are the norm. The majors, once known for never leaving money on the table, understood it was more profitable in the long run to have every exhibitor on the same page. “Why bother to prepare and ship a DCP to a theater that yields a box office take of less than $300 per day, from which the distributor gets about only half?” Bordwell observed in 2012. “Distributors willing to cut overhead and back-office operations may well regard the small houses and circuits as simply a pain in the neck.”

            Compared with many of his colleagues, Gajda was lucky. Over the next ten years, he worked his way up to a management position for Ballantyne’s West Coast division. But in February 2020, as Covid shut down theaters worldwide, the company began losing business. Gajda was laid off, along with half of Ballantyne’s employees. Most of the film industry would soon join them.

            By October, NATO announced that nearly 70 percent of all small and midsize theaters in the US were in danger of closing permanently, with one hundred thousand potential theater jobs erased. The federal government’s December stimulus provided a crucial $15 billion lifeline to live entertainment, some of which went to movie theaters. Still, audiences aren’t expected to return to prepandemic levels until 2024. And these are just the beginning of theaters’ problems.

            Michael Karagosian, a film industry consultant who worked on VPF contracts, told me that VPFs were temporary agreements intended only to fund the initial purchase of digital equipment. The problem is that digital projectors require considerably more expensive maintenance than 35mm ones. Most theaters will need to replace their projectors in the next couple of years. The last VPF agreements expired in 2020. AMC, Regal, and Cinemark are rumored to have secured extensions because of Covid, but this will only be a brief reprieve. From here on out, theaters are on their own.

            The problems go deeper. In July 2020, Trump’s Justice Department announced that their request to repeal the Paramount Consent Decrees, filed a year prior, had been granted. Distributors can once again own the means of exhibition. Block booking, illegal since 1948, is back on the table. In her decision, US District Court Judge Analisa Torres stated that block booking was no longer relevant. With new models of distribution having emerged since 1948, such as streaming, she argued, “there is less danger that a block booking licensing agreement would create a barrier to entry that would foreclose independent movie distributors from sufficient access to the market.” Judge Torres is correct that block booking is outdated, in the sense that the majors found a way around it: by producing fewer films, most of which were blockbusters, they effectively preempted theaters’ ability to choose what they want to show. But implicit in her opinion is the assumption that there is no difference between watching a film in a theater, at home, or on your phone. It suggests, as Netflix’s CEO Ted Sarandos has argued, that the cinematic qualities of a movie don’t matter. At the heart of a film is a “story,” which can be consumed on any platform and from any location. This is the logic behind Hollywood’s multibillion dollar push into “content.” “Content,” the formal vehicle for “storytelling,” can play more seamlessly on the twenty billion internet-connected devices around the world, unlike the mere two hundred thousand cinema screens that constrain “movies.” It can be watched from a bus, a couch, or a bed. Like most entertainment, “content” appeals as widely as possible in order to maximize profits. As Netflix’s cofounder Reed Hastings has admitted, the company’s main competitor is not HBO Max or Amazon but the biological need for sleep.

            Theaters might not have realized it then, but this was the first step toward a new exhibition paradigm that would no longer include them.

            Unlike the ever-growing fatberg of content, the cinema’s primary concern is to see the world clearly. This is best done in a theater, where the images are large, the sound is mixed, and a projectionist ensures with skill and attentiveness that films are screened as their makers intended. And yet, for the majority of Americans, seeing a movie theatrically has been a miserable experience for decades. Most Americans go to multiplexes, and most multiplexes were built in asphalt parks or inside malls on the outskirts of suburban sprawl. To get to them, audiences must drive through traffic, sometimes pay for parking, and sit through thirty minutes of ads and a lengthy blockbuster that costs well over $20 with concessions — an experience that Sarandos unironically believes is “elitist.”

            What projectionists know better than anyone else is that the cinema doesn’t happen in multiplexes, at least not necessarily. It happens in auditoriums, around which exhibitors have built lobbies, concession areas, and marquees; the cinema can exist in any space with a projector, projectionist, screen, film, and audience. When Lucas claimed that digital cinema would allow filmmakers to sell their films directly to theaters, he correctly intuited that Hollywood was preventing freer, more radical kinds of exhibition from existing. Today we are told by tech fetishists like Kara Swisher that streaming is a democratizing force that will liberate audiences from the Hollywood studios whose reliance on theatrical release is outdated and arbitrary. For these subscribers, it is easier to imagine paying a handful of corporations a monthly fee of $14.99 than, for instance, every public library in the United States having its own movie theater. The latter is the kind of exhibition paradigm that Hollywood, in shifting the industry from film to digital on its own terms, sought to foreclose. Netflix, in pioneering the streaming platform, has gifted Hollywood a digital exhibition model that appeals widely to the millions of Americans cornered by multiplexes, without forcing the studios to expand their film slates beyond IP-driven blockbusters.

            Although streaming platforms are unconstrained by the traditional model of making profits on individual films, the studios continue to make a very small fraction of all the films made by American and foreign independent filmmakers. When studios do, it’s only for the most established independents — whom Netflix elects to call “creators” — and who command especially large production budgets, such as Alfonso Cuarón with Roma, Martin Scorsese with The Irishman, or Jane Campion with The Power of the Dog.

            Tellingly, the vast majority of digital films that Hollywood produces bear little resemblance to the guerrilla, made-for-nothing filmmaking that digital cinema once promised. Around the turn of the millennium, a number of directors made pathbreaking films that offered a vision of what future digital cinema might look like: Agnès Varda’s The Gleaners and I, Lars von Trier’s Dancer in the Dark, Spike Lee’s Bamboozled, and David Lynch’s Inland Empire, to name just a few. Shot on prosumer camcorders, these films cast a wary eye on the spectacle of entertainment and pointed to the detritus produced by a society so assured of its own greatness. Digital filmmaking, as it exists today on streaming platforms and in AMC theaters, demands the most expensive equipment with the highest technical standards that produce the same generic look.

            Disney’s Marvel Studios, the greatest champion of digital cinema today, spends hundreds of millions of dollars on CGI that is designed to be spectacular but whose purpose is abject. Marvel’s producers are known for using CGI in every frame, replacing props and set pieces, giving characters new hairstyles, and ironing away wrinkles from actors’ foreheads and creases in their suits. Like Lucas’s editor who bragged about manipulating every frame in The Phantom Menace, Marvel mistrusts the camera’s ability to capture the realism that unfolds on set. The studio wants to tame what André Bazin described as “the irrational power of the photograph” that is the basis of cinema; the image’s capacity to misbehave in the future, no matter how hard one tries to direct it in the present. Behind every instance of Marvel’s CGI is the hope for the blockbuster’s continued supremacy.

            In light of the repeal of the consent decrees, it’s clear that the myth of digital filmmaking as a cheaper, faster medium than celluloid has become the remaining big studios’ alibi for achieving what had been taken away from them: a monopolization of all filmmaking activity — control over production, distribution, and exhibition. Bordwell predicted that in converting to digital, exhibitors were opening themselves up to a power grab. He noted that the perpetual capital-intensive demands of digital exhibition would engender a situation in which “only the permanently well-funded” would be able to exhibit films. Today we know exactly who these players are: the Hollywood majors, Amazon, and Netflix.

            But in the final turn of the screw, the majors no longer seem intent on maintaining theatrical exhibition, nor the tech companies on saving it. Why would they, when global streaming revenue is poised to reach $85 billion by 2025, more than double what Hollywood was getting in global box office receipts?

            If a Hollywood studio or tech giant purchases a bankrupt AMC or Regal in the coming years, it won’t be to lock out independent distributors — they’ve already informally achieved that through the blockbuster system — but to wind down the theatergoing experience altogether, and to steer, as they did with digital exhibition, a new distribution system to their liking, now centered on streaming.

            What this might entail: a mass closure of smaller-market theaters, an even shorter theatrical window, and higher ticket prices as theatergoing is repackaged as a premium experience. In other words, a moviegoing experience whose sole purpose is to qualify studio films for Academy Awards and make the price of a Disney+ subscription look like a good deal.

            More likely, they’ll let theaters wither away on their own. Speaking from Disney’s Burbank studio, CEO Bob Chapek inaugurated the company’s 2020 Investor Day by showing off a 1957 diagram of the company’s business structure, drawn by Walt Disney himself. At the center of the drawing, around which all the company’s squidlike appendages revolved, was “theatrical films.” As Chapek went on to explain, the company’s future now revolved around Disney+. Hollywood already has its alibi. “This one is for the fans,” claimed WarnerMedia CEO Kilar while announcing that all seventeen of Warner Bros.’s 2021 titles would be sent to streaming. Just like the myth that digital filmmaking is cheaper, it’s a narrative that hits all the notes of democratic fellow feeling.

            In reality, it’s for the stockholders. “[This] is about the survival of a telecom mammoth,” the filmmaker Denis Villeneuve wrote in a letter to Variety. “One that is currently bearing an astronomical debt of more than $150 billion.” Villeneuve, whose blockbuster Dune was affected by the move, was referring to AT&T, parent company to both Warner Bros. and HBO. In 2021 HBO Max grew its subscriber base by over 20 percent.

            Gajda, for his part, is following the money. In November 2020 he got a new job with Best Buy’s Geek Squad. He now assists with home theater installations.

            Comment


            • #7
              The link to the article above does not work. Use this instead:

              https://www.nplusonemag.com/issue-42...digital-rocks/

              The link above was incorrect: http://https//www.nplusonemag.com/is...digital-rocks/

              Comment


              • #8
                This is one of the more thoughtful and less gaffe-strewn articles on this topic than the norm. I totally agree with the author that the stadium seat McMultiplex on a mega-mall just off the freeway is in decline. Pretty much all the new installs I've done since starting at MiT in 2017 have been high end combined movie theater and restaurant operations (e.g. Cinepolis and Alamo Drafthouse), quirky arthouse venues, mom-and-pop indies in small remote towns, performing arts centers, college/university auditoria, and high end home theaters. But there have been enough of those to keep me busy, and they all seem to be investing in equipment and building healthy audiences back again. If the author is right and that Hollywood is squeezing theaters out, I suspect the response will be for theaters to look elsewhere for content, something that digital cinema makes a lot easier than it was in the past. I've seen some of the small town houses playing Hallmark and Pure Flix stuff, political documentaries, and streamed sports and musical theatre, and of course the arthouses and universities play foreign and other niche interest stuff. So it seems that the shift to digital might initially have enabled Hollywood to strangle the exhibitor, but paradoxically, has now given exhibitors access to content that the cost and time lag of 35mm distribution previously ruled out.

                I'm not sure where he got the figure of $10K a year to maintain a typical DCI projector from, though, unless he's including bulbs and electricity into that figure (which 35mm projectors also consumed). I'm sure that the parts and labor to maintain a typical Series 2 machine over a realistic 10-year service life don't come to anything like $10K a year if consumables that 35mm projectors also required are excluded.

                Comment


                • #9
                  Originally posted by Leo Enticknap View Post
                  This is one of the more thoughtful and less gaffe-strewn articles on this topic than the norm. I totally agree with the author that the stadium seat McMultiplex on a mega-mall just off the freeway is in decline...

                  I'm not sure where he got the figure of $10K a year to maintain a typical DCI projector from, though, unless he's including bulbs and electricity into that figure (which 35mm projectors also consumed). I'm sure that the parts and labor to maintain a typical Series 2 machine over a realistic 10-year service life don't come to anything like $10K a year if consumables that 35mm projectors also required are excluded.
                  It is a well written opinion piece, not an academic study. The article has no footnotes, et cetera to validate any of the conclusions.

                  Comment


                  • #10
                    I'm not sure where he got the figure of $10K a year to maintain a typical DCI projector from, though, unless he's including bulbs and electricity into that figure (which 35mm projectors also consumed). I'm sure that the parts and labor to maintain a typical Series 2 machine over a realistic 10-year service life don't come to anything like $10K a year if consumables that 35mm projectors also required are excluded.
                    I would put our cost at somewhere between $2000 and $2500, when you factor in bulb + filters + power + technician visit. Maybe a little less if our tech can remote-in to fix issues and/or I can do some of the small fixes myself.

                    This writer also gets wrong the percentage theaters pay to the studios -- like most writers, he says it's "up to 90%." (At least he doesn't claim that we cough up some of the popcorn money, as some writers have in the past.) I guess I'm glad that that myth persists because it justifies concession prices -- at least partially.

                    Comment


                    • #11
                      Does it include write-off of the projector? Does it include extended service warranties? Does it include the server too? Does it include an average for replacement parts outside of warranty? It's hard to say anything particular about this $10K figure without considering the whole picture, which may also look very different between operations...

                      Comment


                      • #12
                        The article claims specifically that $10K is the annual maintenance cost, not the total cost of ownership (TCO). Agreed completely with Marcel that $10K a year may be a realistic TCO figure, including depreciation of the asset, the cost of maintenance, and the cost of the consumables needed to run it (chiefly bulbs and electricity). The author gets enough of the technical concepts right that it would surprise me if he confused the cost of maintenance with the TCO, but that certainly would explain how he came up with that figure.

                        Comment


                        • #13
                          This writer also gets wrong the percentage theaters pay to the studios -- like most writers, he says it's "up to 90%.
                          Mike, I know you've bristled at articles that use this percentage before, but on the big blockbusters, there are deals that are indeed those uber lopsided percentages, especially for the big chains. True, not all and not in every situation, but I would think often enough that when reporters are writing about the exhibitor's plight, they are going to quote the most dramatic numbers to demonstrate their point. Whenever theatre owners want to complain about terms that on the face of it do sound draconian, they will point to the most unfair-sounding terms that say Disney has known to impose -- like being forced to play a title for many more weeks than it will pull in patrons, leaving the theatre to play to near empty shows day in and day out, not nearly enough of a gross to pay for the electricity to light the projector lamp.

                          I think whether the cost of operating a digital projector is $!0k, $5k or $2k, the author's point is well taken -- that the promise of converting to a digital platform has not resulted in the massive savings for exhibition that was predicted early on, and quite frankly, touted as the carrot to bring exhibitors on board, which was essential if the change to digital was to become the accepted universal standard.

                          I remember when I was in HS we used to get a science magazine every month -- the name of which I can't recall -- which explained the new advances in science. There was an issue about the new miracle energy source, uranium, and how it was so energy dense that when harnessed and used in an atomic reactor, it could provide energy for whole cities so cheaply that they wouldn't even need to charge for it! Similar kind of thing here; digital may have saved the cost of making prints, but it certainly didn't cost exhibitors any substantial less total operational cost savings for their business. Like atomic energy, it's wildly positive predictions of massive cost savings was greatly and artificially exaggerated. And the downside, at least by the proponents of the change never mentioned how devastating this C-change would close hundreds of smaller owners and devastate art house operations, as in the "We have no prints and no DCPs -- run a DVD" syndrome.
                          Last edited by Frank Angel; 04-03-2022, 02:00 PM.

                          Comment


                          • #14
                            I still contend the biggest consequence of getting rid of physical film prints and going digital is that it effectively removed any visible quality differences between what a commercial cinema offers versus what someone can see on the TV screen in their home. I don't need to be reminded about differences in color bit depth or the slight differences in overall image resolution or data compression levels. In the end, the movie product someone can see on their TV screen at home bares very little difference from the product offered in commercial cinemas. One has to wait only a very short amount of time for that far less expensive home version. Commercial cinemas no longer have any exclusivity at offering the best image and sound quality. Since it's all digital it's possible for it be equaled (for the most part) at home.

                            Is any work being done at all on the commercial level to move projection in cinemas above the 4K barrier? Digital cinema cameras that can shoot 6K and 8K have been around for several years now. Computing horsepower has improved to such a degree that rendering a movie's digital intermediate in native 4K should no longer be a problem, especially since it has been 18 fucking years since the first movie with a 4K DI was produced. I mean, holy fucking shit, 8K digital intermediates should be very common by now. But the Hollywood system has gotten so badly lazy. 2K was the default right up until just recently. 4K is only starting to become a go-to standard now that so many Goddamned UHD TV sets are in service within homes. The people running the movie industry are doing commercial cinemas no favors at all. They're pretty much enemies to each other at this point.

                            Comment


                            • #15
                              Mike, I know you've bristled at articles that use this percentage before, but on the big blockbusters, there are deals that are indeed those uber lopsided percentages, especially for the big chains.
                              I'm not saying the 90% figure doesn't exist, but these articles never mention the house allowance, or "house nut" as it's called. The theater pays 90% of what's left after the allowance is deducted, so it comes out to 55 or 60% in the end. I've never heard of an exhibitor actually having to pay 90% of the whole gross. But, no general audience writer wants to explain all that to the readers because it would cause them to tune out or fall asleep, so they simplify it. It's OK though because it leaves sympathy with the exhibitor.

                              In the end, the movie product someone can see on their TV screen at home bears very little difference from the product offered in commercial cinemas.
                              < -- (I fixed that typo for ya, Bobby)

                              I guess you're right, if ALL YOU'RE CONSIDERING IS PICTURE QUALITY. There is a lot more to the proper movie experience than that. I stood in the back of our theatre today for a while. There's no way any home system can compare to the size, spectacle and overal awesomeness of what you get in a theater. If you're in a crappy shoebox cinema, OK, but if you're listening to a decent sound system played at a good volume in a dark room with a great big screen, and you're out on the town with friends..... no TV can compare. It's apples and oranges.

                              People who wanna watch TV are gonna watch TV. It's just too bad they don't care enough about their entertainment to want to get it the best way they can. That's the biggest problem we face... both in the public and the industry.... people who don't care.

                              Comment

                              Working...
                              X