September 20, 2017 § Leave a comment
As the countdown to “Blade Runner 2049” continues, it’s worth remembering that the original “Blade Runner” wasn’t met with the kind of reverence it enjoys now. When it came out in 1982, I was living in Berkeley and saw it in a packed theater on what I’m pretty sure was opening night. From the first scene–explosions over an ominous-looking Los Angeles–I knew “Blade Runner” was a masterpiece. I loved the dystopian future it depicted, from the constant rain to the Japanese-influenced motifs. I loved the fact that Deckard was an updated Raymond Chandler detective who lived in a famous Frank Lloyd Wright house. I loved the fact that the climactic chase scene was filmed in the Bradbury Building, George Herbert Wyman’s 1893 iron-and-glass masterpiece that, like the film itself, was years ahead of its time.
I was surprised, to put it mildly, when the critics didn’t share my enthusiasm. Janet Maslin, though she praised the movie’s special effects, called “Blade Runner” “a mess, at least as far as its narrative is concerned.” On their TV show “At the Movies,” Gene Siskel called it “a waste of time,” while Roger Ebert gave it a thumbs up only for the effects. Twenty-five years later, Ebert reappraised it positively, in part because the once-futuristic lighted billboards had become a reality: “the story benefits…by seeming more to inhabit is world than be laid on top of it.” (Siskel died in 1999, so there’s no way of knowing whether he would have changed his mind.) The Hollywood Reporter called it “a Felliniesque journey into Dante’s Inferno, with Micky Spillane in tow,” though it also called it “mesmerizing.” Thanks to its decidedly mixed critical reception, “Blade Runner” was a box office dud.
The film’s reputation started changing with the release of Ridley Scott’s director’s cut in 1992. Shorn of its voice-over narration, “Blade Runner” gained a new following and began to be regarded as a science fiction classic. The lack of narration–tacked onto the original because some thought the story confusing–gives the film greater dynamism, as did additional footage that seems to affirm the theory that Deckard himself is a replicant. In 2007, the Final Cut, which I haven’t seen, expanded the unicorn dream sequence, remastered the haunting Vangelis score and added three scenes.
On October 6th, we’ll finally get the sequel: “Blade Runner 2049,” starring Ryan Gosling and Harrison Ford. Directed by Denis Villeneuve, it looks worthy of the original and will draw a massive audience of fans, including me. As for the critical reception, it’s safe to assume a much better response than the original received in 1982.
My ebook on “Blade Runner” is available here:
September 18, 2017 § Leave a comment
Throughout his long career, which began in TV and movie Westerns and dramas in the 1950’s, Stanton was described as a character actor, one who excelled in secondary roles. The character actor label followed him even after he became a leading man in two 1984 films: “Repo Man,” and “Paris, Texas.” It was, I suppose, a reference to his anti-movie star looks: a slight build and a gaunt, weathered face that became more fascinating as he aged. Stanton’s nose, always prominent, came to dominate his face as the rest of his features receded. Increasingly, his visage looked less made of flesh than carved from wood.
But his looks had little to do with his acting. His method relied heavily on observation and stillness, two qualities that elude most actors and almost all movie stars. Lucky is a man of few words, yet there’s never a moment when Stanton isn’t fully thinking his thoughts and feeling his feelings. We see Lucky going through his day– getting dressed, shaving, brushing his teeth, exercising, venturing out to talk with fellow residents of his tiny town–quotidian activities made profound by Stanton’s acting. I wouldn’t have minded if the film had gone on in this way, like a more mundane “Groundhog Day,” but the screenwriters Logan Sparks and Drago Sumonja had a bigger theme in mind: mortality, and Lucky’s reconciliation with it. Filling out the story are colorful characters played David Lynch, Beth Grant, Ed Begley, Jr. and Ron Livingstone, but the film–and every scene in it–belongs to Harry Dean Stanton. For me the highlight is his unexpected, perfect singing of “Volver,” (“To Return”) at a child’s birthday party. More than a showcase for Stanton’s musical talents–he was an accomplished singer and guitarist–“Volver” is a song about memory, love, the passage of time and the inevitability of death:
To Return with a withered brow/the snows of time silvered my temples
To feel life’s a puff of breath/that twenty years are nothing
Through his performance of “Volver,” Lucky gains an acceptance of his own end, a moment of grace that affects everyone in the scene. Days later, I’m still thinking about it.
September 12, 2017 § Leave a comment
In the years since our interview, Hargobind married, closed his business and moved with his wife Dalveer to New York. Soon afterwards, he was diagnosed with brain cancer. The last time I saw him was in 2015, during a visit to Los Angeles while he was in remission. More surgeries followed, and today he came to the end of his life after a brave two-and-a-half year battle.
Though he became a New Yorker, I will always think of Hargobind in Hollywoodland, a place he loved. In addition to local history, he learned about the wildlife and was able to identify birds by their calls. He led so many people up the Hollywoodland stairs that he grew noticeably thinner and more muscular, yet he was always respectful of us residents. I was lucky to be among his and Dalveer’s friends, a group that spans the world and today remembers him fondly.
September 9, 2017 § Leave a comment
It’s hard not to be puzzled by recent director changes in the “Star Wars” series. The latest to be fired is Colin Trevorrow, who was supposed to direct “Star Wars: Episode IX.” He follows Phil Lord and Chris Miller, who were fired during production on the untitled Han Solo film (and replaced by Ron Howard), and Josh Trank, who was either supposed to direct the Han Solo movie or the Luke Skywalker one–no one seems to know for sure. Then there’s Gareth Edwards, who received directorial credit for “Rogue One” but was relieved during production by its screenwriters, Chris Weitz and Tony Gilroy. Because Weitz and Gilroy did extensive reshoots and fixed the third act, they are widely credited with the success “Rogue One.”
Producer Kathleen Kennedy’s formula for directors seems to be: find them at Sundance, give them a bigger picture and then move them up at warp speed to “Star Wars.” Hence Trevorrow, whose successful indie film “Safety Not Guaranteed” led him to direct “Jurassic World.” The critical and box office failure of his latest, “The Book of Henry,” sealed his fate on “Star Wars: Episode IX,” but shouldn’t have been his modest resume? Phil Lord and Chris Miller’s animated hit “Cloudy With a Chance of Meatballs” led them to “21 Jump Street” and “The Lego Movie,” but none of the three predicted a successful transition to the world’s biggest franchise. Josh Trank’s indie hit “Chronicle” gave him the director spot on “Fantastic Four,”–again, not an obvious pathway to “Star Wars” glory.
As every knows, there has never been an experienced woman director of a “Star Wars” film, much less one with as little directing experience as these men. Women with critically successful first films tend to spend years trying to finance their second, not juggling action film offers from major studios as their male peers do. Often they wind up directing TV shows–hardly the purgatory it used to be, but not the same as having their name on the poster of the big summer movie. I can think of three excellent women directors with long resumes off the top of my head–Kathryn Bigelow, Mimi Leder and Ava DuVernay. All have successful action and effects films to their credit, but were they even considered for “Star Wars”? It’s easy to say they wouldn’t want the oversight that comes with the job, but I wonder if any of them were asked.
As it happens, Ava DuVernay–fresh off her Oscar nomination for “13”–is already directing the forthcoming “A Wrinkle in Time” for Disney, which owns “Star Wars.” She is the favorite candidate of many fans for “Star Wars: Episode IX” and a likely choice. Let’s hope Kathleen Kennedy thinks so too.
August 1, 2017 § Leave a comment
My first job after college was an odd mix of performing arts and social work. Each week, San Francisco’s theaters, dance and opera companies and orchestras would funnel their unsold tickets to my firm, which would distribute them to social service groups that could fill seats on short notice. Our clients were low-income seniors, recovering alcoholics, pregnant teenage girls and the mentally ill, all of them living or receiving care in facilities.
It was a job that put me in daily phone contact with theater managers, social workers and, in the case of the seniors and alcoholics, the clients themselves. First came the tickets, then the matching of shows to clients: nothing violent for the mental patients, for example, and nothing depressing for the pregnant girls. The alcoholics were the most manipulative, missing their call-in deadlines and refusing all opera, symphony and ballet. The mental patients were the most frequent no-shows and disrupters, while the pregnant girls never wanted to see anything. The seniors were by far the most reliable, and game for almost anything. Consequently, they were our most frequent clients.
Each group was escorted by one of our volunteers, local culture enthusiasts who lacked the income to see live events. They went to most of the performances, but occasionally my boss would assign me a show she thought I should see. One of these was a revival of Sam Shepard’s “True West” at the Magic Theater, which had launched first production of the play in 1980. I had never seen a Shepard play before and probably would have been impressed by any of them, but seeing “True West” in the small theater where it originated was an indelible experience. As the brothers Austin and Lee argued, drank, fought and switched personalities, I knew I was seeing a work of genius.
Sam Shepard’s side career as an actor certainly garnered him more fame and money than his career as a playwright, and like most people I’ve seen a lot of his movies. Most put him in supporting roles, where his handsome, lone cowboy looks and quiet charisma had maximum impact. Each time he appeared onscreen, whether in “Frances,” “Crimes of the Heart,” “The Assassination of Jesse James by the Coward Robert Ford,” or any number of others, I felt a wave of excitement followed by a sense of relief. Whenever Shepard showed up, the film always got better.
Hearing of his death yesterday made me wonder who would fill his shoes, both as a playwright and actor. In the former category, comparisons to Williams and O’Neill are easier than those to younger American playwrights, whose work seems paler and less universal. And in an era where young American actors stay boyish throughout their careers, the Sam Shepard roles increasingly go to Australian, Irish, Scottish or English actors. Shepard himself was often compared to Gary Cooper, a Montana native he resembled both physically and stylistically, but who will remind us of Sam Shepard?
June 29, 2017 § Leave a comment
“Funeral Parade of Roses,” by the late Toshio Matsumoto, premiered in Japan in 1969 but was not seen in the United States until 1970, probably because of its depictions of gay sex, drug use and violence. Matsumoto, who for most of his career was an academic and an experimental filmmaker, sets his story in the demimonde of Tokyo’s Shinjuku district. His characters are gangsters, filmmakers, student rioters and trans women. Most of the action takes place in a gay club whose gangster owner, Gonda (Yoshio Tsuchiya), is pitting Leda, the “mama” (Osamu Ogasawara), against a younger rival, Eddie (Peter), who aspires to succeed her as the hostess. As the lover of both Leda and Eddie, Gonda sets in motion a tragedy of Greek proportions.
“Funeral Parade of Roses” is at once an art film, a black comedy, a feature film, soft core porn, a film-within-a-film, a horror flick, a political commentary and a retelling of “Odeipus Rex”–and I’ve probably missed a few genres. It references Man Ray’s photographs and French Cinema, and is beautiful, messy and brilliant. The film was a major influence on Stanley Kubrick, who borrowed from it in “A Clockwork Orange” and “Eyes Wide Shut.” Despite being nearly fifty years old, “Funeral Parade of Roses” received a wildly enthusiastic reception from a mostly young audience at Cinefamily the night I saw it. It will be released on DVD and deserves its praise.
“Oh, Lucy” directed and co-written by Atsuko Hirayanagi, was well received at Cannes this year. The story of a 55-year-old single woman in Tokyo who unexpectedly changes her life, the film deals in Japanese themes (suicide by train, office ladies, yakuza) as well as universal ones (workplace politics, alienation and family relationships).
Setsuko (Shinobu Terajima) is a hoarder and office drone who witnesses a suicide off the tracks on her way home one evening–in fact, the man bids her goodbye before jumping. Soon afterwards, her niece Mika persuades Setsuko to buy a package of English lessons from her. Despite having no interest in learning English, Setsuko hands over the money and goes to the class which, oddly, is held in a yakuza establishment in Shinjuku.
“I’m a hugger,” says the American teacher, John (Josh Hartnett). He promptly wraps her in an embrace, christens her Lucy and makes her wear a blonde wig, all of which he claims will help her to learn English. Galvanized by his method, Lucy develops a crush on John as well as a tentative friendship with a fellow student, Tom (Koji Yakusho). When John abruptly disappears along with Mika, Lucy wastes no time in flying to Los Angeles to find him, accompanied by her estranged sister Ayako, Mika’s mother. Once they find John, they set out on a road trip to San Diego in search of Mika. There, liberated and unmoored, Lucy wreaks havoc on everyone around her. A black comedy that gets progressively darker before its hopeful ending, “Oh, Lucy” is as unpredictable and indelible as its heroine. It’s well worth seeing.
April 9, 2017 § 1 Comment
Those who’ve seen my documentary, “Under the Hollywood Sign,” will remember my interview with the musician Alan Brackett, a longtime Hollywoodland resident who also contributed the song that accompanies the end credits. Brackett has just published an illuminating memoir, Almost Famous: Journey to the Summer of Love, about his early life in Santa Barbara, where he was a child performer, and his subsequent musical career in Los Angeles during the 1960’s.
“I believe I helped kill [folk music] with…over-exposure,” he writes refreshingly. Brackett isn’t kidding: before founding the seminal psychedelic band the Peanut Butter Conspiracy, he was a successful folk musician, most notably in the Hillside Singers, a quartet that toured the country during the height of the folk craze in the early 1960’s, when he was still a teenager.
The other reason for folk’s demise, of course, was the British Invasion, whose seismic influence Brackett grasped as he enlisted in the Marines in 1964, ahead of being drafted. After six months of service he returned to a changed world, musically and socially: the 60’s had begun in earnest. His new band (first called The Young Swingers, then The Ashes) played rock, and after a few more incarnations and personnel changes became the Peanut Butter Conspiracy in 1966. The band signed with Columbia, cut an album and quickly became famous. Brackett, who played bass, was its main songwriter.
PBC had a woman as its lead singer, Barbara “Sandi” Robison, which probably contributed to its rivalry with the Jefferson Airplane, which was led first by Signe Anderson and then Grace Slick. (Beyond that fact, the Airplane’s drummer, Spencer Dryden, had been a member of The Ashes.) In an affecting aside, Brackett talks about manager Bill Graham’s reaction to the PBC’s getting better reviews than the Jefferson Airplane did: he kept the band off any bill that included the Airplane, effectively cutting off the PBC’s chances to play festivals and large venues across the country.
While “Almost Famous” will appeal most to those who remember the Peanut Butter Conspiracy and its heyday, anyone can appreciate the whirlwind atmosphere of the late 1960s music scene. Within a few months of its founding, the PBC not only had a major label recording contract but was billed with every famous band and musician of the day. The Doors, the Association, Iron Butterfly and the Byrds are a few of the bands Brackett knows well, and Bob Dylan, Janis Joplin, Elvis Presley and Frank Zappa enliven his anecdotes. His memories are all the more affecting because many of these musicians are gone, along with the Los Angeles they inhabited so brightly.
“Almost Famous” has some drawbacks: it’s heavy on childhood reminiscences and light on Brackett’s later life, including a stint in music publishing and a longer career as a Hollywood prop master. It also could have benefitted from a cleanup of the spelling, punctuation and grammar. Nevertheless, the book is a valuable account of an important time in American culture, and well worth reading.