By Richard Schickel
It begins sometime in early December, in a screening room near you, with a handful of middle-aged men and women impatiently awaiting the start of a new movie. “Bad year,” one of these critics muses. “Can’t remember a worse one,” someone else offers. To which I never counter with the logical comment, “Sure you can.”
It seems superfluous. Movies are the products of human ingenuity, or the lack thereof, which means, of course, that the vast majority of them are pretty bad and a substantial minority of them are just plain god-awful. This annual chorus of complaint rises in volume as the various critics groups announce their prizes and individual writers submit their 10-best lists. It climaxes, as it did earlier this week, with the announcement of the Academy Award nominations, which are presented—and reported upon in the press—with a certain objectivity. Mild surprise is rendered at various errors and omissions (What, no directorial nomination for Christopher Nolan?), dubious predictions about the outcome offered up. This year, the prizes will be presented on Feb. 27. And I promise you that by March 1 we will be unable to recall most of the winners, although there are always the display ads reminding us of who won the prize for best editing.
Thus do a few movies “enter history.” Except that history—especially movie history—has a way of moving to its own peculiar rhythms. For example, who knew, at the time, that the best movie of 1950 was “In a Lonely Place”? The Oscar that year went to “All About Eve,” which was OK, I suppose, though the film is so brittle that it tends to fall tinkling at your feet when you re-encounter it now.
As 2010 drew to a close, “The Social Network” looked like a sure Oscar thing. I couldn’t understand that at all—a bunch of unpleasant people arguing over the inordinate profits of an Internet institution that was of no consequence to me—or, I would have thought, to anyone else over the age of 40. I guessed that the raves were arising from aging reviewers eager to prove their hipness and thus preserve their jobs from eager young things willing to work at half price.
This week’s very routine nominations hinted that I was not entirely wrong in my suppositions. “The Social Network” did all right—eight nods, most of them in significant categories—but three other films did better, and the coverage I read was notably reserved in its comments on the picture. I’m not saying it peaked early. I’m saying that the academy elders are not a group that spends a lot of time friending each other on Facebook. My guess is that they think “Network’s” protagonists are a bunch of snotty little twerps, not unlike some of their own sons and daughters, who after college moved back into the house to whine away a couple of years before entering the job market.
So it goes in this year of staggering irrelevance in Oscarland. Did you really profit by your exposure to the psychosexual hysteria of “Black Swan”? Or feel the triumph of the human spirit watching a guy cut off his arm to save his own life in “127 Hours”? Or thrill to the gnomic power of “Inception”? Or think that the always enjoyable Jeff Bridges was all that much better than John Wayne, who joyously subverted his own image in the original version of “True Grit”?
I’m not saying that the films nominated for best picture did not offer some incidental pleasures. For example, we all like Annette Bening, an obviously nice woman, a good and decently ambitious actress, whose successful taming of Warren Beatty’s libido also excites Hollywood’s admiration. But her work in the tepid “The Kids Are All Right” is more solid than stirring. I’ll take her in “The Grifters” or “Bugsy” every time. By the same token I’ll take Mark Ruffalo—a consistently underrated actor—in “Kids” over his competitors. His charm is very insinuating.
But still, this is The Year of Living Safely—within disguised genre limits, within emotionally predictable ranges. Except, perhaps, for “The King’s Speech.” It has the most nominations (12) and is therefore the front-runner for the best-picture prize. Yet support for the picture seems rather hangdog. It’s rather like arguing for “Mrs. Miniver” as a Major Motion Picture experience.
The virtues of “The King’s Speech” are so—I don’t know—traditional. Or seemingly so: Royalty brought down from its high horse and humanized as it struggles with its damned stammer; its nicely evoked reconstruction of another time and place; its sense that triumphing over a kingly speech impediment may have been a previously unknown hinge of fate. Watching it, you sometimes get a regressive feeling; it’s almost as if we were back in the 1930s, when about half the American movies were set in England or its colonies, and C. Aubrey Smith was the leader of the local “English colony,” captaining cricket teams in Santa Monica.
But I think “The King’s Speech” is better than that. Its acting is very formal—100 percent wool and several yards wide. And it has a lot of live-wire English eccentricity about it. It may look a little bit stuffed-shirt at first glance, but it has a lot of cross-class sniping. And it does impart the feeling that the bonding that takes place between Colin Firth’s George VI and Geoffrey Rush’s cheeky speech therapist may predict, sometime in the future, a genuine loosening of England’s class structure and strictures. And you really can’t beat the divine Helena Bonham Carter sitting on the king’s chest in aid of curing his stammer.
Practically speaking, the success—so far—of this movie reflects the fact that academy membership skews old, as they say. The members are ever ready to stand up for what people used to call “The Tradition of Quality”—otherwise known as stodginess. But sometimes the stodgy is preferable to the dodgy. When it comes to the academy, it is always unwise to vote against amiability. Or likability. Or the merely well-spoken—which, all that stammering aside, “The King’s Speech” defiantly is.
He’s not buying it, folks: Our resident reviewer isn’t keen on Darren Aronofsky’s accolade magnet, “Black Swan.”