The Best Movies of 2014

February 25th, 2015

So, what did you think of the Oscar ceremony?  I’m always amazed at how little the show does with the actual movies they are supposed to be honoring.  (E. Haig has more to say on that subject in his review.)  As for the awards themselves, I find myself almost never satisfied.


This year, I suppose I can’t complain a whole lot about the award for best picture.  “Birdman” was my second best film of the year, but it was not a close second to my top choice.  But most of the other films on my top ten list (see below) either didn’t win anything or weren’t even nominated.  Of course, my choices are based on entirely subjective factors that only start with what I perceive to be artistic excellence.  Other factors I consider are the noteworthiness of the film, the emotional reaction it elicits in me, the extent to which it stays with me weeks and months after I’ve seen it, and the whether I feel it will stand the test of time.


And as movie years go, I have to say 2014 wasn’t a great one.  Only a few films qualify as truly great in my mind and many that were projected to be extraordinary came up short, in some instances far short.


Among those I would put in that far-short category were Christopher Nolan’s “Interstellar” and Angelina Jolie’s “Unbroken.”  The latter was a valid attempt to tell the dramatic story from Laura Hillenbrand’s book, but it was too heavy on the prison camp torture scenes.  And the former was just a mess, with an incongruous plot (even for a science fiction film).


Other films that I found less than compelling were Wes Anderson’s “The Grand Budapest Hotel,” Paul Thomas Anderson’s “Inherent Vice,” and “Whiplash.”  For those, I came away with either a shrug (“Budapest”), puzzled (“Vice”) or just plain angry (“Whiplash”).  I was also disappointed with (despite admiring in part) “The Theory of Everything” and “The Imitation Game.”  Both are quality productions, but neither resonated with me.


I saw six films that were very good, but not quite good enough to make my top ten.  They were “Two Days, One Night,” with Marion Cotillard as the laid-off worker fighting to keep her job; Tim Burton’s “Big Eyes,” starring Amy Adams and Christoph Waltz; “Still Alice,” which was elevated by Julianne Moore’s revelatory performance; “Selma,” which succeeded in capturing the history of the Civil Rights movement, if not all of its passion; “Locke,” with Tom Hardy alone in a car for 90 minutes; and “The Railway Man,” with Colin Firth more effectively portraying the horrors of a World War II Japanese prison camp than were shown in “Unbroken.”


With all that I’ve said to this point, my top ten films should be understood to be an entirely subjective list.  I don’t claim to be a professional critic, and I certainly have no training as a film maker or experience in the film industry.  As with all other forms of artistic expression, one person’s joy might be another’s displeasure.  Some will see films on this list they didn’t find all that impressive; some might even wonder if they saw the same movie.  And even I might think less of them if I saw them again.  The films don’t change, but our perception of them might, due to any number of factors, not least perhaps being our emotional mood that day.


With that heavy dose of caveats, here’s my list, starting with number ten and working down to number one:


  1. “Into the Woods” – This Disney production worked on many levels in elevating many fairy tales into a meaningful musical for adults. Meryl Streep again showed she can handle just about any role, and she was supported by a solid cast.


  1. “American Sniper” – The Chris Kyle biography was heavy on hagiography, but Clint Eastwood’s direction provided much to contemplate beyond the heroic portrayal. It’s the Iraq War version of Audie Murphy’s “To Hell and Back.”


  1. “Wild” – Reese Witherspoon made hiking interesting, and the backstory of why her character needed to hike the 1,100 mile Pacific Coast Trail was effectively conveyed in Nick Hornby’s screenplay from Cheryl Strayed’s memoir.


  1. “Citizenfour” – Laura Poitras’s documentary of the Edward Snowden revelations is the scariest exposé of the fearsome power of a secret government since “All the President’s Men,” and that it is told as the events were developing makes it all the more stirring.


  1. “Gone Girl” – As dark a film (effectively directed by David Fincher) as could be made from an equally dark novel, with a resolution that is unsettling, especially because it is so believable. Ben Affleck more than holds his own against a masterful performance by Rosamund Pike.


  1. “Foxcatcher” – Another dark story, this one true, highlighted by an amazing performance by Steve Carell, excellent supporting performances by Mark Ruffalo and Channing Tatum, and great direction by Bennett Miller in bringing the tragic tale to life.


  1. “Love is Strange” – John Lithgow and Alfred Molina are so wonderful as the married couple in this film that the beautiful story their lives revolve around is icing on the cake. This film deserves far more praise and recognition than it received in its all too short release.


  1. “Mr. Turner” – Mike Leigh knows how to make real movies about real people, and in this complete portrait of the eccentric nineteenth century British painter, J.M.W. Turner, he has created another masterpiece, one of rare beauty and humanity.


  1. “Birdman (or the Unexpected Virtue of Ignorance)” – Alejandro Inarritu’s film is so unique in its depiction of live theater and yet touches such universal truths about human needs, dreams and delusions that it is compelling viewing at the same time that it is wonderfully entertaining.


  1. “Boyhood” – Twelve years in the making, all with the same core actors, and telling a very personal story of a family and the son who grows into a young man, and it all works wonderfully. This film is a great credit to the vision of Richard Linklater.  It stands above all the others for me.


E. Haig’s Review of the Academy Awards Ceremony

February 25th, 2015

As an entertaining variety show, this year’s Academy Awards telecast (last Sunday on ABC) was heavy on music and included a fair number of humorous quips, lots of glamour (especially in the dresses many of the female presenters wore), and a few moments, here and there, of snippets from actual movies.  The end result was very much a mixed bag, not just in terms of content, but with regard to the quality of the overall production as well.

The proceedings started off well enough with host Neil Patrick Harris performing a clever song and dance routine on the magic of movies.  He used back screen images of real films and was joined in part by Anna Kendrick and Jack Black.  Mr. Black provided the dark side of the industry in the lyrics he sang, while Mr. Harris and Ms. Kendrick were joyful and upbeat.  It set the stage for some of the tone that was apparent throughout the night, with the collective industry trying hard to prove that charges of latent racism were unfounded and that everyone was politically correct and family focused.

But too much of a good thing often makes for a bad show, and this awards ceremony teetered on the verge of being bad for much of the overly long (three-and-a-half hours plus) telecast.  The problems started with the early acceptance speeches, which were heavily laden with thanks to family members.  Few, if any, winners failed to mention spouses, children, and parents in the thanks they gave, and some (best supporting actor J.K. Simmons, in particular) sounded as if the family connection was everything for them.  Yes, it’s a nice sentiment, but it doesn’t necessarily make for compelling viewing, especially when the award winners are complete unknowns anyway (as many of the early awards go to technicians rather than true stars).

And when they weren’t thanking their families, many award winners saw fit to make political statements.  Patricia Arquette (best supporting actress for “Boyhood”) got the ball rolling in a rambling speech that she read that ended with a call for equal pay for women.  The President of the Academy, Cheryl Boone Isaacs, supported freedom of expression in her remarks.  Laura Poitras, winner for “Citizenfour” (best documentary) opposed government intrusion on privacy.  And Alejandro Inarritu, in accepting the award for best picture (he also won for the screenplay of “Birdman” and for directing the film), closed the festivities by hoping the United States would reform its immigration laws in favor of immigrating Mexicans.

And then there was the singing.  Some of it was okay; some of it was good; very little of it was memorable.  And yet there was so much of it.  Did the show really need the tribute to “The Sound of Music,” with Lady Gaga (who, however good a performer she may be, is not a great singer) working through a medley of the tunes from that movie?  Was that entire segment just a way to honor Julie Andrews, who then presented the award for best movie score?

The other songs of note were Glen Campbell’s “I’m Not Gonna Miss You,” sung by Tim McGraw (a sadly truthful prediction of what Mr. Campbell’s affliction of Alzheimer’s would cause him to lose) and “Glory” from the movie “Selma,” sung by John Legend and Common (the rap artist).  The reception to that song marked another overdone aspect of the show.  As everyone knew, the Academy had taken a lot of heat for the seeming all-white quality of the major award nominees.  Mr. Harris noted it in his opening jokes (most of which were less than side-splitting), and it was evident in the number of non-white presenters of the many awards over the course of the evening.  And so it came as no surprise that Mr. Legend and Common received three standing ovations (one after they sang their song), one after their song won the award for best song, and one after their thank you speeches, which lauded the work of Martin Luther King and (yet more political speech) called for the struggle to continue.

As for Mr. Harris, his main contribution, other than that opening song, was a clever bit where he appeared on stage in his underpants after he locked himself out of his dressing room (a take on the scene from “Birdman,” where much the same thing happens to the character played by Michael Keaton).  It was mildly amusing, but probably came across as just weird if you hadn’t seen the movie.  Less successful was an extended bit where he made much of his own predictions for the night.  He had them locked in a safe on the stage and repeatedly made sure everyone knew they were sealed in that safe.  Then, at the end of the night, he retrieved them and showed them to be accounts of many of the speeches that had been delivered earlier.  For all the buildup, it kind of fell flat.

As did much of the evening, which included precious few actual movie clips.  Short bits (not even full scenes) totaling 30 seconds were shown of each of the eight films nominated for best picture as they were introduced.  Short (very short) scenes were shown of each of the acting nominees.  And there were the scenes of “The Sound of Music” for that tribute.  Otherwise, it was as if the night intended to honor the best from the movie industry was scheduled for another night.  We keep hoping that one year the Academy is actually going to produce an Oscar ceremony that really celebrate the cinematic arts its industry produces.  Clearly, this wasn’t that year.


Contemplating Brian Williams and the Lies We Tell

February 19th, 2015

Brian Williams may never deliver the evening news on NBC or any other network again.  If so, he’ll go down as a major figure in American journalism who “misremembered” himself into oblivion.

As the dust settles around Mr. Williams’s six month suspension from his position as chief news anchor for NBC’s nightly news broadcasts, many are wondering how a man who had secured the top job in his profession could have been so careless in reporting a personal anecdote.  The answer really isn’t as hard to understand in terms of the human condition writ large.  Lying, in all its variant forms, is really as common a trait as honesty, and far more universally practiced.

The kind of lying in which Mr. Williams engaged is even more common still.  In the vernacular, it is referred to by a term synonymous with cattle droppings.  The classic example of this form of prevarication is the telling by the fisherman of the fish he caught and then released.  In truth, the little sea dweller might have been a healthy five pounder, but in the tale the proud fisherman recounts to friends later, it was a whopper, weighing at least forty pounds.

The fisherman might even expand his story into a veritable “Old Man and the Sea” saga, making himself out to be something of a hero in the process for battling long and hard to pull his catch aboard before then releasing the captured prey back into its natural home.  His friends will be dutifully impressed, unless they have told a few such tales themselves, in which case they’ll smile and dismiss the story for what it is: B.S. (the family-friendly version of that cattle dropping term).

Williams’s report of his harrowing experience as a passenger in a helicopter that was struck by an RPG while on patrol in Iraq in 2003 was a classic fisherman’s tale.  In fact, he had been in a helicopter that was a good 30 minutes behind the one that was hit, according to military personnel who recalled the incident.  Williams’s claim that he “mis-remembered” the details of the incident is the kind of half-apology that never cuts it, but it’s not unusual.  Hillary Clinton had a similar “excuse” when she misreported a similar incident she claimed to have experienced during her husband’s administration.  That B.S. got her in trouble in the 2008 campaign and could be raised again if/when she runs next year (or later this year, as rumors suggest).

Both Williams and Ms. Clinton are guilty of classic B.S., the kind that in social circles is most often ignored or considered gauche at worst and hardly worth seriously diminishing one’s view of the prevaricator.  Even people who are habitual bull s—tters usually escape social shunning, and many are still admired or at least appreciated for other qualities.

But in Mr. Williams’s case (and to a lesser extent Ms. Clinton’s) the B.S. raises unappealing questions about his trustworthiness in doing what is his basic job, to wit: accurately and honestly reporting the news.  In an earlier time, the revered Walter Cronkite was studied microscopically by would-be critics to see if he ever allowed a false report to pass from his lips.  Even his facial expressions were viewed with something approaching a compulsive degree of attention.

But Cronkite, an absolute paragon of trustworthiness, was always above reproach.  Williams isn’t in his league, nor for that matter are most of the cable news anchors whose political biases often cause them to non-report parts of a story that create a false impression (the essence of a lie) of real events.

Ms. Clinton’s fate is less likely to be controlled by her false claim that she escaped enemy fire in Bosnia.  First of all, she’s a politician, not a news anchor.  Politicians are expected to prevaricate, which is not to excuse it, but what politician doesn’t “color the truth” in making campaign promises, if not in describing his or her accomplishments?  The ones who get in trouble are those who go beyond typical campaign promising and résumé puffing to outright lying about illicit or criminal behavior.  Bill Clinton got impeached for trying to artfully lie his way out of an illicit affair.  Others have similarly crossed the line in trying to avoid the opprobrium of their mis- and malfeasances.

But lying in its many forms and for its many reasons is part of the human condition.  It is recognized in the very first pages of the Bible, when Cain lies to God about his murder of his brother.  The writers of that story assuredly understood the penchant for lying that even in the early days of civilized society was clearly rampant.  And so it has been that throughout the existence of our species lying has been a commonly accepted, if oft condemned, part of our being.  Rare are the individuals who, like the young George Washington, “cannot tell a lie.”

Think about the oath that witnesses take in court.  Why must they swear to God or, more commonly now, under penalty of perjury, to “tell the truth, the whole truth, and nothing but the truth”?  Isn’t the requirement of that oath based on the recognition that we will be inclined (or at least susceptible) to lie, if we don’t swear to be honest?

People lie.  They lie for all kinds of reasons and in pursuit of all kinds of goals or for all kinds of purposes.  People lie to self-aggrandize; they lie to get out of trouble; they lie to gain advantage in relationships; they lie to avoid expressing honest feelings; they lie to protect loved ones; they lie to accommodate the demands of bosses or supervisors; they lie hid their true identities; they lie, sometimes, just because it makes for a better conversation.

Lying in social circles, and even in political ones, is easy; it’s accepted, and, so long as the result of the lie isn’t directly harmful to anyone, it usually isn’t condemned.

The more interesting question, one that gets folks like Mr. Williams in real trouble, is whether telling the truth is equally as harmless.

On Trusting the Human Condition

February 8th, 2015

The drive home from my gym every day requires me to take a left turn from a main thoroughfare (Fair Oaks Boulevard in Carmichael) to a residential street that leads to my house.  In making that turn, I am often waiting for on-coming traffic (moving from west to east) to pass.  Those cars often whip around the bend in the road at that point at speeds well in excess of the posted forty miles per hour limit.

Usually, I pay the cars that dart by to my left no mind, waiting instead for a break in the traffic to make my turn from the turning lane where I station my car.  The turn itself is not an issue; it’s a basic Driver’s Ed left turn.

But every now and then, as I wait for the moment when all is clear, I contemplate the admittedly remote possibility that one of those oncoming speedy vehicles will lose control or otherwise veer into my turning lane.  It isn’t a completely irrational concern.  Serious accidents do occur at that point, more often due to the risk undertaken in turning left onto Fair Oaks as opposed to turning off of Fair Oaks (if you get the picture).

I could avoid this risk of a life-threatening collision by taking a more circuitous route to get home from my gym, but I don’t, because . . . well let’s see, why don’t I take that more circuitous route?  It would take me an extra minute or two to get home, so that can’t really be the reason.  It’s probably because the odds of one of the on-coming cars veering into my lane are low, not infinitesimally low, but low enough that I accept the risk.

In the end, I’m trusting my fellow drivers to be reasonable, to be appropriately cautious when they whip around a bend in the road just a tenth of a mile from my intersection and to be appropriately attentive to the possibility that a car might be waiting to turn into their path.  And, of course, my trust is buttressed by the fact that any driver who isn’t appropriately cautious and attentive risks serious injury to him or herself as well as to me if he or she should veer into my turn lane.

But still I wonder about the calculus of my decision to take on the risk.  Accidents, after all, do happen, and, as is also axiomatic in Driver’s Ed, defensive driving is the best antidote to being a victim of those accidents.  So, maybe I should take that more circuitous route on my drive home from the gym.  Maybe I should just take more responsibility for my own fate and leave less to chance.

And yet, I won’t, because, despite my recognition that those accidents that do happen are most often caused by drivers who are less than appropriately cautious and attentive, I trust that most drivers are and that their personal safety amps up their level of caution and attentiveness just enough to work the calculus in favor of risking the accident to save the two minutes.

If all of this relatively trivial contemplation sounds odd, consider how often all of us engage in the same innate reasoning.  Much of human activity is based on the trust we have in each other.  We trust our doctors to use their professional training and expertise correctly in diagnosing a symptom.  We trust police officers to use discretion in the exercise of their duties and in the use of force.  We trust airline pilots to stay alert to all kinds of potential dangers in flying the planes we board.  We trust teachers to promote good standards of conduct and strong work ethics in our children.  We trust the cashier to accurately punch in the code for the items we purchase at our local super market.  We trust the mechanic to keep our car in working order.

And on and on it goes.  In almost all forms of human interaction, we rely on each other in a kind of implicit social compact to do the right thing, to refrain from doing the wrong thing, to be, in essence, responsible.

We so rely and place our trust in spite of the undeniable fact of the human condition, which is that we are very much imperfect beings, fully susceptible of the kind of conduct that puts all of us at risk, all the time, of the “accidents” that will, inevitably, happen.

And, of course, if we expand our view beyond the accidents that the human condition inevitably creates, we confront the less forgivable, but still prevalent, aspects of our species condition that flow from our innate selfishness.  We can be mendacious and even downright evil, either because we lapse into irrationality or because we are inculcated at an early age with a lack of moral bearing.

Thus, we can lie, cheat and steal with minimal restraint, save for the laws and regulations that our society imposes on us.  Temptations abound in our daily lives, and rationalizations are readily available.  When the store clerk mistakenly gives me a ten dollar bill instead of a single for my change, do I return it or keep it?  Will anyone know if I keep it?  Doesn’t it make up for the many times when I get short-changed?

Such thoughts are also part of the human condition.  Thus did Adam bite of the apple; so did Eden sink to grief.

I’m not sure where all of this takes me.  I’m certainly not seeking to decry everything that makes us humans what we are.  We are also capable of kindness, charity and empathy; and the ability to love selflessly is what, in large part, makes life worth living.

We’re a mixture of capabilities and tendencies that have been consistent throughout the millennia of our existence.  In the end, we have survived and progressed because, ultimately and finally, our trust in each other has been well-founded.

Which is why, despite the ever-present anxiety that comes from thinking too hard, I’ll continue to take the risk on that left turn to get home from my gym.