Category Archives: Popular Culture

Fear of Identity Erosion

A few weeks ago, I finally got around to watching Sound and Fury, the 2000-released, Academy award-nominated documentary film about two families struggling with the impact of having their deaf children receive cochlear implants. These tiny electronic devices are surgically implanted, and will usually improve hearing in deaf patients, but—it is feared by the families featured in Sound and Fury—this improvement will come at the expense of “deaf culture.”

McLuhanThe film is an absorbing exploration of what we mean by culture and identity, and how critically important these concepts are to us. Because here’s the thing—the parents of one of the children being considered for cochlear implants (who are themselves deaf) choose not to have the operation, even though their child has asked for it, and even though it will in all likelihood significantly improve their young daughter’s hearing.

Why? Because improved hearing will negatively affect their daughter’s inclusion in the deaf tribe. I use that word advisedly, because it seems that is what identification comes down to for nearly all of us—inclusion in a group, or tribe. We identify ourselves via gender, language, race, nation, occupation, family role, sexual orientation, etc.—ever more narrowed groupings—until we arrive at that final, fairly specific definition of who we are. And these labels are incredibly valued by us. We will fight wars over these divisions, enact discriminatory laws, and cleave families apart, all in order to preserve them.

And here’s the other point that the film makes abundantly clear: technology forces change. I’m told that American Sign Language (ASL) is the equivalent of any other, fully developed spoken language, even to the point where there are separate dialects within ASL. The anxiety felt by the parents of the deaf daughter about the loss of deaf culture is entirely justified—to the extent that cochlear implant technology could potentially eradicate ASL, and this language (like any other language) is currently a central component of deaf culture. With the steady advance of implant technology, the need for deaf children to learn ASL could steadily decrease, to the point where the language eventually atrophies and dies. And with it deaf culture?

Possibly, yes, at least in terms of how deaf culture is presently defined. To their credit, it seems that the parents featured in Sound and Fury eventually relented, granting their child the surgery, but they did so only after fierce and sustained resistance to the idea. And so it goes with ‘identity groupings.’ We are threatened by their erosion, and we will do all manner of irrational, at times selfish and destructive things to prevent that erosion.

My friend Rafi, in a recent and fascinating blog post, announced that this year, he and his family will mostly forego the Passover rituals which have for so long been a defining Jewish tradition. He writes that, after a sustained re-reading and contemplation of ‘The Haggadah,’ the text meant to be read aloud during the Passover celebrations, he found the message simply too cruel, too “constructed to promote fear and exclusion.” “I’m done with it,” he announces.

Well, at the risk of offending many Jewish people in many places, more power to him. He does a courageous and generous thing when he says no more “us and them,” no more segregation, no more division.

All cultures, all traditions can bring with them a wonderful richness—great music, food, dance, costumes, all of it. But they can also bring insecurity, antipathy and conflict, conflict which can often result directly in people suffering.

Everyone benefits from knowing who they are, where they came from culturally. But no one should fear revising traditions; no one should slavishly accept that all cultural practices or group identities must continue exactly as they are, and have been. Technology may force change upon you, but regardless, recognize that change whatever its source is relentless. Anyone who thinks they can preserve cultural traditions perfectly intact within that relentless context of change is fooling themselves. And neither should anyone think that all cultural traditions are worth preserving.

New identities are always possible. Acceptance and inclusion are the goals, not exclusion and fear. It takes time, careful thought, and sometimes courage, but every human being can arrive at a clear individual understanding of who they are and what is important to them. Choose traditions which welcome others and engender the greater good. Reject those which don’t. If you can do this, and I don’t mean to diminish the challenge involved, you’ll know who you are, and you’ll undoubtedly enjoy a rich cultural life.

Storytelling 3.0 – Part 2

We tend to forget—at least I do—that, in the history of storytelling, movies came before radio. By about 15 years. The first theatre devoted exclusively to showing motion picture entertainment opened in Pittsburgh in 1905. It was called The Nickelodeon. The name became generic, and by 1910, about 26 million Americans visited a nickelodeon every week. It was a veritable techno-entertainment explosion.

The thing is, anyone at all—if they could either buy or create the product—could rent a hall, then charge admission to see a movie. To this very day, you are free to do this.

When radio rolled around—about 1920—this arrangement was obviously not on. It’s a challenge to charge admission to a radio broadcast. In fact, the first radio broadcasts were intended to sell radios; this was their original economic raison d’être.

Sadly, very quickly it became illegal to broadcast without a government granted license. (Oddly enough, the first licensed radio broadcast again originated from Pittsburgh.) And almost as quickly, sponsorship became a part of radio broadcasting. The price of admission was the passive audio receipt of an advertisement for a product or service.

An exhibit in the Henry Ford Museum, furnished as a 1930s living room, commemorating the radio broadcast by Orson Welles of H. G. Wells’ The War of the Worlds. Maia C photo
An exhibit in the Henry Ford Museum, furnished as a 1930s living room, commemorating the radio broadcast by Orson Welles of H. G. Wells’ The War of the Worlds.
Maia C photo

Radio shows were much easier and cheaper to produce than movies, and they weren’t always communal in the way movies were, that is they were not always a shared experience. (Although they could be—many a family sat around the radio in the mid part of the 20th century, engrossed in stories about Superman or The Black Museum.)

More importantly, as with book publishing, the gatekeepers were back with radio, and they were both public and private. No one could operate a radio station without a government license, and no one could gain access to a radio studio without permission from the station owner.

Then came television with the same deal in place, only more so. TV shows were more expensive to produce, but like radio, they lent themselves to a more private viewing, and access to the medium for storytellers was fully restricted, from the outset. As with radio, and until recently, TV was ‘free;’ the only charge was willing exposure to an interruptive ‘commercial.’

With the advent of each of these storytelling mediums, the experience has changed, for both storyteller and audience member. Live theatre has retained some of the immediate connection with an audience that began back in the caves (For my purposes, the storyteller in theatre is the playwright.), and radio too has kept some of that immediacy, given that so much of it is still produced live. But the true face-to-face storytelling connection is gone with electronic media, and whenever the audience member is alone as opposed to in a group, the experience is qualitatively different. The kind of community that is engendered by electronic media—say fans of a particular TV show—is inevitably more isolated, more disparate than that spawned within a theatre.

The first commercial internet providers came into being in the late 1980s, and we have since lived through a revolution as profound as was the Gutenberg. Like reading, the internet consumer experience is almost always private, but like movies, the access to the medium is essentially unrestricted, for both storyteller and story receiver.

And that, in the end, is surprising and wonderful. Economics aside for a moment, I think it’s undeniably true that never, in all our history, has the storyteller been in a more favorable position than today.

What does this mean for you and I? Well, many things, but let me climb onto an advocacy box for a minute to stress what I think is the most significant benefit for all of us. Anyone can now be a storyteller, in the true sense of the word, that is a person with a story to tell and an audience set to receive it. For today’s storyteller, because of the internet, the world is your oyster, ready to shuck.

Everyone has a story to tell, that much is certain. If you’ve been alive long enough to gain control of grunt and gesture, you have a story to tell. If you have learned to set down words, you’re good to go on the internet. And I’m suggesting that all of us should. Specifically what I’m advocating is that you write a blog, a real, regular blog like this one, or something as marvelously simple as my friend Rafi’s. Sure, tweeting or updating your Facebook page is mini-blogging, but no, you can do better than that.

Start a real blog—lots of sites offer free hosting—then keep it up. Tell the stories of your life, past and present; tell them for yourself, your family, your friends. Your family for one will be grateful, later if not right away. If you gain an audience beyond yourself, your family and friends, great, but it doesn’t matter a hoot. Blog because you now can; it’s free and essentially forever. Celebrate the nature of the new storytelling medium by telling a story, your story.

Storytelling 3.0 – Part 1

Leighton Cooke photo
Leighton Cooke photo

With apologies to Marshall McLuhan, when it comes to story, the medium is not the message. Yet the medium certainly affects reception of the message. As I’ve written earlier in this blog, storytelling began even before we had language. Back in our species days in caves, whether it was the events of the day’s hunt, or what was to be discovered beyond the distant mountain, I’m quite certain our ancestors told stories to one another with grunt and gesture.

Once we began to label things with actual words, oral language developed rapidly and disparately, into many languages. The medium was as dynamic as it’s ever been. It was immediate, face-to-face, and personal. Stories became ways in which we could explain things, like how we got here, or why life was so arbitrary, or what the bleep that big bright orb was which sometimes rose in the night sky and sometimes didn’t. Or stories became a way in which we could scare our children away from wandering into the forest alone and getting either lost or eaten.

Then, somewhere back about 48 centuries, in Egypt, it occurred to some bright soul that words could be represented by symbols. Hieroglyphics—the first alphabet—appeared. The art of communication has never been the same. The great oral tradition of storytelling began to wane, superseded by written language, a medium that is both more rigid and exclusive. To learn to read and understand, as opposed to listen and understand, was more arduous, difficult enough that it had to be taught, and then not until a child was old enough to grasp the meaning and system behind written words.

It was not until about 1000 BC that the Phoenicians developed a more phonetic alphabet, which in turn became the basis for the Greek, Hebrew and Aramaic alphabets, and thus the alphabet I use to type this word. The Phoenician alphabet was wildly successful, spreading quickly into Africa and Europe, in part because the Phoenicians were so adept at sailing and trading all around the Mediterranean Sea. More importantly though, it was successful because it was much more easily learned, and it could be adapted to different languages.

We are talking a revolutionary change here. Prior to this time, written language was, to echo Steven Fischer in A History of Writing, an instrument of power used by the ruling class to control access to information. The larger population had been, for some 38 centuries—and to employ a modern term—illiterate, and thus royalty and the priesthood had been able to communicate secretively and exclusively among themselves, to their great advantage. It’s not hard to imagine how the common folk back then must have at times regarded written language as nearly magical, as comprised of mysterious symbols imbued with supernatural powers.

We are arriving at the nub of it now, aren’t we? Every medium of communication, whether it be used for telling stories or not, brings people together, but some media do it better than others. Stories build communities, and this is a point not lost on writers of language as divergent as Joseph Conrad and Rebecca Solnit. In his luminous Preface to The Nigger of Narcissus, published in 1897, Conrad writes that the novelist speaks to “the subtle but invincible conviction of solidarity that knits together the loneliness of innumerable hearts.” For a story to succeed, we must identify with the characters in it, and Solnit writes in 2013, in The Faraway Nearby, that we mean by identification that “I extend solidarity to you, and who and what you identify with builds your own identity.”

Stories are powerful vehicles, with profound potential benefits for humanity. But they can also bring evil. As Solnit has also written, stories can be used “to justify taking lives, even our own, by violence or by numbness and the failure to live.” The Nazis had a story to tell, all about why life was difficult, and who was to blame, and how we might make life better.

The content of the story matters; the intent of the storyteller matters. And the medium by which the story is told has its effect. As storytelling media have evolved through time, the story is received differently, by different people. Sometimes that’s a good thing; sometimes it isn’t.

To be continued…

Handprints in the Digital Cave

There are now more than 150 million blogs on the internet. 150 million! That’s as if every second American is writing a blog; every single Russian is blogging in this imaginary measure, plus about another seven million.

The explosion seems to have come back in 2003, when, according to Technorati, there were just 100, 000 “web-logs.” Six months later there were a million. A year later there were more than four million. And on it has gone. Today, according to Blogging.com, more than half a million new blog posts go up everyday.

doozle photo
doozle photo

Why do bloggers blog? Well, it’s not for the money. I’ve written on numerous occasions in this blog about how the digital revolution has undermined the monetization of all manner of modern practices, whether it be medicine, or music or car mechanics. And writing, as we all know, is no different. Over the last year or so, I slightly revised several of my blog posts to submit them to Digital Journal, a Toronto-based online news service which prides itself on being  “a pioneer” in revenue-sharing with its contributors. I’ve submitted six articles thus far and seen them all published. My earnings to date: $4.14.

It ain’t a living. In fact, Blogging.com tells us that only eight per cent of bloggers make enough from their blogs to feed their family, and that more than 80% of bloggers never make as much as $100 from their blogging.

Lawrence Lessig, Harvard Law Professor and regular blogger since 2002, writes in his book Remix: Making Art and Commerce Thrive in a Hybrid Economy, that, “much of the time, I have no idea why I [blog].” He goes on to suggest that, when he blogs, it has to do with an “RW’ (ReWrite) ethic made possible by the internet, as opposed to the “RO” (Read Only) media ethic predating the internet. For Lessig, the introduction of the capacity to ‘comment’ was a critical juncture in the historical development of blogs, enabling an exchange between bloggers and their blog readers that, to this day, Lessig finds both captivating and “insanely difficult.”

I’d agree with Lessig that the interactive nature of blog writing is new and important and critical to the growth of blogging, but I’d also narrow the rationale down some. The final click in posting to a blog comes atop the ‘publish’ button. Now some may view that term as slightly pretentious, even a bit of braggadocio, but here’s the thing. It isn’t. That act of posting is very much an act of publishing, now that we live in a digital age. That post goes public, globally so, and likely forever. How often could that be said about a bit of writing ‘published’ in the traditional sense, on paper?

Sure that post is delivered into a sea of online content that likely and immediately floods it with unread information, but nevertheless that post now has a potential readership of billions, and its existence is essentially permanent. If that isn’t publishing, I don’t know what is.

I really don’t care much if any one reads my blog. As many of my friends and family members like to remind me, I suck at promoting my blog, and that’s because, like too many writers, I find the act of self-promotion uncomfortable. Neither do I expect to ever make any amount of money from this blog. I blog as a creative outlet, and in order to press my blackened hand against the wall of the digital cave. And I take comfort in knowing that the chances of my handprint surviving through the ages are far greater than all those of our ancestors who had to employ an actual cave wall, gritty and very soon again enveloped in darkness.

I suspect that there are now more people writing—and a good many of them writing well, if not brilliantly—than at any time in our history. And that is because of the opportunity to publish on the web. No more hidebound gatekeepers to circumvent. No more expensive and difficult distribution systems to navigate. Direct access to a broad audience, at basically no cost, and in a medium that in effect will never deteriorate.

More people writing—expressing themselves in a fully creative manner—than ever before. That’s a flipping wonderful thing.

Guns

The Gaiety Theatre became a church, then a parking lot. pinkmoose photo
The Gaiety Theatre became a church, then a parking lot.
pinkmoose photo

The game was derived directly from ‘the westerns’ we watched every Saturday afternoon at the Gaiety Theatre in downtown Grande Prairie, wherein the final act of every movie consisted of the good guy and bad guys (the baddies always outnumbered our hero) running around and shooting at one another. “Guns” we called it. “Let’s play guns!” we would shout, and soon we’d be lurking/sneaking around the immediate neighbourhood houses, blasting away at one another with toy weapons, inciting many an argument as to whether I had or had not “Got ya!” If indeed you were struck by an imaginary bullet, a dramatic tumble to the ground was required, followed by rapid expiration.

Let no one ever doubt the influential power of the ascendant mass medium of the day. As I’ve written elsewhere on this blog, I grew up without television, but those Saturday matinees were more than enough to have us pretending at the gun violence that is all too real in the adult world. Video games seem an even more powerful enactment of the gun fantasy that can grip children, but the difference may be marginal. I doubt that movies have lost much influence over young people today, and I further suspect that in the majority of Hollywood movies today at least one gun still appears. Check out how many of today’s movie ads or posters feature menacing men with guns, with those guns usually prominent in foreground. Sex sells, but so it seems do guns.

And of course the rest of the world, including those of us in Canada, looks with horror upon the pervasive, implacable gun culture in the U.S., wondering how it is that even the slaughter of twenty elementary school children isn’t enough to curb the ready availability of guns. Because, from a rational perspective, the facts are incontrovertible: more guns do not mean greater safety, quite the opposite. You are far more likely to die of a gunshot in the U.S. than you are in any other developed country. Roughly 90% of Americans own a gun. The next closest is Serbia at 58%. In Canada it’s about 30%. Australia 15%. Russia 9%. And a higher rate of mental illness does not mean greater gun violence. It’s pure and it’s simple: more guns mean more gun violence, more people being shot and killed.

But we are, by and large, not rational animals, and no amount of logical argument is going to convince members of the gun lobby that gun ownership should be restricted. It’s an emotional and psychological attachment that cannot be broken without causing increased resentment, anger, anxiety and a sense of humiliating diminution. Guns are fetishes to those who desire them, sacred objects that allow the owner to feel elevated in status, elevated to a position of greater independence and potency. After all a gun will allow you to induce fear in others.

And yes the American obsession with guns has historical roots, the revolution and the second amendment to the constitution and all that, but, as Michael Moore so brilliantly pointed out in this animated sequence in Bowling for Columbine, much more essentially it has to do with fear. People enamored of gun ownership feel threatened; without a gun they feel powerless in the face of threats from people they view as dangerously different from themselves. And nothing but nothing empowers like a gun.

You might think that people who love guns do not wish to play with them. Guns are not toys to these people, you might say; they are genuine tools used to protect their owners, mostly from all those other people out there who also own guns. But just down the road from where we live on Galiano is a shooting range. On quiet Sunday afternoons we invariably hear the sound of gunfire echoing through the trees, as gun aficionados shoot repeatedly at targets, trying to do exactly the same thing over and over again, hit the bull’s eye. Those people are indeed playing with their guns; they are recreating with their guns. Why? Because it makes them feel better.

Successful movie genres are manifestations of broadly felt inner conflicts; in the case of westerns those conflicts are around issues of freedom and oppression. And the western may still be the most successful of all movie genres, remaining dominant from the very birth of dramatic film (The Great Train Robbery, 1903), right through to the 1970s (McCabe and Mrs. Miller, 1971). The problem is that the western offered ‘gunplay’ as the answer to oppression, and therefore the suggestion that everyone should have a gun. But once everyone has a gun, everyone is afraid. And once you are afraid, no one is taking away your gun.

Requiem for a Cinema Pioneer

The great Quebec filmmaker Michel Brault died last month, and while he and his career were fully appreciated in his home province—Premier Pauline Marois attended his funeral on October 4, and the flag at the Quebec City Parliament building flew at half-mast for the occasion—we in English-speaking North America know too little of the profound contribution this film artist made to cinema.

Especially in the realm of documentary, Brault’s influence can hardly be overstated.  He was among the very first to take up the new lightweight film cameras that began appearing in the late 1950s, and when he co-shot and co-directed the short film Les Raquetteurs (The Snowshoers) for The National Film Board of Canada in 1958, documentary filmmaking was forever changed.  The 15-minute film focused on a convention of cheery showshoers in rural Quebec, employing a fluid, hand-held shooting style, synchronous sound, and no voice-over narration whatsoever.  The dominant documentary visual style in previous years had been the ponderous look made necessary by the bulk of 35 mm cameras, a style frequently accompanied by somber ‘voice of God’ narration.  Subject matter was often ‘exotic’ and distant; say Inuit people in the Canadian Arctic, or dark-skinned Natives in Papua New Guinea.  Reenactment was, almost of necessity, the preferred manner of recording events.

12675326_102622376eIn 1960, the French anthropologist-filmmakers Jean Rouch and Edgar Morin were shooting Chronique d’un Ete (Chronicle of A Summer) in Paris, turning their cameras for the first time upon their own ‘tribe.’  When they saw Les Raquetteurs, they immediately fired their cameraman and brought Brault in to complete the work.  Rouch went on to label Chronique “cinema verité” (literally ‘truth cinema’), and an entire new genre of documentary film began to appear everywhere in the West.

Robert Drew and his Associates (chief among them D.A. Pennebaker, Richard Leacock and Albert Maysles) took up the cause in the United States, labeling their work ‘direct cinema,’ and delivering films like Primary, about the 1960 Wisconsin primary election between Hubert Humphrey and the largely unknown John F, Kennedy, and Don’t Look Back, about a young folksinger named Bob Dylan on his 1965 tour of the United Kingdom.  Both films would have a marked impact upon the subsequent rise of these two pivotal political/cultural figures.

Brault himself was slightly less grandiose in describing the filmic techniques he pioneered, saying, “I don’t know what truth is.  We can’t think we’re creating truth with a camera.  But what we can do is reveal something to viewers that allows them to discover their own truth.”

He would later turn to fictional filmmaking, writing and directing, among other works, Les Ordres in 1974, a smoldering indictment of the abuse of power which transpired during the ‘October Crisis’ of 1970 in Quebec.  Les Ordres was scripted, but the script was based upon a series of interviews done with a number of people who were in fact arrested and imprisoned during the crisis.  As such, it was considered ‘docudrama,’ another area where Brault’s influence was seminal.  Brault won the Best Director Award at the Cannes Film Festival in 1975 for Les Ordres, and he remains the only Canadian to have ever done so.

These days, with video cameras in every smart phone and tablet, the idea that we should turn our cinematic attention to our own people is taken for granted, as every police department now teaches its members.  But in Brault’s early career, that we should observe, at close quarters, those immediately around us, and do so in an unobtrusive but sustained way, then make that prolonged cinematic observation available to the public, well, that was an almost revolutionary notion.  We could stay close to home, and let the camera reveal what it would.  The process may not have unavoidably presented ‘the truth,’ certainly not in any genuinely objective way, but observational documentary filmmaking granted us new understanding, new insight into people both with and without power.  And we were the better for it.

If the goal is to leave a lasting impression, to press a permanent handprint onto the wall of the cave where we live, Michel Brault can rest in peace.  He made his mark.

Your Good Side

It’s an apocryphal story that my friends, family members and students have heard too many times, but it was surgical for me in its impact over the years, and so I think it bears repeating.

I was standing in a long lineup for hours, waiting for a booth to open and begin selling tickets to a Bob Dylan concert.  I had somehow been chosen by my friends to go alone to buy tickets for the bunch of us, so was standing as part of a group of strangers who inevitably got talking.

The fellow I talked with the most was tall, with an impressive mustache and bad teeth, engaging in a funny and oddly insightful way.  He was telling me at one point about a co-vivant relationship between a professor friend of his and a younger woman, a relationship which soured with time.  The turn of phrase he used caused me to laugh out loud at the time, and think more about it later:

“They were still at that phase where they were showing one another their good sides.”

Sad but true I thought.  On that first date we are shining in our virtue, our willingness to behave in the most admirable, unselfish ways.  Love blooms, issuing forth all manner of florid songs and poems about the very paragon of beauty and refinement that our lover is.

Fast forward to when we have been living together for a year, when all the foibles and flaws have been fully exposed.  She now knows that you squeeze the toothpaste tube in the middle and refuse to ever put a new toilet paper roll on the holder; you now know that she is a slob who leaves underwear lying all over the bedroom floor and spends hours every day on the phone with her mother.  It’s an arc of change that indeed seems inevitable.  We are many-sided creatures, and so, inexorably, we reveal all sides, including the dark one, to those who come to know us intimately.

Many years later, I wouldn’t disagree with that sentiment, but I’d also suggest we can contend with the slide.  We can resist the tendency to arrive at two separate standards of behavior: one for those who know us best, and one for everyone else.

The latter standard is of course the one we should aspire to, the one where we don our very best cloak of behavior in an attempt to make the best possible first impression.

Kurt Vonnegut Rashawerakh photo
Kurt Vonnegut
Rashawerakh photo

It’s a daunting prospect, but the great American writer Kurt Vonnegut Jr. set down what are in fact some encouraging words in this regard.  In the introduction to Mother Night, he wrote: “We are what we pretend to be, so we must be careful about what we pretend to be.”

You see, there is the matter of will in this gloomy revelatory fate, offering what must be the most constructive strategy in the face of it.  We can all go about pretending we’re still on that first date.  In the grand (or not-so-grand) tradition of ‘What would ____ do?’, we can ask, ‘What would I do if we had just met?’

With sufficient effort, I’d suggest that—in stark deference to Abraham Lincoln’s inescapable maxim that we can’t fool all the people all the time—we can in fact fool most of the people most of the time.  If you pretend to be a good person most of the time, happily, most people will think you are.

Here’s another relevant Vonnegut near-aphorism (the guy was brilliant at them), from my personal favorite of his books, Sirens of Titan:

“The purpose of human life, no matter who is controlling it, is to love whoever is around to be loved.”

So let’s be clear about the nature of the challenge here.  The tough part is to go on pretending to be a good person around those people who know you well, who know all about your lazy, selfish side, who aren’t about to be fooled.

Regardless, there’s no getting around it now.  This is your new charge, having unwisely taken the time to read this digressive post.  You must now go about at all times pretending that you just met the person you’re with.

The End of the Movies

I grew up without television.  It never arrived in the small town where I lived until I was about ten.  So I grew up watching the movies, initially Saturday afternoon matinees, which my older brother begrudgingly escorted me to under firm orders from my mother, who was looking for brief respite from the burden of three disorderly boys.  Admission was ten cents, popcorn five cents.  (If these prices seem unbelievable to you, all I can say is… me too.)

file2791245784270Movies were it, a prime cultural (and for me eventually professional) mover, right through my adolescence and early adulthood.  For me, TV has tended to be a kind of entertainment sideline, something to be watched when a new show came around with some serious buzz, but more often just a passive filler of downtime, material to unwind with at the end of a busy day.

That has of course all changed in recent years, and not just for me.  I don’t go to the movies much anymore—that is I don’t go to the movie houses—and, what’s more, movies don’t seem to matter much anymore.  These days movies are mostly noisy spectacle, big, flashy events, but events with very little to offer other than raucous entertainment.  Comic book movies are the dominant genre of today, and, no matter how I slice it, those comic book characters don’t really connect with life as I’m living it, day to day.  And, as I say, it’s not just me, as someone from an older demographic.  Today, unfortunately, the audience for the movies is smaller, and more narrow than it’s ever been.

OLYMPUS DIGITAL CAMERAMovie audiences peaked in 1946, the year The Best Years of Our Lives, The Big Sleep, and It’s A Wonderful Life were released, and 100 million tickets were sold every week.  By 1955—when Guys and Dolls, Rebel Without A Cause, and The Seven Year Itch were released—with the advent of television, that audience had dropped to less than half that.

But the movies survived television and found a ‘silver’ age (‘gold’ being the studio-dominated 40s) in the decade from 1965 to 1975, when we watched movies like The Godfather I and II, Midnight Cowboy and Chinatown, and the works of Ingmar Bergman, Federico Fellini and Francois Truffaut enjoyed theatrical release right across North America.  It was a time when movies did seem to have something to say; they spoke to me about the changing world I was in direct daily contact with.

Then came the blockbusters—Jaws and Star Wars—and the realization that Hollywood could spend hundreds, not tens of millions of dollars on a movie and garner just as large an increase in returns.  Movies have never been the same.

Today less than 40 million people in North America go to see a movie once a month.  In a 2012 poll done by Harris International, 61% of respondents said they rarely or never go to the movies.  Why would you when you have that wide screen at home, ad-free, with the pause button at your disposal?  The most you’ll likely pay to watch in private is half of what you would at the movie house.

And then, this year, we had a summer of blockbuster flops.  The worst was The Lone Ranger, made for $225 million and about to cost Walt Disney at least $100 million.  Both Steven Speilberg and George Lucas have said that the industry is set to “implode,” with the distribution side morphing into something closer to a Broadway model where fewer movies are released; they stay in theatres longer, but with much higher ticket prices.  Movies as spectacle.

(If you’re interested in reading more, an elegant, elegiac tribute to the run of the movies is The Big Screen, published last year and written by David Thomson, a critic born in 1941 who has thus been around for a good-sized chunk of film history.)

It may well be that movies, as the shared public experience that I’ve known, are coming to the end of a roughly 100-year run.  It was rapid, glamorous, often tawdry, sometimes brilliant, once in a while even significant, but technology is quickly trampling the movies.  If you were there for even a part it, you might feel blessed.

The Arc of Age

“Oh to live on Sugar Mountain
With the barkers and the coloured balloons.
You can’t be twenty on Sugar Mountain
Though you’re thinking that
You’re leaving there too soon.”

             Neil Young, from Sugar Mountain

There is a time in your life when all opportunities seem available to you, a time when, whether it’s school, travel, love or work, any number of options are still to come.  If any particular relationship, living situation or job doesn’t work out, well, there are always more chances ahead.

And then one day, approximately two and half heartbeats later, you wake up to the reality that this wide open future no longer awaits you.

imagesKids do it to you more than anything else.  You can always change jobs, move to another city, or leave a lover, but a child is forever.  No changing your mind, after the fact.  As Neil Young has written in another song (Already One), once that charming little creature of yours enters into the world, he or she “won’t let [you] forget.”

The arc of a life affair is like a splendid strand of fireworks, trailing sparks as it rockets up into a starry sky, only to “too soon” begin the downward turn, moments away from extinguishment.  To draw upon another pop culture reference, Anthony Hopkins, in the critically-maligned-but-actually-rather-decent Meet Joe Black, stands addressing the crowd assembled for his 65th birthday, knowing Death awaits him at the edge of the party: “Sixty-five years.  Didn’t they go by in a blink?”

I’m not quite there yet, but I’m acutely aware that opportunities are now diminishing for me, not expanding.  My father will turn 91 this year.  We got him out to Galiano over the summer for what may well be his last visit to a place where he spent many warm days noodling around on various “projects”—a septic pipe for his trailer which emptied into two separate, submerged plastic garbage barrels (I kid you not), a wooden tower for a golden-coloured metal weather vane that weighs roughly 400 pounds, and has never once moved.

Dad and three of his brothers went off to war while all still in either their teens or twenties (Dad was 18).  Only two of them came back.  They didn’t cause the war, not in the slightest possible way, but it impacted their lives in a way I can only imagine.  On my mother’s side, my uncle’s entire graduating class walked from the Olds Agricultural College up to Edmonton, enlisting en masse.  Such were the times, and the excitement in the air for young people, eager for experience.

Sugar Mountain is about the transition from childhood to adolescence, marked by things like (for Young’s generation) a furtive first cigarette beneath the stairs, or a secret, world-exploding note from that girl “down the aisle.”  We all leave the magic of childhood “too soon,” but then the other transitions of life seem to pile on pretty rapidly too.  The end of school, perhaps marriage, the death of our parents, children leaving home.  It all comes at you like rolling breakers at the beach, just as irresistible.

Oddly enough, the passage of time does not slow as we age.  In fact it accelerates, causing whole chapters of our lives to blur into a kind of muted cacophony of sounds and pictures, like a tape set to fast forward.  (I’ve commented here on this blog on the blur of the child-rearing years.)  That year’s time, say grade four, which seemed to drag on forever for me as a child now seems to hurtle by in an instant, like an approaching pedestrian whom I don’t recognize until he’s passed me by.  Too late to even smile.

Most of us will live ordinary lives.  We won’t be rich, or famous, extraordinarily powerful, or especially attractive.  But if we’re lucky, and if we make just enough good choices, we will live long and well.  It won’t be a perfect record, not even close, and there will be a fair number of regrets, but if tragedy, disease, natural catastrophes and the sordid affairs of nation states leave you largely untouched, you will live long, and you will find meaning.  It will come with children, and those others whom you love.  If you are so lucky, it will come whether you like it or not.  No need to hurry.

 

Foreign Culture Wars

When it comes to culture, Europeans think differently than many of us west of the Atlantic.  Just last month, European countries, led by France, unanimously endorsed the concept of ‘cultural exception’ in the current U.S.-European trade deal talks—meaning that cultural industries are exempted from full exposure to the free-trade winds that will blow through other industries under the new agreement.  It’s a position that in the U.S., with its mega-imagesscale cultural industries (Amazon and books, Hollywood and movies, ABC and Desperate Housewives, etc.), seems almost nonsensical.  For someone like the late Jack Valenti, Hollywood lobbyist extraordinaire, it’s simply a matter of cultural industries outside the U.S. failing to make competitive product.  He used words like “baloney,” “odious” and “a needless crutch”, for instance, when describing quotas on foreign television previously established by EU member states.

It’s a battle that Canada had to fight when negotiating its own free-trade deal with the U.S. back in the 80s.  At the time the Americans were keenly interested in gaining unfettered access to both the energy and cultural industries in Canada.  And, in the interest of context, it’s worth tracking back further, to the institution of ‘Canadian content’ regulations in the early 70s.  Prior to 1971, for instance, Canadian musicians heard on the radio were a thin and scattered bunch.  Following the imposition of the ‘Can-con’ rules, there was a veritable explosion of Canadian musical talent, from recording artists as diverse as Anne Murray and Steppenwolf.  (Although Anne may have won the day on most radio stations, prompting one wag to wonder whether AM was in fact her moniker.)  The digital revolution has since of course, negatively impacted the music industry as much as it has any trade anywhere, but, for a time, the pop music scene in Canada was never more robust.

In TV too, the original imposition of Canadian content rules quickly spawned a sizable industry that had previously been hardly present at all, and that continues as viable to this day.  There was an original mandate with government subsidies of the film and television industry to create product that was ‘culturally distinct’—stories would be ‘recognizably Canadian’—and with globalization that mandate has suffered (Is there anything genuinely recognizable as Canadian about a show like Rookie Blue?), causing one to wonder if the TV industry is still ‘cultural’ at all, but I digress.

The point is that cultural industries outside nation-state juggernauts like the U.S. and China have historically needed protection in order to flourish, if not survive.  What someone like Jack Valenti failed to recognize is that it takes the same ratio of talent within any pool to produce hits, regardless of place; therefore the Canadian or French pool has to be protected if it is to remain large enough to produce proportionately fewer hits, that is enough hits to survive.  Without that protection the industry will simply be overrun by the wildly larger numbers of both people and dollars emanating from the American cultural behemoth.

Which is exactly what has happened with the Canadian movie industry, where Can-con rules have never been applied.  (If you’re wondering why, perhaps it’s sufficient to say that Hollywood movie distribution contracts, back in the day, did not even recognize Canada as a separate territory.)  While the Canadian radio and television industries have evolved a reasonably sound business model, the same can’t be said about the indigenous movie industry in Canada.  It has been, and remains marginal; what I have described as an ‘ego-driven crap shoot’ where few people are employed, and audiences are meager.  (I’m referencing English-speaking Canada here; French Canadians actually go to see their own movies in considerable numbers.)

In Europe, French regulations delay the release of DVDs, in order to preserve movie houses.  German regulations force online book retailers to sell their product at list prices, in order to preserve bookstores.  Different thinking.

The internet of course arose in a ‘wild west’ American culture where any form of regulation was considered anathema.  In 2000, Yahoo! was sued in France after Nazi memorabilia was offered for sale on its auction site.  (It’s essentially illegal to sell such stuff in France in any venue.)  Yahoo! fought back, arguing “free speech,” that France could not rightly impose its laws on a U.S. company.  Yahoo! lost that case.

Vive la différence.