Tag Archives: accelerated change

The Last Post

In all likelihood, this is the last post on this site. The blog has run for precisely three years, this post aside, and was, in good part, a deliberate but modest exercise for me. During its first year (2013), I set myself the task of writing a post every week, and then did that, before tailing of to a more intermittent schedule. As I wrote in an earlier article, I have blogged as a creative outlet, for myself, because I actually enjoy the art and craft of writing, especially when I can do so on my own schedule.

Photo: Thomas Hawk
Photo: Thomas Hawk

Maybe the writing and posting is little more than the piteous human impulse to leave something behind, after we’re so soon gone: a small stack of notes, initials carved in the trunk of a tree, ‘handprints on the wall of the digital cave.’

My approach has of course meant that the size of the audience for this blog has been limited, to say the least, but I’m not too fussed about that. Its final value for me has lain elsewhere.

Maybe one day it will be appreciated as one small record kept during times which were changing as quickly as they have ever changed for humankind. The disruption of the digital revolution was in high gear back in 2012-13, and it seems to me that it has slowed some in more recent years. Robotic cars are coming on rather more slowly than did smart phones.

These days, it feels more like we are living in a time of reckoning with that technical, social, economic disruption, a time when many people are looking for someone to blame for the more difficult circumstances they suddenly find themselves living in. And, sadly, there are always politicians willing to step up and seize the opportunity afforded by those searchings, politicians like Donald Trump and Marine Le Pen. Clearly there is a price to be paid when change is not so well managed by those with control of the social and economic levers. If we don’t get effective progressive change then we get reactionary change, and reactionary change is doomed to fail, at least in the long run.

The most impactful change has of course been economic, the result of globalization in a capitalist society which makes having more, as opposed to less money ever so much more convenient and status boosting. Median incomes have stalled in the West, given global competition; jobs have disappeared, the kinds of jobs available have changed, and it is so much easier to blame immigration—the visible signs of change—than it is to blame, say, automation, which has been so much more likely to have undermined your economic well being.

What does it mean for the future? It’s always hard to say. Events are by their very nature unpredictable, and unforeseen events can quickly sway historical outcomes in big ways. As the human species continues to overrun the planet, we are going to have to wrestle certain problems—overpopulation, ecological damage (especially climate change), economic inequality—to the ground, or we are in for a rough ride.

Can we do so? It’s certainly possible. All it takes is the right choices, collectively and individually, made by the right people at the right times. Simple, right?

No, not so simple. But then, what we can do individually is restricted, so that makes it a little easier. Educate yourself, sit back, see the bigger picture, make choices for the greater good, rationally rather than out of frustration, resentment, anger or any of those other emotions which we then look to rationalize. Try to be aware of when you are acting selfishly, blaming others, speaking only to those from your own ‘tribe’, however that may be defined, whether by class, race, religion or nationality. Like it or not, we are all in this together. That colony on Mars will never be our salvation.

Maybe, just maybe, this blog has helped someone other than me to do these things, to maintain a wider perspective, clarify, stay calm and choose wisely. If so, bonus. Great. If not, that’s okay too. It’s helped me.

The Leisure Revolution

Yuval Noah Harari, in his 2014 book Sapiens: A Brief History of Humankind, has an interesting take on the ‘agricultural revolution'; you know, where, way back, we learned to plant crops and domesticate animals. He calls it “history’s biggest fraud.” Not in the sense that it didn’t happen. It did, leading to an increase in food supply and, consequent to that, the growth of cities. His contention is that it did not lead to a better life for humankind, neither a healthier diet nor more leisure time.

5121772432_283c4f57ed_zInstead it led to a less stimulating life with the increased likelihood of starvation and disease. The starvation came about as the result of periodic natural disasters, like drought, devastating the crops we came to depend upon, and the disease came about because urban conditions are much better for spreading illness than are rural ones. As to leisure time, Harari asserts that our hunter-gatherer ancestors ambled about a wide, natural territory, often able to harvest a diverse and abundant food supply, and to do so in fewer hours than it took the average farmer to feed his family a few centuries later.

Rutger Bregman, in his 2016 book, Utopia for Realists: The Case for a Universal Basic Income, Open Borders, and a 15-hour Workweek, makes a similar argument about the industrial revolution. It did not lead to a more leisurely life. Bregman estimates that in 1300 the average English farmer worked about 1500 hours a year in order to make a living, whereas the typical factory worker in Victorian England worked twice that many hours just to survive. The 70-hour workweek was the norm in a city like Manchester in 1850, no weekends, no vacations.

Eventually things improved. By about 1940, in the West, the 40-hour, five-day workweek was fairly standard, a change led, surprisingly enough, by none other than Henry Ford.

And then, things truly began to look up. In 1956 Richard Nixon promised Americans a four-day workweek, and by 1965, prognosticators everywhere were predicting vast increases in leisure time, with workweeks of as little as 14 hours. There was considerable hand-wringing about the coming, perplexing problem of boredom, idle hands given to inflamed immorality and violence.

It all came to a grim, grinding halt in the 1980s. Today the average guy in the U.S. works 8.4 hours per work day, or 42 hours per week. That’s very little changed in the last 50 years.

The digital revolution has brought us an accelerated life, new, not always better forms of community, grotesque economic inequality, and, unlike the industrial revolution, persistent unemployment. (Many people, like weavers, were put out of work by the industrial revolution, but then it went on to deliver a slew of different types of employment, like shop foremen.) And so far, for those people still working, it hasn’t done much for additional leisure time.

The other factor in why many of us are busier these days is what Bregman cites as “the most important development of the last decades.” The feminist revolution. While in some countries at least the workload for individuals has decreased slightly, families these days are living a blur, because these days women are contributing about 40% of the family income, and working full time to do so.

It seems that, with the coming of the digital revolution, we’ve gone and done it to ourselves again. And here’s a disconcerting note; surveys show that many people today would rather earn more than work less—so that they can live the lifestyle they’ve always dreamed of. They’d rather have that bigger house, newer car, more fashionable outfit, and dream vacation than they would more leisure time. We might call this the consumer revolution, and it’s largely a post-WWII phenomenon.

What’s to be done? Well, it’s not in fact that mysterious. Economic answers come with things like a guaranteed annual income and a progressive tax regime that effectively redistributes wealth. And there is very solid evidence as to the validity of these economic remedies, much of it to be found in Bregman’s book.

But just as relevant to the modern leisure deficit is the fact that, as indicated above, we chose these outcomes. Not always consciously, and often incrementally, without realizing the ensuing consequences, but nevertheless we had and still have choice in these matters.

We can choose to live more simply, with less stuff. We can choose to buy a smaller home, drive an older car, purchase clothing at a second-hand store, and grow a garden.

Don’t like your job? Feeling like a wage slave? Have other interests you’d love to pursue?

It’s a choice.

The Cowboy Rides Away

To say that the cowboy is iconic in North American culture is hardly sufficient. Mythic hero is more accurate, but it’s important to remember that the cowboy was real, not supernatural like Hercules or Spiderman. The reality was that, for a brief period, essentially from 1860 to 1900, there were a great number of horses and cattle running free in the American frontier, most of them having been abandoned by retreating Mexicans. With the arrival of the railroad following the Civil War, the ’roundup’ and sale of these cattle became possible, leading to the beef industry that employed a great many ‘cowboys.’ The cattle were herded to railheads of course, but not too quickly, because if you did that the cattle lost weight, and they were sold for slaughter by the pound.

Thus the cowboy’s life was one of outdoors ambling on horseback, as part of a collaborative team of men who camped early for the night, gathered around fires to share a meal, tell stories, and maybe even sing songs. It’s a lifestyle with easily apparent appeal, although here’s what the reclusive American writer Trevanian had to say about the broader charm of the cowboy:

“It is revealing of the American culture that its prototypic hero is the cowboy: an uneducated, boorish, Victorian migrant agricultural worker.” 

The Great Train Robbery The original black hat.
The Great Train Robbery
The original black hat.

When the American film industry moved to California in the early part of the 20th century, there were by then plenty of unemployed cowboys knocking about, men who could ride, rope and sometimes shoot with the best of them—just one more coincidental reason why the western movie became incredibly popular. And it is truly difficult to overestimate the popularity and therefore the influence of the western movie. Arguably the first dramatic movie was a western—The Great Train Robbery in 1903—and the genre was dominant right through until the 70s, when it died with nevertheless accomplished films like The Wild Bunch and McCabe and Mrs. Miller.

I’ve argued elsewhere that the western movie was so successful, over such a long period of time (still longer than any other genre), that it created a ‘conventional form’ along with a set of audience expectations that, long after expiration of the genre itself, offers moviemakers who can reinvent the form within a new context (i.e. The Matrix or Drive) an unparalleled opportunity to go boffo at the box office.

The influence of cowboy culture in popular music is scarcely less significant. Cole Porter knocked it right out of the park in 1934 with a sublime rhyme scheme in the cowpoke paean Don’t Fence Me In

I want to ride to the ridge where the West commences

And gaze at the moon till I lose my senses.

I can’t look at hobbles and I can’t stand fences.

The song has been covered by everyone from Ella Fitzgerald to The Killers. And almost 40 years later, James Taylor waxed nearly as lyrical (rhyming “Boston” with “frostin”) in maybe his best song, Sweet Bay James:

There is a young cowboy; he lives on the range.

His horse and his cattle are his only companions.

He works in the saddle and he sleeps in the canyons…

More than anything else, the cowboy represents freedom, a largely solitary life free of long-term obligations, tight schedules or immediate bosses. Too often in the westerns the cowboy’s love interest represented civilization, settling down and responsibility, and so too often, at the end of the story, the cowboy rode away from the girl, off into the sunset to resume a life of independent rambling (although it’s worth noting that in a couple of the defining westerns, High Noon and Stagecoach, the hero did choose the girl, and they rode off together in a buckboard).

It’s no surprise that the cowboy’s allure arose alongside the maturing of the industrial revolution, when incomes were rising but often as the result of work fettered to a factory system of mechanical drudgery. Are we any more free in the age of the digital revolution, with its increased pace and unrelenting connectivity? Well, not so’s you’d notice.

In the digital age, the cowboy hero seems a complete anachronism, more irrelevant than ever, but I think it’s worth remembering that, although the cowboy almost always resorted to a gun to resolve his conflicts with the bad guys—and the impact of that implicit message upon American society can hardly be overestimated either (see Guns)—he did so reluctantly, in defence of the little guy being oppressed by powerful villains, who were often corporate-types.

Today the cowboy is gone for good from our cultural landscape, and I’m not suggesting he should be brought back. But in our world of ever more powerful corporate interests, we could all use some of his individual pluck. The economic wheels of our day are rolling along just fine; the ecological and moral ones, not so much. Sadly, too much of the cowboy’s good is gone with him.

An Education?

The conference was titled, “The Next New World.” It took place last month in San Francisco, and was hosted by Thomas Friedman, columnist for The New York Times and author of The World Is Flat. Friedman has been writing about the digital revolution for years now, and his thinking on the matter is wide-ranging and incisive.

In his keynote address, Friedman describes “an inflection” that occurred coincidental with the great recession of 2008—the technical transformations that began with the personal computer, continued with the internet, and are ongoing with smart phones and the cloud. Friedman is not the first to note that this transformation is the equivalent of what began in 1450 with the invention of the printing press, the so-called Gutenberg revolution. The difference is that the Gutenberg revolution took 200 years to sweep through society. The digital revolution has taken two decades.

5351622529_5d4c782817Friedman and his co-speakers at the conference are right in articulating that today’s revolution has meant that there is a new social contract extant, one based not upon high wages for middle skills (think auto manufacturing or bookkeeping), but upon high wages for high skills (think data analysis or mobile programming). Everything from driving cars to teaching children to milking cows has been overtaken by digital technology in the last 20 years, and so the average employee is now faced with a work place where wages and benefits don’t flow from a commitment to steady long term work, but where constant innovation is required for jobs that last an average of 4.6 years. As Friedman adds—tellingly I think—in today’s next new world, “no one cares what you know.” They care only about what you can do.

Friedman adds in his address that the real focus of the discussions at the conference can be abridged by two questions: “How [in this new world] does my kid get a job?” and, “How does our local school or university need to adapt?’’

All well and good. Everyone has to eat, never mind grow a career or pay a mortgage. What bothers me however, in all these worthwhile discussions, is the underlying assumption that the education happening at schools and universities should essentially equate to job training. I’ve checked the Oxford; nowhere does that esteemed dictionary define education as training for a job. The closest it comes is to say that education can be training “in a particular subject,” not a skill.

I would contend that what a young person knows, as opposed to what they can do, should matter to an employer. What’s more, I think it should matter to all of us. Here’s a definitional point for education from the Oxford that I was delighted to see: “an enlightening experience.”

A better world requires a better educated populace, especially women. For the human race to progress (perhaps survive), more people need to understand the lessons of history. More people have to know how to think rationally, act responsibly, and honour compassion, courage and commitment. None of that necessarily comes with job training for a data analyst or mobile programmer.

And maybe, if the range of jobs available out there is narrowing to ever more specific, high technical-skills work, applicable to an ever more narrow set of industries, then that set of industries should be taking on a greater role in instituting the needed training regimes. Maybe as an addendum to what can be more properly termed ‘an education.’

I’m sure that Friedman and his conference colleagues would not disagree with the value of an education that stresses knowledge, not skills. And yes, universities have become too elitist and expensive everywhere, especially in America. But my daughter attends Quest University in Squamish, British Columbia, where, in addition to studying mathematics and biology, she is obliged to take courses in Rhetoric, Democracy and Justice, and Global Perspectives.

Not exactly the stuff that is likely to land her a job in Silicon Valley, you might say, and I would have to reluctantly agree. But then I would again argue that it should better qualify her for that job. Certainly those courses will make her a better citizen, something the world is in dire need of, but I would also argue that a degree in “Liberal Arts and Sciences” does in fact better qualify her for that job, because those courses will teach her how to better formulate an argument, better understand the empowerment (and therefore the greater job satisfaction) that comes with the democratic process, and better appreciate the global implications of practically all we do workwise these days.

Damn tootin’ that education in liberal arts and sciences better qualifies her for that job in Silicon Valley. That and every other job out there.

What We Put Up With

The sky was new. It was a thick, uniform, misty grey, but I was told there were no clouds up there. I’d never seen this before, and was skeptical. How could this be? It was the humidity, I was told. It got like that around here on hot summer days.

The year was 1970; I was 17, part of a high school exchange program that had taken me and a fair number of my friends to the Trenton-Belleviille area of southern Ontario. We’d been squired about in buses for days, shuffling through various museums and historical sights, sometimes bored, sometimes behaving badly (my buddy Ken, blowing a spliff in the washroom cubicle at the back of the bus, would surely be considered bad form), sometimes, not often, left to our own devices. On this day we’d been driven to the sandy shores of Lake Ontario, where what was shockingly, appallingly new, much newer than the leaden sky, was out there in the shallow water.

Small signs were attached to stakes standing in the water, just offshore. They read, “Fish for Fun.”

I couldn’t believe it. How could this be allowed to happen? How could people put up with this? As a kid from a small town in northern Alberta, I’d never seen anything like it.

It was a kind of accelerated future shock, as if I had been suddenly propelled forward in time to a new, meta-industrialized world where this was the accepted reality. In this cowardly new world, lakes would be so polluted that eating fish caught in them was unsafe (at 17, I’d caught my share of fish, and always eaten them), and this was how people dealt with the problem. With a lame attempt at cheery acquiescence.

When I think about it, my 17-year-old self would have had a great deal of trouble believing numerous of the realities that we live with today. Setting aside all the literally incredible changes wrought by the digital revolution—where we walk around with tiny computers in our hand, able to instantly send and/or receive information from anywhere in the world—here are a few more mundane examples of contemporary realities that would have had me shaking my teenage head in utter disbelief:

  • Americans buy more than 200 bottles of water per person every year, spending more than $20 billion in the process.
  • People everywhere scoop up their dog’s excrement, deposit it into small plastic bags that they then carry with them to the nearest garbage receptacle. (Here’s a related—and very telling—factoid, first pointed out to me in a top-drawer piece by New York Times Columnist David Brooks: there are now more American homes with dogs than there are homes with children.)
  • On any given night in Canada, some 30,000 people are homeless. One in 50 of them is a child.

There are more examples I could give of current actualities my teen incarnation would scarcely have believed, but, to backtrack for a moment in the interests of fairness, pollution levels in Lake Ontario are in fact lower today than they were in 1970, although the lake can hardly be considered pristine. As the redoubtable Elizabeth May, head of Canada’s Green Party, points out in a recent statement, many of the worst environmental problems of the 70s have been effectively dealt with—toxic pesticides, acid rain, depletion of the ozone layer—but only because worthy activists like her fought long and hard for those solutions.

jronaldlee photo
jronaldlee photo

The fact is that we are a remarkably adaptable species, able to adjust to all manner of hardships, injustice and environmental degradation, so long as those changes come about slowly, and we are given to believe there’s not much we as individuals can do about it. Never has the metaphor of the frog in the slowly heating pot of water been more apropos than it is to the prospect of man-made climate change, for instance.

It’s not the cataclysmic changes that are going to get us. It’s the incremental ones.

 

Where Have All the Dollars Gone

Sir Robert Borden addressing the troops, 1917 Biblioarchives
Sir Robert Borden addressing the troops, 1917
Biblioarchives

In March of this year, lawyers acting on behalf of the Canadian government asserted that the government has no special obligation to Afghan military veterans as a result of a pledge made by Prime Minister Robert Borden back in 1917, on the eve of The Battle of Vimy Ridge. Borden assured the soldiers then preparing for battle in France (more than 10,000 of them would be killed or wounded) that none of them would subsequently “have just cause to reproach the government for having broken faith with the men who won and the men who died.”

The March court filing came about as a result of Canadian soldiers earlier suing over the government’s new “veterans charter,” which changes the pension-for-life settlements provided to soldiers from previous wars to a system where Afghan veterans receive lump sum payments for non-economic losses, such as losing limbs. It’s not difficult for any of us to understand that this change is all about our government saving money.

Also in March of this year, the Vancouver School Board announced a budget shortfall of $18.2 million. Reluctantly, the Board is considering an array of cutbacks, including school closures, ending music programs, and keeping schools closed for the entire week surrounding Remembrance Day. My kids have now moved on past public schools, but I clearly remember, for all the years they were enrolled in Vancouver schools, a steady and discouraging series of annual cutbacks; librarians disappearing, field trips becoming rare events indeed; at one point even the announcement that the temperature in schools would be turned down.

I lack the expertise to offer any detailed economic analysis as to why our governments are these days unable to meet obligations to veterans and school children that they were able to meet in the past, but here’s one bit of crude economic breakdown that causes even greater wonder. The Gross Domestic Product per capita in Canada in 1960 averaged $12,931 US; in 2012 it averaged $35,992 US, adjusted for inflation. In other words, the country is today producing three times as much in the way of goods and services per citizen as it was back in 1960, presumably enabling the government to collect far more in the way of taxes, per person, than it could four-plus decades ago. And yet we can no longer support veterans and school children in the way we did back then.

A large part of the explanation is of course that governments these days are addressing a whole lot of social concerns that they weren’t in 1960, never mind in 1917, everything from drug and alcohol treatment centres, to the parents of autistic children, to modern dance troupes. It may well be that this growing list of demands outstrips the three-times-bigger ratio of available government funds. It may even be enough for one to lament what Washington Post columnist Charles Krauthammer (an example of that rare beast, the genuinely thoughtful conservative pundit) calls “the ever-growing Leviathan state.” It may… or it may not.

One theory has it that, in the post-war decades, right up until the 70s, the Canadian economy was legitimately growing, more products, more services, more jobs. Since the 80s, any increase in or even maintaining of profit ratios (and thus disposable incomes) has come as the result of increased ‘efficiency,’ fewer people producing more goods and services through better technology and less waste. (More cutbacks anyone?) If that’s true, then things are only going to get worse, as these finite efficiencies produce ever-diminishing returns.

Whatever the final explanation, it’s not likely a simple one, and whatever the economic answer, it’s not likely to be easily achieved. Too often a current government has only to promise fewer taxes for voters to flock in their direction, regardless of known scandal, evident mean-spiritedness, or obvious cronyism. We tend to assume that the ensuing government cutbacks won’t arrive at our door. And so long as they don’t we remain generally unrepentant for our self-centeredness. The moment they do—the moment an alcoholic, or autistic child, or modern dancer appears inside our door—our attitudes tend to shift.

Thus, as we stand witnessing a time of waning of western economic domination (see DEP, from the archives of this blog), it seems we can be sure of only one thing: it’s a matter of priorities. If school-age children and wounded veterans are not our priority, then who is?

 

 

Facetime

Last month the city of Nelson, BC, said no to drive-thrus. There’s only one in the town anyway, but city councilors voted to prevent any more appearing. Councillor Deb Kozak described it as “a very Nelson” thing to do.

Nelson may be slightly off the mean when it comes to small towns—many a draft dodger settled there back in the Vietnam War era, and pot-growing allowed Nelson to better weather the downturn of the forest industry that occurred back in the 80s—but at the same time, dumping on drive-thrus is something that could only happen in a smaller urban centre.

The move is in support of controlling carbon pollution of course; no more idling cars lined up down the block (Hello, Fort McMurray?!), but what I like about it is that the new by-law obliges people to get out of their cars, to enjoy a little facetime with another human being, instead of leaning out their car window, shouting into a tinny speaker mounted in a plastic sign.

For all the degree of change being generated by the digital revolution, and for all the noise I’ve made about that change in this blog, there are two revolutions of recent decades that have probably had greater effect: the revolution in settlement patterns that we call urbanization, and the revolution in economic scale that we call globalization. Both are probably more evident in smaller cities and towns than anywhere else.

Grain elevators, Milestone, Saskatchewan, about 1928
Grain elevators, Milestone, Saskatchewan,
about 1928

Both of my parents grew up in truly small prairie towns; my mother in Gilbert Plains, Manitoba, present population about 750; my father in Sedgewick, Alberta, present population about 850. Sedgewick’s population has dropped some 4% in recent years, despite a concurrent overall growth rate in Alberta of some 20%. Both these towns were among the hundreds arranged across the Canadian prairies, marked off by rust-coloured grain elevators rising above the horizon, set roughly every seven miles along the rail lines. This distance because half that far was gauged doable by horse and wagon for all the surrounding farmers.

I grew up in Grande Prairie, Alberta, a town which officially became a city while I still lived there. The three blocks of Main Street that I knew were anchored at one end by the Co-op Store, where all the farmers shopped, and at the other by the pool hall, where all the young assholes like me hung out. In between were Lilge Hardware, operated by the Lilge brothers, Wilf and Clem, Joe’s Corner Coffee Shop, and Ludbrooks, which offered “variety” as “the spice of life,” and where we as kids would shop for board games, after saving our allowance money for months at a time.

Grande Prairie is virtually unrecognizable to me now, that is it looks much like every other small and large city across the continent: the same ‘big box’ stores surround it as surround Prince George, and Regina and Billings, Montana, I’m willing to bet. Instead of Lilge Hardware, Joe’s Corner Coffee Shop and Ludbrooks we have Walmart, Starbucks and Costco. This is what globalization looks like, when it arrives in your own backyard.

80% of Canadians live in urban centres now, as opposed to less than 30% at the beginning of the 20th century. And those urban centres now look pretty much the same wherever you go, once the geography is removed. It’s a degree of change that snuck up on us far more stealthily than has the digital revolution, with its dizzying pace, but it’s a no less disruptive transformation.

I couldn’t wait to get out of Grande Prairie when I was a teenager. The big city beckoned with diversity, anonymity, and vigour. Maybe if I was young in Grande Prairie now I wouldn’t feel the same need, given that I could now access anything there that I could in the big city. A good thing? Bad thing?

There’s no saying. Certain opportunities still exist only in the truly big centres of course, cities like Tokyo, New York or London. If you want to make movies it’s still true that you better get yourself to Los Angeles. But they’re not about to ban drive-thrus in Los Angeles. And that’s too bad.

Exponential End

Computers are now more than a million times faster than they were when the first hand calculator appeared back in the 1960s. (An engineer working at Texas Instruments, Jack Kilby, had invented the first integrated circuit, or semiconductor, in 1957.) This incredible, exponential increase was predicted via ‘Moore’s Law,’ first formulated in 1965: that is that the number of transistors in a semiconductor doubles approximately every two years.

Another way to state this Law (which is not a natural ‘law’ at all, but an observational prediction) is to say that each generation of transistors will be half the size of the last. This is obviously a finite process, with an end in sight.  Well, in our imaginations at least.

The implications of this end are not so small. As we all know, rapidly evolving digital technology has hugely impacted nearly every sector of our economy, and with those changes has come disruptive social change, but also rapid economic growth. The two largest arenas of economic growth in the U.S. in recent years have been Wall Street and Silicon Valley, and Wall Street has prospered on the manipulation of money, via computers, while Silicon Valley (Silicon is the ‘plate’ upon which a semiconductor is usually built.) has prospered upon the growing ubiquity of computers themselves.

Intel has predicted that the end of this exponential innovation will come anywhere between 2013 and 2018. Moore’s Law itself predicts the end at 2020. Gordon Moore himself—he who formulated the Law—said in a 2005 interview that, “In terms of size [of transistors] you can see that we’re approaching the size of atoms, which is a fundamental barrier.” Well, in 2012 a team working at the University of New South Wales announced the development of the first working transistor consisting of a single atom. That sounds a lot like the end of the line.

In November of last year, a group of eminent semiconductor experts met in Washington to discuss the current state of semiconductor innovation, as well as its worrisome future. These men (alas, yes, all men) are worried about the future of semiconductor innovation because it seems that there are a number of basic ideas about how innovation can continue past the coming ‘end,’ but none of these ideas has emerged as more promising than the others, and any one of them is going to be very expensive. We’re talking a kind of paradigm shift, from microelectronics to nanoelectronics, and, as is often the case, the early stages of a fundamentally new technology are much more costly than the later stages, when the new technology has been scaled up.

And of course research dollars are more difficult to secure these days than they have been in the past. Thus the additional worry that the U.S., which has for decades led the world in digital innovation, is going to be eclipsed by countries like China and Korea that are now investing more in R&D than is the U.S. The 2013 budget sequestration cuts have, for instance, directly impacted certain university research budgets, causing programs to be cancelled and researchers to be laid off.

Bell Labs 1934
Bell Labs 1934

One of the ironies of the situation, for all those of us who consider corporate monopoly to be abhorrent, is evident when a speaker at the conference mentioned working at the Bell Labs back in the day when Ma Bell (AT&T) operated as a monopoly and funds at the Labs were virtually unlimited. Among the technologies originating at the Bell Labs are the transistor, the laser, and the UNIX operating system.

It’s going to be interesting, because the need is not going away. The runaway train that is broadband appetite, for instance, is not slowing down; by 2015 it’s estimated that there will be 16 times the amount of video clamoring to get online than there is today.

It’s worth noting that predictions about Moore’s law lasting only about another decade have been made for the last 30 years. And futurists like Ray Kurzweil and Bruce Sterling believe that exponential innovation will continue on past the end of its current course due in large part to a ‘Law of Accelerating Technical Returns,’ leading ultimately to ‘The Singularity,’ where computers surpass human intelligence.

Someone should tell those anxious computer scientists who convened last November in Washington: not to worry. Computers will solve this problem for us.

Groovy

“Slow down, you move too fast                                                                                       You’ve got to make the morning last                                                                                   Just kicking down the cobble stones                                                                            Looking for fun and feelin’ groovy.”

               The 59th Street Bridge Song

Paul Simon wrote the above lyrics during the nanosecond in history when it was in fact cool to use the word ‘groovy.’  (How is it that much, much older words, like ‘cool,’ or ‘hip’ can remain cool and hip seemingly forever, while a perfectly good word like ‘groovy’ immediately lapses into full blown dorkdom?)

He wrote the song in 1966, when the hippie counterculture was flourishing (1968 saw it begin to sour), when themes of ‘dropping out,’ and going ‘back to the land’ were ascendant among young people.  (Both Bruce Cockburn and Canned Heat were “going to the country.”)  Some of those young people left the city to form rural communes, which almost always disintegrated in a matter of months, as individual goals and disparate personalities clashed with the communal ideal.  Reality can bite down hard on those who believe that the peaceful serenity of the natural world can easily be reflected in the messy functionings of humankind grouped together, even where they share a common purpose.

Carl Honoré, a Canadian living in London, referenced the 59th Street Bridge Song in the opening passage to his 2004 book, In Praise of Slow.  In the text he suggests that, “The Slow movement is on the march,” that is people everywhere were steadily joining the ranks of those practicing slower work, sex, food, medicine, even weightlifting.  In closing the book he asks, “When will the Slow movement turn into a Slow revolution?”

Well, from a point in time almost ten years later, the answer would seem to be ‘not yet,’ and ‘not any time soon either.’  Today, technical innovation continues to drive change in a way that makes the pace of 2004—no YouTube, no iPhone—look almost placid.

Saint Iscariot photo
Saint Iscariot photo

No, slow is not easy to attain these days, and nor, for that matter, was it back in the sixties, not in any successful, final sense at least.  Slow has to be a deliberate choice of course—say, to leave that demanding job and pay the price in both dollars and status—but there is something counterintuitive about going slower that should be recognized by all those looking to step off today’s fast train.  It may be nicely summed up in a quote that Honoré serves up via Edward Abbey, cantankerous American author and environmentalist:

Life is already too short to waste on speed.”

If you want to expand your life, include in it more by way of experience, fulfillment, payoff, it’s not to be done by going faster.  Speed is the mortal enemy of memory, and even on Galiano, I have to remind myself, when I arrive and set about the myriad of tasks always awaiting, that if I try to do too much, stay too busy, I will almost instantly find myself at the departure point.  When that happens, it feels like I just got caught in a revolving door, whirled around a few times, then immediately dumped right back where I began.  Like I never did exit onto the other, island side.

As in all things, the challenge is one of balance, and the key commodity here is what I call engagement.  There is very little to be gained by ‘dropping out’ entirely; it’s an act of defeat, of surrender.  There are many, many fascinating components to stay abreast of in today’s world, and the very best thing about the internet may be that it makes such engagement easier.  You can be a part of a whole plethora of communities, without ever leaving home.

Stay engaged.  Never stop learning.  Keep looking for fun in new knowledge, skills and experiences.  But don’t kid yourself; we are all on a fast train which is hurtling toward oblivion.  If you want to hasten the journey, stay busy.  If you want to remember the trip, expand the experience and consciously enjoy it more often, step off once in a while, kick a few cobble stones, see if you can conjure up a little groovy.

 

The Arc of Age

“Oh to live on Sugar Mountain
With the barkers and the coloured balloons.
You can’t be twenty on Sugar Mountain
Though you’re thinking that
You’re leaving there too soon.”

             Neil Young, from Sugar Mountain

There is a time in your life when all opportunities seem available to you, a time when, whether it’s school, travel, love or work, any number of options are still to come.  If any particular relationship, living situation or job doesn’t work out, well, there are always more chances ahead.

And then one day, approximately two and half heartbeats later, you wake up to the reality that this wide open future no longer awaits you.

imagesKids do it to you more than anything else.  You can always change jobs, move to another city, or leave a lover, but a child is forever.  No changing your mind, after the fact.  As Neil Young has written in another song (Already One), once that charming little creature of yours enters into the world, he or she “won’t let [you] forget.”

The arc of a life affair is like a splendid strand of fireworks, trailing sparks as it rockets up into a starry sky, only to “too soon” begin the downward turn, moments away from extinguishment.  To draw upon another pop culture reference, Anthony Hopkins, in the critically-maligned-but-actually-rather-decent Meet Joe Black, stands addressing the crowd assembled for his 65th birthday, knowing Death awaits him at the edge of the party: “Sixty-five years.  Didn’t they go by in a blink?”

I’m not quite there yet, but I’m acutely aware that opportunities are now diminishing for me, not expanding.  My father will turn 91 this year.  We got him out to Galiano over the summer for what may well be his last visit to a place where he spent many warm days noodling around on various “projects”—a septic pipe for his trailer which emptied into two separate, submerged plastic garbage barrels (I kid you not), a wooden tower for a golden-coloured metal weather vane that weighs roughly 400 pounds, and has never once moved.

Dad and three of his brothers went off to war while all still in either their teens or twenties (Dad was 18).  Only two of them came back.  They didn’t cause the war, not in the slightest possible way, but it impacted their lives in a way I can only imagine.  On my mother’s side, my uncle’s entire graduating class walked from the Olds Agricultural College up to Edmonton, enlisting en masse.  Such were the times, and the excitement in the air for young people, eager for experience.

Sugar Mountain is about the transition from childhood to adolescence, marked by things like (for Young’s generation) a furtive first cigarette beneath the stairs, or a secret, world-exploding note from that girl “down the aisle.”  We all leave the magic of childhood “too soon,” but then the other transitions of life seem to pile on pretty rapidly too.  The end of school, perhaps marriage, the death of our parents, children leaving home.  It all comes at you like rolling breakers at the beach, just as irresistible.

Oddly enough, the passage of time does not slow as we age.  In fact it accelerates, causing whole chapters of our lives to blur into a kind of muted cacophony of sounds and pictures, like a tape set to fast forward.  (I’ve commented here on this blog on the blur of the child-rearing years.)  That year’s time, say grade four, which seemed to drag on forever for me as a child now seems to hurtle by in an instant, like an approaching pedestrian whom I don’t recognize until he’s passed me by.  Too late to even smile.

Most of us will live ordinary lives.  We won’t be rich, or famous, extraordinarily powerful, or especially attractive.  But if we’re lucky, and if we make just enough good choices, we will live long and well.  It won’t be a perfect record, not even close, and there will be a fair number of regrets, but if tragedy, disease, natural catastrophes and the sordid affairs of nation states leave you largely untouched, you will live long, and you will find meaning.  It will come with children, and those others whom you love.  If you are so lucky, it will come whether you like it or not.  No need to hurry.