Tag Archives: accelerated change

Future Imperfect

“You’re welcome to Le Carre—he hasn’t got any future.”

—A publisher who rejected John Le Carre’s The Spy Who Came In From the Cold, which would go on to be described by Publishers Weekly as “the best spy novel of all-time.”

file000152304352 When it comes to predicting the future, we all make mistakes.  As we age, we hope to make them slightly less often, but, let me assure you, we never entirely escape the incidence.  Some of us, however, are in positions of authority which make the dimensions of our prognosticating blunders truly spectacular.  Infamous examples abound, especially in the cultural realm…

“Who the hell wants to hear actors talk?”

—Harry Warner of Warner Brothers, dismissing the idea of ‘talkies’ in 1925.

“Guitar groups are on the way out… The Beatles have no future in show business.”

—Dick Rowe, Decca Recording executive, snubbing The Beatles in 1962.

So too in the realm of technological future-telling.  Tim Wu, in his highly entertaining The Master Switch, recounts how in 1877 Western Union [Telegraph] was the most powerful information corporation on the planet, exclusive owners of the only continent-wide communications network.  The Bell [Telephone] Company was at the time a new and struggling tech firm with few customers and even fewer investors.  Such was the financial duress felt by Bell that the company’s President offered Western Union all of Bell’s patents for $100,000.  William Orton, Western Union’s President, declined the offer.  A company memo circulated a year earlier summed up Western Union’s take on the admittedly primitive Bell technology: “This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.”

Lest you think that the pace of technical innovation was invariably slower in the 19th century, it should be noted that, less than a year later, Western Union recognized the error of that take, and embarked upon a furious development effort of its own, commissioning a promising young inventor named Thomas Edison to come up with a better phone.  The effort would prove strangely inopportune (proving that luck too always plays a part in determining the future), when, just as Bell launched a patent-infringement lawsuit, it was discovered that Jay Gould, Robber Baron King mentioned elsewhere on this blogsite, was secretly buying up shares of Western Union, in preparation for a hostile takeover.  Western Union was suddenly obliged to view its telephone dust-up as a “lesser skirmish, one it no longer had the luxury of fighting.”  The company settled out of court with Bell on less than favorable terms, and Bell soon re-emerged as the American Telephone and Telegraph Company (AT&T), which would become the most successful communications company of the 20th century.

file1251307481611In a ‘look back’ article published earlier this year, U.S. News revisited its own predictive report from 1967 entitled, “The Wondrous World of 1990.”  The predictions made in the 1967 piece range from wide misses—a manned Mars landing, a cure for the common cold—to the remarkably prescient—a “checkless, cashless” economy, an “automated” (Google?) car.  (A blithe addendum notes how, if the driver of the robotic car does not accelerate as instructed, “the [computerized] roadway takes over control.”)

More broadly, two prophecies stand out for me in the 1967 article, two points central to the themes of this blog.  One is a miss; the other a palpable hit.  The miss discusses how, “Production and wealth will rise faster than population, so that incomes will climb steadily.”  This in turn will mean that the typical 1967 worker, who was then putting in about 2000 hours a year on the job, would, by 1990, see those hours drop to 1700 or less.  “The four-day week will arrive,” trumpets the article.

If only they had gotten that right.

The hit relates to, “Underlying the transformation to come is a quickening in the tempo of development out of scientific discoveries already made.”  One Dr. Richard G. Folsom, then President of Rensselaer Polytechnic Institute, is quoted: “The magnitude of change will expand, even explode.”

That much they did get right.

 

Luddites Unite!

‘Luddite’ has in recent years come to function as a generic pejorative, describing an unthinking, head-in-the sand type afraid of all new forms of technology.  It’s an unfair use of the term.

images-1‘Luddite’ originates with a short-lived (1811 to 1817) protest movement among British textile workers, men who went about, usually under cover of darkness, destroying the weaving frames then being introduced to newly emerging ‘factories.’  These were the early days of the industrial revolution, and the new manufacturing facilities being attacked were supplanting the cottage-based industry the Luddites were a part of, leading to widespread unemployment, and therefore genuine hardship.  It was an age long before the existence of any kind of social safety net, times when employers were free to hire children, and they did so, at reduced wages, since the new machines were much easier to operate, requiring little of the skill possessed by the adult artisans being left behind.  (Just as causative in the sufferings of these newly unemployed were the prolonged Napoleonic wars that the British government was incessantly engaged in back then, at great economic expense.)  My point being that the Luddites were not opposed to technology per se; they were simply striking back at machinery which was making their lives well and truly miserable.  People were literally starving.

The term ‘Neo-Luddite’ has emerged in our day, referring, in author Kirkpatrick Sale’s words, to “a leaderless movement of passive resistance to consumerism and the increasingly bizarre and frightening technologies of the Computer Age.”  The vast majority of the people involved in this modern-day movement eschew violence, counting among their members prominent academics like the late Theodore Roszak, and eminent men of letters like Wendell Berry.  Again, the movement is not anti-technology per se, only anti-certain-kinds-of-technology, that is technology which might be described as anti-community.

Back in the 1970s, I read, quite avidly I might add, Ivan Illich’s book Tools for Conviviality, which from its very title, you might construe to be Neo-Luddite in its intent.  And you’d be largely right.  Illich condemned the use of machines like the very ones the British Luddites were smashing back in the 19th century—factory-based, single-purpose machines meant first of all for greater generation of the owner’s monetary profit.  The interesting thing is that Illich considered the then-ubiquitous pay phone as a properly convivial tool.  Anyone could use it, as often or seldom as they chose, to their own end, and the phone facilitated communication between individuals, that is community.

It’s not hard to see where all this is going.  Illich’s book was directly influential upon Lee Felsenstein, considered by many to be father of the personal computer.  Felsenstein was a member the legendary Homebrew Computer Club, which first met in Silicon Valley in 1975, spawning various founders of various microcomputer companies, including Steve Wozniak of Apple.  The original ethos espoused by members of the Club stressed peer-to-peer support, open-source information, and the autonomous operation of an individually owned machine.

Were he still alive, Ivan Illich would undoubtedly think of the personal computer, and of the smart phone as convivial tools.  But Illich had another concern associated with current technology—the rise of a managerial class of experts, people who were in a position to co-opt technical knowledge and expertise, and eventually control industries like medicine, agriculture and education.  Would the Lords of the computer age—those who control Google, Facebook, Apple—be considered by Illich to be members of a new managerial elite?

It’s not easy to say.  I suspect Illich would indeed think of the CEOs of companies like Toyota, General Electrics, and Royal Dutch Shell as members of a managerial elite, ultimately alienating workers from their own employment.  But Larry Page, the CEO of Google, who claims the company motto as, “Don’t be evil?”; is he too one of the new internet overlords?

The Neo-Luddites are right in saying that what we must all do is carefully discriminate among new forms of technology.  We must consider the control, the intent, the final gain associated with each type.  Convivial technology adds to our independence, as well as our efficiency.  It informs and empowers the user, not an alternate owner, nor the cloud-based controller of the medium.  If we could all make the distinction between convivial and non-convivial technology, it might make Luddites of us all.

 

Rich and Famous

Back in 1968, Andy Warhol notably said, “In the future, everyone will be world-famous for 15 minutes.”  Back then, the gates to fame were securely guarded by the sober keepers of what was referred to as ‘mass media.’  Few had access to any form of media beyond a ‘photocopier,’ and so it took great skill or achievement, or spectacularly bad luck or choices to gain a remote audience of more than a handful.

images-6

I think of Michael James Brody Jr., who in 1970 announced he would be giving away one million dollars, and who was then of course immediately engulfed in media attention, including an appearance on The Ed Sullivan Show, where he prophetically sang a less than distinguished version of Bob Dylan’s, “You Ain’t Goin’ Nowhere.”  (Proving that opportunistic might be a more accurate descriptor for the mass media of the day than discriminating.)  Indeed, Brody quickly faded from the public eye, committing suicide in 1973; his life a sad comment on Warhol’s original pronouncement.

These days anyone who can turn on a computer has access to an international medium, and the average teenager on Facebook has more than 500 ‘friends’ providing an instant audience.  The average 22-year-old in Britain has more than 1000 Facebook friends.  Certainly it amounts to a ‘network’ of sorts, encompassing plenty of people who can’t be considered friends in any genuine sense, but who nevertheless, as the YouTube slogan formerly suggested, allow the individual to resemble a minor-league ‘broadcaster.’

Numbers still count of course.  Google Adsense does not come sniffing around any blog without a serious number of daily clicks.  (Google makes searching for an accurate take on this number remarkably unproductive.)  So when it comes to money, big dogs still rule the kennel, and in that sense not much has changed.  But in other important ways nearly everything has changed.  Now any ordinary mortal can ‘share’ everything from the breakup of her most recent relationship to, famously, unwisely, his participation in last night’s riot.  And with these changes, the very conception of privacy seems to have morphed for current 20somethings.  (The average 50something has roughly 50 times fewer Facebook friends than the 20something.)  Any smart phone now knows precisely where we are at all times, and, if we wish, it will happily notify all our friends of as much whenever they happen to be in the neighbourhood.  More ominously, if Eli Pariser in The Filter Bubble is right, facial recognition technology will soon advance to the point where whomever—the government, your employer, your husband—will be able to search for you wherever security cameras may have observed you, which is just about everywhere, isn’t it?  The prospect represents a virtual paradigm shift in our public/private lives.  As Pariser, writes, “The ability to search by face will shatter many of our illusions about privacy and anonymity.”

Personally, I’ve never quite grasped the attraction of fame, at any scale, whether it be via The New York Times or Facebook.  Money, sure; it’s highly convenient.  Power, again sure, if you’re able to contend with its corrupting capacity.  Fame can obviously facilitate these other, more ostensibly desirable ends, but fame in the sense that you won’t be able to go out in public without being recognized, that strangers might approach you, looking for some sort of buzz of interaction—the very idea that anonymity will be gone for you—I just don’t see the payoff in that.

In June of 1968, Valerie Solanas, a marginal figure in Andy Warhol’s notorious ‘Factory’ scene, tried to kill him.  She very nearly did, and Warhol had to wear a surgical corset for the rest of his life.  Maybe it’s just my perverse take on things, but the attempt seems to be the apogee of the dark side of fame.  A murder attempt is obviously not the sort of attention anyone needs, but then, the need for widespread attention seems to me to be something all of us should regard with suspicion.  As David Bowie has suggested, “Fame, what you get is no tomorrow.”

 

It Started With the Telephone.

images-5

I once heard an elderly woman speaking about the arrival of telephones in the remote utopian community of Sointula, BC, where she grew up.  (The area had been settled by a group of Fins who rowed over to the village’s location on Malcolm Island in 1901, escaping the brutal drudgery of Nanaimo BC’s coal mines.)  The onset of phones in Sointula meant that no longer did one have to ‘drop by’ to communicate with a neighbour, a little ‘face time’ was no longer necessary.  Something indefinable had changed in her community with the advent of the telephone, mused the woman, wistfully; something was never quite the same.

The use of telegraph wires and Morse code of course preceded that of phones, but it seems to me that the telegram wasn’t guilty of the same social isolation.  It wasn’t in your home for one thing, and there was this other human intermediary involved as well, if only to tap out the dots and dashes that your brief missive was translated into.  The telegraph was almost a ‘public’ utility.

Nope, it was the arrival of phones that effectively began the physical separation of communicating human beings that we now see fully manifest in cyberspace.  Now we are all, in the words of author Sherry Turkle, “alone together” in a virtual world where your physical presence is only periodically, infrequently a part of the communication process.  Now we are all sitting in millions of tiny private theatres, connected, but only electronically, and usually not to our neighbours, maybe even not to our family.

We thought of it as ‘hermetic’ when the process first began evolving back in the 80s, when VHS tapes and ‘home theatre’ set-ups began replacing the social experience of going out to the movie houses.  We may have made slight ‘clucking’ sounds back then, shaking our heads in mild regret at the passing of the communal event, but the truth is I didn’t regret it, certainly not once larger format TV screens began appearing.  I didn’t miss the blaring of ads before the film, nor the people whispering two rows behind me, nor the cell phones chiming in the darkness.  But it’s different now; regardless of my original guilty intent, there is definitely something meaningful that’s missing these days, something to do with community.

The web has undoubtedly increased the number of communities on the planet, but the increase is of course in the number of virtual communities, and with the growing number and strength of web communities has come a steady erosion of real world communities.  The more we are an energetic member of Avaaz, the online activist association, or a regular voter on American Idol, or a frequent updater of our LinkedIn profiles, the less likely it is we are a contributing member of our real-world community, that realm of people just outside our front door.

And there is something qualitatively different about person-to-person communication; we all know this.  The full sonic range of the voice, the subtleties of the body language, the tactile wonder of touch.  It’s the difference between a spectacularly flowing 70 mm Omnimax, helicopter-shot image of a wild Rocky Mountain valley in summer, and actually standing in that valley, the sun on your face and the breeze in your hair, smelling the alpine meadow flowers.  The first is indeed amazing, but the experiential gap between the two is nevertheless vast.

The web can be part of the solution of course.  MeetUp.com is a remarkable resource for putting people who share an interest—everyone from dog walkers to Spanish speakers to Drupal aficionados—in the same room.  But individually, it’s all just one more reason to disconnect daily, to go for a coffee, a beer or a walk with your friend.

It’s no accident that the words communication and community share so many letters.  In the final, genuine, ‘full-throated’ analysis, the two can both be reduced to electronic words, sounds or images and still be effective, certainly efficient, but they’ll remain digital, as opposed to real.

 

 

Fear of Technology

On Galiano Island mobile phone coverage is at best spotty.  Around Sturdies Bay, where the ferry lands, coverage spills over from one of the nearby islands and the tiny graph in the upper corner of my cell phone window shows four bars, but just two kilometres down the road that signal is almost non-existent.  Our cabin lies in a complete dead zone.  A couple of years ago a phone company was proposing to erect a tower on the ridge above our place, but ‘opposition grew’ among the locals.  My neighbour came by, wearing a battered straw cowboy hat, holding a petition opposing the tower.  He knew the phone company claimed the tower would be entirely safe, but he felt they couldn’t be trusted.

At the co-op where I live with my family in Vancouver, there was a similar spate of opposition that arose against the addition of ‘smart meters’ to the building complex’s servercentral-industries-technologyelectrical system.  Those meters would send out a signal, much like a mobile device does, and it was felt by numerous of my neighbours that that signal might be harmful to human health.  According to BC Hydro, the corporation installing the new meters, exposure to radio frequency during the 20-year life span of a smart meter is equivalent to the exposure of a single 30-minute cell phone call, which would suggest that I should be a lot more concerned about the installation of the cell phone tower on Galiano than I should be about the installation of a smart meter in my Vancouver home.  Regardless, my Vancouver neighbours were at least obliquely aware of such safety claims on behalf of BC Hydro, but they were not about to be swayed in their opposition to the meters.  They too brought round a petition.

My neighbours are quite right to not trust the corporate agenda.  As the documentary The Corporation has so nimbly pointed out, that agenda is about one thing and one thing only, to the exclusion of all else—the maximization of profit, and the resultant increase in share price.  That focus is amoral, in effect sociopathic, but then one can hardly expect it to be otherwise.  Corporations exist in a world of other corporations.

And certainly there is no shortage of examples where the corporate agenda had a direct and deleterious effect on human health.  From big mining to big pharma to big finance, corporations have regularly pursued profits at the expense of our collective wellbeing, there’s little doubt of that much.

Thomas Edison himself once opposed the installation of the electrical grid in America.  Go figure I know, a man who was after all the father of the electric lightbulb, but here’s what he said: “…I have always consistently opposed high-tension and alternating systems of electric lighting… not only on account of danger, but because of their general unreliability and unsuitability for any general system of distribution.”*

In 1891, at a village meeting in Bradford, Vermont, there was a contentious vote taken regarding a proposal to purchase an electric light plant for the purpose of replacing the local gas street lamps.  The vote was not in favour.  Here’s what Larry Coffin, President of the Bradford Historical Society, wrote in his blog about the successful opposition at that meeting: “That opposition seems to have come largely from those who disapproved of a government-owned enterprise, although there were those who were just opposed to change.”

The fear of new technology is indeed linked to a more generalized fear of change.  Change, especially when it’s cloaked in the ‘hardwear’ of unfamiliar technology, makes us uneasy, makes us aware that the new, coming situation may well be open to exploitation by others, exploitation which might put us at a disadvantage, or do us harm.

And petitions are rather like referendums, as I’ve written about them elsewhere on this site.  They bring out ‘the opposition’ in us, opposition that comes with the empowerment of opposing, whether it be out of fear, or resentment, or simple contrariness.  We oppose because it makes us feel safer, or more influential, or that we have at least temporarily beaten back the forces that seek to gain advantage upon us.  It’s an attitude that rarely contributes to the greater good, that is rarely healthy.

I didn’t sign either petition.

 

*Source: Edison, Thomas A. The Dangers of Electric Lighting, North American Review, November, 1889. pp.630, 632, 633.

 

The Apostate

Jaron Lanier is an interesting case.  He’s an author, musician, composer and computer scientist who, back in the 80s, was a pioneer in virtual reality soft and hardware.  (The company he founded after leaving Atari in 1985 was the first to sell VR goggles and gloves.)

Most interestingly, these days Lanier is a sharp critic of what he terms, “digital Maosim,” the open source groupthink which says that not only all information, but all creative content images-3should be free and subject to appropriation (for mashups, for instance).  As you might expect, he’s been subject to considerable blowback from the online community in taking this position, and Lanier freely admits that he was once a card-carrying member of this same club, believing that any musicians or journalists who were going broke because of the digital revolution were simply dinosaurs, unable to adapt to a new environment that would soon provide them other means of financial support, if only they were a little patient and properly innovative.  The problem is, as Lanier writes in his 2010 book You Are Not a Gadget, “None of us was ever able to give the dinosaurs any constructive advice about how to survive.”

And so, currently, we have a world where creators—be they artists, musicians, writers or filmmakers—face massive competition and constant downward pressure on what they can charge for their product.  This while a few of what Lanier labels the “Lords of the Clouds”—those very able but still very lucky entrepreneurs who were at the right place at the right time with the right idea (think the owners of Youtube and Google)—have amassed huge fortunes.

These conditions have delivered a new feudal world where, according to Lanier, we again have starving peasants and rich lords, where formerly middle-class creators struggle to survive in the face of competition from what he adroitly describes as those people ‘living in their van,’ or those who are mere hobbyists, creating art as an after-work pastime, or perhaps because they can pay their monthly bills with an inheritance.  Important artists find themselves, like Renaissance artists of old, looking to rely on the beneficence of wealthy patrons.  “Patrons gave us Bach and Michelangelo,” rues Lanier, “but it’s unlikely patrons would have given us Vladimir Nabokov, the Beatles, or Stanley Kubrick.”

There’s little doubt that the digital revolution has been fairly disastrous for the creative community, at least once you combine it with the global economic tanking that took place in 2008-09.  (See last week’s post: ‘DEP.’)  As is so often the case, however, the picture is not so simple.  Another huge factor in the plethora of creative product out there at rock bottom prices is the advent of new production technology.  It’s now a whole lot easier than it was back in the 1970s to make a movie, or record some music, or publish your book.  The means of production have evolved to where just about anyone can get their hands on those means and begin creating, then distributing.  More supply, less [or the same] demand means lower prices; the invisible, emotionally indiscriminate hand of capitalism at work.  The former gatekeepers—the major record labels, publishing houses and movie studios—have lost their decisive positions at the entryway, and this in the end has to be a good thing.  It’s just that the change has not come without a flip side, one Lanier does a nice job of illuminating.

Back in 1990, Francis Coppola was interviewed by his wife for the making of Heart of Darkness; A Filmmaker’s Apocalypse, a documentary she was shooting about the extraordinary travails Coppola had faced in completing his career-defining opus Apocalypse Now.  Coppola had this to say about the future of filmmaking: “My great hope is that … one day some little fat girl in Ohio is gonna be the new Mozart and make a beautiful film with her father’s camera, and for once the so-called ‘professionalism’ about movies will be destroyed forever, and it will become an art form.”

Be careful what you wish for.

 

Referendum Politics

13716821-vote

An old friend once said to me that she thought voting should be a privilege, rather than a right.  She felt citizens should be educated on the issues before they would qualify to vote.  With that, presumably, would come the government requirement to take a course, complete a quiz, or somehow prove that you as potential voter were sufficiently informed to be eligible to step into the ballot box.

It’s a bit much for me, involving a bit too much faith in the benevolence of government, but, on the other hand, it’s not hard to empathize with the sentiment.  Anyone who has made any sort of sustained investigation into the illegality of soft drugs, for instance, will soon come to the conclusion that the U.S. ‘war on drugs’ is a colossal waste of police and legal resources, a policy which pitchforks money to organized crime, fills up jails with non-violent offenders, and delivers scant results in terms of decreased drug use.

And yet, until very recently—maybe—a majority of American voters favored retaining laws prohibiting marijuana use.  Why?  Well, two reasons I think.  First of all emotion, the historical residue of the hysteria generated by ridiculous government campaigns from out of the past touting the dangers of “reefer madness!”  Secondly, the simple fact that these people aren’t well informed about the issue.  They haven’t studied the facts.  They haven’t seen how much money is spent eradicating marijuana fields, taking down grow ops, busting teenagers, jailing small-time dealers.  They haven’t considered how much money flows to gangs, when it could be flowing in taxes to depleted government coffers.  They may be vaguely aware that the prohibition of alcohol back in the 1920s didn’t work out that well, giving rise to the American Mafia, but they haven’t really had to examine the parallels between those events and the prohibition against marijuana.  Why have the majority of Americans viewed marijuana prohibition as a good thing?  They don’t know any better.

It’s just one example which raises the question of whether ‘direct democracy’ is a good thing.  The digital revolution is fast delivering us the means to hold a referendum on every issue, voting from our smart phones, tablets and laptops.  Should we go there?  If we do we could probably eliminate the need for those noxious politicians squabbling in cantankerous legislatures.  Then we could institute, just as my friend suggested, online courses which a prospective voter would be obligated to complete, before casting her vote on any particular proposed law.  Tempted?

The question can be more germanely asked, here and now, as whether an elected official is compelled to vote ‘the will of the people.’   Setting aside for a second the reality of a ‘party whip’ dictating to said official how he will vote, should our rep be free to vote according to his own personal assessment of the proposition, or should he be obliged to vote in line with what polls show is the view of the majority of his constituents?

Personally, I’m a believer in representative democracy, where we send our best and brightest to debate, study and confer on the issues of the day, and then vote according to their soundest judgment.  Referendums are a mug’s game.  If we are to see progressive change in our society, we’re better off avoiding them.  Why?  For one specific reason: voting ‘no’ empowers; voting yes does not.  We can frame the referendum question as carefully as we like, crafting it like obsessed ad men, but the fact is that the number of voters out there who feel at least mild resentment toward politicians dwarfs the number who may be uninformed about any particular issue.  These folks are generally not terribly happy with their lives, and the easiest place to direct the blame is toward the government.

Thus, when the opportunity arises to ‘stick one’ to the government, they’re going to take it; they’re going to vote no to change.  Voting no means that the power still resides with you—maybe I’ll vote yes next time, if you’re nicer to me in the meantime—but voting yes means you no longer hold any leverage.  The power has been passed on to people who may never care to seek your input again.

As I keep saying, change is constant; new problems will always arise, so we need change to contend with those problems—new solutions for new problems.  And referendums will always make that difficult.  They’re a political cop-out.  They amount to politicians dodging their responsibility.

 

 

 

Cybernetics

Chop Wood, Carry Water, Write Blog

I was ten years old when my father got his first calculator.  It was 1963, and a few years before he had left his position as City Engineer in the just-barely-big-enough-to-be-called-a-city I grew up in to start his own engineering and surveying business.  The calculator was the size of, say, a slightly elongated 300-page hard cover book, but the thing was, it could do trigonometry, instantly calculating sines, cosines and tangents (and therefore distances) that my father had been laboriously calculating ‘by hand’ until then.

This was significant for my family because my father worked long hours in those days.  He left early, came home for lunch, kicked back in his recliner for a brief nap, returned to work, showed up again just ahead of dinner, and then usually headed back to the office after dinner for a few more hours of mental toil.  It was particularly hard on my mother, who was left to contain myself and my two brothers.  The arrival of the calculator, my mother announced, meant that we would all see more of my father.

It didn’t happen.  Any let-up in my father’s work schedule came only years later as the result of a growing business, and the hiring of staff.

The term I remember from the 70s for this same naive hope was ‘cybernetics.’  It’s a word that seems to have meant many different things to many different people over the years, but the meaning I recall touted was one which suggested that, given the incredible speed and efficiency evolving via modern science and technology, we would all soon be enjoying far greater amounts of leisure time.

It didn’t happen.  You may have noticed.

What a gloriously well-intentioned crock it all was, and another lesson in how poorly we predict the future direction and impact of new technology.  (No one, for instance, foresaw the rise of social media ten years ago.)  We know now that technology—especially digital technology—doesn’t save us time, it simply accelerates our lives.  It simply closes the gap between what we can do now, and what would have previously taken us longer to get to.  With an instant calculation, or instant information, or instant communication, that task which we would have formerly had overnight, or maybe two weeks to anticipate and ponder, is immediately upon us, demanding the doing.

The fact is, if you’d like more time on your hands, get off the grid.  Escape electronic technology altogether.  Chop wood, carry water, save time.

Until recently, the place on Galiano Island where I write this was off the communications grid.  No phone (no cell phone coverage), no TV, no internet.  Electricity and radio, that was it.  To come here was to immerse yourself in the pre-digital age.  That experience became the inspiration for this blog.

Thus, this the inaugural post for a blog intended to be about our changing times, about the accelerated change we are all living with on a daily basis.  Ken Auletta, who writes a column on communications technology for The New Yorker, has said that we are now experiencing a greater degree of change than at any time in our entire history, and I suspect he’s dead right about that.  (It took radio 38 years to reach an audience of 50 million; the internet just four.)  And I further think that this condition isn’t given quite enough attention these days.

Make no mistake, the digital revolution has many, many positives.  The often-mentioned benefits of social media for the Arab Spring protestors should suffice as an example.  But so too do we have the catch phrase ‘digital Darwinism’ accompanying the ‘e revolution,’ as if to say that any who are not gleefully on board the digital train are doomed to an embarrassed extinction.  These purveyors of doom employ the word ‘exciting’ more often than any other in discussing the future of the revolution underway (Jeff Bezos, founder of Amazon, tells us, grinning, that we are still in “day one” of the revolution.), but let us make no mistake about this fact either: the burgeoning ubiquity of digital technology is driven by money.  Those who wax so enthusiastic about all of us participating in the social media game are themselves coaching from the sidelines.  They want to win alright, but the winner’s payoff for them is exactly as Aaron Sorkin’s brilliant screenplay for The Social Network would have us note: “Baby, you’re a rich man too.”  Those of us out there battling on the field are playing for very different stakes.  And for those on the field, the game is not without injury.

It’s not fashionable to say so, but the digital revolution warrants our skepticism, a critical rather than an eager or unthinking reception, and this blog is intended to facilitate exactly that.

The blogger's woodpile.
The blogger’s woodpile.