Tag Archives: the future

Interstellar Dreams

In a recent article in Aeon magazine, Elon Musk tells us that he figures it will take about a million people to properly colonize Mars. He has in mind a design for a giant spaceship, the “Mars Colonial Transporter,” to facilitate the task.

8577726421_2a363387c1And lest you think that Mr. Musk is just another techno-geek keener with a shaky grip on reality, no. This is the guy who sold PayPal to eBay for $1.5 billion, then went on to successfully compete with corporate behemoth General Motors by designing and marketing the Tesla electric car. Currently he heads up SpaceX, a startup dedicated to said colonization of Mars, a company that has a contract with NASA to transport astronauts to the International Space Station. He’s the real deal.

Musk sees the colonization of the red planet as a stepping stone to exploration of the rest of our solar system, and ultimately interstellar space. He imagines the million colonists in place within a century, the first bunch taking up residence there around 2040.

As a species, we have been journeying out beyond the horizon for about as long as we’ve been mobile. Always willing, despite obvious dangers, to explore unknown territories, then ‘settle’ them, before allowing others to move on again, into the alien. This urge to migrate, to reconnoiter strange lands and then inhabit them is one of the true hallmarks of humankind. No other species has spread so far and wide on the planet, and done it with such aplomb.

And so, for us, outer space is of course “the next frontier.”

The obstacles this time are no less considerable than they were on terra firma. Mars once had an atmosphere; probably surface water too, but these days it’s a distinctly harsh environment; exposed to it you’d last less than 30 seconds. Colonist’s quarters there will be close, and extremely stress-inducing. It will be a bleak, constricted adventure, and very few will care to go, given that it’s a one-way ticket.

Getting there, however, is relatively easy, compared to interstellar space travel. The nearest star, called Alpha Centauri, is four light years away. Sounds encouraging—if we can even approach the speed of light the trip might take less than four years for the astronauts to arrive, if Einstein was right about speed shortening time. The problem is the energy needed for the journey; it seems it is physically impossible that the spaceship could carry enough onboard fuel. Scientists have imagined ‘solar sails’ which will capture the streaming energy of the sun, a solar wind, if you will. Then there’s the need for enough food for the trip, the immense psychological pressure of isolation lasting that long, the health problems that come with weightlessness, the difficulty of communication with home, exposure to hazardous radiation, and more. Again scientists have ideas to meet all these challenges, but they are highly theoretical. None of them are anywhere near practical realization.

And of course there is the possibility of robotic exploration of space, but that’s not the same is it. Where’s the adventure in that? No robot can ever be a hero, not without a lot of misplaced anthropomorphism.

No, for all intents and purposes, our days of exploration are over. There are no more truly wild places left upon Mother Earth, and our chances of sallying forth into outer space, at least for the very indefinite future, are essentially nil. As William Gibson has pointed out, no one will speak of ‘the twenty-second century’ the way we used to of the twenty-first.

It’s a necessary, perhaps mythic shift in consciousness with consequences yet to be determined. Obviously it behooves us to take good care of the planet, given that it’s the only abode any of us will ever have. But it also suggests that we should better appreciate the miraculous coincidence of life on ‘the pale blue dot.’ Just as interstellar travel may never happen, so too we may never discover life elsewhere in the universe.

This is it folks. We’re staying home tonight, and likely forever. Fate will find us where we are.

 

Let the Machines Decide

The GPS device in my car knows the speed limit for the road I’m driving on, and displays that information for me on its screen. Nice. Nobody needs another speeding ticket. But what if my ‘smart car’ refused to go over that limit, even if I wanted it to? You know, the wife shouting from the backseat, about to give birth, the hospital four blocks away, that sort of thing.

David Hilowitz photo
David Hilowitz photo

It’s a scenario not far removed from reality. Google’s robotic car has inspired many futurists to imagine a computer that controls not only the speed of your car, but also where it goes, diverting your car away from congestion points toward alternate routes to your destination. Evgeny Morozov is among these futurists, and in a recent article in The Observer, he suggests that computers may soon be in a position to usurp many functions that we have traditionally assigned to government. “Algorithmic regulation,” he calls it. We can imagine government bureaucrats joining the unemployment line to fill out a form that will allow a computer to judge whether they are worthy of benefits or no.

Examples of machines making decisions previously assigned to humans are already easily found. If the ebook downloaded to my Kobo has a hold placed on it, the Vancouver Public Library’s computer will unceremoniously retrieve it from my e-reader upon its due date, regardless of whether I have just 10 more pages to read, and would be willing to pay the overdue fine in order to do so.

But Morozov’s cautionary critique is about a wider phenomenon, and it’s largely the ‘internet of things’ which is fuelling his concern. The internet of things is most pointedly about the process which will see digital chips migrate out of electronic devices, into those things which we have until now tended to consider inanimate, non-electronic objects, things like your door, or your mattress. It may well be that in future a computer somewhere will be informed about it when you don’t spend the night at home.

Maybe you spent the night on a friend’s couch, after one too many. Maybe you ate some greasy fast food that night too. And maybe you haven’t worked out at your club’s gym for more than six months now. The data gathering upshot of this at least arguably unhealthy behavior is that you may be considered higher risk by a life insurance company, and so proffered a higher premium.

Presumably there is a human being at the end of this theoretical decision-making chain, but I think we’ve all learned that it’s never safe to assume that digital tech won’t take over any particular role, and certainly whatever the imagined final decision taken as to your insurance risk, certainly it will be informed by data collection done by digital machines.

The most chilling note struck in Morozov’s piece comes, for me, when he quotes Tim O’Reilly, technology publisher and venture capitalist, referring to precisely this industry: “I think that insurance is going to be the native business model for the internet of things.”

Now isn’t that heartening. Corporate insurance as the business model of the near future.

The gist of what is alarming about the prospect of digital machines taking increasing control of our lives is that it suggests that the ‘depersonalization’ we have all been living through for the last three-plus decades is only the beginning. It’s “day one,” as Jeff Bezos likes to say about the digital revolution. It suggests that we can look forward to feeling like a true speck of dust in the infinite cosmic universe of corporate society, with absolutely no living being to talk to should we ever wish to take an unnecessary risk, diverge from the chosen route, or pay the fine instead.

For all the libertarian noise that folks from Silicon Valley make about efficiency and disruption, let no one be fooled: the slick algorithmic regulation that replaces decisions made by people, whether government bureaucrats or not, may be more objective, but it will not bring greater freedom.

The Age of Surveillance

“Today’s world would have disturbed and astonished George Orwell.”                                        —David Lyon, Director, Surveillance Studies Centre, Queen’s University

When Orwell wrote 1984, he imagined a world where pervasive surveillance was visual, achieved by camera. Today’s surveillance is of course much more about gathering information, but it is every bit as all-encompassing as that depicted by Orwell in his dystopian novel. Whereas individual monitoring in 1984 was at the behest of a superstate personified as ‘Big Brother,’ today’s omnipresent watching comes via an unholy alliance of business and the state.

Most of it occurs when we are online. In 2011, Max Schrems, an Austrian studying law in Silicon Valley, asked Facebook to send him all the data the company had collected on him. (Facebook was by no means keen to meet his request; as a European, Schrems was able to take advantage of the fact that Facebook’s European headquarters are in Dublin, and Ireland has far stricter privacy laws than we have on this side of the Atlantic.) He was shocked to receive a CD containing more than 1200 individual PDFs. The information tracked every login, chat message, ‘poke’ and post Schram had ever made on Facebook, including those he had deleted. Additionally, a map showed the precise locations of all the photos tagging Schrem that a friend had posted from her iPhone while they were on vacation together.

Facebook accumulates this dossier of information in order to sell your digital persona to advertisers, as does Google, Skype, Youtube, Yahoo! and just about every other major corporate entity operating online. If ever there was a time when we wondered how and if the web would become monetized, we now know the answer. The web is an advertising medium, just as are the television and radio; it’s just that the advertising is ‘targeted’ at you via a comprehensive individual profile that these companies have collected and happily offered to their advertising clients, in exchange for their money.

How did our governments become involved? Well, the 9/11 terrorist attacks kicked off their participation most definitively. Those horrific events provided rationale for governments everywhere to begin monitoring online communication, and to pass laws making it legal wherever necessary. And now it seems they routinely ask the Googles and Facebooks of the world to hand over the information they’re interested in, and the Googles and Facebooks comply, without ever telling us they have. In one infamous incidence, Yahoo! complied with a Chinese government request to provide information on two dissidents, Wang Xiaoning and Shi Tao, and this complicity led directly to the imprisonment of both men. Sprint has now actually automated a system to handle requests from government agencies for information, one that charges a fee of course!

It’s all quite incredible, and we consent to it every time we toggle that “I agree” box under the “terms and conditions” of privacy policies we will never read. The terms of service you agree to on Skype, for instance, allow Skype to change those terms any time they wish to, without your notification or permission.

And here’s the real rub on today’s ‘culture of surveillance:’ we have no choice in the matter. Use of the internet is, for almost all of us, no longer a matter of socializing, or of seeking entertainment; it is where we work, where we carry out the myriad of tasks necessary to maintain the functioning of our daily life. The choice to not create an online profile that can then be sold by the corporations which happen to own the sites we operate within is about as realistic as is the choice to never leave home. Because here’s the other truly disturbing thing about surveillance in the coming days: it’s not going to remain within the digital domain.

Coming to a tree near you? BlackyShimSham photo
Coming to a tree near you?
BlackyShimSham photo

In May of this year Canadian Federal authorities used facial recognition software to bust a phony passport scheme being operated out of Quebec and BC by organized crime figures. It seems Passport Canada has been using the software since 2009, but it’s only become truly effective in the last few years. It’s not at all difficult to imagine that further advances in this software will soon have security cameras everywhere able to recognize you wherever you go. Already such cameras can read your car’s license plate number as you speed over a bridge, enabling the toll to be sent to your residence, for payment at your convenience. Thousands of these cameras continue to be installed in urban, suburban and yes, even rural areas every year.

Soon enough, evading surveillance will be nearly impossible, whether you’re online or walking in the woods. Big Brother meets Big Data.

Exponential End

Computers are now more than a million times faster than they were when the first hand calculator appeared back in the 1960s. (An engineer working at Texas Instruments, Jack Kilby, had invented the first integrated circuit, or semiconductor, in 1957.) This incredible, exponential increase was predicted via ‘Moore’s Law,’ first formulated in 1965: that is that the number of transistors in a semiconductor doubles approximately every two years.

Another way to state this Law (which is not a natural ‘law’ at all, but an observational prediction) is to say that each generation of transistors will be half the size of the last. This is obviously a finite process, with an end in sight.  Well, in our imaginations at least.

The implications of this end are not so small. As we all know, rapidly evolving digital technology has hugely impacted nearly every sector of our economy, and with those changes has come disruptive social change, but also rapid economic growth. The two largest arenas of economic growth in the U.S. in recent years have been Wall Street and Silicon Valley, and Wall Street has prospered on the manipulation of money, via computers, while Silicon Valley (Silicon is the ‘plate’ upon which a semiconductor is usually built.) has prospered upon the growing ubiquity of computers themselves.

Intel has predicted that the end of this exponential innovation will come anywhere between 2013 and 2018. Moore’s Law itself predicts the end at 2020. Gordon Moore himself—he who formulated the Law—said in a 2005 interview that, “In terms of size [of transistors] you can see that we’re approaching the size of atoms, which is a fundamental barrier.” Well, in 2012 a team working at the University of New South Wales announced the development of the first working transistor consisting of a single atom. That sounds a lot like the end of the line.

In November of last year, a group of eminent semiconductor experts met in Washington to discuss the current state of semiconductor innovation, as well as its worrisome future. These men (alas, yes, all men) are worried about the future of semiconductor innovation because it seems that there are a number of basic ideas about how innovation can continue past the coming ‘end,’ but none of these ideas has emerged as more promising than the others, and any one of them is going to be very expensive. We’re talking a kind of paradigm shift, from microelectronics to nanoelectronics, and, as is often the case, the early stages of a fundamentally new technology are much more costly than the later stages, when the new technology has been scaled up.

And of course research dollars are more difficult to secure these days than they have been in the past. Thus the additional worry that the U.S., which has for decades led the world in digital innovation, is going to be eclipsed by countries like China and Korea that are now investing more in R&D than is the U.S. The 2013 budget sequestration cuts have, for instance, directly impacted certain university research budgets, causing programs to be cancelled and researchers to be laid off.

Bell Labs 1934
Bell Labs 1934

One of the ironies of the situation, for all those of us who consider corporate monopoly to be abhorrent, is evident when a speaker at the conference mentioned working at the Bell Labs back in the day when Ma Bell (AT&T) operated as a monopoly and funds at the Labs were virtually unlimited. Among the technologies originating at the Bell Labs are the transistor, the laser, and the UNIX operating system.

It’s going to be interesting, because the need is not going away. The runaway train that is broadband appetite, for instance, is not slowing down; by 2015 it’s estimated that there will be 16 times the amount of video clamoring to get online than there is today.

It’s worth noting that predictions about Moore’s law lasting only about another decade have been made for the last 30 years. And futurists like Ray Kurzweil and Bruce Sterling believe that exponential innovation will continue on past the end of its current course due in large part to a ‘Law of Accelerating Technical Returns,’ leading ultimately to ‘The Singularity,’ where computers surpass human intelligence.

Someone should tell those anxious computer scientists who convened last November in Washington: not to worry. Computers will solve this problem for us.

Cars

I drove a car just as soon as I was legally able to.  Couldn’t wait.  A learner’s permit was obtainable in Alberta at age 14 back then, so within days of my 14th birthday I was happily out on the road, behind the wheel of a freedom machine.  I owned my first car, a light blue Volkswagon Fastback, by the time I was 18.

epSos.de photo
epSos.de photo

My own son, who is now 24, has never owned a car, and professes no interest in doing so.  It was my suggestion, not his, that he obtain a driver’s license, since I believed, perhaps naively, that it enhanced his chances for gainful employment.  My son’s cousin, same age, similarly has no interest in driving.  His friend Mendel, a year younger, has never bothered with the driver’s license.

They all own mobile devices of course, and if they ever had to choose between a car and a smart phone it would not be a difficult choice, and the auto industry would not be the beneficiary.

Times change.  And yet, more than ever, Canada is a suburban, car-dependent nation.  Two-thirds of us live in suburban neighbourhoods and three-quarters of us still drive to work, most of the time alone.  Vancouver, where I spend most of my time, now has the worst traffic congestion in all of North America, this year finally overtaking perennial frontrunner Los Angeles.

If ever a technology is in need of a revolution it has to be cars.  As uber venture capitalist (and Netscape co-founder) Marc Andreeson has been pointing out of late, most cars sit idle most of the time, like 90% of the time.  And the actual figure for occupancy on car trips is 1.2 persons per journey.

Car co-ops, and car-sharing companies like Zip Car of Car2Go point the way.  Many people have begun sharing, rather than owning a car.  But if you take the numbers mentioned above and add in the coming phenomenon of the Google robot car, the potential transportation picture becomes truly intriguing.

Driverless cars are now legal on public roads in Nevada, California and Florida.  Since 2011, there have been two collisions involving Google’s robot cars.  In one incident, the car was under human control at the time; in the other the robotic car was rear-ended while stopped at a traffic light.  We might assume that a human was driving the car that rear-ended the robot.

What if no one owned a car?  What if you could simply order up a driverless car ride on your smart phone any time, anywhere?  Your robot car would arrive at your door, it might stop to pick someone else up en route, but it would then drop you off at the entranceway to wherever it is you’re wishing to go to.  You would pay a fee for this service of course, but it would be minor in comparison to what you now pay if you own and regularly drive a car.

And of course the need for cars would nosedive, because these robotic cars would be in use nearly all of the time, say 90% of the time.  Car numbers would plummet, meaning traffic congestion would be a thing of the past.  And it keeps going: garages, driveways, parking lots would begin to disappear.  Our urban landscape, which has essentially been designed to accommodate cars, would begin to transform.  A lot more green space would become available.

And I haven’t even mentioned the reduction in carbon pollution that would ensue with the reduction in cars, carbon pollution being a problem which just may threaten the stability of civilization in the coming years.

Cars have been with us for about 100 years now.  Our relationship with them over that period has at times been tender, at times belligerent, at times top-down, sun-in-your face, wind-in-your-hair fabulous, at times utterly savage.  For those people who love cars, who fuss over them, restore them, take them out for careful drives only on sunny Sunday afternoons; I hope those people keep their cars, as an expensive hobby.  For the rest of us, those of us who use cars simply to get from A to B, for whom cars are just a form of convenient transport, the days when we need to own a car are disappearing.  For my money, the sooner the better.

The End of the Movies

I grew up without television.  It never arrived in the small town where I lived until I was about ten.  So I grew up watching the movies, initially Saturday afternoon matinees, which my older brother begrudgingly escorted me to under firm orders from my mother, who was looking for brief respite from the burden of three disorderly boys.  Admission was ten cents, popcorn five cents.  (If these prices seem unbelievable to you, all I can say is… me too.)

file2791245784270Movies were it, a prime cultural (and for me eventually professional) mover, right through my adolescence and early adulthood.  For me, TV has tended to be a kind of entertainment sideline, something to be watched when a new show came around with some serious buzz, but more often just a passive filler of downtime, material to unwind with at the end of a busy day.

That has of course all changed in recent years, and not just for me.  I don’t go to the movies much anymore—that is I don’t go to the movie houses—and, what’s more, movies don’t seem to matter much anymore.  These days movies are mostly noisy spectacle, big, flashy events, but events with very little to offer other than raucous entertainment.  Comic book movies are the dominant genre of today, and, no matter how I slice it, those comic book characters don’t really connect with life as I’m living it, day to day.  And, as I say, it’s not just me, as someone from an older demographic.  Today, unfortunately, the audience for the movies is smaller, and more narrow than it’s ever been.

OLYMPUS DIGITAL CAMERAMovie audiences peaked in 1946, the year The Best Years of Our Lives, The Big Sleep, and It’s A Wonderful Life were released, and 100 million tickets were sold every week.  By 1955—when Guys and Dolls, Rebel Without A Cause, and The Seven Year Itch were released—with the advent of television, that audience had dropped to less than half that.

But the movies survived television and found a ‘silver’ age (‘gold’ being the studio-dominated 40s) in the decade from 1965 to 1975, when we watched movies like The Godfather I and II, Midnight Cowboy and Chinatown, and the works of Ingmar Bergman, Federico Fellini and Francois Truffaut enjoyed theatrical release right across North America.  It was a time when movies did seem to have something to say; they spoke to me about the changing world I was in direct daily contact with.

Then came the blockbusters—Jaws and Star Wars—and the realization that Hollywood could spend hundreds, not tens of millions of dollars on a movie and garner just as large an increase in returns.  Movies have never been the same.

Today less than 40 million people in North America go to see a movie once a month.  In a 2012 poll done by Harris International, 61% of respondents said they rarely or never go to the movies.  Why would you when you have that wide screen at home, ad-free, with the pause button at your disposal?  The most you’ll likely pay to watch in private is half of what you would at the movie house.

And then, this year, we had a summer of blockbuster flops.  The worst was The Lone Ranger, made for $225 million and about to cost Walt Disney at least $100 million.  Both Steven Speilberg and George Lucas have said that the industry is set to “implode,” with the distribution side morphing into something closer to a Broadway model where fewer movies are released; they stay in theatres longer, but with much higher ticket prices.  Movies as spectacle.

(If you’re interested in reading more, an elegant, elegiac tribute to the run of the movies is The Big Screen, published last year and written by David Thomson, a critic born in 1941 who has thus been around for a good-sized chunk of film history.)

It may well be that movies, as the shared public experience that I’ve known, are coming to the end of a roughly 100-year run.  It was rapid, glamorous, often tawdry, sometimes brilliant, once in a while even significant, but technology is quickly trampling the movies.  If you were there for even a part it, you might feel blessed.

The Arc of Age

“Oh to live on Sugar Mountain
With the barkers and the coloured balloons.
You can’t be twenty on Sugar Mountain
Though you’re thinking that
You’re leaving there too soon.”

             Neil Young, from Sugar Mountain

There is a time in your life when all opportunities seem available to you, a time when, whether it’s school, travel, love or work, any number of options are still to come.  If any particular relationship, living situation or job doesn’t work out, well, there are always more chances ahead.

And then one day, approximately two and half heartbeats later, you wake up to the reality that this wide open future no longer awaits you.

imagesKids do it to you more than anything else.  You can always change jobs, move to another city, or leave a lover, but a child is forever.  No changing your mind, after the fact.  As Neil Young has written in another song (Already One), once that charming little creature of yours enters into the world, he or she “won’t let [you] forget.”

The arc of a life affair is like a splendid strand of fireworks, trailing sparks as it rockets up into a starry sky, only to “too soon” begin the downward turn, moments away from extinguishment.  To draw upon another pop culture reference, Anthony Hopkins, in the critically-maligned-but-actually-rather-decent Meet Joe Black, stands addressing the crowd assembled for his 65th birthday, knowing Death awaits him at the edge of the party: “Sixty-five years.  Didn’t they go by in a blink?”

I’m not quite there yet, but I’m acutely aware that opportunities are now diminishing for me, not expanding.  My father will turn 91 this year.  We got him out to Galiano over the summer for what may well be his last visit to a place where he spent many warm days noodling around on various “projects”—a septic pipe for his trailer which emptied into two separate, submerged plastic garbage barrels (I kid you not), a wooden tower for a golden-coloured metal weather vane that weighs roughly 400 pounds, and has never once moved.

Dad and three of his brothers went off to war while all still in either their teens or twenties (Dad was 18).  Only two of them came back.  They didn’t cause the war, not in the slightest possible way, but it impacted their lives in a way I can only imagine.  On my mother’s side, my uncle’s entire graduating class walked from the Olds Agricultural College up to Edmonton, enlisting en masse.  Such were the times, and the excitement in the air for young people, eager for experience.

Sugar Mountain is about the transition from childhood to adolescence, marked by things like (for Young’s generation) a furtive first cigarette beneath the stairs, or a secret, world-exploding note from that girl “down the aisle.”  We all leave the magic of childhood “too soon,” but then the other transitions of life seem to pile on pretty rapidly too.  The end of school, perhaps marriage, the death of our parents, children leaving home.  It all comes at you like rolling breakers at the beach, just as irresistible.

Oddly enough, the passage of time does not slow as we age.  In fact it accelerates, causing whole chapters of our lives to blur into a kind of muted cacophony of sounds and pictures, like a tape set to fast forward.  (I’ve commented here on this blog on the blur of the child-rearing years.)  That year’s time, say grade four, which seemed to drag on forever for me as a child now seems to hurtle by in an instant, like an approaching pedestrian whom I don’t recognize until he’s passed me by.  Too late to even smile.

Most of us will live ordinary lives.  We won’t be rich, or famous, extraordinarily powerful, or especially attractive.  But if we’re lucky, and if we make just enough good choices, we will live long and well.  It won’t be a perfect record, not even close, and there will be a fair number of regrets, but if tragedy, disease, natural catastrophes and the sordid affairs of nation states leave you largely untouched, you will live long, and you will find meaning.  It will come with children, and those others whom you love.  If you are so lucky, it will come whether you like it or not.  No need to hurry.

 

Marx Was Right

Those politicos who chant the competition-as-salvation mantra, especially those in America, may find it hard to believe, but not so long ago many prominent U.S. businessmen and politicians were singing the praises of corporate monopoly.  Incredibly, given America’s current climate of opinion—where the word government, never mind socialism, seems a dirty word—just 100 years ago, it was widely believed that there were four basic industries with “public callings”—telecommunications, transportation, banking and energy—that were best instituted as government sanctioned monopolies.  The most successful of the corporate entities to occupy this place of economic privilege was the American Telephone and Telegraph Company (AT&T), and here’s what its then President, Theodore Vail, had to say about the social value of competition, “In the long run… the public as a whole has never benefited by destructive competition.”

Groucho's older brother Karl (kidding)
Groucho’s older brother Karl (kidding)

Karl Marx may have been wrong about many things, including what best motivates the average human being, but he was certainly not wrong when he suggested that capitalism tends directly toward monopoly.  How could it not, when the most durable means of defeating the competition will always be to simply eliminate it?  In 1913, AT&T had been remarkably successful in doing just that, and its monopoly would survive undiminished until 1982, when the Reagan administration oversaw the breakup of AT&T into the seven so-called ‘Baby Bells.’

(Before you conclude that it’s only right-thinking, right-leaning governments, like Reagan’s, that can properly control corporate America, know that it was also a Republican administration, under President Taft, that condoned the ascendency to monopoly by AT&T in 1913.)

Tim Wu, in his book The Master Switch (cited last week in this blog), has postulated “the cycle” as continuously operative in the communications industries (all the way from telegraph to TV), whereby technical innovation gives birth to an initially wide-open trade, but where soon enough corporate consolidation leads to singular business empires.  It’s worth noting that by 2006, AT&T had, via some truly brutal business practices, essentially reunited its pre-breakup empire, leaving only two of the Baby Bells, Verizon and Qwest, still intact and independent.

The latest example of the tendency toward monopoly in Canada can be seen readily at play in the federal government’s efforts to boost competition among the oligopoly of this country’s big three telephone providers, Telus, Bell and Rogers.  Evidence suggests that, prior to the government’s most recent intervention—in 2008 reserving wireless spectrum for new companies like Mobilicity, Wind and Public Mobile—Canadians paid some of the highest mobile phone charges in the world.  Since their entry into the marketplace, these three rookie players, have—what a surprise—struggled to prosper, even survive in the face of fierce competition from the triad of telecom veterans.  All three ‘Canadian babies’ are now said to be up for sale, and the feds, to their credit, stepped in earlier this year to block a takeover of Wind Mobile by Telus Corp.

Former Baby Bell Verizon—now referred to in comparison to Canadian telecoms as “giant” or “huge”—is reported to be circling Canada’s wireless market, rumoured to be considering a bid on either of Wind Mobile or Mobilicity.  Facilitating this move—and setting off alarm bells (no pun intended) near the Canadian cultural core—is a recent legislative relaxation of formerly stringent foreign ownership rules to allow foreign takeovers of telecoms with less than 10 per cent of the market.

Wu’s book asks if the internet will succumb to the same cycle of amalgamation that so many other electronic media have.  His answer: too soon to tell, but history teaches us to keep a wary eye.  And if you consider Apple’s cozy relationship with AT&T over the iPhone, or the fact that Google and Verizon have courted, you’d have to agree with his concern.  Wu concludes his book with an advocacy of what he terms “The Separations Principle,” an enforced separation of “those who develop information, those who control the network infrastructure on which it travels, and those who control the tools or venues of access” to that information.

The internet, given its decentralized construction, is not easy to consolidate, but no one should feel confident that today’s corporate titans won’t try.  Nor should we underestimate their ability to succeed in that effort.

 

Future Imperfect

“You’re welcome to Le Carre—he hasn’t got any future.”

—A publisher who rejected John Le Carre’s The Spy Who Came In From the Cold, which would go on to be described by Publishers Weekly as “the best spy novel of all-time.”

file000152304352 When it comes to predicting the future, we all make mistakes.  As we age, we hope to make them slightly less often, but, let me assure you, we never entirely escape the incidence.  Some of us, however, are in positions of authority which make the dimensions of our prognosticating blunders truly spectacular.  Infamous examples abound, especially in the cultural realm…

“Who the hell wants to hear actors talk?”

—Harry Warner of Warner Brothers, dismissing the idea of ‘talkies’ in 1925.

“Guitar groups are on the way out… The Beatles have no future in show business.”

—Dick Rowe, Decca Recording executive, snubbing The Beatles in 1962.

So too in the realm of technological future-telling.  Tim Wu, in his highly entertaining The Master Switch, recounts how in 1877 Western Union [Telegraph] was the most powerful information corporation on the planet, exclusive owners of the only continent-wide communications network.  The Bell [Telephone] Company was at the time a new and struggling tech firm with few customers and even fewer investors.  Such was the financial duress felt by Bell that the company’s President offered Western Union all of Bell’s patents for $100,000.  William Orton, Western Union’s President, declined the offer.  A company memo circulated a year earlier summed up Western Union’s take on the admittedly primitive Bell technology: “This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.”

Lest you think that the pace of technical innovation was invariably slower in the 19th century, it should be noted that, less than a year later, Western Union recognized the error of that take, and embarked upon a furious development effort of its own, commissioning a promising young inventor named Thomas Edison to come up with a better phone.  The effort would prove strangely inopportune (proving that luck too always plays a part in determining the future), when, just as Bell launched a patent-infringement lawsuit, it was discovered that Jay Gould, Robber Baron King mentioned elsewhere on this blogsite, was secretly buying up shares of Western Union, in preparation for a hostile takeover.  Western Union was suddenly obliged to view its telephone dust-up as a “lesser skirmish, one it no longer had the luxury of fighting.”  The company settled out of court with Bell on less than favorable terms, and Bell soon re-emerged as the American Telephone and Telegraph Company (AT&T), which would become the most successful communications company of the 20th century.

file1251307481611In a ‘look back’ article published earlier this year, U.S. News revisited its own predictive report from 1967 entitled, “The Wondrous World of 1990.”  The predictions made in the 1967 piece range from wide misses—a manned Mars landing, a cure for the common cold—to the remarkably prescient—a “checkless, cashless” economy, an “automated” (Google?) car.  (A blithe addendum notes how, if the driver of the robotic car does not accelerate as instructed, “the [computerized] roadway takes over control.”)

More broadly, two prophecies stand out for me in the 1967 article, two points central to the themes of this blog.  One is a miss; the other a palpable hit.  The miss discusses how, “Production and wealth will rise faster than population, so that incomes will climb steadily.”  This in turn will mean that the typical 1967 worker, who was then putting in about 2000 hours a year on the job, would, by 1990, see those hours drop to 1700 or less.  “The four-day week will arrive,” trumpets the article.

If only they had gotten that right.

The hit relates to, “Underlying the transformation to come is a quickening in the tempo of development out of scientific discoveries already made.”  One Dr. Richard G. Folsom, then President of Rensselaer Polytechnic Institute, is quoted: “The magnitude of change will expand, even explode.”

That much they did get right.