Tag Archives: technology

The Last Post

In all likelihood, this is the last post on this site. The blog has run for precisely three years, this post aside, and was, in good part, a deliberate but modest exercise for me. During its first year (2013), I set myself the task of writing a post every week, and then did that, before tailing of to a more intermittent schedule. As I wrote in an earlier article, I have blogged as a creative outlet, for myself, because I actually enjoy the art and craft of writing, especially when I can do so on my own schedule.

Photo: Thomas Hawk
Photo: Thomas Hawk

Maybe the writing and posting is little more than the piteous human impulse to leave something behind, after we’re so soon gone: a small stack of notes, initials carved in the trunk of a tree, ‘handprints on the wall of the digital cave.’

My approach has of course meant that the size of the audience for this blog has been limited, to say the least, but I’m not too fussed about that. Its final value for me has lain elsewhere.

Maybe one day it will be appreciated as one small record kept during times which were changing as quickly as they have ever changed for humankind. The disruption of the digital revolution was in high gear back in 2012-13, and it seems to me that it has slowed some in more recent years. Robotic cars are coming on rather more slowly than did smart phones.

These days, it feels more like we are living in a time of reckoning with that technical, social, economic disruption, a time when many people are looking for someone to blame for the more difficult circumstances they suddenly find themselves living in. And, sadly, there are always politicians willing to step up and seize the opportunity afforded by those searchings, politicians like Donald Trump and Marine Le Pen. Clearly there is a price to be paid when change is not so well managed by those with control of the social and economic levers. If we don’t get effective progressive change then we get reactionary change, and reactionary change is doomed to fail, at least in the long run.

The most impactful change has of course been economic, the result of globalization in a capitalist society which makes having more, as opposed to less money ever so much more convenient and status boosting. Median incomes have stalled in the West, given global competition; jobs have disappeared, the kinds of jobs available have changed, and it is so much easier to blame immigration—the visible signs of change—than it is to blame, say, automation, which has been so much more likely to have undermined your economic well being.

What does it mean for the future? It’s always hard to say. Events are by their very nature unpredictable, and unforeseen events can quickly sway historical outcomes in big ways. As the human species continues to overrun the planet, we are going to have to wrestle certain problems—overpopulation, ecological damage (especially climate change), economic inequality—to the ground, or we are in for a rough ride.

Can we do so? It’s certainly possible. All it takes is the right choices, collectively and individually, made by the right people at the right times. Simple, right?

No, not so simple. But then, what we can do individually is restricted, so that makes it a little easier. Educate yourself, sit back, see the bigger picture, make choices for the greater good, rationally rather than out of frustration, resentment, anger or any of those other emotions which we then look to rationalize. Try to be aware of when you are acting selfishly, blaming others, speaking only to those from your own ‘tribe’, however that may be defined, whether by class, race, religion or nationality. Like it or not, we are all in this together. That colony on Mars will never be our salvation.

Maybe, just maybe, this blog has helped someone other than me to do these things, to maintain a wider perspective, clarify, stay calm and choose wisely. If so, bonus. Great. If not, that’s okay too. It’s helped me.

London 1897

1897 was Queen Victoria’s Diamond Jubilee year; she had been reigning over her Queendom for 50 years, keeping Edward, her eldest son and thus heir, sidelined until he was by then in his late 50s. The current British monarch has of course done the same thing for even longer to her eldest son Charles, who is now 67.

Traffic outside the Bank of England, London, 1897
Traffic outside the Bank of England, London, 1897

Economic inequality was pervasive, glaring and much resented in Victoria’s England, fed by proceeds from the largest empire the world has ever known. (It occupied nearly a quarter of the earth’s total land mass.) Accurate data is hard to come by, but here’s how George Bernard Shaw came to describe the situation:

“It is in this phase of capitalistic development, attained in Great Britain in the 19th century, that Socialism arises as a revolt against a distribution of wealth that has lost all its moral plausibility. The inequalities [have] become monstrous.”

Such conditions gave rise to not only socialism (think Bernie Sanders), but communism, and Scotland Yard security forces kept a vigilant eye upon London’s Communist Working Men’s club, where various firebrands regularly called for overthrow of the existing government. Even more worrisome was the Epicerie Francaise, where international anarchists met and called for radical action of all sorts. French security forces had opened a Special Branch in London just to monitor the Epicerie.

In April of that year, a bomb exploded in the London underground, killing one and injuring more. The perpetrator was never identified, but most people blamed ‘foreigners,’ probably Italians. ‘Immigrants,’ we might call them today, probably Muslims.

In May, Guglielmo Marconi (a rich, well-connected Italian) sent the first ever wireless telecommunication over open sea when he transmitted, “Are you ready?” from the coast of Wales to Flat Holm Island, a distance of 6 kilometres. Unlike his scientific counterparts who favored the free exchange of knowledge (open source?), Marconi was the prototype of today’s successful start-up entrepreneur, brilliant, hardworking, but ever concerned that his competitors (and there were numerous) would steal his technology and exploit it commercially before he was able to.

Telegraphy was an established industry by this time, having been first introduced commercially in 1837. It had sped up the exchange of information unimaginably, from the top speed of a human or animal to virtually immediate, over vast distances. In freeing information from the movement of any physical object, telegraphy had revolutionized the global economy, as well as the media, that is journalism. Critics, however, complained that the telegram had resulted in the standardization of language, stripping it of its regional distinction and flair.

Marconi’s new wireless technology was disruptive and often begrudged. When he later succeeded in transmitting a wireless message across the Atlantic Ocean, he promised that he would undercut the cost of sending a telegram via ocean-bottom cable by 60%.

In the heyday of its popularity as a medium, the average telegram was about 12 words, or about 60 characters.

Electric cars made their first appearance in August of 1897, as London taxis. They disappeared from the roads two year later. Their biggest flaw was likely the excessive weight of their batteries.

Meanwhile, over in America, September saw the Sheriff and his men from Lucerne County, Pennsylvania fatally gun down 19 striking mineworkers, while injuring many others. The murdered men were immigrants, all were unarmed and all had been shot in the back; several had suffered multiple wounds. Protests ensued, sometimes violent, and the Sheriff and his deputies were eventually arrested, only to be later acquitted.

The French expression, ‘Plus ça change…’ is an abbreviated version of a maxim usually translated to English as, ‘The more things change, the more things stay the same.’ Indeed, some things change, (especially technology; the fastest motorcar in London in 1897 topped out at about 35 miles per hour), and some things don’t. One website defines the expression as the “resigned acknowledgment of the fundamental immutability of human nature and institutions.” Touché.

So there it is for you. A little historical perspective on today’s turbulent times. Never a bad thing.

Death As A Process

We are all hurtling toward oblivion. And none of us want to talk about it, much less think about it.

Alex Proimos photo
Alex Proimos photo

The real problem, however, is that, although we are all careening toward our own personal extinction, modern medicine is doing a bang up job of forestalling the moment. Average life expectancy back in classical Greece was under 30 years; life expectancy in many countries today is over 80. Globally, over the last 200 years, life expectancy has essentially doubled, and the trend continues. A recent Lancet study tells us that life expectancy for men and women has increased by about six years in just the past two decades. It is said that the first person who will live to age 200 has now been born!

‘What’s the problem?’ you may ask. Longer life = a good thing. No?

Well, no and yes. A healthy, meaningful life, free of pain, sure. But, as many of us have seen, the final years, under a miraculous contemporary medical regime, can be contrary to all three of those descriptors.

We used to, more often than not, die at home. Not anymore, although almost all of us will say that we’d prefer to. And again, the trend continues; one study says that in the U.K., by 2030 fewer than one in ten will die at home (and that includes a ‘nursing home’). When the end comes, we are very likely to be within the walls of a cool, clinical institution.

But again, I don’t think that’s the worst of it. We used to die far more precipitously. We got old, we got sick, we died, like dropping off the earthly plane. Now, as stipulated in Being Mortal, Atul Gawande‘s excellent book on this untidy business, the pattern of our death is typically a prolonged series of much shorter drop-offs. We develop heart disease, there are effective drugs for that. Our legs go; here’s an electric wheelchair that can spin around inside an elevator. Cancer crops up, begin chemotherapy. Today’s medical model is an interventionist one; if the problem can be addressed it will be, or at least it should be. And so our lives are repeatedly extended, and each time, the quality is not quite what it was.

What’s more, the final expiry itself is no longer definitive. Our demarcation of death used to be based upon the heart and lungs stopping their involuntary movement. Then, back in the late 60s, given the interventionist aplomb of doctors, we switched to ‘brain dead.’ But now, even that definition isn’t working for us. In a recent National Geographic article, brain death is broken down into five separate stages. (The first is short-term memory loss, and if that’s true, I’m dying as I write this.)

Just above, I used the word “moment” in referring to death, but hang on. As quoted in the same article, Sam Parnia, in his book Erasing Death, refutes that notion explicitly: death is “a process, not a moment.” And doctors can now resuscitate our dying selves well along into that process, up to 30 minutes in with adults, much longer for children, long after we would have been ‘left for dead,’ just a few decades ago.

It’s all very disorderly and difficult, and something we all need to think about, vis a vis our own short lives. As my mother said several times in referring to particularly decrepit friends, “We can live too long.” And yet, as a friend of my own once said, with unsettling accuracy, “We cling to life.” (Well, not my mother. She wasn’t in pain, but, dying of cancer, she asked for, and would have taken, if it had been provided for her, a “euthanasia pill.”)

And a final point here. It is often the family members of the dying, not the dying themselves, who prompt the intervention. We cling not only to life, but to our connection to the dying. And if one thing is clear to me in all this messiness, it’s that the decision to intervene should rest with the dying, not the interventionists, whoever they may be.

Ask your aging loved ones what they want, what they fear, when the end comes. Make sure that you have a ‘Living Will’ in place, that a ‘Do Not Resuscitate’ sign will be hung on the end of the hospital bed where you will likely expire, if that is your desire. Make your wishes known to your family members before you’re incapacitated, and the decision has to go to them.

If nothing else, go out on your own terms.

 

 

Closing the Digital Lid

I began teaching a new course last week, as so many other teachers everywhere did, and, as is my wont, I asked my students for ‘lids down’ on the laptops which inevitably appear on their desks as they first arrive and sit down. The rationale of course is that their computers are open in order for them to “take notes,” but we can all be rightly skeptical of that practice. The online distractions are simply too many and varied for that to be consistently true, given the perfect visual block that the flipped-up lids present to we instructors stranded on the back side of that web portal.

It’s interesting to note that recent research indicates that students who take notes longhand, as compared to on their laptops, fare better in recalling the substance of the course material than do their keyboarding counterparts. And the longhanders score better not only in factual recall; conceptually they also respond more accurately and substantively to after-class questions, avoiding what the researchers refer to as the keyboarders’ “shallower processing.”

It’s a contentious issue among educators of course. Some suggest that we instructors should ‘embrace’ the digital realm in our classrooms, allowing students to tweet as we speak, ask questions anonymously, fact check, all that. A richer, more vibrant educational environment is the result, say these internet enthusiasts.

It depends upon class size, and certainly I wouldn’t object to laptops or handhelds open and operating during any kind of educational ‘field trip,’ but I came to the lids down position long before I heard about the recent research I’ve just mentioned, and I did so out of what may be seen as an old-school notion: common courtesy.

My classes are small—as writing classes they need to be—and I am always looking for what I refer to as ‘engagement in the process.’ Regardless of the quality of the writing produced, I’m looking for students to listen carefully at all times, to me as well as to their fellow students, to think, process, and respond with ideas that may or may not be helpful to the group process. That just isn’t happening, or at least not as well as it could be happening, if students are in two places at once. Except of course they are not two places at once; their attention is simply bouncing rapidly back and forth between those two places. What we describe as multitasking.

In that sense I’m looking for more than just common courtesy, but respectful attention is nevertheless at the heart of what I’m asking for in a classroom. Anything less is simply rude.

We’re all familiar with moments like this:

 

babycakes romero photo
babycakes romero photo

Where the so-called ‘digital divide’ has nothing to do with separate generations or genders; it’s the sad loss of a potential conversation, and I very much consider my classroom process a group conversation.

Or how about this image, taken from the CNN election night coverage:

CNN laptops

This is more precisely what I’m on about. These folks are gathered as pundits to discuss and enlighten the audience on the events of the evening, and clearly, as part of that endeavor, they can be expected to listen to one another, with their varied insights and political leanings, and we in the audience can be expected to profit by that exchange. But, with lids up, we may be sure that each pundit is periodically checking the screen while their fellow analyst is speaking. Why? I’m assuming it’s because they wish to check in on the very latest election data as it flows in. But this is CNN headquarters, where the data flowing all around them couldn’t be more up-to-the minute!

If you’re going to engage in a conversation with someone, group or otherwise, then do that, engage: listen carefully and respond thoughtfully. Not with just your own talking points, but with a reasoned response to what has just been said by your conversational partner.

Online addiction continues to engulf us. My own personal survey indicates that more than half of those of us walking outside are either staring into the virtual void or at least carrying the tool which connects us to that space. At a bus stop or in the subway car the great majority of us are guilty. And so it becomes increasingly difficult for us to unplug when we find ourselves a member of a group meant to communicate face to face.

When it comes to conversation and common courtesy, I guess it’s like what an old professor once said to me about common sense: ‘Not so common.’

The Role of Government

It’s the statistic that got everyone’s attention. A recently released study by Oxfam, the international agency dedicated to combatting poverty and injustice, warns that the richest 1% of the planet’s citizens will soon possess more than the remaining 99%.

The nation's representatives? Michael Riffle photo
The nation’s representatives?
Michael Riffle photo

In an interesting related factoid, The Upshot (a ‘data-driven’ undertaking from The New York Times) reports that the richest 1% of Americans, on average and after excluding capital gains, have seen their incomes increase by $97,000 since 2009; the 99% have seen their average income fall by $100 in that time.

In Canada the situation is less dire, but the trend is in the same direction. In the 1980s, as reported by the Broadbent Institute, the top 1% of Canadians received 8% of all national income; that figure has now risen to 14%.

In that same article in The Upshot, writer Justin Wolfers, professor of economics at the University of Michigan, wonders why it is that “robust employment growth over recent years” has not generated more broadly based income growth in America.

Well, surely part of the answer has to be the structural changes wrought in the economy by the digital revolution. The London taxi drivers currently protesting the arrival of the Uber app are just the latest in a now long line of workers who have found themselves displaced by hi-tech changes in their industry. And those workers, once displaced, rarely find themselves able to land alternate employment at higher wages. As has been pointed out by authors like Erik Brynjolfsson and Andrew McAfee, the people not being displaced by computers—once we get past the coders themselves—tend to be folks like waiters, gardeners and daycare workers; not exactly the sorts pulling down the big bucks.

And the other major factor of course has to be the whole trickle-down, anti-regulatory economic wave that began to swell back in the days of Reagan/Thatcher, and which continues to roll over us today. The financial crash of 2008 is the most obvious example of what economic deregulation can mean to all of us, but, more generally, as times have toughened in the Western economies (that is as we have seen the onset of globalization), people have tended to increasingly resent the hand of government in their pockets. Neo-cons have encouraged this attitude at every turn, and so the back doors have been increasingly left open, allowing the rich to sneak into the kitchen, then scoop up ever larger portions of the economic pie.

The single greatest triumph of the Republican Party in America has been their ability to convince a great many white, working-class Americans that the Party has their backs, when very few propositions could be further from the truth.

We have seen, in recent decades, a steadily growing anti-government sentiment provide steadily growing opportunity for the rich to get ever richer. And let’s be very clear about one thing. The growing bank accounts of the mega-rich are not the best means for growing the economy, for easily apparent reasons. Those guys simply don’t have to spend their money the way us poorer people do, just to stay ahead of the monthly bills. Here’s a TD Bank study that makes this point.

Now no one should rightly go about saying more government is the answer to all our socio-economic woes. Anybody who has ever dealt with a government office in a time of acute need knows that these bureaucracies can be inefficient, self-serving and sometimes obnoxious, even vindictive. But greater government management of the current economy? Well, how much more evident could that need be?

Robert Reich's formula for government intervention.
Robert Reich’s formula for government intervention.

 

 

 

 

 

 

 

 

It comes down to some fairly old-fashioned ideas like a guaranteed annual income, higher minimum wages, and a more progressive income tax regime. Scary stuff for a whole lot of people. But if you’re one of them, if you’re one of those people who finds the idea of more government anathema, an outrageous infringement upon your economic freedom, you should recognize that if your opinion prevails, then what you see now is what you will see later.

Only worse, if that can be imagined.

 

Interstellar Dreams

In a recent article in Aeon magazine, Elon Musk tells us that he figures it will take about a million people to properly colonize Mars. He has in mind a design for a giant spaceship, the “Mars Colonial Transporter,” to facilitate the task.

8577726421_2a363387c1And lest you think that Mr. Musk is just another techno-geek keener with a shaky grip on reality, no. This is the guy who sold PayPal to eBay for $1.5 billion, then went on to successfully compete with corporate behemoth General Motors by designing and marketing the Tesla electric car. Currently he heads up SpaceX, a startup dedicated to said colonization of Mars, a company that has a contract with NASA to transport astronauts to the International Space Station. He’s the real deal.

Musk sees the colonization of the red planet as a stepping stone to exploration of the rest of our solar system, and ultimately interstellar space. He imagines the million colonists in place within a century, the first bunch taking up residence there around 2040.

As a species, we have been journeying out beyond the horizon for about as long as we’ve been mobile. Always willing, despite obvious dangers, to explore unknown territories, then ‘settle’ them, before allowing others to move on again, into the alien. This urge to migrate, to reconnoiter strange lands and then inhabit them is one of the true hallmarks of humankind. No other species has spread so far and wide on the planet, and done it with such aplomb.

And so, for us, outer space is of course “the next frontier.”

The obstacles this time are no less considerable than they were on terra firma. Mars once had an atmosphere; probably surface water too, but these days it’s a distinctly harsh environment; exposed to it you’d last less than 30 seconds. Colonist’s quarters there will be close, and extremely stress-inducing. It will be a bleak, constricted adventure, and very few will care to go, given that it’s a one-way ticket.

Getting there, however, is relatively easy, compared to interstellar space travel. The nearest star, called Alpha Centauri, is four light years away. Sounds encouraging—if we can even approach the speed of light the trip might take less than four years for the astronauts to arrive, if Einstein was right about speed shortening time. The problem is the energy needed for the journey; it seems it is physically impossible that the spaceship could carry enough onboard fuel. Scientists have imagined ‘solar sails’ which will capture the streaming energy of the sun, a solar wind, if you will. Then there’s the need for enough food for the trip, the immense psychological pressure of isolation lasting that long, the health problems that come with weightlessness, the difficulty of communication with home, exposure to hazardous radiation, and more. Again scientists have ideas to meet all these challenges, but they are highly theoretical. None of them are anywhere near practical realization.

And of course there is the possibility of robotic exploration of space, but that’s not the same is it. Where’s the adventure in that? No robot can ever be a hero, not without a lot of misplaced anthropomorphism.

No, for all intents and purposes, our days of exploration are over. There are no more truly wild places left upon Mother Earth, and our chances of sallying forth into outer space, at least for the very indefinite future, are essentially nil. As William Gibson has pointed out, no one will speak of ‘the twenty-second century’ the way we used to of the twenty-first.

It’s a necessary, perhaps mythic shift in consciousness with consequences yet to be determined. Obviously it behooves us to take good care of the planet, given that it’s the only abode any of us will ever have. But it also suggests that we should better appreciate the miraculous coincidence of life on ‘the pale blue dot.’ Just as interstellar travel may never happen, so too we may never discover life elsewhere in the universe.

This is it folks. We’re staying home tonight, and likely forever. Fate will find us where we are.

 

Let the Machines Decide

The GPS device in my car knows the speed limit for the road I’m driving on, and displays that information for me on its screen. Nice. Nobody needs another speeding ticket. But what if my ‘smart car’ refused to go over that limit, even if I wanted it to? You know, the wife shouting from the backseat, about to give birth, the hospital four blocks away, that sort of thing.

David Hilowitz photo
David Hilowitz photo

It’s a scenario not far removed from reality. Google’s robotic car has inspired many futurists to imagine a computer that controls not only the speed of your car, but also where it goes, diverting your car away from congestion points toward alternate routes to your destination. Evgeny Morozov is among these futurists, and in a recent article in The Observer, he suggests that computers may soon be in a position to usurp many functions that we have traditionally assigned to government. “Algorithmic regulation,” he calls it. We can imagine government bureaucrats joining the unemployment line to fill out a form that will allow a computer to judge whether they are worthy of benefits or no.

Examples of machines making decisions previously assigned to humans are already easily found. If the ebook downloaded to my Kobo has a hold placed on it, the Vancouver Public Library’s computer will unceremoniously retrieve it from my e-reader upon its due date, regardless of whether I have just 10 more pages to read, and would be willing to pay the overdue fine in order to do so.

But Morozov’s cautionary critique is about a wider phenomenon, and it’s largely the ‘internet of things’ which is fuelling his concern. The internet of things is most pointedly about the process which will see digital chips migrate out of electronic devices, into those things which we have until now tended to consider inanimate, non-electronic objects, things like your door, or your mattress. It may well be that in future a computer somewhere will be informed about it when you don’t spend the night at home.

Maybe you spent the night on a friend’s couch, after one too many. Maybe you ate some greasy fast food that night too. And maybe you haven’t worked out at your club’s gym for more than six months now. The data gathering upshot of this at least arguably unhealthy behavior is that you may be considered higher risk by a life insurance company, and so proffered a higher premium.

Presumably there is a human being at the end of this theoretical decision-making chain, but I think we’ve all learned that it’s never safe to assume that digital tech won’t take over any particular role, and certainly whatever the imagined final decision taken as to your insurance risk, certainly it will be informed by data collection done by digital machines.

The most chilling note struck in Morozov’s piece comes, for me, when he quotes Tim O’Reilly, technology publisher and venture capitalist, referring to precisely this industry: “I think that insurance is going to be the native business model for the internet of things.”

Now isn’t that heartening. Corporate insurance as the business model of the near future.

The gist of what is alarming about the prospect of digital machines taking increasing control of our lives is that it suggests that the ‘depersonalization’ we have all been living through for the last three-plus decades is only the beginning. It’s “day one,” as Jeff Bezos likes to say about the digital revolution. It suggests that we can look forward to feeling like a true speck of dust in the infinite cosmic universe of corporate society, with absolutely no living being to talk to should we ever wish to take an unnecessary risk, diverge from the chosen route, or pay the fine instead.

For all the libertarian noise that folks from Silicon Valley make about efficiency and disruption, let no one be fooled: the slick algorithmic regulation that replaces decisions made by people, whether government bureaucrats or not, may be more objective, but it will not bring greater freedom.

Dark Matter

“The internet as we once knew it is officially dead.”                                                                                 Ronald Deibert, in Black Code

Although born of the military (see Origins, from the archives of this blog), in its infancy, the internet was seen as a force for democracy, transparency and the empowerment of individual citizens. The whole open source, ‘information wants to be free,’ advocacy ethos emerged and was optimistically seen by many as heralding a new age of increased ‘bottom up’ power.

Mike Licht photo
Mike Licht photo

And to a considerable extent this has proven to be the case. Political and economic authority has been undermined, greater public transparency has been achieved, and activist groups everywhere have found it easier to organize and exert influence. In more recent years, however, the dark, countervailing side of the internet has also become increasingly apparent, and all of us should be aware of its presence, and perhaps we should all be afraid.

Certainly Ronald Diebert’s 2013 book Black Code: Inside the Battle for Cyberspace should be required reading for anyone who still thinks the internet is a safe and free environment in which to privately gather information, exchange ideas, and find community. Diebert is Director of the Citizen Lab at the Munk School of Global Affairs, University of Toronto, and in that role he has had ample opportunity to peer into the frightening world of what he terms the “cyber-security industrial complex.” In an economy still operating under the shadow of the great recession, this complex is a growth industry that is estimated to now be worth as much as $150 billion annually.

It consists of firms like UK-based Gamma International, Endgame, headquartered in Atlanta, and Stockholm-based Ericsson, makers of Nokia phones. What these companies offer are software products that of course will bypass nearly all existing anti-virus systems to:

  • Monitor and record your emails, chats and IP communications, including Skype, once thought to be the most secure form of online communication.
  • Extract files from your hard drive and send them to the owners of the product, without you ever knowing it’s happened.
  • Activate the microphone or camera in your computer for surveillance of the room your computer sits in.
  • Pinpoint the geographic location of your wireless device.

These products can do all this and more, and they can do it in real time. Other software packages offered for sale by these companies will monitor social media networks, on a massive scale. As reported by the London Review of Books, one such company, ThorpeGlen, recently mined a week’s worth of call data from 50 million internet users in Indonesia. They did this as a kind of sales demo of their services.

The clients for these companies include, not surprisingly, oppressive regimes in countries like China, Iran and Egypt. And to offer some sense of why this market is so lucrative, The Wall Street Journal reported that a security hacking package was offered for sale in Egypt by Gamma for $559,279 US. Apparently the system also comes with a training staff of four.

Some of these services would be illegal if employed within Canada, but, for instance, if you are an Iranian émigré living in Canada who is active in opposition to the current Iranian regime, this legal restriction is of very little comfort. Those people interested in whom you’re corresponding with do not reside in Canada.

And even in countries like the US and Canada, as Edward Snowden has shown us, the national security agencies are not to be trusted to steer clear of our personal affairs. As Michael Hayden, former Director of the CIA, told documentary filmmaker Alex Gibney, “We steal secrets,” and none of us should be naïve enough to believe that the CIA, if they should have even the remotest interest, won’t steal our personal secrets.

All of us have to get over our collective fear of terrorist attacks and push back on the invasion of our privacy currently underway on the web. The justification for this invasion simply isn’t there. You are about as likely to die in a terrorist attack as you are as the result of a piano falling on your head.

Neither should any of us assume that, as we have ‘done nothing wrong,’ we need not be concerned with the vulnerability to surveillance that exists for all the information about us stored online. Twenty years ago, if we had thought that any agency, government or private, was looking to secretly tap our phone line, we would have been outraged, and then demanded an end to it. That sort of intervention took a search warrant, justified in court. It should be no different on the web.

An Education?

The conference was titled, “The Next New World.” It took place last month in San Francisco, and was hosted by Thomas Friedman, columnist for The New York Times and author of The World Is Flat. Friedman has been writing about the digital revolution for years now, and his thinking on the matter is wide-ranging and incisive.

In his keynote address, Friedman describes “an inflection” that occurred coincidental with the great recession of 2008—the technical transformations that began with the personal computer, continued with the internet, and are ongoing with smart phones and the cloud. Friedman is not the first to note that this transformation is the equivalent of what began in 1450 with the invention of the printing press, the so-called Gutenberg revolution. The difference is that the Gutenberg revolution took 200 years to sweep through society. The digital revolution has taken two decades.

5351622529_5d4c782817Friedman and his co-speakers at the conference are right in articulating that today’s revolution has meant that there is a new social contract extant, one based not upon high wages for middle skills (think auto manufacturing or bookkeeping), but upon high wages for high skills (think data analysis or mobile programming). Everything from driving cars to teaching children to milking cows has been overtaken by digital technology in the last 20 years, and so the average employee is now faced with a work place where wages and benefits don’t flow from a commitment to steady long term work, but where constant innovation is required for jobs that last an average of 4.6 years. As Friedman adds—tellingly I think—in today’s next new world, “no one cares what you know.” They care only about what you can do.

Friedman adds in his address that the real focus of the discussions at the conference can be abridged by two questions: “How [in this new world] does my kid get a job?” and, “How does our local school or university need to adapt?’’

All well and good. Everyone has to eat, never mind grow a career or pay a mortgage. What bothers me however, in all these worthwhile discussions, is the underlying assumption that the education happening at schools and universities should essentially equate to job training. I’ve checked the Oxford; nowhere does that esteemed dictionary define education as training for a job. The closest it comes is to say that education can be training “in a particular subject,” not a skill.

I would contend that what a young person knows, as opposed to what they can do, should matter to an employer. What’s more, I think it should matter to all of us. Here’s a definitional point for education from the Oxford that I was delighted to see: “an enlightening experience.”

A better world requires a better educated populace, especially women. For the human race to progress (perhaps survive), more people need to understand the lessons of history. More people have to know how to think rationally, act responsibly, and honour compassion, courage and commitment. None of that necessarily comes with job training for a data analyst or mobile programmer.

And maybe, if the range of jobs available out there is narrowing to ever more specific, high technical-skills work, applicable to an ever more narrow set of industries, then that set of industries should be taking on a greater role in instituting the needed training regimes. Maybe as an addendum to what can be more properly termed ‘an education.’

I’m sure that Friedman and his conference colleagues would not disagree with the value of an education that stresses knowledge, not skills. And yes, universities have become too elitist and expensive everywhere, especially in America. But my daughter attends Quest University in Squamish, British Columbia, where, in addition to studying mathematics and biology, she is obliged to take courses in Rhetoric, Democracy and Justice, and Global Perspectives.

Not exactly the stuff that is likely to land her a job in Silicon Valley, you might say, and I would have to reluctantly agree. But then I would again argue that it should better qualify her for that job. Certainly those courses will make her a better citizen, something the world is in dire need of, but I would also argue that a degree in “Liberal Arts and Sciences” does in fact better qualify her for that job, because those courses will teach her how to better formulate an argument, better understand the empowerment (and therefore the greater job satisfaction) that comes with the democratic process, and better appreciate the global implications of practically all we do workwise these days.

Damn tootin’ that education in liberal arts and sciences better qualifies her for that job in Silicon Valley. That and every other job out there.

The Age of Surveillance

“Today’s world would have disturbed and astonished George Orwell.”                                        —David Lyon, Director, Surveillance Studies Centre, Queen’s University

When Orwell wrote 1984, he imagined a world where pervasive surveillance was visual, achieved by camera. Today’s surveillance is of course much more about gathering information, but it is every bit as all-encompassing as that depicted by Orwell in his dystopian novel. Whereas individual monitoring in 1984 was at the behest of a superstate personified as ‘Big Brother,’ today’s omnipresent watching comes via an unholy alliance of business and the state.

Most of it occurs when we are online. In 2011, Max Schrems, an Austrian studying law in Silicon Valley, asked Facebook to send him all the data the company had collected on him. (Facebook was by no means keen to meet his request; as a European, Schrems was able to take advantage of the fact that Facebook’s European headquarters are in Dublin, and Ireland has far stricter privacy laws than we have on this side of the Atlantic.) He was shocked to receive a CD containing more than 1200 individual PDFs. The information tracked every login, chat message, ‘poke’ and post Schram had ever made on Facebook, including those he had deleted. Additionally, a map showed the precise locations of all the photos tagging Schrem that a friend had posted from her iPhone while they were on vacation together.

Facebook accumulates this dossier of information in order to sell your digital persona to advertisers, as does Google, Skype, Youtube, Yahoo! and just about every other major corporate entity operating online. If ever there was a time when we wondered how and if the web would become monetized, we now know the answer. The web is an advertising medium, just as are the television and radio; it’s just that the advertising is ‘targeted’ at you via a comprehensive individual profile that these companies have collected and happily offered to their advertising clients, in exchange for their money.

How did our governments become involved? Well, the 9/11 terrorist attacks kicked off their participation most definitively. Those horrific events provided rationale for governments everywhere to begin monitoring online communication, and to pass laws making it legal wherever necessary. And now it seems they routinely ask the Googles and Facebooks of the world to hand over the information they’re interested in, and the Googles and Facebooks comply, without ever telling us they have. In one infamous incidence, Yahoo! complied with a Chinese government request to provide information on two dissidents, Wang Xiaoning and Shi Tao, and this complicity led directly to the imprisonment of both men. Sprint has now actually automated a system to handle requests from government agencies for information, one that charges a fee of course!

It’s all quite incredible, and we consent to it every time we toggle that “I agree” box under the “terms and conditions” of privacy policies we will never read. The terms of service you agree to on Skype, for instance, allow Skype to change those terms any time they wish to, without your notification or permission.

And here’s the real rub on today’s ‘culture of surveillance:’ we have no choice in the matter. Use of the internet is, for almost all of us, no longer a matter of socializing, or of seeking entertainment; it is where we work, where we carry out the myriad of tasks necessary to maintain the functioning of our daily life. The choice to not create an online profile that can then be sold by the corporations which happen to own the sites we operate within is about as realistic as is the choice to never leave home. Because here’s the other truly disturbing thing about surveillance in the coming days: it’s not going to remain within the digital domain.

Coming to a tree near you? BlackyShimSham photo
Coming to a tree near you?
BlackyShimSham photo

In May of this year Canadian Federal authorities used facial recognition software to bust a phony passport scheme being operated out of Quebec and BC by organized crime figures. It seems Passport Canada has been using the software since 2009, but it’s only become truly effective in the last few years. It’s not at all difficult to imagine that further advances in this software will soon have security cameras everywhere able to recognize you wherever you go. Already such cameras can read your car’s license plate number as you speed over a bridge, enabling the toll to be sent to your residence, for payment at your convenience. Thousands of these cameras continue to be installed in urban, suburban and yes, even rural areas every year.

Soon enough, evading surveillance will be nearly impossible, whether you’re online or walking in the woods. Big Brother meets Big Data.