Tag Archives: digital revolution

The Last Post

In all likelihood, this is the last post on this site. The blog has run for precisely three years, this post aside, and was, in good part, a deliberate but modest exercise for me. During its first year (2013), I set myself the task of writing a post every week, and then did that, before tailing of to a more intermittent schedule. As I wrote in an earlier article, I have blogged as a creative outlet, for myself, because I actually enjoy the art and craft of writing, especially when I can do so on my own schedule.

Photo: Thomas Hawk
Photo: Thomas Hawk

Maybe the writing and posting is little more than the piteous human impulse to leave something behind, after we’re so soon gone: a small stack of notes, initials carved in the trunk of a tree, ‘handprints on the wall of the digital cave.’

My approach has of course meant that the size of the audience for this blog has been limited, to say the least, but I’m not too fussed about that. Its final value for me has lain elsewhere.

Maybe one day it will be appreciated as one small record kept during times which were changing as quickly as they have ever changed for humankind. The disruption of the digital revolution was in high gear back in 2012-13, and it seems to me that it has slowed some in more recent years. Robotic cars are coming on rather more slowly than did smart phones.

These days, it feels more like we are living in a time of reckoning with that technical, social, economic disruption, a time when many people are looking for someone to blame for the more difficult circumstances they suddenly find themselves living in. And, sadly, there are always politicians willing to step up and seize the opportunity afforded by those searchings, politicians like Donald Trump and Marine Le Pen. Clearly there is a price to be paid when change is not so well managed by those with control of the social and economic levers. If we don’t get effective progressive change then we get reactionary change, and reactionary change is doomed to fail, at least in the long run.

The most impactful change has of course been economic, the result of globalization in a capitalist society which makes having more, as opposed to less money ever so much more convenient and status boosting. Median incomes have stalled in the West, given global competition; jobs have disappeared, the kinds of jobs available have changed, and it is so much easier to blame immigration—the visible signs of change—than it is to blame, say, automation, which has been so much more likely to have undermined your economic well being.

What does it mean for the future? It’s always hard to say. Events are by their very nature unpredictable, and unforeseen events can quickly sway historical outcomes in big ways. As the human species continues to overrun the planet, we are going to have to wrestle certain problems—overpopulation, ecological damage (especially climate change), economic inequality—to the ground, or we are in for a rough ride.

Can we do so? It’s certainly possible. All it takes is the right choices, collectively and individually, made by the right people at the right times. Simple, right?

No, not so simple. But then, what we can do individually is restricted, so that makes it a little easier. Educate yourself, sit back, see the bigger picture, make choices for the greater good, rationally rather than out of frustration, resentment, anger or any of those other emotions which we then look to rationalize. Try to be aware of when you are acting selfishly, blaming others, speaking only to those from your own ‘tribe’, however that may be defined, whether by class, race, religion or nationality. Like it or not, we are all in this together. That colony on Mars will never be our salvation.

Maybe, just maybe, this blog has helped someone other than me to do these things, to maintain a wider perspective, clarify, stay calm and choose wisely. If so, bonus. Great. If not, that’s okay too. It’s helped me.

The Leisure Revolution

Yuval Noah Harari, in his 2014 book Sapiens: A Brief History of Humankind, has an interesting take on the ‘agricultural revolution'; you know, where, way back, we learned to plant crops and domesticate animals. He calls it “history’s biggest fraud.” Not in the sense that it didn’t happen. It did, leading to an increase in food supply and, consequent to that, the growth of cities. His contention is that it did not lead to a better life for humankind, neither a healthier diet nor more leisure time.

5121772432_283c4f57ed_zInstead it led to a less stimulating life with the increased likelihood of starvation and disease. The starvation came about as the result of periodic natural disasters, like drought, devastating the crops we came to depend upon, and the disease came about because urban conditions are much better for spreading illness than are rural ones. As to leisure time, Harari asserts that our hunter-gatherer ancestors ambled about a wide, natural territory, often able to harvest a diverse and abundant food supply, and to do so in fewer hours than it took the average farmer to feed his family a few centuries later.

Rutger Bregman, in his 2016 book, Utopia for Realists: The Case for a Universal Basic Income, Open Borders, and a 15-hour Workweek, makes a similar argument about the industrial revolution. It did not lead to a more leisurely life. Bregman estimates that in 1300 the average English farmer worked about 1500 hours a year in order to make a living, whereas the typical factory worker in Victorian England worked twice that many hours just to survive. The 70-hour workweek was the norm in a city like Manchester in 1850, no weekends, no vacations.

Eventually things improved. By about 1940, in the West, the 40-hour, five-day workweek was fairly standard, a change led, surprisingly enough, by none other than Henry Ford.

And then, things truly began to look up. In 1956 Richard Nixon promised Americans a four-day workweek, and by 1965, prognosticators everywhere were predicting vast increases in leisure time, with workweeks of as little as 14 hours. There was considerable hand-wringing about the coming, perplexing problem of boredom, idle hands given to inflamed immorality and violence.

It all came to a grim, grinding halt in the 1980s. Today the average guy in the U.S. works 8.4 hours per work day, or 42 hours per week. That’s very little changed in the last 50 years.

The digital revolution has brought us an accelerated life, new, not always better forms of community, grotesque economic inequality, and, unlike the industrial revolution, persistent unemployment. (Many people, like weavers, were put out of work by the industrial revolution, but then it went on to deliver a slew of different types of employment, like shop foremen.) And so far, for those people still working, it hasn’t done much for additional leisure time.

The other factor in why many of us are busier these days is what Bregman cites as “the most important development of the last decades.” The feminist revolution. While in some countries at least the workload for individuals has decreased slightly, families these days are living a blur, because these days women are contributing about 40% of the family income, and working full time to do so.

It seems that, with the coming of the digital revolution, we’ve gone and done it to ourselves again. And here’s a disconcerting note; surveys show that many people today would rather earn more than work less—so that they can live the lifestyle they’ve always dreamed of. They’d rather have that bigger house, newer car, more fashionable outfit, and dream vacation than they would more leisure time. We might call this the consumer revolution, and it’s largely a post-WWII phenomenon.

What’s to be done? Well, it’s not in fact that mysterious. Economic answers come with things like a guaranteed annual income and a progressive tax regime that effectively redistributes wealth. And there is very solid evidence as to the validity of these economic remedies, much of it to be found in Bregman’s book.

But just as relevant to the modern leisure deficit is the fact that, as indicated above, we chose these outcomes. Not always consciously, and often incrementally, without realizing the ensuing consequences, but nevertheless we had and still have choice in these matters.

We can choose to live more simply, with less stuff. We can choose to buy a smaller home, drive an older car, purchase clothing at a second-hand store, and grow a garden.

Don’t like your job? Feeling like a wage slave? Have other interests you’d love to pursue?

It’s a choice.

Closing the Digital Lid

I began teaching a new course last week, as so many other teachers everywhere did, and, as is my wont, I asked my students for ‘lids down’ on the laptops which inevitably appear on their desks as they first arrive and sit down. The rationale of course is that their computers are open in order for them to “take notes,” but we can all be rightly skeptical of that practice. The online distractions are simply too many and varied for that to be consistently true, given the perfect visual block that the flipped-up lids present to we instructors stranded on the back side of that web portal.

It’s interesting to note that recent research indicates that students who take notes longhand, as compared to on their laptops, fare better in recalling the substance of the course material than do their keyboarding counterparts. And the longhanders score better not only in factual recall; conceptually they also respond more accurately and substantively to after-class questions, avoiding what the researchers refer to as the keyboarders’ “shallower processing.”

It’s a contentious issue among educators of course. Some suggest that we instructors should ‘embrace’ the digital realm in our classrooms, allowing students to tweet as we speak, ask questions anonymously, fact check, all that. A richer, more vibrant educational environment is the result, say these internet enthusiasts.

It depends upon class size, and certainly I wouldn’t object to laptops or handhelds open and operating during any kind of educational ‘field trip,’ but I came to the lids down position long before I heard about the recent research I’ve just mentioned, and I did so out of what may be seen as an old-school notion: common courtesy.

My classes are small—as writing classes they need to be—and I am always looking for what I refer to as ‘engagement in the process.’ Regardless of the quality of the writing produced, I’m looking for students to listen carefully at all times, to me as well as to their fellow students, to think, process, and respond with ideas that may or may not be helpful to the group process. That just isn’t happening, or at least not as well as it could be happening, if students are in two places at once. Except of course they are not two places at once; their attention is simply bouncing rapidly back and forth between those two places. What we describe as multitasking.

In that sense I’m looking for more than just common courtesy, but respectful attention is nevertheless at the heart of what I’m asking for in a classroom. Anything less is simply rude.

We’re all familiar with moments like this:

 

babycakes romero photo
babycakes romero photo

Where the so-called ‘digital divide’ has nothing to do with separate generations or genders; it’s the sad loss of a potential conversation, and I very much consider my classroom process a group conversation.

Or how about this image, taken from the CNN election night coverage:

CNN laptops

This is more precisely what I’m on about. These folks are gathered as pundits to discuss and enlighten the audience on the events of the evening, and clearly, as part of that endeavor, they can be expected to listen to one another, with their varied insights and political leanings, and we in the audience can be expected to profit by that exchange. But, with lids up, we may be sure that each pundit is periodically checking the screen while their fellow analyst is speaking. Why? I’m assuming it’s because they wish to check in on the very latest election data as it flows in. But this is CNN headquarters, where the data flowing all around them couldn’t be more up-to-the minute!

If you’re going to engage in a conversation with someone, group or otherwise, then do that, engage: listen carefully and respond thoughtfully. Not with just your own talking points, but with a reasoned response to what has just been said by your conversational partner.

Online addiction continues to engulf us. My own personal survey indicates that more than half of those of us walking outside are either staring into the virtual void or at least carrying the tool which connects us to that space. At a bus stop or in the subway car the great majority of us are guilty. And so it becomes increasingly difficult for us to unplug when we find ourselves a member of a group meant to communicate face to face.

When it comes to conversation and common courtesy, I guess it’s like what an old professor once said to me about common sense: ‘Not so common.’

Discrepancies

Pete Muller photo
Pete Muller photo

This photograph was published in the July 2015 issue of National Geographic magazine. It was taken in a village in southern Guinea, during the recent Ebola outbreak which had its epicentre in that part of Africa.

The young girl sitting on the blanket looks distinctly uneasy. Before her, the caption tells us, a traditional “healer” is preparing to exorcise the “malign spirits” which may have caused the girl to contract the Ebola virus. We see the healer’s face encrusted in white; a bit of green vegetation is wrapped around one wrist; he carries a kind of sceptre, a decorated stick.

What’s most remarkable about the photo is to be seen in the background, among the small group of villagers who have gathered to watch the exorcism—two young men hold up their phones, videoing the process.

The elements of the discrepancy seem almost too much to set side by side, and yet, there they are. A rankly superstitious practice which tracks right back to a mention in the Dead Sea Scrolls (i.e. before Christianity), smack up against the latest in 21st century communications technology. How is this possible?

The fact is today’s world is rife with such discrepancies; it’s only that they’re usually further removed from one another. In whole villages in rural Afghanistan not one person may be able to read and write. In the city of Helsinki, with a population of almost one and a half million, you will be hard pressed to find anyone over the age of 15 who cannot read and write. (The literacy rate In Afghanistan is 28%, among females less than 13%; in Finland the rate is 100%.)

Carlos Slim, the Mexican business mogul, has a net worth of more than $77 billion U.S. The average hotel receptionist in Mexico brings home $4260 U.S. in pay over the course of one year.

In California, it is illegal for mental health providers to engage in “reparative therapy” for LGBT minors. In Uganda, you may be sent to jail for up to 14 years for failing to report a suspected homosexual.

More than half of new lawyers in Canada are women. In Saudi Arabia, women cannot drive a car, vote, or leave home except in the company of a male chaperone.

In all these cases, the divergence is just too great. And no one, anywhere, should attempt to justify these differences via the notion of ‘culture.’ They remain in place because it is to the advantage of the privileged group that they do so.

Does digital technology close these gaps, or drive them ever wider? The answer is complex. Certainly those phones held up by the two young men in Guinea offer them opportunities for information-gathering and commerce that are unprecedented historically, potentially meaning that their lives are ‘lifted’ economically, educationally, socially. But at the same time, the very persistence of superstition, illiteracy, and poverty means that, if those two young men rise up, the gap between them and those next to them who believe in the power of exorcism will grow.

The rising tide of digital technology most assuredly does not lift all boats, any more than the growing wealth of the economic elite trickles down, in any effective way, to those living at the bottom of the financial hill. Any time the separation between two sets of people grows too great, whether it be the Mayan priests ruling over Palenque in the 7th century, or Marie Antoinette and her husband ruling over France during the final years of the 18th century, it does not bode well for us.

In today’s global village, the discrepancies which exist internationally present problems on a scale not seen before, and I mean that quite literally, because we are more aware of these problems than we have ever been before. We no longer have to wait for an emissary to return to court, after a year-long mission, to know about the conditions of a far-off land and its people. But, at same time, today’s problems are of a distressingly familiar order.

Those at the peak of today’s societal pyramid are doing just fine, thanks. What’s called for are measures to assure that the pyramid does not get any higher, that it in fact flattens, delivering greater equality of rights, education, health care, and economic opportunity to all people everywhere.

I’m sounding frighteningly socialistic to some I know, but the lessons of history are there for all of us to observe, and we ignore them at our peril. It is in our own interests to help those being left out or behind, wherever they live, because the discrepancies of today’s world are a threat to us all.

The Cowboy Rides Away

To say that the cowboy is iconic in North American culture is hardly sufficient. Mythic hero is more accurate, but it’s important to remember that the cowboy was real, not supernatural like Hercules or Spiderman. The reality was that, for a brief period, essentially from 1860 to 1900, there were a great number of horses and cattle running free in the American frontier, most of them having been abandoned by retreating Mexicans. With the arrival of the railroad following the Civil War, the ’roundup’ and sale of these cattle became possible, leading to the beef industry that employed a great many ‘cowboys.’ The cattle were herded to railheads of course, but not too quickly, because if you did that the cattle lost weight, and they were sold for slaughter by the pound.

Thus the cowboy’s life was one of outdoors ambling on horseback, as part of a collaborative team of men who camped early for the night, gathered around fires to share a meal, tell stories, and maybe even sing songs. It’s a lifestyle with easily apparent appeal, although here’s what the reclusive American writer Trevanian had to say about the broader charm of the cowboy:

“It is revealing of the American culture that its prototypic hero is the cowboy: an uneducated, boorish, Victorian migrant agricultural worker.” 

The Great Train Robbery The original black hat.
The Great Train Robbery
The original black hat.

When the American film industry moved to California in the early part of the 20th century, there were by then plenty of unemployed cowboys knocking about, men who could ride, rope and sometimes shoot with the best of them—just one more coincidental reason why the western movie became incredibly popular. And it is truly difficult to overestimate the popularity and therefore the influence of the western movie. Arguably the first dramatic movie was a western—The Great Train Robbery in 1903—and the genre was dominant right through until the 70s, when it died with nevertheless accomplished films like The Wild Bunch and McCabe and Mrs. Miller.

I’ve argued elsewhere that the western movie was so successful, over such a long period of time (still longer than any other genre), that it created a ‘conventional form’ along with a set of audience expectations that, long after expiration of the genre itself, offers moviemakers who can reinvent the form within a new context (i.e. The Matrix or Drive) an unparalleled opportunity to go boffo at the box office.

The influence of cowboy culture in popular music is scarcely less significant. Cole Porter knocked it right out of the park in 1934 with a sublime rhyme scheme in the cowpoke paean Don’t Fence Me In

I want to ride to the ridge where the West commences

And gaze at the moon till I lose my senses.

I can’t look at hobbles and I can’t stand fences.

The song has been covered by everyone from Ella Fitzgerald to The Killers. And almost 40 years later, James Taylor waxed nearly as lyrical (rhyming “Boston” with “frostin”) in maybe his best song, Sweet Bay James:

There is a young cowboy; he lives on the range.

His horse and his cattle are his only companions.

He works in the saddle and he sleeps in the canyons…

More than anything else, the cowboy represents freedom, a largely solitary life free of long-term obligations, tight schedules or immediate bosses. Too often in the westerns the cowboy’s love interest represented civilization, settling down and responsibility, and so too often, at the end of the story, the cowboy rode away from the girl, off into the sunset to resume a life of independent rambling (although it’s worth noting that in a couple of the defining westerns, High Noon and Stagecoach, the hero did choose the girl, and they rode off together in a buckboard).

It’s no surprise that the cowboy’s allure arose alongside the maturing of the industrial revolution, when incomes were rising but often as the result of work fettered to a factory system of mechanical drudgery. Are we any more free in the age of the digital revolution, with its increased pace and unrelenting connectivity? Well, not so’s you’d notice.

In the digital age, the cowboy hero seems a complete anachronism, more irrelevant than ever, but I think it’s worth remembering that, although the cowboy almost always resorted to a gun to resolve his conflicts with the bad guys—and the impact of that implicit message upon American society can hardly be overestimated either (see Guns)—he did so reluctantly, in defence of the little guy being oppressed by powerful villains, who were often corporate-types.

Today the cowboy is gone for good from our cultural landscape, and I’m not suggesting he should be brought back. But in our world of ever more powerful corporate interests, we could all use some of his individual pluck. The economic wheels of our day are rolling along just fine; the ecological and moral ones, not so much. Sadly, too much of the cowboy’s good is gone with him.

Full Circle

There’s some interesting reading to be found in a paper released by the Canadian Media Production Association last week. It’s titled, Content Everywhere: Securing Canada’s Place in the Digital Future, and it offers up an effective survey of the current media landscape. At first glance, suffice it to say that recent trends continue:

* Video progressively rules on the internet—YouTube now has more than one billion unique viewers every month, with 100 hours of video uploaded every minute.

* ‘Cord cutting’, that is escaping the tyranny of cable ‘bundling,’ continues for consumers—an American who owns an iPad now has a 65% likelihood of being a member of the cord cutter tribe.

* As the market penetration of the so-called OTTs (‘Over The Top’ online streamers like Netflix, Amazon and Hulu) continues to grow—one of the OTTs now reaches almost half of all American households; over 60% of the 18 – 24 demographic—they are moving increasingly into the financing of original content.

The ‘old boys’, the established television networks, know all about these trends of course, and so they have, in recent years, moved actively, if still hesitantly into the digital realm. In Canada, Bell Media launched Crave TV in 2014, Rogers and Shaw finally birthed Shomi, and CBC now has an online comedy channel called Punchline. (Conventional TV’s great strength increasingly remains of course in the provision of live events, mostly sports, but also news, and of course the odd award show, although it’s interesting to note that ratings for the Oscars this year were down about 15%.)

Ben Templesmith photo
Ben Templesmith photo

Overall, the evolving picture is of the online media industry maturing, in all the good and bad that that entails. Perhaps most disconcerting is a subtitle within the paper which reads: “Many things about OTT look like TV.” AOL greenlit 16 original series in 2014, all of them featuring major celebrities or movie stars. Pitch meetings with the big-league OTTs are usually booked through agents or entertainment lawyers these days. And we can all be sure that when David Fincher, after House of Cards, pitches his new series, he’ll be strolling into the Netflix offices past a long line of waiting, lesser-known producers who once hoped that the web would provide them with new and different opportunities. Sigh.

And of course, as the paper, points out, creators for the web face a unique set of additional challenges, even as the process morphs into something distressingly familiar. Chief among them are ‘discoverability,’ and an overcrowded marketplace. The gatekeepers for the online game may no longer be the same, but the smaller players still face a huge disadvantage when it comes to putting bums in the seats. They simply don’t have the resources to compete with the big guys at marketing, or at perhaps hiring the talent which comes with a built-in audience.

And finally, if you’re a Canadian hoping to succeed with online content, you face an added problem with financing, because as slow as the big broadcasters have been to move into the online space, the established ‘legacy’ funders, like Telefilm Canada and the tax credit programs, have been even more lead-footed. Because online revenues have been so difficult to realize, these agencies have been extra adept at shuffling their feet and avoiding eye contact whenever, for instance, documentary filmmakers with an online-only audience in mind have come calling.

I’m reminded of the final scenes in George Orwell’s classic Animal Farm, when the pigs move into the farmhouse, begin to walk upright and wear clothes. Or of Daron Acemoglu and James Robinson’s incisive explanation of Why Nations Fail, describing how it is that, following revolutions, tyrants like Robert Mugabe replace tyrants like Ian Smith, how Joseph Stalin replaces Csar Nicolas II. The digital revolution may not have yet completed itself, not yet come right round in what Acemoglu and Robinson term “the vicious circle,” but the streets have gone quiet again. It may be that no one has been sent off to a “knacker” or to the gulag, but if you were among those who dreamed of a better world, or maybe even who manned an online barricade, well, purchase a ticket and get in line. It seems that all along, the digital revolution was for sale, to the highest bidder.

The Role of Government

It’s the statistic that got everyone’s attention. A recently released study by Oxfam, the international agency dedicated to combatting poverty and injustice, warns that the richest 1% of the planet’s citizens will soon possess more than the remaining 99%.

The nation's representatives? Michael Riffle photo
The nation’s representatives?
Michael Riffle photo

In an interesting related factoid, The Upshot (a ‘data-driven’ undertaking from The New York Times) reports that the richest 1% of Americans, on average and after excluding capital gains, have seen their incomes increase by $97,000 since 2009; the 99% have seen their average income fall by $100 in that time.

In Canada the situation is less dire, but the trend is in the same direction. In the 1980s, as reported by the Broadbent Institute, the top 1% of Canadians received 8% of all national income; that figure has now risen to 14%.

In that same article in The Upshot, writer Justin Wolfers, professor of economics at the University of Michigan, wonders why it is that “robust employment growth over recent years” has not generated more broadly based income growth in America.

Well, surely part of the answer has to be the structural changes wrought in the economy by the digital revolution. The London taxi drivers currently protesting the arrival of the Uber app are just the latest in a now long line of workers who have found themselves displaced by hi-tech changes in their industry. And those workers, once displaced, rarely find themselves able to land alternate employment at higher wages. As has been pointed out by authors like Erik Brynjolfsson and Andrew McAfee, the people not being displaced by computers—once we get past the coders themselves—tend to be folks like waiters, gardeners and daycare workers; not exactly the sorts pulling down the big bucks.

And the other major factor of course has to be the whole trickle-down, anti-regulatory economic wave that began to swell back in the days of Reagan/Thatcher, and which continues to roll over us today. The financial crash of 2008 is the most obvious example of what economic deregulation can mean to all of us, but, more generally, as times have toughened in the Western economies (that is as we have seen the onset of globalization), people have tended to increasingly resent the hand of government in their pockets. Neo-cons have encouraged this attitude at every turn, and so the back doors have been increasingly left open, allowing the rich to sneak into the kitchen, then scoop up ever larger portions of the economic pie.

The single greatest triumph of the Republican Party in America has been their ability to convince a great many white, working-class Americans that the Party has their backs, when very few propositions could be further from the truth.

We have seen, in recent decades, a steadily growing anti-government sentiment provide steadily growing opportunity for the rich to get ever richer. And let’s be very clear about one thing. The growing bank accounts of the mega-rich are not the best means for growing the economy, for easily apparent reasons. Those guys simply don’t have to spend their money the way us poorer people do, just to stay ahead of the monthly bills. Here’s a TD Bank study that makes this point.

Now no one should rightly go about saying more government is the answer to all our socio-economic woes. Anybody who has ever dealt with a government office in a time of acute need knows that these bureaucracies can be inefficient, self-serving and sometimes obnoxious, even vindictive. But greater government management of the current economy? Well, how much more evident could that need be?

Robert Reich's formula for government intervention.
Robert Reich’s formula for government intervention.

 

 

 

 

 

 

 

 

It comes down to some fairly old-fashioned ideas like a guaranteed annual income, higher minimum wages, and a more progressive income tax regime. Scary stuff for a whole lot of people. But if you’re one of them, if you’re one of those people who finds the idea of more government anathema, an outrageous infringement upon your economic freedom, you should recognize that if your opinion prevails, then what you see now is what you will see later.

Only worse, if that can be imagined.

 

Just Like Yesterday

Meet the new boss.

  Same as the old boss.

from Won’t Get Fooled Again, by Pete Townshend

 

The guardians of the old media have found a brilliant way to exploit the denizens of the new. By dangling the carrot of access to television—a mature industry where recognition and revenue remain solidly in place—the executives who stand at the gates to TV can cause the multitudes who populate the online realm—an industry where revenue is dispersed very unevenly and recognition is highly fragmented—to work tirelessly to promote their exclusive brands. It’s perfect.

In recent times, those clever folks who control TV have evolved the method of the online competition in order to shamelessly advance their corporate brands. Offer those who create content for the web—especially of course those who operate within the social media arena—the chance to create for TV, and those creators will toil doggedly, nearly interminably on your behalf, and they will do so without a cent of actual remuneration, and on the slimmest of chances at success. How great is that?

The Canadian Broadcasting Corporation has just concluded a nation-wide contest where 285 comedy creator-teams submitted video ‘teasers’ in pursuit of a single (although lucrative at $500,000) prize of the production of a half-hour TV special. The competition ran over ten weeks, and the teaser was only the beginning of the work demanded of these aspirants. Each week, in addition to the endless amount of online ‘sharing’ these teams were obliged to do—if they were to have any realistic expectation of prevailing in the contest—these teams had to produce a new video ‘mission’ on a specified theme (‘The Do Over,” “The Differentiator” etc.).

Likewise Telus, a corporation with a more regional territory (Alberta and BC), have just run the ‘Storyhive’ competition, where hundreds of applicants chased 15 grants of $10,000, leading finally to one winner gaining $50,000 toward the production of content for Optik TV, the television service owned by Telus.

It’s a truly prodigious amount of work done by talented people on the behalf of others for absolutely no monetary recompense. The competitions are won of course via online voting solicited by the contestants, and don’t think it’s anything like a democratic, one email address, one vote mechanism. No, visitors to the relevant site (where you must of course register) ‘earn’ votes by repeated visits, or, more germanely, online promotion of the corporate site. For CBC and Telus it’s win win win; for 99%+ of the contestants it’s lose lose lose. And, if it’s necessary to drive home the point of this losing game, in the Telus competition, in winnowing the pitched projects down to the final 15, there is not one iota of critical adjudication applied; it is entirely determined by online voting. In other words, at least until that first significant selective step, Telus does not care one whit about the actual creative quality of the submissions; they care only about the quantity of online visitation they are able to achieve.

Let me be very clear about my take on this process. It’s manipulative, exploitive, and vile. The folks behind it should be ashamed of themselves.

Tau Zero photo
Tau Zero photo

But, as with so many of the changes wrought by the digital revolution, neither is this obnoxious game about to go away. The television executives who have invented it have mined gold for themselves, and they could care less about the fact that almost all of the losing contestants have nothing good to say about them or their contest. Those losers are simple collateral damage in the winning war for online traffic, and thus advertising dollars.

It’s odd and slightly unsettling that (as described in a recent article in The New Yorker magazine) KingBach, a top star on Vine, an online video site where content episodes last a sum total of six seconds, dreams of making it on TV and in the movies, where fewer people will watch him.

Welcome to the new world of mass media, which looks altogether too much like the same old world. The ‘young adult’ demographic still watches far more TV than they do online video. YouTube will make less than $4 billion in advertising this year; CBS will earn more than $8 billion.

Pete Townsend’s prayers may well have been in vain.

Let the Machines Decide

The GPS device in my car knows the speed limit for the road I’m driving on, and displays that information for me on its screen. Nice. Nobody needs another speeding ticket. But what if my ‘smart car’ refused to go over that limit, even if I wanted it to? You know, the wife shouting from the backseat, about to give birth, the hospital four blocks away, that sort of thing.

David Hilowitz photo
David Hilowitz photo

It’s a scenario not far removed from reality. Google’s robotic car has inspired many futurists to imagine a computer that controls not only the speed of your car, but also where it goes, diverting your car away from congestion points toward alternate routes to your destination. Evgeny Morozov is among these futurists, and in a recent article in The Observer, he suggests that computers may soon be in a position to usurp many functions that we have traditionally assigned to government. “Algorithmic regulation,” he calls it. We can imagine government bureaucrats joining the unemployment line to fill out a form that will allow a computer to judge whether they are worthy of benefits or no.

Examples of machines making decisions previously assigned to humans are already easily found. If the ebook downloaded to my Kobo has a hold placed on it, the Vancouver Public Library’s computer will unceremoniously retrieve it from my e-reader upon its due date, regardless of whether I have just 10 more pages to read, and would be willing to pay the overdue fine in order to do so.

But Morozov’s cautionary critique is about a wider phenomenon, and it’s largely the ‘internet of things’ which is fuelling his concern. The internet of things is most pointedly about the process which will see digital chips migrate out of electronic devices, into those things which we have until now tended to consider inanimate, non-electronic objects, things like your door, or your mattress. It may well be that in future a computer somewhere will be informed about it when you don’t spend the night at home.

Maybe you spent the night on a friend’s couch, after one too many. Maybe you ate some greasy fast food that night too. And maybe you haven’t worked out at your club’s gym for more than six months now. The data gathering upshot of this at least arguably unhealthy behavior is that you may be considered higher risk by a life insurance company, and so proffered a higher premium.

Presumably there is a human being at the end of this theoretical decision-making chain, but I think we’ve all learned that it’s never safe to assume that digital tech won’t take over any particular role, and certainly whatever the imagined final decision taken as to your insurance risk, certainly it will be informed by data collection done by digital machines.

The most chilling note struck in Morozov’s piece comes, for me, when he quotes Tim O’Reilly, technology publisher and venture capitalist, referring to precisely this industry: “I think that insurance is going to be the native business model for the internet of things.”

Now isn’t that heartening. Corporate insurance as the business model of the near future.

The gist of what is alarming about the prospect of digital machines taking increasing control of our lives is that it suggests that the ‘depersonalization’ we have all been living through for the last three-plus decades is only the beginning. It’s “day one,” as Jeff Bezos likes to say about the digital revolution. It suggests that we can look forward to feeling like a true speck of dust in the infinite cosmic universe of corporate society, with absolutely no living being to talk to should we ever wish to take an unnecessary risk, diverge from the chosen route, or pay the fine instead.

For all the libertarian noise that folks from Silicon Valley make about efficiency and disruption, let no one be fooled: the slick algorithmic regulation that replaces decisions made by people, whether government bureaucrats or not, may be more objective, but it will not bring greater freedom.

An Education?

The conference was titled, “The Next New World.” It took place last month in San Francisco, and was hosted by Thomas Friedman, columnist for The New York Times and author of The World Is Flat. Friedman has been writing about the digital revolution for years now, and his thinking on the matter is wide-ranging and incisive.

In his keynote address, Friedman describes “an inflection” that occurred coincidental with the great recession of 2008—the technical transformations that began with the personal computer, continued with the internet, and are ongoing with smart phones and the cloud. Friedman is not the first to note that this transformation is the equivalent of what began in 1450 with the invention of the printing press, the so-called Gutenberg revolution. The difference is that the Gutenberg revolution took 200 years to sweep through society. The digital revolution has taken two decades.

5351622529_5d4c782817Friedman and his co-speakers at the conference are right in articulating that today’s revolution has meant that there is a new social contract extant, one based not upon high wages for middle skills (think auto manufacturing or bookkeeping), but upon high wages for high skills (think data analysis or mobile programming). Everything from driving cars to teaching children to milking cows has been overtaken by digital technology in the last 20 years, and so the average employee is now faced with a work place where wages and benefits don’t flow from a commitment to steady long term work, but where constant innovation is required for jobs that last an average of 4.6 years. As Friedman adds—tellingly I think—in today’s next new world, “no one cares what you know.” They care only about what you can do.

Friedman adds in his address that the real focus of the discussions at the conference can be abridged by two questions: “How [in this new world] does my kid get a job?” and, “How does our local school or university need to adapt?’’

All well and good. Everyone has to eat, never mind grow a career or pay a mortgage. What bothers me however, in all these worthwhile discussions, is the underlying assumption that the education happening at schools and universities should essentially equate to job training. I’ve checked the Oxford; nowhere does that esteemed dictionary define education as training for a job. The closest it comes is to say that education can be training “in a particular subject,” not a skill.

I would contend that what a young person knows, as opposed to what they can do, should matter to an employer. What’s more, I think it should matter to all of us. Here’s a definitional point for education from the Oxford that I was delighted to see: “an enlightening experience.”

A better world requires a better educated populace, especially women. For the human race to progress (perhaps survive), more people need to understand the lessons of history. More people have to know how to think rationally, act responsibly, and honour compassion, courage and commitment. None of that necessarily comes with job training for a data analyst or mobile programmer.

And maybe, if the range of jobs available out there is narrowing to ever more specific, high technical-skills work, applicable to an ever more narrow set of industries, then that set of industries should be taking on a greater role in instituting the needed training regimes. Maybe as an addendum to what can be more properly termed ‘an education.’

I’m sure that Friedman and his conference colleagues would not disagree with the value of an education that stresses knowledge, not skills. And yes, universities have become too elitist and expensive everywhere, especially in America. But my daughter attends Quest University in Squamish, British Columbia, where, in addition to studying mathematics and biology, she is obliged to take courses in Rhetoric, Democracy and Justice, and Global Perspectives.

Not exactly the stuff that is likely to land her a job in Silicon Valley, you might say, and I would have to reluctantly agree. But then I would again argue that it should better qualify her for that job. Certainly those courses will make her a better citizen, something the world is in dire need of, but I would also argue that a degree in “Liberal Arts and Sciences” does in fact better qualify her for that job, because those courses will teach her how to better formulate an argument, better understand the empowerment (and therefore the greater job satisfaction) that comes with the democratic process, and better appreciate the global implications of practically all we do workwise these days.

Damn tootin’ that education in liberal arts and sciences better qualifies her for that job in Silicon Valley. That and every other job out there.