The Robots Are Coming! The Robots Are Coming!

“Technology will get to everybody eventually.”

Jarod Lanier said the above during an interview with a writer from Salon.com, the news and entertainment website; this during Lanier’s book tour for his latest publication: Who Owns the Future?  Lanier is the internet apostate I wrote about earlier this year who once championed open-source culture, but who now suggests that digital technology is economically undermining the entire middle-class.

He offers this startling example of as much in the ‘Prelude’ to his new book:

“At the height of its power, the photography company Kodak employed more than 140,000 people and was worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When Instagram was sold to Facebook for a billion dollars in 2012, it employed only thirteen people.”

images-1Lanier is suggesting that the musicians, video store clerks and journalists who have already seen their livelihoods erased or eroded by the internet are just the canaries in the coalmine, members of the first wave of economic casualties.  Soon the driverless Google cars will be taking down taxi drivers, caregivers will be replaced by robots, and all diagnoses of illness will be arrived at online.  The digital revolution is coming for us all, and it’s not a matter of if, but when.

The same chilling note is struck in a terrific article by Kevin Drum in the May/June 2013 issue of Mother Jones.  Drum points out that the development of Artificial Intelligence (AI) has been steady but not spectacular since the invention of the first programmable computer in 1940.  That’s because the human brain is an amazingly complex processor, and even today, after more than seven decades of exponential increase in the power of computers, they are still operating at about one thousandth of the power of a human brain.

The thing is, exponential is the key word here.  Anyone who has ever looked at an exponential growth curve plotted on a graph knows that for a long time the line runs fairly flat, but with the doubling effect that comes with exponential growth, the curve eventually begins a very steep climb.  Many people believe that we’re now at the base of that sharp rise.  Henry Markram, a neuroscientist working at the Swiss Federal Institute of Technology in Lausanne thinks he will be able to successfully model the human brain by 2020.

He may or may not be right in that prediction, but again, if he is wrong, it won’t be about if, only when.  And when we combine that eventual reality with Lanier’s telling Kodak-to-Instagram employment factoid above, there appears to be grounds for genuine concern.  Finally, after many years of dire (or celebratory) predictions, labor may be about to go into real oversupply.  If these ideas are at all accurate, robots will soon be displacing human employment just about everywhere you look.  Accountants, teachers, architects, the last of the assembly-line workers, even writers; we’re all vulnerable.  As Drum sees it, capital, not labor will be the commodity in short supply in the near future, and that bodes well only for those folks who already have plenty of capital.

One of the conditions that follows from an oversupply of labor and an increased demand for capital is of course that wealth will flow from those earning salaries to those holding the capital.  The proverbial rich will get richer, the poor poorer.  And this condition is of course extant and growing, especially in the U.S.—an escalating income inequality between the 99 and 1 per cents.

History tells us that times of high unemployment are times dangerous to us all, often leading to unrest that in turn leads to illusory socio-economic solutions— communism, fascism, anti-immigration laws, etc.  What to do?  Well, anticipate the problem, first of all.  Our leaders need to make some contingency plans.

A tax on capital?  Not likely any time soon, certainly not in America.  But if indeed the coming economic reality is that more and more people will be without work, while a select few citizens will be ever more wealthy, the concept of ‘income redistribution’ needs to come into play, one way or another.  And orderly, democratic economic reform beats the hell out of rioting in the streets.

 

Oligarchs of the Internet

Steve Jobs had no use for philanthropy.  There is no record of him having made any charitable donations during his lifetime, despite his immense wealth.  Jobs never signed Bill Gates and Warren Buffet’s ‘billionaire’s pledge,’ to give at least half of his fortune to charity, as have more than 100 other exceptionally wealthy individuals from around the globe.  He also condoned sweatshop conditions—for children—at Apple manufacturing sites in China.  Apple employs about 700,000 people via subcontractors, according to The New York Times, but almost none of them work in the U.S.  Steve had no problem with any of this.

(His wife, Laurene Powell Jobs, has a better record than Steve when it comes to giving back, having emerged from his shadow after his death to contribute actively to a number of worthy causes, especially education.)

Mark Zuckerberg did sign Buffet’s pledge, but it’s also the case that last year Facebook spent nearly $2.5 million lobbying in Washington against tougher privacy laws, and for immigration reform that would allow the employment of immigrant IT workers at lower wages.  Like Jobs, Zuckerberg is taxes-averse, and Facebook actually succeeded in paying no taxes last year, despite profits of more than a billion dollars.

How about Google, the ‘Don’t be evil’ corporation?  Sergey Brin and his wife have been generous, particularly in giving to the battle against Parkinson’s disease (Brin carries the flawed Parkinson’s gene), but a recent article in Wired magazine strikes a disturbing note. The piece recounts how, in 2009, a low-life drug dealer named David Whitaker was looking for leniency from the US Food and Drug Administration after being busted for selling steroids and human growth hormones online from a base of operations in Mexico.  He told the FDA that he had marketed his sometimes-phony drugs into the US using Google Adwords, something supposedly expressly prohibited by Google’s policies.  All he had needed to do, it seems, was work directly with Google reps to tailor his website into something more ‘educational’—no ‘buy now’ buttons, no photos of drugs, that sort of thing—and, even though he made no attempt to conceal the true nature of his business, Google was happy to help.  The Feds were initially skeptical, but set up Whitaker in a new, bogus operation as a sting, to test whether he was telling the truth.  This time his venture would include sales of RU-486, the abortion pill, which is normally taken only under the supervision of a medical doctor.

A few months later it was abundantly clear that Whitaker was indeed being truthful.  The feds took legal action, and in 2011, Google settled out of court, paying a $500 million fine.  A brief statement from the company admitted that, “With hindsight, we shouldn’t have allowed these ads on Google in the first place.”

Jay Gould
Jay Gould

Great success brings great size, and with size comes a kind of corporate momentum that inevitably stresses sales over principles.  It’s not an across-the-board phenomenon—the post-CEO Bill Gates being the obvious exception—but it’s clear that many of today’s internet lords are not necessarily cut from cloth any different than were the notorious robber barons of the past, men like Jay Gould, who in 1869 attempted to corner the market in gold, hoping that the increase in price would increase the price of wheat, such that western farmers would then sell, causing a great amount of shipping of bread stuffs eastward, increasing freight business for the Erie railroad, which Gould was trying to take control of.  The ploy may have been complex, even ingenious, but it also brought Gould infamy, eventually forcing him out of an ownership position with the Railroad.

Today’s web oligarchs enjoy much higher approval ratings than did their 19th century corporate predecessors.  It’s been reported that some members of Occupy Wall Street stopped to mourn at the impromptu memorial site set up outside the Apple Store in Manhattan, following Steve Jobs’ death.

We serfs of the imperial internet realm can do better.  Great product does not always mean a worthy producer, and the ends never justify the means.  We can demand more from the web-based corporations and corporation heads who profit so handsomely from our purchases and our use of their services.  We can, at the very least, expect generosity.

 

 

 

Luddites Unite!

‘Luddite’ has in recent years come to function as a generic pejorative, describing an unthinking, head-in-the sand type afraid of all new forms of technology.  It’s an unfair use of the term.

images-1‘Luddite’ originates with a short-lived (1811 to 1817) protest movement among British textile workers, men who went about, usually under cover of darkness, destroying the weaving frames then being introduced to newly emerging ‘factories.’  These were the early days of the industrial revolution, and the new manufacturing facilities being attacked were supplanting the cottage-based industry the Luddites were a part of, leading to widespread unemployment, and therefore genuine hardship.  It was an age long before the existence of any kind of social safety net, times when employers were free to hire children, and they did so, at reduced wages, since the new machines were much easier to operate, requiring little of the skill possessed by the adult artisans being left behind.  (Just as causative in the sufferings of these newly unemployed were the prolonged Napoleonic wars that the British government was incessantly engaged in back then, at great economic expense.)  My point being that the Luddites were not opposed to technology per se; they were simply striking back at machinery which was making their lives well and truly miserable.  People were literally starving.

The term ‘Neo-Luddite’ has emerged in our day, referring, in author Kirkpatrick Sale’s words, to “a leaderless movement of passive resistance to consumerism and the increasingly bizarre and frightening technologies of the Computer Age.”  The vast majority of the people involved in this modern-day movement eschew violence, counting among their members prominent academics like the late Theodore Roszak, and eminent men of letters like Wendell Berry.  Again, the movement is not anti-technology per se, only anti-certain-kinds-of-technology, that is technology which might be described as anti-community.

Back in the 1970s, I read, quite avidly I might add, Ivan Illich’s book Tools for Conviviality, which from its very title, you might construe to be Neo-Luddite in its intent.  And you’d be largely right.  Illich condemned the use of machines like the very ones the British Luddites were smashing back in the 19th century—factory-based, single-purpose machines meant first of all for greater generation of the owner’s monetary profit.  The interesting thing is that Illich considered the then-ubiquitous pay phone as a properly convivial tool.  Anyone could use it, as often or seldom as they chose, to their own end, and the phone facilitated communication between individuals, that is community.

It’s not hard to see where all this is going.  Illich’s book was directly influential upon Lee Felsenstein, considered by many to be father of the personal computer.  Felsenstein was a member the legendary Homebrew Computer Club, which first met in Silicon Valley in 1975, spawning various founders of various microcomputer companies, including Steve Wozniak of Apple.  The original ethos espoused by members of the Club stressed peer-to-peer support, open-source information, and the autonomous operation of an individually owned machine.

Were he still alive, Ivan Illich would undoubtedly think of the personal computer, and of the smart phone as convivial tools.  But Illich had another concern associated with current technology—the rise of a managerial class of experts, people who were in a position to co-opt technical knowledge and expertise, and eventually control industries like medicine, agriculture and education.  Would the Lords of the computer age—those who control Google, Facebook, Apple—be considered by Illich to be members of a new managerial elite?

It’s not easy to say.  I suspect Illich would indeed think of the CEOs of companies like Toyota, General Electrics, and Royal Dutch Shell as members of a managerial elite, ultimately alienating workers from their own employment.  But Larry Page, the CEO of Google, who claims the company motto as, “Don’t be evil?”; is he too one of the new internet overlords?

The Neo-Luddites are right in saying that what we must all do is carefully discriminate among new forms of technology.  We must consider the control, the intent, the final gain associated with each type.  Convivial technology adds to our independence, as well as our efficiency.  It informs and empowers the user, not an alternate owner, nor the cloud-based controller of the medium.  If we could all make the distinction between convivial and non-convivial technology, it might make Luddites of us all.

 

The Storytelling Arc

Storytelling may be the most ancient art of all.  Before we as a species even had language, before we smeared charcoal goop on the palms of our hands and pressed them against the cave wall, we told one another stories.  I’ve written about this in a book entitled The Tyranny of Story, describing a scene where our cave-dwelling ancestors related the events of the day’s hunt while they sat around the fire that evening.  In the morning the men had set off as a hunting party, leaving the old, the women and the children behind.  tyrstoryLater that day they returned, carrying a large, now dead, formerly very dangerous beast; this while also carrying one of their own party, seriously injured.  Another man limped behind the others, less seriously wounded.

Is there any way that those who remained behind didn’t want to hear exactly how it all happened?  Who first saw the brute?  Who struck the first blow?  Who struck the killing blow?  How were the men injured?  I can’t imagine that, in those days before words, never mind a written language, the men of the hunting party didn’t act out the critical moments of the successful hunt, leaping about, gesturing, uttering grunts not so far removed from words; there with the others of their tribe watching, transfixed, while the fire cast exaggerated shadows of their ‘acting’ upon the cave walls.  How else was theatre, that is to say storytelling born?

And so we’ve been at it, telling one another stories, for so long that it may well now be in our genes, as some sort of ancestral memory, making us all story experts, at least as members of the audience.  We now don’t hesitate to pass judgment on any story.  It’s one of those things—like the proper functioning of the public school system, or the correct treatment for the common cold—that we are all unquestioned authorities on.

Until the Internet, for most storytellers, the audience hardly expanded beyond those other members of the tribe seated around the fire.  Once you chose a medium more complex than the air between you and a live audience, whether it was a newspaper, book, television show or movie, there were gatekeepers who held it as their job to judge whether your story was worthy of an expanded audience.  It would, after all, cost money to print, publish, broadcast or project your story, other people’s money, and it was to be determined that your story would garner an audience sufficiently broad to return a profit upon the investment of those folks’ money, before you the wannabe storyteller were to be granted access to that larger audience.

It is perhaps the single best thing about the coming of the web that these gatekeepers can now be effectively circumvented.  As Robert Tercek has observed in a 2011 TEDx talk, for decades now, “we’ve outsourced our storytelling to professionals.”  We’ve had little choice in the matter; we’ve been obliged to allow professional storytellers in the book, television and movie industries to tell our stories for us.  But no more.  With the coming of the Internet, we’ve all been given the opportunity ‘to reclaim the power of the personal narrative.’

This is no small thing.  Telling stories is how we try to lend meaning to our lives, how we attempt to make sense of the random chaos of experience, then pass along any insights we have gained in that attempt.  As filmmaker Hal Hartley has said, “When people tell stories to one another, they really want to talk about the meaning of life.”  I tell my students, when I can, to celebrate this opportunity, to seize it, to tell their own stories however and whenever they can, whether it be with animated drawings, words, video images or sounds.

Let me tell you too.  Grab a camera, pick up your laptop, plug in a microphone.  Tell a story, then share it, potentially, with me and millions of other people.  Take advantage of the new, truly wonderful and interactive opportunity now available to us all to share our stories.  If you do, you’ll be celebrating not only this new, marvelous opportunity; you’ll also be celebrating yourself.  As you might expect, you’ll never feel better.

 

 

 

Surfing the Information Sea

Not long ago I was online looking for a pair of waterproof sandals.  My family was heading off to Costa Rica for a wedding the following week, and I’d read that the hiking trails through many of the national parks in that tropical country could get wet and muddy.  I didn’t buy or order anything in the end, but, sure enough, whenever I was online during the next few days, there were ads staring back at me for just that kind of footwear.  God bless the folks at Google.

This is nothing terribly new.  The change began back in December of 2009, when Google, quietly, unceremoniously, began customizing our search results according to whatever information they can garner about us by tracking our online activities.  We’re talking everything here from our social media pursuits, to our political leanings, to our ‘window’ shopping.  It’s a whole new era where the giants of the online world—Google, Apple, Facebook, Microsoft—are all engaged in a furious race to gather as much data as they can on us, so that they can then sell it to advertisers.

They’ve created what MoveOn.org board president Eli Pariser calls a “filter bubble” around each of us, and the implications of this process are profound.  You’d like to think that there’s an element of objectivity involved when you search for any particular information online.  You’re looking to discover the best electric lawnmower, or the dates for the War of the Roses, or how to make yogurt at home, and you assume that Google will simply bring you unbiased info from the most popular, or the most authenticated sites.  Don’t be too sure.  We are all increasingly living within our own unique information bubble, as determined by the Google algorithm, and that determination is made with money in mind.

It’s an unsettling prospect, especially when combined with what Nicholas Carr first suggested back in 2008 in an article in The Atlantic magazine—that Google is also making us stupid.  Carr feels that online reading has essentially become an ADD (Attention Deficit Disorder) experience, one where we skim, multitask, skip from one information bit to another, all in a disorderly process very different from the “deep reading” we used to do in the past.  Feeling guilty?  I confess.  I think I may have developed an online reading technique which has me reading mostly just the opening sentence of the paragraph in each article or post I arrive at on the net.  ‘Surfing’ may be an old but still perfect descriptor of the process—staying up on the surface of the article, moving fast, perhaps enjoying the ride, but rarely stopping to ponder or examine what lies beneath.

Carr quite rightly points out that science has in recent times proven that our brain is not a static entity, even in adulthood.  It regularly rewires itself according to the stimuli or exercise we give it.  Therefore the likelihood that, when reading online the way I do, I am training my brain to be skittish, incapable of sustained attention, and very easily bored.  The intellectual laziness made possible by the web means that my knowledge base remains shallow, if fairly diverse.

When the Internet first arrived in our lives, it seemed too good to be true, and I guess it was.  At least it was too good to last.  A medium that originally seemed entirely open and accessible, free of central control, there to simply accommodate the free flow of ideas and information, has since those heady days steadily closed in upon itself.  And it has done so under the sustained pressure of commercial capitalism, with all its many rewards and penalties.  Alas, just like the days before cyberspace, it seems there is no free surfboard ride.  As we skim the surface of the information sea, just like the surfer riding toward both rocks and sand, we should keep our head up.  Maybe we should even consider jumping off, into the deeper water, before we hit the shore.