The New Digital Age: Reshaping the Future of People, Nations and Business

by Eric Schmidt & Jared Cohen

First reviewed August 2013

I know in my heart the internet has changed since 2013, but to me it has calcified to right around there. Better than being mentally stuck in the late 90s, I suppose.

This remains on of my favorite pieces of writing I’ve done.


I can live with top ten lists filled with nonsense and online puffery touting the secret grandiosity of a regional cuisine—those types of articles are obvious click-bait, simple to avoid. In the decades after the Rovian base-revving false dichotomy—“With us or against us”—it can sometimes be difficult to tell the difference between legitimate discord with popular sentiment and opportunistic nose-tweaking in a “Controversial Opinion Piece.” No, the real scourge of the internet is the insidious self-discovery examination essay.

Cobbled together from bits and pieces of all the most appealing and least substantial internet dreck, a good self-discovery piece will leave its reader feeling certain it applies to them and that they—amongst the hundreds of thousands reading and forwarding it—are particularly special. Just such an article from the Huffington Post recently made the social media rounds—“Signs that you might be an introvert." It contained a whirlwind of the most benign character traits coupled with vaguely complementary societal cues: “Do you have a penchant for philosophical conversations and a love of thought-provoking books and movies?”; “Introverts observe and take in a lot of information, and they think before they speak, leading them to appear wise to others.

It’s the online journalism equivalent of a palm reading; print enough mild flattery and genteel pleasantries and some portion of your target audience is going to jump on it. “Why yes, I do tend to notice more than others! What a great article, this is so me.” “When describing the way that introverts think, Jung explained that they're more interested in ideas and the big picture rather than facts and details. Of course, many introverts excel in detail-oriented tasks—but they often have a mind for more abstract concepts as well.” Just in case you’re detail-oriented. Or not. One way or another, this article is going to cover you. Unless you’re detail-oriented enough to notice that calling “introverts” “abstract big-picture-thinkers” but with enough breathing room to allow for excellence in “detail-oriented tasks” paints such a broad swath of inclusionary blather as to be meaningless. Are you good at anything and desperately seeking consolation for occasionally feeling uncomfortable? You might just be an introvert!

So perhaps The New Digital Age deserves credit for extracting and presenting the very essence of its subject matter: an indulgent, optimistic catch-all for a possible future direction for the Internet. Reflecting the brightest hopes of those reading it with nary a smudge to be found, The New Digital Age is an infomercial for the glory of the ‘inevitable’ digital utopia:

Haircuts will finally be automated and machine-precise. And cell phones, tablets and laptops will have wireless recharging capabilities, rendering the need to fiddle with charging cables an obsolete nuisance.

Those pesky haircuts. I can already see the fumblings of a non-union actor mashing together some cables in a hopeless, tangled mess. I couldn’t imagine anything more antiseptic and bland than a machine-encoded “perfect” haircut—if you want to try it now, go get yourself a Flowbee (cutting-edge 1988 technology) and then apologize to your barber when he or she has to fix what the robot did.

Perhaps you’re interested in how technology can make your life easier outside of machine-precise haircuts. All that time not at the salon gives you more time to connect to things that really matter:

By relying on integrated systems, which will encompass both the professional and the personal sides of our lives, we’ll be able to use our time more effectively each day—whether that means having the time to have a “deep think,” spending more time preparing for an important presentation or guaranteeing that a parent can attend his or her child’s soccer game without distraction.

Or maybe you’d like technology to keep you available all the time so you never disconnect from the other things that really really matter:

In the West, a mother could take a break from watching her child’s soccer game to explore a live global map (interactive and constantly updated) on her iPad, displaying who needs what and where. She would be able to independently decide whom to fund on the basis of individuals’ stories or perceived need levels.

This is the same sort of unreflective double-speak and optimistic pandering that the “Introvert” article contained; a sense that—no matter how incompatible some concepts may be—there is a way to flatten out the definitions to encompass whatever baggage a reader carries with them. Technology allows for no distractions at the soccer game, unless there is a single person anywhere on the globe that might need something that can be crowd-funded. If this were an independent film, the camera would pan down to the mother’s iPad screen so the audience can read the display—a pixelated version of her kid during his soccer game and the “need” field would read “Little Billy: PARENTAL ATTENTION.”

But such is life in The New Digital Age; what one hand giveth:

Just imagine the implications of these burgeoning mobile or tablet-based learning platforms for a country like Afghanistan, which has one of the lowest rates of literacy in the world….students stuck in school systems that teach narrow curriculums or only rote memorization will have access to a virtual world that encourages independent exploration and critical thinking.

The other hand taketh away:

It is, after all, much easier to blame a single product or company for a particularly evil application of technology than to acknowledge the limitations of personal responsibility...People have a responsibility as consumers and individuals to read a company’s policies and positions on privacy and security before they willingly share information.

If you want technology to be the driving force emancipating the downtrodden and shattering the shackles of ignorance, it merits explanation why the onus is on the end user to start out as a sophisticated consumer on par with a multinational conglomerate. But we never get any more discussion about how to square the idea that technology can free the teeming masses from their ignorance but only after the “I accept” button has been clicked. It smacks vaguely of colonial imperialism or noblesse oblige; a Digital Man’s Burden. A digression—what fascinating data-scraping algorithms might distinguish between the EULA-proof proto-technological naïf and the tech-savvy wunderkind—never materializes. Or even recognition that to begin distinguishing between the layman and the laity, data-scraping would have to pick apart personal information. No machine solutions can “assist” without first accessing information; so how then, can machine solutions be the answer to how and what information to share?

People will have access to ubiquitous wireless Internet networks that are many times cheaper than they are now. We’ll be more efficient, more productive and more creative.

Oh. Well that clears it right up. Everything will be better, and the Oxford comma will be forgotten. Got it.

I’m being flippant. And it’s not completely fair to cherry-pick two sections from the same chapter of the same book and juxtapose them for my own amusement, simply because they are logically incompatible and/or laden with parent-child soccer imagery. “Why yes, I can see myself sitting my child’s sporting event; this book really speaks to me!” Whatever. If you picked The New Digital Age up, you knew it was speculative. C’mon. Of course it is, it’s a forward-facing technologist’s manifesto by major corpo-politico field leaders.

And there are some cool tidbits and neat theories, which are sometimes palatably presented as theories and not definitive proclamations from the nascent Gootopia (it is not easy to portmanteau “Google” and “utopia.” By design, I assume).

Sometimes programs to secure sensitive information rely on 10 million lines of code while attackers can penetrate them with only 125 lines. “What we observed in cybersecurity,” Regina Dugan, a senior vice-president at Google and former director of DARPA (Defense Advanced Research Projects Agency), said, “is that we needed to create the equivalent of an adaptive immune system in computer security architecture.” Computers can continue to look and operate in similar ways, but there will have to be unique differences among them developed over time to protect and differentiate each system. What that means is that an adversary now has to write one hundred and twenty-five lines of code against millions of computers—that’s how you shift the asymmetry.

That’s cool. Let’s talk about how that’s going to happen! Yeah, Google, spill it about the histamine-blocking compu-bio-net servers that you’ve got cooking!

Someone found guilty of insider trading could be temporarily barred from all forms of e-commerce: no trading, online banking or buying things on the Internet. Or someone subjected to a restraining order would be restricted from visiting the social-networking profiles of the targeted person and his or her friends, or even searching for his or her name online.

Oh, okay, some fresh speculation instead. I guess we’ve moved past that cool stuff and will never hear about it again. But let’s really dig into the quagmire you’ve shoved the judicial system into with your offhanded remark; no more buying my vitamins or groceries or video games online because I broke SEC regulations?

Consider the impact of basic mobile phones for a group of Congolese fisherwomen today. Whereas they used to bring their daily catch to the market and watch it slowly spoil as the day progressed, now they keep it on the line, in the river, and wait for calls from customers. Once an order is placed, a fish is brought out of the water and prepared for the buyer. There is no need for an expensive refrigerator, no need for someone to guard it at night, no danger of spoiled fish losing their value (or poisoning customers), and there is no unnecessary overfishing.

I hope you’re not in the Congo, inside traders, because you sure aren’t getting fish without some e-commerce. And that’s contemporary; in The New Digital Age I have been led to believe that the future will be all online, all the time. Banning all e-commerce in the future sounds like a death sentence. What cruel barbarism the future brings! But I don’t make the speculative, ill-planned rules, I just try to, you know, think them through.

Speaking of the barren wasteland that digitalism is turning the Congolese markets into, what happens to the joy of browsing and perusing when even physical transactions have been reduced to the abstract e-commerce “on demand” style of procurement? This is the type of baseless speculative futurism of which I could use more. Perhaps the real fish stay in the river but the market stall is brimming with holographic representations, complete with some fake aromas modern florists and the Subway Sandwich franchise use to simulate the scents of heady bouquets and heavy bread loafs, respectively. “To smell this fish, please click ‘I accept’. By accepting, you have given permission for all data about your fish-selecting to be entered into a database so that in the future, only the fish we think you might enjoy will be presented. Unless one of those Congolese fisherwomen pay us a small advertising fee; then we’ll cram whatever shit she dredges out of the lake into your face first. Algorithms!” I can’t imagine a better future.

Look, I try to write serious reviews, I really do. But I have a hard time taking anything so pie-in-the-sky seriously. I mean, the GUI designers I know want to punch whatever art department was responsible for Minority Report and its cool looking—but functionally absurd—spawn of motion-controlled interface abominations. The practical and the actual rarely overlap; keyboard commands and shortcuts are so much faster than mouse-driven interfaces, but who’s going to learn keyboard commands? The only context I've even heard that “power users” spurn the mouse was during a discussion of the NCIS set design; the dorky science girl's computer didn’t have a mouse in at least one scene. There are now about 600,000 words on the internet dedicated to defending why that’s not a flaw but an intentional piece of visual character development: she’s a power user; the computer mouse is for babies; maybe she’s running Unix; and so on.

I’m not saying there’s no place for futurism. I’m asking you please not to simply wave a crystal ball around my face and tell me it’s magic. Don’t try to trick me into thinking being an introvert is a cool thing because all the negatives normally associated with that personality type are either inapplicable in my case or are secretly positives.

And don’t tell me you know where the future is headed. Even if you were the CEO of Google.

When formulating policies, technology companies will, like governments, increasingly have to factor in all sorts of domestic and international dynamics, such as the political risk environment, diplomatic relationships between states, and the rules that govern citizens’ lives. The central truth of the technology industry—that technology is neutral but people are not—will periodically be lost amid all the noise. But our collective progress as citizens in the digital age will hinge on our not forgetting it.

Foreboding. And vague. Not vague in the lyrical “Technology is neither good nor evil; nor is it neutral,” bon mot from Dr. Kranzberg. It’s just...what is this saying? Tech companies have a lot of stuff to deal with...so, uh…did anyone in the audience recently lose something that starts with an “S”?

David Dinaburg