-- Leo's gemini proxy

-- Connecting to dfdn.info:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini;lang=en-US

[Origional document]


[ Reprinted from Wired 08/2010. We'll take this down if Wired objects.

gopher@floodgap.com ]


The Web Is Dead. Long Live the Internet


By Chris Anderson and Michael Wolff

August 17, 2010 |

9:00 am |

Wired September 2010

http://www.wired.com/magazine/2010/08/ff_webrip/all/1


Sources: Cisco estimates based on CAIDA publications, Andrew Odlyzko


Two decades after its birth, the World Wide Web is in decline, as

simpler, sleeker services -- think apps -- are less about the

searching and more about the getting. Chris Anderson explains how this

new paradigm reflects the inevitable course of capitalism. And Michael

Wolff explains why the new breed of media titan is forsaking the Web

for more promising (and profitable) pastures.


[ This article is split in two interleaved halves. Look for the changing

Blame. -- Ed. ]


Who's to Blame:

Us

As much as we love the open, unfettered Web, we're abandoning it for

simpler, sleeker services that just work.

by Chris Anderson


You wake up and check your email on your bedside iPad -- that's one

app. During breakfast you browse Facebook, Twitter, and The New York

Times -- three more apps. On the way to the office, you listen to a

podcast on your smartphone. Another app. At work, you scroll through

RSS feeds in a reader and have Skype and IM conversations. More apps.

At the end of the day, you come home, make dinner while listening to

Pandora, play some games on Xbox Live, and watch a movie on Netflix's

streaming service.


You've spent the day on the Internet -- but not on the Web. And you

are not alone.


This is not a trivial distinction. Over the past few years, one of the

most important shifts in the digital world has been the move from the

wide-open Web to semiclosed platforms that use the Internet for

transport but not the browser for display. It's driven primarily by

the rise of the iPhone model of mobile computing, and it's a world

Google can't crawl, one where HTML doesn't rule. And it's the world

that consumers are increasingly choosing, not because they're

rejecting the idea of the Web but because these dedicated platforms

often just work better or fit better into their lives (the screen

comes to them, they don't have to go to the screen). The fact that

it's easier for companies to make money on these platforms only

cements the trend. Producers and consumers agree: The Web is not the

culmination of the digital revolution.--A decade ago, the ascent of

the Web browser as the center of the computing world appeared

inevitable. It seemed just a matter of time before the Web replaced PC

application software and reduced operating systems to a "poorly

debugged set of device drivers," as Netscape cofounder Marc Andreessen

famously said. First Java, then Flash, then Ajax, then HTML5 --

increasingly interactive online code -- promised to put all apps in

the cloud and replace the desktop with the webtop. Open, free, and out

of control.


But there has always been an alternative path, one that saw the Web as

a worthy tool but not the whole toolkit. In 1997, Wired published a

now-infamous "Push!" cover story, which suggested that it was time

to "kiss your browser goodbye." The argument then was that "push"

technologies such as PointCast and Microsoft's Active Desktop would

create a "radical future of media beyond the Web."


"Sure, we'll always have Web pages. We still have postcards and

telegrams, don't we? But the center of interactive media --

increasingly, the center of gravity of all media -- is moving to a

post-HTML environment," we promised nearly a decade and half ago. The

examples of the time were a bit silly -- a "3-D furry-muckers VR

space" and "headlines sent to a pager" -- but the point was altogether

prescient: a glimpse of the machine-to-machine future that would be

less about browsing and more about getting.


Who's to Blame:

Them

Chaos isn't a business model. A new breed of media moguls is bringing

order -- and profits -- to the digital world.

by Michael Wolff


An amusing development in the past year or so -- if you regard

post-Soviet finance as amusing -- is that Russian investor Yuri

Milner has, bit by bit, amassed one of the most valuable stakes on the

Internet: He's got 10 percent of Facebook. He's done this by

undercutting traditional American VCs -- the Kleiners and the

Sequoias who would, in days past, insist on a special status in

return for their early investment. Milner not only offers better terms

than VC firms, he sees the world differently. The traditional VC has a

portfolio of Web sites, expecting a few of them to be successes -- a

good metaphor for the Web itself, broad not deep, dependent on the

connections between sites rather than any one, autonomous property. In

an entirely different strategic model, the Russian is concentrating

his bet on a unique power bloc. Not only is Facebook more than just

another Web site, Milner says, but with 500 million users it's "the

largest Web site there has ever been, so large that it is not a Web

site at all."


According to Compete, a Web analytics company, the top 10 Web

sites accounted for 31 percent of US pageviews in 2001, 40 percent in

2006, and about 75 percent in 2010. "Big sucks the traffic out of

small," Milner says. "In theory you can have a few very successful

individuals controlling hundreds of millions of people. You can become

big fast, and that favors the domination of strong people."


Milner sounds more like a traditional media mogul than a Web

entrepreneur. But that's exactly the point. If we're moving away from

the open Web, it's at least in part because of the rising dominance of

businesspeople more inclined to think in the all-or-nothing terms of

traditional media than in the come-one-come-all collectivist

utopianism of the Web. This is not just natural maturation but in many

ways the result of a competing idea -- one that rejects the Web's

ethic, technology, and business models. The control the Web took from

the vertically integrated, top-down media world can, with a little

rethinking of the nature and the use of the Internet, be taken back.


This development -- a familiar historical march, both feudal and

corporate, in which the less powerful are sapped of their reason for

being by the better resourced, organized, and efficient -- is perhaps

the rudest shock possible to the leveled, porous, low-barrier-to-entry

ethos of the Internet Age. After all, this is a battle that seemed

fought and won -- not just toppling newspapers and music labels but

also AOL and Prodigy and anyone who built a business on the idea that

a curated experience would beat out the flexibility and freedom of the

Web.


Illustration: Dirk Fowler


Blame Us:

As it happened, PointCast, a glorified screensaver that could

inadvertently bring your corporate network to its knees, quickly

imploded, taking push with it. But just as Web 2.0 is simply Web 1.0

that works, the idea has come around again. Those push concepts have

now reappeared as APIs, apps, and the smartphone. And this time we

have Apple and the iPhone/iPad juggernaut leading the way, with tens

of millions of consumers already voting with their wallets for an

app-led experience. This post-Web future now looks a lot more

convincing. Indeed, it's already here.


The Web is, after all, just one of many applications that exist on the

Internet, which uses the IP and TCP protocols to move packets around.

This architecture -- not the specific applications built on top of it

-- is the revolution. Today the content you see in your browser --

largely HTML data delivered via the http protocol on port 80 --

accounts for less than a quarter of the traffic on the Internet ...

and it's shrinking. The applications that account for more of the

Internet's traffic include peer-to-peer file transfers, email, company

VPNs, the machine-to-machine communications of APIs, Skype calls,

World of Warcraft and other online games, Xbox Live, iTunes,

voice-over-IP phones, iChat, and Netflix movie streaming. Many of the

newer Net applications are closed, often proprietary, networks.


And the shift is only accelerating. Within five years, Morgan Stanley

projects, the number of users accessing the Net from mobile devices

will surpass the number who access it from PCs. Because the screens

are smaller, such mobile traffic tends to be driven by specialty

software, mostly apps, designed for a single purpose. For the sake of

the optimized experience on mobile devices, users forgo the

general-purpose browser. They use the Net, but not the Web. Fast beats

flexible.


This was all inevitable. It is the cycle of capitalism. The story of

industrial revolutions, after all, is a story of battles over

control. A technology is invented, it spreads, a thousand flowers

bloom, and then someone finds a way to own it, locking out others. It

happens every time.


Take railroads. Uniform and open gauge standards helped the industry

boom and created an explosion of competitors -- in 1920, there were

186 major railroads in the US. But eventually the strongest of them

rolled up the others, and today there are just seven -- a regulated

oligopoly. Or telephones. The invention of the switchboard was another

open standard that allowed networks to interconnect. After telephone

patents held by AT&T's parent company expired in 1894, more than 6,000

independent phone companies sprouted up. But by 1939, AT&T controlled

nearly all of the US's long-distance lines and some four-fifths of its

telephones. Or electricity. In the early 1900s, after the

standardization to alternating current distribution, hundreds of small

electric utilities were consolidated into huge holding companies. By

the late 1920s, the 16 largest of those commanded more than 75 percent

of the electricity generated in the US.


Indeed, there has hardly ever been a fortune created without a

monopoly of some sort, or at least an oligopoly. This is the natural

path of industrialization: invention, propagation, adoption, control.


Now it's the Web's turn to face the pressure for profits and the

walled gardens that bring them. Openness is a wonderful thing in the

nonmonetary economy of peer production. But eventually our tolerance

for the delirious chaos of infinite competition finds its limits. Much

as we love freedom and choice, we also love things that just work,

reliably and seamlessly. And if we have to pay for what we love, well,

that increasingly seems OK. Have you looked at your cell phone or

cable bill lately?


As Jonathan L. Zittrain puts it in The Future of the Internet -- And

How to Stop It, "It is a mistake to think of the Web browser as the

apex of the PC's evolution." Today the Internet hosts countless closed

gardens; in a sense, the Web is an exception, not the rule.


Blame Them:

The truth is that the Web has always had two faces. On the one hand,

the Internet has meant the breakdown of incumbent businesses and

traditional power structures. On the other, it's been a constant power

struggle, with many companies banking their strategy on controlling

all or large chunks of the TCP/IP-fueled universe. Netscape tried to

own the homepage; Amazon.com tried to dominate retail; Yahoo, the

navigation of the Web.


Google was the endpoint of this process: It may represent open systems

and leveled architecture, but with superb irony and strategic

brilliance it came to almost completely control that openness. It's

difficult to imagine another industry so thoroughly subservient to one

player. In the Google model, there is one distributor of movies, which

also owns all the theaters. Google, by managing both traffic and sales

(advertising), created a condition in which it was impossible for

anyone else doing business in the traditional Web to be bigger than or

even competitive with Google. It was the imperial master over the

world's most distributed systems. A kind of Rome.


In an analysis that sees the Web, in the description of Interactive

Advertising Bureau president Randall Rothenberg, as driven by "a bunch

of megalomaniacs who want to own the entirety of the world," it is

perhaps inevitable that some of those megalomaniacs began to see

replicating Google's achievement as their fundamental business

challenge. And because Google so dominated the Web, that meant

building an alternative to the Web.

People


Enter Facebook. The site began as a free but closed system. It

required not just registration but an acceptable email address (from a

university, or later, from any school). Google was forbidden to search

through its servers. By the time it opened to the general public in

2006, its clublike, ritualistic, highly regulated foundation was

already in place. Its very attraction was that it was a closed system.

Indeed, Facebook's organization of information and relationships

became, in a remarkably short period of time, a redoubt from the Web

-- a simpler, more habit-forming place. The company invited developers

to create games and applications specifically for use on Facebook,

turning the site into a full-fledged platform. And then, at some

critical-mass point, not just in terms of registration numbers but of

sheer time spent, of habituation and loyalty, Facebook became a

parallel world to the Web, an experience that was vastly different and

arguably more fulfilling and compelling and that consumed the time

previously spent idly drifting from site to site. Even more to the

point, Facebook founder Mark Zuckerberg possessed a clear vision

of empire: one in which the developers who built applications on top

of the platform that his company owned and controlled would always be

subservient to the platform itself. It was, all of a sudden, not just

a radical displacement but also an extraordinary concentration of

power. The Web of countless entrepreneurs was being overshadowed by

the single entrepreneur-mogul-visionary model, a ruthless paragon of

everything the Web was not: rigid standards, high design, centralized

control.


Striving megalomaniacs like Zuckerberg weren't the only ones eager to

topple Google's model of the open Web. Content companies, which depend

on advertising to fund the creation and promulgation of their wares,

appeared to be losing faith in their ability to do so online. The Web

was built by engineers, not editors. So nobody paid much attention to

the fact that HTML-constructed Web sites -- the most advanced form of

online media and design -- turned out to be a pretty piss-poor

advertising medium.


For quite a while this was masked by the growth of the audience share,

followed by an ever-growing ad-dollar share, until, about two years

ago, things started to slow down. The audience continued to grow at a

ferocious rate -- about 35 percent of all our media time is now spent

on the Web -- but ad dollars weren't keeping pace. Online ads had

risen to some 14 percent of consumer advertising spending but had

begun to level off. (In contrast, TV -- which also accounts for 35

percent of our media time, gets nearly 40 percent of ad dollars.)


Blame Us:

Monopolies are actually even more likely in highly networked markets

like the online world. The dark side of network effects is that rich

nodes get richer. Metcalfe's law, which states that the value of a

network increases in proportion to the square of connections, creates

winner-take-all markets, where the gap between the number one and

number two players is typically large and growing.

Platforms


So what took so long? Why wasn't the Web colonized by monopolists a

decade ago? Because it was in its adolescence then, still innovating

quickly with a fresh and growing population of users always looking

for something new. Network-driven domination was short-lived.

Friendster got huge while social networking was in its infancy, and

fickle consumers were still keen to experiment with the next new

thing. They found another shiny service and moved on, just as they

had abandoned SixDegrees.com before it. In the expanding universe of

the early Web, AOL's walled garden couldn't compete with what was

outside the walls, and so the walls fell.


But the Web is now 18 years old. It has reached adulthood. An entire

generation has grown up in front of a browser. The exploration of a

new world has turned into business as usual. We get the Web. It's part

of our life. And we just want to use the services that make our life

better. Our appetite for discovery slows as our familiarity with the

status quo grows.


Blame human nature. As much as we intellectually appreciate openness,

at the end of the day we favor the easiest path. We'll pay for

convenience and reliability, which is why iTunes can sell songs for 99

cents despite the fact that they are out there, somewhere, in some

form, for free. When you are young, you have more time than money, and

LimeWire is worth the hassle. As you get older, you have more money

than time. The iTunes toll is a small price to pay for the simplicity

of just getting what you want. The more Facebook becomes part of your

life, the more locked in you become. Artificial scarcity is the

natural goal of the profit-seeking.


Blame Them:

What's more, there was the additionally sobering and confounding fact

that an online consumer continued to be worth significantly less than

an offline one. For a while, this was seen as inevitable right-sizing:

Because everything online could be tracked, advertisers no longer had

to pay to reach readers who never saw their ads. You paid for what you

got.


Unfortunately, what you got wasn't much. Consumers weren't motivated

by display ads, as evidenced by the share of the online audience that

bothered to click on them. (According to a 2009 comScore study, only

16 percent of users ever click on an ad, and 8 percent of users

accounted for 85 percent of all clicks.) The Web might generate some

clicks here and there, but you had to aggregate millions and millions

of them to make any money (which is what Google, and basically nobody

else, was able to do). And the Web almost perversely discouraged the

kind of systematized, coordinated, focused attention upon which brands

are built -- the prime, or at least most lucrative, function of media.


What's more, this medium rendered powerless the marketers and agencies

that might have been able to turn this chaotic mess into an effective

selling tool -- the same marketers and professional salespeople who

created the formats (the variety shows, the 30- second spots, the soap

operas) that worked so well in television and radio. Advertising

powerhouse WPP, for instance, with its colossal network of

marketing firms -- the same firms that had shaped traditional media by

matching content with ads that moved the nation -- may still represent

a large share of Google's revenue, but it pales next to the greater

population of individual sellers that use Google's AdWords and AdSense

programs.


Blame Us:

There is an analogy to the current Web in the first era of the

Internet. In the 1990s, as it became clear that digital networks were

the future, there were two warring camps. One was the traditional

telcos, on whose wires these feral bits of the young Internet were

being sent. The telcos argued that the messy protocols of TCP/IP --

all this unpredictable routing and those lost packets requiring

resending -- were a cry for help. What consumers wanted were

"intelligent" networks that could (for a price) find the right path

and provision the right bandwidth so that transmissions would flow

uninterrupted. Only the owners of the networks could put the

intelligence in place at the right spots, and thus the Internet would

become a value-added service provided by the AT&Ts of the world, much

like ISDN before it. The rallying cry was "quality of service" (QoS).

Only telcos could offer it, and as soon as consumers demanded it, the

telcos would win.


The opposing camp argued for "dumb" networks. Rather than cede control

to the telcos to manage the path that bits took, argued its

proponents, just treat the networks as dumb pipes and let TCP/IP

figure out the routing. So what if you have to resend a few times, or

the latency is all over the place. Just keep building more capacity --

"overprovision bandwidth" -- and it will be Good Enough.


On the underlying Internet itself, Good Enough has won. We stare at

the spinning buffering disks on our YouTube videos rather than accept

the Faustian bargain of some Comcast/Google QoS bandwidth deal that we

would invariably end up paying more for. Aside from some corporate

networks, dumb pipes are what the world wants from telcos. The

innovation advantages of an open marketplace outweigh the limited

performance advantages of a closed system.


But the Web is a different matter. The marketplace has spoken: When it

comes to the applications that run on top of the Net, people are

starting to choose quality of service. We want TweetDeck to

organize our Twitter feeds because it's more convenient than the

Twitter Web page. The Google Maps mobile app on our phone works better

in the car than the Google Maps Web site on our laptop. And we'd

rather lean back to read books with our Kindle or iPad app than lean

forward to peer at our desktop browser.


At the application layer, the open Internet has always been a fiction.

It was only because we confused the Web with the Net that we didn't

see it. The rise of machine-to-machine communications -- iPhone apps

talking to Twitter APIs -- is all about control. Every API comes with

terms of service, and Twitter, Amazon.com, Google, or any other

company can control the use as they will. We are choosing a new form

of QoS: custom applications that just work, thanks to cached content

and local code. Every time you pick an iPhone app instead of a Web

site, you are voting with your finger: A better experience is worth

paying for, either in cash or in implicit acceptance of a non-Web

standard.


Blame Them:

One result of the relative lack of influence of professional

salespeople and hucksters -- the democratization of marketing, if you

will -- is that advertising on the Web has not developed in the subtle

and crafty and controlling ways it did in other mediums. The

ineffectual banner ad, created (indeed by the founders of this

magazine) in 1994 -- and never much liked by anyone in the marketing

world -- still remains the foundation of display advertising on the

Web.


And then there's the audience.


At some never-quite-admitted level, the Web audience, however

measurable, is nevertheless a fraud. Nearly 60 percent of people find

Web sites from search engines, much of which may be driven by SEO, or

"search engine optimization" -- a new-economy acronym that refers to

gaming Google's algorithm to land top results for hot search terms. In

other words, many of these people have been essentially corralled into

clicking a random link and may have no idea why they are visiting a

particular site -- or, indeed, what site they are visiting. They are

the exact opposite of a loyal audience, the kind that you might

expect, over time, to inculcate with your message.


Web audiences have grown ever larger even as the quality of those

audiences has shriveled, leading advertisers to pay less and less to

reach them. That, in turn, has meant the rise of junk-shop content

providers -- like Demand Media -- which have determined that the

only way to make money online is to spend even less on content than

advertisers are willing to pay to advertise against it. This further

cheapens online content, makes visitors even less valuable, and

continues to diminish the credibility of the medium.


Even in the face of this downward spiral, the despairing have hoped.

But then came the recession, and the panic button got pushed. Finally,

after years of experimentation, content companies came to a disturbing

conclusion: The Web did not work. It would never bring in the bucks.

And so they began looking for a new model, one that leveraged the

power of the Internet without the value-destroying side effects of the

Web. And they found Steve Jobs, who -- rumor had it -- was working on

a new tablet device.


Now, on the technology side, what the Web has lacked in its

determination to turn itself into a full-fledged media format is

anybody who knew anything about media. Likewise, on the media side,

there wasn't anybody who knew anything about technology. This has been

a fundamental and aching disconnect: There was no sublime integration

of content and systems, of experience and functionality -- no clever,

subtle, Machiavellian overarching design able to create that

codependent relationship between audience, producer, and marketer.


Blame Us:

In the media world, this has taken the form of a shift from

ad-supported free content to freemium -- free samples as marketing

for paid services -- with an emphasis on the "premium" part. On the

Web, average CPMs (the price of ads per thousand impressions) in key

content categories such as news are falling, not rising, because

user-generated pages are flooding Facebook and other sites. The

assumption had been that once the market matured, big companies would

be able to reverse the hollowing-out trend of analog dollars turning

into digital pennies. Sadly that hasn't been the case for most on the

Web, and by the looks of it there's no light at the end of that

tunnel. Thus the shift to the app model on rich media platforms like

the iPad, where limited free content drives subscription revenue

(check out Wired's cool new iPad app!).


The Web won't take the sequestering of its commercial space easily.

The defenders of the unfettered Web have their hopes set on HTML5 --

the latest version of Web-building code that offers applike

flexibility -- as an open way to satisfy the desire for quality of

service. If a standard Web browser can act like an app, offering the

sort of clean interface and seamless interactivity that iPad users

want, perhaps users will resist the trend to the paid, closed, and

proprietary. But the business forces lining up behind closed platforms

are big and getting bigger. This is seen by many as a battle for the

soul of the digital frontier.


Zittrain argues that the demise of the all-encompassing, wide-open Web

is a dangerous thing, a loss of open standards and services that are

"generative" -- that allow people to find new uses for them. "The

prospect of tethered appliances and software as service," he warns,

"permits major regulatory intrusions to be implemented as minor

technical adjustments to code or requests to service providers."


But what is actually emerging is not quite the bleak future of the

Internet that Zittrain envisioned. It is only the future of the

commercial content side of the digital economy. Ecommerce continues to

thrive on the Web, and no company is going to shut its Web site as an

information resource. More important, the great virtue of today's Web

is that so much of it is noncommercial. The wide-open Web of peer

production, the so-called generative Web where everyone is free to

create what they want, continues to thrive, driven by the nonmonetary

incentives of expression, attention, reputation, and the like. But the

notion of the Web as the ultimate marketplace for digital delivery is

now in doubt.


The Internet is the real revolution, as important as electricity; what

we do with it is still evolving. As it moved from your desktop to your

pocket, the nature of the Net changed. The delirious chaos of the open

Web was an adolescent phase subsidized by industrial giants groping

their way in a new world. Now they're doing what industrialists do

best -- finding choke points. And by the looks of it, we're loving it.


Editor in chief Chris Anderson (canderson@wired.com) wrote about

the new industrial revolution in issue 18.02.


Blame Them:

Jobs perfectly fills that void. Other technologists have steered clear

of actual media businesses, seeing themselves as renters of systems

and third-party facilitators, often deeply wary of any involvement

with content. (See, for instance, Google CEO Eric Schmidt's insistence

that his company is not in the content business.) Jobs, on the

other hand, built two of the most successful media businesses of the

past generation: iTunes, a content distributor, and Pixar, a movie

studio. Then, in 2006, with the sale of Pixar to Disney, Jobs becomes

the biggest individual shareholder in one of the world's biggest

traditional media conglomerates -- indeed much of Jobs' personal

wealth lies in his traditional media holdings.


In fact, Jobs had, through iTunes, aligned himself with traditional

media in a way that Google has always resisted. In Google's open and

distributed model, almost anybody can advertise on nearly any site and

Google gets a cut -- its interests are with the mob. Apple, on the

other hand, gets a cut any time anybody buys a movie or song -- its

interests are aligned with the traditional content providers. (This

is, of course, a complicated alignment, because in each deal, Apple

has quickly come to dominate the relationship.)


So it's not shocking that Jobs' iPad-enabled vision of media's future

looks more like media's past. In this scenario, Jobs is a mogul

straight out of the studio system. While Google may have controlled

traffic and sales, Apple controls the content itself. Indeed, it

retains absolute approval rights over all third-party applications.

Apple controls the look and feel and experience. And, what's more, it

controls both the content-delivery system (iTunes) and the devices

(iPods, iPhones, and iPads) through which that content is consumed.


Since the dawn of the commercial Web, technology has eclipsed content.

The new business model is to try to let the content -- the product, as

it were -- eclipse the technology. Jobs and Zuckerberg are trying to

do this like old-media moguls, fine-tuning all aspects of their

product, providing a more designed, directed, and polished experience.

The rising breed of exciting Internet services -- like Spotify,

the hotly anticipated streaming music service; and Netflix, which lets

users stream movies directly to their computer screens, Blu-ray

players, or Xbox 360s -- also pull us back from the Web. We are

returning to a world that already exists -- one in which we chase the

transformative effects of music and film instead of our brief

(relatively speaking) flirtation with the transformative effects of

the Web.


After a long trip, we may be coming home.


Michael Wolff (michael@burnrate.com) is a new contributing editor

for Wired. He is also a columnist for Vanity Fair and the founder of

Newser, a news-aggregation site.


.

-- Response ended

-- Page fetched on Sun May 12 21:50:53 2024