The Slow-Motion Internet

Google’s growth plans depend in part on whether it can make the entire Web faster.

Feb 22, 2011

The Internet is no longer fast enough for Google.

To see why, try the Chrome netbook. It’s a prototype device that exemplifies one of the company’s visions for the future: the idea that we can do nearly all our computing online, accessing information anywhere on a whim. This netbook has a pared-down operating system that’s essentially a powerful Web browser. It stores almost no files or software. Almost everything you can do on the device requires an Internet connection.

When I got my hands on the Chrome netbook, I understood why Google (one of our TR50 companies) finds the idea compelling. I liked the convenience at first—I always had the files I needed, because the machine forced me to store them remotely, “in the cloud.” But one day, I waited minutes for my Web word processor to open a file. I couldn’t look at the computer’s task manager and solve the problem—I just had to stare at a spinning wheel. Another time, my favorite streaming radio station took forever to load. Once it did start playing songs, the connection hiccupped, giving the effect of a skipping CD. Before long, I gave up using the netbook and went back to a computer that could work offline.

These types of failures mean the Internet isn’t ready to deliver what Google envisions, which is for networked devices to feel as fluid and easy as PCs that do their computing “locally.” And it’s not just the idea of relying exclusively on the Web that’s in jeopardy: even on an everyday computer, it can be slow and frustrating to do anything important in a Web application. Online apps like the free spreadsheet program from Google sometimes feel sluggish—there can be a lag before a number you’ve entered shows up on your screen. That is a big problem for Google, because its hopes ride on the prospect that we will all live more thoroughly Internet-connected lives. The company looks forward to a day when, instead of depending on software that resides on desktop computers (often sold by Microsoft), we turn to programs that are run remotely (often by Google).

The Need for Speed
Even slight slowdowns online frustrate people and cost companies money. (View infographics)

Google has used its dominance of Web search to build an extremely profitable advertising business; it had $8.5 billion in net income last year on $29.3 billion in revenue. But the company knows it can’t rely on search forever. Perhaps someone else—Microsoft or a startup—will build an even better search engine. Facebook is pursuing its own vision of a Web—one closed off from Google—that revolves around social connections and personal preferences. Or an unexpected threat might arise. Google has launched many products aimed at capturing more of the time people spend online—not only on PCs but also on new types of devices that range from smart phones to Internet-enabled TVs—but none has yielded significant revenue. This is why Eric Schmidt, who plans to step aside as CEO this spring, tried to push the company in new directions. He told employees to think of Google as a company making software for mobile devices, running on an Internet pervasive and fast enough to blend into any daily activity.

But that isn’t able to happen just yet. Like the steady drip of a corrosive fluid, repeated encounters with a slow website eat away at a person’s willingness to use Web apps. Bill Coughran, a senior vice president of engineering at Google who oversees its “Make the Web Faster” initiative, says the company fears that the growth of online services could hit a wall if the Web is “too slow or too insecure.”

Google’s solution is brilliant and ambitious: speed up the whole Web—not just the sites Google runs. That means changing many things that aren’t even in Google’s control—everything from the way websites are built to the fiber that brings the Internet into people’s homes. And it may be more than even Google’s vast resources and world-class engineers can manage.

The Anatomy of the Web

When people use a website, requests for data have to travel from the browser on their computers to the servers that host that site. The servers determine what to send back. The code that describes how to load the page travels back to the browser; it may include instructions to fetch certain items, such as images or video, which require sending even more messages.

Each of these messages involves a complicated, interconnected nest of hardware and software that is often outdated or poorly designed, or at the very least congested. The routes go through various kinds of physical infrastructure, from high-speed lines that make up the backbone of the Internet to the cables, phone wires, and wireless signals that deliver a site to its intended user.

Performance problems can happen anywhere in that process. The servers hosting a site might be slow. The browser might not handle the code efficiently. The code might be hard to process. On top of that, the back-and-forth negotiation of sending information and determining whether it has arrived is governed by protocols designed decades ago. They were not crafted for the level of speed and interactivity required by modern Web applications that are meant to replace software traditionally run on a PC.

People turn out to be sensitive to the slightest delays. An internal study showed Google that introducing a delay of 100 to 400 milliseconds when displaying search results led users to conduct 0.2 to 0.6 percent fewer searches, and the number of searches dropped ever lower as weeks went by. Once normal speed was restored, it took time for people to resume their earlier searching habits.

Browsing Web pages “should be like changing the channel on the TV,” says Arvind Jain, a director of engineering at Google and the technical lead for the Make the Web Faster initiative. The project came about two years ago, at the behest of Google cofounder Larry Page (who will replace Schmidt as CEO). There are problems with “every component” of the Web, says Jain, who speaks with a casual hubris that is quintessentially Google. “We realized we have to fix all of them.”

The Web in Google’s Image

To get started, Jain teamed up with a small group, including Richard Rabbat, who serves as product manager for the initiative. Rabbat tends to criticize the Internet in a joking tone. But he is as convinced as Jain that it is unacceptably slow, especially on mobile devices.

Rabbat grew up in Lebanon, where Internet access was limited by war and a poor economy. As an undergraduate, he shared access with many other people through a satellite communication system called a VSAT, or very small aperture terminal. He vividly remembers how much time he spent waiting for pages to load, waiting to get the information he needed.

When their project began, Rabbat and Jain sat down and mapped out things Google could do to help make the Web faster at every level, including the company’s own sites. Expectations at the top of the company ran high. Leonidas Kontothanassis, tech lead for Google’s office in Cambridge, Massachusetts, recalls with a broad grin that three minutes before the team that works on Google’s ad network headed into a goal-setting meeting one day, they got a message on an internal planning site. The team had been looking at ways to make the ad network run twice as fast. The message indicated that Larry Page wanted to see them make it 10 times as fast. “We had brainstormed stuff we could do to make ads faster, but those nice small wins became irrelevant at that moment,” Kontothanassis recalls. The team had to change its approach completely, questioning fundamental aspects of the Internet rather than searching for places to tweak.

Meanwhile, other Google engineers were working on the company’s Web browser, which is also called Chrome and came out well before the prototype netbook with the same name. The company designed and built the browser with an eye to the problems that have arisen with the popularity of Web applications. Web apps are programmed largely in JavaScript, through ad hoc techniques that engineers have developed in response to demands for new website capabilities. One of Google’s chief innovations for the browser was a new engine for processing JavaScript faster than ever before. The browser’s code was opened to the public—among other reasons, in hopes that people would have ideas for making it even faster.

Next, because Google’s true goal was to improve all browsers, it launched an ad campaign focused on the idea that a faster browser was a better one. Since then, rival browsers Firefox, Safari, Opera, and Internet Explorer have all significantly sped up and given that metric top billing in marketing materials.

If browsers needed to get faster, though, so did websites themselves. In April 2010, Google engaged a powerful weapon to force other sites to improve their performance: it announced that it would begin taking site speed into account when ranking pages in its search engine. As everyone on the Internet knows, if your site is not visible in the first page of Google’s search results, you barely exist.

The best solutions of all, Rabbat and Jain realized, would spread with as little human intervention as possible. As Rabbat puts it, “Instead of telling people what the problems are, can we just fix it for them automatically?” Late in 2010, Google released a free tool that website administrators can download. It analyzes sites and automatically fixes problems that are slowing them down. For example, it changes the way sites handle images in order to make them load more efficiently. The team tested the tool on a representative set of Web pages and found that it typically made a site two to three times faster. Less than three months after its launch, the tool had been installed on more than 30,000 servers.

Pushing deeper

Next the company hopes to reach even further—into the Internet’s fundamental architecture. Google has proposed a new protocol, SPDY (pronounced “speedy”), which it says could make Internet communication twice as fast as it is under today’s protocols. The current protocols weren’t designed for anything near the bandwidth that’s available now. The one known as TCP, for example, is set up to make sure no information gets lost. It cautiously increases its transfer rate little by little once a connection is open, testing the water the whole way. If it detects a problem, it cuts its transfer rate in half; as a result, TCP rarely takes advantage of all the bandwidth it has available. Another issue is that many Web pages today are designed so that information loads in sequence—an image here, an ad there, a video there. If all the pieces could be loaded in parallel, the page would reach users much more quickly. SPDY isn’t meant to replace TCP; it’s a proposed substitute for HTTP, which is a protocol built on top of TCP. But Google says the improvements inherent in SPDY would make up for some of the shortcomings of TCP itself.

But while everyone agrees that these old protocols slow things down, it won’t be easy to replace them. “The challenge isn’t so much technical as economic,” says Neil Cohen, senior director of product marketing for the content delivery network Akamai. Replacing the old standards would require updating users’ operating systems and changing servers, networking hardware, and other equipment all across the world.

In the meantime, Google plans to pressure Internet service providers until they offer connections that meet the standards the company expects and needs. In the coming years, Google will build and run a one-gigabit-per-second Internet connection for a community in the United States, its location yet to be announced. This is 20 times faster than what Verizon Communications generally delivers over its FiOS fiber-optic service—which is one of the fastest consumer plans today—and 100 times faster than what most Americans have. Google hopes that the project will yield technical information about what it takes to provide that level of service, and that it will encourage consumers to demand higher speeds.

But even with higher connection speeds, software would need to be redesigned to take full advantage of the greater capacity. And building the necessary infrastructure would be grueling, expensive, and time-consuming. In 2010, Verizon said it would finish existing FiOS construction projects but launch no new ones; not even its relatively fast service will reach many customers. Google may offer very high speeds to some test communities, but it is unlikely to become an Internet service provider on any significant scale.

Moreover, some of the problems with the way infrastructure is deployed are not within Google’s power to resolve. “In many cases, the failure of the system happens at intermediary stages,” says Tom Hughes-Croucher, a performance expert at Joyent, a provider of cloud-computing infrastructure. For example, he says, even with an improved protocol like SPDY, an Internet service provider’s misconfigured server could slow the Web experience for thousands of people. Slowdowns are common in South America and Africa because they have few local data centers; pretty much all information must travel farther to reach users. “It’s a policy thing,” Hughes-Croucher says. “It’s something for governments to solve.”

Finally, even if Google’s projects pan out, the company could still be stymied. For one thing, Google’s overall market power—which gives it the ability to push other companies into pursuing its goals—has come under fire, particularly in a European Union inquiry into whether Google exerts unfair control over the rankings in its search engine.

Regardless, Google’s confidence in its power to change the Web seems boundless.

I asked Rabbat and Jain what will happen if some of Google’s speed projects don’t succeed. They looked at each other in confusion, and then Rabbat started laughing. “We have not explored the failure scenario,” he said. He repeated the sentence a few times.

He collected himself and leaned forward, speaking seriously now. “We believe we can do this,” he said. “With the size of Google, people will listen—they will try out options we suggest.”

Jain nodded fervently. “That’s exactly right. Everyone wants this, not just Google. Everyone understands that this work will benefit everyone. It won’t be a Google success. It will be a success in the Internet as a whole.”

Erica Naone is Technology Review ’s editor for Web and social media.