Millions of people on phones, tablets and computers were tracking teams and watching video from the event, a once-every-four-years sports spectacle that garners intense online interest.
Much of that Web traffic pulsed in and out of
Built in 2007, Yahoo's building is filled with long rows of servers whose tiny LED lights flicker as data flows in and out of the center.
Such buildings are the physical manifestation of the digital world of zeroes and ones, the basic molecules of online activity.
The activity inside allows people to check
At the same time
Since then, five other companies, including Dell and Intuit, have joined the
Most data center owners don't allow visitors inside their buildings, citing security and customer-protection concerns.
The tour, which occurred in May, covered the entire building during normal daytime working hours. Some questions were not answered, such as the number of servers and total number of CPUs inside the
While protective of what happens inside the data center, the global tech company wanted to open the doors to showcase innovations and its skilled workforce, Philion said.
"We're proud of our facility here. And importantly, we're growing our team," she said. "We want to get the word out what we do, what it looks like and also spread the word that we're hiring."
There are practically no windows, except for a panel of glass along the front wall near the main entry.
The company employs just more than 40 workers. Most of them gather during lunch hours at a common dining area near the front of the building.
It operates with a handful of managers; most of the staff is comprised of technicians, computer operators and facility support staff.
The cooling coop
Data farms or data centers consume immense amounts of energy. Most of that power is consumed running all the computers and network devices inside. But companies like
In a console room at one end of the large main building, network managers watch displays that keep them aware of how data is transferred in and out of the center.
In the building's Server Room One, a technician holds a laptop computer and examines information on the screen. The laptop is plugged via cable into one of the servers sitting on a rack. He's diagnosing a problem that could be either the server starting to fail or a networking bottleneck that's keeping it from operating.
Every server on the rack is labeled by a numeric code to help locate it among the vast number of devices spread across the two buildings.
Because Web traffic can be rerouted easily when hardware or networking problems pop up, most
"You might only notice, if you're trying to load a page, a very, very short momentary interruption or delay," he said.
Companies that do most of their business online find themselves always trying to manage the increasing glut of customer data that comes with personal information, videos, photos, spreadsheets, blogs and business backups.
The online customer doesn't want any delay in finding or using that data, Schuyleman added.
"It's no longer OK for a page to load slowly. People used to be patient while it took some time for the old
"The new servers use less power, they have bigger drives and you're maximizing your energy," Huck said.
One of the tour leaders was
In the building's Server Room One, the company uses the standard data center cooling method of chilling the room. To keep the servers running properly,
In Server Room Two, Page said
Instead of cooling the entire room, that plan creates enclosed areas that wall off the heat-producing servers from everything else.
Each enclosed aisle is roughly 15 feet long by 9 feet tall and 4 feet wide. The racks of servers are pumping heat into the enclosed aisle where cooler fans are pulling that heat out and circulating through the rest of the non-cooled main areas of the room.
Page said the first cooling system is like cooling a glass of milk by cooling the whole room. "Think of this (second) way as placing a glass of milk inside a refrigerator instead of cooling the whole room," Page said.
The cooling upgrade was incorporated into the design of the building added in 2011.
That building has the name Yahoo Computing Coop. The engineer who helped design the system said he drew inspiration from studying a backyard chicken coop's upward-sloping roof.
The Coop building has a series of wall louvers that allow fresh air to enter the data center from the outside, taking advantage of the cool mornings and evenings in
That air then flows through two rows of cabinets on the main floor, and as the air warms, it rises upward toward the slanted ceilings of the building.
The heat is funneled out of a long narrow chicken-coop-inspired cupola on the roof.
On most days, the
Philion said the
All companies using data centers try to push that PUE score as close as possible to 1.0, which would mean that all the power it uses is put to use by the server equipment, and none consumed in operating cooling systems.
In the first generation of data centers, a 2.0 PUE was considered normal.
Not anymore, Philion said.
"Our team is continually working to ensure that our PUE is as low as possible."
(c)2014 The Spokesman-Review (Spokane, Wash.)
Visit The Spokesman-Review (Spokane, Wash.) at www.spokesman.com
Distributed by MCT Information Services