Skip to main content
rob magazine

Sweden’s ISP Bahnhof AB’s Pionen Data Centre lies 30 metres below Stockholm, in a former civil-defence nuclear shelterChristoph Morlinghaus


In early August, one small link in Bell's vast national network of cables suffered a 30-minute outage that sporadically stopped customers from accessing some websites, proving again that, in order to function, the Internet—that diffuse network of networks—still relies on a huge collection of physical things.

One of these is a drab, brown building at 151 Front St. West in downtown Toronto, which a senior wireless carrier executive once joked would be the ideal site for a terrorist attack, since a strike would shut down much of the country's communications, both wired and wireless.

"When you walk by the building, you can't tell what we are," says Doug Riches, a British electrical engineer who manages the ultra-secure co-location facility. "We like it that way." Canadians should be glad for the tight security; if you access the Internet from within this country's borders, it's certain your data has flowed through one of the 100,000 strands of fibre-optic cable strewn throughout this building's telecommunications suites, which house millions of dollars worth of equipment. Companies like Cogeco and Rogers co-locate and join their networks together in this neutral location, instead of going through the wider Internet. If you're using Google—and who isn't?—your data is sure to have passed through a corner of one suite in particular: the Toronto Internet Exchange, or Torix—a non-profit association that runs a caged-off area on the sixth and seventh floors at 151 Front. Here, about 160 companies plug into each other directly.

So why do companies want to co-locate? Getting a spot in one of these facilities helps speed up the connection between two websites and reduces wait times, as well as minimizing breaks in continuity on extremely sensitive services like voice over IP. Jon Nistor, a systems engineer and president of Torix, says that having a presence in Torix also increases the resiliency of a company's connection, should something go awry in the physical world.

Companies that are co-located at Torix were totally fine during Bell's outage. "Torix is basically an interconnection point," Nistor says, as he walks around the cage, gesturing at servers covered in labels on pieces of tape. Torix pays 151 Front for space, and charges companies for access as well as supplier and contractor costs, though its goal isn't profit; it's simply to improve the Internet.



If the Big Data revolution can be compared to the Industrial Revolution, then massive server farms are the factories where the next historical shift is taking place. These huge, humming buildings house tens of thousands of servers and allow people in nearby regions to access their stored information more quickly.

They also consume enormous amounts of energy, making them expensive to operate. Consumer-facing tech giants like Google, Microsoft and Facebook are trying to locate, power and cool their data centres in the most efficient, eco-friendly way possible—by sucking in outside air and piping in water to chill the rows of servers, as well as using hydroelectricity. Canada, a country with both a lot of water and a lot of cold, should be the perfect home for the type of high-tech facility that, say, Google just built in a repurposed Finnish paper mill. Right?

Guess again. No big global tech giant is rushing into our country. Is our water not wet enough? Our cold not cold enough?

As always, it's a numbers game. Facebook, for example, has server farms in the United States in order to be close to its 309 million people, as well as more than 108 million in Mexico. Those farms can service Canada's population of 34 million without straining any fibre.

"I don't really think 'Canada versus the U.S. versus Mexico'; I view North America as kind of North America," says Tom Furlong, vice-president of Facebook's site operations. When it comes to latency—the computing term for system delays such as the time that elapses between the moment a user types the address of a Facebook page in their Internet browser and the moment that page is sent to the user—Furlong says the trip your data takes across the 49th parallel is "not that relevant....When you start crossing large bodies of water, that latency starts to get felt."

Furlong says Facebook's new data centre in Sweden will noticeably improve performance for its European customers, who have had to make do with an undersea cable connecting them to Facebook's East Coast server farm.

But even if Furlong views North America as a single entity, others don't. Indeed, some companies look at the U.S. and see a country of angsty, national-security-obsessed yahoos—and want to host their sensitive data elsewhere. "There's a belief that U.S. legislation has a lower threshold for search and seizure," says Strahan McCarten, the director of hosting and data centre services at Bell Canada. Bell operates about seven facilities across the country, ranging from 10,000 to 200,000 square feet. Companies pay Bell to use its data centres for a number of reasons, including Canadian privacy laws.

Yet those server farms are almost all in or near Canada's biggest cities. That's because these buildings not only require 24-7 on-site security, but they also have to be close to skilled electrical engineers and plumbers, as well as specialists in IT-related fields; also, clients sometimes

need to visit their data. Though you might save cash by putting a heat-producing mega-facility on an ice floe, there isn't much point if you have to fly someone in from Calgary or Edmonton every time you need to replace an LED.



The massive data centres dotting the planet guard their secrets closely. Many have 24-hour security, backup generators, biometric eye scans and perimeter fences. Inside its data centres, Google employs what it calls "The Crusher," which drives an oversized, rounded steel arrowhead through the middle of servers that have died. That twisted heap of metal is then fed through "The Shredder," from whence it emerges as a pile of unreadable green-and-silver shards.

That might prevent secrets from being stolen by dumpster divers, but The Crusher is powerless to keep safe the terabytes of vital data that still live in the cloud. More and more confidential information is moving online every day; tax, health care and insurance data will go soon, if they're not there already.

"The trend is not going to reverse," says Nirav Mehta, director of identity and data protection at RSA. "There will be a migration." Vital information long ago moved out of locked filing cabinets and onto password-protected computers. Next stop: the cloud.

When it comes to security, not all cloud computing models are the same. Whereas in "public" cloud services like Gmail, anyone can set up an account, "private" cloud services use security precautions like RSA's to restrict access, usually just to employees. If you access your work network from home, there's a good chance you've used, or at least seen, RSA's SecurID two-factor authentication keys and software. SecurID transmits random numeric codes every 60 seconds to users, who must enter them, and their personal passwords, correctly.

Many services are a hybrid of the two models, keeping the most sensitive information and tasks in the private cloud. No security is infallible, however, which is why execs considering how to operate in the cloud feel a chill when they hear about events like the hack that com-promised 70 million Sony PlayStation users worldwide (see below).

"There are some data sets that are best kept inside the company," says Microsoft Canada's national technology officer John Weigelt. "There have been some high-profile failures."

So how do you decide what parts of your business ought to be in the cloud? The nature of the data determines where it fits best. For data that needs to be widely accessible, like large databases of passwords (think Sony or Google), RSA has come up with technology that allows companies to break apart the data and store chunks of it separately, so that a breach wouldn't yield anything usable. For other types of data, Mehta envisions "community clouds" between like-minded organizations, such as health care providers and insurance companies.

Ultimately, the level of security will depend on the sensitivity of the information. Proprietary data such as CEOs' e-mails are best locked down in private clouds—if they're accessible from outside the company's offices at all.

But too much security can get in the way of cloud computing's real purpose: making it easier to get things done. Mehta is always aware of the fine line. In June, when he tried to check into a hotel in Tokyo, his credit card was frozen. "I had called [my credit card company] in advance, and it still failed," Mehta says. "It took me two or three phone calls to solve it."