Go to the Globe and Mail homepage

Jump to main navigationJump to main content

(Image Source/(c) Image Source)
(Image Source/(c) Image Source)

The algorithm method: Programming our lives away Add to ...

How it does that remains a closely guarded secret.

That's the thing about algorithms. Most are what engineers call "black boxes" - they convert inputs into outputs without revealing how they do it. The good ones are enormously valuable. Google's is the spine of a business that currently generates about $28-billion a year in revenues. Its secrets are not for sharing.

In many cases, we surrender our authority to algorithms willingly, even eagerly. We know that Google collects massive amounts of data about us whenever we search, or use Gmail or any of the other products the company offers. We know we have the right to remain anonymous, but most of us don't. We accept the deal that Google is offering: Tell us more about yourself and our algorithm will deliver more relevant search results and ads for products you might want to buy.

But often, the data being fed into algorithms are being collected without our explicit consent and used in ways we are not even aware of.

They know you might quit your job - even before you do

Some companies now use algorithms to make decisions around hiring and firing. At Google, which boasts that "almost every [personnel]decision is based on quantitative analysis," engineers have developed an algorithm to identify those employees most likely to leave to work for a competitor or strike out on their own. Employee surveys, peer reviews, evaluations, promotion and pay histories feed into the algorithm. It helps us "get inside people's heads even before they know they might leave," a Google manager told The Wall Street Journal last year.

At IBM, programmers put together mathematical models of 50,000 of the company's tech consultants. They crunched massive amounts of data on them - how many e-mails they sent, who got them, who read the documents they wrote - and used this information to help assess the employees' effectiveness and deploy their skills in the most cost-efficient way.

It's easy to understand the appeal for employers. The advantages for employees are less clear. Workers have no idea how the algorithm weighs the factors it is measuring or whether the data it uses are accurate. Getting turned down for a promotion because a secret black box has determined you are not suitable, or has predicted you may soon decide to leave the company, is hardly a model of transparency.

The Google algorithm for ranking websites, the company's most valuable piece of intellectual property, is equally opaque. Securing a good placement on the search results page confers enormous economic advantage. The algorithm looks at more than 200 factors in determining who gets those rankings, but Google does not disclose what those factors are or how they are weighted. And the company tweaks its algorithm an average of once a day, mostly to stay ahead of spammers.

An online business may find itself suddenly going from No. 1 in a search ranking to No. 20, simply because Google has decided to make a change to its algorithm.

Credit-card companies also now use sophisticated algorithms to help them assess customers. The companies are understandably looking to be able to predict who will ultimately turn out to be a bad risk. And they use the motherlode of data gleaned from our credit-card purchases to determine whose card will be cancelled, whose credit limit will be raised or lowered and numerous other decisions.

Last year, The New York Times Magazine profiled J.P. Martin, who in 2002, while working as an executive at Canadian Tire, built a mathematical model to analyze a year's worth of transactions made using the company's popular credit card.

He found that people who bought carbon-monoxide detectors for their homes or premium birdseed or those little felt pads that stopped chair legs from scratching the floor almost never missed a credit-card payment. Those people apparently felt a sense of responsibility and commitment to their homes and wildlife that also extended to their monthly bills. On the other hand, people who bought chrome-skull car accessories or frequented a Montreal establishment called Sharx Pool Bar were more likely to miss repeated payments.

Most consumers are unaware that their ability to gain access to credit is now being determined by variables such as what time they are logging in to the card company's website (late-night logins could be a sign of anxiety-induced sleeplessness) or whether they are using the card to pay for groceries or therapy sessions. As the research head of one large firm that analyzes credit risk said, "We may look at 300 different characteristics just to predict their delinquency risk."

Report Typo/Error
Single page

Follow us on Twitter: @GlobeTechnology


Next story




Most popular videos »

More from The Globe and Mail

Most popular