Epagogix's computer program treats screenplays as mathematical propositions. It looks at hundreds of factors and can predict with impressive accuracy how much money the film will gross at the box office. Other companies are developing similar programs for producing hit songs.
This might be good news for record companies and movie studios, but bad news for fans of creativity, innovation and risk-taking.
In journalism, the judgment of editors is also yielding to the authority of algorithms. "I don't give people what they want," Esquire editor David Granger boasted this year at a forum in Toronto.
"I want to give them what they never could have expected."
But editors such as Mr. Granger can misjudge their readers' appetites and interests, and ultimately lose money. Meanwhile, their corporate bosses notice that the same massive databases being used to determine our credit worthiness could also predict what stories we will read or watch on TV.
Want to eliminate the guesswork about what audiences want? Just look at the topics they are searching for and serve up stories to match.
Short-circuiting our more complicated desires
Christopher Anderson, who teaches at the City University of New York, is not convinced that using search engines to "give people what they want" is really a step forward.
"I think it's dangerous when you boil down what people want to a simple mathematical formula," Mr. Anderson says. "The real danger of these algorithms is that they're reducing the scope of what 'want' means. Want is complicated, and it's more complicated than clicking on a link."
Jaron Lanier agrees. He's a Silicon Valley veteran and a pioneer in the development of virtual reality who has now become one of the Web's most persistent critics. His book, You are Not a Gadget: A Manifesto, published this year, calls for "a new digital humanism" to counteract the trend toward "cybernetic totalism."
Mr. Lanier urges readers not to succumb to an ideology being peddled by the gurus of Silicon Valley that seeks to devalue human creativity. He believes that they are asking us to abandon our faith in ourselves and, instead, to put our trust "in the crowd, in the algorithms that remove the risks of creativity in ways too sophisticated for any mere person to understand."
They want us to believe, he concludes, "that the computer is evolving into a life form that can understand people better than people can understand themselves."
So how do we fight back? How do we achieve the appropriate level of governance over algorithms that Mr. Shirky insists is necessary? Opening up the black box will not be easy, given the legal protections that proprietary algorithms enjoy.
It would be nice to think that there is some way of verifying whether the information being fed into the algorithm is accurate, but the mountain of data that now exists about each of us - pulled from our e-mails, searches and online purchases - is so enormous, it is hard to imagine how that would be done.
For Mr. Anderson, the first step in reining in the power of algorithms is simply to think more deeply and more critically about them. We have to understand, he says, that while we may create tools to serve us, we often wind up being the servants of those tools.
"Algorithms are not wrong in and of themselves," he says. "But they are wrong if we give them god- like status. If we assume that algorithms stand outside the world, then we are giving them too much credit.
"It's so easy to say, 'Well, they're science. They're mathematical. They're true.' I think nothing can stand outside the world in that way, including our formulas."
Ira Basen is a Toronto-based writer and broadcaster. His documentary, Engineering Search , will air on CBC Radio 1 on Dec. 5.
Follow us on Twitter: