Skip to main content

Peter Tieleman, the director of the at The Centre for Molecular Simulation at the University of Calgary, says he has to divide his projects across different computer systems and forgo projects that cannot be done in Canada.Chris Bolin

Scientists across Canada who need access to fast and powerful supercomputers to conduct their federally funded research say they are falling behind their international competitors, or having to switch to less ambitious projects because the country's digital research infrastructure is insufficient to meet their needs.

The problem is so acute, those affected say, that fixing it will require Ottawa to pour tens of millions more into Canada's national computer capacity each year, and rethink how the system is supported in the long term.

What's needed is not simply more data storage but also computational capacity – the ability to crunch through reams of calculations to do things like analyze genetic variants across an entire population or simulate the Earth's climate. The challenge is exacerbated by the rapid turnover of digital technology, and by growing demand from researchers anxious to leverage the power of big data and high-performance computing to make breakthroughs.

Compute Canada, the organization tasked with supporting university-based researchers with their digital needs, says that the growing reliance on computation in many areas of science means that it is no longer able to provide its biggest users with the digital muscle they need to operate in the world's top tier. The organization projects that more modest users of its services could be running into similar barriers in the next couple of years.

Since only researchers with federal funding can apply to Compute Canada, the digital bottleneck means the government is, in some cases, paying for science that it can't support.

Meanwhile, the nation's digital infrastructure faces a coming tsunami of demand through a range of large-scale research projects that the government has already committed to.

"We're alarmed," said Compute Canada's president, Mark Dietrich. "Important big science initiatives threaten to overwhelm our already stretched capabilities."

For Peter Tieleman, a University of Calgary chemist who draws on Compute Canada's resources more than any other individual scientist to model the behaviour of cell membranes and their interactions with molecules including candidate drugs, the limitations mean that he has to divide his projects across different computer systems and forgo investigations that simply cannot be done in Canada.

"Personalized medicine is only going to be personalized if we're able to analyze data at the level of the individual," said Dr. Tieleman, referring to a trend in medical research to tailor potential treatments for disease to a patient's unique genetic characteristics.

In recent weeks, Compute Canada has faced ire and distress from researchers over its allocation of computer resources for 2016. Many applicants received far less than expected, while others have been cut off entirely.

"It's the hardest year we've had – and last year was already bad," said Dugan O'Neil, chief science officer for the organization.

Among those whose allocation was reduced to zero is Scott Ormiston, a professor of mechanical engineering at the University of Manitoba. As a consequence, one of Dr. Ormiston's graduate students, who is developing software to reveal fluid and gas interactions in pipes – a topic that is relevant for averting nuclear accidents – may be unable to complete his PhD, already overdue because of diminished computer resources.

"I don't know what to do. All my plans are in jeopardy," said the student, Foad Hassaninejadfarahani, whose funding has now run out.

Although there is widespread agreement that Canada's research computing and data management tools need an upgrade, opinions vary about whether advanced computing is a pressing problem.

"It's not something that, today, is keeping me awake at night," said Feridun Hamdullahpur, who chairs the Leadership Council for Digital Infrastructure, a university-led group that reports to the federal government.

Dr. Hamdullahpur, who is president of the University of Waterloo, said he is not aware of any researchers on his campus who have been hampered by a lack of access to computational capacity, though he added that the future might bring such constraints.

The council, which meets Thursday in Ottawa, is scheduled to hear from Compute Canada about the organization's concerns.

Others say the problem is long standing and has already cost Canada its ability to attract top researchers in some fields.

"It's a pretty terrible situation," said Robert Thacker, a cosmologist at St. Mary's University in Halifax who uses supercomputers to study the formation of galaxies and the large-scale structure of the universe.

Dr. Thacker said that the last time his group was able to publish computationally based work on par with that going on in the United States and Europe was in 2008. He added that his department recently lost its bid to attract a rising star in the field to a faculty position in part because of Canada's limited computational resources. Instead, the researcher, a Canadian scientist who was working abroad, accepted an offer in the U.S.

Sabrina Foran, a spokesperson for the federal Ministry of Innovation, Science and Economic Development, said the government is working to develop a new digital research infrastructure strategy. A funding boost to Compute Canada was announced last year by the Conservative government in the lead-up to the federal election, but scientists say this will merely replace aging systems without adding more capacity.

Your Globe

Build your personal news feed

Follow the author of this article:

Check Following for new articles

Interact with The Globe