Skip to main content
canadian car of the year awards

Over the course of five days 80 journalists tested 58 brand-new vehicles from 18 different manufacturers.Michelle Siu

BEST NEW SMALL CARS

BEST NEW FAMILY VEHICLES

BEST NEW LUXURY CARS

BEST NEW SPORTS/PERFORMANCE AND PRESTIGE VEHICLES

BEST NEW SUVs and PICKUPS

*****

The logistics are so complex, they can be difficult to fathom: 58 brand-new vehicles from 18 different manufacturers in 12 classes evaluated by some 80 automotive journalists from across Canada gathered together in one place for five days.

This, in a big nutshell, is what the AJAC Canadian Car of The Year (CCOTY) TestFest is all about. But all the facts and figures barely scratch the surface of this annual event, which this year moved to Niagara Falls from its previous location, Niagara-on-the-Lake. Home base was the Legends on the Niagara Golf Course, a sprawling facility that offered the space necessary to accommodate the vehicles and the people involved in TestFest.

The parking lot was transformed into an autocross course for testing of a more spirited nature. At a separate off-road facility, pickup trucks and SUVs were put through their paces.

However, I spent little time at any of these locations, for reasons that will soon become apparent.

The initial plan was to participate in the subjective testing of the various entries, driving all the vehicles in a given category back-to-back on the same day, on the same roads, in the same conditions. This approach is a big part of what makes TestFest unique in the world of automotive awards. There are many awards out there, more than you can shake a stick shift at, but few dictate that voting journalists drive all entries, one right after the other, in the same place.

This is important because, the truth be told, there are few bad vehicles on the market these days. Modern vehicles are – across the board – better built, higher performing, safer, cleaner and more efficient than ever before. Next year, they'll be better than this year; the year after that, they'll be better than next year.

But in any given class of vehicle, there will always be a class leader – and the best way to determine which one is at the top of the heap is to drive all entries within a class one after the other. In fact, I don't believe it's possible to decide on a class leader – in as objective a way as possible, that is – unless this exact approach is adopted.

I know whereof I speak: I've driven a vehicle early in the year and come away convinced that it's best-in-class – absolutely imperious from tip to tail – then driven an even better vehicle right after it at TestFest later that same year. I've lost count of the number of times this has happened, but each year is guaranteed to produce at least one surprise.

The TestFest score sheet includes 16 subjective categories to vote on; these categories cover such diverse areas as design, ergonomics, handling, ride comfort and entertainment features. There are a further seven categories that are entirely objective in nature, such as interior volume, fuel efficiency and tailpipe emissions.

All the subjective categories are scored by the journalists on a scale from 0 to 10 – naturally, 10 is the highest – and all the objective measures are converted into a score using that same 0-to-10 scale. The scores are weighted according to their importance to the given category; for example, acceleration is more critical in a sports car than it is in, say, a pickup truck.

If this sounds like a complex scoring system, it is – but it has been fine-tuned over the past 26 years and is designed to be as mathematically rigorous as possible. Reviewing vehicles can be a decidedly subjective process for a wide variety of reasons, so using comparative data and converting opinions into numerical measures is a worthy pursuit.

This year, for the first time at TestFest, I focused exclusively on "the objective" as I was asked by the CCOTY committee to assist with the performance testing of the vehicles. There were three categories to measure: acceleration from 0-100 km/h, passing speed from 80-120 km/h and braking distance from 100 km/h to a full stop.

So I became one-third of the three-man team that spent a day-and-a-half driving up and down the same 1.3-kilometre stretch of closed road not far from the golf course. If this sounds like a tedious way to spend 12 hours – well, that's understandable. However, it was interesting to see first-hand how the vehicles fared in these three measures.

Each of us had a GPS-based performance meter at our disposal, so the technological side of the equation was covered. But the road was not perfectly flat for the entire stretch; it featured some minor changes in elevation. All the testing is conducted in both directions to compensate for wind and variances in the road, but it's still important to find the flattest sections possible.

I volunteered to work on classes that contained a number of the slower vehicles; these vehicles, of course, take longer to get up to speed, so recording all the numbers on the flat sections proved trickier than expected.

On the first day of testing, intermittent rain also disrupted the process; the critical aspect of performance testing is to strive for identical conditions, as varying dampness affects the grip needed for acceleration and braking.

In the final analysis, I drove back and forth along that same stretch of tarmac a dizzying number of times; it took me close to five hours to record the performance numbers for just six vehicles.

Fortunately, the second and final day went more smoothly: the road was dry from start to finish and my (newfound) familiarity with the performance meter and the testing process paid dividends. Another eight or so hours of driving that same road and the performance figures for another 11 vehicles were in the books.

By the end, I had become so accustomed to the scenery, I could predict within a tenth or two the acceleration times for each vehicle without needing to look at the readout on the performance meter. I can also report that during repeated runs with each vehicle, the consistency of the results produced was remarkable – just a tenth of a second or two here and a metre or two of braking distance there.

To obtain the final braking performance score for each vehicle, the two most consistent results in each direction, four numbers in total, are averaged together. The same math applies to the 0-100 km/h acceleration times and the 80-120 km/h passing speed times.

At some point, the final performance numbers for all 58 vehicles will be fed into the system and assigned scores from 1 to 10. These scores will then be combined with the subjective ratings from all the journalists to produce a final overall score for each vehicle. It's an exhaustive process, but a necessary one to help secure the credibility of the results.

The vehicles with the highest scores in each class are, of course, declared the winners; these will be announced Dec. 3. Class winners then become eligible to win the overall Canadian Car of The Year and Canadian Utility Vehicle of The Year; both will be revealed Feb. 13 during the media preview day before the opening of the Canadian International Auto Show in Toronto.

2014 Canadian Car of The Year awards

Oct. 21-25: TestFest: AJAC members test-drive 58 different models in 12 categories.

Dec. 3: The winners in each of the 12 categories are announced.

Jan. 28, 2014: Best New Technology award winners announced in two categories.

Feb. 13, 2014: 2014 Canadian Car and Utility Vehicle of the Year announced.

Interact with The Globe