Skip to main content

Katie Jodrell fractured her spine just 10 days before her wedding. She is shown with the prototype surgical navigation system that helped physicians repair her in good time to make it to the altar.

Seconds after Katie Jodrell fell almost three-and-a-half metres at a building site, her legs went numb and a great shaft of pain cut through her body. It was the last week of July, just 10 days before she was supposed to walk down the aisle to marry her fiancé, Ryan.

"Right away I knew there was something very wrong and I thought the worst, that I was paralyzed," recalls Katie, a construction framer who lives in Port Hope, Ont. "Within half an hour the feeling started to come back to my legs and I was trying to think positive. But in the back of my mind I was thinking, 'How can this be happening just before my wedding?'"


Katie Jodrell, whose spine was repaired using surgical navigation technology, is shown at Sunnybrook with husband, Ryan.
PHOTOGRAPH TIM FRASER


Katie was rushed to Northumberland Hills Hospital in Cobourg, Ont., where she learned she had two broken vertebrae and a fractured spine. From Northumberland she was airlifted to Sunnybrook for spinal surgery that would involve installing two metal rods and 10 screws onto her spine.

Just before the operation, Sunnybrook neurosurgeon Dr. Victor Yang asked Katie if she would allow the surgical team to use a relatively new, light-based technology that can map her spine in 3-D to give doctors a view of hard-to-see areas.

"He explained everything very clearly, and I understood that this technology was going to be better and that it would be safe," says Katie. "Obviously I wanted the best and safest route possible."

Katie agreed to Dr. Yang's request. Ten days later, with her surgical dressing peeking up from the back of her strapless white gown, Katie walked – rather gingerly – across the floorboards of a light-filled barn to say "I do" to her groom.

■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■

WHAT DR. YANG OFFERED to Katie before her surgery was a chance to take part in clinical trials for an innovative surgical navigation prototype system, a technology he started developing in 2009 with an engineering team at Ryerson University. Dr. Yang, who is also an electrical and computer engineer, was completing his neurosurgery residency when he saw the limitations of modern imaging technology in spinal surgery where hardware has to be attached to bone and where pinpoint accuracy is critical.

"Modern surgical treatment is tailored to get you back on your feet as soon as possible – the longer you stay in bed the higher the chance of complications – and we do that by putting in hardware such as screws and bolts inside the body," he says. "But when a patient goes through this type of surgery, there's a risk that they can come out worse if any of the hardware gets into the spinal cord."


The light-based system that helps map the patient's spine or brain to a high degree of accuracy.
PHOTOGRAPH TIM FRASER


Dr. Victor Yang and his team operate on a patient's spine using the prototype surgical navigation system.
PHOTOGRAPH TIM FRASER


That risk is based largely on the fact that surgeons don't have a direct line of sight into the pedicle bones on the back of the spinal column. These small stumps, which form the strongest part of the vertebrae, are used to anchor metal in spinal surgery. A surgeon looking down at a patient's spine can't see the pedicles.

This visual barrier is often addressed by taking a computed tomography image – also known as a CT or CAT scan – of the patient's spine before surgery. The problem with this approach, says Dr. Yang, is that patients are usually lying face up during the scan. When they're turned face down on the operating table, the spine shifts – and so do the pedicles.

"So the surgeon needs to match points on the scan image to points on the patient's spine, and may need to pick up to 100 points," says Dr. Yang. "This takes time because it involves picking each point one at a time."

Instead of taking this laborious point-matching approach – known in the medical field as co-registration – some doctors choose to bring a CT scanner into the operating room so they can take an image while the patient is on the table. The resulting images are very accurate, says Dr. Yang.

"But the difficulty lies in moving the equipment in," he says. "It takes 15 minutes to half an hour to set up and get the patient ready. That's 15 to 30 minutes that the patient is lying down with a breathing tube in her mouth, and that the medical staff is spending in the O.R."

Even the accurate images from a CT scan can become imprecise once the pedicle screws are inserted into the spine because the pushing motion causes the spine to move again. Taking another scan is an option, but this adds even more time to the procedure. Also, since CT machines generate images based on the absorption of radiation by different body parts, additional scans mean more radiation introduced into the patient's body.

"At the end of the day, whichever approach you take, using existing technology adds time to the procedure and isn't necessarily the best for the patient," says Dr. Yang.

A skilled surgeon can confidently estimate the pedicle location based on their experience and knowledge of anatomy, but there's still a risk of missing the target site, says Dr. Yang. Even the slightest deviation can lead to serious injury for the patient.

"We, as engineers, want to minimize that," says Dr. Yang. "So we asked ourselves: Is it possible to have technology in real time in the O.R. that takes seconds to set up, adjusts its images as the spine moves, does not expose the patient to radiation and allows surgeons to reduce the probability of getting to the spinal cord and other structures that we don't want to injure?"

■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■

WITH THE PROTOTYPE surgical navigation system, Dr. Yang and his research and development team answered all of these criteria. Unlike CT machines, which are large and are typically not a standard part of O.R. equipment, the prototype fits seamlessly into the operating room. In fact, unless it's pointed out, a visitor to the O.R. won't even know there's a system in the room. That's because it's designed to look and function as a conventional overhead light above the operating table.

"Some of the difficulty with other equipment is that they're not actually part of the operating room, so it's a bit of a disruption when they're brought in," says Dr. Yang. "With this technology, you just have a light that surgeons are used to having in the O.R., and it's designed in a way that a surgeon can't even recognize when the light is active; it's working in the background all the time."

Where CT machines take an image of a body part during a particular moment in time, the navigation system delivers real-time images to a computer screen beside the operating table. Special LED lights inside the device project binary, barcode-like patterns onto the exposed body part. These patterns, which are invisible to the human eye, are captured by optical cells in the navigation system and analyzed by software, which uses curves and deformations in the patterns to visually recreate the surface of the anatomy. The resulting 3-D image is then mapped against the preoperative scan on the computer screen.

“The images on screen show an exact representation of the patient’s anatomy. It’s extremely accurate … This is a very big step forward in the field."


Dr. Todd Mainprize,
Head of neurosurgery, Sunnybrook


This all happens within milliseconds. For surgeons, looking at the system-generated image on the computer screen is just like looking through a camera lens; what they're seeing is live.

"The images on screen show an exact representation of the patient's anatomy," says Dr. Todd Mainprize, head of neurosurgery at Sunnybrook. "It's extremely accurate. Unlike manual co-registration, this technology automatically matches thousands of points to make a more accurate representation, and it does it very quickly. This is a very big step forward in the field."

To further guide surgeons during a procedure, Dr. Yang and his team designed markers that mount onto surgical tools and pinpoint their location on a patient's anatomy. As the surgeon moves the tool over the anatomy's surface, the com-
puter image on the screen tracks the movements. "Now we have submillimetre accuracy in getting to where we want to go during the procedure," says Dr. Yang.

This accuracy has profound implications for surgeons and their patients, says Dr. Mainprize. "If we're doing a procedure on the spine and we're off by a millimetre, the patient can become paralyzed," he says. Without the need for CT scans during surgery, doctors can work faster, says Dr. Mainprize. While this can translate into greater efficiencies for hospitals and the health-care system, the real beneficiaries are the patients. The less time a patient spends in surgery, the lower the risk of complications, says
Dr. Mainprize.

"In patients getting spinal instrumentation, infection can be a serious complication and the procedure might need to be repeated," he says.

Dr. Albert Yee, orthopaedic surgeon and associate scientist at Sunnybrook and one of the study investigators working with the Ryerson University research team, says not having to use radiation during surgery is a plus for patients. While technologies exist today that allow doctors to do minimally invasive scans during surgery, they tend to generate a lot of radiation, he says.

As spine instrumentation and other types of surgery become more common because of Canada's aging population, this innovative surgical navigation system could be a game changer for doctors and patients, says Dr. Mainprize. Recent studies link computer-aided and image-guided surgeries with lower risk of complications. One study, which looked at more than 2,400 endoscopic sinus surgeries, found that operations performed without guiding technologies were three times more likely to result in major complications than image-guided procedures.

■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■


LAST YEAR, from March to December, Dr. Yang and his team ran the first phase of clinical trials for the prototype surgical navigation technology. Doctors at Sunnybrook used the device on 40 patients, including Katie. About half the group underwent spinal surgery while the other half had procedures on different areas of the brain.

The second phase of clinical trials is now underway. In the meantime, 7D Surgical Inc, which represents a joint partnership between Sunnybrook and Ryerson University, is commercializing the navigation system.

After five years in development, Dr. Yang's innovation is getting closer to full realization. 7D Surgical plans to develop and market Dr. Yang's technology and make it available to other hospitals in the country. Dr. Yang's long-term goal is to deploy these surgical navigation technologies in operating rooms around the world.

"I think we just happen to be at this convergence in technology, where LED technology and rapid prototyping technology have matured enough to make technological advancements such as 7D Surgical possible and cost-effective," he says. "This whole lighting unit was produced with 3-D printing. Five years ago we didn't have 3-D printing and it would have cost us millions of dollars to build this prototype."

The timing couldn't have been better for Katie. She continues to recover today and feels grateful for the leading-edge care she received at Sunnybrook.

"I've never had a hospital stay before and the treatment I got at Sunnybrook was just outstanding," she says. "Being offered this technology was a bonus."


The happy moment: Katie Jodrell and new husband, Ryan, just days after her surgery.
PHOTOGRAPH TIM FRASER


This content was produced by The Globe and Mail's advertising department, in consultation with Sunnybrook. The Globe's editorial department was not involved in its creation.

Interact with The Globe