Rating a patient’s mood on a scale from 1 to 10 may fit clinical guidelines. And yet, says Dr. Persaud, “how does the number a patient reports relate to how they’re functioning and how much they’re suffering?”
When I added a sleep app to my pedometer, I was surprised to see that at first I was sleeping more restfully than I imagined – which created a tension between two understandings of reality that I haven’t yet resolved. I became more regular in my sleeping habits (no compulsive reading of the latest Jack Reacher thriller until 2 a.m.) but I also found myself tensing up as my week of self-monitoring progressed, sleeping more fitfully as I thought about the all-seeing app resting on the mattress inches from my head.
Dr. Persaud studies the concept of disease-mongering, the fabrication of imagined sickness – and a marketable cure – through the manipulation of numbers. So he’s skeptical of apps that generate data to tell you what you should be able to figure out on your own.
“You should know whether you feel well-rested in the morning; you shouldn’t have to check your iPhone to find out how you slept.”
And yet we continue to check our devices, as if they knew best.
We also rely on our devices to make connections with those who are plugged into their own. But those connections, too, are more complex than they first appear. Once we begin to share data, it’s available to be sampled and commercialized much more widely.
“If people undervalue their own privacy,” says Christopher Berry, “and you offer them utility, and that utility is free, than those people become the product. But I don’t think people get this.”
Perhaps that’s harmless enough – certainly it’s happening all the time on social media. Mr. Berry is more concerned about the consequences of increasingly intimate data grabs – of personal genome tests, for example.
“I might know that I have an augmented risk for heart attack or Alzheimer’s and to me that type of connection is unbelievable. … And I think people would change the way they’re living and become better if they knew that type of information. But the ethical constraint is having that sit on a server in the United States, accessible and storable by the NSA and people I didn’t give permission to. I totally want all the utility but I don’t want any of the radioactivity that goes with it.”
Electronic spying and self-quantifying, no surprise, turn out to be part of the same technological package.
At the most extreme end of the Quantified Life is Feltron Annual Reports, an aesthetically beguiling collection of personal inputs that tell you where a meticulous infographer named Nicholas Felton spent the past year – indexed by activity, diet, location, people encountered (graphed by frequency/intimacy), photos taken (2,801 by iPhone), apparel worn (bow tie accomplished at 7:20 p.m., June 28, in Brooklyn).
How much utility is there in these numbers for anyone other than Nicholas Felton? And yet the people who collect data study him almost as if he were a heightened version of themselves – with the same fascination and attention to extreme detail that ancient Greeks would have recognized in the painstakingly rendered sculpture of a god or hero.
Do I see a kinship here – me trudging along with my pedometer while some quantified Apollo is adjusting his bow tie for the world? At a basic level, there’s a form of awareness, an attention to self that could veer off into narcissism for sure, but can also be a kind of appreciation – knowledge through contemplation.
Perhaps we don’t need to make overly innovative claims for the power of data or obsess over its existential implications: There’s simply pleasure and beauty and vicarious humanity in such a fully formed version of an individual.
“Words can be used to describe someone’s history,” Mr. Berry concedes, “but actual behaviour is far more interesting.”