Thick Data

I've been running regularly each week since my daughter started school in February. Well, jogging as best I can. And I was reminded this morning of what converted me after never having the slightest bit of interest before. In fact, I was probably anti-running, and definitely very judgemental. But that's all changed now and I've flipped to being a running evangelist. Like yoga, it really is active meditation.

This morning I could have used many excuses not to do it. Spring has brought beautiful weather in Adelaide, but just as it does every year, it also brings hayfever. This morning I feel as though I've been hit in the face by Rocky Balboa (I don't really know much about real life boxing). I had a bit of a mental breakthrough yesterday and I'm really keen to get working on my literature review, running would delay that (side note, why do mental breakthroughs always come at the end of the days work?). And it's my birthday, who wants to run on their birthday? But to be honest, birthdays kind of lost their shine for me when I woke up to the news on my 20th birthday. Day and night news coverage of 9-11, followed by annual reminders is pretty sobering. But I's a beautiful day today and my daughter was happy to ride her scooter to school so I thought I might as well make the most of it and jog home.

I usually run along the river, but I got swooped by a magpie the other week and now I'm scared. So today I took the easier flatter option of running around the streets. As I usually do, I listened to a podcast, only to be interrupted part way by flat batteries in my headphones. So the rest of the way home was just me and my thoughts. And that's when I was reminded of the meditative nature of running. I've been so caught up in trying to make the most of every opportunity to learn lately that all of my exercise time has been spent consuming and interpreting other people's content. Giving your mind space to breathe is also really important.

This isn't one of the outcomes I had in mind when I started couch to 5K. When I started my goal was to be able to run for at least 30 minutes without stopping. Then it was to run for 5K without stopping (I'm slow!), 45 minutes, etc. But in May, once I was running regularly enough to justify buying gadgets, I got myself a Garmin running watch. Then I started getting more interested in the numbers. Particularly pace and VO2 max. I didn't even know what VO2 max was before that (and truthfully, I still don't have a solid understanding).

For a time those numbers were looking pretty good. Until winter hit, and then the numbers started going backward. And the numbers are still going backward.

Just looking at the numbers can be pretty disheartening to be honest. I mean, it's not like I was training specifically to improve them, but to decline has been a bit annoying. But what I need to remember is, those outcomes aren't really indicative of whether or not I've been successful in my running. The reason they don't indicate success because they don't reflect my goal.

My goal is to exercise for at least 30 minutes, three times a week regularly, and to get at least two runs in a week as part of that. And I have been successful in achieving that goal all but one week since May. That's despite all the snotty viruses we've had and my recent war with pollen.

Now, if I were to focus on the numbers instead, I might feel like it's all a waste of time. That I might as well wait until spring has passed, I can breathe clearly again, and things will be good again. Take an all or nothing approach. This would obviously be stupid, but it's actually a pretty common way to react to situations like this. If you think the outcome data outweighs the impact on lived experience then there's a very real risk this will happen.

This is where the thick data bit comes in. Yesterday I had a meeting with an experienced researcher who is known for her strong opinions to talk about my research. And I was really nervous, because even though I feel like I'm starting to be credible in what I'm talking about I still have self doubt. Especially in the world of Pharmacy where everyone seems to be obsessed with quantitative research. But she reminded me that I should never apologise for doing qualitative work. Good quantitative data is important, yes, but qualitative data is thick data. It helps you to gain a deep understanding of what's going on. Where the problems lie. To tell a story that's meaningful. Its important and difficult work to understand and interpret lived experiences. It is essential to making sense of the numbers.

There's a really great example of this in the book Range, by David Epstein. It's been written about other places too, but I'll use any opportunity to plug this book that I can because it's so good. He tells the story of the Challenger disaster, which I've mentioned briefly in an earlier post but I want to explore it again from a slightly different angle.

The Challenger decision was not a failure of quantitative analysis. NASA's real mistake was to rely on quantitative analysis too much.

How often do we do this in relation to medicines? All the bloody time! If we're talking with patients, our obsession about our clinical indicators and our assumptions around them often means we neglect to explore the actual lived experience of the person themselves and encourage them on the steps they've taken toward reaching their goals. We almost reinforce the all or nothing approach. In my running example, sure my numbers suck, but I've been putting in consistent effort over time and there has been multiple positive impacts on my lived experience. I can think of many examples of dealing with patients when I haven't been considerate to their effort. We have a tendency to thing of patient behaviour as dichotomous, compliant or non-compliant. But reality is far more complex than that, far more dynamic, and it can't be reduced to numbers or singular outcomes.

We do it on broader scales as well. Our obsession with systematic reviews is the perfect example of this. Yet, how many systematic reviews are conducted to reach the same cliche conclusion of "studies were too heterogenous and of low quality to draw conclusions"? Ok, fair enough, but that's not terribly helpful. Last year Trisha Greenhalgh and colleagues challenged this in their perspective piece Time to challenge the spurious hierarchy of systematic over narrative reviews? It's a really thought provoking piece about the assumption that systematic is better. They basically call bullshit on this. Choose the review methodology that best suits your research question and conduct a robust review.

The first time I realised this to be true, that quantitative RCTs weren't the be all and end all, was when I heard David Currow speak in 2011. He is a professor in palliative care who heads up the Palliative Care Clinical Studies Collaborative (PsCCSC). They are a national research network who are working to build a high quality evidence base for palliative care, facilitate RCTs in palliative care etc. I remember attending a talk he was giving sharing their latest studies findings on ketamine and he talked about study design. I distinctly recall him saying that the question defines the methods. It's the essence of high quality research and it has stuck with me.

The worst thing about the challenger disaster was that they had the other data available, it just didn't pass their threshold of what was deemed credible. They had photographs that concerned the engineers. The engineer later testified to the commission investigating the disaster about being reluctant to insist on the decision makers paying attention to their concerns:

"I was asked to quantify my concerns, and I said I couldn't"..."I had no data to quantify it, but I did say I knew that it was away from goodness"

It was a cultural issue at NASA that led to them ignoring key information that resulted in the deaths of seven people. Cultural issues are just as important to acknowledge and address as determining the number needed to treat, or effect size. It might be difficult to do, but we've got to start somewhere right?


You may also like: