Oration in the Internet Age

My generation is faced with an odd burden. We came of age with Facebook and Twitter, the existence of blogging and online publishing being taken for granted. We grew up with the twenty-four hour news cycle, with content overload, with the internet being a simple fact of life. Anyone could reach anyone, anywhere, anytime. We have infinite voices. We have perfect access. I think it’s something about this possibility that burdens us with the the illusion of a voice. We all believe that we have something great to share with the world. We all have a message we want to spread and feel capable of carrying it to the ends of the earth. That is, if only we had something to say.

Take a look at Twitter. Or Facebook. Or Pinterest. Or any social media site. Really look, just for a minute. Take in the thoughts of your friends, ex’s, high school nemeses, or complete strangers. Think about the power they’re exercising, power only dreamed of by any ancient revolutionary. They are sending their thoughts out into to a medium available to billions of people. Odds are that those thoughts will be preserved forever in the deep, dark spaces of internet archives. They are speaking to posterity. And what is their message? Cat memes, thinly veiled attacks on friends, poorly constructed political invective, and brags about how much they drank last night.  Two or three word commemorations of an old friends birthday, the laziest of gestures. Somehow, the greatest of publishing power has led to the exposure of the dregs of our brains. Only our most thoughtless thoughts are preserved for posterity. Anything deep or sensitive or real, anything with feeling, is locked away in fear of ridicule.  And yet we feel, in our heart of hearts, that we have something to say, if only we could only come up with the right words.

Maybe it is this overwhelming flood of raw feelings published daily in digital form that intimidates us. One can imagine that, in the old days, the battle was with the publishers. A well-worded piece in the pages of the New York Times or read over one of the three television channels was an event, discussed in bars and ballrooms across the nation. One struggled with the gatekeepers, but once inside, influence was almost guaranteed. Now, even the most brilliant of pieces falls on deaf ears. A carefully researched expose of local government corruptions garners less views than a video of a pretty girl taking a pratfall.  A GIF-able, easily digested list of “12 Reasons the 90s Ruled!” will always be shared more often than a thoughtful reflection on facing an uncertain future in the twilight of youth.  We can shout and shout and shout into the oceans depths, pouring our heart out on the waves, and are lucky to see a ripple.

Writing about the internet on the internet is always a cause for pause. This is another variable by which I’ve found myself constantly confounded.  I complain about the facile “Happy birthday!” posted on a Facebook wall, but regularly send exactly this message. I bemoan my generation’s lack of meaning and it’s overshadowing entitlement to be heard, but in doing so I deconstruct. I do not build or create. I cry foul at the deluge of internet content and it’s basely democratic popularity, all while purporting to offer something superior.  I am the problem. I am the fear. I am the listlessness.  I am the meaninglessness. I am the internet.

But even as I ponder this problem, I am reminded of another story, of Demosthenes and his stunted speech. In order to strengthen his voice, he would wade out into the water and practice shouting above the waves, letting his voice echo across the shore. Perhaps this is what we need: not to take the great ocean as a threat and a hindrance, but an opportunity and a training ground.

Advertisements

Leave a comment

Filed under Uncategorized

Accidental Organs

At some point in all of our lives, we pestered our parent’s with the terribly uncomfortable question:  “Where do babies come from?” Children have a tendency to take most things in their world for granted. The food in refrigerators (and the refrigerators themselves), the existence of cars and the evening thunderstorm are, at the beginning of our lives, magical occurrences. The wonder of childhood is the period of transition between this unthinking acceptance and a growing understanding of the world around us. The innocent observation of childhood has always been highly prized in our society, often to the point that we forget the striving that accompanies it. As children, we not only appreciate but we dissect.  And nowhere is that more apparent than the extremely practical question of where all our new brothers and sisters are coming from and the more deeply existential one of how we ourselves came into existence.
In the beautifully written Never Let Me Go, Kazuo Ishiguro asks a similarly uncomfortable question: “Where do organs come from?”  The answer, in short, is “From people, of course.” But the heartbreaking conflict of his book lies in its science fiction element. The narrator (and almost every character in the book) are clones, born for the express purpose of later giving their organs to average, every day citizens.  This looming fate haunts every page of the book, a dwarf crouching at the armpit of the reader whispering, over and over again, “All happiness is fleeting.” It makes for a difficulty read since, apart from this looming threat of “donations,” the lives of these characters are painfully ordinary. They are sometimes petty, often shortsighted, fall in love, are unfaithful and bored and brilliant, all in turn. In other words, they are utterly human.
The central conflict of Never Let Me Go is that, despite the obvious humanity of the narrator and her compatriots, the “normal” people in the book are simply unable to view them through this lens. The trappings of humanity demonstrated by the clones disturb them at every turn. But they cannot turn away from the resources they provide. “Ask people to go back to the time before, with cancer, neurodegenerative disease, they will simply say no.”  As much as human being love novelty, we also struggle with it. It takes us back to a childlike place, both pleased and unreasoning. I’ve wondered often about the rise of advanced electronics and the advanced waste that accompanies them, not to mention the human suffering often involved in their creation. I examine the things in my life and am often uncomfortable with the answer to “Where does this come from?”  Because the answer is “From people, of course.” Every resource expended on my iPhone and the satellite that brings it YouTube is an investment of human capital. But if you asked us to go back, back to the time before GPS, 10,000 songs in our pocket and god knows how many pixels in our living room? Honestly, I think we’d just say no.

Leave a comment

Filed under Uncategorized

Hard work

There are a few companies whose advertising departments I really admire. Gatorade is one of them. Admittedly, sports are so deeply ingrained in the American psyche that half of their work is done for them.  They still come out with some real masterpieces though, like this.

I (and many of the other athletes I’ve talked to) love this ad because it’s the antidote to the “movie montage” view of sports. It’s so omnipresent that I won’t link to more than a few examples of it, but there’s always this or this.  The basic idea sold buy these montages is that the hero dedicates themselves to the cause, labors away for 1-2 minutes to some inspirational music and emerge an unstoppable warrior.  To anyone who’s ever played sports seriously, this idea is laughable. It glosses over the aches, the early mornings, the social events missed, the NSAIDs, the extra reps, the tubs full of ice. It makes it seem like the only part that matters is deciding, in a single moment of time, that you’re going to be great. In reality, greatness is a practice formed in a thousand unglamorous moments, mostly with no soundtrack.  

I’ve felt this way about sports for years.  I’m embarrassed to admit, however, that this philosophy only spilled over into the rest of my life recently.  In every young mind, there’s a constant struggle between viewing success as a product of talent versus a product of hard work. One is a state of being, while the other is a stage of becoming. The two options drive two distinct ways of looking at the world: a fear of failure or a need for achievement. Talent, on the one hand, is inherent, infallible and unstoppable. Talented people come don’t come up short, since they’re already possession of the thing that makes them great. This makes failure your worst nightmare, since any time you fall short implies that you don’t have “it.” Hard work, on the other hand, is a process. It’s not something you have. It’s something you choose, over and over and over again.  

For most of my life, I’ve believed in talent.  Hence, the trick was to project completion at all times. “What you do, do well” was interpreted as “Avoid anything you might do poorly.”  I couldn’t imagine trying hard and failing.  More precisely, I couldn’t risk it.  I didn’t want to expose myself as a talentless fraud. I’m ashamed to say I skated by through most of my education, telling myself that I did almost as well as my peers with 1/5 the work (nobody had introduced me to the Pareto principle yet). Even in college, when I was fully immersed in a humanities curriculum I loved, I never really bought into the idea of hard work. I spent countless hours thinking and reading and writing, but only if I was interested.  When I was bored, or tired, or simply lazy, my efforts tailed off and I reverted to scrambling for deadlines.  

In many ways, our educational system supports this. There’s nothing teachers and professors love more than a high potential underachiever. These students constantly project the image that they’re one gifted educator away from greatness. Teachers love this idea. It’s like the surgeon who believes a patient is one operation away from walking again. They dream about stumbling onto this kind of person. Do you know who doesn’t? Bosses. It took about two weeks at the software start-up that hired me after I graduated to realize that my boss didn’t give a damn how great of a programmer I could be, if only I applied myself. He wanted his database problem fixed. And he wanted it done by Monday. When I left the company a year later to go back to graduate school, my boss liked me enough to keep sending me consulting work on the side. But it wasn’t because he thought I was inherently gifted. It because, when a release was coming up, I was there Saturday morning and at 10 pm on Tuesday night. It was because I worked my ass off. 

This is a hard transition to make. A lot of my peers balk because it sounds overwhelmingly cynical. Fortunately, I’m not just saying “work hard, because people only care about what they can get out of you” (although it’s probably more true than we’d like to admit). Rather, I’m trying to say that believing in hard work will not only make you more successful, it’ll make you better, happier person. It all comes down to how you process failure. If you believe in talent, failure is the end. You do whatever you can to convince yourself you didn’t actually fail. You blame your boss or your co-workers or your spouse. You change the rules. You change the goalposts. At the end of the day, though, you know its not real.  But if you believe in hard work, failure isn’t the last thing that happens. Failure is how we learn we’re not there yet. Failure is how we discover opportunities to become great. Failure is the beginning of success. 

So do it now. Work hard. And be proud of it.  

Leave a comment

Filed under Uncategorized

Educate the Young

Shameless plug: David Mayer & Co over at Educate the Young were kind enough to give me a shout-out for some of the patient safety work I did after attending the Telluride Patient Safety Summer Camp last year.  Great organization, great staff and great experience.  Can’t speak highly enough of these guys. 

Leave a comment

Filed under Medicine, Uncategorized

Delivery vs. Content: Lectures and Scalability

A couple weeks ago, I was browsing through Tim Ferriss’s blog and began reading about a novel start-up incubator called Y Combinator. One of the chief founders, Paul Graham, is a pretty interesting guy. In his life, he’s spent significant time working as a programmer, painter, investor and, most importantly for me, essayist.  He could also lay a decent claim to having invented a little thing called the “web app.” So, you know, a pretty bright guy.  

He publishes a lot of his writing on his website and one piece in particular, titled “Writing and Speaking” caught my attention. The essay itself is worth reading, but his argument can be summed up this way: 

1. Any presentation is allotted a finite amount of preparation time. 
2. That time is split between developing content and practicing delivery. 
3. Consequently, well-polished speakers often have presentations poor in content
4. Similarly, those with the best content are often unimpressive speakers.  
5. Therefore, don’t focus on being a great speaker. Content always wins.  

At a glance, the first premise seems weak. Preparation for important presentations tends to balloon a la Parkinson’s law. However, Graham’s underlying premise, I think, is that it shouldn’t. Unless you want to talk for a living, you give lectures to support your main line of work, whether its research, software, clinical practice, or whatever else. This even applies, I would argue, to teaching, where large group lectures are given in hopes of developing a smaller subset of interested students (i.e. majors, graduate students, research assistants, sub-interns, etc).  

My disagreement with Graham lies more in his conclusion and, principally, in the particular context for which he tries to leverage it. As a Platonist, I understand his implicit critique of rhetoric. Great speakers are, by and large, great convincers.  But for all their persuasive power, they rarely instill much knowledge.  The problem is that acquiring knowledge of any kind is work.  A lot of work. And before people are willing to invest substantial effort into something, they need to be convinced that it’s important. Think of the early political career of Willie Stark in All the King’s Men. Or the meteoric rise of Barack Obama’s political career.  Rhetoric is important because knowledge rarely emerges directly from ignorance. First, you need to arrive at true belief and that, unfortunately, is a product of persuasion. And persuasion rarely relies on content.

Now, if you’re lecturing principally to a graduate seminar, things are different. Those in attendance are already educated in your field and convinced that the subject is important. But what if you’re in a giant lecture hall? The bigger the church, the less likely any particular audience member is already part of the choir. Consequently, most of them aren’t already invested. People who don’t care get bored fast. There’s something more interesting on Buzzfeed or Netflix. My rule of thumb: if you can’t introduce yourself to everyone at your talk and remember their names, persuasion is important. Same things goes for multiple iterations of the same talk. Pitching a hospital-wide quality improvement idea to the staff? Selling a high volume product? Trying to pick up people at bars? Delivery will eclipse content. But if you’re presenting to your CMO, selling someone an EMR or trying to find a life partner? Paul Graham is right.  There’s two “t”s in content. Cross both of them.  

Leave a comment

Filed under Philosophy

Becoming a doctor from philosophy

Clancy Martin recently published an article in the Atlantic titled “Playing with Plato,” in which he both reviews Rebecca Goldstein’s new book Plato at the Googleplex (which I have not read) and argues that the philosophical questions confronted by the historical philosophers are still relevant to modern day life (a point for which I have great sympathy). To me, however, the most interesting part of his article was the first sentence: 

“When I was 21, I was trying to decide whether to become a doctor or a philosophy professor.”

He goes on to explain that, during this process, he got two conflicting pieces of advice.  From his business-minded brother: “Be practical. Books are dangerous things. Just because it’s on paper, you think it’s true.” And from his “New Age guru” father: “Be a professor. You’ll never be rich, but you’ll be doing what you love: reading and writing. You get summers off. It’s a good life.” In the end, Martin followed his father’s advice and, by all accounts, has achieved a far bit of professional success.  

Understanding why I was fascinated by this brief introduction to Martin’s article requires a little background: when I was 22, I walked away from four years of undergraduate study (read: obsession) and an offer to attend a Ph.D. program in Boston to return to school and become a physician.  It wasn’t an easy choice. I suffered through more than one sleepless night. I sought a lot of advice. Much of it was in line with what Martin relays: “You’ll never find a job” versus “Do what makes you happy.”  In the end, I found that, unlike Martin, neither of these lines of reasoning was, well, reasonable.  

What eventually caused me to I walk away from academic philosophy was, ironically, philosophy itself. For me, books were indeed dangerous. But they didn’t ruin my sense of practicality and worldliness, as Martin’s brother feared. Rather, they shattered the idyllic vision I had of my career, lounging in the quad reading Spinoza and taking research trips to Greece. I imagined that what drove me was the possibility of changing young minds via the lectern and the pen. But the more I thought and the more I struggled, the more I realized that my motivation to enter academic philosophy was exactly as Martin’s father suggested: I wanted to spend my life reading and writing and I wanted someone to pay me for it.  Instead of being a noble pursuit, entering academic philosophy was the most selfish thing I could imagine doing. 

Why did this bother me so much? Most careers are chosen for selfish reasons; they offer money or power or respect or some other end. Time and time again, however, I returned Plato’s allegory of the cave and its enduring mystery: why, after climbing to the surface and seeing the beauty of the sun, would the philosopher return to the darkness of the cave and the danger of trying to free others? To me, this is similar to the great paradox of Buddhism: the quest for detachment from the worldly cycle of desire and suffering requires us to be moved with compassion for those around us.  

My mentor, Frank Harrison, helped me see that the answer, first, last and always, is love. The kind of love that drives us upwards towards Martin’s “eternal idea” should also drive us outwards towards our fellow man, who, in the words of Marcus Aurelius, “participate in the same intelligence and same portion of divinity” as we do. 

So why couldn’t my work with students and colleagues be my expression of this philosophical love? Because, by and large, they would be exactly like me: upper-middle class Caucasian males. Instead of practicing love for the other, I’d be practicing self-love under another name. Further, a long, hard look at academic philosophy lead me to conclude that being a philosophy professor and being a philosopher were distinct enterprises. Often, they can even be opposed. Recent events at the University of ColoradoNorthwestern and Miami have only reinforced my belief that modern academic philosophy is a troubled system.  

All this being said, I will not pretend for a moment that leaving philosophy for medicine was an act of self-immolation. The decision was difficult, but the human body is fascinating and medicine is challenging.  I find the work fulfilling and am consequently no martyr. But this, I would argue, is the great genius of the Greeks and, perhaps, the idea we should most strive to reclaim: serving yourself and serving others are not mutually exclusive, but rather walk together like two feet. In the end, I left academic philosophy because I believed that I would not be happy there. I believed that becoming a physician would make me a better, wiser person. And, trying my best to love wisdom, what other choice could I make?   

2 Comments

Filed under Medicine, Philosophy

ICD-10: The epidemiologist, the observer and the vanishingly rare

I first heard about the ICD-10 when I was working at a small start-up, trying to develop an EMR for a string of dialysis clinics. It was always spoken of with a certain gravity, like the ominous visit from an aunt that nobody in the family likes, but feels obligated to see.  Practical (read: business) people hate ICD-10. It’s giant and unwieldy. Doctors think it’ll be an excuse to bilk them out of payments.  They dread the day that they get a “false coding” note for a visit for a broken arm because they didn’t specify the patient fell off their bicycle or down a flight of stairs.  So who’s driving this?

I can only assume that it’s research.  ICD-10 must be an epidemiologist’s dream. Want to prove something inane, like the fact that waterskiing accidents are more common in the summer?  ICD-10 is your tool.  If you can collate all the insurance billing from the entire country, you can begin to pull out these vanishingly rare instances and analyze them.  Admittedly, as this article points out, some of the events the ICD-10 tries to capture are so vanishingly rare that they actually, well, vanish.  They’re literally unheard of or actually impossible. But what about some of the other widely panned codes, like falling off a chicken coop?  Theoretically, we could begin to perform real time monitoring of safety conditions in all kind of industries.  These events are rare, which means that if we see a cluster of them occurring in a particular geographic area, an investigation might be warranted. Maybe building inspectors aren’t performing their inspections. Maybe a certain company isn’t enforcing proper safety standards. Again, theoretically, the giant index of ICD-10 codes could drive meaningful data collect and interventions.

The problem is the observer. Inter-observer variability is a problem in all sorts of medical fields, from reading chest x-rays to interpreting physical exam findings. For the ICD-10 to be useful for research, you need to code these rare events correctly.  And, with the endless array of options, the chances of this happening seem, to me, to be vanishingly small.  Maybe there’s a good technical solution to this, where an EMR scans the HPI and offers a variety of appropriate billing codes (writing “chicken coop” should be a dead give-away).  The validation and implementation of this for all 155,000 codes is, however, a monumental task at best.  Such an undertaking can (and should) be done by those who created the codes in the first place. Unfortunately, something tells me they can be less than thorough.

Leave a comment

Filed under Medicine