December 21, 2006
'Wild Harmonies: A Life of Music and Wolves' — by Hèléne Grimaud
I finished her above-titled autobiography last evening.
Ms. Grimaud is literally the "Wild Child" of classical music, having broken all the rules and thumbed her nose at the French music establishment as a teenager to take her chances her way.
The video above, part of a master class given by her, is in her native French and subtitled in German so if you're not adept in either of those languages you'll simply have to make do with the music, which happens to be the piano part from the Schumann violin sonata.
I enjoyed the book,
which wanders hither and yon through wolf lore and history, fable and fact, as Ms. Grimaud recounts a young life half earthbound, half fantastic with endless internal questioning and depression mitigated by music's blessing.
It took a chance encounter with a wolf in Tallahassee, Florida in the early 1990s, where she'd moved from Paris when she was in her early 20s, to open the way to her spiritual flowering.
Most interesting and sui generis.
bookofjoe goes to China
You could look it up.
[via a Chinese reader who asked to remain anonymous]
World's Longest Tongue*
Henry Fountain described the magnificent tongue of the Andean bat Anoura fistulata in a story that appeared in the December 19, 2006 New York Times Science section.
Long story (and tongue) short: The bat's tongue (above, unfurled to imbibe from a test tube filled with sugar water and below, taking nectar from a flower) is one and a half times its body length.
The article follows.
- For an Andean Nectar Feeder, a Tongue That Wags the Bat
Imagine your tongue was nine feet long.
That’s about how long it would be if you were a certain kind of nectar-feeding bat.
The bat, Anoura fistulata, found in the Ecuadorean Andes, has a tongue one and a half times its body length. For the 2-inch plus bat, that comes out to a 3.3-inch tongue. Proportionally, this is the longest tongue of any mammal, and the second longest (next to the chameleon) among vertebrates.
Most nectar-feeding bats have long, slim tongues, the better to reach the sweet liquid contained in the flowers they pollinate. But Nathan Muchhala, a University of Miami doctoral student who has studied these bats, known as glossophagines, said he noticed something different about A. fistulata, which he first described as a new species last year.
“Its tongue seemed longer — about twice as long,” he said. “At that point I knew something was up.” A report on the record tongue was published in the journal Nature.
Glossophagines have a retractor muscle in the chest, but the tongue itself (which is pumped full of blood to extend it) begins at the base of the oral cavity. A. fistulata, however, keeps the actual tongue in its chest: it begins between the heart and sternum.
Mr. Muchhala wondered why this particular bat’s tongue was so long, and found the answer in his earlier studies in the Andes of the types of flowers that these and other nectar-feeding bats were visiting. He remembered a particular flower, Centropogon nigricans, with a nectar-containing neck, called a corolla, that is the same length as the tongue.
“Once I found out the tongue was so long, I started to connect it to that,” he said.
Studies of this and other bats showed that while they all carried pollen from many flower species, only A. fistulata was carrying this pollen from C. nigricans. And the flower was found only within the bat’s range on the western slopes of the Andes. All of this points to the possibility that the bat’s tongue and the flower evolved together.
“It’s still a little dicey issue,” Mr. Muchhala said. “I can’t really prove it.” But it seems a likely explanation for the similar length of tongue and corolla.
Brock Fenton, a bat specialist at the University of Western Ontario who was not involved in the research, said the long tongue was “an astonishing adaptation,” but not totally surprising given what scientists are learning about bats.
“We’re going to end up with more and more examples of this kind of specialization,” he said.
Here's a link to the abstract of the Nature article, which follows.
- Nectar bat stows huge tongue in its rib cage
Bats of the subfamily Glossophaginae (family Phyllostomidae) are arguably the most specialized of mammalian nectarivores, and hundreds of neotropical plants rely on them for pollination. But flowers pollinated by bats are not known to specialize for bat subgroups (unlike flowers that have adapted to the length and curvature of hummingbird bills, for example), possibly because the mouthparts of bats do not vary much compared with the bills of birds or the probosces of insects. Here I report a spectacular exception: a recently-described nectar bat that can extend its tongue twice as far as those of related bats and is the sole pollinator of a plant with corolla tubes of matching length.
Voodoo Word — by Mike Larsson
Gareth described it as "an electrified plushy that spits out nastygrams onto a computer screen whenever you stab it with a straight pin."
Prediction: Larsson's gonna need a truck to carry his money to the bank once he starts selling these.
'Dead Enough?: The Paradox of Brain Death' — by Pauline W. Chen
This past Tuesday, December 19, she emailed me not to threaten me with legal action for posting it but, rather, to thank me for doing so.
That's a refreshing change.
But I digress.
She was kind enough to call my attention to an earlier essay of hers — "Dead Enough?: The Paradox of Brain Death" — which appeared in the Fall 2005 issue of the Virginia Quarterly Review.
That piece went on to become a a co-winner of the 2005 Staige D. Blackford Prize in Nonfiction and a finalist for a 2006 National Magazine Award.
- Dead Enough?: The Paradox of Brain Death
A perfect liver on ice has a nearly radiant sheen, a wet brown anterior so flawless that it reflects light. The broad, curved surfaces meet in sharp juxtaposition at the periphery, creating fine edges that would slice your fingertips if they were made of steel and not hepatocytes.
The surgeon lifts the organ out of the basin. It nestles into her cupped palms, the edges curling over her gloved fingers. She lowers it gently into the nook created by the flapping right diaphragm and the curved scaffolding of the rib cage. With bright blue sutures, she stitches the infrahepatic vena cava, the suprahepatic vena cava, and the portal vein, and then releases three vascular clamps in quick succession. Dammed blood bursts through the newly sewn connections and fills the liver like a balloon, changing its color to a deep plum. The organ hardens for a few minutes, shocked by the sudden ingress of warm blood; but once the oxygenated blood of the hepatic artery pulsates in, the liver begins to soften again.
A few minutes later, the surgeon raises the large, floppy front surface of the liver with the backside of her left hand. A tangle of vessels now rigid with blood emerges from a central port deep within the liver’s parenchyma. Just to the right, almost lost within the fat and folds of liver, is a tiny, pencil-thin duct. The surgeon, with her most delicate of forceps, grasps the edge of this duct and holds it up to view. The hollow is nearly invisible, but concentrating hard enough, you begin to see it. Pushing forth from the darkness and coalescing along the edge is the first sign of the liver’s function: a single, golden droplet of bile.
Livers like these are what drive the Lazarean awakenings of liver transplant patients. Their divine perfection reverses death-like comas, arrests the voluminous flows of mortal hemorrhage, and restores life force in patients who are all but dead. The more flawless these livers are, the more miraculously they work, their quality determined by, as much as anything else, the state of the donor. Small, seemingly insignificant differences in donors—a systolic blood pressure of 60 rather than 90, a cardiac arrest of 20 minutes instead of 10, a blood pH of 7.0 and not 7.4—determine which recipients will emerge from end-stage liver disease and which ones will die on the operating room table. Whether you are transplanting a liver or a heart or a kidney or a pancreas, the better the donor is, the better your patient’s outcome will be. And the very best of those donors is one who, when under the knife, is as close to alive as possible.
* * * *
I am a transplant and cancer surgeon and in my office, stashed among folders containing notes and old operative reports from my residency, two fellowships, and practice, is a 9 × 12 manila envelope that bulges with small white stickers. Each sticker bears the name and age of one of the 150 brain-dead patients whose organs I have procured.
People in my business refer to these donors as “cadaveric,” as much to emphasize their deceased status as to differentiate them from the living relatives and friends who give a kidney, a lung, or a part of a liver. Cadaveric donors make up the majority of organ donors in this country and are, as the adjective indicates, dead. Like their counterparts in hospital morgues, they possess all the official paperwork necessary to prove their status, including the appropriate certificates and scrawled physician notes that detail the time and date of death.
There is one notable difference between cadaveric donors and their mortuary peers, however: the brain-dead hardly seem dead. They are warm, with skin that seems capable of blushing. Their chests rise and fall so convincingly with each breath that the ventilator at the head of the bed seems like superfluous machinery rather than the source of the air that fills their lungs. If you look closely at the clear tubing that drains their bladders, you see droplets of warm condensation clinging to the inside walls from fresh streams of body-temperature urine.
After the first 20 times or so, harvesting organs becomes routine in the way that driving a car can be. Like a living patient, a brain-dead donor who has scarring from previous operations, extra layers of fat, or some anatomic variation of the norm can be difficult to work on. But the procedure is generally the same. The determination of brain death, usually made by the patient’s physicians, is a fait accompli by the time the procurement begins; and the donor will remain connected to a ventilator and life-supporting medications until you have opened the belly and chest and dissected out the organs and important blood vessels. When everyone in the room is ready, you shut a vascular clamp around the aorta and ask the anesthesiologists to disconnect the ventilator. You pack sterile, crushed ice quickly into body cavities to decrease any damage to the organs, and your assistant scissors across the vena cava, letting the blood run through suction tubing into wastebasket-sized, clear plastic canisters on the floor of the operating room. As the preservation fluid begins to fill the now emptying blood vessels and organs, you stand back for a few moments and watch the heart contract, writhe, and then fall absolutely still.
One winter about 15 years ago, long before I had any idea of what kind of doctor I would become, I came home from medical school wearing a fist-sized blue and white button on my coat lapel; it proclaimed that I was not taking my organs to heaven. When I walked through the door, my mother, who was sitting at her usual seat in the kitchen, looked me over for a moment and then frowned. I thought it was the coat, a used man’s black overcoat; instead, she pointed to the button. That weekend she badgered me incessantly; the button had obviously perturbed more than my mother’s sense of fashion propriety. “How can you be sure,” she asked me over and over, “that the doctors who want your organs will know you are dead enough?”
At the time, with only rudimentary medical education, I was not really able to answer her question, so I laughed instead and told her she was being irrational. To emphasize my point, I kept the button on my coat for several months, or at least until my next visit back home. But in truth, I had found my mother’s question unsettling.
When I am in social situations nowadays, I see people’s smiles freeze when they discover my profession. The lighthearted chatter stops, and their eyes become nystagmic, as if they are searching in the air around me for something to say. The seasoned socialites recover quickly and begin some polite discussion about a distant sick relative or friend or about a news story concerning organ transplants. Eventually, though, all of these conversations will arrive at the same point, as if the inner ruminations of every stranger I have met converge on a single thought. When they finally open their mouths to ask that familiar question, I feel for a moment that I am a medical student once again who is wearing the coat with the button and facing her mother in the kitchen. “What about you?” these strangers will say. “Have you signed your donor card?”
* * * *
On December 2, 1967, in Cape Town, South Africa, 25-year-old Denise Ann Darvall and her mother stopped by a bakery to pick up a birthday cake. As they made their way back across the street to their car, a truck hit them, killing the mother instantly. Denise, who had some residual signs of life, was rushed to Groote Schurr Hospital just up the road, where she was diagnosed with severe brain injury. Within a day of the accident, Dr. Christian Barnard, a young cardiothoracic surgeon who had set his sights on performing the first human-to-human heart transplant in the world, declared Darvall’s death, procured her heart with her father’s consent, and then stitched it into 55-year-old South African grocer Louis Washkansky.
Almost immediately, newspaper editorials began to question this surgical switching of parts, noting that a corpse had to be “cannibalized” for the living and that the donor’s heart continued to function although she was dead. One health official at the time said, “I have a horrible vision of ghouls hovering over an accident victim with long knives unsheathed, waiting to take out his organs as soon as he is pronounced dead.” The apparent success of this transplant had created a new moral dilemma: would doctors rush to declare one person dead in order to save another?
A year later, in an attempt to address the difficult implications of organ procurement, Dr. Robert Ebert, then dean at Harvard’s medical school, and Dr. Henry Beecher, a Harvard anesthesiologist well known for his critical work on the exploitation of human research subjects, formed an ad hoc committee to study the “hopelessly unconscious patient.” The committee published their report, “A Definition of Irreversible Coma,” in 1968 in the Journal of the American Medical Association.
At first glance, the report simply describes characteristics of what would soon be known as “brain death.” What is odd, however, given the controversy of the times and the sizable body of research that already existed on brain death, is that the report includes precious few references to either. Instead, the connection between brain function and death is described as if it were a well-accepted observation.
By turning the link between brain function and death into a given, the Harvard committee transformed a disputable academic and philosophical issue into an objective, incontrovertible, medical fact. Publication in one of the nation’s most prestigious, peer-reviewed, medical journals made defining death according to brain function morally acceptable; and the imprimatur of Harvard secured that transformation. The Harvard report remains one of the most frequently cited papers on brain death.
In the decade that followed publication of the report, there were fewer arguments over the ethics of defining death based on brain function and more about the criteria for determining brain death. The debate resulted in a dazzling array of options. One medical reviewer counted as many as 30 different recommendations by as many official medical societies. Some said brain death involved only the brain stem, which controls basic physical functions like breathing, swallowing, and sleep-wake cycles. Others cautioned that such a limitation could lead to defining as dead those patients who were “awake” inside but who had damaged brain stem function. They cited as an example patients with “locked in” syndrome, who are fully conscious and able to feel pain but can only blink. These groups believed instead that the brain stem and the cortex, the part of the brain that is responsible for abstract thought, memory, and self-awareness, had to be irreparably damaged for a patient to be diagnosed as brain-dead.
In 1981, the President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioral Research proposed the concept of “whole brain death,” defining the moment of death as the point at which the brain had irreversibly lost total function. With this definition in hand, the commission drafted the Uniform Determination of Death Act, a piece of legislation that was immediately supported by both the American Medical Association and the American Bar Association and eventually adopted by most state legislatures.
In the nearly 25 years since the commission’s report, diagnosing brain death has become relatively straightforward based on three criteria. First, the patient must be in a severe coma not caused by a potentially reversible condition such as hypothermia. Second, brain stem reflexes, tested with various bedside exams, must be absent. Third, the patient, once disconnected from the ventilator, must make no spontaneous attempts to breathe and must have blood tests with corroborating elevations in carbon dioxide. While the real-life application of these criteria can be complicated by human variation, the results obtained are easily replicated when performed by experienced physicians. In the last two decades, then, it has become possible to diagnose brain death with unfailing accuracy.
* * * *
There is a belief in medicine that the pathologist has the final say and that in death there is truth. The truth in brain death is mind-boggling. While normal brain tissue is firm, a brain that has been dead shows progressive autolysis, a form of biological self-destruction. “It will almost be like soup,” Dr. Harry Vinters, chief of neuropathology at UCLA, recently explained to me. He is the co-author of a major textbook on the pathology of the brain and has performed almost a hundred autopsies on the brain-dead. “It really depends on how long they have been on the ventilator. If they have been on the ventilator for two days, then the brain is grey and softened. But if, for example, a family has had difficulty deciding what they want to do and the patient has been kept on the ventilator for two to three weeks, then there’s tremendous autolysis. The brain gets very swollen, soft, and mushy.” The nerve tissue can become so friable that fragments of brain from the head will break off and float down the spinal column. “Sometimes I’ll be looking at a slide of the spinal cord,” says Vinters, “and I’ll see fragments of cerebellum floating around in the specimen.”
I think about this finality of brain death when strangers ask me about my donor card. I also think about the transplant recipients that donor organs have saved. “Of course I have signed my card,” I will tell them. I have no qualms about giving my organs away.
The question that no one has ever asked me, however, is what I would do if I had to decide about donation for someone else in my family.
One of the most difficult things a doctor can do is to tell a family that the seemingly living body in front of them is actually dead. One trauma surgeon recently commented to me, “It’s always been hard for me to explain to a family that the brain-dead loved one is indeed dead, even though the EKG is beeping, the chest is going up and down, and the patient looks pink and asleep. The difficulty comes when the family asks for ‘one more day’ or to wait until an uncle arrives from out of town.” Families wonder if there has been a mistake, or if the patient is merely in some kind of temporary comatose state and, like a computer in standby mode, could come back to life any moment with a touch on the right button.
I have to admit that despite all I know about brain death, I still have my moments of uncertainty. More than once, when I have pulled my scalpel across the warm, pliable skin of a donor and seen the exuberant reds of well-oxygenated bleeding, my mother’s old nagging doubts have insinuated themselves into my forebrain. I have found myself thinking about the donors’ lives and asking the nurses who have met their families what they were like. As I push aside the still contracting intestines and inadvertently brush my hands against the remnants of the previous day’s meal within, I cannot help but think about that last meal and whether the donors and the people they were with had any inkling of the near future. And as I finally watch the last frenzied energy of their hearts dissipate into stillness, I am sorry that I am the one to witness the coming of that eternal silence and that those family members had to walk away from their loved one for the last time with the rhythmic bleats of the heart monitor still ringing in their ears.
More than once, as I have procured organs, I have had my doubts. But it was not that, like my mother, I was afraid that these donors were not dead enough; it was that I regretted having to keep them so alive.
Dr. Chen's book, "Final Exam — A Surgeon's Reflections on Mortality" (top), will be published by Knopf next month.
Hey, joe — Where'd all the gadgets and stuff go?
Not to fret.
I too have noticed the past few days' predominance here of news-related items and concomitant dearth of ruby chopstick holders and their ilk.
That's because with Christmas now just around the corner — what, four days from today — the zillions of companies that send me their catalogs from which I occasionally extract a nugget from amidst the dross have ceased and desisted, figuring that everyone who needed one or ten catalogs had by now received them and placed their orders for in-time holiday delivery.
I expect the flood to resume once the new year begins.
In the meantime I'll just spackle and patch and make do.
Durian Durian — Best new band of 2006*
In my Bizarro World* dreams.
The King will never die.
The photo below
was taken at Changi Airport in Singapore.
- What is a Durian?
Well... simply put, Durian is a fruit: a big, green thorny fruit. But wait, it is not just another exotic and expensive fruit from Southeast Asia. In fact, it is considered "King of the Fruit" throughout the region. Personally, I think that is an understatement of the millenium since we Asian are humble people. In fact, the actual status of Durian is "THE GOD OF ALL FRUIT!" No kidding! Yeah, yeah, I know, some (unadventurous) people would rather die than to smell the STINK of a durian. Heck, a few countries even BAN the presence of durian in selected public spots due to its offensive smell (that, to me, is a hopeless rule just like outlawing fart, when we know that it is such a pleasure and EVERYBODY secretly doing it!). Look here, I can go on and on here but let's hear it from non-biased sources okay.
The photo below,
of a sign present in most SMRTs (Singapore Mass Rapid Transit passenger compartments), was taken by Steve Bennett.
'The Blog Mob' — by Joseph Rago
Rago is the assistant editorial features editor at the Wall Street Journal.
He came out from behind his editorial desk yesterday swinging in a no-holds-barred editorial page salvo which delivered blow after well-aimed punishing blow that hit home and made me reel and bounce back against the ropes, only to receive yet another series of well-aimed punches that eventually staggered me and sent me to the canvas for the count.
I briefly considered throwing in the towel for good at that point, but as you can see I managed to finally get my battered psyche up off the ground and limp back to the dressing room to let my crack research team see if they could patch me up well enough to come out swinging in the future.
Here's Rago's superb takedown of bloggers and all they purport and/or wish to be.
- The Blog Mob
Blogs are very important these days. Even Mahmoud Ahmadinejad has one. The invention of the Web log, we are told, is as transformative as Gutenberg's press, and has shoved journalism into a reformation, perhaps a revolution.
The ascendancy of Internet technology did bring with it innovations. Information is more conveniently disseminated, and there's more of it, because anybody can chip in. There's more "choice" — and in a sense, more democracy. Folks on the WWW, conservatives especially, boast about how the alternative media corrodes the "MSM," for mainstream media, a term redolent with unfairness and elitism.
The blogs are not as significant as their self-endeared curators would like to think. Journalism requires journalists, who are at least fitfully confronting the digital age. The bloggers, for their part, produce minimal reportage. Instead, they ride along with the MSM like remora fish on the bellies of sharks, picking at the scraps.
More success is met in purveying opinion and comment. Some critics reproach the blogs for the coarsening and increasing volatility of political life. Blogs, they say, tend to disinhibit. Maybe so. But politics weren't much rarefied when Andrew Jackson was president, either. The larger problem with blogs, it seems to me, is quality. Most of them are pretty awful. Many, even some with large followings, are downright appalling.
Every conceivable belief is on the scene, but the collective prose, by and large, is homogeneous: A tone of careless informality prevails; posts oscillate between the uselessly brief and the uselessly logorrheic; complexity and complication are eschewed; the humor is cringe-making, with irony present only in its conspicuous absence; arguments are solipsistic; writers traffic more in pronouncement than persuasion . . .
The way we write affects both style and substance. The loquacious formulations of late Henry James, for instance, owe in part to his arthritis, which made longhand impossible, and instead he dictated his writing to a secretary. In this aspect, journalism as practiced via blog appears to be a change for the worse. That is, the inferiority of the medium is rooted in its new, distinctive literary form. Its closest analogue might be the (poorly kept) diary or commonplace book, or the note scrawled to oneself on the back of an envelope — though these things are not meant for public consumption. The reason for a blog's being is: Here's my opinion, right now.
The right now is partially a function of technology, which makes instantaneity possible, and also a function of a culture that valorizes the up-to-the-minute above all else. But there is no inherent virtue to instantaneity. Traditional daily reporting — the news — already rushes ahead at a pretty good clip, breakneck even, and suffers for it. On the Internet all this is accelerated.
The blogs must be timely if they are to influence politics. This element — here's my opinion — is necessarily modified and partly determined by the right now. Instant response, with not even a day of delay, impairs rigor. It is also a coagulant for orthodoxies. We rarely encounter sustained or systematic blog thought — instead, panics and manias; endless rehearsings of arguments put forward elsewhere; and a tendency to substitute ideology for cognition. The participatory Internet, in combination with the hyperlink, which allows sites to interrelate, appears to encourage mobs and mob behavior.
This cross-referential and interactive arrangement, in theory, should allow for some resolution to divisive issues, with the market sorting out the vagaries of individual analysis. Not in practice. The Internet is very good at connecting and isolating people who are in agreement, not so good at engaging those who aren't. The petty interpolitical feuding mainly points out that someone is a liar or an idiot or both.
Because political blogs are predictable, they are excruciatingly boring. More acutely, they promote intellectual disingenuousness, with every constituency hostage to its assumptions and the party line. Thus the right-leaning blogs exhaustively pursue second-order distractions — John Kerry always providing useful material — while leaving underexamined more fundamental issues, say, Iraq. Conservatives have long taken it as self-evident that the press unfavorably distorts the war, which may be the case; but today that country is a vastation, and the unified field theory of media bias has not been altered one jot.
Leftward fatuities too are easily found: The fatuity matters more than the politics. If the blogs have enthusiastically endorsed Joseph Conrad's judgment of newspapering — "written by fools to be read by imbeciles" — they have also demonstrated a remarkable ecumenicalism in filling out that same role themselves.
Nobody wants to be an imbecile. Part of it, I think, is that everyone likes shows and entertainments. Mobs are exciting. People also like validation of what they already believe; the Internet, like all free markets, has a way of gratifying the mediocrity of the masses. And part of it, especially in politics, has to do with conservatives. In their frustration with the ancien régime, conservatives quite eagerly traded for an enlarged discourse. In the process they created a counterestablishment, one that has adopted the same reductive habits they used to complain about. The quarrel over one discrete set of standards did a lot to pull down the very idea of standards.
Certainly the MSM, such as it is, collapsed itself. It was once utterly dominant yet made itself vulnerable by playing on its reputed accuracy and disinterest to pursue adversarial agendas. Still, as far from perfect as that system was, it was and is not wholly imperfect. The technology of ink on paper is highly advanced, and has over centuries accumulated a major institutional culture that screens editorially for originality, expertise and seriousness.
Of course, once a technosocial force like the blog is loosed on the world, it does not go away because some find it undesirable. So grieving over the lost establishment is pointless, and kind of sad. But democracy does not work well, so to speak, without checks and balances. And in acceding so easily to the imperatives of the Internet, we've allowed decay to pass for progress.
It's the verbal equivalent of smelling salts: gets your attention in a hurry — at least, if you're me it does.
"Written by fools to be read by imbeciles" — ouch.
Joseph Conrad sure had a way with words, what?