Ga-ga now and then

DAB Matula Thoughts June 7, 2019

Ga-ga then and now

2172 words

[Above: Nesbit reception at 2019 AUA Annual Meeting in Chicago. Ice sculpture.]

 

One.             

Senior medical students are getting ready this month for the next big stage in their lives and careers, just as I did in June of 1971 heading west from Buffalo to Los Angeles, to start nine years of training at UCLA. I don’t recall much of the drive along the evolving interstate highway system, a vision of President Eisenhower only 20 years earlier, but the exhilaration of beginning something totally new with surgical residency under William P. Longmire certainly dominated my thoughts on the road. The intellectual and conjoined physical capabilities of surgery as a profession excited me. The first day of internship, in line to check in, I met fellow intern Doug McConnell and quickly befriended John Cook, Erick Albert, Ed Pritchett, Ron Busuttil, Arnie Brody, John Kaswick, Dave Confer and the rest of our 18 at the bottom of the UCLA training pyramid. Over the five-year process, we learned the knowledge base, skills, and professionalism of surgery through experience, teaching, study, and role models. In the blink of an eye 1971 has become 2019 and, suddenly I’m near the end of my career.

Reading Arrowsmith and the recent story of the Theranos debacle in John Carreyrou’s Bad Blood, I saw those protagonists wanting to change the world. My hopes in 1971 were not so grand, I just wanted to find my own relevance and hoped to become good in my career. Most people similarly want to make their mark in one way or another, through job, family, art, or community. Some, however, actually intend to change the larger world, although their idea of “change” may be someone else’s deformation.

Last month a large cohort of our University of Michigan urology residents, faculty, nurses, PAs, and staff met in Chicago at the annual AUA national meeting to learn, teach, exchange ideas, network, enjoy reunion, and circulate word of our new chair Ganesh Palapattu. The Michigan brand was strong with hundreds of presentations from our faculty, residents, and alumni. The MUSIC and Nesbit Alumni sessions were great gathering points. [Below: UM podium events with alumni Cheryl Lee, Jens Sønksen, Barry Kogan, and Julian Wan.]

Cheryl has been back in Ann Arbor this week as visiting professor.

A group of our residents and one incipient PGY1 were ga-ga at the AUA Museum booth. [Below in front: Juan Andino, Catherine Nam; back row: Adam Cole, Scott Hawken, Rita Jen, Ella Doerge, senior faculty member, Colton Walker, Matt Lee, Kyle Johnson, Udit Singhal.]

 

Two.

Surgery, the word, derives from Greek, kheirourgos, for working by hand and the term moved through Latin, Old French, and Anglo-French to become surgien in the 13th century. The epicenter of that world was the doctor/patient duality, based on an essential transaction as old as humanity with exchange of information, discovery of needs, and provision of remedies and skills. The knowledge base and tools are far better since Hippocratic times, but the professional ideals are much the same. It seemed pretty awesome to my 21-year-old self that I might one day be able to fix things with my hands like Drs. Longmire and Rick Fonkalsrud. History mattered to our UCLA professors who insisted that trainees know the back stories of each disorder and treatment.

New interns arriving next month, called PGY1s for their postgraduate year status, may have parallel thoughts to those of mine 48 years ago as they start their journeys. Pyramidal training models no longer exist – PGY1s can reasonably expect to complete their programs. Their experiences will be replete with contemporary expectations, notably patient safety, value propositions, clinical outcome assessments, co-morbidities, social determinants of disease, personal well-being, attention to patient experience, and teamwork with diversity, equity, and inclusion. Acronyms have proliferated, tools are more powerful, and regulation grows more burdensome. Nevertheless, essential transactions remain at the center of health care with needs of patients addressed by the knowledge, skills, and kindness of healthcare providers, one patient and one provider at a time.

While taking pride in the labels doctor, physician, surgeon, nurse, and physician’s assistant we realize now that teams of providers with many types of expertise congregate around each single patient, either immediately physically as “bedside teams” (in clinics as well), sequentially, or virtually (with office staff, coders, laboratories, or electronically). Teams offer exquisitely specialized expertise and “wisdom of crowds,” although patients often find no single person in charge of their care.

 

Three.

Patient safety was a given when I was a resident. It was wrapped up in regular Morbidity and Mortality conferences without explicit use of that phrase, patient safety. Around that time a young graduate student in sociology, Charles Bosk, embedded himself in an academic surgical team for 18 months to discover how surgery was learned, practiced, and lived at an unnamed “Pacific Hospital.” The result was his book in 1979, Forgive and Remember: Managing Medical Failure. Bob Bartlett, my friend and colleague in the Surgery Department, introduced me to it a few years later. A second edition in 2003 was reviewed by Williamson. [Williamson R. J Royal Soc Med. 97(3):147-148, 2004.]

Patient safety has grown since my internship from an obvious but unarticulated expectation to a distinct field of study modeled after other industries, notably aviation. Health care has learned much from other professions such as the concepts of safety culture, standardization of procedures, checklists, and so forth, although healthcare is more multidimensional and nuanced than those other worlds. Bosk recently reflected on the health care exceptionality in a Lancet article, “Blind spots in the science of safety,” written with Kirstine Pedersen, concluding:

“There is a science of safety to reduce preventable adverse outcomes. But health care also has an irreducibly relational, experiential, and normative element that remains opaque to safety science. The contribution of a kind and reassuring word; a well delivered and appropriately timed disclosure of a bad diagnosis; or an experience-based evaluation of a small but important change in a patient’s condition – all are difficult, if not impossible to capture in a performance metric. Accomplishing safety and avoiding harm depend on discretion, effective teamwork, and local knowledge of how things work in specific clinical settings. Finally, the successful practice of a science of safety presupposes in theory what is most difficult to achieve in practice: a stable functioning team capable of wisely adapting general guidelines to specific cases.” [Bosk CL, Pedersen KZ, “Blind spots in the science of safety.” The Lancet 393:978-979, 2019.]

 

Four.

The Michigan Urology Centennial is nearly here and the process of writing our departmental history has elicited many names and stories. Bookends demarcating any era may be discretionary choices and our starting point could easily be debated. Perhaps the first “urologic” procedure of Moses Gunn initiated this specialty at Michigan in the 1850s, or the first faculty appointments with the term lecturer on genitourinary surgery, held by Cyrenus Darling (1902) or clinical professor of genitourinary surgery by Ira Dean Loree (1907) might qualify. Unquestionably, though, the arrival of Hugh Cabot in the autumn of 1919 brought modern urology with its academic components to the University of Michigan. Cabot was the first to use the 20thcentury terminology, urology, at UM and he was Michigan’s celebrity in the field. He literally brought Modern Urology to Ann Arbor, as that was the name of his two-volume state-of-the art textbook of 1918, repeated in a second edition in 1924. Cabot probably didn’t anticipate becoming Medical School dean when he left Boston two years earlier, but his advancement was hardly accidental. A number of other prominent faculty members were well-positioned to replace Dean Victor Vaughan, but Cabot played his political cards well and won the job.

Frederick George Novy (1864-1957) was the strongest competitor. Born and raised in Chicago, Novy obtained a B.S. in chemistry from the University of Michigan in 1886. His master’s thesis was “Cocaine and its derivatives” in 1887. Teaching bacteriology as an instructor, his Ph.D. thesis in 1890 was “The toxic products of the bacillus of hog cholera.” After an M.D. in 1891 he followed the footsteps of his teacher Victor Vaughan as assistant professor of hygiene and physiological chemistry. Visiting key European centers in 1894 and 1897, Novy brought state-of-the-art bacteriology to Ann Arbor, rising to full professor in 1904 and first chair of the Department of Bacteriology. His studies of trypanosomes and spirochetes, laboratory culture techniques, anaerobic organisms, and the tubercle bacillus were widely respected. Our colleague Powel Kazanjian wrote a first-rate book on Novy.

 

Five.

Paul de Kruif (1890-1971), one of Novy’s students, bears particular mention. [Above: de Kruif, courtesy Bentley Library.]  de Kruif came from Zeeland, Michigan, to Ann Arbor for a bachelor’s degree in 1912 and then a Ph.D. in 1916. He joined the U.S. Mexican Expedition (“the Pancho Villa Expedition”) against Mexican revolutionary paramilitary forces in 1916 and 1917, then saw service in France with the Sanitary Corps, investigating the gas gangrene prevalent in the trenches of WWI. de Kruif returned to Michigan as assistant professor in 1919 working in Novy’s laboratory, publishing a paper on streptococci and complement activation.

Novy helped de Kruif secure a prestigious position at the Rockefeller Institute in 1920, to study mechanisms of respiratory infection. While there de Kruif wrote an anonymous chapter on modern medicine for Harold Sterns’s Civilization in 1922. The 34 chapters were mainly written by prominent authors, including H.L. Mencken, Ring Larder, and Lewis Mumford, so how de Kruif, a young bacteriologist (and non-physician), came to be included in this compilation is a mystery. de Kruif’s 14-page chapter, however, caused the biggest stir, skewering contemporary medical practice and doctors for “a mélange of religious ritual, more or less accurate folk-lore, and commercial cunning.” de Kruif viewed medical practice as unscientific “medical Ga-Ga-ism,” but his article was sophomoric at best.

Once de Kruif was revealed as author the Rockefeller Institute fired him in September, 1922. The newly unemployed bacteriologist came in contact with a newly prominent author, Sinclair Lewis (1885-1951), praised for Main Street (1920) and Babbitt (1922). Lewis was ready for his next novel and two friends, Morris Fishbein and H.L. Mencken, persuaded him to focus on medical research. Lewis, son and grandson of physicians, knew little of medical research, so Fishbein, editor of JAMA, connected Lewis to de Kruif. A bond and collaboration ensued for Arrowsmith (1925) in which a central character, Max Gottlieb, was modelled around Novy. Lewis gave de Kruif 25% of the royalties for the collaboration, but held back on sharing authorship, claiming that it might hurt sales. At the time de Kruif thought his share generous, but later became somewhat embittered as book sales soared with Lewis as sole author. [Henig RM. The life and legacy of Paul de Kruif. Alicia Patterson Foundation.]

Arrowsmith was selected for the 1926 Pulitzer Prize, but Lewis refused the $1,000 award, explaining his refusal in a letter to the Pulitzer Committee:

“… I invite other writers to consider the fact that by accepting the prizes and approval of these vague institutions we are admitting their authority, publicly confirming them as the final judges of literary excellence, and I inquire whether any prize is worth that subservience.”

Four years later, however, Lewis accepted the $46,350 Nobel Prize. His Nobel lecture was “The American Fear of Literature.”

Leaving lab behind, de Kruif became a full-time science writer, one of the first in that new genre of journalism. His Microbe Hunters, published in 1926, became a classic and inspired me when I read it as an early teenager, unaware of the controversies around it. [Chernin E. “Paul de Kruif’s Microbe Hunters and an outraged Ronald Ross.” Rev Infec Dis. 10(3):661-667, 1988.] Arrowsmith was re-published in 2001 by Classics of Medicine Library and Michigan’s Howard Markel provided the introduction. [Markel H. “Prescribing Arrowsmith.”]

 

Ga-ga notes

de Kruif’s adjective ga ga for American medicine in the 1920s intended to mean foolish, infatuated, or wildly enthusiastic. It can also denote someone no longer in possession of full mental faculties or a dotard. (Dotard recently came into play in the peculiar rhetoric of the North Korean and American leaders.) The ga ga origin may be from early 20thcentury French for a senile person based on gâteux, variant of gâteur and hospital slang for “bed-wetter.” Gateau, of course, is also French for “cake” and gateux is the plural. de Kruif himself was negatively ga-ga with his criticism of medical specialism. Lady Gaga brings the term to a new level of consciousness and a new generation.

The past week was big on three continents for those who go ga-ga over historic anniversaries. Two hundred years ago, on 31 May 1819, Walt Whitman was born on Long Island. His Leaves of Grass, among much else, had the intriguing phrase “I am large, I contain multitudes,” a prescient reminder of our cellular basis, microbiome, or the plethora of information that leads to TMI (“too much information”) or burnout. Seventy-five years ago, on 4 June 1944, Operation Overlord at Normandy, France, initiated the Allied invasion of Nazi-occupied Europe. Thirty years ago, on 4 June 1989, protests in a large city square between the Forbidden City and the Mausoleum of Mao Zedong turned violent and are now referred to as the June Fourth Incident in the People’s Republic of China.

 

David A. Bloom

University of Michigan, Department of Urology, Ann Arbor

May 3, 2019. Sensations

Matula Thoughts  May 3, 2019

Sensations

 

2180 words: twenty minutes to read, five to skim, or seconds to delete if TMI.

 

Appreciation. Leonardo da Vinci reverberates strongly, even five hundred years after his death on 2 May 1519. The Lancet commemorated yesterday’s anniversary with a cover picture of that great polymath who encompassed astonishing ideas, insights, and talents, leaving for posterity a multitude of works that amaze and delight. Anatomy, physiology, engineering, and visual art are just a few of the intellectual arenas his senses played with and his hands produced. Walt Whitman later wrote: we “contain multitudes…,” and you can fill in the words of what multitudes in particular might follow, such as atoms, cells, thoughts, physical creations, emotions, or other possibilities. da Vinci exemplified that human potential better than most of us, trying to make sense of the world.

 

One.             

Azalias 2019

Spring hits our senses. We can’t easily describe in words the perfumes of flowers or the pleasant rich scent of mulch, but we surely know them. Odors are important sensory inputs, although we don’t usually notice them much as they are less important for us than to most other creatures.  [Above: azaleas, spring 2019.]

Dogs, for example, discern far more olfactory notes than we do and that is probably a good thing, since dogs sequester significant cerebral space and energy for distinctions of specific urine scents or fecal aromas to understand who is in the neighborhood, skills that have been essential to millennia of canine culture, while humans have found other ways to evaluate their fellows and territories. [Below: Molly’s spring inspection.]

Molly

We surely would be confused by having to track of hundreds of scent variations. In fact, even a small amount of effluent odor from one of our neighbors is generally regarded as too much information. [Below: mulch delivery at Smithsonian Institution, Spring 2019.]

Mulch

Smell used to be important in medical diagnosis. Uroscopy relied on smell, color, sediment, feel, and taste of urine for clues to disease and prognosis. Historically, urine was inspected by all five senses (including the taste of urine and the sound of its stream), but now patients are told to leave a sample in the privacy of a bathroom for a medical assistant to label and send to a laboratory. Doctors rarely come close to the stuff. Even so, for any good diagnostician, a necrotic wound, uremic breath, fecal odor, or hint of tobacco, are valuable bits of information not just for a specific disease, but also relevant to the life and comorbidities of a patient. These and other points of data add to the medical gaze that transcends visual clues and once inspired the meme of clever detectives. That gaze has now been replaced by the digital gaze of checklists, smart phrases, and drop-down menus.

RueMorgueManuscript

Last month we commented on the first of the medical detectives in The Murders in the Rue Morgue, wherein Edgar Allen Poe in 1842 described how diagnostic senses could be marshaled in a process he called ratiocination to figure out crimes. The tale reflected on the odor of urine and double entendre of a name when detective Dupin explained to the narrator (Poe) how he seemed to read his mind, by making deductions from facial expressions:

“Perdidit antiquum litera sonum.

I had told you that this was in reference to Orion, formerly written Urion; and, from certain pungencies connected with this explanation, I was aware that you could not have forgotten it.”

The Latin phrase intended the loss or attrition of an old or previous meaning or sound of the word or its homonym. Orion referred to the celestial constellation (Poe called it a nebular cosmogony) and its similarity to urine became a play on words that Dupin noticed had popped into the narrator’s mind as he looked up at the constellation and smiled when the wordplay and associations came to mind. [Above: 1895 facsimile of Poe’s original manuscript for “The Murders in the Rue Morgue.” Susan Jaffe Tane collection at Cornell University. Public domain. Wikipedia.]

 

Two.

Five classic senses taught in my childhood – smell, sight, taste, hearing, and touch – have been updated to seven for my grandchildren with the addition of vestibular sense and proprioception. Technology extends the senses further, outsourcing them and merging their inputs to provide unprecedented amounts of information of the world around us and within us. Microscopy and telescopy carry sight far beyond the unaided eye, while modern imaging with CT scans, MRI, and radioisotope labeling visualize our own living interior bodies. Sound, too, allows inspection of our interiors due to the discovery of Pierre Curie and his brother in 1880 of the piezo-electric principle in crystals that underlies ultrasonography. Extending the seven “basic” senses through technology, we see the world in new ways, although at the cost of diminished acuity of our original senses.

Today’s versions of the medical gaze and the detective’s ratiocination, are powerful: the sum-total of sensory inputs (enhanced by technology) and mental heuristics of scientific thinking.  Intellect integrates the physical senses. This larger sense, the sense of making sense of everything, is the wisdom, judgment, and mental capacity that creates meaning from immediate or recalled sensory input. This may be the most important and defining human sense, but even that is challenged by impending extension or replacement with so-called artificial intelligence.

 

Three.

Ghost_In_The_Machine_cover 

Incidental or relevant? Recently, I was asked to comment on a paper regarding incidental findings of renal cysts in children and that got me thinking how far ultrasonography has come in my career. Genitourinary imaging by ultrasonography came of age as a practical urologic tool in the 1980’s. I recall those early days when, at Walter Reed Army Medical Center, we experimented with crude B-mode ultrasonography to interrogate testes for tumors or viability. Coincidentally, it was around that time, 1981 to be specific, when Gordon Sumner wrote the lyrics to a song called Too much information (TMI):

“Too much information running through my brain,

Too much information driving me insane…”

The world is even more replete with information since Sting and The Police recorded that song in their album Ghost in the Machine. Yet, one might argue that TMI is a sophomoric complaint, as if the infinite information in the cosmos should be curated for our personal capacity of the moment. The actual problem is not too much information, but too little human capacity for processing and our technologies have made this situation worse.

Kandel

Perhaps this is the essence of abstract art, that Eric Kandel expressed in Reductionism in Art and Brain Science, explaining that functional MRI shows human brains process representational art differently and in different cerebral pathways than processing abstract art (Columbia Press, 2016).  Representational art gives viewers very specific images that relate to things immediately understandable. (Below: American Gothic by Grant Wood (1930), courtesy Art Institute of Chicago.)

 

“Abstract art” seems to contain less information (perhaps less craft – or even no craft, at first glance) than representational works. Kandel finds that abstractions can in fact contain far more, calling on you to search everything you know to understand the piece. Abstract artworks invite you to inspect the world to discover their meaning, although a particular artist may not necessarily know or understand the world any better than you. The artist, however, creates a door for you to imagine the world differently than you did a moment before viewing the work. Abstract images may open up, in an informational sense, far more than a given representational scene or a moment you will readily comprehend. Abstraction is a window into far larger and stranger worlds of information, associations, and imaginations. (Below: Composition No. 10. 1939-1942, (Piet Mondrian. Private Collection. Wikipedia.)

Piet_Mondriaan,_1939-1942_-_Composition_10

edu-meet-me-volunteers

[Above: UM Silver Club members attend Meet Me at UMMA program at the University of Michigan Museum of Art. Image courtesy of UM Silver Club. The untitled painting is by Mark Bradford, 2005.]

 

Four.

The Shannon number, named for UM graduate Claude Shannon (1916-2001), represents a lower bound of the game-tree complexity of chess, 10120.  This is an enormous number, unimaginably large, given that the number of atoms in the observable universe is estimated at 1080. The point here is that human imagination (and in this instance, for only one human game), in a measurable sense, is far larger than the real world. Walt Whitman (1819-1892) may not have known the celestial math, but he wasn’t exaggerating when he wrote Song of Myself.

“Do I contradict myself?

Very well then I contradict myself,

(I am large, I contain multitudes.)”

[Whitman W. Song of Myself. Section 51, 1892 version.]

Whitman imagined that he and each of us is unimaginably large, in imagination. This is sensory overload at its most. It is ironically, unimaginable, far beyond TMI.

Whether an incidentaloma discovered by ultrasonography, computer-assisted tomography, or magnetic resonance imaging, is important to the well-being of a person or is too much information (TMI) is one of the dilemmas of modern medicine. The quality and precision of ultrasound interrogation, reveal increasingly tiny anatomic details, anomalies, and imperfections that may cause great anxiety for patients, regularly driving parents of children with simple renal cysts to near-insanity with unnecessary worry. While technology seemed to promise humans better control of their lives, it may be just the opposite, whereby technology becomes the ruling agent. [Below: the promise of technology, Life Magazine, September 10, 1965.]

life_c2

 

Five.

An article and a book expand these considerations of gaze, ratiocination, and information. Roger Kneebone, in The Lancet, offered perspectives on “Looking and Seeing,” comparing a physician’s observational skills to those of an experienced entomologist, Erica McAlister at the Natural History Museum in London. The article begins with these resonating sentences, quoted with his permission:

“Medicine depends upon observation. Yet we are changing the way we look and that alters what we see. As a medical student, I was schooled according to a rigid mantra. Inspection, palpation, percussion, auscultation – always in that order … The aim, I think, was to ensure that we directed our attention to the person in front of us, that we didn’t jump to conclusions before assembling all the information we needed. That fell by the wayside as we turned into junior doctors. Nobody seemed interested in what we had seen or how we described it. Instead, it was all about blood tests, x-rays, scans – all about results.” [Kneebone R. “Looking and seeing.” The Lancet. 393:1091, 2019.]

Kneebone says it beautifully. The last word in his phrase could easily be data as well as results. The results becomes a proxy for the patient. The physicians of the next generation have learned excellent key-board skills, data collection, acronyms du jour, and navigation of electronic health records with drop-down menus, check-lists, and cut-and-paste artistry. The artful skills taught to me and Kneebone – inspection, palpation, percussion, and auscultation – seem rendered obsolete by data. One worries if the talents to navigate technology and its data come at the expense of the medical gaze, the medical sniff, and the ratiocination Edgar Allen Poe and Arthur Conan Doyle brought forth in their detectives. The model of the astute clinician is giving way to Watson, not Conan Doyle’s Watson, but IBM’s Watson.

Information or data, if you prefer, is a false deity. We may use data but should not worship it. Too many leaders say “show me the data,” believing that data will perfectly direct essential actions. Data should inform key decisions, of course, but data needs human wisdom for good decisions – using, tweaking, discarding, or reformulating data for human needs, not for the self-serving “needs” of algorithms. Self-learning algorithms can accomplish much, but can never replace human wisdom.

The book of relevance is Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again, by Eric Topol, reviewed by Indra Joshi in The Lancet and I look forward to seeing if it convinces me in its promise. [Joshi I. “Waiting for Deep Medicine.” The Lancet. 393:1193-1194, 2019.]  The concern with “artificial intelligence” is its easy confusion with human wisdom, the wisdom of crowds that tends to bend toward truth and overarching human values. Self-learning algorithms that constitute AI are ultimately constructed by individuals with their own values, biases, and agendas. Furthermore, they are susceptible to intrusion and perversion. Finlayson et al warned of this recently: Adversarial attacks on medical machine learning, emerging vulnerabilities demand new conversations. [Finlayson SG, et al. Science. 363:1287–1289, 2019.]

 

Short story.

Truth is often stranger than fiction. Poe’s story in 1841 revealed the perpetrator of The Murders in the Rue Morgue was an orangutan smuggled to Paris by a sailor. The actual murders were unintentional, the escaped animal was frightened and responding as its genes, millions of years of environmental selection, prescribed. Most readers probably found that part of the story a bit outrageous, it didn’t quite make sense that a sailor could or would smuggle such an animal. But truth is often as strange or stranger than fiction: a recent report from the Associated Press of Russian tourist Andrei Zhestkov, discovered on the Indonesian resort island of Bali trying to smuggle a 2-year old drugged orangutan in a rattan basket to Russia on March 22. The smuggler also had seven live lizards in a suitcase. [Mike Ives. New York Times, March 25, 2019.]

Orangutan

 

Thanks for reading Matula Thoughts.

David A. Bloom

University of Michigan, Department of Urology, Ann Arbor

Spring

Matula Thoughts April 5, 2019

Calendar1

Spring considerations

20 minutes to read, two minutes to scan, one second to delete.

2341 words

Note of Passage

Mark C. McQuiggan, University of Michigan triple graduate, passed away last month leaving his beloved wife Carolyn (Brunk). Mark was the son of the late Dr. Mark R. McQuiggan and Dr. Catherine (Corbeille) McQuiggan, internists who had trained at the Mayo Clinic and worked together in an office in Detroit’s Fisher Building. Mark C. was born on May 15, 1933 and was 85 years old at the time of his death. He was thoroughly a Michigan Man with a BS from LS&A in 1954, an MD in 1958, and urology residency under Reed Nesbit, completed in 1964. Mark’s co-residents were Karl Schroeder and Dick Bourne, and other particular friends from residency were Clair and Clarice Cox and Dick and Jane Dorr.  Mark practiced urology with excellence and devotion in Southfield, Michigan, on the staff of North Detroit General Hospital and Ascension Providence Hospital. Mark and Carolyn were lovely and loyal presences at our yearly Nesbit Society Alumni Reunions. (Below: Mark in October, 2010, at the Nesbit Scientific Session.) Mark loved the University of Michigan, and Michigan Urology, along with Michigan athletics. Michigan Urology will miss Mark, who seemed to always have a smile and was a wonderful link to Michigan Urology’s past.

Urology at Michigan undergoes its own passage, this being the transition to Ganesh Palapattu as chair, who is already bringing exciting and substantive change to the department just around the fortuitous time of the Michigan Urology Centennial. He is continuing the weekly Urology What’s New aimed at departmental specifics along with this monthly set of Matula Thoughts on the first Fridays, and simultaneously available on the web site matulathoughts.org.

 

One. 

April brings spring, so welcome after a rough winter’s polar vortices reached down to our geography and innermost bodily cores. Flowering dogwoods, photographed last year (above), will return soon and that’s much of the attraction of photography – preservation of meaningful moments with fidelity to the momentary truth. We want to hold on to things we value as best we can and photography allows us to keep them, in a way, by replication. Words can also replicate those moments and truths with fidelity and beauty.

Last spring this column referred to Dr. William Carlos Williams and his book, Spring and All, a title mysterious in its promise. [Above: Williams and Ezra Pound at their last meeting, photographed by Richard Avedon in July 1958, Wikipedia.] The central piece in Williams’ collection, On the Road to the Contagious Hospital, speaks to facilities that that have faded away, the leprosaria, tuberculosis sanitaria, and other such places. New diseases and antibiotic-resistant resurgence of the old ones may resurrect those institutions. Leprosy, by the way, is not a disease of the past. The Lancet recently had a photoessay “Picturing health new face of leprosy.” The authors noted: “… leprosy impairs and society disables.”  [Kumar A, Lambert S, Lockwood DNJ. The Lancet, 393:629-638, 2019.]

The University of Michigan once had its own contagious hospital after the citizens in Ann Arbor in 1914 voted for a bond issue of $25,000 for an isolation hospital to be maintained by the university. [Below: UM Contagious Disease Hospital, courtesy Bentley Library.] It was placed on a ridge behind the Catherine Street Hospital and looked over the Huron River. Horace Davenport’s book (Not Just Any Medical School, 1999) tells how in the first year the 24-bed hospital housed patients with chicken pox, diphtheria, necrotizing ulcerative gingivitis (Vincent’s angina), pneumonia, tuberculosis (TB), and whooping cough. [Davenport HW. Not Just Any Medical School. University of Michigan Press. 1999.]

 

Two.

Photography, as a neologism meaning drawing by light, may have had a number of separate origins between 1834 and 1839. Previous methods to capture images by means of cameras obscura or shadow images on silver nitrate-treated papers were novelties, but didn’t scale up in terms of utility, until Louis Daguerre announced his sensational process on January 7, 1839. The rest is the history of the Kodak moment, motion pictures, Polaroids, and now the cell phone camera with its albums of thousands of pictures and videos.

Anesthesia, in contrast to photography, had a specific origin in time, place, and originator. Anesthesia was the neologism of Oliver Wendell Holmes in Boston, 1846. Just as photography was coming of age, medical practitioners were starting to bring science and new technology to their art. Large metropolitan hospitals, notably the Napoleonic legacies in France, afforded large numbers of patients that inquisitive physicians studied and compared. Evolving tools of measurement and investigation allowed new clinical skills and a slowly growing sense of hygiene would bring a greater level of safety to medical care.

Professor Charles-Alexandre Louis (1787-1872) in Paris at the Pitié-Salpêtrière was among the best of these physicians and his comparison of patients with pulmonary TB who were treated with leeches against those untreated patients was one of the earliest clinical trials. Young people from around the world came to Paris for weeks, months, or years to watch Louis at work. He stressed the idea of critical clinical observation (including the medical gaze), measurement, and analysis to improve understanding of disease and therapy, forming a Society of Clinical Observation that many young American trainees joined.

The idea of clinical material as the milieu for medical education and the improvement of health care through careful observation, inquiry, and research, received as great a boost from Louis as anyone. The medical gaze went beyond a quick visual glance. Deep inspection by an experienced physician was something new, a gaze that would discover clues to a diagnosis, understanding of co-morbidities, and other relevant facts to the case, the story, and the truth of a clinical situation.

The medical gaze, like the photograph, was novel and they complemented each other. Photography became a teaching and documentary tool. The informed gaze discovered a condition, an attitude, or a moment that the photograph could replicate and preserve. The medical gaze also inspired a new genre in literature – bringing the idea of astute medical discovery by observation, listening, and reasoning to crime solving.

One wonders if the medical gaze, once a desirable clinical skill, has now been eliminated by modern imaging tests, laboratory studies, biomarkers, and check lists? This begs the question whether or not tomorrow’s masters of those technologies and processes will quickly succumb to nonhuman purveyors of “artificial intelligence”?

 

Three.

The Murders in the Rue Morgue, Edgar Allen Poe’s famous short story in 1841, initiated a new genre of crime literature and the clever reasoning, Poe called “ratiocination,” necessary to solve crimes. [Poe 1809-1849, above] Curiously, Poe’s story included a brief speculation on uroscopic clues, specifically the odor of urine.

This scientific crime solver genre continues to gather cultural momentum. The picture above, made in the last year of Poe’s life, is the “Annie” daguerreotype, the best known of the eight known Poe daguerreotypes and named for Mrs. Annie Richmond of Lowell, Massachusetts who commissioned and owned the picture. Poe was just a little ahead of his time with ratiocination, his take on the medical gaze, where careful observation and trained reasoning could discover the truth of a situation. Over the next decades up to the fin de siècle a scientific corpus of knowledge, bringing new technology, would expand the medical gaze into a powerful capacity to produce data and evidence for both health care and criminal investigation.

Future detective author Arthur Conan Doyle (1859-1930) was barely ten years old when Preston B. Rose started teaching Ann Arbor medical students urinalysis and scientific methods of forensic investigation in the Chemical Laboratory just behind the University of Michigan Medical School. Only 17 years later, as a 27-year old ophthalmologist with a struggling practice, Conan Doyle created a powerful blend of ratiocination and scientific analysis in the intellectual superhero, Sherlock Holmes. The detective was modeled on a real-life medical role-model of Doyle when he was a medical student and the name Doyle selected coincided with the real-life medical superhero Oliver Wendell Holmes, one of the most prominent Americans Abroad, who studied with Louis in Paris, as explained in David McCullough’s book. After return to Boston, Holmes presented one of the first convincing hypotheses for the germ theory to explain puerperal fever. [Below: Sir Arthur Ignatius Conan Doyle by English photographer Herbert Rose Barraud. Carbon print on card mount. Courtesy of the National Portrait Gallery, London.]

Doyle SS

 

Four.

Holmes embraced the new technology of photography, writing essays about it, making his own pictures, inventing a stereoscopic camera, and studying human ambulation with it. In the June issue of The Atlantic Magazine in 1859 Holmes commented on the improbability of the technology of capturing an actual moment in time totally on a single surface:

“This is just what the Daguerreotype has done. It has fixed the most fleeting of our illusions, that which the apostle and the philosopher and the poet have alike used as the type of instability and unreality. The photograph has completed the triumph, by making a sheet of paper reflect images like a mirror and hold them as a picture.”

It is a universal truth that pictures tell stories more immediately than words, and we humans have been practicing this art since cave-dwelling days, inspired by beauty in the natural world, fantasies, or unnatural horrors. Photography offers realistic images of faces, scenes, or situations, and complements the older visual arts of drawing or painting.

Earlier, in the inaugural Atlantic Monthly (above) Holmes had written:

“The next European war will send us stereographs of battles. It is asserted that a bursting shell can be photographed… We are looking into stereoscopes as pretty toys, and wondering over the photograph as a charming novelty; but before another generation has passed away, it will be recognized that a new epoch in the history of human progress dates from the time when He who

Never but in uncreated light

Dwelt from eternity –

Took a pencil of fire from the hand of the ‘angel standing in the sun,’ and placed it in the hands of a mortal.”

[“The stereoscope and the stereograph,” Atlantic Monthly, November, 1857.]

 

Five.

Guernica. Pablo Picasso (1881-1973) while living in Paris was commissioned by the Spanish Republican Government to make a work in response to the destruction of Guernica. This  Basque town in northern Spain was bombed for two hours by Nazi Germany and Italian warplanes in their support of Spanish nationalists on 26 April 1937. [Above: Picasso working on the mural. Wikipedia.] The town was at a major crossroad 10 kilometers from the front lines between the Republican retreat and Nationalist advance to Bilbao. The target was a minor factory for war materials outside of town. The bombers missed the factory, but destroyed the town.

Picasso completed the large oil painting on canvas in June, 1937, after 35 days of work. The specific disputes of the Republicans and Nationalists, and the justifications of their supporters and suppliers are nowhere evident in the mural, only the grotesque mangled forms and anguished expressions of the victims. Guernica may be Picasso’s greatest work and one of mankind’s iconic images of the horror of war. The event itself was miniscule in the grand scale of 20th century conflict, but Picasso made it a transcendent moment for humanity.

No single painting, photograph, or narrative can capture the full and terrible story of Guernica, although together they give a fuller sense of the horror than any one work alone. [Above: Museo Reina Sofia, Madrid, Spain. ©Picasso. Below: ruined Guernica. German Federal Archives.]

Guernica, Ruinen

Picasso had commissioned three full-size tapestry reproductions of the work by Jacqueline de la Baume Durrbach and her husband René in 1955, weavers in Southern France. Nelson Rockefeller purchased one of these and it hangs on loan in the United Nations at the entrance to the Security Council room. A blue curtain strategically covered Guernica for televised press conferences of Colin Powell and John Negroponte on 5 February 2003. [Kennedy M. “Picasso tapestry of Guernica heads to UK.” London: The Guardian, 26 January 2009.] Picasso entrusted Guernica to the Museum of Modern Art in New York, pending re-establishment of liberty and democracy in Spain. After Spain became a democratic constitutional monarchy in 1978 the painting was ceded to Spain in 1981, although not without dissent that the ruling system was still not quite the republic stipulated by the artist in his will.

 

Short bits.

Morbidity and Mortality (M&M) conferences, discussed here last month, brought M&M candy to mind. The story goes that the Spanish Civil War inspired Forrest Mars, Sr. to create an American version of the British confection Smarties. Mars was working in England in the candy business at that time, estranged from his father, Frank Mars of Mars candy fame. Forrest had created the Mars Bar in Slough in 1932 and was looking for another product. Rowntree’s of York, maker of Chocolate Beans since 1882, had recently tweaked the name to Milk Chocolate Beans in 1937, and changed it to Smarties the following year. These oblate spheroids were sold in cylindrical cardboard tubes, with a colorful lid that contained a random alphabet letter, designed to encourage children to learn. The chocolate center was protected by a shell of hardened sugar syrup to prevent melting, a convenience enjoyed by soldiers in the Spanish Civil War.

The Spanish Civil War (17 July 1936 – 1 April 1939) engendered strong international sympathies, involving anarchists, communists, nationalists, aristocratic groups, and religious factions, although largely became viewed as a contest between democracy and fascism. British volunteers, likely including George Orwell, carried Milk Chocolate Beans and Smarties into battles and Forrest Mars might have noticed. Just as likely one of his children brought some home.

Returning to the U.S. and working with Bruce Murrie, son of Hershey Chocolate’s president, Mars developed their button-shaped variant, patented it on 3 March 1941, and began manufacture that year in New Jersey. M&M derived from Mars and Murrie, with a small “m” stamped on each button. The first big customer was the U.S. Army and during WWII M&Ms were sold exclusively to the military. “Melts in your mouth, not in your hand,” was first used as a tagline in 1949. Peanut M&Ms were introduced in 1954, and the rest is history.

Thanks for reading Matula Thoughts

David A. Bloom, M.D.

University of Michigan, Department of Urology, Ann Arbor

 

 

Matula Thoughts March 1, 2019

 

DAB What’s New Mar 1, 2019

NESBIT_CalendarExample_MAR

Stories

1999 words

[Above: childbirth fever pamphlet 1855 – a fatal complication. Below: M&M complications conference at UM Urology.]

One.             

M&Ms.  Once a month our department gathers at 7 AM on a Thursday morning for Morbidity and Mortality (M&M) conference, as is typical of most surgical training programs. This recurring touchpoint integrates the triple mission of medical academia so we can learn from the serious complications inherent to our work, improve the quality of that work, and discover new avenues of investigation. Typically, residents or fellows tell a story of a complication or a death, faculty members involved consider “what might have been done differently,” others share their experiences and thoughts, and sometimes a literature-based short presentation is offered. Complications are classified by the Clavien system. [Above: January 2019 M&M with Priyanka Gupta discussing the new complications entry system.] These conferences fine-tune our mutual relevance, allowing regular inspection of our complications, discussion from the perspective of quality improvement, and calibration of individual work with that of colleagues.

When I was a resident, grand rounds centered around the chair, whose every opinion mattered. Performances as residents could make or break progression through residency and chances for fellowships or good jobs. The chair critiqued everyone else and molded the department in his image (always a “his” during my training), much like an Autocrat at the Breakfast-Table, the title of essays by Oliver Wendell Holmes in 1858. Those of us who made it through the process naturally carried a deep respect and even fondness for the chair, while others were not quite so enamored. Things have changed, especially in big departments, with decentralization to divisions and teams much more the order of the day, and while structure is still necessarily hierarchical (the buck must stop somewhere), a more democratic flavor rules the day at M&M conferences and grand rounds.

Although chairs are no longer the center of departmental universes, they set much of the tone and represent the team administratively to the rest of the institution. Departments improve when leadership rotates carefully, as it has in our case, and today it’s official: we welcome Ganesh Palapattu to our chair position, and Brent Hollenbeck as vice chair of the University of Michigan Department of Urology.

 

Two.

The Clavien-Dindo system, described in 2004 by Zurich surgeons Pierre Clavien and Daniel Dindo, assigns grades to surgical complications: Grade I events are small deviations from normal expected operative or postoperative courses; Grade II events are atypical medication needs, including blood transfusion and total parenteral nutrition; Grade III are complications requiring surgical, endoscopic, or radiologic intervention – with or without anesthesia; Grade IV are life-threatening complications; and Grade V is death. [PA Clavien et al. Ann Surg. 250:187-196, 2009.] Our M & M conferences focus on Clavien III or greater complications, mainly to identify learning opportunities: what could we do better, personally, or in our teams and systems? Human activities are inevitably susceptible to periodic errors and negative outcomes, but medical complications are serious disappointments and sometimes tragedies for patients and their families. Each complication is a story, often a complex one. Faculty and residents must learn from them, grieve over them, and learn to deal with the adversity. Just as importantly, surgeons must move on to take care of the next patient. The seminal book Forgive and Remember by Bosk, discussed on these pages in the past, is worth renewed attention. [Bosk CL. Forgive and Remember. Managing Medical Failure. University of Chicago Press. 1979.]

Getting “the story” right is a universal necessity, whether from personal points of view, social perspectives, or occupational demands. Journalists, teachers, politicians, engineers, lawyers, and physicians need to understand stories and ascertain truth. Surgeons need to know a patient’s story from the diagnostic perspective in order to come to operative solutions, and if complications occur, then it is imperative to understand those stories, for only then can the practice of medicine improve.

 

Three.

The idea of what is “right” – that is what can be proven true or is generally accepted as correct – is surprisingly complex, requiring a socially shared sense of “truth” and factual reliability.  A person’s ability to adhere to truth is a matter of integrity, and we expect higher levels of integrity from physicians, scientists, and engineers than many other occupations. Yet, shouldn’t we expect integrity in all responsible occupations, from chefs to politicians? When is it forgivable to tamper with the public trust for personal gain or malicious reason and what are the boundaries of the First Amendment? These tough questions are beyond solution in Matula Thoughts, but should be considered and discussed by all members of society.

It is a fact, as this line is written, that it is not raining outside my window, but that fact will change with time and environment. Some facts are difficult to ascertain and people sometimes have legitimate misconceptions of reality, uncertainty being intrinsic to humanity. Deliberate misrepresentation of reality, however, is corrosive to any social group and to society at large. Deliberate misrepresentation is expected in the products of fiction and the entertainment industry, but not in their business dealings. Misrepresentation in business, politics, religion, etc., erodes trust, essential for a healthy society. When stories become propaganda, or opinions masquerade as journalism, free speech is abused. Misrepresentation in medicine and science, worse matters, are social crimes.

These last charges are tricky, running contrary to the First Amendment and the cherished idea of free speech. Yet, “yelling fire” in a theater or its equivalent on social media is too  dangerous for society to tolerate. Democratic societies have yet to figure out where and how to draw the line between deliberate misrepresentation and free speech, and the hyper-pace of contemporary social media exacerbates the dilemma. Given that the ideas of the First Amendment are self-ordained “rights” of humanity, it is unlikely that they can be preserved if they cannot be better stewarded to serve the public, rather than serve individuals, factions, or ideologies.

Then, too, there is the matter of the “backstory,” the history, conditions, and other narratives leading up to a particular story and the circumstances that frame it. In health care the backstory includes co-morbidities, while in the field of economics such circumstances are dismissed as externalities. Although stories are simpler and easy to “understand” when stripped of complicating and confounding matters, stripped-down stories rarely convey the whole truth of a matter for accurate understanding.

 

Four.

It is hard to escape the name Oliver Wendell Holmes in American history. There were two of them, the first an iconic American physician (1809-1894) and the second, his son, an iconic supreme court justice (1841-1935). Both lives and careers centered on stories and truth.

Medical practice is a highly social profession and business. Socialization of practitioners with specialized knowledge and experience, sharing their stories, is a route to progress and today’s M&M conferences are programmed opportunities for this teamwork. Medical education, standards of practice, quality improvement, and research have been built around socialization since ancient times of Mediterranean and Asian medical practice, medieval professional guilds, and doctors in the early days of the United States.

One sparkling example was The Boston Society for Medical Improvement, doctors who wanted to share ideas and ascertain truths. Established in 1828 by John Spooner with 11 members, the Society quickly grew to 35 by 1838. Meetings were held the second and fourth Monday each month, originally in Spooner’s rented room on Washington Street.  A cabinet keeper managed a collection of specimens contributed by the members. Only “elite” practicing physicians of Boston were eligible and a younger set of physicians in 1835 formed their separate Boston Society for Medical Observation, echoing the terminology of Professor Louis in Paris, under whom Holmes studied. The two competing Boston groups ultimately merged in 1894.

The picture above, from the Countway Library Center for the History of Medicine, shows the Boston Society for Medical Improvement in 1853: sitting – George Bethune, Oliver Wendell Holmes, Samuel Cabot, Jonathan Mason Warren, William Coale, James Gregerson; standing – Charles Ware, Robert Hooper, Le Baron Russell, Samuel Parkman. Samuel Cabot was the grandfather of Arthur Tracy Cabot and Hugh Cabot, two of the most influential urologists in the transitional fin de siècle between the end of the late 19th century and early 20th. Hugh Cabot’s arrival in Ann Arbor in autumn 1919 defines the Michigan Urology centennial.

 

Five.

Puerperal fever & a murder. At a summer meeting in 1842 of the Boston Society for Medical Improvement, JBS Jackson queried fellow members their opinions regarding the possible contagiousness of puerperal fever. Jackson was concerned by the death of a colleague after treating an infected woman, and he knew of other infections incurred by subsequent patients the decreased physician had treated before he died. Holmes, a member of the original French Society of Medical Observation during his study in Paris a decade earlier, took up Jackson’s question and presented his own independent research, “The contagiousness of puerperal fever,” back to the Society on February 13, 1843. The presentation was commemorated in a 1940 painting by Dean Cornwell, That Mothers Might Live (below).

OWH 1843

The New England Quarterly Journal of Medicine and Surgery published Holmes’s talk in April and it was reprinted as a pamphlet (top, lead picture). Holmes was certain that “obstetricians, nurses, and midwives were active agents of the infection, carrying the dreaded disease from the bedside of one mother to the next.” This was among the earliest good evidence for germ theory of disease.

Holmes was dean of Harvard Medical School when he factored in the sensational murder case of wealthy Bostonian George Parkman in 1849. Parkman had studied medicine, but never practiced, so it is likely that the Parkman identified in the Boston Society for Medical Improvement was his relative. The murdered George Parkman was a wealthy Bostonian who had studied abroad, received an MD in Aberdeen, Scotland, and studied further in France, taking particular interest in mental illness. After returning home, however, he never practiced medicine, instead managed family property, so was ineligible for the Medical Improvement Society, although an admired friend of Holmes.

John Webster was also from an affluent family and had studied abroad. Later in Boston Webster became professor of chemistry and geology at the medical school, but ran into debt often and borrowed extensively, including from George Parkman. In an argument over a debt, Webster killed Parkman in his medical school office on November 23, 1849, dismembered the body, and hid it in a locked cellar basement restroom. An astute custodian, Ephraim Littlefield, concerned about the popular missing Bostonian, broke into the room and discovered the body remnants on November 30, 1849.  Holmes testified persuasively at the 12-day trial and Webster was executed by hanging on August 30, 1850. Holmes dedicated his 1850 introductory lecture to the medical school class in Parkman’s memory. [Below: OW Holmes c. 1879.]

Holmes enjoyed stories, although happier ones than that of his murdered friend. He wrote poetry and books of fiction and nonfiction. A founder of the Atlantic Magazine, he contributed to it regularly and mingled with the literary set in Boston, including J. Elliot Cabot, James Russell Lowell, Ralph Waldo Emerson, and Henry Wadsworth Longfellow. Holmes popularized the term Boston Brahmin and was certainly one of them. The Autocrat of the Breakfast-Table is a collection of 1857-1858 essays Holmes wrote for The Atlantic, published in book form in 1858. The stories are one-sided dialogues between a genial and “anonymous author” and other residents of a New England boarding house. It is, perhaps, more than a coincidence that the fictional detective imagined 40 years later by Dr. Arthur Conan Doyle, would share the Holmes surname.

 

Short story. Frédéric François Chopin born this day in 1810, six months after Holmes, lived a short life of only 39 years. Although numerous photographs exist of Holmes, only two exist of the great Polish composer and virtuoso pianist. [Below: top, Chopin c. 1847, http://commons.wikimedia.org/wiki/File:Chopin1847_R_SW.jpg]

Photography as a technology was new and rare during the early lives of these two men, but Holmes’ luck of longevity gave him greater opportunity as a subject. [Above: Chopin c. 1849. Daguerreotype by Louis-Auguste Bisson.]

 

Thanks for reading Matula Thoughts.

David A. Bloom

University of Michigan, Department of Urology, Ann Arbor

 

 

February 1, 2019

DAB What’s New Feb 1, 2019

matula_logo_final-2019-1-21

Sands of time, transition, & short thoughts on rules
3996 words

nesbit_calendarexample_feb

 

One.

time

February, the shortest month, begins today, this Friday, and its periodic extra day comes next year on a Saturday. Although 2019 is only a month old, the sands of time slipped away for one iteration of Michigan Urology, and the metaphorical hourglass reloads today for our Michigan Urology version 8 that will refresh our department. Regental privilege requires that the next urology chair requires formal action, although most of us know the party in question, who begins today as acting chair. Ganesh Palapattu will do an excellent job leading the faculty, residents, and staff – the parties who will actually do the refreshing. Our new chair will face challenges and, if history is any guide, our team will support him fully for the next chapter of the Michigan Urology journey. In that context, this is a good time to examine the past and re-articulate our history, as Richard Feynman (1918-1988), American theoretical physicist, once wrote:

“Why repeat all this? Because there are new generations born every day. Because there are great ideas developed in the history of man, and these ideas do not last unless they are passed purposefully and clearly from generation to generation.” [Feynman RP. The Meaning of it All. Thoughts of a Citizen Scientist. 1998.]

It may be a long stretch from the “great ideas in the history of man,” to a modest history of Michigan Urology but I hope you allow Matula Thoughts some slack and accept this belief in regularly rearticulating the past for each cohort of our successors.

screenshot 2019-01-29 14.12.14

I first met Ganesh when I was visiting professor at UCLA, my urology alma mater, and he was a resident under Jean deKernion, a wonderful urologist, leader, and friend. As a visiting professor at a number of places, I often tossed out ideas for papers, but Ganesh was perhaps the only one over the years who took the bait and completed a paper with me. His career took him to Johns Hopkins, The University of Rochester, and then Baylor in Houston at Tim Boone’s program. At great loss to Tim, but with his consent and blessing, Ganesh and his lab, with Alex Zaslavsky, came to Michigan at the start of my term as chair. Ganesh is well prepared. He is a terrific teacher, effective leader, excellent surgeon, and has led our largest urology section, uro-oncology, very well. When a need is identified he steps up – he was among the first to volunteer in Flint at the Hamilton Community Health Network clinic, when that opportunity materialized. His lab has done well with a recent 2% score on its latest grant submission. Ganesh will be thoughtful, consensus-building, and creative as he leads Michigan Urology in its mission (education, research, and clinical care), and our essential deliverable – kind and excellent patient-centered care. [Above: Ganesh with Anu. Below: with Kirtan and Elina.]

 

Two.

250px-the_melody_haunts_my_reverie

Anticipating the centennial of Michigan Urology, we’ve been working on a new volume of our story, previously written by the late John Konnak and urological scholar Dev Pardanani nearly 20 years ago. It is impossible to understand the urology story in Ann Arbor, without a larger sense of the story of our state, our specialty, and our university. It might be said that melodies of the past haunt the reveries of our stories, to tweak Hoagy Carmichael’s phrase. So, our story properly began around 11,000 years ago, well before Hippocrates and the known roots of medical practice, with the inhabitants of the Mound Builder and Woodland cultures who populated our geographical area after the last glacial period receded. The Holcombe beach site near Lake Saint Clair has evidence of Paleo-Indian settlement in that era and by the 17th century, Huron, Odawa, Potawatomi, and Iroquois people inhabited the region. Dates are difficult to ascertain, but legend, archeology, and solar eclipse history suggest that an Iroquois Confederacy of Five Nations around the Great Lakes formed by then. Those people surely suffered from urological problems and undoubtedly tried many remedies to ease their pains, although the ailments either dissipated or claimed the poor sufferers’ lives. [Above: Painting by Roy Lichtenstein, 1965. Below, Map of Five Nations, De Lisle, 1718. Darlington Collection, University of Pittsburgh.]

map_of_the_country_of_the_five_nations_belonging_to_the_province_of_new_york_and_of_the_lakes_near_which_the_nations_of_far_indians_live_with_part_of_canada_taken_from_the_map_of_the_lou

French explorers, beginning with Étienne Brûlé, around 1610, Samuel de Champlain, and later René-Robert Cavelier de La Salle, attempted to colonize the regional home of the Cayuga, Mohawk, Oneida, Onondaga, and Seneca who comprised the Iroquois Five Nations. The Tuscarora joined the confederacy in 1722 to become the Six Nations that eventually were overwhelmed by Europeans.

 

Three.

Prelude to UM. Detroit, a settlement town in the western territory of a young United States, was initially referred to as the straights. Michigan became a distinct territory, carved from the Northwest Territory by congressional act, 30 June 1805. First governor William Hull and presiding judge Augustus B. Woodward described its history, in their first report, with the French penetration of Lake Michigan, the “Ouisconsin” River and the Mississippi down to its “mouth,” defaulting to the French feudal system of property ownership by aristocratic right (seigniorial), but offering no sensitivity to the Native American perspective:

“Prior to this era the settlements of the strait had commenced, and Detroit claims an antiquity of fifteen years superior to the city of Philadelphia. The few titles granted by the government of France were of three French acres in front, on the bank of the river, by forty feet in depth, subject to the feudal and seignoral conditions, which usually accompanied titles in France.” [Michigan Historical Collections. 36:107, 1908.]

The claim in the report refers obliquely to La Salle who buried an engraved plate and cross near what is now Venice, Louisiana, on April 9, 1682 to assert ownership of the territory by France. Hull and Woodward didn’t have all their facts in order regarding Philadelphia, also founded in 1682 but a month earlier on March 4 when William Penn made it the capital of Pennsylvania Colony. Great Britain assumed the French possessions after the 1763 Treaty of Paris ended the Seven Year’s War. Another Treaty of Paris, in 1783, ended the Revolutionary War, and the territory that would become Michigan was acquired from Canada by the United States. The Hull and Woodward Report tells of the sad circumstances of Detroit in June of 1805 just after it had burned to the ground:

“It was the unfortunate fate of the new government to commence its operations in a scene of the deepest public and private calamity. By the conflagration of Detroit, which took place on the morning of the 11th of June, all the buildings of that place, both public and private, were entirely consumed; and the most valuable part of the personal property of the inhabitants was lost. On the arrival of the new government [Woodward arrived Saturday June 29 and Hull on Monday July 1]. A part of the people were found encamped on the public grounds, in the vicinity of the town, and the remainder were dispersed through the neighboring settlements of the country; both on the British and the American side of the boundary… The place which bore the appellation of the town of Detroit was a spot of about 2 acres of ground, completely covered with buildings, and combustible material…” [Central Michigan University. Clarke Historical Library. 1805. Hull.]

Detroit rebounded from the fire and was on the upswing when The War of 1812 broke out and the town, indefensible, surrendered to the British on 6 August. An attempt to regain Detroit by General William Henry Harrison failed in January 1813, but on 10 September Commodore Perry’s fleet of nine small ships defeated six heavily armed Royal Navy ships on Lake Erie and returned the city to the United States. One quarter of the recruited American soldiers were African American. The British retreated up the Thames River in Canada, where the decisive Thames Battle on 5 October turned the tide against Great Britain and Tecumseh’s Confederacy (recounted here in Matula Thoughts last year). This story is a prelude to the University of Michigania, organized in Detroit in 1817.

 

Four.

New Year resolutions have faded into memory by now for all but the most resolute of us, although it’s worth reflecting that resolutions and intentions reflect the best versions of our imperfect selves. Franklin Delano Roosevelt, an architect of some of the best of modern American society, was particularly good with his public words, few more noteworthy than in his First Inaugural Address on March 4, 1933 during the depth of the Great Depression: “So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself – nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.” Yet, no more or less imperfect than most of us today, FDR sometimes crumbled from fear himself, as early in WWII with Executive Order 9066 February 19, 1942, authorizing the Secretary of War to prescribe “Military Areas”:

“Whenever he or any designated Commander deems such action necessary or desirable, to prescribe military areas in such places and of such extent as he or the appropriate Military Commander may determine, from which any or all persons may be excluded, and with respect to which, the right of any person to enter, remain in, or leave shall be subject to whatever restrictions the Secretary of War or the appropriate Military Commander may impose in his discretion. The Secretary of War is hereby authorized to provide for residents of any such area who are excluded there from, such transportation, food, shelter, and other accommodations as may be necessary, in the judgment of the Secretary of War or the said Military Commander, and until other arrangements are made, to accomplish the purpose of this order. The designation of military areas in any region or locality shall supersede designations of prohibited and restricted areas by the Attorney General under the Proclamations of December 7 and 8, 1941, and shall supersede the responsibility and authority of the Attorney General under the said Proclamations in respect of such prohibited and restricted areas.” [Below: FDR at Yalta. DG Chandor portrait at SAAM, Washington.]

chandor. fdr yalta

The Executive Order quickly became actual law on March 21, 1942 when Roosevelt signed Public Law 503, put forth by Congress after 30-minute discussion in the House and an hour in the Senate, thus evicting 122,000 men, women, and children of Japanese ancestry (two thirds were American citizens) from their West Coast homes to incarceration camps. Americans of German and Italian ancestry were similarly targeted, but with much smaller numbers. Another Executive Order, number 9102 signed 18 March 1942, created the War Relocation Authority (WRA) to manage the forced relocation and internment. Milton Eisenhower was its first director, but only for a few months. His successor, Dillon Myer asked Eisenhower if he should take the job and was told:

“Dillon, if you can sleep and still carry on the job my answer would be yes. I can’t sleep and do this job. I had to get out of it.” [NYT 3 May 1965.] [Oral history interview with Dillon S. Myer. Harry S. Truman Presidential Library.]

Ultimately, 18 Civilian Assembly Centers, 10 Relocation Centers of the WRA, 9 Justice Department Centers (with German-American and Italian-American detainees), 3 Citizen Isolation centers (for “problem inmates”), 3 Federal Bureau of Prisons sites (mainly for draft resisters), 18 U.S. Army facilities, and 7 Immigration and Naturalization Services’ facilities were involved in detentions. The Japanese American Memorial to Patriotism During WWII revisits this sad story with the Golden Crane sculpture of Nina Akamu showing two Japanese cranes caught in barbed wire. Semicircular granite walls name the ten main WRA internment camps and The Archipelago on the open perimeter along Louisiana Avenue near D Street in Washington, DC, symbolizes the Japanese Islands and the five generations of Japanese Americans affected by the war. [Below: Two Cranes. DAB January, 2018.]

japanese monument

 

Five.

Hourglasses turn the ephemeral notion of time into physical reality. The grains of sand are elementary chemicals assembling by physical rules into worthy objects, stardust like ourselves. Laws of chemistry and physics that created stardust are durable and universal. Human rules are fungible and we hope that representational government and good leaders bend them to fairness, allowing redress when rules are improper, archaic, wrong-headed, or harmful to the public good. All sorts of rules, federal, state, local, professional, organizational, sectarian, familial, and personal ones constrain us, and sometimes they seem to come out of the blue as with presidential directives. Lincoln’s Emancipation Proclamation, considered here last month, and FDR’s Executive Order 9066 raise the issue of these curious sidebars of American law. A report of the Library of Congress, Congressional Research Service, by legislative attorney John Contrubis (updated March 9, 1999) explains the origin and usage of these two “Presidential instruments” (below).

pres proclam

The Constitution provides no explicit authority for executive orders and proclamations, although Article II states: “the executive power shall be vested in a President of the United States,” “the President shall be Commander in Chief of the Army and Navy of the United States,” and “he shall take care that the laws be faithfully executed.” Dogmatic originalism, might then argue to exclude the Air Force from presidential authority, or stipulate that a president execute all laws faithfully to their letter (rather than broad interpretation of Constitutional intent), or that a president must be a “he.” Such pedantic exercises unnaturally infuse human rules with an immutability similar to natural laws of chemistry and physics.

emanc proc

As humans, we elevate some of our laws to higher truths, such as belief in human liberty, the sanctity of life, equality of opportunity, and the right to pursue happiness, recognizing that these “self-evident truths” are perhaps on a higher plane than laws of prohibition, zoning, speed limits, or executive orders. Executive orders are legally binding directives given by the president to federal agencies in the executive branch, while executive proclamations may be ceremonial, policy announcements celebrations (Mother’s Day), or statements of a condition (e.g. of national mourning for the death of George HW Bush). Clearly there is overlap between orders and proclamations; the Emancipation Proclamation was as much an order as a proclamation. [Above: Emancipation Proclamation, Clements Library, University of Michigan. Below: 1914 Proclamation of Woodrow Wilson designating Mother’s Day.]

mother's day proclamation copy

 

Six.

Lysekno. Civic laws can cast long shadows that undermine education and science, setting human laws and policies at odds with the natural world. The Trofim Lysekno (1898-1976) story is a cautionary tale. That Russian biologist rejected Mendelian genetics and proposed his own theory of environmentally-acquired inheritance, offering experimental results with improved crop yields by his methods (unverified by others) and convincing Joseph Stalin to embrace Lysenkoism nationally. Soviet scientists who opposed the idea were dismissed from their posts, if not killed as “enemies of the state.” [Fitzpatrick S. Stalin’s Peasants: Resistance and Survival in the Russian Village after Collectivization. Oxford University Press. 1994. p. 4-5.] Forced collectivization and famine followed in the 1930’s, but Lysenko’s political power consolidated and in 1940 he became director of the Institute of Genetics of the USSR Academy of Sciences. In 1948, scientific dissent from Lysenko’s theory was outlawed.

After Stalin died in 1953, Nikita Khrushchev retained Lysenko in his post, but scientific opposition resurfaced and his agricultural influence declined. In 1964, Andrei Sakharov (1921-1989) physicist, architect for the Soviet thermonuclear bomb, but later Soviet dissident and Nobel Peace Prize Recipient (1975), denounced Lysenko to the Russian Academy of Sciences in 1964 saying:

“He is responsible for the shameful backwardness of Soviet biology and of genetics in particular, for the dissemination of pseudo-scientific views, for adventurism, for the degrading of learning, and for the defamation, firing, arrest, even death, of many genuine scientists.” [Norman L, Qing NL, Yuan JL. Biography of Andrei Sakharov, dissent period. The Seevak Website Competition.] [Cohen BM. The descent of Lysenko. The Journal of Heredity. 56:229-233, 1965.] [Cohen BM. The demise of Lysenko. The Journal of Heredity. 68:57, 1977.]

Lysenko died in Moscow in 1976 with only brief mention in the daily national newspaper. His politically enforced scientific pseudo-science had tragic consequences for millions of people in Soviet Russia. Lysenko wasn’t the first to consider the effects of environment on inheritance, Lamarck (1744-1829) had that thought much earlier. Open scientific give and take has since shown that Mendelian and other genetic processes are indeed influenced if not largely regulated by epigenetic factors. Science works well, but not when corrupted by ideology.

 

Seven.

573px-the_gerry-mander_edit

Too bad Gerrymanders aren’t mythical creatures. These Homo sapiens look-a-likes actually exist, grabbing and abusing transient authority to distort reality and fairness to gain political advantage. Democracy as expressed in our origin-document, The Declaration of Independence is built upon shared belief in fairness, but when fairness is seriously undermined, authoritarianism creeps back into public life – authority of a political party, authority of a leader, authority of a particular ideology, authority of a religion, or authority of a class of people. History shows this human propensity again and again with tribalism, kingdoms, monarchies, dictators, cults, single-party nations, etc. Gerrymander came from Elkanah Tisdale’s cartoon in the Boston Centinel, 1812, showing the district created by the Massachusetts Legislature to favor incumbent Democratic-Republican candidates over the Federalists. [Above: Tisdale’s creature in the Centinel, 1812. Below: Michigan districts.]

 

mich congressional

Eradication of the gerrymander is one of democracy’s existential necessities. This problem is exacerbated by the algorithmically-targeted misinformation made possible by personal data mining. This perversion of free speech is dramatized in the Netflix film, Brexit.

 

Eight.

history hall

History Hall. Along the passages connecting University Hospital, Frankel Cardiovascular Center, Rogel Cancer Center, and Medical Sciences I buildings are pictures of most of the Medical School graduating classes. Even as faculty and staff walk briskly through them, discussing their work, the decorative walls and the light from the glass tunnel are pleasant and even refreshing. If you have a chance to linger briefly and look, the pictures take your walk through a history of paradigm changes, economic booms and busts, great discoveries, inspiring leaders, wars, bad actors, duds, and all the other stuff of 170 years. Each student and faculty member in the class pictures is an individual summation of countless personal dramas and stories. [Above: David Fox and Joe McCune.]

Maybe stepping aside as chair (I don’t think of it so much as “stepping down” or a loss, but I am truly pleased to have Ganesh Palapattu pick up the challenges, present and ahead) gives me too much time for lingering walks and gratuitous thoughts. Framed by all the larger problems of the world (geopolitical conflict, terrorism, poverty, widening inequality, economic unpredictability, environmental degradation, infectious diseases, and other existential threats) one must wonder: can we humans successfully control our own destiny? If so, some structure and rules are obviously necessary for 7 billion people on a small planet, but will the structures and rules revert to ancient painful models of authoritarian rule and pyramidal hierarchy, or could they tilt toward libertarian, laissez-faire, or anarchistic models although those have never proven successful at large scale?

The question is not merely rhetorical, it is existential and an answer needs to be found between those extremes, within some central range. How we find, set, and reset that optimal place in our laws is the ultimate political question. Representational democracy, even as terribly imperfect as it is, seems to offer the best framework to balance individual freedom and happiness with optimization of societal function, human destiny, and planetary sustainability. This same dilemma of governance, structure, and rule-setting is recapitulated in localities and large organizations, even that of Michigan Medicine. These may seem strange Matula Thoughts for the moment and solutions are beyond the wisdom of this writer, but with 7 billion points of wisdom, good answers should abound. Lingering walks through history halls can help.

 

Nine.

Academic urology at Michigan effectively began in the autumn of 1919 when Hugh Cabot came to Ann Arbor, and for that reason we begin a year of centennial celebration with our Nesbit Alumni Reunion October 3-5, 2019. Cabot’s 11 years at Michigan were transformative, but disruptive and (yes) often authoritarian, leading the regents to dismiss him in February, 1930, “…in the interests of greater harmony.” His next phase of work was at the Mayo Clinic where he focused on large issues of health care, such as testifying to Congress in favor of multispecialty group practice against the position of the AMA. Cabot’s final book, The Patient’s Dilemma, written in 1940, concludes with reflections on the problems that democratic systems have in planning the future. “It may well be – if we preserve our sense of humor – that we may suspect that the phrases ‘long distance planning’ and the ‘democratic process’ are in fact contradictions of terms.” While allowing for individual freedoms of opinions and rights to change them and exercise them through voting, Cabot explains that a democratic society that cannot make long term plans and carry them out is reduced to an “absurdity.” Cabot ends the book thus:

“…we have an immense body of opinion, part of which is in this country, a handsome part of it elsewhere, which continues in spite of discouragements, to believe that there is in all human beings an inherent and irresistible desire for certain freedoms which can be obtained only under democracy. Such a view seems to me based upon irrefutable evidence going back to the beginnings of the world. Its validity I cannot doubt. Once we admit this premise, once we admit that we believe that there are in democracy certain inherent benefits essential to progressive civilization, then we are driven to the conclusion that though long distance planning under democracy is beset with many vicissitudes, nevertheless such plans must be made and, by dint of good temper and the laws of the cosmos, they may come to fruition.”  [Cabot H. The Patient’s Dilemma: The Quest for Medical Security in America. 1940.]

 

Ten.

Stardust, Hoagy Carmichael’s popular song, came to his mind in 1927 when visiting his alma mater, Indiana University, where he had earned a bachelor’s degree in 1925 and law degree in 1926. Mitchell Parish added lyrics in 1929 and the song has been recorded by Bing Crosby (1931), Nat King Cole (1956), and Willie Nelson (1978) among many others. The music and the lyrics are equally compelling, with Parish linking “the purple dust of twilight time,” the stars, and memories of a lover: “And now my consolation is in the stardust of a song.”

The original title was two words, Star Dust. Astronomers have learned much about the topic since Hoagy’s day: the elements of stardust larger than hydrogen and helium up to the size of iron required solar furnaces for their creation, but larger elements required the greater manufacturing complexity of supernovae. The fact that life is literally made of stardust is not just a figure of speech, the stardust of a song is a lyrical metaphor of a higher order of magnitude. Lying somewhere between cosmic stardust and its human incarnation is the daily work and politics of humanity, and these have been the focus of matulathoughts.org.

I came to Ann Arbor in 1984 from Walter Reed and the U.S. Army at the invitation of Section Head Ed McGuire, who very positively impacted the world of urology and myself. I inherited the stewardship of Michigan Urology from another great urologist and our inaugural chair, Jim Montie. Previous leaders of urology at Michigan educated superb urologists from Nobel Prize winner Charles Huggins and Reed Nesbit, the first section head, through Jack Lapides who trained another splendid cohort, including Hugh Solomon whom we often see at Grand Rounds. [Below, Hugh and Jim.] Following Jack, we had Ed, Joe Oesterling, Bart Grossman, and then Jim. They all brought things to the table, so to speak.

screen shot 2019-01-16 at 7.47.03 am

My appreciation is profound to our faculty, staff, Nesbit alumni, and friends of the department. You have made my time as chair a joy. Sandy Heskett has been with me from the start of my administrative duties in Allen Lichter’s dean’s office and she has somehow dissolved the problems of each day and kept our department as well as your old chair on track. Jack Cichon and Malissa Eversole have been incomparable in their work and loyalty to our team. Thanks, too, to my colleagues and friends on the faculty, in the Dean’s office, and on central campus. It has been a great run for me, but it isn’t over yet.

We appreciate your interest and will be back here on the first Friday of March at this website: matulathoughts.org. and meanwhile encourage any comments from you.

David A. Bloom
University of Michigan, Department of Urology, Ann Arbor