DAB What’s New May 5, 2017
Ideas, evidence, & anniversaries
Ideas and evidence, that is information indicating whether ideas or propositions are true, have been assembling at increasing rates over the past dozen millennia of human progress and Michelangelo’s Hand of God, Creation of Adam illustrates this concept beautifully, with the sagittal brain embodying mankind’s divine creative spark (Sistine Chapel fresco. c. 1511). [See Meshberger in JAMA. 264;1837, 1990] The University of Michigan has been a significant player for the past 2 centuries of that narrative. The university launched its bicentennial celebration last month, the Medical School had its 150th anniversary (sesquicentennial) 17 years ago, and in a few years the Urology Department will have its own centennial. These are not just self-congratulatory moments, but worthy celebrations given the impact of each of these three entities.
Long preceding our particular institution, universities began in medieval Europe as ecclesiastical places of learning, teaching, and study. Mostly shedding their sectarian roots over ensuing centuries universities became, in turn, technical schools, research centers, professional schools, and now giant enterprises of academia that also aggregate sophisticated athletic teams, musical societies, technology transfer businesses, and health systems. Most fundamentally, universities teach the next generation of society and address the world’s problems, generating new ideas and finding evidence to arbitrate which facts are true facts (in the terminology, once again, of Don Coffey). Universities are humanity’s best bet as honest brokers for tomorrow to teach our successors, build better societies, and pursue truth.
The University of Michigan, Medical School, and Urology Department have much to celebrate. The university originated as a small school in Detroit in 1817, the Medical School began in 1850 in an Ann Arbor classroom for 92 students, and Michigan Urology claims the 1920 arrival of Hugh Cabot (below) for its birth. Cabots were big figures in American medicine. Older cousin Arthur Tracey Cabot was one of America’s first genitourinary specialists, a founding member of the American Association of Genitourinary Surgeons, and Hugh’s brother Richard was a celebrated Boston internist. Hugh Cabot’s life was deeply impacted by military service in France during WWI. Returning to Boston in 1917 and unfulfilled in his private practice Cabot jumped at the chance to come to Michigan as fulltime surgery chair. He quickly became dean and in 1926 opened a modern hospital (1000 beds) with a multispecialty academic medical practice that defined 20th century medicine. Cabot’s first 2 urology trainees were Charles Huggins and Reed Nesbit. One would win a Nobel Prize and the other would shape the future of clinical and academic urology, in addition to succeeding Cabot as the urologist of record in Ann Arbor. [McDougal, Spence, Bloom, Uznis. Hugh Cabot. Urology. 50:648, 1997.]
Humans are natural historians and find it pleasing, useful, or sobering to rewind the past with anniversaries, centennials, or other markers that inform, inspire, or caution. For example, on today’s date in 1864 the Battle of the Wilderness began, a time when our Medical School was fairly new. The Civil War was much on the minds of Michigan medical students then, who would go off to fight for the north or south after graduation. Wilderness was the first battle of Lt. General Ulysses S. Grant’s 1864 Virginia Overland Campaign and, although tactically inconclusive with heavy losses on both sides, it thrust Grant into a national spotlight carrying him eventually into the White House.
The disabilities and deaths of the Civil War affected most people and families in the United States. Wars, with countless traumatic crises for soldiers and civilians, perversely stimulate improvements in healthcare. Infection and antisepsis were not understood in 1864 and even minor wounds from musket balls or the more accurate Minié ball, prominent in the Crimean War and American Civil War, became lethal long after the instant of injury because of subsequent sepsis. [Above: Battle of the Wilderness; near Todd’s Tavern, Orange County, Virginia, May 6, 1864. Imagined scene in the Civil War Print Series by Louis Kurz and Alexander Allison c. 1887.] Fifty years later antiseptic technique was commonplace and the surgical repertoire has expanded greatly when the U.S entered WWI, ridiculously claimed as “the war to end all war.” That horrendous conflict, however, not only gets repeated, but is ever more horrendous as technology expands weaponry. The experiences of medical personnel like Cabot in WWI translated into new knowledge, skills, specialties, and systems that refined health care in the world that followed, until the next wars.
Michigan’s Medical School had been open for 11 years when the Civil War began and the 2 years of lectures needed to produce an MD hadn’t changed much. Dogma filled the curriculum with little evidence for medical practice beyond personal experiences. The educational process was two-dimensional, consisting of faculty vs. students in classrooms. The lectures included concepts as ancient as Hippocratic and Galenic theories of little use in the real world. Medical students had only simplistic understanding of trauma based on gross anatomy and lacking any sense of physiology, infectious disease, or cellular response to injury. Trauma care was mainly a matter of bandaging and crude orthopedic management. Anesthesia was rudimentary and surgical options beyond amputation were few. Most of what was taught in medical school as facts of the time would vanish under the scrutiny of science and emerging medical disciplines enlarged the curriculum in length and content. A UM hospital in 1869 (initially a dormitory for patients undergoing surgery in the medical school – shown below) opened a third dimension of inpatient clinical experience at bedsides as medical subspecialties began to form. Laboratory instruction, in emerging biosciences, provided a fourth dimension of medical education as a verifiable conceptual basis of health care was assembling.
Successive hospital iterations offered increasingly complex clinical experiences for medical students as well as patients and by the time of the 1910 Flexner report didactic classroom and laboratory experiences were equivalent to patient care experiences in the Medical School curriculum time and budget. An outpatient building in 1953 added a fifth dimension of ambulatory care that, in its own turn over the next 50 years, would exceed the scale of inpatient experience as medical specialties required more outpatient learning than bedside education. To maintain a clinical and scientific footprint for 700 medical students, 200 Ph.D. candidates, and 1100 residents and fellows, it became evident that a new dimension of statewide clinical opportunities and affiliations would be necessary. This has been happening over the past 15 years with Livonia, East Ann Arbor, Brighton, Northville, a growing number of professional service agreements, and regional affiliations such as MidMichigan and MetroHealth that create opportunities for “population health management”, for the University of Michigan Health System (now Michigan Medicine) representing a sixth dimension of health care education. In many respects, this new paradigm is as big a leap into the future as that first university hospital was in 1869.
Just as during the Civil War, WW1, WW2, Korea, or Vietnam (on the minds of my school cohort), national and international conflicts will affect today’s medical students who are in jeopardy, after graduation, of being thrust into action using their newfound knowledge and skills in dire circumstances of armed conflict.
Part – whole dilemma. One difficulty in healthcare today is the matter of deploying specialties for the care of patients, while keeping the whole of the patient in perspective. The specialties formed as 20th century ideas and evidence enriched the practice of medicine and the curriculum of medical schools. New areas of focused practice led to a new layer of education for medical students after graduation, known as residency training. Parallel and complementary subspecialties and epistemologies similarly formed in the sister healthcare sciences, such as nursing, pharmacy, sociology, psychology public health, and engineering here at Michigan and around the world. In 1933 the American Board of Medical Specialties (ABMS) began to consolidate emerging medical specialties to assure the public of the training, qualifications, and professionalism of medical specialists. By 1984 Human Genetics was added to the specialty roster and 24 medical specialties were in play, as medical practice was becoming increasingly complex and fragmented. The ABMS then stopped adding new boards and chose to manage new areas of practice through subspecialty certification or joint certification of emerging areas of practice among specific boards. This seems to have worked out well so far with 150 areas of specialties and subspecialties now in practice. [Above: residents James Tracey, Parth Shah, and Rita Jen sorting out the work for the day after morning conference.]
No single person can successfully manage this proliferation of knowledge, skills, and technology on behalf of patients, so all parts of a given health care team must work together. The idea of a primary care gate-keeper is not working well as a coordinator of care or as a focal point to ration care. This is the “part-whole” dilemma; that is, how to reconcile the parts with the whole. We also see this socially and politically in managing a multicultural society. The same issue plays out in universities among competing and collaborating disciplines. Sociobiologist E.O. Wilson makes the case that interdisciplinarity is how the most important work for the human future is likely to take place. [EO Wilson. Consilience.] Interdisciplinarity in the Twentieth Century, the subtitle of a book by Harvey Graff, examines the part-whole relationship in universities, reviewed by Peled from McGill who concluded:
“Graff emphasizes the dynamic interdependence between knowledge, scientific epistemologies, and (inter) disciplinarity, while remaining wary of proposing any simple definitions. Instead, he stresses the importance of egalitarian exchanges and the role of history and the humanities in the study of interdisciplinarity. Although Undisciplining Knowledge provides insightful answers to largely unexplored questions, its main contribution lies in refining and reframing these questions for the benefit of historians of science and interdisciplinary researchers.” [Undisciplining Knowledge. Interdisciplinarity in the Twentieth Century. HJ Graff. Johns Hopkins University Press. 2015. Yael Peled. The domain of the disciples. Science. 350:168, 2015.]
Note the phrases “egalitarian exchanges” and “the role of history and the humanities.” Interdisciplinarity today may seem novel and groundbreaking, but it will likely transform into new fields of work and knowledge in the near future just as history shows in Michigan’s Medical School curriculum.
Evidence. The Stratton Brothers Trial began on this day in May, 1905, the first occasion for fingerprint evidence to obtain conviction in a murder trial. Alfred Stratton (born 1882) and his brother Albert (born 1884) were the first people convicted in for murder based on fingerprint evidence. The case, otherwise known as the Mask Murders (stocking-top masks left at the crime scene – below), the Deptford Murders (the location), or the Farrow Murders (the last name of the victims) initiated the interdisciplinarity of law and science (now, forensic science). A smudge on the empty cashbox looked suspicious to Detective Inspector Charles Collins, who wrapped up the box and took it to the newly established Fingerprinting Bureau at Scotland Yard. Alfred’s right thumb was a perfect match. The conviction ended up in execution of the brothers on May 23 at HM Prison, Wandsworth. Fingerprints are synonymous with unequivocal identification, truth for which no alternative explanation can be accepted. The truth matters for criminal law.
[Stratton masks. Courtesy of The Line Up website. Article & image: Robert Walsh (http://www.the-line-up.com/).]
Tolerance of deliberate untruth corrodes a free society. We cherish free speech, but we cannot be indifference to deliberate falsehood. Just as evidence replaces dogma with verifiable information, deceitful claims must be challenged by testable facts. Few have expanded on this topic with greater clarity than Harry Frankfurt, although it seems that misdirection of facts is becoming more prevalent. [Frankfurt. On Bullshit. Princeton University Press. 2005.] Propaganda, lies, and plagiarism fall are breeches of the important social norm of truth and should irritate us enough to call them out as learning opportunities so we can learn how to recognize them, understand how they corrode professionalism, use them as teaching opportunities, and reaffirm one’s own standards.
Not every crime has its fingerprints, but just as the internet offers plagiarists opportunity to harvest cyberspace, the internet gives readers strong investigative tools. Science magazine earlier this year dedicated an issue to the matter of how evidence should inform public policy and contained an introduction to the discussion called “A matter of fact” by David Malakoff [Science 355:563, 2017].
“This is a worrying time for those who believe government policies should be based on the best evidence. Pundits claim we’ve entered a postfactual era. Viral fake news stories spread alternative facts. On some issues, such as climate change and childhood vaccinations, many scientists worry that their hard-won research findings have lost sway with politicians and the public, and feel their veracity is under attack. Some are taking to the internet and even to the streets to speak up for evidence. But just how should evidence shape policy? And why does it sometimes lose out?”
What we take as facts or truth is susceptible to change or even error. In fact, evolution is built on error. Missense is the phenomenon in which a single nucleotide substitution (that is, a point mutation) changes the genetic code such that an amino acid is produced that is different than the one intended in the original genetic code. The ultimate protein built of the amino acids may be dysfunctional or nonfunctional as in the circumstance of sickle-cell disease where the hemoglobin beta change is changed from GAG to GTG. Random error, or perhaps “purposeful missense” from a creationistic point of view, is the mechanism of evolution and diversity.
We expect integrity in most transactions in society and we are justly offended when this expectation is not fulfilled. The privileges of professional occupations are based on their fulfillment of this public trust, and few professions are older or more essential than the health sciences. Error and imperfection represent the honest “missense” of humanity’s work, but deliberate deceit is another story breaking a universal taboo.
Transgressions against the public trust are especially reviled in medicine and science. A spectrum of transgressions exists, from a casual moment of dishonesty all the way to fraud, theft, and other criminality. Plagiarism sits in the middle of the spectrum. Some plagiarism is merely poor scholarship, but most often plagiarism is out-right theft. Once someone falls into the plagiarism trap, it is difficult to distinguish among its variants. Self-plagiarism revolves around the repeating one’s own work, but representing it as new. Of course, we all repeat our own ideas and words over time, but if you write a book chapter the publisher may claim ownership of your words, so you must be careful not to repeat wholesale your own paragraphs or illustrations in later articles, especially if the perception is to be that the newer article is genuinely “up-to-date.” Still, this differs from the deceit of stealing someone else’s work.
Scientific misconduct with deliberate plagiarism, fabrication, and falsification of data is a big problem, not so much in scale and prevalence – for I believe we have only occasional bad actors in our midst – but more because of their effect of distorting truth and corroding the public trust as an article in Science by Jeffrey Morris last year examined. [Morris. After the fall. Science. 354:408, 2016.]
Gaslighting. On May 4, 1944 MGM released a movie called Gaslight, starring Charles Boyer, Ingrid Bergman, Joseph Cotton, May Whitty, and Angela Lansbury. The story, based on a 1938 Patrick Hamilton play, concerns a woman whose husband manipulates her into believing she is insane in order to distract her from his criminal activities. One of his deceptions is causing gaslights to flicker, making his wife think her vision is unsteady. Fiction became reality as the gaslighting metaphor found use in everyday speech for forms of manipulation through denial, misdirection, contradiction, and outright deceit to delegitimize or destabilize a target. Florence Rush (1918-2008), an American social worker and feminist theorist, applied gaslighting in her work as a pioneer in studies on childhood sexual abuse. (She also introduced the concept of the sandwich generation.)
Plagiarism is one form of gaslighting, the deception being the authenticity of ideas, statements, or evidence. The assumption of truth is a bedrock expectation in healthcare. Once abused, trust is rightfully difficult to restore. For example, the trainee who fudges a laboratory report during rounds may momentarily escape with the untruth, but the intoxicating bad habit gets repeated and ultimately discovered. The same goes for plagiarism or overt research fraud, where the likelihood of discovery increases exponentially over time because perpetrators invariably repeat the offense and the longer the evidence sits in public space, the more likely it will be recognized for what it is.
Paul Simon’s 1986 song, All Around the World (The Myth of Fingerprints), challenged the metaphor of universal individuality with a great tune, but a cynical lyric. Steve Berlin of Los Lobos claimed that Simon never gave the band due credit for the music that they had previously created and played when helping Simon on the Graceland album. After the band saw “words and music by Paul Simon” on the album 6 months later, they contacted Simon who said “Sue me, see what happens.” They didn’t. [Chad Childers. Rock Cellar magazine. July 23, 2012.]
Case reports. When I was medical student and resident, case reports were foundational parts of medical education, expanding the generalities of systemic and organ-based learning and offering personal stories of medical detective-work. Some case studies illuminated classic presentations of disease, others were exceptions that proved a rule, and some were exotic conditions that surprised and educated us. Case studies, coming from reputable sources, carried a sense of authenticity – they were accepted as true facts beginning with the earliest medical journals such as The Lancet. In time, with the emergence of technology, defined areas of study (the disciplines, departments, specialties) scientific method, and randomized controlled trials offered higher levels of rigor.
Case studies also provided many of us early chances to study an illuminating case, present at conferences, and even publish. Medical journals were once heavily dependent on case reports. Evolving technology added illuminating images to 20th century specialty journals. Whereas relatively few students and residents had access to million-dollar biologic labs or enormous data sets, any ambitious resident could find an interesting clinical story to expand upon and present.
In my early faculty years ivory towers began to sneer at case reports as journals marginalized and eliminated them. Hypothesis-driven research, sophisticated laboratory studies, clinical trials, and health services research dominate current medical journals. Electronic media by threatening the business plans of medical journals, have challenged their very purpose and identity, leading many publications to retreat to imagined core functions or pander to readership surveys that represent very weak science themselves.
A few journals have, however, maintained a place for single case stories or recently restored them. Case reports are a renewed feature in The Lancet. That journal and JAMA also embrace art, commentary, and relevant news that expand their interest for many readers. A recent paper in Academic Medicine, gives a strong argument for the educational value of case reports. [CD Packer, RB Katz, CL Iacopetti, JD Krimmel, MK Singh. A case suspended in time: the educational value of case reports. Academic Medicine. 92:152, 2017.]
I don’t think I’m so different than most of my colleagues in wanting medical journals that curate relevant facts and issues broadly. Anything related to sustenance of the human condition from our medical perspective should be fair game for our journals including new evidence, ideas, technologies, therapies, understanding of health and disease, environmental threats, controversies, health care economics, educational matters, medical humanities, and art. Focus and balance is necessary for editors and boards, but the strong journals of our times (The Lancet, JAMA, NEJM, or Science, for example) seem to get it pretty much right for their readerships.
What Archie Cochrane learnt from a single case was the title of a recent article in The Lancet in its recurring section called “The art of medicine.” [Brian Hurwitz. The Lancet. 389:594-595, 2017.] The title of the article is ironic given that this Scottish physician (1919-1988) had extraordinary belief in randomized controlled trials that led to the Cochrane Library database of systematic reviews, The UK Cochrane Centre in Oxford, and the international Cochrane Collaboration. Yet, there in The Lancet, I found this article on what Archie learned from a single case. An illuminating single case can be a powerful tool, in medicine, in the broader scope of journalism, and in political speeches. Ronald Reagan was probably the first US president to use this tool in public addresses, as for example in the Pointe du Hoc speech in 40th year anniversary of D-Day at Normandy on June 6, 1944, when he alluded to stories of a leader (Lord Lovat), a bagpiper (Bill Millin), Canadians, Poles, US Army 2nd Ranger Battalion solders shooting ropes up over the cliff face, as well as Americans back home ringing the Liberty Bell in Philadelphia, going to church at 4 AM in Georgia, or praying on porches in Kansas. Reagan (and speechwriter Peggy Noonan) understood the specific instance of a particular story illuminates a much larger reality.
Scientific experimentation, including the randomized controlled trial, offers a high level of rigor and verifiability in accruing new knowledge, and largely has replaced stories of individual clinical experiences, however the work-in-progress of medical education shouldn’t be so highfalutin as to deny entirely the value of carefully-presented case studies
New rules. Last month we held a retreat for faculty, residents, and advanced practice providers (pictured above and below at Michigan League). We heard ideas and facts from Vice Deans David Spahlinger and Carol Bradford, along with strategic plans from our divisions and associate chairs who oversee the components of our missions. It became clear that our department is nearly the right size for our mission and obligations, although we will need about 10 more FTEs over the next 3 years to reach and maintain that size. Mission, essential deliverable, markets, professionalism, and work-life balance were discussed. My term as chair will come to a close and we expect to announce a search committee this summer. Once replaced, I hope to remain on the faculty in a meaningful way for a few years just as did my predecessors Ed McGuire and Jim Montie. Jim, by the way, was unable to join us due to grandparenting privileges keeping him in Europe at the time, but he sent a short and inspiring video that explained how “culture eats strategy.” Jim’s ten pieces of advice, slightly rephrased below, for academic medicine ring very true.
a. Faculty have a higher purpose other than personal success; academic success is not a “win at all cost” endeavor. Academic medicine is not the Hunger Games.
b. Expert and empathetic clinical care is the highest priority.
c. Urology’s culture is embraced and preserved by faculty and inculcated in fellows, residents, and staff.
d. We share respect for colleagues, fellows & residents, and staff.
e. Academic productivity is important.
f. Referring physicians are highly valued and respected.
g. Try to make UM better, even at some sacrifice.
h. A team is necessary and one with diverse thoughts and backgrounds is always better.
i. Salary should be sufficient to that ensure faculty are not being taken advantage of (actually or perceived).
j. Innovation is the lifeblood of outstanding academic medicine.
Jim called his list “Thoughts for living in Michigan Urology.” He also added a question for the new paradigm of Michigan Medicine: “How does Michigan Urology integrate UM affiliates into the Urology Department? Don’t wait for the institution to solve it. Decide what vision you have and move to implementing it. Get to know the people at these other hospitals and practices.”
These are our thoughts for May, a month in which the redbuds have been amazing in and around Ann Arbor.
David A. Bloom
University of Michigan, Department of Urology, Ann Arbor