Michael Baumgartner, a professor of Orthopedic Trauma at Yale University, has provided one of the most important discoveries to orthopedic surgery in modern history. For several decades, avoiding failure in intertrochanteric hip fractures was an elusive conundrum, and it required several decades of compounding information to finally discover the most significant variable associated with implant failure in this common injury. Certainly there are multiple variables associated with fixation success, but his discovery has provided the most profound impact. Today, we take this elementary information for granted, but to figure out this seemingly simple concept, it required decades of encounters and numerous failures. I have had the opportunity to listen to him lecture; his humility and understanding of our cognitive limitations are profound. He wisely said that, “he feels sorry for his patients 30 years ago and he feels sorry for his patients today;” understanding that every successive decade of orthopedic practice, with rapidly improving knowledge and technology, had given him an entirely new fact pattern required to more successfully and better treat his patients. Ego negatively impacts our ability to learn and creates an environment unfavorable to meaningful progress. There are four categories of cognition: known-knowns, known-unknowns, unknown-knowns and unknown-unknowns. Accumulating more known-knowns and trying to minimize the volume of unknown-unknowns is our intention, but it is a continually fluid process. The more we understand a topic, the more esoteric and elusive mastery of the subject becomes.
The Dunning-Kruger Effect is a cognitive bias describing the correlation between confidence and competence. People tend to be overconfident in a particular domain with little competence. As the level of competence increases, there is a paradoxical decrease in confidence, as recognizing nuance and complexity generates less certainty and absolutism. Complex systems are often simplified to help facilitate understanding, but oversimplification does not appropriately appreciate the intricacy of a given system or domain. As an individual, or a collective research field, gain more knowledge about a system, there often become more questions than answers.
It is imperative to understand that all facts have a half-life. As an example, if I say, “the earth is round,” this fact has nearly an infinite half-life. Likewise, if I say that “my daughter wears a 2T shirt,” the half-life of this fact is much more abbreviated. This concept has been quantified by the Lindy Effect. The Lindy Effect essentially explains that the longevity of a technology or an idea is proportional to its current age. It separates signal from background noise, but only passage of time can expose what is signal and what is noise. As a result of this phenomenon, it is necessary to be open to unlearn or modify our thought processes and be open to the idea that previous conceptions may actually be wrong. Learning is a temporal process without ever being able to achieve mastery. As time passes, our understanding of dynamic systems becomes more accurate because finite facts will change. For example, we understand that subjects exposed to an average of 20 mSv units of whole-body radiation are more likely to develop malignancy (here). Longitudinal studies and econometric analysis were necessary to recognize that the radiation exposure was the most significant variable that led to leukemia and solid tumors, several years following radiation exposure. Biological systems can resist acute stressors, but manifestation of disease may not become evident years to decades after exposure. The longer the period from exposure to disease manifestation, the more difficult it can be to correlate the two variables.
Our brains subconsciously assimilate information from a given trivial concept, even if these inferences cannot ever be corroborated. The Nobel Prize-winning physicist, Richard Feynman, in his book “Surely You’re Joking, Mr. Feynman,” gives an example: if you are presented with a brick, we can reasonably assume that this is a solid structure in its entirety, but we can ultimately never see inside of the brick to prove it. Even if we shatter the brick into 1,000 pieces, we are only observing a new surface, never the inside of the brick. Therefore, we can reasonably assume, through inference, that the brick is solid in its entirety, but are unable to prove this with absolute certainty. Through this example, he demonstrates that even if we are presented with a trivial fact, “this is a brick,” we often make inferences about these facts based on previous experience without understanding the ultimate complexity of such perceived triviality. Obviously this analogy is inconsequential, but it provides a reference to a level of complexity that is not overtly obvious. As the complexity and importance of certain factual information becomes, avoiding false inferential reasoning is more critical.
Richard Feynman was a theoretical physicist, awarded the Nobel Prize in Physics in 1965 for his discovery in quantum electrodynamics. He had originally worked out an equation for his quantum electrodynamic theory, but when tested in the laboratory, was unable to reproduce his hypothesis. Feynman took many sabbaticals in his career, many of which had nothing to do with physics. When his experiment did not corroborate his hypothesis, he lost his passion for physics and went to Africa to learn how to play certain instruments and how to create specific sounds with his voice. Upon his return to America, he was having a difficult time rediscovering his passion for physics. While he was gone, it was discovered that there was a slight misrepresentation of a constant in Bernoulli’s Equation. Because he was out of the country at the time of this discovery, he was not made aware that a slight correction had been made in this equation. After being informed of this correction by a colleague, the recalculation of his theory was now consistent with the laboratory findings which ultimately led to his Nobel Prize. In this particular situation, his theory had always been correct, but reliance on previous information, that later was found to be slightly incorrect, made all of the difference.
Developing fact patterns and understanding complex systems is a dynamic process that is always being refreshed with new information. For anybody that has used a compound microscope, there is a course adjustment and fine adjustment that can help bring the object of interest into focus. As we learn and develop knowledge and experience, randomized controlled trials are akin to the course adjustment, while trial-and-error, pattern recognition and personal anecdote become our fine adjustment. Both functions are vitally necessary when trying to understand the world in which we live and operate. We make inferences about our observations and the magnitude of variability minimizes as the exposure to number of events increases.
Successful science is a function of passage of time, appropriate interpretation of data and accurate data analysis. It is imperative to avoid confirmation and desirability bias, that will inaccurately affirm an individual’s desired outcome. All information must be considered, both affirmative, and more importantly, negative data. Feynman once said, “The first principle is that you must not fool yourself — and you are the easiest person to fool.” Only the passage of time will provide the necessary information needed to understand how a biologic system will respond to a specific environmental or pharmacologic exposure. Universal, mandatory treatment of a highly complex biological system, without appropriate passage of time is deeply concerning policy. Mandating this treatment in children, with infinitesimal risk of developing severe viral illness, is malicious. If we are to make a truly informed decision regarding consent to receive a treatment, ALL information should be disseminated to make this decision. Delaying the release of this information for 55 years should elicit more skepticism and outrage than our current state.