If you’ve ever taken a human anatomy or osteology class, you’ve probably worked with real human skeletal remains. So, where did they come from, and should we continue to use them?
https://anatomypubs.onlinelibrary.wiley.com/doi/10.1002/ar.24868
Let me paint the picture for you. It’s a Thursday afternoon, and my entry level anthropology lab is buzzing with anxiety (and, yes, utter horror) about the upcoming exam on anatomy and pathologies of the human skull. Although I’m an anthropology minor, I somehow managed to put off taking my one required lab until the first semester of my junior year, so I counted myself among the nervous group of (mostly) freshmen getting ready for our first college practical exam. However, I have the somewhat limited advantage of being an anthropology minor, meaning I know the ins and outs of the lab spaces, and even a limited knowledge of the CU Boulder assemblage of human remains that we use for study. So, I was significantly less flabbergasted than my peers were when our TA informed us that the skulls we had been using regularly for classwork and studying were real human remains.
While many people would share the same reactions (horror, disgust, morbid curiosity) as my classmates did at this revelation, the truth is that many people have actually worked directly with real human remains at some point in their scientific education, spanning back to high school. But, even though the acquisition and use of human skeletal remains in the classroom is a somewhat common practice, the history behind these individuals–who they are, where they come from, and how they arrived in an academic setting–is often shocking and unethical, stemming from intense greed, classism, and utter disregard for the wishes of the deceased and their living communities. So, without further ado, here is the (albeit very abbreviated) crazy history of human skeletal remains in academia– and where we may go from here. For this article, I will mostly be referencing Amber R. Comer’s article, “The evolving ethics of anatomy: Dissecting an unethical past in order to prepare for a future of ethical anatomical practice.”
The history of human remains for academic purposes goes all the way back to the first half of the third century BCE, when Herophilus of Chalcedon in Alexandria and his apprentice, Erasistratus, began practicing dissections of recently deceased humans, what we now call cadavers (Comer). Despite the high disapproval of the posthumous study of human bodies at the time due to religion, Herophilus and Erasistratus cemented their legacies as sort of founding fathers of anatomy. Most notably, their observations provided foundational knowledge about the cardiovascular system, nervous system, digestive system, and reproductive system (Comer 2022). However, after the ethics of Herophilus’ practices began to be deliberated on a wider scale, the practice died out. Overall, the practice of human dissection in ancient Egypt and Greece only lasted about 30 to 40 years.
Next up, the Middle Ages. While, for the most part, people in the Middle Ages were vehemently opposed to the dissection of human remains due to widespread religious beliefs, things began to change in 1240. The Holy Roman Emperor, Frederick II, decided that medical students absolutely needed to be exposed to these practices once again, requiring every student and medical professional to observe dissections every five years. This shift in viewpoint came with the church’s sudden movement away from holding the body at its utmost value, no longer spreading the belief that the body was needed for the soul to make it to heaven (Hosek 2024). However, that didn’t mean that all opinions on this macabre practice suddenly switched. Although it became sanctioned by the church, dissections of human remains were still heavily criticized by the public. It took hundreds of years for the practice to become “normal” in the eyes of the public–so “normal” in fact, that the 16th century brought with it an interesting turn of events.
When you think of Leonardo Da Vinci, you probably think of the Mona Lisa, or his early concepts of what would one day become an airplane. What most people don’t know, however, is that this OG “renaissance man” also regularly engaged in grave robbing, as well as paying others to rob graves for him, for the purpose of his art. “The Vitruvian Man” is today used as a diagram for anatomical position in the study of human anatomy and locomotion, and we can say almost for certain that this particular piece of Da Vinci’s repertoire was based on those whose graves he robbed (Comer). They were exhumed and put on display without their permission, or even the permission of their communities or kin.
Shortly after Da Vinci, a man named Andreas Vesalius entered the scene. Vesalius, inspired by the drawings of Da Vinci, used his own observations of human anatomy to create one of the first scientific books on the subject, “De Humani Corporis Fabrica.” While his findings included important scientific discoveries, he never engaged with the concept of consent. In fact, most of his (government sanctioned) dissections were on the bodies of executed prisoners, whom many believed to have relinquished their rights to their bodies when they committed their crimes (Comer).
As wild as this idea seems to us today, it was anything but short lived. In 1540, Henry VIII, representing both the church and government of England, created the Company of Barber Surgeons and ruled that four executed prisoners would be turned over to the company per year for anatomical research. By 1626, Oxford University was working with the Oxford sheriff to claim as many executed individuals as possible for this purpose, and by 1636, all graduates were required to have participated in at least two dissections during their time at the university (Comer). Eventually, the Murder Act of 1752 allowed for public anatomical dissections of anyone who had been convicted of murder, cementing the idea that the use of the body for science was punitive, and even humiliating. Ethical objections to the practice were refuted by the “ends justify the means” mindset, which suggested that the knowledge gained from performing these studies made the “barbaric” practice itself worth it (Comer, Hosek).
In the 19th century, the death penalty began to fall out of fashion, and cadavers once again became difficult to come by. Eventually, medical students, physicians, and even entrepreneurs who knew they could make good money acquiring cadavers began to rob graves again. The rising rates of grave robberies (and subsequent arrests for said grave robberies) brought forth the Anatomy Act of 1832, which allowed the bodies of those who died in workhouses, asylums, and prisons to be turned over to universities for anatomical research. Obviously, this act targeted certain populations of people– and they weren’t the rich, white medical students who would end up performing the dissections (Comer, Hosek). This took the classist trends that had long targeted prisoners to a whole new level, solidifying the role of classism, racism, and ableism in the history of anatomy. It created a fear of dissection in poor populations that would last for centuries. Shortly after was the introduction of free burials as an incentive for “donating” one’s body. Basically, educational institutions would offer to bury the bodies (or what was left of them) for free after they finished the dissection (Comer). This targeted poor communities who wouldn’t ordinarily be able to afford a funeral or burial service for their loved ones.
The 20th century and the Holocaust brought a new level of experimentation on human remains. I feel that the details of these experiments do not need to be rehashed here, so I will just sum it up by stressing the point that the scientific discoveries made during this time period on human remains were unambiguously non-consensual, and the trauma inflicted on the kin of these individuals is still staggering today.
Today, anatomists are still coming to terms with the implications of the history of our field. In many places, at least in the US, cadavers are exclusively donated with the person’s or kin’s knowing consent. They are treated with respect, and many schools even hold memorial services for the individuals used in educational settings. As a whole, the issues of cadavers, who have a very limited time to be used for education after death, have been brought up to speed with modern ethics of knowing consent and respect for the individual.
Where we have the most moral issues today is with bones. Obviously, human skeletal remains last a very long time. So, many of the specimens being used for modern education have been around for a very long time. These are the remains you have probably handled in class–an articulated skeleton, a skull, or even teeth used for dentistry. The 1990s brought the introduction of NAGPRA, or the Native American Grave and Repatriation Act. This act was focused on gaining permission to study, or, most often, returning the remains of individuals who were part of Native American tribes. In the 30 years since it has passed, NAGPRA has brought the repatriation and even identification of hundreds of thousands of individuals, from museums to universities. You can even check to see how NAGPRA has made a difference in specific counties in the US here. This act has been monumental in beginning to move anatomy and bioarchaeology forward in an ethical and respectful way.
However, not every individual in these institutions is part of a Native American tribe. So, what happens to all of the other individuals? Relatively older skeletal remains can usually be traced back to India. For about 180 years, spanning up to the ban on international exports of skeletal remains in 1985, India is estimated to have shipped over 1.5 million worth of individuals’ skeletal remains around the world. So, if any lab bones arrived in the US earlier than the mid-80s, they are most likely from India–but we will never be able to tell who they are or where they are from due to the sheer number of people exploited for this purpose. Alternatively, more recent skeletal remains in labs are most likely distributed by the Smithsonian museums, a collection of hundreds of thousands of ill-recorded individuals, making it impossible to repatriate or know anything about them either. Lastly, we have archaeological remains that are not part of NAGPRA. Many universities across the country are working hard to track down the origins of these individuals so they can be repatriated, but it is an endless ongoing process.
So, where do anatomy and bioarchaeology go from here? Do we continue to use the remains of those we can’t identify to study? Do we even continue to use the books and information created based on individuals who were obtained unethically? If we don’t, was it all for nothing? These are the questions scientists are asking every day, and many believe that there is no right answer. The (literal) skeletons in the closets of these institutions have such deep histories that we can only hope to someday find retribution for. Until then, we can only move forward with the intent of respect, consent, open minds, and, most importantly, the willingness to listen and learn from the communities from which we stole.