First Toy Mathematical Instrument Maker to the King
If there’s one thing monarchs like, it’s a title. Titles create order and structure and simultaneously confuse and dazzle Americans. After years of stalking royals, I’m familiar with most of them, from duchess to earl to Queen Clarice’s Order of the Rose. But in researching George III for this month’s series, I came across a new title that I just had to share with our readers: mathematical instrument maker to the King.
Stefanie mentioned in Week One of our series that George was the first British ruler to learn science as part of his official curriculum. His tutor, John Stuart, 3rd earl of Bute, exposed him to “natural philosophy”, as science was called at the time. The Enlightenment era was just emerging and with Stuart’s influence, George became an enthusiast of all things physics and math. And this interest extended outside of the classroom. According to the Science Museum in London, “George III hoped to demonstrate that he governed the nation according to reason and virtue, so that his citizens would also aspire to these values. Furthermore, both Bute and the King believed that physically using instruments and undertaking mathematical exercises helped to cultivate the rational mind.”
If you know, you know.
Although George’s rational mind was besieged by illness in his later years, his passion for the natural sciences continued. He maintained an impressive collection of scientific instruments, now a popular installation at the Science Museum. And while instrument makers were a dime a dozen in London at the time, there were two men in particular that the king relied on to create his impressive contraptions: George Adams Sr. and his son, George Adams Jr.
On Fleet
It turns out George’s passion for science was very trendy at the time. In the eighteenth century, London went from being the capital of the world for bangers and mash to the epicenter of scientific instrument making. According to Gerard L’Estrange Turner, a leading historian on the subject, instrument makers enjoyed a status different from other handymen. They often published scholarly articles, gave lectures, and mingled with the upper echelons of society. As scientific knowledge grew, people craved demonstrations of the natural laws and principles they learned about. Scientific instrument makers did just that, combining the craftsmanship of luxury decor with the curiosity of basic science experiments. Think desktop pendulum, but make it fashion.
One of the famous Adams globes, which sold for over 17,000 USD in 2008. Christie’s
Although George was a customer of several instrument makers in the city, he was most loyal to a family business located on Fleet Street. George Adams opened his shop, specializing in globes and microscopes, in 1734, and worked his way up to instrument maker to the King. After he died in 1772, his son, George Jr., succeeded him. Jr. was also an optician and wrote textbooks to keep the business afloat amidst budget cuts. The Adams family business had a sizable impact on scientific exploration in its heyday, and many of their high quality creations are still in existence. But their pièce de résistance was the stunning centerpiece of George’s collection called the philosophical table.
Table Talk
The philosophical table was designed to allow for multiple demonstrations of physics principles using the same instrument. I guess versatility never goes out of style. The table had so many applications, if fact, that George Adams Sr. published an entire book devoted to it, entitled A Description of An Apparatus for explaining the Principles of Mechanicks [sic] made for His Majesty King George the Third. It was a rather unimaginative title, but Adams’ creation was anything but. Pillars at the end of the table were used to attach different instruments that could be used to investigate collisions and central force and there was also an apparatus that demonstrated how a carriage works. Over the years, the Adams family would also contribute instruments that allowed investigation of air pressure and simple machines. Their inventions captivated the mind of King George, and continue to inspire awe from museum goers.
King George’s stunning philosophical table remains in pristine condition as a popular museum attraction. Science Museumof London
A stunning illustration from the 1762 book published about the philosophical table and its applications. Science Museum of London
Gadgets and Gizmos Aplenty
I hated physics in college, but there is something to be said for how simply basic physics principles can be demonstrated. By dreaming up ways to illustrate scientific discoveries, the Adams duo made science accessible and engaging. In the end, George’s commitment to scientific learning could not save him from the grip of neurological illness, but his passion influenced a generation of experimenters and tinkerers. Not bad for a mad king.
Tapdrup, J. (2002, July 01). Adams of Fleet Street: Instrument Makers to King George III (review). Retrieved from https://muse.jhu.edu/article/33990Turner, G. (1976).
The London trade in scientific instrument-making in the 18th century. Vistas in Astronomy, 20, 173-182. doi:10.1016/0083-6656(76)90029-5
George III, King of Great Britain and Ireland from 1760-1820
George IV, King of Great Britain and Ireland from 1820-1830 (aka George Jr.)
Charlotte of Mecklenburg-Strelitz, Wife of George III
Princess Amelia, Daughter of George III and Charlotte
William Pitt the Younger, Prime Minister of the United Kingdom (Tory Party)
Charles James Fox, British Politician (Whig Party) and First British Foreign Secretary
If there is one thing we have all had enough of this past year, it’s party politics. So lucky for you, party politics is exactly what we will be focusing on this week! We aim to please here at ULTC…I would like to apologize in advance to my experts in 18th century British politics – my discussion of the political consequences of George’s reign will be rudimentary. There is only so much room for discussion in this blog post without subjecting you all to War and Peace!
Whigging Out
Here are the basics: In the late 1600s, two political ideologies were emerging in England, with one of the main differentiators being religion. If you will remember back to our series on Henry VIII, England had historically been a Catholic kingdom until Henry rejected the Pope and was excommunicated. A century later, England had become a predominantly Protestant country; however, there were many who were still loyal to the Catholic religion, albeit mostly in secret. The opposing views affected more than just how people chose to pray in their own homes, it also had consequences for those in the line of succession. At this time, debate was heating up as to whether James II (then the Duke of York) should be allowed to inherit the crown, based solely on the fact that he was openly Catholic. The “Whigs”, as they came to be known, were against James and the “Tories” supported his right to the crown.
By the time George III became king in 1760, the definitions of Whig and Tory had evolved and, under George’s reign, concrete parties with specific political views began to emerge. The Tory Party “broadly represented the interests of the country gentry, the merchant classes, and official administerial groups” and the Whig Party “came to represent the interests of religious dissenters, industrialists, and others who sought electoral, parliamentary, and philanthropic reforms”. (Britannica) In 1783, Great Britain elected a 24 year old Prime Minister from the Tory party – William Pitt the Younger (not to be confused with his father, also named William Pitt, who had been Prime Minister as well). On the other side of the aisle, leading the Whigs, was Charles James Fox. They were the Alexander Hamilton and Aaron Burr of British politics, constantly butting heads and forming a rivalry that defined a generation of party politics. More importantly, George III loathed Fox, and the feeling was mutual for Charles. It was no secret how they felt about each other and so it was all the worse that George III’s eldest song and heir, George Jr., openly supported Fox and the Whig Party. Kind of similar to when I rooted for the Yankees for a period of time just to upset my father, a Red Sox fan. George and George Jr. disagreed on everything and Fox did nothing to try to bridge this divide. Instead, he exacerbated it by supporting a motion to give Jr. a bigger yearly allowance, something the king was adamantly against seeing how his son had a propensity for gambling all his money away. In turn, George Jr. was unhappy with his father for denying him a meaningful military position, driving him further into the waiting arms of his father’s political nemesis.
A classic game of tug of war between the Tories and Whigs as they fought for influence over the crown. loc.gov.
Bros Before Geo’s
When George fell ill for the first time in 1788, it was widely believed that he would not survive. In response, the Government began the first (and as it turned out, the first of many) plans to make George Jr. the regent until the inevitable happened and the king passed. While the Prince was pumped about the prospect of his upcoming promotion, he was pretty much the only one. The government’s current ministers were getting nervous as George Jr. began gathering his Whig bros and promising them influential positions, which would mean a change in power from the currently dominant Tories to the party led by the king’s enemy, Fox. Contentious debate broke out in the Government about: 1. If George Jr. should in fact be nominated as the regent (just because he was the heir to the throne didn’t mean he would automatically be the regent) and 2. What the extent of his authority would be. Unfortunately for George Jr., his father began to show signs of a miraculous recovery. On April 23, 1789, George III made a triumphant return to public life when he attended a church service, much to the delight of the people. And to the disappointment of Jr.
Oh look, it’s every bro I ever met at a frat house (aka George Jr.). historic-uk.com.
Frenemies
The year 1801 was a significant one in British history, as Great Britain (Scotland and England) joined with Ireland to form what we know today as the United Kingdom (minus Southern Ireland). This is why you will often see George III’s title written as “King of Great Britain and Ireland, because for the first years of his reign, they were not one entity. It was not a happy union by any means and, like most issues at this time, religion was at the heart of the problem. Two hundred and seventy years after our boy Henry VIII had turned his back on the Catholic Church, the effects of his decision could still be seen front and center. For two centuries, Catholics had lacked the same rights as their fellow Protestant citizens – unable to own property, vote or run for office. By the late 18th century, many of these laws had been dialed back a bit, but there was still a long way to go. When Great Britain absorbed Ireland, a country with a large Catholic population, the intolerance of the laws was even harder to ignore. There was one important man in the government who was a proponent of dramatically reversing the Catholic discrimination laws – William Pitt the Younger. Unfortunately for Pitt (and for Catholics) George III was ‘bitterly anti-Catholic” (Britannica). And so, Pitt found that he was in direct opposition to the man who was at one time his most important ally – the King himself.
As you can see from this diagram, there is a lot going on when it comes to the various names one might use to refer to George’s kingdom. brilliantmaps.com.
Why is all of this relevant to our “mad” king? In 1801, George III again fell critically ill, this time spending several days in a coma. And again, his family and his country prepared for his death. But George would not go easily. In less than a year’s time, it appeared as if he had made another recovery and “in the summer of 1803 he was said to be perfectly fit again” (Hibbert, 321). Last week Riley presented several fascinating theories as to what could have been the cause of George’s mental and physical ailments, but George himself was convinced he knew the catalyst – that it was the stress of Pitt’s campaign to restore Catholic rights that drove him to the brink of death. In fact, Pitt felt so bad about his apparent role in the King’s decline that he “swore he would no longer bring up the matter of Catholic rights again, because he feared the stress of it had contributed to the King’s illness” (Hibbert, 316). George suddenly found that men from both parties, who could rarely agree on anything, agreed that they had pity for the King and did not wish to contribute in any way to additional health scares.
Mind Control
This theory that Pitt’s campaign was the catalyst to George’s decline lost its legs by 1804 when the King was again sick. Although he eventually made a physical recovery, those around him could not deny that his mind and personality were much changed – in particular his habit of long and hurried speech was worse than ever. Even the Speaker of the House of Commons at the time claimed that the King’s “disorder had taken the ‘decided character of a complete mental derangement’ ’” (Hibbert, 340). It had been a rough six years for George, and those close to him were feeling the strain as well. Queen Charlotte especially began to crack under the pressure and the fight to control the narrative of the King’s health led to quarrels between George’s wife and their sons. And so “the Queen and most of her children had signed a declaration to the effect that the Cabinet, not the Royal Family, must make all the necessary decisions about His Majesty’s treatment in future” (Hibbert, 345)”. In short, Pitt was now in charge of making George’s important medical decisions.
George III holding an (almost) to scale Napoleon. The French general eventually surrendered to the British in 1815.Metropolitan Museum of Art.
As George’s health continued to decline, his popularity rose. It was not only compassion and pity that endeared him to his people, but also British military victories against the French, who had declared war on Great Britain following the French Revolution. George, even in his sickness, was seen “as a stolid, reliable, honest, dependable monarch as well as a national symbol” (Hibbert, 390). But all of the popularity and riches in the world could not save George from the unimaginable heartbreak that was to deal him his final mental blow.
A Beautiful Mind
In 1810, George’s favorite daughter, Princess Amelia, died tragically of tuberculosis. George was inconsolable and this time he would not make one of his miraculous recoveries. In fact, he periodically had to be reminded that Amelia had passed, for he often believed that she was still alive. It was abundantly clear at this point that someone must officially be named regent, and this time Jr. won the prize. On February 6, 1811, the Regency Act was passed. In what could be viewed as a sign of growth, George Jr. did not actually replace the government’s ministers with all of his Whig friends, as he had been planning on doing back in 1789. What led to this mature decision? Jr. claimed that he did so because he was, after all, only acting for his father for the time being” (Hibbert, 398). As it turned out, this decision had profound effects on Britain’s history – the Whig Party at the time was ready to quit the war with France and essentially let Napoleon have his way with Europe. If George Jr. had brought these men into the fold, who knows if the British would have remained in the war long enough to defeat Napoleon in 1815!
A depiction of George III towards the end of his life – a far cry from the young and energetic man we were first introduced to. en.wikipedia.org.
The last decade of King George III’s life was sadly one of isolation. He was deaf and blind, but he had managed to outlive his wife. When George passed in January of 1820, at the age of 81, the people of the United Kingdom celebrated. However, they weren’t celebrating his death; they celebrated his life and what he had accomplished in the midst of the relentless battles against his own mind and body. His missteps early in his reign were largely forgotten and instead George was “widely seen as a symbol of the greatness of the British constitution (Black, 413)”. Over the years, George has dropped off the map of British history as a result of the rise in interest in the Tudors and the Stuarts (guilty as charged). Unfortunately for this king, his name was thrown back into the limelight because of the emphasis on his “madness”, which over the years has been portrayed in numerous movies, plays, and television shows. What has been lost with time is the humanity of George – he was a devout family man who didn’t drink or gamble or indulge in parties and food like many of his contemporaries. He loved his children (even Jr. in the end) and was a domestic man at heart. He also loved his country, which was probably why he was involved in politics to the extent that he was, as he tried to bring the same type of morality and integrity to the government as he did to his own home. George was far from perfect for sure, but he deserves to be remembered for more than his lowest moments. As do we all.
Black, Jeremy. George III: America’s Last King. Yale University Press, 2008.
Brooks, Rebecca Beatrice, et al. “What Was the Olive Branch Petition?” History of Massachusetts Blog, 7 Mar. 2020, historyofmassachusetts.org/what-was-the-olive-branch-petition/.
When Stefanie suggested a series on King George III, my only reference was the flamboyant, comical character from “Hamilton”. Arrogant, sure, but not suffering from a mental illness. But when I learned that for centuries, King George has been considered one of the most insane royals of all time, it gave new meaning to the lyric, “when you’re gone, I’ll go mad/so don’t throw away this thing we had.” Lin Manuel Miranda, in all his genius, didn’t miss a thing, because indeed it was after the American colonies had broken free from the United Kingdom that George’s mind began to deteriorate.
Honestly, same. AZ Quotes
Although early psychohistorical analyses of George like to point to 1765 as the onset of his bouts of mental illness, medical records clearly indicate that this was a purely physical ailment. 1765 was likely pinpointed as the beginning of his mental deterioration because of the tempting prospect that George’s madness led to the American Revolution and therefore dramatically altered the course of history. Scholars who adopted this view argued that George must have been on the verge of mental illness his whole life. In the historical records, they see evidence of suppressed sexual desire, family drama, and mounting political pressure that must have precipitated a psychiatric episode.
However, that hypothesis began to change in the 1960s. Researchers who reviewed the relevant primary documents found that the earliest evidence of any psychiatric symptoms in George is from 1788, as Stefanie mentioned last week. Not only does this invalidate the tantalizing but unsupported claim that George’s madness led to the American Revolution, it also calls into question the portrayal of George as a fragile neurotic. If his mental health was really so delicate as to be triggered by external pressures, the revolt of the financially lucrative colonies probably would have fit the bill. But by all accounts, George seemed perfectly healthy and sane until five years after the war ended. The debate over just what caused George’s mental deterioration continues to this day.
I Ain’t Here For a Long Time, I’m Here for an Enzyme
In 1966, Dr. Ida Macalpine and her son Dr. Richard Hunter published a paper in the British Journal of Medicine that has dominated the theories surrounding George’s madness for over half a century. By combing through correspondences and daily reports from the royal physicians, Macalpine and Hunter established a characteristic pattern of the evolution of George’s five episodes. Importantly, his mental symptoms were preceded by physical ones: a cold turned into stomach pains which turned into sensory disturbances and generalized weakness. Sometimes he had tremors or problems holding his gaze. In addition, he would have profound psychological symptoms such as mood swings, hallucinations, insomnia, loss of inhibitions, and increased/rapid speech. Because the mental changes were always accompanied by bodily, or somatic, symptoms, the authors rejected the hypothesis that George was suffering from manic depression or simply a personality disorder. Macalpine and Hunter focused particularly on his abdominal discomfort, nerve pain, and psychological changes and saw a clear diagnosis. “Reviewed in this light,” they wrote, “the symptomology and course of the royal malady reads like a text-book case” of something called acute intermittent porphyria (AIP).
Like many of the diseases we’ve discussed on this blog, AIP has genetic origins. Unlike those that we have covered, however, AIP doesn’t mainly affect the brain, but rather, the liver. Patients with AIP have mutations in a gene that encodes an enzyme called porphobilinogen deaminase, or PBGD for short. Enzymes are proteins in the body that help facilitate chemical reactions. A familiar example is lactase, an enzyme which helps break down lactose found in dairy products. In the case of PBGD, this reaction is the third step in an eight step process to make something called heme, which is needed for the transport of oxygen in your bloodstream. Some people who have mutations in PBGD never have issues in heme production. However, stress, substance abuse, or changes in nutrition can cause PBGD to stop working periodically in genetically predisposed individuals, leading to acute episodes. This causes the buildup of heme precursors called porphyrins, which can be toxic, as well as heme deficiency. The analogy here would be that people who are lactose intolerant are deficient in lactase enzyme, so when they eat dairy, the lactose doesn’t get broken down the way that it should, leading to gastrointestinal discomfort, as the paternal side of my family can attest. However, while lactose intolerant folks can take a Lactaid pill containing the enzyme they need, people with AIP aren’t so lucky, and their enzymatic deficiency can cause major problems.
This diagram shows the eight step process needed to make heme. Different types of porphyria, listed on the right, result from deficiencies in the different enzymes needed for each step. Stepwards
Urine For It
Indeed, George did have the characteristic symptoms of AIP: muscle pain and weakness, digestive disturbances, insomnia, and psychiatric changes. Muscle pain, abdominal discomfort, and mental changes are all due to signaling changes in the nervous system and cell death. In addition, one of the more famous and striking symptoms is dark or red colored urine due to the presence of porphyrins, which confer purple-ish pigmentation. While discoloration of the urine isn’t needed for a diagnosis of AIP, Macalpine and Hunter cited several doctors’ reports that described George’s urine as “blue”, “red”, or “bilous”, strengthening their argument. In addition, they argue that his younger sister, Caroline Matilda of Denmark and Norway, also had symptoms of porphyria at a similar age, supporting the idea that George had inherited this disease.
A urine sample from a control on the left compared to that of a patient with AIP on the right. Lin et al, 2008.
So clearly this enzymatic deficiency is bad news for the body, but how exactly does the buildup of porphyrins in the liver lead to profound psychological disturbances and neuropathic pain? If you want the answer to this question, you’ll have to get in line behind hundreds of scientists who have been trying to figure this out for decades. Although there is still a lot of research to be done, the current literature points to a combination of energetic disturbances and an imbalance in the ions neurons use to communicate, abolishing neural signaling and leading to cell death. Based on the biological role of PBGD, there are two potential reasons for this: either the porphyrins that accumulate are themselves toxic to the nervous system, or the lack of heme caused by their accumulation causes nervous system damage. While there is evidence that a porphyrin called ALA can interfere with inhibitory neuronal signaling, possibly leading to toxicity, researchers have pointed out that ALA levels are high even when patients aren’t exhibiting symptoms. That makes the theory that symptoms are caused by ALA accumulation hard to believe. On the other hand, it is known that heme plays important roles in the nervous system, especially in the generation of cellular energy, and thus heme deficiency could lead to neuronal death. In all likelihood, both porphyrin accumulation and heme deficiency likely contribute to the neurological and psychiatric symptoms of AIP.
It’s worth noting that Macalpine and Hunter later analyzed the medical histories of more royals in George’s family line and adjusted their diagnosis to a similar but slightly more mild form of porphyria called variegate porphyria. Despite different enzymes being affected in these two diseases, the downstream biological consequences are the same and the physiological implications are similar. George had all the hallmark symptoms, blue urine included. An open and shut case, I thought, maybe even the easiest I have covered on this blog. But in the 2010s, some new researchers came on to the scene, and they weren’t buying what Macalpine and Hunter were selling.
This figure details the known downstream effects of excess porphyrin on the left and heme deficiency on the right, which could underly the psychiatric symptoms of AIP. Pischik and Kauppinen, 2009.
Manic At the Disco
When critics of the porphyria hypothesis started to evaluate the evidence, they were quick to point out that several physicians with expertise in porphyria had doubted the diagnosis from the beginning. In fact, I found several letters to the editor objecting to the diagnosis with varying shades of outrage. They all agree that the diagnosis of variegate porphyria was inappropriate because it does not cause severe symptoms like the ones George experienced, unless patients are on specific medications. But even then, the AIP theory also rings hollow. One physician said that George experienced diarrhea, but he has only seen porphyric patients presenting with constipation. Another claimed that the records of changes in George’s urine color were problematic, because the urine should be normal colored when passed, but change “upon standing”, which was apparently inconsistent with the historical accounts of the king. This doctor was especially testy with the British Medical Journal, which he believed was publishing Macalpine’s work for media attention. “I have my suspicions, too, about you, Mr. Editor,” he said, “as a result of your putting out a special supplement of the royal malady complete with vivid purple-coloured cover!”
Despite these objections, the porphyria diagnosis stuck and no one was able to suggest a convincing alternative. Then in 2010, Timothy Peters and Allan Beveridge, two researchers from the United Kingdom, undertook a thorough reanalysis of the relevant historical records. They took to the DSM and argued that George met the diagnostic criteria for a manic episode. His episodes were far longer than the one week required, and he exhibited symptoms that they found consistent with grandiosity, decreased need for sleep, racing thoughts, distractibility, increased/rapid speech, sexual disinhibition, and anxious restlessness. Based on this, Peters and Beveridge argue that George was not suffering from porphyria of any kind, but rather bipolar disorder.
Bipolar disorder is characterized by both periods of mania and depression. VeryWell Mind.
Back in our very first series on Charles VI, we talked about bipolar disorder or manic depression in contrast with schizophrenia. Patients with manic depression experience alternating periods of mania, in which they have increased activity, loss of inhibition, and inflated sense of self, and depression. Peters and Beveridge observed these symptoms in the historical records and note that the length of George’s episodes are consistent with the 4.5 month average of a manic phase. They further offered alternative explanations for the physical symptoms that George experienced, but don’t rule out the possibility that bipolar disorder itself was at the center of these symptoms. They cite research showing that physical symptoms including digestive disturbances and muscle pain, like the ones George experienced, were reported by 42% of a group of over 300 manic depressive patients. They also suggest that his symptoms could be mania resulting from the neurological effects of an infection, which might be a better way to reconcile his physical and psychiatric symptoms.
Write it, Regret it
The new theory of bipolar disorder is certainly intriguing, but it needed something more if it was going to dethrone porphyria as the prevailing diagnosis of King George. That’s when Peters reached out to a team from the University of London. In a brilliant move, the researchers decided to focus on one symptom of bipolar disorder that they wouldn’t need to evaluate through secondhand sources: pressured speech. This rapid, non-stop manner of speaking is frequently seen in bipolar patients. Its exact neurobiological cause is unknown, but seems to reflect the distracted and overactive thought pattern in patients. Dr. Kay Redfield Jamison, a psychiatrist who wrote about her own experiences with bipolar disorder in her best-selling memoir, An Unquiet Mind, described it this way: “For those who are manic, or those who have a history of mania, words move about in all directions possible, in a three-dimensional ‘soup’, making retrieval more fluid, less predictable.”
Peters and his new crew used machine learning to systematically analyze letters written by George to his political advisors between 1760 and 1810. The computer was able to analyze different features of his writing and then compare these features between letters written during manic episodes to those written when George was lucid. In addition, they considered political stressors happening at the time the letter was written that could explain why his language or writing style changed. They found that during manic episodes, George’s vocabulary became strikingly restricted, his sentences became shorter, and he began to use more coordinating phrases, which had the effect of connecting unrelated ideas. Importantly, these changes weren’t seen when George was experiencing political pressure outside the context of a manic episode, suggesting it was not due to stress. The paper grabbed headlines across the globe.
George’s letters may have hinted at madness but his exquisite penmanship did not. St. George’s University, London
The authors claim that this is evidence that George was experiencing manic episodes, not psychiatric symptoms due to porphyria. However, I don’t know if I am convinced. For one, while pressured speech is a hallmark of bipolar and mania in general, there is little to no research on how written language changes during a manic episode. So it’s unclear if the changes in George’s letters are consistent with mania. But most of all, I keep coming back to the odd physical symptoms that accompanied all of George’s episodes, in particular that each started with flu-like symptoms. Peters mentioned the possibility of secondary mania as a result of infection as an aside in one of his papers, but that might be the most convincing theory I’ve heard; the only one that is able to link the somatic to the psychological in a logical way.
Rule of Law
Unfortunately I can’t give you a tidy diagnosis tied up in a bow this month. Whether it was porphyria or bipolar disorder or something else entirely that robbed George of his ability to rule is unclear. But I think George is the perfect example of why medical history is so fascinating and, every once in a while, grabs widespread attention. The possibility that a measly enzyme or a bad cold had the power to change the course of history is tantalizing. We started ULTC for just that reason; to explore moments where the world was nudged in one direction or another by the biology playing out inside a single person; when the powers of nature collide with the power of human authority. Next week, Stefanie will show us exactly how George’s mysterious ailments shaped history. See you then!
Lin, C. S., Krishnan, A. V., Lee, M., Zagami, A. S., You, H., Yang, C., . . . Kiernan, M. C. (2008). Nerve function and dysfunction in acute intermittent porphyria. Brain,131(9), 2510-2519. doi:10.1093/brain/awn152
Macalpine, I., & Hunter, R. (1966). The “insanity” of King George III: A classic case of porphyria. British Medical Journal,1(5479), 65-71. doi:10.1136/bmj.1.5479.65
Peters, T. J., & Beveridge, A. (2010). The madness of King George III: A psychiatric re-assessment. History of Psychiatry,21(1), 20-37. doi:10.1177/0957154×09343825
Rentoumi, V., Peters, T., Conlin, J., & Garrard, P. (2017). The acute mania of King George III: A computational linguistic analysis. Plos One,12(3). doi:10.1371/journal.pone.0171626
George III, King of Great Britain and Ireland from 1760-1820
George II, King of Great Britain and Ireland from 1727-1760
Frederick, Prince of Wales, Father of George III
George IV, King of Great Britain and Ireland from 1820-1830 (aka George Jr.)
Charlotte of Mecklenburg-Strelitz, Wife of George III
Princess Amelia, Daughter of George III and Charlotte
If you grew up in the United States school system, (and we have many international readers, so not all of you did!) then your early American history lessons taught you all about how the colonists rose up against the British and won their freedom from the tyrant on the throne across the pond. The tyrant was King George III and over three centuries after his death, he is most known for two things, neither of which are ideal claims to fame: 1. Losing the American colonies and 2. Being mad. However, if you have been with Uneasy Lies the Crown over the last year, you know there is always more to the story. For George this is especially true – there was much more to this king than what was included in your 4th grade textbooks.
Appall of My Eye
Our story starts on June 4, 1738 with the birth of George in London. He was the oldest son of Frederick, Prince of Wales and Princess Augusta, and the grandson of George II, the current King of Great Britain. While sons were generally prized possessions during this time, Prince Frederick was absolutely loathed by his parents, ensuring that little George III grew up among constant tension among family members. They hated Frederick so much that his mother, Caroline, once said (out loud, to other people), “I wish the ground would open this moment and sink the monster to the lowest hole in hell” (Hibbert). Right about now I am feeling really great about that one time my mom told me she was tired of putting up with my “crappy shenanigans”.
Despite the constant tension between father and son, Frederick was actually a pretty good father to his own children and specifically stressed the importance of education. In fact, George III was the first British monarch to study science as part of his curriculum! We stan an enlightened king! So it must have been devastating for George when he lost his father in 1751, before he was even 13 years old.
The Philosophical Table, one of the many pieces on display at the George III Collection at London’s Science Museum. www.sciencemuseum.org.uk.
Four Score and Seven Years War
Frederick’s death was not only tragic but also historically significant for the future of the British monarchy. Because Frederick was the Prince of Wales when he died, his oldest son, George, inherited the title and became the new Prince of Wales and heir to the throne. It would be another 10 years before George became king, but even after a decade of preparation, he was perhaps not ready to inherit the kingdom in the condition it was in. George was crowned King George III in 1760 and at this time Great Britain was four years into the Seven Years’ War, or the French and Indian War as we call it in the States. In Europe, Great Britain was fighting alongside Prussia and Hanover against Russia, France, Austria, Sweden and Saxony. In the American colonies, Great Britain and France were at war over who would control the land south of Canada and north of Florida. Naturally, the Native Americans who lived on that land were involved in the bloodshed, hence the name of the conflict that today’s Americans grow up using. By 1763, the war had concluded, bringing an end to the eventful first three years of George III’s reign.
The Seven Years’ War, or the French and Indian War as it played out in the colonies. history.com.
We’re Not Gonna Take It
If George thought an end to the Seven Years’ War meant an end to Britain’s war woes, then he was sorely mistaken. By the mid-1760s, tension in American colonies was ramping up and several decisions by the British government only pissed off the colonists even more. Laws like the Stamp Act and the Townshend Acts were desperate attempts to dig the kingdom out of the massive debt it had accumulated during the war with France by collecting increased taxes from the colonies. And we all know what happened in 1773 with the Tea Act – the Boston harbor has never been the same. But even with these hugely unpopular orders, the king was not seen by the colonists as the source of their woes – that honor belonged to Parliament, which for many years was seen as the real enemy. That favorable opinion was swiftly reversed in 1775 when George rejected the document known as the Olive Branch Petition, a document in “which the colonists pledged their loyalty to the crown and asserted their rights as British citizens” (History of Mass). Much like my ex-boyfriend, George was not interested in compromising. The message from Britain was clear – if the colonists did not stand down, there would be war. Six years later when the British surrendered in Yorktown, the king found himself to be a pretty unpopular guy both in the colonies and at home, where George’s subjects were not impressed that they had spent precious resources and lives to fight their own people and came away with nothing. He was also seen as somewhat of a laughing stock by his fellow monarchs, with Catherine the Great claiming that “rather than sign the separation of thirteen provinces, like my brother George, I would have shot myself” (Black).
Honestly, I am so desperate to socialize these days that I have FOMO just looking at this painting of the Boston Tea Party. history.com.
Mind Over Matter
Trouble on the outside had plagued George since he had taken the throne, but in the summer of 1788 trouble of a different and more concerning kind began to brew – trouble of the mind. At the age of 50, George was laid low with some kind of stomach illness, and although he physically recovered relatively soon, those around the king noticed that something was off. People who knew George were aware that he could be eccentric at times, but now his behavior seemed much stranger. He was constantly moving and spoke as if in a hurry, while being overly friendly and communicative with people he didn’t know. His recovery was short-lived and the stomach issues came back even worse, now bringing with it “agonizing cramp(s) in the legs and a rash on his arms” (Hibbert). And again, his behavior became increasingly alarming. He slept little, became angry and violent (even attacking his own son at one point), found himself unable to stop talking for hours at a time, and reminisced about women he used to court long before he was married. He even hallucinated that London had been flooded and that his pillow was his long-dead son. Servants were so desperate to control him that at times they would tie him down to his bed. George’s beloved wife Charlotte became desperate as well and the government began to make plans for a regency led by their eldest son, George Jr., the Prince of Wales. Remember, a regency was when the government appointed someone to make decisions in the monarch’s place in the event that he or she was too young or not capable of ruling for other reasons (sickness, traveling outside the country, etc.) However, the Regency Act was never passed as the king began to show signs of recovery beginning in early 1789, much to the delight of the British people. Unfortunately for George, he recovered just in time to face a brand new threat to his crown: unrest was taking shape in France and the consequences would reach far and wide.
George’s many maladies have been represented in pop culture for years, including the play “The Madness of King George III”, which was also adapted into a movie. dailymail.co.uk.
The Blind Side
On July 14, 1789, in a scene chillingly similar to the events of the January 6th, 2021 Capitol riot, revolutionaries in Paris stormed a building called the Bastille, “a royal fortress and prison that had come to symbolize the tyranny of the Bourbon monarchs” (History.com). It was the beginning of years of violence and bloodshed in France, during which the French king and queen lost their heads and the monarchy was abolished. Luckily for those who wished to maintain the status quo in Great Britain, George was mentally stable at the outbreak of the turbulence and he was largely seen as an “object of compassion in his collapse” and “a symbol of the old English order for which the country was fighting” (Britannica). But the heightened tension and stress clearly took a toll on the now 63-year-old George, and in 1801, he fell ill again. This time, his illness was so severe he went into a coma. Again the king defied the odds and seemed to recover, but declined yet again in 1804. By now his body was frail and there was no way to hide his age and the “strange and uncharacteristic conduct” (Hibbert) he displayed. In fact, he could no longer stand to be around his own wife whom he had loved so much and relied on in his earlier days of sickness. If that were not enough, George’s eyesight was also rapidly declining and he eventually became blind.
King George III’s final mental break came in 1810 and was believed to be the result of the death of his beloved daughter, Princess Amelia. The need for a regent could no longer be denied and in February of 1811, George Jr. was officially made regent. By now the king had “retreated into a fantasy world in which the past was largely forgotten, the dead were alive, and the alive were dead” (Hibbert). With the condition of his mind and body (he was now also deaf in addition to being blind) it is truly amazing that George III lived for almost another decade, even outliving his wife Charlotte, who died in 1818. On January 29, 1820, King George III finally met his peace at the incredible age of 81. There were numerous times when it was assumed he would not make it through the night; nevertheless, George reigned over Great Britain for 59 years. The only British monarchs to wear the crown longer were Queen Victoria (63 years) and Queen Elizabeth II (68 years and counting). It was a 59 years that would have been stressful and difficult for even the most competent of men. Next week Riley will wade through the many theories of what plagued “America’s last king” as we attempt to understand the rollercoaster of the ailments he fought for over three decades.
Charlotte of Mecklenburg-Strelitz, as portrayed in Netflix’s Bridgerton. There is some evidence to suggest that Charlotte was (very) distantly related to nobility from North Africa. people.com.
Black, Jeremy. George III: America’s Last King. Yale University Press, 2008.
Brooks, Rebecca Beatrice, et al. “What Was the Olive Branch Petition?” History of Massachusetts Blog, 7 Mar. 2020, historyofmassachusetts.org/what-was-the-olive-branch-petition/.
In episode 7 of the most recent season of Netflix’s “The Crown”, Princess Margaret goes to therapy for her depression and learns that two of her first cousins, Katherine and Nerissa Bowes-Lyon, were institutionalized for mental disabilities. I can tell you from experience that stranger things have been revealed by a good therapy sesh, but what made this realization so unusual was that as far as Margaret knew, both Katherine and Nerissa had died. As “The Crown” renewed interest in Princess Di, the subject of this month’s series, we thought it would also be appropriate to investigate what is fact and what is fiction about the Bowes-Lyon sisters and the disease that afflicted them.
Katherine and Nerissa as depicted in season 4 of “The Crown”, showing homage to the Queen on their television set. Hospital workers who cared from them over the years recalled similar instances of bowing and showing interest in the royal family. Radio Times.
Cheaper By the Cousin
Katherine and Nerissa were the daughters of John Herbert Bowes-Lyon, the brother of Queen Elizabeth’s mother. As “The Crown” correctly depicted, the girls were placed in Royal Earlswood Hospital in 1941 because of their “mental retardation”: neither learned how to talk and it has widely been reported that they only reached the cognitive level of three- or four-year-olds. Three of their cousins, Idonea, Etheldra, and Rosemary, also suffered from the same apparent genetic disease and were admitted to the hospital. According to the 1963 edition of Burke’s Peerage, a published genealogy of all the aristocracy, Katherine died in 1940 and her sister passed away in 1961. But in 1987, British tabloids discovered that this was not the case. Katherine would not pass away until 2014 at the age of 87. Nerissa had only died a year previously at the age of 66, and was buried with nothing but a plastic tag to mark her gravesite.
The plastic marker over Nerissa’s burial site. Oprah Magazine.
Immediately, rumors swirled that this was an intentional cover up by the royal family. “The Crown” depicts a conversation in which the Queen Mother tells Princess Margaret that they had to lie because the potential for genetic defects in the royal line could erode confidence in the monarchy. Katherine and Nerissa’s cousin, Lord Clinton, denied these rumors back in 1987. He claimed their mother, Fenella, had inaccurately reported the girls’ death to the Burke’s Peerage because of old age and confusion. But doubters were quick to point out that she had supplied the publication with specific dates of death, indicating more than the “vague” error Lord Clinton tried to portray.
Despite a large network of relatives, no one attempted to correct the Peerage or even visit the sisters in the hospital. It is clear that someone knew Fenella’s report was inaccurate given that the royal household was footing the bill for the Bowes-Lyon sisters’ care, but Queen Elizabeth and her immediate family did not. Many allege that the Queen Mother learned the sisters were alive in 1982, but never contacted Burke’s Peerage or made a visit to Royal Earlswood. Despite the royal family’s repeated protest that family members visited Katherine and Nerissa, no records of any visits were found after Fenella died in 1966.
Rare photos of the sisters. Up News Info.
Katherine would live in the hospital until it was shut down in 1996 due to poor conditions. The hospital administrator alleged that Queen Elizabeth and her mother declined to meet with him as next-of-kin in order to determine where she would live next. While we cannot speculate about the royal family’s motives, the neglect of Katherine, Nerissa, and their cousins was tragic and indicative of the darkside of royal politics.
The X Files
The 1987 reports about the 5 cousins sparked interest about what could have been the cause of such a crippling disease. An article from that year in The Age newspaper detailed the hypothesis of Professor David Danks, who was the director of the Murdoch Institute for Birth Defects, an Australian institute established by Rupert Murdoch. Danks noted that there were no males in the Bowes-Lyon family who suffered from the mysterious disease. He therefore suspected that the defect in question was a sex-linked disease in which “male victims died in early childhood.” Danks was unwilling however to suggest a specific disorder, as there were “about 20 gene diseases” that could explain the Bowes-Lyon illness.
Because the name of the game is to learn about the brain here on ULTC, I will be less scrupulous than Danks and throw out one hypothesis: Rett syndrome. Rett syndrome is a developmental disorder causing cognitive defects, impaired brain growth, motor difficulties (including the inability to speak), and autistic-like behavior. The disease is caused by mutations in a gene called MECP2, which controls the expression of many other genes. Many of the genes controlled by the MeCP2 protein are important for brain development, explaining the downstream neurodevelopmental impairments.
This campaign by Cure Rett highlights the range of experiences of children who suffer from Rett syndrome. Grace for Rett.
Importantly, MECP2 is located on the X chromosome. If you remember your high school biology class, biological sex is determined by the combination of sex chromosomes an individual has. Although there are aberrations in sex chromosome complement, the general rule is that females are XX and males are XY, with the Y chromosome being much smaller and containing fewer genes. Therefore, mutations in genes on the X chromosome tend to affect males more, because they don’t have the benefit of another X chromosome to pick up the slack. Rett syndrome is no exception. MECP2 mutations generally cause death in males shortly after death, so Rett syndrome is primarily a disease affecting females, which would be why there were no boys in the Bowes-Lyon family who were affected. Additionally, although most MECP2 mutations occur randomly, there are instances in which women carry one X chromosome with a mutation but do not show symptoms, what are referred to as asymptomatic carriers. It is possible that Fenella and her sister were two such carriers, explaining the odd pattern of inheritance that puzzled Danks. Research is ongoing, but there is currently no cure for Rett Syndrome or any interventions beyond physical and occupational therapy to preserve mobility.
Got So Far to Go
Rett syndrome is just one example of X-linked diseases causing mental disabilities. The true cause of Katherine, Nerissa, Idonea, Etheldra, and Rosemary’s suffering remains undetermined. But thanks to “The Crown” and other projects like the 2011 documentary “The Queen’s Hidden Cousins”, these women are no longer unknown to the world. To me, the Bowes-Lyon cousins serve as a reminder that genetic diseases are the great equalizer, affecting blue bloods and blue collars alike. Moreover, the neglect of Katherine and Nerissa shows me that we still have a lot of work to do to remove stigma surrounding mental disabilities and make sure that everyone is treated with dignity, regardless of their cognitive abilities.
Every time Diana chose to speak out about her eating disorder or her unhappy marriage, she was doing more than just garnering the attention of her husband and his family – Diana was slowly breaking down the way the British monarchy had operated for hundreds of years. When Andrew Morton’s book was published in 1992, “it shattered the myth of the royal family” and its “image as the ‘perfect’ family” (337, Morton). This image was the very thing the Queen wanted to preserve as she pushed back against her son’s desire to end his unhappy marriage. And I’m sure it irked the royal establishment that this perceived smear campaign was coming from one of their own, a British aristocrat.
In fact, Diana was “the first Englishwoman to marry an heir to the throne for 300 years” (royal.uk). For the past three centuries the Prince of Wales had married a foreign born princess, ensuring that the Queen of England was not actually English. It was an unprecedented break in tradition, doubled by the fact that Diana was a “commoner”, and cleared the way for Charles’ own son William to do the same. Today it is hard to imagine the Queen of England not being from England! However, a member of the royal family marrying a commoner was not unprecedented and it also clearly was not a recipe for success. The marriages of Queen Elizabeth’s sister Marget and her husband Antony Armstrong-Jones, the Queen’s daughter Anne and Captain Mark Phillips, and Prince Andrew and Fergie had all ended in divorce. The difference this time was that Charles would one day be king (or so he thought, he probably wasn’t banking on his mother living forever…) and none of the aforementioned figures attracted the public’s attention like Diana did.
Even though Andrew and Fergie were the picture of happiness on their wedding day, their marriage did not last. townandcountrymag.com.
A Whole New World
Transformation in the British monarchy was certainly inevitable and things had been changing course long before Diana arrived on the scene. By the 20th century, the crown was merely ornamental in Great Britain and events such as the World Wars had irreversibly changed the role of monarchies across Europe. The rise of television and tabloids meant that it was harder for the royal family to hide during scandals and made it easier for the public to voice their displeasure. When news broke that Diana had died, the worldwide reaction was palpable. It wasn’t just that people were sad. They were also angry and many felt that the Queen and her family did not show the proper level of grief that was due to the mother of the future king and his brother. Not surprisingly, as we often see after the death of public figures, “it was conveniently forgotten that [Diana] was for a time widely seen as a destructive influence upon the whole fabric of the British monarchy” (321, Morton). Diana’s death ensured that her legacy would always be a positive one and the tragic circumstances surrounding her deadly crash changed the way the royal family approached their relationship with the media.
Today it is not uncommon to see the royal family splashed across the front page of every tabloid in England. nytimes.com.
Today, the royal family understands and accepts that their everyday lives are open to the public, as it is quite literally their job to attend public engagements and support local charities. However, the new generation of royals has made it very clear that there is a line that they are not willing to allow the media to cross. In 2017, William and Kate won a five-year legal battle with a French tabloid over the printing of photos of Kate topless during a private vacation. More recently, Harry and Meghan have battled British tabloids over the publishing of a letter written by Meghan to her father. Before Harry and Meghan stepped down from their royal duties and moved to America, Harry was extremely outspoken about the parallels he saw between how his mother had been treated in the media and how his wife was being treated. Again, they stressed that although styled as princes and princesses, they were first and foremost human beings with the right to a certain amount of privacy and respect afforded to other citizens. And although Prince Charles and his family were often worried about the negative effects that Diana’s behavior would have on the monarchy, one could argue that her popularity pumped new life into an establishment that was in danger of becoming obsolete.
A Picture Is Worth A Thousand Words
The stories of any one of the number of monarchs we have covered in ULTC over the last year will show you that Diana was far from the first royal to suffer from mental and physical illness. But what made Diana a groundbreaking figure was her willingness to speak about her struggles openly and a desire to relate to people on a human level. The effects of her efforts can be seen through the work of her children – William and Harry are both great supporters of mental health organizations, with William and Kate beginning an initiative called Heads Together which encourages change in the way society talks about and approaches mental health. Today Diana is memorialized for a new generation in books, TV shows, documentaries, commemorative beanie babies and the popularity of her kids. But there are priceless and meaningful lessons we can take from Diana’s life, other than her impeccable fashion sense and iconic hairstyle.
Before William and Harry’s infamous rift, they banded together to honor their mother through the Heads Together organization. thesun.co.uk.
Her determination to break from the mold of tradition encouraged Diana to shed light on difficult and oftentimes controversial issues. One of Diana’s greatest passions was the care of people suffering from AIDS and in the 1980’s the world did not have the same knowledge of HIV and AIDS that we do today. It was even thought that by simply touching an infected person, or sharing the same toilet, someone could be exposed. The Princess of Wales went a step further than simple gestures and words when she “opened the UK’s first purpose built HIV/Aids unit that exclusively cared for patients infected with the virus” (bbc.com). In 1997, seven months before her fatal accident, Diana traveled to Angola and was photographed walking through a live minefield. Not only did that moment open many peoples’ eyes to this deadly practice, but “her trip was credited with boosting the campaign for a global landmine treaty signed later that year” (bbc.com). Amidst the troubles she was experiencing in her own personal life, Diana made time for people in need, determined not only to use her position for good, but to show her two young boys that there was more to being royal than the glitz and the glamour.
Diana rocked a face shield decades before the rest of the world. Except instead of trying to avoid a virus, she was walking through Angola’s minefields. time.com.
In the end, Diana was not perfect, and that is exactly what made her the People’s Princess.
In past series here on ULTC, I have had to rely on historical documents and secondhand sources to piece together the inner workings of our subjects’ minds. But this month, I don’t have to formulate an armchair diagnosis or scour JSTOR; I can go straight to the source. Princess Diana is the most modern royal we have covered on the blog, but she’s also the most accessible. As a result, we have her own words about her battles with mental illness, words that shattered the “keep calm and carry on” tradition of the British monarchy and forever changed the way people think and talk about mental health.
Diana breaking the silence in her (in)famous interview with Martin Bashir. Vanity Fair
Although Diana’s experiences with mental illness were covered in the 1992 biography by Andrew Morton, she had to deny her involvement in the project because of her royal status. So, the first time Diana publicly talked about her mental health was in her 1995 BBC tell-all interview with Martin Bashir. Breaking royal protocol, she aired the dirty laundry of her failing marriage with Prince Charles and the psychological effects of life in the royal family. As Stefanie told you last week, the BBC reopened an investigation into Bashir in December 2020 over allegations that he coerced Diana into the interview by forging documents suggesting that the Queen had hired people to spy on her. Even so, the contents of the interview remain achingly relatable to anyone who has suffered from mental illness. Her words will serve as guideposts as we explore the neurobiology behind what she suffered.
“I was crying out for help, but giving the wrong signals, and people…decided that was the problem – Diana was unstable.” What were these signals, and what do they reveal about the royal who changed the monarchy forever?
A Secret Disease
“I had bulimia for a number of years. And that’s like a secret disease. You inflict it upon yourself because your self-esteem is at a low ebb, and you don’t think you’re worthy or valuable. You fill your stomach up four or five times a day – some do it more – and it gives you a feeling of comfort. It’s like having a pair of arms around you, but it’s…temporary. Then you’re disgusted at the bloatedness of your stomach, and then you bring it all up again. And it’s a repetitive pattern which is very destructive to yourself… it was my escape mechanism, and it worked for me, at that time.”
First we should distinguish between bulimia and anorexia, the two most well known eating disorders. The latest edition of the DSM outlines the diagnostic criteria for anorexia as restricted food intake leading to “significantly low body weight in the context of age, sex, developmental trajectory, and physical health”, fear of gaining weight, and “disturbance in the way in which one’s body weight or shape is experienced, undue influence of body weight or shape on self-evaluation, or persistent lack of recognition of the seriousness of the current low body weight.” Anorexia can further be categorized as a restrictive type, in which food intake or excessive exercise leads to weight loss, or binge/purge type, in which patients binge eat or use laxatives, self-induced vomiting, or diet pills to lose weight. In contrast, a bulimia diagnosis requires the occurrence of weekly binge eating episodes followed by behaviors to offset weight gain, similar to the binge/purge subtype of anorexia that I mentioned before, for at least 3 months. The difference here is that people suffering from bulimia are not restricting their food intake and therefore do not show the drastic weight loss characteristic of anorexia. As Diana noted in her interview with Bashir, “the thing about bulimia is your weight always stays the same, whereas with anorexia you visibly shrink. So you can pretend the whole way through. There’s no proof.”
However, despite the differences between anorexia and bulimia, clinical data suggests that they are related. 25-30% of people diagnosed with bulimia previously were diagnosed with anorexia. In addition, anorexia and bulimia have shared genetic risk factors, explaining why both disorders are transmitted within families. This is consistent with the fact that Diana’s older sister, Sarah, struggled with anorexia.
The grave consequences of an eating disorder on the entire body explain why eating disorders have the highest mortality rate of any mental illness. Disordered eating is not fashionable. It is deadly. Wikipedia
Both Diana and Sarah’s eating disorders emerged during their teen years, which is a defining feature of eating disorders. According to Diana, her battle with bulimia started shortly after she began dating Prince Charles and he made an offhand comment calling her “chubby”. This highlights the complicated etiology of eating disorders. If you think back to your middle and high school days, I am sure you remember the heightened self consciousness and desire to fit in and be liked that you experienced with the onslaught of pubertal hormones. These social factors are certainly an important factor in the onset of bulimia and anorexia. However, as I mentioned before, there are also genetic factors at play, shattering the misconception of eating disorders as vain attempts to achieve societal beauty norms. Think of similar dynamics we have talked about before on the blog: an environmental or social trigger setting off a genetic predisposition. In this case, that trigger was Prince Charles’ inserting his foot in his mouth and commenting on the figure of his teenage bride-to-be.
To a large extent, the neurobiology of eating disorders remains a black box. Anorexia and bulimia are far more prevalent in women, and onset coincides with puberty, so there could be a hormonal driver. Eating disorders often emerge after a period of decreased food intake, so could be related to the physiological effects of acute malnutrition. But because the dominant symptom of bulimia is binge eating, researchers have long been interested in whether patients have alterations in the signals that control eating and satiety.
One of these signals is called leptin. Leptin is expressed in fat tissue throughout the body and its levels fluctuate depending on the amount of energy stores available. Leptin can bind to receptors of neurons located in an area called the arcuate nucleus of the hypothalamus in the brain. Once bound, leptin can activate neurons that produce anorexigenic signals, or those that tell the body you aren’t hungry, and suppress neurons that produce orexigenic signals, the ones that tell you it’s time to eat. The overall effect is that appetite is suppressed. A landmark 2000 paper from a team at Harvard Medical School found that leptin levels were lower in women with bulimia than in female controls. This could explain why people with bulimia are less likely to feel full or satisfied after eating, making them more likely to engage in binge eating behaviors. Leptin is also involved in the regulation of many neuroendocrines, hormones that act in the brain, that are similarly involved in the maintenance of body weight and appetite control.
Excitingly, in recent years, researchers have found that these neuroendocrine molecules also play a role in the reward pathway in the brain that is responsible for the pleasurable feelings that you get from your favorite foods. Therefore, the physical and emotional experience of eating may be altered in people suffering from eating disorders.
Patients with active (BN) or recovering from (BN-R) bulimia have significantly lower levels of leptin in the blood than healthy controls. Jimerson et al.
As Diana mentioned, there was a large emotional component to her disordered eating behaviors. She would binge eat to feel a sense of comfort when experiencing negative emotions and then purge because of the feelings of disgust this behavior elicited. It has been suggested that people with bulimia eat to distract from stress. Anyone who has ever hit the Ben&Jerry’s post-breakup has experienced this. And indeed, the neuroendocrine stress response is hyperactive in many people with eating disorders, which can increase feelings of stress and decrease response to food intake, culminating in overeating. Diana’s high-profile lifestyle and loveless family life likely faded into the background briefly while eating, offering relief, only to reemerge and precipitate purging.
Diana’s openness about her eating disorder famously caused what psychologists and media referred to as the Diana Effect. After going public with her struggles with bulimia, the number of women in the UK receiving treatment for eating disorders doubled. In 1997, clinicians noted that bulimia rates began to decrease, but then began to tick back up after her death that year. We can debate whether there are alternate explanations for this trend, but what’s clear is that Diana’s honesty shined a light on an illness that often keeps its victims in the dark.
Baby Blues
“Then I was unwell with postnatal depression, which no one ever discusses, postnatal depression, you have to read about it afterwards, and that in itself was a bit of a difficult time. You’d wake up in the morning feeling you didn’t want to get out of bed, you felt misunderstood, and just very, very low in yourself.”
As if opening up about her eating disorder wasn’t enough, Princess Diana also got candid about postpartum depression, another taboo mental disorder affecting women. Whether you have children or not, I am sure you can appreciate that childbirth is a stressful experience. You’ve experienced the most intense of biological changes for 9 months and then push the human body to its absolute limit by delivering a baby who will now keep you up at all hours as you fulfill his or her every need. So it’s not surprising that the postpartum period can be a time of increased sensitivity.
However, some women, roughly one in nine according to the Office on Women’s Health, will experience depression-like symptoms for more than two weeks after giving birth. Postpartum depression is more common in women with a history of mood disorders, like Diana. In addition to the classical symptoms of depression that we have covered on the blog, women may experience difficulty connecting with their new babies. In the most extreme cases, they may think about hurting their own child. Sadly, as many as 50% of postpartum depression cases are estimated to go undiagnosed, likely because of stigma or the belief that postpartum mood changes are normal.
The joy of William’s birth was clouded by postpartum depression. Pinterest
The classical hypothesis is that sudden hormone changes after birth drive postpartum depression. This is similar to mood changes seen in premenstrual syndrome (PMS — definitely not me, never heard of it…), except on a much larger scale. In PMS and postpartum depression, estrogen and progesterone levels decline. However, in pregnancy, these hormones are present at concentrations more than 1000 times higher than in the menstrual cycle. Thus, it makes sense that postpartum depression would be PMS on steroids. In fact, researchers have found that giving rodents large amounts of estrogen and progesterone and then rapidly withdrawing them can induce depression-like behaviors, supporting this theory.
There are many other fascinating areas of research regarding the neurobiology of postpartum depression, like alterations in stress hormones, neurotransmitters, and even the immune system. However, I wanted to share one subfield that I stumbled upon while researching for this post: the role of epigenetics. We’ve talked a lot about the genetics of various diseases. A mutation or variant of a gene causes a change in the protein it codes for, leading to some biological alteration that contributes to disease. This is like changing the text in a document. Epigenetics, on the other hand, affects genes without changing the gene sequence in the DNA. Think of this as font style, size, and bolding, italics, and underlining: alterations to a document that don’t change any of the words on the page.
Epigenetic changes come from the addition of various chemical groups to the DNA. These additions cause the DNA to change shape, or conformation. Some of these conformational changes cause the DNA to loosen, making more room for proteins that promote gene expression to bind. In this case, expression of certain genes in the area of the epigenetic change will increase. Other times, the conformational change causes the DNA to scrunch up, preventing the expression-regulating proteins from binding their targets, and gene expression will decrease. Epigenetics is a fascinating field that is of interest to researchers of pretty much every disease because it bridges the gap between environment and biology, nature and nurture. Environmental exposures can produce epigenetic changes, altering a person’s physiology without altering their DNA sequence.
This graphic shows how different epigenetic modifications change DNA structure. Notice that the DNA near the acyl group is open and the gene is active. The DNA near the methyl group is tightly wound, so the gene is inactive. Eshragi et al., 2018
A 2014 study by authors from University of Maryland and Johns Hopkins (shoutout blue jays) studied pregnancy-induced epigenetic changes in women with mood disorders. They identified two genes that were more likely to be silenced by epigenetic changesin women who went on to develop postpartum depression than those who did not. One of these genes, HP1BP3, was previously shown to be involved in maternal behavior. Mice lacking this gene were less likely to go after their pups in a test where they are separated and their pups were less likely to survive. So is the epigenetic silencing of HP1BP3 responsible for defects in parenting behavior in women suffering from postpartum depression? Time will tell! It is likely just one spoke in a complex wheel of physiological mechanisms, but understanding how epigenetic changes contribute to postpartum depression could help identify women who are at-risk so that they can get help before experiencing the same struggles as Diana.
On the left, you can see that fewer pups survive one week after birth when the mom lacks the HP1BP3 gene (KO) than pups of normal mice (WT) because of impaired maternal behavior. On the right, researchers showed that women with postpartum depression (PPD) have increased epigenetic changes on the HP1BP3, suggesting decreased expression of this gene might lead to reduced maternal behavior in women with postpartum depression. Garkfinkel et al. and Guintivano et al.
A Cry for Help
“When no one listens to you, or you feel no one’s listening to you, all sorts of things start to happen. For instance you have so much pain inside yourself that you try and hurt yourself on the outside because you want help, but it’s the wrong help you’re asking for. People see it as crying wolf or attention-seeking, and they think because you’re in the media all the time you’ve got enough attention. But I was actually crying out because I wanted to get better in order to go forward and continue my duty and my role as wife, mother, Princess of Wales.”
As Stefanie told you last week, when things got more bleak in Diana and Charles’ marriage, Diana became more desperate. She began to engage in nonsuicidal self injury (NSSI), where someone hurts themselves without intending to commit suicide. Diana described throwing herself down some stairs while pregnant and cutting herself. These are obviously serious actions, but she admitted that she had no intention of trying to take her own life, but rather was “actually crying out”. Studies have found a higher rate of NSSI in patients with eating disorders (around 35-45%), especially those with bulimia, than the general population. One study found that NSSI in eating disorder patients was particularly associated with adverse childhood effects, including traumatic divorces like that of Diana’s parents.
The motivations for engaging in self-harm can vary, but a common reason is because the injury can temporarily relieve persistent negative emotions. This is similar to the explanation Diana gave of her bulimia when she said that purging was like “having a pair of arms around you” for a brief moment. Some imaging studies of people who perform NSSI have found that these patients show increased activation of the amygdala in response to neutral stimuli. The amygdala is a brain region involved in identifying threatening stimuli and triggering the fear responses like fight or flight. Therefore, a popular hypothesis is that NSSI patients have hyperactive fear responses and self injury can somehow blunt this. This could be due to the activation of a separate brain circuit responsible for the internal experience of reward. Researchers found teen girls who engage in self harm experienced an increased sense of “relief” after being exposed to a painful cold stimulus. The subjective report of relief was correlated with brain activity in the striatum, a brain region involved in the reward pathway. This suggests that self injury could be acting as a neurological reward, much like a drug, causing a sensation of relief or improved mood briefly as a result.
Strength in the Weakness
Diana’s own recollections took us through 3 disorders and an array of biological mechanisms, from epigenetics to brain circuitry to endocrine disturbances. Our neuroscientific understanding of bulimia, postpartum depression, and NSSI remains incomplete, and as much as the world knows about Diana, we similarly only understand her partially. Unless she had bravely opened up about her mental health, the world would have been oblivious to her suffering. Outwardly, Diana was vibrant and beautiful and passionately working for people on the fringes of society. Her candor forged a path for talking openly about mental illness and very well may have saved lives by inspiring others to seek treatment. But her mark on mental health isn’t the only legacy she left. Next week, Stefanie will discuss Diana’s long-term impact on the monarchy, and you won’t want to miss it!
If you or someone you know is struggling with depression, self-harm, or suicidal ideation, call the Suicide Prevention Lifeline at 1-800-273-8255.
If you or someone you know is struggling with an eating disorder, contact the National Eating Disorder Association helpline at 800-931-2237.
Berner, L. A., Brown, T. A., Lavender, J. M., Lopez, E., Wierenga, C. E., & Kaye, W. H. (2019). Neuroendocrinology of reward in anorexia nervosa and bulimia nervosa: Beyond leptin and ghrelin. Molecular and Cellular Endocrinology,497, 110320. doi:10.1016/j.mce.2018.10.018
Garfinkel, B. P., Arad, S., Neuner, S. M., Netser, S., Wagner, S., Kaczorowski, C. C., . . . Orly, J. (2016). HP1BP3 expression determines maternal behavior and offspring survival. Genes, Brain and Behavior,15(7), 678-688. doi:10.1111/gbb.12312
Guintivano, J., Arad, M., Gould, T. D., Payne, J. L., & Kaminsky, Z. (2014). Antenatal prediction of postpartum depression with blood DNA methylation biomarkers. Comprehensive Psychiatry,55(8). doi:10.1016/j.comppsych.2014.08.017
Jimerson, D. C., Mantzoros, C., Wolfe, B.E, Metzger, E.D. (2000). Decreased Serum Leptin in Bulimia Nervosa. Journal of Clinical Endocrinology & Metabolism,85(12), 4511-4514. doi:10.1210/jc.85.12.4511
Kaye, W. (2009). Neurobiology of Anorexia and Bulimia Nervosa Purdue Ingestive Behavior Research Center Symposium Influences on Eating and Body Weight over the Lifespan: Children and Adolescents. Physiology & Behavior,94(1), 121-135. doi:10.1016/j.physbeh.2007.11.037
Payne, J. L., & Maguire, J. (2019). Pathophysiological mechanisms implicated in postpartum depression. Frontiers in Neuroendocrinology,52, 165-180. doi:10.1016/j.yfrne.2018.12.001
Schreiner, M. W., Klimes-Dougan, B. D., Begnel, E. R., & Cullen, K. U. (2015, September 28). Conceptualizing the neurobiology of non-suicidal self-injury from the perspective of the Research Domain Criteria Project. Retrieved from https://www.sciencedirect.com/science/article/pii/S0149763415002456
Vieira, A. I., Ramalho, S., Brandão, I., Saraiva, J., & Gonçalves, S. (2016). Adversity, emotion regulation, and non-suicidal self-injury in eating disorders. Eating Disorders,24(5), 440-452. doi:10.1080/10640266.2016.1198205
Zhou, Y., & Rui, L. (2013). Leptin signaling and leptin resistance. Frontiers of Medicine,7(2), 207-222. doi:10.1007/s11684-013-0263-5
Charles Spencer, 9th Earl Spencer, Brother of Diana
Camilla Parker-Bowles, Duchess of Cornwall
Andrew Morton, British Journalist and Author
Martin Bashir, British Journalist
The problem with fairy tales is as we grow up, we begin to realize that they are not always as they appear. For me, and for many women of my generation, this was especially true of the story of Princess Diana. I was not even 7 years old when she passed away, but I knew she was a beautiful princess and the whole world mourned her death. Over the last two decades the image I had of a perfect woman with the perfect life has been shattered for me and for many others as we continue to learn that Diana was a real human being who had very real problems. As we explore the life of Diana this month, I will try to remain as objective as possible. As the historian on this blog, I have researched and written about numerous men and women, but Diana is the first subject we have tackled who lived in the modern world. She was someone I grew up with, whose children I have woken up in the middle of the night to watch get married. My job here is not to convince you that Charles was the world’s biggest tool (because that’s a given), but to present the facts so that our resident neuroscientist can unearth what was really going on beneath the tiaras and ball gowns. Do you know the real story of Diana, Princess of Wales? Let’s find out.
Growing Pains
Before Diana was a princess, and even before she was styled as “Lady” Diana, she was Diana Frances Spencer. Born on July 1, 1961 in Norfolk, England, she was one of four surviving children, with two older sisters and a younger brother (an older brother had died in infancy the year before Diana was born). While technically a “commoner” when she married into the Windsors, Diana came from a family of wealthy aristocrats who could trace their lineage back through hundreds of years of English royalty. Her father was Viscount Althorp (a viscount is ranked between a baron and an earl) and when Diana’s grandfather died, Viscount Althorp inherited his late father’s title – Earl Spencer. From that moment on, when Diana was 14 years old, she was titled Lady Diana Spencer. All of this to say that the Spencers grew up around the royal family, mixing with their children and attending important events. Both of her grandmothers were ladies-in-waiting to the Queen Mother, Queen Elizabeth’s mom. The concept of royalty was nothing new to Diana.
The ranking of titles in the British aristocracy. Diana’s father was a Viscount until his father died, when he inherited the Earldom. avictorian.com.
But despite the wealth and opportunities that surrounded Diana as she was growing up, she found herself plagued by insecurities and destructive family drama. When Diana was seven, her parents went through a nasty divorce and “the children became pawns in a bitter and acrimonious battle which turned mother against daughter and husband against wife” (Morton). Divorce is bad enough for children in the best of circumstances, but divorce amongst aristocrats in the 1960s meant that the lives of the Spencer children were put under a societal microscope, especially because their mother ended up marrying her alleged lover just one month after the divorce was final. Everywhere Diana looked, the broken marriage had taken its toll – her brother crying alone in his room, the cold shoulder from her father, her mother’s depression. Years later, the trauma of the divorce still haunted the siblings, with Diana and her sister Sarah both suffering from eating disorders (something Riley will cover extensively next week).
According to Diana, “it was a very unhappy childhood…very unstable, the whole thing” (Morton). But throughout these trying years of her youth, Diana remained a well-liked student and friend, who loved dancing and animals, excelled at diving, and had no desire at all to be a member of the royal family that she was so often mixing and mingling with.
Young Diana with her brother Charles. There was a significant age gap between Diana and Charles and their two older sisters, so unfortunately the younger two took the brunt of the divorce. harpersbazaar.com.
Not a Girl, Not Yet a Woman
When I was 16, I was being reprimanded for trying to get hot chocolate at my local 7-11 with my friends without my parent’s permission. Diana was 16 years old when she first met the man who would make her a princess. When Diana was introduced to Charles, he was dating her older sister Sarah. Any crush that Diana may have harbored at this time would have been a school girl crush, not only because Charles was the heir to the throne but also because he was 12 years older than her. So it was another two years before Charles began to court Diana as a serious candidate for marriage, a prospect that was both flattering and terrifying. As soon as the public got a whiff of the new couple, the tabloids descended like hounds on Diana and they would remain blood thirsty for photos of and information about her for the remainder of her life. Lady Diana was a part-time kindergarten teacher, part-time house cleaner and full-time teenager when she was engaged to Charles in February 1981 at the age of 19. As Martin Bashir said in his infamous 1995 interview with Diana, she went from “being Lady Diana Spencer to the most photographed, the most talked-about, woman in the world”. Everything she wore, every trip she took, everything she did and every public word she said would be acutely judged for the next 16 years.
Diana and Charles at their engagement announcement in 1981. When reporters asked if they were in love, Charles infamously answered “whatever in love means”. parade.com.
Happily Never After
Unfortunately for Diana and Charles, there really was no honeymoon period in their relationship. Although it is no secret now, the public did not know at the time about Camilla Parker Bowles and her affair with the Prince of Wales. But Diana knew. As she famously said “there were three [people] in the marriage, so it was a bit crowded”. The stress triggered by knowledge of this clandestine relationship, coupled with extreme loneliness and a feeling of abandonment by the royal family, sent Diana in a downward spiral that manifested itself as bulimia nervosa, or simply, bulimia. During the height of her illness, Diana would make herself sick four to five times in a day, describing it as “ a profound release of tension [that] in some hazy way gave her a sense of control over herself and a means of releasing the anger she felt” (Morton). It was a cry for help, but no one answered the call. And perhaps most hurtfully, Diana’s new family was aware of her struggles and still declined to offer sympathy or assistance. The situation only got worse when Diana became pregnant with her first child, William, and made the first of several suicide attempts – although these attempts were not classified as “serious”, in the sense that they weren’t meant to actually take her life, they were of course concerning as it displayed the princess’s sense of desperation. When she was three months pregnant, Diana threw herself down a flight of stairs, where she was found by her mother-in-law Queen Elizabeth. Thankfully mother and baby were not seriously injured. This was followed up over the next few years by additional incidents of self-harm that included cutting her wrists and throwing herself against a glass cabinet. And all the while the world saw a beautiful, happy and confident young woman, oblivious to the internal darkness the Princess of Wales was battling every day of her life.
By the time Harry was born in 1984, Diana and Charles’ brief marriage was a disaster. It certainly didn’t help when Charles greeted the birth of their second son with disappointment that he wasn’t a girl (quite the opposite reaction we are used to from royals!). Diana felt completely isolated from her extended family and Charles was very much in love with Camila, who was still married herself. But as royals, and more importantly the heirs to the throne, divorce was not a serious option. So Diana threw herself into raising her two beloved sons and fulfilling her royal obligations, all while in the deep throes of bulimia. In fact, it was not until 1988-89 that Diana finally received serious help for her eating disorder and felt as if she was finally in control of her life. As she found physical strength, she also found the strength to pursue causes that she was passionate about, most famously those suffering from AIDS.
When Diana was photographed in 1987 shaking hands with an AIDS patient, it was a groundbreaking moment. There was a lot of misinformation about HIV and AIDS, and Diana helped to portray patients as human beings in need of care and compassion and not to be feared. whotime.com.
Dirty Diana
To make matters worse within the royal family, the marriage between Charles’ brother Prince Andrew and Diana’s friend “Fergie” was clearly beyond repair after a short five years. Charles and Diana knew their marriage was not in a much better place, but the options available to his brother Andrew were not open to Charles. In 1992, amid the scandal of the breakdown of Andrew and Fergie’s marriage, the palace was hit with another bomb – a biography about Diana that had been written secretly with her assistance. That book was instrumental in the research for this blog because it provides something that we as historians are not always able to access when studying historical figures – her thoughts and feelings in her own words. Diana’s suspicions of Charles and Camilla, her state of mind as she succumbed to bulimia and episodes of self-harm, the isolation and betrayal she felt from the royal family – all of it was suddenly out in the open for the world to read. That same year, recordings of conversations between Diana and her alleged lover were leaked to the press, exposing a side of Diana that she had not been ready to share. Not surprisingly, the Prince and Princess of Wales found it impossible to continue on with the charade of their marriage, and in December of 1992 their separation was announced to the public. But if the royal couple believed that formalizing the separation would bring her some relief, they were disappointed. The spotlight from the media burned even more brightly on their every move until Diana announced in December 1993, exactly a year after the separation announcement, that she would be stepping away from public life. According to Diana, when she became Princess of Wales she was “not aware of how overwhelming [the] attention would become; nor the extent to which it would affect both [her] public duties and [her personal] life, in a manner that [had] been hard to bear” (Morton)
In 1995 the world was once again invited to listen to Diana in her own words when she sat down for a candid interview with BBC journalist Martin Bashir. Twenty-five years later, this interview is being investigated under the belief that Diana was blackmailed into agreeing to do it. Whatever the origin of the interview, the effect was once again a shock to the monarchy and it’s squeaky clean public image. On July 15, 1996 (exactly one year before our own Queen Riley was born), Diana and Charles’ divorce was officially announced and Her Royal Highness, Diana Princess of Wales, reverted back to the name that had served her in much simpler times, Lady Diana.
Gone But Not Forgotten
Most of us know about the tragedy that took place in Paris on August 31, 1997, when Diana and two of her companions were killed in a car crash in Paris (ironically, on the day of Riley’s baptism). The driver of their car had been extremely intoxicated and the group had been attempting to flee from throngs of paparazzi on motorcycles when they drove into a barrier inside of a tunnel. Diana’s death shook not only her country but the entire world as it mourned the loss of the “People’s Princess”. More than two decades later, I have the benefit of hindsight as I observe the legacy Diana left behind, but at the time the overwhelming reaction was anger and disgust at the establishment that had utterly failed the Princess of Wales. Next week, Riley will take a deep and meaningful dive beneath the fairytale persona that Diana and the royal family fed to the world for so long as we continue to explore how her challenges and struggles shaped the modern British monarchy.
Kensington Palace after the death of Diana. rd.com.
We’ve talked a lot about subpar medical practices throughout history on this blog, and it seems like Ivan the Terrible’s notorious temper was caused by mercury ointments he used to soothe his joint pain. Mercury was the buzziest medical ingredient of the time, and while we aren’t rushing to apply known neurotoxins to our extremities today, we still often fall victim to pseudoscience.
Instead of witch doctors or slick salesmen, today’s snake oils come from more refined sources; the former Bachelor contestant slinging Sugar Bear Hair pills on your Instagram; your clean-eating coworker who swears that golden milk lattes keep her out of the doctor’s office; or the lifestyle blogger who claims their trip to an oxygen bar was life changing. Well-marketed, aesthetically pleasing, and associated with an aspirational lifestyle, “wellness” is renewing interest in natural remedies and new age medicine. While not a bad thing on its own, the lack of evidence to back up claims about the supposedly miraculous effects these wellness products have on the body, combined with their eyebrow-raising prices, raises red flags.
No, these products will not make you look like the Kardashians. Marie Clare.
Before you start buying collagen peptide and charcoal powder for everyone on your holiday shopping list, let’s take a look at one of the most controversial wellness trends: crystals.
GOOP I Did it Again
To understand how crystals became so big, we have to look to the woman who put them on the mainstream map. Gwenyth Paltrow’s lifestyle blog and store, goop, popularized the use of crystals, favored in new age and alternative healing circles, for a whole new demographic; her base of well-off, health-conscious women. The online goop store features crystals in many forms under the “wellness category”; reusable straws, water bottles, jewelry and more, intermingled with a disturbing assortment of sex toys and herbal supplements.
In trying to understand what exactly the fuss is with these geological formations, I had to read a mind blowing amount of BS. According to supposedly science-focused shaman Colleen McCann, this is why crystals are so healing (I have to copy and paste because it just must be read in full):
“This is where science and mysticism intersect: Crystals are millions of years old and were forged during the earliest part of the earth’s formation. I think of crystals as a timeless database of knowledge, because they retain all the information they have ever been exposed to. Crystals absorb information—whether a severe weather pattern, or the experience of an ancient ceremony—and pass it to anyone that comes into contact with them.
Scientifically, crystals are the most orderly structure that exists in nature, meaning they have the lowest amount of entropy (a measurement of disorder). Crystals are structured in such a way that they respond to the inputs of all different energies around them, so they oscillate, emitting specific vibratory frequencies. The way they are balanced, the frequencies they emit, and their ability to store a tremendous amount of information makes crystals essential to modern technologies. This is why there are crystals in computers, TVs, cell phones, satellites, and so on.”
LOL. Reddit.
That was a lot, so let’s break it down. First, real crystals are old (many for sale are often fakes). And while weather certainly shapes the physical characteristics of crystals, the idea that these geodes are able to store information and then transfer that energy is purely mystical. Second, quartz crystals do have a highly ordered structure in which one silicon atom is connected to four oxygens, forming what’s called a tetrahedron. Each tetrahedron is then connected to four other tetrahedrons. And entropy is considered a measurement of thermodynamic order, in particular, the distribution of energy among the molecules making up a material. However, we shouldn’t think about entropy in anthropomorphized terms of order and chaos. In other words, contact with something with low entropy won’t make you less “disorganized”. Finally, crystals are a valuable component in modern electronics. When placed in a circuit, they are able to change shape and produce an electrical signal with a constant frequency. However, this is a specific property of crystals in electric circuits, not the decorative one that sits on your desk. And even if crystals did produce oscillations, their supposed health benefit would presume that your body is emitting waves of energy in accordance with your physiological state, which is another new age theory not backed by science.
A quartz crystal (labeled WF 10191) oscillator favored in electronics for its small size and ability to produce constant electrical signals. Wikipedia
Despite the dearth of evidence, goop, under the scientific direction of MIT PhD Gerda Endemann, has promoted the healing powers of crystals. Earlier this year, the website settled a lawsuit for $145,000 for making unfounded health claims. A particularly infamous assertion was that a $66 jade egg placed, bear with me, in one’s vagina, could be used to “balance hormones, regulate menstrual cycles, prevent uterine prolapse, and increase bladder control”. The FDA would beg to differ. So now when you go on goop’s website, you are frequently met with disclaimers that, “These statements have not been evaluated by the Food and Drug Administration. This product is not intended to diagnose, treat, cure, or prevent any disease.” But critics have pointed out that the blog has continued to toe the line of misleading readers, and is especially problematic because it wedges pseudoscience between accurate articles, like those about the skin microbiome and intermittent fasting.
The infamous egg, still available for sale on goop.com with instructions to, “Keep it in or on a space that is sacred to you or has good vibes.” RIP to my targeted ads.
In response to these criticisms, Dr. Endemann said in an interview with MIT’s Undark, “I mean, I’m not saying that every single thing that would have been ever published at Goop that I’ve even seen — I think is perfect. But I think it’s fun.” I would argue that when it comes to people’s health, something better be more than fun before you throw your PhD behind it.
Of Quartz
Last month, the craze over energies and rocks made it into mainstream science when a group of scientists from the University of Pittsburgh published a paper in Science of the Total Environment proposing that COVID could be caused by magnetic disturbances and therefore could be prevented by wearing jade amulets. Basically, some lab mice died unexpectedly, and when Dr. Moses Bility did an autopsy, he notced changes in these mice similar to those seen in people who vape . He hypothesized that metallic particles in the lungs (seen in people who vape) interact with magnetic fields, causing biochemical disturbances that damage organs. Then more of his mice died also unexpectedly in the spring at the same time that COVID cases spiked. So Bility made the huge leap that magnetic changes related to the spring equinox caused disturbances in the body, leading to disease that was attributed to the coronavirus (which he also claims just exists in our DNA but isn’t actually making us sick). And then made an even bigger leap that jade amulets, used in traditional Chinese medicine, could thus be used to prevent COVID by blocking the effects of magnetic fields on the body’s chemistry.
Can’t confirm that ancient Chinese jade amulets protect against COVID but can confirm they are stunning. Pinterest.
I think this just instinctively sounds wrong, but luckily, some very smart people have pointed out what’s actually wrong with this theory, like the fact that magnetic fields from the earth’s core wouldn’t be strong enough to have the proposed biological effects and jade isn’t strong enough to offset the magnetic fields that could cause these types of health problems. Johns Hopkins’ very own Dr. Kenneth Witwer was one of the first people to criticize the paper, and pointed out that the mice that got sick were restricted to a specific area of the animal facility. Had they really been killed by magnetic changes in the earth’s core, mice in every area of the facility should have been affected. It’s therefore much more likely that some sort of infection caused the restricted illness seen in Bility’s mice. The paper has been retracted and the first author plans to resubmit a heavily revised version, but I’m sure it’s already made the rounds on Facebook.
Jaded
Lest we think we are smarter than the mercury acolytes of Ivan the Terrible’s age, we need only to log onto Pinterest to remember how often we cling to remedies without any supporting evidence. I’m not against homeopathic medicine or a healthy curiosity in alternative medicine (peppermint essential oil is a must for my migraines), but I am against pseudoscience. So I just urge you to do your homework into anything you do for your health and to keep jade out of all your orifices.
Lastly, ULTC is officially taking our holiday break! We wish you and yours a happy and healthy holiday. Subscribe and follow us so that you know as soon as we’re back. Can’t wait to see you in the new year!
Ivan Vasilyevich, aka Ivan the Terrible, Tsar of Russia from 1547-1584
Tsarevich Ivan Ivanovich, Son of Ivan the Terrible, AKA Ivan Jr. (no picture available so use your imagination!)
Feodor the I, Youngest Son of Ivan the Terrible and Tsar of Russia from 1584-1598
Boris Godunov, Tsar of Russia and and Feodor’s Successor from 1598-1605
Tsar Michael I, First Tsar of the Romanov House, from 1613-1645
The Boyar, Russian Aristocratic Class
The Oprichniki, Ivan the Terrible’s Personal Soldiers
A Lot of Work To Kiss Your Sister…
As we explored a couple of weeks ago (do you remember that long ago? It’s not like anything significant has happened since then…), Ivan had a penchant for war and a desire to expand Russian access to trade routes. This meant collecting more land, specifically areas that belonged to Russia’s European neighbors. The most significant conflict of Ivan’s reign was known as the Livonian War and it spanned across four decades, from 1558 to 1583. The prize up for grabs was the area that is now present day Estonia and Latvia, and the opponents were essentially Russia vs. everyone – in particular, Poland, Denmark, Norway, Sweden and Lithuania. And it wasn’t so much the land that was desired, but the access it afforded to the Baltic Sea. The beginning years of the war were successful for Ivan, but as drama unfolded at home, the tides turned until he was eventually forced to surrender. As a result, Ivan returned all of the territory Russia had gained over the last 24 years. The “Livonian War had proved fruitless for Russia, which was exhausted by the long struggle” (Britannica). Not to mention bankrupt. But what the Livonian War did do was set a precedent for Russia’s involvement in European affairs and this would be a main focus of Peter the Great’s rule 100 years later.
For your convenience, a little geography refresher…because I’m not going to sit here and pretend like I knew where Latvia and Estonia are.
Failure to Launch
During the failures of the Livonian War was a failure with a far more immediate impact. When Ivan created the oprichniki, essentially his personal bodyguard army, he was attempting to create a source of power and authority that was loyal to him alone, partly in an effort to feel safer and more stable in his own empire after he withdrew from society. Unfortunately, all it managed to do was create more instability and piss everyone off. In fact, “Ivan’s reign of terror eventually resulted in the weakening of all levels of the aristocracy”(Britannica) – not hard to imagine seeing as so many boyars perished under the sword of Ivan’s army. Not surprisingly, Ivan’s policies over the seven years of violence didn’t endear any of the survivors to their emperor. The oprichniki were eventually dissolved after they failed to do a key part of their job: defend the empire from enemies like the Tatars. Ivan’s short-lived bodyguards managed to do a heck of a lot of damage in a short amount of time. As the empire reeled from the violence and instability they caused, it distracted from the war in the West and gave Russia’s opponents an advantage that eventually led to Ivan’s defeat.
Center of Attention
As the first Russian Tsar, whatever decisions Ivan made were bound to have far-reaching consequences and to set a precedent for the empire moving forward. Historians agree that the most significant result of Ivan’s reign was the centralization of the Russian government. Even though Ivan had been crowned in 1547 as the “caesar” of all of Russia, the empire at the time was still made up of hundreds of territories that looked to their respective members of the aristocracy for leadership and governance. Imagine the United States in the early days of independence; each state was operating according to its own whims and rules. After the founding fathers got their shit together, wrote the constitution and appointed GW as President, there was a central point of government that the states ultimately answered to. In the same way, Ivan’s insecurities and paranoia drove him to pull the authority and powers of the government close to his chest. By the end of his reign, there was no doubt that Russia was an empire with a powerful tsar at its center.
Today, Russia’s government could not be more centralized. But it wasn’t always that way! wsj.com.
Fall Out Boy
When Ivan flipped his lid and beat his oldest son to death, the world did a collective facepalm. Here at ULTC, we have profiled many kings and queens who were desperate for sons to carry on their legacy, and here was Ivan with an adult son who was perfectly suited to succeed him as Tsar. And then he killed him. Henry Tudor is rolling around in his grave at the thought. But, he had another son, you say! There was a backup plan! Unfortunately, Ivan’s younger son Feodor was never taken seriously as a potential heir – 1. Because he of course had an older brother 2. Because he was often sick and 3. For lack of a better word, he was kind of dumb (their words not mine).
Russia to Feodor ^
He also had no interest whatsoever in being Tsar, and so when he ascended to the throne, he left the decision making to his brother-in-law Boris Godunov. Feodor was Tsar for 14 years (in other words, Boris made decisions on his behalf for 14 years) before he died at the age of 41 with no children. And so, 50 shorts years after Russia had crowned its first Tsar, it found itself with an empty throne. The Rurik Dynasty which had shown so much promise had been snuffed out as a consequence of Ivan Jr.’s untimely death.
Trouble Maker
Since Boris had really been ruling Russia for the past decade and a half, it made sense to the group of men tasked to elect a new ruler that they should just make Boris’s title official. He was crowned Tsar in 1598 and was actually not a bad ruler all things considered. The problem with Boris is that he suffered from the same suspicions of the boyars as Ivan the Terrible, and his policies against them similarly led to constant fighting among the aristocracy. In 1605, after being sick for quite some time, Boris died. He had a son, Feodor II, who succeeded him to the throne for a matter of weeks before he and his mother were murdered by those that weren’t happy with the current rule in Russia. The next eight years were known as the “Times of Trouble” as fighting and violence continued amid the uncertainty of who would rule. Stability was finally restored in 1613 when a man (or I should say boy – he was only 16 years old) from our favorite Russian dynasty was made tsar – Michael Romanov.
The “Times of Trouble”, depicted here, look more like Game of Thrones. en.wikipedia.org.
As we know, the Romanovs would rule in Russia for the next 300 years. Who knows if the Romanov dynasty would ever have ascended to the throne if Ivan Jr. had lived and produced heirs of his own. It’s true that Ivan the Terrible’s paranoia and insecurities led to some pretty awful decisions for his people but it also led to the centralization of the government, quite a feat for an empire that large without modern technology. Ivan surely earned his unflattering nickname, but once the dust settled, Russia emerged as one of the world’s major powerhouses.
“This Day In History: Ivan The Terrible Orders A Massacre In Novgorod (1570).” HistoryCollection.com, 8 July 2017, historycollection.com/day-history-ivan-terrible-orders-massacre-novgorod-1570/.