Shell-Shock: A History of the Changing Attitudes to War Neurosis [1997] – ★★★★
“…They broke his body and his mind/And yet They made him live,/And TheyShell-Shock: A History of the Changing Attitudes to War Neurosis [1997] – ★★★★
“…They broke his body and his mind/And yet They made him live,/And They asked more of My Mother’s Son/Than any man could give...” (from Rudyard Kipling’s poem The Mother’s Son).
“…Men who went out to battle, grim and glad;/ Children, with eyes that hate you, broken and mad…” (Siegfried Sassoon, October 1917).
This is an insightful book about the history of “shell-shock”, a type of post-traumatic stress disorder suffered by soldiers after a prolonged combat. Anthony Babington is neither a medical professional nor strictly a trained historian, but his book still provides a thought-provoking overview of a very misunderstood illness. From wars described by Herodotus (484-425 BC) to the Gulf War of 1990/91, the account touches on every major war conflict to explain how “shell-shock” and combat stress were perceived and treated through history.
Babington’s story starts with the fact that on 12 March 1915 (during the First World War), British Army’s Lance-Sergeant Walton, 26 years of age, was court-martialled for desertion and executed by his own regiment two months later. One “small” fact, however, is that Walton was, in fact, suffering from a severe nervous breakdown from the moment of his desertion, and, in all likelihood, was even too sick to understand the nature of his actions.
Writing in 1676, Swiss physicianJohannes Hoferreferred to a mysterious illness affecting soldiers. They exhibited signs of despair and “nostalgia”. In the 19th century, this was described as having “a soldier’s heart”, a condition whereby a person suffers from breathlessness, racing pulse and exhaustion. A person may also display “absence of imagination, no curiosity about the future and no recollection of “the past stirring events in which they had taken part”, as well as obsessive thoughts, mental anguish and nervous mannerism” [Babington, 1997: 44].
Babington’s book is strong in providing an overview of various medical professionals’ contributions to demystifying one strange illness that had started to plague soldiers from the wars of the 1870s onwards. For example, there was in 1904 DrPaul Jacoby,operating in Russia, who saw a large number of soldiers involved in either the Franco-Prussian War of 1870/71 or the Russo-Turkish War of 1877/1878 exhibiting strange symptoms of “insanity”. He was one of the first to “attribute the high incidence of derangement” to the fact that “the privations and fatigue of active service produces a nervous tension caused by ever-present danger and frequent mental shocks” [Babington, 1997: 21]. Jacoby was also one of the first to call for “a special psychiatric service” to be established so that soldiers suffering from this disease could receive specialised care, while saying that “recent advances of weapon technology had added greatly to the nervous strain of the combatants” [Babington, 1997: 34].
Babington actually makes an interesting observation here about the greater prevalence of shell shock in our modern times. In the past, a battle hardly lasted three days, and a three days’ battle was an unusual one. While Gettysburg “endured three days”, but the “middle of the First World War, [any battle] could last months”. Needless to say, prolonged strain and expectations of being killed over a long period of time put much pressure on the psyche, causing increased anxiety, paranoia, depression and hopelessness. There was also a new war policy entering the battlefield during the First World War – “gathering every man and gun, and wearing down the enemy by constant and, if possible, ceaseless attacks”. This attrition warfare can undoubtedly have a severe psychological and emotional impact both on the attackers and on the defenders.
Charles Samuel Myers(1873 – 1946) was one of the first British physicians to document shell-shock, andThomas William Salmon(1876 – 1927), a leader of the mental hygiene movement in America, made a study of shell-shock, too. Dr Salmon was of the opinion that the high rate of “insanity” during previous wars was “due, in part at least, to the failure to recognise the true nature of severe neurosis” [Babington, 1997: 21]. The sad fact was that, when the first British soldiers suffering from war neurosis started returning to England in September 1914, many of them were regarded as “insane” and no special arrangements were made for them since they were sent to ordinary hospitals. At this point, war neurosis was still treated as a blast-concussion and not as a purely psychological illness.
In fact, the British were most puzzled and reluctant to admit “shell-shock” in their personnel, shaming soldiers who complained of it, and branding them as “cowards” and “ones without character or resilience” (this can be contrasted with a more understandable and humane approach taken by the US in this period [1997: 105]). The shortage of men during the First World War was also one of the contributing factors to ignore any psychiatric symptoms – if you still has four functioning limbs, off to war you go, again. As late as May 1916, Myers, then a Consulting Psychologist for the Army, wrote that “from a military standpoint, a deserter was either “insane” and destined for the “mad house”, or responsible and should be shot” [Babington, 1997: 57]. It is for this reason, cases of “shell-shock” were so severely under-reported, and it is particularly painful to read the accounts of so-called deserters, who were in fact men suffering from extreme forms of “shell-shock”, being executed for their “betrayals” and inability to perform services.
It was only in 1920 that the British government set up an inquiry “to get to the bottom of the hysteria” plaguing soldiers in recent wars, and uneasiness about the past executions for desertion started to spread among the public. The book then provides an account of the Second World War events, when there was greater awareness of the disorder, with special trained medical staff allocated to deal with psychiatric causalities. Their prevalence did not abate. Babington writers: “during the first ten days’ fighting after the landing in Normandy, more than 10 percent of the British battle casualties were psychiatric. In the struggle to break out of the bridgehead, the figure increased to 20 percent” [Babington, 1997: 156].
This non-fiction book comes from Daniel Keyes, the writer of classic sci-fi Flowers for Algernon [195The Minds of Billy Milligan [1981/2018] – ★★★★1/2
This non-fiction book comes from Daniel Keyes, the writer of classic sci-fi Flowers for Algernon [1959]. The Minds of Billy Milligan tells the amazing story of Billy Milligan, the first man in the US history to successfully plead the insanity defence in court based on his proven multiple personality disorder and, therefore, be held not responsible for his major crimes (three counts of robbery and rape). Billy Milligan had twenty-four personalities (or “people” ) living inside him, competing for spotlight (or consciousness) at any one time, and some of them developed when he was a toddler and suffering from trauma. This is no fiction as numerous eminent psychiatrists who observed Milligan for years testified repeatedly to his condition and the chances that Milligan could have somehow faked all twenty-four personalities over so many years are close to zero. This is because his personalities were truly different people, observed to have different body temperatures, hand-writing, accents, vocabulary, speech patterns, mannerism, IQ, skills, knowledge, experience and even brain waves. Daniel Keyes traces Milligan’s case, beginning from his arrest and childhood and culminating with Milligan being dragged from one hospital to another, battling public prejudice. This is a mind-blowing account of the most remarkable case of a disorder that lies at the very heart of uncovering the mystery of the human mind and consciousness.
The story of Billy Milligan can be said to be unique in the world. He is the first person ever to have his multiple personality condition studied extensively and for a prolonged period of time in a controlled setting. The findings beggar belief. Milligan was found to have such people “living inside him” as Ragen, a twenty-three year old colour-blind Yugoslavian man who is an expert in ammunitions; Arthur, a man speaking with a distinctive upper-class British accent and who also, incidentally, can read and write fluent Arabic; Philip, “a dangerous thug” with a Brooklyn accent; a little girl named Christene, aged three, who is called “the corner child” and “the Teacher”, who represents the “sum of all twenty-three alter egos fused into one…who taught the others everything they’ve learned”. The shocking transformations of Milligan from one person to another were a sight to behold: “it struck him how the change of personality caused a definite facial alteration. Arthur’s tight-jawed, pressed-lipped, heavy-lidded gaze that made him appear arrogant had given away to Billy’s wide-eyed, hesitant expression. He seemed weak and vulnerable. In place of Danny’s fear and apprehension, Billy showed bewilderment” [Keyes, 1981/2018: 120]. One doctor set himself a task of fusing all of Milligan’s personalities together, attempting to establish “lines of communication” between personalities so the need for each of them is soon reduced to zero. That attempt was only partially successful.
Even if it was possible for Milligan to change the so-called “psychological” characteristics of his personalities at whim, it is still a mystery how he was able to change some of his physical/bodily/medical characteristics, which, most probably, only a few shamans or Tibetan Buddhism practitioners are capable of changing at whim. These physical characteristics include the presence of nystagmus (a condition affecting vision which causes repetitive and uncontrolled eye movements) in one of Milligan’s personalities and proven dyslexia in another. Some of his personalities also showed high levels of anxiety which will be difficult to show randomly, including high levels of sweating and symptoms close to having a seizure. Even more of a mystery is how a person could go in a fraction of a second from having a very fast pulse and visible extreme anxiety to being completely relaxed, having a very low pulse and extraordinary self-confidence which verges on complete boredom. Dr George Harding was a person who tried to solve this puzzle.
Another question is why some people develop this very rare disorder whose very existence is still contested by some. The rule seems to be that people only develop it when they had suffered in their childhood some extreme, prolonged and repeated abuse (mental, physical and/or sexual), being victims of unbelievable sadistic behaviour at a very young age, and the aggravating factor is that abuse was perpetrated by a person who should have been this child’s guardian and protector (a father/mother figure). The famous case of Shirley Ardell Mason (aka Sybil) was based on this conclusion, as well as the case of one Australian woman Jenny Haynes (this is the documentary about her – Woman with 2.500 Personalities (warning – distressing content)). It only makes sense then that someone that young would develop a private protective psychological mechanism to deal with the unbelievable trauma “to survive” psychologically. This capability testifies to the human brain’s innate flexibility and adaptability. A brain would then be powerful enough to create a completely separate identity (identities) so they can take on themselves all the unbelievable pain and hurt which the core personality is simply unable to endure. If one cannot change the outside reality, there exists a possibility to change the inward one. Events that happen to people in their childhood sadly echo persistently throughout one’s adult life (whether consciously, subconsciously or even unconsciously).
The case of Milligan also has implications for the study of consciousness. Milligan often talked in the book about his personalities “holding the consciousness” at any single moment in time, but some personalities were also able to be minimally-conscious of the actions of others even when they were not “in the spotlight” or active in real life. That may provide some argument for the idea that consciousness is not assigned to one specific part of a brain, but may be “spread out” and “ever-present” across the human brain or even other human faculties.
The Minds of Billy Milligan is that kind of a non-fiction book that can put to shame any fictional account in terms of imagination. The case of Milligan sounds so unreal no fiction writer would even attempt to imagine it. Milligan’s case also showcases the extent of the human brain’s flexibility and adaptability, shedding light on the wonder which is the human mind. Though the first half of the book is clearly stronger than the second, Daniel Keyes’ narrative still grips like only a powerful thriller can....more
When Brains Dream: Exploring The Science & Mystery of Sleep [2021] - ★★★
This book was one of my most anticipated reads of 2021. In this book, the authWhen Brains Dream: Exploring The Science & Mystery of Sleep [2021] - ★★★
This book was one of my most anticipated reads of 2021. In this book, the authors, Professors of the Université de Montréal and Harvard Medical School respectively, have a goal to explain the “dreaming brain” and start with the early research into dreams done by some Greek philosophers before talking about the dreaming theories of Freud and Jung. The authors then go on to explain REM (rapid eye movement) sleep and the discovery of it by Aserinsky and Kleitman in 1953. What follows are the explanations of some well-knowntheories about dreaming,for example those that relate to (i) memory; (ii) evolutionary advantage ( “role-play” ); (iii) problem-solving; (iv) creativity and (v) emotions. Antonio Zadra and Robert Stickgold then move on tosleep disordersin their book, talking about narcolepsy and sleep paralysis. The major issue in these book sections is that they are filled with too many obvious statements that could have been edited out. There are so many of these obvious statements in the book that it often reads like a sleep and dream manual series for “complete dummies”. For example, I certainly did not pick up this new book to find out to my “amazement” that “our brain and mind never rest” [Zadra/Stickgold, 2021: 270], that “just about everyone dreams”, that “the Frozen characters Olaf and Elsa don’t dream” [2021: 82] or that “absence of dream recall clearly is not proof of the absence of dreaming” [Zadra/Stickgold, 2021: 52], but that is what I found inside. The authors constantly refer to some future chapters in the book, and, most probably, an up-to-date college textbook on psychology may provide a more interesting and insightful overview of the topic.
I lucid-dream spontaneously since childhood and considerlucid-dreaming(a state where a person is aware that he or she is dreaming while having a dream) to be very important to our understanding of dreams and consciousness, but the problem is that the authors hardly offer any explanation of it or talk about its causes at length; rather, they offer some techniques of how to start lucid-dreaming, techniques which belong more inside some new age self-help books, rather inside such a serious non-fiction book as When Brains Dream penned by two eminent Professors. (Accurate) dream recall and vividness of dreams are important for the development of lucid dreaming (I have always had both), but from my own personal experience and side I can also suggest monitoring one’s thoughts during the day, day-dreaming (within reason, of course), connecting with one’s inner self (interpret this as broadly as possible) and becoming more aware of one’s feelings during the day. Some insight into lucid dreaming the authors nevertheless give: “When brains dream lucidly”, they write, “frontal regions that are associated with self-reflective awareness during waking, but that are normally turned off during REM sleep, become more active” [Zadra/Stickgold, 2021: 233].
Finally, Professor Zadra and Professor Stickgold offertheir own theory into the nature of dreamingin this book. The so-called “NEXT UP” theory “suggests that the function of dreaming is to explain the past and predict the future, to discover what’s next up in our lives. This is the brain’s task while we dream” [Zadra/Stickgold, 2021: 269, 270]. To achieve that “the dreaming brain attempts only to show us what has been and what might be”. It shows us that we cannot yet fully explain [Zadra/Stickgold, 2021: 270]. The researches then note “dreaming is a unique form of sleep-dependent memory evolution, one that extracts new knowledge from existing information through the discovery and strengthening of…often previously unexplored associations” [Zadra/Stickgold, 2021: 271]. Of course, they cannot prove that empirically in any way, but this theory sounds logical, even if “limiting”. Undoubtedly, our brain works ceaselessly during the night, processing events from our lives, sorting memories, calculating, thinking, trying to come to terms with either unpleasant events or come up with internal solutions, etc. Perhaps, the dreaming brain really tries to predict the future and open our minds to the multitude of possibilities open to us in real life through the finding and presentation of new opportunities by the process of imagining, cataloguing, eliminating and making (unlikely) associations, but, is it ALL that it does? Knowing how complex our brains are, the explanation is a little simplistic and probably our brain does a million other things besides while we sleep, and dreams have a mountain of other reasons we cannot even imagine. The Professors’ theory seems be an “easy way out”, too. The researchers claim that their theory is new and inventive, but it is not – it is a mish-mash of other existing theories all put together - memory, evolutionary (the researchers’ “memory-evolution” ) and maybe even Freudian theories (researchers’ “associations” wording).
Though When Brains Dreams is an engaging account of a dreaming brain that summarises well the research in this field so far, the book is also laden with obviousness and, unfortunately, focuses too much on the “why” question, as opposed to the “how” issue. The “relaxed” style of the book baffles, rather than says something insightful or concrete about dreaming, and the researchers’ own theory into dreaming is hardly something more than a clever “conglomeration” of all the others....more
Gracefully Insane: The Rise and Fall of America's Premier Mental Hospital [2001] - ★★★1/2
This non-fiction book is about McLean Hospital in New EnglanGracefully Insane: The Rise and Fall of America's Premier Mental Hospital [2001] - ★★★1/2
This non-fiction book is about McLean Hospital in New England, “one of America’s oldest and most prestigious mental hospitals” [Beam, 2001: 1], whose residents once included mathematician John Nash and authors Susanna Kaysen (Girl, Interrupted [1993]) and Sylvia Path (The Bell Jar [1963]). Comprised of beautiful Tudor mansions and set in a picturesque area, the institution became the first mental hospital in Boston and has been called a “cultural museum”. It also inspired Dennis Lehane's thriller Shutter Island [2003], and is an unusual mental hospital in many respects. McLean was known not only for its celebrity-patients and “moral treatment”, but also for its patient rooms furnished with every comfort, tennis courts, extensive gardens and free-standing cottages for its aristocratic clientele. From the hospital’s founding in 1811 to the late 1990s, Alex Beam traces the history of this institution, emphasising the contributions of different individuals on its development and how changes in the treatment of mental disorders throughout the two centuries impacted the running, structure and the organisation of McLean. We read both the doctors and the patients’ accounts.
Beam’s book may be chaotic and disjointed, but it is interesting, especially in its insights offered on the rise and decline of various popular treatments to treat mental disorders (and how McLean responded to various “medical treatment” trends), including lobotomy, electroshock therapy, hydrotherapies, psychoanalysis and drugs. The book ends with the overview of the 1970-90s, which “have been a time of trouble for full-service mental hospitals” [Beam, 2001: 233], since “the world has given up on long-term, residential mental health care”, in favour of “psychopharmacology...quick diagnoses [and] rapid drug prescriptions”....more
Mad in America: Bad Science, Bad Medicine and the Enduring Mistreatment of the Mentally Ill [2002] – ★★★★1/2
Robert Whitaker opens his book with this qMad in America: Bad Science, Bad Medicine and the Enduring Mistreatment of the Mentally Ill [2002] – ★★★★1/2
Robert Whitaker opens his book with this quote by David Cohen: “We are still mad about the mad. We still don’t understand them and that lack of understanding makes us mean and arrogant, and makes us mislead ourselves, and so we hurt them”. His book isan engaging overview of the methods to treat mentally ill patients through centuries(starting in the pre-1750s period and continuing to the present day), and how changes in societal attitudes and perceptions, as well as in psychiatry politics and business considerations impacted the treatment. “Scientific” and “therapeutic” approaches to treating mentally ill had competed with each other for centuries, and Whitaker shows how politics of this or that time period ultimately dictated what mentally ill patients were supposed “to need”, with mentally ill people often caught in a trap of doctors and businesses’ ambitions to make a mark in science or earn money respectively.
The book is divided into four sections: (a) The Original Bedlam (1750 – 1900); (b) The Darkest Era (1900 – 1950); (c) Back to Bedlam (1950 – 1990s) and (d) Mad Medicine Today (1990s – present).
The Original Bedlam (1750 – 1900)
The story starts circa 1796, in a period when psychiatry was finally “waking up” from the “chain-the-mentally-ill” and “patients-as-a-spectacle” mentality and finally realising that patients in psychiatric institutions need a more humane medical treatment. Before that, mentally ill patients were held in terrible conditions, and Bethlehem (Bedlam) Hospital in London testifies to that. “Like all wild animals, lunatics needed to be dominated and broken” [2002: 7], was the opinion of that time. Unflinchingly, Whitaker goes through the horrific and shocking arsenal of “treatments” for the mentally ill at that time, talking about the Bath of Surprise, Spinning Therapy, and the Tranquiliser Chair. For example, the reasoning behind the Drowning Therapy was the following: “if a patient was nearly drowned and then brought to life, he would take a fresh start, leaving his disease behind” [2002: 17].
So, Benjamin Rush [1745 – 1813], “the father of American psychiatry”, was the proponent of a kinder treatment for the mentally ill at that time in Philadelphia, but even he thought that it was the circulatory disorder in the body that was the cause of all madness [2002: 17], and, accordingly, was in favour of bleeding his patients severely to “fix” that. A glimpse of hope at that time was the exemplary role of the Quakers in caring (gently and without medical intrusions) for the mentally ill in Philadelphia, as well as the influence of Philippe Pinel [1745 – 1826] and his promotion of “moral treatment” in Europe.
The Darkest Era (1900 – 1950)
Robert Whitaker states that “at the beginning of the twentieth century, the generous attitude towards the mentally ill disappeared in American society” [2002: 41]. The first half of the twentieth century was all about the rise of eugenics, segregation mentality and sterilisation efforts in psychiatry. The 1940s also saw the rise of shock treatments for mentally ill which were considered “quick, easy, reliable, and cheap” [Whitaker, 2002: 98]. However, it was also clear early on that these also produced “a more profound, lasting trauma” and “changes akin to suffering a concussive head injury” [2002: 102]. Whitaker writes: “[shock treatments were] a form of a brain damage, but that was not how [they] were presented to the public” [2002: 103]. The public were made to believe that the shock treatment was safe, effective and painless, and any memory loss was only temporary. However, they were anything but benevolent, and before the introduction of paralysing drugs, up to forty percent of all patients used to break bones in this treatment, and they were said to be used to “quieten the ward and insure good citizenship” [2002: 106].
If a shock treatment sounds awful, the rise of a procedure which became known as a prefrontal lobotomy may sound even more so. Dr Walter Freeman [1895 – 1972] and Dr James Watts [1904 – 1994] were neurosurgeons who were the pioneers of a method by which an instrument was drilled/inserted into a patient’s brain to cure them of disorders. Even though the procedure gained a “medical approval”, it was also deemed to be akin to a “partial euthanasia” and “the removal of a patient’s soul” because it induced the unprecedented state of passivity and regression to childhood.
Back to Bedlam (1950 – 1990s)
The latter half of the twentieth century was all about drugs as a cure for mentally ill patients, and a drug chlorpromazine (Thorazine or Largactil) had all the attention. That was a time when the shift occurred from the asylum to the community care, and schizophrenia was an illness of interest. Whitaker writes how pharmaceutical companies had the most to gain from promoting all sorts of antipsychotics as safe for consumption, and drugs were even prescribed for the elderly, for children with learning difficulties and simply for people who had mild stress in their daily jobs: “19 million prescriptions were written annually” [2002: 205]. And that was all happening at the time when the same patients were reporting dangerous side-effects from those drugs and immense addiction. A glimpse of hope in that period was probably a study conducted by American psychiatrist Dr Loren Mosher [1933 – 2004] who opened a house Soteria for mentally ill and had results that showed that a controlling atmosphere and over-use of drugs hindered recovery for patients and more attention should be paid to therapeutic and benign treatments, as well as to the atmosphere of kindness. However, Mosher’s results were generally ignored, and testing various drugs on patients without their consent continued to be a norm.
Mad Medicine Today (1990s – present)
“The transformation of chlorpromazine from a drug that induced a chemical lobotomy into a safe, anti-schizophrenic drug took a decade”, but “by the mid-1980s, it was no longer possible to ignore the many drawbacks of neuroleptics” [2002: 258], reports the author. So, what did pharmaceutical companies do? They invented new, “safer” drugs, and their prime goal was to outperform their business competitors. Whitaker talks in this section about the proliferation of drugs, such as risperidone, which causes mania where none was before, and about hasty and negligently conducted preliminary trials on those drugs [2002: 286].
“With the new drugs presented to the public as wonderfully safe, American psychiatrists [were] inviting an ever greater number of patients into the madness tent” and “evidence of the harm caused by the drugs was simply allowed to pile up, then pushed away in the corner where it wouldn’t be seen” [2002: 289], writes Whitaker. He convincingly shows how politics and business became the biggest winners in a game where patients’ care and needs were hardly prime considerations, and where corruption in the sector was ever-present (for example, see this documentary on corrupt mental hospitals of the 1990s).