Skip to content

Monthly Archives: November 2009

Hey, we just cured Asperger’s!

We must have, right? Because it’s not going to be in the new edition of the Diagnostic Manual of Mental Disorders (DSM), which is used by just about every psychiatrist to diagnose what exactly someone is suffering from.

So Asperger’s has just…disappeared! Woohoo! That’s progress, my friends. Here’s Sacha Baron-Cohen’s brother, Simon Baron-Cohen, who heads up the Autism Research Centre at Cambridge University, to tell us more:

The Short Life of a Diagnosis

THE Diagnostic and Statistical Manual of Mental Disorders, published by the American Psychiatric Association, is the bible of diagnosis in psychiatry, and is used not just by doctors around the world but also by health insurers.

Changing any such central document is complicated. It should therefore come as no surprise that a committee of experts charged with revising the manual has caused consternation by considering removing Asperger syndrome from the next edition, scheduled to appear in 2012. The committee argues that the syndrome should be deleted because there is no clear separation between it and its close neighbor, autism.

The experts propose that both conditions should be subsumed under the term “autism spectrum disorder,” with individuals differentiated by levels of severity. It may be true that there is no hard and fast separation between Asperger syndrome and classic autism, since they are currently differentiated only by intelligence and onset of language. Both classic autism and Asperger syndrome involve difficulties with social interaction and communication, alongside unusually narrow interests and a strong desire for repetition, but in Asperger syndrome, the person has good intelligence and language acquisition.

The question of whether Asperger syndrome should be included or excluded is the latest example of dramatic changes in history of the diagnostic manual. The first manual, published in 1952, listed 106 “mental disorders.” The second (1968), listed 182, and famously removed homosexuality as a disorder in a later printing. The third (1980) listed 265 disorders, taking out “neurosis.” The revised third version (1987) listed 292 disorders, while the current fourth version cut the list of disorders back to 283.

This history reminds us that psychiatric diagnoses are not set in stone. They are “manmade,” and different generations of doctors sit around the committee table and change how we think about “mental disorders.”

This in turn reminds us to set aside any assumption that the diagnostic manual is a taxonomic system. Maybe one day it will achieve this scientific value, but a classification system that can be changed so freely and so frequently can’t be close to following Plato’s recommendation of “carving nature at its joints.”

Part of the reason the diagnostic manual can move the boundaries and add or remove “mental disorders” so easily is that it focuses on surface appearances or behavior (symptoms) and is silent about causes. Symptoms can be arranged into groups in many ways, and there is no single right way to cluster them. Psychiatry is not at the stage of other branches of medicine, where a diagnostic category depends on a known biological mechanism. An example of where this does occur is Down syndrome, where surface appearances are irrelevant. Instead the cause — an extra copy of Chromosome 21 — is the sole determinant to obtain a diagnosis. Psychiatry, in contrast, does not yet have any diagnostic blood tests with which to reveal a biological mechanism.

So what should we do about Asperger syndrome? Although originally described in German in 1944, the first article about it in English was published in 1981, and Asperger syndrome made it only into the fourth version of the manual, in 1994. That is, the international medical community took 50 years to acknowledge it. In the last decade thousands of people have been given the diagnosis. Seen through this historical lens, it seems a very short time frame to be considering removing Asperger syndrome from the manual.

We also need to be aware of the consequences of removing it. First, what happens to those people and their families who waited so long for a diagnostic label that does a good job of describing their profile? Will they have to go back to the clinics to get their diagnoses changed? The likelihood of causing them confusion and upset seems high.

Second, science hasn’t had a proper chance to test if there is a biological difference between Asperger syndrome and classic autism. My colleagues and I recently published the first candidate gene study of Asperger syndrome, which identified 14 genes associated with the condition.

We don’t yet know if Asperger syndrome is genetically identical or distinct from classic autism, but surely it makes scientific sense to wait until these two subgroups have been thoroughly tested before lumping them together in the diagnostic manual. I am the first to agree with the concept of an autistic spectrum, but there may be important differences between subgroups that the psychiatric association should not blur too hastily.

Simon Baron-Cohen, the director of the Autism Research Center at Cambridge University, is the author of “The Essential Difference.”

Cloud thinking

Check out this great article by John Brockman, editor of Edge magazine (which is the best ideas publication out there, to my mind)…OK, it’s a bit of a pretentious opening…in fact, could you imagine a more pretentious opening…but still, some interesting thoughts in this article.

At a dinner in the mid-sixties, the composer John Cage handed me a copy of Norbert Wiener’s book, Cybernetics. He was talking about “the mind we all share” in the context of “the cybernetic idea”. He was not talking Teilhard de Chardin, the Noosphere, or any kind of metaphysics.

The cybernetic idea was built from Turing’s Universal Machine in the late thirties; Norbert Wiener’s work during World War II on automatic aiming and firing of anti-aircraft guns; John von Neumann’s theory of automata and its applications (mid-forties); Claude Shannon’s landmark paper founding information theory in 1948.

What exactly is “the cybernetic idea”? Well, it’s not to be confused with the discipline of cybernetics, which hit a wall, and stopped evolving during the 1950s. And it’s not your usual kind of idea. The cybernetic idea is an invention. A very big invention. The late evolutionary biologist Gregory Bateson called it the most important idea since the idea of Jesus Christ.

The most important inventions involve the grasping of a conceptual whole, a set of relationships which had not been previously recognized. This necessarily involves a backward look. We don’t notice it. An example of this is the “invention” of talking. Humans did not notice that they were talking until the day someone said, “We’re talking.” No doubt the first person to utter such words was considered crazy. But that moment was the invention of talking, the recognition of pattern which, once perceived, had always been there.

So how does this fit in with the cybernetic idea?

It’s the recognition that reality itself is communicable. It’s the perception that the nonlinear extension of the brain’s experience — the socialization of mind — is a process that involves the transmission of neural pattern — electrical, not mental — that’s part of a system of communication and control that functions without individual awareness or consent.

This cybernetic explanation tears the apart the fabric of our habitual thinking. Subject and object fuse. The individual self decreates. It is a world of pattern, of order, of resonances. It’s an undone world of language, communication, and pattern. By understanding that the experience of the brain is continually communicated through the process of information, we can now recognize the extensions of man as communication, not as a means for the flow of communication. As such they provide the information for the continual process of neural coding.

How is this playing out in terms of the scenarios presented by Frank Schirrmacher in his comments about the effect of the Internet on our neural processes? Here are some random thoughts inspired by the piece and the discussion:

Danny Hillis once said that “the web is the slime mold of the Internet. In the long run, the Internet will arrive at a much richer infrastructure, in which ideas can potentially evolve outside of human minds. You can imagine something happening on the Internet along evolutionary lines, as in the simulations I run on my parallel computers. It already happens in trivial ways, with viruses, but that’s just the beginning. I can imagine nontrivial forms of organization evolving on the Internet. Ideas could evolve on the Internet that are much too complicated to hold in any human mind.” He suggested that “new forms of organization that go beyond humans may be evolving. In the short term, forms of human organization are enabled.”

Schirrmacher reports on Gerd Gigerenzer’s idea that “thinking itself somehow leaves the brain and uses a platform outside of the human body. And that’s the Internet and it’s the cloud. And very soon we will have the brain in the cloud. And this raises the question of the importance of thoughts. For centuries, what was important for me was decided in my brain. But now, apparently, it will be decided somewhere else.”

John Bargh notes that research on the prediction and control of human judgment and behavior, has become democratized. “This has indeed produced (and is still producing) an explosion of knowledge of the IF-THEN contingencies of human responses to the physical and social environment … we are so rapidly building a database or atlas of unconscious influences and effects that could well be exploited by ever-faster computing devices, as the knowledge is accumulating at an exponential rate.” The import of Bargh’s thinking is that the mere existence of a social network becomes an unconscious influence on human judgment and behavior.

George Dyson traces how numbers have changed from representing things, to meaning things, to doing things. He points out that the very activity involved in the socialization of mind means that “we have network processes (including human collaboration) that might actually be ideas.”

What does all this add up to?

Schirrmacher is correct when when he points out that in this digital age we are going through a fundamental change which includes how our brains function. But the presence or absence of free will is a trivial concern next to the big challenge confronting us: to recognize the radical nature of the changes that are occuring and to grasp an understanding of the process as our empirical advances blow apart our epistemological bases for thinking about who and what we are. “We’re talking.”