The parasite, which is excreted by cats in their feces, is called Toxoplasma gondii (T. gondii or Toxo for short) and is the microbe that causes toxoplasmosis-the reason pregnant women are told to avoid cats' litter boxes. Since the 1920s, doctors have recognized that a woman who becomes infected during pregnancy can transmit the disease to the fetus, in some cases resulting in severe brain damage or death. T. gondii is also a major threat to people with weakened immunity: in the early days of the AIDS epidemic, before good antiretroviral drugs were developed, it was to blame for the dementia that afflicted many patients at the disease's end stage. Healthy children and adults, however, usually experience nothing worse than brief flu-like symptoms before quickly fighting off the protozoan, which thereafter lies dormant inside brain cells-or at least that's the standard medical wisdom.
But if Flegr is right, the "latent" parasite may be quietly tweaking the connections between our neurons, changing our response to frightening situations, our trust in others, how outgoing we are, and even our preference for certain scents. And that's not all. He also believes that the organism contributes to car crashes, suicides, and mental disorders such as schizophrenia. When you add up all the different ways it can harm us, says Flegr, "Toxoplasma might even kill as many people as malaria, or at least a million people a year."
In a 2008 paper called The Seductive Allure of Neuroscience Explanations, a group from Yale University demonstrated that including neuroscientific information in explanations of psychological phenomena makes the explanations more appealing, even if the neuroscientific info is irrelevant.
Explanations of psychological phenomena seem to generate more public interest when they contain neuroscientific information. Even irrelevant neuroscience information in an explanation of a psychological phenomenon may interfere with people's abilities to critically consider the underlying logic of this explanation.
I don't know if I buy this. Perhaps if the authors had explained their results relative to how the human brain functions...
Nishimoto and two other research team members served as subjects for the experiment, because the procedure requires volunteers to remain still inside the MRI scanner for hours at a time.
They watched two separate sets of Hollywood movie trailers, while fMRI was used to measure blood flow through the visual cortex, the part of the brain that processes visual information. On the computer, the brain was divided into small, three-dimensional cubes known as volumetric pixels, or "voxels."
"We built a model for each voxel that describes how shape and motion information in the movie is mapped into brain activity," Nishimoto said.
The brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity.
Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.
Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie.
The kicker: "the breakthrough paves the way for reproducing the movies inside our heads that no one else sees, such as dreams and memories". First time travelling neutrinos and now this...what a time to be alive. (via ★essl)
They found that engaging in brief (10 minute) conversations in which participants were simply instructed to get to know another person resulted in boosts to their subsequent performance on an array of common cognitive tasks. But when participants engaged in conversations that had a competitive edge, their performance on cognitive tasks showed no improvement.
"We believe that performance boosts come about because some social interactions induce people to try to read others' minds and take their perspectives on things," Ybarra said. "And we also find that when we structure even competitive interactions to have an element of taking the other person's perspective, or trying to put yourself in the other person's shoes, there is a boost in executive functioning as a result."
I've noticed this effect with myself but I always thought it was the result of my introversion, i.e. competitive conversations are more stressful and sap energy and mental function more quickly than normal conversations. I know a couple of people who enjoy competitive conversation and I've largely steered clear of those interactions since realizing I always felt so blah afterwards.
Tatiana and Krista Hogan are conjoined twins who not only share a bit each other's skulls but also parts of their brains. So are they two people with two brains & personalities or one person with one brain and two (split) personalities?
Adding to the conundrum, of course, are their linked brains, and the mysterious hints of what passes between them. The family regularly sees evidence of it. The way their heads are joined, they have markedly different fields of view. One child will look at a toy or a cup. The other can reach across and grab it, even though her own eyes couldn't possibly see its location. "They share thoughts, too," says Louise. "Nobody will be saying anything," adds Simms, "and Tati will just pipe up and say, 'Stop that!' And she'll smack her sister." While their verbal development is delayed, it continues to get better. Their sentences are two or three words at most so far, and their enunciation is at first difficult to understand. Both the family, and researchers, anxiously await the children's explanation for what they are experiencing.
Corkin first met Henry at Brenda Milner's lab in Montreal in 1962, and over the years, as the mining of his mind has continued, she's witnessed firsthand how Henry continues to give up riches, broadening our understanding of how memory works. But she's also keenly aware of Henry's enduring mysteries, has documented things about him that nobody can quite explain, not yet.
For example, Henry's inability to recall postoperative episodes, an amnesia that was once thought to be complete, has revealed itself over the years to have some puzzling exceptions. Certain things have managed, somehow, to make their way through, to stick and become memories. Henry knows a president was assassinated in Dallas, though Kennedy's motorcade didn't leave Love Field until more than a decade after Henry left my grandfather's operating room. Henry can hear the incomplete name of an icon -- "Bob Dy ..." -- and complete it, even though in 1953 Robert Zimmerman was just a twelve-year-old chafing against the dead-end monotony of small-town Minnesota. Henry can tell you that Archie Bunker's son-in-law is named Meathead.
How is this possible?
The piece is written by the grandson of the doctor who removed a portion of Molaison's brain in an effort to cure his epilepsy.
As I note in How We Decide, this data directly contradicts the rational models of microeconomics. Consumers aren't always driven by careful considerations of price and expected utility. We don't look at the electric grill or box of chocolates and perform an explicit cost-benefit analysis. Instead, we outsource much of this calculation to our emotional brain, and rely on relative amounts of pleasure versus pain to tell us what to purchase.
They then put the participants, one by one, in a dark anechoic chamber which shields all incoming sounds and deadens any noise made by the participant. The room had a 'panic button' to stop the experiment but apparently no-one needed to use it.
Our brains have Oprah neurons, Aniston neurons, Eiffel Tower neurons, and Saddam neurons that fire when we see pictures or hear the names of these people and places.
Yet "Oprah neuron" might be a misnomer. The same neuron also fired, albeit much more weakly, to Whoopi Goldberg in one patient. Similarly, Luke Skywalker neurons also responded to Yoda, and those famous Jennifer Aniston neurons flashed to her former Friends co-star Lisa Kudrow. Such connections could explain how our brain relates two abstract concepts, Quian Quiroga says.
And then the Skywalker neurons said, "these aren't the memories you're looking for". Ba doomp.
The fact that both of these important brain networks become active together suggests that mind wandering is not useless mental static. Instead, Schooler proposes, mind wandering allows us to work through some important thinking. Our brains process information to reach goals, but some of those goals are immediate while others are distant. Somehow we have evolved a way to switch between handling the here and now and contemplating long-term objectives. It may be no coincidence that most of the thoughts that people have during mind wandering have to do with the future.
This jibes well with the picture of the absentmindedness typical of some brilliant people.
If I ever write a book, it might have something to do with the two minds that govern creative expertise: the instinctual unconscious mind (the realm of relaxed concentration) and the thinking mind (the realm of deliberate practice). The tension between these two minds is both the key to and fatal flaw of human creativity. From the world of sports1, here's Rockies pitcher and college physics major Jeff Francis describing the interplay of the minds on the mound:
Even though I do understand the forces and everything, there's a separation when I'm pitching. If I throw a good pitch, I know what I did to do it, but there has to be a separation between knowing what I did and knowing why what I did helped the ball do what it did, if that makes any sense at all. If I thought about it on the mound, I'd be really mechanical and trying to be too perfect instead of doing what comes naturally.
But you don't need to be a physics major to wrestle with the consequences of the conflict between the two minds. After an injury and subsequent surgery, Francis' instinctual mind works to protect his body from further injury:
Francis repeatedly pulled the ball back in preparation to throw. But as he flashed his arm forward, his hand would, mind unaware, bring the ball back toward his ear rather than at full extension. It was his body essentially shortening the axis of his arm to decrease the force on his shoulder, protecting him from pain. And Francis could not stop it.
After his 10th pitch and first muffled groan of pain, he stopped.
"It's hurting you?" Murayama said.
"Yeah," Francis said.
"I can tell. You're getting out ahead of your arm. Slow down, stay back a little more."
"Does it look like I'm scared to throw a little?"
"Are you scared?"
To fully recover and regain his former effective pitching motion, Francis will utilize his thinking mind to retrain his unconscious mind through deliberate practice to ignore the injury potential. (thx, adriana)
 Most of the examples I've cited over the years deal with sports, mostly because professional athletes are among the most trained, scrutinized, studied, and optimized creative workers in the world. For a lot of other professions and endeavors, the data and scrutiny just isn't as evident. ↩
Since I don't use Adderall or Provigil, it took me a few days to get through this New Yorker article about neuroenhancing drugs. The main takeaway? Like cosmetic body modification in the 80s, mind modification through prescription chemical means is already commonplace for some and will soon be for many.
Chatterjee worries about cosmetic neurology, but he thinks that it will eventually become as acceptable as cosmetic surgery has; in fact, with neuroenhancement it's harder to argue that it's frivolous. As he notes in a 2007 paper, "Many sectors of society have winner-take-all conditions in which small advantages produce disproportionate rewards." At school and at work, the usefulness of being "smarter," needing less sleep, and learning more quickly are all "abundantly clear." In the near future, he predicts, some neurologists will refashion themselves as "quality-of-life consultants," whose role will be "to provide information while abrogating final responsibility for these decisions to patients." The demand is certainly there: from an aging population that won't put up with memory loss; from overwrought parents bent on giving their children every possible edge; from anxious employees in an efficiency-obsessed, BlackBerry-equipped office culture, where work never really ends.
The article is full of wonderful vocabulary. Like the "worried well": those people who are healthy but go to the doctor anyway to see if they can be made more healthy somehow. Being concerned about how good you've got it and attempting to do something about it seems to be another one of those uniquely American phenomena caused by an overabundance of free time & disposable income and the desire to overachieve. See also the impoverished wealthy, the dumb educated, and fat fit.
Henry Molaison -- more widely known as H.M. -- died last week at 82. Molaison was an amnesiac and the study of his condition revealed much about the workings of the human brain. He lost his long-term memory after a surgery in 1953 and couldn't remember anything after that for more than 20 seconds or so.
Living at his parents' house, and later with a relative through the 1970s, Mr. Molaison helped with the shopping, mowed the lawn, raked leaves and relaxed in front of the television. He could navigate through a day attending to mundane details -- fixing a lunch, making his bed -- by drawing on what he could remember from his first 27 years.
Molly Birnbaum was training to be a chef in Boston when she got hit by a car and lost her sense of smell. Soon after, she moved to New York.
Without the aroma of car exhaust, hot dogs or coffee, the city was a blank slate. Nothing was unbearable and nothing was especially beguiling. Penn Station's public restroom smelled the same as Jacques Torres's chocolate shop on Hudson Street. I knew that New York possessed a further level of meaning, but I had no access to it, and I worked hard to ignore what I could not detect.
In the first year of my recovery, I regularly visited both a neurologist and neuropsychologist who both disputed this claim. They told me that smell and taste, although related, are essentially exclusive. If anything, my neuropsychologist told me, smell is more integrated with memory.
In my experience, I've found this to be true: I have not lost my love of food; in fact, I feel like my appreciation for flavor combinations have been heightened. Milk does not taste like a "viscous liquid" to me and ice cream is certainly more than just "freezing." Similarly, a good wine is more than tasting the acids, a memorable dessert is more than simply sweet, and french fries do not taste like salty nothing-sticks.
After monitoring the daily schedule of the children for several months, Belton came to the conclusion that their lack of imagination was, at least in part, caused by the absence of "empty time," or periods without any activity or sensory stimulation. She noticed that as soon as these children got even a little bit bored, they simply turned on the television: the moving images kept their minds occupied. "It was a very automatic reaction," she says. "Television was what they did when they didn't know what else to do."
The problem with this habit, Belton says, is that it kept the kids from daydreaming. Because the children were rarely bored -- at least, when a television was nearby -- they never learned how to use their own imagination as a form of entertainment. "The capacity to daydream enables a person to fill empty time with an enjoyable activity that can be carried on anywhere," Belton says. "But that's a skill that requires real practice. Too many kids never get the practice."
But television isn't the default network that Lehrer is referring to:
Every time we slip effortlessly into a daydream, a distinct pattern of brain areas is activated, which is known as the default network. Studies show that this network is most engaged when people are performing tasks that require little conscious attention, such as routine driving on the highway or reading a tedious text. Although such mental trances are often seen as a sign of lethargy -- we are staring haplessly into space -- the cortex is actually very active during this default state, as numerous brain regions interact. Instead of responding to the outside world, the brain starts to contemplate its internal landscape. This is when new and creative connections are made between seemingly unrelated ideas.
Not surprisingly, the players were significantly better at predicting whether or not the shot would go in. While they got it right more than two-thirds of the time, the non-playing experts (i.e., the coaches and writers) only got it right 44 percent of the time.
It's thought that the brains of the players act as though they are actually taking the shot.
In other words, when professional basketball players watch another player take a shot, mirror neurons in their pre-motor areas might light up as if they were taking the same shot. This automatic empathy allows them to predict where the ball will end up before the ball is even in the air.
Jonah Lerher, author of Proust Was a Neuroscientist, has a piece in the New Yorker this week (not online1) about how the process of insight works in the brain. The main takeaway is that insight comes easiest when our brains are relaxed and not focused on too much detail so that it is able to look for more general associations between seemingly disparate ideas.
Kounios tells a story about an expert Zen meditator who took part in one of the C.R.A. insight experiments. At first, the meditator couldn't solve any of the insight problems. "This Zen guy went through thirty or so of the verbal puzzles and just drew a blank," Kounios said. "He was used to being very focussed, but you can't solve these problems if you're too focussed." Then, just as he was about to give up, he started solving one puzzle after another, until, by the end of the experiment, he was getting them all right. It was an unprecedented streak. "Normally, people don't get better as the task goes along," Kounios said. "If anything, they get a little bored." Kounios believes that the dramatic improvement of the Zen meditator came from his paradoxical ability to focus on not being focussed, so that he could pay attention to those remote associations in the right hemisphere. "He had the cognitive control to let go," Kounios said. "He became an insight machine."
I try not to miss any of Atul Gawande's New Yorker articles, but his piece on itching from this week's issue is possibly the most interesting thing I've read in the magazine in a long time. He begins by focusing on a specific patient for whom compulsive itching has become a very serious problem. (Warning, this quote is pretty disturbing...but don't let it deter you from reading the article.)
...the itching was so torturous, and the area so numb, that her scratching began to go through the skin. At a later office visit, her doctor found a silver-dollar-size patch of scalp where skin had been replaced by scab. M. tried bandaging her head, wearing caps to bed. But her fingernails would always find a way to her flesh, especially while she slept.
One morning, after she was awakened by her bedside alarm, she sat up and, she recalled, "this fluid came down my face, this greenish liquid." She pressed a square of gauze to her head and went to see her doctor again. M. showed the doctor the fluid on the dressing. The doctor looked closely at the wound. She shined a light on it and in M.'s eyes. Then she walked out of the room and called an ambulance. Only in the Emergency Department at Massachusetts General Hospital, after the doctors started swarming, and one told her she needed surgery now, did M. learn what had happened. She had scratched through her skull during the night -- and all the way into her brain.
From there, Gawande pulls out to tell us about itching/scratching (the two are inseparable), then about a recent theory of how our brains perceive the world ("visual perception is more than ninety per cent memory and less than ten per cent sensory nerve signals"), and finally about a fascinating therapy initially developed for those who experience phantom limb pain called mirror treatment.
Among them is an experiment that Ramachandran performed with volunteers who had phantom pain in an amputated arm. They put their surviving arm through a hole in the side of a box with a mirror inside, so that, peering through the open top, they would see their arm and its mirror image, as if they had two arms. Ramachandran then asked them to move both their intact arm and, in their mind, their phantom arm-to pretend that they were conducting an orchestra, say. The patients had the sense that they had two arms again. Even though they knew it was an illusion, it provided immediate relief. People who for years had been unable to unclench their phantom fist suddenly felt their hand open; phantom arms in painfully contorted positions could relax. With daily use of the mirror box over weeks, patients sensed their phantom limbs actually shrink into their stumps and, in several instances, completely vanish. Researchers at Walter Reed Army Medical Center recently published the results of a randomized trial of mirror therapy for soldiers with phantom-limb pain, showing dramatic success.
Crazy! Gawande documents and speculates about other applications of this treatment, including using virtual reality representations instead of mirrors and utilizing multiple mirrors for treatment of M.'s itchy scalp. Anyway, read the whole thing...highly recommended.
"The main functional characteristic of mirror neurons is that they become active both when the monkey makes a particular action (for example, when grasping an object or holding it) and when it observes another individual making a similar action." In other words, these peculiar cells mirror, on our inside, the outside world; they enable us to internalize the actions of another. They collapse the distinction between seeing and doing.
This suggests that when I watch Kobe glide to the basket for a dunk, a few deluded cells in my premotor cortex are convinced that I, myself, am touching the rim. And when he hits a three pointer, my mirror neurons light up as I've just made the crucial shot. They are what bind me to the game, breaking down that 4th wall separating fan from player. I'm not upset because my team lost: I'm upset because it literally feels like I lost, as if I had been on the court.
'They like toys more that are associated with someone who has spoken their language. They prefer to eat foods offered to them by a native speaker compared to a speaker of a foreign language. And older children say that they want to be friends with someone who speaks in their native accent.' Accents and vernacular, far more than race, seem to influence the people we like. 'Children would rather be friends with someone who is from a different race and speaks with a native accent versus somebody who is their own race but speaks with a foreign accent.'
The disease apparently altered circuits in their brains, changing the connections between the front and back parts and resulting in a torrent of creativity. "We used to think dementias hit the brain diffusely," Dr. Miller said. "Nothing was anatomically specific. That is wrong. We now realize that when specific, dominant circuits are injured or disintegrate, they may release or disinhibit activity in other areas. In other words, if one part of the brain is compromised, another part can remodel and become stronger."
Some of Adams' work can be seen here...her portrait of pi contains a touch of synesthesia. (thx, cory)
This talk by neuroanatomist Jill Bolte Taylor was universally considered the best talk at the TED conference last month. In it, she describes the lessons she learned from studying her stroke from inside her own head as it was happening.
And in that moment my right arm went totally paralyzed by my side. And I realized, "Oh my gosh! I'm having a stroke! I'm having a stroke!" And the next thing my brain says to me is, "Wow! This is so cool. This is so cool. How many brain scientists have the opportunity to study their own brain from the inside out?"
Proust Was a Neuroscientist is the story of how eight writers and artists anticipated our contemporary understanding of the human brain. From the preface:
This book is about artists who anticipated the discoveries of neuroscience. It is about writers and painters and composers who discovered truths about the human mind -- real, tangible truths -- that science is only now rediscovering. Their imaginations foretold the facts of the future.
I enjoyed the book quite a bit so I sent the author, Jonah Lehrer, a few questions via email. Here's our brief conversation.
Jason Kottke: Your exploration of the intersection of neuroscience and culture begins with Proust; you were reading Swann's Way while doing research in a neuroscience lab. Where did the idea come from for a collection of people who anticipated our modern understanding of the human brain? How did you find those other stories?
Jonah Lehrer: The lab I was working in was studying the chemistry of memory. The manual labor of science can get pretty tedious, and so I started reading Proust while waiting for my experiments to finish. After a few hundred pages of melodrama, I began to realize that the novelist had these very modern ideas about how our memory worked. His fiction, in other words, anticipated the very facts I was trying to uncover by studying the isolated neurons of sea slugs. Once I had this idea about looking at art through the prism of science, I began to see connections everywhere. I'd mutter about the visual cortex while looking at a Cezanne painting, or think about the somatosensory areas while reading Whitman on the "body electric". Needless to say, my labmates mocked me mercilessly.
I'm always a little embarrassed to admit just how idiosyncratic my selection process was for the other artists in the book. I simply began with my favorite artists and tried to see what they had to say about the mind. The first thing that surprised me was just how much they had to say. Virginia Woolf, for instance, is always going on and on about her brain. "Nerves" has to be one of her favorite words.
Kottke: Which of your characters did you know the least about beforehand? Even a seeming polymath like yourself must have a blind spot or two.
Lehrer: Definitely Gertrude Stein. I actually found her through William James, the great American psychologist and philosopher. She worked in his Harvard lab, published a few scientific papers on "automatic writing," and then went to med-school at Johns Hopkins before dropping out and moving to Paris to hang out with Picasso. So I knew she had this deep background in science, but I had only read snippets of her work. I then proceeded to fall asleep to the same page of "The Making of Americans" for a month.
Kottke: Are there other characters that you considered for inclusion? If so, why weren't they included?
Lehrer: Lots of people were left on the cutting room floor. I had a long digression on Edgar Allen Poe and mirror neurons. (See, for instance, "The Purloined Letter," where Poe has detective Dupin reveal his secret for reading the minds of criminals: "When I wish to find out how wise, or how stupid, or how good, or how wicked is any one, or what are his thoughts at the moment, I fashion the expression of my face, as accurately as possible, in accordance with the expression of his, and then wait to see what thoughts or sentiments arise in my mind or heart, as if to match or correspond with the expression.") I also had a chapter on Coleridge and the unconscious, but I think that chapter was really just me wanting to write about opium. But, for the most part, I can't really say why some chapters survived the editing process and others didn't. I certainly mean no disrespect to Poe. If they let me write a sequel, I'll find a way to include him.
Kottke: I noticed that three out of the eight main characters in the book are women. Surveying the usually cited big thinkers of the 19th and 20th centuries, it would have been easy to write this book with all male characters. Is there an implicit statement in there that science would be better off with a greater percentage of women participating?
Lehrer: While I certainly agree with the idea that the institution of science would benefit from more female scientists, I didn't choose these female artists for that reason. I don't think you need any ulterior motive to fall in love with the work of Virginia Woolf and George Eliot. Their art speaks for itself. That said, I think the psychological insights of women like Woolf were rooted, at least in part, in their womanhood. Woolf, for instance, rebelled against the stodgy old male novelists of her day. Their fiction, she complained, was all about "factories and utopias". Woolf wanted to invert this hierarchy, so that the "task of the novelist" was to "examine an ordinary mind on an ordinary day." There's something very domestic about her modernism, so that the grandest epiphanies happen while someone is out buying flowers or eating a beef stew. Women might not be able to write novels about war or politics, but they could find an equal majesty by exploring the mind.
Plus, I think Woolf learned a lot about the brain from her mental illness. As a woman, she was subjected to all sorts of terrible psychiatric treatments, which made her rather skeptical of doctors. (In Mrs. Dalloway, she refers to the paternalistic Dr. Bradshaw as an "obscurely evil" person, whose insistence that the mental illness was "physical, purely physical" causes a suicide.) Introspection was Woolf's only medicine. "I feel my brains, like a pear, to see if it's ripe," she once wrote. "It will be exquisite by September."
Kottke: Are there other books/media out there that share a third culture kinship with yours? I received a copy of Lawrence Weschler's Everything That Rises: A Book of Convergences for Christmas...that seems to fit. Steven Johnson's books. Anything else you can recommend?
Lehrer: I've stolen ideas from so many people it's hard to know where to begin. Certainly Weschler and Johnson have both been major influences. I've always worshipped Oliver Sacks; Richard Powers has more neuroscience in his novels than most issues of Nature; I just saw Olafur Eliasson's new show at SFMOMA and that was rather inspiring. I could go on and on. It's really an exciting time to be interested in the intersection of art and science.
But I'd also recommend traveling back in time a little bit, before our two cultures were so divided. We don't think of people like George Eliot as third-culture figures, but she famously described her novels as a "a set of experiments in life." Virginia Woolf, before she wrote Mrs. Dalloway, said that in her new novel the "psychology should be done very realistically." Whitman worked in Civil War hospitals and corresponded for years with the neurologist who discovered phantom limb syndrome. (He also kept up with phrenology, the brain science of his day.) Or look at Coleridge. When the poet was asked why he attended so many lectures on chemistry, he gave a great answer: "To improve my stock of metaphors". In other words, trying to merge art and science isn't some newfangled idea.
Thanks, Jonah. You can read more of Lehrer's writing at his frequently updated blog, The Frontal Cortex.
The experimenters used functional magnetic resonance imaging (fMRI) to scan the brains of Harvard and other Boston-area students while showing them pictures of other college-age people whom the researchers randomly described as either liberal northeastern students or conservative Midwest fundamentalist Christian students.
The study concludes that the secret to getting along with someone that you perceive as an outsider is to find some common ground so that your brain will accept them as someone with similar circumstances.
This is not new advice. Yet it is heartening to see that it is firmly grounded in distinct patterns of neural activity. There may be a brain basis for reacting with prejudices for those that seem different. But there's also a brain basis for overriding those differences and seeing outsiders as more like us.
In other words, a civilized society depends not on the people who are currently the most civilized, but those who are most willing to accept change, as social or cultural groupings change, split or coalesce. Inevitably this means reasonable people rather than faithful people.
First, our brains consist of material particles. Second, these particles, in certain arrangements, produce subjective thoughts and feelings. Third, physical properties alone cannot account for subjectivity. (How could the ineffable experience of tasting a strawberry ever arise from the equations of physics?) Now, Nagel reasoned, the properties of a complex system like the brain don't just pop into existence from nowhere; they must derive from the properties of that system's ultimate constituents. Those ultimate constituents must therefore have subjective features themselves -- features that, in the right combinations, add up to our inner thoughts and feelings. But the electrons, protons and neutrons making up our brains are no different from those making up the rest of the world. So the entire universe must consist of little bits of consciousness.
Dude! Note: the timestamp on this post is exactly 4:20 pm ET. You know what to do.
From an article on human memory that includes profiles of a woman who remembers everything she's done in her life since age 11 and a man who remembers almost nothing after 1960:
The metaphors we most often use to describe memory -- the photograph, the tape recorder, the mirror, the hard drive -- all suggest mechanical accuracy, as if the mind were some sort of meticulous transcriber of our experiences. And for a long time it was a commonly held view that our brains function as perfect recorders-that a lifetime of memories are socked away somewhere in the cerebral attic, and if they can't be found it isn't because they've disappeared, but only because we've lost access to them.
That's not the case, of course. A better metaphor for human memory might be that of an almost-saturated sponge trying to sop up spilled water on a counter. The sponge gets some of the water up but also loses some of its already-captured liquid and you just sort of smear the watery mess all over until the counter is completely wet but appears less waterlogged than it was. At least, that's how *my* memory works.
...a neurologically based phenomenon in which stimulation of one sensory or cognitive pathway leads to automatic, involuntary experiences in a second sensory or cognitive pathway.
For some people, this means that numbers are associated with colors...5 is blue, 2 is red, etc. In a recent experiment, a person with synesthesia was found to experience colors associated with numbers even though they were colorblind...colors that person had never actually seen with his eyes.
That may seem strange, but what it really means is that the subject had problems with his retina that left him able to distinguish only an extremely narrow range of wavelengths when looking at most images in the world -- his brain was fine, but his eyes weren't quite up to the job. But when he saw certain numbers, he experienced colors that he otherwise never saw.
The set of abilities that allows you to select behavior that's appropriate to the situation, inhibit inappropriate behavior and focus on the job at hand in spite of distractions. Executive function includes basic functions like processing speed, response speed and working memory, the type used to remember a house number while walking from the car to a party.
Interestingly, physical and not mental exercise is the best way to improve your brain's executive function. (via joel)
To control the simulated aircraft, the neurons first receive information from the computer about flight conditions: whether the plane is flying straight and level or is tilted to the left or to the right. The neurons then analyze the data and respond by sending signals to the plane's controls. Those signals alter the flight path and new information is sent to the neurons, creating a feedback system.
FYI, this story is a couple of years old...if that matters to you.
Is she spinning clockwise or counterclockwise? Or both...and how is that even possible? It's a left-brain vs. right-brain test...which way she spins for you determines which side of your brain is more dominant. (Tip: if you're having trouble getting her to switch directions, focus on a point a couple of inches below her feet...that seems to do it for me.)
Tony Wright, horticulturalist, broke the unofficial world record by going without sleep for more than 11 days. His trick was, when the left side of his brain tired, to switch to the right side. And then back again after the left had recovered and so on.
At some level, talk therapy has always been an exercise in replaying and reinterpreting each person's unique life story. Yet Mr. Adler found that in fact those former patients who scored highest on measures of well-being -- who had recovered, by standard measures -- told very similar tales about their experiences.
They described their problem, whether depression or an eating disorder, as coming on suddenly, as if out of nowhere. They characterized their difficulty as if it were an outside enemy, often giving it a name (the black dog, the walk of shame). And eventually they conquered it.
"The story is one of victorious battle: 'I ended therapy because I could overcome this on my own,'" Mr. Adler said. Those in the study who scored lower on measures of psychological well-being were more likely to see their moods and behavior problems as a part of their own character, rather than as a villain to be defeated. To them, therapy was part of a continuing adaptation, not a decisive battle.
The article goes on to describe the benefits of thinking about past events in the third person rather than in the first person:
In a 2005 study reported in the journal Psychological Science, researchers at Columbia University measured how student participants reacted to a bad memory, whether an argument or failed exam, when it was recalled in the third person. They tested levels of conscious and unconscious hostility after the recollections, using both standard questionnaires and students' essays. The investigators found that the third-person scenes were significantly less upsetting, compared with bad memories recalled in the first person.
"What our experiment showed is that this shift in perspective, having this distance from yourself, allows you to relive the experience and focus on why you're feeling upset," instead of being immersed in it, said Ethan Kross, the study's lead author. The emotional content of the memory is still felt, he said, but its sting is blunted as the brain frames its meaning, as it builds the story.
But things like eating disorders and mental illness aren't external forces and thinking about a bad memory as if it happened to a third party is not the truth. The standard model of the happy, smart, successful human being is someone who knows more, works hard, and has found, or at least is heading toward, their own personal meaning of life. But often that's not the case. Self-deceit (or otherwise willfully forgetting seemingly pertinent information) seems to be important to human growth.
The researchers studied 84 female housekeepers from seven hotels. Women in 4 hotels were told that their regular work was enough exercise to meet the requirements for a healthy, active lifestyle, whereas the women in the other three hotels were told nothing. To determine if the placebo effect plays a role in the benefits of exercise, the researchers investigated whether subjects' mind-set (in this case, their perceived levels of exercise) could inhibit or enhance the health benefits of exercise independent of any actual exercise.
Four weeks later, the researchers returned to assess any changes in the women's health. They found that the women in the informed group had lost an average of 2 pounds, lowered their blood pressure by almost 10 percent, and were significantly healthier as measured by body-fat percentage, body mass index, and waist-to-hip ratio. These changes were significantly higher than those reported in the control group and were especially remarkable given the time period of only four weeks.
Just by thinking they were exercising, these women gained extra benefit from their usual routines. The idea of thinking about oneself reminded me of Allen Iverson's training routine, which utilizes a technique called psychocybernetics:
"Let me tell you about Allen's workouts," says Terry Royster, his bodyguard from 1997 until early 2002. "All the time I have been with him, I never seen him lift a weight or stand there and shoot jumper after jumper. Instead, we'll be on our way to the game and he'll be quiet as hell. Finally, he'll say, 'You know now I usually cross my man over and take it into the lane and pull up? Well, tonight I'm gonna cross him over and then take a step back and fade away. I'm gonna kill 'em with it all night long.' And damned if he didn't do just that. See, that's his workout, when he's just sitting there, thinking. That's him working on his game."
What Iverson is doing is tricking his conscious self into thinking that he's done something that he hasn't, that he's practiced a move or shot 100 perfect free throws in a row. I think, therefore I slam. (I wonder if Iverson pictures himself in the first or third person in his visualizations.)
Carol Dweck's research looks at the difference between thinking of talent or ability as innate as opposed to something that can be developed:
At the time, the suggested cure for learned helplessness was a long string of successes. Dweck posited that the difference between the helpless response and its opposite -- the determination to master new things and surmount challenges -- lay in people's beliefs about why they had failed. People who attributed their failures to lack of ability, Dweck thought, would become discouraged even in areas where they were capable. Those who thought they simply hadn't tried hard enough, on the other hand, would be fueled by setbacks.
For some people, the facade they've created for themselves can come crashing down suddenly, as with stage fright:
He describes the sense of acute self-consciousness and loss of confidence that followed as "stage dread," a sort of "paradigm shift." He says, "It's not 'Look at me - I'm flying.' It's 'Look at me - I might fall.' It would be like playing a game of chess where you're constantly regretting the moves you've already played rather than looking at the ones you're going to play." Fry could not mobilize his defenses; unable to shore himself up, he took himself away.
"Until now, it's been assumed that people with high capacity visual working memory had greater storage but actually, it's about the bouncer - a neural mechanism that controls what information gets into awareness," Vogel said.
The findings suggest that despite the brain's astonishing ability to archive a lifetime of memories, one of its prime functions is, paradoxically, to forget. Our sensory organs continually deluge us with information, some of it unpleasant. We wouldn't get through the day -- or through life -- if we didn't repress much of it.
Perhaps the way to true personal acheivement and happiness is through lying to yourself instead of being honest, loafing instead of practicing, and purposely forgetting information. There are plenty of self-help books on the market...where are the self-hurt books?
Update: Wired has an update on Adams' condition. Apparently a few days after he wrote the blog post above, Adams had a relapse and waited almost two more years for a surgical procedure that helped him.
As one gets smarter, how you use your memory changes. "Verbatim memory is often a property of being a novice. As people become smarter, they start to put things into categories, and one of the costs they pay is lower memory accuracy for individual differences."