These didn’t track as AI-generated at first…and then I tried to read the text — THE STANFORD PRESERIBENT. You can see the whole set on Bluesky (if you have access).
The late psychologist Mihaly Csikszentmihalyi identified and popularized the concept of flow and also did research around the linked ideas of creativity and happiness. In his book Creativity: Flow and the Psychology of Discovery and Invention, he listed 10 pairs of contradictory traits that creative people tend to have.
1. Creative individuals have a great deal of physical energy, but they are also often quiet and at rest.
2. Creative individuals tend to be smart, yet also naive at the same time.
3. A third paradoxical trait refers to the related combination of playfulness and discipline, or responsibility and irresponsibility.
4. Creative individuals alternate between imagination and fantasy at one end, and a rooted sense of reality at the other.
5. Creative people seem to harbor opposite tendencies on the continuum between extroversion and introversion.
6. Creative individuals are also remarkably humble and proud at the same time.
7. Creative individuals to a certain extent escape this rigid gender role stereotyping [of ‘masculine’ and ‘feminine’].
8. Creative people are both traditional and conservative and at the same time rebellious and iconoclastic.
9. Creative persons are very passionate about their work, yet they can be extremely objective about it as well.
10. The openness and sensitivity of creative individuals often exposes them to suffering and pain yet also a great deal of enjoyment.
I don’t know if this is comforting or what, but psychologist Steven Taylor published a book two months before the start of the Covid-19 pandemic called The Psychology of Pandemics that predicted many of the behaviors we’ve been seeing over the past 18+ months, including masking backlash, the acceptance of conspiracy theories, vaccine resistance, and wholesale denial that the pandemic is even happening.
Taylor would know because he predicted it. He wrote a remarkable little book back in 2019 called “The Psychology of Pandemics.” Its premise is that pandemics are “not simply events in which some harmful microbe ‘goes viral,’” but rather are mass psychological phenomena about the behaviors, attitudes and emotions of people.
The book came out pre-COVID and yet predicts every trend and trope we’ve been living for 19 months now: the hoarding of supplies like toilet paper at the start; the rapid spread of “unfounded rumors and fake news”; the backlash against masks and vaccines; the rise and acceptance of conspiracy theories; and the division of society into people who “dutifully conform to the advice of health authorities” — sometimes compulsively so — and those who “engage in seemingly self-defeating behaviors such as refusing to get vaccinated.”
He has no crystal ball, he says, it’s just that all of this has happened before. A lot of people believed the Spanish flu pandemic of 1918 was spread by the Germans through Bayer aspirin. It’s all based on basic psychology as to how people react to health emergencies.
The denialists and refuseniks today are engaging in what the psychology field calls “psychological reactance.” It’s “a motivational response to rules, regulations, or attempts at persuasion that are perceived as threatening one’s autonomy and freedom of choice,” the book describes. Think what happens when someone says “Eat your broccoli.”
Following onto that is what psychologists term “motivated reasoning.” That’s when people stick with their story even if the facts obviously are contrary to it, as a form of “comforting delusion,” Taylor says. The book covers “unrealistic optimism bias,” in which people in pandemics are prone to convincing themselves that it can’t or won’t happen to them.
The book almost wasn’t even released at all — Taylor’s publisher told him the book was “interesting, but no one’s going to want to read it”.
I grew up in Wisconsin, and have lived in Iowa, Minnesota, and New York. Except for a two-year stint in the Bay Area, I’ve experienced winter — real winter, with lots of snow, below-freezing temperatures, and little daylight — every year of my life and never had a problem with it. So I was surprised when my last two Vermont winters put me on my ass. In winter 2017-18, I was depressed, anxious, wasn’t getting out of bed in the morning, spent endless time on my phone doing nothing, and had trouble focusing on my work. And I didn’t realize what it was until the first nice spring day came, 70 and sunny, and it hit me: “holy shit, I’ve been depressed because of winter” and felt wonderful for the next 5 months, like a completely different person. Then last year I was so anxious that it would happen again that all that stuff was worse and started basically a week into fall.
Nothing helped: I tried getting outside more, spent more time with friends, got out to meet new people, travelled to warm places, took photos of VT’s beautiful winter landscapes, spent time in cities, cut back on alcohol, and prioritized sleep. Last year I skied more than ever before and enjoyed it more than I’d ever had. Didn’t matter. This stuff worked during the spring and summer but my winter malaise was seemingly impenetrable. The plan for this fall was to try a SAD lamp, therapy, maybe drugs, and lots more warm travel. But then something interesting happened.
Sometime this fall — using a combination of Stoicism, stubbornness, and a sort of magical thinking that Jason-in-his-30s would have dismissed as woo-woo bullshit — I decided that because I live in Vermont, there is nothing I can do about it being winter, so it was unhelpful for me to be upset about it. I stopped complaining about it getting cold and dark, I stopped dreading the arrival of snow. I told myself that I just wasn’t going to feel like I felt in the summer and that’s ok — winter is a time for different feelings. As Matt Thomas wrote, I stopped fighting the winter vibe and tried to go with it:
Fall is a time to write for me as well, but it also means welcoming — rather than fighting against — the shorter days, the football games, the decorative gourds. Productivity writer Nicholas Bate’s seven fall basics are more sleep, more reading, more hiking, more reflection, more soup, more movies, and more night sky. I like those too. The winter will bring with it new things, new adjustments. Hygge not hay rides. Ditto the spring. Come summer, I’ll feel less stress about stopping work early to go to a barbecue or movie because I know, come autumn, I’ll be hunkering down. More and more, I try to live in harmony with the seasons, not the clock.
Last night, I read this Fast Company piece on some research done by Kari Leibowitz about how people in near-polar climates avoid seasonal depression and it really resonated with this approach that I’d stumbled upon.
At first, she was asking “Why aren’t people here more depressed?” and if there were lessons that could be taken elsewhere. But once she was there, “I sort of realized that that was the wrong question to be asking,” she says. When she asked people “Why don’t you have seasonal depression?” the answer was “Why would we?”
It turns out that in northern Norway, “people view winter as something to be enjoyed, not something to be endured,” says Leibowitz, and that makes all the difference.
The people in the Norwegian communities Leibowitz studied, they got outside as much as they could — “there’s no such thing as bad weather, only bad clothing” — spent their time indoors being cozy, came together in groups, and marveled at winter’s beauty. I’d tried all that stuff my previous two winters but what seems to have moved the needle for me this year is a shift in mindset.
As I experienced firsthand Tromsø residents’ unique relationship to winter, a serendipitous conversation with Alia Crum, assistant professor of psychology at Stanford University, inspired me to consider mindset as a factor that might influence Tromsø residents’ sunny perspective of the sunless winter. Crum defines mindsets as the “lenses through which information is perceived, organized and interpreted.” Mindsets serve as an overarching framework for our everyday experiences — and they can profoundly influence how we react in a variety of situations.
Crum’s work has shown that mindsets significantly influence both our physical and mental health in areas as diverse as exercise, stress and diet. For example, according to Crum’s research, individuals can hold the mindset that stress is either debilitating (bad for your health and performance) or enhancing (motivating and performance-boosting). The truth is that stress is both; it can cause athletes to crumble under pressure and lead CEOs to have heart attacks, but it can also sharpen focus and critical thinking, giving athletes, CEOs and the rest of us the attention and adrenaline to succeed in high-pressure situations. According to Crum’s work, instead of the mere presence of stress, it is our mindset about stress — whether or not we perceive it as a help or a hindrance — that contributes most to health, performance and psychological outcomes.
This is the woo-woo bullshit I referred to earlier, the sort of thing that always brings to my mind the advice of self-help gurus embodied by The Simpsons’ Troy McClure urging his viewers to “get confident, stupid!” Is the secret to feeling happy really just to feel happy? It sounds ridiculous, right? This is the bit of the Fast Company piece that resonated with me like a massive gong:
But overall, mindset research is increasingly finding that it doesn’t take much to shift one’s thinking. “It doesn’t have to be this huge complicated thing,” says Leibowitz. “You can just consciously try to have a positive wintertime mindset and that might be enough to induce it.”
So how has this tiny shift in mindset been working for me so far? It’s only mid-November — albeit a mid-November where it’s already been 5°F, has been mostly below freezing for the past week, and with a good 6 inches of snow on the ground — but I have been feeling not only not bad, but actually good. My early fall had some seasonally-unrelated tough moments, but I’ve experienced none of last year’s pre-winter despondency. I’m looking forward to the start of skiing, especially since my kids are so jazzed up about it. I don’t currently have any trips planned (just got back from warm & sunny Mexico and am glad to be home even though the trip was great), but I’m definitely eager to start prepping for something in January. I’ve had more time for reading, watching some interesting TV, eating rich foods, making apple pie, and working. I went for a 6-mile walk in the freezing cold with a friend and it was delightful. And I’m already looking forward to spring and summer as well. It’s comforting to know that warmer weather and longer days are waiting for me in the distance, when I can do more of what I want to do and feel more like my true self. But in the meantime, pass the cocoa and I’ll see you on the slopes.
Francis Galton, a Victorian eugenicist and statistician, was obsessed with measuring reaction time as a proxy for general intelligence. In 1885, 1890, and 1892, he collected “data on the sensory, psychomotor, and physical attributes of 1,639 females and 4,849 males.” Eventually, though, reaction time gave way to other questionable measurements of generalized intelligence like IQ tests and scholastic aptitude scores, so most of us don’t keep track of our reaction times, if we’ve ever had it measured.
Here’s the thing, though — everyone who’s tried to repeat Galton’s experiments in the 20th and 21st century, across populations, varying the equipment used and the measurement process taken, etc., has never been able to get reaction times as fast as what Galton measured. IQ scores have generally risen over time; reaction times have slowed down. It’s a matter of milliseconds, but the effect is large: about 10 percent. It is quite possible that young adults in 19th century Great Britain were just plain faster than us.
What are we to make of this? Normally we wouldn’t put much weight on a single study, even one with 3000 participants, but there aren’t many alternatives. It isn’t as if we can have access to young adults born in the 19th century to check if the result replicates. It’s a shame there aren’t more intervening studies, so we could test the reasonable prediction that participants in the 1930s should be about halfway between the Victorian and modern participants.
And, even if we believe this datum, what does it mean? A genuine decline in cognitive capacity? Excess cognitive load on other functions? Motivational changes? Changes in how experiments are run or approached by participants? I’m not giving up on the kids just yet.
In a lesson for TED-Ed, David Dunning explains the Dunning-Kruger Effect, a cognitive bias in which people with lesser abilities tend to rate themselves as more proficient than they are.
Interestingly, this effect not only applies to those with lower abilities thinking they are better but also to experts who think they’re not exceptional. That is, the least & most skilled groups are both deficient in their ability to evaluate their skills.
In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize — scratch that, cannot recognize — just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. Logic itself almost demands this lack of self-insight: For poor performers to recognize their ineptitude would require them to possess the very expertise they lack. To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers — and we are all poor performers at some things — fail to see the flaws in their thinking or the answers they lack.
What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.
Confidence feels like knowledge. I feel like that simple statement explains so much about the world.
In closing, I’ll just note that thinking you’re impervious to the Dunning-Kruger Effect is itself an example of the Dunning-Kruger Effect in action. (via open culture)
The marshmallow test is a famous psychological experiment designed by Walter Mischel in the 1960s. Kids were given a single marshmallow but told they could have another if they refrained from eating the first one for 15 minutes. The results seemed to indicate a much greater degree of self-control amongst those children who were able to delay gratification, which led to better outcomes in their lives. From a New Yorker article about Mischel:
Once Mischel began analyzing the results, he noticed that low delayers, the children who rang the bell quickly, seemed more likely to have behavioral problems, both in school and at home. They got lower S.A.T. scores. They struggled in stressful situations, often had trouble paying attention, and found it difficult to maintain friendships. The child who could wait fifteen minutes had an S.A.T. score that was, on average, two hundred and ten points higher than that of the kid who could wait only thirty seconds.
Ultimately, the new study finds limited support for the idea that being able to delay gratification leads to better outcomes. Instead, it suggests that the capacity to hold out for a second marshmallow is shaped in large part by a child’s social and economic background — and, in turn, that that background, not the ability to delay gratification, is what’s behind kids’ long-term success.
If you’re poor, you might look at the promise of future food somewhat dubiously…and not because of a lack of self-control:
The failed replication of the marshmallow test does more than just debunk the earlier notion; it suggests other possible explanations for why poorer kids would be less motivated to wait for that second marshmallow. For them, daily life holds fewer guarantees: There might be food in the pantry today, but there might not be tomorrow, so there is a risk that comes with waiting. And even if their parents promise to buy more of a certain food, sometimes that promise gets broken out of financial necessity.
Leatrice Eiseman, Pantone Color Institute’s executive director, teaches an annual class on trend forecasting and the psychology of color. She joined Pantone after publishing her 1983 book “Alive With Color,” and she created the color clock concept.
Eiseman believes that our reaction to colors “goes beyond the psychological into the physiological” and that colors carry inherent messages that all humans innately understand — the whispers of that “ancient wisdom.” She doesn’t deny the important influence of memory and social factors on color perception, but often, she says, “our response is involuntary, and we simply have no control over it.”
Last October, Eiseman published her 10th book, “The Complete Color Harmony, Pantone Edition,” her boldest statement yet on the psychology of color — and one that might rightly be displayed in the self-help section. Consider a chapter titled, “Personal Colors: What Do They Say About You?” which offers a kind of chromatic horoscope that locates truths not in the cosmos but in the spectrum of visible light.
Cognitive biases are systematic ways in which people deviate from rationality in making judgements. Wikipedia maintains a list such biases and one example is survivorship bias, the tendency to focus on those things or people which succeed in an endeavor and discount the experiences of those which did not.
A commonly held opinion in many populations is that machinery, equipment, and goods manufactured in previous generations often is better built and lasts longer than similar contemporary items. (This perception is reflected in the common expression “They don’t make ‘em like they used to.”) Again, because of the selective pressures of time and use, it is inevitable that only those items which were built to last will have survived into the present day. Therefore, most of the old machinery still seen functioning well in the present day must necessarily have been built to a standard of quality necessary to survive. All of the machinery, equipment, and goods that have failed over the intervening years are no longer visible to the general population as they have been junked, scrapped, recycled, or otherwise disposed of.
Buster Benson recently went through the list of biases and tried to simplify them into some sort of structure. What he came up with is a list of four conundrums — “4 qualities of the universe that limit our own intelligence and the intelligence of every other person, collective, organism, machine, alien, or imaginable god” — that lead to all biases. They are:
1. There’s too much information.
2. There’s not enough meaning.
3. There’s not enough time and resources.
4. There’s not enough memory.
The 2nd conundrum is that the process of turning raw information into something meaningful requires connecting the dots between the limited information that’s made it to you and the catalog of mental models, beliefs, symbols, and associations that you’ve stored from previous experiences. Connecting dots is an imprecise and subjective process, resulting in a story that’s a blend of new and old information. Your new stories are being built out of the bricks of your old stories, and so will always have a hint of past qualities and textures that may not have actually been there.
For each conundrum in Benson’s scheme, there are categories of bias, 20 in all. For example, the categories that related to the “not enough meaning” conundrum are:
1. We find stories and patterns even in sparse data.
2. We fill in characteristics from stereotypes, generalities, and prior histories whenever there are new specific instances or gaps in information.
3. We imagine things and people we’re familiar with or fond of as better than things and people we aren’t familiar with or fond of.
4. We simplify probabilities and numbers to make them easier to think about.
5. We project our current mindset and assumptions onto the past and future.
Benson’s whole piece is worth a read, but if you spend too much time with it, you might become unable to function because you’ll start to see cognitive biases everywhere.
In 2000, the BBC broadcast an hour-long documentary called Five Steps to Tyranny, a look at how ordinary people can do monstrous things in the presence of authority.
Horrific things happen in the world we live in. We would like to believe only evil people carry out atrocities. But tyrannies are created by ordinary people, like you and me.
[Colonel Bob Stewart:] “I’d never been to the former Yugoslavia before in my life, so what actually struck me about the country was how beautiful it was, how nice people were, and yet how ghastly they could behave.”
The five steps are:
“us” and “them” (prejudice and the formation of a dominant group)
obey orders (the tendency to follow orders, especially from those with authority)
do “them” harm (obeying an authority who commands actions against our conscience)
“stand up” or “stand by” (standing by as harm occurs)
The teacher is told to administer an electric shock every time the learner makes a mistake, increasing the level of shock each time. There were 30 switches on the shock generator marked from 15 volts (slight shock) to 450 (danger — severe shock).
The “learners” were in on the experiment and weren’t actually shocked but were told to react as if they were. The results?
65% (two-thirds) of participants (i.e. teachers) continued to the highest level of 450 volts. All the participants continued to 300 volts.
The program also shows how real-life tyrannies have developed in places like Rwanda, Burma, and Bosnia. From a review of the show in The Guardian:
But there is no doubt about the programme’s bottom line: tyrannies happen because ordinary people are surprisingly willing to do tyranny’s dirty work.
Programmes like this can show such things with great vividness — and there is news footage from Bosnia, or from Rwanda, or from Burma to back it up with terrible clarity. It isn’t clear why the majority is so often compliant, but the implication is that democracy should always be grateful to the protesters, the members of the awkward squad, the people who challenge authority.
But don’t take it for granted that the awkward squad must be a force for good: in Germany, in the 1920s, Hitler was an outsider, a protester, a member of the awkward squad. When he came to power in 1932, he found that German medical professors and biologists had already installed a racial ideology for him, one which had already theorised about the elimination of sick or disabled German children, and the rejection of Jewish professionals as agents of pollution.
Zimbardo himself offers this final word in the program:
For me the bottom line message is that we could be led to do evil deeds. And what that means is to become sensitive to the conditions under which ordinary people can do these evil deeds — what we have been demonstrating throughout this program — and to take a position of resisting tyranny at the very first signs of its existence.
By collecting troves of data on how users play their games, developers have mastered the science of applied addiction. And with the rise of “freemium” games that rely on micro-transactions, they have good reason to deploy the tools of behavioral psychology to inspire purchases.
To maximize the efficacy of a coercive monetization model, you must use a premium currency, ideally with the ability to purchase said currency in-app. Making the consumer exit the game to make a purchase gives the target’s brain more time to figure out what you are up to, lowering your chances of a sale. If you can set up your game to allow “one button conversion”, such as in many iOS games, then obviously this is ideal. The same effect is seen in real world retail stores where people buying goods with cash tend to spend less than those buying with credit cards, due to the layering effect.
Purchasing in-app premium currency also allows the use of discounting, such that premium currency can be sold for less per unit if it is purchased in bulk. Thus a user that is capable of doing basic math (handled in a different part of the brain that develops earlier) can feel the urge to “save money” by buying more. The younger the consumer, the more effective this technique is, assuming they are able to do the math. Thus you want to make the numbers on the purchase options very simple, and you can also put banners on bigger purchases telling the user how much more they will “save” on big purchases to assist very young or otherwise math-impaired customers.
Having the user see their amount of premium currency in the interface is also much less anxiety generating, compared to seeing a real money balance. If real money was used (no successful game developer does this) then the consumer would see their money going down as they play and become apprehensive. This gives the consumer more opportunities to think and will reduce revenues.
On the topic of in-app purchases, Griffiths says, “The introduction of in-game virtual goods and accessories (that people pay real money for) was a psychological masterstroke.”
“It becomes more akin to gambling, as social gamers know that they are spending money as they play with little or no financial return,” he continues. “The one question I am constantly asked is why people pay real money for virtual items in games like FarmVille. As someone who has studied slot machine players for over 25 years, the similarities are striking.”
Griffiths argues that the real difference between pure gambling games and some free-to-play games is the fact that gambling games allow you to win your money back, adding an extra dimension that can potentially drive revenues even further.
Candy Crush Saga was actually designed by an economist to demonstrate how people don’t understand the concept of sunk cost.
Update: In 2009, Chris Anderson wrote a book called Free: The Future of a Radical Price in which he argued that freemium was going to be an important business model.
The online economy offers challenges to traditional businesses as well as incredible opportunities. Chris Anderson makes the compelling case that in many instances businesses can succeed best by giving away more than they charge for. Known as “Freemium,” this combination of free and paid is emerging as one of the most powerful digital business models. In Free, Chris Anderson explores this radical idea for the new global economy and demonstrates how it can be harnessed for the benefit of consumers and businesses alike. In the twenty-first century, Free is more than just a promotional gimmick: It’s a business strategy that is essential to a company’s successful future.
Social: Social introversion is the closest to the commonly held understanding of introversion, in that it’s a preference for socializing with small groups instead of large ones. Or sometimes, it’s a preference for no group at all — solitude is often preferable for those who score high in social introversion. “They prefer to stay home with a book or a computer, or to stick to small gatherings with close friends, as opposed to attending large parties with many strangers,” Cheek said. But it’s different from shyness, in that there’s no anxiety driving the preference for solitude or small groups.
I took the quiz at the bottom of the article and I’m a mix of roughly equal parts social, restrained, and anxious introversion with a dash of thinking.
Psychologists Dacher Keltner and Paul Ekman served as scientific consultants during the production of Pixar’s Inside Out. Keltner studies the origins of human emotion and Ekman pioneered research of microexpressions. In this NY Times piece, they discuss the science behind the movie.
Those quibbles aside, however, the movie’s portrayal of sadness successfully dramatizes two central insights from the science of emotion.
First, emotions organize — rather than disrupt — rational thinking. Traditionally, in the history of Western thought, the prevailing view has been that emotions are enemies of rationality and disruptive of cooperative social relations.
…
Second, emotions organize — rather than disrupt — our social lives. Studies have found, for example, that emotions structure (not just color) such disparate social interactions as attachment between parents and children, sibling conflicts, flirtations between young courters and negotiations between rivals.
I’ve thought about Inside Out every day since I saw it. Pixar clearly did their homework on the emotional stuff and it paid off.
From filmmaker Adam Curtis, a four-part documentary series on “how those in power have used Freud’s theories to try and control the dangerous crowd in an age of mass democracy”. Here’s part one:
This is a powerful and arresting documentary series — I ended up watching all four episodes back to back in a marathon effort. It was that gripping. I had felt similarly about his more recent documentary about the rise of neo conservatism and arab fundamentalism and the similarity in their techniques for recruiting followers (and their mutual need of each other in that project) — but ‘The Century of the Self’ (TCS from now on), is much grander in its scope. It seeks to analyse the different conceptions of the self in the twentieth century, and how these conceptions were ultimately used by corporations to manipulate consumers into purchasing their products. Curtis takes large swipes at corporate capitalism in this documentary, but his target is even wider than this — he seeks to tell a story about the relationship between the differing conceptions of individualism and the capitalist, democratic institutions (corporations and governments) which organise themselves around these conceptions.
The Marshmallow Test was developed by psychologist Walter Mischel to study self-control and delayed gratification. From a piece about Mischel in the New Yorker:
Once Mischel began analyzing the results, he noticed that low delayers, the children who rang the bell quickly, seemed more likely to have behavioral problems, both in school and at home. They got lower S.A.T. scores. They struggled in stressful situations, often had trouble paying attention, and found it difficult to maintain friendships. The child who could wait fifteen minutes had an S.A.T. score that was, on average, two hundred and ten points higher than that of the kid who could wait only thirty seconds.
The world’s leading expert on self-control, Walter Mischel has proven that the ability to delay gratification is critical for a successful life, predicting higher SAT scores, better social and cognitive functioning, a healthier lifestyle and a greater sense of self-worth. But is willpower prewired, or can it be taught?
In The Marshmallow Test, Mischel explains how self-control can be mastered and applied to challenges in everyday life — from weight control to quitting smoking, overcoming heartbreak, making major decisions, and planning for retirement. With profound implications for the choices we make in parenting, education, public policy and self-care, The Marshmallow Test will change the way you think about who we are and what we can be.
Here’s a video of the test in action:
Update:A recent study showed that the environment in which the test is performed is important.
Now a new study demonstrates that being able to delay gratification is influenced as much by the environment as by innate ability. Children who experienced reliable interactions immediately before the marshmallow task waited on average four times longer — 12 versus three minutes — than youngsters in similar but unreliable situations.
You don’t know what you would do unless you’re in that situation.
That’s Philip Zimbardo’s1 introduction to this fascinating and deeply disturbing video, depicting a real-world instance of Stanley Milgram’s experiment on obedience to authority figures2. In the video, you see a McDonald’s manager take a phone call from a man pretending to be a police officer. The caller orders the manager to strip search an employee. And then much much worse.
The video is NSFW and if you’re sensitive to descriptions and depictions of sexual abuse, you may want to skip it. And lest you think this was an isolated incident featuring exceptionally weak-minded people, the same caller was alleged to have made several other calls resulting in similar behavior. (via mr)
Zimbardo conducted the notorious Stanford prison experiment in 1971.↩
Milgram’s experiment focused on a person in authority ordering someone to deliver (fake) electric shocks to a third person. Some participants continued to deliver the shocks as ordered even when the person being shocked yelled in pain and complained of a heart condition.↩
According to developmental psychologist Richard Tremblay, violent criminals are basically toddlers who never grew up and never outgrow their tendency to use physical aggression to get what they want.
The study tracked behavior in 1,037 mostly disadvantaged Quebec schoolboys from kindergarten through age 18. The boys fell into four distinct trajectories of physical aggression.
The most peaceable 20 percent, a “no problem” group, showed little physical aggression at any age; two larger groups showed moderate and high rates of aggression as preschoolers. In these three groups violence fell through childhood and adolescence, and dropped to almost nothing when the boys reached their 20s.
A fourth group, about 5 percent, peaked higher during toddlerhood and declined far more slowly. Their curve was more plateau than hill.
As they moved into late adolescence and young adulthood, their aggression grew ever more dangerous, and it tailed off late. At age 17 they were four times as physically aggressive as the moderate group and committed 14 times as many criminal infractions. It’s these chronically violent individuals, Dr. Tremblay says, who are responsible for most violent crime.
In a 2008 paper called The Seductive Allure of Neuroscience Explanations, a group from Yale University demonstrated that including neuroscientific information in explanations of psychological phenomena makes the explanations more appealing, even if the neuroscientific info is irrelevant.
Explanations of psychological phenomena seem to generate more public interest when they contain neuroscientific information. Even irrelevant neuroscience information in an explanation of a psychological phenomenon may interfere with people’s abilities to critically consider the underlying logic of this explanation.
I don’t know if I buy this. Perhaps if the authors had explained their results relative to how the human brain functions…
After the end of the first day, I said, “There’s nothing here. Nothing’s happening.” The guards had this antiauthority mentality. They felt awkward in their uniforms. They didn’t get into the guard mentality until the prisoners started to revolt. Throughout the experiment, there was this conspiracy of denial-everyone involved was in effect denying that this was an experiment and agreeing that this is a prison run by psychologists.
There was zero time for reflection. We had to feed the prisoners three meals a day, deal with the prisoner breakdowns, deal with their parents, run a parole board. By the third day I was sleeping in my office. I had become the superintendent of the Stanford county jail. That was who I was: I’m not the researcher at all. Even my posture changes-when I walk through the prison yard, I’m walking with my hands behind my back, which I never in my life do, the way generals walk when they’re inspecting troops.
The Polar explorations were a huge mistake of the human race, an indication that the twentieth century was a mistake in its entirety. They are one of the indicators.
I think psychology and self-reflection is one of the major catastrophes of the twentieth century. A major, major mistake. And it’s only one of the mistakes of the twentieth century, which makes me think that the twentieth century in its entirety was a mistake.
Herzog backs this up with some intriguing counter-history:
The Spanish Inquisition had one goal, to eradicate all traces of Muslim faith on the soil of Spain, and hence you had to confess and proclaim the innermost deepest nature of your faith to the commission. And almost as a parallel event, explaining and scrutinizing the human soul, into all its niches and crooks and abysses and dark corners, is not doing good to humans.
We have to have our dark corners and the unexplained. We will become uninhabitable in a way an apartment will become uninhabitable if you illuminate every single dark corner and under the table and wherever—you cannot live in a house like this anymore. And you cannot live with a person anymore—let’s say in a marriage or a deep friendship—if everything is illuminated, explained, and put out on the table. There is something profoundly wrong. It’s a mistake. It’s a fundamentally wrong approach toward human beings.
But lest you think that Herzog’s rejection of the ethics of the Inquisition comes from an embrace of spiritual tolerance:
I think there should be holy war against yoga classes. It detours us from real thinking.
I said to my friend Gavin Craig the other day that with folks like Herzog, you almost have to approach them as if they’re characters in a play. Instead of asking yourself whether you like them personally or agree with the things they say, take a step back and try to admire how they’re drawn.
The article about Dan McLaughlin’s quest to go from zero-to-PGA Tour through 10,000 hours of deliberate practice got linked around a bunch yesterday. Several people who pointed to it made a typical mistake. Malcolm Gladwell wrote about the 10,000 hours theory in his book, he did not come up with it. It is not “Gladwell’s theory” and McLaughlin is not “testing Gladwell”. The 10,000 hours theory was developed and popularized by Dr. Anders Ericsson (here for instance) — who you may have heard of from this Freakonomics piece in the NY Times Magazine — before it became a pop culture tidbit by Gladwell’s inclusion of Ericsson’s work in Outliers.
Psychology professor Daryl Bem ran some common psychology experiments backwards and detected statistically significant results that could indicate that people somehow can, uh, see into the future.
In one experiment, students were shown a list of words and then asked to recall words from it, after which they were told to type words that were randomly selected from the same list. Spookily, the students were better at recalling words that they would later type.
In another study, Bem adapted research on “priming” — the effect of a subliminally presented word on a person’s response to an image. For instance, if someone is momentarily flashed the word “ugly”, it will take them longer to decide that a picture of a kitten is pleasant than if “beautiful” had been flashed. Running the experiment back-to-front, Bem found that the priming effect seemed to work backwards in time as well as forwards.
In the “ultimatum game”, for example, people are given $100 and told to offer some of it to someone else; if the other person accepts, each keeps their portion, but if they reject the offer, nobody gets anything. On average, Americans offer just under half, which seems to say much about human notions of fairness, or the fear of making an insultingly low offer. But many cultures behave differently; the Machiguenga of Peru prefer to keep more cash or, if on the other side of the deal, to accept whatever is offered. Another example: speakers of the Mayan language of Tzeltal are among several more likely to describe things as east or west of each other, not on the left or right. Academics would bristle, the researchers note, if journals were renamed with titles such as Journal of Personality and Social Psychology of American Undergraduate Psychology Students. But perhaps they should be.
The continued reports from Chile about those miners trapped in the mine are kind of fascinating. Here’s an article about the battle between the miners and the doctors, psychologists, and government officials attempting to manage them from afar.
In an effort to dominate the miners, the team of psychologists led by Mr Iturra has instituted a series of prizes and punishments. When the miners behave well, they are given TV and mood music. Other treats — like images of the outside world are being held in reserve, as either a carrot or a stick should the miners become unduly feisty.
In a show of strength, the miners have at times refused to listen to the psychologists, insisting that they are well. “When that happens, we have to say, ‘OK, you don’t want to speak with psychologists? Perfect. That day you get no TV, there is no music — because we administer these things,’” said Dr Diaz. “And if they want magazines? Well, then they have to speak to us. This is a daily arm wrestle.”
A two part (one, two) series on using psychological techniques to improve your creativity.
Interviews with 22 Nobel Laureates in physiology, chemistry, medicine and physics as well as Pulitzer Prize winning writers and other artists has found a surprising similarity in their creative processes (Rothenberg, 1996).
Called ‘Janusian thinking’ after the many-faced Roman god Janus, it involves conceiving of multiple simultaneous opposites. Integrative ideas emerge from juxtapositions, which are usually not obvious in the final product, theory or artwork.
Physicist Niels Bohr may have used Janusian thinking to conceive the principle of complementarity in quantum theory (that light can be analysed as either a wave or a particle, but never simultaneously as both).
Among primates, only humans masturbate. Why is that? Perhaps it’s our big….brains.
Go on, put this article aside, take a five minute break and put my challenge to the test (don’t forget to close your office door if you’re reading this at work): Just try to masturbate successfully — that is, to orgasmic completion — without casting some erotic representational target in your mind’s eye. Instead, clear your mind entirely, or think of, I don’t know, an enormous blank canvass hanging in an art gallery. And of course no porn or helpful naked co-workers are permitted for this task either.
How’d it go? Do you see the impossibility of it? This is one of the reasons, incidentally, why I find it so hard to believe that self-proclaimed asexuals who admit to masturbating to orgasm are really and truly asexual. They must be picturing something , and whatever that something is gives away their sexuality.
If I were given carte blanche to write about any topic I could, it would be about how much our ignorance, in general, shapes our lives in ways we do not know about. Put simply, people tend to do what they know and fail to do that which they have no conception of. In that way, ignorance profoundly channels the course we take in life. And unknown unknowns constitute a grand swath of everybody’s field of ignorance.
This is part one of a five-part series in which we hear from David Dunning about the Dunning-Kruger Effect.
When people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it.
A fascinating 10-minute animated talk by Philip Zimbardo about the different “time zones” or “time perspectives” that people can have and how the different zones affect people’s world views.
The six different time zones are:
- Past positive: focus is on the “good old days”, past successes, nostalgia, etc.
- Past negative: focus on regret, failure, all the things that went wrong
- Present hedonistic: living in the moment for pleasure and avoiding pain, seek novelty and sensation
- Present fatalism: life is governed by outside forces, “it doesn’t pay to plan”
- Future: focus is on learning to work rather than play
- Transcendental Future: life begins after the death of the mortal body
Find out which time zone you’re in by taking this survey.
The early meetings were stormy. “You oughta worship me, I’ll tell you that!” one of the Christs yelled. “I will not worship you! You’re a creature! You better live your own life and wake up to the facts!” another snapped back. “No two men are Jesus Christs. … I am the Good Lord!” the third interjected, barely concealing his anger.
They include mood, group size, authority, and social approval.
People use conformity to ingratiate themselves with others. Conforming also makes people feel better about themselves by bolstering self-confidence. Some people have a greater need for liking from others so are more likely to conform.
Have you noticed that nonconformers are less likely to care what other people think of them? Nonconformity and self-confidence go hand-in-hand.
Stay Connected