homeaboutarchives + tagsshopmembership!
aboutarchivesshopmembership!
aboutarchivesmembers!

kottke.org posts about science

The Dunning-Kruger Effect: we are all confident idiots

posted by Jason Kottke   Jun 27, 2018

In a lesson for TED-Ed, David Dunning explains the Dunning-Kruger Effect, a cognitive bias in which people with lesser abilities tend to rate themselves as more proficient than they are.

Interestingly, this effect not only applies to those with lower abilities thinking they are better but also to experts who think they’re not exceptional. That is, the least & most skilled groups are both deficient in their ability to evaluate their skills.

Dunning also wrote a longer piece for Pacific Standard on the phenomenon.

In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize — scratch that, cannot recognize — just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. Logic itself almost demands this lack of self-insight: For poor performers to recognize their ineptitude would require them to possess the very expertise they lack. To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers — and we are all poor performers at some things — fail to see the flaws in their thinking or the answers they lack.

What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.

Confidence feels like knowledge. I feel like that simple statement explains so much about the world.

See also Errol Morris’ series for the NY Times about humanity’s unknown unknowns.

In closing, I’ll just note that thinking you’re impervious to the Dunning-Kruger Effect is itself an example of the Dunning-Kruger Effect in action. (via open culture)

James Hansen’s 1988 climate predictions have proved to be remarkably accurate

posted by Jason Kottke   Jun 25, 2018

In 1988, Dr. James Hansen testified in front of Congress about the future dangers of climate change caused by human activity. That same year, the results of a study released by Hansen and his team at the Goddard Institute for Space Studies detailed three possible scenarios for possible future warming. Their middle-of-the-road prediction has proved to be remarkably accurate over the past 30 years.

Hansen Warming Trend

Changes in the human effects that influence Earth’s global energy imbalance (a.k.a. ‘anthropogenic radiative forcings’) have in reality been closest to Hansen’s Scenario B, but about 20-30% weaker thanks to the success of the Montreal Protocol in phasing out chlorofluorocarbons (CFCs). Hansen’s climate model projected that under Scenario B, global surface air temperatures would warm about 0.84°C between 1988 and 2017. But with a global energy imbalance 20-30% lower, it would have predicted a global surface warming closer to 0.6-0.7°C by this year.

The actual 1988-2017 temperature increase was about 0.6°C. Hansen’s 1988 global climate model was almost spot-on.

Scientists have known this was happening for decades and have been telling our government officials about it for more than 30 years. Our present inaction on a national level on this is shameful and “the global poor, the disenfranchised, the young, and the yet-to-be-born” will soon pay the price.

See also a brief history of America’s shameful inaction on climate change.

There’s no scientific or genetic basis for race

posted by Jason Kottke   Jun 21, 2018

Elizabeth Kolbert writing for National Geographic: There’s No Scientific Basis for Race — It’s a Made-Up Label.

“What the genetics shows is that mixture and displacement have happened again and again and that our pictures of past ‘racial structures’ are almost always wrong,” says David Reich, a Harvard University paleogeneticist whose new book on the subject is called Who We Are and How We Got Here. There are no fixed traits associated with specific geographic locations, Reich says, because as often as isolation has created differences among populations, migration and mixing have blurred or erased them.

She also observes that there’s more diversity in Africa than all the other continents combined (which is what happens when the rest of the world’s population is based on a relatively small population that left Africa 60,000 years ago).

How the Earth’s continents will look 250 million years from now

posted by Jason Kottke   Jun 13, 2018

Speaking of Pangaea, this video shows how the present-day continents came to be formed from the Pangaea supercontinent about 240 million years ago, then shows what the Earth’s surface might look like 250 million years in the future, if the tectonic plates continue to move in predictable ways.

I hope this explanation is helpful. Of course all of this is scientific speculation, we will have to wait and see what happens, but this is my projection based on my understanding of the forces that drive plate motions and the history of past plate motions. Remember: “The past reveals patterns; Patterns inform process; Process permits prediction.”

Look at how quickly India slams into the Asian continent…no wonder the Himalayas are so high.1 And it’s interesting that we’re essentially bookended by two supercontinents, the ancient Pangaea and Pangaea Proxima in the future.

  1. Though they may not be able to grow much more. Erosion and gravity work to keep the maximum height in check.

Flat Earthers and the double-edged sword of American magical thinking

posted by Jason Kottke   Jun 12, 2018

Alan Burdick recently wrote a piece for The New Yorker about the “burgeoning” flat Earth movement, a group of people who believe, against simple & overwhelming evidence, that the Earth is not spherical1 but flat.

If you are only just waking up to the twenty-first century, you should know that, according to a growing number of people, much of what you’ve been taught about our planet is a lie: Earth really is flat. We know this because dozens, if not hundreds, of YouTube videos describe the coverup. We’ve listened to podcasts — Flat Earth Conspiracy, The Flat Earth Podcast — that parse the minutiae of various flat-Earth models, and the very wonkiness of the discussion indicates that the over-all theory is as sound and valid as any other scientific theory. We know because on a clear, cool day it is sometimes possible, from southwestern Michigan, to see the Chicago skyline, more than fifty miles away — an impossibility were Earth actually curved. We know because, last February, Kyrie Irving, the Boston Celtics point guard, told us so. “The Earth is flat,” he said. “It’s right in front of our faces. I’m telling you, it’s right in front of our faces. They lie to us.”

John Gruber remarked on Burdick’s piece by saying:

In recent years I’ve begun to feel conflicted about the internet. On the one hand, it’s been wonderful in so many ways. I’ve personally built my entire career on the fact that the internet enables me to publish as a one-person operation. But on the other hand, before the internet, kooks were forced to exist on the fringe. There’ve always been flat-earther-types denying science and John Birch Society political fringers, but they had no means to amplify their message or bond into large movements.

Another way to put this is that all the people who bought those News of the World-style magazines from the grocery checkout — UFO sightings! Elvis lives! NASA faked the Moon landing! new treatment lets you live 200 years! etc.! — were able to find each other, organize, and mobilize because of the internet. And then they decided to elect one of themselves President.

I recently downloaded the audiobook of Kurt Andersen’s Fantasyland: How America Went Haywire: A 500-Year History and am looking forward to listening to it on my summer roadtrip. Here’s part of the synopsis:

In this sweeping, eloquent history of America, Kurt Andersen shows that what’s happening in our country today — this post-factual, “fake news” moment we’re all living through — is not something new, but rather the ultimate expression of our national character. America was founded by wishful dreamers, magical thinkers, and true believers, by hucksters and their suckers. Fantasy is deeply embedded in our DNA.

Over the course of five centuries — from the Salem witch trials to Scientology to the Satanic Panic of the 1980s, from P. T. Barnum to Hollywood and the anything-goes, wild-and-crazy sixties, from conspiracy theories to our fetish for guns and obsession with extraterrestrials — our love of the fantastic has made America exceptional in a way that we’ve never fully acknowledged. From the start, our ultra-individualism was attached to epic dreams and epic fantasies — every citizen was free to believe absolutely anything, or to pretend to be absolutely anybody.

Gruber’s point about the internet being a double-edged sword appears to be echoed here by Andersen about American individualism. Sure, this “if people disagree with you, you must be doing something right” spirit is responsible for the anti-vaxxer movement, conspiracy theories that 9/11 was an inside job & Newtown didn’t happen, climate change denialism, and anti-evolutionism, but it also gets you things like rock & roll, putting men on the Moon, and countless discoveries & inventions, including the internet.

Update: The Atlantic published an excerpt of Fantasyland last year:

I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.” A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness. “Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books. They’re elitist. Constantly telling us what is or isn’t true. Or what did or didn’t happen. Who’s Britannica to tell me the Panama Canal was finished in 1914? If I wanna say it happened in 1941, that’s my right. I don’t trust books — they’re all fact, no heart … Face it, folks, we are a divided nation … divided between those who think with their head and those who know with their heart … Because that’s where the truth comes from, ladies and gentlemen — the gut.”

Whoa, yes, I thought: exactly. America had changed since I was young, when truthiness and reality-based community wouldn’t have made any sense as jokes. For all the fun, and all the many salutary effects of the 1960s — the main decade of my childhood — I saw that those years had also been the big-bang moment for truthiness. And if the ’60s amounted to a national nervous breakdown, we are probably mistaken to consider ourselves over it.

(thx, david)

  1. More properly, the Earth is an oblate spheroid.

An AI learned to see in the dark

posted by Jason Kottke   Jun 05, 2018

Cameras that can take usable photos in low light conditions are very useful but very expensive. A new paper presented at this year’s IEEE Conference on Computer Vision and Pattern Recognition shows that training an AI to do image processing on low-light photos taken with a normal camera can yield amazing results. Here’s an image taken with a Sony a7S II, a really good low-light camera, and then corrected in the traditional way:

AI image in the dark

The colors are off and there’s a ton of noise. Here’s the same image, corrected by the AI program:

AI image in the dark

Pretty good, right? The effective ISO on these images has to be 1,000,000 or more. A short video shows more of their results:

It would be great to see technology like this in smartphones in a year or two.

Willpower, wealth, and the marshmallow test

posted by Jason Kottke   Jun 04, 2018

The marshmallow test is a famous psychological experiment designed by Walter Mischel in the 1960s. Kids were given a single marshmallow but told they could have another if they refrained from eating the first one for 15 minutes. The results seemed to indicate a much greater degree of self-control amongst those children who were able to delay gratification, which led to better outcomes in their lives. From a New Yorker article about Mischel:

Once Mischel began analyzing the results, he noticed that low delayers, the children who rang the bell quickly, seemed more likely to have behavioral problems, both in school and at home. They got lower S.A.T. scores. They struggled in stressful situations, often had trouble paying attention, and found it difficult to maintain friendships. The child who could wait fifteen minutes had an S.A.T. score that was, on average, two hundred and ten points higher than that of the kid who could wait only thirty seconds.

But Mischel only tested ~90 kids from a single preschool. Researchers from UC Irvine and NYU recently redid the test with more kids that were more representative of the general population and found that household income was a big factor in explaining both the ability to delay and outcomes.

Ultimately, the new study finds limited support for the idea that being able to delay gratification leads to better outcomes. Instead, it suggests that the capacity to hold out for a second marshmallow is shaped in large part by a child’s social and economic background — and, in turn, that that background, not the ability to delay gratification, is what’s behind kids’ long-term success.

If you’re poor, you might look at the promise of future food somewhat dubiously…and not because of a lack of self-control:

The failed replication of the marshmallow test does more than just debunk the earlier notion; it suggests other possible explanations for why poorer kids would be less motivated to wait for that second marshmallow. For them, daily life holds fewer guarantees: There might be food in the pantry today, but there might not be tomorrow, so there is a risk that comes with waiting. And even if their parents promise to buy more of a certain food, sometimes that promise gets broken out of financial necessity.

An explainer video from 1923 about Einstein’s theory of relativity

posted by Jason Kottke   May 29, 2018

In 1923, Inkwell Studios1 released a 20-minute animated explanation of Albert Einstein’s theory of relativity, perhaps one of the very first scientific explainer videos ever made. Films were still silent in those days and the public’s scientific understanding limited (the discovery of Pluto was 7 years in the future, and penicillin 5 years) so the film is almost excruciatingly slow by today’s standards, but if you squint hard enough, you can see the great-grandparent to YouTube channels like Kurzgesagt, Nerdwriter, TED Ed, minutephysics, and the 119,000+ videos on YouTube returned for a “einstein relativity explained” search. (via open culture)

  1. Inkwell later became Fleischer Studios, which made cartoons like Betty Boop, Popeye, and the first animated Superman series. They also introduced the bouncing ball as a technique for singing along to on-screen lyrics.

A brief history of fingerprints

posted by Jason Kottke   May 29, 2018

Smudge Art

Chantel Tattoli’s piece for The Paris Review, The Surprising History (and Future) of Fingerprints, is interesting throughout, but these two things leapt from the screen (italics mine):

It is true that every print is unique to every finger, even for identical twins, who share the same genetic code. Fingerprints are formed by friction from touching the walls of our mother’s womb. Sometimes they are called “chanced impressions.” By Week 19, about four months before we are issued into the world, they are set.

WHAT?! Is this true? A cursory search shows this might indeed be the case, although it looks as though there’s not established scientific consensus around the process.

Also, Picasso was fingerprinted as a suspect in the theft of the Mona Lisa from the Louvre:

When French authorities interrogated Pablo Picasso, in 1911, at the Palais de Justice about the theft of the Mona Lisa from the Louvre that August, he was clad in his favorite red-and-white polka-dot shirt. Picasso cried. He begged forgiveness. He was in possession of two statuettes filched from the museum, but he hadn’t taken her.

“In possession of”? Turns out a pal of Picasso’s lifted the statuettes from the museum, which was notoriously easy to steal from, and sold them to the artist, who knew exactly what he was buying.

True to Pieret’s testimony, Picasso kept two stolen Iberian statues buried in a cupboard in his Paris apartment. Despite the artist’s later protestations of ignorance there could be no mistaking their origins. The bottom of each was stamped in bold: PROPERTY OF THE MUSÉE DU LOUVRE.

Fingerprint art by Evan Roth. (via @claytoncubitt)

Global warming blankets

posted by Jason Kottke   May 24, 2018

Using simple graphic representations of annual temperatures (like this one posted by climate scientist Ed Hawkins), people are knitting and crocheting blankets that show just how warm the Earth has gotten over the past few decades. See Katie Stumpf’s blanket, for example.

Global Warming Blankets

According to climate scientist (and crocheter) Ellie Highwood, these blankets are a subset of “temperature blankets” made to represent, for example, daily temperatures over the course of a year in a particular location. The blanket she crocheted used NOAA data of global mean temperature anomalies for a 101-year period ending 2016.

I then devised a colour scale using 15 different colours each representing a 0.1 °C data bin. So everything between 0 and 0.099 was in one colour for example. Making a code for these colours, the time series can be rewritten as in the table below. It is up to the creator to then choose the colours to match this scale, and indeed which years to include. I was making a baby sized blanket so chose the last 100 years, 1916-2016.

If you read her post, she provides instructions for making your own global warming blanket.

P.S. You might think that with the Earth’s atmosphere getting warmer on average, these blankets would ironically be less necessary that they would have been 50 years ago. But climate change is also responsible for more extreme winter weather events — think global weirding in addition to global warming. So keep those blankets handy!

Degrees of Uncertainty

posted by Jason Kottke   May 17, 2018

Degrees of Uncertainty is an upcoming documentary by Neil Halloran that “uses data-driven animation to explore the topic of global warming”. It’s based on this XKCD comic of A Timeline of Earth’s Average Temperature.

Halloran is a creator of the excellent The Fallen of World War II interactive documentary, so I’m looking forward to seeing what he does with the topic of climate change.

Can bacteriophages rescue us from drug-resistant bacteria?

posted by Jason Kottke   May 14, 2018

Last month when I posted a video comparing the sizes of various microorganisms, I noted the weirdness of bacteriophages, which are bacteria-killing viruses that look a bit like a 20-sided die stuck on the top of a sci-fi alien’s body.

Bacteriophages are really real and terrifying…if you happen to be a bacteria. Bacteriophages attack by attaching themselves to bacteria, piercing their outer membranes, and then pumping them full of bacteriophage DNA. The phage replicates inside of the bacteria until the bacteria bursts and little baby bacteriophages are exploded out all over the place, ready to attack their own bacteria.

I couldn’t find a good explainer (video or text) about these organisms, but over the weekend, Kurzgesagt rode to the rescue with this video. In the second part of the video, they discuss whether bacteriophages might form the basis of an effective treatment for antibiotic-resistant infections.

The Finkbeiner test for gender bias in science writing

posted by Jason Kottke   Apr 27, 2018

In a 2013 piece, Christie Aschwanden suggested a test in the spirit of the Bechdel test for avoiding gender bias in profiles written about scientists who are women.

To pass the Finkbeiner test, the story cannot mention:

- The fact that she’s a woman
- Her husband’s job
- Her child care arrangements
- How she nurtures her underlings
- How she was taken aback by the competitiveness in her field
- How she’s such a role model for other women
- How she’s the “first woman to…”

Aschwanden named the test after her colleague Ann Finkbeiner, who wrote that she was going to write a piece about an astronomer without mentioning that she, the astronomer, was a woman.

Meanwhile I’m sick of writing about [gender bias in science]; I’m bored silly with it. So I’m going to cut to the chase, close my eyes, and pretend the problem is solved; we’ve made a great cultural leap forward and the whole issue is over with.

And I’m going to write the profile of an impressive astronomer and not once mention that she’s a woman. I’m not going to mention her husband’s job or her child care arrangements or how she nurtures her students or how she was taken aback by the competitiveness in her field. I’m not going to interview her women students and elicit raves about her as a role model. I’m going to be blindly, aggressively, egregiously ignorant of her gender.

I’m going to pretend she’s just an astronomer.

(via @john_overholt)

An AI can realistically “paint in” missing areas of photographs

posted by Jason Kottke   Apr 26, 2018

This video, and the paper it’s based on, is called “Image Inpainting for Irregular Holes Using Partial Convolutions” but it’s actually straight-up witchcraft! Researchers at NVIDIA have developed a deep-learning program that can automagically paint in areas of photographs that are missing. Ok, you’re saying, Photoshop has been able to do something like that for years. And the first couple of examples were like, oh that’s neat. But then the eyes are deleted from a model’s portrait and the program drew new eyes for her. Under close scrutiny, the results are not completely photorealistic, but at a glance it’s remarkably convincing. (via imperica)

How to harvest nearly infinite energy from a spinning black hole

posted by Jason Kottke   Apr 23, 2018

Well, this is a thing I didn’t know about black holes before watching this video. Because some black holes spin, it’s possible to harvest massive amounts of energy from them, even when all other energy sources in the far far future are gone. This process was first proposed by Roger Penrose in a 1971 paper.

The Penrose process (also called Penrose mechanism) is a process theorised by Roger Penrose wherein energy can be extracted from a rotating black hole. That extraction is made possible because the rotational energy of the black hole is located not inside the event horizon of the black hole, but on the outside of it in a region of the Kerr spacetime called the ergosphere, a region in which a particle is necessarily propelled in locomotive concurrence with the rotating spacetime. All objects in the ergosphere become dragged by a rotating spacetime. In the process, a lump of matter enters into the ergosphere of the black hole, and once it enters the ergosphere, it is forcibly split into two parts. For example, the matter might be made of two parts that separate by firing an explosive or rocket which pushes its halves apart. The momentum of the two pieces of matter when they separate can be arranged so that one piece escapes from the black hole (it “escapes to infinity”), whilst the other falls past the event horizon into the black hole. With careful arrangement, the escaping piece of matter can be made to have greater mass-energy than the original piece of matter, and the infalling piece has negative mass-energy.

This same effect can also be used in conjunction with a massive mirror to superradiate electromagnetic energy: you shoot light into a spinning black hole surrounded by mirrors, the light is repeatedly sped up by the ergosphere as it bounces off the mirror, and then you harvest the super-energetic light. After the significant startup costs, it’s basically an infinite source of free energy.

How to reduce opioid addiction

posted by Jason Kottke   Apr 18, 2018

This morning I ran across news from two different studies about reducing deaths from opioid overdoses and they both had the same solution: medication-assisted treatment. First, from a study involving inmates in Rhode Island correctional facilities:

The program offers inmates methadone and buprenorphine (opioids that reduce cravings and ease withdrawal symptoms), as well as naltrexone, which blocks people from getting high.

The data set is small but the results are encouraging: there were fewer overdose deaths of former inmates after the program was implemented in 2016.

In the 90s, France used a similar program to cut heroin overdose deaths by 79%:

In 1995, France made it so any doctor could prescribe buprenorphine without any special licensing or training. Buprenorphine, a first-line treatment for opioid addiction, is a medication that reduces cravings for opioids without becoming addictive itself.

With the change in policy, the majority of buprenorphine prescribers in France became primary-care doctors, rather than addiction specialists or psychiatrists. Suddenly, about 10 times as many addicted patients began receiving medication-assisted treatment, and half the country’s heroin users were being treated. Within four years, overdose deaths had declined by 79 percent.

“What do census tracts with highest concentrations of particular populations look like?”

posted by Jason Kottke   Apr 18, 2018

The use of satellite imagery has revolutionized many areas of science and research, from archaeology to tracking human rights abuses to (of course) climate science. This vantage point makes different sorts of observations possible than looking at ground level does.

In what she calls “a work in progress”, Jia Zhang, a PhD candidate at MIT Media Lab, used census data to collect chunks of satellite images from areas with the highest concentrations of white, black, Asian, and Native American & Alaska Native people. The result is striking (but perhaps not surprising):

Census Satellite

I’m looking forward to seeing more of Zhang’s work in this area.

Alan Turing was an excellent runner

posted by Jason Kottke   Apr 17, 2018

Alan Turing Runner

Computer scientist, mathematician, and all-around supergenius Alan Turing, who played a pivotal role in breaking secret German codes during WWII and developing the conceptual framework for the modern general purpose computer, was also a cracking good runner.

He was a runner who, like many others, came to the sport rather late. According to an article by Pat Butcher, he did not compete as an undergraduate at Cambridge, preferring to row. But after winning his fellowship to King’s College, he began running with more purpose. He is said to have often run a route from Cambridge to Ely and back, a distance of 50 kilometers.

It’s also said Turing would occasionally sometimes run to London for meetings, a distance of 40 miles. In 1947, after only two years of training, Turing ran a marathon in 2:46. He was even in contention for a spot on the British Olympic team for 1948 before an injury held him to fifth place at the trials. Had he competed and run at his personal best time, he would have finished 15th.

As the photo above shows, Turing had a brute force running style, not unlike the machine he helped design to break Enigma coded messages. He ran, he said, to relieve stress.

“We heard him rather than saw him. He made a terrible grunting noise when he was running, but before we could say anything to him, he was past us like a shot out of a gun. A couple of nights later we caught up with him long enough for me to ask who he ran for. When he said nobody, we invited him to join Walton. He did, and immediately became our best runner… I asked him one day why he punished himself so much in training. He told me ‘I have such a stressful job that the only way I can get it out of my mind is by running hard; it’s the only way I can get some release.’”

I found out about Turing’s running prowess via the Wikipedia page of non-professional marathon runners. Turing is quite high on the list, particularly if you filter out world class athletes from other sports. Also on the list, just above Turing, is Wolfgang Ketterle, a Nobel Prize-winning physicist who ran a 2:44 in Boston in 2014 at the age of 56.

A high-resolution tour of the Moon from NASA

posted by Jason Kottke   Apr 09, 2018

Using imagery and data that the Lunar Reconnaissance Orbiter spacecraft has collected since 2009, NASA made this video tour of the Moon in 4K resolution. This looked incredible on my iMac screen.

As the visualization moves around the near side, far side, north and south poles, we highlight interesting features, sites, and information gathered on the lunar terrain.

See also The 100-megapixel Moon and A full rotation of the Moon.

A great list of science books written by women

posted by Jason Kottke   Apr 06, 2018

Scientist and educator Joanne Manaster has compiled a growing list of science books written by women (with a rule of one book per author). Some of the books and authors featured are:

Hidden Figures by Margot Lee Shetterly.

Biomimicry by Janine Benyus.

My Life with the Chimpanzees by Jane Goodall.

Silent Spring by Rachel Carson.

Black Hole Blues and Other Songs from Outer Space by Janna Levin.

The Autistic Brain by Temple Grandin.

Me, Myself, and Why: Searching for the Science of Self by Jennifer Ouellette.

The Confidence Game by Maria Konnikova.

The Invention of Nature by
Andrea Wulf.

The Sixth Extinction by Elizabeth Kolbert.

The Immortal Life of Henrietta Lacks by Rebecca Skloot.

Code Girls by Liza Mundy.

Grunt: The Curious Science of Humans at War by Mary Roach.

The Human Age by Diane Ackerman.

Manaster is soliciting suggestions on Twitter for authors she may have missed.

What makes a tree a tree? Scientists still aren’t sure…

posted by Jason Kottke   Apr 05, 2018

Broccoli Tree

In Knowable Magazine, Rachel Ehrenberg writes about the tricky business of understanding what a tree is. Trees are tall, woody, long-lived and have tree-like genes, right? Not always…

If one is pressed to describe what makes a tree a tree, long life is right up there with wood and height. While many plants have a predictably limited life span (what scientists call “programmed senescence”), trees don’t, and many persist for centuries. In fact, that trait — indefinite growth — could be science’s tidiest demarcation of treeness, even more than woodiness. Yet it’s only helpful to a point. We think we know what trees are, but they slip through the fingers when we try to define them.

Ehrenberg then suggests that we should think about tree-ness as a verb rather than a noun.

Maybe it’s time to start thinking of tree as a verb, rather than a noun - tree-ing, or tree-ifying. It’s a strategy, a way of being, like swimming or flying, even though to our eyes it’s happening in very slow motion.

This reminds me of one of Austin Kleon’s strategies for How to Keep Going: “forget the noun, do the verb”. Hey, it seems to be working for the trees. (via @robgmacfarlane)

Carl Sagan’s tools for critical thinking and detecting bullshit

posted by Jason Kottke   Apr 02, 2018

In his 1995 book The Demon-Haunted World, astrophysicist Carl Sagan presented a partial list of “tools for skeptical thinking” which can be used to construct & understand reasoned arguments and reject fraudulent ones.

Wherever possible there must be independent confirmation of the “facts.”

Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.

Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.

Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.

Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.

Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.

If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.

Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.

Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

I found this via Open Culture, which remarked on Sagan’s prescient remarks about people being “unable to distinguish between what feels good and what’s true”.

Like many a science communicator after him, Sagan was very much concerned with the influence of superstitious religious beliefs. He also foresaw a time in the near future much like our own. Elsewhere in The Demon-Haunted World, Sagan writes of “America in my children’s or grandchildren’s time…. when awesome technological powers are in the hands of a very few.” The loss of control over media and education renders people “unable to distinguish between what feels good and what’s true.”

This state involves, he says a “slide… back into superstition” of the religious variety and also a general “celebration of ignorance,” such that well-supported scientific theories carry the same weight or less than explanations made up on the spot by authorities whom people have lost the ability to “knowledgeably question.”

Yeeeeeeeep.

Update: After I posted this, a reader let me know that Michael Shermer has been accused by several women of sexually inappropriate & predatory behavior and rape at professional conferences. I personally believe women, and I further believe that if Shermer was actually serious about rationality and his ten rules for critical thinking listed above, he wouldn’t have pulled this shit in the first place (nor tried to hamfistedly explain it away). I’ve rewritten the post to remove the references to Shermer, which actually made it more succinct and put the focus fully on Sagan, which was my intention in the first place (the title remains unchanged). (via @dmetilli)

A comparison of the sizes of various microorganisms, cells, and viruses

posted by Jason Kottke   Apr 02, 2018

Microorganisms are so small compared to humans that you might be tempted to think that they’re all about the same size. As this video shows, that is not at all the case. The rinovirus and polio virus are 0.03 micrometers (μm) wide, a red blood cell is 8 μm, a neuron 100 μm, and a frog’s egg 1 mm. That’s a span of 5 orders of magnitude, about the same difference as the height of a human to the thickness of the Earth’s atmosphere.

Watching the animation, you might have noticed the T4 bacteriophage, which looks like a cross between the aliens in Arrival and a lunar lander. Can’t be real, right? Bacteriophages are really real and terrifying…if you happen to be a bacteria. Bacteriophages attack by attaching themselves to bacteria, piercing their outer membranes, and then pumping them full of bacteriophage DNA. The phage replicates inside of the bacteria until the bacteria bursts and little baby bacteriophages are exploded out all over the place, ready to attack their own bacteria.

Facial recognition AIs have a hard time with dark skin

posted by Jason Kottke   Mar 26, 2018

For her Gender Shades project, MIT researcher Joy Buolamwini fed over 1000 faces of different genders and skin tones into three AI-powered facial recognition systems from Microsoft, IBM, and Face++ to see how well they could recognize different kinds of faces.

The systems all performed well overall, but recognized male faces more readily than female faces and performed better on lighter skinned subjects than darker skinned subjects. For instance, 93.6% of gender misclassification errors by Microsoft’s system were of darker skinned people.

Gender Shades

Her message near the end of the video is worth heeding:

We have entered the age of automation overconfident yet underprepared. If we fail to make ethical and inclusive artificial intelligence, we risk losing gains made in civil rights and gender equity under the guise of machine neutrality.

A world-historical theory of tool use

posted by Tim Carmody   Mar 16, 2018

early tools.jpg

I love reading and rereading about the origin of humanity. I love that it’s not settled science: we’re still making new discoveries about when humans first left Africa, how and when we interbred with other hominins, and what makes us human in the first place. It’s just the coolest story, which is also every story.

Popular Science has a really nice new primer on the current state of research on early humanity. Embedded in it is a series of studies on tool use by early humans in Kenya that caught my attention. Basically, the tools got smaller and more portable, the materials used were more exotic (sourced from farther away), and they were decorated with pigments.

“That’s where there’s a similarity to technology in recent times; things start out big and clunky and they get small and portable,” says Richard Potts, head of the Smithsonian’s Human Origins Program and a co-author of the papers. “The history [of] technology has been the same ever since.”

I wonder, though, if all three vectors hold up across history: greater portability, greater range of materials, and greater decorative value.

I suspect the null hypothesis would be that technologies that work tend to stay roughly the same over time. (For most of early human history, our tools didn’t change up that much, which is exactly why the burst of activity in east Africa is noteworthy.) You need something to shake things up: either sudden availability of new materials, or a deprivation of old ones (like the Bronze Age collapse, which eventually helped usher in the Iron Age).

As it turns out, that’s exactly what happened.

“One of the things we see is that around 500,000 years ago in the rift valley of southern Kenya, all hell breaks loose. There’s faulting that occurs, and earthquake activity was moving the landscape up and down. The climate record shows there is a stronger degree of oscillation between wet and dry. That would have disrupted the predictability of food and water, for those early people,” Potts says. “It’s exactly under those conditions that almost any organism—but especially a hunter-gatherer human, even an early one—would begin to expand geography of obtaining food or obtaining resources. It’s under those conditions that you begin to run into other groups of hominins and you become aware of resources beyond your usual boundaries.”

“Oh my god!” People’s reactions to looking at the Moon through a telescope.

posted by Jason Kottke   Mar 15, 2018

Wylie Overstreet and Alex Gorosh took a telescope around the streets of LA and invited people to look at the Moon through it. Watching people’s reactions to seeing such a closeup view of the Moon with their own eyes, perhaps for the first time, is really amazing.

Whoa, that looks like that’s right down the street, man!

I often wonder what the effect is of most Americans not being able to see the night sky on a regular basis. As Sriram Murali says:

The night skies remind us of our place in the Universe. Imagine if we lived under skies full of stars. That reminder we are a tiny part of this cosmos, the awe and a special connection with this remarkable world would make us much better beings — more thoughtful, inquisitive, empathetic, kind and caring. Imagine kids growing up passionate about astronomy looking for answers and how advanced humankind would be, how connected and caring we’d feel with one another, how noble and adventurous we’d be.

Gorgeous 8K video of the aurora borealis dancing in the skies during a lunar eclipse

posted by Jason Kottke   Mar 14, 2018

8K resolution. Time lapse. 360º view. Aurora borealis. Lunar eclipse. I’m not really sure how you could pack much more into this video. Probably best experienced with some sort of VR rig, but for those of us without access to such a thing, watching it several times on a large screen while dragging the view around is a more than adequate substitute. If seeing the aurora borealis in person wasn’t already on your bucket list, it is now. Dang. (via the kid should see this)

Physics giant Stephen Hawking dead at age 76

posted by Jason Kottke   Mar 14, 2018

Lego Stephen Hawking

Stephen Hawking, who uncovered the mysteries of black holes and with A Brief History of Time did more than anyone to popularize science since the late Carl Sagan, has died at his home in Cambridge at age 76. From an obituary in The Guardian:

Hawking once estimated he worked only 1,000 hours during his three undergraduate years at Oxford. In his finals, he came borderline between a first- and second-class degree. Convinced that he was seen as a difficult student, he told his viva examiners that if they gave him a first he would move to Cambridge to pursue his PhD. Award a second and he threatened to stay. They opted for a first.

Those who live in the shadow of death are often those who live most. For Hawking, the early diagnosis of his terminal disease, and witnessing the death from leukaemia of a boy he knew in hospital, ignited a fresh sense of purpose. “Although there was a cloud hanging over my future, I found, to my surprise, that I was enjoying life in the present more than before. I began to make progress with my research,” he once said. Embarking on his career in earnest, he declared: “My goal is simple. It is a complete understanding of the universe, why it is as it is and why it exists at all.”

From Dennis Overbye’s obit in the NY Times:

He went on to become his generation’s leader in exploring gravity and the properties of black holes, the bottomless gravitational pits so deep and dense that not even light can escape them.

That work led to a turning point in modern physics, playing itself out in the closing months of 1973 on the walls of his brain when Dr. Hawking set out to apply quantum theory, the weird laws that govern subatomic reality, to black holes. In a long and daunting calculation, Dr. Hawking discovered to his befuddlement that black holes — those mythological avatars of cosmic doom — were not really black at all. In fact, he found, they would eventually fizzle, leaking radiation and particles, and finally explode and disappear over the eons.

Nobody, including Dr. Hawking, believed it at first — that particles could be coming out of a black hole. “I wasn’t looking for them at all,” he recalled in an interview in 1978. “I merely tripped over them. I was rather annoyed.”

That calculation, in a thesis published in 1974 in the journal Nature under the title “Black Hole Explosions?,” is hailed by scientists as the first great landmark in the struggle to find a single theory of nature — to connect gravity and quantum mechanics, those warring descriptions of the large and the small, to explain a universe that seems stranger than anybody had thought.

The discovery of Hawking radiation, as it is known, turned black holes upside down. It transformed them from destroyers to creators — or at least to recyclers — and wrenched the dream of a final theory in a strange, new direction.

“You can ask what will happen to someone who jumps into a black hole,” Dr. Hawking said in an interview in 1978. “I certainly don’t think he will survive it.

“On the other hand,” he added, “if we send someone off to jump into a black hole, neither he nor his constituent atoms will come back, but his mass energy will come back. Maybe that applies to the whole universe.”

Dennis W. Sciama, a cosmologist and Dr. Hawking’s thesis adviser at Cambridge, called Hawking’s thesis in Nature “the most beautiful paper in the history of physics.”

Roger Penrose, the eminent mathematician and physicist who collaborated with Hawking on discoveries related to black holes and the genesis of the universe, wrote a lengthy scientific obituary for Hawking in The Guardian.

Following his work in this area, Hawking established a number of important results about black holes, such as an argument for its event horizon (its bounding surface) having to have the topology of a sphere. In collaboration with Carter and James Bardeen, in work published in 1973, he established some remarkable analogies between the behaviour of black holes and the basic laws of thermodynamics, where the horizon’s surface area and its surface gravity were shown to be analogous, respectively, to the thermodynamic quantities of entropy and temperature. It would be fair to say that in his highly active period leading up to this work, Hawking’s research in classical general relativity was the best anywhere in the world at that time.

And then there was that time Hawking threw a party for time travellers but didn’t advertise it until after the party was over (to ensure only visitors from the future would show up).

Tonight is perhaps a good night to watch Errol Morris’ superb documentary on Hawking (with a wonderful Philip Glass soundtrack) or build a version of Hawking out of Lego.

The Winter Olympics, male & female physiology, and socially constructed bodies

posted by Jason Kottke   Feb 19, 2018

This is a fascinating thread by Milena Popova about the differing performances of male and female athletes at the Winter Olympics. As they point out, humans are sexually dimorphic but the story doesn’t end there. Bodies are also socially constructed.

Physiology is a thing, but physiology is shaped and mediated by our social context.

Look back at those pictures of “women”. Those petite, delicate bodies, those faces we process as “beautiful”. Those are the qualities that globally dominant Western cultures associate with “femininity”.

And sport is one of the institutions that fiercely guards and reproduces dominant ideas about gender, masculinity and femininity. This plays out differently in different sports.

Generally, men and women compete separately. And for the purposes of sport “men” and “women” are defined as people whose bodies were assigned male or female at birth and whose gender matches that assignment.

The obvious example here is South African runner Caster Semenya. But Popova continues with a more subtle (and admittedly speculative) situation:

Now, what really gets me is snowboarding. Because on the face of it that’s not a sport that’s judged on the same gendered criteria of artistry and aesthetics as figure skating or gymnastics.

You’d think under all the skiing gear, helmets, scarves and goggles, it would be quite hard to perform femininity.

And still, as my friend whom I made watch slope style and half pipe for the first time in her life last night pointed out, the body types of the men and women riders are really rather different. You can tell even under all the gear.

And that translates to performance. Women get an amplitude of about 3m above the half pipe, men about 4-5m. The best women do 1080s (three revolutions), the best men 1440s (four revolutions).

But much like any other subculture snowboarding reproduces hierarchical structures. Moves are named after people, some people find it easier to access than others (hint: it’s a massively expensive sport), some people set trends.

One of the structures it reproduces is a gendered hierarchy. It’s a very masculine culture. Women find it harder to access the sport, find it harder to be taken seriously as athletes in their own right rather than “just hangers-on”.

And I have the sneaky suspicion that because the people with the most subcultural capital tend to be men and they decide whom they will admit and accept to the community, there are certain looks and body types of women who find it less hard (not easy!) to gain access.

And those happen to be the body types that may find it harder to do 1440s and to get 5m amplitude above the half pipe.

Another example from figure skating is Surya Bonaly, a French figure skater who landed a backflip on one skate in a performance at the 1998 Olympics. While backflips weren’t banned because of Bonaly’s relative ease in performing them (as claimed here), her athletic style was outside the norm in women’s figure skating, in which traditional femininity is baked right into the rules & judging. This was also a factor in Tonya Harding’s career (as depicted in I, Tonya).

Anyway, super interesting to think about.

Photo of a single atom wins science photo contest

posted by Jason Kottke   Feb 13, 2018

Single Atom Photo

The UK’s Engineering and Physical Sciences Research Council just announced the winner of their annual science photography contest: a photo of a single strontium atom suspended in an electric field taken by David Nadlinger. The atom is that tiiiiny dot in the middle of the photo above.

‘Single Atom in an Ion Trap’, by David Nadlinger, from the University of Oxford, shows the atom held by the fields emanating from the metal electrodes surrounding it. The distance between the small needle tips is about two millimetres.

When illuminated by a laser of the right blue-violet colour the atom absorbs and re-emits light particles sufficiently quickly for an ordinary camera to capture it in a long exposure photograph. The winning picture was taken through a window of the ultra-high vacuum chamber that houses the ion trap.

Laser-cooled atomic ions provide a pristine platform for exploring and harnessing the unique properties of quantum physics. They can serve as extremely accurate clocks and sensors or, as explored by the UK Networked Quantum Information Technologies Hub, as building blocks for future quantum computers, which could tackle problems that stymie even today’s largest supercomputers.