Advertise here with Carbon Ads

This site is made possible by member support. ❤️

Big thanks to Arcustech for hosting the site and offering amazing tech support.

When you buy through links on kottke.org, I may earn an affiliate commission. Thanks for supporting the site!

kottke.org. home of fine hypertext products since 1998.

🍔  💀  📸  😭  🕳️  🤠  🎬  🥔

kottke.org posts about history

Medieval History with Africa at the Center, Europe at the Margins

The_Ruins_at_Gondar,_Ethiopia_-_Fasilides_Castle_(2414137463).jpg

A new history of late medieval Ethiopia and its interactions with Europe by historian Verena Krebs does something a little unusual, at least for a professor at a European university: it treats the horn of Africa as the center of civilization that it was, and Europeans as the members of far-flung satellite states that Ethiopians could not help but see them as being.

It’s not that modern historians of the medieval Mediterranean, Europe and Africa have been ignorant about contacts between Ethiopia and Europe; the issue was that they had the power dynamic reversed. The traditional narrative stressed Ethiopia as weak and in trouble in the face of aggression from external forces, especially the Mamluks in Egypt, so Ethiopia sought military assistance from their fellow Christians to the north—the expanding kingdoms of Aragon (in modern Spain), and France. But the real story, buried in plain sight in medieval diplomatic texts, simply had not yet been put together by modern scholars. Krebs’ research not only transforms our understanding of the specific relationship between Ethiopia and other kingdoms, but joins a welcome chorus of medieval African scholarship pushing scholars of medieval Europe to broaden their scope and imagine a much more richly connected medieval world.

The Solomonic kings of Ethiopia, in Krebs’ retelling, forged trans-regional connections. They “discovered” the kingdoms of late medieval Europe, not the other way around. It was the Africans who, in the early-15th century, sent ambassadors out into strange and distant lands. They sought curiosities and sacred relics from foreign leaders that could serve as symbols of prestige and greatness. Their emissaries descended onto a territory that they saw as more or less a uniform “other,” even if locals knew it to be a diverse land of many peoples. At the beginning of the so-called Age of Exploration, a narrative that paints European rulers as heroes for sending out their ships to foreign lands, Krebs has found evidence that the kings of Ethiopia were sponsoring their own missions of diplomacy, faith and commerce.

In fact, it would probably be accurate to say that Ethiopia (which over its long history has included areas now in Eritrea, Somalia, Sudan, Yemen, and parts of Saudi Arabia) was undergoing its own Renaissance, complete with the rediscovery of a lost classical kingdom, Aksum (sometimes called Axum).

Around the third century, Aksum was considered an imperial power on the scale of Rome, China, and Persia. It was part to better understand the historical legacy of Aksum that east Africans circa 1400 reestablished trading ties and military partnerships with their old Roman trading partners — or, in their absence, the Germanic barbarians who’d replaced them.

Europe, Krebs says, was for the Ethiopians a mysterious and perhaps even slightly barbaric land with an interesting history and, importantly, sacred stuff that Ethiopian kings could obtain. They knew about the Pope, she says, “But other than that, it’s Frankland. [Medieval Ethiopians] had much more precise terms for Greek Christianity, Syriac Christianity, Armenian Christianity, the Copts, of course. All of the Orthodox and Oriental Orthodox churches. But everything Latin Christian [to the Ethiopians] is Frankland.”


Election Days I Have Known

2003-05-12 15.19.01.jpg

My birthday is November 3, 1979. This means that Election Day 2020 in the United States was also my 41st birthday. It was a very strange birthday. But I believe that anyone born in the first week of November who lives and/or votes in this country often finds themselves celebrating strange birthdays. Their memories and experiences of those days are different, more vivid, and more hopelessly entwine the political, civic, and personal.

I was very nearly born on November 2nd. That was my mother’s 28th birthday. She went into labor at lunch with my aunt and my grandmother. She was enjoying time away from my older brother and sister (then two and five) and didn’t want to change her plans. When she got home, she was well into labor but didn’t tell my father. My godmother, whom my whole family calls my Aunt Joette but who is not, strictly speaking, my aunt, and her husband, my Uncle Mike (same deal) came over to visit.

Uncle Mike somehow picked up that my mother was having contractions, timed them in his head, and told her when it was time to go to the hospital. He also offered to watch my siblings while my father drove her there. He even cleaned and vacuumed the house, with my brother clinging to his leg. Aunt Joette, who was 26 but already had three children of her own, hopped in the back seat of my parents’ Thunderbird. My mom, now seriously uncomfortable, told my dad to punch it. He drove through Detroit at over 100 mph to Hutzel Hospital, where I would be born. My godmother, always terrified of expressways and fast driving, has never ridden in a car with my father since. My dad still says he has never made such good time downtown.

But once they got to the hospital, everything stopped. My parents say it was the first sign of how stubborn I could be. (Frankly, this trait is overdetermined in my family.) My mother was in labor for more than 24 hours. Her doctors prepared for an emergency C-section before I arrived, about an hour and a half before the end of November 3. Since 1979 is an odd-numbered year, there wasn’t a federal election that week. But it would have been a good Election Day story if there had been one.

My mother’s father’s birthday was October 29th. In Detroit, October 30th is Devil’s Night. October 31st is Halloween. November 1st is All Saints’ Day and Dia de los Muertos; this holiday is a pretty big deal in southwest Detroit’s Mexicantown, where Uncle Mike, Aunt Joette, and my cousins Rachel, Nikki, and Miguel went to church at Holy Redeemer.

November 1st is also my mother’s brother’s birthday. My Uncle Chris is exactly fourteen years younger than my mother and fourteen years older than me. He turned 30 two days before I turned 16, and called my mother the day in between, and since my mom wasn’t home, he and I talked for about half an hour. With leap days included, he is just one day closer in age to my mother than he is to me. My mother’s birthday (and All Souls’ Day) is November 2nd, and mine is November 3rd. It’s a lot of birthdays and holidays in just a few days. My parents’ wedding anniversary is February 5th, which explains why I was born nine months later; the others in my family can plausibly be blamed on cold Michigan nights. This has always made the week of Election Day a pretty big deal in our family.

2003-05-12 20.03.16.jpg

The first President elected in my lifetime was Ronald Reagan, on November 4, 1980. I don’t remember this very well, but I have seen pictures of my first birthday party the day before. My parents, like a surprisingly large number of Americans, both voted for independent candidate John Anderson, supposedly moved that he might be driven to bankruptcy by his campaign debts if he didn’t receive enough of the vote. Either they’re misremembering or are suckers, because in September, Anderson had already qualified for matching funds. Anyways, they both worked multiple jobs and had three small children, and ready access to reliable political information was not very good 40 years ago either.

I remember Reagan as President, but do not especially remember his reelection on November 6, 1984. I do remember my 5th birthday party extremely well. It was at McDonald’s, and my friends from kindergarten Andrew and Norman were invited. Ronald McDonald was there, I ate at least six Chicken McNuggets (which I still love), and Andrew gave me the He-Man action figure Jitsu, a bad guy with a golden hand that did a karate chop. He was kind of an evil knockoff of Fisto. Another of my uncles also gave me a copy of Jitsu, and I was excited about returning it and picking out a different He-Man character, but my younger brother took the second Jitsu out of its box, so we had two Jitsus, which is at least one too many. My brother was only three, but I was very upset with him.

2003-05-12 19.57.06.jpg

In 1988, my mom and I were pulling for Jesse Jackson, and both of us were pissed off when he didn’t win the nomination. (I’m still mad about this, actually.) This is when I start to remember Phil Hartman’s Ronald Reagan, Dana Carvey’s George H.W. Bush, and Jon Lovitz’s undersung Michael Dukakis (“I can’t believe I’m losing to this guy”).

My parents were not around very much—my sister is really the one who raised me while the two of them worked, and I’ve always thought of her as an equal parent as well as a beloved sibling—but they indulged my watching late night television and asking questions about the conventions at a young age. George H.W. Bush was elected on November 8th. (See, it’s not really always the first Tuesday in November, because for whatever reason, November 1st doesn’t count.)

[Note, in lieu of art: I do not currently possess any photos of myself from age 12 or 13, which is right and just.]

Bill Clinton was elected President on November 3, 1992, my thirteenth birthday. In my junior high’s mock election, held the day before, Ross Perot won in a landslide. (We’d moved to the suburbs by this point.) Why were mostly-white suburban tweens entranced by Perot, who had simply nothing in his history or character to appeal to them, besides perhaps a funny voice? Some of it felt like a collective prank, a joke on the fact that the school was pretending to let us decide something we actually had no choice about: “Let’s all vote for Perot, and see what happens.”

I think some people were moved by the idea that something, anything unexpected might happen. It’s like why little kids are fascinated by dinosaurs: here are these creatures, older and bigger than your parents, older and bigger than anything, who once ruled the world. They all died, and anything, no matter how powerful or seemingly inevitable, can die again and be replaced by something new. I don’t know; maybe that was why my parents, still in their late twenties, wanted to vote for Anderson. Maybe that was part of why so many of my friends in college voted for Ralph Nader. Longing for change becomes something more than rational when so many external things determine your life.

Anyways, Bill Clinton was a deeply flawed President and remains a deeply flawed human being. Still, given the choices, I’m happy with how it actually turned out.

November 3rd was also the day of my last junior high football game. I was a starting defensive tackle, and our team was undefeated. So were our cross-town rivals at Wilkinson. They ran all over us that day: our defense, which had shut out all but one team we played and gone games without giving up any yards, could not stop Jason Byrd, a big, fast, 14-year-old athlete most of us knew from little league baseball. He died in 1997.

2003-08-17 12.23.51.jpg

In 2000, I turned 21. Four days later, I voted for the first time. I did not vote in the 1998 midterms, even though Michigan’s governor was on the ballot, because my college town made it difficult for students to register to vote, and because, having just turned 19, I briefly did not believe electoral politics could create genuine change. I was also lazy, and foolish, and preoccupied with many other things.

But by 2000, I’d had a change of heart, and voted for Al Gore. I now think Gore would have been a better President than I believed then, partly because of the incredibly guarded, talking-out-both-sides-of-his-mouth campaign that he ran, but also because I did not foresee the disaster of the Bush years. I thought things would carry on mostly like they had, and that Bush, while dim and disengaged, would be a relatively benign conservative like I thought his father had been. I also thought he wouldn’t win anyways. I was a real putz. I had studied so much history but had no idea of what history had in store for us.

That year, my senior year in college, I lived in a big co-op house with fourteen other people. The house was right around the corner from my favorite bar, where I’d rung in my 21st birthday at midnight. We all watched the election results on CBS—I don’t know who chose that network, but Dan Rather had a lot of homespun idioms he used to introduce all the tosses and turns.

After the news came in that Florida had been called for Gore, then moved to toss-up, then called for Bush, then nobody was sure, my worst roommate, the one who let her dog shit all over the living room carpet, who installed her own private air conditioner even though we all split the utility bills, who everyone hated and nobody could figure out how she’d moved into the house or how to get her to leave, was openly celebrating a Bush win and taunting the rest of us (pretty even split Gore/Nader). I had one thought: I need a drink.

2006-10-06 13.06.29.jpg

Bush won again, defeating John Kerry after Election Day on November 2, 2004. For me, this was the biggest gut-punch election of my life. I followed it closely, I watched all the debates, I participated in antiwar, anti-Bush, and anti-Cheney demonstrations, and met up with other young people involved in politics in Philadelphia, where I’d moved in 2002. I had thought Gore would win, but was convinced Kerry would—even after the midterm losses in 2002, even after the bombs fell on Iraq the day after my first son was born.

The next day, my 25th birthday, I walked around the city in a haze. I had to get groceries. The new Trader Joe’s on Market St had opened, right across the street from Center City’s small but lively porno district. I talked to my parents and each of my siblings on the phone, but I don’t remember what any of us said. Ohio was close, and there would be recounts, but it was over. Maybe Kerry would have been a good President, maybe he wouldn’t have, but at that moment, every possibility felt foreclosed upon. This is what they want, I thought. It wasn’t for the first time and certainly would not be the last.

2007-04-02 07.35.37.jpg

For years, I thought Barack Obama was elected on my birthday in 2008. I even told people in the run-up to the 2020 election, just weeks ago, no, it’s OK, it’s actually good luck: Barack Obama won on my birthday. It’s not true. He was elected on Tuesday, November 4th, the day after.

But those few days all feel like one day, in the best sense. Now, my younger son had just turned one year old in September, had been walking (for some value of “walking”) since August, and his mother and I hadn’t properly slept yet. On November 3rd, my friends Matt Thompson and Robin Sloan celebrated the fifth anniversary of their blog Snarkmarket by asking me to join them as the site’s third author. I was scrambling to finish my doctoral dissertation in comparative literature, and to send out applications for Assistant Professor and Visiting Fellow jobs that were rapidly disappearing thanks to the economic collapse. (More jobs I applied to cancelled their searches than gave me outright Nos.)

It was total disaster. And yet somehow, the best thing had happened. Obama was the only Presidential candidate I’d supported in the primary who’d ever made it to the general election (that’s still true, by the way). He was the first Presidential candidate I’d voted for who’d won, and I had a young, multiracial family living in the birthplace of the Declaration of Independence who were counting on him. Health care for everyone, an end to the war in Iraq, real progress for Black and Latinx (we didn’t use the X then, but I will now) and Middle Eastern and South and Central Asian people seemed imminent. I was now 29, and even though I professed to know better, to have made myself properly jaded, properly paranoid, properly realistic about the limits of elected officials, the military, corporations, and the American people… I found myself quite carried away, like so many others. Meanwhile, the slaughterhouse continued its work.

2013-04-02 13.28.01.jpg

By November 2012, I was separated. I was living in Manhattan’s Hell’s Kitchen, but still registered to vote in Philadelphia. I had completely bounced out of academia, but somehow wound up with a series of very good jobs writing for technology magazines and websites. My son and his mom had just moved from Philadelphia to Atlanta after Philly’s public schools fell apart, making it much harder for me to take the train to see them.

My 33rd birthday was nevertheless my favorite ever. I was visiting Washington, DC for a few days, and all my friends in the area gathered to have brunch. Some of them knew each other, and some of them didn’t. We swapped stories about our “formative nerd texts,” the book that shaped our obsessions at an early age. (My answer: Calvin and Hobbes.) I had a crush on someone again, one of the first since my wife, and I didn’t know what to do with it. On the way back to New York, I stopped in Philadelphia to vote for Barack Obama again. (He won.) The city I’d lived in for a decade began to feel less and less like home. When I finally got from Penn Station to my apartment, I felt twin waves of longing and relief.

I turned 37 on November 3, 2016. My mother had just turned 65. My uncle, whom I remember as eternally 30, turned 51. By then, all of my grandparents had died. I had moved back in with my parents in metro Detroit the year before, partly to help my parents after my father’s heart attack, and partly because I had no place else to go.

I did not want to celebrate my birthday. I did not want to see or be seen by anyone. I closed off my wall on Facebook well before November. I stopped posting on Twitter a month before Election Day. Even though all the polls and polling averages, which had been so successful in 2008 and 2012 by controlling for known problems, had predicted until shortly before the end that Hillary Clinton would likely sail to an easy victory, I could feel what was coming.

I felt it in the part of my brain that can recognize a rattlesnake in the grass. There was nothing statistical about it at all, nothing deductive, just pure anticipation. Certain other primates have a word that means “snake,” and everyone in their band knows what it means. When they hear that word, the monkeys run for the trees. My brain was screaming that word, and it was running for the trees.

I thought, I will vote. And if she wins and he loses, then I will have helped stop this. And then I can kill myself.

Of course, it didn’t work out that way.

Xpectro Avatar.png

This Election Day and my 41st birthday have been even more unusual. It was my first birthday without my sister, Kelly. She had also been living with my parents after many years in New York, and died suddenly in April from a pulmonary embolism caused by COVID-19. My sister Kelly was 45 years old.

She would have been so good at figuring out how to celebrate my birthday and my mom’s, to keep us all safe and still have fun. She would have been so happy to vote to turn Michigan blue again. She would have had my nieces and nephews rolling with laughter at the funeral she didn’t have. She was my parent, and she was my sister. And she had a whole life to live that had nothing to do with me, but still shared with me, that she would tell me about on long telephone calls and late-night talks. And if she loved you, family or friend, she loved every part of you: she loved your parents and partners and children. She was the only person in my family who could befriend every generation, who could tell the third cousins apart, who knew what your second cousins’ kids wanted for Christmas without having to ask.

I feel like I lost all three people she was: sister, parent, protagonist. I still have so much I want to ask her about. I think I know, but I will never know.

We couldn’t celebrate my or my mother’s birthday with my brothers and their families, so my mother, my father, and I tried to make an even bigger celebration ourselves. Her birthday bled into mine, as it always does. We bought a Grand Traverse Baking Company cherry crumb pie, which was delicious. We had all already dropped off our ballots in October, so on Election Day (my birthday), we ordered carry-out from my favorite Lebanese restaurant. I bought a bottle of Jameson Irish whiskey, but didn’t drink any of it. I was in a good mood all night (the pie definitely helped), even as Michigan and Pennsylvania remained uncounted, as Georgia remained uncertain, as the blue mirage turned into a red mirage and back again.

Even now, although nearly all the votes have been counted (and Georgia’s, my son’s adopted home state, have been counted twice), Election Day is somehow not yet over. We knew it would be Election Week; few of us knew it would be Election Month.

Yet that means somehow my birthday is not yet over; it has metastasized to become all of Scorpio season, perhaps to Thanksgiving and after. And that means I am still only on the verge of turning 41, still 40, still waiting for the clock to turn over to start this next part of my life, a second half if I am lucky, a final third if everything goes chalk. I wasn’t born until late in the night on November 3, 1979, and I proved even before I was here that I can wait a very long time.

Still, I would like this to end, and end properly, even if I have to march on the state house in Michigan’s capitol building with the family I have left to see it out. Everyone is dying again; they have never stopped dying, and I would like to end that too.

I have no fantasies about Joe Biden or Kamala Harris. I don’t see them as avatars of hope like I saw Barack Obama, or as neoliberal schemers determined to betray their base. But I cannot survive (too many of us cannot survive) the petty fascist death cult of the Republican party under Trump. It has been building to this for generations, but now achieved its worst form yet. I will use every tool at my disposal, including the Democratic party, to crush them and drive them from power, like St. Patrick did Ireland’s snakes in legend.

They let New Orleans drown; they poisoned Flint; they let the police and bigots fashioning themselves as police murder Black people throughout the country, and then said it was the victims’ fault. They let my sister die and called it a rounding error. They have always been my enemies, ever since I was a little boy watching Ronald Reagan on television and realized what he was, even though I didn’t know the words for it. This was a man who would let us all die and (if he said anything at all) call our deaths noble and brave and necessary, if it would suit his vision of his own power, and perhaps enrich people I would never know.

Snake. The word I was looking for, that I already knew at four years old, was snake.

My father is terrified of snakes: he says that this is because in Ireland, where his parents were born, they have none. Snake is also what my mother’s people, the Ojibwe or Lake Superior Chippewa, called the Dakota and Lakota peoples when they fought them in what’s now Wisconsin: Sioux is a Chippewa word. (Literally, nadouessioux, or more properly natowessiwak, means “little snakes.”)

I know that if we want elections worth the name in 2022 or 2024 or any year afterwards, we have to win. The GOP, despite their hold on state legislatures, the courts, and at worst a 50/50 split in the US Senate, are fighting like they will never win a fairly counted, fairly administered, unsuppressed election again. And they might be right. But Democrats have to fight too. For once, Democrats have to forget that they’ve won and continue to fight.

I would like back everything that I have lost. But until the end of time and the return of the Messiah (and yes, I do mean Gritty), none of us can ever have that. All we can look forward to are more birthdays, more yahrzeits, and—I hope—more Election Days.

Maybe they will even become a holiday. Wouldn’t that be beautiful?


The history and futures of work since the 50s

Office work in the 50s

Quite interesting and well designed retrospective on the History of Work at Atlassian, looking at every decade starting in the 50s; what office work looked like, technological innovations, and how the future of work was imagined during each decade.

In a 1964 interview with the BBC, science fiction writer Arthur C. Clarke nailed almost all of his predictions for the year 2014. He predicted the use of wireless communications, making us “in instant contact with each other, wherever we may be,” as well as robotic surgery, only missing his prediction that workers would no longer commute to their offices and travel “only for pleasure.”

When office work and life-long hopes of employment started losing some of it’s potential and appeal:

Employees increasingly had doubts about the value of long-term company loyalty and started putting their own needs and interests above their employers’. “Office Space” debuted in 1999 and humorously brought this idea to life, satirizing the banal, everyday work of office denizens and their incompetent, overbearing bosses.


The Beginning of Recorded Sound

early sound.png

Centuries of Sound is a podcast that creates mixtapes by year. So far, that’s pretty standard. The main difference is that CoS’s mixtapes begin in 1853.

That’s as early as we’ve been able to recover recorded audio, mostly from technology that did not work particularly well at the time. The technology of the 1850s recorded sound, but couldn’t reliably reproduce it.

The real start date for us is nearly a quarter of a century [before Thomas Edison], in the studio of French printer and bookseller Édouard-Léon Scott de Martinville. The year was 1853 or 1854, and he was working on engravings for a physiology textbook, in particular a diagram of the internal workings of the human ear. What if, he thought, we could photograph sounds in the way we do images? (photography was a quarter-century old at this point) He began to sketch a device, a way of mimicking the inner workings of the human ear in order to make lines on a piece of paper.

I cover a plate of glass with an exceedingly thin stratum of lampblack. Above I fix an acoustic trumpet with a membrane the diameter of a five franc coin at its small end—the physiological tympanum (eardrum). At its center I affix a stylus—a boar’s bristle a centimeter or more in length, fine but suitably rigid. I carefully adjust the trumpet so the stylus barely grazes the lampblack. Then, as the glass plate slides horizontally in a well formed groove at a speed of one meter per second, one speaks in the vicinity of the trumpet’s opening, causing the membranes to vibrate and the stylus to trace figures.

Firstsounds.org did the most work in deciphering these early paper recordings, and that story is well told by the radio show Studio 360.

It even has a perfect name, what these people do: archeophony.

Here, then, is Centuries of Sound’s mix of all the recorded audio up to 1860 that they’ve been able to recreate from those early, not-at-the-time-reproducible pre-Edison audio signal recordings.

I wish I had known about this when I was still writing my dissertation (which was, in part, on paper and multimedia in the 1900s). It would have made many things much easier.


Black is Beautiful photography show and monograph

untitled-kwame-brathwaite-black-women-in-convertible.jpg

Photographer Kwame Brathwaite is best known for his images of black superstars in the 1970s (Muhammad Ali training for the Rumble in the Jungle, the Jackson 5 on their first tour in Africa, Bob Marley at home in Kingston). A new exhibition highlights earlier work from his archives and positions him as an influential figure in a burgeoning movement. The now 81-year-old has his first book coming out in May after a six decade career: Kwame Brathwaite: Black is Beautiful.

nomsa-brath-modeling-congolese-fabrics-sterns-department-store-1963-kwame-brathwaite-photo.png

Brathwaite co-organized a fashion show in Harlem that became iconic. Naturally ‘62: The Original African Coiffure and Fashion Extravaganza Designed to Restore Our Racial Pride and Standards used the slogan “Black is Beautiful,” later to be a major part of history. His imagery and ideals elevated the slogan to part of the zeitgeist. Artsy has a beautiful slideshow of the Grandassa models and this:

The participants, known as the Grandassa models, were not professionals in the fashion world, which reinforced Brathwaite’s political and artistic vision. They were dark-skinned and their hair was unprocessed; they wore African-inspired garments full of lush colors, waxed cotton prints, and elaborate patterns.

sikolo-brathwaite-portrait-ajass-1968-by-kwame-brathwaite.jpg

The FT has a great piece with more context on Kwame’s history and work.

kwame-brathwaite-self-portrait-ajass-1964.png

Black is Beautiful: The Photography of Kwame Brathwaite opens April 11 at the Skirball Center in Los Angeles.


Mary(s) Seacole tells the powerful story of forgotten black women

I was lucky enough to see Marys Seacole last week at Lincoln Center’s Claire Tow Theater and immediately regretted not seeing it sooner so I could tell everyone I know about this important show. The friend who brought me called it “woke theater,” but I’d describe it as humane activism. It whispered when it could have shouted and shouted when it could have whispered, and blew me away with its sensitivity and power when addressing race, womanhood, colonialism, and interconnection.

At least now I know about Brooklyn-based playwright Jackie Sibblies Drury so I can tell you not to miss her next production.

Do read up on the fascinating life of Jamaican nurse/businesswoman Mary Seacole, who spent significant time tending wounded soldiers during the Crimean War, but who has long been overlooked in British colonial history. And please tell me about other emerging playwrights I should know about!


A Genealogy of Blue

Blue - The History of A Color.jpg

Even colors have histories, and what vibrant histories they are. French historian Michel Pasteureau’s Blue: The History of A Color (he’s also done histories of red, green, and black) is capably reviewed by Jesse Russell in the Claremont Review of Books in an essay called “The Colors of Our Dreams.” Russell offers the following luminous details.

Blue was once little-known in the Western palette. Homer’s sea was “wine dark”; blue would not be used as water’s color until the seventeenth century. It has evolved from its original association with warmth, heat, barbarism, and the creatures of the underworld, to its current association with calm, peace, and reverie. Like the unruly green, the Romans associated blue with the savage Celtae and Germani, who used the woad herb’s rich leaves for their blue pigments. These northern barbarians also painted themselves blue before war and religious rituals. The ancient Germans, according to Ovid, even dyed their whitening hair blue.

The Romans, in contrast, preferred the color red—the Latin word, “coloratus” was synonymous with that for red, ruber. The Romans and Greeks did import lapis lazuli, the exquisite blue rock, from exotic locals such as China, Iran, and Afghanistan. But neither used the barbaric blue for important figures or images, saving it for the backgrounds for white and red figures. Even the Greek words for blue, like the names of colors in the Bible, largely were meant to evoke certain states or feelings as opposed to exact visual colors. Blue, like green, was the color of death and barbarism. The nobler colors—white, red, and black—were preferred.

Kind of Blue.jpg

Blue’s fortunes changed in the Middle Ages when it became associated with both the heavens and heaven, and particularly an association with the Virgin Mary. French royalty adopted blue as their official color; and in modernity, the introduction of indigo from the Americas and the invention of Prussian blue in the early 18th century helped cement blue (along with white and red) as part of a tripartite color scheme that gave us the flags of Great Britain, the United States, and France.

And then along came Goethe:

By the mid-nineteenth century, blue became a Romantic symbol of melancholy. Among those guilty of luring the moody young to dress in blue was Wolfgang von Goethe who, in The Sorrows of Young Werther, depicted his title character in a blue coat. This, coupled with Werther’s untimely death, inspired a craze for blue coats and a mania for suicide among melancholy European youth. Werther’s blue jacket was matched by the blue flower in Novalis’s unfinished posthumous piece Heinrich von Ofterdingen, which narrates the tale of a medieval troubadour who seeks out the flower as a symbol of the authentic life of beauty and art. Young, melancholic Frenchmen were doubly encouraged in their swooning by the closeness shared by the French word for blue flower, “ancolie,” and the ending of “mélancholie.”

From Romanticism’s murky forest a host of verbal expressions bloomed, linking blue with odd, melancholic reverie. Fairy tales were known as “blue tales”; to be terribly drunk in German became known as “being blue” or “Blau sein”; and the “blue devils,” from which we get the great American expression (and musical genre) “the blues,” meant to be afflicted with a lingering sadness.

Joni Mitchell - Blue.jpg

Blue has a curious oscillation between conservativism and rebellion, perhaps especially in France, but throughout the world:

The navy blazer, a sign of conservativism and preppy formality in the twentieth century, was once a mark of the avant garde Westerner, adorned in what became known as “sportswear.” Aspiring radicals wore blue jeans, made from denim dyed with indigo, but ultimately derived by Levi Strauss from the pants made from tent canvas for California prospectors. Eventually, jeans became leisurewear for Americans from the East Coast who wanted to dress like the cowboys of the increasingly tame “wild west.” As the tides of early twentieth-century fashionable rebellion swelled, blue jeans were given the stamp of haute couture in a famous 1935 edition of Vogue, and, after World War II, were a symbol of rebellion and nonconformity—especially in newly liberated Europe. But in the West, jeans eventually became blasé (but comfortable) everyday wear when everyone—even conservative squares—started wearing them. This did not stop blue jeans from becoming symbols of rebellion in Communist countries during the heady days of glasnost and perestroika, and later in the Muslim world a symbol of youthful rebellion.

Taken together, the genealogy of blue is a history of finding meaning in difference, whether it was the Germanic blue facing off against the Roman red, the vibrant blue jacket against the staid black coat, or the heavenly Marian apparition set off against the profane, multicolored world below.

(Via The Browser.)


Rebecca Solnit on The West and climate change

This article reminded me of the powerful story Rebecca Solnit told at Pop-Up Magazine a couple years ago. She presented photos taken by Mark Klett and Byron Wolfe at Lake Powell over several years, along with unfolding a compelling narrative expression of climate change as told through one geological place.

lake-powell-2012.png

She also tells a cautionary tale of The West, in her companion piece later published in California Sunday magazine.

Glen Canyon Dam is a monument to overconfidence 710 feet high, an engineering marvel and an ecological mistake. The American West is full of these follies: decommissioned nuclear power plants surrounded by the spent radioactive waste that will remain dangerous for 100,000 years; the bomb-torn land of military testing and training sites; the Nevada Test Site itself, cratered and contaminated by the explosion of a thousand nuclear devices. Las Vegas and Phoenix, two cities that have grown furiously in recent decades, are monuments to the conviction that stable temperatures and fossil fuel and water could be counted upon to persist indefinitely.

You can regard the enormous projects of this era as a continuation of the Second World War. In the West, this kind of development resembled a war against nature, an attempt to conquer heat, dryness, remoteness, the variability of rainfall and river flow — to triumph over the way water limits growth. As the environmental writer Bill deBuys put it: “Thanks to reservoirs large and small, scores of dams including colossi like Hoover and Glen Canyon, more than 1,000 miles of aqueducts and countless pumps, siphons, tunnels and diversions, the West had been thoroughly re-rivered and re-engineered. It had acquired the plumbing system of a giant water-delivery machine. … Today the Colorado River, the most fully harnessed of the West’s great waterways, provides water to about 40 million people and irrigates nearly 5.5 million acres of farmland.” Along the way, so many parties sip and gulp from the Colorado that little water reaches Mexico.

glen-canyon-dam-solnit.png

It’s a longread so I recommend finding someone with a voice as soothing and clear as Solnit’s to read aloud to you (and then hold you when you realize what it all means).

N.B. Pop-Up’s spring tour tickets go on sale on April 9. It’s always an interesting show. Sometimes funny, sometimes sad, always poignant and well-produced. And never recorded so you must be there in person!


The Last Days of Walter Benjamin’s Life

Walter Benjamin Library Card.jpg

This Aeon essay by Giorgio van Straten, “Lost in Migration,” is excerpted from a book titled In Search of Lost Books, which explains its fascination with a book that’s fascinated many people, a manuscript carried in a briefcase by Walter Benjamin at the end of his life which has never been identified or located and probably did not survive him.

I’ve always been a little turned off by the obsession with this manuscript among Benjamin fans and readers. There’s something so shattering to me about the end of Benjamin’s life, and how he died, that it feels not just trivial, but almost profane to geek out over the imaginary contents of a book he might have left behind. I feel the same way about dead musicians. It’s all just bad news.

Luckily, though, this essay does contain a compelling and concise account of the end of Benjamin’s life.

First Benjamin fled Paris, which had been bombed and was nearly about to be invaded by the German army, for Marseilles:

Benjamin was not an old man - he was only 48 years old - even if the years weighed more heavily at the time than they do now. But he was tired and unwell (his friends called him ‘Old Benj’); he suffered from asthma, had already had one heart attack, and had always been unsuited to much physical activity, accustomed as he was to spending his time either with his books or in erudite conversation. For him, every move, every physical undertaking represented a kind of trauma, yet his vicissitudes had over the years necessitated some 28 changes of address. And in addition he was bad at coping with the mundane aspects of life, the prosaic necessities of everyday living.

Hannah Arendt repeated with reference to Benjamin remarks made by Jacques Rivière about Proust:

He died of the same inexperience that permitted him to write his works. He died of ignorance of the world, because he did not know how to make a fire or open a window.

before adding to them a remark of her own:

With a precision suggesting a sleepwalker his clumsiness invariably guided him to the very centre of a misfortune.

Now this man seemingly inept in the everyday business of living found himself having to move in the midst of war, in a country on the verge of collapse, in hopeless confusion.

From Marseilles he hoped to reach Spain, since, as a German refugee, he did not have the proper exit papers.

The next morning he was joined soon after daybreak by his travelling companions. The path they took climbed ever higher, and at times it was almost impossible to follow amid rocks and gorges. Benjamin began to feel increasingly fatigued, and he adopted a strategy to make the most of his energy: walking for 10 minutes and then resting for one, timing these intervals precisely with his pocket-watch. Ten minutes of walking and one of rest. As the path became progressively steeper, the two women and the boy were obliged to help him, since he could not manage by himself to carry the black suitcase he refused to abandon, insisting that it was more important that the manuscript inside it should reach America than that he should.

A tremendous physical effort was required, and though the group found themselves frequently on the point of giving up, they eventually reached a ridge from which vantage point the sea appeared, illuminated by the sun. Not much further off was the town of Portbou: against all odds they had made it.

Spain had changed its policy on refugees just the day before:

[A]nyone arriving ‘illegally’ would be sent back to France. For Benjamin this meant being handed over to the Germans. The only concession they obtained, on account of their exhaustion and the lateness of the hour, was to spend the night in Portbou: they would be allowed to stay in the Hotel Franca. Benjamin was given room number 3. They would be expelled the next day.

For Benjamin that day never came. He killed himself by swallowing the 15 morphine tablets he had carried with him in case his cardiac problems recurred.

This is how one of the greatest writers and thinkers of the twentieth century was lost to us, forever.


The European Parenthesis

Byzantium Map

There’s an idea in media history and media theory called “The Gutenberg Parenthesis.” The basic idea of it is simple: the dominance of fixed, printed text is a historical blip in a broader history of much more mutable, orally-driven media forms. You find versions of this idea in Walter Ong and Marshall McLuhan, but it’s being re-thought for digital technology by folks like L.O. Sauerberg and Thomas Pettit. And one of the implications is that if you want to understand media today, you have to understand media before Gutenberg and print. The future is medieval, is one formulation of this.

A similar idea can be applied to world history, and it has been by J.C. Sharman in his book Empires of the Weak: The Real Story of European Expansion and the Creation of the New World Order. Here the focus isn’t print technology, which Europe borrowed and adapted from Asia 500 years ago, but European domination of the rest of the world, which, Sharman contends, really only got going a little more than 200 years ago and is questionable today.

This is from a review of Sharman’s book by Alan Mikhail, titled “When Asia Ruled the World”:

In Sharman’s account, the dominance of the West (note Europe’s easy baton-pass to the United States), roughly from the Enlightenment to World War II, represents a historical blip in the last millennium. And, perhaps more important, today we seem to be on the cusp of a return to a more regular state of affairs, where the large states of Asia will again be the globe’s hegemons.

To make this provocative argument, Sharman finds the early modern period, conventionally dated from 1500 to 1800, the most fruitful for thinking about where we are headed. In those centuries, the enormous empires of the East — the Qing, the Ottomans and the Mughals — were the most formidable states on earth. Territory equaled power, and those states held the most land…

Asia’s enormous land-based empires didn’t much care about their coastlines and tolerated — more than they succumbed to — the Europeans nibbling on their shores in what were desperate, highly risky and ultimately temporary ventures. Until approximately 1750, Europeans — even in Europe, thanks to the Ottomans — held no military advantage over other powers.

But how then to explain the undeniable fact that Europeans dominated the globe from the turn of the 19th century to World War I? Sharman reasons that it was a combination of internal fractures within the Qing and Ottoman Empires, as well as the inclination of Europeans to think that empire building was the route to national sovereignty: in other words, almost a kind of vanity project.

The future, contends Sharman, is medieval; one in which Asia dominates the planet, and Europe and the West are at the periphery of global power and influence. There’s a lot that’s going to change over the next century; global climate change is certainly going to shift the balance of power and the fight for survival worldwide. But the idea that we’re coming out of a historical aberration rather than a necessary outcome is well worth thinking about.


More on Ancient Scripts and the History of Writing

World's Writing Systems.png

One post last week that y’all loved was The Evolution of the Alphabet. I loved it too; anything breaking down the history of writing in ways that are (get it) decipherable is just to me. But since then, even more great links on the history of writing have come in. To which I say, it is our duty, nay—our pleasure—to round those links up.

First, a riff on Jason’s post from the man himself, Talking Points Memo’s Josh Marshall. Josh, like me, is obsessed with the history of writing. He recommends two books (Empires of the Word: A Language History of the World by Nicholas Oster and The Writing Revolution: Cuneiform to the Internet by Amalia E. Gnanadesikan) and adds this reflection:

Historians of writing believe that our current alphabet originated as a sort of quick-and-dirty adaptation of Egyptian hieroglyphics into a simpler and more flexible way of writing. You take a small number of hieroglyphic characters representing specific things, decide to use them not for their meaning but for their sound and then use this as a way to encode the sound of words in almost any language. In this particular case it was to encode a Semitic language related to and ancestral to Hebrew and Phoenician. It was likely devised by soldiers of traders operating either in Egypt or between Egypt and what’s now Israel and Jordan.

This basic A B C D formulation is the foundation of the writing systems for not only all languages that use the Latin alphabet but also those which use the Greek, Cyrillic and Arabic alphabets along with numerous others. What is particularly fascinating is that most historians of writing believe that this invention - the alphabet, designed by and for sub-literate Semites living on the borderlands of Egypt about 4,000 years ago - is likely the origin point of all modern alphabets. In some cases, it’s a direct lineal descent as in Canaanite to Greek to Latin to our modern alphabet. But the creators of the alphabets that now dominate South Asia (originating 2500 to 3000 years ago) also seem to have borrowed at least the idea of the alphabet from these Semitic innovators, though others believe they are an indigenous creation.

The deep history of these letters we are now communicating through is like the DNA - or perhaps rather the record of the DNA - of human cognition and thought, processed through language and encoded into writing.

The second link comes from linguist Gretchen McCulloch. It’s The World Writing Systems, a site that doesn’t focus narrowly on our updated Latin alphabet and its antecedent forms, but on every system of writing that ever is or has been. It lets you search, browse, sort, and generally geek out to your heart’s content. It also lets you know whether the scripts are supported by Unicode (a surprising number are not), and links you to Wikipedia entries about them. So you can easily read about the Cypriot Syllabary, an Iron Age script and descendant of Linear A that was eventually replaced by the Greek alphabet.

Differences between Cypriot syllabary and Linear B
The main difference between the two lies not in the structure of the syllabary but the use of the symbols. Final consonants in the Cypriot syllabary are marked by a final, silent e. For example, final consonants, n, s and r are noted by using ne, re and se. Groups of consonants are created using extra vowels. Diphthongs such as ae, au, eu and ei are spelled out completely. In addition, nasal consonants that occur before another consonant are omitted completely.

See, you just learned something!

Now, many of the Aegean writing systems (including Linear A) are still undeciphered. For that, you want classicist Anna P. Judson’s “A very short introduction to the undeciphered Aegean writing systems” from her blog, “It’s All Greek To Me.” (Hat tip here to the polymath sportswriter Zito Madu.)

Here’s what Judson has to say about Linear A (which unlike Linear B, wasn’t used to write Greek, but a related language called Minoan):

It’s generally agreed that at least some Linear A signs, and quite plausibly the majority of them, can be ‘read’, since they are likely to have had similar sound-values to their Linear B equivalents (Linear B was adapted directly from Linear A in order to write in Greek); but it’s still not possible to identify the language involved or to understand any of its grammatical features, the meanings of most words, etc. As an example, the word AB81-02, or KU-RO if transliterated using Linear B sound-values, is one of the few words whose meaning we do know: it appears at the end of lists next to the sum of all the listed numerals, and so clearly means ‘total’. But we still don’t actually know how to pronounce this word, or what part of speech it is, and we can’t identify it with any similar words in any known languages.

The most promising set of inscriptions for analysing linguistic features is the so-called ‘libation formula’ - texts found on stone vases used in religious rituals (‘libation tables’), which are probably dedications (so probably say something like “Person X gives/dedicates/offers this object/offering to Deity Y”), and across which similar elements often recur in the same position in the text. In principle, having a ‘formula’ of this kind should let us identify grammatical elements via the slight variations between texts - e.g. if a particular variation in one word seemed to correlate with the number of dedicators listed, we might be able to infer that that was a verb with singular or plural marking. Unfortunately, there simply aren’t enough examples of these texts to establish this kind of linguistic detail - every analysis conducted so far has identified a different element as being the name of the donor, the name of the deity, the verb of offering, etc., so it’s still not possible to draw any certain conclusions from this ‘formula’.

Cretan Hieroglyphic and its variants are even less well understood than Linear A! Some of them are only attested in single inscriptions! God, writing isn’t a smooth series of adaptations leading to a clear final goal! Writing is a total mess! How did anyone ever make sense of it at all?

But they did; and that’s how and why we’re all here, communicating with each other on these alphanumeric encoding machines to this very day.


America’s Last Chance at a Gospel Archive

Gospel Music.jpg

This is a fascinating Longread from Oxford American, on Baylor University’s Robert Darden’s efforts to create a comprehensive archive (physical and digital) of American gospel music:

In 2001, when Darden set out to write the first comprehensive history of black gospel music, from its African origins into the twenty-first century, he came across citations to many fundamental songs—early recordings by the Staple Singers; “There Is Not a Friend Like Jesus” by the Roberta Martin Singers; “Peace in the Valley” by the Southern Sons— but found that the recordings were long gone, never to be heard again. So he sought what could be salvaged. In Chicago, he climbed to the top floor of a tenement building, where he listened in awe as a woman sang lines to old freedom songs. In Memphis, he swayed and clapped in the pews of Al Green’s church. But when he finished People Get Ready!: A New History of Black Gospel Music in 2004, having painstakingly laid out the history of the genre, he almost couldn’t bear that many of its core components were lost to us.

After Darden finished the book, he got in touch with some of his old contacts from Billboard, gospel scholars and collectors like Bob Marovich, Opal Nations, and Ray Funk. He wanted to determine how much of black gospel music from its golden age was lost or unavailable. They estimated seventy-five percent.

Ask them why, and the answer gets complicated. “Part of it is racism,” Darden says. “Part of it is economic.” Part of it has to do with the consolidation in the music industry (some record companies hold the copyrights to these songs, but, lacking financial incentive, don’t make them available in any form). And the last part, as he sees it, is the religious aspect of this music. Marovich put it to me this way: “When I was growing up, there was always, in our neighborhood, a couple of guys in white shirts and black ties that wanted to talk to you about Jesus. And you wanted to run the opposite direction from those guys… . Gospel is a little frightening to the unknowledgeable.”

In February of 2005, Darden wrote an op-ed in the New York Times lamenting the loss of these treasures from gospel’s golden age: “It would be more than a cultural disaster to forever lose this music,” he writes. “It would be a sin.” The apparent imbalance of that remark stuck with me. By any honest standard, we sin regularly. A cultural disaster seems like a much more grievous affair. But I also had the feeling that he was onto something—that the loss of this music was a moral failing born out of a history of oppression and neglect. He explained to me that when he wrote that, he had in mind Jim Wallis’s (at the time controversial) claim that racism was America’s original sin.

One of the songs referenced in the article as the archival project’s unofficial anthem is “Old Ship of Zion,” recorded by The Mighty Wonders. I found it on YouTube:

There’s a beauty to these recordings that belies their historical importance. John Lewis has said that “without music the civil rights movement would have been like a bird without wings.” Some antebellum spirituals are said to have held instructions for slaves to escape. You can also find the roots of almost every major American popular music in the various strands of gospel, from country to hip-hop. It would be very American to lose these strands, but it’s the better kind of American to bring our collective resources and technology to bear to save and recirculate these records for the public good.


On the history of lavender

Edith’s post on the long history of lavender as a relaxant reminded me that you can now vape herbs just as you’ve always been able to smoke them.

There’s a long history of lavender being used to combat anxiety and other feelings of distress. In 1551, for instance, naturalist William Turner, in his nature guide Herball, wrote that “flowers of lavender, quilted in a cap, comfort the brain very well.” And herbalist John Parkinson, in his 1640 Theatrum Botanicum, wrote that lavender is of “especiall good use for all griefes and paines of the head and brain,” as well as for “the tremblings and passions of the heart” — and not just drunk as a tonic but “even applied to the temples, or to the nostrils to be smelt unto.”


The Library of Congress’s Collection of Early Films

National Screening Room, a project by the Library of Congress, is a collection of early films (from the late 19th to most of the 20th century), digitized by the LOC for public use and perusal. Sadly, it’s not made clear which of the films are clearly in the public domain, and so free to remix and reuse, but it’s still fun to browse the collection for a look at cultural and cinematic history.

There’s a bunch of early Thomas Edison kinetoscopes, including this kiss between actors May Irwin and John C. Rice that reportedly brought the house down in 1896:

Or these two 1906 documentaries of San Francisco, one from shortly before the earthquake, and another just after (the devastation is really remarkable, and the photography, oddly beautiful):

There’s a silent 1926 commercial for the first wave of electric refrigerators, promoted by the Electric League of Pittsburgh, promising an exhibition with free admission! (wow guys, thanks)

There’s also 33 newsreels made during the 40s and 50s by All-American News, the first newsreels aimed at a black audience. As you might guess by the name and the dates, it’s pretty rah-rah, patriotic, support-the-war-effort stuff, but also includes some slice-of-life stories and examples of economic cooperation among working-to-middle-class black families at the time.

I hope this is just the beginning, and we can get more and more of our cinematic patrimony back into the public commons where it belongs.


How Precision Engineering Made Modernity Possible

Antikythera-Mechanism-Exploded.jpg

Simon Winchester, author of The Professor and the Madman, has a new book out called The Perfectionists: How Precision Engineers Created the Modern World. It’s the history of a concept, which makes it tricky to write about, but it’s an uncommonly generative and illustrative concept. James Gleick shares this anecdote in his book review.

North Wales, “on a cool May day in 1776.” The Age of Steam was getting underway. So was the Industrial Revolution—almost but not quite the same thing. In Scotland, James Watt was designing a new engine to pump water by means of the power of steam. In England, John “Iron-Mad” Wilkinson was improving the manufacture of cannons, which were prone to exploding, with notorious consequences for the sailors manning the gun decks of the navy’s ships. Rather than casting cannons as hollow tubes, Wilkinson invented a machine that took solid blocks of iron and bored cylindrical holes into them: straight and precise, one after another, each cannon identical to the last. His boring machine, which he patented, made him a rich man.

Watt, meanwhile, had patented his steam engine, a giant machine, tall as a house, at its heart a four-foot-wide cylinder in which blasts of steam forced a piston up and down. His first engines were hugely powerful and yet frustratingly inefficient. They leaked. Steam gushed everywhere. Winchester, a master of detail, lists the ways the inventor tried to plug the gaps between cylinder and piston: rubber, linseed oil-soaked leather, paste of soaked paper and flour, corkboard shims, and half-dried horse dung—until finally John Wilkinson came along. He wanted a Watt engine to power one of his bellows. He saw the problem and had the solution ready-made. He could bore steam-engine cylinders from solid iron just as he had naval cannons, and on a larger scale. He made a massive boring tool of ultrahard iron and, with huge iron rods and iron sleighs and chains and blocks and “searing heat and grinding din,” achieved a cylinder, four feet in diameter, which as Watt later wrote “does not err the thickness of an old shilling at any part.”

By “an old shilling” he meant a tenth of an inch, which is a reminder that measurement itself—the science and the terminology—was in its infancy. An engineer today would say a tolerance of 0.1 inches.

This corresponding concept of “tolerance” turns out to be equally important. The ancient world was certainly capable of creating complex machinery (see the Antikythera Mechanism above), and the early modern period was able to put together the scientific method and new ways of conceptualizing the universe. But it’s the Industrial Revolution that created — or was created by — this notion that machines could be made in parts that fit together so closely that they could be interchangeable. That’s what got our machine age going, which in turn enabled guns and cars and transistors and computers and every other thing.


How A 1979 Email Chain Letter Helped Give Birth to Our Social Internet

Spocks-death.jpg

The Internet connects various bodies of knowledge and enables all sorts of private communication and coordination. It’s also clear in 2018, and has been for twenty-five years, that the Internet supports a variety of social media: public and semi-public communication for entertainment and cultural purposes.

Vint Cerf, co-inventor of the TCP/IP protocol and general internet pioneer, traces the emerging social use of the Internet to an unlikely candidate: a 1979 chain email from MIT’s Artificial Intelligence labs, titled “SF-LOVERS,” that asked Cerf and his colleagues at DARPA and elsewhere in the network of networks called ARPANET to weigh in on their favorite science fiction authors.

Because the message had gone out to the entire network, everybody’s answers could then be seen and responded to by everybody else. Users could also choose to send their replies to just one person or a subgroup, generating scores of smaller discussions that eventually fed back into the whole.

About 40 years later, Cerf still recalls this as the moment he realized that the internet would be something more than every other communications technology before it. “It was clear we had a social medium on our hands,” he said.

The thread was a hit. It also created what might be thought of as the first online social network. Though individuals had been connected via this internet before, this was the first time they were using it for social interactions and, importantly, building a larger community identity through these personal connections. After SF-LOVERS came YUMYUM, another chain email that debated the quality of restaurants in the new Silicon Valley. (In-house gourmet chefs were still decades away.) Then WINE-TASTERS appeared, its purpose self-evident. The socialization also inspired more science with HUMAN-NETS, a community for researchers to discuss the human factors of these proto-online communities.

In the 1980s, these chain emails saw the first use of spoiler alerts, for the death of Spock in The Wrath of Khan (oops, sorry if I spoiled that for you), and emoticons: “:-)” to indicate a joke and “:-(” to indicate a non-joke. (I’d say the semantics of those has drifted over the years.)

And here we are, almost forty years later, still doing the same shit, only with slightly more sophistication and bandwidth, on the commercial successors to those early email threads.


Contemporary Versions of WPA Public Service Announcement Posters

For Topic, MGMT. Design created a series of 2018 updates to the iconic WPA public service announcement posters from the 1930s and 1940s. Some of the posters are directly imitating specific WPA posters; others have a looser inspiration.

For example, here’s a “Don’t Mix ‘Em” WPA poster:

WPA-1934-1.jpeg

And here’s the corresponding 2018 version:

WPA-2018-1.jpeg

And the rest of the contemporary series:

WPA-2018-2.jpeg

WPA-2018-3.jpeg

WPA-2018-4.jpeg

WPA-2018-5.jpeg

In the accompanying essay, the designers write that “we noticed that the advertising of the 1930s and ’40s seemed far less cynical or manipulative than it is today… Today’s distribution methods have created a relentless flood of messages, putting a torrent of information in the palm of your hand. How the public values, rejects, or embraces this version of public information is up to them.”

The other obvious difference is the overall mood of the messages. The WPA posters are direct, imperative, and point towards solutions, even when they’re being particularly grim about it. The contemporary versions are ironic, diffident, and uncertain about solutions — or at least, uncertain about solutions that can be reduced to a bold-type message across a poster. (Except “Don’t Send Dick Pics.” That one, they’ve got nailed.)

At the same time, there’s a yearning for that level of clarity, aesthetically if not intellectually. All of this seems frustrating but basically honest about the mood and limitations of this political moment.


Rituals of democracy

At The Atlantic, Yoni Applebaum argues that a decline in democracy isn’t just about voter disenfranchisement, gerrymandering, and radical inequality, but also the decline of smaller civic institutions. The 19th century saw a boom in democratic governance of public/private associations, not just local and federal government:

By the latter half of the 19th century, more and more of these associations mirrored the federal government in form: Local chapters elected representatives to state-level gatherings, which sent delegates to national assemblies. “Associations are created, extended, and worked in the United States more quickly and effectively than in any other country,” marveled the British statesman James Bryce in 1888. These groups had their own systems of checks and balances. Executive officers were accountable to legislative assemblies; independent judiciaries ensured that both complied with the rules. One typical 19th-century legal guide, published by the Knights of Pythias, a fraternal order, compiled 2,827 binding precedents for use in its tribunals.

The model proved remarkably adaptable. In business, shareholders elected boards of directors in accordance with corporate charters, while trade associations bound together independent firms. Labor unions chartered locals that elected officers and dispatched delegates to national gatherings. From churches to mutual insurers to fraternities to volunteer fire companies, America’s civic institutions were run not by aristocratic elites who inherited their offices, nor by centrally appointed administrators, but by democratically elected representatives…

This nation of presidents—and judges, representatives, and recording secretaries—obsessed over rules and procedures. Offices turned over at the end of fixed terms; new organizations were constantly formed. Ordinary Americans could expect to find themselves suddenly asked to join a committee or chair a meeting. In 1876, an army engineer named Henry Robert published his Pocket Manual of Rules of Order for Deliberative Assemblies, and it improbably became a best seller; within four decades, more than 500,000 copies were in print. It was, a Boston newspaper declared, “as indispensable as was the Catechism in more ecclesiastical times.”

We were, as the University of Georgia’s Walter Hill said in 1892, “a nation of Presidents.” And the decline of those traditions, those rituals of democracy, tracks a corresponding decline in respect for and interest in political democracy. That, at least, is Applebaum’s take.

I’m less sure. I think the history and sociology of these voluntary associations is fascinating, and deserves to be part of what we talk about when we talk about democracy writ large. But I think (and I would guess this would probably be taken as a friendly amendment) we also have to think about the transformations in other institutions we know are connected to democracy, like the media, public schools, and public institutions like libraries. Places that don’t necessarily have elected officers, but likewise help preserve certain rituals of democracy that we all have to learn in order to interact in a democratic society. How we read, how we think, how we share space — all of this matters, the anthropology as much as the law.

In other words, there’s a formal and informal side to democracy, and neither one of them is necessarily more important than the other. Not to mention that elections and officers are features of only a certain kind of democracy, and there are other, more radical forms of democracy that are available to us — all of which were experimented with in the 18th and 19th centuries too, by anarchists, socialists, and other people who didn’t take the US’s electoral model has having definitively solved the question of what a democratic society might look like.

The broader truth I take from this is that, to borrow from Aristotle, we are what we do repeatedly. If we’re not a nation of Presidents any more, it means we’ve become a different kind of democracy, not necessarily an un-democracy. And that’s significant. We’ve mutated into something else: a different kind of mass democracy, linked together by a different set of institutions with a different set of principles, leading to a different set of possibilities.


Werner’s Nomenclature of Colours

While Isaac Newton and the 17th century were more decisive for understanding the physics of color, you can’t beat the late 18th and early 19th century for a broader, subtler, more humanistic sense of the science of colors. The playwright and polymath J.W. von Goethe built up his Theory of Colours by collecting almost 18,000 meteorological and mineralogical specimens, with an emphasis on subtle distinctions between colors and their psychological perception in nature, rather than wavelengths of light.

Another phenomenal collection of naturalist examples is Abraham Gottlob Werner’s Nomenclature of Colours, first published in 1814. An 1821 edition recommends it for “zoology, botany, chemistry, mineralogy, and morbid anatomy.” At My Modern Met, Kelly Richman-Abdou writes:

Nomenclature of Colours served as a must-have reference for artists, scientists, naturalists, and anthropologists alike. The exquisitely rendered guide showcases the earth’s rich range of color by separating it into specific tones. Illustrated only by a small swatch, each handwritten entry is accompanied by a flowery name (like “Arterial Blood Red” and “Velvet Black”) as well as an identifying number. What the book is truly known for, however, is its poetic descriptions of where each tone can be found in nature.

Werner was a German mineralogist who created the system of color classification in the book to help distinguish between his own samples. His Scottish collaborators Patrick Syme and Robert Jameson were a painter and naturalist, respectively, who adapted the system into the book format in which it exists today. As you might guess, each color in the book includes a name, a swatch, and examples from the animal, vegetable, and mineral world showing where each color is found in nature.

werners-nomenclature-of-colours-4.jpg

Probably the most famous user of Werner’s book was Charles Darwin, who used it to help describe animals and other bits of the natural world in his books and journals. But if you think about it, before photography, anything that let naturalists describe what they were seeing in something resembling a universal vocabulary had to be essential. Essential enough that they were willing to produce the book by hand, with no real way to print in color.

Amazon sells a pocket-sized facsimile edition of the book. It may not be as handy as a color wheel for painting a room, but might be handier if you’re identifying bird eggs or a rare bit of stone.


Writing as bureaucracy vs. writing as magic

Chinese oracle bone script.jpg

Michael Erard pokes away at the “administrative hypothesis,” the idea that ancient writing had its origin in accounting bureaucracies and existed primarily as a function of state power. There’s just as much evidence, he argues, that states and proto-states co-opted already-existing symbols used by pre-state farmers to keep tallies and mark time, and more provocatively, by priests who used writing as a script for prophecy, narrative, and magic spells.

Over and over, what we see is that writing is more like gunpowder than like a nuclear bomb. In each of the four sites of the independent invention of writing, there’s either no evidence one way or the other, or there’s evidence that a proto-writing pre-dated the administrative needs of the state. Even in Mesopotamia, a phonetic cuneiform script was used for a few hundred years for accounting before writing was used for overtly political purposes. As far as the reductive argument that accountants invented writing in Mesopotamia, it’s true that writing came from counting, but temple priests get the credit more than accountants do. ‘Priests invented writing’ is a reduction I can live with - it posits writing as a tool for contacting the supernatural realm, recording the movement of spirits, inspecting the inscrutable wishes of divinities.

It’s a complex argument, because it has at least two parts:

1) writing wasn’t invented by states (even writing for accounting purposes);
2) writing has been invented for reasons other than an accounting function.

So most of Erard’s examples are arguing against one part of the most robust version of the adminstrative hypothesis, rather than refuting it outright. This is hardly a knockout blow, but it makes for some notable asterisks. (I wish there were more here about China.)


Rethinking “The Great Migration”

Union Terminal Colored Waiting Room.jpg

At CityLab, Brentin Mock makes a compelling case for rethinking the causes and consequences of black Americans’ 20th century relocation from the rural South to the industrialized north.

“The Great Migration” makes it sound like a bunch of people just packed up their bags headed for better jobs and homes—no different than the recent trend of Amazon-ian and Apple-American tech nerds moving in droves from Silicon Valley to greener, more affordable pastures in the former Rust Belt. In reality, the stakes for African Americans in the 20th century were much grimmer and urgent—they were moving to save their lives, as Bryan Stevenson, the racial justice advocate behind the lynching memorial and museum, regularly emphasizes. It probably should be called The Great Massive Forced Exodus.

When you look even closer, the idea of a single migration gets even messier, since black Americans weren’t free from lynching and other forms of violence, legal or extralegal, even after they reached the north. (The Autobiography of Malcolm X, among other books, tells this story very well.)

Race riots, redlining, white flight, followed by gentrification and police harassment continue to have the same effect of alternately pushing and constraining the black population around the country. So what you have is a kind of continual, whirling diaspora, shaped by similar forces, but taking on different forms, that continues to and through the present.


Partners in prewar Greenwich Village

1940_ED_Greenwich_Village_overlay.jpg

My friend, the historian Dan Bouk, has a fascinating find in the 1940 U.S. census. Over 200,000 people are listed as “partners” on census accountings of households, despite the fact that the census instructions make no particular allowance for “partners” as a category.

What’s more, partners show up disproportionately in neighborhoods where we know gay people settled: Greenwich Village, parts of the Upper East Side, and so forth.

All we can say for sure about Brand and Grant—and the other 11 partnerships that Davis recorded—is that they shared a sink. It seems likely that at least some of these households did live together as lesbian or gay couples. But it could also be that we find so many partnerships in this gay neighborhood, because the presence of queer folk indicated or amplified social and cultural spaces in which people could live openly in odd (that is, unusual), but non-sexual arrangements. The census didn’t ask about sexual identity or sexual behavior, and so we cannot know from the census alone.

As for those other partnerships, five partners were women living with another woman, the pairs always within a few years in age. They included a pair of doctors, two travel agents (who may have been business partners, if nothing else), an editor and a secretary, and a secretary and a stenographer who both worked for the YMCA(!). Two of the partners were male-male, including a police detective who lived with a patrolman.

As Bouk writes of his own (romantic) partner, “I thought we were being very modern, that this was a new sort of relationship.” But partners seem to have been around and declared for a very long time.


The reaction time problem

reaction time over time.jpg

Francis Galton, a Victorian eugenicist and statistician, was obsessed with measuring reaction time as a proxy for general intelligence. In 1885, 1890, and 1892, he collected “data on the sensory, psychomotor, and physical attributes of 1,639 females and 4,849 males.” Eventually, though, reaction time gave way to other questionable measurements of generalized intelligence like IQ tests and scholastic aptitude scores, so most of us don’t keep track of our reaction times, if we’ve ever had it measured.

Here’s the thing, though — everyone who’s tried to repeat Galton’s experiments in the 20th and 21st century, across populations, varying the equipment used and the measurement process taken, etc., has never been able to get reaction times as fast as what Galton measured. IQ scores have generally risen over time; reaction times have slowed down. It’s a matter of milliseconds, but the effect is large: about 10 percent. It is quite possible that young adults in 19th century Great Britain were just plain faster than us.

Tom Stafford, writing at Mind Hacks, has some helpful caveats:

What are we to make of this? Normally we wouldn’t put much weight on a single study, even one with 3000 participants, but there aren’t many alternatives. It isn’t as if we can have access to young adults born in the 19th century to check if the result replicates. It’s a shame there aren’t more intervening studies, so we could test the reasonable prediction that participants in the 1930s should be about halfway between the Victorian and modern participants.

And, even if we believe this datum, what does it mean? A genuine decline in cognitive capacity? Excess cognitive load on other functions? Motivational changes? Changes in how experiments are run or approached by participants? I’m not giving up on the kids just yet.


Rosa Parks’s Arrest Warrant

A courthouse intern on a housecleaning project named Maya McKenzie turned up a slew of rarely-seen original documents of the Montgomery Bus Boycott. They include Rosa Parks’s arrest warrant and court records, as well as a bond posted for Martin Luther King Jr. on charges of conspiracy, and more.

“A lot of times in our schools, when we teach about the movement, it’s all centered around one person, one figure, but what this does is open up that world to give the back story, to let them know that there were so many people that were involved,” said Quinton T. Ross Jr., the president of Alabama State, a historically black university, where a professor once used a mimeograph machine to run off thousands of fliers announcing the boycott.

montgomery-documents-4-superJumbo.jpg

What’s odd is that the records appeared to have already been gathered together, but not for any clear reason. They had never been made public.


Cooking Babylonian stews, the oldest recipes ever found

The Yale Babylonian Collection has four cuneiform tablets that contain the world’s oldest known food recipes — nearly four thousand years old. Scholars think the recipes weren’t everyday cuisine, but dishes prepared for royal houses, because they’re 1) fairly complex and 2) written down. A Yale-Harvard team decided to cook three of the recipes (two lamb stews, one vegetarian) for an event at NYU called “An Appetite for the Past.”

The undertaking was not without its challenges, says [Yale curator Agnete] Lassen. “Not only were some of the ingredients that were used during this time period not available, but two of the tablets are poorly preserved — there are big holes in them. Some of these terms that appear in the Akkadian original are difficult to translate because these are words that don’t appear very often in the other texts that we have and that makes it very difficult to decipher them.”

“Having an understanding of what the food is supposed to feel and taste like is very important,” says Lassen. “We didn’t know what we were looking for. When we were recreating one of the recipes I kept thinking they were doing this wrong, ‘this is not how I would make this.’ And then when it had boiled for a while it suddenly transformed itself into something delicious.”

I wonder which of our recipes will still survive in four thousand years, and what historians of the future will make of the people who ate this food.

Update: University of Chicago Press has a book of ancient Mesopotamian recipes. “Offering everything from translated recipes for pigeon and gazelle stews, the contents of medicinal teas and broths, and the origins of ingredients native to the region, this book reveals the cuisine of one of history’s most fascinating societies.” I’ve never eaten pigeon, but people I know who have say it’s delicious. (I don’t know anyone who’s eaten gazelle.) Via Amy Drummond.


Against political analogies

It’s a common and (on its face) rhetorical move: take something that’s happening now and map it onto the past. Better yet, take something atrocious that’s happening now and show how it maps onto something atrocious in the past, ideally affecting the very people who are now supporting the atrocities. “See?” this trope says: “what you’re doing to other people is exactly what was done to you.”

That’s the basic structure of “resistance genealogy,” as seen in clashes over immigration. “Tomi Lahren’s great-great-grandfather forged citizenship papers; Mike Pence’s family benefited from “chain migration”; James Woods’ ancestors fled famine and moved to Britain as refugees,” etc.

Rebecca Onion argues, convincingly, that this doesn’t work:

The chasm between the life and experiences of a white American, even one who’s descended from desperate immigrants of decades past, and the life of this Honduran mother is the entire point of racist anti-immigration thought. Diminishment of the human qualities of entering immigrants (“unskilled” and “unmodern” immigrants coming from “shithole” countries) reinforces the distance between the two. People who support the Trump administration’s immigration policies want fewer Honduran mothers and their 18-month-olds to enter the country. If you start from this position, nothing you hear about illiterate Germans coming to the United States in the 19th century will change your mind.

Besides underestimating racism, it flattens out history, and assumes that if people only knew more about patterns of historical racism, they might be convinced or at least shamed into changing how they talk about it. Everything we’ve seen suggests that isn’t the case.

I’m going to take this one step further and say this is a weakness in most resorts to historical and political analogies deployed as a tool to understand or persuade people about the present.

For example, consider Donald Trump saying, regarding immigrants trying to enter the United States, “these aren’t people, these are animals.” This is a disgusting thing to say and way to think — and not just because German Nazis and Rwandan perpetrators of genocide used similar language in a different context, and regardless of whether he was using it to refer to immigrants in general or members of a specific gang. It’s bad, it’s racist, it’s shitty, and you really don’t need the added leverage of the historical analogy in order to see why. But that leverage is tempting, because it shows off how much we know, it underlines the stakes, and it converts bad into ultra-bad.

This hurts me to say, because I love history and analogies both. But there’s a limit to how much they can tell us and how well they work. And playing “gotcha!” is usually well beyond the limits of both.


Why humans need stories

Tablet V of the Epic of Gilgamesh, The Sulaymaniyah Museum, Iraq

There’s a tendency these days to disregard the idea of “storytelling.” Like so many terms it’s been overused, its meaning stretched to within an inch of its life. We watch a lot of Netflix and obsess over some stories in the news but we don’t read as many books and we don’t gather around the fire to tell stories so much. But they have been part of our lives forever. In Our fiction addiction: Why humans need stories, the author takes us through some of the oldest stories we tell and why evolutionary theorists are studying them.

One common idea is that storytelling is a form of cognitive play that hones our minds, allowing us to simulate the world around us and imagine different strategies, particularly in social situations. “It teaches us about other people and it’s a practice in empathy and theory of mind,” says Joseph Carroll at the University of Missouri-St Louis. […]

Providing some evidence for this theory, brain scans have shown that reading or hearing stories activates various areas of the cortex that are known to be involved in social and emotional processing, and the more people read fiction, the easier they find it to empathise with other people. […]

Crucially, this then appeared to translate to their real-life behaviour; the groups that appeared to invest the most in storytelling also proved to be the most cooperative during various experimental tasks - exactly as the evolutionary theory would suggest. […]

By mapping the spread of oral folktales across different cultural groups in Europe and Asia, some anthropologists have also estimated that certain folktales - such as the Faustian story of The Smith and the Devil - may have arrived with the first Indo-European settlers more than 6,000 years ago, who then spread out and conquered the continent, bringing their fiction with them.

The author also says this; “Although we have no firm evidence of storytelling before the advent of writing.” He then goes on to write about the paintings in Lascaux which seem to be telling stories, so he’s aware of some examples. Randomly today I also happened on this about Australia’s ancient language shaped by sharks which talks about the beautiful history of the Yanyuwa people and their relationship with the tiger shark. They’ve been “dreaming,” telling stories, for 40,000-years!

This forms one of the oldest stories in the world, the tiger shark dreaming. The ‘dreaming’ is what Aboriginal people call their more than 40,000-year-old history and mythology; in this case, the dreaming describes how the Gulf of Carpentaria and rivers were created by the tiger shark.

And then there’s this incredible aspect of their culture:

What’s especially unusual about Yanyuwa is that it’s one of the few languages in the world where men and women speak different dialects. Only three women speak the women’s dialect fluently now, and Friday is one of few males who still speaks the men’s. Aboriginal people in previous decades were forced to speak English, and now there are only a few elderly people left who remember the language.


Connecticut silk

Did you know Connecticut nearly had a silk production industry? Atlas Obscura has a short history of that silk adventure, from mulberry trees, to attics, speculative bubbles and lumpy thread.

By 1826, three out of every four households in Mansfield, Connecticut, were raising silkworms, and by 1826, Congress commissioned a report on the potential for a U.S. silk industry. By 1840, Connecticut outpaced other states in raw silk production by a factor of three. Within the next two decades, however, the industry would collapse, leaving the country to wonder what went wrong.

Silk

One of the biggest triumphs for the early industry was figuring out how to adapt sericulture to cold weather. Such tactics included keeping silkworms warm by raising them in attics, and figuring out how to feed them in cold weather.

Leaves

The product they ended up with was adequate for sewing thread, but not strong enough for the industrial-silk-manufacturing infrastructure that Connecticut had begun to build. According to one scathing assessment, “Connecticut women in 70 years have not improved their knowledge of reeling.” Another issue, Stockard says, was the expectation that women could reel silk “whenever leisure from other duties permitted.” In other words, women were supposed to wedge a high-skill, time-intensive task into their already full workloads, and the result was sub-par silk.

(Via @justinpickard.)


Twitter history walk threads

Paul Cooper Norfolk church walk

One of my favourite Twitter thread style or topic in recent months has been the “history walk.” People picking something they want to see, usually a ruin or forgotten place, documenting their walk there and the things they discover. Admittedly, I don’t have that many examples but the few I have seen are fantastic enough to make the form a favourite.

First one, from just this past weekend, has Paul Cooper setting out in the Norfolk countryside in search of the ruins of the church of St Mary’s, which “local folklore claims as the resting place of the Somerton Witch.” I’m including a few pictures below but read the whole thread, packed with historical tidbits.

(The first picture above is of Neolithic mines, which dot the landscape like lunar craters. The deepest could be as much as 60ft deep.)

Paul Cooper Norfolk church walk

Paul Cooper Norfolk church walk

Paul Cooper Norfolk church walk 4

Paul Cooper Norfolk church walk 5

If you’re a history buff, you should also check out Paul’s whole feed, he does these regular long threads on various historical topics.

Second example, also in the English country side, is this one by @gawanmac:

I saw this on an OS map and couldn’t not investigate. A place of worship symbol in the middle of bloody nowhere on the edge of a wood. It was a foggy, atmospheric day up on the North Downs, so I decided to walk three sides of a square through the wood to reach it.

gawanmac North Downs church walk 1

gawanmac North Downs church walk 2

gawanmac North Downs church walk 3

gawanmac North Downs church walk 4


history of the entire world, i guess

You may remember Bill Wurtz from his video history of Japan, which I called “the most entertaining history of anything I have ever seen”. I still stand by that, but his new video on the history of the Earth from before the Big Bang — “a long time ago, actually never, also now, nothing is nowhere” — to the present day is just as good. On the development of the ozone layer billions of years ago:

Hey, can we go on land?

NO.

Why?

The Sun is a deadly lazer.

Oh, ok.

Not anymore, there’s a blanket.