Advertise here with Carbon Ads

This site is made possible by member support. โค๏ธ

Big thanks to Arcustech for hosting the site and offering amazing tech support.

When you buy through links on kottke.org, I may earn an affiliate commission. Thanks for supporting the site!

kottke.org. home of fine hypertext products since 1998.

๐Ÿ”  ๐Ÿ’€  ๐Ÿ“ธ  ๐Ÿ˜ญ  ๐Ÿ•ณ๏ธ  ๐Ÿค   ๐ŸŽฌ  ๐Ÿฅ”

kottke.org posts about computing

Silicon Cowboys, a documentary film on the history of Compaq Computer

Silicon Cowboys

Silicon Cowboys is an upcoming documentary about Compaq Computer, one of the first companies to challenge IBM with a compatible computer.

Launched in 1982 by three friends in a Houston diner, Compaq Computer set out to build a portable PC to take on IBM, the world’s most powerful tech company. Many had tried cloning the industry leader’s code, only to be trounced by IBM and its high-priced lawyers. SILICON COWBOYS explores the remarkable David vs. Goliath story, and eventual demise, of Compaq, an unlikely upstart who altered the future of computing and helped shape the world as we know it today. Directed by Oscar(R)-nominated director Jason Cohen, the film offers a fresh look at the explosive rise of the 1980’s PC industry and is a refreshing alternative to the familiar narratives of Jobs, Gates, and Zuckerberg.

There’s no trailer yet, but the film is set to debut at SXSW in March. The first season of Halt and Catch Fire had a lot of influences, but the bare-bones story was that of Compaq.

Many reviews mention the similarity of the characters to Apple founders Steve Jobs and Steve Wozniak, but the trio of managers from Texas Instruments who left to form Compaq in the early 80s are a much closer fit. The Compaq Portable was the first 100% IBM compatible computer produced.


Computer Show

From Adam Lisagor’s Sandwich Video comes Computer Show, a present-day send-up of a personal computing show set in 1983. The guests and their products are contemporary and real, but the hosts are stuck in 1983 and don’t really know what the web is, what Reddit is, what links are, or anything like that.

“Computer Show” is a technology talk show, set in 1983. The dawn of the personal computing revolution. Awkward hair and awkward suits. Primitive synths and crude graphics. VHS tapes. No Internet. But there’s a twist.

The guests on this show are tech luminaries โ€” experts, founders, thinkers, entrepreneurs…from 2015. They are real, and they are really on “Computer Show” to talk about their thing. Will it go well? Can they break through to the host Gary Fabert (played by Rob Baedeker of the SF-based sketch mainstay Kasper Hauser) and his rotating cast of co-hosts, who know of neither iPhone nor website nor Twitter nor…hardly anything?

The first episode, featuring Lumi, is embedded above and here’s the second episode with Reddit cofounder Alexis Ohanian.

Hopefully they’ll get to make more.

Update: In an interview with Inc., Lisagor shares how Computer Show came about.

So it just became clear to Roxana and Tony, there was something here. Roxana had the idea to make something in this universe. To produce a show like “The Computer Chronicles”.

One day, over Slack, Roxana asked me for the contact info of a producer I know. When I asked why, she told me she had this idea, and also asked if I’d maybe be a contributor on it. When I got more info out of her (she tends to be a little private about her personal projects) and explained to me that she had this idea of a tech talk show set in the early 80’s, where the guests would interview people from modern day, I just about flipped out and lost my mind I was so excited.

Update: While we not-so-patiently await new episodes of Computer Show, at least we can buy the t-shirt.


New York Historical Society’s Silicon City exhibit

Early NYC Computing

The Silicon City exhibit at the New York Historical Society takes a look at the long history of computing in NYC.

Every 15 minutes, for nearly a year, 500 men, women, and children rose majestically into “the egg,” Eero Saarinen’s idiosyncratic theater at the 1964 World’s Fair. It was very likely their first introduction to computer logic. Computing was not new. But for the general public, IBM’s iconic pavilion was a high profile coming out party, and Silicon City will harness it to introduce New York’s role in helping midwife the digital age.

The exhibit opens on November 13, 2015 and runs through next April. The museum is using Kickstarter to help bring the Telstar satellite back to NYC for the exhibit.


A 1968 computer art contest

Computer Art 1968

From the August 1968 issue of Computers and Automation magazine, the results of their Sixth Annual Computer Art Contest (flip to page 8).

Computer Art 1968

Computer Art 1968

It’s also worth paging through the rest of the magazine just for the ads.

Update: Looks like The Verge saw this post and did a followup on the history of the Computer Art Contest.

In any given issue, Computers and Automation devoted equal time to the latest methods of database storage and grand questions about the future of their “great instrument,” but the Computer Art Contest was soon a regular event. A look back through old issues of the journal (available at Internet Archive) shows how the fledgling discipline of computer art rapidly evolved. At the time, computers were specialized tools, most commonly used by individuals working in research labs, academia, or the military โ€” and this heritage shows. Both the first and second prizes for the inaugural 1963 competition went to designs generated at the same military lab.


The computer collector

Lonnie Mimms has a gigantic collection of vintage computers, software, and peripherals. You don’t realize the scope of the collection until you see him walking around the Apple pop-up exhibit he built inside of an abandoned CompUSA.


Simple CPU

Very quickly, here’s how a computer works at the simplest level.

D Latch

Want to see how computers store data? This next device is called a ‘D-Latch’. It holds a binary bit. The top switch is the value to be stored, the bottom switch enables storage. Eight of these devices can be used to store a byte in memory.


Halt and Catch Fire

Halt And Catch Fire

I’ve been hearing some good things about Halt and Catch Fire, which is three episodes into its first season on AMC. The show follows a group of 80s computer folk as they attempt to reverse engineer the IBM PC. The first episode is available online in its entirety.

Many reviews mention the similarity of the characters to Apple founders Steve Jobs and Steve Wozniak, but the trio of managers from Texas Instruments who left to form Compaq in the early 80s are a much closer fit. The Compaq Portable was the first 100% IBM compatible computer produced. Brian McCullough recently did a piece on Compaq’s cloning of the IBM PC for the Internet History Podcast.

The idea was to create a computer that was mostly like IBM-PC and mostly ran all the same software, but sold at a cheaper price point. The first company to pursue this strategy was Columbia Data Products, followed by Eagle Computer. But soon, most of the big names in the young computer industry (Xerox, Hewlett-Packard, Digital Equipment Corporation, Texas Instruments, and Wang) were all producing PC clones.

But all of these machines were only mostly PC-compatible. So, at best, they were DOS compatible. But there was no guarantee that each and every program or peripheral that ran on the IBM-PC could run on a clone. The key innovation that Canion, Harris and Murto planned to bring to market under the name Compaq Computer Corporation would be a no-compromises, 100% IBM-PC compatibility. This way, their portable computer would be able to run every single piece of software developed for the IBM-PC. They would be able to launch their machine into the largest and most vibrant software ecosystem of the time, and users would be able to use all their favorite programs on the road.

My dad bought a machine from Columbia Data Products; I had no idea it was the first compatible to the market. My uncle had a Compaq Portable that he could take with him on business trips. I played so much Lode Runner on both of those machines. I wonder if that disk of levels I created is still around anywhere… (via @cabel)

Update: I’m all caught up, five episodes into the season, and I’m loving it.


Turing Test passed for the first time

A supercomputer running a program simulating a 13-year-old boy named Eugene has passed the Turing Test at an event held at London’s Royal Society.

The Turing Test is based on 20th century mathematician and code-breaker Turing’s 1950 famous question and answer game, ‘Can Machines Think?’. The experiment investigates whether people can detect if they are talking to machines or humans. The event is particularly poignant as it took place on the 60th anniversary of Turing’s death, nearly six months after he was given a posthumous royal pardon.

If a computer is mistaken for a human more than 30% of the time during a series of five minute keyboard conversations it passes the test. No computer has ever achieved this, until now. Eugene managed to convince 33% of the human judges that it was human.

I’m sure there will be some debate as members of the AI and computing communities weigh in over the next few days, but at first blush, it seems like a significant result. The very first Long Bet concerned the Turing Test, with Mitch Kapor stating:

By 2029 no computer โ€” or “machine intelligence” โ€” will have passed the Turing Test.

and Ray Kurzweil opposing. The stakes are $20,000, but the terms are quite detailed, so who knows if Kurzweil has won.

Update: Kelly Oakes of Buzzfeed dumps some cold water on this result.

Of course the Turing Test hasn’t been passed. I think its a great shame it has been reported that way, because it reduces the worth of serious AI research. We are still a very long way from achieving human-level AI, and it trivialises Turing’s thought experiment (which is fraught with problems anyway) to suggest otherwise.


The Macintosh is 30 years old today

Apple is celebrating the 30th anniversary of the Macintosh with a special subsite.

Incredible that the Mac is still around; the 90s were a dire time for Apple and it’s amazing to see the current fantastic iMacs and Macbooks that came after some epically bad mid-90s machines. Here’s Steve Jobs introducing the original Mac in 1984 (a snippet of the full introduction video):

Steven Levy writes about covering the introduction of the Mac for Rolling Stone.

First, I met the machine. From the instant the woman running the demo switched on that strange-looking contraption (inspired in part by the Cuisinart food processor), I knew the Macintosh would change millions of lives, including my own. To understand that, you must realize how much 1984 really was not like 2014. Until that point, personal computers were locked in an esoteric realm of codes and commands. They looked unfriendly, with the letters of text growing in sickly phosphorescence. Even the simplest tasks required memorizing the proper intonations, then executing several exacting steps.

But the Macintosh was friendly. It opened with a smile. Words appeared with the clarity of text on a printed page - and for the first time, ordinary people had the power to format text as professional printers did. Selecting and moving text was made dramatically easier by the then-quaint mouse accompanying the keyboard. You could draw on it. This humble shoebox-sized machine had a simplicity that instantly empowered you.

Here’s the piece Levy wrote for Rolling Stone.

If you have had any prior experience with personal computers, what you might expect to see is some sort of opaque code, called a “prompt,” consisting of phosphorescent green or white letters on a murky background. What you see with Macintosh is the Finder. On a pleasant, light background (you can later change the background to any of a number of patterns, if you like), little pictures called “icons” appear, representing choices available to you. A word-processing program might be represented by a pen, while the program that lets you draw pictures might have a paintbrush icon. A file would represent stored documents - book reports, letters, legal briefs and so forth. To see a particular file, you’d move the mouse, which would, in turn, move the cursor to the file you wanted. You’d tap a button on the mouse twice, and the contents of the file would appear on the screen: dark on light, just like a piece of paper.

Levy has also appended a never-seen-before transcript of his interview with Steve Jobs onto the Kindle version of Insanely Great, a book Levy wrote about the Mac.

Dave Winer participated on a panel of developers on launch day.

The rollout on January 24th was like a college graduation ceremony. There were the fratboys, the insiders, the football players, and developers played a role too. We praised their product, their achievement, and they showed off our work. Apple took a serious stake in the success of software on their platform. They also had strong opinions about how our software should work, which in hindsight were almost all good ideas. The idea of user interface standards were at the time controversial. Today, you’ll get no argument from me. It’s better to have one way to do things, than have two or more, no matter how much better the new ones are.

That day, I was on a panel of developers, talking to the press about the new machine. We were all gushing, all excited to be there. I still get goosebumps thinking about it today.

MacOS System 1.1 emulator. (via @gruber)

iFixit did a teardown of the 128K Macintosh.

Jason Snell interviewed several Apple execs about the 30th anniversary for MacWorld. (via df)

What’s clear when you talk to Apple’s executives is that the company believes that people don’t have to choose between a laptop, a tablet, and a smartphone. Instead, Apple believes that every one of its products has particular strengths for particular tasks, and that people should be able to switch among them with ease. This is why the Mac is still relevant, 30 years on-because sometimes a device with a keyboard and a trackpad is the best tool for the job.

“It’s not an either/or,” Schiller said. “It’s a world where you’re going to have a phone, a tablet, a computer, you don’t have to choose. And so what’s more important is how you seamlessly move between them all…. It’s not like this is a laptop person and that’s a tablet person. It doesn’t have to be that way.”

Snell previously interviewed Steve Jobs on the 20th anniversary of the Mac, which includes an essay that Jobs wrote for the very first issue of Macworld in 1984:

The Macintosh is the future of Apple Computer. And it’s being done by a bunch of people who are incredibly talented but who in most organizations would be working three levels below the impact of the decisions they’re making in the organization. It’s one of those things that you know won’t last forever. The group might stay together maybe for one more iteration of the product, and then they’ll go their separate ways. For a very special moment, all of us have come together to make this new product. We feel this may be the best thing we’ll ever do with our lives.

Here’s a look inside that first MacWorld issue.

As always, Folklore.org is an amazing source for stories about the Mac told by the folks who were there.

Susan Kare designed the icons, the interface elements, and fonts for the original Macintosh. Have a look at her Apple portfolio or buy some prints of the original Mac icons.

Stephen Fry recounts his experience with the Mac, including the little tidbit that he and Douglas Adams bought the first two Macs in Europe (as far as he knows).

I like to claim that I bought the second Macintosh computer ever sold in Europe in that January, 30 years ago. My friend and hero Douglas Adams was in the queue ahead of me. For all I know someone somewhere had bought one ten minutes earlier, but these were the first two that the only shop selling them in London had in stock on the 24th January 1984, so I’m sticking to my story.

Review of the Mac in the NY Times from 1984.

The Next Web has an interview with Daniel Kottke (no relation) and Randy Wigginton on programming the original Mac.

TNW: When you look at today’s Macs, as well as the iPhone and the iPad, do you see how it traces back to that original genesis?

Randy: It was more of a philosophy - let’s bring the theoretical into now - and the focus was on the user, not on the programmer. Before then it had always been let’s make it so programmers can do stuff and produce programs.

Here, it was all about the user, and the programmers had to work their asses off to make it easy for the user to do what they wanted. It was the principle of least surprise. We never wanted [the Macintosh] to do something that people were shocked at. These are things that we just take for granted now. The whole undo paradigm? It didn’t exist before that.

Like Daniel says, it’s definitely the case that there were academic and business places with similar technology, but they had never attempted to reach a mass market.

Daniel: I’m just struck by the parallel now, thinking about what the Mac did. The paradigm before the Mac in terms of Apple products was command-line commands in the Apple II and the Apple III. In the open source world of Linux, I’m messing around with Raspberry Pis now, and it terrifies me, because I think, “This is not ready for the consumer,” but then I think about Android, which is built on top of Linux. So the Macintosh did for the Apple II paradigm what Android has done for Linux.

A week after Jobs unveiled the Mac at the Apple shareholders meeting, he did the whole thing again at a meeting of the Boston Computer Society. Time has the recently unearthed video of the event.


Is Google’s quantum computer even quantum?

Google and NASA recently bought a D-Wave quantum computer. But according to a piece by Sophie Bushwick published on the Physics Buzz Blog, there isn’t scientific consensus on whether the computer is actually using quantum effects to calculate.

In theory, quantum computers can perform calculations far faster than their classical counterparts to solve incredibly complex problems. They do this by storing information in quantum bits, or qubits.

At any given moment, each of a classical computer’s bits can only be in an “on” or an “off” state. They exist inside conventional electronic circuits, which follow the 19th-century rules of classical physics. A qubit, on the other hand, can be created with an electron, or inside a superconducting loop. Obeying the counterintuitive logic of quantum mechanics, a qubit can act as if it’s “on” and “off” simultaneously. It can also become tightly linked to the state of its fellow qubits, a situation called entanglement. These are two of the unusual properties that enable quantum computers to test multiple solutions at the same time.

But in practice, a physical quantum computer is incredibly difficult to run. Entanglement is delicate, and very easily disrupted by outside influences. Add more qubits to increase the device’s calculating power, and it becomes more difficult to maintain entanglement.

(via fine structure)


Google’s new quantum computer

Google’s got themselves a quantum computer (they’re sharing it with NASA) and they made a little video about it:

I’m sure that Hartmut is a smart guy and all, but he’s got a promising career as an Arnold Schwarzenegger impersonator hanging out there if the whole Google thing doesn’t work out.


The Apollo Guidance Computer

A 30-minute documentary from the 60s on the Apollo Guidance Computer.


Richard Feynman and The Connection Machine

I will read stories about Richard Feynman all day long and this one is no exception. Danny Hillis remembers his friend and colleague in this piece originally written for Physics Today (original here).

Richard arrived in Boston the day after the company was incorporated. We had been busy raising the money, finding a place to rent, issuing stock, etc. We set up in an old mansion just outside of the city, and when Richard showed up we were still recovering from the shock of having the first few million dollars in the bank. No one had thought about anything technical for several months. We were arguing about what the name of the company should be when Richard walked in, saluted, and said, “Richard Feynman reporting for duty. OK, boss, what’s my assignment?” The assembled group of not-quite-graduated MIT students was astounded.

After a hurried private discussion (“I don’t know, you hired him…”), we informed Richard that his assignment would be to advise on the application of parallel processing to scientific problems.

“That sounds like a bunch of baloney,” he said. “Give me something real to do.”

So we sent him out to buy some office supplies. While he was gone, we decided that the part of the machine that we were most worried about was the router that delivered messages from one processor to another. We were not sure that our design was going to work. When Richard returned from buying pencils, we gave him the assignment of analyzing the router.

For more Hillis, I recommend Pattern on the Stone and for more Feynman, you can’t go wrong with Gleick’s Genius.


Don’t mess with Texas’s old computers

As recently as last year, a liquid filtration company in Texas was still using a computer built in 1948 to run all of its accounting work.

Sparkler’s IBM 402 is not a traditional computer, but an automated electromechanical tabulator that can be programmed (or more accurately, wired) to print out certain results based on values encoded into stacks of 80-column Hollerith-type punched cards.

Companies traditionally used the 402 for accounting, since the machine could take a long list of numbers, add them up, and print a detailed written report. In a sense, you could consider it a 3000-pound spreadsheet machine. That’s exactly how Sparkler Filters uses its IBM 402, which could very well be the last fully operational 402 on the planet. As it has for over half a century, the firm still runs all of its accounting work (payroll, sales, and inventory) through the IBM 402. The machine prints out reports on wide, tractor-fed paper.

Here’s what one of the computer’s apps look like:

IBM 402 apps

Objects in motion tends to stay in motion.


Do we live in a computer simulation?

In 2003, British philosopher Nick Bostrom suggested that we might live in a computer simulation. From the abstract of Bostrom’s paper:

This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a “posthuman” stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. A number of other consequences of this result are also discussed.

The gist appears to be that if The Matrix is possible, someone has probably already invented it and we’re in it. Which, you know, whoa.

But researchers believe they have devised a test to check if we’re living in a computer simulation.

However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.

The supercomputers performing lattice quantum chromodynamics calculations essentially divide space-time into a four-dimensional grid. That allows researchers to examine what is called the strong force, one of the four fundamental forces of nature and the one that binds subatomic particles called quarks and gluons together into neutrons and protons at the core of atoms.

“If you make the simulations big enough, something like our universe should emerge,” Savage said. Then it would be a matter of looking for a “signature” in our universe that has an analog in the current small-scale simulations.

If it turns out we’re all really living in an episode of St. Elsewhere, I’m going to be really bummed. (via @CharlesCMann)


The invention of social computing

I’m going to link again to Errol Morris’ piece on his brother’s role in the invention of email…the final part was posted a few hours ago…the entire piece is well worth a read. As is the case with many of his movies, Morris uses the story of a key or unique individual to paint a broader picture; in this instance, as the story of his brother’s involvement with an early email system unfolds, we also learn about the beginnings of social computing.

Fernando Corbato: Back in the early ’60s, computers were getting bigger. And were expensive. So people resorted to a scheme called batch processing. It was like taking your clothes to the laundromat. You’d take your job in, and leave it in the input bins. The staff people would prerecord it onto these magnetic tapes. The magnetic tapes would be run by the computer. And then, the output would be printed. This cycle would take at best, several hours, or at worst, 24 hours. And it was maddening, because when you’re working on a complicated program, you can make a trivial slip-up - you left out a comma or something - and the program would crash. It was maddening. People are not perfect. You would try very hard to be careful, but you didn’t always make it. You’d design a program. You’d program it. And then you’d have to debug it and get it to work right. A process that could take, literally, a week, weeks, months -

People began to advocate a different tactic, which came to be called time-sharing. Take advantage of the speed of the computer and have people at typewriter-like terminals. In principle, it seemed like a good idea. It certainly seemed feasible. But no manufacturer knew how to do it. And the vendors were not terribly interested, because it was like suggesting to an automobile manufacturer that they go into the airplane business. It just was a new game. A group of us began to create experimental versions of time-sharing, to see if it was feasible. I was lucky enough to be in a position to try to do this at MIT. And we basically created the “Compatible Time Sharing System,” nicknamed CTSS from the initials, that worked on the large mainframes that IBM was producing. First it was going to be just a demo. And then, it kept escalating. Time-sharing caught the attention of a few visionary people, like Licklider, then at BBN, who picked up the mantle. He went to Washington to become part of one of the funding agencies, namely ARPA. ARPA has changed names back and forth from DARPA to ARPA. But it’s always the same thing.

And it was this shift from batch processing to time-sharing that accidentally kickstarted people using computers in a social way…programming together, sending notes to each other, etc.

Robert Fano: Yes, the computer was connected through telephone lines to terminals. We had terminals all over the MIT campus. People could also use CTSS from other locations through the teletype network. CTSS was capable of serving about 20 people at a time without their being aware of one another. But they could also communicate with each other. A whole different view of computers was generated.

Before CTSS, people wrote programs for themselves. The idea of writing programs for somebody else to use was totally alien. With CTSS, programs and data stored could be stored in the common memory segment and they were available to the whole community. And that really took off. At a certain point, I started seeing the whole thing as a system that included the knowledge of the community. It was a completely new view. It was a remarkable event. In retrospect, I wish I had gotten a very smart social psychologist on the premises to look at and interpret what was happening to the community, because it was just unbelievable.

There was a community of people using the computer. They got to know each other through it. You could send an e-mail to somebody through the system. It was a completely new phenomenon.

It seems completely nutty to me that people using computers together โ€” which is probably 100% of what people use computers for today (email, Twitter, Facebook, IM, etc.) โ€” was an accidental byproduct of a system designed to let a lot of people use the same computer separately. Just goes to show, technology and invention works in unexpected ways sometimes…and just as “nature finds a way” in Jurassic Park, “social finds a way” with technology.


Report on personal computers from 1982

This James Fallows article from the July 1982 issue of The Atlantic Monthly is a wonderful technological time capsule. Fallows purchased a PC early in the 80s for use as a word processor.

For a while, I was a little worried about what they would come up with, especially after my father-in-law called to ask how important it was that I be able to use both upper- and lower-case letters. But finally, for a total of about $4,000, Optek gave me the machinery I have used happily to this day.

In the early days of personal computing, there were many competing machines, processors, operating systems manufactured by a number of companies. The PC Fallows bought was a crazy-quilt of a machine โ€” the monitor was made by Ball Corporation (the canning supplies company) and the printer was a converted IBM Selectric typewriter โ€” and was soon obsolete.

If I had guessed right, my brand, the Processor Technology SOL, would have caught on, and today I’d have the equivalent of a Mercedes-Benz instead of a Hupmobile. I’d be able to buy new programs at the computer store, and I’d be able to plug in to all the over-the-phone services. But I guessed wrong, and I’m left with a specimen of an extinct breed. When I need new programs, I try to write them myself, and when I have a breakdown, I call the neighborhood craftsman, Leland Mull, who lovingly tends the dwindling local population of SOL-20s.


Nature’s quantum computers

One of the big bummers about quantum computing is the cold temperatures required (hundreds of degrees below zero). However, a number of researchers believe that certain algae and bacteria perform quantum calculations at room temperature.

The evidence comes from a study of how energy travels across the light-harvesting molecules involved in photosynthesis. The work has culminated this week in the extraordinary announcement that these molecules in a marine alga may exploit quantum processes at room temperature to transfer energy without loss. Physicists had previously ruled out quantum processes, arguing that they could not persist for long enough at such temperatures to achieve anything useful.

(via mr)


Bacterial computing

Scientists have created a really fast bacterial computer that can solve, among other things, a specialized case of the travelling salesman problem.

Programming such a computer is no easy task, however. The researchers coded a simplified version of the problem, using just three cities, by modifying the DNA of Escherichia coli bacteria. The cities were represented by a combination of genes causing the bacteria to glow red or green, and the possible routes between the cities were explored by the random shuffling of DNA. Bacteria producing the correct answer glowed both colours, turning them yellow.

But just as vacuum tube and silicon chip-based computers became capable of more abstract calculations, perhaps the bacteria computer will follow the same developmental trajectory.


Following up on why HAL sings “Daisy,

Following up on why HAL sings “Daisy, Daisy” in 2001: A Space Odyssey”, Lee Hartsfeld found a 1961 record with the Bell Labs recording on it at a junk shop for $10.


Why does HAL sing “Daisy, Daisy” in 2001: A Space Odyssey?

In 1962, Arthur C. Clarke was touring Bell Labs when he heard a demonstration of a song sung by an IBM 704 computer programmed by physicist John L. Kelly. The song, the first ever performed by a computer, was called “Daisy Bell”, more commonly known as “Bicycle Built for Two” or “Daisy, Daisy”. When Clarke collaborated with Stanley Kubrick on 2001: A Space Odyssey, they had HAL sing it while Dave powered him down.

A clip of a 1963 synthesized computer speech demonstration by Bell Labs featuring “Daisy Bell” was included on an album for the First Philadelphia Computer Music Festival. You can listen to it (it’s the last track) and the rest of the album at vintagecomputermusic.com. (via mark)

Update: A reader just reminded me that HAL may have been so named because each letter is off by one from IBM, although Arthur C. Clarke denies this. (thx, justin)


Boxes and Arrows has an interview with

Boxes and Arrows has an interview with Adam Greenfield on his new book, Everyware. “Increasingly invisible but present everywhere in our lives, [computing] has moved off the desktop and out into everyday life โ€” affecting almost every one of us, whether we’re entirely aware of it or not.”


Khoi Vinh reports on computer technology in

Khoi Vinh reports on computer technology in Vietnam. They’re wired for broadband and Windows still dominates.


The $100 Laptop being designed by the MIT

The $100 Laptop being designed by the MIT Media Lab was recently unveiled. It’s a bright green, has a hand-crank for recharging the battery, flash memory, USB ports, networking, etc. The target audience is children in third-world countries.


George Dyson visits Google on the 60th

George Dyson visits Google on the 60th anniversary of John von Neumann’s proposal for a digital computer. A quote from a Googler โ€” “We are not scanning all those books to be read by people. We are scanning them to be read by an AI.” โ€” highlights a quasi-philosophical question about Google Print…if a book is copied but nobody reads it, has it actually been copied? (Or something like that.)


Interesting rumination on the possibility of flash

Interesting rumination on the possibility of flash memory-based computers. “In two years I have a feeling that Jobs will announce an Intel-flash iBook that will be the thinest laptop ever made boasting the best battery life of any current machine”.


Biologists are beginning to simulate living things

Biologists are beginning to simulate living things by computer, molecule by molecule. They’re starting with E. coli, but they’ve still got a long way to go.


Apple introduces a touch-sensitive squeezable mouse

Apple introduces a touch-sensitive squeezable mouse.


As We May Think by Vannevar Bush

As We May Think by Vannevar Bush. This influential essay that introduces Bush’s Memex concept was published 60 years ago this month.


The top 500 supercomputers in the world

The top 500 supercomputers in the world.