Quotes

Alan Moore: interview on mtv.com

I have a theory, which has not let me down so far, that there is an inverse relationship between imagination and money. Because the more money and technology that is available to [create] a work, the less imagination there will be in it.

Tadhg Kelly: Stories, Structure, Abstraction and Games

And that's why Chess and Go remain as enduringly popular as they are, and why soccer is the most popular game on earth. Robustness and elegance are the key driving forces here, and they are in direct opposition to the brittleness and complexity, the defining traits of story.

Bill Tozier: Diverse themes observed at GECCO 2006

What one wants is to be able to talk with a diverse club of smart people, arrange to do short one-off research projects and simulations, publish papers or capture intellectual property quickly and easily, and move on to another conversation. Quickly. Easily. For a living. Can't do that in industry. Can't do that in the Academy. Yet in my experience, scientists and engineers all want it. Maybe even a few mathematicians and social scientists do, too.

Luiz Henrique de Figueiredo: lua-l

 > I found myself wishing to have a continue keyword [...]
 > I can't recall an official reason why it isn't in the language.
Lua evolves by answering "why?" not "why not?".

David Hestenes and Garret Sobczyk: Clifford Algebra to Geometric Calculus: A Unified Language for Mathematics, p xii

Klein's seminal analysis of the structure and history of mathematics brings to light two major processes by which mathematics grows and becomes organized... The one emphasizes algebraic structure while the other emphasizes geometric interpretation. Klein's analysis shows one process alternately dominating the other in the historical development of mathematics. But there is no necessary reason that the two processes should operate in mutual exclusion. Indeed, each process is undoubtedly grounded in one of the two great capacities of the human mind: the capacity for language and the capacity for spatial perception. From the psychological point of view, then, the fusion of algebra with geometry is so fundamental that one could well say, 'Geometry without algebra is dumb! Algebra without geometry is blind!'

Richard Hamming: The Unreasonable Effectiveness of Mathematics (1980)

The Postulates of Mathematics Were Not on the Stone Tablets that Moses Brought Down from Mt. Sinai. It is necessary to emphasize this. We begin with a vague concept in our minds, then we create various sets of postulates, and gradually we settle down to one particular set. In the rigorous postulational approach, the original concept is now replaced by what the postulates define. This makes further evolution of the concept rather difficult and as a result tends to slow down the evolution of mathematics. It is not that the postulation approach is wrong, only that its arbitrariness should be clearly recognized, and we should be prepared to change postulates when the need becomes apparent.

Richard Hamming: The Art of Doing Science and Engineering (1997)

Education is what, when, and why to do things. Training is how to do it.

In science, if you know what you are doing, you should not be doing it. In engineering, if you do not know what you are doing, you should not be doing it.

Dan Bricklin: interview on Triumph of the Nerds

People who saw [VisiCalc] and went and got it... Like an accountant, I remember showing it to one around here and he started shaking and said, "That's what I do all week. I could do it in an hour." ... I meet these people now, they come up to me and say, "I gotta tell you, you changed my life. You made accounting fun."

Dan Bricklin and Bob Frankston: interview on Triumph of the Nerds

DB: You know, looking back at how successful a lot of other people have been [as a result of our work], it's kind of sad that we weren't as successful...

BF: It would be very nice to be gazillionaires, but you can also understand that part of the reason was that that's not what we were trying to be.

DB: We were kids of the Sixties and what did you want to do? You wanted to make the world better, and you wanted to make your mark on the world and improve things, and we did it. So by the mark of what we would measure ourselves by, we were very successful.

Richard Hamming: You and Your Research

Somewhere around every seven years make a significant, if not complete, shift in your field. Thus, I shifted from numerical analysis, to hardware, to software, and so on, periodically, because you tend to use up your ideas. When you go to a new field, you have to start over as a baby. You are no longer the big mukity muk and you can start back there and you can start planting those acorns which will become the giant oaks. ...

You need to get into a new field to get new viewpoints, and before you use up all the old ones. You can do something about this, but it takes effort and energy. It takes courage to say, "Yes, I will give up my great reputation." For example, when error correcting codes were well launched, having these theories, I said, "Hamming, you are going to quit reading papers in the field; you are going to ignore it completely; you are going to try and do something else other than coast on that."

Sol Stein: Stein on Writing

Nonfiction conveys information. Fiction conveys emotion.

C.A.R. Hoare: The Emperor's Old Clothes

I note with fear and horror that even in 1980, language designers and users have not learned this lesson [mandatory run-time checking of array bounds]. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law.

C.A.R. Hoare: The Emperor's Old Clothes

I conclude that there are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willingness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.

Alan Kay: How Simply and Understandably Could The "Personal Computing Experience" Be Programmed?

When I first prepared this particular talk... I realized that my usual approach is usually critical. That is, a lot of the things that I do, that most people do, are because they hate something somebody else has done, or they hate that something hasn't been done. And I realized that informed criticism has completely been done in by the web. Because the web has produced so much uninformed criticism. It's kind of a Gresham's Law -- bad money drives the good money out of circulation. Bad criticism drives good criticism out of circulation. You just can't criticize anything.

George Orwell: Politics and the English Language

What is above all needed is to let the meaning choose the word, and not the other way around. In prose, the worst thing one can do with words is surrender to them. When you think of a concrete object, you think wordlessly, and then, if you want to describe the thing you have been visualising you probably hunt about until you find the exact words that seem to fit it. When you think of something abstract you are more inclined to use words from the start, and unless you make a conscious effort to prevent it, the existing dialect will come rushing in and do the job for you, at the expense of blurring or even changing your meaning. Probably it is better to put off using words as long as possible and get one's meaning as clear as one can through pictures and sensations. Afterward one can choose -- not simply accept -- the phrases that will best cover the meaning, and then switch round and decide what impressions one's words are likely to make on another person.

Mark Kennedy: Carrying a Sketchbook

I don't know what people expect to see what they look in a sketchbook, but they always seem mighty disappointed. I think people expect to see what they would see in a Hollywood version of a sketchbook. Whenever someone is sketching from life in a movie, it's always supposed to look tossed off and effortless, but it's really some totally finished and labored-over drawing that some artist spent hours rendering.

Any real sketchbook is full of misfires, false starts and stumbles, with a few successes sprinkled here and there. If you were capable of doing a perfect drawing every time, you wouldn't need to carry a sketchbook!

Dan Piponi: The Essence of Quantum Computing

What I am saying is in direct contradiction with what is said by some of the founding fathers of quantum computing. Actually, I think that's a good thing, it means that whether I'm right or wrong, I must be saying something non-trivial.

Joe Armstrong: A History of Erlang

It was during this conference that we realised that the work we were doing on Erlang was very different from a lot of mainstream work in telecommunications programming. Our major concern at the time was with detecting and recovering from errors. I remember Mike, Robert and I having great fun asking the same question over and over again: "what happens if it fails?" -- the answer we got was almost always a variant on "our model assumes no failures." We seemed to be the only people in the world designing a system that could recover from software failures.

John Napier: Hands

With the eye, the hand is our main source of contact with the physical environment. The hand has advantages over the eye because it can observe the environment by means of touch, and having observed it, it can immediately proceed to do something about it. The hand has other great advantages over the eye. It can see around corners and it can see in the dark.

Christopher Alexander: foreword to Richard Gabriel's "Patterns Of Software"

In my life as an architect, I find that the single thing which inhibits young professionals, new students most severely, is their acceptance of standards that are too low. If I ask a student whether her design is as good as Chartres, she often smiles tolerantly at me as if to say, "Of course not, that isn't what I am trying to do.... I could never do that."

Then, I express my disagreement, and tell her: "That standard must be our standard. If you are going to be a builder, no other standard is worthwhile. That is what I expect of myself in my own buildings, and it is what I expect of my students." Gradually, I show the students that they have a right to ask this of themselves, and must ask this of themselves. Once that level of standard is in their minds, they will be able to figure out, for themselves, how to do better, how to make something that is as profound as that.

Neil Postman: Amusing Ourselves to Death

[This argument] fixes its attention on the forms of human conversation, and postulates that how we are obliged to conduct such conversations will have the strongest possible influence on what ideas we can conveniently express. And what ideas are convenient to express inevitably become the important content of a culture.

Neil Postman: Amusing Ourselves to Death

We must remember that Galileo merely said that the language of nature is written in mathematics. He did not say everything is. And even the truth about nature need not be expressed in mathematics. For most of human history, the language of nature has been the language of myth and ritual. These forms, one might add, had the virtues of leaving nature unthreatened and of encouraging the belief that human beings are part of it. It hardly befits a people who stand ready to blow up the planet to praise themselves too vigorously for having found the true way to talk about nature.

Steven Johnson: Everything Bad Is Good For You

Now, I have no doubt that playing today's games does in fact improve your visual intelligence and your manual dexterity, but the virtues of gaming run far deeper than hand-eye coordination. When I read these ostensibly positive accounts of video games, they strike me as the equivalent of writing a story about the merits of the great novels and focusing on how reading them can improve your spelling.

Paul Hawken: The Ecology of Commerce

The purpose of all these suggestions is to end industrialism as we know it. Industrialism is over, in fact; the question remains how we organize the economy that follows. Either it falls in on us, and crushes civilization, or we reconstruct it and unleash the imagination of a more sustainable future into our daily acts of commerce. Protecting our industries because we want to be pro-business and pro-jobs will have the same level of effectiveness as did the Soviet effort to maintain its industries in the 1970s and 1980s.

Bjarne Stroustrup: interview in MIT Tech Review

Q: In The Design and Evolution of C++, you claim that Kierkegaard was an influence on your conception of the language. Is this a joke?

A: A bit pretentious, maybe, but not a joke. A lot of thinking about software development is focused on the group, the team, the company. This is often done to the point where the individual is completely submerged in corporate "culture" with no outlet for unique talents and skills. Corporate practices can be directly hostile to individuals with exceptional skills and initiative in technical matters. I consider such management of technical people cruel and wasteful. Kierkegaard was a strong proponent for the individual against "the crowd" and has some serious discussion of the importance of aesthetics and ethical behavior.

Stewart Brand: Environmental Heresies

The best way for doubters to control a questionable new technology is to embrace it, lest it remain wholly in the hands of enthusiasts who think there is nothing questionable about it.

Alan Moore: interview for "Authors on Anarchism"

In the future, we would have to be prepared for a situation in which we have firstly, no currency, and secondly, as a result of that, no government. So there are ways in which technology itself and the ways in which we respond to technology -- the ways in which we adapt our culture and our way of living to accommodate breakthroughs and movements in technology -- might give us a way to move around government. To evolve around government to a point where such a thing is no longer necessary or desirable. That is perhaps an optimistic vision, but it's one of the only realistic ways I can see it happening. ...

I really don't think that a violent revolution is ever going to provide a long-term solution to the problems of the ordinary person. I think that is something that we had best handle ourselves, and which we are most likely to achieve by the simple evolution of western society. But that might take quite a while, and whether we have that amount of time is, of course, open to debate.

Sheik Ahmed Zaki Yamani

The Stone Age didn't end for lack of stone, and the Oil Age will end long before the world runs out of oil.

Pavel Kobel: lua-l

Business demanding promise from [open-source] project is like business demanding promise from forest. If you like trees, you must do other thing to conserve.

Noam Chomsky: response to interview question regarding alternatives to capitalism

I think that, what used to be called centuries ago "wage slavery," is intolerable. And I don't think that people ought to be forced to rent themselves in order to survive. I think that the economic institutions ought to be run democratically, by their participants, by the communities in which they exist, and so on. And I think, basically, through various kinds of free association.

Kai Krause: Software is merely a Performance Art

I used to think "Software Design" is an art form.

I now believe that I was half-right:
it is indeed an art, but it has a rather short half-life:
Software is merely a performance art!

A momentary flash of brilliance, doomed to be overtaken by the next wave, or maybe even by its own sequel. Eaten alive by its successors. And time...

This is not to denigrate the genre of performance art: anamorphic sidewalk chalk drawings, Goldsworthy pebble piles or Norwegian carved-ice-hotels are admirable feats of human ingenuity, but they all share that ephemeral time limit: the first rain, wind or heat will dissolve the beauty, and the artist must be well aware of its fleeting glory.

For many years I have discussed this with friends that are writers, musicians, painters and the simple truth emerged: one can still read the words, hear the music and look at the images....

Their value and their appeal remains, in some cases even gain by familiarity: like a good wine it can improve over time. You can hum a tune you once liked, years later. You can read words or look a painting from 300 years ago and still appreciate its truth and beauty today, as if brand new. Software, by that comparison, is more like Soufflé: enjoy it now, today, for tomorrow it has already collapsed on itself. Soufflé 1.1 is the thing to have, Version 2.0 is on the horizon.

It is a simple fact: hardly any of my software even still runs at all!

Richard Doherty: Diary of a Disaster: General Magic Goes Poof!

I'm visiting Woz and his daughter Suzanne, who is in the hospital after an emergency appendectomy, when another visitor asks if a certain friend has been told about the surgery. Woz proudly whips out his Magic Link to get her address and number. Before the device can retrieve the data, however, Suzanne produces the number from an address book in her handbag.

Seymour Papert: Mindstorms: children, computers, and powerful ideas

In many schools today, the phrase "computer-aided instruction" means making the computer teach the child. One might say the computer is being used to program the child. In my vision, the child programs the computer and, in doing so, both acquires a sense of mastery over a piece of the most modern and powerful technology and establishes an intimate contact with some of the deepest ideas from science, from mathematics, and from the art of intellectual model building. ...

Two fundamental ideas run through this book. The first is that it is possible to design computers so that learning to communicate with them can be a natural process, more like learning French by living in France than like trying to learn it through the unnatural process of American foreign-language instruction in classrooms. Second, learning to communicate with a computer may change the way other learning takes place. The computer can be a mathematics-speaking and an alphabetic-speaking entity. We are learning how to make computers with which children love to communicate. When this communication occurs, children learn mathematics as a living language. Moreover, mathematical communication and alphabetic communication are thereby both transformed from the alien and therefore difficult things they are for most children into natural and therefore easy ones. The idea of "talking mathematics" to a computer can be generalized to a view of learning mathematics in "Mathland"; that is to say, in a context which is to learning mathematics what living in France is to learning French.

Steven Johnson: Emergence: The Connected Lives of Ants, Brains, Cities, and Software

Cities bring minds together and put them into coherent slots. ... Ideas and goods flow readily within these clusters, leading to productive cross-pollination, ensuring that good ideas don't die out in rural isolation. The power unleashed by this data storage is evident in the earliest large-scale human settlements... By some accounts, grain cultivation, the plow, the potter's wheel, the sailboat, the draw loom, copper metallurgy, abstract mathematics, exact astronomical observation, the calendar -- all of these inventions appeared within centuries of the original urban populations. It's possible, even likely, that more isolated groups or individuals had stumbled upon some of those technologies at an earlier date, but they didn't become part of the collective intelligence of civilization until there were cities to store and transmit them.

Ken Robinson: TED 2006 talk

University professors... live in their heads. ... They're disembodied, in a kind of literal way. They look upon their body as a form of transport for their heads. It's a way of getting their heads to meetings.

J. Yee: email

Last night I went to a baby shower where a good number of the attendees were babies themselves. I kept thinking how ridiculous it is that people pour so much time and energy into supporting a single life, when there are so many others that need more support.

Will Wright: interview in Designing Interactions

We noticed that when we were designing The Sims, a certain degree of abstraction in the game is very beneficial. You don't actually get very close to the characters. You can't quite see their facial expressions, but everybody in their mind is imagining the facial expressions on the characters.

In computer game design, you're dealing with two processors. You've got the processor in front of you on the computer and you've got the processor in your head, and so the game itself is actually running on both. There are certain things that the computer is very good at, but there are other things that the human imagination is better at.

Wikileaks editors (anonymous): Wikileaks: About

Considering corporations as analogous to a nation state reveals the following properties:

  1. The right to vote does not exist except for share holders (analogous to land owners) and even there voting power is in proportion to ownership.
  2. All power issues from a central committee.
  3. There is no balancing division of power. There is no fourth estate. There are no juries and innocence is not presumed.
  4. Failure to submit to any order may result in instant exile.
  5. There is no freedom of speech.
  6. There is no right of association. Even love between men and women is forbidden without approval.
  7. The economy is centrally planned.
  8. There is pervasive surveillance of movement and electronic communication.
  9. The society is heavily regulated, to the degree many employees are told when, where and how many times a day they can go to the toilet.
  10. There is little transparency and freedom of information is unimaginable.
  11. Internal opposition groups are blackbanned, surveilled and/or marginalized whenever and wherever possible.

While having a GDP and population comparable to Belgium, Denmark or New Zealand, most corporations have nothing like their quality of civic freedoms and protections. Internally, some mirror the most pernicious aspects of the 1960s Soviet system. This is even more striking when the regional civic laws the company operates under are weak (such as in West Papua or South Korea); there, the character of these corporate tyrannies is unobscured by their surroundings.

Aaron Hertzman: Machine Learning for Computer Graphics: A Manifesto and Tutorial

It is a truism that artificial intelligence research can never become successful, because its successes are not viewed as AI.

Chaim Gingold: Miniature Gardens & Magic Crayons

Will Wright points out that while playing games, people engage a game in their head, and what counts is this mental world. "So what we're trying to do as designers is build up these mental models in the player. The computer is just an incremental step, an intermediate model to the model in the player's head." His explanation of this concept works like this: somebody walks into a game store and looks at the cover of your game's box. Based on the front of the box, they start playing a game in their head, and if that game is interesting, they'll pick up the box and look at the back. They then play a new game in their head, closer to the one you've designed. If they like that game, then they'll buy the game and take it home.

Bret Victor: email (9/3/04)

Interface matters to me more than anything else, and it always has. I just never realized that. I've spent a lot of time over the years desperately trying to think of a "thing" to change the world. I now know why the search was fruitless -- things don't change the world. People change the world by using things. The focus must be on the "using", not the "thing". Now that I'm looking through the right end of the binoculars, I can see a lot more clearly, and there are projects and possibilities that genuinely interest me deeply.

Joe Armstrong: erlang-questions mailing list

The real principle is "let some other process fix the error". The "let it fail" philosophy is a consequence of this. ... look to make a fault-tolerant system you need TWO computers not ONE right ... and If you've got TWO computers you need to start thinking about distributed programming *whether you like or not* and if you're going to do distributed computing then you'll have to think about the following ...

Malcolm Gladwell: Group Think

[The] point is not that innovation attracts groups but that innovation is found in groups: that it tends to arise out of social interaction -- conversation, validation, the intimacy of proximity, and the look in your listener's eye that tells you you're onto something. ...

When [Erasmus Darwin, James Watt, Joseph Priestley, etc.] were not meeting, they were writing to each other with words of encouragement or advice or excitement. This was truly -- in a phrase that is invariably and unthinkingly used in the pejorative -- a mutual-admiration society. ...

What were they doing? Darwin, in a lovely phrase, called it "philosophical laughing," which was his way of saying that those who depart from cultural or intellectual consensus need people to walk beside them and laugh with them to give them confidence. ...

We divide [groups] into cults and clubs, and dismiss the former for their insularity and the latter for their banality. The cult is the place where, cut off from your peers, you become crazy. The club is the place where, surrounded by your peers, you become boring. Yet if you can combine the best of those two -- the right kind of insularity with the right kind of homogeneity -- you create an environment both safe enough and stimulating enough to make great thoughts possible.

Doug McIlroy: talk on the history of computing at Bell Labs

This machine ran for a good number of years, probably six, eight. And it is said that it never made an undetected error. What that means is that it never made an error that it did not diagnose itself and stop. Relay technology was very very defensive. The telephone switching system had to work. It was full of self-checking.

Ted Koppel: interview in Frontline: News War

To the extent that we're now judging journalism by the same standards that we apply to entertainment — in other words, give the public what it wants, not necessarily what it ought to hear, what it ought to see, what it needs, but what it wants — that may prove to be one of the greatest tragedies in the history of American journalism. ...

In the very early days of television news, the FCC still had teeth, and still used them every once in a while. And there was that little paragraph, section 315 of the FCC code, that said: "You shall operate in the public interest, convenience, and necessity." And what that meant was, you had to have a news division that told people what was important out there.

Andy Barnes: interview in Frontline: News War

The idea that all of the world should be measured in dollars to stockholders is actually a relatively new idea. It used to be that we thought that businesses had their purpose. Your purpose was to be making newspapers or fountain pens or whatever. And now we act as though the only purpose of a business was to enrich the people who trade it on Wall Street.... Of course you've got to have profit, of course you've got to support your ownership. But that's not why we're doing it. We're doing it because publishing a newspaper is a crucial thing to be doing.

danah boyd: Facebook's "Privacy Trainwreck": Exposure, Invasion, and Drama

i started wondering if social media is dangerous ... If gossip is too delicious to turn your back on and Flickr, Bloglines, Xanga, Facebook, etc. provide you with an infinite stream of gossip, you'll tune in. Yet, the reason that gossip is in your genes is because it's the human equivalent to grooming. By sharing and receiving gossip, you build a social bond between another human. Yet, what happens when the computer is providing you that gossip asynchronously? I doubt i'm building a meaningful relationship with you when i read your MySpace CuteKitten78. You don't even know that i'm watching your life. Are you really going to be there when i need you?

montessori.edu: FAQ

Q. I recently observed a Montessori classroom for a day. I was very very impressed, but I [noticed] there doesn't seem to be any opportunities for pretend play...

A: When Dr. Montessori opened the first Children's House it was full of pretend play things. The children never played with them as long as they were allowed to do real things - i.e. cooking instead of pretending to cook. It is still true.

Adam Cadre: My first political donation

Everyone's interested in the leaders of the country. Intelligent people are interested in the actual functioning of the government, what policies various candidates plan to put into practice, how those policies will affect the lives of the citizenry... but there just aren't that many intelligent people. A lot of people are stupid. To them the government is just a sort of reality show. To them politicians are just celebrities who show up in different timeslots from the actors and sports stars. The beauty of constitutional monarchy is that it gives the stupid people their reality show, but farms it out to a powerless royal family so the real government can get on with its work.

Neal Stephenson: Snow Crash

At the time, both of them were working on avatars. He was working on bodies, she was working on faces. She was the face department, because nobody thought that faces were all that important -- they were just flesh-toned busts on top of the avatars. She was just in the process of proving them all desperately wrong. But at this phase, the all-male society of bitheads that made up the power structure of Black Sun Systems said that the face problem was trivial and superficial. It was, of course, nothing more than sexism, the especially virulent type espoused by male techies who sincerely believe that they are too smart to be sexists.

Adam Cadre: some of my evaluative patterns

Non-fiction also tends to fall into the trap of failing to communicate with the reader. All too many writers, especially in academia, act as if they are programmers in 1980 trying to fit an entire videogame into 4K. Write to communicate; don't just densely encode information for storage.

Michael Rivero

Most people prefer to believe their leaders are just and fair even in the face of evidence to the contrary, because once a citizen acknowledges that the government under which they live is lying and corrupt, the citizen has to choose what he or she will do about it. To take action in the face of a corrupt government entails risks of harm to life and loved ones. To choose to do nothing is to surrender one's self-image of standing for principles. Most people do not have the courage to face that choice. Hence, most propaganda is not designed to fool the critical thinker but only to give moral cowards an excuse not to think at all.

Dan Cook: Mixing Games and Applications

Why do games have such a radically different learning curve than advanced applications? It turns out that games are carefully tuned machines that hack into human beings' most fundamental learning processes. Games are exercises in applied psychology at a level far more nuanced than your typical application. ...

Implicit in this description of interactivity is the fact that users change. More importantly, the feedback loops we, as designers, build into our games, directly change the user's mind... The person that starts using a game is not the same person that finishes the game. Games and the scaffold of skills atoms describes in minute detail how and what change occurs.

This is a pretty big philosophical shift from how application design is usually approached. We tend to imagine that users are static creatures who live an independent and unchanging existence outside of our applications. We merely need to give them a static set of pragmatic tools and all will be good.

Games state that our job is to teach, educate and change our users. We lead them on an explicitly designed journey that leaves them with functioning skills that they could not have imagined before they started using our application. Our games start off simple and slowly add complexity. Our apps must adapt along the user's journey to reflect their changing mental models and advanced skills. Failure to do so results in a mismatch that results in frustration, boredom and burnout.

Clay Shirky: Gin, Television, and Social Surplus

This is something that people in the media world don't understand. Media in the 20th century was run as a single race -- consumption... But media is actually a triathlon, it's three different events. People like to consume, but they also like to produce, and they like to share. ...

And this is the other thing about the size of the cognitive surplus we're talking about. It's so large that even a small change could have huge ramifications. Let's say that everything stays 99 percent the same, that people watch 99 percent as much television as they used to, but 1 percent of that is carved out for producing and for sharing. The Internet-connected population watches roughly a trillion hours of TV a year. That's about five times the size of the annual U.S. consumption. One per cent of that is 100 Wikipedia projects per year worth of participation.

Bill Budge: interview on Computer Chronicles (1984)

[Now] I'm just putting bumpers [all over the pinball board]. This is a favorite of really young kids. They like to just grab a whole bunch of bumpers, fill the board up with them and put a ball on there. A pinball aficionado would gasp, but little kids don't really build pinball machines, they just sort of build "things" with this. ...

[Q: What do you want to do next?] I want to extend the idea of a construction set. This one was hard to do when I started, because there are lots of combinations of things you can't really predict, when you're making a kit, when you're making a "metagame". I'd like to extend the idea even further, and the problem there is then designing the "parts box". In pinball, it's a small set of parts. You don't really have to worry about thinking up abstractions. In a general construction set, it's not clear what the parts should be. It's almost like you're inventing a new language for representing specifications for programs.

Guy Steele: 50 in 50

We must remember that, strictly speaking, "formal" does not mean merely "rigorous", but "according to form". Meaning need be ascribed only to the result of a formal process. It is not needed to guide the process itself. We ascribe meaning to intermediate formal states primarily, nay solely, to reassure ourselves.

John Holt: How Children Fail

Our way of scoring was to give the groups [of fourth-graders] a point for each correct prediction. Before long they were thinking more of ways to get a good score than to make the beam balance. We wanted them to figure out how to balance the beam, and introduced the scoring as a matter of motivation. But they out-smarted us, and figured out ways to get a good score that had nothing to do with whether the beam balanced or not.

... Betty figured out that the way to get a good score is to put the weights in what you know is a wrong place, and then have everyone on your team say it is wrong. Thus, they will each get a point for predicting correctly.

... A couple years later, when I put a balance beam and some weights on a table at the back of my class, and just left it there without saying anything about it or trying to "teach" it, most of the children in the class, including some very poor students, figured out just by messing around with it how it worked.

John Holt: How Children Fail

[I told the fourth-graders] I was thinking of a number between 1 and 10,000. ... They still cling stubbornly to the idea that the only good answer is a yes answer. This, of course, is the result of miseducation in which "right answers" are the only ones that pay off. They have not learned how to learn from a mistake, or even that learning from mistakes is possible. If they say, "Is the number between 5,000 and 10,000?" and I say yes, they cheer; if I say no, they groan, even though they get exactly the same amount of information in either case. The more anxious ones will, over and over again, ask questions that have already been answered, just for the satisfaction of hearing a yes.

John Holt: How Children Learn

[Describing a science-fiction-like photograph of a research lab.] Why did the magazine want such a picture?... Because it makes science look like a powerful and forbidding mystery, not for the likes of you and me. Because it tells us that only people with expensive and incomprehensible machines can discover the truth, about human beings or anything else, and that we must believe whatever they tell us. Because it turns science from an activity to be done into a commodity to be bought. Because it prevents ordinary human beings from being the scientists, the askers of questions and seekers and makers of answers that we naturally and rightfully are, and makes us instead into science consumers and science worshippers.

Fabien: lua-l

A language is an interface between programmers and hardware, so it has social/psychological/pedagogical features which are just as important as its formal properties. If a language can't be efficiently ported on regular hardware, it's the language that sucks, not the hardware. Similarly, if it doesn't interface properly with its communities of coders (fails to build up standard coding practices, good libraries, trust...), the language sucks, not the people. Ergo Lisp sucks. Many Lisp zealots dismiss the language's failures as "merely social", but that's missing the purpose of a language entirely: failing socially is just as bad as failing technically.

Tycho: Penny Arcade

I was going through a thread earlier this morning full of those who couldn't understand why people keep buying the 360, and part of it is almost certainly because of people like my sister. She doesn't define herself spiritually by her console choice and doesn't track hardware failure rates. She is only a "gamer" during the time the console is on, the same way she ceases being a "toaster" once her toast is complete. She is utterly devoid of the received wisdom we amass as enthusiasts, and the joy she wrings from the medium is not diminished as a result.

John Taylor Gatto: The Six-Lesson Schoolteacher

This is another way I teach the lesson of dependency. Good people wait for a teacher to tell them what to do. This is the most important lesson of all, that we must wait for other people, better trained than ourselves, to make the meanings of our lives.

John Taylor Gatto: The Underground History of American Education

If you obsess about conspiracy, what you'll fail to see is that we are held fast by a form of highly abstract thinking fully concretized in human institutions which has grown beyond the power of the managers of these institutions to control. If there is a way out of the trap we're in, it won't be by removing some bad guys and replacing them with good guys.

Amish Information Systems: Last one. Romance.

My romantic entanglements tended to be quantum in nature - i.e. they happened at a distance and were undetectable to outside observers.

Steven Strogatz: Nonlinear Dynamics and Chaos, p 175

This example [phase-space sketch of a nonlinear system] shows how far we can go with pictures -- without invoking any difficult formulas, we were able to extract all the important features of the pendulum's dynamics. It would be much more difficult to obtain these results analytically, and much more confusing to interpret the formulas, even if we could find them.

Banksy

The thing I hate the most about advertising is that it attracts all the bright, creative and ambitious young people, leaving us mainly with the slow and self-obsessed to become our artists. Modern art is a disaster area. Never in the field of human history has so much been used by so many to say so little.

Dan Bricklin: The Cornucopia of the Commons

What we see here is that increasing the value of the database by adding more information is a natural by-product of using the tool for your own benefit. No altruistic sharing motives need be present. ...

I believe that you can help predict the success of a particular UI used to build a shared database based on how much normal, selfish use adds to the database.

Dan Bricklin: Systems without guilt where every contribution is appreciated

In a good system, just doing what you normally would do to help yourself helps everybody. Even helping a bit once in a while (like typing in the track names of a CD nobody else had ever entered) benefited you and the system. Instead of making you feel bad for "only" doing 99%, a well designed system makes you feel good for doing 1%. People complain about systems that have lots of "freeloaders". Systems that do well with lots of "freeloading" and make the best of periodic participation are good.

So, here we have another design criteria for a type of successful system: Guiltlessness. No only should people just need to do what's best for them when they help others, they need to not need to always do it.

Lawrence Lessig: Remix

[Regarding motivations in a sharing vs commercial economy] Even the thee-regarding [selfless] motivations need not be descriptions of self-sacrifice. I suspect that no one contributes to Wikipedia despite hating what he does, solely because he believes he ought to help create free knowledge. We can all understand people in the commercial economy who hate what they do but do it anyway ("he's just doing it for the money"). That dynamic is very difficult to imagine in the sharing economy. In the sharing economy, people are in it for the thing they're doing, either because they like the doing, or because they like doing such things. Either way, these are happy places. People are there because they want to be.

Clay Shirky: Newspapers and Thinking the Unthinkable

For a long time, longer than anyone in the newspaper business has been alive in fact, print journalism has been intertwined with these economics. The expense of printing created an environment where Wal-Mart was willing to subsidize the Baghdad bureau. This wasn't because of any deep link between advertising and reporting, nor was it about any real desire on the part of Wal-Mart to have their marketing budget go to international correspondents. It was just an accident. Advertisers had little choice other than to have their money used that way, since they didn't really have any other vehicle for display ads.

The old difficulties and costs of printing forced everyone doing it into a similar set of organizational models; it was this similarity that made us regard Daily Racing Form and L'Osservatore Romano as being in the same business. That the relationship between advertisers, publishers, and journalists has been ratified by a century of cultural practice doesn't make it any less accidental.

The competition-deflecting effects of printing cost got destroyed by the internet, where everyone pays for the infrastructure, and then everyone gets to use it. And when Wal-Mart, and the local Maytag dealer, and the law firm hiring a secretary, and that kid down the block selling his bike, were all able to use that infrastructure to get out of their old relationship with the publisher, they did. They'd never really signed up to fund the Baghdad bureau anyway.

Adam Cadre: WALL-E

So [workers replaced by automation] have to go find other jobs. Not because the society needs any additional production -- it was already doing fine on that count -- but because of ideology. An odd facet of this ideology is that no one really cares much whether you're contributing to the greater good so long as you're performing some kind of labor!

Charles Bloom: Waffling

I've always been very dubious about the idea of learning from people who have been successful. There's this whole cult of worshipping rich people, reading interviews with them, getting their opinions on things, trying to learn what made them successful. I think it's mostly nonsense. The thing is, if you just look at who the biggest earners are, it's almost entirely luck. ...

The point is if you just look at successful business people, they will probably be confident, decisive, risk takers, aggressive at seizing opportunities, aggressive about growing the business quickly, etc. That doesn't mean that those are the right things to do. It just means that those are variance-increasing traits that give them a *chance* to be a big success.

Lewis Hyde: The Gift, p 11

This, then, is how I use "consume" to speak of a gift -- a gift is consumed when it moves from one hand to another with no assurance of anything in return. There is little difference, therefore, between its consumption and its movement. A market exchange has an equilibrium or stasis: you pay to balance the scale. But when you give a gift there is momentum, and the weight shifts from body to body.

Chip Morningstar: Habitat Chronicles: Smart people can rationalize anything

You can't sell someone the solution before they've bought the problem.

Alan Kay: The Early History of Smalltalk

All of the elements eventually used in the Smalltalk user interface were already to be found in the sixties, as different ways to access and invoke the functionality provided by an interactive system. The two major centers of ideas were Lincoln Labs and RAND corp, both ARPA funded. The big shift that consolidated these ideas into a powerful theory and long-lived examples came because the LRG [Learning Research Group] focus was on children. Hence, we were thinking about learning as being one of the main effects we wanted to have happen. Early on, this led to a 90 degree rotation of the purposed of the user interface from "access to functionality" to "environment in which users learn by doing." This new stance could now respond to the echos of Montessori and Dewey, particularly the former, and got me, on rereading Jerome Bruner, to think beyond the children's curriculum to a "curriculum of the user interface."

The particular aim of LRG was to find the equivalent of writing -- that is, learning and thinking by doing in a medium -- our new "pocket universe." For various reasons I had settled on "iconic programming" as the way to achieve this, drawing on the iconic representations used by many ARPA projects in the sixties. My friend Nicholas Negroponte, an architect, was extremely interested in how environments affected peoples' work and creativity. He was interested in embedding the new computer magic in familiar surroundings. I had quite a bit of theatrical experience in a past life, and remembered Coleridge's adage that "people attend 'bad theatre' hoping to forget, people attend 'good theatre' aching to remember." In other words, it is the ability to evoke the audience's own intelligence and experiences that makes theatre work.

Putting all this together, we want an apparently free environment in which exploration causes desired sequences to happen (Montessori); one that allows kinesthetic, iconic, and symbolic learning -- "doing with images makes symbols" (Piaget & Bruner); the user is never trapped in a mode (GRAIL); the magic is embedded in the familiar (Negroponte); and which acts as a magnifying mirror for the user's own intelligence (Coleridge). It would be a great finish to ths story to say that having articulated this, we were able to move straightforwardly to the design as we know it today. In fact, the UI design work happened in fits and starts in between feeding Smalltalk itself, designing children's experiments, trying to understand iconic construction, and just playing around. In spite of this meandering, the context almost forced a good design to turn out anyway.

James Herndon: How to Survive in Your Native Land, p 36

This drove us out of our minds, and it drove us out of our minds every day... Unaccountably, the course was not, as we'd thought, a course where students would get to do all the things we'd thought up for them to do, but instead a course where they could steadfastly refuse to do everything and then complain that there was nothing to do. ...

We never quite accepted the notion that the real curriculum of the course was precisely the question What Shall We Do In Here? and that it was really an important question and maybe the only important question.

Keith Johnstone: Impro: Improvisation and the Theatre, p 149

We don't know much about Masks in this culture ... because this culture is usually hostile to trance states. We distrust spontaneity, and try to replace it by reason: the Mask was driven out of theatre in the same way that improvisation was driven out of music. ... Education itself might be seen as primarily an anti-trance activity.

I see the Mask as something that is continually flaring up in this culture, only to be almost immediately snuffed out. No sooner have I established a tradition of Mask work somewhere than the students start getting taught the 'correct' movements, just as they learn a phoney 'Commedia dell' Arte' technique.

Charles Bloom: Intolerance

Just the mathematics of being single are depressing. You have to flirt with ten girls to get a date with one. You have to go on ten dates to find someone you want to have something long term with. You have to have ten long term relationships to find the one that works. It's unbearable.

Adam Cadre: Fatal abstraction

I hate abstraction. Here are some examples.

Kent Beck and Ward Cunningham: A Laboratory For Teaching Object-Oriented Thinking

Note that the [CRC] cards are placed such that View and Controller are overlapping (implying close collaboration) and placed above Model (implying supervision.) We find these and other informal groupings aid in comprehending a design. Parts, for example, are often arranged below the whole. ...

The ability to quickly organize and spatially address index cards proves most valuable when a design is incomplete or poorly understood. We have watched designers repeatedly refer to a card they intended to write by pointing to where they will put it when completed.

Randall B. Smith and David Ungar: Programming as an Experience: The Inspiration for Self

We now believe that when features, rules, or elaborations are motivated by particular examples, it is a good bet that their addition will be a mistake. The second author once coined the term "architect's trap" for something similar in the field of computer architecture; this phenomenon might be called "the language designer's trap."

If examples cannot be trusted, what do we think should motivate the language designer? Consistency and malleability. When there is only one way of doing things, it is easier to modify and reuse code. When code is reused, programs are easier to change and most importantly, shrink. When a program shrinks its construction and maintenance requires fewer people which allows for more opportunities for reuse to be found. Consistency leads to reuse, reuse leads to conciseness, conciseness leads to understanding.

David Hestenes: Reforming the Mathematical Language of Physics

Mathematics is taken for granted in the physics curriculum -- a body of immutable truths to be assimilated and applied. The profound influence of mathematics on our conceptions of the physical world is never analyzed. The possibility that mathematical tools used today were invented to solve problems in the past and might not be well suited for current problems is never considered. ...

The point I wish to make by citing these two examples [Newton and Einstein] is that without essential mathematical concepts the two theories would have been literally inconceivable. The mathematical modeling tools we employ at once extend and limit our ability to conceive the world. Limitations of mathematics are evident in the fact that the analytic geometry that provides the foundation for classical mechanics is insufficient for General Relativity.

David Hestenes: Reforming the Mathematical Language of Physics

Early in my career, I naively thought that if you give a good idea to competent mathematicians or physicists, they will work out its implications for themselves. I have learned since that most of them need the implications spelled out in utter detail.

W. Daniel Hillis: Richard Feynman and The Connection Machine

The last project that I worked on with Richard [Feynman] was in simulated evolution. ... When I got back to Boston I went to the library and discovered a book by Kimura on the subject, and much to my disappointment, all of our "discoveries" were covered in the first few pages. When I called back and told Richard what I had found, he was elated. "Hey, we got it right!" he said. "Not bad for amateurs."

In retrospect I realize that in almost everything that we worked on together, we were both amateurs. In digital physics, neural networks, even parallel computing, we never really knew what we were doing. But the things that we studied were so new that no one else knew exactly what they were doing either. It was amateurs who made the progress.

Freeman Dyson: interview in OMNI magazine

Q: You must be aware that some of your colleagues take a jaundiced view of your ideas ... Does it bother you to know that they're out there, muttering about "Dyson's crazy ideas"?

A: Not at all. Keep in mind, I'm also a perfectly respectable physicist, and the speculation is a hobby. It's become well known, but I've grown used to the idea that people very often become famous for accidental reasons. It's amusing to think that someday all my "serious" work will probably be a footnote in a textbook, when everybody remembers what I did on the side! Anyway, what do I have to lose? I have tenure here, and no one expects much from a theoretical physicist once he's past fifty anyway!

Joe Armstrong: interview: Joe Armstrong and Simon Peyton Jones discuss Erlang and Haskell

We can't stop our systems and globally check they are consistent and then relaunch them. We incrementally change bits and we recognize that they are inconsistent under short time periods and we live with that. Finding ways of living with failure, making systems that work, despite the fact they are inconsistent, despite the fact that failures occur. So our error models are very sophisticated.

When I see things like Scala or I see on the net there's this kind of "Erlang-like semantics", that usually means mailboxes and message boxes. It doesn't mean all the error handling, it doesn't mean the live code upgrade. The live upgrade of code while you are running a system needs a lot of deep plumbing under the counter -- it's not easy.

Dan Roam: The Back of the Napkin: Solving Problems and Selling Ideas with Pictures, p 133

The opposite of "simple" is not "complex," but rather "elaborate." ... One of the most important virtues of visual thinking is its ability to clarify things so that the complex can be better understood, but that does not mean that all good visual thinking is about simplification. The real goal of visual thinking is to make the complex understandable by making it visible -- not by making it simple. Whether that goal demands a simple picture, an elaborate one, or an intentionally complex one is almost always determined by the audience and its familiarity with the subject being addressed.

C.A.R. Hoare: Retrospective: An Axiomatic Basis for Computer Programming

I expected that research into the axiomatic method would occupy me for my entire working life; and I expected that its results would not find widespread practical application in industry until after I reached retirement age. These expectations led me in 1968 to move from an industrial to an academic career. And when I retired in 1999, both the positive and the negative expectations had been entirely fulfilled.

John Allison: Mild terror at 5pm

I grew up with Roald Dahl and Tove Jansson and Richmal Crompton and Ronald Searle. They were all masters of world-building and immersive stories. They never spoke down to readers and I can read a lot of their work as an adult with the same pleasure. If I want to keep doing this for the rest of my working life, I have to make something lasting like that.

Wikipedia: Hermann Grassmann

[Grassmann's theory of linear algebra] was a revolutionary text, too far ahead of its time to be appreciated. Grassmann submitted it as a Ph. D. thesis, but Möbius said he was unable to evaluate it and forwarded it to Ernst Kummer, who rejected it without giving it a careful reading. Over the next 10-odd years, Grassmann wrote a variety of work applying his theory, in the hope that these applications would lead others to take his theory seriously. ...

In 1862, Grassmann published a thoroughly rewritten second edition of A1, hoping to earn belated recognition for his theory of extension, and containing the definitive exposition of his linear algebra. It fared no better than A1, even though A2's manner of exposition anticipates the textbooks of the 20th century.

Disappointed at his inability to be recognized as a mathematician, Grassmann turned to historical linguistics. ... These philological accomplishments were honored during his lifetime.

Graham Nelson: Natural Language, Semantic Analysis and Interactive Fiction

The general reaction of experienced IF writers to early drafts of Inform 7 was a two-stage scepticism. First: was this just syntactic sugar, that is, a verbose paraphrase of the same old code? ... Second: perhaps this was indeed a fast prototyping tool for setting up the map and the objects, but would it not then grind into useless inflexibility when it came to coding up innovative behaviour -- in fact, would it be fun for beginners but useless to the real task at hand? It sometimes seemed to those of us working on Inform that an experienced IF author, shown Inform 7 for the first time, would go through the so-called Five Stages of Grief: Denial, Anger, Bargaining, Depression, and Acceptance. The following comment is typical of the Bargaining stage:

I would like to see it be as easy as possible to mix Inform 6 and Inform 7 code. [...] I also wonder if it might be possible to allow the user access to the Inform 6 code that the Inform 7 pre-processor creates. I can imagine some people wanting to use Inform 7 to lay out the outline of their game -- rooms, basic objects therein, and so on -- quickly, and then do the heavy lifting, so to speak, in Inform 6.

Matt Knox: Interview with an Adware Author

Most things don't have to be perfect. In particular, things involving human interactions don't have to be perfect, because groups of humans have all these self-regulations built in. If you and I have an agreement and you screwed me over badly, you've always got in the back of your mind the nagging worry that I'm going to show up on your doorstep with a club and kill you. Because of that, people don't tend to screw each other too much, right? At least, they try not to. One danger, perhaps, of moving towards an algorithmically driven society is that the algorithms aren't scared of us showing up and beating them up. The algorithms will do whatever it is that they are designed to do.

Michael Chabon: The Amazing Adventures of Kavalier & Clay, p 265

A surprising fact about the magician Bernard Kornblum was that he believed in magic. Not in the so-called magic of candles, pentagrams, and bat wings. Not in the kitchen enchantments of Slavic grandmothers with their herbiaries and pairings from the little toe of a blind virgin tied up in a goatskin bag. Not in astrology, theosophy, chiromancy, dowsing rods, séances, weeping statues, werewolves, wonders, or miracles. All these Kornblum had regarded as fakery far different -- far more destructive -- than the brand of illusion he practiced, whose success, after all, increased in direct proportion to his audiences' constant, keen awareness that, in spite of all the vigilance they could bring to bear, they were being deceived.

Richard Feynman: Surely You're Joking, Mr. Feynman, p 92

In these discussions one man would make a point. Then Compton, for example, would explain a different point of view. He would say it should be this way, and he was perfectly right. Another guy would say, well, maybe, but there's this other possibility we have to consider against it.

So everybody is disagreeing, all around the table. I am surprised and disturbed that Compton doesn't repeat and emphasize his point. Finally, at the end, Tolman, who's the chairman, would say, "Well, having heard all these arguments, I guess it's true that Compton's argument is the best of all, and now we have to go ahead."

It was such a shock to me to see that a committee of men could present a whole lot of ideas, each one thinking of a new facet, while remembering what the other fella said, so that, at the end, the decision is made as to which idea was the best -- summing it all up -- without having to say it three times. These were very great men indeed.

Richard Feynman: Surely You're Joking, Mr. Feynman, p 92

[As a new professor] at Cornell, I'd work on preparing my courses, and I'd go over to the library a lot and read through the Arabian Nights and ogle the girls that would go by. But when it came time to do some research, I couldn't get to work. I was a little tired; I was not interested; I couldn't do research! This went on for what I felt was a few years ... I simply couldn't get started on any problem: I remember writing one or two sentences about some problem in gamma rays and then I couldn't go any further. I was convinced that from the war and everything else (the death of my wife) I had simply burned myself out.

... Then I had another thought: Physics disgusts me a little bit now, but I used to enjoy doing physics. Why did I enjoy it? I used to play with it. I used to do whatever I felt like doing -- it didn't have to do with whether it was important for the development of nuclear physics, but whether it was interesting and amusing for me to play with. When I was in high school, I'd see water running out of a faucet growing narrower, and wonder if I could figure out what determines that curve. I found it was rather easy to do. I didn't have to do it; it wasn't important for the future of science; somebody else had already done it. That didn't make any difference: I'd invent things and play with things for my own entertainment.

So I got this new attitude. Now that I am burned out and I'll never accomplish anything, I've got this nice position at the university teaching classes which I rather enjoy, and just like I read the Arabian Nights for pleasure, I'm going to play with physics, whenever I want to, without worrying about any importance whatsoever.

Within a week I was in the cafeteria and some guy, fooling around, throws a plate in the air. As the plate went up in the air I saw it wobble, and I noticed the red medallion of Cornell on the plate going around. It was pretty obvious to me that the medallion went around faster than the wobbling.

I had nothing to do, so I start to figure out the motion of the rotating plate. I discover that when the angle is very slight, the medallion rotates twice as fast as the wobble rate -- two to one. ...

I went on to work out equations of wobbles. Then I thought about how electron orbits start to move in relativity. Then there's the Dirac Equation in electrodynamics. And then quantum electrodynamics. And before I knew it (it was a very short time) I was "playing" -- working, really -- with the same old problem that I loved so much, that I had stopped working on when I went to Los Alamos: my thesis-type problems; all those old-fashioned, wonderful things.

It was effortless. It was easy to play with these things. It was like uncorking a bottle: Everything flowed out effortlessly. I almost tried to resist it! There was no importance to what I was doing, but ultimately there was. The diagrams and the whole business that I got the Nobel Prize for came from that piddling around with the wobbling plate.

Clay Shirky: A Rant About Women

Not caring works surprisingly well. Another of my great former students, now a peer and a friend, saw a request from a magazine reporter doing a tech story and looking for examples. My friend, who'd previously been too quiet about her work, decided to write the reporter and say "My work is awesome. You should write about it."

The reporter looked at her work and wrote back saying "Your work is indeed awesome, and I will write about it. I also have to tell you you are the only woman who suggested her own work. Men do that all the time, but women wait for someone else to recommend them." My friend stopped waiting, and now her work is getting the attention it deserves.

Steven Levy: Hackers: Heroes of the Computer Revolution

Some planners would visit Homebrew and be turned off by the technical ferocity of the discussions, the intense flame that burned brightest when people directed themselves to the hacker pursuit of building. Ted Nelson, author of Computer Lib, came to a meeting and was confused by all of it, later calling the scruffily dressed and largely uncombed Homebrew people "chip-monks, people obsessed with chips. It was like going to a meeting of people who love hammers." Bob Albrecht rarely attended, later explaining that "I could understand only about every fourth word those guys were saying . . . they were hackers." Jude Milhon, the woman with whom Lee remained friends after their meeting through the Barb and their involvement in Community Memory, dropped in once and was repelled by the concentration on sheer technology, exploration, and control for the sake of control. She noted the lack of female hardware hackers, and was enraged at the male hacker obsession with technological play and power. She summed up her feelings with the epithet "the boys and their toys," and like Fred Moore worried that the love affair with technology might blindly lead to abuse of that technology.

Ethereal Bligh

I once heard Murray Gel-Mann lecture at the Santa Fe Institute on "creativity" and scientific discoveries. He utilized one very memorable metaphor. It was of ideas as particle energy states -- that they will find the locally lowest stable "well" to settle in. But that local low may not be the regional or global low, of course, and it takes an increase of energy to move the idea up out of the local low in order for it to find its way to something deeper. Here the idea was the deeper the well, the "truer" the idea. And Gel-Mann's point was that it's a contrary thing to go up that well...that an essential characteristic of creativity (and intellectual discovery) is to do the unlikely, the counter-intuitive. Not always be contrary and go uphill, of course, that's worse than useless (something the cranks don't understand). But just enough at the right times to open up new vistas that were previously unimagined.

Douglas Adams: Speech at Digital Biota 2

Now imagine an early man surveying his surroundings at the end of a happy day's tool making.... Man the maker looks at his world and says 'So who made this then?' Who made this? ... Early man thinks, 'Well, because there's only one sort of being I know about who makes things, whoever made all this must therefore be a much bigger, much more powerful and necessarily invisible, one of me and because I tend to be the strong one who does all the stuff, he's probably male'. And so we have the idea of a god. Then, because when we make things we do it with the intention of doing something with them, early man asks himself, 'If he made it, what did he make it for?' Now the real trap springs, because early man is thinking, 'This world fits me very well. Here are all these things that support me and feed me and look after me; yes, this world fits me nicely' and he reaches the inescapable conclusion that whoever made it, made it for him.

This is rather as if you imagine a puddle waking up one morning and thinking, 'This is an interesting world I find myself in - an interesting hole I find myself in - fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!' This is such a powerful idea that as the sun rises in the sky and the air heats up and as, gradually, the puddle gets smaller and smaller, it's still frantically hanging on to the notion that everything's going to be alright, because this world was meant to have him in it, was built to have him in it; so the moment he disappears catches him rather by surprise.

Andreas Rossberg: comment on Lambda the Ultimate

[Regarding the Coverity static analysis tool for C to "find bugs in the real world"]: It's still depressing what incredible amounts of intellectual and monetary resources are wasted on problems many (most?) of which wouldn't even exist if people just used civilized languages.

[Reply from vrijz: These tools are intended to analyze existing code with little or no additional effort. To the extent that untyped, memory-unsafe languages are more convenient than languages with strong typing and safety guarantees, static analysis tools are useful.]

Yes, agreed. But my worry is that seemingly helpful tools like the one described may have the perverse effect of prolonging the life of broken code bases and languages - especially if these tools are pragmatically tuned to fit the mindset of the more ignorant among their users, as described in the article. Maybe it would be better if these artifacts collapsed sooner rather than later?

Also, obviously, people continue starting new projects in utterly inadequate languages, and the availability of tools like this will likely feed the belief that that is a good idea.

Douglas Adams: Dirk Gently's Holistic Detective Agency, p 54

He felt a tug of sadness that someone who had seemed so shiningly alive within the small confines of a university community should have seemed to fade so much in the light of common day.

why the lucky stiff: Twitter

when you don't create things, you become defined by your tastes rather than ability. your tastes only narrow & exclude people. so create

Chip and Dan Heath: Switch: How to Change Things When Change Is Hard, p 82

You need a gut-smacking goal, one that appeals to both [the rational and the emotional mind]. ...

Goals in most organizations, however, lack emotional resonance. Instead, SMART goals -- goals that are Specific, Measurable, Actionable, Relevant, and Timely -- have become the norm. A typical smart goal might be "My marketing campaign will generate 4500 qualified sales leads for the sales group by the end of Q3'09." SMART goals presume the emotion; they don't generate it.

The specificity of SMART goals is great cure for the worst sins of goal setting -- ambiguity and irrelevance. ("We are going to delight our customers every day in every way!") But SMART goals are better for steady-state situations than for change situations, because the assumptions underlying them are that the goals are worthwhile. If you accept that generating 4500 leads for the sales force is a great use of your time, the SMART goal will be effective. But if a new boss, pushing a new direction, assigns you the 4500-leads goal even though you've never handed lead generation before, then there might be trouble. SMART goals presume the emotion; they don't generate it.

Henry David Thoreau: Civil Disobedience

Cast your whole vote, not a strip of paper merely, but your whole influence.

Bruce Sterling: The Hacker Crackdown: Law and Disorder on the Electronic Frontier

Technologies in their "Goofy Prototype" stage rarely work very well. They're experimental, and therefore halfbaked and rather frazzled. The prototype may be attractive and novel, and it does look as if it ought to be good for something-or-other. But nobody, including the inventor, is quite sure what. Inventors, and speculators, and pundits may have very firm ideas about its potential use, but those ideas are often very wrong.

The natural habitat of the Goofy Prototype is in trade shows and in the popular press. Infant technologies need publicity and investment money like a tottering calf needs milk.

Skaven: FAQ

If we want to get philosophical about it, one could say that the more you produce, the more the uniqueness of your work suffers. As people change over time, so does their work. Their work represents samples from a continuum of their personal development. More work along the way just gives a higher sample rate of this continuum, and won't necessarily introduce anything new. Not always does increased number of production explore new possibilities, but only dwells on the already explored ones.

Chris Crawford: The History of Thinking

Thus, writing changed the way we thought. Don't think of it as a means of recording ideas, think of it as an instrument for exploring and examining ideas. Western civilization grew from the heady exploitation of this instrument of thinking. Indeed, the written page can be thought of as "artificial cortex", a technological means of augmenting the expansion of the sequential-processing portions of the brain. We humans were so impatient to grow more cortex, we went ahead and concocted an artificial version: paper and ink.

Neil Postman: Technology and Society (talk)

(Paraphrased) questions to ask of a new technology:

  • What is the problem to which this technology is the solution?
  • Whose problem is it?
  • What new problems might result from solving this problem?
  • Which people and institutions might be harmed by this solution?
  • How does the new technology change our language, and what are the implications of that?
  • What people and institutions gain economic or political power because of the technological change?
David Foster Wallace: Life and Work

In the day-to-day trenches of adult life, there is actually no such thing as atheism. There is no such thing as not worshipping. Everybody worships. The only choice we get is what to worship. And an outstanding reason for choosing some sort of God or spiritual-type thing to worship -- be it J.C. or Allah, be it Yahweh or the Wiccan mother-goddess or the Four Noble Truths or some infrangible set of ethical principles -- is that pretty much anything else you worship will eat you alive. If you worship money and things -- if they are where you tap real meaning in life -- then you will never have enough. Never feel you have enough. It's the truth. Worship your own body and beauty and sexual allure and you will always feel ugly, and when time and age start showing, you will die a million deaths before they finally plant you... Worship power -- you will feel weak and afraid, and you will need ever more power over others to keep the fear at bay. Worship your intellect, being seen as smart -- you will end up feeling stupid, a fraud, always on the verge of being found out. And so on.

Look, the insidious thing about these forms of worship is not that they're evil or sinful; it is that they are unconscious. They are default-settings. They're the kind of worship you just gradually slip into, day after day, getting more and more selective about what you see and how you measure value without ever being fully aware that that's what you're doing.

Iris Chang: suicide letter

When you believe you have a future, you think in terms of generations and years. When you do not, you live not just by the day -- but by the minute.

Daniel Fontijne: Gaigen 2: a Geometric Algebra Implementation Generator

We consider geometric algebra the be the high-level "object-oriented" language for encoding geometry, whereas -- in this context -- linear algebra is more akin to assembly language.

John Lienhard: Engines of Our Ingenuity, #622: Ignaz Philipp Semmelweis

On a hunch, [Semmelweis] sets up a policy. Doctors must wash their hands in a chlorine solution when they leave the cadavers. Mortality from puerperal fever promptly drops to two percent. Now things grow strange. Instead of reporting his success at a meeting, Semmelweis says nothing....

As outside interest grows, we begin to understand Semmelweis's silence. The hospital director feels his leadership has been criticized. He's furious. He blocks Semmelweis's promotion. The situation gets worse. Viennese doctors turn on this Hungarian immigrant.... Finally, he goes back to Budapest...

Finally, in 1861, he writes a book on his methods. The establishment gives it poor reviews. Semmelweis grows angry and polemical. He hurts his own cause with rage and frustration.

Brad Templeton: Voluntary Taxes

In this county, a proposition... asks for a $29 levy on all properties to pay for medical programs for children. How could anybody vote against that? (I have not examined this proposition in detail, but generally when you see "motherhood" propositions on the ballot, particularly bonds, they have been put there by politicians who have other projects they know would not be popular. So they arrange a ballot proposition to raise money for something nobody could be against, which normally they would have had to spend general revenue on, and this frees up general revenue so they can spend it with less accountability.)

Richard Hamming: The Art of Doing Science and Engineering, p vi

Teachers should prepare the student for the student's future, not for the teacher's past.

Noam Chomsky: talk

[In the U.S.], "libertarian" means "extreme advocate of total tyranny"... It means power ought to be given into the hands of private unaccountable tyrannies. Even worse than state tyrannies, because there the public has some kind of role. But the corporate system, especially as it has evolved in the twentieth century, is pure tyranny. Completely unaccountable. You're inside one of these institutions, you take orders from above, you hand it down below... there's nothing you can say, tyrannies do what they feel like, they're global in scale. This is the extreme opposite of what been called "libertarian" everywhere in the world since the enlightenment.

Joe Armstrong: interview

I think I had come to [Ericsson's research lab] something like two years after it had started... Our view of the world was, yes, we'll solve problems and then we'll push them into projects and we will improve Ericsson's productivity. This view of the world wasn't yet tinged by any contact with reality. So we thought it would be easy to discover new and useful stuff and we thought that once we had discovered new and useful stuff then the world would welcome us with open arms. What we learned later was, it wasn't all that easy to discover new stuff. And it's incredibly difficult to get people to use new and better stuff.

LiberianRedditor: How can I find out if I am the only Redditor in Liberia?

I visited a Liberian friend yesterday who teaches a computer class; they teach Microsoft Office by describing it, and drawing on pieces of paper. He was excited that he was able to obtain a mouse and bring it in for people to see, so that their understanding wouldn't be as abstract.

Julian Assange

Non-conformity is not the adoption of some pre-existing alternative subculture.

Alex Kolesar and Joseph Kovell: No Need for Bushido FAQ

Q: What happened to the art/writing?

A: It got better.

Stewart Brand: Long Now talk

It seems like most people ask: "How can I throw my life away in the least unhappy way?"

Douglas Engelbart: interview

I got this wild dream in my head about what would help mankind the most, to go off and do something dramatic, and I just happened to get a picture of how, if people started to learn to interact with computers, in collective ways of collaborating together, and this was way back in the early 50s, so it was a little bit premature. So anyways, I had some GI bill money left still so I could just go after that, and up and down quite a bit through the years, and I finally sort of gave up.

Will Wright: interview

Or somebody might actually initiate a sequence of actions on their computer in a very creative way and the computer might recognize that, send it up to the server, and say: "Wow, that was an interesting sequence, and that person likes doing comedy romances. Let's try that on ten other people tomorrow. If those ten people respond well, let's try it on a hundred the next day." So it could be that the things aren't just randomly discovered, but they're also observed from what the players did specifically.

Douglas Adams: talk

Everybody puts little hidden jokes in stuff from time to time. There are quite a few little jokes in my books which only the person they directed at would get. It's a bit like people waving at complete strangers out of buses... just being friendly and saying hi.

The stuff at the beginning of Long Dark Tea Time Of The Soul about the harpsichord and the bailiffs was a joke at the expense of my great friend Michael Bywater, on whom the character of DG was to a certain extent based.

For instance in Life, The Universe And Everything I describe the way that the robot waiters and guests behave in the BistroMath ship. One of the guest robots keeps feeling under tables, insulting people and going on about some woman or other... I called him an AutoRory. Old friend of mine called Rory McGrath. That's exactly what he used to be like in restaurants. Don't know if he still is because I haven't gone to restaurants with him for a while, for obvious reasons. Not only did Rory get the joke. Anybody who had ever been to a restaurant with him or even just IN a restaurant with him got it.

Joshua Allen: Fireland

The site was a bunch of random pieces of writing. Fiction masquerading as non-fiction and vice versa, etc. Basically the same shit I'm doing today.

There have been many times when I've regretted investing so much time and energy in the internet. When I was embarrassed by the whole thing. But it occurred to me that just about every good thing I have in my life today has stemmed, directly or indirectly, from that site.

Matt Groening: interview at Mother Jones

[Re fighting FOX for creative control over Futurama] You can't believe what babies people are. It's really like being in junior high school. [With] the bullies, and every step of the way, any time I've been gracious, that has been -- it's seen as a sign of weakness. And every time I've yelled back, I've been treated with respect. That's just not very good psychology. The other thing is, it's just astonishing to have this lesson repeated over and over again: You can't expect people to behave in their own best interest. It's in Fox's best interest for this show to be a success, but they'd rather mess with the show and have them fail, than allow creators independence and let them succeed.

Paul Ford: The Web Is a Customer Service Medium

One can spend a lot of time defining a medium in terms of how it looks, what it transmits, wavelengths used, typographic choices made, bandwidth available. I like to think about media in terms of questions answered.

Kevin Kelly: My Life Countdown

I am now 55 years old. Like a lot of people in middle age my late-night thoughts bend to contemplations about how short my remaining time is. Even with increasing longevity there is not enough time to do all that I want. Nowhere close. My friend Stewart Brand, who is now 69, has been arranging his life in blocks of 5 years. Five years is what he says any project worth doing will take. From moment of inception to the last good-riddance, a book, a campaign, a new job, a start-up will take 5 years to play through. So, he asks himself, how many 5 years do I have left? He can count them on one hand even if he is lucky. So this clarifies his choices. If he has less than 5 big things he can do, what will they be?

Tom Stoppard: Arcadia

It is a defect of God's humour that he directs our hearts everywhere but to those who have a right to them.

Christopher Alexander: interview in Stewart Brand's "How Buildings Learn"

I think people have lost confidence in themselves. To a large extent, that's been done by members of my profession. The building has become the province of the architect; it's sort of his plaything. And the architects have worked quite hard to convince users that they don't know anything about architecture... It's even reached the point where people have interior decorators come choose their own wallpaper, for god's sake.

So this lack of confidence, which has been fostered in the population, is a manipulation that has actually been caused partly by the media, but largely by the [architecture] profession. It tremendously endangers the fabric of society, because if people lose confidence in themselves to that degree, then the adaptation of the environment, to common sense and to everyday use, disappears.

Jonathan Blow: talk: Video Games and the Human Condition

Game design is kind of a game by itself. I've made a bunch of puzzle games, and I've found that looking at a situation and saying, "How do I make an interesting puzzle out of this," is itself a really interesting puzzle. So there's this huge irony, that the companies that are making these social games that have basically no gameplay value in them are actually themselves playing a much more interesting game than the game that they're making for you to play. The game they're playing is this huge multidimensional optimization problem, trying to gather data and make the best decision and all that, but the game they're making for you to play is like clicking on a cow a bunch of times and you get some gold.

Jonathan Blow: talk: Design Reboot

When millions of people buy our game, we are pumping a (mental) substance into the (mental) environment. This is a public mental health issue. We have the power to shape humanity. How will we use it?

C.A.R. Hoare: The Emperor's Old Clothes

At first I hoped that such a technically unsound project would collapse, but I soon realized it was doomed to success. Almost anything in software can be implemented, sold, and even used given enough determination. There is nothing a mere scientist can say that will stand against the flood of a hundred million dollars. But there is one quality that cannot be purchased in this way - and that is reliability. The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay. ...

[Ada] has been initiated and sponsored by one of the world's most powerful organizations, the United States Department of Defense. Thus it is ensured of an influence and attention quite independent of its technical merits, and its faults and deficiencies threaten us with far greater dangers. For none of the evidence we have so far can inspire confidence that this language has avoided any of the problems that have afflicted other complex language projects of the past.

Alan Kay: The Early History of Smalltalk

A twentieth century problem is that technology has become too "easy". When it was hard to do anything whether good or bad, enough time was taken so that the result was usually good. Now we can make things almost trivially, especially in software, but most of the designs are trivial as well. This is inverse vandalism: the making of things because you can. Couple this to even less sophisticated buyers and you have generated an exploitation marketplace similar to that set up for teenagers. A counter to this is to generate enormous dissatisfaction with one's designs using the entire history of human art as a standard and goal. Then the trick is to decouple the disatisfaction from self worth--otherwise it is either too depressing or one stops too soon with trivial results.

Wikipedia: Inverse problem

The field of inverse problems was first discovered and introduced by soviet-armenian physicist, Viktor Ambartsumian. ... [His] paper was published in 1929 in the German physics journal Zeitschrift fur Physik and remained in oblivion for a rather long time. Describing this situation after many decades, Ambartsumian said, "If an astronomer publishes an article with a mathematical content in a physics journal, then the most likely thing that will happen to it is oblivion."

Jay Rosen: PressThink Basics: The Master Narrative in Journalism

Individual reports we can summarize, index, and criticize... but there is no reliable index to replicating patterns in news coverage. Your local newscaster may tell you, "here's a list of stories we're working on for NewsFour at 11:00," but there is nowhere listed the story forms from which this repetitive content flows. A given work of journalism will have an author's byline, but in some measure the author is always "journalism" itself and its peculiar habits of mind. You can't interview that guy.

Alan Kay: Doing With Images Makes Symbols

Jacques Hadamard, the famous French mathematician, in the late stages of his life, decided to poll his 99 buddies, who made up together the 100 great mathematicians and physicists on the earth, and he asked them, "How do you do your thing?" They were all personal friends of his, so they wrote back depositions. Only a few, out of the hundred, claimed to use mathematical symbology at all. Quite a surprise. All of them said they did it mostly in imagery or figurative terms. An amazing 30% or so, including Einstein, were down here in the mudpies [doing]. Einstein's deposition said, "I have sensations of a kinesthetic or muscular type." Einstein could feel the abstract spaces he was dealing with, in the muscles of his arms and his fingers...

The sad part of [the doing -> images -> symbols] diagram is that every child in the United States is taught math and physics through this [symbolic] channel. The channel that almost no adult creative mathematician or physicist uses to do it... They use this channel to communicate, but not to do their thing. Much of our education is founded on those principles, that just because we can talk about something, there is a naive belief that we can teach through talking and listening.

William Thurston: On proof and progress in mathematics

When a significant theorem is proved, it often (but not always) happens that the solution can be communicated in a matter of minutes from one person to another within the subfield. The same proof would be communicated and generally understood in an hour talk to members of the subfield. It would be the subject of a 15- or 20-page paper, which could be read and understood in a few hours or perhaps days by members of the subfield.

Why is there such a big expansion from the informal discussion to the talk to the paper? One-on-one, people use wide channels of communication that go far beyond formal mathematical language. They use gestures, they draw pictures and diagrams, they make sound effects and use body language. Communication is more likely to be two-way, so that people can concentrate on what needs the most attention. With these channels of communication, they are in a much better position to convey what's going on, not just in their logical and linguistic facilities, but in their other mental facilities as well.

In talks, people are more inhibited and more formal. Mathematical audiences are often not very good at asking the questions that are on most people's minds, and speakers often have an unrealistic preset outline that inhibits them from addressing questions even when they are asked.

In papers, people are still more formal. Writers translate their ideas into symbols and logic, and readers try to translate back.

Steven Johnson: Where Good Ideas Come From

Technological and scientific advances rarely break out of the adjacent possible; the history of cultural progress is, almost without exception, a story of one door leading to another door... But of course, every now and then an idea does occur to someone that teleports us forward a few rooms, skipping some exploratory steps in the adjacent possible. But those ideas almost always end up being short-term failures... we call them "ahead of their time". ...

Babbage had most of [his Analytical Engine] sketched out by 1837, but the first true computer to use this programmable architecture didn't appear for more than a hundred years. While the Difference Engine engendered an immediate series of refinements and practical applications, the Analytical Engine effectively disappeared from the map. Many of the pioneering insights that Babbage had hit upon in the 1830s had to be independently rediscovered by the visionaries of World War II-era computer science.

Howard Rheingold: The Millennium Whole Earth Catalog

If you want to maintain independence in the era of large institutions and think fresh thoughts in the age of mass media, you are going to need good tools.

Mike Birkhead: Depth vs Breadth in Combat Design

Depth is the Knowledge of How, and breadth is the Knowledge of Why.

Chris Hecker: talk, NYU Game Center Lecture Series

Q: At the [NYU] Game Center, we're interested in the role of the university as an alternate place for thinking about games... What in your opinion are some of the big interesting problems that students should be working on?

A: My advice for students is... I question the question. I don't think there are problems that students should be working on. I think students should be making games that are interesting and push the boundaries, and those will generate the problems.

Clay Shirky: Why We Need the New News Environment to be Chaotic

News has to be free, because it has to spread. The few people who care about the news need to be able to share it with one another and, in times of crisis, to sound the alarm for the rest of us. Newspapers have always felt a tension between their commercial and civic functions, but when a publication drags access to the news itself over to the business side, as with the paywalls at The Times of London or the Tallahassee Democrat, they become Journalism as Luxury. In a future dominated by Journalism as Luxury, elites will still get what they need (a tautology in market economies), but most communities will suffer; imagine Bell, California times a thousand, with no Ruben Vives to go after the politicians.

Sebastian Deterding: Don't Play Games With Me! Promises and Pitfalls of Gameful Design

Dozens of psychological studies have consistently shown that giving expected extrinsic rewards for an activity (e.g. "If you do x, I will give you y amount of cash/points/...") often reduces intrinsic motivation of people to do it. The first reason is that people feel controlled by the person giving the rewards, reducing their sense of autonomy... Secondly, giving a reward for an activity sends a strong social signal that you don't consider the activity worth doing for its own sake.

Steven Johnson: Interface Culture

If you live your entire life under the spell of television, the mental world you inherit from the TV -- the supremacy of images over text, the passive consumption, a preference for live events over historical contemplation -- seems like second nature to you. Only when another medium rolls into view does the television's influence become perceptible. When those paradigm shifts arrive only once every few centuries, you have to be a genuine visionary or a lunatic to see beyond the limits of the form. McLuhan, of course, was a little of both.

Steven Johnson: Interface Culture

Looking back now... what strikes you about the early days of the desktop metaphor is how many people resisted the idea, and how many simply didn't get it at all. The viability of the graphic interface is so far beyond question now that it's difficult to remember that there was ever a dispute about it. But if you sift through the original reviews of the Mac and the Lisa... you can't help but be struck by how hard a time the critics had wrapping their minds around the new paradigm.

Some of the reviews of the graphic interface struck the ridiculous real-men-don't-do-windows chord... as in this wag from Creative Computing magazine:

Icons and a mouse will not make a non-literate person literate. Pointing at pictures can last only so long. Sooner or later you must stop pointing and selecting, and begin to think and type.

The opposition now seems completely out of place to use, accustomed as we are to the way spatial metaphors can augment thought -- but to those first critics, the visual language seemed like child's play, or a cartoon. Other reviews missed the point altogether, dismissing the Mac as a tool that only artists and designers would have use for, as though the machine's major innovation was MacPaint's spray can and not the interface itself. Consider the editorial from Forbes, dated February 13, 1984:

[The Macintosh's] best features are for computer novices: MacPaint, a program that creates graphic designs of stunning complexity, and MacWrite, a word-processing program that goes to ingenious lengths to set up the screen to look like a typewriter. Both are controlled by the machine's "mouse," which moves the cursor without the user's touching the keyboard. Such simplicity is not aimed at big corporations. The average middle manager has little need for the graphics capability of MacPaint. Most managers have a hard enough time writing reports, without having to worry about designing them as well.

The ease with which the author dismisses the brilliance of those original programs ("such simplicity") is breathtaking, of course, but even more arresting is how the graphic interface itself flies completely below his radar. There's not even a passing reference to the potential virtues of organizing information visually... There's a puzzling literalness to the language: the author sees a graphic interface and immediately assumes that it must be useful only for graphic artists. The broader conceptual liberation promised by the graphic interface doesn't even occur to him.

Sir James Lighthill: discussion following The Recently Recognized Failure of Predictability in Newtonian Dynamics

Q: Do you regard the chaos [within Newtonian mechanics] as immutable, forever remaining inexplicable; and that no new data, no more exact observations or no future theory will ever be able to explain it? I have in mind that the history of science has revealed time and time again a state of affairs where observed phenomena have been seen as irrational, inexplicable and 'chaotic' according to received theory and accepted laws of science but that subsequent refinement of the data and/or new hypotheses, by offering a new explanatory schema, have revealed that a new order lay unperceived within the older chaos....

A: Perhaps I should make it clear that the results I described are not 'scientific theories'. They are mathematical results, based upon rigorous 'proof' in the mathematical sense. They are not capable of alteration therefore.

Admittedly the history of science confirms that our understanding of natural laws is constantly being further refined. Newtonian dynamics is itself an illustration of this because we have long recognized it as only an approximation to the true laws of mechanics...

My lecture, however, was about the mathematical properties of systems assumed to obey exactly the laws of Newtonian dynamics. The behaviour of such systems had long been thought to be completely predictable but is now known, for a certain proportion of such systems, to be 'chaotic' in a well defined sense.

Alan Kay: Programming and Scaling

Leonardo could not invent a single engine for any of his vehicles. Maybe the smartest person of his time, but he was born in the wrong time. His IQ could not transcend his time. Henry Ford was nowhere near Leonardo, but he happened to be born in the right century, a century in which people had already done a lot of work in making mechanical things...

Knowledge, in many many cases, trumps IQ. Why? This is because there are certain special people who invent new ways of looking at things. Henry Ford was powerful because Issac Newton changed the way Europe thought about things. One of the wonderful things about the way knowledge works is if you can get a supreme genius to invent calculus, those of us with more normal IQs can learn it. So we're not shut out from what the genius does. We just can't invent calculus by ourselves, but once one of these guys turns things around, the knowledge of the era changes completely.

Alan Kay: Programming and Scaling

So we've got this present, it comes out of one set of things in the past that we're vaguely aware of, and gives rise to an incremental future. But the truth is that the past is vast. It's enormous! There are billions of people contributing to the past. And every time we think the present is real, we cannot see the rest of the past. So we have to destroy the present.

Once you get rid of it, it's a scary situation, because you said, "I'm not going to have anything based on the past." Of course that's not possible; you're just trying. But sometimes you get a little feeling. And this not an idea; it's just a feeling. It's like an odor of perfume. But the fun thing is that little feeling can actually lead you to look in the past in different places than you normally do, and you can bring those up to that feeling. And once you do that, that feeling starts expanding into a vision, and the vision expands into an actual idea... Some of the most creative people I know actually operate this way. This is where those ideas come from that are not just incremental to the present. They come out of vague, even muscular sensations, that you have to go chasing to find out what they are. If you try to get the idea too early, it can only be in terms of the present.

B.N. Delone: Mathematics: Its Content, Methods, and Meaning, p 193

The inventors of the infinitesimal analysis [calculus] were already in possession of Descartes' method [of analytic geometry]. Whether it was a question of tangents or normals to curves, or of maxima or minima of functions considered geometrically, or of the radius of curvature of a curve at a given point, etc., the equation of the curve was considered first, by the method of Descartes, and then the equations of the normal, the tangent, and so forth, were found. Thus infinitesimal analysis, namely the differential and integral calculus, would have been inconceivable without the preliminary development of analytic geometry.

Stewart Brand: interview on Marketplace

In [Whole Earth Catalog] I focused on individual empowerment, [but in Whole Earth Discipline] the focus is on the aggregate effects of humans on things like climate. And some of these issues are of such scale that you got to have the governments doing things like making carbon expensive. Or making coal expensive to burn and putting all that carbon into the atmosphere. And individuals can't do that, individual communities can't do that. It takes national governments.

Stewart Brand: foreword to Unbounding the Future: the Nanotechnology Revolution

[Nanotechnology] will arrive piecemeal and prominently, but the consequences will arrive at a larger scale and often invisibly.

Perspective from within a bursting revolution is always a problem because the long view is obscured by compelling immediacies and the sudden traffic of people new to the subject, some seizing opportunity, some viewing with alarm. Both optimists and pessimists about new technologies are notorious for their tunnel vision.

The temptation always is to focus on a single point of departure or a single feared or desired goal. Sample point of departure: What if we can make anything out of diamond? Sample feared/desired goal: What if molecular-scale medicine lets people live for centuries?

We're not accustomed to asking, What would a world be like where many such things are occurring? Nor do we ask, What should such a world be like?

Steven Levy: Insanely Great

Since gathering the wealth of Croesus was not Engelbart's goal, one might reasonably assume that his achievements brought him satisfaction. When I suggested as much, Engelbart curtly gestured to his system... He wanted this system, his system, everywhere. But he had no control over the future. His vision was at the mercy of those he inspired.

Steven Levy: Bill and Andy's Excellent Adventure II

Bill [Atkinson]'s problem with his employer's oversight was not so much ego, as a matter of his deeply ingrained sense of fairness. Bill has a radar for the personal angle, and the idea of one person gaining an unearned edge over another is loathsome to him. ... He thinks that "business as usual" is no excuse for not doing what's right.

The second thing crucial to Bill is his need to get his products out into the world. He bears scars from those times when a project of his failed to reach the public. He loved the idea that Apple bundled his MacPaint with every Macintosh, and he was crushed when the company decided that his post-Mac project, a flat-pad communicating computer called Magic Slate, was too esoteric a product to begin developing in 1985. He went into a depression, not working for months, until one night he wandered out of his house in the Los Gatos hills, stared at the star-filled sky, and had an epiphany: In the face of the awesome celestial epic, what was the point of being depressed? All you could do, really, was use your abilities to do what you could to make a little part of the universe better. And Bill Atkinson went back into the house and began using his abilities to work on a new project that would become known as HyperCard.

Dan Bricklin: interview

Q: Do you ever feel that the fame of VisiCalc has overshadowed some of your more recent accomplishments?

A: It had better. VisiCalc was a pretty big thing to have done, and I'm very happy that I had the opportunity to make such a big contribution to the world.

John Gall: Systemantics

A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.

William S. Anglin: Mathematics and history

Mathematics is not a careful march down a well-cleared highway, but a journey into a strange wilderness, where the explorers often get lost. Rigour should be a signal to the historian that the maps have been made, and the real explorers have gone elsewhere.

Frank Lantz: re Ian Bogost

He probably would be happier if he hadn't made Cow Clicker, but he doesn't want to be happy. I don't know what his goal in life is, but it's not to be happy. He's definitely better off having made this thing that has made him so unhappy.

Gerald Jay Sussman: We Really Don't Know How To Compute! (40:30)

I'm only pushing this idea, not because I think it's the right answer. I'm trying to twist us, so we say, "This is a different way to think." We have to think fifty-two different ways to fix this problem. I don't know how to make a machine that builds a person out of a cell. But I think the problem is that we've been stuck for too long diddling with our details. We've been sitting here worrying about our type system, when we should be worrying about how to get flexible machines and flexible programming.

Trudy Cooper and Doug Bayne: Ask Oglaf anything

Writing's the exact opposite of jerking off- hurts while you're doing it, but afterwards you're proud of the little mess you made.

Paul Baran: interview with Stewart Brand

You say, "My God, one day this is how we're going to build all our networks." It's such a wild thought that you ask, "Am I fooling myself?"

I took out the briefing charts and went around to present this idea [of digital packet switching], and got dumped all over. People said it wouldn't work because of this reason or that reason. I would study the problem and come back. A good wire-brushing like this was necessary. You see, these ideas were crazy. We were in an analog world. The image of a computer was a great big room with parts failing all the time. I said, "You can build computers in shoe box size. It's already happening on board airplanes."

E. T. Jaynes: Probability Theory, chapter 5: Queer uses for probability theory

Issuing reports of sensational data defeats its own purpose. For if the prior probability of deception is greater than that of ESP, then the more improbable the alleged data are on the null hypothesis of no deception and no ESP, the more strongly we are led to believe, not in ESP, but in deception...

Laplace perceived this phenomenon long ago... He notes that those who make recitals of miracles, "decrease rather than augment the belief which they wish to inspire; for then those recitals render very probable the error or the falsehood of their authors. But that which diminishes the belief of educated men often increases that of the uneducated, always avid for the marvelous."

Indeed, the [author] found himself a victim of this phenomenon... We applied Bayesian analysis to estimation of frequencies of nonstationary sinusoidal signals... We found -- as was expected on theoretical grounds -- an improved resolution over the previously used Fourier transform methods.

If we had claimed a 50% improvement, we would have been believed at once, and other researchers would have adopted this method eagerly. But in fact we found orders of magnitude improvement in resolution. It was, in retrospect, foolish of us to mention this at the outset, for in the minds of others the prior probability that we were irresponsible charlatans was greater than the prior probability that a new method could possibly be that good; and we were not at first believed.

Karl Popper: Realism and the Aim of Science

My subject does not exist because subject matters in general do not exist. There are no subject matters; no branches of learning -- or, rather, of inquiry: there are only problems, and the urge to solve them. A science such as botany or chemistry is, I contend, merely an administrative unit. University administrators have a difficult job anyway, and it is a great convenience to them to work on the assumption that there are some named subjects, with chairs attached to them to be filled by the experts in these subjects. It has been said that the subjects are also a convenience to the student. I do not agree: even serious students are misled by the myth of the subject. And I should be reluctant to call anything that misleads a person a convenience to that person.

James Clerk Maxwell: Scientific Papers of James Clerk Maxwell, Vol II

Mathematicians may flatter themselves that they possess new ideas which mere human language is as yet unable to express. Let them make the effort to express these ideas in appropriate words without the aid of symbols, and if they succeed they will not only lay us laymen under a lasting obligation, but, we venture to say, they will find themselves very much enlightened during the process, and will even be doubtful whether the ideas as expressed in symbols had ever quite found their way out of the equations into their minds.

Arturo Bejar: State Bundles for Persistence

Do you, Programmer,
take this Object to be part of the persistent state of your application,
to have and to hold,
through maintenance and iterations,
for past and future versions,
as long as the application shall live?

Alan Kay: The Early History of Smalltalk

New ideas go through stages of acceptance, both from within and without. From within, the sequence moves from "barely seeing" a pattern several times, then noting it but not perceiving its "cosmic" significance, then using it operationally in several areas, then comes a "grand rotation" in which the pattern becomes the center of a new way of thinking, and finally, it turns into the same kind of inflexible religion that it originally broke away from. From without, as Schopenhauer noted, the new idea is first denounced as the work of the insane, in a few years it is considered obvious and mundane, and finally the original denouncers will claim to have invented it.

Tevis Thompson: Saving Zelda

If Zelda is to reclaim any of the spirit that Miyamoto first invested in its world... it needs to make most of the map accessible from the beginning. No artificial barriers to clumsily guide Link along a set course... Link must be allowed to enter areas he's not ready for. He must be allowed to be defeated, not blocked, by the world and its inhabitants.

This world, dangerous, demanding exploration, must also be mysterious. This means: illegible, at least at first... How can you truly explore if you know how everything works already? How can you ever be surprised if every "secret" is conspicuously marked as such?

The point of a hero's adventure... is not to make you feel better about yourself. The point is to grow, to overcome, to in some way actually become better. If a legendary quest has no substantial challenge, if it asks nothing of you except that you jump through the hoops it so carefully lays out for you, then the very legend is unworthy of being told, and retold.

To do this, Hyrule must become more indifferent to the player. It must aspire to ignore Link. Zelda has so far followed a spirit of indulgence in its loving details, a carefully crafted adventure that reeks of quality and just-for-you-ness. But a world is not for you. A world needs a substance, an independence, a sense that it doesn't just disappear when you turn around (even if it kinda does). It needs architecture, not level design with themed wallpaper, and environments with their own ecosystems (which were doing just fine before you showed up). Every location can't be plagued with false crises only you can solve, grist for the storymill.

Richard Hamming: One Man's View of Computer Science (1969)

This brings me to another distinction, that between undirected research and basic research. Everyone likes to do undirected research and most people like to believe that undirected research is basic research. I am choosing to define basic research as being work upon which people will in the future base a lot of their work. After all, what else can we reasonably mean by basic research other than work upon which a lot of later work is based? I believe experience shows that relatively few people are capable of doing basic research. While one cannot be certain that a particular piece of work will or will not turn out to be basic, one can often give fairly accurate probabilities on the outcome... What determines whether or not a piece of work has much chance to become basic is not so much the question asked as it is the way the problem is attacked.

Danny Hillis: quoted in "What Technology Wants" by Kevin Kelly, p142

There might be tens of thousands of people who conceive the possibility of the same invention at the same time. But less than one in ten of them imagines how it might be done. Of these who see how to do it, only one in ten will actually think through the practical details and specific solutions. Of these only one in ten will actually get the design to work for very long. And finally, usually only one of all those many thousands with the idea will get the invention to stick in the culture.

Hans Christian Von Baeyer: Warmth Disperses and Time Passes: The History of Heat, p38

In 1823, [Sadi Carnot] was ready to publish what he had discovered. Before putting pen to paper, he had adopted two guidelines, each admirable in its own right, but fatal in combination: By neatly canceling each other out, they condemned his book to almost total oblivion. First, he decided to address himself to the general public rather than an audience of scientist and engineers. This decision establishes the book, which much later assumed its rightful place among the classics of science, as the last member of a noble tradition. Galileo himself had started the trend by writing in popular Italian instead of Latin, by keeping mathematical details to a minimum, and by perfecting a lively literary style. Galileo's writings were enormously influential, but after the time of Newton another genre, densely mathematical in content and highly professional in tone, had become predominant, particularly in physics.

Carnot's second guideline, and the essence of his greatness, was to embrace generality. Inspired by his father, who had written a successful book on the analysis of simple mechanical machines, Carnot undertook to develop a general theory of steam engines that would rise above the practical questions of design and materials that were of immediate interest to engineers.

A popular explanation of the advantages of the steam engine, or a general treatise of the theory of extracting work from heat, might have made its mark. But the public was too unsophisticated to understand a general theory, and the technical people too contemptuous to bother with what seemed to be a popularization of a complex subject. By trying to address two audiences at once, Carnot excluded both. His Reflections on the Motive Power of Fire received only one, albeit enthusiastic, review, and a decade later, three years after its author's death at thirty-six, one single citation in a science text alone bore the burden of keeping his memory alive.

Alan Kay: The Early History of Smalltalk

Even very young children can understand and use interactive transformational tools. The first ones are their hands! They can readily extend these experiences to computer objects and making changes to them. They can often imagine what a proposed change will do and not be surprised at the result... They can answer any question whose answer requires the application of just one of these tools. But it is extremely difficult for them to answer any question that requires two or more transformations. Yet they have no problem applying sequences of transformations, exploring "forward." It is for conceiving and achieving even modest goals requiring several changes that they almost completely lack navigation abilities.

It seems that what needs to be learned and taught is now to package up transformations in twos and threes in a manner similar to learning a strategic game like checkers. The vague sense of a "threesome" pointing towards one's goal can be a set up for the more detailed work that is needed to accomplish it.

Richard Gabriel: The Design of Parallel Programming Languages

John [McCarthy]'s world is a world of ideas, a world in which ideas don't belong to anyone, and when an idea is wrong, just the idea - not the person - is wrong. A world in which ideas are like young birds, and we catch them and proudly show them to our friends. The bird's beauty and the hunter's are distinct....

Some people won't show you the birds they've caught until they are sure, certain, positive that they - the birds, or themselves - are gorgeous, or rare, or remarkable. When your mind can separate yourself from your bird, you will share it sooner, and the beauty of the bird will be sooner enjoyed. And what is a bird but for being enjoyed?

Charles Babbage: quoted in "The Information" by James Gleick, p104

[Babbage] was developing a sour view of the Englishman's attitude toward technological innovation: "If you speak to him of a machine for peeling a potato, he will pronounce it impossible: if you peel a potato with it before his eyes, he will declare it useless, because it will not slice a pineapple."

Daniel Dennett: Darwin's Dangerous Idea, p346

I don't know about you, but I am not initially attracted by the idea of my brain as a sort of dung heap in which the larvae of other people's ideas renew themselves, before sending out copies of themselves in an informational diaspora.... Who's in charge, according to this vision -- we or our memes?

Tycho: Penny Arcade

Like most readers, I had functionally consigned [our game] to the furnace. I had let it float away on one of those little lantern boats in a way that brought me closure, if no one else. Insufficient. Fucking insufficient.

You have to get back on the horse. Somehow, and I don't know how this kind of thing starts, we have started to lionize horseback-not-getting-on: these casual, a priori assertions of inevitable failure, which is nothing more than a gauze draped over your own pulsing terror. Every creative act is open war against The Way It Is. What you are saying when you make something is that the universe is not sufficient, and what it really needs is more you. And it does, actually; it does. Go look outside. You can't tell me that we are done making the world.

James Gleick: The Information, p400

As a duplicating machine, the printing press not only made texts cheaper and more accessible; its real power was to make them stable... All forms of knowledge achieved stability and permanence, not because paper was more durable than papyrus but simply because there were many copies.

Alfred North Whitehead: An Introduction to Mathematics (1910), p20

From the earliest epoch (2634 BC) the Chinese had utilized the characteristic property of the compass needle, but do not seem to have connected it with any theoretical ideas. The really profound changes in human life all have the ultimate origin in knowledge pursued for its own sake.... The importance which the science of electromagnetism has since assumed in every department of human life is not due to the superior practical bias of Europeans, but to the fact that in the West electrical and magnetic phenomena were studied by men who were dominated by abstract theoretic interests.

Chris Hecker: interview

Q: Are you ever afraid of someone stealing your thunder, especially when you've been quite open about development and showing off your games for some time now - i.e. if a game comes out that happens to have the same puzzle hook as The Witness or the same kind of competitive aspects as Spy Party?

A: I think anybody really good is going to want to do their own thing. Anybody who's not really good, you don't have to worry too much about.

Jon Gertner: The Idea Factory: Bell Labs and the Great Age of American Innovation

It was curious, in a way, who they were, these men coming to Bell Labs in New York. Most had been trained at first-rate graduate schools like MIT and Chicago and Caltech; they had been flagged by physics or chemistry or engineering professors at these places and their names had been quietly passed along to Mervin Kelly or someone else at the Labs. But most had been raised in fly-speck towns, intersections of nowhere and nowhere, places with names like Chickasa or Quaker Neck or Petoskey, towns like the one Kelly had come from, rural and premodern like Gallatin, towns where their fathers had been fruit growers or merchants or small-time lawyers. Almost all of them had found a way out -- a high school teacher, oftentimes, who noticed something about them, a startling knack for mathematics, for example, or an insatiable curiosity about electricity, and had tried to nurture this talent with extra assignments or after-school tutoring, all in the hope (never explained to the young men but realized by them all, gratefully, many years later) that the students could be pushed toward a local university and away from the desolation of a life behind a plow or a cash register.

The young Bell Labs recruits had other things in common. Almost all had grown up with a peculiar desire to know more about the stars or the telephone lines or (most often) the radio, and especially their makeshift home wireless sets. Almost all of them had put one together themselves, and in turn had discovered how sound could be pulled from the air.

Will Wright: Gaming Reality

We have this limited bubble of experience. We can only have so many experiences in our lifetime to build models from, and we're abstracting from that data. We've found, through evolution, two ways to get more data, to build more elaborate models of the world. One is to have toy experiences, little counterfeit experiences. The other one is to learn from the experience of others. When somebody tells you a story, you can actually learn from that story, incorporate it into your model of the world to make your model more accurate based upon that data that you got from somebody else. So over time, we have come to call one of these things "play" and the other one "storytelling". These are both fundamentally educational technologies that allow us to build more elaborate models of the world around us, by supplanting our limited experience with other experiences.

Charles Geschke: quoted in "Dealers of Lightning" by Michael Hiltzik, p273

The typical posture and demeanor of the Xerox executives, and all of them were men, was this -- [arms folded sternly across the chest]. But their wives would immediately walk up to the machines and say, "Could I try that mouse thing?" That's because many of them had been secretaries -- users of the equipment. These guys, maybe they punched a button on a copier one time in their lives, but they had someone else do their typing and their filing. So we were trying to sell to people who really had no concept of the work this equipment was actually accomplishing.

Alan Kay: The Center of "Why?"

Living organisms are shaped by evolution to survive, not necessarily to get a clear picture of the universe. For example, frogs' brains are set up to recognize food as moving objects that are oblong in shape. So if we take a frog's normal food -- flies -- paralyze them with a little chloroform and put them in front of the frog, it will not notice them or try to eat them.

It will starve in front of its food! But if we throw little rectangular pieces of cardboard at the frog it will eat them until it is stuffed! The frog only sees a little of the world we see, but it still thinks it perceives the whole world.

Now, of course, we are not like frogs! Or are we?

Alan Kay: quoted in "Dealers of Lightning" by Michael Hiltzik

It's almost impossible for most people to see technology as the tool rather than the end. People get trapped in thinking that anything in the environment is to be taken as a given. It's part of the way our nervous system works. But it's dangerous to take it as a given because then it controls you, rather than the other way around. That's McLuhan's insight, one of the bigger ones in the twentieth century. Zen in the twentieth century is about taking things that have been rendered invisible by this process and trying to make them visible again.

Thomas Kuhn: Reflection on my Critics

[Wikipedia's paraphrase] Kuhn expressed the opinion that his critics' readings of his book were so inconsistent with his own understanding of it that he was "...tempted to posit the existence of two Thomas Kuhns," one the author of his book, the other the individual who had been criticized in the symposium by "Professors Popper, Feyerabend, Lakatos, Toulmin and Watkins."

Nina Paley: interview about "Sita Sings the Blues"

Q: Do you think that women directors bring a distinctive perspective and if so, how would you describe it? How would this story be different if told by a man? Or would a man not tell this story?

A: This story was told by me as an individual. An individual brings their individual characteristics and experience to a story. I happen to be a woman, but I'm a specific woman, not womankind in general. I can't tell you how other women would direct a particular film, or other men. We're all unique.

Richard Hamming: The Art of Doing Science and Engineering, ch 4

In the beginning we programmed in absolute binary... Finally, a Symbolic Assembly Program was devised -- after more years than you are apt to believe during which most programmers continued their heroic absolute binary programming. At the time [the assembler] first appeared I would guess about 1% of the older programmers were interested in it -- using [assembly] was "sissy stuff", and a real programmer would not stoop to wasting machine capacity to do the assembly.

Yes! Programmers wanted no part of it, though when pressed they had to admit their old methods used more machine time in locating and fixing up errors than the [assembler] ever used. One of the main complaints was when using a symbolic system you do not know where anything was in storage -- though in the early days we supplied a mapping of symbolic to actual storage, and believe it or not they later lovingly pored over such sheets rather than realize they did not need to know that information if they stuck to operating within the system -- no! When correcting errors they preferred to do it in absolute binary.

FORTRAN was proposed by Backus and friends, and again was opposed by almost all programmers. First, it was said it could not be done. Second, if it could be done, it would be too wasteful of machine time and capacity. Third, even if it did work, no respectable programmer would use it -- it was only for sissies!

Max Planck

New scientific ideas never spring from a communal body, however organized, but rather from the head of an individually inspired researcher who struggles with his problems in lonely thought and unites all his thought on one single point which is his whole world for the moment.

Max Planck

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.

David Hestenes: Notes for a Modeling Theory of Science, Cognition, and Instruction

While science is a search for structure, mathematics is the science of structure.

John D. Cook: Most published research results are false

Here's an example that shows how p-values can be misleading. Suppose you have 1,000 totally ineffective drugs to test. About 1 out of every 20 trials will produce a p-value of 0.05 or smaller by chance, so about 50 trials out of the 1,000 will have a "significant" result, and only those studies will publish their results. The error rate in the lab was indeed 5%, but the error rate in the literature coming out of the lab is 100 percent!

Freeman Dyson: quoted in Richard Feynman: No Ordinary Genius

[The Feynman diagram approach to quantum electrodynamics] was combining this very pictorial approach with strict adherance to quantum mechanics. And that's what made it so original. Quantum mechanics is generally regarded as a theory of waves. Feynman was able to do it by ignoring the wave aspect completely. The pictures show you just particles traveling along in straight lines. These then were translated into mathematics, but in a very simple fashion, so that once you had the geometrical picture, it was simple to go straight to the answer. And that made his methods very powerful, as compared to the conventional way of doing things, which is much more analytical.

Clay Shirky: Napster, Udacity, and the Academy

Every college provides access to a huge collection of potential readings, and to a tiny collection of potential lectures. We ask students to read the best works we can find, whoever produced them and where, but we only ask them to listen to the best lecture a local employee can produce that morning. Sometimes you're at a place where the best lecture your professor can give is the best in the world. But mostly not. And the only thing that kept this system from seeming strange was that we've never had a good way of publishing lectures. ...

The fight over [massive open online courses] is really about the story we tell ourselves about higher education: what it is, who it's for, how it's delivered, who delivers it. The most widely told story about college focuses obsessively on elite schools and answers a crazy mix of questions: How will we teach complex thinking and skills? How will we turn adolescents into well-rounded members of the middle class? Who will certify that education is taking place? How will we instill reverence for Virgil? Who will subsidize the professor's work?

Carver Mead: "Feynman as a Colleague", chapter in "Feynman and Computation"

Gordon Moore asked me whether tunneling would be a major limitation on how small we could make transistors in an integrated circuit. That question took me on a detour that was to last nearly 30 years.... I decided to make the question the subject of a talk. As I prepared for this event, I began to have serious doubts about my sanity. My calculations were telling me that, contrary to all the current lore in the field, we could scale down the technology such that everything got better. The circuits got more complex, they ran faster, and the took less power -- WOW! The more I looked at the problem, the more I was convinced that the result was correct, so I went ahead and gave the talk. That talk provoked considerable debate, and at the time most people didn't believe the result. But by the time the next workshop rolled around, a number of other groups had worked through the problem for themselves, and we were pretty much all in agreement.

Back in 1959, Feynman had given a lecture entitled "There's Plenty of Room at the Bottom". That talk had made a big impression me... I became completely absorbed with how the exponential increase in complexity of integrated circuits would change the way that we think about computing. The viewpoint of the computer industry at the time was an outgrowth of the industrial revolution; it was based on what was then called "the economy of scale." A 1000-horsepower engine cost only four times as much as a 100-horsepower engine. Therefore, the cost per horsepower became less as the engine was made larger. It was more cost effective to make a few large power plants than to make many small ones. Efficiency considerations favored the concentration of technology in a few large installations. The same was evidently true of computing. IBM was particularly successful following this strategy.

But as I looked at the physics of the emerging technology, it didn't work that way at all. The time required to move data was set by the velocity of light and related electromagnetic considerations, so it was far more effective to put whatever computing was required where the data were located. Efficiency considerations thus favored the distribution of technology, rather than the concentration of technology. The economics of information technology were the reverse of those of mechanical technology. I gave numerous talks on this topic and, at that time, what I had to say was contrary to what the industry wanted to hear.

Douglas Hofstadter: interview about "I Am a Strange Loop"

Q: Will the mind one day understand itself?

Depends on what you mean by understand itself. If you mean in broad-principle terms if we will come to understand things, yeah, I don't see why not. For example, I like to look back at Freud. I don't know when it was that he first published his ideas about the ego, the id and the superego, and I don't know how much truth there is to those ideas, but it was a big leap even if it wasn't completely correct, because nobody had ever spoken of the abstract architecture of a human soul or a human self. It's as if he were saying that a self can be thought of in an abstract way, the way a government is thought of, with a legislative branch, a judicial, an executive, and he was making guesses at what the architecture of a human self is. And maybe they were all wrong, but it doesn't matter; the point is it was a first stab. Like the Bohr atom, it was a wonderful intuitive leap.

Freeman Dyson: The Scientist as Rebel

My message is that science is a human activity, and the best way to understand it is to understand the individual human beings who practice it. Science is an art form and not a philosophical method. The great advances in science usually result from new tools rather than from new doctrines. ... Every time we introduce a new tool, it always leads to new and unexpected discoveries, because Nature's imagination is richer than ours.

John Holt: How Children Fail

Knowledge, learning, understanding, are not linear. They are not little bits of facts lined up in rows or piled up one on top of another. A field of knowledge, whether it be math, English, history, science, music, or whatever, is a territory, and knowing it is not just a matter of knowing all of the items in the territory, but of knowing how they relate to, compare with, and fit in with each other... It is the difference between knowing the names of all the streets in a city and being able to get from any place, by any desired route, to any other place.

Why do we talk and write about the world and our knowledge of it as if they were linear? Because that is the nature of talk. Words come out in single file, one at a time; there's no other way to talk or write. So in order to talk about it, we cut the real undivided world into little pieces, and make these into strings of talk, like beads on a necklace. But we must not be fooled; these strings of talk are not what the world is like. Our learning is not real, not complete, not accurate, above all not useful, unless we take these word strings and somehow convert them in our minds into a likeness of the world, a working mental model of the universe as we know it. Only when we have made such a model, and when there is at least a rough correspondence between that model and reality, can it be said of us that we have learned something.

John Holt: How Children Fail

I gave Marjorie 2 rods, and asked how many differently shaped rectangles she could make by putting them together. She saw there was only one... With 4 rods, there were two possible rectangles, a 1 x 4 and a 2 x 2. And so we worked our way up to 20, finding the factors of each number along the way... At no time on the way up to 20 did it occur to her that she could solve the problem by making use of what little she knew about factors. Given 10 rods, she did not think, "I can make a rectangle 5 rods long and 2 wide"; she had to work by trial and error each time. But she did get progressively quicker at seeing which combinations were possible and which were not.

I did not see until later that this increased quickness and skill was the beginning, the seed of a generalized understanding. An example comes to mind that was repeated many times. When the children had 12 rods, they made a 6 x 2 rectangle. Then they divided that rectangle in half and put the halves together to make a 4 x 3 rectangle. As they worked, their attack on the problem became more economical and organized. They were a long way from putting their insights and understandings into words, but they were getting there. The essential is that this sort of process not be rushed.

John Holt: How Children Learn

I went back to the [sliding pieces] puzzle many times, hoping that I would find some fresh approach to it; but my mind kept moving back into the little groove it had made for itself. I tried to make myself forget my supposed proof that the problem was impossible. No use. Before long I would be back at the business of trying to find the flaw in my reasoning. I never found it. Like many other people, in many other situations, I had reasoned myself into a box. Looking back at the problem, I saw my great mistake. I had begun to reason too soon, before I had allowed myself enough "Messing About," before I had built a good enough mental model of the ways in which those pieces moved, before I had given myself enough time to explore all of the possible ways in which they could move. The reason some of the children were able to do the puzzle was not that they did it blindly, but that they did not try to solve it by reason until they had found by experience what the pieces could do. Because their mental model of the puzzle was complete, it served them; because mine was incomplete, it failed me.

David Hawkins: Messing About in Science

When the mind is evolving the abstractions which will lead to physical comprehension, all of us must cross the line between ignorance and insight many times before we truly understand.

Loren Eiseley: The Man Who Saw Through Time

We of today have difficulty in realizing that the world of Bacon and Shakespeare was only semiliterate, steeped in religious contention, with its gaze turned backward in wonder upon the Greco-Roman past. Oswald Spengler justly remarks that human choice is only possible within the limitations and idea-forms of a given age. More than three hundred years ago, Francis Bacon would have understood him. Bacon's world horribly constricted his ability to exert his will upon it. At the same time he would have had a slight reservation. "Send out your little book upon the waters," he would have countered, "and hope. Your will may be worked beyond you in another and more favorable age.".

Carver Mead: keynote at Caltech EE centenial

My first experience with electronics was with vacuum tubes. And when I was a high school kid, I read about this "transistor" thing the Bell guys had invented. What I didn't realize was that Bill Shockley had basically, out of whole cloth, invented the semiconductor physics viewpoint and way of looking at things, and some of the devices that came along. And that's what we've been living on ever since. Absolutely fundamental to everything we do. Wasn't seen that way at the time, but that's what it turned out to be.

Carver Mead: keynote at Caltech EE centenial (closing)

Those things are going to be important. But -- the same way that Royal Sorensen didn't see information as a thing he could put his life work into -- right now, today, we can't see the thing, at all, that's going to be the most important 100 years from now.

Jerome Bruner: Toward a Theory of Instruction

The first response of educational systems under such acceleration [of societal technology] is to produce technicians and engineers and scientists as needed, but it is doubtful whether such a priority produces what is required to manage the enterprise. For no specific science or technology provides a metalanguage in terms of which to think about a society, its technology, its science, and the constant changes that these undergo with innovation. Could an automotive engineer have foreseen the death of small-town America with the advent of the automobile? He would have been so wedded to his task of making better and better automobiles that it would never have occurred to him to consider the town, the footpath, leisure, or local loyalty. Somehow, if change is to be managed, it requires men with skills in sensing continuity and opportunity for continuity. ...

A further speculation about preparation for change is that we are bound to move toward instruction in the sciences of behavior and away from the study of history... It has to do with the need for studying the possible rather than the achieved -- a necessary step if we are to adapt to change.

Jerome Bruner: Toward a Theory of Instruction, p 63

What is now a problem is how to "detach" the notations that the child has learned from the concrete, visible, manipulable embodiment to which it refers -- the wood. For if the child is to deal with mathematical properties he will have to deal with symbols per se, else he will be limited to the narrow and rather trivial range of symbolism that can be given direct (and only partial) visual embodiment. Concepts such as x2 and x3 may be given a visualizable referent, but what of xn?

Alan Kay: interview

The thing that traumatized me occurred a couple years later, when I found an old copy of Life magazine that had the Margaret Bourke-White photos from Buchenwald. This was in the 1940s -- no TV, living on a farm. That's when I realized that adults were dangerous. Like, really dangerous. I forgot about those pictures for a few years, but I had nightmares. But I had forgotten where the images came from. Seven or eight years later, I started getting memories back in snatches, and I went back and found the magazine. That probably was the turning point that changed my entire attitude toward life. It was responsible for getting me interested in education. My interest in education is unglamorous. I don't have an enormous desire to help children, but I have an enormous desire to create better adults.

Hippolyte Taine: On Intelligence (quoted in Jacques Hadamard: The Psychology of Invention in the Mathematical Field)

You may compare the mind of a man to the stage of a theatre, very narrow at the footlights but constantly broadening as it goes back. At the footlights, there is hardly room for more than one actor. ... As one goes further and further away from the footlights, there are other figures less and less distinct as they are more distant from the lights. And beyond these groups, in the wings and altogether in the background, are innumerable obscure shapes that a sudden call may bring forward and even within direct range of the footlights. Undefined evolutions constantly take place throughout this seething mass of actors of all kinds, to furnish the chorus leaders who in turn, as in a magic lantern picture, pass before our eyes.

Alan Kay: Turing Award talk (45:54)

I have a zillion prejudices. I love parallelism. I think because I learned to program plugboards before I learned to program a computer... I love hardware like the B5000... I love Lisp... JOSS was the most beautiful programming language ever done. It could hardly do anything, but it did it beautifully. It's an interesting challenge to take something of this beauty and try to scale it. You combine these two together and you got the original Logo... I love APL. All of these systems can be done in a different way. Basically, the love of these things is because these guys got to some special kernel. I love what Engelbart did. I love spreadsheets. I love HyperCard.

Suppose you amalgamate all of these wonderful things into a simple system that regular people could use.

James Gleick: Genius: The Life and Science of Richard Feynman, p 387

At one point Goodstein remarked, "You know, it's amazing that Watson made this great discovery even though he was so out of touch with what everyone in his field was doing."

Feynman held up the paper he had been writing on. Amid scribbling and embellishments he had inscribed one word: DISREGARD.

"That's what I've forgotten," he said.

Saul Griffith: profile in Wired

We write all of our own tools, no matter what project we're building. Pretty much anything that we're doing requires some sort of design tool that didn't exist before. In fact, the design tools that we write to do the projects that we're doing are a sort of product in and of themselves.

I think in reality, today, if you use the same tools as everyone else, you kind of build the same products. If you write your own tools, you can sort of see new things, design new things.

Charles Bloom: Some GDC Observations

I saw one really amazing game at GDC that stood out from the rest. It had all the players instantly smiling and laughing. It was fun for kids and adults. It created a feeling of group affinity. Everyone around wanted to join in. It was even beneficial to the body. It was an inflatable ball.

William Tozier: Down is just the most common way out

Nobody would believe me if I came right out and said that I create the field to suit the work I want to do. On the fly; not from whole cloth, but from the chunks of other fields as needed. Nor will they believe you, when you are cured of your profession and start to merely do what's called for to make yourself useful.

Kieran Egan: Why Education is Difficult and Contentious (29:49)

We don't actually think about our institutions. We think through them. We take for granted the institutions that surround us, and they frame the ways we think about the world.

W. D. Niven: preface to "The Scientific Papers of James Clerk Maxwell" (1890)

One striking characteristic [of James Clerk Maxwell as an undergraduate] was remarked by his contemporaries. Whenever the subject admitted of it he had recourse to diagrams, though his fellow students might solve the question more easily by a train of analysis. Many illustrations of this manner of proceeding might be taken from his writings, but in truth it was only one phase of his mental attitude towards scientific questions, which led him to proceed from one distinct idea to another instead of trusting to symbols and equations.

Hal Abelson: interview

Q: So it’s fine to say, everybody should learn a little bit about how to program and this way of thinking because it’s valuable and important. But then maybe that’s just not realistic. Donald Knuth told me that he thinks two percent of the population have brains wired the right way to think about programming.

A: That same logic would lead you to say that one percent of the US's population is wired to understand Mandarin. The reasoning there is equivalent.

William Gibson: interview, 9/4/2012

[re "cyberpunk"] Once's there's a label for it, it's all over.

Ward Cunningham: interview in Dr. Dobbs

The basic [software design] patterns have done their job. And then of course, you should ask, "What was that job?" And this is the thing that surprised me. The job was really to take C++, which was a fairly static language, and show people how to write dynamic programs in a static language. That's what most of the patterns in that book were about. And in the process, patterns extended the life of C++ by a decade, which is not what I thought would happen.

What I thought would happen is people, when they learned these patterns, would look at them and say, "Wow, these patterns are hard in C++ and they're easy in Smalltalk. So if I want to think in terms of these patterns, I might as well use a language where they're easily expressed." And extend the life of Smalltalk by a decade. But the opposite happened.

Wikipedia: Ivan Illich

According to a contemporary review in The Libertarian Forum, "[Ivan] Illich's advocacy of the free market in education is the bone in the throat that is choking the public educators." Although it is important to note that Illich's opposition was not merely to publicly funded schooling, as with the libertarians, but to schooling as such; the disestablishment of schools was for him not a means to a free market in educational services, but a deschooled society, which was a more fundamental shift... He actually opposed advocates of free-market education as "the most dangerous category of educational reformers."

Gerald M. Weinberg: The Psychology of Computer Programming

How many programmers [today] learn to write programs by reading programs? ... With the advent of terminals, things are getting worse, for the programmer may not even see his own program in a form suitable for reading. In the old days, programmers would while away the time by reading each others’ programs. Some even went so far as to read programs from the program library -- which in those days was still a library in the old sense of the term.

John Herschel: A Preliminary Discourse on the Study of Natural Philosophy (1831)

For example, the words -- square, circle, a hundred etc convey to the mind notions so complete in themselves, and so distinct from everything else, that we are sure when we use them we know the whole of our own meaning. It is widely different with words expressing natural objects and mixed relations.

Take, for instance, IRON. Different persons attach very different ideas to this word. One who has never heard of magnetism has a widely different notion of IRON from one in the contrary predicament. The vulgar, who regard this metal as incombustible, and the chemist, who sees it burn with the utmost fury, and who has other reasons for regarding it as one of the most combustible bodies in nature; -- the poet, who uses it as an emblem of rigidity; and the smith and the engineer, in whose hands it is plastic, and moulded like wax into every form; -- the jailer, who prizes it as an obstruction, and the electrician who sees in it only a channel of open communication by which -- that most impassable of objects -- air may be traversed by his imprisoned fluid, have all different, and all imperfect, notions of the same word.

The meaning of such a term is like a rainbow, -- every body sees an different one, and all maintain it to be the same.

Danny Hillis: The Power of Conviction

To an architect, imagination is mostly about the future. To invent the future, one must live in it, which means living (at least partly) in a world that does not yet exist. Just as a driver whizzing along a highway pays more attention to the front window than the rear, the architect steers by looking ahead. This can sometimes make them seem aloof or absent-minded, as if they are someplace else. In fact, they are. For them, the past is a lesson, the present is fleeting; but the future is real. It is infinite and malleable, brimming with possibility.

Marc Ettlinger: Quora: What are some English language rules that native speakers don't know, but still follow?

Almost everything we know about our native languages is what's called implicit knowledge. Stuff we don't know that we know, or stuff that we can't really describe, but we can do anyway. Like maybe riding a bike, or walking.

Carver Mead: Collective Electrodynamics, p 113

We can view nature as being continuous in both space and time. This picture of nature is what Einstein wanted most. But to arrive at this picture, we had to give up the one-way-direction of time, and allow coupling to everything on the light cone. This, too, was okay with Einstein. So why was he so hung up on local causality? Why do all the textbooks state that the coupling of states unified by a light cone is a violation of relativity? In science, as in all other aspects of human endeavor, each age has its blind spots, and what is impossible to comprehend in one generation seems natural and obvious to another. So, after only one generation, Zeh could say, "There are no quantum jumps, nor are there particles."

Ted Nelson: I Think I Know Who Satoshi Is, 3:35

Also like Satoshi [Nakamoto], I do not bother much with conventional academic publishing, and similarly count on posterity to understand and prove me right, since the present world doesn't get it. To work alone in this way takes insight, determination, and chutzpah, and no clinician could distinguish our condition from paranoia. Only posterity can sort it out. As Woody Allen says, posterity is the religion of the intellectuals.

Hermann Grassmann: Die Ausdehnungslehre, 1862 (4,I,II; 10), quoted in Crowe’s History of Vector Analysis, p 89

For I remain completely confident that the labor which I have expended on the science presented here and which has demanded a significant part of my life as well as the most strenuous application of my powers, will not be lost. It is true that I am aware that the form which I have given the science is imperfect and must be imperfect. But I know and feel obliged to state (though I run the risk of seeming arrogant) that even if this work should again remain unused for another seventeen years or even longer, without entering into the actual development of science, still that time will come when it will be brought forth from the dust of oblivion and when ideas now dormant will bring forth fruit. I know that if I also fail to gather around me in a position (which I have up to now desired in vain) a circle of scholars, whom I could fructify with these ideas, and whom I could stimulate to develop and enrich further these ideas, nevertheless there will come a time when these ideas, perhaps in a new form, will arise anew and will enter into living communication with contemporary developments. For truth is eternal and divine, and no phase in the development of truth, however small may be the region encompassed, can pass on without leaving a trace; truth remains, even though the garment in which poor mortals clothe it may fall to dust.

Michael Crowe: History of Vector Analysis

It was around this time that the ideas of the founders of non-Euclidean geometry, Nicholas Lobachevski and Janos Bolyai, were becoming known. It is important to realize that Hamilton, by creating the first extensive and consistent algebraic system that departed from at least one of the standard properties of traditional mathematics [commutative multiplication] issued in a development that was probably as significant for algebra as the non-Euclidean systems were for geometry. Perhaps the most significant message carried by Hamilton’s creation is that it is legitimate for mathematicians to create new algebraic systems that break traditional rules.

Gerald Jay Sussman: We Really Don't Know How To Compute! (24:21)

In the future, it's going to be the case that computers are so cheap and so easy to make, that you can make them the size of a grain of sand, complete with a megabyte of RAM. You're going to buy them by the bushel. You can pour them into your concrete, you buy your concrete by the megaflop, and you have a wall that's smart. So long as you can get the power to them, and they can do something, that's going to happen. Remember, your cells are pretty smart... they seem to talk to each other and do useful things.

Benoit Mandelbrot: The Fractal Geometry of Nature, p 21

In a letter to Dedekind, at the very beginning of the 1875-1925 crisis in mathematics, Cantor is overwhelmed by amazement at his own findings, [exclaiming] "to see is not to believe". And, as if on cue, mathematics seeks to avoid being misled by the graven images of monsters. What a contrast between the rococo exuberance of pre- or counterrevolutionary geometry, and the near-total visual barrenness of the works of Weierstrass, Cantor, or Peano! In physics, an analogous development threatened since about 1800, since Laplace's Celestrial Mechanics avoided all illustration. And it is exemplified by the statement by P. A. M. Dirac (in the preface of his 1930 Quantum Mechanics) that nature's "fundamental laws do not govern the world as it appears in our mental picture in any very direct way, but instead they control a substratum of which we cannot form a mental picture without introducing irrelevancies."

The wide and uncritical acceptance of this view has become destructive. In particular, in the theory of fractals, "to see is to believe."

Benoit Mandelbrot: The Fractal Geometry of Nature, p 22

Graphics is wonderful for matching models with reality... A formula can relate to only a small aspect of the relationship between model and reality, while the eye has enormous powers of integration and discrimination. True, the eye sometimes sees spurious relationships which statistical analysis later negates, but this problem arises mostly in areas of science where samples are very small. In the areas we shall explore, samples are huge.

William Cleveland: Visualizing Data, p 270

Sometimes, when visualization thoroughly reveals the structure of a set of data, there is a tendency to underrate the power of the method for the application. Little effort is expended in seeing the structure once the right visualization method is used, so we are mislead into thinking nothing exciting has occurred. The [previous example] might be such a case. The intensive visualization showed a linearity in hardness, a nonlinearity in tensile strength, and interaction between hardness and tensile strength, and three aberrant observations... It might be thought that anyone analyzing these data would uncover these properties. This is not the case. In the original treatment, the analysis got it wrong. They operated within a paradigm of numerical methods and probabilistic inference for data analysis, and not intensive visualization. They missed the nonlinearity. They missed the interaction. They missed the outliers. In other words, they missed most of the structure in the data.

Herbert Simon: The Sciences of the Artificial, p 153

All mathematics exhibits in its conclusions only what is already implicit in its premises... Hence all mathematical derivation can be viewed simply as change in representation, making evident what was previously true but obscure.

This view can be extended to all of problem solving -- solving a problem simply means representing it so as to make the solution transparent. If the problem solving could actually be organized in these terms, the issue of representation would indeed become central. But even if it cannot -- if this is too exaggerated a view -- a deeper understanding of how representations are created and how they contribute to the solution of problems will become an essential component in the future theory of design.

M Mitchell Waldrop: The Dream Machine: J.C.R. Licklider & the Revolution That Made Computing Personal, p 12

Considering all that happened later, Lick's youthful passion for psychology might seem like an aberration, a sideline, a twenty-five-year-long diversion from his ultimate career in computers. But in fact, his grounding in psychology would prove central to his very conception of computers. Virtually all the other computer pioneers of his generation would come to the field in the 1940s and 1950s with backgrounds in mathematics, physics, or electrical engineering, technological orientations that led them to focus on gadgetry -- on making the machines bigger, faster, and more reliable. Lick was unique in bringing to the field a deep appreciation for human beings: our capacity to perceive, to adapt, to make choices, and to devise completely new ways of tackling apparently intractable problems. As an experimental psychologist, he found these abilities every bit as subtle and as worthy of respect as a computer's ability to execute an algorithm. And that was why to him, the real challenge would always lie in adapting computers to the humans who used them, thereby exploiting the strengths of each.

M Mitchell Waldrop: The Dream Machine, p 20

[Norbert Weiner made his many contributions] in a style that left his more conventional colleagues shaking their heads. Instead of treating mathematics as formal exercise in the manipulation of symbols, Wiener worked by intuition, often groping his way toward a solution by trying to envision some physical model of the problem. He considered mathematical notation and language necessary evils at best -- things that tended to get in the way of the real ideas.

M Mitchell Waldrop: The Dream Machine, p 340

In trying to divide the two labs [CSL and SSL at Xerox PARC] between basic and applied computer research, [George Pake] was being led badly astray by his background as a physicist. At the time, he remembers, "I was a bit baffled. I kept looking for these underlying principles of computer science" -- the analogs of Newton's laws of motion, say -- "and I couldn't find them. There were certainly some deep ideas -- for example, information theory, or the Turing machine. But those ideas had not led to a large body of theory as there was in physics." The upshot was that his plan for the separation of the two labs just didn't work, because "both ended up doing applications."

M Mitchell Waldrop: The Dream Machine, p 419

Mead and Conway had pioneered a way of teaching integrated circuit design via an elegant and very general set of design principles. Draft chapters of their textbook, Introduction to VLSI Systems, had been circulating since 1977 and had already been used in courses at Caltech, Berkeley, Carnegie Mellon, and MIT. (The book itself, which was published in 1979, would go on to become a bible for VLSI professionals.)

But just as important, Mead and Conway had conceived the notion of a "silicon foundry." The idea was that students in an IC design course would each prepare a chip layout, specified in a standard chip description language, and then send it over the Arpanet to a "silicon broker'-originally PARC and then later Hewlett-Packard. The broker, in turn, would compile dozens of individual designs and then arrange with a chip manufacturer to have them all etched onto a single silicon wafer, so that the cost could be shared. Finally, the chips would be cut apart, packaged individually, and sent back to the students for testing and experimentation...

MOSIS would flourish into the 1990s. And chip innovation would flourish along with it. MOSIS supported design experiments for advanced architectures such as Intel's Cosmic Cube and the Connection Machine from Thinking Machines, Inc. It supported experiments in reduced-instruction-set computing at Stanford and Berkeley, thereby providing a proof-of-concept for a number of cutting-edge commercial chips of the late 1980s and 1990s. It even supported the development of the "graphics engine" chip by Stanford's James Clark, who would soon be applying his expertise as a cofounder of Silicon Graphics, Inc. And most of all, the MOSIS project produced an awful lot of people trained in VLSI design. Indeed, you could argue that MOSIS was as much responsible as any other single factor for the explosion in microchip technology during the 1980s and 1990s.

William Thurston: Mathematical Education

People are much smarter when they can use their full intellect and when they can relate what they are learning to situations or phenomena which are real to them.

The natural reaction, when someone is having trouble understanding what you are explaining, is to break up the explanation into smaller pieces and explain the pieces one by one. This tends not to work, so you back up even further and fill in even more details.

But human minds do not work like computers: it is harder, not easier, to understand something broken down into all the precise little rules than to grasp it as a whole. It is very hard for a person to read a computer assembly language program and figure out what it is about...

Studying mathematics one rule at a time is like studying a language by first memorizing the vocabulary and the detailed linguistic rules, then building phrases and sentences, and only afterwards learning to read, write, and converse. Native speakers of a language are not aware of the linguistic rules: they assimilate the language by focusing on a higher level, and absorbing the rules and patterns subconsciously. The rules and patterns are much harder for people to learn explicitly than is the language itself.

Ian Bogost: On the Manifesto for a Ludic Century

When you think about it, it's curious to pen a manifesto for a ludic century to come in the twenty-first century, when the manifesto itself was such a staple of twentieth-century thought... The modern manifesto as a written prescription that makes manifest certain principles really starts with the political manifestos of Marx, Engels, Bellegarrigue, and others in the mid-19th century. The artistic manifestos of Symbolism, Futurism, Dadaism, Surrealism, and others followed this lead, proclaiming clear, direct, and unyielding principles for creative practice. So, perhaps there is one fundamental challenge for the Manifesto for a Ludic Century: would a truly ludic century be a century of manifestos? Of declaring simple principles rather than embracing systems? Or, is the Ludic Manifesto meant to be the last manifesto, the manifesto to end manifestos, replacing simple answers with the complexity of "information at play?"

Douglas Hofstadter: Analogy as the Core of Cognition

I believe that all communication is via analogy. Indeed, I would describe communication this way: taking an intricate dance that can be danced in one and only one medium, and then, despite the intimacy of the marriage of that dance to that medium, making a radically new dance that is intimately married to a radically different medium, and in just the same way as the first dance was to its medium...

Imagine taking the most enthralling basketball game you ever watched... and giving a videotape of that game to a “soccer choreographer,” who will now stage all the details of an artificial soccer game that is in some sense analogous to your basketball game.

Doug Engelbart: quoted in "The Engelbart Hypothesis" by Valerie Landau and Eileen Clegg

My boss gave me quite a lecture one day. He said, "Look, here's eight pages you've gone through to describe this thing you want to do and it's still all faint. Bill has just written this proposal, on one page, very concise, clear, describing exactly what he wants to do with his research." The model proposal was very detailed in an intellectual domain that was already all thoroughly beaten out. What he was proposing was a very narrow research question pursuing a tiny sub-domain.

I tried to explain to my boss that I was interested in opening up an entirely new approach for which there is no vocabulary. Later, people used the term "paradigm shift" to describe a fundamental change in assumptions and thinking. If you're really dealing with something in a different paradigm, the vocabulary of almost everything you're trying to say is different. You have to somehow establish the terms as stepping-stones to arrive at what you're trying to say. And people aren't used to it taking that long for you to get the picture to them. That has been the basic problem ever since, when trying to describe the framework Augmentation System and the Bootstrap Strategy.

Stuart Russell: quoted in "The Man Who Would Teach Machines to Think" by James Somers

A lot of the stuff going on [in AI] is not very ambitious. In machine learning, one of the big steps that happened in the mid-'80s was to say, "Look, here’s some real data -- can I get my program to predict accurately on parts of the data that I haven’t yet provided to it?" What you see now in machine learning is that people see that as the only task.

Danny Hillis: quoted in "Clock Of The Long Now" by Stewart Brand, p 48

If you're going to do something that's meant to be interesting for ten millennia, it almost has to have been interesting for ten millennia. Clocks and other methods of measuring time have interested people for a very long time.

Rob Pike: Systems Software Research is Irrelevant

To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you don’t honor the standards you’re marginalized. Estimate that 90-95% of the work in Plan 9 was directly or indirectly to honor externally imposed standards. At another level, instruction architectures, buses, etc. have the same influence. With so much externally imposed structure, there’s little slop left for novelty.

David Hestenes: interview

The first discovery is one of the highlights of my life. And it gave me strong motivation and direction for my research. That discovery was recognition that the Pauli matrices could be reinterpreted as vectors, and their products had a geometric interpretation. I was so excited that I went and gave a little lecture about it to my father. Among other things, I said, “Look at this identity σ1 σ2 σ3 = i, which appears in all the quantum mechanics books that discuss spin. All the great quantum physicists, Pauli, Schroedinger, Heisenberg and even Dirac as well as mathematicians Weyl and von Neumann, failed to recognize its geometric meaning and the fact that it has nothing to do with spin. When you see the Pauli sigmas as vectors, then you can see the identity as expressing the simple geometric fact that three orthogonal vectors determine a unit volume. Thus there is geometric reason for the Pauli algebra...

My father gave me the greatest compliment of my life... “You have learned the difference between a mathematical idea and its representation by symbols. Many mathematicians never learn that!”

John Herivel: The Background to Newton's Principia, quoted by David Hestenes

When 1666 closed, Newton was not in command of the results that have made his reputation deathless, not in mathematics, not in mechanics, not in optics. What he had done in all three was to lay foundations, some more extensive than others, on which he could build with assurance, but nothing was complete at the end of 1666, and most were not even close to complete. Far from diminishing Newton's stature, such a judgment enhances it by treating his achievement as a human drama of toil and struggle rather than a tale of divine revelation. "I keep the subject constantly before me," he said, "and wait 'till the first dawnings open slowly, by little and little, into a full and clear light."

David Hestenes: Modeling Games in the Newtonian World

We marvel at the ingenuity of Newton's mathematical arguments, in part, because his methods were so unwieldy that no one since has fully mastered them. Before anyone could improve on Newton's performance, the world had to wait half a century for the development of better mathematical tools and techniques, primarily by Euler.

Douglas Hofstadter: Surfaces and Essences, p 452

Geniuses do not deliberately set off with the goal of concocting a wild-sounding analogy between some brand-new phenomenon, shimmering and mysterious, and some old phenomenon, conceptually distant and seemingly unrelated; rather, they concentrate intensely on some puzzling situation that they think merits deep attention, carefully circling around it, looking at it from all sorts of angles, and finally, if they are lucky, finding a viewpoint that reminds them of some previously known phenomenon that the mysterious new one resembles in a subtle but suggestive manner. Through such a process of convergence, a genius comes to see a surprising new essence of the phenomenon. This is high-level perception; this is discovery by analogy.

Carver Mead: interview

Once angels were the explanation, but now, for us, it is a "force," or "field." But these are all constructs of the human mind to help us to work with and visualize the regularities of nature. When we grasp onto some regularity, we give it a name, and the temptation is always to think that we really understand it. But the truth is that we're still not even close.

John Tukey: Prim-9

Just being able to see projections on all coordinate pairs can, as we have just seen, be very helpful, but it is not enough. To able to get reasonably to all two-dimensional projections means either a way to call for what we want, or a way to move about. Since we usually do not know just what we want, and when we do, we would find it hard to learn to call for it, what we need is a way to move about.

E. T. Jaynes: Probability in Quantum Theory

For all these years it has seemed obvious to me -- for the same reasons that it did to Einstein and Schrödinger -- that the Copenhagen interpretation is a mass of contradictions and irrationality and that, while theoretical physics can of course continue to make progress in the mathematical details and computational techniques, there is no hope of any further progress in our basic understanding of Nature until this conceptual mess is cleared up.

Because this position seems to arouse fierce controversy, let me stress our motivation: if quantum theory were not successful pragmatically, we would have no interest in its interpretation. It is precisely because of the enormous success of the QM mathematical formalism that it becomes crucially important to learn what that mathematics means. To find a rational physical interpretation of the QM formalism ought to be considered the top priority research problem of theoretical physics; until this is accomplished, all other theoretical results can only be provisional and temporary.

This conviction has affected the whole course of my career. I had intended originally to specialize in Quantum Electrodynamics working with J. R. Oppenheimer; but this proved to be impossible. Whenever I look at any quantum-mechanical calculation, the basic craziness of what we are doing rises in my gorge and I have to stop and try to find some different way of looking at the problem, that makes physical sense. Gradually I came to see that the resolution cannot be found within the confines of the traditional thinking of physicists; the foundations of probability theory and the role of human information have to be brought in, and so I have spent many years trying to understand them in the greatest generality.

David Foster Wallace: Tense Present: Democracy, English, and the Wars over Usage

A distinctive feature of [A Dictionary of Modern American Usage] is that its author is willing to acknowledge that a usage dictionary is not a bible or even a textbook but rather just the record of one smart person's attempts to work out answers to certain very difficult questions.

Jeff Bezos: interview

Oftentimes, invention requires a long-term willingness to be misunderstood. You do something that you genuinely believe in, that you have conviction about, but for a long period of time, well-meaning people may criticize that effort. When you receive criticism from well-meaning people, it pays to say -- first of all, search yourself -- "Are they right?" And if they are, you need to adapt what you're doing. If they're not right, if you really have conviction that they're not right, you need to have that long-term willingness to be misunderstood. It's a key part of invention.

Valentino Braitenberg: Vehicles: Experiments in Synthetic Psychology

Our vehicles may move in water by jet propulsion. Or you may prefer to imagine them moving somewhere between galaxies... swimming around in water... little carts moving on hard surfaces... It doesn't matter. Get used to a way of thinking in which the hardware of the realization of an idea is much less important than the idea itself.

Neil Postman: Interview

I don't think any of us can do much about the rapid growth of new technology... However, it is possible for us to learn how to control our own uses of technology... The forum that I think is best suited for this is our educational system. If students get a sound education in the history, social effects and psychological biases of technology, they may grow to be adults who use technology rather than be used by it.

Ta-Nehisi Coates: The Champion Barack Obama

My mother's admonishings had their place. God forbid I ever embarrass her. God forbid I be like my [absent] grandfather, like the fathers of my friends and girlfriends and wife. God forbid I ever stand in front of these white folks and embarrass my ancestors, my people, my dead. And God forbid I ever confuse that creed, which I took from my mother, which I pass on to my son, with a wise and intelligent analysis of my community. My religion can never be science. This is the difference between navigating the world and explaining it. ...

Catharsis is not policy. Catharsis is not leadership. And shame is not wisdom.

Alan Kay: Powerful Ideas Need Love Too! (1995)

Now computers can be television-like, book-like, and "like themselves." Today's commercial trends in educational and home markets are to make them as television-like as possible. And the weight of the billions of dollars behind these efforts is likely to be overwhelming. It is sobering to realize that in 1600, 150 years after the invention of the printing press, the top two bestsellers in the British Isles were the Bible and astrology books! Scientific and political ways of thinking were just starting to be invented. The real revolutions take a very long time to appear, because as McLuhan noted, the initial content and values in a new medium is always taken from old media.

Now one thing that is possible with computers and networks, that could get around some of the onslaught of "infobabble," is the possibility of making media on the Internet that is "self teaching." Imagine a child or adult just poking around the Internet for fun and finding something--perhaps about rockets or gene splicing--that looks intriguing. If it were like an article in an encyclopedia, it would have to rely on expository writing (at a level chosen when the author wrote it) to convey the ideas. This will wind up being a miss for most netsurfers, especially given the general low level of reading fluency today. The computer version of this will be able to find out how old and how sophisticated is the surfer and instantly tailor a progression of learning experiences that will have a much higher chance of introducing each user to the "good stuff" that underlies most human knowledge. A very young child would be given different experiences than older ones--and some of the experiences would try to teach the child to read and reason better as a byproduct of their interest. This is a "Montessori" approach to how some media might be organized on the Internet: one's own interests provide the motivation to journey through an environment that is full of learning opportunities disguised as toys.

This new kind of "dynamic media" is possible to make today, but very hard and expensive. Yet it is the kind of investment that a whole country should be able to understand and make. I still don't think it is a real substitute for growing up in a culture that loves learning and thinking. But in such a culture, such new media would allow everyone to go much deeper, in more directions, and experience more ways to think about the world than is possible with the best books today. Without such a culture, such media is likely to be absolutely necessary to stave off the fast approaching next Dark Ages.

Adam Cadre: 2013.11 minutiae

Charles Barkley weighed in: "In a locker room and with my friends, we use racial slurs. [...] The language we use, sometimes it's homophobic, sometimes it's sexist, a lot of times it's racist. We do that when we're joking with our teammates." It seems to me that this is an important social divide you don't often hear about: between (a) those who have internalized that you're not supposed to be racist and sexist and homophobic anymore, and aren't, and (b) those who have internalized that you're not supposed to be racist and sexist and homophobic in public anymore, but take it as a given that secretly everyone is making racist and sexist and homophobic jokes in private with their friends.

Marshall McLuhan: The Gutenberg Galaxy

Man the tool-making animal, whether in speech or in writing or in radio, has long been engaged in extending one or another of his sense organs in such a manner as to disturb all of his other senses and faculties.

Sydney Brenner: interview

A Fred Sanger [born 1918] would not survive today’s world of science. With continuous reporting and appraisals, some committee would note that he published little of import between insulin in 1952 and his first paper on RNA sequencing in 1967 with another long gap until DNA sequencing in 1977. He would be labelled as unproductive, and his modest personal support would be denied. We no longer have a culture that allows individuals to embark on long-term—and what would be considered today extremely risky—projects.

Max Black: Models and Metaphors: Studies in Language and Philosophy, p 242

There will always be competent technicians who... can be trusted to build the highways... But clearing intellectual jungles is also a respectable occupation. Perhaps every science must start with metaphor and end with algebra; and perhaps without the metaphor there would never have been any algebra.

Gerald Jay Sussman: Robust Design through Diversity

A computational system is very much a dynamical system, with a very complicated state space, and a program is very much like a system of (differential or difference) dynamical equations, describing the incremental evolution of the state. One thing we have learned about dynamical systems over the past hundred years is that only limited insights can be gleaned by manipulation of the dynamical equations. We have learned that it is powerful to examine the geometry of the set of all possible trajectories, the phase portrait, and to understand how the phase portrait changes with variations of the parameters of the dynamical equations. This picture is not brittle: the knowledge we obtain is structurally stable.

Kieran Egan: The Educated Mind: How Cognitive Tools Shape Our Understanding, p 76

The Greek alphabet, from which all alphabetic systems are derived, has particular characteristics for making us conscious of our language, or, rather, for determining the kind of consciousness of language that we develop...

Eric Havelock writes about "the superior technology of the Greek alphabet," which remains the "sole instrument of full literacy to the present day". That technology was superior to other scripts and to oral modes of communication, in Havelock's account, because it led to a conceptual revolution in ancient Greece in which "a reflexive syntax of definition, description, and analysis" was exploited by Plato, Aristotle, and other ancient Greeks and all their alphabetic successors. They generated the philosophic, scientific, historical, descriptive, legal, and moral forms of discourse that make up what we call the modern mind...

While many of their neighbors were using writing systems to mark pots of grain, wine, and olives, list kings and priests, and celebrate in stylized form victories over traditional enemies, the Greeks began to exploit their writing system in ways none of its inventors could have imagined... Among much else, it opened up what we call the historical period. Fluent literacy is not simply a matter of thinking and then writing the product of one's thoughts; the writing, rather, becomes a part of the process of thinking. Extended discursive writing is not an external copy of a kind of thinking that goes on in the head; it represents a distinctive kind of literate thinking.

Over the past thirty years, a number of scholars have argued with significant success that the fifth-century developments that had in the earlier part of this century been romantically referred to as the "Greek miracle" -- giving birth to democracy, logic, philosophy, history, drama, reflective introspection, and so on so suddenly -- were explainable in large part as an implication of the development and spread of alphabetic literacy... The developments were not simply in the new kinds of texts being produced in ancient Greece, such as Herodotus's Histories, but were somehow in the kind of thinking that went into writing and reading such texts, or listening to such texts being read or performed.

Kieran Egan: The Educated Mind: How Cognitive Tools Shape Our Understanding, p 93

We reinforce the image of the textbook, encyclopedia, or dictionary as the paradigm of the successful knower. It becomes important in such a climate of opinion to emphasize that books do not store knowledge. They contain symbolic codes that can serve us as external mnemonics for knowledge. Knowledge can exist only in living human minds.

Rosemary Simpson: 50 Years After "As We May Think": The Brown/MIT Vannevar Bush Symposium (1995)

The prescience of [Vannevar] Bush's vision is astonishing. He defined a goal, a strategy, and a research agenda that are still alive today. But, as Andy van Dam made clear in his opening remarks and the subsequent symposium speakers confirmed in their testimony, Bush turned out to be not so much predicting the future as creating it through the influence, both direct and indirect, of his compelling vision on major figures like Engelbart, Nelson, and the other symposium speakers.

Plato: Phaedrus

Socrates: I cannot help feeling, Phaedrus, that writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence. And the same may be said of [written words]. You would imagine that they had intelligence, but if you want to know anything and put a question to one of them, the speaker always gives one unvarying answer. And when they have been once written down they are tumbled about anywhere among those who may or may not understand them, and know not to whom they should reply, to whom not: and, if they are maltreated or abused, they have no parent to protect them; and they cannot protect or defend themselves.

Matt Ridley: The Origins of Virtue: Human Instincts and the Evolution of Cooperation, p 48

When the 5,000-year-old mummified corpse of a fully equipped Neolithic man turned up in a melting glacier high in the Tyrolean Alps in 1991, the variety and sophistication of his equipment was astonishing... Dressed in furs under a woven grass cloak, equipped with a stone dagger with an ash-wood handle, a copper axe, a yew-wood bow, a quiver and fourteen cornus-wood arrows, he also carried a tinder fungus for lighting fires, two birch-bark containers, one of which contained some embers of his most recent fire, insulated by maple leaves, a hazel-wood pannier, a bone awl, stone drills and scrapers, a lime-wood-and-antler retoucheur for fine stone sharpening, an antibiotic birch fungus as a medicine kit and various spare parts. His copper axe was cast and hammered sharp in a way that is extremely difficult to achieve even with modern metallurgical knowledge. It was fixed with millimetre precision into a yew haft that was shaped to obtain mechanically ideal ratios of leverage.

This was a technological age. People lived their lives steeped in technology. They knew how to work leather, wood, bark, fungi, copper, stone, bone and grass into weapons, clothes, ropes, pouches, needles, glues, containers and ornaments. Arguably, the unlucky mummy had more different kinds of equipment on him than the hiker couple who found him. Archaeologists believe he probably relied upon specialists for the manufacture of much of his equipment, and perhaps also for the tattoos that had been applied to his arthritic joints.

Frank Lantz: Hearts and Minds (33:50)

The dilemma of quantitative, data-driven game design.... So here's an analogy: Imagine you have a friend who has trouble forming relationships with women, and he tells you, "I don't know what I'm doing wrong. I go on a date, and I bring a thermometer so I can measure their skin temperature. I bring calipers so I can measure their pupil, to see when it's expanding and contracting..." The point is, it doesn't even matter if these are the correct things to measure to predict someone's sexual arousal. If you bring a thermometer and calipers with you on a date, you're not going to be having sex...

There's really only one important question worth asking, which is: what is a life well-lived?... That's a question that can't be answered, but one thing we can say, with a lot of certainty, is that a life well-lived is not going to be a life in which every moment is scrutinized.

Ed Catmull: Keep Your Crises Small, 23:51

Initially, the films [our teams] put together, they're a mess. It's like everything else in life -- the first time you do it, it's a mess. Sometimes it's labeled "first time, it's a failure", but that's not even the right word to use. It's just like, you get the first one out, you learn from it, and the only failure is if you don't learn from it, if you don't progress.

Richard Rhodes: The Making of the Atomic Bomb, p50

"Collision" [of alpha particles and nuclei] is misleading. What [Ernest] Rutherford had visualized, making calculations and drawing diagrammatic atoms on large sheets of good paper, was exactly the sort of curving path toward and away from a compact, massive central body that a comet follows in its gravitational pas de deux with the sun. He had a model made, a heavy electromagnet suspended as a pendulum on thirty feet of wire that grazed the face of another electromagnet set on a table. With the two grazing faces matched in polarity and therefore repelling each other, the pendulum was deflected into a parabolic path according to its velocity and angle of approach, just as the alpha particles were deflected. He needed as always to visualize his work.

Nikolai Luzin: The Evolution of Function, Part II

The modern understanding of function and its definition, which seems correct to us, could arise only after Fourier's discovery. His discovery showed clearly that most of the misunderstandings that arose in the debate about the vibrating string were the result of confusing two seemingly identical but actually vastly different concepts, namely that of function and that of its analytic representation. Indeed, prior to Fourier's discovery no distinction was drawn between the concepts of "function" and of "analytic representation," and it was this discovery that brought about their disconnection.

Neil Postman: Five Things We Need to Know About Technological Change

Every technology has a prejudice. Like language itself, it predisposes us to favor and value certain perspectives and accomplishments. In a culture without writing, human memory is of the greatest importance, as are the proverbs, sayings and songs which contain the accumulated oral wisdom of centuries. That is why Solomon was thought to be the wisest of men. In Kings I we are told he knew 3,000 proverbs. But in a culture with writing, such feats of memory are considered a waste of time, and proverbs are merely irrelevant fancies. The writing person favors logical organization and systematic analysis, not proverbs. The telegraphic person values speed, not introspection. The television person values immediacy, not history...

Every technology has a philosophy which is given expression in how the technology makes people use their minds, in what it makes us do with our bodies, in how it codifies the world, in which of our senses it amplifies, in which of our emotional and intellectual tendencies it disregards. This idea is the sum and substance of what the great Catholic prophet, Marshall McLuhan meant when he coined the famous sentence, “The medium is the message.”

Neil Postman: Five Things We Need to Know About Technological Change

Who, we may ask, has had the greatest impact on American education in this century? If you are thinking of John Dewey or any other education philosopher, I must say you are quite wrong. The greatest impact has been made by quiet men in grey suits in a suburb of New York City called Princeton, New Jersey. There, they developed and promoted the technology known as the standardized test, such as IQ tests, the SATs and the GREs. Their tests redefined what we mean by learning, and have resulted in our reorganizing the curriculum to accommodate the tests.

A second example concerns our politics... The radicals who have changed the nature of politics in America are entrepreneurs in dark suits and grey ties who manage the large television industry in America. They did not mean to turn political discourse into a form of entertainment. They did not mean to make it impossible for an overweight person to run for high political office. They did not mean to reduce political campaigning to a 30-second TV commercial. All they were trying to do is to make television into a vast and unsleeping money machine. That they destroyed substantive political discourse in the process does not concern them.

Noam Chomsky: talk at Georgetown University

The US media are alone in that you must meet the condition of concision. You gotta say things between two commercials, or in 600 words. And that's a very important fact, because the beauty of concision is that you can only repeat conventional thoughts.

Suppose I get up on Nightline, and I'm given two minutes, and I say [...], I don't need any evidence. Everybody just nods. On the other hand, suppose you say something that isn't just regurgitating conventional pieties. Suppose you say something that's the least bit unexpected or controversial... People will, quite reasonably, expect to know what you mean. Why did you say that? I never heard that before. If you said that, you better have a reason, better have some evidence. In fact, you better have a lot of evidence, because that's a pretty startling comment. You can't give evidence if you're stuck with concision. That's the genius of this structural constraint.

Alan Kay: The Power of the Context

Giving a professional illustrator a goal for a poster usually results in what was desired. If one tries this with an artist, one will get what the artist needed to create that day. Sometimes we make, to have, sometimes to know and express. The pursuit of Art always sets off plans and goals, but plans and goals don't always give rise to Art. If "visions not goals" opens the heavens, it is important to find artistic people to conceive the projects.

Thus the "people not projects" principle was the other cornerstone of ARPA/PARC’s success. Because of the normal distribution of talents and drive in the world, a depressingly large percentage of organizational processes have been designed to deal with people of moderate ability, motivation, and trust. We can easily see this in most walks of life today, but also astoundingly in corporate, university, and government research. ARPA/PARC had two main thresholds: self-motivation and ability. They cultivated people who "had to do, paid or not" and "whose doings were likely to be highly interesting and important". Thus conventional oversight was not only not needed, but was not really possible. "Peer review" wasn't easily done even with actual peers. The situation was "out of control", yet extremely productive and not at all anarchic.

"Out of control" because artists have to do what they have to do. "Extremely productive" because a great vision acts like a magnetic field from the future that aligns all the little iron particle artists to point to “North” without having to see it. They then make their own paths to the future.

Leigh Alexander: The Unearthing

Ian is a game design teacher and a professional skeptic. People call him a “curmudgeon”, but they don’t really understand how much love, how much actual faith, that kind of skepticism takes. On a pretty regular basis one of us will IM the other something like “help” or “fuck” or “people are terrible”.

Only when you fully believe in how wonderful something is supposed to be does every little daily indignity start to feel like some claw of malaise. At least, that’s how I explain Ian to other people.

Elaine Ou: Your Margin is My Opportunity

Your margin is my opportunity. –Jeff Bezos, Amazon

That’s probably what Lyft and Uber were saying to each other as they slashed their commissions to 0. How do you beat a company that doesn’t need to make money?

The 8 hours you need to sleep each night, are my opportunity. The time you spend with your family and friends, is my opportunity. If you’re not maxed out, if there’s still a shred of humanity left in you, then you’re just leaving your lunch on the table.

David Graeber: interview

[In socialist regimes], you couldn’t really get fired from your job. As a result you didn’t really have to work very hard... I have a lot of friends who grew up in the USSR, or Yugoslavia, who describe what it was like. You get up. You buy the paper. You go to work. You read the paper. Then maybe a little work, and a long lunch, including a visit to the public bath. If you think about it in that light, it makes the achievements of the socialist bloc seem pretty impressive: a country like Russia managed to go from a backwater to a major world power with everyone working maybe on average four or five hours a day. But the problem is they couldn’t take credit for it. They had to pretend it was a problem, “the problem of absenteeism,” or whatever, because of course work was considered the ultimate moral virtue. They couldn’t take credit for the great social benefit they actually provided. Which is, incidentally, the reason that workers in socialist countries had no idea what they were getting into when they accepted the idea of introducing capitalist-style work discipline. “What, we have to ask permission to go to the bathroom?” It seemed just as totalitarian to them as accepting a Soviet-style police state would have been to us.

That ambivalence in the heart of the worker’s movement remains... On the one hand, there’s this ideological imperative to validate work as virtue in itself. Which is constantly being reinforced by the larger society. On the other hand, there’s the reality that most work is obviously stupid, degrading, unnecessary, and the feeling that it is best avoided whenever possible. But it makes it very difficult to organize, as workers, against work.

John Conway: On Numbers and Games, epilogue

The Surreal Numbers [introduced in this book] have been the topic of many research papers and a number of books... Most of the authors who have written about them have chosen to define surreal numbers to be just their sign-sequences. This has the great advantage of making equality be just identity rather than an inductively defined relation, and also of giving a clear mental picture from the start. However, I think it has probably also impeded further progress. Let me explain why.

The greatest delight, and at the same time, the greatest mystery, of the Surreal numbers is the amazing way that a few simple "genetic" definitions magically create a richly structured Universe out of nothing. Technically, this involves in particular the facts that each surreal number is repeatedly redefined, and that the functions the definitions produce are independent of form. Surely real progress will only come when we understand the deep reasons why these particular definitions have this property? It can hardly be expected to come from an approach in which this problem is avoided from the start? ...

I believe the real way to make "surreal progress" is to search for more of these "genetic" definitions and seek to understand their properties.

Ian Bogost: Proteus: A Trio of Artisanal Game Reviews

Proteus is a game about being an island instead of a game about being on one... One explores Proteus less like one explores a wooded nature preserve and more like one explores a naked body -- by moving it through one's attention rather than by moving one's attention through it.

Clay Shirky: The Semantic Web, Syllogism, and Worldview (2003)

The people working on the Semantic Web greatly overestimate the value of deductive reasoning (a persistent theme in Artificial Intelligence projects generally.) The great popularizer of this error was Arthur Conan Doyle, whose Sherlock Holmes stories have done more damage to people's understanding of human intelligence than anyone other than Rene Descartes. Doyle has convinced generations of readers that what seriously smart people do when they think is to arrive at inevitable conclusions by linking antecedent facts.

This sentiment is attractive precisely because it describes a world simpler than our own. In the real world, we are usually operating with partial, inconclusive or context-sensitive information. When we have to make a decision based on this information, we guess, extrapolate, intuit, we do what we did last time, we do what we think our friends would do or what Jesus or Joan Jett would have done, we do all of those things and more, but we almost never use actual deductive logic.

As a consequence, almost none of the statements we make, even seemingly obvious ones, are true in the way the Semantic Web needs them to be true. Drew McDermott [noted] "It must be the case that a significant portion of the inferences we want [to make] are deductions, or it will simply be irrelevant how many theorems follow deductively from a given axiom set." ...

This [absurd syllogism from Dodgson] is the best critique of the Semantic Web ever published, as it illustrates the kind of world we would have to live in for this form of reasoning to work, a world where language is merely math done with words. ...

Statements in the Semantic Web work as inputs to syllogistic logic not because syllogisms are a good way to deal with slippery, partial, or context-dependent statements -- they are not, for the reasons discussed above -- but rather because syllogisms are things computers do well. If the world can't be reduced to unambiguous statements that can be effortlessly recombined, then it will be hard to rescue the Artificial Intelligence project. And that, of course, would be unthinkable.

Ian Bogost: What Do We Save When We Save the Internet?

The comments, and reading them, and not reading them. Knowing that response and reaction responds and reacts to someone’s preferred idea rather than the ideas proffered.

Douglas Adams: So Long, and Thanks for All the Fish, p 130

It is very easy to be blinded to the essential uselessness of [these products] by the sense of achievement you get from getting them to work at all.

In other words -- and this is the rock solid principle on which the whole of the Corporation’s Galaxy-wide success is founded -- their fundamental design flaws are completely hidden by their superficial design flaws.

Hal Abelson and Andrea diSessa: Turtle Geometry: The Computer as a Medium for Exploring Mathematics

We encourage you not to lose sight of the most important reason for a combined look at turtles and vectors: Turtle geometry and vector geometry are two different representations for geometric phenomena, and whenever we have two different representations of the same thing we can learn a great deal by comparing representations and translating descriptions from one representation into the other. Shifting descriptions back and forth between representations can often lead to insights that are not inherent in either of the representations alone.

George Lakoff and Rafael Núñez: Where Mathematics Come From: How The Embodied Mind Brings Mathematics Into Being

In recent years, there have been revolutionary advances in cognitive science... Perhaps the most profound of these new insights are the following:

1. The embodiment of mind. The detailed nature of our bodies, our brains, and our everyday functioning in the world structures human concepts and human reason. This includes mathematical concepts and mathematical reason.

2. The cognitive unconscious. Most thought is unconscious -- not repressed in the Freudian sense but simply inaccessible to direct conscious introspection. We cannot look directly at our conceptual systems and at our low-level thought processes. This includes most mathematical thought.

3. Metaphorical thought. For the most part, human beings conceptualize abstract concepts in concrete terms, using ideas and modes of reasoning grounded in the sensory-motor system. The mechanism by which the abstract is comprehended in terms of the concrete is called conceptual metaphor.

George Lakoff and Rafael Núñez: Where Mathematics Come From: How The Embodied Mind Brings Mathematics Into Being

Symbolic logic is not the basis of all rationality, and it is not absolutely true. It is a beautiful metaphorical system, which has some rather bizarre metaphors. It is useful for certain purposes but quite inadequate for characterizing anything like the full range of the mechanisms of human reason...

Mathematics as we know it is human mathematics, a product of the human mind, [using] the basic conceptual mechanisms of the embodied human mind as it has evolved in the real world. Mathematics is a product of the neural capacities of our brains, the nature of our bodies, our evolution, our environment, and our long social and cultural history.

Lewis Lapham: The Eternal Now: Introduction to "Understanding Media"

By eliminating the dimensions of space and time, the electronic forms of communication also eliminate the presumption of cause and effect. Typographic man assumed that A follows B, that people who made things -- whether cities, ideas, families, or works of art -- measured their victories (usually Pyrrhic) over periods of time longer than those sold to the buyers of beer commercials. Graphic man imagines himself living in the enchanted garden of the eternal now. If all the world can be seen simultaneously, and if all mankind's joy and suffering is always and everywhere present (if not on CNN or Oprah, then on the "Sunday Night Movie" or MTV), nothing necessarily follows from anything else. Sequence becomes merely additive instead of causative.

Joe Armstrong: On the road again

The Himba [of northern Namibia] have distinct names for different shades of green, we have no names for these shades, so we don’t “see” the different colors, but the Himba do. [This is analogous] to concepts on programming languages, how different words have acquire precise meanings which are misunderstood in different communities...

For me the words “concurrent,” “parallel” and “simultaneous” have completely different meanings, but many people think they mean the same thing. It’s like me seeing three shades of green, when the person I’m talking to sees one green.

Gregory Bateson: conversation with Stewart Brand in "Both Sides of the Necessary Paradox"

"Oh the damage that's been done to psychiatric thinking by the clinical bias. The clinical bias being, that there are good things and there are bad things. The bad things necessarily have causes. This is not so true of good things.

"No experimenter links up, say, the phenomena of schizophrenia with the phenomena of humor. Schizophrenia is clinical, and humor isn't even psychology, you know. The two of them are closely related, and closely related, both of them, to arts and poetry and religion. So you've got a whole spectrum of phenomena the investigation of any of which throws light on any other. The investigation of none of which is very susceptible to the experimental method."

"Because of non-isolatability?" I [Stewart Brand] think I'm ahead of him this time.

"Because the experiment always puts a label on the context in which you are. You can't really experiment with people, not in the lab you can't. It's doubtful you can do it with dogs. You cannot induce a Pavlovian nervous breakdown -- what do they call it, 'experimental neurosis' -- in an animal out in the field."

"I didn't know that!" I'm gleeful.

More of the Bateson chortle. "You've got to have a lab."

"Why?"

"Because the smell of the lab, the feel of the harness in which the animal stands, and all that, are context markers which say what sort of thing is going on in this situation; that you're supposed to be right or wrong, for example.

"What you do to induce these neuroses is, you train the animal to believe that the smell of the lab and similar things is a message which tells him he's got to discriminate between an ellipse and a circle, say. Right. He learns to discriminate. Then you make the discrimination a little more difficult, and he learns again, and you have underlined the message. Then you make the discrimination impossible.

"At this point discrimination is not the appropriate form of behavior. Guesswork is. But the animal cannot stop feeling that he ought to discriminate, and then you get the symptomology coming on. The one whose discrimination broke down was the experimenter, who failed to discriminate between a context for discrimination and a context for gambling."

"So," says I, "it's the experimenter's neurosis that --"

"-- Has now become the experimental neurosis of the animal. This whole context business has a Heisenberg hook in it much worse than the atoms ever thought of." ...

"In the field what happens?"

"None of this happens. For one thing, the stimuli don't count. Those electric shocks they use are about as powerful as what the animal would get if he pricked his leg on a bramble, pushing through.

Suppose you've got an animal whose job in life is to turn over stones and eat the beetles under them. All right, one stone in ten is going to have a beetle under it. He cannot go into a nervous breakdown because the other nine stones don't have beetles under them. But the lab can make him do that you see."

"Do you think we're all in a lab of our own making, in which we drive each other crazy?"

"You said it, not I, brother," chuckling. "Of course."

Frank Lantz: Life and Death and Middle Pair: Go, Poker and the Sublime (17:10)

Go is about thinking. Go is thought made visible to itself... When you study go, you must learn to read. And it's painful and dull... Black goes here, trying to capture white, so white goes here, black goes here, white goes here....

But you also must learn how to see, as if from a distance, the patterns that cannot be articulated in this discrete and finite way. Truly high-level play is about intuition and feel and wisdom, as well as this brute-strength tactical reading.

So go is something like a brightly-colored dye that is squirted into the fluid of our thoughts just at the point where they unfold into turbulence, at the threshold of these two ways of seeing, the discrete and the continuous. [reason and intuition]

We set up camp at the border of what is possible for our minds to compute, and then we push into the wilderness.

Doug Engelbart: interview

I started trying to reach out to make connections in domains of interest and concerns out there that fit along the vector I was interested in. I went to the information retrieval people. I remember one instance when I went to the Ford Foundation's Center for Advanced Study in Social Sciences... I was trying to explain what I wanted to do and one guy just kept telling me, "You are just giving fancy names to information retrieval. Why do that? Why don't you just admit that it's information retrieval and get on with the rest of it and make it all work?" He was getting kind of nasty. The other guy was trying to get him to back off.

Kieran Egan: The Educated Mind: How Cognitive Tools Shape Our Understanding (p 179)

A problem for understanding intellectual development is that we have no adequate metaphors for it. You will perhaps recall Jonathan Miller's claim that it became possible to understand the function of the heart and the circulation of the blood only after the invention of pumps to clear mines of water. Everyone since the beginning of our species has felt that regular thump in the chest and has seen blood spurt when arteries are cut. Thinking of the heart as a pump enabled us to understand its function.

But we have no similarly useful metaphor to help us understand intellectual or cultural or educational development. The nearest processes are biological... Piaget's is perhaps the best known elaboration of a biology-based metaphor of development, and John Dewey elaborated a conception of education drawing on "growth".

I have tried to adhere to a nonbiological and rather vague use of "development". These kinds of understanding are only "somewhat" distinctive in that they are not wholly different forms of thought, mutually incomprehensible; they are not so much like different computer programs as like modules of a well-integrated program, focusing on different tasks but each able to comprehend the others. Well, that's not a great metaphor either; tempting though computer metaphors are for mental operations, they always seem to confuse as much as they clarify.

John Maynard Keynes: The General Theory of Employment, Interest and Money (1936)

This book's... main purpose is to deal with difficult questions of theory, and only in the second place with the applications of this theory to practice... Those, who are strongly wedded to what I shall call “the classical theory”, will fluctuate, I expect, between a belief that I am quite wrong and a belief that I am saying nothing new....

The composition of this book has been for the author a long struggle of escape, and so must the reading of it be for most readers if the author’s assault upon them is to be successful, -- a struggle of escape from habitual modes of thought and expression. The ideas which are here expressed so laboriously are extremely simple and should be obvious. The difficulty lies, not in the new ideas, but in escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds.

Ian Bogost: interview about A Slow Year

As for flaws, the biggest risk is whether or not I’ll be able to make the game legible to players. Maybe I’m about to lose all my remaining credibility, it’s not my intention to be effete or obscurantist. Yet, I know that’s how some will receive the game. But, I really mean all this stuff. It’s not pretense.

Edwin Hutchins: Cognition in the Wild, p 367

The definition of cognition has been unhooked from interaction with the world. Research on games and puzzles has produced some interesting insights, but the results may be of limited generality. The tasks typically chosen for laboratory studies are novel ones that are regarded by subjects as challenging or difficult. D'Andrade has likened the typical laboratory cognitive tasks to feats of athletic prowess. If we want to know about walking, studying people jumping as high as they can may not be the best approach.

Such tasks are unrepresentative in another sense as well. The evolution of the material means of thought is an important component of culturally elaborated tasks. It permits a task that would otherwise be difficult to be re-coded and re-represented in a form in which it is easy to see the answer. This sort of development of material means is intentionally prohibited in puzzle tasks because to allow this sort of evolution would destroy the puzzling aspects of the puzzle. Puzzles are tasks that are preserved in the culture because they are challenging. If the performance mattered, we would learn to re-represent them in a way that removed the challenge. That would also remove their value as puzzles, of course.

The point is that the tasks that are "typical" in laboratory studies of thought are drawn from a special category of cultural materials that have been isolated from the cognitive processes of the larger cultural system. This makes these tasks especially unrepresentative of human cognition.

Jacob Bronowski: The Ascent of Man, p 70

Genghis Khan was a nomad and the inventor of a powerful war machine -- and that conjunction says something important about the origins of war in human history. Of course, it is tempting to close one's eyes to history, and instead to speculate about the roots of war in some possible animal instinct: as if, like the tiger, we still had to kill to live, or, like the robin redbreast, to defend a nesting territory.

But war, organized war, is not a human instinct. It is a highly planned and co-operative form of theft. And that form of theft began ten thousand years ago when the harvesters of wheat accumulated a surplus, and the nomads rose out of the desert to rob them of what they themselves could not provide. The evidence for that we saw in the walled city of Jericho and its prehistoric tower. That is the beginning of war.

A. H. Maslow: A Theory of Human Motivation

Emergency conditions are, almost by definition, rare in the normally functioning peaceful society. That this truism can be forgotten is due mainly to two reasons. First, rats have few motivations other than physiological ones, and since so much of the research upon motivation has been made with these animals, it is easy to carry the rat-picture over to the human being. Secondly, it is too often not realized that culture itself is an adaptive tool, one of whose main functions is to make the physiological emergencies come less and less often. In most of the known societies, chronic extreme hunger of the emergency type is rare, rather than common. In any case, this is still true in the United States. The average American citizen is experiencing appetite rather than hunger when he says "I am hungry." He is apt to experience sheer life-and-death hunger only by accident and then only a few times through his entire life.

Obviously a good way to obscure the 'higher' motivations, and to get a lopsided view of human capacities and human nature, is to make the organism extremely and chronically hungry or thirsty. Anyone who attempts to make an emergency picture into a typical one, and who will measure all of man's goals and desires by his behavior during extreme physiological deprivation is certainly being blind to many things. It is quite true that man lives by bread alone -- when there is no bread. But what happens to man's desires when there is plenty of bread and when his belly is chronically filled?

At once other (and 'higher') needs emerge and these, rather than physiological hungers, dominate the organism. And when these in turn are satisfied, again new (and still 'higher') needs emerge and so on. This is what we mean by saying that the basic human needs are organized into a hierarchy of relative prepotency.

Edwin Hutchins: Cognition in the Wild, p 107

[Regarding Edmond Gunter's predecessor of the slide rule] Again we have an artifact on which computations are performed by physical manipulation. However, there is an important difference between the astrolabe and Gunter's scale in this regard. In both cases the constraints of a represented world are built into the physical structure of the device, but in the case of Gunter's scale the represented world is not literally the world of experience. Instead it is a symbolic world: the world of logarithmic representations of numbers. The regularities of relations among entities in this world are built into the structure of the artifact, but this time the regularities are the syntax of the symbolic world of numbers rather than the physics of a literal world of earth and stars. The representations of symbolic worlds in physical artifacts, and especially the representation of the syntax of such a world in the physical constraints of the artifact itself, is an enormously powerful principle.

Edwin Hutchins: Cognition in the Wild, p 115

Of all the many possible ways of representing position and implementing navigation computations in the Western tradition, the chart is the one in which the meaning of the expression of position and the meaning of the operations that produce that expression are most easily understood. As was noted above, lines of position could be represented as linear equations, and the algorithm applied to find their intersection could be that of simultaneous linear equations. As a physical analog of space, the chart provides an interface to a computational system in which the user's understanding of the form of the symbolic expressions (lines of position) is structurally similar to the user's understanding of the meanings of the expressions (relations among locations in the world).

In fact, the similarity is so close that many users find the form and the meaning indistinguishable. Navigators not only think they are doing the computations, they also invest the interpretations of events in the domain of the representations with a reality that sometimes seems to eclipse the reality outside the skin of the ship. One navigator jokingly described his faith in the charted position by creating the following mock conversation over the chart: "This little dot right here where these lines cross is where we are! I don't care if the bosun says we just went aground, we are here and there is plenty of water under the ship here." For the navigator, the ship is where the lines of position intersect.

Edwin Hutchins: Cognition in the Wild, p 270

Simply being in the presence of others who are working does not always provide a context for learning from their actions. In the example above, the fact that the work was done in [a conversation] between the plotter and the bearing recorder opened it to other members of the team. In a similar way, the design of tools can affect their suitability for joint use or for demonstration and may thereby constrain the possibilities for knowledge acquisition. The interaction of a task performer with a tool may or may not be open to others, depending upon the nature of the tool itself. The design of a tool may change the horizon of observation of those in the vicinity of the tool.

For example, because the navigation chart is an explicit graphical depiction of position and motion, it is easy to "see" certain aspects of solutions. The chart representation presents the relevant information in a form such that much of the work can be done on the basis of perceptual inferences. Because the work done with a chart is performed on its surface -- all of the work is at the device's interface, as it were -- watching someone work with a chart is much more revealing of what is done to perform the task than watching someone work with a calculator or a computer.

The openness of a tool can also affect its use as an instrument of instruction. When the bearing recorder chooses a set of landmarks that result in lines of position with shallow intersections, it is easy to show him, on the chart, the consequences of his actions and the nature of the remedy required... Imagine how much more difficult it would be to explain the inadequacy of the landmark assignment if the lines of position were represented as equations to be punched into a calculator rather than as lines drawn on the chart.

David Graeber: What’s the Point If We Can’t Have Fun?

To exercise one’s capacities to their fullest extent is to take pleasure in one’s own existence, and with sociable creatures, such pleasures are proportionally magnified when performed in company... This does not need to be explained. It is simply what life is. We don’t have to explain why creatures desire to be alive. Life is an end in itself. And if what being alive actually consists of is having powers -- to run, jump, fight, fly through the air -- then surely the exercise of such powers as an end in itself does not have to be explained either. It’s just an extension of the same principle.

E.T. Jaynes: Notes on Present Status and Future Prospects (1991)

Every new conceptual idea (unlike a mathematical one) must go through a phase of facing opposition from two sides -- the entrenched Establishment who thinks that its toes are being stepped on, and a lunatic fringe that springs up, seemingly by spontaneous generation, out of the idea itself.

Those whose fame and fortune are based on their very real accomplishments using previous methods have a strong vested interest in them and will raise strenuous opposition to any attempt to replace them. This phenomenon has been very well documented in many cases...

In contrast to the Establishment which is protecting something that has some demonstrated value, the lunatic fringe has no vested interest in anything because it is composed of those who have never made any useful contribution to any field. Instead, they are parasites feeding on the new idea; while giving the appearance of opposing it, in fact they are deriving their sole sustenance from it, since they have no other agenda.

The Establishment and the lunatic fringe have the common feature that they do not understand the new idea, and attack it on philosophical grounds without making any attempt to learn its technical features so they might try it and see for themselves how it works. Many will not even deign to examine the results which others have found using it; they know that it is wrong, whatever results it gives.

There is no really effective way to deal with this kind of opposition; one can only continue quietly accumulating the evidence of new useful results, and eventually the truth will be recognized.

Peter Turchi: Maps of the Imagination: The Writer as Cartographer

In the early 1960s, James Lord agreed to pose for Alberto Giacometti for one afternoon, for a sketch. The sketch became a painting, and the session went on for eighteen days. Lord kept a record of Giacometti's process, which was cyclical.

Giacometti would stall, sometimes for hours, before beginning to work (some days he had to be coaxed out of a café or kept from destroying earlier drawings in his studio); when he finally sat at the canvas, he would either despair over his inability to do work of any merit or make optimistic noises; before long his tune would shift from optimism to despair, or from despair to talk of suicide.

Every day, he would erase, or paint over, the previous day's work. Typically, he would continue until the studio was almost completely dark; typically, at the end of the day he would deem the work a failure.

Oliver Heaviside: quoted in Paul Nahin's biography

A part of us lives after us, diffused through all humanity -- more or less -- and through all nature. This is the immortality of the soul. There are large souls and small souls... That of a Shakespeare or Newton is stupendously big. Such men live the best part of their lives after they are dead. Maxwell is one of these men. His soul will live and grow for long to come, and hundreds of years hence will shine as one of the bright stars of the past, whose light takes ages to reach us.

David Hestenes: interview

The point of Modeling Instruction is to organize the instruction in a way that enables students to take advantage of the conceptual tools that are available. And they have to be introduced to these tools...

Geometric Algebra and Calculus [were first] fully developed for application to advanced projects in physics and science. These more powerful tools are now ready to start moving down to the earlier grades. That's what I consider as the design for the classroom of the future. Modeling Instruction is concerned primarily with reforms that have to be done now.

Tycho: Fan Fiction

Brave to write it, all by itself, and then brave to show it. It is like opening your ribcage, and letting someone see the little bird you have inside. What if they don’t love the bird? It’s not like you can change it. I mean.. That’s your bird.

DigDug2k: comment on Ian Bogost's article "The Secret History of the Robot Car"

I heard a talk (at Google) a few years ago by one of the Stanford guys [working on self-driving cars]. He spent a lot of it complaining about horrible defense contractors taking so long to get these working, then talked about how he went and grabbed off the shelf LIDAR and GPS systems to build their first prototype. Those defense contractors spent the last 30 years building those LIDAR and GPS systems. I think the breakthrough was probably those systems becoming cheap enough to use in commercial things like this.

Vincent Toups: Duckspeak Vs Smalltalk: The Decline of the Xerox PARC Philosophy at Apple Computers

HyperCard was, by comparison, much closer to the Dynabook ethos [than the iPad]. In a sense, the iPad is the failed "HyperCard Player" brought to corporate fruition, able to run applications but completely unsuited for developing them, both in its basic design (which prioritizes pointing and clicking as the mechanism of interaction), in the conceptual design of its software, and in the social and legal organization of its software distribution system.

Carver Mead: Semiconductors (13:10)

[In a digital computer] the continuous variables have to be represented, of course, by numbers -- strings of these discrete symbols. And they have to get processed in discrete chunks, one at a time. So there's no notion of locality or continuity that we have in the physical world. And much of what goes on in the physical world is really simplified a great deal by the continuity that exists there, and that continuity is lost when we've digitized the information. So, all the alternative hypotheses for the solution to one of these exponential problems have to be spelled out in these discrete symbols, without any natural continuity between them. And for that reason, we just go and work them, one at a time, one after another.

Carver Mead: comments after Dave Johannsen's reminiscences

We had to debug the Gerber plotter. So here we were, making a VLSI chip, sitting on the floor with our scope and so forth, with cards on the extender, and the cards have discrete transistors on them. So we were using three-generations-old technology to do the next-generation technology. And that's a lesson that you learn all the time, that if you're doing something new, you're always doing it with the old tools, or the old thought processes, or the old languages, or whatever. And that persists, no matter what you're doing.

Alec Resnick: Rendering Learners Legible

We should be suspicious any time we notice that classic trick of marketing something whose inverse is unimaginable -- after all, who wouldn’t want personalized education? If you cannot invert the reform and find something someone reasonable might disagree with, you have a platitude on your hands. And platitudes that front for reforms corrupt their language and often end up running defense for other, more obscure dynamics.

Theodore Roszak: From Satori to Silicon Valley (1985)

The reversionary and the technophiliac choices with which our society confronts us do not so readily combine; indeed, I suspect there is an insurmountable hostility between the large scale technology on which the computer industry is based and the traditional values that the counter culture wished to salvage. The military-industrial complex battens off the gigantism of advanced technology; it is not the ally of communal or organic values. Nor are the corporate leaders of the industrial world so easily outsmarted and outflanked as the Fullerite Technophiles always wanted to believe. Moneyed elites are no slouches when it comes to defending their interests. They can outspend their opposition; they can outwait and outwit their enemies by hiring the brains they need as well as the brute power.

It is sad in the extreme to know, as we now do, that before Ken Kesey and Timothy Leary brought the gospel of LSD to the streets, the CIA had long since undertaken an exhaustive run of experiments with the hallucinogens using human beings as guinea pigs to explore the possibilities of mind control. Similarly, it now seems abundantly clear that long before the personal computer has the chance to restore democratic values, the major corporations and the security agencies of the world will have used the technology to usher in a new era of advanced surveillance and control. As for the space rocket and satellite, we can be sure that by the time the L-5 Society has raised the funds for its first modest colony, the military will already be encamped on the high frontier armed with unheard of genocidal weaponry.

Theodore Roszak: From Satori to Silicon Valley (2000)

Gibson's cyber cowboys are marginalized bottom-dogs so sunk in narcotic fantasies and hallucinatory worlds that they have no Reality Principle to cling to. They inhabit a fictive zone where nothing is certain and everything can be manipulated. Anybody you meet might be a hologram. Not even their minds are their own. What the powers of modern technology have finally brought them is ecological doom in an empire of unlivable cities dominated by the high rise towers and emblazoned logos of the reigning corporate elite.

Jedediah Purdy: Anthropocene fever

For all the talk of crisis that swirls around the Anthropocene, it is unlikely that a changing Earth will feel catastrophic or apocalyptic... Some environmentalists still warn of apocalypse to motivate could-be, should-be activists; but geologic time remains far slower than political time, even when human powers add a wobble to the planet. Instead, the Anthropocene will be like today, only more so: many systems, from weather to soil to your local ecosystem, will be in a slow-perennial crisis. And where apocalyptic change is a rupture in time, a slow crisis feels normal. It feels, in fact, natural.

So the Anthropocene will feel natural. I say this not so much because of the controversial empirics-cum-mathematics of the climate-forecasting models as because of a basic insight of modernity that goes back to Rousseau: humanity is the adaptable species. What would have been unimaginable or seemed all but unintelligible 100 years ago, let alone 500 (a sliver of time in the evolutionary life of a species), can become ordinary in a generation. That is how long it took to produce ‘digital natives’, to accustom people to electricity and television, and so on for each revolution in our material and technological world. It takes a great deal of change to break through this kind of adaptability.

This is all the more so because rich-country humanity already lives in a constant technological wrestling match with exogenous shocks, which are going to get more frequent and more intense in the Anthropocene. Large parts of North America regularly experience droughts and heat waves that would devastate a simpler society than today’s US. Because the continent is thoroughly engineered, from the water canals of the West to the irrigation systems of the Great Plains to air conditioning nearly everywhere, these are experienced as inconvenience, as mere ‘news’. The same events, in poorer places, are catastrophes.

Hans Christian Von Baeyer: Warmth Disperses and Time Passes: The History of Heat, p32

Scientists at Cambridge University have recently re-created Joule's experiment, taking exquisite care to assure authenticity in every detail.. The historical detectives made the unexpected and troubling discovery that the miniscule heating of the room caused by the physical exertions of the experimenters overwhelmed the expected rise in temperature of the water! And even after they overcame that difficulty, along with many other equally serious ones, they were not able to duplicate Joule's published results.

Blinded by our belief in the superiority of the present over the past, we might conclude that Joule cheated, but then we would miss the point. On the contrary, we must admit that he probably knew things that we don't know, and that cannot be communicated by technical papers. He was the Paganini of the thermometer, and capable of making that instrument perform unimaginable feats.

Merely reading it was high art: "And since constant practice enabled me to read off with the naked eye 1/20 of a division, it followed that 1/200 of a degree Fahrenheit was an appreciable temperature." Such precision with the naked eye was and is unheard of. Joule's friend and champion William Thomson called it magical. Besides his uncanny ability to read scales, Joule must have relied tacitly on a whole repertoire of tricks and habits that made him in his day the undisputed leader in the field of thermometry, and that we may never rediscover because they are no longer useful.

Theodore Roszak: The Cult of Information, p 75, 78

Papert likes to claim that Logo, because it is transparently interactive at every step, amounts to letting the child program the machine, rather than having the machine program the child... True, students write the program, but they must do so on the machine's terms. They must stay within the machine's language and logic... Students are free to call a square a box, and they can instruct the machine to turn the box so many degrees this way or that. But they cannot order the computer to put a Hobbit in the box, or to make the box grow wings and fly away to Middle Earth. Logo grants the children control over an experimental "microworld" in which to do their programming; but the microworld is not the full terrain of the human imagination... It is well suited to geometrical play but not to fantasy that oversteps those narrow boundaries. As I read Papert's words, I found myself haunted by the image of the prisoner who has been granted complete freedom to roam the "microworld" called jail: "Stay inside the wall, follow the rules, and you can do whatever you want." ...

Logo does not allow the artistic imagination to romp; the child who would like to draw a horse, or a space monster, or a clown that does not look like a collection of boxes and circles is going to be out of luck. Nor does Logo allow the hand to use a pencil to sweep and shade and dance across the page. Art, like everything else Logo teaches, comes down to fingers stroking a keyboard, a mind working out a program. In a perverse sense, this may be an excellent lesson in computer science, especially in the artificial intelligence research that underlies Logo; the children learn the grand reductive principle: If the computer cannot rise to the level of the subject, then lower the subject to the level of the computer. Possibly, in the Information Age, children will find themselves living in a society where that principle has become the rule in all fields touched by the computer. Logo might then be seen as a useful preparation for "real life."

Theodore Roszak: The Cult of Information, p 94

It is precisely because some ideas -- many ideas -- are brutal and deadly that we need to learn how to deal with them adroitly. An idea takes us into people's minds, ushers us through their experience. Understanding an idea means understanding the lives of those who created and championed it. It means knowing their peculiar sources of inspiration, their limits, their vulnerabilities and blind spots.

What our schools must offer the young is an education that lets them make that journey through another mind in the light of other ideas, including some that they have fashioned for themselves from their own experience. The mind that owns few ideas is apt to be crabbed and narrow, ungenerous and defensive in its judgements. "Nothing is more dangerous than an idea," Emil Chartier once said, "when it is the only one we have."

Tim Urban: Elon Musk: The World’s Raddest Man

A few people I spoke with referenced [Elon] Musk’s obsession with truth and accuracy. He’s fine with and even welcoming of negative criticism about him when he believes it’s accurate, but when the press gets something wrong about him or his companies, he usually can’t help himself and will engage them and correct their error. He detests vague spin-doctor phrases like “studies say” and “scientists disagree,” and he refuses to advertise for Tesla, something most startup car companies wouldn’t think twice about -- because he sees advertising as manipulative and dishonest.

Ian Bogost: Video Games Are Better Without Characters

What if games’ role in representation and identity lies not in offering familiar characters for us to embody, but in helping wrest us from the temptation of personal identification entirely? What if the real fight against monocultural bias and blinkeredness does not involve the accelerated indulgence of identification, but the abdication of our own selfish, individual desires in the interest of participating in systems larger than ourselves? What if the thing games most have to show us is the higher-order domains to which we might belong, including families, neighborhoods, cities, nations, social systems, and even formal structures and patterns? What if replacing militarized male brutes with everyone’s favorite alternative identity just results in Balkanization rather than inclusion? ...

Amidst arguments on Twitter and Reddit about whose favorite games are more valid, while we worry about the perfect distribution of bodies in our sci-fi fantasy, the big machines of global systems hulk down the roads and the waterways, indifferent. It is an extravagance to worry only about representation of our individual selves while more obvious forces threaten them with oblivion—commercialism run amok; climate change; wealth inequality; extortionate healthcare; unfunded schools; decaying infrastructure; automation and servitude. And yet, we persist, whether out of moralism or foolishness or youth, lining up for our proverbial enslavement. We’ll sign away anything, it would seem, so long as we’re still able to “express ourselves” with the makeshift tools we are rationed by the billionaires savvy enough to play the game of systems rather than the game of identities. ...

Only a fool would fail to realize that we are the Sims now meandering aimlessly in the streets of the power brokers’ real-world cities. Not people with feelings and identities at all, but just user interface elements that indicate the state of the system, recast in euphemisms like the Sharing Economy, such that its operators might adjust their strategy accordingly.

Norbert Wiener: The Human Use of Human Beings, p 36

Nature offers resistance to decoding, but it does not show ingenuity in finding new and undecipherable methods for jamming our communication with the outer world.

This distinction between the passive resistance of nature and the active resistance of an opponent suggests a distinction between the research scientist and the warrior or game player. The research physicist has all the time in the world to carry out his experiments, and need not fear that nature will in time discover his tricks and method and change her policy. Therefore, his work is governed by his best moments, whereas a chess player cannot make one mistake without finding an alert adversary ready to take advantage of it and to defeat him. Thus the chess player is governed more by his worst moments.

Alan Kay: Vannevar Bush Symposium talk, 7:04

Knowing more than your own field is really helpful in [thinking creatively]. I've always thought that one of the reasons the 1960s was so interesting is that nobody was a computer scientist back then. Everybody who came into it came into it with lots of other knowledge and interests. Then they tried to figure what computers were, and the only place they could use for analogies were other areas. So we got some extremely interesting ideas from that.

And of course, the reason being educated is important is simply because you don't have any blue [orthogonal] contexts if you don't have any other kinds of knowledge to think with. Engineering is one of the hardest fields to be creative in, just because it's all about optimizing, and you don't optimize without being very firmly anchored to the context you're in. What we're talking about here is something that is not about optimization, but about rotating the point of view.

Tim Berners-Lee: Vannevar Bush Symposium talk, 7:20

So, I still have a dream that the web could be less of a television channel, and more of a sea of interactive shared knowledge...

The "World Wide Web" program, the original browser/editor, was in fact an editor, and you could make links as easily as you could follow them. And that was fundamental. There are two things which seem to me to be totally bizarre. One of them is the fact that you can't do that [now], that we've lost that. So in fact the thing is not interactive. I don't know if I can think of any hypertext experiments in research where you haven't been able to make links just as easily as following them. Authorship has always been right up there. And now, for some historical quirk, which I could go into, I have gone into, I won't go into, we have a whole bunch of things out there which are "browsers".

So that's something I'm a little embarrassed about. And the second thing I'm embarrassed about... When you made the links, and you edited the text on the screen, you didn't see any of these URLs and HTML and all that stuff. The weirdest thing for me, if you can imagine, is to see an advertisement in the "help wanted" of the Boston Globe, saying they want HTML writers, HTML programmers. I mean, give me a break! That's like asking somebody to come along with the skills to write a Microsoft Word file in binary. The whole thing is totally inappropriate.

Jamie Zawinski: resignation and postmortem

When we started [Netscape], we were out to change the world. And we did that. Without us, the change probably would have happened anyway, maybe six months or a year later, and who-knows-what would have played out differently. But we were the ones who actually did it. When you see URLs on grocery bags, on billboards, on the sides of trucks, at the end of movie credits just after the studio logos -- that was us, we did that. We put the Internet in the hands of normal people. We kick-started a new communications medium. We changed the world.

Ted Nelson: As We Will Think

[In the memex], the user may make connections between different parts of the things stored. He does this by associative indexing... By this associative technique he may create "trails", new documentary objects that are useful in new ways... These new structures, or trails, may be taken and given to other people... And they may be published...

It is strange that "As We May Think" has been taken so to heart in the field of information retrieval, since it runs counter to virtually all work being pursued under the name of information retrieval today. Such systems are principally concerned either with indexing conventional documents by content, or with somehow representing that content in a way that can be mechanically searched and deciphered.

This is indeed paradoxical.. Bush did not think well of indexing... His real emphasis was on linkage, and new structures and activities that the automatic link-jump would make possible.

The n+1 Editors: Whatever Minutes

Civilization takes a turn. Not in the sense that talking on a cell phone while you pay for groceries is uncivilized, as in, uncouth, ignorant of the rules that still exist. The point is that it is decivilizing, undoing practices of civilization as fundamental as using silverware to eat. Or alternatively civilizing, if you like, because it doesn’t send us on a straight path backward (as if we were going to eat with our fingers or read by whale-oil light) but deflects us into something new that no one intended or wanted in advance.

Leah Hunt-Hendrix: So You Want to Be a "Radical" Philanthropist?

Singer-style philanthropy is palliative, an attempt to reduce suffering that leaves untouched the question of what generated the suffering in the first place, and what long-term solutions there might be to end its continual reproduction. It offers nets to help individual Africans avoid malaria while ignoring the structural, political, and economic reasons malaria is rampant. It prunes around the edges of a poison tree, rather than grasping at its roots. ...

The best philanthropy is the type that seeks to end the system that perpetually generates the need for philanthropy.

Frank Wilson: The Hand

It is genuinely startling to read [Sir Charles Bell's 1833 treatise, The Hand, Its Mechanism and Vital Endowments, as Evincing Design] now, because its singular message -- that no serious account of human life can ignore the central importance of the human hand -- remains as trenchant as when it was first published. This message deserves vigorous renewal as an admonition to cognitive science. Indeed, I would go further: I would argue that any theory of human intelligence which ignores the interdependence of hand and brain function, the historic origins of that relationship, or the impact of that history on developmental dynamics in modern humans, is grossly misleading and sterile.

Frank Wilson: The Hand

Since the Industrial Revolution, parents have expected that organized educational systems will tame and modernize their children and "prepare them for life." Such is the theory. But education -- ritualized, formal education, at least -- is not an all-purpose solution to the problem of inexperience and mental immaturity among the young. I was completely unprepared for the frequency with which I heard the people whom I interviewed [musicians, puppeteers, woodworkers, others whose careers depended on unusually refined hand control] either dismiss or actively denounce the time they had spent in school. Most of my interview subjects, although I never asked them directly, said quite forcefully that they had clarified their own thinking and their lives as a result of what they were doing with their hands. Not only were most of them essentially self-taught, but a few had engineered their personally unique repertoire of skills and expertise in open retreat from painful experiences in a school system that had dictated the form and content of their education in order to prepare them for a life modeled on conventional norms of success.

Kieran Egan: Getting It Wrong from the Beginning

But this is not a work of history. I do consider some historical figures, but only because it is sometimes easier to disinter the ideas that have been loaded with layers of complexity over the years by looking at their earlier appearance and then seeing how they have gradually transmuted into today's presuppositions. It is a way of trying to make strange what is so familiar that we find it hard to think about.

Leslie Lamport: The Future of Computing: Logic or Biology

The fundamental problem with approaching computer systems as biological systems is that it means giving up on the idea of actually understanding the systems we build. We can’t make our software dependable if we don’t understand it. And as our society becomes ever more dependent on computer software, that software must be dependable. ...

When people who can’t think logically design large systems, those systems become incomprehensible. And we start thinking of them as biological systems. Since biological systems are too complex to understand, it seems perfectly natural that computer programs should be too complex to understand.

We should not accept this... If we don’t, then the future of computing will belong to biology, not logic. We will continue having to use computer programs that we don’t understand, and trying to coax them to do what we want. Instead of a sensible world of computing, we will live in a world of homeopathy and faith healing.

Walter Vincenti: interview

Q: You were developing transonic theory after the sound barrier had already been broken. Hasn’t much of your historical study also involved engineering problems that were “solved” in a practical sense before they were understood theoretically?

A: Yes, and I think that’s a typical situation in technology. You have to look hard to find cases in which the theory is well worked out before the practice. Look at the steam engine and thermodynamics; that whole vast science got started because people were trying to explain and calculate the performance of the reciprocating steam engines that had been built.

Maciej Cegłowski: Web Design: The First Hundred Years, 19:39

These people are really smart -- they're smarter than I am -- but they've climbed into their heads and they've pulled the ladder up behind them. They're taking their graph paper and these exponential curves of how fast technology's growing, and scaring themselves silly. That's not a way to run an industry.

Carver Mead: The Universe and Us: An Integrated Theory of Electromagnetics and Gravitation (18:44)

We had started out [via de Broglie and Schrödinger] having a physical picture of the electron as a wave propagating around the proton... it all made perfect sense intuitively. But then you got some fancy mathematics that made it unnecessary to have the physical picture. And then Bohr argued that "We're above all that now. We don't need physical pictures. We don't need to use intuition." ...

Now don't get me wrong, there's nothing wrong with mathematics. But what got propagated was the notion that mathematics had become the guide to physical theory. ...

One of the things that I've developed through my life is an enormous respect for the power of mathematics. I know a lot of mathematicians, they're very bright, and one of the things I've come to realize is that you can, if you're good enough, develop a mathematics for any physical theory -- whether it's what nature does or not. So in fact if you say that mathematics is going to guide what physics does, all you're saying is that you've let go of the fact that what the real world does should be guiding what your physics is. Because the mathematics can express anything.

That's where we are today. Mathematics took over, and we now have essentially all our physics taught with increasingly sophisticated mathematics and less and less physical insight.

Doug Engelbart: The Augmented Knowledge Workshop (in "A History of Personal Workstations")

By 1959 I was lucky enough to get a small grant from the Air Force Office of Scientific Research that carried me for several years -- not enough for my full-time work, but by 1960 SRI began pitching in the difference.

It was remarkably slow and sweaty work. I first tried to find close relevance within established disciplines. For a while I thought that the emergent Al field might provide me with an overlap of mutual interest. But in each case I found that the people I would talk with would immediately translate my admittedly strange (for the times) statements of purpose and possibility into their own discipline's framework. When rephrased and discussed from those other perceptions, the "augmentation" pictures were remarkably pallid and limited compared to the images that were driving me.

For example, I gave a paper in 1960 at the annual meeting of the American Documentation Institute, outlining the probable effects of future personal-support use of computers. I discussed how a focus on personal support would change the role of their future systems and how such a change would enable more effective roles for the documentation and information specialists. I received no response at all at the meeting. One reviewer gave a very ho-hum description of the paper as the discussion of a (yet another) personal retrieval system. Later, during a visit to a high-caliber research outfit, an information-retrieval researcher got very hot under the collar because I wouldn't accept his perception that all that the personal-use augmentation support I was projecting amounted to, pure and simple, was a matter of information retrieval and why didn't I just join their forefront problem pursuits and stop setting myself apart?

Then I discovered a great little RAND report written by Kennedy and Putt [Administration of Research in a Research Corporation] that described my situation marvelously and recommended a solution. Their thesis was that when launching a project of inter- or new-discipline nature, the researcher would encounter consistent problems in approaching people in established disciplines. They wouldn't perceive your formulations and goals as relevant, and they would become disputative on the apparent basis that your positions were contrary to "accepted" knowledge or methods. The trouble, said these authors, was that each established discipline has its own "conceptual framework." The enculturation of young professionals with their discipline's framework begins in their first year of professional school. Without such a framework, tailored for the goals, values, and general environment of the respective discipline, there could be no effective, collaborative work. Furthermore, if such a conceptual framework did not already exist for a new type of research, then before effective research should be attempted, an appropriate, unique framework needs to be created. They called this framework-creation process the "Search Phase."

So, I realized that I had to develop an appropriate conceptual framework for the augmentation pursuit that I was hooked on. That search phase was not only very sweaty, but very lonely. In 1962, 1 published an SRI report entitled, "Augmenting Human Intellect: A Conceptual Framework." ... I can appreciate that these framework documents appear to many others as unusably general and vague. But, for me, the concepts, principles, and strategies embodied in that framework look better and better every year. The genesis of most of what was and is unique about the products of the augmentation work can be traced back to this framework.

Christopher Alexander: debate with Peter Eisenman

Let's just talk about the simple matter of making an arcade. I find in my own practical work that in order to find out what's really comfortable, it is necessary to mock up the design at full scale. This is what I normally do. So I will take pieces of lumber, scrap material, and I'll start mocking up. How big are the columns? What is the space between them? At what height is the ceiling above? How wide is the thing? When you actually get all those elements correct, at a certain point you begin to feel that they are in harmony.

Of course, harmony is a product not only of yourself, but of the surroundings. In other words, what is harmonious in one place will not be in another. So, it is very, very much a question of what application creates harmony in that place. It is a simple objective matter. At least my experience tells me, that when a group of different people set out to try and find out what is harmonious, what feels most comfortable in such and such a situation, their opinions about it will tend to converge, if they are mocking up full-scale, real stuff. Of course, if they're making sketches or throwing out ideas, they won't agree. But if you start making the real thing, one tends to reach agreement. My only concern is to produce that kind of harmony.

E. O. Wilson: Consilience: The Unity of Knowledge

I believe that in the process of locating new avenues of creative thought, we will also arrive at an existential conservatism. It is worth asking repeatedly: Where are our deepest roots? We are, it seems, Old World, catarrhine primates, brilliant emergent animals, defined genetically by our unique origins, blessed by our newfound biological genius, and secure in our homeland if we wish to make it so. What does it all mean?

This is what it all means: To the extent that we depend on prosthetic devices to keep ourselves and the biosphere alive, we will render everything fragile. To the extent that we banish the rest of life, we will impoverish our own species for all time. And if we should surrender our genetic nature to machine-aided ratiocination, and our ethics and art and our very meaning to a habit of careless discursion in the name of progress, imagining ourselves godlike and absolved from our ancient heritage, we will become nothing.

E. O. Wilson: The Meaning of Human Existence

I hereby cast a vote for existential conservatism, the preservation of biological human nature as a sacred trust. We are doing very well in terms of science and technology. Let’s agree to keep that up, and move both along even faster. But let’s also promote the humanities, that which makes us human, and not use science to mess around with the wellspring of this, the absolute and unique potential of the human future.

Alan Kay: The Power of Simplicity, 38:21

All the different ways companies have invented to kill that goose [that lays the golden eggs]. One of them is, just eat it. Forget about those eggs. ...

"Only one gold egg every twelve?" "I want gold coins rather than golden eggs." "I want platinum eggs." No! You can buy platinum with the gold from these eggs.

Make the goose a manager. Give the goose a deadline. Require the goose to explain to you how they're going to make the next egg.

This is just at the level of ridiculousness that's going on. Nobody who [thinks like this] has ever laid a golden egg. It's not their business. Their business is to count those golden eggs after they get laid.

David Graeber: Of Flying Cars and the Declining Rate of Profit

Might the cultural sensibility that came to be referred to as postmodernism best be seen as a prolonged meditation on all the technological changes that never happened? The question struck me as I watched one of the recent Star Wars movies. The movie was terrible, but I couldn’t help but feel impressed by the quality of the special effects. Recalling the clumsy special effects typical of fifties sci-fi films, I kept thinking how impressed a fifties audience would have been if they’d known what we could do by now -- only to realize, “Actually, no. They wouldn’t be impressed at all, would they? They thought we’d be doing this kind of thing by now. Not just figuring out more sophisticated ways to simulate it.”

That last word -- simulate -- is key. The technologies that have advanced since the seventies are mainly either medical technologies or information technologies -- largely, technologies of simulation. They are technologies of what Jean Baudrillard and Umberto Eco called the “hyper-real,” the ability to make imitations that are more realistic than originals. The postmodern sensibility, the feeling that we had somehow broken into an unprecedented new historical period in which we understood that there is nothing new; that grand historical narratives of progress and liberation were meaningless; that everything now was simulation, ironic repetition, fragmentation, and pastiche -- all this makes sense in a technological environment in which the only breakthroughs were those that made it easier to create, transfer, and rearrange virtual projections of things that either already existed, or, we came to realize, never would. Surely, if we were vacationing in geodesic domes on Mars or toting about pocket-size nuclear fusion plants or telekinetic mind-reading devices no one would ever have been talking like this. The postmodern moment was a desperate way to take what could otherwise only be felt as a bitter disappointment and to dress it up as something epochal, exciting, and new.

Mikey Dickerson: One Year After healthcare.gov

Our country is a place where we allocate our resources through the collective decisions that all of us make... We allocate our resources to the point where we have thousands of engineers working on things like picture-sharing apps, when we've already got dozens of picture-sharing apps. We do not allocate anybody to problems like... This is just a handful of things that I've been asked to staff in the last week or so and do not have adequate staff to do...

anonymous student: The Authoritarian Turn of Academia

Many fields have moved away from a materialist view of the world and toward a seemingly obsessive focus on discourse as the most important explanation for social problems like income inequality, racism, sexism, etc. The idea, to put it simply, is that the way we represent people, places, and things is as important — if not more important — than reality itself. ... Some may remember that in the 1990s, this formed the basis for a broad debate in academia over the merits of what was generally referred to as “postmodernism.” ...

I suspect the debate is over because of the frustrating inefficacy of a conversation in which disagreement is construed as oppression or violence. It’s simply easier for the many professors who do not share the research priorities of the cultural turn — for example, a sociologist studying the reasons for poor health conditions in Indian slums instead of the language surrounding poor health in an Indian slum — to simply defer to their discourse-focused colleagues on most issues rather than risk career-damaging accusations of “silencing” or “marginalizing” or “epistemological violence.”

Carver Mead: oral history

A research institution really has [three] roles that are complementary... The first one is... it’s a flywheel in the knowledge base. In other words, fads come and go so fast that it’s very easy to actually lose some knowledge before it surfaces in a useful way again.

A good example is right now [1996]. All the universities are trying to put back in place communications electronics — which is something I grew up with, doing radio and that kind of stuff. And then it all went out of fashion when the computer came in. But now we’ve got communications again, and all of a sudden people have decided, “Gee, what was that radio thing we used to do?” But all those guys have retired. Now we’ve got to hire somebody who knows something about radio...

Fortunately, we still have enough of the old guys around who did that and have been doing that all along. And they were kind of out of favor, and now they’re back in favor. So there’s this flywheel effect. It’s a result of the tenure system that we have people who know things that aren’t the hot rage right now but are very important. It’s often seen as dead wood or something, but no, it’s a flywheel. It’s a way to preserve a knowledge base so you don’t have to learn it from scratch all over again.

[Secondly], we are at the forefront of the research. We are the ones who are coming up with the new ideas, the new directions, who are fighting to get people to see that there really is something here. So we’re at the leading edge — or the bleeding edge, as they say.

But then the [third] thing we do... is that because we teach, we’re interested in the unification of knowledge. What I’m doing this year is a pure form of that. I’m not inventing anything, I’m just getting to where I can look at things in a way that makes things that were very complicated much simpler. So it doesn’t take so much specialization and work...

You see, there are always two things that happen when you have a very rapid increase in knowledge. There’s the sort of bifurcation phenomenon — you get a bunch of little subfields and disciplines and all of that stuff. But then there’s the thing that comes behind that, and it’s slower. And that is the assimilation of these things and their unification; so there’s a whole set of things that come from unifying principles. If we didn’t have that, there’d be no hope of ever having an education at all, because the knowledge base is doubling every year or two. In six years or so, there’d be absolutely no hope of getting anything.

John Cramer: The Transactional Interpretation of Quantum Mechanics

It should be emphasized that the [Transactional Interpretation] is an interpretation of the existing formalism of quantum mechanics rather than a new theory or revision of the quantum mechanical formalism. As such, it makes no predictions which differ from those of conventional quantum mechanics. It is not testable except on the basis of its value in dealing with interpretational problems. The author has found it to be more useful as a guide for deciding which quantum mechanical calculations to perform than to the performance of such calculations.

The main utility of the TI is as a conceptual model which provides the user with a way of clearly visualizing complicated quantum processes and of quickly analyzing seemingly "paradoxical" situations (e.g., Wheeler's delayed choice experiments, Herbert's paradox, the Hanbury-Brown-Twiss effect, and the Albert-Aharonov-D'Amato prediction) which would otherwise require elaborate mathematical analysis. It is a way of thinking rather than a way of calculating.

It may have value as a pedagogical tool for the teaching of quantum mechanics to students. It also seems to have considerable value in the development of intuitions and insights into quantum phenomena that up to now have remained mysterious.

Christopher Alexander: The Timeless Way of Building

Within this process, every individual act of building is a process in which space gets differentiated. It is not a process of addition, in which preformed parts are combined to create a whole, but a process of unfolding, like the evolution of an embryo, in which the whole precedes the parts, and actually gives birth to them, by splitting.

Alan Macdonald: Geometric Algebra 5

Pauli needed the mathematical structure of G3 [geometric algebra of 3-dimensional space] for his theory. Since G3 was not available, he devised an awkward isomorphic structure using the mathematical tools available to him: vectors, matrices, complex numbers.

Sam Hahn: Scaling Human Capabilities for Solving Problems that Threaten Our Survival

Ironically, Doug [Engelbart]'s own teams over the years have not sustained themselves to perform continuing, directed, coherent activity around his vision. Some say Doug is hard to work with. Others say the problem is people do not have the patience for Doug's long-term vision, so they take a small subset of his ideas and go off to make their fortunes. For whatever reason, there has not been a critical mass of people organized around his principles for solving complex global problems.

Though Doug's ideas are immortal — and have changed the world in terms of personal computing — Doug is human and has suffered from not being able to carry his big ideas forward. That leaves it to the rest of us, who believe in collective IQ, to figure it out.

Malcolm McCullough: Abstracting Craft: The Practiced Digital Hand

Almost any practiced person values her skill above and beyond what it is good for producing, as though there were psychological benefits to mastery itself.

For example, the circumstances of practice are often themselves a source of satisfaction. This is because skill is sentient: it involves cognitive cues and affective intent. It is also very habitual. In particular, it develops an intimate relation with certain contexts or tools, which makes it individual. No two people will be skilled alike; no machine will be skilled at all.

Of course the latter is debatable if we accept simple mechanical or deductive capacity as skill — but we are maintaining that there is a sentient component too. One way our sentient activity differs from the action of machines is play. We putter about in our studios. We enjoy being skilled. We experiment to grow more so. Skills beget more skills.

Carver Mead: oral history

John Bardeen was just the most unassuming guy. I remember the second seminar Bardeen gave at Caltech — I think it was just after he got his second Nobel Prize for the BCS theory, and it was some superconducting thing he was doing. He had one grad student working on it and they were working on this little thing, and he gave his whole talk on this little dipshit phenomenon that was just this little thing. I was sitting there in the front row being very jazzed about this, because it was great; he was still going strong.

So on the way out, people were talking and one of my colleagues was saying, “I can’t imagine, here’s this guy that has two Nobel Prizes and he’s telling us about this dipshit little thing.” I said, “Don’t you understand? That’s how he got his second Nobel Prize.” Because most people, one Nobel Prize will kill them for life, because nothing would be good enough for them to work on because it’s not Nobel Prize–quality stuff, whereas if you’re just doing it because it’s interesting, something might be interesting enough that it’s going to be another home run. But you’re never going to find out if all you think about is Nobel prizes.

Carver Mead: oral history

You do wonder a little, don’t you, about how long you can keep a presence like that with people who didn’t earn their stripes the way the earlier guys did, because there’s so much of what Intel is that is an outgrowth of having been in the early evolution of semiconductors.

Like Gordon [Moore]’s first job was winding the heating elements for a diffusion tube. I mean, there’s this real — and Andy [Grove], you know, taught the introductory course in physics and technology; his wonderful book that I taught out of. I was the first one to teach out of it.

But hands-on, real knowledge of the physics and the process was really what made the legacy of Intel, and it was that stuff that distinguished them from the people who you read about it in books, and that was true all the way up in the leadership. And you do wonder a little, don’t you — it’s a little like what happens with Microsoft when Bill [Gates] retires. You lose something very essential, and you’re not sure what you get in return.

George Gilder: Microcosm, p170

[Carver] Mead, however, soon came to see the microprocessor as more a problem than a solution. Sometimes he would say in exasperation — or in his drive to make a point — that "the microprocessor set back the technology ten years"...

Mead believed that the industry was still avoiding the clear message of its technical medium. The designers were once again using VLSI just as they used the transistor and the integrated circuit, mostly to reduce the costs of executing old ideas.

[Ted] Hoff actually was boasting that he followed the scheme of DEC's PDP-8, a computer advanced for its time but ghastly in its waste of printed circuit board space on wires and glue logic. Yet the VLSI copy made by Hoff [the Intel 4004] was inferior in every functional respect to the original computer. It was as if VLSI was an obstacle to the designer rather than an open sesame for radically improved processors...

Avoiding through software, memories, and microprocessors the need to design new chips for new applications, Mead believed, would prove a Pyrrhic victory.

Lynn Conway: quoted in "Microcosm" by George Gilder, p188

In electronics, a new wave comes through in bits and pieces... I was very aware of the difficulty of bringing forth a new system of knowledge by just publishing bits and pieces of it among traditional work and then waiting until after it has all evolved and someone writes a book about it. What we decided to do was to write about it while it is still happening.

Our method was to project ourselves ahead ten years, and then write the book as though reflecting back upon a decade. Then we would let the people in the comunity critique it, and let the book itself become the focal point for the creation of methods.

Dave Kosak: Psychonaut Tim Schafer on Taking Risks

[Tim Schafer] said sometimes you have to trick yourself into taking risks. When he was working on Full Throttle, the team agonized over what the main character should look like. They came up with character after character, and kept shooting them down, worried if people would like him or not. "That guy's too big, his jaw looks funny, he's too menacing, etc."

So, the team decided to table the issue and just draw the biker gang instead. That should be easy. They could look as cool or as weird as they wanted. They went nuts! Fat guys, skinny guys, crazy-looking guys, on and on. One of those characters somebody drew up really stuck out. He just looked interesting. "That's the main character!" Schafer remembers realizing. The lead character came about because the art team had tricked itself into taking risks.

Air Force Office of Scientific Research: brochure, August 1960

The Air Force's major activity for the support of new ideas

FOREWORD

A little over a decade ago the Air Force, as chief consumer of advanced technology, realized that applied research is essentially the middleman drawing heavily on supplies from the basic research of gifted scientists. But it faced the situation where its raw material was fast diminishing...

AFOSR AND BASIC RESEARCH

The Air Force Office of Scientific Research is charged with building a stockpile of knowledge which someday will provide the know-how for developing Air Force weapon systems of the future. In one sense, AFOSR may be thought of as a very inexpensive insurance policy for future defense. Stockpiling basic knowledge today may, and no doubt will, preclude such expensive "crash" programs as those brought on by World War II...

AFOSR operates on the premise that scientific advances cannot be ordered or scheduled, but that selection and emphasis from the infinite possible directions of basic research can foster those scientific areas most probably related to present and future Air Force needs. For this reason, AFOSR acts only on unsolicited, original research proposals offered by scientists in universities, nonprofit institutions, and industry. Selction is made essentially on the basis of originality, significance to science, scientific competence of the investigator, and relevance of the proposed research to Air Force needs.

A major objective of AFOR's program in support of basic research is to become a vital partner in man's unending search for new scientific knowledge. The capability for carrying out this mission is tied directly to the many scientists working in laboratories throughout the free world. To obtain the maximum output of basic science, AFOSR makes every attempt to provide the appropriate environment and support for those inspired individuals who are capable of making first-class contributions. The kind of research sought is fundamentally a quality item.

Elaine Ou: Unable

Pilots have a hard enough time citing unable when facing down a life-or-death situation. It’s even harder on the surly bonds of earth, where death happens slowly. The employer who asks you to work weekends does not suffer the consequences of your failed marriage. The investor who sends you to Vegas isn’t the one gambling with the life of his company. Unable.

A control tower can’t see wake turbulence, icing conditions, or mechanical distress. According to NTSB investigators, no matter how ridiculous a tower directive, the cause of accident always ultimately comes down to pilot error — for being unwilling to say Unable.

No Air Traffic Controller has ever died from pilot error.

Carver Mead: interview

It always worked out that when I understood something, it turned out to be simple. Take the connection between the quantum stuff and the electrodynamics in my book. It took me thirty years to figure out, and in the end, it was almost trivial. It's so simple that any freshman could read it and understand it. But it was hard for me to get there because all of this historical junk was in the way.

Lawrence Weschler: Mr. Wilson's Cabinet of Wonder, p 121

[Stephen Greenblatt, in Marvelous Possessions: The Wonder of the New World] goes on to ask about the function of all this marveling. Yes, Columbus was overwhelmed with all the wonder he was experiencing — the word itself recurs in his journals and dispatches so often that the King of Spain himself at one point suggested that Columbus should be called not Almirante, the admiral, but Admirans, the one who wonders.

But so much wonder was also a useful screen, for in his writings "Columbus tries to draw the reader toward wonder, a sense of the marvelous that in effect fills up the emptiness at the center of the maimed rite of possession." Greenblatt is referring to that moment, repeated time and again, when following an exchange of trinkets, Columbus claims title to the respective islands in the name of the King of Spain, and none of the native inhabitants contradict him, which he in turn takes for assent.

In the years after Columbus, the European sensibility's virtual debauch in the wonder of the New World allowed it to disguise, from itself, the unprecedented human decimation that was taking place over there, on the ground, at that very moment. Wonder-besotted Europeans were so bedazzled that they could simply fail to notice the carnage transpiring under their very eyes, in their very name.

Noam Chomsky: interview

[Q: Your ideas, your worldview, has never really gone mainstream... Does that depress you, that you're not living in any way kind of like the society you'd like to?]

I never anticipated living in Utopia. If I was in the mainstream, I'd begin to ask myself what I'm doing wrong. There have been (not because of me, but many people) there has been notable progress over the years. Not uniform – there's regression as well — but in many respects, it's a more civilized world than it was before. ...

Look, you have two choices. You can say, "I'm a pessimist, nothing's gonna work, I'm giving up, I'll help ensure that the worst will happen." Or you can grasp onto the opportunities that do exist, the rays of hope that exist, and say, "Well, maybe we can make it a better world." It's not a much of a choice.

Howard Rheingold: The Virtual Community, p 172

We are both interested in the possibility of adding video to computer conferencing. Part of that ontological untrustworthiness of cyberspace is the lack of body language and facial expression. Misunderstandings that tangle group communications and sour personal relationships online might be avoided if you could add a raised eyebrow or a playful tone of voice to the online vocabulary.

[After trying a video-wall conferencing system], Barlow told me that he found himself somewhat disappointed in that hope. Something seemed missing. Barlow told the computer scientist who was giving him the demonstration about his disappointment. The researcher, a native of India, smiled and told him that what the video does not transmit is "the prana," the life force, literally the breath of the other people.

Binyavanga Wainaina: Glory

We went back to class. Very excited. Heretofore our teachers had threatened us with straightforward visions of failure. Boys would end up shining shoes; girls would end up pregnant. Now there was a worse thing to be: a user of biogas. ...

Often, the formal side, out of its good nature or its panicked guilt, out of a feeling that the giant world of the urban poor is too pathetic to tolerate, pins its hopes and dreams on some revolutionary product. Biogas. A windup radio. A magic laptop. These pure products are meant to solve everything.

They almost always fail, but they satisfy the giver. To the recipients, the things have no context, no relationship to their ideas of themselves or their possibilities. A great salesman can spark a dialogue with you; in a matter of minutes, you come to make your own sense of his product, fitting it into your imagination, your life. You lead, the salesman follows. Whereas a pure product presents itself as a complete solution; a product built to serve the needs of the needy assumes the needy have measured themselves exactly as the product has measured them.

Barack Obama: interview with Jeffrey Goldberg in The Obama Doctrine

I believe that we have to avoid being simplistic. I think we have to build resilience and make sure that our political debates are grounded in reality. It’s not that I don’t appreciate the value of theater in political communications; it’s that the habits we — the media, politicians — have gotten into, and how we talk about these issues, are so detached so often from what we need to be doing that for me to satisfy the cable news hype-fest would lead to us making worse and worse decisions over time...

There’s a difference between resilience and complacency. [There's also a difference between making considered decisions and making rash, emotional ones.] What it means, actually, is that you care so much that you want to get it right and you’re not going to indulge in either impetuous or, in some cases, manufactured responses that make good sound bites but don’t produce results. The stakes are too high to play those games.

Rich Harris: Small modules: it’s not quite that simple

Discoverability is often cited as npm’s biggest flaw. Many blog posts — scratch that, entire websites — have been created to try and mitigate the difficulty of finding what you need on npm. Everyone has an idea about how to make it easier to find needles in the haystack, but no-one bothers to ask what all this hay is doing here in the first place.

T. R. Reid: The Chip, p 186

The slide rule was a simple instrument (at least, if you knew how logarithms work). To hear engineers tell it, that simplicity was part of its appeal. "It has a sort of honesty about it," Jack Kilby told me one day, reaching into the drawer of his desk and pulling out his old K & E. "With the slide rule, there're no hidden parts. There's no black box. There's nothing going on that isn't right there on the table."

To put it another way, the slide rule was not threatening. No one ever called the slide rule a "mechanical brain". Nobody ever declared that the slide rule was endowed with something called artificial intelligence. There were no movies about runaway slide rules named HAL seizing control of the spaceship or plotting to dominate mankind.

The slide rule, hanging at the ready from the belts of Fermi, Wigner, and Werner von Braun, helped men create the first nuclear chain reaction and send rockets to the stratosphere. But the rule was always recognized as nothing more than a tool. It had no more "intelligence" than the yardstick or a screwdriver or any other familiar tool that extends human power.

Ivar Ekeland: The Best of All Possible Worlds, p 8

That day in the Pisa Duomo, Galileo sees the opposite: back and forth swings the great lantern, back and forth... Why should [motionlessness] be more natural than this symmetric motion, back and forth, back and forth, with a majestic regularity? What is there to prevent it going on forever? ...

Galileo's theory of the pendulum -- and we may use that word as the ancient Greeks did, equating theory to vision, because Galileo actually saw it that day in the cathedral, and all his subsequent work was to remember and understand what he had seen...

Ivar Ekeland: The Best of All Possible Worlds, p 85

Both these properties, predictability and stability, are special to integrable systems... Since classical mechanics has dealt exclusively with integrable systems for so many years, we have been left with wrong ideas about causality. The mathematical truth, coming from non-integrable systems, is that everything is the cause of everything else: to predict what will happen tomorrow, we must take into account everything that is happening today.

Except in very special cases, there is no clear-cut "causality chain," relating successive events, where each one is the (only) cause of the next in line. Integrable systems are such special cases, and they have led to a view of the world as a juxtaposition of causal chains, running parallel to each other with little or no interference.

Christopher Alexander: A City Is Not a Tree

Now, why is it that so many designers have conceived cities as trees when the natural structure is in every case a semilattice? Have they done so deliberately, in the belief that a tree structure will serve the people of the city better? Or have they done it because they cannot help it, because they are trapped by a mental habit, perhaps even trapped by the way the mind works - because they cannot encompass the complexity of a semilattice in any convenient mental form, because the mind has an overwhelming predisposition to see trees wherever it looks and cannot escape the tree conception?

I shall try to convince you that it is for this second reason that trees are being proposed and built as cities - that is, because designers, limited as they must be by the capacity of the mind to form intuitively accessible structures, cannot achieve the complexity of the semilattice in a single mental act.

Caroline Rose: Bouncing Balls

I should add that at Tymshare, in a group acquired from SRI, I’d worked for a while under Doug Engelbart and used a rather complicated mouse device he’d invented. It had struck me as an interesting experiment but a bit gimmicky. I’d also played Hangman on a computer at Xerox PARC with a friend who worked there; the graphics seemed pretty cool, but I thought of it only as a game/toy — otherwise why wouldn’t Xerox have been attempting to market it for more serious purposes? Much earlier I had used electronic messaging on the ARPAnet, mainly to communicate with fellow workers in the building, and at first thought it a silly substitute for walking down the hall to talk to someone in person. So you can see that the light about the potential for such innovations dawned very slowly on me.

Robert Garner: Xerox Star 8010 Final Demo

I was designing the [Ethernet controller board at Xerox]... The target was 20 Mb/s, and I designed the board for a predecessor machine called the D0, or the Dolphin. I laid out the board, and it didn't fit. So I went to Bob Metcalfe and I said, "You know, Bob, I can't get the chips to fit at 20 MHz." I opened up a parts catalog, and there was a Fairchild 10 MHz CRC chip. I went, tail between my legs, to Bob, and said, "If we run at 10 MHz, this all fits." That's why the Ethernet runs at 10 Mb/s.

Frank Ludolph: Demo at CHI '98 of Apple's Lisa Computer and Interface

Fred Brooks in his Mythical Man Month said, "Be prepared to throw one away." Those of us who were working on the Lisa didn't realize we were working on the throwaway.

However, the Mac did pick up our interaction style and at least picked up the metaphor. But at each of these stages that you've heard about today -- Augment, Star, and the Lisa -- there were things that got dropped in the process of making the product something that could be afforded. I think it's worthwhile to occasionally go back and look at the systems. Normally what got carried forward was the visual stuff. What got dropped was the stuff that was hard to see.

Jay Forrester: talk at the Computer Museum in Boston, June 1980

[Whirlwind] started as an analog computer for determining the stability and control characteristics of large aircraft... It wasn't to train [pilots] for a known aircraft, but to take the equations of motion and the physical dimensions of an airplane and anticipate what its behavior would be, since at that time, many airplanes were being built that did not control properly and did not behave well after they were finished...

We took about a year to decide that any analog computer big enough to do that job would end up presenting you with a solution of its own idiosyncrasies rather than a solution of the problem at hand.

Scott Kim: Viewpoint: Toward a computer for Visual Thinkers

Twenty five years after the invention of text editors, there are still no books or courses on text editor design. The lack is especially striking given that most of the use of personal computers is text editing. Computer science students design compilers to learn principles of programming; why not design text editors to learn principles of user interfaces?

Severo Ornstein: Initial Demo of the Mockingbird Composer's tool at Xerox PARC Forum

I went around and talked with a number of composers to see if the problems they were facing [capturing musical ideas composed at a piano] were the same. Most of them said that they themselves write using the piano as an instrument, but that nobody else does.

Theodore Roszak: From Satori to Silicon Valley

Does this mean that the reversionary wing of the movement was simply a light that failed? In one sense, obviously yes. The urban-industrial dominance is more tightly locked to the planet than ever; the search for viable alternatives has gone into a deep eclipse.

But a light that fails is still better than unchallenged darkness. For besides winning, there is also being right. And on another level where the historical clock measures out its story in millennia not minutes, the reversionaries may be regarded as prophetical voices that, though largely unheeded, spoke truth to power.

Barack Obama: interview with Doris Kearns Goodwin

Q: Do you ever wish you had been president in another era? Suppose you’d been around in Lincoln’s time, when your written word would be pamphletized, when everybody would be reading the entire speech and they’d be talking to each other about it. And Teddy Roosevelt was right for the era when punchy language worked. F.D.R. was perfect for conversational style on radio, J.F.K. and Reagan for the big TV networks. And you’re governing in the age of the Internet, with its divergent voices and sound bites.

A: It’s an interesting question. As I said earlier, there is a big part of me that has a writer’s sensibility. And so that’s how I think. That’s how I pursue truth. That’s how I hope to communicate truth to people. And I know that’s not how it is always received. Because it gets chopped up. Or if it’s too long, then it’s dismissed as being professorial, or abstract, or long-winded.

Alan Kay: The 40th Anniversary of the Dynabook

In a way, ideas only count for a little in computing, because you kind of have to implement the stuff. This is the part of the story that really makes me clutch at my throat, because every time you implement something, five years go away. You do learn something.

Chuck Thacker: The 40th Anniversary of the Dynabook

Interviewer: [PARC] was kind of a middle ground between academic research and product development research.

Well, no, we were much, much more like academics than like product developers. We really did not think of these things as products. What we were doing, the way I characterize it, was we were spending a lot of money to simulate the future. That's not the way you build a product.

Carver Mead: Caltech EE Centenial

What was there about [Caltech's EE department] that led to Caltech [alumni] being there when there's these big paradigm shifts? I would propose that the biggest gift that Royal Sorensen gave to Caltech is that he had a different view of engineering than other schools did. Engineering wasn't something you studied and learned and memorized and knew where to look up. Engineering was understanding things all the way to the bottom, no matter what field they were called, and being able to use that to build stuff and make it work.

MasterPip: Garbage men/women of Reddit, what do people do with their trash that frustrates you the most?

[Trash collectors saw] a noticeable decrease in tips when we switched to sideloaders. Why? Because customers think we sit on our ass all day being lazy and not out physically doing the work. I come home just as exhausted as I was when I was physically doing the work. The reason is that now it's all mental, and very frustrating work. My mind is going a mile a minute. I'm scanning my mirrors and cameras at all times, looking for overheads, grabbing cans, watching out for dogs, kids who weren't taught to stay away from the 20 ton truck, shitty drivers, and old people who don't give a fuck if they get run over because they think they have the right of way because they managed to live this long.

Shirley Manson: interview

People at record companies live in fear of being wrong. Music cannot thrive in that environment. It is an unruly art form. You can’t keep treating it like sausage meat. You have to let it morph and move and breathe.

Walter Ong: Orality and Literacy, p 67

Words and objects are never totally disjunct: words represent objects, and perception of objects is in part conditioned by the store of words into which perceptions are nested. Nature states no 'facts': these come only within statements devised by human beings to refer to the seamless web of actuality around them.

Walter Ong: Orality and Literacy, p 93

In ancient Greek culture Havelock discovers a general pattern of restricted literacy applicable to many other cultures: shortly after the introduction of writing, a 'craft literacy' develops. At this stage writing is a trade practiced by craftsmen, whom others hire to write a letter or document as they might hire a stone-mason to build a house, or a shipwright to build a boat. Such was the state of affairs in West African kingdoms, such as Mali, from the Middle Ages into the twentieth century.

At such a craft literacy stage, there is no need for an individual to know reading and writing any more than any other trade. Only around Plato's time in ancient Greece, more than three centuries after the introduction of the Greek alphabet, was this stage transcended when writing was finally diffused through the Greek population and interiorized enough to affect thought processes generally.

Mark Wilson: This $1,500 Toaster Oven Is Everything That's Wrong With Silicon Valley Design

This salmon had become more distracting to babysit than if I’d just cooked it on my own. This salmon had become a metaphor for Silicon Valley itself. Automated yet distracting. Boastful yet mediocre. Confident yet wrong. Most of all, the June is a product built less for you, the user, and more for its own ever-impending perfection as a platform. When you cook salmon wrong, you learn about cooking it right. When the June cooks salmon wrong, its findings are uploaded, aggregated, and averaged into a June database that you hope will allow all June ovens to get it right the next time. Good thing the firmware updates are installed automatically. ...

June is taking something important away from the cooking process: the home cook’s ability to observe and learn. The sizzle of a steak on a pan will tell you if it’s hot enough. The smell will tell you when it starts to brown. These are soft skills that we gain through practice over time. June eliminates this self-education. Instead of teaching ourselves to cook, we’re teaching a machine to cook. And while that might make a product more valuable in the long term for a greater number of users, it’s inherently less valuable to us as individuals, if for no other reason than that even in the best-case scenarios of machine learning, we all have individual tastes. And what averages out across millions of people may end up tasting pretty . . . average.

Cooking has always been a highly personal, multi-sensory experience, where trial and error is the only way to become the all-star cook most of us know as grandma. But as I put the salmon on the table 40 minutes later than projected, I had no idea what I should have done differently, other than to never have used June in the first place.

Barack Obama: interview with Destin Sandlin

I've learned to be pretty good at listening carefully to people who know a lot more than I do about a topic, and making sure that any dissenting voices are in the room at the same time. So, there's an initial presentation, and I will make sure I hear from everybody in the room. Does anybody disagree with the baseline facts? Is there any evidence that's inconsistent with what was just said? And if there is, then I want to hear that argument in front of me. And then what I'm pretty good at is asking questions, poking, prodding, testing propositions, seeing if they hold up.

Conal Elliott: The C language is purely functional

Sadly – and here is the real intent behind my post – many Haskell programmers believe that IO is necessary to do “real programming”, and they use Haskell as if it were C (relegating lots of work to IO). In other words, monadic IO has proved to be such a comfortable “solution” to I/O in a functional language, that very few folks are still searching for a genuinely (not merely technically) functional solution.

Before monadic IO, there was a lot of vibrant and imaginative work on functional I/O. It hadn’t arrived yet, but was still in touch with the Spirit of functional programming. With the invention and acceptance of monadic imperative programming, it’s like the Haskell community wandered into an opium den and are still lying there in a fog.

Tom Lehrer: interview

[Sadly, though, Lehrer is of the opinion that while satire may attract attention to an issue, it doesn't achieve a lot else.] The audience usually has to be with you, I'm afraid. I always regarded myself as not even preaching to the converted, I was titillating the converted. The audiences like to think that satire is doing something. But, in fact, it is mostly to leave themselves satisfied. Satisfied rather than angry, which is what they should be.

Matt Gaffney: 50.Qh6+!! The Move That Won Magnus Carlsen the World Chess Championship

In Game 10, Carlsen finally got his first win. It was a slow grind, the kind he’s famous for — not the most entertaining chess to watch, but effective, an illustration of the default style of millennials raised not so much on romantic notions of beautiful checkmates but on the cold assessments of their ubiquitous chess engines.

Nick Bilton: Silicon Valley Meets Its Biggest Creation

Six years ago... I was in a room with the engineers who would eventually build 3-D printers. While we sat there amid circuit boards and melting plastic, I asked the excited inventors what they would make with their new concoction. Their ideas were so cute and innocent, like customized iPhone cases and plastic wall hooks. Yet what was the first thing that people built once they got their hands on a 3-D printer? Guns. (Then came 3-D-printed bullets and semi-automatic machine guns.)

The same thing happened with Twitter. The product was conceived to organize a night out with your friends at, say, a rave. A decade later and we have a President-elect who lies, attacks people, and intentionally provokes outrage in 140-character bursts aimed directly at his 17.3 million followers.

If the tech elite had no idea how their innocent products could be undermined, then now is their opportunity to pause and think about the implications of their actions on the future. As companies in Silicon Valley build robots that can run as fast as a cheetah, fleets of cars and trucks that can drive themselves, artificial intelligence agents that can predict weather patterns and respond to global market changes, and flocks of drones that will deliver our packages, maybe it’s time to put more effort into thinking about how to avoid calamitous events from occurring on a larger scale.

Hafu: Real Future: A Female eSports Champion Speaks Out About Harassment

Interviewer: You're literally inviting people into your bedroom 14 hours a day. They must feel very intimately connected to you.

Yeah, they know to press where it hurts. And when you show where it hurts, people will just keep pressing.

Ian Bogost: Obama Was Too Good at Social Media

But these projects also affirm the dark underbelly of the social media era. The compression of complex ideas into tweetable sound-bites. The victory of sentiment and affect over reason and fact on the internet. The belief that large information archives can produce knowledge of the present, and of history, by exalting data correlation over all other methods of knowledge production. The tendency to privilege technological discourse over all other topics. The celebration of quick-draw contests and hackathons, pursued with an entrepreneur’s short-term attention, as the ultimate means of invention. The silent privileging of those with the time and resources to jump at the invitation to work for free, on no notice, over a period of weeks to curry favor and attention. ...

As Obama leaves office, the digital tools he quietly celebrated have also hollowed out American life. Surveillance capitalism has made data extraction, aggregation, resale, and speculation the hidden engine of wealth and progress (for those few fortunate enough to pursue rather than to be pursued by it). The ability to create and widely disseminate information as credible and accurate, no matter its relationship to reality. The obsession with immediacy and attention over longevity and conviction. The consolidation of media and information, particularly local media, in the hands of a few large companies with limited commitment to civic good. While the first social media presidency was busy tweeting and Snapchatting, supposedly for public engagement, it did precious little to address the impacts of these and other effects of technology on the American public as matters of public policy.

Michael Crowe: A History of Vector Analysis: The Evolution of the Idea of a Vectorial System, p 33

The rapidity with which new systems of ideas (as opposed to new experimental or technical results) are received is often exaggerated. Non-Euclidean geometry, Boolean algebra, Maxwell's theory, and even the calculus were only gradually appreciated and assimilated into scientific thought. Moreover complex numbers, the Grassmannian system, and the Gibbs-Heaviside (or modern) system of vector analysis were all slowly received.

When Hamilton and the young Tait express their surprise that the quaternion system was only slowly being appreciated, their statements are to be taken with caution, for they express a lament that is nearly universal among the proponents of new systems of ideas.

Adam Cadre: Election logorrhea

But in a country where we do pick our leaders based on who we "like", I suppose it is worth a look at why the voters found Hillary Clinton so goshdarn unlikeable... Some foresaw how this election would unfold years in advance.  I spent most of 2011 working as a junior screenwriter on a superhero movie... The head writer chose the plot elements, and there was one wrinkle I didn't quite get at the time... The setup is that our heroine has discovered that she has superpowers, inadvertently goes public by intervening in a local emergency, and becomes a media sensation.  At which point the villain, who has an identical power set, engineers an international crisis that allows him to heroically intervene — and declare that he's come to our world to apprehend our heroine, who is actually an interdimensional outlaw.  At which point the public automatically lines up behind the man. 

That much made sense to me: a big part of the reason I had agreed to devote a year of my life to this movie was that it seemed like a unique opportunity to shine a light on the sexism that even in our supposedly enlightened age continues to poison society.  We wanted to depict how women have to be twice as accomplished to be taken half as seriously, even by other women.  But then it came time to write a scene illustrating the public reaction to this controversy.  I thought it best to be subtle — no over-the-top misogynistic invective, but instead sound bites like "he seems more like what a superhero should be" and "I feel safer knowing he's the one protecting the planet" and "he won't just give the bad guys a time-out, he'll kick their ass!". 

But the head writer seemed to want to take subtlety even further.  He had a very precise idea of what people should say.  "I just don't trust her."  "There's just something I don't like about her."  I thought these sorts of lines were too vague and blunted the point we were trying to make, and I went along with them reluctantly.  But sure enough, five years passed and it was exactly the head writer's lines that I heard coming out of the mouths of people trying to explain their disdain for Hillary Clinton.

Seymour Papert: Computer Criticism vs. Technocentric Thinking

Technocentrism refers to the tendency to give a [centrality] to a technical object -- for example computers or LOGO. This tendency shows up in questions like "what is the effect of the computer on cognitive development?" or "does LOGO work?" ... Such turns of phrase often betray a tendency to think of "computers" and of "LOGO" as agents that act directly on thinking and learning; they betray a tendency to reduce what are really the most important components of educational situations -- people and cultures -- to a secondary, facilitating role.

The context for human development is always a culture, never an isolated technology. In the presence of computers, cultures might change and with them people's ways of learning and thinking. But if you want to understand (or influence) the change, you have to center your attention on the culture -- not on the computer.

J. Willard Gibbs: Quaternions and the Algebra of Vectors, quoted in Crowe’s History of Vector Analysis, p 199

There are two ways in which we may measure the progress of any reform. The one consists in counting those who have adopted the shibboleth of the reformers; the other measure is the degree in which the community is imbued with the essential principles of the reform.

Ben Watt: interview in the Guardian

Eden was recorded with that fierce, adolescent spirit that everybody had at that time. Self-awareness is a dangerous thing: by about the third or fourth record, people were throwing comparisons at us and you have to be very tough to withstand it. And by the end of the 90s, we were playing to 5,000 people a night. I'd stand on stage, looking out, thinking, "I don't want to be this big."

Graham Nelson: Prompter: A Domain-Specific Language for Versu

That’s what one group of programming-language designers do, but it’s not what I do. My own best-known language, Inform, has never placed higher than number 73 in popularity charts. For comparison, the 73rd most popular human language is Norwegian: but like Norwegian, which of course ranks as number 1 in Norway, Inform is valued within its own domain. It serves a community with specific needs which would otherwise be difficult to meet. And that’s the other reason to design a programming language: when there’s an entirely new domain to work in, one where conventional languages just won’t do.

Kevin Drum: Lead: America's Real Criminal Element

Experts often suggest that crime resembles an epidemic. But what kind? Karl Smith, a professor of public economics and government at the University of North Carolina-Chapel Hill, has a good rule of thumb for categorizing epidemics: If it spreads along lines of communication, he says, the cause is information. Think Bieber Fever. If it travels along major transportation routes, the cause is microbial. Think influenza. If it spreads out like a fan, the cause is an insect. Think malaria. But if it's everywhere, all at once — as both the rise of crime in the '60s and '70s and the fall of crime in the '90s seemed to be — the cause is a molecule.

Scott Kim: Viewpoint demo

The point of Viewpoint is not text, graphics, editing, or even visibility. Viewpoint challenges a deep belief in computer science, that the pixels on the screen are mere shadows of real data structures. Only by treating the screen itself as a first-class citizen will we be able to build computers that are truly for visual thinkers.

James Somers: Torching the Modern-Day Library of Alexandria

What became known as the Google Books Search Amended Settlement Agreement came to 165 pages and more than a dozen appendices. It took two and a half years to hammer out the details. Sarnoff described the negotiations as “four-dimensional chess” between the authors, publishers, libraries, and Google. “Everyone involved,” he said to me, “and I mean everyone—on all sides of this issue—thought that if we were going to get this through, this would be the single most important thing they did in their careers.”

Joan Didion: Slouching Towards Bethlehem

Of course the activists -- not those whose thinking had become rigid, but those whose approach to revolution was imaginatively anarchic -- had long ago grasped the reality which still eluded the press: we were seeing something important. We were seeing the desperate attempt of a handful of pathetically unequipped children to create a community in a social vacuum. Once we had seen these children, we could no longer overlook the vacuum, no longer pretend that the society’s atomization could be reversed.

This was not a traditional generational rebellion. At some point between 1945 and 1967 we had somehow neglected to tell these children that the rules of the game we happened to be playing. Maybe we had stopped believing in the rules ourselves, maybe we were having a failure of nerve about the game. Maybe there was were just too few people around to do the telling. These were children who grew up cut loose from the web of cousins and great-aunts and family doctors and lifelong neighbors who had traditionally suggested and enforced the society’s values. They are children who have moved around a lot, San Jose, Chula Vista, here. They are less in rebellion against the society than ignorant of it, able only to feed back certain of its most publicized self-doubts, Vietnam, Saran-Wrap, diet pills, the Bomb.

They feed back exactly what is given them. Because they do not believe in words... their only proficient vocabulary is in the society's platitudes. As it happens I am still committed to the idea that the ability to think for one's self depends upon one's mastery of the language, and I am not optimistic about children who will settle for saying, to indicate that their mother and father do not live together, that they come from "a broken home."

Viznut: Bringing magic back to technology

The magic we need more in today's technological world is of the latter kind. We should strive to increase deepness rather than outward complexity, human virtuosity rather than consumerism, flexibility rather than effortlessness. The mysteries should invite attempts at understanding and exploitation rather than blind reliance or worship; this is also the key difference between esoterica and superstition.

Mills Baker: Genera

I worry deeply that our systematizing is inevitable because when we are online we are in public: that these fora mandate performance, and worse, the kind of performance that asserts its naturalness, like the grotesquely beautiful actor who says, “Oh, me? I just roll out of bed in the morning and wear whatever I find lying about” as he smiles a smile so practiced it could calibrate the atomic clock. Every online utterance is an angling for approval; we write in the style of speeches: exhorting an audience, haranguing enemies, lauding the choir. People “remind” no one in particular of the correct ways to think, the correct opinions to hold. When I see us speaking like op-ed columnists, I feel embarrassed: it is like watching a lunatic relative address passers-by using the “royal we,” and, I feel, it is pitifully imitative. Whom are we imitating? Those who live in public: politicians, celebrities, “personalities.”

There is no honesty without privacy, and privacy is not being forbidden so much as rendered irrelevant; privacy is an invented concept, after all, and like all inventions must contend with waves of successive technologies or be made obsolete.

Nick Bilton: All Is Fair in Love and Twitter

[On the origins of Twitter:] As he listened to Dorsey talk, Glass would later recall, he stared out the window, thinking about his failing marriage and how alone he felt. Then he had an epiphany. This status thing wasn’t just about sharing what kind of music you were listening to or where you were, he thought. It could be a conversation. It wasn’t about reporting; it was about connecting. There could be a real business in that. He would certainly like such a service: his nights alone in his apartment, alone in his office, alone in his car, could feel less alone with a steady stream of conversation percolating online.

William Howarth: Reading Thoreau at 200

If Thoreau as American eco-hero peaked around the first Earth Day (1970), today he is derided ... by postmodern thinkers for whom nature is a suspect green blur. (I still recall one faculty meeting at which a tenured English professor dismissed DNA as all right, "if you believe in that sort of thing.")

Debbie Chachra: Gratitude for Invisible Systems

When we think about caring for our neighbors, we think about local churches, and charities -- systems embedded in our communities. But I see these [infrastructural and public safety] technological systems as one of the main ways that we take care of each other at scale. It’s how Americans care for all three hundred million of our neighbors, rich or poor, spread over four million square miles, embedded in global supply chains. ...

If I were to make a suggestion for how technology could be used to improve our democracy, I would want to make these systems more visible, understandable, and valued by the general public. Perhaps a place to start is with the system that is the ultimate commons -- our shared planet.

Mark Ferrari: 8 Bit & '8 Bitish' Graphics - Outside the Box

It's not so much that I expect you to run out and do palette-shift or color-cycling effects like this. It's just there was so much more to do with a palette that you could control color-by-color... There was so much more you could do because the environment was small enough that you could actually think about it that way.

And I think that that is probably part of the reason why 8-bit and 8-bit-ish art is still so relevant today -- because it was a mentally and creatively manageable space to work in. I'm not sure some of our other, much more cinemagraphic and three-dimensional things are the same.

Bob Taylor: interview at Computer History Museum

Q: What is the key to creating a lab that does groundbreaking research and ships products? Can a lab do both, and if so, how?

A: [chuckles] No, a lab cannot ship products. A lab can ship technology to a group whose business is to ship products. If the lab thinks it's shipping products, it's not doing research.

Bob Johnstone: Brilliant!: Shuji Nakamura And the Revolution in Lighting Technology, p 53

Early laser research was curiosity driven, in an era when big corporations like AT&T, GE, IBM, RCA, and Westinghouse were able to recruit the cream of the scientific crop. They lured scientists and engineers, not with stock options -- an incentive that in 1962 had yet to be thought up -- but with hefty salaries and freedom to pursue their instincts, wherever they led, unconstrained by commercial pressures.

For researchers in the 1960s, the central labs of corporations were much like the universities where they would normally have worked. Corporate scientists published papers, attended conferences, even took sabbatical years just like professors. Today, when research horizons in the corporate sector are mostly measured in months, and blue-sky research is once again the exclusive domain of academe, such freedom seems very old-fashioned. Back then, however, it was the norm.

Bob Johnstone: Brilliant!: Shuji Nakamura And the Revolution in Lighting Technology, p 107

Modifying the equipment was the key to his success... For the first three months after he began his experiments, Shuji tried making minor adjustments to the machine. It was frustrating work... Nakamura eventually concluded that he was going to have to make major changes to the system. Once again he would have to become a tradesman, or rather, tradesmen: plumber, welder, electrician -- whatever it took. He rolled up his sleeves, took the equipment apart, then put it back together exactly the way he wanted it. ...

Elite researchers at big firms prefer not to dirty their hands monkeying with the plumbing: that is what technicians are paid for. If at all possible, most MOCVD researchers would rather not modify their equipment. When modification is unavoidable, they often have to ask the manufacturer to do it for them. That typically means having to wait for several months before they can try out a new idea.

The ability to remodel his reactor himself thus gave Nakamura a huge competitive advantage. There was nothing stopping him; he could work as fast as he wanted. His motto was: Remodel in the morning, experiment in the afternoon. ...

Previously he had served a ten-year self-taught apprenticeship in growing LEDs. Now he had rebuilt a reactor with his own hands. This experience gave him an intimate knowledge of the hardware that none of his rivals could match. Almost immediately, Nakamaura was able to grow better films of gallium nitride than anyone had ever produced before.

Italo Calvino: If on a Winter's Night a Traveler, p 176

I read in a book that the objectivity of thought can be expressed using the verb "to think" in the impersonal third person: saying not "I think" but "it thinks" as we say "it rains." There is thought in the universe -- this is the constant from which we must set out every time.

Hermann Hesse: Siddhartha

"But if you don't mind me asking: being without possessions, what would you like to give?"

"Everyone gives what he has. The warrior gives strength, the merchant gives merchandise, the teacher teachings, the farmer rice, the fisher fish."

"Yes indeed. And what is it now what you've got to give? What is it that you've learned, what you're able to do?"

"I can think. I can wait. I can fast."

"That's everything?"

"I believe, that's everything!"

"And what's the use of that? For example, the fasting-- what is it good for?"

"It is very good, sir. When a person has nothing to eat, fasting is the smartest thing he could do. When, for example, Siddhartha hadn't learned to fast, he would have to accept any kind of service before this day is up, whether it may be with you or wherever, because hunger would force him to do so. But like this, Siddhartha can wait calmly, he knows no impatience, he knows no emergency, for a long time he can allow hunger to besiege him and can laugh about it. This, sir, is what fasting is good for."

Gretchen Bakke: The Grid: The Fraying Wires Between Americans and Our Energy Future, p 89

By the time Carter took office, every home in America was its own miraculous technological node, built into a complexly woven support net of wires and pipes and ductwork. By 1976 eveyone in America who wanted it had electricity, indoor plumbing, central heating, a refrigerator, and a phone...

Living in these homes and laboring in these workplaces changed us. It only took a generation after the end of the Depression for Americans to become consummately modern individuals, until as a nation we had lost working knowledge of a coal brazier, a kerosene lamp, a latrine, an ice box, a well, a mangler, or anything else more complicated than a switch, a button as outlet, a socket, a tap, or a flusher.

And yet, it was also the case that almost no one had any idea how the replacement technology (a coal-burning power plant for a brazier, or a sewer treatment plant for an outhouse, or water purification plant for a well) worked. ... As long as the switches... worked, the benefit of all the rest having become distant undertakings running along long wires and through long pipes was that they were no longer our immediate concern.

Thus did America lose one kind of knowing -- that involved in managing a low-tech household -- without gaining another kind of knowing -- that of the distant complexity undergirding a high-tech household. ... One result of this was that we got to think about other things, do other things, and live longer, less disease-riddled lives. The problem was that all of these systems beyond the scope of daily understanding had consequences, and by the 1970s, these consequences had become almost impossible to ignore.

Gretchen Bakke: The Grid: The Fraying Wires Between Americans and Our Energy Future, p 106

America's first [wind] turbine engineers were aeronautical engineers who had opted out of working for Vietnam-era helicopter companies. As a result they designed their turbines with floppy flexible blades based on the aerodynamics of helicopters. It turns out that the blades you need on a helicopter are the exact opposite of the ones that make for a successful wind turbine. ...

Since the helicopter guys had the wrong theory, once they got to the trial and error phase of things they never got very good results without really understanding why. Their machines couldn't be tinkered into efficacy. ...

If Americans were mostly trained as engineers, the Danes were former blacksmiths. "They had a totally different relationship with metal," Cashman explains. "They spent their time fixing large-scale farm equipment, so machinery was their model... Their second to third prototypes were good. They could just manufacture them because they were all done on the right principles."

Dick Pashley: Flashback -- a Story of Flash Memory

Q: How did [Intel's] senior management finally get convinced to work on flash?

When we went up to the memory components division... we made this presentation, and we were actually shocked. We thought we had a memory that was ideal, and the response was basically sour grapes. We were told that Toshiba failed, so why should we invest in this? And the other reason was, ironically, it would cannibalize Intel's EPROM business.

And we're scratching our heads, going, "Isn't it better that we cannibalize our own business than someone else?" But we didn't say that. We went back to Santa Clara, and I set up a luncheon with Gordon Moore... And during [our] presentation, Gordon made a very astute observation. He said, "Flash is a disruptive technology." And clearly he understood the potential of what flash could offer.

About two weeks later, I got a phone call and I was asked if I wanted to leave the technical side of Intel and basically do a startup... I think Gordon deserves most of the credit for at least allowing the people at Intel to work on flash and turn it into a product.

Davey Wreden: Game of the Year

Each new receipt of [Game of the Year] does little more than set the new standard of my own satisfaction.

The most significant and direct outcome of being nominated is that I become more deeply unsettled by being omitted on subsequent lists.

Hayao Miyazaki: project proposal for "The Wind Rises"

I want to portray a devoted individual who pursued his dream head on. Dreams possess an element of madness, and such poison must not be concealed. Yearning for something too beautiful can ruin you. Swaying towards beauty may come at a price. Jiro [Horikoshi] will be battered and defeated, his design career cut short. Nonetheless, Jiro was an individual of preeminent originality and talent. This is what we will strive to portray in this film.

Paul Lewis: 'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia

McNamee chooses his words carefully. "The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences," he says. "The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models."

Jacob "Kujo" Lyons: interview

Because everybody watches the same videos online, everybody ends up looking very similar. The differences between individual b-boys, between crews, between cities/states/countries/continents, have largely disappeared. It used to be that you could tell what city a b-boy was from by the way he danced. Not anymore. But I've been saying these things for almost a decade, and most people don't listen, but continue watching the same videos and dancing the same way. It's what I call the "international style," or the "Youtube style".

Avery Pennarun: Highlights on "quality"

This [book] was very good, quite short, and easy to read, like Einstein's Relativity was easy to read. And you know it works, because his ideas were copied over and over, poorly, eventually into unrecognizability, ineffectiveness, and confusion, the way only the best ideas are.

Carver Mead: Oral History

[Being an undergrad at Caltech] was very hard for me. I worked very hard. I wasn’t as quick as the other students, because I kept trying to really understand. I didn’t want to just figure out how to [do things], because I didn’t come here for that. I came here because I really wanted to understand this stuff, and that’s much harder than just learning how to do things.

Tycho: The Dumbest Timeline

They made a kind of monster machine, with every possible lever thrown towards a caustic narcissism, and then they pretend to be fucking surprised when an unbroken stream of monsters emerge.

Fred Turner: interview

How people do sexuality has changed enormously with the introduction of new media. My wife and I have been married for thirty-plus years. When we were courting, we wrote beautiful handwritten letters on blue paper and mailed them long distance. You’d wait weeks for them. You’d fill in every little gap of the page. Now, we FaceTime. There's no withheld gratification.

Romance of the kind that I grew up with was something that took time. It required restraining your desires. It required thinking about another person. I mean, one of the most erotic things you can do with a person is think about them, right? Just think about them. That's different in a world where you can press a button and their face appears.

George W.S. Trow: Within the Context of No Context, p 62

The problem comes with a lie... The lie is in this -- that the teen-age alcoholic suffers from a problem in the foreground, a problem within a context, liable to solution within the frame of the context, subject to powers of arrangement near to the hand of the organizing power of the context.

The reality is this -- that the problem is the only context available to the people in the problem.

Shinichi Suzuki: Ability Development from Age Zero

We must think deeply about the fact that Casals, at ninety-one years old, even now practices the cello two hours every day so as not to be stagnant at even his high level of ability.

Even if someone becomes a fine person who does great works, he is not so exalted that he does not need to study. Rather the opposite is true, because he finds more and more problems to study and he has the will to grow higher and higher. He is in a world so advanced that we cannot even imitate him, but we learn that people train themselves more and reach for truer beauty.

Jonathan Blow: interview with Brian Moriarty, 1:54

[Brian Moriarty] I tell my students that as designers they should begin with some idea of their target audience, who they're designing for, and the kind of experience that they're tring to create for that person. Who are you designing for, what kind of expierence were you trying to produce, did you achieve it, and how do you know?

For me, those two things -- the who it's for and what the experience is -- is really sort of just one thing. I have some picture of what I want the experience to be.... But then when it comes to who it's for, the answer is: the person who would appreciate that experience that I just outlined.

Alan Kay: talk at ATG System Software Seminar (1991)

One of the major distinctions between expert planning and novice planning is that novices plan (when they do plan) with the bricks. They try to make plans in terms of the bricks, and experts almost never do.

John Cleese: discussion about Life of Brian, 4:50

It's about closed systems of thought, whether they're political or theological or religious or whatever. Systems by which whatever evidence is given to the person he merely adapts it, fits it into his ideology. You show the same event to a Marxist and a Catholic, for example, both of them find they have explanations of it. It's what, to be pompous, Popper's on about with falsifiability of theories. Once you've got an idea that is whirring around so fast that no other light or contrary evidence can come in, then I think it's very dangerous.

Jerome Bruner: interview on Inside the Psychologist's Studio

Q: To what extent did you conceive of what you were doing as revolutionary?

A: The word never crossed my mind. It's sensible, that's not revolutionary. Who needs revolutionary?

Van Jacobson: NDN -- Why Bother?

NDN is not very "layerist", and that's personal philosophy but it's shared by other parties. Dave Clark and I... twenty years back, we wrote a paper on application-layer framing which is, basically, what you should put in the network is just muxing and demuxing and packets. Everything else should go in the application, and the fact that it wasn't in the application [in TCP/IP] is more an accident of history than an architectural choice.

It was actually intended to be in the application, [but] there were some issues with Multics -- the initial DARPA-funded implementation -- that caused it to get pushed into the kernel. The reason is when a packet came off the wire you had to deliver it to something, but most of the applications in Multics spent their entire life swapped out, because there wasn't enough memory on that time-sharing system, because memory was obviously expensive. And because of that, you wanted the receiver for the packets to be sitting somewhere that was always running. And because the communication lines cost way more than the computers, you wanted to keep the communication lines busy. They were leased lines and if you weren't using them, it was money going up in smoke. So you wanted to look at the packets coming in and say, "Is this okay? Am I missing something? Was it corrupted?"

So that caused a lot of protocol processing to migrate from the app into the kernel. It wasn't architectural.

In NDN, everything's in the app -- it has to be, because [packets are signed.] ... [Semantics live] in a library in the application, because some of us believe that's the one true way that undoes this accident of history that constrained IP.

Eric Havelock: The Literate Revolution in Greece and Its Cultural Consequences, p 88

The alphabet therewith made possible the production of novel or unexpected statement, previously unfamiliar and even "unthought." The advance of knowledge, both humane and scientific, depends upon the human ability to think about something unexpected -- a "new idea," as we loosely but conveniently say. Such novel thought only achieves completed existence when it becomes novel statement, and a novel statement cannot realize its potential until it can be preserved for further use.

Previous transcription, because of the ambiguities of the script, discouraged attempts to record novel statements. This indirectly discouraged the attempt to frame them even orally, for what use were they likely to be, or what influence were they likely to have, if confined within the ephemeral range of casual vernacular conversation? The alphabet, by encouraging the production of unfamiliar statement, stimulated the thinking of novel thought, which could lie around in inscribed form, be recognized, be read and re-read, and so spread its influence among readers.

It is no accident that the pre-alphabetic cultures of the world were also in a large sense the pre-scientific cultures, pre-philosophical and pre-literary. The power of novel statement is not restricted to the arrangement of scientific observation. It covers the gamut of the human experience. There were new inventible ways of speaking about human life, and therefore of thinking about it, which became slowly possible for man only when they became inscribed and preservable and extendable in the alphabetic literatures of Europe.

Anthony Siegman: oral history

And it's optics that really makes all of that possible. And the advent of the laser, besides enabling the [internet via fiber optics], enables us to just do so many kinds of tasks... incomparable scientific measurements and incomparable engineering...

It is very interesting that lasers as an economic market are a very minor one -- the worldwide market for lasers is a few billion dollars total. And that amount might buy you one semiconductor fabrication plant, a few billion dollars, you see. But the economic impact of lasers on daily life, on engineering... is just immense. It's just beyond compare...

The Optical Society is a small society by many standards, for example compared to IEEE, which has 150,000 members or something. But it has a bright future.

Severo Ornstein: Computing in the Middle Ages: A View from the Trenches

Many recent so-called computer histories, catering heavily to the public lust for a peep at the rich and famous, have explored, ad nauseam, the eccentricities of Bill Gates and Steve Jobs and their brethren -- to the point that they have become almost mythic figures. Their stories and their garages have become legendary. But these are Johnny-come-latelies who have achieved notoriety for the most part not for innovative science or even for engineering, but rather thanks to their extraordinary ability to exploit ideas pioneered by others, to turn them into financial empires. But what about the pioneers themselves, the ones who did the scientific and engineering groundwork on which these empires have been built? Where and who are they? Too often they appear only in fading photographs, wearing outmoded suits, usually standing in front of giant machinery that bears no apparent resemblance to the computers of today.

Telle Whitney: talk at Carver Mead's 80th birthday, 7:11

But what I also didn't realize was that this was the tail end of what was happening with the VLSI work. A few years after I got here, Carver called a meeting of all of his graduate students and he said, "Things are changing." And he asked for a commitment. He wanted us to commit to the work, commit to the science. ... He was asking for us to make a commitment to a different future, and he created the lab.

If you look at the students who were here during this period of time, it was less about a group working together, and more a collection of people that had interesting ideas. In hindsight, I think that Carver was starting to think, "Okay, what is next?"

Frank McCourt: Teacher Man

I took his year-long course on the History of English Literature... You could see he wanted us to know and understand how English literature had developed and the language along with it. He insisted we should know the literature the way a doctor knows the body.

Noam Chomsky: The Purpose of Education

You can't expect somebody to become a biologist by giving them access to the Harvard University biology library and saying, "Just look through it." That will give them nothing. The internet is the same, except magnified enormously. ...

The person who wins the Nobel Prize in biology is not the person who read the most journal articles and took the most notes on them. It's the person who knew what to look for. Cultivating that capacity to seek what's significant, always willing to question whether you're on the right track, that's what education is going to be about, whether it's using computers and internet, or pencil and paper and books.

Mike Engelhardt: interview

The point of simulation is so you understand your circuit better. You simulate the circuit so you know what you need to do to verify the design. What the simulator is doing is helping you develop intuition on how the circuit works. And there is no way to understate the value of cultivating your intuition on how your circuit works. Intuition is the most important part of engineering.

Frank Oppenheimer: clip in "Exploratorium" by Jon Boorstin, 13:22

It shouldn't be a static thing; it should be one where people learn what's happening. And the only way to learn what's happening is to change what's happening. ...

They're not just getting somebody else dishing it out to them. There's enough in there so they actually made a connection themselves. Now, they don't go ahead and do a whole lot of other experiments to see if it's right, but that's hard to do in a museum.

Harry Gray: The Joy of Teaching and Research

That's when I started developing a way to teach molecular orbital theory, for the first time, to freshmen. It was an incredible experience, and I fell in love with teaching. I fell in love with teaching because I realized that I could somehow incorporate stuff I was doing in research into my teaching. Not at the level I was doing it in research, but things that were related directly to my research. I could develop the theory for freshmen at a level I felt they could understand and work with me. So that's how I got started in teaching, was to take the research I was doing and teach it in a certain way that they could appreciate. And they loved it. ...

Based on that course, I wrote a book. A couple of kids in the course had taken very good notes. They gave me their notes, and I wrote the book.

Richard Williams: The Animator's Survival Kit, p 17

In [Winsor] McCay's words: "I went into the business and spent thousands of dollars developing this new art. It required considerable time, patience and careful thought... This is the most fascinating work I have ever done -- this business of making animated cartoons live on the screen." ...

Later, as an older man being celebrated by the younger funny-cartoon animators in the business, McKay lashed out at them saying that he had developed and given them a great new art form which they had cheapened and turned into a crude money-making business done by hack artists.

This well defines the endlessly uncomfortable relationship between the pioneering artist/idealist and the animation industry -- working to comfortable and predictable formulas.

Still doth the battle rage...

Eric Mazur: Confessions of a Converted Lecturer

You know, the students had a point. I was lecturing from my lecture notes. And if they had read the book, they would have seen that the book wasn't that different from the lecture notes.

Imagine I had been teaching Shakespeare rather than physics. Would I have asked the students to come to class and then said, "Today we're going to cover A Midsummer Night's Dream," and then open the collected works of Shakespeare and start reading A Midsummer Night's Dream to them?

No, of course not. I would have said to them, "Next Wednesday we're going to discuss A Midsummer Night's Dream. If you haven't read A Midsummer Night's Dream, read it before coming to class." And as a student you would have known that if you had not read the play, you might as well stay in bed because you would not be able to contribute to the discussion.

But that's not what we do in the sciences. In the sciences we still mostly focus on "information transfer". And you know, unfortunately, Halliday and Resnick is a lot more boring than Shakespeare, so we have a disadvantage there too.

Patrick Brown: interview

One of the questions that I started thinking about... we need to understand how meat works. The same way that we might want to understand how does a cell work, how does a gene work, what are the molecular mechanisms that cause a normal cell to become a cancer cell, that kind of stuff. We need to understand, in the same kind of mechanistic level, how meat produces flavor, and what underlies the texture, and so forth.

Bob Taylor: interview

Doug [Engelbart] was also having problems. His manager, early on in the NASA support, came to see me, which was unusual. He came to see me in my office in Washington, and he said, "I want to talk to you about Doug." He said, "Why are you funding him?"

I said, "Because he's trying to do something that's very important, that nobody else is trying to do, and I believe in it."

Alan Kay: interview on Future School Now, 19:33

I don't believe there is such a thing as knowledge. What there are are processes inside people's head. Knowledge isn't a substance just because it's a noun.

Carver Mead: panel: Demystifying VLSI Technology: Exploring Its Future Possibilities, 36:18

We went through the golden age, where we were doing neuromorphic circuits and systems, and we had access to fab. The fab was easily understood, so the conceptual base was reasonable. It was rich enough to do interesting things, but it was limited enough that you didn't have to do everything all over again, including your thinking about devices, when you went to the next process node. We went through probably twenty years that way.

That was the golden age from a research point of view, because MOSIS would turn the thing in a couple of weeks, or at most a month. So everybody was actually producing real artifacts, and finding out what they really did, and realizing that your simulation only got it right a fraction of the time, and there were things that went on that you hadn't anticipated and aren't built into the simulation. ...

Now ... getting into fab has to go through some big pile of software that somebody has copyrighted. By the time you're done, it's very hard to see what's down there and see how it's related to what you wanted to do. ...

The ability to actualize, in real physical form, the idea that you had, the concept, is priceless in educating students how to innovate. If you can't do that, you're in the mercy of a big simulator, which is a big pile of stuff that you don't understand. ...

That means that the kind of innovation that we're talking about isn't going to happen through that channel. I see it happening right now in integrated optics just because you have that connection. The same person that has the concept goes all the way to a physical device, and tests it and figures out what it's doing. That connection, you can't lose, or you end up with specialists. You don't do innovation very much with specialists. ...

[Integrated circuit design] has gotten all of this stuff between the designer and the silicon, and you can't see through it.

Alan Kay: LINC Twentieth Anniversary Symposium (1983), 1:44

When I was thinking about design yesterday afternoon, I went over to All Souls Church in the district and played organ for a couple of hours. ...

This instrument I played on is an interesting example of design. It was built about six or seven years ago. The action on the organ is completely mechanical, a reversion of pipe organ design back to a style originated in the 12th and 13th centuries.

One of the things that organ designers over the years discovered, after going to electric actions and pneumatic actions and all that, was that the old mechanical design was the best, because the organist can feel the pipe valves opening through his fingers. It's a more sensitive musical instrument.

The other problem you have in an organ is you have to change lots of stops. This instrument had about sixty different stops. The designers of this organ, instead of using mechanical stop changing, had a complete computer memory system to allow all the stop combinations to be set up and changed by just the touch of a button. You can't hear the stops being changed, except for the change in the timbre of the instrument.

I thought, well, that is tremendous. Here is the best of the old and the best of the new.

And then I discovered one horrible thing when I sat down to play, and that is that the bench was too tall. ... I think the best thing you can ever say about a designer, and this is definitely true about Wes, is that Wes designs the bench well too.