Quotes

Alan Moore: interview on mtv.com

I have a theory, which has not let me down so far, that there is an inverse relationship between imagination and money. Because the more money and technology that is available to [create] a work, the less imagination there will be in it.

Tadhg Kelly: Stories, Structure, Abstraction and Games

And that's why Chess and Go remain as enduringly popular as they are, and why soccer is the most popular game on earth. Robustness and elegance are the key driving forces here, and they are in direct opposition to the brittleness and complexity, the defining traits of story.

Bill Tozier: Diverse themes observed at GECCO 2006

What one wants is to be able to talk with a diverse club of smart people, arrange to do short one-off research projects and simulations, publish papers or capture intellectual property quickly and easily, and move on to another conversation. Quickly. Easily. For a living. Can't do that in industry. Can't do that in the Academy. Yet in my experience, scientists and engineers all want it. Maybe even a few mathematicians and social scientists do, too.

Luiz Henrique de Figueiredo: lua-l

 > I found myself wishing to have a continue keyword [...]
 > I can't recall an official reason why it isn't in the language.
Lua evolves by answering "why?" not "why not?".

David Hestenes and Garret Sobczyk: Clifford Algebra to Geometric Calculus: A Unified Language for Mathematics, p xii

Klein's seminal analysis of the structure and history of mathematics brings to light two major processes by which mathematics grows and becomes organized... The one emphasizes algebraic structure while the other emphasizes geometric interpretation. Klein's analysis shows one process alternately dominating the other in the historical development of mathematics. But there is no necessary reason that the two processes should operate in mutual exclusion. Indeed, each process is undoubtedly grounded in one of the two great capacities of the human mind: the capacity for language and the capacity for spatial perception. From the psychological point of view, then, the fusion of algebra with geometry is so fundamental that one could well say, 'Geometry without algebra is dumb! Algebra without geometry is blind!'

Richard Hamming: The Unreasonable Effectiveness of Mathematics (1980)

The Postulates of Mathematics Were Not on the Stone Tablets that Moses Brought Down from Mt. Sinai. It is necessary to emphasize this. We begin with a vague concept in our minds, then we create various sets of postulates, and gradually we settle down to one particular set. In the rigorous postulational approach, the original concept is now replaced by what the postulates define. This makes further evolution of the concept rather difficult and as a result tends to slow down the evolution of mathematics. It is not that the postulation approach is wrong, only that its arbitrariness should be clearly recognized, and we should be prepared to change postulates when the need becomes apparent.

Richard Hamming: The Art of Doing Science and Engineering (1997)

Education is what, when, and why to do things. Training is how to do it.

In science, if you know what you are doing, you should not be doing it. In engineering, if you do not know what you are doing, you should not be doing it.

Dan Bricklin: interview on Triumph of the Nerds

People who saw [VisiCalc] and went and got it... Like an accountant, I remember showing it to one around here and he started shaking and said, "That's what I do all week. I could do it in an hour." ... I meet these people now, they come up to me and say, "I gotta tell you, you changed my life. You made accounting fun."

Dan Bricklin and Bob Frankston: interview on Triumph of the Nerds

DB: You know, looking back at how successful a lot of other people have been [as a result of our work], it's kind of sad that we weren't as successful...

BF: It would be very nice to be gazillionaires, but you can also understand that part of the reason was that that's not what we were trying to be.

DB: We were kids of the Sixties and what did you want to do? You wanted to make the world better, and you wanted to make your mark on the world and improve things, and we did it. So by the mark of what we would measure ourselves by, we were very successful.

Richard Hamming: You and Your Research

Somewhere around every seven years make a significant, if not complete, shift in your field. Thus, I shifted from numerical analysis, to hardware, to software, and so on, periodically, because you tend to use up your ideas. When you go to a new field, you have to start over as a baby. You are no longer the big mukity muk and you can start back there and you can start planting those acorns which will become the giant oaks. ...

You need to get into a new field to get new viewpoints, and before you use up all the old ones. You can do something about this, but it takes effort and energy. It takes courage to say, "Yes, I will give up my great reputation." For example, when error correcting codes were well launched, having these theories, I said, "Hamming, you are going to quit reading papers in the field; you are going to ignore it completely; you are going to try and do something else other than coast on that."

Sol Stein: Stein on Writing

Nonfiction conveys information. Fiction conveys emotion.

C.A.R. Hoare: The Emperor's Old Clothes

I note with fear and horror that even in 1980, language designers and users have not learned this lesson [mandatory run-time checking of array bounds]. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law.

C.A.R. Hoare: The Emperor's Old Clothes

I conclude that there are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willingness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.

Alan Kay: How Simply and Understandably Could The "Personal Computing Experience" Be Programmed?

When I first prepared this particular talk... I realized that my usual approach is usually critical. That is, a lot of the things that I do, that most people do, are because they hate something somebody else has done, or they hate that something hasn't been done. And I realized that informed criticism has completely been done in by the web. Because the web has produced so much uninformed criticism. It's kind of a Gresham's Law -- bad money drives the good money out of circulation. Bad criticism drives good criticism out of circulation. You just can't criticize anything.

George Orwell: Politics and the English Language

What is above all needed is to let the meaning choose the word, and not the other way around. In prose, the worst thing one can do with words is surrender to them. When you think of a concrete object, you think wordlessly, and then, if you want to describe the thing you have been visualising you probably hunt about until you find the exact words that seem to fit it. When you think of something abstract you are more inclined to use words from the start, and unless you make a conscious effort to prevent it, the existing dialect will come rushing in and do the job for you, at the expense of blurring or even changing your meaning. Probably it is better to put off using words as long as possible and get one's meaning as clear as one can through pictures and sensations. Afterward one can choose -- not simply accept -- the phrases that will best cover the meaning, and then switch round and decide what impressions one's words are likely to make on another person.

Mark Kennedy: Carrying a Sketchbook

I don't know what people expect to see what they look in a sketchbook, but they always seem mighty disappointed. I think people expect to see what they would see in a Hollywood version of a sketchbook. Whenever someone is sketching from life in a movie, it's always supposed to look tossed off and effortless, but it's really some totally finished and labored-over drawing that some artist spent hours rendering.

Any real sketchbook is full of misfires, false starts and stumbles, with a few successes sprinkled here and there. If you were capable of doing a perfect drawing every time, you wouldn't need to carry a sketchbook!

Dan Piponi: The Essence of Quantum Computing

What I am saying is in direct contradiction with what is said by some of the founding fathers of quantum computing. Actually, I think that's a good thing, it means that whether I'm right or wrong, I must be saying something non-trivial.

Joe Armstrong: A History of Erlang

It was during this conference that we realised that the work we were doing on Erlang was very different from a lot of mainstream work in telecommunications programming. Our major concern at the time was with detecting and recovering from errors. I remember Mike, Robert and I having great fun asking the same question over and over again: "what happens if it fails?" -- the answer we got was almost always a variant on "our model assumes no failures." We seemed to be the only people in the world designing a system that could recover from software failures.

John Napier: Hands

With the eye, the hand is our main source of contact with the physical environment. The hand has advantages over the eye because it can observe the environment by means of touch, and having observed it, it can immediately proceed to do something about it. The hand has other great advantages over the eye. It can see around corners and it can see in the dark.

Christopher Alexander: foreword to Richard Gabriel's "Patterns Of Software"

In my life as an architect, I find that the single thing which inhibits young professionals, new students most severely, is their acceptance of standards that are too low. If I ask a student whether her design is as good as Chartres, she often smiles tolerantly at me as if to say, "Of course not, that isn't what I am trying to do.... I could never do that."

Then, I express my disagreement, and tell her: "That standard must be our standard. If you are going to be a builder, no other standard is worthwhile. That is what I expect of myself in my own buildings, and it is what I expect of my students." Gradually, I show the students that they have a right to ask this of themselves, and must ask this of themselves. Once that level of standard is in their minds, they will be able to figure out, for themselves, how to do better, how to make something that is as profound as that.

Neil Postman: Amusing Ourselves to Death

[This argument] fixes its attention on the forms of human conversation, and postulates that how we are obliged to conduct such conversations will have the strongest possible influence on what ideas we can conveniently express. And what ideas are convenient to express inevitably become the important content of a culture.

Neil Postman: Amusing Ourselves to Death

We must remember that Galileo merely said that the language of nature is written in mathematics. He did not say everything is. And even the truth about nature need not be expressed in mathematics. For most of human history, the language of nature has been the language of myth and ritual. These forms, one might add, had the virtues of leaving nature unthreatened and of encouraging the belief that human beings are part of it. It hardly befits a people who stand ready to blow up the planet to praise themselves too vigorously for having found the true way to talk about nature.

Steven Johnson: Everything Bad Is Good For You

Now, I have no doubt that playing today's games does in fact improve your visual intelligence and your manual dexterity, but the virtues of gaming run far deeper than hand-eye coordination. When I read these ostensibly positive accounts of video games, they strike me as the equivalent of writing a story about the merits of the great novels and focusing on how reading them can improve your spelling.

Paul Hawken: The Ecology of Commerce

The purpose of all these suggestions is to end industrialism as we know it. Industrialism is over, in fact; the question remains how we organize the economy that follows. Either it falls in on us, and crushes civilization, or we reconstruct it and unleash the imagination of a more sustainable future into our daily acts of commerce. Protecting our industries because we want to be pro-business and pro-jobs will have the same level of effectiveness as did the Soviet effort to maintain its industries in the 1970s and 1980s.

Bjarne Stroustrup: interview in MIT Tech Review

Q: In The Design and Evolution of C++, you claim that Kierkegaard was an influence on your conception of the language. Is this a joke?

A: A bit pretentious, maybe, but not a joke. A lot of thinking about software development is focused on the group, the team, the company. This is often done to the point where the individual is completely submerged in corporate "culture" with no outlet for unique talents and skills. Corporate practices can be directly hostile to individuals with exceptional skills and initiative in technical matters. I consider such management of technical people cruel and wasteful. Kierkegaard was a strong proponent for the individual against "the crowd" and has some serious discussion of the importance of aesthetics and ethical behavior.

Stewart Brand: Environmental Heresies

The best way for doubters to control a questionable new technology is to embrace it, lest it remain wholly in the hands of enthusiasts who think there is nothing questionable about it.

Alan Moore: interview for "Authors on Anarchism"

In the future, we would have to be prepared for a situation in which we have firstly, no currency, and secondly, as a result of that, no government. So there are ways in which technology itself and the ways in which we respond to technology -- the ways in which we adapt our culture and our way of living to accommodate breakthroughs and movements in technology -- might give us a way to move around government. To evolve around government to a point where such a thing is no longer necessary or desirable. That is perhaps an optimistic vision, but it's one of the only realistic ways I can see it happening. ...

I really don't think that a violent revolution is ever going to provide a long-term solution to the problems of the ordinary person. I think that is something that we had best handle ourselves, and which we are most likely to achieve by the simple evolution of western society. But that might take quite a while, and whether we have that amount of time is, of course, open to debate.

Sheik Ahmed Zaki Yamani

The Stone Age didn't end for lack of stone, and the Oil Age will end long before the world runs out of oil.

Pavel Kobel: lua-l

Business demanding promise from [open-source] project is like business demanding promise from forest. If you like trees, you must do other thing to conserve.

Noam Chomsky: response to interview question regarding alternatives to capitalism

I think that, what used to be called centuries ago "wage slavery," is intolerable. And I don't think that people ought to be forced to rent themselves in order to survive. I think that the economic institutions ought to be run democratically, by their participants, by the communities in which they exist, and so on. And I think, basically, through various kinds of free association.

Kai Krause: Software is merely a Performance Art

I used to think "Software Design" is an art form.

I now believe that I was half-right:
it is indeed an art, but it has a rather short half-life:
Software is merely a performance art!

A momentary flash of brilliance, doomed to be overtaken by the next wave, or maybe even by its own sequel. Eaten alive by its successors. And time...

This is not to denigrate the genre of performance art: anamorphic sidewalk chalk drawings, Goldsworthy pebble piles or Norwegian carved-ice-hotels are admirable feats of human ingenuity, but they all share that ephemeral time limit: the first rain, wind or heat will dissolve the beauty, and the artist must be well aware of its fleeting glory.

For many years I have discussed this with friends that are writers, musicians, painters and the simple truth emerged: one can still read the words, hear the music and look at the images....

Their value and their appeal remains, in some cases even gain by familiarity: like a good wine it can improve over time. You can hum a tune you once liked, years later. You can read words or look a painting from 300 years ago and still appreciate its truth and beauty today, as if brand new. Software, by that comparison, is more like Soufflé: enjoy it now, today, for tomorrow it has already collapsed on itself. Soufflé 1.1 is the thing to have, Version 2.0 is on the horizon.

It is a simple fact: hardly any of my software even still runs at all!

Richard Doherty: Diary of a Disaster: General Magic Goes Poof!

I'm visiting Woz and his daughter Suzanne, who is in the hospital after an emergency appendectomy, when another visitor asks if a certain friend has been told about the surgery. Woz proudly whips out his Magic Link to get her address and number. Before the device can retrieve the data, however, Suzanne produces the number from an address book in her handbag.

Seymour Papert: Mindstorms: children, computers, and powerful ideas

In many schools today, the phrase "computer-aided instruction" means making the computer teach the child. One might say the computer is being used to program the child. In my vision, the child programs the computer and, in doing so, both acquires a sense of mastery over a piece of the most modern and powerful technology and establishes an intimate contact with some of the deepest ideas from science, from mathematics, and from the art of intellectual model building. ...

Two fundamental ideas run through this book. The first is that it is possible to design computers so that learning to communicate with them can be a natural process, more like learning French by living in France than like trying to learn it through the unnatural process of American foreign-language instruction in classrooms. Second, learning to communicate with a computer may change the way other learning takes place. The computer can be a mathematics-speaking and an alphabetic-speaking entity. We are learning how to make computers with which children love to communicate. When this communication occurs, children learn mathematics as a living language. Moreover, mathematical communication and alphabetic communication are thereby both transformed from the alien and therefore difficult things they are for most children into natural and therefore easy ones. The idea of "talking mathematics" to a computer can be generalized to a view of learning mathematics in "Mathland"; that is to say, in a context which is to learning mathematics what living in France is to learning French.

Steven Johnson: Emergence: The Connected Lives of Ants, Brains, Cities, and Software

Cities bring minds together and put them into coherent slots. ... Ideas and goods flow readily within these clusters, leading to productive cross-pollination, ensuring that good ideas don't die out in rural isolation. The power unleashed by this data storage is evident in the earliest large-scale human settlements... By some accounts, grain cultivation, the plow, the potter's wheel, the sailboat, the draw loom, copper metallurgy, abstract mathematics, exact astronomical observation, the calendar -- all of these inventions appeared within centuries of the original urban populations. It's possible, even likely, that more isolated groups or individuals had stumbled upon some of those technologies at an earlier date, but they didn't become part of the collective intelligence of civilization until there were cities to store and transmit them.

Ken Robinson: TED 2006 talk

University professors... live in their heads. ... They're disembodied, in a kind of literal way. They look upon their body as a form of transport for their heads. It's a way of getting their heads to meetings.

J. Yee: email

Last night I went to a baby shower where a good number of the attendees were babies themselves. I kept thinking how ridiculous it is that people pour so much time and energy into supporting a single life, when there are so many others that need more support.

Will Wright: interview in Designing Interactions

We noticed that when we were designing The Sims, a certain degree of abstraction in the game is very beneficial. You don't actually get very close to the characters. You can't quite see their facial expressions, but everybody in their mind is imagining the facial expressions on the characters.

In computer game design, you're dealing with two processors. You've got the processor in front of you on the computer and you've got the processor in your head, and so the game itself is actually running on both. There are certain things that the computer is very good at, but there are other things that the human imagination is better at.

Wikileaks editors (anonymous): Wikileaks: About

Considering corporations as analogous to a nation state reveals the following properties:

  1. The right to vote does not exist except for share holders (analogous to land owners) and even there voting power is in proportion to ownership.
  2. All power issues from a central committee.
  3. There is no balancing division of power. There is no fourth estate. There are no juries and innocence is not presumed.
  4. Failure to submit to any order may result in instant exile.
  5. There is no freedom of speech.
  6. There is no right of association. Even love between men and women is forbidden without approval.
  7. The economy is centrally planned.
  8. There is pervasive surveillance of movement and electronic communication.
  9. The society is heavily regulated, to the degree many employees are told when, where and how many times a day they can go to the toilet.
  10. There is little transparency and freedom of information is unimaginable.
  11. Internal opposition groups are blackbanned, surveilled and/or marginalized whenever and wherever possible.

While having a GDP and population comparable to Belgium, Denmark or New Zealand, most corporations have nothing like their quality of civic freedoms and protections. Internally, some mirror the most pernicious aspects of the 1960s Soviet system. This is even more striking when the regional civic laws the company operates under are weak (such as in West Papua or South Korea); there, the character of these corporate tyrannies is unobscured by their surroundings.

Aaron Hertzman: Machine Learning for Computer Graphics: A Manifesto and Tutorial

It is a truism that artificial intelligence research can never become successful, because its successes are not viewed as AI.

Chaim Gingold: Miniature Gardens & Magic Crayons

Will Wright points out that while playing games, people engage a game in their head, and what counts is this mental world. "So what we're trying to do as designers is build up these mental models in the player. The computer is just an incremental step, an intermediate model to the model in the player's head." His explanation of this concept works like this: somebody walks into a game store and looks at the cover of your game's box. Based on the front of the box, they start playing a game in their head, and if that game is interesting, they'll pick up the box and look at the back. They then play a new game in their head, closer to the one you've designed. If they like that game, then they'll buy the game and take it home.

Bret Victor: email (9/3/04)

Interface matters to me more than anything else, and it always has. I just never realized that. I've spent a lot of time over the years desperately trying to think of a "thing" to change the world. I now know why the search was fruitless -- things don't change the world. People change the world by using things. The focus must be on the "using", not the "thing". Now that I'm looking through the right end of the binoculars, I can see a lot more clearly, and there are projects and possibilities that genuinely interest me deeply.

Joe Armstrong: erlang-questions mailing list

The real principle is "let some other process fix the error". The "let it fail" philosophy is a consequence of this. ... look to make a fault-tolerant system you need TWO computers not ONE right ... and If you've got TWO computers you need to start thinking about distributed programming *whether you like or not* and if you're going to do distributed computing then you'll have to think about the following ...

Malcolm Gladwell: Group Think

[The] point is not that innovation attracts groups but that innovation is found in groups: that it tends to arise out of social interaction -- conversation, validation, the intimacy of proximity, and the look in your listener's eye that tells you you're onto something. ...

When [Erasmus Darwin, James Watt, Joseph Priestley, etc.] were not meeting, they were writing to each other with words of encouragement or advice or excitement. This was truly -- in a phrase that is invariably and unthinkingly used in the pejorative -- a mutual-admiration society. ...

What were they doing? Darwin, in a lovely phrase, called it "philosophical laughing," which was his way of saying that those who depart from cultural or intellectual consensus need people to walk beside them and laugh with them to give them confidence. ...

We divide [groups] into cults and clubs, and dismiss the former for their insularity and the latter for their banality. The cult is the place where, cut off from your peers, you become crazy. The club is the place where, surrounded by your peers, you become boring. Yet if you can combine the best of those two -- the right kind of insularity with the right kind of homogeneity -- you create an environment both safe enough and stimulating enough to make great thoughts possible.

Doug McIlroy: talk on the history of computing at Bell Labs

This machine ran for a good number of years, probably six, eight. And it is said that it never made an undetected error. What that means is that it never made an error that it did not diagnose itself and stop. Relay technology was very very defensive. The telephone switching system had to work. It was full of self-checking.

Ted Koppel: interview in Frontline: News War

To the extent that we're now judging journalism by the same standards that we apply to entertainment — in other words, give the public what it wants, not necessarily what it ought to hear, what it ought to see, what it needs, but what it wants — that may prove to be one of the greatest tragedies in the history of American journalism. ...

In the very early days of television news, the FCC still had teeth, and still used them every once in a while. And there was that little paragraph, section 315 of the FCC code, that said: "You shall operate in the public interest, convenience, and necessity." And what that meant was, you had to have a news division that told people what was important out there.

Andy Barnes: interview in Frontline: News War

The idea that all of the world should be measured in dollars to stockholders is actually a relatively new idea. It used to be that we thought that businesses had their purpose. Your purpose was to be making newspapers or fountain pens or whatever. And now we act as though the only purpose of a business was to enrich the people who trade it on Wall Street.... Of course you've got to have profit, of course you've got to support your ownership. But that's not why we're doing it. We're doing it because publishing a newspaper is a crucial thing to be doing.

danah boyd: Facebook's "Privacy Trainwreck": Exposure, Invasion, and Drama

i started wondering if social media is dangerous ... If gossip is too delicious to turn your back on and Flickr, Bloglines, Xanga, Facebook, etc. provide you with an infinite stream of gossip, you'll tune in. Yet, the reason that gossip is in your genes is because it's the human equivalent to grooming. By sharing and receiving gossip, you build a social bond between another human. Yet, what happens when the computer is providing you that gossip asynchronously? I doubt i'm building a meaningful relationship with you when i read your MySpace CuteKitten78. You don't even know that i'm watching your life. Are you really going to be there when i need you?

montessori.edu: FAQ

Q. I recently observed a Montessori classroom for a day. I was very very impressed, but I [noticed] there doesn't seem to be any opportunities for pretend play...

A: When Dr. Montessori opened the first Children's House it was full of pretend play things. The children never played with them as long as they were allowed to do real things - i.e. cooking instead of pretending to cook. It is still true.

Adam Cadre: My first political donation

Everyone's interested in the leaders of the country. Intelligent people are interested in the actual functioning of the government, what policies various candidates plan to put into practice, how those policies will affect the lives of the citizenry... but there just aren't that many intelligent people. A lot of people are stupid. To them the government is just a sort of reality show. To them politicians are just celebrities who show up in different timeslots from the actors and sports stars. The beauty of constitutional monarchy is that it gives the stupid people their reality show, but farms it out to a powerless royal family so the real government can get on with its work.

Neal Stephenson: Snow Crash

At the time, both of them were working on avatars. He was working on bodies, she was working on faces. She was the face department, because nobody thought that faces were all that important -- they were just flesh-toned busts on top of the avatars. She was just in the process of proving them all desperately wrong. But at this phase, the all-male society of bitheads that made up the power structure of Black Sun Systems said that the face problem was trivial and superficial. It was, of course, nothing more than sexism, the especially virulent type espoused by male techies who sincerely believe that they are too smart to be sexists.

Adam Cadre: some of my evaluative patterns

Non-fiction also tends to fall into the trap of failing to communicate with the reader. All too many writers, especially in academia, act as if they are programmers in 1980 trying to fit an entire videogame into 4K. Write to communicate; don't just densely encode information for storage.

Michael Rivero

Most people prefer to believe their leaders are just and fair even in the face of evidence to the contrary, because once a citizen acknowledges that the government under which they live is lying and corrupt, the citizen has to choose what he or she will do about it. To take action in the face of a corrupt government entails risks of harm to life and loved ones. To choose to do nothing is to surrender one's self-image of standing for principles. Most people do not have the courage to face that choice. Hence, most propaganda is not designed to fool the critical thinker but only to give moral cowards an excuse not to think at all.

Dan Cook: Mixing Games and Applications

Why do games have such a radically different learning curve than advanced applications? It turns out that games are carefully tuned machines that hack into human beings' most fundamental learning processes. Games are exercises in applied psychology at a level far more nuanced than your typical application. ...

Implicit in this description of interactivity is the fact that users change. More importantly, the feedback loops we, as designers, build into our games, directly change the user's mind... The person that starts using a game is not the same person that finishes the game. Games and the scaffold of skills atoms describes in minute detail how and what change occurs.

This is a pretty big philosophical shift from how application design is usually approached. We tend to imagine that users are static creatures who live an independent and unchanging existence outside of our applications. We merely need to give them a static set of pragmatic tools and all will be good.

Games state that our job is to teach, educate and change our users. We lead them on an explicitly designed journey that leaves them with functioning skills that they could not have imagined before they started using our application. Our games start off simple and slowly add complexity. Our apps must adapt along the user's journey to reflect their changing mental models and advanced skills. Failure to do so results in a mismatch that results in frustration, boredom and burnout.

Clay Shirky: Gin, Television, and Social Surplus

This is something that people in the media world don't understand. Media in the 20th century was run as a single race -- consumption... But media is actually a triathlon, it's three different events. People like to consume, but they also like to produce, and they like to share. ...

And this is the other thing about the size of the cognitive surplus we're talking about. It's so large that even a small change could have huge ramifications. Let's say that everything stays 99 percent the same, that people watch 99 percent as much television as they used to, but 1 percent of that is carved out for producing and for sharing. The Internet-connected population watches roughly a trillion hours of TV a year. That's about five times the size of the annual U.S. consumption. One per cent of that is 100 Wikipedia projects per year worth of participation.

Bill Budge: interview on Computer Chronicles (1984)

[Now] I'm just putting bumpers [all over the pinball board]. This is a favorite of really young kids. They like to just grab a whole bunch of bumpers, fill the board up with them and put a ball on there. A pinball aficionado would gasp, but little kids don't really build pinball machines, they just sort of build "things" with this. ...

[Q: What do you want to do next?] I want to extend the idea of a construction set. This one was hard to do when I started, because there are lots of combinations of things you can't really predict, when you're making a kit, when you're making a "metagame". I'd like to extend the idea even further, and the problem there is then designing the "parts box". In pinball, it's a small set of parts. You don't really have to worry about thinking up abstractions. In a general construction set, it's not clear what the parts should be. It's almost like you're inventing a new language for representing specifications for programs.

Guy Steele: 50 in 50

We must remember that, strictly speaking, "formal" does not mean merely "rigorous", but "according to form". Meaning need be ascribed only to the result of a formal process. It is not needed to guide the process itself. We ascribe meaning to intermediate formal states primarily, nay solely, to reassure ourselves.

John Holt: How Children Fail

Our way of scoring was to give the groups [of fourth-graders] a point for each correct prediction. Before long they were thinking more of ways to get a good score than to make the beam balance. We wanted them to figure out how to balance the beam, and introduced the scoring as a matter of motivation. But they out-smarted us, and figured out ways to get a good score that had nothing to do with whether the beam balanced or not.

... Betty figured out that the way to get a good score is to put the weights in what you know is a wrong place, and then have everyone on your team say it is wrong. Thus, they will each get a point for predicting correctly.

... A couple years later, when I put a balance beam and some weights on a table at the back of my class, and just left it there without saying anything about it or trying to "teach" it, most of the children in the class, including some very poor students, figured out just by messing around with it how it worked.

John Holt: How Children Fail

[I told the fourth-graders] I was thinking of a number between 1 and 10,000. ... They still cling stubbornly to the idea that the only good answer is a yes answer. This, of course, is the result of miseducation in which "right answers" are the only ones that pay off. They have not learned how to learn from a mistake, or even that learning from mistakes is possible. If they say, "Is the number between 5,000 and 10,000?" and I say yes, they cheer; if I say no, they groan, even though they get exactly the same amount of information in either case. The more anxious ones will, over and over again, ask questions that have already been answered, just for the satisfaction of hearing a yes.

John Holt: How Children Learn

[Describing a science-fiction-like photograph of a research lab.] Why did the magazine want such a picture?... Because it makes science look like a powerful and forbidding mystery, not for the likes of you and me. Because it tells us that only people with expensive and incomprehensible machines can discover the truth, about human beings or anything else, and that we must believe whatever they tell us. Because it turns science from an activity to be done into a commodity to be bought. Because it prevents ordinary human beings from being the scientists, the askers of questions and seekers and makers of answers that we naturally and rightfully are, and makes us instead into science consumers and science worshippers.

Fabien: lua-l

A language is an interface between programmers and hardware, so it has social/psychological/pedagogical features which are just as important as its formal properties. If a language can't be efficiently ported on regular hardware, it's the language that sucks, not the hardware. Similarly, if it doesn't interface properly with its communities of coders (fails to build up standard coding practices, good libraries, trust...), the language sucks, not the people. Ergo Lisp sucks. Many Lisp zealots dismiss the language's failures as "merely social", but that's missing the purpose of a language entirely: failing socially is just as bad as failing technically.

Tycho: Penny Arcade

I was going through a thread earlier this morning full of those who couldn't understand why people keep buying the 360, and part of it is almost certainly because of people like my sister. She doesn't define herself spiritually by her console choice and doesn't track hardware failure rates. She is only a "gamer" during the time the console is on, the same way she ceases being a "toaster" once her toast is complete. She is utterly devoid of the received wisdom we amass as enthusiasts, and the joy she wrings from the medium is not diminished as a result.

John Taylor Gatto: The Six-Lesson Schoolteacher

This is another way I teach the lesson of dependency. Good people wait for a teacher to tell them what to do. This is the most important lesson of all, that we must wait for other people, better trained than ourselves, to make the meanings of our lives.

John Taylor Gatto: The Underground History of American Education

If you obsess about conspiracy, what you'll fail to see is that we are held fast by a form of highly abstract thinking fully concretized in human institutions which has grown beyond the power of the managers of these institutions to control. If there is a way out of the trap we're in, it won't be by removing some bad guys and replacing them with good guys.

Amish Information Systems: Last one. Romance.

My romantic entanglements tended to be quantum in nature - i.e. they happened at a distance and were undetectable to outside observers.

Steven Strogatz: Nonlinear Dynamics and Chaos, p 175

This example [phase-space sketch of a nonlinear system] shows how far we can go with pictures -- without invoking any difficult formulas, we were able to extract all the important features of the pendulum's dynamics. It would be much more difficult to obtain these results analytically, and much more confusing to interpret the forumulas, even if we could find them.

Banksy

The thing I hate the most about advertising is that it attracts all the bright, creative and ambitious young people, leaving us mainly with the slow and self-obsessed to become our artists. Modern art is a disaster area. Never in the field of human history has so much been used by so many to say so little.

Dan Bricklin: The Cornucopia of the Commons

What we see here is that increasing the value of the database by adding more information is a natural by-product of using the tool for your own benefit. No altruistic sharing motives need be present. ...

I believe that you can help predict the success of a particular UI used to build a shared database based on how much normal, selfish use adds to the database.

Dan Bricklin: Systems without guilt where every contribution is appreciated

In a good system, just doing what you normally would do to help yourself helps everybody. Even helping a bit once in a while (like typing in the track names of a CD nobody else had ever entered) benefited you and the system. Instead of making you feel bad for "only" doing 99%, a well designed system makes you feel good for doing 1%. People complain about systems that have lots of "freeloaders". Systems that do well with lots of "freeloading" and make the best of periodic participation are good.

So, here we have another design criteria for a type of successful system: Guiltlessness. No only should people just need to do what's best for them when they help others, they need to not need to always do it.

Lawrence Lessig: Remix

[Regarding motivations in a sharing vs commercial economy] Even the thee-regarding [selfless] motivations need not be descriptions of self-sacrifice. I suspect that no one contributes to Wikipedia despite hating what he does, solely because he believes he ought to help create free knowledge. We can all understand people in the commercial economy who hate what they do but do it anyway ("he's just doing it for the money"). That dynamic is very difficult to imagine in the sharing economy. In the sharing economy, people are in it for the thing they're doing, either because they like the doing, or because they like doing such things. Either way, these are happy places. People are there because they want to be.

Clay Shirky: Newspapers and Thinking the Unthinkable

For a long time, longer than anyone in the newspaper business has been alive in fact, print journalism has been intertwined with these economics. The expense of printing created an environment where Wal-Mart was willing to subsidize the Baghdad bureau. This wasn't because of any deep link between advertising and reporting, nor was it about any real desire on the part of Wal-Mart to have their marketing budget go to international correspondents. It was just an accident. Advertisers had little choice other than to have their money used that way, since they didn't really have any other vehicle for display ads.

The old difficulties and costs of printing forced everyone doing it into a similar set of organizational models; it was this similarity that made us regard Daily Racing Form and L'Osservatore Romano as being in the same business. That the relationship between advertisers, publishers, and journalists has been ratified by a century of cultural practice doesn't make it any less accidental.

The competition-deflecting effects of printing cost got destroyed by the internet, where everyone pays for the infrastructure, and then everyone gets to use it. And when Wal-Mart, and the local Maytag dealer, and the law firm hiring a secretary, and that kid down the block selling his bike, were all able to use that infrastructure to get out of their old relationship with the publisher, they did. They'd never really signed up to fund the Baghdad bureau anyway.

Adam Cadre: WALL-E

So [workers replaced by automation] have to go find other jobs. Not because the society needs any additional production -- it was already doing fine on that count -- but because of ideology. An odd facet of this ideology is that no one really cares much whether you're contributing to the greater good so long as you're performing some kind of labor!

Charles Bloom: Waffling

I've always been very dubious about the idea of learning from people who have been successful. There's this whole cult of worshipping rich people, reading interviews with them, getting their opinions on things, trying to learn what made them successful. I think it's mostly nonsense. The thing is, if you just look at who the biggest earners are, it's almost entirely luck. ...

The point is if you just look at successful business people, they will probably be confident, decisive, risk takers, aggressive at seizing opportunities, aggressive about growing the business quickly, etc. That doesn't mean that those are the right things to do. It just means that those are variance-increasing traits that give them a *chance* to be a big success.

Lewis Hyde: The Gift, p 11

This, then, is how I use "consume" to speak of a gift -- a gift is consumed when it moves from one hand to another with no assurance of anything in return. There is little difference, therefore, between its consumption and its movement. A market exchange has an equilibrium or stasis: you pay to balance the scale. But when you give a gift there is momentum, and the weight shifts from body to body.

Chip Morningstar: Habitat Chronicles: Smart people can rationalize anything

You can't sell someone the solution before they've bought the problem.

Alan Kay: The Early History of Smalltalk

All of the elements eventually used in the Smalltalk user interface were already to be found in the sixties, as different ways to access and invoke the functionality provided by an interactive system. The two major centers of ideas were Lincoln Labs and RAND corp, both ARPA funded. The big shift that consolidated these ideas into a powerful theory and long-lived examples came because the LRG [Learning Research Group] focus was on children. Hence, we were thinking about learning as being one of the main effects we wanted to have happen. Early on, this led to a 90 degree rotation of the purposed of the user interface from "access to functionality" to "environment in which users learn by doing." This new stance could now respond to the echos of Montessori and Dewey, particularly the former, and got me, on rereading Jerome Bruner, to think beyond the children's curriculum to a "curriculum of the user interface."

The particular aim of LRG was to find the equivalent of writing -- that is, learning and thinking by doing in a medium -- our new "pocket universe." For various reasons I had settled on "iconic programming" as the way to achieve this, drawing on the iconic representations used by many ARPA projects in the sixties. My friend Nicholas Negroponte, an architect, was extremely interested in how environments affected peoples' work and creativity. He was interested in embedding the new computer magic in familiar surroundings. I had quite a bit of theatrical experience in a past life, and remembered Coleridge's adage that "people attend 'bad theatre' hoping to forget, people attend 'good theatre' aching to remember." In other words, it is the ability to evoke the audience's own intelligence and experiences that makes theatre work.

Putting all this together, we want an apparently free environment in which exploration causes desired sequences to happen (Montessori); one that allows kinesthetic, iconic, and symbolic learning -- "doing with images makes symbols" (Piaget & Bruner); the user is never trapped in a mode (GRAIL); the magic is embedded in the familiar (Negroponte); and which acts as a magnifying mirror for the user's own intelligence (Coleridge). It would be a great finish to ths story to say that having articulated this, we were able to move straightforwardly to the design as we know it today. In fact, the UI design work happened in fits and starts in between feeding Smalltalk itself, designing children's experiments, trying to understand iconic construction, and just playing around. In spite of this meandering, the context almost forced a good design to turn out anyway.

James Herndon: How to Survive in Your Native Land, p 36

This drove us out of our minds, and it drove us out of our minds every day... Unaccountably, the course was not, as we'd thought, a course where students would get to do all the things we'd thought up for them to do, but instead a course where they could steadfastly refuse to do everything and then complain that there was nothing to do. ...

We never quite accepted the notion that the real curriculum of the course was precisely the question What Shall We Do In Here? and that it was really an important question and maybe the only important question.

Keith Johnstone: Impro: Improvisation and the Theatre, p 149

We don't know much about Masks in this culture ... because this culture is usually hostile to trance states. We distrust spontaneity, and try to replace it by reason: the Mask was driven out of theatre in the same way that improvisation was driven out of music. ... Education itself might be seen as primarily an anti-trance activity.

I see the Mask as something that is continually flaring up in this culture, only to be almost immediately snuffed out. No sooner have I established a tradition of Mask work somewhere than the students start getting taught the 'correct' movements, just as they learn a phoney 'Commedia dell' Arte' technique.

Charles Bloom: Intolerance

Just the mathematics of being single are depressing. You have to flirt with ten girls to get a date with one. You have to go on ten dates to find someone you want to have something long term with. You have to have ten long term relationships to find the one that works. It's unbearable.

Adam Cadre: Fatal abstraction

I hate abstraction. Here are some examples.

Kent Beck and Ward Cunningham: A Laboratory For Teaching Object-Oriented Thinking

Note that the [CRC] cards are placed such that View and Controller are overlapping (implying close collaboration) and placed above Model (implying supervision.) We find these and other informal groupings aid in comprehending a design. Parts, for example, are often arranged below the whole. ...

The ability to quickly organize and spatially address index cards proves most valuable when a design is incomplete or poorly understood. We have watched designers repeatedly refer to a card they intended to write by pointing to where they will put it when completed.

Randall B. Smith and David Ungar: Programming as an Experience: The Inspiration for Self

We now believe that when features, rules, or elaborations are motivated by particular examples, it is a good bet that their addition will be a mistake. The second author once coined the term "architect's trap" for something similar in the field of computer architecture; this phenomenon might be called "the language designer's trap."

If examples cannot be trusted, what do we think should motivate the language designer? Consistency and malleability. When there is only one way of doing things, it is easier to modify and reuse code. When code is reused, programs are easier to change and most importantly, shrink. When a program shrinks its construction and maintenance requires fewer people which allows for more opportunities for reuse to be found. Consistency leads to reuse, reuse leads to conciseness, conciseness leads to understanding.

David Hestenes: Reforming the Mathematical Language of Physics

Mathematics is taken for granted in the physics curriculum -- a body of immutable truths to be assimilated and applied. The profound influence of mathematics on our conceptions of the physical world is never analyzed. The possibility that mathematical tools used today were invented to solve problems in the past and might not be well suited for current problems is never considered. ...

The point I wish to make by citing these two examples [Newton and Einstein] is that without essential mathematical concepts the two theories would have been literally inconceivable. The mathematical modeling tools we employ at once extend and limit our ability to conceive the world. Limitations of mathematics are evident in the fact that the analytic geometry that provides the foundation for classical mechanics is insufficient for General Relativity.

David Hestenes: Reforming the Mathematical Language of Physics

Early in my career, I naively thought that if you give a good idea to competent mathematicians or physicists, they will work out its implications for themselves. I have learned since that most of them need the implications spelled out in utter detail.

W. Daniel Hillis: Richard Feynman and The Connection Machine

The last project that I worked on with Richard [Feynman] was in simulated evolution. ... When I got back to Boston I went to the library and discovered a book by Kimura on the subject, and much to my disappointment, all of our "discoveries" were covered in the first few pages. When I called back and told Richard what I had found, he was elated. "Hey, we got it right!" he said. "Not bad for amateurs."

In retrospect I realize that in almost everything that we worked on together, we were both amateurs. In digital physics, neural networks, even parallel computing, we never really knew what we were doing. But the things that we studied were so new that no one else knew exactly what they were doing either. It was amateurs who made the progress.

Freeman Dyson: interview in OMNI magazine

Q: You must be aware that some of your colleagues take a jaundiced view of your ideas ... Does it bother you to know that they're out there, muttering about "Dyson's crazy ideas"?

A: Not at all. Keep in mind, I'm also a perfectly respectable physicist, and the speculation is a hobby. It's become well known, but I've grown used to the idea that people very often become famous for accidental reasons. It's amusing to think that someday all my "serious" work will probably be a footnote in a textbook, when everybody remembers what I did on the side! Anyway, what do I have to lose? I have tenure here, and no one expects much from a theoretical physicist once he's past fifty anyway!

Joe Armstrong: interview: Joe Armstrong and Simon Peyton Jones discuss Erlang and Haskell

We can't stop our systems and globally check they are consistent and then relaunch them. We incrementally change bits and we recognize that they are inconsistent under short time periods and we live with that. Finding ways of living with failure, making systems that work, despite the fact they are inconsistent, despite the fact that failures occur. So our error models are very sophisticated.

When I see things like Scala or I see on the net there's this kind of "Erlang-like semantics", that usually means mailboxes and message boxes. It doesn't mean all the error handling, it doesn't mean the live code upgrade. The live upgrade of code while you are running a system needs a lot of deep plumbing under the counter -- it's not easy.

Dan Roam: The Back of the Napkin: Solving Problems and Selilng Ideas with Pictures, p 133

The opposite of "simple" is not "complex," but rather "elaborate." ... One of the most important virtues of visual thinking is its ability to clarify things so that the complex can be better understood, but that does not mean that all good visual thinking is about simplification. The real goal of visual thinking is to make the complex understandable by making it visible -- not by making it simple. Whether that goal demands a simple picture, an elaborate one, or an intentionally complex one is almost always determined by the audience and its familiarity with the subject being addressed.

C.A.R. Hoare: Retrospective: An Axiomatic Basis for Computer Programming

I expected that research into the axiomatic method would occupy me for my entire working life; and I expected that its results would not find widespread practical application in industry until after I reached retirement age. These expectations led me in 1968 to move from an industrial to an academic career. And when I retired in 1999, both the positive and the negative expectations had been entirely fulfilled.

John Allison: Mild terror at 5pm

I grew up with Roald Dahl and Tove Jansson and Richmal Crompton and Ronald Searle. They were all masters of world-building and immersive stories. They never spoke down to readers and I can read a lot of their work as an adult with the same pleasure. If I want to keep doing this for the rest of my working life, I have to make something lasting like that.

Wikipedia: Hermann Grassmann

[Grassmann's theory of linear algebra] was a revolutionary text, too far ahead of its time to be appreciated. Grassmann submitted it as a Ph. D. thesis, but Möbius said he was unable to evaluate it and forwarded it to Ernst Kummer, who rejected it without giving it a careful reading. Over the next 10-odd years, Grassmann wrote a variety of work applying his theory, in the hope that these applications would lead others to take his theory seriously. ...

In 1862, Grassman published a thoroughly rewritten second edition of A1, hoping to earn belated recognition for his theory of extension, and containing the definitive exposition of his linear algebra. It fared no better than A1, even though A2's manner of exposition anticipates the textbooks of the 20th century.

Disappointed at his inability to be recognized as a mathematician, Grassmann turned to historical linguistics. ... These philological accomplishments were honored during his lifetime.

Graham Nelson: Natural Language, Semantic Analysis and Interactive Fiction

The general reaction of experienced IF writers to early drafts of Inform 7 was a two-stage scepticism. First: was this just syntactic sugar, that is, a verbose paraphrase of the same old code? ... Second: perhaps this was indeed a fast prototyping tool for setting up the map and the objects, but would it not then grind into useless inflexibility when it came to coding up innovative behaviour -- in fact, would it be fun for beginners but useless to the real task at hand? It sometimes seemed to those of us working on Inform that an experienced IF author, shown Inform 7 for the first time, would go through the so-called Five Stages of Grief: Denial, Anger, Bargaining, Depression, and Acceptance. The following comment is typical of the Bargaining stage:

I would like to see it be as easy as possible to mix Inform 6 and Inform 7 code. [...] I also wonder if it might be possible to allow the user access to the Inform 6 code that the Inform 7 pre-processor creates. I can imagine some people wanting to use Inform 7 to lay out the outline of their game -- rooms, basic objects therein, and so on -- quickly, and then do the heavy lifting, so to speak, in Inform 6.

Matt Knox: Interview with an Adware Author

Most things don't have to be perfect. In particular, things involving human interactions don't have to be perfect, because groups of humans have all these self-regulations built in. If you and I have an agreement and you screwed me over badly, you've always got in the back of your mind the nagging worry that I'm going to show up on your doorstep with a club and kill you. Because of that, people don't tend to screw each other too much, right? At least, they try not to. One danger, perhaps, of moving towards an algorithmically driven society is that the algorithms aren't scared of us showing up and beating them up. The algorithms will do whatever it is that they are designed to do.

Michael Chabon: The Amazing Adventures of Kavalier & Clay, p 265

A surprising fact about the magician Bernard Kornblum was that he believed in magic. Not in the so-called magic of candles, pentagrams, and bat wings. Not in the kitchen enchantments of Slavic grandmothers with their herbiaries and pairings from the little toe of a blind virgin tied up in a goatskin bag. Not in astrology, theosophy, chiromancy, dowsing rods, séances, weeping statues, werewolves, wonders, or miracles. All these Kornblum had regarded as fakery far different -- far more destructive -- than the brand of illusion he practiced, whose success, after all, increased in direct proportion to his audiences' constant, keen awareness that, in spite of all the vigilance they could bring to bear, they were being deceived.

Richard Feynman: Surely You're Joking, Mr. Feynman, p 92

In these discussions one man would make a point. Then Compton, for example, would explain a different point of view. He would say it should be this way, and he was perfectly right. Another guy would say, well, maybe, but there's this other possibility we have to consider against it.

So everybody is disagreeing, all around the table. I am surprised and disturbed that Compton doesn't repeat and emphasize his point. Finally, at the end, Tolman, who's the chairman, would say, "Well, having heard all these arguments, I guess it's true that Compton's argument is the best of all, and now we have to go ahead."

It was such a shock to me to see that a committee of men could present a whole lot of ideas, each one thinking of a new facet, while remembering what the other fella said, so that, at the end, the decision is made as to which idea was the best -- summing it all up -- without having to say it three times. These were very great men indeed.

Richard Feynman: Surely You're Joking, Mr. Feynman, p 92

[As a new professor] at Cornell, I'd work on preparing my courses, and I'd go over to the library a lot and read through the Arabian Nights and ogle the girls that would go by. But when it came time to do some research, I couldn't get to work. I was a little tired; I was not interested; I couldn't do research! This went on for what I felt was a few years ... I simply couldn't get started on any problem: I remember writing one or two sentences about some problem in gamma rays and then I couldn't go any further. I was convinced that from the war and everything else (the death of my wife) I had simply burned myself out.

... Then I had another thought: Physics disgusts me a little bit now, but I used to enjoy doing physics. Why did I enjoy it? I used to play with it. I used to do whatever I felt like doing -- it didn't have to do with whether it was important for the development of nuclear physics, but whether it was interesting and amusing for me to play with. When I was in high school, I'd see water running out of a faucet growing narrower, and wonder if I could figure out what determines that curve. I found it was rather easy to do. I didn't have to do it; it wasn't important for the future of science; somebody else had already done it. That didn't make any difference: I'd invent things and play with things for my own entertainment.

So I got this new attitude. Now that I am burned out and I'll never accomplish anything, I've got this nice position at the university teaching classes which I rather enjoy, and just like I read the Arabian Nights for pleasure, I'm going to play with physics, whenever I want to, without worrying about any importance whatsoever.

Within a week I was in the cafeteria and some guy, fooling around, throws a plate in the air. As the plate went up in the air I saw it wobble, and I noticed the red medallion of Cornell on the plate going around. It was pretty obvious to me that the medallion went around faster than the wobbling.

I had nothing to do, so I start to figure out the motion of the rotating plate. I discover that when the angle is very slight, the medallion rotates twice as fast as the wobble rate -- two to one. ...

I went on to work out equations of wobbles. Then I thought about how electron orbits start to move in relativity. Then there's the Dirac Equation in electrodynamics. And then quantum electrodynamics. And before I knew it (it was a very short time) I was "playing" -- working, really -- with the same old problem that I loved so much, that I had stopped working on when I went to Los Alamos: my thesis-type problems; all those old-fashioned, wonderful things.

It was effortless. It was easy to play with these things. It was like uncorking a bottle: Everything flowed out effortlessly. I almost tried to resist it! There was no importance to what I was doing, but ultimately there was. The diagrams and the whole business that I got the Nobel Prize for came from that piddling around with the wobbling plate.

Clay Shirky: A Rant About Women

Not caring works surprisingly well. Another of my great former students, now a peer and a friend, saw a request from a magazine reporter doing a tech story and looking for examples. My friend, who'd previously been too quiet about her work, decided to write the reporter and say "My work is awesome. You should write about it."

The reporter looked at her work and wrote back saying "Your work is indeed awesome, and I will write about it. I also have to tell you you are the only woman who suggested her own work. Men do that all the time, but women wait for someone else to recommend them." My friend stopped waiting, and now her work is getting the attention it deserves.

Steven Levy: Hackers: Heroes of the Computer Revolution

Some planners would visit Homebrew and be turned off by the technical ferocity of the discussions, the intense flame that burned brightest when people directed themselves to the hacker pursuit of building. Ted Nelson, author of Computer Lib, came to a meeting and was confused by all of it, later calling the scruffily dressed and largely uncombed Homebrew people "chip-monks, people obsessed with chips. It was like going to a meeting of people who love hammers." Bob Albrecht rarely attended, later explaining that "I could understand only about every fourth word those guys were saying . . . they were hackers." Jude Milhon, the woman with whom Lee remained friends after their meeting through the Barb and their involvement in Community Memory, dropped in once and was repelled by the concentration on sheer technology, exploration, and control for the sake of control. She noted the lack of female hardware hackers, and was enraged at the male hacker obsession with technological play and power. She summed up her feelings with the epithet "the boys and their toys," and like Fred Moore worried that the love affair with technology might blindly lead to abuse of that technology.

Ethereal Bligh

I once heard Murray Gel-Mann lecture at the Santa Fe Institute on "creativity" and scientific discoveries. He utilized one very memorable metaphor. It was of ideas as particle energy states -- that they will find the locally lowest stable "well" to settle in. But that local low may not be the regional or global low, of course, and it takes an increase of energy to move the idea up out of the local low in order for it to find its way to something deeper. Here the idea was the deeper the well, the "truer" the idea. And Gel-Mann's point was that it's a contrary thing to go up that well...that an essential characteristic of creativity (and intellectual discovery) is to do the unlikely, the counter-intuitive. Not always be contrary and go uphill, of course, that's worse than useless (something the cranks don't understand). But just enough at the right times to open up new vistas that were previously unimagined.

Douglas Adams: Speech at Digital Biota 2

Now imagine an early man surveying his surroundings at the end of a happy day's tool making.... Man the maker looks at his world and says 'So who made this then?' Who made this? ... Early man thinks, 'Well, because there's only one sort of being I know about who makes things, whoever made all this must therefore be a much bigger, much more powerful and necessarily invisible, one of me and because I tend to be the strong one who does all the stuff, he's probably male'. And so we have the idea of a god. Then, because when we make things we do it with the intention of doing something with them, early man asks himself, 'If he made it, what did he make it for?' Now the real trap springs, because early man is thinking, 'This world fits me very well. Here are all these things that support me and feed me and look after me; yes, this world fits me nicely' and he reaches the inescapable conclusion that whoever made it, made it for him.

This is rather as if you imagine a puddle waking up one morning and thinking, 'This is an interesting world I find myself in - an interesting hole I find myself in - fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!' This is such a powerful idea that as the sun rises in the sky and the air heats up and as, gradually, the puddle gets smaller and smaller, it's still frantically hanging on to the notion that everything's going to be alright, because this world was meant to have him in it, was built to have him in it; so the moment he disappears catches him rather by surprise.

Andreas Rossberg: comment on Lambda the Ultimate

[Regarding the Coverity static analysis tool for C to "find bugs in the real world"]: It's still depressing what incredible amounts of intellectual and monetary resources are wasted on problems many (most?) of which wouldn't even exist if people just used civilized languages.

[Reply from vrijz: These tools are intended to analyze existing code with little or no additional effort. To the extent that untyped, memory-unsafe languages are more convenient than languages with strong typing and safety guarantees, static analysis tools are useful.]

Yes, agreed. But my worry is that seemingly helpful tools like the one described may have the perverse effect of prolonging the life of broken code bases and languages - especially if these tools are pragmatically tuned to fit the mindset of the more ignorant among their users, as described in the article. Maybe it would be better if these artifacts collapsed sooner rather than later?

Also, obviously, people continue starting new projects in utterly inadequate languages, and the availability of tools like this will likely feed the belief that that is a good idea.

Douglas Adams: Dirk Gently's Holistic Detective Agency, p 54

He felt a tug of sadness that someone who had seemed so shiningly alive within the small confines of a university community should have seemed to fade so much in the light of common day.

why the lucky stiff: Twitter

when you don't create things, you become defined by your tastes rather than ability. your tastes only narrow & exclude people. so create

Chip and Dan Heath: Switch: How to Change Things When Change Is Hard, p 82

You need a gut-smacking goal, one that appeals to both [the rational and the emotional mind]. ...

Goals in most organizations, however, lack emotional resonance. Instead, SMART goals -- goals that are Specific, Measurable, Actionable, Relevant, and Timely -- have become the norm. A typical smart goal might be "My marketing campaign will generate 4500 qualified sales leads for the sales group by the end of Q3'09." SMART goals presume the emotion; they don't generate it.

The specificity of SMART goals is great cure for the worst sins of goal setting -- ambiguity and irrelevance. ("We are going to delight our customers every day in every way!") But SMART goals are better for steady-state situations than for change situations, because the assumptions underlying them are that the goals are worthwhile. If you accept that generating 4500 leads for the sales force is a great use of your time, the SMART goal will be effective. But if a new boss, pushing a new direction, assigns you the 4500-leads goal even though you've never handed lead generation before, then there might be trouble. SMART goals presume the emotion; they don't generate it.

Henry David Thoreau: Civil Disobedience

Cast your whole vote, not a strip of paper merely, but your whole influence.

Bruce Sterling: The Hacker Crackdown: Law and Disorder on the Electronic Frontier

Technologies in their "Goofy Prototype" stage rarely work very well. They're experimental, and therefore halfbaked and rather frazzled. The prototype may be attractive and novel, and it does look as if it ought to be good for something-or-other. But nobody, including the inventor, is quite sure what. Inventors, and speculators, and pundits may have very firm ideas about its potential use, but those ideas are often very wrong.

The natural habitat of the Goofy Prototype is in trade shows and in the popular press. Infant technologies need publicity and investment money like a tottering calf needs milk.

Skaven: FAQ

If we want to get philosophical about it, one could say that the more you produce, the more the uniqueness of your work suffers. As people change over time, so does their work. Their work represents samples from a continuum of their personal development. More work along the way just gives a higher sample rate of this continuum, and won't necessarily introduce anything new. Not always does increased number of production explore new possibilities, but only dwells on the already explored ones.

Chris Crawford: The History of Thinking

Thus, writing changed the way we thought. Don't think of it as a means of recording ideas, think of it as an instrument for exploring and examining ideas. Western civilization grew from the heady exploitation of this instrument of thinking. Indeed, the written page can be thought of as "artificial cortex", a technological means of augmenting the expansion of the sequential-processing portions of the brain. We humans were so impatient to grow more cortex, we went ahead and concocted an artificial version: paper and ink.

Neil Postman: Technology and Society (talk)

(Paraphrased) questions to ask of a new technology:

David Foster Wallace: Life and Work

In the day-to-day trenches of adult life, there is actually no such thing as atheism. There is no such thing as not worshipping. Everybody worships. The only choice we get is what to worship. And an outstanding reason for choosing some sort of God or spiritual-type thing to worship -- be it J.C. or Allah, be it Yahweh or the Wiccan mother-goddess or the Four Noble Truths or some infrangible set of ethical principles -- is that pretty much anything else you worship will eat you alive. If you worship money and things -- if they are where you tap real meaning in life -- then you will never have enough. Never feel you have enough. It's the truth. Worship your own body and beauty and sexual allure and you will always feel ugly, and when time and age start showing, you will die a million deaths before they finally plant you... Worship power -- you will feel weak and afraid, and you will need ever more power over others to keep the fear at bay. Worship your intellect, being seen as smart -- you will end up feeling stupid, a fraud, always on the verge of being found out. And so on.

Look, the insidious thing about these forms of worship is not that they're evil or sinful; it is that they are unconscious. They are default-settings. They're the kind of worship you just gradually slip into, day after day, getting more and more selective about what you see and how you measure value without ever being fully aware that that's what you're doing.

Iris Chang: suicide letter

When you believe you have a future, you think in terms of generations and years. When you do not, you live not just by the day -- but by the minute.

Daniel Fontijne: Gaigen 2: a Geometric Algebra Implementation Generator

We consider geometric algebra the be the high-level "object-oriented" language for encoding geometry, whereas -- in this context -- linear algebra is more akin to assembly language.

John Lienhard: Engines of Our Ingenuity, #622: Ignaz Philipp Semmelweis

On a hunch, [Semmelweis] sets up a policy. Doctors must wash their hands in a chlorine solution when they leave the cadavers. Mortality from puerperal fever promptly drops to two percent. Now things grow strange. Instead of reporting his success at a meeting, Semmelweis says nothing....

As outside interest grows, we begin to understand Semmelweis's silence. The hospital director feels his leadership has been criticized. He's furious. He blocks Semmelweis's promotion. The situation gets worse. Viennese doctors turn on this Hungarian immigrant.... Finally, he goes back to Budapest...

Finally, in 1861, he writes a book on his methods. The establishment gives it poor reviews. Semmelweis grows angry and polemical. He hurts his own cause with rage and frustration.

Brad Templeton: Voluntary Taxes

In this county, a proposition... asks for a $29 levy on all properties to pay for medical programs for children. How could anybody vote against that? (I have not examined this proposition in detail, but generally when you see "motherhood" propositions on the ballot, particularly bonds, they have been put there by politicians who have other projects they know would not be popular. So they arrange a ballot proposition to raise money for something nobody could be against, which normally they would have had to spend general revenue on, and this frees up general revenue so they can spend it with less accountability.)

Richard Hamming: The Art of Doing Science and Engineering, p vi

Teachers should prepare the student for the student's future, not for the teacher's past.

Noam Chomsky: talk

[In the U.S.], "libertarian" means "extreme advocate of total tyranny"... It means power ought to be given into the hands of private unaccountable tyrannies. Even worse than state tyrannies, because there the public has some kind of role. But the corporate system, especially as it has evolved in the twentieth century, is pure tyranny. Completely unaccountable. You're inside one of these institutions, you take orders from above, you hand it down below... there's nothing you can say, tyrannies do what they feel like, they're global in scale. This is the extreme opposite of what been called "libertarian" everywhere in the world since the enlightenment.

Joe Armstrong: interview

I think I had come to [Ericsson's research lab] something like two years after it had started... Our view of the world was, yes, we'll solve problems and then we'll push them into projects and we will improve Ericsson's productivity. This view of the world wasn't yet tinged by any contact with reality. So we thought it would be easy to discover new and useful stuff and we thought that once we had discovered new and useful stuff then the world would welcome us with open arms. What we learned later was, it wasn't all that easy to discover new stuff. And it's incredibly difficult to get people to use new and better stuff.

LiberianRedditor: How can I find out if I am the only Redditor in Liberia?

I visited a Liberian friend yesterday who teaches a computer class; they teach Microsoft Office by describing it, and drawing on pieces of paper. He was excited that he was able to obtain a mouse and bring it in for people to see, so that their understanding wouldn't be as abstract.

Julian Assange

Non-conformity is not the adoption of some pre-existing alternative subculture.

Alex Kolesar and Joseph Kovell: No Need for Bushido FAQ

Q: What happened to the art/writing?

A: It got better.

Stewart Brand: Long Now talk

It seems like most people ask: "How can I throw my life away in the least unhappy way?"

Douglas Engelbart: interview

I got this wild dream in my head about what would help mankind the most, to go off and do something dramatic, and I just happened to get a picture of how, if people started to learn to interact with computers, in collective ways of collaborating together, and this was way back in the early 50s, so it was a little bit premature. So anyways, I had some GI bill money left still so I could just go after that, and up and down quite a bit through the years, and I finally sort of gave up.

Will Wright: interview

Or somebody might actually initiate a sequence of actions on their computer in a very creative way and the computer might recognize that, send it up to the server, and say: "Wow, that was an interesting sequence, and that person likes doing comedy romances. Let's try that on ten other people tomorrow. If those ten people respond well, let's try it on a hundred the next day." So it could be that the things aren't just randomly discovered, but they're also observed from what the players did specifically.

Douglas Adams: talk

Everybody puts little hidden jokes in stuff from time to time. There are quite a few little jokes in my books which only the person they directed at would get. It's a bit like people waving at complete strangers out of buses... just being friendly and saying hi.

The stuff at the beginning of Long Dark Tea Time Of The Soul about the harpsichord and the bailiffs was a joke at the expense of my great friend Michael Bywater, on whom the character of DG was to a certain extent based.

For instance in Life, The Universe And Everything I describe the way that the robot waiters and guests behave in the BistroMath ship. One of the guest robots keeps feeling under tables, insulting people and going on about some woman or other... I called him an AutoRory. Old friend of mine called Rory McGrath. That's exactly what he used to be like in restaurants. Don't know if he still is because I haven't gone to restaurants with him for a while, for obvious reasons. Not only did Rory get the joke. Anybody who had ever been to a restaurant with him or even just IN a restaurant with him got it.

Joshua Allen: Fireland

The site was a bunch of random pieces of writing. Fiction masquerading as non-fiction and vice versa, etc. Basically the same shit I'm doing today.

There have been many times when I've regretted investing so much time and energy in the internet. When I was embarrassed by the whole thing. But it occurred to me that just about every good thing I have in my life today has stemmed, directly or indirectly, from that site.

Matt Groening: interview at Mother Jones

[Re fighting FOX for creative control over Futurama] You can't believe what babies people are. It's really like being in junior high school. [With] the bullies, and every step of the way, any time I've been gracious, that has been -- it's seen as a sign of weakness. And every time I've yelled back, I've been treated with respect. That's just not very good psychology. The other thing is, it's just astonishing to have this lesson repeated over and over again: You can't expect people to behave in their own best interest. It's in Fox's best interest for this show to be a success, but they'd rather mess with the show and have them fail, than allow creators independence and let them succeed.

Paul Ford: The Web Is a Customer Service Medium

One can spend a lot of time defining a medium in terms of how it looks, what it transmits, wavelengths used, typographic choices made, bandwidth available. I like to think about media in terms of questions answered.

Kevin Kelly: My Life Countdown

I am now 55 years old. Like a lot of people in middle age my late-night thoughts bend to contemplations about how short my remaining time is. Even with increasing longevity there is not enough time to do all that I want. Nowhere close. My friend Stewart Brand, who is now 69, has been arranging his life in blocks of 5 years. Five years is what he says any project worth doing will take. From moment of inception to the last good-riddance, a book, a campaign, a new job, a start-up will take 5 years to play through. So, he asks himself, how many 5 years do I have left? He can count them on one hand even if he is lucky. So this clarifies his choices. If he has less than 5 big things he can do, what will they be?

Tom Stoppard: Arcadia

It is a defect of God's humour that he directs our hearts everywhere but to those who have a right to them.

Christopher Alexander: interview in Stewart Brand's "How Buildings Learn"

I think people have lost confidence in themselves. To a large extent, that's been done by members of my profession. The building has become the province of the architect; it's sort of his plaything. And the architects have worked quite hard to convince users that they don't know anything about architecture... It's even reached the point where people have interior decorators come choose their own wallpaper, for god's sake.

So this lack of confidence, which has been fostered in the population, is a manipulation that has actually been caused partly by the media, but largely by the [architecture] profession. It tremendously endangers the fabric of society, because if people lose confidence in themselves to that degree, then the adaptation of the environment, to common sense and to everyday use, disappears.

Jonathan Blow: talk: Video Games and the Human Condition

Game design is kind of a game by itself. I've made a bunch of puzzle games, and I've found that looking at a situation and saying, "How do I make an interesting puzzle out of this," is itself a really interesting puzzle. So there's this huge irony, that the companies that are making these social games that have basically no gameplay value in them are actually themselves playing a much more interesting game than the game that they're making for you to play. The game they're playing is this huge multidimensional optimization problem, trying to gather data and make the best decision and all that, but the game they're making for you to play is like clicking on a cow a bunch of times and you get some gold.

Jonathan Blow: talk: Design Reboot

When millions of people buy our game, we are pumping a (mental) substance into the (mental) environment. This is a public mental health issue. We have the power to shape humanity. How will we use it?

C.A.R. Hoare: The Emperor's Old Clothes

At first I hoped that such a technically unsound project would collapse, but I soon realized it was doomed to success. Almost anything in software can be implemented, sold, and even used given enough determination. There is nothing a mere scientist can say that will stand against the flood of a hundred million dollars. But there is one quality that cannot be purchased in this way - and that is reliability. The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay. ...

[Ada] has been initiated and sponsored by one of the world's most powerful organizations, the United States Department of Defense. Thus it is ensured of an influence and attention quite independent of its technical merits, and its faults and deficiencies threaten us with far greater dangers. For none of the evidence we have so far can inspire confidence that this language has avoided any of the problems that have afflicted other complex language projects of the past.

Alan Kay: The Early History of Smalltalk

A twentieth century problem is that technology has become too "easy". When it was hard to do anything whether good or bad, enough time was taken so that the result was usually good. Now we can make things almost trivially, especially in software, but most of the designs are trivial as well. This is inverse vandalism: the making of things because you can. Couple this to even less sophisticated buyers and you have generated an exploitation marketplace similar to that set up for teenagers. A counter to this is to generate enormous disatisfaction with one's designs using the entire history of human art as a standard and goal. Then the trick is to decouple the disatisfaction from self worth--otherwise it is either too depressing or one stops too soon with trivial results.

Wikipedia: Inverse problem

The field of inverse problems was first discovered and introduced by soviet-armenian physicist, Viktor Ambartsumian. ... [His] paper was published in 1929 in the German physics journal Zeitschrift fur Physik and remained in oblivion for a rather long time. Describing this situation after many decades, Ambartsumian said, "If an astronomer publishes an article with a mathematical content in a physics journal, then the most likely thing that will happen to it is oblivion."

Jay Rosen: PressThink Basics: The Master Narrative in Journalism

Individual reports we can summarize, index, and criticize... but there is no reliable index to replicating patterns in news coverage. Your local newscaster may tell you, "here's a list of stories we're working on for NewsFour at 11:00," but there is nowhere listed the story forms from which this repetitive content flows. A given work of journalism will have an author's byline, but in some measure the author is always "journalism" itself and its peculiar habits of mind. You can't interview that guy.

Alan Kay: Doing With Images Makes Symbols

Jacques Hadamard, the famous French mathematician, in the late stages of his life, decided to poll his 99 buddies, who made up together the 100 great mathematicians and physicists on the earth, and he asked them, "How do you do your thing?" They were all personal friends of his, so they wrote back depositions. Only a few, out of the hundred, claimed to use mathematical symbology at all. Quite a surprise. All of them said they did it mostly in imagery or figurative terms. An amazing 30% or so, including Einstein, were down here in the mudpies [doing]. Einstein's deposition said, "I have sensations of a kinesthetic or muscular type." Einstein could feel the abstract spaces he was dealing with, in the muscles of his arms and his fingers...

The sad part of [the doing -> images -> symbols] diagram is that every child in the United States is taught math and physics through this [symbolic] channel. The channel that almost no adult creative mathematician or physicist uses to do it... They use this channel to communicate, but not to do their thing. Much of our education is founded on those principles, that just because we can talk about something, there is a naive belief that we can teach through talking and listening.

William Thurston: On proof and progress in mathematics

When a significant theorem is proved, it often (but not always) happens that the solution can be communicated in a matter of minutes from one person to another within the subfield. The same proof would be communicated and generally understood in an hour talk to members of the subfield. It would be the subject of a 15- or 20-page paper, which could be read and understood in a few hours or perhaps days by members of the subfield.

Why is there such a big expansion from the informal discussion to the talk to the paper? One-on-one, people use wide channels of communication that go far beyond formal mathematical language. They use gestures, they draw pictures and diagrams, they make sound effects and use body language. Communication is more likely to be two-way, so that people can concentrate on what needs the most attention. With these channels of communication, they are in a much better position to convey what's going on, not just in their logical and linguistic facilities, but in their other mental facilities as well.

In talks, people are more inhibited and more formal. Mathematical audiences are often not very good at asking the questions that are on most people's minds, and speakers often have an unrealistic preset outline that inhibits them from addressing questions even when they are asked.

In papers, people are still more formal. Writers translate their ideas into symbols and logic, and readers try to translate back.

Steven Johnson: Where Good Ideas Come From

Technological and scientific advances rarely break out of the adjacent possible; the history of cultural progress is, almost without exception, a story of one door leading to another door... But of course, every now and then an idea does occur to someone that teleports us forward a few rooms, skipping some exploratory steps in the adjacent possible. But those ideas almost always end up being short-term failures... we call them "ahead of their time". ...

Babbage had most of [his Analytical Engine] sketched out by 1837, but the first true computer to use this programmable architecture didn't appear for more than a hundred years. While the Difference Engine engendered an immediate series of refinements and practical applications, the Analytical Engine effectively disappeared from the map. Many of the pioneering insights that Babbage had hit upon in the 1830s had to be independently rediscovered by the visionaries of World War II-era computer science.

Howard Rheingold: The Millennium Whole Earth Catalog

If you want to maintain independence in the era of large institutions and think fresh thoughts in the age of mass media, you are going to need good tools.

Mike Birkhead: Depth vs Breadth in Combat Design

Depth is the Knowledge of How, and breadth is the Knowledge of Why.

Chris Hecker: talk, NYU Game Center Lecture Series

Q: At the [NYU] Game Center, we're interested in the role of the university as an alternate place for thinking about games... What in your opinion are some of the big interesting problems that students should be working on?

A: My advice for students is... I question the question. I don't think there are problems that students should be working on. I think students should be making games that are interesting and push the boundaries, and those will generate the problems.

Clay Shirky: Why We Need the New News Environment to be Chaotic

News has to be free, because it has to spread. The few people who care about the news need to be able to share it with one another and, in times of crisis, to sound the alarm for the rest of us. Newspapers have always felt a tension between their commercial and civic functions, but when a publication drags access to the news itself over to the business side, as with the paywalls at The Times of London or the Tallahassee Democrat, they become Journalism as Luxury. In a future dominated by Journalism as Luxury, elites will still get what they need (a tautology in market economies), but most communities will suffer; imagine Bell, California times a thousand, with no Ruben Vives to go after the the politicians.

Sebastian Deterding: Don't Play Games With Me! Promises and Pitfalls of Gameful Design

Dozens of psychological studies have consistently shown that giving expected extrinsic rewards for an activity (e.g. "If you do x, I will give you y amount of cash/points/...") often reduces intrinsic motivation of people to do it. The first reason is that people feel controlled by the person giving the rewards, reducing their sense of autonomy... Secondly, giving a reward for an activity sends a strong social signal that you don't consider the activity worth doing for its own sake.

Steven Johnson: Interface Culture

If you live your entire life under the spell of television, the mental world you inherit from the TV -- the supremacy of images over text, the passive consumption, a preference for live events over historical contemplation -- seems like second nature to you. Only when another medium rolls into view does the television's influence become perceptible. When those paradigm shifts arrive only once every few centuries, you have to be a genuine visionary or a lunatic to see beyond the limits of the form. McLuhan, of course, was a little of both.

Steven Johnson: Interface Culture

Looking back now... what strikes you about the early days of the desktop metaphor is how many people resisted the idea, and how many simply didn't get it at all. The viability of the graphic interface is so far beyond question now that it's difficult to remember that there was ever a dispute about it. But if you sift through the original reviews of the Mac and the Lisa... you can't help but be struck by how hard a time the critics had wrapping their minds around the new paradigm.

Some of the reviews of the graphic interface struck the ridiculous real-men-don't-do-windows chord... as in this wag from Creative Computing magazine:

Icons and a mouse will not make a non-literate person literate. Pointing at pictures can last only so long. Sooner or later you must stop pointing and selecting, and begin to think and type.

The opposition now seems completely out of place to use, accustomed as we are to the way spatial metaphors can augment thought -- but to those first critics, the visual language seemed like child's play, or a cartoon. Other reviews missed the point altogether, dismissing the Mac as a tool that only artists and designers would have use for, as though the machine's major innovation was MacPaint's spray can and not the interface itself. Consider the editorial from Forbes, dated February 13, 1984:

[The Macintosh's] best features are for computer novices: MacPaint, a program that creates graphic designs of stunning complexity, and MacWrite, a word-processing program that goes to ingenious lengths to set up the screen to look like a typewriter. Both are controlled by the machine's "mouse," which moves the cursor without the user's touching the keyboard. Such simplicity is not aimed at big corporations. The average middle manager has litte need for the graphics capability of MacPaint. Most managers have a hard enough time writing reports, without having to worry about designing them as well.

The ease with which the author dismisses the brilliance of those original programs ("such simplicity") is breathtaking, of course, but even more arresting is how the graphic interface itself flies completely below his radar. There's not even a passing reference to the potential virtues of organizing information visually... There's a puzzling literalness to the langauge: the author sees a graphic interface and immediately assumes that it must be useful only for graphic artists. The broader conceptual liberation promised by the graphic interace doesn't even occur to him.

Sir James Lighthill: discussion following The Recently Recognized Failure of Predictability in Newtonian Dynamics

Q: Do you regard the chaos [within Newtonian mechanics] as immutable, forever remaining inexplicable; and that no new data, no more exact observations or no future theory will ever be able to explain it? I have in mind that the history of science has revealed time and time again a state of affairs where observed phenomena have been seen as irrational, inexplicable and 'chaotic' according to received theory and accepted laws of science but that subsequent refinement of the data and/or new hypotheses, by offering a new explanatory schema, have revealed that a new order lay unperceived within the older chaos....

A: Perhaps I should make it clear that the results I described are not 'scientific theories'. They are mathematical results, based upon rigorous 'proof' in the mathematical sense. They are not capable of alteration therefore.

Admittedly the history of science confirms that our understanding of natural laws is constantly being further refined. Newtonian dynamics is itself an illustration of this because we have long recognized it as only an approximation to the true laws of mechanics...

My lecture, however, was about the mathematical properties of systems assumed to obey exactly the laws of Newtonian dynamics. The behaviour of such systems had long been thought to be completely predictable but is now known, for a certain proportion of such systems, to be 'chaotic' in a well defined sense.

Alan Kay: Programming and Scaling

Leonardo could not invent a single engine for any of his vehicles. Maybe the smartest person of his time, but he was born in the wrong time. His IQ could not transcend his time. Henry Ford was nowhere near Leonardo, but he happened to be born in the right century, a century in which people had already done a lot of work in making mechanical things...

Knowledge, in many many cases, trumps IQ. Why? This is because there are certain special people who invent new ways of looking at things. Henry Ford was powerful because Issac Newton changed the way Europe thought about things. One of the wonderful things about the way knowledge works is if you can get a supreme genius to invent calculus, those of us with more normal IQs can learn it. So we're not shut out from what the genius does. We just can't invent calculus by ourselves, but once one of these guys turns things around, the knowledge of the era changes completely.

Alan Kay: Programming and Scaling

So we've got this present, it comes out of one set of things in the past that we're vaguely aware of, and gives rise to an incremental future. But the truth is that the past is vast. It's enormous! There are billions of people contributing to the past. And every time we think the present is real, we cannot see the rest of the past. So we have to destroy the present.

Once you get rid of it, it's a scary situation, because you said, "I'm not going to have anything based on the past." Of course that's not possible; you're just trying. But sometimes you get a little feeling. And this not an idea; it's just a feeling. It's like an odor of perfume. But the fun thing is that little feeling can actually lead you to look in the past in different places than you normally do, and you can bring those up to that feeling. And once you do that, that feeling starts expanding into a vision, and the vision expands into an actual idea... Some of the most creative people I know actually operate this way. This is where those ideas come from that are not just incremental to the present. They come out of vague, even muscular sensations, that you have to go chasing to find out what they are. If you try to get the idea too early, it can only be in terms of the present.

B.N. Delone: Mathematics: Its Content, Methods, and Meaning, p 193

The inventors of the infinitesimal analysis [calculus] were already in possession of Descartes' method [of analytic geometry]. Whether it was a question of tangents or normals to curves, or of maxima or minima of functions considered geometrically, or of the radius of curvature of a curve at a given point, etc., the equation of the curve was considered first, by the method of Descartes, and then the equations of the normal, the tangent, and so forth, were found. Thus infinitesimal analysis, namely the differential and integral calculus, would have been inconceivable without the preliminary development of analytic geometry.

Stewart Brand: interview on Marketplace

In [Whole Earth Catalog] I focused on individual empowerment, [but in Whole Earth Discipline] the focus is on the aggregate effects of humans on things like climate. And some of these issues are of such scale that you got to have the governments doing things like making carbon expensive. Or making coal expensive to burn and putting all that carbon into the atmosphere. And individuals can't do that, individual communities can't do that. It takes national governments.

Stewart Brand: foreward to Unbounding the Future: the Nanotechnology Revolution

[Nanotechnology] will arrive piecemeal and prominently, but the consequences will arrive at a larger scale and often invisibly.

Perspective from within a bursting revolution is always a problem because the long view is obscured by compelling immediacies and the sudden traffic of people new to the subject, some seizing opportunity, some viewing with alarm. Both optimists and pessimists about new technologies are notorious for their tunnel vision.

The temptation always is to focus on a single point of departure or a single feared or desired goal. Sample point of departure: What if we can make anything out of diamond? Sample feared/desired goal: What if molecular-scale medicine lets people live for centuries?

We're not accustomed to asking, What would a world be like where many such things are occurring? Nor do we ask, What should such a world be like?

Steven Levy: Insanely Great

Since gathering the wealth of Croesus was not Engelbart's goal, one might reasonably assume that his achievements brought him satisfaction. When I suggested as much, Engelbart curtly gestured to his system... He wanted this system, his system, everywhere. But he had no control over the future. His vision was at the mercy of those he inspired.

Steven Levy: Bill and Andy's Excellent Adventure II

Bill [Atkinson]'s problem with his employer's oversight was not so much ego, as a matter of his deeply ingrained sense of fairness. Bill has a radar for the personal angle, and the idea of one person gaining an unearned edge over another is loathsome to him. ... He thinks that "business as usual" is no excuse for not doing what's right.

The second thing crucial to Bill is his need to get his products out into the world. He bears scars from those times when a project of his failed to reach the public. He loved the idea that Apple bundled his MacPaint with every Macintosh, and he was crushed when the company decided that his post-Mac project, a flat-pad communicating computer called Magic Slate, was too esoteric a product to begin developing in 1985. He went into a depression, not working for months, until one night he wandered out of his house in the Los Gatos hills, stared at the star-filled sky, and had an epiphany: In the face of the awesome celestial epic, what was the point of being depressed? All you could do, really, was use your abilities to do what you could to make a little part of the universe better. And Bill Atkinson went back into the house and began using his abilities to work on a new project that would become known as HyperCard.

Dan Bricklin: interview

Q: Do you ever feel that the fame of VisiCalc has overshadowed some of your more recent accomplishments?

A: It had better. VisiCalc was a pretty big thing to have done, and I'm very happy that I had the opportunity to make such a big contribution to the world.

John Gall: Systemantics

A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.

William S. Anglin: Mathematics and history

Mathematics is not a careful march down a well-cleared highway, but a journey into a strange wilderness, where the explorers often get lost. Rigour should be a signal to the historian that the maps have been made, and the real explorers have gone elsewhere.

Frank Lantz: re Ian Bogost

He probably would be happier if he hadn't made Cow Clicker, but he doesn't want to be happy. I don't know what his goal in life is, but it's not to be happy. He's definitely better off having made this thing that has made him so unhappy.

Gerald Jay Sussman: We Really Don't Know How To Compute! (40:30)

I'm only pushing this idea, not because I think it's the right answer. I'm trying to twist us, so we say, "This is a different way to think." We have to think fifty-two different ways to fix this problem. I don't know how to make a machine that builds a person out of a cell. But I think the problem is that we've been stuck for too long diddling with our details. We've been sitting here worrying about our type system, when we should be worrying about how to get flexible machines and flexible programming.

Trudy Cooper and Doug Bayne: Ask Oglaf anything

Writing's the exact opposite of jerking off- hurts while you're doing it, but afterwards you're proud of the little mess you made.

Paul Baran: interview with Stewart Brand

You say, "My God, one day this is how we're going to build all our networks." It's such a wild thought that you ask, "Am I fooling myself?"

I took out the briefing charts and went around to present this idea [of digital packet switching], and got dumped all over. People said it wouldn't work because of this reason or that reason. I would study the problem and come back. A good wire-brushing like this was necessary. You see, these ideas were crazy. We were in an analog world. The image of a computer was a great big room with parts failing all the time. I said, "You can build computers in shoe box size. It's already happening on board airplanes."

E. T. Jaynes: Probability Theory, chapter 5: Queer uses for probability theory

Issuing reports of sensational data defeats its own purpose. For if the prior probability of deception is greater than that of ESP, then the more improbable the alleged data are on the null hypothesis of no deception and no ESP, the more strongly we are led to believe, not in ESP, but in deception...

Laplace perceived this phenomenon long ago... He notes that those who make recitals of miracles, "decrease rather than augment the belief which they wish to inspire; for then those recitals render very probable the error or the falsehood of their authors. But that which diminishes the belief of educated men often increases that of the uneducated, always avid for the marvelous."

Indeed, the [author] found himself a victim of this phenomenon... We applied Bayesian analysis to estimation of frequencies of nonstationary sinusoidal signals... We found -- as was expected on theoretical grounds -- an improved resolution over the previously used Fourier transform methods.

If we had claimed a 50% improvement, we would have been believed at once, and other researchers would have adopted this method eagerly. But in fact we found orders of magnitude improvement in resolution. It was, in retrospect, foolish of us to mention this at the outset, for in the minds of others the prior probability that we were irresponsible charlatans was greater than the prior probability that a new method could possibly be that good; and we were not at first believed.

Karl Popper: Realism and the Aim of Science

My subject does not exist because subject matters in general do not exist. There are no subject matters; no branches of learning -- or, rather, of inquiry: there are only problems, and the urge to solve them. A science such as botany or chemistry is, I contend, merely an administrative unit. University administrators have a difficult job anyway, and it is a great convenience to them to work on the assumption that there are some named subjects, with chairs attached to them to be filled by the experts in these subjects. It has been said that the subjects are also a convenience to the student. I do not agree: even serious students are misled by the myth of the subject. And I should be reluctant to call anything that misleads a person a convenience to that person.

James Clerk Maxwell: Scientific Papers of James Clerk Maxwell, Vol II

Mathematicians may flatter themselves that they possess new ideas which mere human language is as yet unable to express. Let them make the effort to express these ideas in appropriate words without the aid of symbols, and if they succeed they will not only lay us laymen under a lasting obligation, but, we venture to say, they will find themselves very much enlightened during the process, and will even be doubtful whether the ideas as expressed in symbols had ever quite found their way out of the equations into their minds.

Arturo Bejar: State Bundles for Persistence

Do you, Programmer,
take this Object to be part of the persistent state of your application,
to have and to hold,
through maintenance and iterations,
for past and future versions,
as long as the application shall live?

Alan Kay: The Early History of Smalltalk

New ideas go through stages of acceptance, both from within and without. From within, the sequence moves from "barely seeing" a pattern several times, then noting it but not perceiving its "cosmic" significance, then using it operationally in several areas, then comes a "grand rotation" in which the pattern becomes the center of a new way of thinking, and finally, it turns into the same kind of inflexible religion that it originally broke away from. From without, as Schopenhauer noted, the new idea is first denounced as the work of the insane, in a few years it is considered obvious and mundane, and finally the original denouncers will claim to have invented it.

Tevis Thompson: Saving Zelda

If Zelda is to reclaim any of the spirit that Miyamoto first invested in its world... it needs to make most of the map accessible from the beginning. No artificial barriers to clumsily guide Link along a set course... Link must be allowed to enter areas he's not ready for. He must be allowed to be defeated, not blocked, by the world and its inhabitants.

This world, dangerous, demanding exploration, must also be mysterious. This means: illegible, at least at first... How can you truly explore if you know how everything works already? How can you ever be surprised if every "secret" is conspicuously marked as such?

The point of a hero's adventure... is not to make you feel better about yourself. The point is to grow, to overcome, to in some way actually become better. If a legendary quest has no substantial challenge, if it asks nothing of you except that you jump through the hoops it so carefully lays out for you, then the very legend is unworthy of being told, and retold.

To do this, Hyrule must become more indifferent to the player. It must aspire to ignore Link. Zelda has so far followed a spirit of indulgence in its loving details, a carefully crafted adventure that reeks of quality and just-for-you-ness. But a world is not for you. A world needs a substance, an independence, a sense that it doesn't just disappear when you turn around (even if it kinda does). It needs architecture, not level design with themed wallpaper, and environments with their own ecosystems (which were doing just fine before you showed up). Every location can't be plagued with false crises only you can solve, grist for the storymill.

Richard Hamming: One Man's View of Computer Science (1969)

This brings me to another distinction, that between undirected research and basic research. Everyone likes to do undirected research and most people like to believe that undirected research is basic research. I am choosing to define basic research as being work upon which people will in the future base a lot of their work. After all, what else can we reasonably mean by basic research other than work upon which a lot of later work is based? I believe experience shows that relatively few people are capable of doing basic research. While one cannot be certain that a particular piece of work will or will not turn out to be basic, one can often give fairly accurate probabilities on the outcome... What determines whether or not a piece of work has much chance to become basic is not so much the question asked as it is the way the problem is attacked.

Danny Hillis: quoted in "What Technology Wants" by Kevin Kelly, p142

There might be tens of thousands of people who conceive the possibility of the same invention at the same time. But less than one in ten of them imagines how it might be done. Of these who see how to do it, only one in ten will actually think through the practical details and specific solutions. Of these only one in ten will actually get the design to work for very long. And finally, usually only one of all those many thousands with the idea will get the invention to stick in the culture.

Hans Christian Von Baeyer: Warmth Disperses and Time Passes: The History of Heat, p38

In 1823, [Sadi Carnot] was ready to publish what he had discovered. Before putting pen to paper, he had adopted two guidelines, each admirable in its own right, but fatal in combination: By neatly canceling each other out, they condemned his book to almost total oblivion. First, he decided to address himself to the general public rather than an audience of scientist and engineers. This decision establishes the book, which much later assumed its rightful place among the classics of science, as the last member of a noble tradition. Galileo himself had started the trend by writing in popular Italian instead of Latin, by keeping mathematical details to a minimum, and by perfecting a lively literary style. Galileo's writings were enormously influential, but after the time of Newton another genre, densely mathematical in content and highly professional in tone, had become predominant, particularly in physics.

Carnot's second guideline, and the essence of his greatness, was to embrace generality. Inspired by his father, who had written a successful book on the analysis of simple mechanical machines, Carnot undertook to develop a general theory of steam engines that would rise above the practical questions of design and materials that were of immediate interest to engineers.

A popular explanation of the advantages of the the steam engine, or a general treatise of the theory of extracting work from heat, might have made its mark. But the public was too unsophisticated to understand a general theory, and the technical people too contemptuous to bother with what seemed to be a popularization of a complex subject. By trying to address two audiences at once, Carnot excluded both. His Reflections on the Motive Power of Fire received only one, albeit enthusiastic, review, and a decade later, three years after its author's death at thirty-six, one single citation in a science text alone bore the burden of keeping his memory alive.

Alan Kay: The Early History of Smalltalk

Even very young children can understand and use interactive transformational tools. The first ones are their hands! They can readily extend these experiences to computer objects and making changes to them. They can often imagine what a proposed change will do and not be surprised at the result... They can answer any question whose answer requires the application of just one of these tools. But it is extremely difficult for them to answer any question that requires two or more transformations. Yet they have no problem applying sequences of transformations, exploring "forward." It is for conceiving and achieving even modest goals requiring several changes that they almost completely lack navigation abilities.

It seems that what needs to be learned and taught is now to package up transformations in twos and threes in a manner similar to learning a strategic game like checkers. The vague sense of a "threesome" pointing towards one's goal can be a set up for the more detailed work that is needed to accomplish it.

Richard Gabriel: The Design of Parallel Programming Languages

John [McCarthy]'s world is a world of ideas, a world in which ideas don't belong to anyone, and when an idea is wrong, just the idea - not the person - is wrong. A world in which ideas are like young birds, and we catch them and proudly show them to our friends. The bird's beauty and the hunter's are distinct....

Some people won't show you the birds they've caught until they are sure, certain, positive that they - the birds, or themselves - are gorgeous, or rare, or remarkable. When your mind can separate yourself from your bird, you will share it sooner, and the beauty of the bird will be sooner enjoyed. And what is a bird but for being enjoyed?

Charles Babbage: quoted in "The Information" by James Gleick, p104

[Babbage] was developing a sour view of the Englishman's attitude toward technological innovation: "If you speak to him of a machine for peeling a potato, he will pronounce it impossible: if you peel a potato with it before his eyes, he will declare it useless, because it will not slice a pineapple."

Daniel Dennett: Darwin's Dangerous Idea, p346

I don't know about you, but I am not initially attracted by the idea of my brain as a sort of dung heap in which the larvae of other people's ideas renew themselves, before sending out copies of themselves in an informational diaspora.... Who's in charge, according to this vision -- we or our memes?

Tycho: Penny Arcade

Like most readers, I had functionally consigned [our game] to the furnace. I had let it float away on one of those little lantern boats in a way that brought me closure, if no one else. Insufficient. Fucking insufficient.

You have to get back on the horse. Somehow, and I don't know how this kind of thing starts, we have started to lionize horseback-not-getting-on: these casual, a priori assertions of inevitable failure, which is nothing more than a gauze draped over your own pulsing terror. Every creative act is open war against The Way It Is. What you are saying when you make something is that the universe is not sufficient, and what it really needs is more you. And it does, actually; it does. Go look outside. You can't tell me that we are done making the world.

James Gleick: The Information, p400

As a duplicating machine, the printing press not only made texts cheaper and more accessible; its real power was to make them stable... All forms of knowledge achieved stability and permanence, not beacuse paper was more durable than papyrus but simply because there were many copies.

Alfred North Whitehead: An Introduction to Mathematics (1910), p20

From the earliest epoch (2634 BC) the Chinese had utilized the characteristic property of the compass needle, but do not seem to have connected it with any theoretical ideas. The really profound changes in human life all have the ultimate origin in knowledge pursued for its own sake.... The importance which the science of electromagnetism has since assumed in every department of human life is not due to the superior practical bias of Europeans, but to the fact that in the West electrical and magnetic phenomena were studied by men who were dominated by abstract theoretic interests.

Chris Hecker: interview

Q: Are you ever afraid of someone stealing your thunder, especially when you've been quite open about development and showing off your games for some time now - i.e. if a game comes out that happens to have the same puzzle hook as The Witness or the same kind of competitive aspects as Spy Party?

A: I think anybody really good is going to want to do their own thing. Anybody who's not really good, you don't have to worry too much about.

Jon Gertner: The Idea Factory: Bell Labs and the Great Age of American Innovation

It was curious, in a way, who they were, these men coming to Bell Labs in New York. Most had been trained at first-rate graduate schools like MIT and Chicago and Caltech; they had been flagged by physics or chemistry or engineering professors at these places and their names had been quietly passed along to Mervin Kelly or someone else at the Labs. But most had been raised in fly-speck towns, intersections of nowhere and nowhere, places with names like Chickasa or Quaker Neck or Petoskey, towns like the one Kelly had come from, rural and premodern like Gallatin, towns where their fathers had been fruit growers or merchants or small-time lawyers. Almost all of them had found a way out -- a high school teacher, oftentimes, who noticed something about them, a startling knack for mathematics, for example, or an insatiable curiosity about electricity, and had tried to nurture this talent with extra assignments or after-school tutoring, all in the hope (never explained to the young men but realized by them all, gratefully, many years later) that the students could be pushed toward a local university and away from the desolation of a life behind a plow or a cash register.

The young Bell Labs recruits had other things in common. Almost all had grown up with a peculiar desire to know more about the stars or the telephone lines or (most often) the radio, and especially their makeshift home wireless sets. Almost all of them had put one together themselves, and in turn had discovered how sound could be pulled from the air.

Will Wright: Gaming Reality

We have this limited bubble of experience. We can only have so many experiences in our lifetime to build models from, and we're abstracting from that data. We've found, through evolution, two ways to get more data, to build more elaborate models of the world. One is to have toy experiences, little counterfeit experiences. The other one is to learn from the experience of others. When somebody tells you a story, you can actually learn from that story, incorporate it into your model of the world to make your model more accurate based upon that data that you got from somebody else. So over time, we have come to call one of these things "play" and the other one "storytelling". These are both fundamentally educational technologies that allow us to build more elaborate models of the world around us, by supplanting our limited experience with other experiences.

Charles Geschke: quoted in "Dealers of Lightning" by Michael Hiltzik, p273

The typical posture and demeanor of the Xerox executives, and all of them were men, was this -- [arms folded sternly across the chest]. But their wives would immediately walk up to the machines and say, "Could I try that mouse thing?" That's because many of them had been secretaries -- users of the equipment. These guys, maybe they punched a button on a copier one time in their lives, but they had someone else do their typing and their filing. So we were trying to sell to people who really had no concept of the work this equipment was actually accomplishing.

Alan Kay: The Center of "Why?"

Living organisms are shaped by evolution to survive, not necessarily to get a clear picture of the universe. For example, frogs' brains are set up to recognize food as moving objects that are oblong in shape. So if we take a frog's normal food -- flies -- paralyze them with a little chloroform and put them in front of the frog, it will not notice them or try to eat them.

It will starve in front of its food! But if we throw little rectangular pieces of cardboard at the frog it will eat them until it is stuffed! The frog only sees a little of the world we see, but it still thinks it perceives the whole world.

Now, of course, we are not like frogs! Or are we?

Alan Kay: quoted in "Dealers of Lightning" by Michael Hiltzik

It's almost impossible for most people to see technology as the tool rather than the end. People get trapped in thinking that anything in the environment is to be taken as a given. It's part of the way our nervous system works. But it's dangerous to take it as a given because then it controls you, rather than the other way around. That's McLuhan's insight, one of the bigger ones in the twentieth century. Zen in the twentieth century is about taking things that have been rendered invisible by this process and trying to make them visible again.

Thomas Kuhn: Reflection on my Critics

[Wikipedia's paraphrase] Kuhn expressed the opinion that his critics' readings of his book were so inconsistent with his own understanding of it that he was "...tempted to posit the existence of two Thomas Kuhns," one the author of his book, the other the individual who had been criticized in the symposium by "Professors Popper, Feyerabend, Lakatos, Toulmin and Watkins."

Nina Paley: interview about "Sita Sings the Blues"

Q: Do you think that women directors bring a distinctive perspective and if so, how would you describe it? How would this story be different if told by a man? Or would a man not tell this story?

A: This story was told by me as an individual. An individual brings their individual characteristics and experience to a story. I happen to be a woman, but I'm a specific woman, not womankind in general. I can't tell you how other women would direct a particular film, or other men. We're all unique.

Richard Hamming: The Art of Doing Science and Engineering, ch 4

In the beginning we programmed in absolute binary... Finally, a Symbolic Assembly Program was devised -- after more years than you are apt to believe during which most programmers continued their heroic absolute binary programming. At the time [the assembler] first appeared I would guess about 1% of the older programmers were interested in it -- using [assembly] was "sissy stuff", and a real programmer would not stoop to wasting machine capacity to do the assembly.

Yes! Programmers wanted no part of it, though when pressed they had to admit their old methods used more machine time in locating and fixing up errors than the [assembler] ever used. One of the main complaints was when using a symbolic system you do not know where anything was in storage -- though in the early days we supplied a mapping of symbolic to actual storage, and believe it or not they later lovingly pored over such sheets rather than realize they did not need to know that information if they stuck to operating within the system -- no! When correcting errors they preferred to do it in absolute binary.

FORTRAN was proposed by Backus and friends, and again was opposed by almost all programmers. First, it was said it could not be done. Second, if it could be done, it would be too wasteful of machine time and capacity. Third, even if it did work, no respectable programmer would use it -- it was only for sissies!

Max Planck

New scientific ideas never spring from a communal body, however organized, but rather from the head of an individually inspired researcher who struggles with his problems in lonely thought and unites all his thought on one single point which is his whole world for the moment.

Max Planck

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.

David Hestenes: Notes for a Modeling Theory of Science, Cognition, and Instruction

While science is a search for structure, mathematics is the science of structure.

John D. Cook: Most published research results are false

Here's an example that shows how p-values can be misleading. Suppose you have 1,000 totally ineffective drugs to test. About 1 out of every 20 trials will produce a p-value of 0.05 or smaller by chance, so about 50 trials out of the 1,000 will have a "significant" result, and only those studies will publish their results. The error rate in the lab was indeed 5%, but the error rate in the literature coming out of the lab is 100 percent!

Freeman Dyson: quoted in Richard Feynman: No Ordinary Genius

[The Feynman diagram approach to quantum electrodynamics] was combining this very pictorial approach with strict adherance to quantum mechanics. And that's what made it so original. Quantum mechanics is generally regarded as a theory of waves. Feynman was able to do it by ignoring the wave aspect completely. The pictures show you just particles traveling along in straight lines. These then were translated into mathematics, but in a very simple fashion, so that once you had the geometrical picture, it was simple to go straight to the answer. And that made his methods very powerful, as compared to the conventional way of doing things, which is much more analytical.

Clay Shirky: Napster, Udacity, and the Academy

Every college provides access to a huge collection of potential readings, and to a tiny collection of potential lectures. We ask students to read the best works we can find, whoever produced them and where, but we only ask them to listen to the best lecture a local employee can produce that morning. Sometimes you're at a place where the best lecture your professor can give is the best in the world. But mostly not. And the only thing that kept this system from seeming strange was that we've never had a good way of publishing lectures. ...

The fight over [massive open online courses] is really about the story we tell ourselves about higher education: what it is, who it's for, how it's delivered, who delivers it. The most widely told story about college focuses obsessively on elite schools and answers a crazy mix of questions: How will we teach complex thinking and skills? How will we turn adolescents into well-rounded members of the middle class? Who will certify that education is taking place? How will we instill reverence for Virgil? Who will subsidize the professor's work?

Carver Mead: "Feynman as a Colleague", chapter in "Feynman and Computation"

Gordon Moore asked me whether tunneling would be a major limitation on how small we could make transistors in an integrated circuit. That question took me on a detour that was to last nearly 30 years.... I decided to make the question the subject of a talk. As I prepared for this event, I began to have serious doubts about my sanity. My calculations were telling me that, contrary to all the current lore in the field, we could scale down the technology such that everything got better. The circuits got more complex, they ran faster, and the took less power -- WOW! The more I looked at the problem, the more I was convinced that the result was correct, so I went ahead and gave the talk. That talk provoked considerable debate, and at the time most people didn't believe the result. But by the time the next workshop rolled around, a number of other groups had worked through the problem for themselves, and we were pretty much all in agreement.

Back in 1959, Feynman had given a lecture entitled "There's Plenty of Room at the Bottom". That talk had made a big impression me... I became completely absorbed with how the exponential increase in complexity of integrated circuits would change the way that we think about computing. The viewpoint of the computer industry at the time was an outgrowth of the industial revolution; it was based on what was then called "the economy of scale." A 1000-horsepower engine cost only four times as much as a 100-horsepower engine. Therefore, the cost per horsepower became less as the engine was made larger. It was more cost effective to make a few large power plants than to make many small ones. Efficiency considerations favored the concentration of technology in a few large installations. The same was evidently true of computing. IBM was particularly successful following this strategy.

But as I looked at the physics of the emerging technology, it didn't work that way at all. The time required to move data was set by the velocity of light and related electromagnetic considerations, so it was far more effective to put whatever computing was required where the data were located. Efficiency considerations thus favored the distribution of technology, rather than the concentration of technology. The economics of information technology were the reverse of those of mechanical technology. I gave numerous talks on this topic and, at that time, what I had to say was contrary to what the industry wanted to hear.

Douglas Hofstadter: interview about "I Am a Strange Loop"

Q: Will the mind one day understand itself?

Depends on what you mean by understand itself. If you mean in broad-principle terms if we will come to understand things, yeah, I don't see why not. For example, I like to look back at Freud. I don't know when it was that he first published his ideas about the ego, the id and the superego, and I don't know how much truth there is to those ideas, but it was a big leap even if it wasn't completely correct, because nobody had ever spoken of the abstract architecture of a human soul or a human self. It's as if he were saying that a self can be thought of in an abstract way, the way a government is thought of, with a legislative branch, a judicial, an executive, and he was making guesses at what the architecture of a human self is. And maybe they were all wrong, but it doesn't matter; the point is it was a first stab. Like the Bohr atom, it was a wonderful intuitive leap.

Freeman Dyson: The Scientist as Rebel

My message is that science is a human activity, and the best way to understand it is to understand the individual human beings who practice it. Science is an art form and not a philosophical method. The great advances in science usually result from new tools rather than from new doctrines. ... Every time we introduce a new tool, it always leads to new and unexpected discoveries, because Nature's imagination is richer than ours.

John Holt: How Children Fail

Knowledge, learning, understanding, are not linear. They are not little bits of facts lined up in rows or piled up one on top of another. A field of knowledge, whether it be math, English, history, science, music, or whatever, is a territory, and knowing it is not just a matter of knowing all of the items in the territory, but of knowing how they relate to, compare with, and fit in with each other... It is the difference between knowing the names of all the streets in a city and being able to get from any place, by any desired route, to any other place.

Why do we talk and write about the world and our knowledge of it as if they were linear? Because that is the nature of talk. Words come out in single file, one at a time; there's no other way to talk or write. So in order to talk about it, we cut the real undivided world into little pieces, and make these into strings of talk, like beads on a necklace. But we must not be fooled; these strings of talk are not what the world is like. Our learning is not real, not complete, not accurate, above all not useful, unless we take these word strings and somehow convert them in our minds into a likeness of the world, a working mental model of the universe as we know it. Only when we have made such a model, and when there is at least a rough correspondence between that model and reality, can it be said of us that we have learned something.

John Holt: How Children Fail

I gave Marjorie 2 rods, and asked how many differently shaped rectangles she could make by putting them together. She saw there was only one... With 4 rods, there were two possible rectangles, a 1 x 4 and a 2 x 2. And so we worked our way up to 20, finding the factors of each number along the way... At no time on the way up to 20 did it occur to her that she could solve the problem by making use of what little she knew about factors. Given 10 rods, she did not think, "I can make a rectangle 5 rods long and 2 wide"; she had to work by trial and error each time. But she did get progressively quicker at seeing which combinations were possible and which were not.

I did not see until later that this increased quickness and skill was the beginning, the seed of a generalized understanding. An example comes to mind that was repeated many times. When the children had 12 rods, they made a 6 x 2 rectangle. Then they divided that rectangle in half and put the halves together to make a 4 x 3 rectangle. As they worked, their attack on the problem became more economical and organized. They were a long way from putting their insights and understandings into words, but they were getting there. The essential is that this sort of processs not be rushed.

John Holt: How Children Learn

I went back to the [sliding pieces] puzzle many times, hoping that I would find some fresh approach to it; but my mind kept moving back into the little groove it had made for itself. I tried to make myself forget my supposed proof that the problem was impossible. No use. Before long I would be back at the business of trying to find the flaw in my reasoning. I never found it. Like many other people, in many other situations, I had reasoned myself into a box. Looking back at the problem, I saw my great mistake. I had begun to reason too soon, before I had allowed myself enough "Messsing About," before I had built a good enough mental model of the ways in which those pieces moved, before I had given myself enough time to explore all of the possible ways in which they could move. The reason some of the children were able to do the puzzle was not that they did it blindly, but that they did not try to solve it by reason until they had found by experience what the pieces could do. Because their mental model of the puzzle was complete, it served them; because mine was incomplete, it failed me.

David Hawkins: Messing About in Science

When the mind is evolving the abstractions which will lead to physical comprehension, all of us must cross the line between ignorance and insight many times before we truly understand.

Loren Eiseley: The Man Who Saw Through Time

We of today have difficulty in realizing that the world of Bacon and Shakespeare was only semiliterate, steeped in religious contention, with its gaze turned backward in wonder upon the Greco-Roman past. Oswald Spengler justly remarks that human choice is only possible within the limitations and idea-forms of a given age. More than three hundred years ago, Francis Bacon would have understood him. Bacon's world horribly constricted his ability to exert his will upon it. At the same time he would have had a slight reservation. "Send out your little book upon the waters," he would have countered, "and hope. Your will may be worked beyond you in another and more favorable age.".

Carver Mead: keynote at Caltech EE centenial

My first experience with electronics was with vacuum tubes. And when I was a high school kid, I read about this "transistor" thing the Bell guys had invented. What I didn't realize was that Bill Shockley had basically, out of whole cloth, invented the semiconductor physics viewpoint and way of looking at things, and some of the devices that came along. And that's what we've been living on ever since. Absolutely fundamental to everything we do. Wasn't seen that way at the time, but that's what it turned out to be.

Carver Mead: keynote at Caltech EE centenial (closing)

Those things are going to be important. But -- the same way that Royal Sorenson didn't see information as a thing he could put his life work into -- right now, today, we can't see the thing, at all, that's going to be the most important 100 years from now.

Jerome Bruner: Toward a Theory of Instruction

The first response of educational systems under such acceleration [of societal technology] is to produce technicians and engineers and scientists as needed, but it is doubtful whether such a priority produces what is required to manage the enterprise. For no specific science or technology provides a metalanguage in terms of which to think about a society, its technology, its science, and the constant changes that these undergo with innovation. Could an automotive engineer have foreseen the death of small-town America with the advent of the automobile? He would have been so wedded to his task of making better and better automobiles that it would never have occurred to him to consider the town, the footpath, leisure, or local loyalty. Somehow, if change is to be managed, it requires men with skills in sensing continuity and opportunity for continuity. ...

A further speculation about preparation for change is that we are bound to move toward instruction in the sciences of behavior and away from the study of history... It has to do with the need for studying the possible rather than the achieved -- a necessary step if we are to adapt to change.

Jerome Bruner: Toward a Theory of Instruction, p 63

What is now a problem is how to "detach" the notations that the child has learned from the concrete, visible, manipulable embodiment to which it refers -- the wood. For if the child is to deal with mathematical properties he will have to deal with symbols per se, else he will be limited to the narrow and rather trivial range of symbolism that can be given direct (and only partial) visual embodiment. Concepts such as x2 and x3 may be given a visualizable referent, but what of xn?

Alan Kay: interview

The thing that traumatized me occurred a couple years later, when I found an old copy of Life magazine that had the Margaret Bourke-White photos from Buchenwald. This was in the 1940s -- no TV, living on a farm. That's when I realized that adults were dangerous. Like, really dangerous. I forgot about those pictures for a few years, but I had nightmares. But I had forgotten where the images came from. Seven or eight years later, I started getting memories back in snatches, and I went back and found the magazine. That probably was the turning point that changed my entire attitude toward life. It was responsible for getting me interested in education. My interest in education is unglamorous. I don't have an enormous desire to help children, but I have an enormous desire to create better adults.

Hippolyte Taine: On Intelligence (quoted in Jacques Hadamard: The Psychology of Invention in the Mathematical Field)

You may compare the mind of a man to the stage of a theatre, very narrow at the footlights but constantly broadening as it goes back. At the footlights, there is hardly room for more than one actor. ... As one goes further and further away from the footlights, there are other figures less and less distinct as they are more distant from the lights. And beyond these groups, in the wings and altogether in the background, are innumerable obscure shapes that a sudden call may bring forward and even within direct range of the footlights. Undefined evolutions constantly take place throughout this seething mass of actors of all kinds, to furnish the chorus leaders who in turn, as in a magic lantern picture, pass before our eyes.

Alan Kay: Turing Award talk (46:00)

I have a zillion prejudices. I love parallelism. I think because I learned to program plugboards before I learned to program a computer... I love hardware like the B5000... I love Lisp... JOSS was the most beautiful programming language ever done. It could hardly do anything, but it did it beautifully. It's an interesting challege to take something of this beauty and try to scale it. You combine these two together and you got the original Logo... I love APL. All of these systems can be done in a different way. Basically, the love of these things is because these guys got to some special kernel. I love what Engelbart did. I love spreadsheets. I love HyperCard.

Suppose you amalgamate all of these wonderful things into a simple system that regular people could use.

James Gleick: Genius: The Life and Science of Richard Feynman, p 387

At one point Goodstein remarked, "You know, it's amazing that Watson made this great discovery even though he was so out of touch with what everyone in his field was doing."

Feynman held up the paper he had been writing on. Amid scribbling and embellishments he had inscribed one word: DISREGARD.

"That's what I've forgotten," he said.

Saul Griffith: profile in Wired

We write all of our own tools, no matter what project we're building. Pretty much anything that we're doing requires some sort of design tool that didn't exist before. In fact, the design tools that we write to do the projects that we're doing are a sort of product in and of themselves.

I think in reality, today, if you use the same tools as everyone else, you kind of build the same products. If you write your own tools, you can sort of see new things, design new things.

Charles Bloom: Some GDC Observations

I saw one really amazing game at GDC that stood out from the rest. It had all the players instantly smiling and laughing. It was fun for kids and adults. It created a feeling of group affinity. Everyone around wanted to join in. It was even beneficial to the body. It was an inflatable ball.

William Tozier: Down is just the most common way out

Nobody would believe me if I came right out and said that I create the field to suit the work I want to do. On the fly; not from whole cloth, but from the chunks of other fields as needed. Nor will they believe you, when you are cured of your profession and start to merely do what's called for to make yourself useful.

Kieran Egan: Why Education is Difficult and Contentious (29:49)

We don't actually think about our institutions. We think through them. We take for granted the institutions that surround us, and they frame the ways we think about the world.

W. D. Niven: preface to "The Scientific Papers of James Clerk Maxwell" (1890)

One striking characteristic [of James Clerk Maxwell as an undergraduate] was remarked by his contemporaries. Whenever the subject admitted of it he had recourse to diagrams, though his fellow students might solve the question more easily by a train of analysis. Many illustrations of this manner of proceeding might be taken from his writings, but in truth it was only one phase of his mental attitude towards scientific questions, which led him to proceed from one distinct idea to another instead of trusting to symbols and equations.

Hal Abelson: interview

Q: So it’s fine to say, everybody should learn a little bit about how to program and this way of thinking because it’s valuable and important. But then maybe that’s just not realistic. Donald Knuth told me that he thinks two percent of the population have brains wired the right way to think about programming.

A: That same logic would lead you to say that one percent of the US's population is wired to understand Mandarin. The reasoning there is equivalent.

William Gibson: interview, 9/4/2012

[re "cyberpunk"] Once's there's a label for it, it's all over.

Ward Cunningham: interview in Dr. Dobbs

The basic [software design] patterns have done their job. And then of course, you should ask, "What was that job?" And this is the thing that surprised me. The job was really to take C++, which was a fairly static language, and show people how to write dynamic programs in a static language. That's what most of the patterns in that book were about. And in the process, patterns extended the life of C++ by a decade, which is not what I thought would happen.

What I thought would happen is people, when they learned these patterns, would look at them and say, "Wow, these patterns are hard in C++ and they're easy in Smalltalk. So if I want to think in terms of these patterns, I might as well use a language where they're easily expressed." And extend the life of Smalltalk by a decade. But the opposite happened.

Wikipedia: Ivan Illich

According to a contemporary review in The Libertarian Forum, "[Ivan] Illich's advocacy of the free market in education is the bone in the throat that is choking the public educators." Although it is important to note that Illich's opposition was not merely to publicly funded schooling, as with the libertarians, but to schooling as such; the disestablishment of schools was for him not a means to a free market in educational services, but a deschooled society, which was a more fundamental shift... He actually opposed advocates of free-market education as "the most dangerous category of educational reformers."

Gerald M. Weinberg: The Psychology of Computer Programming

How many programmers [today] learn to write programs by reading programs? ... With the advent of terminals, things are getting worse, for the programmer may not even see his own program in a form suitable for reading. In the old days, programmers would while away the time by reading each others’ programs. Some even went so far as to read programs from the program library -- which in those days was still a library in the old sense of the term.

John Herschel: A Preliminary Discourse on the Study of Natural Philosophy (1831)

For example, the words -- square, circle, a hundred etc convey to the mind notions so complete in themselves, and so distinct from everything else, that we are sure when we use them we know the whole of our own meaning. It is widely different with words expressing natural objects and mixed relations.

Take, for instance, IRON. Different persons attach very different ideas to this word. One who has never heard of magnetism has a widely different notion of IRON from one in the contrary predicament. The vulgar, who regard this metal as incombustible, and the chemist, who sees it burn with the utmost fury, and who has other reasons for regarding it as one of the most combustible bodies in nature; -- the poet, who uses it as an emblem of rigidity; and the smith and the engineer, in whose hands it is plastic, and moulded like wax into every form; -- the jailer, who prizes it as an obstruction, and the electrician who sees in it only a channel of open communication by which -- that most impassable of objects -- air may be traversed by his imprisoned fluid, have all different, and all imperfect, notions of the same word.

The meaning of such a term is like a rainbow, -- every body sees an different one, and all maintain it to be the same.

Danny Hillis: The Power of Conviction

To an architect, imagination is mostly about the future. To invent the future, one must live in it, which means living (at least partly) in a world that does not yet exist. Just as a driver whizzing along a highway pays more attention to the front window than the rear, the architect steers by looking ahead. This can sometimes make them seem aloof or absent-minded, as if they are someplace else. In fact, they are. For them, the past is a lesson, the present is fleeting; but the future is real. It is infinite and malleable, brimming with possibility.

Marc Ettlinger: Quora: What are some English language rules that native speakers don't know, but still follow?

Almost everything we know about our native languages is what's called implicit knowledge. Stuff we don't know that we know, or stuff that we can't really describe, but we can do anyway. Like maybe riding a bike, or walking.

Carver Mead: Collective Electrodynamics, p 113

We can view nature as being continuous in both space and time. This picture of nature is what Einstein wanted most. But to arrive at this picture, we had to give up the one-way-direction of time, and allow coupling to everything on the light cone. This, too, was okay with Einstein. So why was he so hung up on local causality? Why do all the textbooks state that the coupling of states unified by a light cone is a violation of relativity? In science, as in all other aspects of human endeavor, each age has its blind spots, and what is impossible to comprehend in one generation seems natural and obvious to another. So, after only one generation, Zeh could say, "There are no quantum jumps, nor are there particles."

Ted Nelson: Ted Nelson reveals the true identity of Bitcoin inventor, Satoshi, 3:35

Also like Satoshi [Nakamoto], I do not bother much with conventional academic publishing, and similarly count on posterity to understand and prove me right, since the present world doesn't get it. To work alone in this way takes insight, determination, and chutzpah, and no clinician could distinguish our condition from paranoia. Only posterity can sort it out. As Woody Allen says, posterity is the religion of the intellectuals.

Hermann Grassmann: Die Ausdehnungslehre, 1862 (4,I,II; 10), quoted in Crowe’s History of Vector Analysis, p 89

For I remain completely confident that the labor which I have expended on the science presented here and which has demanded a significant part of my life as well as the most strenuous application of my powers, will not be lost. It is true that I am aware that the form which I have given the science is imperfect and must be imperfect. But I know and feel obliged to state (though I run the risk of seeming arrogant) that even if this work should again remain unused for another seventeen years or even longer, without entering into the actual development of science, still that time will come when it will be brought forth from the dust of oblivion and when ideas now dormant will bring forth fruit. I know that if I also fail to gather around me in a position (which I have up to now desired in vain) a circle of scholars, whom I could fructify with these ideas, and whom I could stimulate to develop and enrich further these ideas, nevertheless there will come a time when these ideas, perhaps in a new form, will arise anew and will enter into living communication with contemporary developments. For truth is eternal and divine, and no phase in the development of truth, however small may be the region encompassed, can pass on without leaving a trace; truth remains, even though the garment in which poor mortals clothe it may fall to dust.

Michael Crowe: History of Vector Analysis

It was around this time that the ideas of the founders of non-Euclidean geometry, Nicholas Lobachevski and Janos Bolyai, were becoming known. It is important to realize that Hamilton, by creating the first extensive and consistent algebraic system that departed from at least one of the standard properties of traditional mathematics [commutative multiplication] issued in a development that was probably as significant for algebra as the non-Euclidean systems were for geometry. Perhaps the most significant message carried by Hamilton’s creation is that it is legitimate for mathematicians to create new algebraic systems that break traditional rules.

Gerald Jay Sussman: We Really Don't Know How To Compute! (24:21)

In the future, it's going to be the case that computers are so cheap and so easy to make, that you can make them the size of a grain of sand, complete with a megabyte of RAM. You're going to buy them by the bushel. You can pour them into your concrete, you buy your concrete by the megaflop, and you have a wall that's smart. So long as you can get the power to them, and they can do something, that's going to happen. Remember, your cells are pretty smart... they seem to talk to each other and do useful things.

Benoit Mandelbrot: The Fractal Geometry of Nature, p 21

In a letter to Dedekind, at the very beginning of the 1875-1925 crisis in mathematics, Cantor is overwhelmed by amazement at his own findings, [exclaiming] "to see is not to believe". And, as if on cue, mathematics seeks to avoid being misled by the graven images of monsters. What a contrast between the rococo exuberance of pre- or counterrevolutionary geometry, and the near-total visual barrenness of the works of Weierstrass, Cantor, or Peano! In physics, an analogous development threatened since about 1800, since Laplace's Celestrial Mechanics avoided all illustration. And it is exemplified by the statement by P. A. M. Dirac (in the preface of his 1930 Quantum Mechanics) that nature's "fundamental laws do not govern the world as it appears in our mental picture in any very direct way, but instead they control a substratum of which we cannot form a mental picture without introducing irrelevancies."

The wide and uncritical acceptance of this view has become destructive. In particular, in the theory of fractals, "to see is to believe."

Benoit Mandelbrot: The Fractal Geometry of Nature, p 22

Graphics is wonderful for matching models with reality... A formula can relate to only a small aspect of the relationship between model and reality, while the eye has enormous powers of integration and discrimination. True, the eye sometimes sees spurious relationships which statistical analysis later negates, but this problem arises mostly in areas of science where samples are very small. In the areas we shall explore, samples are huge.

William Cleveland: Visualizing Data, p 270

Sometimes, when visualization thoroughly reveals the structure of a set of data, there is a tendency to underrate the power of the method for the application. Little effort is expended in seeing the structure once the right visualization method is used, so we are mislead into thinking nothing exciting has occurred. The [previous example] might be such a case. The intensive visualization showed a linearity in hardness, a nonlinearity in tensile strength, and interaction between hardness and tensile strength, and three aberrant observations... It might be thought that anyone analyzing these data would uncover these properties. This is not the case. In the original treatment, the analysis got it wrong. They operated within a paradigm of numerical methods and probabilistic inference for data analysis, and not intensive visualization. They missed the nonlinearity. They missed the interaction. They missed the outliers. In other words, they missed most of the structure in the data.

Herbert Simon: The Sciences of the Artificial, p 153

All mathematics exhibits in its conclusions only what is already implicit in its premises... Hence all mathematical derivation can be viewed simply as change in representation, making evident what was previously true but obscure.

This view can be extended to all of problem solving -- solving a problem simply means representing it so as to make the solution transparent. If the problem solving could actually be organized in these terms, the issue of representation would indeed become central. But even if it cannot -- if this is too exaggerated a view -- a deeper understanding of how representations are created and how they contribute to the solution of problems will become an essential component in the future theory of design.

M Mitchell Waldrop: The Dream Machine: J.C.R. Licklider & the Revolution That Made Computing Personal, p 12

Considering all that happened later, Lick's youthful passion for psychology might seem like an aberration, a sideline, a twenty-five-year-long diversion from his ultimate career in computers. But in fact, his grounding in psychology would prove central to his very conception of computers. Virtually all the other computer pioneers of his generation would come to the field in the 1940s and 1950s with backgrounds in mathematics, physics, or electrical engineering, technological orientations that led them to focus on gadgetry -- on making the machines bigger, faster, and more reliable. Lick was unique in bringing to the field a deep appreciation for human beings: our capacity to perceive, to adapt, to make choices, and to devise completely new ways of tackling apparently intractable problems. As an experimental psychologist, he found these abilities every bit as subtle and as worthy of respect as a computer's ability to execute an algorithm. And that was why to him, the real challenge would always lie in adapting computers to the humans who used them, thereby exploiting the strengths of each.

M Mitchell Waldrop: The Dream Machine, p 20

[Norbert Weiner made his many contributions] in a style that left his more conventional colleagues shaking their heads. Instead of treating mathematics as formal exercise in the manipulation of symbols, Wiener worked by intuition, often groping his way toward a solution by trying to envision some physical model of the problem. He considered mathematical notation and language necessary evils at best -- things that tended to get in the way of the real ideas.

M Mitchell Waldrop: The Dream Machine, p 340

In trying to divide the two labs [CSL and SSL at Xerox PARC] between basic and applied computer research, [George Pake] was being led badly astray by his background as a physicist. At the time, he remembers, "I was a bit baffled. I kept looking for these underlying principles of computer science" -- the analogs of Newton's laws of motion, say -- "and I couldn't find them. There were certainly some deep ideas -- for example, information theory, or the Turing machine. But those ideas had not led to a large body of theory as there was in physics." The upshot was that his plan for the separation of the two labs just didn't work, because "both ended up doing applications."

M Mitchell Waldrop: The Dream Machine, p 419

Mead and Conway had pioneered a way of teaching integrated circuit design via an elegant and very general set of design principles. Draft chapters of their textbook, Introduction to VLSI Systems, had been circulating since 1977 and had already been used in courses at Caltech, Berkeley, Carnegie Mellon, and MIT. (The book itself, which was published in 1979, would go on to become a bible for VLSI professionals.)

But just as important, Mead and Conway had conceived the notion of a "silicon foundry." The idea was that students in an IC design course would each prepare a chip layout, specified in a standard chip description language, and then send it over the Arpanet to a "silicon broker'-originally PARC and then later Hewlett-Packard. The broker, in turn, would compile dozens of individual designs and then arrange with a chip manufacturer to have them all etched onto a single silicon wafer, so that the cost could be shared. Finally, the chips would be cut apart, packaged individually, and sent back to the students for testing and experimentation...

MOSIS would flourish into the 1990s. And chip innovation would flourish along with it. MOSIS supported design experiments for advanced architectures such as Intel's Cosmic Cube and the Connection Machine from Thinking Machines, Inc. It supported experiments in reduced-instruction-set computing at Stanford and Berkeley, thereby providing a proof-of-concept for a number of cutting-edge commercial chips of the late 1980s and 1990s. It even supported the development of the "graphics engine" chip by Stanford's James Clark, who would soon be applying his expertise as a cofounder of Silicon Graphics, Inc. And most of all, the MOSIS project produced an awful lot of people trained in VLSI design. Indeed, you could argue that MOSIS was as much responsible as any other single factor for the explosion in microchip technology during the 1980s and 1990s.

William Thurston: Mathematical Education

People are much smarter when they can use their full intellect and when they can relate what they are learning to situations or phenomena which are real to them.

The natural reaction, when someone is having trouble understanding what you are explaining, is to break up the explanation into smaller pieces and explain the pieces one by one. This tends not to work, so you back up even further and fill in even more details.

But human minds do not work like computers: it is harder, not easier, to understand something broken down into all the precise little rules than to grasp it as a whole. It is very hard for a person to read a computer assembly language program and figure out what it is about...

Studying mathematics one rule at a time is like studying a language by first memorizing the vocabulary and the detailed linguistic rules, then building phrases and sentences, and only afterwards learning to read, write, and converse. Native speakers of a language are not aware of the linguistic rules: they assimilate the language by focusing on a higher level, and absorbing the rules and patterns subconsciously. The rules and patterns are much harder for people to learn explicitly than is the language itself.

Ian Bogost: On the Manifesto for a Ludic Century

When you think about it, it's curious to pen a manifesto for a ludic century to come in the twenty-first century, when the manifesto itself was such a staple of twentieth-century thought... The modern manifesto as a written prescription that makes manifest certain principles really starts with the political manifestos of Marx, Engels, Bellegarrigue, and others in the mid-19th century. The artistic manifestos of Symbolism, Futurism, Dadaism, Surrealism, and others followed this lead, proclaiming clear, direct, and unyielding principles for creative practice. So, perhaps there is one fundamental challenge for the Manifesto for a Ludic Century: would a truly ludic century be a century of manifestos? Of declaring simple principles rather than embracing systems? Or, is the Ludic Manifesto meant to be the last manifesto, the manifesto to end manifestos, replacing simple answers with the complexity of "information at play?"

Douglas Hofstadter: Analogy as the Core of Cognition

I believe that all communication is via analogy. Indeed, I would describe communication this way: taking an intricate dance that can be danced in one and only one medium, and then, despite the intimacy of the marriage of that dance to that medium, making a radically new dance that is intimately married to a radically different medium, and in just the same way as the first dance was to its medium...

Imagine taking the most enthralling basketball game you ever watched... and giving a videotape of that game to a “soccer choreographer,” who will now stage all the details of an artificial soccer game that is in some sense analogous to your basketball game.

Doug Engelbart: quoted in "The Engelbart Hypothesis" by Valerie Landau and Eileen Clegg

My boss gave me quite a lecture one day. He said, "Look, here's eight pages you've gone through to describe this thing you want to do and it's still all faint. Bill has just written this proposal, on one page, very concise, clear, describing exactly what he wants to do with his research." The model proposal was very detailed in an intellectual domain that was already all thoroughly beaten out. What he was proposing was a very narrow research question pursuing a tiny sub-domain.

I tried to explain to my boss that I was interested in opening up an entirely new approach for which there is no vocabulary. Later, people used the term "paradigm shift" to describe a fundamental change in assumptions and thinking. If you're really dealing with something in a different paradigm, the vocabulary of almost everything you're trying to say is different. You have to somehow establish the terms as stepping-stones to arrive at what you're trying to say. And people aren't used to it taking that long for you to get the picture to them. That has been the basic problem ever since, when trying to describe the framework Augmentation System and the Bootstrap Strategy.

Stuart Russell: quoted in "The Man Who Would Teach Machines to Think" by James Somers

A lot of the stuff going on [in AI] is not very ambitious. In machine learning, one of the big steps that happened in the mid-'80s was to say, "Look, here’s some real data -- can I get my program to predict accurately on parts of the data that I haven’t yet provided to it?" What you see now in machine learning is that people see that as the only task.

Danny Hillis: quoted in "Clock Of The Long Now" by Stewart Brand, p 48

If you're going to do something that's meant to be interesting for ten millenia, it almost has to have been interesting for ten millenia. Clocks and other methods of measuring time have interested people for a very long time.

Rob Pike: Systems Software Research is Irrelevant

To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you don’t honor the standards you’re marginalized. Estimate that 90-95% of the work in Plan 9 was directly or indirectly to honor externally imposed standards. At another level, instruction architectures, buses, etc. have the same influence. With so much externally imposed structure, there’s little slop left for novelty.

David Hestenes: interview

The first discovery is one of the highlights of my life. And it gave me strong motivation and direction for my research. That discovery was recognition that the Pauli matrices could be reinterpreted as vectors, and their products had a geometric interpretation. I was so excited that I went and gave a little lecture about it to my father. Among other things, I said, “Look at this identity σ1 σ2 σ3 = i, which appears in all the quantum mechanics books that discuss spin. All the great quantum physicists, Pauli, Schroedinger, Heisenberg and even Dirac as well as mathematicians Weyl and von Neumann, failed to recognize its geometric meaning and the fact that it has nothing to do with spin. When you see the Pauli sigmas as vectors, then you can see the identity as expressing the simple geometric fact that three orthogonal vectors determine a unit volume. Thus there is geometric reason for the Pauli algebra...

My father gave me the greatest compliment of my life... “You have learned the difference between a mathematical idea and its representation by symbols. Many mathematicians never learn that!”

John Herivel: The Background to Newton's Principia, quoted by David Hestenes

When 1666 closed, Newton was not in command of the results that have made his reputation deathless, not in mathematics, not in mechanics, not in optics. What he had done in all three was to lay foundations, some more extensive than others, on which he could build with assurance, but nothing was complete at the end of 1666, and most were not even close to complete. Far from diminishing Newton's stature, such a judgment enhances it by treating his achievement as a human drama of toil and struggle rather than a tale of divine revelation. "I keep the subject constantly before me," he said, "and wait 'till the first dawnings open slowly, by little and little, into a full and clear light."

David Hestenes: Modeling Games in the Newtonian World

We marvel at the ingenuity of Newton's mathematical arguments, in part, because his methods were so unwieldy that no one since has fully mastered them. Before anyone could improve on Newton's performance, the world had to wait half a century for the development of better mathematical tools and techniques, primarily by Euler.

Douglas Hofstadter: Surfaces and Essences, p 452

Geniuses do not deliberately set off with the goal of concocting a wild-sounding analogy between some brand-new phenomenon, shimmering and mysterious, and some old perhenomenon, conceptually distant and seemingly unrelated; rather, they concentrate intensely on some puzzling situation that they think merits deep attention, carefully circling around it, looking at it from all sorts of angles, and finally, if they are lucky, finding a viewpoint that reminds them of some previously known phenomenon that the mysterious new one resembles in a subtle but suggestive manner. Through such a process of convergence, a genius comes to see a surprising new essence of the phenomenon. This is high-level perception; this is discovery by analogy.

Carver Mead: interview

Once angels were the explanation, but now, for us, it is a "force," or "field." But these are all constructs of the human mind to help us to work with and visualize the regularities of nature. When we grasp onto some regularity, we give it a name, and the temptation is always to think that we really understand it. But the truth is that we're still not even close.

John Tukey: Prim-9

Just being able to see projections on all coordinate pairs can, as we have just seen, be very helpful, but it is not enough. To able to get reasonably to all two-dimensional projections means either a way to call for what we want, or a way to move about. Since we usually do not know just what we want, and when we do, we would find it hard to learn to call for it, what we need is a way to move about.

E. T. Jaynes: Probability in Quantum Theory

For all these years it has seemed obvious to me -- for the same reasons that it did to Einstein and Schrödinger -- that the Copenhagen interpretation is a mass of contradictions and irrationality and that, while theoretical physics can of course continue to make progress in the mathematical details and computational techniques, there is no hope of any further progress in our basic understanding of Nature until this conceptual mess is cleared up.

Because this position seems to arouse fierce controversy, let me stress our motivation: if quantum theory were not successful pragmatically, we would have no interest in its interpretation. It is precisely because of the enormous success of the QM mathematical formalism that it becomes crucially important to learn what that mathematics means. To find a rational physical interpretation of the QM formalism ought to be considered the top priority research problem of theoretical physics; until this is accomplished, all other theoretical results can only be provisional and temporary.

This conviction has affected the whole course of my career. I had intended originally to specialize in Quantum Electrodynamics working with J. R. Oppenheimer; but this proved to be impossible. Whenever I look at any quantum-mechanical calculation, the basic craziness of what we are doing rises in my gorge and I have to stop and try to find some different way of looking at the problem, that makes physical sense. Gradually I came to see that the resolution cannot be found within the confines of the traditional thinking of physicists; the foundations of probability theory and the role of human information have to be brought in, and so I have spent many years trying to understand them in the greatest generality.

David Foster Wallace: Tense Present: Democracy, English, and the Wars over Usage

A distinctive feature of [A Dictionary of Modern American Usage] is that its author is willing to acknowledge that a usage dictionary is not a bible or even a textbook but rather just the record of one smart person's attempts to work out answers to certain very difficult questions.

Jeff Bezos: interview

Oftentimes, invention requires a long-term willingness to be misunderstood. You do something that you genuinely believe in, that you have conviction about, but for a long period of time, well-meaning people may criticize that effort. When you receive criticism from well-meaning people, it pays to say -- first of all, search yourself -- "Are they right?" And if they are, you need to adapt what you're doing. If they're not right, if you really have conviction that they're not right, you need to have that long-term willingness to be misunderstood. It's a key part of invention.

Valentino Braitenberg: Vehicles: Experiments in Synthetic Psychology

Our vehicles may move in water by jet propulsion. Or you may prefer to imagine them moving somewhere between galaxies... swimming around in water... little carts moving on hard surfaces... It doesn't matter. Get used to a way of thinking in which the hardware of the realization of an idea is much less important than the idea itself.

Neil Postman: Interview

I don't think any of us can do much about the rapid growth of new technology... However, it is possible for us to learn how to control our own uses of technology... The forum that I think is best suited for this is our educational system. If students get a sound education in the history, social effects and psychological biases of technology, they may grow to be adults who use technology rather than be used by it.

Ta-Nehisi Coates: The Champion Barack Obama

My mother's admonishings had their place. God forbid I ever embarrass her. God forbid I be like my [absent] grandfather, like the fathers of my friends and girlfriends and wife. God forbid I ever stand in front of these white folks and embarrass my ancestors, my people, my dead. And God forbid I ever confuse that creed, which I took from my mother, which I pass on to my son, with a wise and intelligent analysis of my community. My religion can never be science. This is the difference between navigating the world and explaining it. ...

Catharsis is not policy. Catharsis is not leadership. And shame is not wisdom.

Alan Kay: Powerful Ideas Need Love Too! (1995)

Now computers can be television-like, book-like, and "like themselves." Today's commercial trends in educational and home markets are to make them as television-like as possible. And the weight of the billions of dollars behind these efforts is likely to be overwhelming. It is sobering to realize that in 1600, 150 years after the invention of the printing press, the top two bestsellers in the British Isles were the Bible and astrology books! Scientific and political ways of thinking were just starting to be invented. The real revolutions take a very long time to appear, because as McLuhan noted, the intial content and values in a new medium is always taken from old media.

Now one thing that is possible with computers and networks, that could get around some of the onslaught of "infobabble," is the possibility of making media on the Internet that is "self teaching." Imagine a child or adult just poking around the Internet for fun and finding something--perhaps about rockets or gene splicing--that looks intriguing. If it were like an article in an encyclopedia, it would have to rely on expository writing (at a level chosen when the author wrote it) to convey the ideas. This will wind up being a miss for most netsurfers, especially given the general low level of reading fluency today. The computer version of this will be able to find out how old and how sophisticated is the surfer and instantly tailor a progression of learning experiences that will have a much higher chance of introducing each user to the "good stuff" that underlies most human knowledge. A very young child would be given different experiences than older ones--and some of the experiences would try to teach the child to read and reason better as a byproduct of their interest. This is a "Montesorri" approach to how some media might be organized on the Internet: one's own interests provide the motivation to journey through an environment that is full of learning opportunities disguised as toys.

This new kind of "dynamic media" is possible to make today, but very hard and expensive. Yet it is the kind of investment that a whole country should be able to understand and make. I still don't think it is a real substitute for growing up in a culture that loves learning and thinking. But in such a culture, such new media would allow everyone to go much deeper, in more directions, and experience more ways to think about the world than is possible with the best books today. Without such a culture, such media is likely to be absolutely necessary to stave off the fast approaching next Dark Ages.

Adam Cadre: 2013.11 minutiae

Charles Barkley weighed in: "In a locker room and with my friends, we use racial slurs. [...] The language we use, sometimes it's homophobic, sometimes it's sexist, a lot of times it's racist. We do that when we're joking with our teammates." It seems to me that this is an important social divide you don't often hear about: between (a) those who have internalized that you're not supposed to be racist and sexist and homophobic anymore, and aren't, and (b) those who have internalized that you're not supposed to be racist and sexist and homophobic in public anymore, but take it as a given that secretly everyone is making racist and sexist and homophobic jokes in private with their friends.

Marshall McLuhan: The Gutenberg Galaxy

Man the tool-making animal, whether in speech or in writing or in radio, has long been engaged in extending one or another of his sense organs in such a manner as to disturb all of his other senses and faculties.

Sydney Brenner: interview

A Fred Sanger [born 1918] would not survive today’s world of science. With continuous reporting and appraisals, some committee would note that he published little of import between insulin in 1952 and his first paper on RNA sequencing in 1967 with another long gap until DNA sequencing in 1977. He would be labelled as unproductive, and his modest personal support would be denied. We no longer have a culture that allows individuals to embark on long-term—and what would be considered today extremely risky—projects.

Max Black: Models and Metaphors: Studies in Language and Philosophy, p 242

There will always be compentent technicians who... can be trusted to build the highways... But clearing intellectual jungles is also a respectable occupation. Perhaps every science must start with metaphor and end with algebra; and perhaps without the metaphor there would never have been any algebra.

Gerald Jay Sussman: Robust Design through Diversity

A computational system is very much a dynamical system, with a very complicated state space, and a program is very much like a system of (differential or difference) dynamical equations, describing the incremental evolution of the state. One thing we have learned about dynamical systems over the past hundred years is that only limited insights can be gleaned by manipulation of the dynamical equations. We have learned that it is powerful to examine the geometry of the set of all possible trajectories, the phase portrait, and to understand how the phase portrait changes with variations of the parameters of the dynamical equations. This picture is not brittle: the knowledge we obtain is structurally stable.

Kieran Egan: The Educated Mind: How Cognitive Tools Shape Our Understanding, p 76

The Greek alphabet, from which all alphabetic systems are derived, has particular characteristics for making us conscious of our language, or, rather, for determining the kind of consciousness of language that we develop...

Eric Havelock writes about "the superior technology of the Greek alphabet," which remains the "sole instrument of full literacy to the present day". That technology was superior to other scripts and to oral modes of communication, in Havelock's account, because it led to a conceptual revolution in ancient Greece in which "a reflexive syntax of definition, description, and analysis" was exploited by Plato, Aristotle, and other ancient Greeks and all their alphabetic successors. They generated the philosophic, scientific, historical, descriptive, legal, and moral forms of discourse that make up what we call the modern mind...

While many of their neighbors were using writing systems to mark pots of grain, wine, and olives, list kings and priests, and celebrate in stylized form victories over traditional enemies, the Greeks began to exploit their writing system in ways none of its inventors could have imagined... Among much else, it opened up what we call the historical period. Fluent literacy is not simply a matter of thinking and then writing the product of one's thoughts; the writing, rather, becomes a part of the process of thinking. Extended discursive writing is not an external copy of a kind of thinking that goes on in the head; it represents a distinctive kind of literate thinking.

Over the past thirty years, a number of scholars have argued with significant success that the fifth-century developments that had in the earlier part of this century been romantically referred to as the "Greek miracle" -- giving birth to democracy, logic, philsophy, history, drama, reflective introspection, and so on so suddenly -- were explainable in large part as an implication of the development and spread of alphabetic literacy... The developments were not simply in the new kinds of texts being produced in ancient Greece, such as Herodotus's Histories, but were somehow in the kind of thinking that went into writing and reading such texts, or listening to such texts being read or performed.

Kieran Egan: The Educated Mind: How Cognitive Tools Shape Our Understanding, p 93

We reinforce the image of the textbook, encyclopedia, or dictionary as the paradigm of the successful knower. It becomes important in such a climate of opinion to emphasize that books do not store knowledge. They contain symbolic codes that can serve us as external mnemonics for knowledge. Knowledge can exist only in living human minds.

Rosemary Simpson: 50 Years After "As We May Think": The Brown/MIT Vannevar Bush Symposium (1995)

The prescience of [Vannevar] Bush's vision is astonishing. He defined a goal, a strategy, and a research agenda that are still alive today. But, as Andy van Dam made clear in his opening remarks and the subsequent symposium speakers confirmed in their testimony, Bush turned out to be not so much predicting the future as creating it through the influence, both direct and indirect, of his compelling vision on major figures like Engelbart, Nelson, and the other symposium speakers.

Plato: Phaedrus

Socrates: I cannot help feeling, Phaedrus, that writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence. And the same may be said of [written words]. You would imagine that they had intelligence, but if you want to know anything and put a question to one of them, the speaker always gives one unvarying answer. And when they have been once written down they are tumbled about anywhere among those who may or may not understand them, and know not to whom they should reply, to whom not: and, if they are maltreated or abused, they have no parent to protect them; and they cannot protect or defend themselves.

Matt Ridley: The Origins of Virtue: Human Instincts and the Evolution of Cooperation, p 48

When the 5,000-year-old mummified corpse of a fully equipped Neolithic man turned up in a melting glacier high in the Tyrolean Alps in 1991, the variety and sophistication of his equipment was astonishing... Dressed in furs under a woven grass cloak, equipped with a stone dagger with an ash-wood handle, a copper axe, a yew-wood bow, a quiver and fourteen cornus-wood arrows, he also carried a tinder fungus for lighting fires, two birch-bark containers, one of which contained some embers of his most recent fire, insulated by maple leaves, a hazel-wood pannier, a bone awl, stone drills and scrapers, a lime-wood-and-antler retoucheur for fine stone sharpening, an antibiotic birch fungus as a medicine kit and various spare parts. His copper axe was cast and hammered sharp in a way that is extremely difficult to achieve even with modern metallurgical knowledge. It was fixed with millimetre precision into a yew haft that was shaped to obtain mechanically ideal ratios of leverage.

This was a technological age. People lived their lives steeped in technology. They knew how to work leather, wood, bark, fungi, copper, stone, bone and grass into weapons, clothes, ropes, pouches, needles, glues, containers and ornaments. Arguably, the unlucky mummy had more different kinds of equipment on him than the hiker couple who found him. Archaeologists believe he probably relied upon specialists for the manufacture of much of his equipment, and perhaps also for the tattoos that had been applied to his arthritic joints.

Frank Lantz: Hearts and Minds (33:50)

The dilemma of quantitative, data-driven game design.... So here's an analogy: Imagine you have a friend who has trouble forming relationships with women, and he tells you, "I don't know what I'm doing wrong. I go on a date, and I bring a thermometer so I can measure their skin temperature. I bring calipers so I can measure their pupil, to see when it's expanding and contracting..." The point is, it doesn't even matter if these are the correct things to measure to predict someone's sexual arousal. If you bring a thermometer and calipers with you on a date, you're not going to be having sex...

There's really only one important question worth asking, which is: what is a life well-lived?... That's a question that can't be answered, but one thing we can say, with a lot of certainty, is that a life well-lived is not going to be a life in which every moment is scrutinized.

Ed Catmull: Keep Your Crises Small, 23:51

Initially, the films [our teams] put together, they're a mess. It's like everything else in life -- the first time you do it, it's a mess. Sometimes it's labeled "first time, it's a failure", but that's not even the right word to use. It's just like, you get the first one out, you learn from it, and the only failure is if you don't learn from it, if you don't progress.

Richard Rhodes: The Making of the Atomic Bomb, p50

"Collision" [of alpha particles and nuclei] is misleading. What [Ernest] Rutherford had visualized, making calculations and drawing diagrammatic atoms on large sheets of good paper, was exactly the sort of curving path toward and away from a compact, massive central body that a comet follows in its gravitional pas de deux with the sun. He had a model made, a heavy electromagnet suspended as a pendulum on thirty feet of wire that grazed the face of another electromagnet set on a table. With the two grazing faces matched in polarity and therefore repelling each other, the pendulum was deflected into a parabolic path according to its velocity and angle of approach, just as the alpha particles were deflected. He needed as always to visualize his work.

Nikolai Luzin: The Evolution of Function, Part II

The modern understanding of function and its definition, which seems correct to us, could arise only after Fourier's discovery. His discovery showed clearly that most of the misunderstandings that arose in the debate about the vibrating string were the result of confusing two seemingly identical but actually vastly different concepts, namely that of function and that of its analytic representation. Indeed, prior to Fourier's discovery no distinction was drawn between the concepts of "function" and of "analytic representation," and it was this discovery that brought about their disconnection.

Neil Postman: Five Things We Need to Know About Technological Change

Every technology has a prejudice. Like language itself, it predisposes us to favor and value certain perspectives and accomplishments. In a culture without writing, human memory is of the greatest importance, as are the proverbs, sayings and songs which contain the accumulated oral wisdom of centuries. That is why Solomon was thought to be the wisest of men. In Kings I we are told he knew 3,000 proverbs. But in a culture with writing, such feats of memory are considered a waste of time, and proverbs are merely irrelevant fancies. The writing person favors logical organization and systematic analysis, not proverbs. The telegraphic person values speed, not introspection. The television person values immediacy, not history...

Every technology has a philosophy which is given expression in how the technology makes people use their minds, in what it makes us do with our bodies, in how it codifies the world, in which of our senses it amplifies, in which of our emotional and intellectual tendencies it disregards. This idea is the sum and substance of what the great Catholic prophet, Marshall McLuhan meant when he coined the famous sentence, “The medium is the message.”

Neil Postman: Five Things We Need to Know About Technological Change

Who, we may ask, has had the greatest impact on American education in this century? If you are thinking of John Dewey or any other education philosopher, I must say you are quite wrong. The greatest impact has been made by quiet men in grey suits in a suburb of New York City called Princeton, New Jersey. There, they developed and promoted the technology known as the standardized test, such as IQ tests, the SATs and the GREs. Their tests redefined what we mean by learning, and have resulted in our reorganizing the curriculum to accommodate the tests.

A second example concerns our politics... The radicals who have changed the nature of politics in America are entrepreneurs in dark suits and grey ties who manage the large television industry in America. They did not mean to turn political discourse into a form of entertainment. They did not mean to make it impossible for an overweight person to run for high political office. They did not mean to reduce political campaigning to a 30-second TV commercial. All they were trying to do is to make television into a vast and unsleeping money machine. That they destroyed substantive political discourse in the process does not concern them.

Noam Chomsky: talk at Georgetown University

The US media are alone in that you must meet the condition of concision. You gotta say things between two commercials, or in 600 words. And that's a very important fact, because the beauty of concision is that you can only repeat conventional thoughts.

Suppose I get up on Nightline, and I'm given two minutes, and I say [...], I don't need any evidence. Everybody just nods. On the other hand, suppose you say something that isn't just regurgitating conventional pieties. Suppose you say something that's the least bit unexpected or controversial... People will, quite reasonably, expect to know what you mean. Why did you say that? I never heard that before. If you said that, you better have a reason, better have some evidence. In fact, you better have a lot of evidence, because that's a pretty startling comment. You can't give evidence if you're stuck with concision. That's the genius of this structural constraint.

Alan Kay: The Power of the Context

Giving a professional illustrator a goal for a poster usually results in what was desired. If one tries this with an artist, one will get what the artist needed to create that day. Sometimes we make, to have, sometimes to know and express. The pursuit of Art always sets off plans and goals, but plans and goals don't always give rise to Art. If "visions not goals" opens the heavens, it is important to find artistic people to conceive the projects.

Thus the "people not projects" principle was the other cornerstone of ARPA/PARC’s success. Because of the normal distribution of talents and drive in the world, a depressingly large percentage of organizational processes have been designed to deal with people of moderate ability, motivation, and trust. We can easily see this in most walks of life today, but also astoundingly in corporate, university, and government research. ARPA/PARC had two main thresholds: self-motivation and ability. They cultivated people who "had to do, paid or not" and "whose doings were likely to be highly interesting and important". Thus conventional oversight was not only not needed, but was not really possible. "Peer review" wasn't easily done even with actual peers. The situation was "out of control", yet extremely productive and not at all anarchic.

"Out of control" because artists have to do what they have to do. "Extremely productive" because a great vision acts like a magnetic field from the future that aligns all the little iron particle artists to point to “North” without having to see it. They then make their own paths to the future.

Leigh Alexander: The Unearthing

Ian is a game design teacher and a professional skeptic. People call him a “curmudgeon”, but they don’t really understand how much love, how much actual faith, that kind of skepticism takes. On a pretty regular basis one of us will IM the other something like “help” or “fuck” or “people are terrible”.

Only when you fully believe in how wonderful something is supposed to be does every little daily indignity start to feel like some claw of malaise. At least, that’s how I explain Ian to other people.

Elaine Ou: Your Margin is My Opportunity

Your margin is my opportunity. –Jeff Bezos, Amazon

That’s probably what Lyft and Uber were saying to each other as they slashed their commissions to 0. How do you beat a company that doesn’t need to make money?

The 8 hours you need to sleep each night, are my opportunity. The time you spend with your family and friends, is my opportunity. If you’re not maxed out, if there’s still a shred of humanity left in you, then you’re just leaving your lunch on the table.

David Graeber: interview

[In socialist regimes], you couldn’t really get fired from your job. As a result you didn’t really have to work very hard... I have a lot of friends who grew up in the USSR, or Yugoslavia, who describe what it was like. You get up. You buy the paper. You go to work. You read the paper. Then maybe a little work, and a long lunch, including a visit to the public bath. If you think about it in that light, it makes the achievements of the socialist bloc seem pretty impressive: a country like Russia managed to go from a backwater to a major world power with everyone working maybe on average four or five hours a day. But the problem is they couldn’t take credit for it. They had to pretend it was a problem, “the problem of absenteeism,” or whatever, because of course work was considered the ultimate moral virtue. They couldn’t take credit for the great social benefit they actually provided. Which is, incidentally, the reason that workers in socialist countries had no idea what they were getting into when they accepted the idea of introducing capitalist-style work discipline. “What, we have to ask permission to go to the bathroom?” It seemed just as totalitarian to them as accepting a Soviet-style police state would have been to us.

That ambivalence in the heart of the worker’s movement remains... On the one hand, there’s this ideological imperative to validate work as virtue in itself. Which is constantly being reinforced by the larger society. On the other hand, there’s the reality that most work is obviously stupid, degrading, unnecessary, and the feeling that it is best avoided whenever possible. But it makes it very difficult to organize, as workers, against work.

John Conway: On Numbers and Games, epilogue

The Surreal Numbers [introduced in this book] have been the topic of many research papers and a number of books... Most of the authors who have written about them have chosen to define surreal numbers to be just their sign-sequences. This has the great advantage of making equality be just identity rather than an inductively defined relation, and also of giving a clear mental picture from the start. However, I think it has probably also impeded further progress. Let me explain why.

The greatest delight, and at the same time, the greatest mystery, of the Surreal numbers is the amazing way that a few simple "genetic" definitions magically create a richly structured Universe out of nothing. Technically, this involves in particular the facts that each surreal number is repeatedly redefined, and that the functions the definitions produce are independent of form. Surely real progress will only come when we understand the deep reasons why these particular definitions have this property? It can hardly be expected to come from an approach in which this problem is avoided from the start? ...

I believe the real way to make "surreal progress" is to search for more of these "genetic" definitions and seek to understand their properties.

Ian Bogost: Proteus: A Trio of Artisanal Game Reviews

Proteus is a game about being an island instead of a game about being on one... One explores Proteus less like one explores a wooded nature preserve and more like one explores a naked body -- by moving it through one's attention rather than by moving one's attention through it.

Clay Shirky: The Semantic Web, Syllogism, and Worldview (2003)

The people working on the Semantic Web greatly overestimate the value of deductive reasoning (a persistent theme in Artificial Intelligence projects generally.) The great popularizer of this error was Arthur Conan Doyle, whose Sherlock Holmes stories have done more damage to people's understanding of human intelligence than anyone other than Rene Descartes. Doyle has convinced generations of readers that what seriously smart people do when they think is to arrive at inevitable conclusions by linking antecedent facts.

This sentiment is attractive precisely because it describes a world simpler than our own. In the real world, we are usually operating with partial, inconclusive or context-sensitive information. When we have to make a decision based on this information, we guess, extrapolate, intuit, we do what we did last time, we do what we think our friends would do or what Jesus or Joan Jett would have done, we do all of those things and more, but we almost never use actual deductive logic.

As a consequence, almost none of the statements we make, even seemingly obvious ones, are true in the way the Semantic Web needs them to be true. Drew McDermott [noted] "It must be the case that a significant portion of the inferences we want [to make] are deductions, or it will simply be irrelevant how many theorems follow deductively from a given axiom set." ...

This [absurd syllogism from Dodgson] is the best critique of the Semantic Web ever published, as it illustrates the kind of world we would have to live in for this form of reasoning to work, a world where language is merely math done with words. ...

Statements in the Semantic Web work as inputs to syllogistic logic not because syllogisms are a good way to deal with slippery, partial, or context-dependent statements -- they are not, for the reasons discussed above -- but rather because syllogisms are things computers do well. If the world can't be reduced to unambiguous statements that can be effortlessly recombined, then it will be hard to rescue the Artificial Intelligence project. And that, of course, would be unthinkable.

Ian Bogost: What Do We Save When We Save the Internet?

The comments, and reading them, and not reading them. Knowing that response and reaction responds and reacts to someone’s preferred idea rather than the ideas proffered.

Douglas Adams: So Long, and Thanks for All the Fish, p 130

It is very easy to be blinded to the essential uselessness of [these products] by the sense of achievement you get from getting them to work at all.

In other words -- and this is the rock solid principle on which the whole of the Corporation’s Galaxy-wide success is founded -- their fundamental design flaws are completely hidden by their superficial design flaws.

Hal Abelson and Andrea diSessa: Turtle Geometry: The Computer as a Medium for Exploring Mathematics

We encourage you not to lose sight of the most important reason for a combined look at turtles and vectors: Turtle geometry and vector geometry are two different representations for geometric phenomena, and whenever we have two different representations of the same thing we can learn a great deal by comparing representations and translating descriptions from one representation into the other. Shifting descriptions back and forth between representations can often lead to insights that are not inherent in either of the representations alone.

George Lakoff and Rafael Núñez: Where Mathematics Come From: How The Embodied Mind Brings Mathematics Into Being

In recent years, there have been revolutionary advances in cognitive science... Perhaps the most profound of these new insights are the following:

1. The embodiment of mind. The detailed nature of our bodies, our brains, and our everyday functioning in the world structures human concepts and human reason. This includes mathematical concepts and mathematical reason.

2. The cognitive unconscious. Most thought is unconscious -- not repressed in the Freudian sense but simply inaccessible to direct conscious introspection. We cannot look directly at our conceptual systems and at our low-level thought processes. This includes most mathematical thought.

3. Metaphorical thought. For the most part, human beings conceptualize abstract concepts in concrete terms, using ideas and modes of reasoning grounded in the sensory-motor system. The mechanism by which the abstract is comprehended in terms of the concrete is called conceptual metaphor.

George Lakoff and Rafael Núñez: Where Mathematics Come From: How The Embodied Mind Brings Mathematics Into Being

Symbolic logic is not the basis of all rationality, and it is not absolutely true. It is a beautiful metaphorical system, which has some rather bizarre metaphors. It is useful for certain purposes but quite inadequate for characterizing anything like the full range of the mechanisms of human reason...

Mathematics as we know it is human mathematics, a product of the human mind, [using] the basic conceptual mechanisms of the embodied human mind as it has evolved in the real world. Mathematics is a product of the neural capacities of our brains, the nature of our bodies, our evolution, our environment, and our long social and cultural history.

Lewis Lapham: The Eternal Now: Introduction to "Understanding Media"

By eliminating the dimensions of space and time, the electronic forms of communication also eliminate the presumption of cause and effect. Typographic man assumed that A follows B, that people who made things -- whether cities, ideas, families, or works of art -- measured their victories (usually Pyrrhic) over periods of time longer than those sold to the buyers of beer commercials. Graphic man imagines himself living in the enchanted garden of the eternal now. If all the world can be seen simultaneously, and if all mankind's joy and suffering is always and everywhere present (if not on CNN or Oprah, then on the "Sunday Night Movie" or MTV), nothing necessarily follows from anything else. Sequence becomes merely additive instead of causative.

Joe Armstrong: On the road again

The Himba [of northern Namibia] have distinct names for different shades of green, we have no names for these shades, so we don’t “see” the different colors, but the Himba do. [This is analogous] to concepts on programming languages, how different words have acquire precise meanings which are misunderstood in different communities...

For me the words “concurrent,” “parallel” and “simultaneous” have completely different meanings, but many people think they mean the same thing. It’s like me seeing three shades of green, when the person I’m talking to sees one green.

Gregory Bateson: conversation with Stewart Brand in "Both Sides of the Necessary Paradox"

"Oh the damage that's been done to psychiatric thinking by the clinical bias. The clinical bias being, that there are good things and there are bad things. The bad things necessarily have causes. This is not so true of good things.

"No experimenter links up, say, the phenomena of schizophrenia with the phenomena of humor. Schizophrenia is clinical, and humor isn't even psychology, you know. The two of them are closely related, and closely related, both of them, to arts and poetry and religion. So you've got a whole spectrum of phenomena the investigation of any of which throws light on any other. The investigation of none of which is very susceptible to the experimental method."

"Because of non-isolatability?" I [Stewart Brand] think I'm ahead of him this time.

"Because the experiment always puts a label on the context in which you are. You can't really experiment with people, not in the lab you can't. It's doubtful you can do it with dogs. You cannot induce a Pavlovian nervous breakdown -- what do they call it, 'experimental neurosis' -- in an animal out in the field."

"I didn't know that!" I'm gleeful.

More of the Bateson chortle. "You've got to have a lab."

"Why?"

"Because the smell of the lab, the feel of the harness in which the animal stands, and all that, are context markers which say what sort of thing is going on in this situation; that you're supposed to be right or wrong, for example.

"What you do to induce these neuroses is, you train the animal to believe that the smell of the lab and similar things is a message which tells him he's got to discrimiate between an ellipse and a circle, say. Right. He learns to discriminate. Then you make the discrimination a little more difficult, and he learns again, and you have underlined the message. Then you make the discrimination impossible.

"At this point discrimination is not the appropriate form of behavior. Guesswork is. But the animal cannot stop feeling that he ought to discriminate, and then you get the symptomalogy coming on. The one whose discrimination broke down was the experimenter, who failed to discriminate between a context for discrimination and a context for gambling."

"So," says I, "it's the experimenter's neurosis that --"

"-- Has now become the experimental neurosis of the animal. This whole context business has a Heisenberg hook in it much worse than the atoms ever thought of." ...

"In the field what happens?"

"None of this happens. For one thing, the stimuli don't count. Those electric shocks they use are about as powerful as what the animal would get if he pricked his leg on a bramble, pushing through.

Suppose you've got an animal whose job in life is to turn over stones and eat the beetles under them. All right, one stone in ten is going to have a beetle under it. He cannot go into a nervous breakdown because the other nine stones don't have beetles under them. But the lab can make him do that you see."

"Do you think we're all in a lab of our own making, in which we drive each other crazy?"

"You said it, not I, brother," chuckling. "Of course."

Frank Lantz: Life and Death and Middle Pair: Go, Poker and the Sublime (17:10)

Go is about thinking. Go is thought made visible to itself... When you study go, you must learn to read. And it's painful and dull... Black goes here, trying to capture white, so white goes here, black goes here, white goes here....

But you also must learn how to see, as if from a distance, the patterns that cannot be articulated in this discrete and finite way. Truly high-level play is about intuition and feel and wisdom, as well as this brute-strength tactical reading.

So go is something like a brightly-colored dye that is squirted into the fluid of our thoughts just at the point where they unfold into turbulance, at the threshold of these two ways of seeing, the discrete and the continuous. [reason and intuition]

We set up camp at the border of what is possible for our minds to compute, and then we push into the wilderness.

Doug Engelbart: interview

I started trying to reach out to make connections in domains of interest and concerns out there that fit along the vector I was interested in. I went to the information retrieval people. I remember one instance when I went to the Ford Foundation's Center for Advanced Study in Social Sciences... I was trying to explain what I wanted to do and one guy just kept telling me, "You are just giving fancy names to information retrieval. Why do that? Why don't you just admit that it's information retrieval and get on with the rest of it and make it all work?" He was getting kind of nasty. The other guy was trying to get him to back off.

Kieran Egan: The Educated Mind: How Cognitive Tools Shape Our Understanding (p 179)

A problem for understanding intellectual development is that we have no adequate metaphors for it. You will perhaps recall Jonathan Miller's claim that it became possible to understand the function of the heart and the circulation of the blood only after the invention of pumps to clear mines of water. Everyone since the beginning of our species has felt that regular thump in the chest and has seen blood spurt when arteries are cut. Thinking of the heart as a pump enabled us to understand its function.

But we have no similarly useful metaphor to help us understand intellectual or cultural or educational development. The nearest processes are biological... Piaget's is perhaps the best known elaboration of a biology-based metaphor of development, and John Dewey elaborated a conception of education drawing on "growth".

I have tried to adhere to a nonbiological and rather vague use of "development". These kinds of understanding are only "somewhat" distinctive in that they are not wholly different forms of thought, mutually incomprehensible; they are not so much like different computer programs as like modules of a well-integrated program, focusing on different tasks but each able to comprehend the others. Well, that's not a great metaphor either; tempting though computer metaphors are for mental operations, they always seem to confuse as much as they clarify.

John Maynard Keynes: The General Theory of Employment, Interest and Money (1936)

This book's... main purpose is to deal with difficult questions of theory, and only in the second place with the applications of this theory to practice... Those, who are strongly wedded to what I shall call “the classical theory”, will fluctuate, I expect, between a belief that I am quite wrong and a belief that I am saying nothing new....

The composition of this book has been for the author a long struggle of escape, and so must the reading of it be for most readers if the author’s assault upon them is to be successful, -- a struggle of escape from habitual modes of thought and expression. The ideas which are here expressed so laboriously are extremely simple and should be obvious. The difficulty lies, not in the new ideas, but in escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds.

Ian Bogost: interview about A Slow Year

As for flaws, the biggest risk is whether or not I’ll be able to make the game legible to players. Maybe I’m about to lose all my remaining credibility, it’s not my intention to be effete or obscurantist. Yet, I know that’s how some will receive the game. But, I really mean all this stuff. It’s not pretense.

Edwin Hutchins: Cognition in the Wild, p 367

The definition of cognition has been unhooked from interaction with the world. Research on games and puzzles has produced some interesting insights, but the results may be of limited generality. The tasks typically chosen for laboratory studies are novel ones that are regarded by subjects as challenging or difficult. D'Andrade has likened the typical laboratory cognitive tasks to feats of athletic prowess. If we want to know about walking, studying people jumping as high as they can may not be the best approach.

Such tasks are unrepresentative in another sense as well. The evolution of the material means of thought is an important component of culturally elaborated tasks. It permits a task that would otherwise be difficult to be re-coded and re-represented in a form in which it is easy to see the answer. This sort of development of material means is intentionally prohibited in puzzle tasks because to allow this sort of evolution would destroy the puzzling aspects of the puzzle. Puzzles are tasks that are preserved in the culture because they are challenging. If the performance mattered, we would learn to re-represent them in a way that removed the challenge. That would also remove their value as puzzles, of course.

The point is that the tasks that are "typical" in laboratory studies of thought are drawn from a special category of cultural materials that have been isolated from the cognitive processes of the larger cultural system. This makes these tasks especially unrepresentative of human cognition.

Jacob Bronowski: The Ascent of Man, p 70

Genghis Khan was a nomad and the inventor of a powerful war machine -- and that conjunction says something important about the origins of war in human history. Of course, it is tempting to close one's eyes to history, and instead to speculate about the roots of war in some possible animal instinct: as if, like the tiger, we still had to kill to live, or, like the robin redbreast, to defend a nesting territory.

But war, organized war, is not a human instinct. It is a highly planned and co-operative form of theft. And that form of theft began ten thousand years ago when the harvesters of wheat accumulated a surplus, and the nomads rose out of the desert to rob them of what they themselves could not provide. The evidence for that we saw in the walled city of Jericho and its prehistoric tower. That is the beginning of war.

A. H. Maslow: A Theory of Human Motivation

Emergency conditions are, almost by definition, rare in the normally functioning peaceful society. That this truism can be forgotten is due mainly to two reasons. First, rats have few motivations other than physiological ones, and since so much of the research upon motivation has been made with these animals, it is easy to carry the rat-picture over to the human being. Secondly, it is too often not realized that culture itself is an adaptive tool, one of whose main functions is to make the physiological emergencies come less and less often. In most of the known societies, chronic extreme hunger of the emergency type is rare, rather than common. In any case, this is still true in the United States. The average American citizen is experiencing appetite rather than hunger when he says "I am hungry." He is apt to experience sheer life-and-death hunger only by accident and then only a few times through his entire life.

Obviously a good way to obscure the 'higher' motivations, and to get a lopsided view of human capacities and human nature, is to make the organism extremely and chronically hungry or thirsty. Anyone who attempts to make an emergency picture into a typical one, and who will measure all of man's goals and desires by his behavior during extreme physiological deprivation is certainly being blind to many things. It is quite true that man lives by bread alone -- when there is no bread. But what happens to man's desires when there is plenty of bread and when his belly is chronically filled?

At once other (and 'higher') needs emerge and these, rather than physiological hungers, dominate the organism. And when these in turn are satisfied, again new (and still 'higher') needs emerge and so on. This is what we mean by saying that the basic human needs are organized into a hierarchy of relative prepotency.

Edwin Hutchins: Cognition in the Wild, p 107

[Regarding Edmond Gunter's predecessor of the slide rule] Again we have an artifact on which computations are performed by physical manipulation. However, there is an important differnece between the astrolabe and Gunter's scale in this regard. In both cases the constraints of a represented world are built into the physical structure of the device, but in the case of Gunter's scale the represented world is not literally the world of experience. Instead it is a symbolic world: the world of logarithmic representations of numbers. The regularities of relations among entities in this world are built into the structure of the artifact, but this time the regularities are the syntax of the symbolic world of numbers rather than the physics of a literal world of earth and stars. The representations of symbolic worlds in physical artifacts, and especially the representation of the syntax of such a world in the physical constraints of the artifact itself, is an enormously powerful principle.

Edwin Hutchins: Cognition in the Wild, p 115

Of all the many possible ways of representing position and implementing navigation computations in the Western tradition, the chart is the one in which the meaning of the expression of position and the meaning of the operations that produce that expression are most easily understood. As was noted above, lines of position could be represented as linear equations, and the algorithm applied to find their intersection could be that of simultaneous linear equations. As a physical analog of space, the chart provides an interface to a computational system in which the user's understanding of the form of the symbolic expressions (lines of position) is structurally similar to the user's understanding of the meanings of the expressions (relations among locations in the world).

In fact, the similarity is so close that many users find the form and the meaning indistinguishable. Navigators not only think they are doing the computations, they also invest the interpretations of events in the domain of the representations with a reality that sometimes seems to eclipse the reality outside the skin of the ship. One navigator jokingly desribed his faith in the charted position by creating the following mock conversation over the chart: "This little dot right here where these lines cross is where we are! I don't care if the bosun says we just went aground, we are here and there is plenty of water under the ship here." For the navigator, the ship is where the lines of position intersect.

Edwin Hutchins: Cognition in the Wild, p 270

Simply being in the presence of others who are working does not always provide a context for learning from their actions. In the example above, the fact that the work was done in [a conversation] between the plotter and the bearing recorder opened it to other memebers of the team. In a similar way, the design of tools can affect their suitability for joint use or for demonstration and may thereby constrain the possibilities for knowledge acquisition. The interaction of a task performer with a tool may or may not be open to others, depending upon the nature of the tool itself. The design of a tool may change the horizon of observation of those in the vicinity of the tool.

For example, because the navigation chart is an explicit graphical depiction of position and motion, it is easy to "see" certain aspects of solutions. The chart representation presents the relevant information in a form such that much of the work can be done on the basis of perceptual inferences. Because the work done with a chart is performed on its surface -- all of the work is at the device's interface, as it were -- watching someone work with a chart is much more revealing of what is done to perform the task than watching someone work with a calculator or a computer.

The openness of a tool can also affect its use as an instrument of instruction. When the bearing recorder chooses a set of landmarks that result in lines of position with shallow intersections, it is easy to show him, on the chart, the consequences of his actions and the nature of the rememdy required... Imagine how much more difficult it would be to explain the inadequacy of the landmark assignment if the lines of position were represented as equations to be punched into a calculator rather than as lines drawn on the chart.