[Hamming]: "I notice that if you have the door to your office closed, you get more work done today and tomorrow, and you are more productive than most. But 10 years later somehow you don’t know quite know what problems are worth working on; all the hard work you do is sort of tangential in importance. He who works with the door open gets all kinds of interruptions, but he also occasionally gets clues as to what the world is and what might be important. Now I cannot prove the cause and effect sequence because you might say, “The closed door is symbolic of a closed mind.” I don’t know. But I can say there is a pretty good correlation between those who work with the doors open and those who ultimately do important things, although people who work with doors closed often work harder. Somehow they seem to work on slightly the wrong thing - not much, but enough that they miss fame."[Hobart]: Working remote is a modern analog to Hamming’s closed-door policy: there’s an immediate productivity boost from reduced interruptions, but some of those interruptions are long-term course-corrections, and they’re valuable.
Hamming's whole talk is fantastic, talking about how to do what he calls "great research"
And for the sake of describing great research I'll occasionally say Nobel-Prize type of work. It doesn't have to gain the Nobel Prize, but I mean those kinds of things which we perceive are significant things. Relativity, if you want, Shannon's information theory, any number of outstanding theories - that's the kind of thing I'm talking about.
Well I now come down to the topic, ``Is the effort to be a great scientist worth it?'' To answer this, you must ask people. When you get beyond their modesty, most people will say, ``Yes, doing really first-class work, and knowing it, is as good as wine, women and song put together,'' or if it's a woman she says, ``It is as good as wine, men and song put together.'' And if you look at the bosses, they tend to come back or ask for reports, trying to participate in those moments of discovery. They're always in the way. So evidently those who have done it, want to do it again. But it is a limited survey. I have never dared to go out and ask those who didn't do great work how they felt about the matter. It's a biased sample, but I still think it is worth the struggle. I think it is very definitely worth the struggle to try and do first-class work because the truth is, the value is in the struggle more than it is in the result. The struggle to make something of yourself seems to be worthwhile in itself. The success and fame are sort of dividends, in my opinion.
So what happens when you do good research, or even great research? Does everything suffer the Conan O'Brien fate?
Let us start with a simple observation, so basic as to almost be trite.
All knowledge only exists in people's heads.
In the limit, if great knowledge is written down in a book, and then people never read the book, in some practical sense, it may as well not have existed. Sometimes, it has to be rediscovered again and again, after being forgotten. This happened with the cure for scurvy, until vitamin C was isolated.
How does information get into people's heads? Well, they either have to read something, or get told it, or rediscover it themselves.
So far, so obvious.
For all the advances in technology, has our ability to read improved, or our ability to listen to conversation? Not obviously. Reading speed may have variation across people, but I've yet to come across anything indicating that it's improving. So let's assume that people's ability to read new source material is no better than in the past.
Now, as you look out on the world, you see that ever more people are doing research, and writing books and papers. Even if some large fraction of this is junk, and some proportion is active stupidity and anti-knowledge, the amount of genuine new knowledge is surely going up every year.
The amount of hours of life you have to read it all, even just the most important bits, in order to make advances at the frontier, is a little higher, but not much. And most of the increase happens at ages long past when you're likely to do any of Hamming's first-class work.
So how do people actually learn enough to advance knowledge?
Well, one way is to spend longer studying and become more specialised. The number of genuine polymaths making contributions in lots of different areas seems to be a lot less than in the days of the Royal Society. This is not a coincidence. Every now and then you get a Von Neumann or a Frank Ramsey, but they are towering and rare geniuses.
The other fate of great research, which is less discussed, is that if it is not to be forgotten, it must be summarised.
How much debate and experiment went into establishing that matter is discrete, and made of atoms, rather than continuous? Or that atoms contain protons, neutrons and electrons? These were colossal contributions, made in painstaking ways by very smart people, resolving a debate that had gone back to the ancient Greeks and before. How do we reward such great work? They become the first sentence of a chemistry class. "Matter is made up of atoms". Boom. Next. There simply isn't time. One can go back to first principles, and read the individual experiments of Dalton and others that established this - that certain combinations of gases tended to combine in fixed proportions, for instance. The Royal Society had the wonderful motto of "Nullius in verba" - take no man's word for it. This is a great aspirational attitude to have, but in practice one can't run all the experiments that make up all of human knowledge. You may well want to know what the experimental evidence actually is. But you probably will end up taking someone's word for it, somewhere, about how those experiments proceeded. How could it be otherwise? How many hours are there in a life?
For the true giants like Newton, their names stay attached to the principles they come up with. But even this is rare. Knowledge of authorship is additional bits of information that people have to carry around in their heads. Is it crucial to know who established each experiment? Or could the time spent learning this be better spent learning more actual facts or principles about the world?
In the fullness of time, if you actually do great work, the praise of posterity will sooner or later be that your work becomes a sentence or two in a summary of a textbook, a contribution to the body of research that every scientist must ingest as fast as possible in order to be able to spend the rest of their lives advancing the frontiers of knowledge. Every page you write, every concept you advance, competes for space in the heads of readers, the pages of textbook authors, and the minutes of this short life. The competition is brutal and Darwinian. Knowledge must evolve to get condensed into shorter and crisper forms, or it risks simply being forgotten. As the time increases, and the amount of new work increases too, the probability of one or other of these outcomes goes to one.
In this respect, one of the great unappreciated works of public service are the efforts of those who do the reading and summarising. Scott Alexander is extremely high on this list - his summaries of other people's books are fantastic, often way more pithy than the original, and include important editorial judgment on strengths and weaknesses. Mencius Moldbug did a similarly great service by reading and synthesizing a huge number of old primary sources that you and I would never have come across otherwise. I have a strong suspicion that over 99% of people currently living who have read Thomas Hutchinson's Strictures Upon the Declaration of Independence are no more than one degree removed from a Moldbug reader.
I think that from this point of view, one should also not be ashamed about mostly reading the abstracts of papers. You can convert the number of hours left in your life, to a number spent reading, to a reading speed, to a total number of pages of text that you will be able to absorb before you die. What shall that text contain? Every paper and book you read in its original and entirety is taking something from the budget available to other great works. Do budget constraints not bind, even for speed readers?
The other point that is worth noting is the disparity between fiction and non-fiction. Science can be summarised. History can be summarised. But fiction and poetry largely cannot, except without stripping out all the art and beauty that made them great. The idea of all of us reading only the cliff notes version of Shakespeare is simply too tragic to bear. But the result of this love is that fiction works stand a much higher chance of being forgotten altogether.
If a man has a genetic mutation that is reproductively advantageous, in the short run, he has more children, and all his traits get passed on. Then his children have more children, and the advantageous gene and the other tag-alongs also get passed on. But roll the tape forward 100 generations, and the only thing left of the original man is the advantageous gene itself. This gets selected on, and the rest gets forgotten.
So too it shall be with memes. You may bequeath an entire volume, but after 100 generations of re-learning, only the crispest, shortest version shall remain. And that is your final contribution to posterity.