The living building

Buildings – even the most cement-filled – are organic; they change through interaction with the parasites that infest them (us, mostly). How often do architects consider this? Ask any scientist who moves into a new laboratory building and you’ll be met with eyerolls and exasperated stories. The new neuroscience institute that I work in is fantastic in many ways, but has some extremely puzzling features such as the need to repeatedly use an ID card to unlock almost every door in the lab. This is in contrast to my previous home of the Salk Institute which was a long open space separated only by clear glass allowing free movement and easy collaboration.

I mostly mention this because the video above – on How Buildings Learn – has a fantastic story at the beginning about MIT’s famous media lab:

I was at the Media Lab when it was brand new. In the three months I was there, the elevator caught fire, the revolving door kept breaking, every doorknob in the building had to be replaced, the automatic door-closer was stronger than people and had to be adjusted, and an untraceable stench of something horrible dead filled the lecture hall for months. This was normal.

In many research buildings, a central atrium serves to bring people together with open stairways, casual meeting areas, and a shared entrance where people meet daily. The Media Lab’s entrance cuts people off from each other; there are three widely separated entrances each huge and glassy; three scattered elevators; few stairs; and from nowhere can you see other humans in the five story space. Where people might be visible, they are carefully obscured by internal windows of smoked glass.

The first scientist…or natural philosopher

Nature has a review of a book on Aristotle:

Aristotle is considered by many to be the first scientist, although the term postdates him by more than two millennia. In Greece in the fourth century BC, he pioneered the techniques of logic, observation, inquiry and demonstration. These would shape Western philosophical and scientific culture through the Middle Ages and the early modern era, and would influence some aspects of the natural sciences even up to the eighteenth century…

Leroi, an evolutionary developmental biologist, visits the Greek island of Lesvos — where Aristotle made observations of natural phenomena and anatomical structures — and puts his own observations in dialogue with those of the philosopher. It was in the island’s lagoon of Kolpos Kalloni that Aristotle was struck by the anatomy of fish and molluscs, and started trying to account for the function of their parts. Leroi’s vivid descriptions of the elements that inspired Aristotle’s biological doctrines — places, colours, smells, marine landscapes and animals, and local lore — enjoin the reader to grasp them viscerally as well as intellectually.

But it is important to distinguish between natural philosophy and science. I have always thought Francis Bacon was the first scientist due to his, y’know, inventing much of what we consider scientific method. I don’t know the extent to which he codified existing ideas versus creating some sort of novel synthesis?

The history of the scientific method is of course a long gradient. Perhaps it began with another early innovator in scientific methodology was Ibn al-Haytham:

The prevailing wisdom at the time was that we saw what our eyes, themselves, illuminated. Supported by revered thinkers like Euclid and Ptolemy, emission theory stated that sight worked because our eyes emitted rays of light — like flashlights. But this didn’t make sense to Ibn al-Haytham. If light comes from our eyes, why, he wondered, is it painful to look at the sun? This simple realization catapulted him into researching the behavior and properties of light: optics…

But Ibn al-Haytham wasn’t satisfied with elucidating these theories only to himself, he wanted others to see what he had done. The years of solitary work culminated in his Book of Optics, which expounded just as much upon his methods as it did his actual ideas. Anyone who read the book would have instructions on how to repeat every single one of Ibn al-Haytham’s experiments.

A search for the science of the mind

More history of the scientists who wanted to understand the mind. Turns out, there was a lot of racism in early 20th century science – what a surprise.

He modelled the brain’s structure as though it was an archaeological site, the different levels supposedly reflecting evolutionary advances. The neocortex, shared by all mammals, controlled basic functions while the prefrontal area was the seat of more advanced abilities.

Investigating the perception of pain, Head had two cutaneous nerves on his left forearm severed. Every Friday for the next four years, he visited Rivers in his college rooms to chart the process of regeneration and the areas of acute sensitivity. Echoing Elliot Smith’s ideas about the evolutionary levels of the brain, Rivers and Head decided that the nervous system contained two layers: one older and more primitive; the other more subtle and localized. They speculated that the two systems “owed their origin to the developmental history of the nervous system. They reveal the means by which an imperfect organism has struggled towards improved functions and physical unity”. And this “could be seen as a metaphor for the triumph of civilization over savagery in human history”. Frederic Bartlett, a student of Rivers who went on to become a leading psychologist in the next generation, noted that this metaphor informed all Rivers’s later theories in physiology, psychology and anthropology. The structure of every human organ, every social institution, revealed cumulative layers of progressive development.

Psychology was looked down on by the Cambridge establishment, but Ludwig Wittgenstein was intrigued and regularly came to Mill Lane to work with Myers. “I had a discussion with Myers about the relations between Logic and Philosophy”, he wrote to Bertrand Russell. “I was very candid and I am sure he thinks that I am the most arrogant devil who ever lived . . . . I think he was a bit less confused after the discussion than before.” When the laboratory was opened to the public in 1913, Wittgenstein exhibited an apparatus for investigating the perception of rhythm. Perhaps influenced by Wittgenstein, Myers was moving away from biological determinism. The physiologists, he complained, “in their attempts to penetrate the reality of the known, were deliberately ignoring the knower”

And some more on the history of the word ‘scientist’ (see previously):

Carrington had noticed the spread of a particular term related to scientific research [it was “scientist”]. He himself felt the word was “not satisfactory,” and he wrote to eight prominent writers and men of science to ask if they considered it legitimate. Seven responded. Huxley and Argyll joined a five-to-two majority when they denounced the term. “I regard it with great dislike,” proclaimed Argyll. Huxley, exhibiting his usual gift for witty dismissals, said that the word in question “must be about as pleasing a word as ‘Electrocution.’”

…The English academic William Whewell first put the word “scientist” into print in 1834 in a review of Mary Somerville’s On the Connexion of the Physical Sciences.Whewell’s review argued that science was becoming fragmented, that chemists and mathematicians and physicists had less and less to do with one another. “A curious illustration of this result,” he wrote, “may be observed in the want of any name by which we can designate the students of the knowledge of the material world collectively.” He then proposed “scientist,” an analogue to “artist,” as the term that could provide linguistic unity to those studying the various branches of the sciences.

…“Scientist” met with a friendlier reception across the Atlantic. By the 1870s, “scientist” had replaced “man of science” in the United States. Interestingly, the term was embraced partly in order to distinguish the American “scientist,” a figure devoted to “pure” research, from the “professional,” who used scientific knowledge to pursue commercial gains…For most British readers, however, the popularity of the word in America was, if anything, evidence that the term was illegitimate and barbarous.

Notes on the word ‘scientist’


The term ‘scientist’ was invented only in 1833, by the polymath William Whewell, who gave it a faintly pejorative odour, drawing analogies to ‘journalist’, ‘sciolist’, ‘atheist’, and ‘tobacconist’. ‘Better die … than bestialise our tongue by such barbarisms,’ scowled the geologist Adam Sedgwick. ‘To anyone who respects the English language,’ said T H Huxley, ‘I think “Scientist” must be about as pleasing a word as “Electrocution”.’ These men preferred to call themselves ‘natural philosophers’ and there was a real distinction. Scientists were narrowly focused utilitarian data-grubbers; natural philosophers thought deeply and wrote elegantly about the moral, cosmological and metaphysical implications of their work…

Charles Babbage, in designing his ‘difference engine’, anticipated all the basic principles of the modern computer – including ‘garbage in, garbage out’. InReflections on the Decline of Science in England (1830) he accused his fellow scientists of routinely suppressing, concocting or cooking data. Such corruption (he confidently insisted) could be cleaned up if the government generously subsidised scientific research…

After his sketches of these forgotten bestsellers, Secord concludes with the literary bomb that blew them all up. In Sartor Resartus Thomas Carlyle fiercely deconstructed everything the popular scientists stood for. Where they were cool, rational, optimistic and supremely organised, he was frenzied, mystical, apocalyptic and deliberately nonsensical. They assumed that big data represented reality; he saw that it might be all pretence, fabrication, image – in a word, ‘clothes’. A century and a half before Microsoft’s emergence, Carlyle grasped the horror of universal digitisation: ‘Shall your Science proceed in the small chink-lighted, or even oil-lighted, underground workshop of Logic alone; and man’s mind become an Arithmetical Mill?’ That was a dig at the clockwork utilitarianism of both John Stuart Mill and Babbage: the latter called his central processing unit a ‘mill’.

Diffusers of useful knowledge. A review by Jonathan Rose on an excellent-sounding book, Visions of Science: Books and Readers at the Dawn of the Victorian Age.

Nietzsche on science

While searching for the appropriate epigraph for my thesis – y’know, important things – I found a lot of great Nietzsche quotes that vaguely relate to science:

Being deep and appearing deep.–Whoever knows he is deep, strives for clarity; whoever would like to appear deep to the crowd, strives for obscurity. For the crowd considers anything deep if only it cannot see to the bottom: the crowd is so timid and afraid of going into the water.

Profundity of thought belongs to youth, clarity of thought to old age.

There are no facts, only interpretations.

There cannot be a God because if there were one, I could not believe that I was not He.

It is my ambition to say in ten sentences what others say in a whole book.

The man of knowledge must be able not only to love his enemies but also to hate his friends.

Cause and effect: such a duality probably never exists; in truth we are confronted by a continuum out of which we isolate a couple of pieces, just as we perceive motion only as isolated points and then infer it without ever actually seeing it. The suddenness with which many effects stand out misleads us; actually, it is sudden only for us. In this moment of suddenness there are an infinite number of processes which elude us. An intellect that could see cause and effect as a continuum and a flux and not, as we do, in terms of an arbitrary division and dismemberment, would repudiate the concept of cause and effect and deny all conditionality.

Convictions are more dangerous enemies of truth than lies.

What are man’s truths ultimately? Merely his irrefutable errors.

What then is truth? A mobile army of metaphors, metonyms, and anthropomorphisms — in short, a sum of human relations, which have been enhanced, transposed, and embellished poetically and rhetorically, and which after long use seem firm, canonical, and obligatory to a people: truths are illusions about which one has forgotten that is what they are; metaphors which are worn out and without sensuous power; coins which have lost their pictures and now matter only as metal, no longer as coins.
We still do not know where the urge for truth comes from; for as yet we have heard only of the obligation imposed by society that it should exist: to be truthful means using the customary metaphors – in moral terms, the obligation to lie according to fixed convention, to lie herd-like in a style obligatory for all…

Nietzsche loved to pile endless epigrams in his book; he was essentially the greatest Twitter philosopher of all time. Not only was he fairly straightforward in how he presented his ideas, but he was a great stylist. Read, say, Twilight of the Idols and then Dostoevsky’s Notes from Underground and tell me they aren’t both products of similar minds.

Science is an aesthetic practice


One of my friends recently hired – or had foisted upon her – an undergraduate to help her do research. “How do you stay motivated?” he asked her. “Uhh” she responded.

We’ve all been there. Doing science really sucks, a lot of the time. Add to that the common list of other deficits with this career: little money, tons of work, no real respect, etc. So why do we do it?

While I was on the microscope I was thinking these very thoughts, and decided to write an article about it on Medium. One of the points of writing this blog is to get better at writing, so I used the chance on Medium to do something a bit different.

The point of the article is that, while there are many reasons one could do science – and I know the reasons are probably pretty heterogeneous among my peers – the reason that I in particular do science is for aesthetic reasons. Essentially, the impulse to do science is the same impulse that drives someone to an appreciation of art. It is a cultivation of an appreciation of the beauty of the world. I’m not sure what motivates an Artist to create Art, though it seems that my few vain attempts have been about creating something beautiful for other people. But to me science is more like stumbling through an art gallery: it isn’t about creating something for other people, but rather finding something beautiful for myself. Once found, I can fit it into my mental blueprints of the world and reflect on it as I would a fine work of art. Perhaps that’s a bit selfish, but it’s what keeps me going.

Go read it, and please feel free to send any constructive criticism on it my way!

Monday Open Thread: The Six Problems of Systems Neuroscience

I was brainstorming experiments and decided to make a list of what I think are the fundamental questions in systems neuroscience:

  1. Sensory: How do we represent the world?
  2. Motor: How do we create an action?
  3. Decision: How do we choose among competing alternatives?
  4. Learning: How do we remain plastic in changing environments?
  5. Computation: What are the underlying algorithms and computations?
  6. Modulation: How does internal state affect the nervous system?

Can anyone think of other broad questions in systems neuroscience? Should one of these not be here? Most other things I could think of belong here; for instance, “How do we deal with external and internal noise?” would probably be under Sensory or Learning. AYWNMBTTOF wrote a great post on what he considers the big questions of his field (taste) which I would subsume under Sensory.

I kind of hope this replaces the somewhat useless 23 Problems in Systems Neuroscience in terms of clarifying what we are studying.

Economics may be a science, but it is not one of the sciences

(Begin poorly-thought-out post:)

Raj Chetty wrote an article for the New York Times that has been being passed around the economics blogosphere on why economics is a science:

What kind of science, people wondered, bestows its most distinguished honor on scholars with opposing ideas? “They should make these politically balanced awards in physics, chemistry and medicine, too,” the Duke sociologist Kieran Healy wrote sardonically on Twitter.

But the headline-grabbing differences between the findings of these Nobel laureates are less significant than the profound agreement in their scientific approach to economic questions, which is characterized by formulating and testing precise hypotheses. I’m troubled by the sense among skeptics that disagreements about the answers to certain questions suggest that economics is a confused discipline, a fake science whose findings cannot be a useful basis for making policy decisions.

He goes on to argue, strangely, that economics is a science because it is now primarily empirical. I’m not particularly interested in the argument of who is a “real” science – when I did physics, I remember people liked to make fun of biology as not a real “hard” science, etc.

But I spend a lot of time talking to people across the scientific spectrum – physicists, biologists, psychologists, economists. And economists are consistently the outlier in what they think about and who they reference. They are simply not a part of the broader natural sciences community. Look at the interdisciplinary connectivity between fields in the picture above. There is a clear cluster in the center of social studies and a largely separate ring of the natural sciences. Here’s another way of viewing it:

No matter how you slice it, economics is just not part of the natural sciences community. It’s starting to edge there, with some hesitant links to neuroscience and genomics, but it’s not there yet. I find it all a bit baffling. Why has economics separated itself so much from the rest of the natural sciences?

Data to PhD students: you’re screwed

Enough people have sent me this beautifully depressing picture that I figured I’d put it up here for posterity. It would be nice to see it broken up into field – this figure is for science and engineering, but science and engineering tend to have very different career paths! My friends who have PhDs in biology are much more likely to try the academic route than my friends in bioengineering, for instance, and physics and chemistry seem to have more post-PhD career options than biologists (in my experience).

I also wonder how this varies with university; I know in Economics (and the humanities) that you’re much, much, much more likely to get a faculty position if you come from Harvard than if you came from, say, State University. I at one point went through a random list of neuroscience professors at top universities and found that a majority of the professors had come from the same few schools – which, sadly, did not include my own.

Nobel prize winners, who needs them?

Never let it be said that the shutdown isn’t affecting our ability to do science. I hear that researchers are  locked out of NIH but they’re not the only ones who have been shut down. Apparently 2012 Nobel Prize winner David Wineland has been found non-essential and furloughed:

Even if Stockholm’s prize committee found Wineland’s work groundbreaking, he was deemed expendable by the government last week.

“On the organization charts I’m just another worker, another non-essential,” said Wineland, sighing, during an interview from his home in Boulder…

“My experiments are completely stopped. It’s very challenging to stay ahead with competitive research when this happens; it just slows the research down,” said Wineland, a soft-spoken man with a white walrus mustache.

I think if my experiments were shut down like this I would just be pacing back and forth feeling nauseous. So many experiments rely on timing that months or years of planned experiments can go down the drain.

Not only is it interfering with the science itself, but the shutdown is beginning to interfere with conferences, too. Apparently satellite meetings of SFN are starting to be canceled.

Feel free to let us know what other affects the shutdown has been having on science.

Update: A good example from NPR