Monday, February 19, 2007

oh so popular on LibraryThing

I have just discovered something called LibraryThing--a website where people list and share their bookshelves! Nice. Anyway, the point of this is that one, yes just one, very special person has a copy of my recent book Middle World on his/her shelf. This means that, as LibraryThing helpfully tells me, there are 1,289,793 books more popular than Middle World!

I have made it. I am someone.

Someone pretty small and unimportant and unpopular, yes, but... someone!

Friday, February 09, 2007

fMRI and brain reading: mark beats guardian to top stories...

Ahem. I knew I was more on the ball than these well-funded journos at the ex-Manchester Guardian...
Compare scientists scan brains with my post of numerous days before... the thought police.

Seriously, as the Guardian article points out, there are probably important ethical issues... how far around the corner is thought-reading? MRI machines are certainly not very portable at the moment. But how long before every police station has one? And MRI experts are being called in to trials as expert witnesses...

What do I think about this? I'm not sure. Perhaps I need a scan to find out...?

Thursday, February 01, 2007

The theology of drag

Writing a lecture on the exciting subject of 'particle mechanics' the other day, I found myself digressing onto some rather spongy ground--the territory of religion, to be precise. And how a comet passing by in 1833 had unexpected consequences for the concept of a God-given eternal universe.

When a particle moves through a fluid, it inevitably suffers drag: friction slowing it down. For 'particle' read anything: a protein in a cell, your car on the motorway, a parachutist plunging from 5000 feet, a delayed 747 hauling a shedload of grim-faced holidaymakers as they finally escape the holding pens of Heathrow...

Or indeed, a comet in space. In 1833 observations were made of Encke's comet as it neared the sun. Those observations were compared with the comet's predicted path, using Newton's celebrated laws of gravity. And the result was a big 'oops'. The comet did not quite agree with Newton: in fact the comet was plainly slowing down.

This indicated that even the distant region of the solar system through which the comet was passing contained enough gaseous material to cause drag--friction was stealing away some of the comet's energy and slowing it down. Obvious for a parachutist falling through the air or a protein in water perhaps--but even apparently 'empty' space was not actually empty.

At first sight this seems at best an interesting, but ultimately not earth-shattering finding. So there are a few gas molecules drifting around in space. Nice to know--but frankly, no big deal.

But the drag on the comet was actually of huge significance--not so much to science as to religion. The drag on the comet proved that even 'heavenly' objects were not subject to perfect, static, never-changing divine laws. If comets were slowed down, planets must be too. Ultimately, the earth and all the rest of the planets, losing their energy to friction with intersolar gases, would finally run out of puff and spiral down into the sun.

The heavens were doomed by drag.

This was a problem for Christian models of the universe in the middle of the 19th century. Isaac Newton, like many scientists of his time, had derived the laws of gravity with a sort of theologically motivated template in mind. He wanted to show that the solar system--the heavens of God--was a stable, static, eternal system. A machine that God had set in motion at the beginning of time, and that would still be going round and round in exactly the same way umpteen eternities later.

And the laws Newton came up with did just that. They were static: they were universal and eternal. Forgetting relativity for a moment, they were right too: theological template or no, Newton was a real scientist who had no intention of going against the evidence. His static, eternal laws killed two birds with one stone: they gave a correct (near enough) mathematical structure to the way the heavens worked, and they satisfied a religious man's desire for a perfect, eternal, God-given solar system.

In a way then Newton made things easier for thinkers of the 18th and early 19th centuries: you could have science--and it didn't get much more scientific than Newton's law of gravity--and you could still have God. This was an important compromise--the sort of thing that, one wonders, might just have got people like Galileo off the hook. In the 19th century many of the big boys of physical science, such as William Thomson (later Lord Kelvin) and James Clerk Maxwell, also managed to have a very solid belief in the ultimate divinity of the cosmos--the God-given nature of the spectacular order and laws they themselves were uncovering as they lifted the skirts of nature. (Apologies if my metaphors are getting a bit uncomfortably mixed here...)

But. If a comet could slow down because of drag, and by extension then planets could slow down too, if indeed the solar system was doomed to be dragged down into final cataclysm--then the theology of the perfect heavens was shot to hell (so to speak). That magisterial work of Newton's, while not scientifically wrong, had had its theological carpet hauled out from under it.

(Incidentally, at the same time as Encke's comet was wheeling into view in the mid-1800s, the timelessness of Newton's mechanical universe was drawing trouble from quite another direction: engineers and scientists of the Industrial Revolution--such as that same Willima Thomson--were becoming uncomfortably aware that Newtonian timelessness did not agree with the laws of thermodynamics, the science of engines and energy. See my book Middle World for more on that story...)

Hence this cometary visit began to unravel the tidy compromise between Newtonian science and God-given universe. The drag on a comet meant if you wanted to reconcile hard scientific law with fundamental theology you were going to have to do some rather more supple mental gymnastics.

So the middle of the 19th century was shaping up to be a torrid time for God... As if Darwin's ideas weren't bad enough, now a few tiny atoms of interstellar hydrogen were getting in the way of a hurtling misshapen lump of ice and causing a real headache for those of a theological disposition...

Wednesday, January 24, 2007

earth as engine

Thermodynamics was the scientific key to the Industrial Revolution: it set down the science of energy, heat, temperature and entropy; it defined at last, after a century of industrial stumbling-about-in-the-dark, the rules of how machines could be made to convert abundant (but not immediately useable) energy into useful work.

Funnily enough, while all these industrialists and scientists were struggling to understand the science of machines, they were actually standing on one. In the 1870s Kelvin (then just plain old Sir William Thomson) pointed out something peculiar about the tides of the earth.

Tides are driven by the varying gravitational attraction of the sun and moon as these and the earth swim gracefully like three big round swans through the heavens: the water on the earth's surface, not being nailed down securely to the planet, tends to slew around a bit.

Kelvin noticed, however, that there was something else that affected the exact motion of the oceans: the pressure of the air. As the tide of water sweeps in to the beach, it fights against air pressure weighing it down.

Hence changes in air pressure change the way the oceans shift. And it was known that air pressure varied between day and night--for the simple reason that the sun heats the air during the day and doesn't during the night. The upshot of this, Kelvin realised, was that the day-night cycle of radiated heat from the sun was converted into a day-night cycle of mechanical force (air pressure) which acted, just like a motor going round or a piston going up and down in a car engine, to cyclically force the motion of the oceans.

This is exactly what any engine does: for instance, a steam engine, the classic workhorse of the 19th century and the icon of the industrial revolution, uses heat from a fire to turn water into steam, generating pressure large enough to drive a piston in a cylinder--generating mechanical force. As the steam cools and condenses, air pressure drives the piston back into the cylinder. And then around again: heat, push out, cool, push in... Heat into mechanical work, all in a cyclic process.

Hence the cyclic heat-cool/push-pull of the air pressure on the oceans means the whole earth is a gigantic motor: converting, by means of air pressure, heat from the sun into a cycle of mechanical force on the water.

We live on the surface of a vast engine!

Thursday, January 11, 2007

Ensuring Best Practice amongst the Thought Police

I've been wondering about the revolutionary changes magnetic resonance imaging (MRI) brain scanners might bring to our society. My conclusions, to summarise for those who can't be bothered to read this whole post: the Thought Police will be able to catch all deviants; they will know when you are really rubbish at your job and have only been pretending all these years; and at last it will be easy to get a reliable builder.

First the deviants. Recent MRI experiments have spatially mapped brain activity in patients suffering from psychopathologies. Patients are shown pictures of happy people and pictures of frightened people. It turns out the brain of the psychopath (not sure that's the right word but anyway...) responds quite differently to 'normal' brains. Response to sight of a happy-looking person is markedly reduced compared to Johnny Normal; response to a frightened-looking person is totally different.

The potential consequence (likely to become reality pretty soon as we spiral deeper into our social paranoia, helped along by a feverishly 'security-conscious' government) of this sort of research is clear: why not scan everyone once a year for signs of deviation. Lock up those whose brains unwisely drift from the norm and start flashing in the wrong places in the MRI machine.

At last a way to actually achieve the dream of the Thought Police.

The second example of MRI brain-gazing concerns mathematicians (as far removed as you can get from psychopaths, of course). Experiments with MRI show that most people, if shown a number printed on a card, asked to memorise it, and then asked to recall it later, actually visualise the number again as they recall it. This even though the original visual stimulus is no longer there, of course, at time of recall. However, the best mathematicians, as opposed to us ordinaries, apparently do not visualise as they go about their mathematical tasks: visualisation is bad for mathematics.

The implication is that you can tell a good mathematician because he or she isn't visualising--and most importantly this is something you can detect and measure directly, by MRI brain scan. You can tell, in other words, when the mathematician is doing his job properly.

There are presumably similar 'best practice' (apologies for using this horrible phrase) patterns of brain activity for other professions and activities. Which bits of the brain does your top physicist crank up when trying to make a recalcitrant oscillscope do what it's supposed to instead of throwing out random data because the wires are loose...? Which mind-portions are flashing wildly as a doctor makes an insightful diagnosis, turning accepted medical science on its head in the best 'Casualty' tradition? What's the pattern in a politician's brain as he or she invents yet another vacuous 'initiative' instead of doing something useful? (All right, that last one's just a joke--of course politicians don't use brains.)

MRI brain scanning seems to offer the possibility, then, of direct measurement of performance. Once you have classified (and perhaps here is the rub, as they say) just what brain activity pattern corresponds to a job done well, you have a way to directly measure how well your candidate is doing.

This has huge potential consequences. At last we will not have to suffer the havoc wreaked by cowboy builders--just scan the chap's mind while he's scratching behind his ear and saying 'tsk tsk and then there's your overheads, mate...' At last no more navigationally-challenged cab-drivers: 'You're not taking me south of the river at this time of night, especially not when your frontal lobe's hardly even registering above noise level, driver.'

And then there's education. As a university teacher (at Nottingham, home of the invention of MRI no less), trying desperately to communicate the enlightening principles of my subject to the generation that is to come after me, I am constantly wondering if any of it is registering. Now all I have to do is scan my students' brains to see if they're learning to think the right way. We won't even need exams anymore.

Except for the teachers, of course, whose brain patterns we will examine judiciously before we give them jobs.

But none of us are safe, of course. George Bernard Shaw once said that everyone should be required to justify their existence, once every five years--on pain of death. Well, now we can do it--we can all be tested--and no amount of bullshit and use of fashionable buzzwords will help you. Because it's the lights in your brain that count.

Here's hoping I'm not in the wrong profession...

Tuesday, January 09, 2007

We are all wasters

I recently stumbled across a peculiar analogy between the science of heat and energy--so called thermodynamics--and the socio-economic basis of society: an analogy whose implication, I'm afraid, is not very flattering for modern society.

One of the first proponents of the importance of the idea of 'work' in physical science was William Whewell, in his book 'The mechanics of engineering', published in 1841. In modern parlance, 'work' equals force x distance: the 'work done' in moving an object, for instance, is the force required multiplied by the distance moved. Work measures the energy required to make something happen, essentially.

It's difficult to realise nowadays that this was a big idea in the 19th century. Scientists still had not grasped the basics of energy, temperature, heat, and efficiency--despite that the industrial revolution, relying absolutely on the science of these things, was already in full swing.

Whewell's book actually uses not 'work' but the term 'labouring force'. This is a nice clue to one of Whewell's other preoccupations (shared no doubt with many a concerned participant in the industrial revolution): labour not as an abstract scientific concept, but as that stuff done by people.

Whewell distinguished between what he called 'productive labour' and 'unproductive labour'. In society, productive labour was that which went to actually producing something. Unproductive labour was that which went to serving someone--nothing concrete was actually produced, the labour involved (by your butler, your maid, your cook, your footman, your gardener, your groom...) simply dissipated into thin air.

In engineering terms, Whewell was making a distinction of vital importance to industry (and later to the science of energy, heat and work): the distinction between useful and 'wasted' effort. 'Productive labour' was energy that did something useful; 'unproductive labour' was waste heat.

The efficiency of a machine depended on the ratio of useful to wasted power. Central to the science of thermodynamics (as the smorgasbord of heat-temperature-energy-entropy science eventually came to be called) was this same distinction between directed and dissipated energy.

The socio-economic roots of Whewell's terminology imply, then, an interesting and rather depressing conclusion. In the 19th century (and much of the 20th) Britain's economy was undoubtedly of a productive bent: masses of iron, steel, chemicals, textiles. Since the 3rd of May 1979 (T-day, if you will) we have of course been busily converting to the 'service economy': exchanging iron, steel and cloth for call-centres, consultancies, and those nowadays-ubiquitous people offering to wash your car for a fiver. (Whatever happened to the kids pocket-money economy?)

According to Whewell's productive and unproductive labour forces and the analogy with useful and wasted energy, then, our modern service economy is, basically, an almost entirely wasteful machine.

And we are all... well, wasters.

Oh dear.

Friday, January 05, 2007

Terribly inspiring lecturers...?

Is there an incompatibility between great visionary science and good teaching of science? I started to wonder about this yesterday as I read about a couple of the 'greats' of the past and their purported ineptitude in the lecture theatre. (Much of the detail below I owe to > G. Crowther's 1968 book 'Scientific Types'.) Perhaps there are lessons we can learn here.

Take the great experimental physicist C. T. R. Wilson--inventor of the so-called cloud chamber. Wilson was a wiry, quiet-spoken Scot who has a good claim as the father of all particle physics experiments. His invention of 1911 allowed the tracks of atoms and sub-atomic particles to be seen directly for the first time. (This only a few years after 'proof' that atoms existed at all came from Jean Perrin's experiments on the Brownian motion of much larger particles in liquids--see Middle World for that whole story!)

But Wilson, it seems, was a terrible lecturer. He would stand facing the blackboard (ah, those were days, 'I love the smell of chalkdust in the morning' etc), mumbling more or less inaudibly, and as he came to the key points his voice would drop even lower, as if forcing the audience to strain to hear might also force them to listen. (Maybe there's some logic in that, now I come ot think of it...) After Wilson gave a Friday evening lecture at the Royal Institution the organiser took him aside and told him he had 'made just about every mistake it is possible to make when giving a lecture'.

And yet, funnily enough, many of Wilson's later-to-be-illustrious students at the Cavendish in Cambridge, while admitting how terrible a lecturer he was, also report how inspiring he was. Lawrence Bragg, for instance, later to be Nobelised for his perfection of the technique of using X-rays to measure the atomic structure of materials (work which, it has been said, led to the whole of molecular biology, starting with Crick & Watson's DNA helix): according to Bragg, Wilson's lectures 'were the best, and the delivery was the worst, of any lectures to which I have been...'

The great 'father of quantum physics' Niels Bohr was apparently a similar lecturing basket-case, being described in action as 'almost inaudible and unintelligible, yet in effect very inspiring.. Bohr had an extraordinary power of conveying to his hearers that they were in the presence of profound insight, and in direct contact with the inner workings of nature, whether they could follow his argument or not...'

I also remember the famous story at Edinburgh, where I did my PhD, of how James Clerk Maxwell (the true Daddy of 19th century physics, second only, or rather third only, to Newton and Einstein in all the history of physical scientists) had been turned down for a job at Edinburgh--not because he was anything other than a frighteningly clever genius, but because he apparently wasn't very good at teaching. Good call, Edinburgh--down at the Cavendish Maxwell went on to create the theory of electromagnetism on which most of modern life is based (see David Bodanis's recent book). (At the same time Maxwell actually designed and built the Cavendish lab itself, too.)

Another example is Osborne Reynolds, one of the first engineering professors in the UK, whose work on liquid flow and turbulence was the basis of most 'big engineering' of the later Industrial Revolution--steam turbines, propeller-driven ships and aeroplanes, you name it. One of Reynolds' pupils in the 1870s at Owens College in Manchester (later Manchester Uni) was J. J. Thomson (subsequently to discover the electron--science really was a pretty small club of high achievers in those days...) Thomson described Reynolds' style: he would rush into the lecture room, open a textbook at random, declare a formula he found to be wrong, write it out on the blackboard, proceed from first principles to prove it wrong, and finally after many false starts and crossings-out announce at the end of the lecture that the formula was right after all.

Doesn't sound a very efficient way to learn--yet Thomson reportedly found Reynolds' lectures strangely educative: perhaps it was because Reynolds exposed the tangled undergrowth of much 'accepted' science, demonstrating that a real scientist never really accepts anything without trying to tear it apart first.

So what's the moral here? Perhaps the real eye-openers of science aren't the best people to be teaching 'the masses'--I mean those of us who are destined to be little more than footsoldiers in the War On Ignorance? I'm not sure. Teaching these days is in danger of getting caught in the politico-economic tug of war between tabloid government and cost-cutting business. Tony Blair wants to tell us what we should be teaching, and so do the business organisations. Shame that few on either end of this tugging rope have a real concept of what it means to try to understand nature--to see something in front of you and want to find out for yourself what it is and how it works and how it might be used.

What these 'terrible lecturers' such as C. T. R. Wilson, Niels Bohr, Osborne Reynolds, etc, had was the ability to inspire. The human brain--even the student brain, oversteeped in alcohol and playstationol though it sometimes is--is the most powerful learning technology we have. What we need to do as teachers is simply find out how to switch it on. I mean inspire it.

And thus start those brilliant brains on their own journey out into the mysterious world around us...