Wednesday, July 29, 2009

The Notion of Disputation Arenas

Over in the closed discussion group of the Lifeboat Foundation, there's been discussion of the concept of "Disputation Arenas."  Or the notion that the art of argument badly needs an upgrade for modern times.

 Back during the middle ages, there were occasional attempts to bring together the wisest members of disparate, bickering factions in order to hear out both sides.  The most famous of these disputations involved Catholic prelates vs prominent Rabbis and they were anything but fair - always aimed at a foregone conclusion.  Yet, the rabbis came, nonetheless.  Why?  Because a little bit of light, in the darkness, is better than none at all.

And so, as the fellow who coined the term "Disputation Arenas," I have decided to post my response to the Lifeboat group here, for public tasting...

DisputationArenasArrowCover(For detailed background, see the lead article in the American Bar Association' s Journal on Dispute Resolution (Ohio State University), v.15, N.3, pp 597-618, Aug. 2000.

My article is now available on my website: Disputation Arenas: Harnessing Conflict and Competition for Society's Benefit.

 -------------------

Regarding the basic notion behind Disputation Arenas...
...I never envisioned a single forum where "truth" would be decided. rather, the notion was simply to empower Already-existing enlightenment processes to do better at their task of pragmatic problem-solving. One of the core elements of the Enlightenment, after all, is argument... the harnessing of inter-human competition toward the discovery of both errors and increasingly more effective models of the world.

An aside -- while I deeply respect my pal Robin Hanson for his lively mind and far-reaching intellect, I never did understand his argument (with which I disagree) that disagreement, in itself, is inherently flawed and not solvable by argument. (Alliteration is intentional. )

Rather, what seems inherent is our human propensity -- nay, genius -- at self-delusion. It is the core human quandary and one that puts the kiabosh on all platonic notions of rule by simple reason. The very best of us fall for delusions -- and moreover, we have no clear way of determining which of us is "the best of us." The one method by which human beings can reliably be made aware of their delusions is through interaction with others -- (a crucial point that we need to make clear to burgeoning artificial intelligences! )

YOU are capable of noticing the delusions that I am too in-love-with to spot or correct. In pointing them out, you do me the service of reciprocal accountability (RA) -- or criticism -- a great boon, allowing self-improvement, and a boon which I'll be only too happy to serve back to you, in plenty. As a favor, of course.

The irony -- that competition thus overcomes our resistance to criticism, and thus fosters a form of (involuntary) cooperation -- is rich and thick and delicious as cake.

(As ironic as the fact that the most vociferous "defenders of competitive markets" are all-too often those who do not get it, and strive always to harm the core process.  And yes, Cato Institute, I am talking about you.)

(See how all this fits into the Big Picture .  Those with immense patience and stamina might even try my way-over-caffienated (but entertaining) talk at Google about "Discourse and Problem Solving in the 21st Century!"

Am I suggesting that Twitter and Second Life and Facebook are helping to lobotomize us, at a time when we really need technologies that might help bring out our best and most mature problem-solving skills?  Well, yes, though I am not invested in pessimism, like Bill Joy or Nicholas Carr.  I feel that these "attention spreading" systems might have some positive effects (perhaps even 1% as much as zealots like Clay Shirky envision!)  But only if they are augmented by other methodologies - mostly not invented yet - that also help us to rediscover focus.

 The crux point is that current fads and trends DO enhance self-expression, vastly, but they also make it trivial to avoid criticism... or, rather, to avoid having to note or notice or respond to or perform self-modification as a result of criticism.  Those who praise ONLY vastly-enhanced self-expression, while ignoring the other half of the Creative Cycle, may be very bright, but they are being zealous fools.

Parse this carefully.  The pessimist curmugeons urge us to step back from the cliff of lobotomization-by-technology, by renouncing some of them and restoring older ways -- a method that never, ever worked in the past. 

Meanwhile, the fervid optimists cry out hossanahs to Twitter. 

Both sides are silly.  To guys like me, who are skeptical of every broad-brush generalization, who love technology but want it to empower pragmatic problem-solvers, it is clear what's missing. And, yes, the solution is more technology...  only much better balanced technology.

 Hence - getting back on-topic - the key features of any Disputation Area system must not only include excellent tools for argument-management , position-parsing, analytic tools and all that. It must also address to problem of how to get people (or advocacy groups) to come! And how to encourage an environment where ALL participants have to grudgingly acknowledge "Hm... I guess I need to take that into account."

Note that that is PRECISELY what happens in the four existing "accountability arenas"... markets, democracy, courts, and especially science. In the first two, it is filthy and inefficient, but also glorious, compared to all past, delusion-drenched civilizations. So, what I am asking for is not impossible.

It would, however, require some focus... and money... to implement. I can think of no more valuable thing for a billionaire to sponsor. But, then, I am not a billionaire and the delusion that I can tell billionaires what to do is... well, a rich one that's been subject to the criticism of life experience.

david brin

===

ADDENDUM

See a fascinating appraisal of the way that Amazon recently sent fingers into every Kindle device that had George Orwell's "Nineteen Eighty-Four" on it, removed the book and refunded the purchase price, without notification -- an act with ironic resonance in many ways. 

The author goes on to offer some interesting comparisons of Kindle to the eReader.  What fascinates me is the extent to which we have allowed the new media to eliminate the freedoms that we had, in the time of videotape, audio cassettes and early computer disks.  True, copyright piracy is (generally) bad.  But the bloody inconvenience and blithering incomprehensibility of simply using a modern DVD player to watch a film that you already own - let alone record an episode of NOVA - it is why I keep three VCRs in the house, still.

Monday, July 20, 2009

Online events and other coolstuff

First: Looking back 40 years: On June 20th, 1969: My brief essay in commemoration of the 40th anniversary of the firsr moon landing is now up on Tor.com. A surprising perspective on art, ambition and the problem of ennui

usenix-brinThird Millennium Problem-Solving: My recent talk for the USENIX Conference is available for viewing online.  A bit nerdier than my usual speeches about the future, for more general audiences.  This bunch of technies seemed to really get into it! So I went a little long.

H+ asked David Brin, Ben Goertzel, J. Storrs Hall, Vernor Vinge, and others: "Is a Terminator-like scenario possible? And if so, how likely is it?"  Extrapolation! Peering into tomorrow!  What fun.

Here’s the latest compilation of my Five Star Rated You Tube appearances. 

COOL QUICKIES

A fascinating look at how your native language alters the way that you think.

See a terrific (and sfnally philosophical) comic strip Dresden Codak

UnscientificAmerica"Unscientific America: How Scientific Illiteracy Threatens Our Future" is co-authored by Chris Mooney and Sheril Kirshenbaum, and is now available in stores across the country and online. (See my review of the book.)

Curtis Wong, a Microsoft researcher I’ve had some cool exchanges with, has brought to life -- partly at Bill Gates’s encouragement -- “project Tuva,” which will now bring you some of the greates, inspirational physics lectures of Richard Feynman.  (Remind me, some time, to tell you some of my own stries about the man, how Feynmen once stole my date at a dance... well, for a while... and how he tricked me into becoming (alas) a physics major.)

FASCINATING

Researchers report that rapamycin, a compound first discovered in soil of Easter Island, extended the expected lifespan of middle-aged mice by 28 percent to 38 percent. In human terms, this would be greater than the predicted increase in extra years of life if cancer and heart disease were both cured and prevented.  (BTW “rapa” comes from Rapanui, the island’s real name. See EARTH) (Thanks Stefan.)

Monkeys that consumed 30 percent less calories than average peers were one third as likely to get a age-related disease and were likely to live longer.   Yeah yeah... I have heard it all before.  So why do we so almost ZERO sign of such an effect in humans? (Putting aside obesity, of course.)  After 4,000 years, we’d know if ascetic monks lived longer, by now.

In fact, everybody has it bass-ackwards!  Semi starvation triggers switches in mammals that say “delay your programmed burnout in case better times may give you a better chance to breed.”  But it doesn’t happen in humans  - because we have ALREADY thrown all those switches!  Our lifespans are already HUGE for mammals.  We get three times as many heartbeats.  Because for a million years it benefited tribes to have some elders around as repositories of lore.  Result?  We are already picking all the low-hanging longevity fruit.  In the case of humans, further increases are gonna need some real sophisticated intervention.

Funny thing.  Not a single researcher in this topic has (to my knowledge) posited this “thrown switches” way of looking at things.  My theory is actually a hybrid of the two big models of ageing -- that it is programmed-in vs that it is an accumulation of genertic and organc errors.  What I am saying is that it is clearly programmed in, for all mammalian species EXCEPT humans, who have already pegged and maxed-out all the dials.  For us, ageing really is about accumulated errors and running out of steam.  Which means that animal analogues and models are of very limited utility.

Watch this one to figure out the joke. Be sure to watch it to the "end"...as the stewardess walks away.

Friday, July 17, 2009

Does the Moon Beckon Us Back?

As the father of three teenagers, I share with millions of other boomers a head-scratching perplexity. Why don’t today’s youth care about outer space?

The easy answer would be to seize upon a simple nostrum -- about each era rejecting the obsessions of the one before it. But then, in that case, why is the very opposite true about popular music? Back in the hippie era, music divided the generations. But today? Well, my kids adore classic 60s and 70s Rock. In a surf shop or bike store, all I have to do is mention a few of the concerts that I snuck into, long ago, and the brash young fellers are at my feet, saying “tell us more, gramps!”

So why do they yawn, when we turn to the NASA Channel or tape the latest shuttle launch to show them after dinner, or when we talk about colonizing Mars?

Or when we brag about being members of a species who walked on the Moon? For certain, you don’t hear astronaut mentioned on any list of dream jobs.

Puzzling over this quandary, I was reminded of something Norman Mailer said, when he wrote his 1960s tome Of A Fire on the Moon. Mailer had begun researching the book amid feelings of smug, intellectual hostility toward the crewcut engineers and fliers he encountered... only then his attitude shifted when he realized, in a startled epiphany that: “They were achieving not one, but two bona fide miracles.”

Feats that -- when Mailer really thought about it -- struck him as truly Biblical in proportion.

1. They were actually going to the Moon!

2. They were actually succeeding in making such an adventure boring.

Mailer’s insight came to mind, while I was talking to my kids about the 40th anniversary of the Apollo 11 landing. Of all the predictions* ever made about spaceflight, I figure the least imaginable outcome would have been ennui.

*(Speaking of predictions. In a 1959 comic strip Jeff Hawke, the writers forecast that the first human landing on the Moon would happen on 4 August 1969, missing the real-life date by only two weeks.

Of course, policy has had a lot to do with it. Members of the astronaut corps were always willing to accept a level of calculated risk similar to -- if more carefully managed than -- the adventurous pioneers of aviation. Perhaps the public might also have accepted the kind of casualty rates that usually occur on a frontier -- they did in Lindbergh’s time. But politicians could not. They wanted promises of “routine access to space.” And so, the shuttle proved an expensive and awkward mix of overblown promises, lost opportunities, relentless nit-pickery and mind numbing sameness. Not at all what we expected, back when my peers sat in dazed wonder, in the front row, watching Stanley Kubrick’s “2001: A Space Odyssey.

Nor is that entirely a bad thing. As I point out elsewhere, we may have failed to build magnificent, rolling space hotels and moonbases that frolic to Strauss waltzes. But our civilization is a better one, than was depicted in that film. And if I had to choose...

Now consider a few other perspectives. For example: ever since the invention of the steam locomotive, human beings (or their machines) managed, every passing year and decade, to keep traveling faster at an accelerating rate -- a curve that kept spiking ever more vertical, until we launched the Voyager space probes on their pellmell fling past Jupiter and beyond the Solar System, in the mid 1970s. Extrapolating that curve of ever-greater speed, some expected that we would, by 2010, dispatch probes to distant stars! We might easily have landed humans on Mars, using Freeman Dyson’s marvelous Orion-drive ships. It all appeared as inevitable and obvious as Moore’s Law of computer development seems to a different generation of techie-transcendentalists.

Only then, quite suddenly, the curve of acceleration abruptly stopped -- after 150 years. The Voyagers still represent, in many ways, a high water mark of humanity’s progress in space, culminating and concluding our raucous search for speed. At least, for now.

Indeed, millions now look at the Space Race obsession as a mark of earlier immaturity. Sure, we benefit from weather and communication satellites, and reconnaissance-sats spread the worldwide strategic transparency that arguably save all our lives, during the Cold War. People are moderately proud of robotic space probes like Hubble and Cassini and Spirit and Opportunity. But, when it comes to dreams of men and women, venturing into vacuum waste, well, you can hardly even find that happening in movie sci fi anymore, let alone our rel-life ambitions.

Certainly, when it comes to the actual Moon itself, I look with skepticism upon any thought of hurrying back there. My own graduate research advisor was the fellow who predicted there might be ice in lightless crater-bottoms, at the north or south lunar poles -- and if it turns out to be true, there may be something useful about the place, someday. But, despite a politician's grandiose boondoggle, it hardly seems a useful destination. Not compared to the riches that await us at near-Earth crossing asteroids, for example. Or that prime piece of real estate that has already caught the Russians' eye -- Phobos. Or the possible abode of life that is Europa.

And yet, in honor of this anniversary, I want to make two points, in defense of those quaint old missions to the Moon.

First, they serve as a backstop against the gloom and pessimism that seem to be preached by cynics of both right and left, at every turn. How many of the arguments for some ambitious enterprise or another begin with: “If we could go to the moon...” Damn right. If we could do that... well... we could do a heckuva lot of cool things, with some gumption, that is.

Finally - I believe the Apollo missions helped to create some of the most important art in human history.

That's a bold and strange statement. But let me dare to define effective visual art as some work or representation that subtly changes human beings just by the sight of it, transforming hearts and minds without verbal or logical persuasion.

By that reckoning, the 20th century featured two hugely effective works of visual art, both of them gifts of physics! First, the terrifying image of the atom bomb altered forever our little-boy romantic attachment to war, beckoning us instead us to grow up a bit in dealing with this new and awesome power to destroy. Defense became the business of serious grownups. Even (especially) among soldiers, war itself is now seen as evidence of failure - an urgent and risky measure arising out of inadequate diplomacy, preparation or deterrence.

The second image that changed us was a gift that arrived at the very end of one of the most difficult years any of us can remember - 1968 - a year that brought most Americans to the brink of exhaustion and despair.


Only then, a final token arrived -- like a gleam of hope shining at the bottom of Pandora’s Box...when the Apollo 8 astronauts brought home that first perfect image of the Earth, floating as a blue marble in space. A picture that moved even the most cynical hearts and changed forever our outlook towards this fragile oasis world.

I'm willing to argue that it was that image -- a work of art that was purely created by humanity’s scientific boldness and ambition -- that transformed us more than anything else. Perhaps making us better, more responsible citizens and world-managers. But also -- one can hope -- possibly sending us down roads that will make us more ready and more worthy, when that day comes for our childrens’ children to reverse things yet again, to once again resume chanting:

“Let’s go!”

====

For a somewhat expanded version of this essay... and other goodies(!)... drop in at the wonderful site TOR.com.

Monday, July 06, 2009

More Science

More from More Science High!  Continuing the cornucopia of interesting things....

I'm on the BBC World Service yet again, this time commenting on "geo-engineering"... or proposals to cool the Earth artificially and compensate for global warming.  I'll announce the posted podcast site.  Till then, read this background article:

A GIANT inflatable tower could carry people to the edge of space without the need for a rocket, and could be completed much sooner than a cable-based space elevator, its proponents claim.  he team envisages assembling the structure from a series of modules constructed from Kevlar-polyethylene composite tubes made rigid by inflating them with a lightweight gas such as helium.My colleague Jeff Hecht has a cool article on this in the New Scientist.  Of course, I described this system in SUNDIVER, back in 1979 -- the "Vanilla Needle" - named after my friend, Ron Finnila, who first mentioned the idea to me.  I even have extensive notes for a way-cool graphic novel that would have featured Jacob Demwa saving the huge, inflated needle.

Ah, but priority is difficult to establish.... Still, will someone add this to the Brin Prediction wiki, please?  Anyone know how to contact the authors? ;-) 

Breakthroughs in understanding how memories form in the brain.
“unpleasant memories are stored by the persistent action of the enzyme PKMzeta, a form of protein kinase C,” and that “these memories can be rapidly erased by injecting a PKMzeta inhibitor into the brain.” Researchers confirmed that by using ZIP, “unpleasant long-term memories in the hippocampus, a region of the brain critical for storing spatial information, are rapidly erased.”  This raises many questions. If human memory can be erased like a computer's hard drive, what happens to the “overwritten” memories? Is there a biochemical equivalent to disk restoration software?

A girl who looks and acts one or two years old is actually 16 years old.  In an almost perfect real life version of Harlarn Ellison's famous short story "Jefty is Five," she seems not to suffer from dwarfism.  Albeit with some uneven dysfunctions, she has simply stayed two.  Science (performed gently of course) is going to learn a LOT from this special person.

More intelligent people don't have more connections, but they have more efficiently placed connections (??) Other studies have shown that physical connections between brain regions via white matter that doesn't contain neurons are also related to intelligence. 

It seems the particles that Enrico Fermi dubbed neutrinos, meaning "little neutral ones", might stretch across billions of light years. The big bang produced huge numbers of "relic" neutrinos, which are quantum-mechanical superpositions of three different mass-energy states. In the early universe, all of these states would have moved at close to the speed of light. But according to calculations by George Fuller and Chad Kishimoto of the University of California, San Diego, as the universe expanded, the most massive of these states slowed down in the relic neutrinos, stretching them across the universe. This raises the possibility that only one of the neutrino's states could fall into a black hole. It's unclear what would happen to the others if this occurred, says Fuller. Wow. 

A cell phone that never needs recharging might sound too good to be true, but Nokia says it's developing technology that could draw enough power from ambient radio waves to keep a cell-phone handset topped up.  Ambient electromagnetic radiation--emitted from Wi-Fi transmitters, cell-phone antennas, TV masts, and other sources--could be converted into enough electrical current to keep a battery topped up. Hey, my sons just built crystal (diode) radios.  They were excited to hear a station, clear as a bell, without battery or wall power!  That is, till they found it was K-Praise Fundamentalist station... and no adjustment of the variable capacitor or coil would change it!  How can that be?  It appears that the diode, itself, is tuned to one station!  Help!

The three researchers published a manifesto in Nature in 2001, declaring that the way to make a synthetic cell was to get a protocell and a genetic molecule to grow and divide in parallel, with the molecules being encapsulated in the cell. Simple fatty acids, of the sort likely to have been around on the primitive Earth, will spontaneously form double-layered spheres, much like the double-layered membrane of today’s living cells. These protocells will incorporate new fatty acids fed into the water, and eventually divide.

Living cells are generally impermeable and have elaborate mechanisms for admitting only the nutrients they need. But Dr. Szostak and his colleagues have shown that small molecules can easily enter the protocells. If they combine into larger molecules, however, they cannot get out, just the arrangement a primitive cell would need.  

Nucleotides consist of a sugar molecule, like ribose or deoxyribose, joined to a base at one end and a phosphate group at the other. Prebiotic chemists discovered with delight that bases like adenine will easily form from simple chemicals like hydrogen cyanide. But years of disappointment followed when the adenine proved incapable of linking naturally to the ribose.

Last month, John Sutherland, a chemist at the University of Manchester in England, reported in Nature his discovery of a quite unexpected route for synthesizing nucleotides from prebiotic chemicals. Instead of making the base and sugar separately from chemicals likely to have existed on the primitive Earth, Dr. Sutherland showed how under the right conditions the base and sugar could be built up as a single unit, and so did not need to be linked.

Another big breakthrough: Researchers l at Imperial College London have discovered that a mixture of left-handed and right-handed molecules can be converted to just one form by cycles of freezing and melting.

See a review of a book about the subtle ways even the simplest life forms "compute:"Wetware: A Computer in Every Living Cell, by Dennis Bray.

Political side note.  See Russ Daggatt's excellent compilation of views about events in Iran.

Heck while I'm at it: these are the best of the old Outer Limits:  Now available on Hulu!

"The Architects of Fear" is the episode that inspired the graphic novel WATCHMEN.  I hope soon they'll post season two... with the incredible Harlan Ellison story "The Demon With The Glass Hand."

http://www.hulu.com/watch/63098/the-outer-limits---original-the-architects-of-fear

http://www.hulu.com/watch/63087/the-outer-limits---original-the-sixth-finger

http://www.hulu.com/watch/63086/the-outer-limits---original-the-man-who-was-never-born

http://www.hulu.com/watch/63097/the-outer-limits---original-second-chance

http://www.hulu.com/watch/63091/the-outer-limits---original-the-bellero-shield

http://www.hulu.com/watch/63077/the-outer-limits---original-moonstone

http://www.hulu.com/watch/63076/the-outer-limits---original-fun-and-games

http://www.hulu.com/watch/63080/the-outer-limits---original-a-feasability-study

http://www.hulu.com/watch/63081/the-outer-limits---original-the-forms-of-things-unknown

Wednesday, July 01, 2009

The World Moves Ahead to More Cool Stuff

Time for my monthly  cornucopia of cool (and non-political) news from the exciting world around us.
Brin-volved Items:

"FiRe CTO Design Challenge": Author, physicist, and host David Brin leads the challenge of "Water Beyond Tomorrow: Using Technology and Innovation to Provide San Diego (and the World) with Adequate Safe Water for Future Decades"  at this year’s “FiRe Conference (Future in Review). 

I had the honor of hosting and stimulating and challenging some of the brightes high technology officers in modern business, including: Sophie Vandebroek, CTO, Xerox, and President, Xerox Innovation Group; Eric Openshaw, Vice Chair and U.S. Technology Leader, Deloitte; Per-Kristian (Kris) Halvorsen, SVP and Chief Innovation Officer, Intuit; Ty Carlson, Architect, SiArch Group, Microsoft; and Joe Burton, CTO, Cisco.

I was interviewed on the BBC World Service on the issue of “bombing” a lunar crater to discover whether there is ice on the moon.  The interviewers worried deeply about littering... but it turned into a delightful and fairminded treatment of the topic.  If it is no longer up, I hope to post it at http://www.davidbrin.com

See an excellent and eye-opening article about The Participatory Panopticon, by Jamais Cascio, that includes an interview with David Brin about our ongoing rush toward a transparent society.


Non-Brinvolved Items:

200px-EvolutionOfGodSee a fascinating interview with Robert Wright, one of the most important authors of our time, about his new book, Evolution of God, about the roots of religion.

  HPlus Magazine finally releases their new summer issue! It describes the already-existing brain/computer interfaces - and where they could take us - and explains Dartmouth-built robots whose artificial neurons can mimic the human learning process. There's 84 pages of online-only goodness, including laser-stimulated brain cells, artificial muscles, and an interview with NASA's director of research (who suggests robot exploration of Mars). And NPR's Moira Gunn assays the implications of the U.S.'s abrupt welcome for stem cell therapy.

Incredible!  The next game intreractive technology:

See the blog of the production company making "The People Vs George Lucas” --  a full length film, due next year, riffing off my book STAR WARS ON TRIAL.

Think Link appears to address some serious deficits in the current, sad state of "discourse" online.  I envision combining it with a good reputation system.  The result could be a real step toward the kind of "disputation arenas" I described in the American Bar Association's Journal of Dispute Resolution.

Somebody's thinking about What Comes After Email.  I have received several emails from people who think there are similarities to my Holocene Project... which I pitched at Google the same day that the patent was awarded, a while back.  Me?  At a first, hurried glance, I don’t see a whole lot of Holocene in Google Wave... but I can see that it would be vastly improved by incorporating Holocene concepts.  Alas, I have found that many bright fellows cannot see the hand in front of their face.  Ah well, I wish them well.  Opinions?

Stunning. And right now this volcano is affecting our sunsets and dipping global warming.

The issue of cyberwarfare.

A simple way has been found to convert plant cellulose into , a basic building block for fuel, polyesters, and other petroleum-based chemicals...  to extract HMF from plants by using a mixture of copper chloride and chromium chloride to break down the cellulose without creating unwanted byproducts. The chlorides didn’t degrade, which meant that the process could be repeated using the same chemicals, reducing the cost of creating HMF while yielding a product with fewer impurities.  While still a ways off from commercial applications, the process shows promise in creating an alternative to plastics.

Sundiver"Near-Term Beamed Sail Propulsion Missions: Cosmos-1 and Sun-Diver", James Benford and Gregory Benford, Beamed Energy Propulsion, AIP Conf. Proc. 664, pg. 358, A. Pakhomov, ed., 2003   Um...see my novel, "Sundiver?

Apropos of tweeting, I couldn't resist sharing this find of Laurie Morrow's!  Do have a look

...and finally...

=== Are We inherently Empathic? ===

New research from Vanderbilt University indicates the way our brain handles how we move through space—including being able to imagine literally stepping into someone else's shoes—may be related to how and why we experience empathy toward others.  

Empathy involves, in part, the ability to simulate the internal states of others. The authors hypothesized that our ability to manipulate, rotate and simulate mental representations of the physical world, including our own bodies, would contribute significantly to our ability to empathize.  The researchers compared performance on the test with how empathetic the subjects reported themselves to be. They found that higher self-reported empathy was associated with paying more attention to the right side of space. Previous research has found that the left side of the face is more emotionally expressive than the right side. Since the left side of the face would be on the right side of the observer, it is possible that attending more to the expressive side of people's faces would allow one to better understand and respond to their mental state. These findings could also point to a role of the left hemisphere in empathy.  (contributed by Stefan.)