忍者ブログ
Informative and Instructive Science News
[10]  [11]  [12]  [13]  [14]  [15]  [16]  [17]  [18]  [19]  [20
×

[PR]上記の広告は3ヶ月以上新規記事投稿のないブログに表示されています。新しい記事を書く事で広告が消えます。

Inside Movie Animation: Simulating 128 Billion Elements

 
Simulations based on physics help make this Disney character incredibly rat-like.

Ever wonder how animated films such as The Incredibles get hair, clothing, water, plants, and other details to look so realistic? Or how, like the lion in The Chronicles of Narnia, animated characters are worked into live-action films? If not, the animators would be pleased, since they don't want special effects to distract from the story. Behind the scenes, though, is a sophisticated combination of artistry, computation, and physics.

Traditionally, animation was hand drawn by artists who needed"some of the same magical eye that the Renaissance painters had, to give the impression that it's realistically illuminated," says Paul Debevec, a computer graphics researcher at the University of Southern California. Over the past decade or so, the hand-painted animation has faded as physically-based simulations have increasingly been used to achieve more realistic lighting and motion. Despite this movement toward reality in animated films, the physics of the real world remains a slave to expediency and art: Simplifications and shortcuts make the simulations faster and cheaper, and what the director wants trumps physical accuracy.

In one dramatic scene in the movie 300, which came out early in 2007, several ships collide violently -- their hulls splinter, masts break, sails tear, and the ships sink. Stephan Trojansky, who worked on 300 as visual effects supervisor for the German-based company ScanlineVFX, said just creating the ocean in that scene involved simulating 128 billion elements. “We probably created the highest fluid simulation detail ever used in visual effects,” he said.

"For the fracturing and splintering of the ships," he added, "we developed splintering technology. Wood doesn't break like a stone tower. It bends. To get realistic behavior, you have to take into account how the ship is nailed together. The physics involved is mainly equations that define where the material will break."

Animations of both fluids and solids—and of facial expressions and clothing, among other things—use various computational methods and a host of equations. But there is a tradeoff in the push for more realistic animations – moving closer to reality requires more and more computer power, and becomes increasingly expensive. There are three commonly used methods of computer animation -- break the object being simulated into discrete elements, use sample points from the object, or create fixed cells in space.

Mark Sagar, of WETA Digital, a visual effects company in Wellington, New Zealand, specializes in simulating faces. One technique is motion capture, in which markers are placed on an actor's face, their positions are noted for different expressions, and the positions are then mapped onto an animated character. "For King Kong we mapped the actor's expressions onto a gorilla," said Sagar.

Simulating the face involves interpreting movement in terms of muscle, Sagar said. "We approximate the detailed mechanical properties of live tissue and its layers and layers. You have motion data and start working out what the driving forces are.” Modeling realistic stretching of the skin requires a lot of finite elements—each a small patch of tissue,” he said. "You compute and solve for forces at each point and then sum until you get a balanced equation. It's not sophisticated from an engineering standpoint but produces high-quality results."

Realistic motion is often too complicated for animators to do by hand, said Michael Kass, a researcher at Pixar Animation Studios. "The results can be awful and very expensive." In the original 1995 Toy Story, he said, "if you see a wrinkle in clothing, it's because an animator decided to put in a wrinkle at that point in time. After that we [at Pixar] decided to do a short film to try out a physically based clothing simulation."

The movement of clothing is computed as a solution to partial differential equations, he said. "You start with individual threads. What are their basic properties? Then you consider the bulk properties when [they're] woven. The main physical effects are stretching, shearing, and bending. To a certain degree, you can take real cloth and get actual measurements."

While animating clothing still presents problems, he said, “it's now part of a standard bag of tricks. Our simulations have become accurate enough that we can design garments with commercially available pattern-making software and then have them move largely as a tailor would expect in our virtual simulations."

Animating hair "is in many ways easier than clothing because it's like individual threads,” Kass said. “The difference is that clothing doesn't move like clothing unless the threads interact. In a real head of hair, the threads do interact, but you can get convincing motion without taking that into account."

Illumination is another area in which physics plays a key role in animation. For a long time, says Cornell University's Steve Marschner, "rendering skin was hard. It would look waxy or too smooth." The fix, he says, was to take into account that skin is translucent, which he and colleagues "figured out from looking at a different problem—rendering marble."

As with simulations of fluids, cloth, rigid bodies, and so on, incorporating translucency to model skin involves old physics. "In some cases we have to create the models from the ground up. But sometimes we find somebody in another branch of physics who has solved a similar problem and we can leverage what they've done." For skin translucency, "we were able to adapt a solution from medical physics, from a calculation of radiation distributions inside the skin that was used for laser therapy in skin diseases."

"One of the coolest things you see in a movie is when there is some sort of otherworldly beast or digital character that is sitting in the scene, roaming around, and it looks like it was really there," says Debevec. "The only way you can do that is by understanding the physics of light transport, respecting how light works in the real world, and then using computers to try to make up the difference from what was really shot."

For example, he says, in Narnia "they filmed a lot of it with the children dressed up in their knight costumes and left an empty space for the lion." Then, to get the digital lion just right, "Rhythm and Hues Studios used radiometrically calibrated cameras to measure the color and intensity of illumination from every direction in the scene." The measurements, he adds, "are fed into algorithms that were originally developed in the physics community and have been adapted by the computer graphics community as a realistic way to simulate the way light bounces around in the scene.”

Similar methods are used for creating digital doubles—virtual stunt characters that fill in for live actors. For that, Debevec said, "film studios sometimes bring actors here to our institute, where we've built devices to measure how a person or object, or whatever you stick in [the device], reflects light coming from every possible direction.” The resulting data set, he says, can be used to simulate a virtual version of the person. "There are about 40 shots of a digital Alfred Molina playing Dr. Otto Octavius in Spider-Man 2. It looks like him, but it's an animated character. The reflection from the skin looks realistic, with its texture, translucency, and shine, since it's all based on measurements of the real actor."

"We rarely simulate more than two indirect bounces of illumination, whereas in reality light just keeps bouncing around," Debevec continued. "With no bounces, things look way too spartan and the shadows are too sharp. One bounce fills in perhaps three-quarters of the missing light, and with two bounces you're usually past 95%. That's good enough." Another shortcut, he adds, is to focus just on the light rays that will end up at the eye. "We try to figure out the cheats you can make that give you images that look right."

"There is a long tradition of cheating as much as possible," said Marschner, "because setting up an exact simulation is either not possible or too expensive." “We use physics to get realism,” Trojansky said. "But I am a physics cheater. I use it as a base, but I am interested in the visual effect." 

PR

It's Spud Time

The United Nations wants more people to appreciate the potato's potential to fight world hunger

As 2007 winds down, thoughts naturally turn towards what might lie ahead. Meals rich in high-carb tubers, perhaps? That's what the United Nations would like everyone to contemplate throughout 2008, which it is designating the International Year of the Potato.

Farmers now harvest more than 300 million tons of potatoes (Solanum tuberosum) worldwide. That makes it the fourth biggest food crop, trailing only corn, wheat, and rice.

For 8,000 years, the humble potato has been a staple in the South American Andes, its homeland. Spanish adventurers encountered the New World crop roughly 500 years ago and brought various types back to Europe. Today, potatoes are cultivated not only throughout the Americas, but also from China's uplands to India's subtropical lowlands—even on Ukraine's arid steppes.

A testament to the potato's Western roots, production of this crop in the States and southward leads the world. Fully 40 percent of the 2006 potato harvest came from North America, with Latin American farmers contributing another 16 percent.

However, appreciation for this nutritious starch within developing countries outside of the Americas—especially in Asia—has been growing steadily, with production of the crop in those regions climbing some 5 percent annually. Indeed, 2005 marked the first time in recent history that production of potatoes in the developing world exceeded that in developed nations.

Although most people think of potatoes as a commodity, in fact, more potatoes are processed to make fast foods, snacks, and convenience items than are sold fresh in the market place. Today, China is the leading producer of spuds, followed by the Russian states and India. International trade in potatoes—worth $6 billion annually—has also been growing within developing nations.

You might then ask why, with all of this pre-existing global interest in potatoes, the UN feels compelled to devote a year of workshops, research contests, and other focused attention on this one particular food. And the reason, the UN's Food and Agricultural Organization argues, is that much of the spud's potential to feed the poor remains untapped.

For instance, although Asians eat nearly half of the world's potatoes, per capita consumption even in Asia remains modest—just 25 kilograms per year, or roughly 45 percent of U.S. consumption and just 27 percent of what's typical in Europe.

Even were potatoes to win greater respect for their nutritional attributes and ability to serve as industrial feedstocks, they couldn't necessarily make a big contribution in new regions of the world without significantly more research. The tubers are vulnerable to a host of major diseases—like the one that set off Ireland's 1845 potato famine. Some varieties of potato are especially resistant to particular diseases, but may not grow well in new regions of the world or taste that yummy.

That's where potato scientists come in. They can identify the climate, soil types, day length, and native diseases with which any new potato crop would have to contend. Then they'll cross lines of wild or cultivated spuds to develop ones with traits that will allow them to thrive outside the Americas. The good news, the UN program notes: "The potato has the richest genetic diversity of any cultivated plant." So there's plenty of potential to tailor a new cultivar to meet the needs of farmers in most places on the globe.

But the potato's biggest advantage, according to the International Potato Center, based in Lima, Peru, is that it yields more food, more quickly, on less land, and in harsher climates than any other major crop. Up to 85 percent of the plant is edible, compared to only about 50 percent for cereal grains. Moreover, the Center notes, potatoes "are ideally suited to places where land is limited and labor is abundant—conditions in much of the developing world."

To help get this word out to agricultural agencies in parts of the world not already turned on to spuds, and from them to farmers, the International Potato Center will be sponsoring a March 2008 meeting: Potato Science for the Poor–Challenges for the New Millennium (http://www.cipotato.org/Cuzco_conference/). Those who attend will have the opportunity to explore the possibility of cooperating to fine tune existing potatoes into higher-yielding varieties.

The International Potato Center's gene bank safeguards the largest biodiversity of potatoes—7,500 different varieties, of which 1,950 are not cultivated. Research on spuds, especially studies aimed at fostering food security and the alleviation of poverty, have become a focus for the center.

With all of this talk of potatoes, are you hungry yet? The UN program has so far identified 172,000 web pages containing recipes for using potatoes. Stay tuned, it says: "We will gather the best of them" and share them on the Year of the Potato website. 

For the Holidays: Good Things Come in Virtual Packages

Virtual commodities such as avatar accessories and Facebook gifts are a sign of status for a new generation fixated with everything Web related 

 
GIVING: Don't know what to get that special someone for the holidays? Web users spent $2.1 billion in 2006 on virtual goods and services, according to researchers at Finland's Helsinki Institute for Information Technology.

 
VIRTUAL BUZZ: Sulake Corp.'s Habbo virtual world lets participants create and accessorize their own online characters; it even has its own house band for participants to adore.

 
AT THE (ONLINE) MALL: Fans of Linden Labs' Second Life can dress their avatars in as much (or as little) as they like.

Not sure what to get that special someone on your holiday shopping list who has everything? How about a virtual T-shirt featuring the logo of his or her favorite virtual band—or a snazzy new pair of avatar swimming trunks? Too trite? Well then, how about a prewrapped present to put under his or her Facebook Christmas tree?

Many people of a certain age may consider such gifts a waste of their hard-earned and very real money. But not so a growing number of tweens and teens as well as 20- and even some 30-somethings, who spent around $2.1 billion in 2006 on virtual goods and services, according to researchers at Finland's Helsinki Institute for Information Technology (HIIT).

The reason for the very real popularity of virtual shopping? The same as for traditional buying, says HIIT researcher Vili Lehdonvirta: Shoppers see some value in the purchase of an item even if they don't necessarily need it.

"For kids in particular this kind of consumption is a social behavior," he says. "Adults have developed conceptions about what can and cannot be valuable, but kids who don't share this rationalization will do what they subjectively feel is right. Studying this can advance the understanding of consumer behavior in general."

Lehdonvirta is the founder of the Virtual Economy Research Network, a Web site that offers news, research and discourse on the virtual purchases, which include domain names as well as clothing and accessories for online avatars and video game characters. "Today, this virtual property is being bought and sold for real money by millions of people at numerous marketplaces around the world," he says.

Indeed, spending on virtual items for social reasons is a more sustainable model than the purchase of computer-generated real estate or avatar apparel in cyber worlds such as Linden Research's Second Life. Facebook's online gifts are a way to establish and maintain friendships, Lehdonvirta says. "Look at it as a form of creativity, establishing membership in a group, identifying social classes," he notes, "and what social group you identify with."

South Korea is the leading market for virtual consumption and one of the most trendy places to spend money on virtual items is social networking site Cyworld. Unlike Facebook or MySpace, a Cyworld participant creates an avatar called a "minime," whose hair, clothing, facial expression, mood and other attributes can be changed as often as the owner wants. Much of the U.S. and European spending that can be tracked—Facebook does not provide sales figures for its virtual swag—is on massively multiplayer games, a primary example being World of Warcraft.

Adults—particularly those in their 30s—that Lehdonvirta interviewed consider social networking sites to be their "virtual front yards," he says. "The kind of car you park in your driveway, your house, your garden—these things communicate information about your status. Some people believe that if you don't spend a decent amount of money to take care of your online image, you will see the social consequences, such as feeling pressure from peers."

Rumors have surfaced in recent years of incomprehensible virtual purchases, such as that of an Australian who allegedly bought an island in the online game Entropia Universe for more than $20,000 and another of an American who supposedly paid $100,000 for an Entropia virtual space station. "These claims can't be independently verified," Lehdonvirta says, adding that they amount to little more than good publicity for Entropia.

Virtual purchases and gift-giving (and maybe even cyber re-gifting) must continue to evolve, however, if they are to continue to attract the interest of spenders and become sustainable economic models for sites such as Facebook. Many of the gifts that Facebook offers are currently freebies that the fairly immature business is hoping will shape market behavior and encourage their participants to eventually fork over money for goods.

HIIT researchers next month plan to begin studying virtual economies as part of a two-and-a-half-year project called Advanced Virtual Economy Applications, which is being financed with $964,000 (around 672,000 euros) by Finnish technology and innovation funding agency Tekes, along with Nokia Research Center, Icelandic massively multiplayer game company CCP, Swedish virtual world–maker Playdo AB and Finland's SWelcom (the electronic media division of SanomaWSOY Group). They plan to explore economic activity in large-scale virtual economies, virtual asset sales as a revenue model for online services, and cyber economies on mobile and ubiquitous platforms.

So, before dismissing requests for the latest in virtual gear, consider the time and money you could save by buying online—not to mention the cred you'll get from your kids who probably think you're a techno-dinosaur who just uses your computer to e-mail and upload digital photos.

Evolving Bigger Brains through Cooking

Our intelligence has enabled us to conquer the world. The secret for the big brains, says biological anthropologist Richard Wrangham, is cooking, which made digestion easier and liberated more calories.

A couple of million years ago or so, our hominid ancestors began exchanging their lowbrow looks for forehead prominence. The trigger for the large, calorie-hungry brains of ours is cooking, argues Richard W. Wrangham, the Ruth B. Moore Professor of Biological Anthropology at Harvard University's Peabody Museum of Archaeology and Ethnology. He hit on his theory after decades of study of our closest cousin, the chimpanzee. For the Insights story "Cooking Up Bigger Brains," appearing in the January 2008 Scientific American, Rachael Moeller Gorman talked with Wrangham about chimps, food, fire, human evolution and the evidence for his controversial theory. Here is an expanded interview.

You have been the director of the Kibale Chimpanzee Project in western Uganda since 1987. Have chimpanzees always been a big interest of yours?

I have always been interested in nature. I began as a bird-watcher and then wanted to go to wild places. I had a gap year between high school and college in Zambia, and that set off an interest in behavioral ecology—I was an assistant to a biologist there working for the game department. It was an amazing place, miles and miles of bush with all sorts of animals.

Did you study chimpanzees there?

No, I wasn't focused on primates at this point. But then I came to college at the University of Oxford, and on my first day I went to the expeditions club to see if there were any opportunities to work in Africa again. By the time I left college, I'd already had quite a bit of experience of Africa. I'd become really interested in thinking about animals as a way to get at the evolution of human social systems: If there are similarities between humans and animals, then let's find out where they come from. I wrote to Jane Goodall in July of 1970 [to ask to work with her], and in November I was in Gombe.

What was it about Africa that excited you—the adventure? What kept drawing you back to look at these animals?

I think the natural history is tremendously exciting and rich. And I think that, even then, I had a sense time is running out, things are changing and, when it's possible, it's necessary to explore all of these fascinating animals and ecosystems. But the sense of freedom and adventure was palpable, of course, as well.

When you first went to work with Goodall, what kind of research did she have you conduct?

She gave me the opportunity to spend a year following four pairs of chimp siblings. It was a wonderful time because I was free to develop my own thoughts and interests about chimps. It's embarrassing in some ways, but my subsequent career has all been playing out the thoughts that I developed then. It was looking at the ways in which ecological pressures affect chimpanzee society. It's very obvious with chimps, because in different seasons you get different distributions of food, and the chimps respond in very marked ways. That is a window into the larger question of the relationship between ecological pressures and social systems and how they vary between species.

It must be exciting to be among the first to really understand the social behavior of chimps.

Yes, that's right—it's fabulous! It's interesting with any animal, because every species has its strange twist, but it's particularly dramatic with chimps. The reason that it's dramatic, actually, only became apparent later with the genetic data that says how closely related we are to chimps. In the 1970s, when we were discovering that chimpanzees have all these amazing similarities to humans, what Jane discovered was that they really like to eat meat, they use tools, they make tools, they have relationships between mothers and offspring that in many ways recall what's going on in humans. There are all sorts of things—cultural transmission of a wide range of behaviors. It gives a general sense.

Your theory that cooking spurred the evolution of modern humans occurred to you while you were sitting in front of your own fireplace?

Yes, about 10 years ago, right after the start of the academic term, I was thinking about what stimulated human evolution. The fire just started drawing me in to the comparison with chimps, because I tend to think about human evolution through the lens of chimps: What would it take to convert a chimpanzeelike ancestor into a human? And as I thought about how long we have had fire, I realized what a ridiculously large difference cooking would make. It's a very simple thought; anyone who had ever taken an anthropology course should have had it long before this.

Just how would cooking make a difference? What's wrong with raw food that chimps eat?

I know chimpanzee foods fairly intimately, I've tasted the great majority of the things I've seen them eat, and I know what a huge difference there is between a chimpanzee diet and the human diet, because we cook. And that set me off thinking about whether or not humans really could ever survive on a raw diet. And my instant assumption was no, because of my experience with chimpanzee diets, which said to me we couldn't possibly do this—so that raises all these fascinating evolutionary questions. I'd had the experience of seeing a close relative eating all those foods and seeing how unpleasant they are and how difficult it would be for humans to survive on a diet like that. Maybe people assume that the kinds of places in which humans live would have apples and bananas dripping off the trees, but it's not like that.

What are the foods like, then?

The typical fruit is very unpleasant, very fibrous, quite bitter; the net effect is that you would not want to eat more than two or three of them before running for a big glass of water and saying, "That was not a pleasant experiment, I hope I don't get sick." They're not nice to eat. Not a tremendous amount of sugar in them. So there were very few fruits that I've tasted that I can actually imagine getting a stomach for because most of them are unpleasant to eat. Some make your stomach heave.

But maybe if we—or ancient humans—were accustomed to them, we would be able to eat them.

I recognize that I've got a palate softened by ease, and it may well be that if you're hungry in the bush you might be prepared to eat a lot of these nasty tasting things, but I've worked with Pygmies in eastern Congo in a forest where I knew a fair number of fruits were being eaten by chimps, and I would ask them, and they'd say there's no way I would ever eat this stuff. Chimpanzees eat, on average, 60 percent of their food as fruit. Humans couldn't do that. So one of the fascinating things for me as I ventured into this was really learning about what hunters and gatherers eat—and it turns out that there are no records of people having a large amount of their food come from raw food. Everywhere, everyone expects a cooked meal every evening.

What about the way our bodies are set up to digest food—besides not liking the taste, can we digest the foods chimps eat?

I think we can probably digest them—this is guesswork because we don't really know—but the point is they're very full of indigestible fiber. So the average human diet has, even in the more fibrous hunter-gatherer types, 5 to 10 percent, say, indigestible fiber. With our chimp studies, they eat 32 percent indigestible fiber. So that is something that the human body is not designed to handle. And the reason we can say that is that we have small colons and small stomachs which are adapted to food that has high caloric density. And food the chimps eat has low caloric density.

When you looked at the archaeological evidence, what were the clues that indicated to you that fire spurred the development of Homo erectus?

The archaeology of fire is historically a confusing area because people have derived stronger claims than, I think, they should have done. What they did was say that we see quite a lot of evidence of fire back to a certain time, and then we see much less evidence; so let's assume that fire started at that break point. And I think the way they should have read it is we know that fire has been used some time in the past, but we don't see any clear cutoff, so we can't draw any conclusions.

Is it possible to narrow down the time when humans first used fire?

Some people say that fire began 40,000 years ago, some people say 200,000, some people say 300,000, some say 400,000, some say 500,000—it's all over the map. There are quite a few sites back to 1.6 million years ago where the people who excavated the sites say, "Well, I've got some evidence of fire back here." And other people say, "Well, you might have it but it's not good enough to convince me." So to me, the way to look at the archaeology of fire evidence is simply to say the archaeology doesn't tell you anything. All it tells you is it is possible that fire was controlled back to 1.6 million years ago.

And you believe cooking with that fire spurred the development of modern humans.

Here's the way I tend to ask the question: I tend to think of the advent of cooking as having a huge impact on the quality of the diet. In fact, I can't think of any increase in the quality of diet in the history of life that is bigger. And repeatedly we have evidence in biology of increases in dietary quality affecting bodies. The food was softer, easier to eat, with a higher density of calories—so this led to smaller guts, and, since the food was providing more energy, we see more evidence of energy use by the body. There's only one time it could have happened on that basis; that is, with the evolution of Homo erectus somewhere between 1.6 [million] and 1.8 million years ago.

What exactly is it about Homo erectus that fits these criteria so much better than earlier or later human ancestors?

Homo erectus is the species that has the biggest drop in tooth size in human evolution, from the previous species, which in that case was Homo habilis. There wasn't any drop in tooth size as large as that at any later point in human evolution. We don't know exactly about the gut, but the normal argument is that if you reconstruct the ribs, you have reduced flaring of the ribs. Up until this point you have ribs that went out to apparently hold a big belly, which is what chimps and gorillas are like, and then at this point [when Homo erectus arose] the ribs go flat, meaning you've got now a flatter belly and, therefore, smaller guts. And then you have more energy being used; people interpret the locomotor skeleton as meaning that the distances traveled every day are much farther. And the brain has one of its larger rises in size.

Smaller guts and bigger brains resulted from extra calories, then. So it is possible that our ancestors simply found richer foods?

There's this lovely theory by Leslie Aiello [president of the Wenner-Gren Foundation for Anthropological Research] and Peter Wheeler [at Liverpool John Moores University in England] saying that larger brains are made possible in primates by smaller guts. And they previously argued that guts were getting smaller at that time, but they said it was because of meat eating. I'm suggesting that this was instead because of cooking, partly because there's no other time that satisfies the expectations that we would have for changes in the body that would be accompanied by cooking.

There are people who believe that just a switch to eating meat caused these changes, even though a million years had lapsed between the adoption of meat-eating and the evolution of Homo erectus? 

Yes, one or two people have written articles saying this doesn't make sense! There is some diversity of opinion, and I find it helpful that there are people who say the old story is too simple.

Do most people adhere to the meat theory, or are there other, more popular theories?

There's an amazing lack of theories, actually. I mean, this is human origins, and there's so much willingness to go with a rather well established, yet not very deeply thought-out, idea. One of the things that amazed me was the difficulty of eating raw meat. Raw meat is not that attractive, particularly the kind of meat you cut off an animal that has been living under stressful conditions in the African savanna: tough, antelope mostly, hippos and rhinos. And I've tried chewing raw meat. It probably wouldn't take them long, though, to realize you could pound the meat. To pound the meat they would have gotten more energy out of it.

Is cooking meat better than pounding it to increase digestibility?

I've turned up some studies in the literature that have not been interpreted the way I've been interpreting them, which show that digestibility of animal protein increases when it is cooked. And that's because it's denatured—the protein is unfolded. It's normally packed solid, with the hydrophobic groups on the inside, the hydrophilic groups on the outside. Denaturation is the process of it opening out. And once it opens out, the proteolytic enzymes can now go in and start snipping. Heat predictably causes denaturation, so I think one of the major effects of cooking is to denature proteins, opening them up to the point where proteolytic enzymes have easier access.

What additional studies would lend support to your theory?

It would be very interesting to compare the human and Homo erectus genetics data to see when certain characteristics arose, such as, when did humans evolve defenses against Maillard reaction products?

Darwin’s Era, Modern Themes: Science, Faith and Publication

“If I finish the book, I’m a killer,” he said. “I murder God.”

At least that’s what Peter Parnell has Darwin say in his new play, “Trumpery,” which opened this month at the Atlantic Theater Company in New York.

In the play, as in real life, Darwin is moved to publish by Alfred Russel Wallace, a young man whom Parnell’s Darwin dismisses as “a nobody, a collector, a poor specimen hunter,” but who has independently come up with a theory just like the one Darwin has been chewing on for decades.

So in part the play hangs on scientific “priority:” who will publish first? As the action begins, Wallace, as in real life, has sent Darwin a paper describing his ideas, in hopes that Darwin will help make them known. (If, like many people, you know who Darwin is but not Wallace, you probably think you know how that comes out. Think again.)

But a larger question, Mr. Parnell said in an interview, is “what it means to be a scientist” when confronting issues of faith. It is an idea as controversial today as it was then.

Darwin’s Britain teemed with religiosity as diverse as evangelical Christian fervor and spiritualism, an idea whose adherents included Wallace and Darwin’s wife, Emma Wedgwood. Darwin knew he would be called heretical for challenging the Biblical idea of God as a one-time-only creator of an immutable natural order.

At first, he finds the idea literally sickening. But, as Mr. Parnell put it, Darwin is “both great enough and grandiose enough” to eventually conclude not just that he could do it, but that he ought to. And we all know how that came out.

But today as then, there are creationists who assert that people must choose between belief in Darwin’s theory and belief in God. Yet Darwin did not kill God. His theory, unchallenged in science, is the foundation on which the edifice of modern biology is built. And it has plenty of adherents among religious believers.

“Trumpery” is not the first foray into science for Mr. Parnell, a screenwriter and dramatist who has worked on television shows like “The West Wing” and who teaches television writing at the Yale School of Drama. That would be “QED,” a play about Richard Feynman, the physicist and theorist of quantum electrodynamics, the modern theory of electromagnetism.

It was while working on that play that Mr. Parnell stumbled on a book, “The Song of the Dodo” by David Quammen, which describes Wallace’s work. The book led Mr. Parnell to more study of Darwin, Wallace and their times. Pretty soon, he had a three-act play with, he realized, a cast of way too many characters dealing with way too many subjects — not just evolution, but topics like Colonialism and a Tierra del Fuegan accused of murder.

“I didn’t know for a long time what the play was about,” he said.

But just as “QED” focused sharply on Feynman, Mr. Parnell found this play by focusing on Darwin and telescoping some of the events in his life to bring his quandaries into sharp relief.

For example, much of the play is an argument involving Darwin, his biological allies Joseph Hooker and Thomas Henry Huxley, and their foe, Richard Owen. In fact, their debates took place in letters. But confrontation is useful for a dramatist dealing with science.

“The ideas have to be accurate, they have to be intelligible,” Mr. Parnell said. “But you have to find a dramatic way to tell it — a reason it can be a play, to exist on stage.”

He added, “It has to be grounded in conflict.”

“Trumpery” is not Mr. Parnell’s first exploration of a frightening idea either. That was “And Tango Makes Three,” a children’s book he wrote with his partner, Justin Richardson. The book tells the true story of Silo and Roy, two male penguins at the Central Park Zoo who courted each other and formed a relationship. When a keeper saw them trying to incubate a rock, he gave them an orphaned egg, which they cared for until it hatched as the chick Tango.

While the book received many favorable reviews, some parents and religious groups objected to it as suggesting that a family could be something other than Mom, Dad and kids.

“That idea is considered dangerous,” Mr. Parnell said.

Today, although Darwin’s idea is not so frightening to many, the conflict over evolution still plays on, on stage and in school boards and courtrooms around the country. Perhaps conflict is inevitable when people confront new and frightening ideas. But, as Darwin tells his dying daughter Annie, at the end of the play, it is good to challenge conventional wisdom. He adds, though, “if you question everything, you have to expect to be scared.” 

Calendar
04 2024/05 06
S M T W T F S
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
Timepiece
タグホイヤー フォーミュラー1 ドリームキャンペーン
Blog Plus
SEO / RSS
Podcast
by PODCAST-BP
New TB
Bar Code
Data Retrieval
Oldest Articles
(09/30)
(09/30)
(09/30)
(09/30)
(09/30)
Photo Index
Reference
Latina




RSS Reader
無料RSSブログパーツ
Misc.
◆BBS


◆Chat


◆Micro TV


Maps



顔文字教室




Copyright © Info All Rights Reserved.
Powered by NinjaBlog
Graphics by 写真素材Kun * Material by Gingham * Template by Kaie
忍者ブログ [PR]