2014完整版《经济学人》英文原版

更新时间:2024-06-27 06:48:01 阅读量: 综合文库 文档下载

说明:文章内容仅供预览,部分内容可能不全。下载后的文档,内容与下面显示的完全一致。下载之前请确认下面内容是否您想要的,是否完整无缺。

经济学人完整整理版

Digest Of The. Economist. 2006(8-9)

The mismeasure of woman

Men and women think differently. But not that differently

IN THE 1970s there was a fad for giving dolls to baby boys and fire-engines to baby girls. The idea was that differences in behaviour between the sexes were solely the result of upbringing: culture turned women into ironers, knitters and chatterboxes, and men into hammerers, drillers and silent types. Switching toys would put an end to sexual sorting. Today, it is clear why it did not. When boys and girls are born, they are already different, and they favour different toys from the beginning.

That boys and girls—and men and women—are programmed by evolution to behave differently from one another is now widely accepted. Surely, no one today would think of doing what John Money, of Johns Hopkins University, did in 1967:

amputating the genitalia of a boy who had suffered a botched circumcision, and advising the parents to bring him up as a girl. The experiment didn't work, and the consequences were tragic. But which of the differences between the sexes are “biological”, in the sense that they have been honed by evolution, and which are “cultural” or “environmental” and might more easily be altered by changed circumstances, is still fiercely debated.

The sensitivity of the question was shown last year by a furore at Harvard University. Larry Summers, then Harvard's

president, caused a storm when he suggested that innate ability could be an important reason why there were so few women in the top positions in mathematics, engineering and the physical sciences.

Even as a proposition for discussion, this is unacceptable to some. But biological explanations of human behaviour are making a comeback as the generation of academics that feared them as a covert way of justifying eugenics, or of thwarting Marxist utopianism, is retiring. The success of neo-Darwinism has provided an intellectual underpinning for discussion about why some differences between the sexes might be innate. And new scanning techniques have enabled researchers to examine the brain's interior while it is working, showing that male and female brains do, at one level, operate differently. The results, however, do not always support past clichés about what the differences in question actually are.

Differences in behaviour between the sexes must, in some way, be reflections of systematic differences between the brains of males and females. Such differences certainly exist, but drawing inferences from them is not as easy as it may appear. For a start, men's brains are about 9% larger than those of women. That used to be cited as evidence of men's supposedly greater intelligence. Actually, the difference is largely (and probably completely) explained by the fact that men are bigger than women. In recent years, more detailed examination has refined the picture. Female brains have a higher percentage of grey matter (the manifestation, en bloc, of the central bodies of nerve cells), and thus a lower percentage of white matter (the manifestation of the long, thin filaments that connect nerve cells together), than male brains. That, plus the fact that in some regions of the female brain, nerve cells are packed more densely than in men, means that the number of nerve cells in male and female brains may be similar.

Oddly, though, the main connection between the two hemispheres of the brain, which is known as the corpus callosum and is made of white matter, is proportionately smaller in men than women. This may explain why men use only one side of the brain to process some problems for which women employ both sides.

These differences in structure and wiring do not appear to have any influence on intelligence as measured by IQ tests. It does, however, seem that the sexes carry out these tests in different ways. In one example, where men and women perform equally well in a test that asks them to work out whether nonsense words rhyme, brain scanning shows that women use areas on both the right and the left sides of the brain to accomplish the task. Men, by contrast, use only areas on the left side. There is also a correlation between mathematical reasoning and temporal-lobe activity in men—but none in women. More generally, men seem to rely more on their grey matter for their IQ, whereas women rely more on their white matter.

American exceptionalism

The world's biggest insurance market is too splintered

KANSAS CITY, Missouri, is known more for its historical role as a cattle town than as a financial hub. But it is to this midwestern city, America's 26th largest, that regulators and insurance executives from around the globe head when they want to make sense of the world's largest—and one of its weirdest—insurance markets.

For it is in Kansas City that the National Association of Insurance Commissioners (NAIC) is housed. It oversees a market

accounting for one-third of premiums written worldwide. Outside Kansas City, the market becomes a regulatory free-for-all. Each of America's 50 states, plus the District of Colombia, governs its insurance industry in its own way.

In an increasingly global insurance market, America's state-based system is coming under strong pressure to reform. Insurance has changed dramatically since the NAIC was set up in 1871, with growing sophistication in underwriting and risk management. Premiums in America have ballooned to $1.1 trillion and market power is increasingly concentrated in the hands of big players (some of them foreign-owned) that are pushing for an overhaul of the state-based system. “It's an extremely expensive and Byzantine process,” says Bob Hartwig, an economist with the Insurance Information Institute, a research group.

Though a fiercely political issue, congressional support for simplifying the system is gaining ground. Both houses of Congress are looking at proposals to change the state-based system. Big insurers favour a version that would implement an optional federal charter allowing them to bypass the state-bystate regulatory process if they choose. A similar system already exists for banks.

Proponents of the changes see more efficiency, an ability to roll out products more quickly nationally and, ultimately, better offerings for consumers as a result. Yet some consumer groups favour state-based regulation. They believe it keeps premiums lower than they otherwise would be. Premiums as a percentage of gross output are lower in America than in several other countries. The political headwinds are strong: insurance commissioners are elected officials in some states (California, for instance) and appointed by the governor in others. The industry is also split: most of the country's 4,500 insurers are small, and many of them have close ties with state-based regulators, whose survival they support. But even these forces may eventually be overcome. Elsewhere in the industry in America, there are other calls for reform. In a backdoor form of protectionism, American

reinsurance firms have long benefited from a regulation that requires foreign reinsurers writing cross-border business into America to post more collateral than they do. “If you operate outside the borders of the US, they don't trust you one inch,” laments Julian James, head of international business at Lloyd's of London, which writes 38% of its business in America.

The collateral requirement was established because of worries about regulatory standards abroad, and the financial strength of global reinsurers. Today regulatory standards have been tightened in many foreign markets. A majority of America's reinsurance cover now comes from firms based abroad, including many that have set up offshore in Bermuda (for tax reasons) primarily to serve America.

Too hot to handle

Dell's battery recall reveals the technology industry's vulnerabilities

THERE is the nail test, in which a team of engineers drives a large metal nail through a battery cell to see if it explodes. In another trial, laboratory technicians bake the batteries in an oven to simulate the effects of a digital device left in a closed car on a sweltering day—to check the reaction of the chemicals inside. On production runs, random batches of batteries are tested for temperature, efficiency, energy density and output.

But the rigorous processes that go into making sophisticated, rechargeable batteries—the heart of billions of electronic gadgets around the world—were not enough. On August 14th Dell, a computer company, said it would replace 4.1m lithium-ion batteries made by Sony, a consumer-electronics firm, in laptop computers sold between 2004 and last month. A handful of customers had reported the batteries overheating, catching fire and even exploding—including one celebrated case at a conference this year in Japan, which was captured on film and passed around the internet. The cost to the two companies is expected to be between $200m and $400m.

In some ways, Dell is a victim of its success. The company was a pioneer in turning the personal computer into a commodity, which meant squeezing suppliers to the last penny, using economies of scale by placing huge orders, and running efficient supply chains with little room for error. It all created a volatile environment in which mistakes can have grave effects.

Since lithium-ion batteries were introduced in 1991, their capacity to overheat and burst into flame has been well known. Indeed, in 2004 America banned them as cargo on passenger planes, as a fire hazard. But the latest problems seem to have arisen because of the manufacturing process, which demands perfection. “If there is even a nano-sized particle of dust, a small metal shard or water condensation that gets into the battery cell, it can overheat and explode,” says Sara Bradford of Frost & Sullivan, a consultancy. As the energy needs of devices have grown rapidly, so have the demands on batteries.

The computing industry's culture is also partly to blame. Firms have long tried to ship products as fast as they possibly can, and they may have set less store by quality. They used to mock the telecoms industry's ethos of “five-nines”—99.999% reliability—because it meant long product cycles. But now they are gradually accepting it as a benchmark. That is partly why Microsoft has taken so long to perfect its new operating system, Windows Vista.

Compared with other product crises, from contaminated Coca-Cola in 1999 to Firestone's faulty tyres in 2000, Dell can be

complimented for quickly taking charge of a hot situation. The firm says there were only six incidents of laptops overheating in America since December 2005—but the internet created a conflagration. Keeping the faith

Mixing religion and development raises soul-searching questions

WORLD Bank projects are usually free of words like “faith” and “soul.” Most of its missions speak the jargon of development: poverty reduction, aggregate growth and structural adjustments. But a small unit within the bank has been currying favour with religious groups, working to ease their suspicions and use their influence to further the bank's goals. In many developing countries, such groups have the best access to the people the bank is trying to help. The programme has existed for eight years, but this brainchild of the bank's previous president, James Wolfensohn, has spent the past year largely in limbo as his successor, Paul Wolfowitz, decides its future. Now, some religious leaders in the developing world are worried that the progress they have made with the bank may stall.

That progress has not always been easy. The programme, named the Development Dialogue on Values and Ethics, faced controversy from the start. Just as religious groups have struggled to work with the bank, many people on the inside doubted if the bank should be delving into the divine. Critics argued that religion could be divisive and political. Some said religion clashes with the secular goals of modernisation.

Although the bank does not lend directly to religious groups, it works with them to provide health, educational and other benefits, and receives direct input from those on the ground in poor countries. Katherine Marshall, director of the bank's faith unit, argues that such groups are in an ideal position to educate people, move resources and keep an eye on corruption. They are organised distribution systems in otherwise chaotic places. The programme has had success getting evangelical groups to fight

malaria in Mozambique, improve microcredit and water distribution in India, and educate people about AIDS in Africa. “We started from very different viewpoints. The World Bank is looking at the survival of a country, we look at the survival of a patient,” says Leonardo Palombi, of the Community of Sant'Egidio, an Italian church group that works in Africa.

Although the work continues, those involved in Mr Wolfensohn's former pet project now fret over its future. Some expect the faith unit to be transferred to an independent organisation also set up by Mr Wolfensohn, the World Faiths Development Dialogue, which will still maintain a link with the bank. Religious groups are hoping their voices will still be heard. “If we are going to make progress, faith institutions need to be involved. We believe religion has the ability to bring stability. It will be important for the bank to follow through,” says Agnes Abuom, of the World Council of Churches for Africa, based in Kenya.

Like religious groups, large institutions such as the bank can resist change. Economists and development experts are

sometimes slow to believe in new ideas. One positive by-product of the initiative is that religious groups once wary of the bank's intentions are less suspicious. Ultimately, as long as both economists and evangelists aim to help the poor attain a better life on earth, differences in opinion about the life hereafter do not matter much.

Stand and deliver

For the first time since the epidemic began, money to fight AIDS is in plentiful supply. It is

now time to convert words into action

KEVIN DE COCK, the World Health Organisation's AIDS supremo, is not a man to mince his words. He reckons that he and his colleagues in the global AIDS establishment have between five and seven years to make a real dent in the problem. If they fail, the world's attention span will be exhausted, charitable donors and governments will turn to other matters and AIDS will be relegated in the public consciousness to being yet another intractable problem of the poor world about which little or nothing can be done.

For now, though, the money is flowing. About $8.9 billion is expected to be available this year. And, regardless of Dr De Cock's long-term worries, that sum should rise over the next few years. Not surprisingly, a lot of people are eager to spend it.

Many of those people—some 24,000 of them—have been meeting in Toronto at the 16th International AIDS Conference. An AIDS conference is unlike any other scientific meeting. In part, it is a jamboree in which people try to out-do each other in displays of cultural inclusiveness: the music of six continents resonates around the convention centre. In part, it is a lightning conductor that allows AIDS activists to make their discontent known to the world in a series of semi-official protests. It is also what other

scientific meetings are, a forum for the presentation of papers with titles such as “Differing lymphocyte cytokine responses in HIV and Leishmania co-infection”. But mostly, it is a giant council of war. And at this one, the generals are trying to impose a complete change of military strategy.

When AIDS was discovered, there was no treatment. Existing anti-viral drugs were tried but at best they delayed the inevitable,

and at worst they failed completely. Prevention, then, was not merely better than cure, it was the only thing to talk about. Condoms were distributed. Posters were posted exhorting the advantages of safe sex. Television adverts were run that showed the consequences of carelessness.

Ten years ago, though, a new class of drugs known as protease inhibitors was developed. In combination with some of the older drugs, they produced what is now known as highly active anti-retroviral therapy or HAART. In most cases, HAART can prolong life indefinitely.

That completely changed the picture. Once the AIDS activists had treated themselves, they began to lobby for the poor world to be treated, too. And, with much foot-dragging, that is now happening. About 1.6m people in low- and middle-income countries, 1m of them in sub-Saharan Africa, are now receiving anti-AIDS drugs routinely. The intention, announced at last year's G8 meeting in Scotland, is that the drugs should be available by 2010 to all who would benefit from them.

However, those on drugs remain infected and require treatment indefinitely. To stop the epidemic requires a re-emphasis of prevention, and it is that which the organisers have been trying to do.

Man, deconstructed

The DNA that may have driven the evolution of the human brain

ONE of the benefits of knowing the complete genetic sequences of humans and other animals is that it becomes possible to compare these blueprints. You can then work out what separates man from beast—genetically speaking, at least.

The human brain sets man apart. About 2m years ago it began to grow in size, and today it is about three times larger than that of chimpanzees, man's closest relative. Human intelligence and behavioural complexity have far outstripped those of its simian cousins, so the human brain seems to have got more complex, as well as bigger. Yet no study has pinpointed the genetic changes that cause these differences between man and chimp.

Now a group of scientists believe they have located some interesting stretches of DNA that may have been crucial in the evolution of the human brain. A team led by David Haussler of the Howard Hughes Medical Institute in California, compared the human genome with that of mammals including other primates. They reported the results in Nature.

The researchers looked at the non-human genomes first, seeking regions that had not changed much throughout evolutionary history. Regions that are untouched by normal random changes typically are important ones, and thus are conserved by evolution. Next the researchers found the equivalent regions in the human genome to see if any were very different between humans and chimps. Such a sudden change is a hallmark of a functional evolutionary shift.

They found 49 regions they dubbed “human accelerated regions” (HARs) that have shown a rapid, recent evolution. Most of these regions are not genes as commonly understood. This is because they code for something other than the proteins that are expressed in human cells and that regulate biological processes. A number of the HARs are portions of DNA that are responsible for turning genes on and off.

Intriguingly, the most rapidly changing region was HAR1, which has accumulated 18 genetic changes when only one would be expected to occur by chance. It codes for a bit of RNA (a molecule that usually acts as a template for translating DNA into protein) that, it is speculated, has some direct function in neuronal development.

HAR1 is expressed before birth in the developing neocortex—the outer layer of the brain that seems to be involved in higher functions such as language, conscious thought and sensory perception. HAR1 is expressed in cells that are thought to have a vital role in directing migrating nerve cells in the developing brain. This happens at seven to 19 weeks of gestation, a crucial time when many of the nerve cells are establishing their functions.

Without more research, the function of HAR1 remains mere speculation. But an intriguing facet of this work is that, until now, most researchers had focused their hunt for differences on the protein-coding stretches of the genome. That such a discovery has been made in what was regarded as the less interesting parts of the human genome is a presage of where exciting genomic finds may lie in the future.

Keeping it real

How to make digital photography more trustworthy

PHOTOGRAPHY often blurs the distinction between art and reality. Modern technology has made that blurring easier. In the digital darkroom photographers can manipulate images and threaten the integrity of endeavours that rely on them. Several

journalists have been fired for such activity in recent months, including one from Reuters for faking pictures in Lebanon. Earlier this year, the investigation into Hwang Woo-suk showed the South Korean scientist had changed images purporting to show

cloning. In an effort to reel in photography, camera-makers are making it more obvious when images have been altered.

One way of doing this is to use image-authentication systems to reveal if someone has tampered with a picture. These use computer programs to generate a code from the very data that comprise the image. As the picture is captured, the code is attached to it. When the image is viewed, software determines the code for the image and compares it with the attached code. If the image has been altered, the codes will not match, revealing the doctoring.

Another way favoured by manufacturers is to take a piece of data from the image and assign it a secret code. Once the image file is transferred to a computer, it is given the same code, which will change if it is edited. The codes will match if the image is authentic but will be inconsistent if tampering occurred.

The algorithm is the weapon of choice for Hany Farid, a computer scientist at Dartmouth College in New Hampshire. Digital images have natural statistical patterns in the intensity and texture of their pixels. These patterns change when the picture is manipulated. Dr Farid's algorithms detect these changes, and can tell if pixels have been duplicated or removed. They also try to detect if noise—the overexposed pixels within the image that create a grainy effect—was present at the time the photograph was taken or has been added later.

However, forgers have become adept at printing and rescanning images, thus creating a new original. In such cases, analysing how three-dimensional elements interact is key. Long shadows at midday are a giveaway. Even the tiny reflections in the centre of a person's pupil tell you about the surrounding light source. So Dr Farid analyses shadows and lighting to see if subjects and surroundings are consistent.

For its part, Adobe, the maker of Photoshop software, has improved its ability to record the changes made to an image by logging how and when each tool or filter was used. Photoshop was the program used by the journalist fired by Reuters; his

handiwork left a pattern in the smoke he had added that was spotted by bloggers. Thus far the internet has proven an effective check on digital forgery. Although it allows potentially fake images to be disseminated widely, it also casts many more critical eyes upon them. Sometimes the best scrutiny is simply more people looking.

Collateral damage

Why the war in Iraq is surprisingly bad news for America's defence firms

WHEN Boeing announced on August 18th that it planned to shut down production of the C-17, a huge military cargo plane, the news sent a shiver through the American defence industry. As it winds down its production line at Long Beach, California, over the next two years, Boeing will soon begin to notify suppliers that their services will no longer be needed. It had to call a halt, because orders from America's Defence Department had dried up and a trickle of export deals could not take their place. The company would not support the cost of running the production line for the C-17 (once one of its biggest-selling aircraft) on the off-chance that the Pentagon might change its mind and place further orders.

The wider worry for the defence industry is that this could be the first of many big programmes to be shut down. A big part of the problem is that America is at war. The need to find an extra $100 billion a year to pay for operations in Iraq means there is pressure to make cuts in the defence budget, which has been provisionally set at $441 billion for the fiscal year beginning in October. American defence budgets involve a complicated dance starting with what the Pentagon wants, what the White House thinks it should get and, finally, what Congress allows it to get away with. Although the armed forces' extra spending on

ammunition, fuel, provisions, medicines and accommodation in Iraq does not strictly come out of the same budget as new weapons, the heavy bill for fighting eventually leads to calls to save money on shiny new equipment.

Earlier this month, for example, the Congressional Budget Office expressed “major concerns” about Future Combat Systems, a $165 billion project to upgrade all of the army's vehicles and communications networks. The scheme is the Pentagon's second-biggest development programme and is intended to give the soldiers on the ground access to real-time battlefield

information from sources such as satellites and unmanned aircraft. But the programme was initially expected to cost about $82 billion, half the latest estimate, and critics are also worried about how well it will work and whether it will be delivered on time. Last week the army issued a glowing progress report on the project and insisted that Boeing and Science Applications

International Corporation, the lead contractors, are on schedule. This was welcome news to defence contractors worried that the grandiose project might fall victim to pressure for budget cuts. Even so, the prospects for many other big weapons programmes are less rosy.

The problem is not just the cost of the fighting in Iraq, but also its nature. The shift in the style of warfare, towards such

“asymmetric” conflicts, means that there is now less demand for big-ticket weapons systems. Things were simpler in the cold war, when the Pentagon spent about $150 billion a year on new weapons. That fell to around $50 billion after the fall of the Berlin Wall.

a way to bribe a generics firm to delay its introduction of a cut-price product. American antitrust officials worry this is to the detriment of the consumer. Another explanation is that the cost and legal uncertainty associated with patent trials are simply too great. Daniel Glazer of Shearman and Sterling, a big law firm, argues that even a firm convinced of the integrity of its patents may well settle “to avoid the all-or-nothing scenario”.

But there is a less charitable explanation. The big firm may know that its patent was mistakenly awarded, perhaps because the purported breakthrough was too minor or obvious. In Barr's ongoing case against Eli Lilly's Evista, the generic firm argues that a prior patent held by the University of Pennsylvania invalidates Lilly's claims. Kathleen Jaeger of America's Generic Pharmaceutical Association adds that branded firms try to extend their lucrative monopolies by filing less rigorous secondary patents designed “to block generics”. David Balto, a former official at America's Federal Trade Commission, says, “Branded pharmaceutical firms have been stretching the limits of what deserves a patent, and the courts are just catching up.”

Ready or not

Europe's financial sector is ill prepared for a coming upheaval

SOME of the most breathless commentary about Europe's financial markets in recent years has centred on the intrigues and dalliances of leading financial exchanges. All of them have flirted with, encouraged and snubbed various potential partners in both Europe and America, although no big deals have yet been completed. Amid the chatter, an important cause of all the matchmaking and matchbreaking has been largely overlooked: a piece of looming legislation that, for all its drab detail, will alter the European Union's financial markets profoundly.

Exchanges are not the only ones to feel the hot breath of the unenticingly labelled Markets in Financial Instruments Directive, known as MiFID, which is due to take effect from November 2007. An important element of the EU's plan for a single market in financial services, the directive embraces both wholesale and retail trading in securities, including shares, bonds and derivatives. As such, it will affect companies from investment banks to asset managers and stockbrokers. Some will benefit more than others .

Charlie McCreevy, the European commissioner in charge of forging a single market, jokes about the ugly moniker: “This is not a fearsome man-eating plant.” But he is evangelical about the directive's purpose. He expects MiFID to “transform” the trading of securities in Europe, reducing the cost of capital, creating growth and increasing Europe's competitiveness in the global economy.

The directive, which EU member states are supposed to weave into their own laws by January 2007, intends to accomplish all this in several ways. First, the rules aim to increase competition across borders, by extending the “single passport”, which allows financial firms to do business across Europe armed only with the approval of their home authorities. To make this possible, investor-protection rules are also to be harmonised, so as to provide a (theoretically) consistent standard in areas such as investment advice, order-handling and the completion of securities trades—“best execution”, in the jargon.

Second, MiFID aims to change the nature of competition in share trading. Although most shares in Europe are still traded on exchanges, there is growing interest in alternatives, such as off-exchange trading between investment banks. MiFID could

accelerate this trend. In some countries—notably France, Italy and Spain—existing rules force all share trades through local bourses. The new rules will end those monopolies. No wonder exchanges, facing the threat of greater competition, are weighing up mergers.

A third intention of MiFID is more transparency. In future, investors should be able to subscribe to information services that let them see the whole market in certain shares, not only what is on offer at the local stock exchange. The goal is to let investors find the best prices in the market. This will mean competition for the London Stock Exchange, for example, which earns a healthy sum from selling such information. Investment banks are already banding together to develop alternative reporting services.

Checking the thermostat

Property prices are cooling fast in America, but heating up elsewhere

HOUSES are not just places to live in; they are increasingly important to whole economies, which is why The Economist started publishing global house-price indicators in 2002. This has allowed us to track the biggest global property-price boom in history. The latest gloomy news from America may suggest that the world is on the brink of its biggest ever house-price bust. However, our latest quarterly update suggests that, outside America, prices are perking up.

America's housing market has certainly caught a chill. According to the Office of Federal Housing Enterprise Oversight (OFHEO), the average price of a house rose by only 1.2% in the second quarter, the smallest gain since 1999. The past year has seen the sharpest slowdown in the rate of growth since the series started in 1975. Even so, average prices are still up by 10.1% on a year ago. This is much stronger than the series published by the National Association of Realtors (NAR), which showed a rise of

only 0.9% in the year to July.

The OFHEO index is thought to be more reliable because it tracks price changes in successive sales of the same houses, and so unlike the NAR series is not distorted by a shift in the mix of sales to cheaper homes. The snag is that the data take time to appear. Prices for this quarter, which will not be published until December, may well be much weaker. A record level of unsold homes is also likely to weigh prices down. The housing futures contract traded on the Chicago Mercantile Exchange is predicting a fall of 5% next year.

Elsewhere, our global house-price indicators signal a cheerier story. House-price inflation is faster than a year ago in roughly half of the 20 countries we track. Apart from America, only Spain, Hong Kong and South Africa have seen big slowdowns. In ten of the countries, prices are rising at double-digit rates, compared with only seven countries last year.

European housing markets—notably Denmark, Belgium, Ireland, France and Sweden—now dominate the top of the league. Anecdotal evidence suggests that even the German market is starting to wake up after more than a decade of flat or falling prices, but this has yet to show up the index that we use, which is published with a long lag (there are no figures for 2006). If any readers know of a more timely index, please let us know.

Some economists have suggested that Britain and Australia are “the canaries in the coal mine”, giving early warning of the fate of America's housing market. The annual rate of increase in house prices in both countries slowed from around 20% in 2003 to close to zero last summer. However, the canaries have started to chirp again. In Australia average prices have picked up by 6.4% over the past year, although this is partly due to a 35% surge in Perth on the back of the commodities boom. Likewise British home prices have perked up this year, to be 6.6% higher, on average, than they were a year ago. Thus it is claimed that housing markets in Britain and Australia have had a soft landing.

Mind the gap

Pay discrimination between male and female scientists

SEVEN years ago, a group of female scientists at the Massachusetts Institute of Technology produced a piece of research showing that senior women professors in the institute's school of science had lower salaries and received fewer resources for research than their male counterparts did. Discrimination against female scientists has cropped up elsewhere. One

study—conducted in Sweden, of all places—showed that female medical-research scientists had to be twice as good as men to win research grants. These pieces of work, though, were relatively small-scale. Now, a much larger study has found that discrimination plays a role in the pay gap between male and female scientists at British universities.

Sara Connolly, a researcher at the University of East Anglia's school of economics, has been analyzing the results of a survey of over 7,000 scientists and she has just presented her findings at this year's meeting of the British Association for the Advancement of Science in Norwich. She found that the average pay gap between male and female academics working in science, engineering and technology is around £1,500 ($2,850) a year.

That is not, of course, irrefutable proof of discrimination. An alternative hypothesis is that the courses of men's and women's lives mean the gap is caused by something else; women taking “career breaks” to have children, for example, and thus rising more slowly through the hierarchy. Unfortunately for that idea, Dr Connolly found that men are also likely to earn more within any given grade of the hierarchy. Male professors, for example, earn over £4,000 a year more than female ones. To prove the point beyond doubt, Dr Connolly worked out how much of the overall pay differential was explained by differences such as seniority, experience and age, and how much was unexplained, and therefore suggestive of discrimination. Explicable differences amounted to 77% of the overall pay gap between the sexes. That still left a substantial 23% gap in pay, which Dr Connolly attributes to discrimination.

Besides pay, her study also looked at the “glass-ceiling” effect—namely that at all stages of a woman's career she is less likely than her male colleagues to be promoted. Between postdoctoral and lecturer level, men are more likely to be promoted than women are, by a factor of between 1.04 and 2.45. Such differences are bigger at higher grades, with the hardest move of all being for a woman to settle into a professorial chair.

Of course, it might be that, at each grade, men do more work than women, to make themselves more eligible for promotion. But that explanation, too, seems to be wrong. Unlike the previous studies, Dr Connolly's compared the experience of scientists in universities with that of those in other sorts of laboratory. It turns out that female academic researchers face more barriers to promotion, and have a wider gap between their pay and that of their male counterparts, than do their sisters in industry or research institutes independent of universities. Private enterprise, in other words, delivers more equality than the supposedly egalitarian world of academia does.

Alpha betting

The industry is splitting in two—and investors are gambling on the expensive bit

IT HAS never been easier to pay less to invest. No fewer than 136 exchange-traded funds (ETFs) were launched in the first half of 2006, more than in the whole of 2005.

For those who believe in efficient markets, this represents a triumph. ETFs are quoted securities that track a particular index, for a fee that is normally just a fraction of a percentage point. They enable investors to assemble a low-cost portfolio covering a wide range of assets from international equities, through government and corporate bonds, to commodities. Morgan Stanley estimates that ETFs control some $487 billion of assets, up 16.7% from a year ago. It predicts they will have $2 trillion of assets by 2011. No longer must investors be at the mercy of error-prone and expensive fund managers.

But as fast as the assets of ETFs and index-tracking mutual funds are growing, another section of the industry seems to be flourishing even faster. Watson Wyatt, a firm of actuaries, estimates that “alternative asset investment” (ranging from hedge funds through private equity to property) grew by around 20% in 2005, to $1.26 trillion. Investors who take this route pay much higher fees in the hope of better performance. One of the fastest-growing assets, funds of hedge funds, charge some of the highest fees of all.

At first sight, this might seem like a typical market, with low-cost commodity producers at one end and high-charging

specialists at the other. Buy a Rolls-Royce rather than a Trabant and you can expect a higher standard of luxury and engineering in return for the much greater price. But fund management is not like any other industry; paying more does not necessarily get you a better service.

An index represents the average performance of all investors, before costs are deducted. If the fee paid to the fund manager increases, the return achieved by the average investor must decline. After fees, hedge-fund returns this year have been feeble. From January 1st through to August 31st, the average hedge fund returned just 4.2%, according to Merrill Lynch, less than the S&P 500 index's 5.8% total return.

So why are people paying up? In part, because investors have learned to distinguish between the market return, dubbed beta, and managers' outperformance, known as alpha. “Why wouldn't you buy beta and alpha separately?” asks Arno Kitts of Henderson Global Investors, a fund-management firm. “Beta is a commodity and alpha is about skill.”

The fund-management splits began with the decline of balanced managers, which took complete charge of an investor's portfolio, running everything from American equities through Japanese bonds to property. Clients became convinced that no one firm could produce good performance in every asset class, nor could they master the art of timing the switch from one asset to another.

Powering up

Improved devices may make better use of sunlight

MOST of the power generated by mankind originates from the sun. It was sunlight that nurtured the early life that became today's oil, gas and coal. It is the solar heating of the Earth's atmosphere and oceans that fuels wave power, wind farms and hydroelectric schemes. But using the sun's energy directly to generate power is rare. Solar cells account for less than 1% of the world's electricity production.

Recent technological improvements, however, may boost this figure. The root of the problem is that most commercial solar cells are made from silicon, and silicon is expensive. Cells can be made from other, cheaper materials, but these are not as efficient as those made from silicon.

The disparity is stark. Commercial silicon cells have efficiencies of 15% to 20%. In the laboratory, some have been made with an efficiency of 30%. The figure for non-traditional cells is far lower. A typical cell based on electrically conductive plastic has an efficiency of just 3% or 4%. What is needed is a way to boost the efficiency of cells made from cheap materials, and three new ways of doing so were unveiled this week in San Francisco, at the annual meeting of the American Chemical Society.

Solar cells work by the action of light on electrons. An electron held in a chemical bond in the cell absorbs a photon (a particle of light) and, thus energised, breaks free. Such electrons can move about and, if they all move in the same direction, create an electric current. But they will not all travel in the same direction without a little persuasion. With silicon, this is achieved using a secondary electrical field across the cell. Non-silicon cells usually have a built-in “electrochemical potential” that encourages the electrons to move away from areas where they are concentrated and towards places where they have more breathing space.

Kwanghee Lee of Pusan National University, in South Korea, and Alan Heeger of the University of California, Santa Barbara, work on solar cells made of electrically conductive plastics. (Indeed, Dr Heeger won a Nobel prize for discovering that some plastics can be made to conduct electricity.) They found that by adding titanium oxide to such a cell and then baking it in an oven, they could increase the efficiency with which it converted solar energy into electricity.

The trick is to put the titanium oxide in as a layer between the part of the cell where the electrons are liberated and the part where they are collected for dispatch into the wider world. This makes the electrically conductive plastic more sensitive to light at wavelengths where sunlight is more intense. Pop the resulting sandwich in the oven for a few minutes at 150°C and the plastic layer becomes crystalline. This improves the efficiency of the process, because the electrons find it easier to move through crystalline structures.

The technique used by Dr Lee and Dr Heeger boosts the efficiency of plastic cells to 5.6%. That is still poor compared with silicon, but it is a big improvement on what was previously possible. Dr Lee concedes that there is still a long way to go, but says that even an efficiency of 7% would bring plastic cells into competition with their silicon cousins, given how cheap they are to manufacture.

A second approach, taken by Michael Gr?tzel of the Swiss Federal Institute of Technology, is to copy nature. Plants absorb solar energy during photosynthesis. They use it to split water into hydrogen ions, electrons and oxygen. The electrons released by this reaction are taken up by carrier molecules and then passed along a chain of such molecules before being used to power the chemical reactions that ultimately make sugar.

Dolling up the dole

A better way to help America's jobless

“MANY of our most fundamental systems—the tax code, health coverage, pension plans, worker training—were created for the world of yesterday, not tomorrow. We will transform these systems.” With these words George Bush laid out an agenda of domestic reform at the Republican convention in 2004. That agenda, starting with last year's attempt to transform America's vast state pension system, has gone nowhere. But Mr Bush's basic argument is right. Much of the machinery of America's domestic economic policy dates from the 1930s and needs repair. Unemployment insurance is a case in point.

Created by Franklin Roosevelt in 1935, America's dole has barely changed since. It provides temporary income support to laid-off workers and is financed by a small tax on wages. The details vary from state to state, but full-time workers who lose their jobs get a cheque worth, on average, just over a third of their previous wage for up to six months. Benefits can be paid for longer if the economy is in recession, but only if Congress agrees. By European standards, America's dole is short-lived, a characteristic that encourages people to get a new job quickly.

As a macroeconomic tool, the dole works well. Unemployment cheques support spending when workers are laid off, helping to smooth the business cycle. But the cash is not aimed at those who need it most. That is because a rising share of the unemployed are not laid off temporarily, but have seen their jobs disappear for good.

Research by Erica Groshen and Simon Potter, of the Federal Reserve Bank of New York, suggests that whereas temporary lay-offs explained much of the jumps in unemployment during the recessions of the 1970s and early 1980s, nowadays structural job losses dominate. People who are unemployed because their job has gone permanently need to find new lines of work. It takes them longer to find a job and, when they do, they are often paid considerably less than before.

Jeffrey Kling, an economist at the Brookings Institution, argues that the unemployment-benefit system ought to distinguish those who are temporarily out of a job but may find similar, or higher-paid work, and those who face permanently lower income. In a paper for the Hamilton Project, a research programme at Brookings that seeks new policies for America's centre-left, Mr Kling suggests that the dole should become less like a handout from the government and more like an insurance policy that individual workers finance themselves.

The idea is to give every worker an account, unsnappily called a “temporary earnings replacement account” or TERA. While in work, people could set aside money in these accounts. Those who lose their jobs could take cash out. The level and duration of withdrawals would be set by the government and would be the same as under today's unemployment system.

Bitter consequences

Green vegetables really do taste horrible

“EAT up your greens” is the exasperated cry of many a parent when faced with a fussy child. But the paradox of vegetables is that they are both good and bad for you. The cultivated plants consumed by all folks except hunter-gatherers have evolved an

ambiguous relationship with people, in which they exchange the risk of being eaten by a human for the reproductive security that domestication brings. But the wild plants from which these cultivars are descended are very definite about the matter. They do not want to be consumed and they make that opinion known by deploying all sorts of poisonous chemicals to discourage nibbling herbivores. In many cases, those poisons have persisted into the cultivated varieties, albeit at lower levels.

Animals, of course, have evolved ways of dealing with these poisons. The best of these, from a plant's point of view, is when an animal can taste, and thus reject, a poisonous chemical. This has long been assumed to be the basis of the taste of bitterness, but that theory has only now been put to a clear test. In a paper just published in Current Biology, Mari Sandell and Paul Breslin, of the Monell Chemical Senses Centre, in Philadelphia, have looked at the phenomenon in that bête noire of presidents and parents alike: broccoli.

Bitter tastes are detected by receptor proteins that are, in turn, encoded by a family of genes known collectively as TAS2R. Humans have around 200 TAS2R genes, each sensitive to different groups of chemicals. That variety, in itself, indicates the range of the plant kingdom's weaponry. Dr Sandell and Dr Breslin, though, focused their attentions on just one of these receptor genes, called hTAS2R38. The protein derived from this gene is known, from laboratory experiments, to be sensitive to a substance called phenylthiocarbamide (PTC). This compound contains a molecular group called thiourea. And thiourea-containing substances are known from other studies to inhibit the function of the thyroid gland.

Cruciferous vegetables, such as watercress, turnips and—most pertinently—broccoli, are rich in a group of thiourea-containing compounds called glucosinolates. Dr Sandell and Dr Breslin wondered if there might be a connection. And, since different versions of hTAS2R38 code for proteins that have different levels of reaction to PTC, they wondered if that might be reflected in the fact that some people like broccoli, and others do not.

The two researchers assembled a group of volunteers and checked which versions of the hTAS2R38 gene they had. They then fed the volunteers vegetables and recorded their reactions. All of the vegetables were thought by at least some people to be bitter, but not all of them were cruciferous plants. The non-cruciferous ones were plants which, so far as is known, do not contain glucosinolates.

The results were clear. All volunteers found the non-cruciferous vegetables equally bitter, but their reactions to the cruciferous ones depended on their genes. Those with two copies of the version of hTAS2R38 coding for the protein that binds best to PTC (one copy having been inherited from each parent) thought broccoli and its cousins the most bitter. Those who had two copies of the poorly binding version thought they tasted fine. Those with one of each had an intermediate reaction.

Despite broccoli's bad reputation, the most bitter vegetables, according to this research, are swedes and turnips. That accords well with work which has shown that eating these vegetables suppresses the uptake of iodine into the thyroid gland. Iodine is an essential ingredient of thyroxine, a hormone produced by that gland.

The upshot of all this is that the complaints of children (and, indeed, of many adults) that green vegetables are horrid contains a lot of truth. There is no doubt that such vegetables are good for you. But they are not unequivocally good. As is often observed in other contexts, there is no free lunch.

Running rings round storms

Trees keep records of passing hurricanes

STUDYING the past is a good way to understand the present, and may even illuminate the future. But the past does not give up its secrets easily. Hurricane scientists, for instance, would like to know about long-term changes in the frequency and strengths of the storms they study. That would help to show whether the shifting pattern of hurricanes seen in the past few decades is cyclical, random or part of a trend that might be caused by global warming. Unfortunately, meteorologists have been keeping systematic tabs on the relevant data for only about 60 years. Before that, records are sporadic and anecdotal—and that is not enough to see the bigger picture.

Human records, however, are not the only sort available. Trees are popular with scientists who want to look at what happened a few hundred years ago, because their annual growth rings mean that their wood can be dated accurately. And Dana Miller, of the University of Tennessee, and her team have used that insight to search for hurricanes that humanity has failed to record. Their results, just published in the Proceedings of the National Academy of Sciences, have identified a number of previously unknown storms that hit the south-west coast of North America. The trick they used to do this was to look at the isotopic composition of the oxygen in the wood of local trees.

Water contains two isotopes of oxygen, one of which has two more neutrons than the other, making it heavier. When a hurricane forms, it tends, initially, to rain water molecules containing the heavier isotope. At that point it is still over the sea.

本文来源:https://www.bwwdw.com/article/bz83.html

Top