Friday, July 11, 2008

Laptop battery for ibm thinkpad t60 usage tips

Laptop battery for ibm thinkpad t60 usage tips

1. A new battery usually comes in a discharged condition and with a very low capacity. It is generally recommended to fully charge new battery packs before use. Refer to the users?guide of your electronic device for charging instructions.

2. A new battery pack needs to be circled (fully discharged and recharged) three to five times to reach its optimum performance.

3. Rechargeable battery will undergo self-discharging when left unused for a long period of time. Thus, it should always be stored in a fully charged state and kept in a cool, dry and clean place.

4. To maintain the optimum performance of a battery pack, it is highly recommended to circle (fully discharging and recharging) it at least once a month.

5. It is normal if a new battery gets warm when being charged or used. However, close attention should be paid if the battery pack becomes excessively hot. This may indicate there is a problem with the charging circuit of the electronic device. So, it is necessary to have it checked by a qualified technician.

6. New batteries are hard to be charged. Sometimes, your electronic device may indicate a fully charged condition about 10 to 15 minutes when the new battery pack is being charged for the first time. When this happens, remove the battery pack and let it cool down for about 10 to 15 minutes then repeat the charging procedure. Sometimes, a new battery will suddenly refuse to be charged. If this happens, it is then suggested to remove the battery from the device and reinsert it.

To better use your battery, please take care of the precautions listed below:
1. Do not modify or disassemble.
2. Do not incinerate or expose battery to excessive heat, which may result in an exposure.
3. Do not expose battery to water or other moist matters.
4. Do not pierce, hit, step on, crush or abuse the battery.
5. Do not place battery in device for a long period of time if device is not being used.
6. Do not short circuit the terminals or store your battery pack with metal objects such as necklaces or hairpins.

Activate new thinkpad t60 battery
The new notebook computer should get small electricity for the first time. To activate the new ibm thinkpad t60 battery, should not use the outer power first, but use up the energy of the thinkpad t60 battery until the notebook shut down. Then charge the battery with the outer power until the indicator light go out. Then use up the energy again. Repeating 3 times to activate the ibm thinkpad t60 battery.

Top 15 Ways to Extend Your Laptop’s Battery Life

Top 15 Ways to Extend Your Laptop’s Battery Life

laptop

Laptops tend to lose their charm quickly when you’re constantly looking for the nearest power outlet to charge up. How do you keep your battery going for as long as possible? Here are 15 easy ways to do so.

1. Defrag regularly - The faster your hard drive does its work - less demand you are going to put on the hard drive and your battery. Make your hard drive as efficient as possible by defragging it regularly. (but not while it’s on battery of course!) Mac OSX is better built to handle fragmentation so it may not be very applicable for Apple systems.

2. Dim your screen - Most laptops come with the ability to dim your laptop screen. Some even come with ways to modify CPU and cooling performance. Cut them down to the lowest level you can tolerate to squeeze out some extra battery juice.

3. Cut down on programs running in the background. Itunes, Desktop Search, etc. All these add to the CPU load and cut down battery life. Shut down everything that isn’t crucial when you’re on battery.

4. Cut down external devices - USB devices (including your mouse) & WiFi drain down your laptop battery. Remove or shut them down when not in use. It goes without saying that charging other devices (like your iPod) with your laptop when on battery is a surefire way of quickly wiping out the charge on your laptop battery.

5. Add more RAM - This will allow you to process more with the memory your laptop has, rather than relying on virtual memory. Virtual memory results in hard drive use, and is much less power efficient. Note that adding more RAM will consume more energy, so this is most applicable if you do need to run memory intensive programs which actually require heavy usage of virtual memory.

dvd

6. Run off a hard drive rather than CD/DVD - As power consuming as hard drives are, CD and DVD drives are worse. Even having one in the drive can be power consuming. They spin, taking power, even when they?re not actively being used. Wherever possible, try to run on virtual drives using programs like Alcohol 120% rather than optical ones.

7. Keep the battery contacts clean: Clean your battery’s metal contacts every couple of months with a cloth moistened with rubbing alcohol. This keeps the transfer of power from your battery more efficient.

8. Take care of your battery - Exercise the Battery. Do not leave a charged battery dormant for long periods of time. Once charged, you should at least use the battery at least once every two to three weeks. Also, do not let a Li-On battery completely discharge. (Discharing is only for older batteries with memory effects)

9. Hibernate not standby - Although placing a laptop in standby mode saves some power and you can instantly resume where you left off, it doesn’t save anywhere as much power as the hibernate function does. Hibernating a PC will actually save your PC’s state as it is, and completely shut itself down.

temp

Der Dax stürzt auf Jahrestief

Frankfurter Börse

Der Dax stürzt auf Jahrestief

Von Alexander Armbruster

11. Juli 2008 Der Dax ist auf ein neues Jahrestief gefallen. Heute notierte der Leitindex der deutschen Wirtschaft um 13.14 Uhr auf 6139 Punkten. Noch vor einem Jahr, am 13. Juli 2007, ist er bis auf 8151 Punkte gestiegen, womit er sogar den bis dato höchsten Wert übersprang, den er während der Technologie-Hausse im März 2000 erreichte.

Anleger hatten im vergangenen Jahr eher damit gerechnet, dass er weiter steigen würde. Doch anstelle eines neuen Höchststandes begann die Finanzkrise. Der Dax setzte zum Sinkflug an und segelte seitdem um mehr als 2000 Punkte nach unten. Umgerechnet entspricht das einem Jahresminus von rund 24 Prozent.

Verlierer sind die Banken und Autobauer

Der Absturz erfolgte branchenübergreifend. Allen voran fielen die Aktienkurse der Banken. Der Kurs der Deutschen Bank hat sich halbiert und handelt heute etwas über 50 Euro. Commerzbank-Aktien verringerten sich beinahe genauso stark. Die Papiere der Hypo Real Estate sind um 67 Prozent niedergegangen und führen damit die Verliererliste im Dax an.

Stark gefallen sind auch die Werte deutscher Autobauer: Die Aktie von Daimler um 44 Prozent, die des Wettbewerbers BMW um 38 Prozent. Einzig der Volkswagen-Konzern bescherte seinen Anlegern Freude, ein Papier kostet momentan 168 Euro und damit 36 Prozent mehr als noch vor 12 Monaten. Dass sich die Aktie dem Markttrend entziehen konnte, lag allerdings einfach daran, dass der Luxuswagenherrsteller Porsche bei dem Massenwagenproduzenten eingestiegen ist - und damit den Aktienpreis hoch gekauft hat.

Optimistische Aussichten

Privatanleger können dennoch optimistisch nach vorne schauen. Nach Meinung vieler Profis wird der Dax bis zum Jahresende auf die Marke von 7000 Punkten zumarschieren oder diese sogar überschreiten. Die Analysten der Deutschen Bank sehen den Index in ihrem Basis-Szenario auf 7700 Punkte steigen, diejenigen der HSH Nordbank sogar auf 8200 Punkte. Peter Körndl, der die Aktienstrategie der Dresdner Bank mitbestimmt, geht von einem Indexstand von 7600 Punkten aus.

Zu seinen Annahmen gehört allerdings, dass der Ölpreis eher fällt als steigt, damit die Inflationsraten abnehmen und die Zentralbanken weniger über weitere Zinserhöhungen nachdenken. „Wenn wir geringere Preissteigerungsraten sehen, wird die Bewertung wieder eine größere Rolle spielen und in dieser Hinsicht sind Aktien immer noch im historischen Vergleich sehr günstig.“

Damit spielt er auf Bewertungsgrößen wie etwa das Verhältnis zwischen erwarteten Unternehmensgewinnen und Aktienkursen an, das derzeit für den Dax bei ungefähr 11 liegt, während der langfristige Durchschnitt 15 beträgt.

Was macht der Ölpreis?

Der Ölpreis spielt auch eine Schlüsselrolle für die Prognose der Commerzbank. „Wenn der Ölpreis konstant unter 130 Dollar fallen würde, wäre das auf jeden Fall geeignet, um die Stimmung der Anleger aufzuhellen und den Dax nach oben zu treiben“, sagt Markus Wallner. Der Aktienstratege sieht den Dax auf 7200 Punkte steigen. Aus seiner Sicht hilft, dass viele Unternehmen bilanziell immer noch gut dastehen, was zumindest das Risiko weiterer Kursverluste stark reduziert. „Allerdings ist die Nervosität momentan sehr hoch und auch geopolitische Ereignisse treiben die Kurse.“

Viele Themen für „überdiskutiert“ hält Markus Reinwand, Aktienstratege der Hessischen Landesbank. Die negativen Schlagzeilen sind aus seiner Sicht bereits in vielen Aktienkursen enthalten. In einer aktuellen Analyse hat er außerdem ausgerechnet, dass in der Vergangenheit nach Indexrückgängen um 20 Prozent meistens eine Bodenbildung folgte, von der auch wieder Bewegungen nach oben möglich waren. Der Dax klettert nach seiner Schätzung auf 7300 Punkte.

Matthias Jörss, Aktienstratege der Bank Sal. Oppenheim, ist weniger optimistisch. Auch er rechnet mit höheren Kursen, sieht den Dax aber nur auf 6900 Punkte steigen. Die Bewertungen vieler Werte seien zwar günstig. Aber die Nachrichten bleiben aus seiner Sicht erst mal schlecht. „Außerdem wird das Wachstum auf vielen Schwellenmärkten sinken und gleichzeitig die Inflation steigen, was Aktien belastet.“ Er empfiehlt deshalb Anlagen in von der Konjunktur wenig betroffenen Sektoren. Aus dem Dax hält er die Aktien von RWE, der Post, der Postbank und der Münchener Rückversicherung für kaufenswert.



Text: F.A.Z.
Bildmaterial: AP, Bloomberg; Thomson Financial Datastream, ddp, FAZ.NET

Breaking up is hard to do

Corporate disposals

Breaking up is hard to do

Jul 10th 2008
From The Economist print edition

GE plans to sell its consumer and industrial division. Corporate disposals are getting more popular


Illustartion by David Simonds

DECIDING to sell a business unit or subsidiary can be one of the hardest decisions chief executives have to make. Some cannot bring themselves to wield the axe: big disposals are often triggered only when a new boss takes over or a financial crisis forces a chief executive’s hand. But smart sellers can earn impressive returns.

The number of such sales worldwide has been growing steadily, from 10,074 in 2003 to 12,361 last year, according to data from Thomson Reuters, a research firm (see chart below). Over the same period the total value of such deals soared from $539 billion to almost $1.5 trillion, as the average deal size increased.

This year is shaping up to be a busy one for divestitures too, even though the broader mergers-and-acquisitions market is in the doldrums. Slowing economies are forcing companies to take a hard look at their activities. Corporate activists are also pressing firms to shed underperforming assets. On Thursday July 10th General Electric (GE), which has been trying to find a buyer for its home-appliances division, said it plans to spin off its entire Consumer & Industrial division to shareholders. Other transactions in the pipeline include Time Warner’s planned disposal of its cable-television business, Allianz’s scheme to sell Dresdner Bank, and Royal Bank of Scotland’s proposed sale of its insurance arm. General Motors (GM) is looking for a buyer for its Hummer brand, and this week Merrill Lynch was reported to be planning to sell all or part of its 20% stake in Bloomberg, a financial-news provider.

Most sales take place in rich countries, but emerging-market firms have begun to join in. In December 2007 Orascom Construction Industries (OCI), an Egyptian conglomerate, sold its cement business to France’s Lafarge for €8.8 billion ($12.9 billion) in order to concentrate on its fast-growing construction, natural-gas and fertiliser activities. In February the firm snapped up a rival fertiliser company using some of the money raised from the deal.

Although OCI sold its business outright, many emerging giants prefer spin-offs, which involve distributing shares in a subsidiary to the parent’s owners or selling a minority stake to new investors. Take Bharti Airtel, an Indian telecoms giant, which last year spun off a $1 billion minority stake in its network-infrastructure business, Bharti Infratel, to a group of investment funds. Such deals highlight hidden gems in a portfolio, but allow parent companies to retain control. They also attract investors keen to buy into fast-growing industries: in February Kohlberg Kravis Roberts, a private-equity firm, invested a further $250m in Bharti Infratel.

Like many private-equity firms, which frequently shuffle their portfolios, companies that buy and sell a lot tend to shine too. Carsten Stendevad of Citi, an investment bank, notes that between 2002 and 2007 the returns of firms that used a balanced mix of acquisitions and disposals outperformed acquisition-focused companies by almost 5% a year.

Why? Part of the answer is that investors assume that active portfolio managers such as GE and Procter & Gamble (a consumer-goods giant which recently announced the $3 billion sale of its Folgers coffee business) are very focused on growth. They are also less likely to suffer from what academics call “capital allocation socialism”. In many firms, some businesses have excellent growth prospects whereas others are mature but produce lots of cash. Ideally, money from these “cash cows” should be redeployed to the stars, but the cash cows’ managers may fight to keep some of it for their empires. Several studies have shown that the investment in future stars rises sharply once laggards have been dumped.

This effect may help explain why a recent study of European asset sales by Morgan Stanley, another investment bank, found that firms selling large businesses tended to get a bigger share-price boost in the long run than those selling tiddlers. Investors seem to interpret a big asset sale as a signal that managers are serious about focusing on growth areas. If a company is in trouble, a big sale may be seen as an important part of a turnaround.

Of course, a firm will reap the benefits of a disposal only if it is handled well. Yet dealmakers say managers often put less effort into selling businesses than buying them, perhaps because they just want to focus on what’s left. Successful sellers, however, work hard to get the best possible prices for their assets, even if it takes months. GE is excellent at this. Its chief executive, Jeffrey Immelt, has been jetting around the globe recently trying to drum up interest in its household-appliances arm from the likes of China’s Haier and Sweden’s Electrolux.

As well as touting a division to potential buyers, managers must also assess the impact its departure will have on shared corporate resources, such as finance and legal services.

The biggest difficulty of all, however, is deciding when to sell. Many experts say that firms often let go of assets too late in a business cycle, when prices are depressed. GM is likely to have trouble parking its Hummer division, for example, having decided to look for a buyer just as giant, thirsty vehicles are going out of fashion.

Chinese companies are investing more abroad

China

Hungry dragon

Jul 11th 2008
From Economist.com

Chinese companies are investing more abroad


AFTER 11 months of tricky negotiations, last week a subsidiary of China's state-owned oil company, CNOOC, announced that it would buy a Norwegian oil-services firm for $2.5 billion. Foreign investment by Chinese companies has grown steadily, reaching $18.7 billion last year. But striking deals is getting harder. Since 2005, when CNOOC was blocked by the American government from buying Unocal, an American oil firm, many of China's big state-owned companies have been wary of bidding for Western firms. And other countries are chary of China's appetite. An estimated $40 billion of potential Chinese acquisitions are awaiting approval by Australian regulators.

Shutterstock

Thursday, July 10, 2008

Don't Let the Circumstances Outpace Your Assumptions

Don't Let the Circumstances Outpace Your Assumptions

A front-page article in yesterday’s Wall Street Journal illustrated how important it is to periodically revisit the assumptions behind an idea.

The article described how several years ago, the head of Sacramento’s regional planning agency started to push developers to concentrate growth in defined areas rather than furthering suburban sprawl. The argument hinged on lowering pollution and fostering economic development.

Of course, with the price of oil shooting up, today the idea looks remarkably prescient.

I wonder, however, how many planners rejected similar ideas because they couldn’t imagine people making living decisions based on the cost of commuting, and how many now wish they started dusting off those rejected plans when signals emerged suggesting that circumstances had changed.

On the other hand, sometimes circumstances change in ways that undermine ideas that once seemed credible. Consider Motorola’s daring, and ultimately doomed, venture to provide satellite-driven mobile telephony.

When Motorola started investing in the multi-billion dollar “Iridium” project, competing standards made using mobile telephony in different counties a major headache. I remember when I joined McKinsey & Co. in the mid 1990s how world travelers would carry several mobile phones. Under these conditions, a truly global technology had appeal.

As many markets converged on a standard (Global System for Mobile communications, or GSM) and created roaming agreements, the basic premise behind Iridium crumbled. Motorola continued plowing money into the venture until it was crystal clear that it was a flop.

It’s rare that you come across a universally good or bad idea. Whenever you make a decision about an idea, ask about the three developments that would make you reject the project you just approved or approve the project you just rejected. Periodically revisiting that shortlist will help to properly manage your innovation portfolio.

Can readers think of examples of good "before their time" ideas that might have been killed too soon? I wonder sometimes what would have happened if Apple kept at the Apple Newton for example. Maybe the iPod would have come even sooner.

Does “Management” Mean “Command and Control”?

Does “Management” Mean “Command and Control”?

I read recently that IBM was abandoning the term “knowledge management” for “knowledge sharing.” According to an article on the KnowledgeBoard site (thanks to Chris Johannesson from NBC Universal for suggesting that I blog about it), Chris Cooper, knowledge sharing solutions leader at IBM Global Business Services (GBS), deems it a “philosophical repositioning.” Cooper notes, “Management suggests control: control of process and control of environment.” Another GBS knowledge specialist, Luis Suarez, notes in the same article, "Command and control corporations are no longer going to be there. People need to be freed to share what they know."

Hmm…better tell all the world’s managers, schools of management, management consultants, etc. The term “management” is apparently a synonym for “command and control,” and we know that’s bad. “Command and control” is top-down, mean and nasty, and headed for extinction; “sharing” is bottom-up, nice and friendly, and the wave of the future. Maybe the Yale School of Organization and Management, for example, should become the Yale School of Sharing.

OK, I have no problem with giving something a new name when you are adopting a new emphasis. I don’t even have that much of a problem with the term “sharing,” although it is somewhat reminiscent of kindergarten. However, I do have a problem with overly simplistic characterizations of knowledge management, and management more generally.

Let’s talk about the more limited issue of defending knowledge management. As I said, I don’t really care what you call it, but if your organization really cares about creating, distributing (I’m sorry—“sharing”), and applying knowledge, you need to manage it. The last time I checked, “management” of knowledge could include some relatively structured, “here’s the knowledge we really need to do our jobs right” approaches, as well as some more emergent, Enterprise 2.0-oriented ones. If you only do the former, your knowledge workers will probably feel a bit stifled; if you only do the latter, things will probably feel a bit chaotic. If I’m a NASA astronaut, for example, and I’m sitting on the launch pad when something goes wrong, I’d rather have people looking for a solution in structured knowledge bases than mucking around in blogs and wikis.

But the broader issue is whether “management” is an outdated concept, or whether it’s the same as “command and control.” Frankly, I think those are nutty ideas. There may sometimes be a need for less directive approaches to management, as I argued with respect to knowledge workers in my book Thinking for a Living. But the right style of management, like the right approach to knowledge management, varies widely based on a number of factors—including the people being managed, the society in which you’re managing, and the task at hand.

In fact, I think we should ban the term “command and control.” It’s simplistic shorthand for a stereotyped approach to management. The world of management is much more subtle and multi-faceted, and any synonyms for it should reflect that complexity.

Founding Father

July/August 2008

Founding Father

A new book describes the man who created modern venture capital.

By Mark Williams

Although it's a popular story, it is untrue that President George W. Bush once said, "The problem with the French is that they have no word for entrepreneur." Still, a common prejudice in Anglophone nations holds that the French are less entrepreneuri­al than we. Creative Capital: Georges Doriot and the Birth of Venture Capital--a biography of the French-born Harvard Business School professor who practically created modern venture capitalism--is a reproach to that assumption.

That said, as BusinessWeek's Spencer Ante makes clear in his new book, Georges Doriot was an unusual Frenchman. He studied the sciences at his Parisian lycée, to which--after gaining his license at 15--he drove through the boulevards of a capital by then hunkered down for World War I. At 18 he passed from that lycée to the charnel house of the Western Front as an officer in an artillery regiment; at war's end heeded his father's counsel that the shattered state of France made the New World his wisest option.

So Georges Doriot came to the United States at 21 with neither family nor friends, nor much money, but with the intention to enroll at MIT and with a letter from a friend of his father introducing him to ­A. Lawrence Lowell, president of Harvard. At Lowell's suggestion he studied at Harvard Business School rather than MIT, and in his first job at an investment bank he befriended a young Lewis Strauss, who would later be the chairman of the U.S. Atomic Energy Commission and a dispenser of federal benifices on an enormous scale. Even in Doriot's earliest years in America, then, its future eminent men were familiar to him--though still strangers to most of their countrymen--and this pattern intensified after Harvard Business School hired him in 1925: his former students frequently attained high positions in business or government. During World War II, having become a U.S. citizen, Doriot joined the army, became director of the Military Planning Division, and received brigadier general's rank in the Quartermaster Corps after William Donovan, soon to be head of the OSS (forerunner of the CIA), recommended him to President Roosevelt. His military superior in the war was a man who in the 1920s had attended his lectures on the virtues of the goal-oriented campaign and the collective wisdom of the markets.

In a hurry: Georges Doriot in 1931 on the luxury liner Ile de France, 10 years after his arrival in the United States from France.
Credit: Bettmann/Corbis
MORE INFORMATION:

Creative Capital: Georges Doriot and the Birth of Venture Capital
By Spencer E. Ante
Harvard Business Press, 2008, $35.00

On that latter subject Doriot felt strongly. In speeches and articles, he opposed both the dirigiste political economy of his native France and the tax hikes and anticompetitive laws enacted in the United States under the New Deal. Such regulations, he maintained, arrogated to bureaucrats the function of the markets; their worst feature was that they let government lend money to failing businesses. Ante notes that a former colleague of Doriot's, James F. Morgan, recalled him as "the most schizophrenic Frenchman I've ever met"--devoted to his original land's wine, cuisine, and language even as "the French capacity to make very simple things complicated drove him nuts." However atypical a Frenchman Doriot was, his pro-entrepreneurial philosophy--alongside his vast experience serving on dozens of corporate boards in the interwar years and running much of U.S. military procurement during World War II--made him the natural choice for the role of company president when in 1946 a group of Boston's leading citizens set up the American Research and Development Corporation (ARD) as the first publicly owned venture capital firm.

By the time Doriot called it quits in 1972 by merging ARD with the conglomerate Textron, his firm had invested in 120 companies, most of which had proprietary, innovative technologies in areas including isotope conversion, water desalination, electronics, data processing, scientific instrumentation, and electrical generation. It's an impressive list of investments, containing names to conjure with--if your taste runs to conjuring with Zapata Off-Shore, a company headed by George H. W. Bush that had a novel mobile oil-drilling rig, or Digital Equipment Corporation (DEC), which Doriot funded with an initial $70,000 in 1957 and which returned more than $400 million when ARD liquidated its stake in 1972.

With DEC, a legendary company from the dawn of the computer age, we enter a landscape that more closely resembles our own. Doriot left his mark in other realms--­principally as an early advocate of globalization, by founding a ­European-based counterpart of Harvard Business School called the Institut Européen d'Administration des Affaires, or INSEAD. Yet his chief legacy is his quarter-century at the head of the first organized venture capital firm to raise its funds from institutional investors and the public. Contemporaneously with ARD's watershed investment in DEC, others began walking the trails Doriot had blazed: Arthur Rock (a student of Doriot's in the Harvard class of 1951) backed the departure of the "Traitorous Eight" from Shockley Semiconductor to form ­Fairchild Semiconductor in 1957, then funded ­Robert Noyce and ­Gordon Moore when they left ­Fairchild to found Intel; ­Laurance ­Rockefeller formed ­Venrock, which has since backed more than 400 companies, including Intel and Apple; Don ­Valentine formed Sequoia Capital, which would invest in Atari, Apple, Oracle, Cisco, Google, and YouTube.

Creative Capital is not a yellowing evocation of a vanished era of business. Nor does it suggest that there are, or once were, more systematic, less speculative ways of investing in technology startups. But if one is struck by how little Doriot's venture capi­talism differed from that of today's Silicon Valley, Ante's book does show how the structure of venture capital has evolved. At the 1960s' end, for instance, when Doriot sought a successor at ARD, he favored one of his former students, Thomas Perkins, who'd made a name for himself as administrative head of the research department at Hewlett-Packard. Perkins found polite reasons to decline Doriot's offer, but his real motive--as he told Ante--was simply that "there was no way to make significant money because of the structure of ARD." Doriot endured bureaucratic regulators who did not understand or care how a venture capital firm differed from other investment companies. ARD suffered because, since it was incorporated as a publicly traded investment company, its employees could not generally receive stock options in its portfolio companies, despite Doriot's ceaseless pleas to the U.S. Securities and Exchange Commission.

The reality that Doriot's company faced from 1959 onward was that a new organizational form--the limited partnership, born in Texas's oil-wildcatting industry--was being adopted by newer VC firms. Ante quotes a former ARD executive who recalled that after he supervised the IPO of one portfolio company, the net worth of that company's CEO "went from 0 to $10 million and I got a $2,000 raise." A VC limited partnership, by contrast, gave its general partners not just management fees but also portions of its capital gains; additionally, it permitted profits to be passed on to its investors without incurring corporate taxes, and it mandated that limited partners stand clear of management. Small wonder that when Perkins helped found Kleiner Perkins Caufield and Byers in 1972, it was as a limited partnership. When Doriot finally accepted the SEC's intransigence, he deemed ARD "not competitive anymore" and sought the merger with Textron.

Similar disagreements continue between government and industry. After the dot-com and telecom crashes, Washington passed the Sarbanes-Oxley Act and new accounting rules for expensing stock options, despite the predictions of many tech­nology executives and VCs that regulation would undermine innovation. John Doerr at Kleiner ­Perkins, for one, believes that that happened: ­"Sarbanes-Oxley did have some chilling effects on technology startups in terms of the cost of being able to go public."

What verdict should we award Doriot and ARD? David Hsu, a professor of management at the University of Pennsylvania's Wharton School, says that while ARD suffered from fatal organizational flaws, it made a lasting imprint on the practice of venture capital. Indeed, writes Hsu in a paper he coauthored, by the time Doriot sold the firm to Textron, "venture capital had become a part of the economy, and ARD simply slipped out of existence with its historical mission accomplished."

Mark Williams is a contributing editor to Technology Review.


"Plug and Play" Hospitals

Wednesday, July 09, 2008

"Plug and Play" Hospitals

Medical devices that exchange data could make hospitals safer.

By Kristina Grifantini

The bewildering variety of new medical devices in U.S. hospitals promises higher standards of care. But it also poses new opportunities for error. A growing number of physicians believe that the interoperability of medical devices--their ability to communicate with each other--could make hospitals safer and more efficient.

"Today, there are many proprietary systems available from different vendors, but the problem is, these systems can't talk to one another," says Douglas Rosendale, a surgeon who works on information integration at Veterans Health Administration and Harvard Brigham and Women's Hospital. "If they can't interface, then they can't share information, which could have an impact on patient care." Estimates of the number of preventable deaths caused each year by medical errors in American hospitals range from 98,000 to 195,000.

Julian Goldman, director of the Center for Integration of Medicine and Innovative Technology's Medical Device Interoperability Program, based at Massachusetts General Hospital, has developed two demonstration projects that illustrate the idea of the "plug and play" operating room. The first project is an integrated ventilator. A common problem in hospitals is taking chest x-rays of patients on ventilators, says Goldman. To keep the lungs' movements from blurring the image, doctors must manually turn off the ventilator for a few seconds to take the x-ray. But then they run the risk of inadvertently leaving the ventilator off for too long, says Goldman.

To simulate an x-ray machine, Goldman used a webcam, which he connected to a ventilator and a computer. He synched the camera with the ventilator so that it would capture images only when the ventilator was at the point of full inhalation or exhalation. Goldman says that as a result of his demonstration, standards for ventilators are in the process of being revised so that future versions of the devices will include a pause function and will be subject to network control, moving toward interoperability.

Breathe in, breathe out: A ventilator and camera connected to a computer demonstrate how “plug and play” medical devices could help prevent medical errors. Traditionally, a doctor who wants to take an x-ray of a patient on a ventilator has to manually stop the device so that the lungs’ motion won’t blur the image. But that introduces the risk of accidentally leaving the ventilator off for too long. This experimental setup at Massachusetts General Hospital times photographs taken by a webcam so that they correspond to the full-exhalation or full-inhalation states of the ventilator.
Credit: Kristina Grifantini

"That's an example where you actually avoid the risk by simply not having to turn off the ventilator at all," says Peter Szolovits, a professor of computer science at MIT who studies medical data integration. "In other cases where you have a bunch of data simultaneously, you can do a better job of trying to understand what's going on with the patient," he says.

Device interoperability could also reduce the large number of false alarms that nurses must contend with. "If you go into an ICU, it's a madhouse," says Szolovits. "There are alarms going off constantly, because each alarm is separate from the others, so none of them have an integrated view of what's going on with the patient." If the data from medical monitors were integrated, he says, alarms would be more likely to indicate something truly important.

Goldman's second plug-and-play demonstration simulates a self-administering pain medication pump, a device widely used in hospitals despite its occasional adverse effects. Monitoring devices strive to eliminate the risk that patients will accidentally overdose, but they set off many false alarms. Goldman speculated that if a computer received data from two or more monitoring devices, it could much more easily distinguish false emergencies from real ones. In his demonstration, simulated patient data is fed to an oximeter and a respiratory monitor. The program sounds an alarm only when both sensors suggest that the patient is undergoing a crisis.

Goldman admits that, while his demos are relatively straightforward, obstacles to device interoperability remain. Monitoring systems are expensive for hospitals to replace, he says: "We've made it too difficult to integrate systems to have smart alarms." Another barrier is old-fashioned competitiveness. A vendor that produces medical equipment tends to make its devices compatible only with each other.

But as Goldman points out, many emergency rooms need such specialized equipment that no one vendor can produce all of it. So selecting a single vendor won't solve the interoperability problem. "We're probably a ways off from true interoperability," Rosendale says. "However, there is clearly momentum growing in this area. As computer technology and device dependence grows, that means interoperability is going to be more and more obvious."

"I think everyone recognizes that there's a lot of data generated for patients, but it's not always used as effectively as it could be," says Daniel Nigrin, chief information officer and senior vice president for information services at Children's Hospital Boston. "Over the course of the last 5 to 10 years, there have been several studies that came out that showed basically that there's room for enormous improvement in reducing errors in medicine. That's why efforts like [Goldman's are] so crucial." Nigrin suggests that hospitals are slowly starting to move toward medical devices that share data with one another and with electronic medical-record systems. "There are instances where you're starting to see some of the devices connected. Whether that's having monitoring systems or ventilator systems attached to electronic medical records, you're starting to see some systems like that implemented in a real-world environment," he says.

A Patch to Fix the Net

Thursday, July 10, 2008

A Patch to Fix the Net

A major flaw in the basic design of the Internet is being repaired by a large group of vendors working in concert.

By Erica Naone

On Tuesday, major vendors released patches to address a flaw in the underpinnings of the Internet, in what researchers say is the largest synchronized security update in the history of the Web. Vendors and security researchers are hoping that their coordinated efforts will get the fix out to most of the systems that need it before attackers are able to identify the flaw and begin to exploit it. Attackers could use the flaw to control Internet traffic, potentially directing users to phishing sites or sites loaded with malicious software.

Discovered six months ago by security researcher Dan Kaminsky, director of penetration testing services at IOActive, the flaw is in the domain name system, a core element of the Web that helps systems connected to the Internet locate each other. Kaminsky likens the domain name system to the telephone company's 411 system. When a user types in a Web address--technologyreview.com--the domain name system matches it to the numerical address of the corresponding Web server--69.147.160.210. It's like giving a name to 411 and receiving a phone number, Kaminsky says.

Credit: Technology Review

The flaw that Kaminsky found could allow attackers to take control of the system and direct Internet traffic wherever they want it to go. The worst-case scenario, he says, could look pretty bleak. "You'd have the Internet, but it wouldn't be the Internet you expect," Kaminsky says. A user might type in the address for the Bank of America website, for example, and be redirected to a phishing site created by an attacker.

Details of the flaw are being kept secret for now. After Kaminsky discovered it, he quietly notified the major vendors of hardware and software for domain name servers. In March, he was one of 16 researchers who met at Microsoft's Redmond, WA, campus to plan how to deal with the flaw without releasing information that could help attackers. The researchers began working with vendors to release patches simultaneously. Also, since patches are known for giving away information that can help attackers reverse-engineer malicious software, the researchers chose a fix that kept the exact nature of the problem hidden. "We've done everything in our power up to and including selecting an obscure fix to provide the good guys with as much of an advantage as possible," Kaminsky says. "The advantage won't last forever. We think--we hope--it'll last a month."

Since the flaw is in the design of the domain name system itself, it afflicts products made by a variety of vendors, including Microsoft, Cisco, Sun Microsystems, and Red Hat, according to a report released by the U.S. Department of Homeland Security's Computer Emergency Readiness Team. The flaw also poses more problems for servers than it does for Web surfers, so vendors are focusing on getting patches to Internet service providers and company networks that might be vulnerable. Most home users will be covered by automatic updates to their operating systems.

Rich Mogull, an analyst with Securosis, says, "This is something that absolutely affects everyone who uses the Internet today." While he notes that most home users won't have to take action to address the flaw, he stresses that it's very important for businesses to make sure that they've covered their bases. "It is an absolutely critical issue that can impede the ability of any business to carry out their normal operations," he says.

Although Kaminsky was careful to avoid giving out too much information about the flaw that he discovered, he did say a few things about the nature of the fix. When a domain name server responds to a request for a website's location, it provides a confirmation code that is one of 65,000 numbers, as assurance that the transaction is authentic. "What has been discovered," Kaminsky says, "is that, for undisclosed reasons, 65,000 is just not enough, and we need a source of more randomness." The new system will require the initial request to include two randomly generated identifiers, instead of the one it now contains. Both identifiers will automatically be returned in the server's response. Kaminsky likens this to sending mail. Before the patch, it was possible to send a letter signed on the inside, but without a return address. After the patch, all "mail" sent from domain name system servers must include both a "signature"--the confirmation code--and the "return address"--the source port information.

Jeff Moss, CEO of Black Hat, a company that organizes conferences on security, stresses the importance, not only of the vulnerability, but also of the approach taken to patching it. "I don't even want to ask Dan [Kaminsky] how much money he could have gotten for this bug had he decided to sell it," Moss says.

Kaminsky says he's glad that vendors were willing to work together to address the flaw. "Something of this scale has not yet happened before," he says. "It is my hope that for any issue of this scale, especially design issues of this scale, this is the sort of thing that we can do in the future." He plans to release full details of the vulnerability next month at the Black Hat security conference in Las Vegas.

Mapping the California Wildfires

Mapping the California Wildfires

NASA is using its new thermal-imaging sensor to help track the fires raging across the state.
Wednesday, July 09, 2008
By Brittany Sauser

More than 1,700 wildfires are burning across the state of California, where dry, windy conditions continue to make it difficult for firefighters to put out the flames. According to the California Department of Forestry and Fire Protection, a total of 667,863 acres have been burned since June 20, destroying 81 homes while threatening more than 13,000. The costs are estimated to be greater than $200 million.

The United States Forest Service and the National Interagency Fire Center continue to tap all their resources to help contain the fires. In particular, they have called on, once again, NASA to use its unmanned aerial vehicle, Ikhana, equipped with a new thermal-imaging sensor to track the fires. The 12-channel spectral sensor is more sensitive in the thermal range and can track fires with greater accuracy than can current methods to map fires, such as using line scanners. The data from the new sensor is automatically processed onboard the aircraft and then sent to ground stations, where it is incorporated into a Google Earth map.

The data is displayed in an array of colors to let firefighters know where the fire is actively burning, as well as identify areas that are cooling. This helps the firefighters determine where to deploy resources.

Ikhana conducted its first operational mission on July 8; it mapped fires such as the American River Complex, Piute, Clover, Northern Mountain, and BTU Lightning Complex. "The technology is performing flawlessly," says Everett Hinkley, the National Remote Sensing Program's manager at the U.S. Forest Service and a principal investigator on the project to test and develop the sensor. "We are getting updates to the California fire folks within 30 minutes. Never have they had updates this fast. The key ingredient is having satellite communications to be able to push images to a server as they are acquired."

Hinkley has provided Technology Review with exclusive images (see below) taken during yesterday's mission. Use the legend as a guide: the fire's hot, active spots are yellow; warm areas that were recently burned are shades of red; and areas that are cooling are blue.






The American River Complex fire is located in Foresthill, in Placer County.




The Clover fire is located in the South Sierra Wilderness, in Kern County.




The Piute fire, located in Twin Oaks, in the South Sierra Wilderness, has burned 33,152 acres and is 28 percent contained.





Another view of the Piute fire.

Images credit: NASA


The bare necessities

Household spending

The bare necessities

Jul 8th 2008
From Economist.com

Where people spend most on food and fuel


THE soaring cost of food and fuel is a concern for the governments of rich and poor countries alike. Many households in Africa and Asia shell-out more on food and fuel as a share of total spending and so are disproportionately hit by rising prices. But in some poor countries fuel subsidies help to ease the pain.

AFP

Tuesday, July 08, 2008

Siemens streicht 16.750 Stellen weltweit

Arbeitsmarkt | 08.07.2008

Siemens streicht 16.750 Stellen weltweit

Schnelle Veränderungen im weltweiten Geschäft und trübe Konjunktur-Aussichten: Damit hat Siemens-Chef Peter Löscher die harsche Maßnahme des deutschen Elektronik-Riesen begründet.

Weltweit arbeiten 419.000 Menschen in dem Unternehmen Siemens, rund ein Drittel davon in Deutschland. Nun sollen 17.000 Stellen wegfallen, teilte das Unternehmen am Dienstag (08.07.2008) mit - davon allein in Deutschland 5250 Arbeitsplätze.

Der Großteil der Arbeitsplätze werde bis 2010 in der Verwaltung gestrichen. Durch die Neuorganisation des Konzerns würden aber auch in der Produktion Einschnitte nötig. Insgesamt kappt Siemens dort 4150 Jobs. Zudem plane Siemens den Verkauf seiner Industriemontage-Sparte SIMS, die in Deutschland rund 1200 Menschen beschäftigt.

Am stärksten von der Stellenstreichung betroffen seien die Standorte Erlangen, München, Nürnberg und Berlin. Von den drei Sektoren von Siemens schultert die Sparte Industrie mit 6350 wegfallenden Stellen den größten Anteil des Jobabbaus. Allein in der gebeutelten Zugsparte fallen 2500 Stellen weg.

"Wir müssen effizienter werden"

Seit 1. Juli 2007 Chef des Aufsichtsrats: Peter Löscher (Quelle: AP)Bildunterschrift: Großansicht des Bildes mit der Bildunterschrift: Seit 1. Juli 2007 Chef des Aufsichtsrats: Peter Löscher

Siemens-Chef Peter Löscher verteidigte den Schritt: "Die Geschwindigkeit, mit der sich das Geschäft weltweit verändert, hat erheblich zugenommen. Wir stellen Siemens darauf ein. Auch vor dem Hintergrund einer sich eintrübenden Konjunktur müssen wir effizienter werden."

Der massive Stellenabbau soll nach Worten von Personalvorstand Siegfried Russwurm "so sozialverträglich wie möglich" erfolgen. Dazu wolle die Unternehmensführung nun rasch mit den Arbeitnehmervertretern Verhandlungen aufnehmen, sagte Russwurm am Dienstag in München. Betriebsbedingte Kündigungen schloss er aber nicht völlig aus. Diese könnten aber nur "das allerletzte Mittel sein", erklärte der Personalvorstand. Man werde "das gesamte uns zur Verfügung stehende Instrumentarium wie beispielsweise Transfergesellschaften oder auch Altersteilzeitregelungen durchsprechen".

Empörung vor Ort

Mit Empörung und Unverständnis reagierten Arbeitnehmervertreter und Kommunalpolitiker an den großen Standorten des Konzerns in Erlangen und Nürnberg. "Das ist inakzeptabel für ein Unternehmen mit Milliardenergebnissen und überfüllten Auftragsbüchern", sagte der Erlanger IG-Metall-Chef Wolfgang Niclas der Deutschen Presse-Agentur dpa am Dienstag. Erlangens Wirtschaftsreferent Konrad Beugel (CSU) äußerte sich ebenfalls besorgt. Der Nürnberger IG-Metall-Sekretär Rudi Lutz sagte: "Es herrscht sehr viel Frust." (leix)

Time to fix Siemens

Siemens

Time to fix Siemens

Jul 8th 2008
From Economist.com

Peter Löscher is remodelling the engineering giant


AFP

PETER LÖSCHER’S task, when he took over as the chief executive of Siemens a year ago, was two-fold: remodel the engineering giant to boost efficiency and clean up after one of the biggest scandals in corporate history. He is now attempting to complete both jobs by purging the company from top to bottom. On Tuesday July 8th the firm officially announced its latest gambit. What had been mooted in private a couple of weeks beforehand was made public: Siemens would axe 16,750 employees, some 4% of its global workforce of 400,000.

The cuts, mainly of administrative staff, come after a clear out of top management and a restructuring which is supposed to streamline the way the German firm does business. Most of the supervisory board, management board and senior executives are new. The eight divisions that used to make up Siemens have been crunched into three—energy, health care and industry—and extraneous businesses have been off-loaded, all with the aim of saving cash. All these reorganisations and job cuts are intended to slice €1.2 billion ($1.9 billion) from total costs by 2010.

By chance this is close to the amount reckoned to have been spent illegally around the world over the past few years to win business at the expense of competitors—some €1.3 billion. Investigators probing the charges claim that a culture of bribery, endorsed by top mangers, existed at Siemens. Shareholders, after discovering that the firm was using graft to win contracts, forced the departure of the previous chief executive and chairman, although both deny any knowledge of corruption. European Union watchdogs imposed a fine of €660m on the firm. An investigation by America’s Securities and Exchange Commission is still going on. Siemens can expect a huge fine when the American regulators finally impose their sanctions.

Mr Löscher is keen that the taint of the bribery scandal is washed away as rapidly as possible. He wants to devote himself fully to preparations for an economic downturn and to make the German giant as competitive as GE, its greatest rival and a company that he once worked for. German unions, appalled at the proposed job cuts, may yet elect to strike. Promises that the lay-offs will be carried out as sensitively as possible, with jobs reassigned and early retirements where possible, are unlikely to dampen union anger.

One of Mr Löscher’s tactics for revitalising Seimens is to change its culture: making it less German. Mr Löscher, an Austrian, is the first non-German boss in the company’s 161-year history. He has insisted that all-important meetings are conducted in English. And he has even admitted to making a mistake: Siemens failed to build power stations on time, which resulted in a profit warning earlier this year. In the past, before Mr Löscher’s arrival, the bosses responsible would have been found comfortable jobs elsewhere in the firm. But in this case those responsible quit. Other managers who failed to live up to expectations have also left.

A leaner, less German, less corrupt firm will still hope to exploit the strategy that Mr Löscher inherited from his predecessors: taking advantage of demographic trends that are sweeping the world. Ageing populations are boosting the demand for the type of health-care equipment that Siemens produces; rapid urbanisation in Asia means that its power, water and transport businesses should thrive, even as Western economies feel the pinch.

Siemens’s shares have fallen by 36% this year as doubtful investors voted with their cash. Mr Löscher has a tricky task winning back their confidence by running a collection of businesses that live up to the German reputation for engineering excellence while operating in nearly every country on the planet.

Stem Cells from a Human-Pig Hybrid

Stem Cells from a Human-Pig Hybrid

Scientists hope to create a cell model to study heart disease.
Wednesday, July 02, 2008
By Emily Singer--from Technology Review

British scientists will use pig eggs and DNA from a human patient with heart disease to generate stem cells. If successful, these will be the first human stem cells made from animal eggs.

A shortage of human eggs--a central ingredient in the cloning process--has stalled human cloning, so scientists are studying whether animal eggs can do the trick. (Two groups in the United Kingdom have already been given permission to move forward with hybrid research.) The concept of human-animal hybrids has proved controversial, but scientists will only generate cells from the research; they won't let the embryos develop.

According to an article in the Guardian,

Although the stem cells will not contain any animal DNA, they will not be suitable for treating humans directly. Instead, the scientists will use the cells to learn how genetic mutations cause heart cells to malfunction and ultimately cause life-threatening cardiomyopathy.

"Ultimately they will help us understand where some of the problems associated with these diseases arise, and they could also provide models for the pharmaceutical industry to test new drugs," [Warwick Medical School scientist Justin] St John says. "We will effectively be creating and studying these diseases in a dish, but it's important to say that we're at the very early stages of this research and it will take a considerable amount of time."

Human-animal hybrid research has received much more attention in the United Kingdom than in the United States, largely because the research there is governed by a central regulatory board, and details of research proposals are made public. No broad-arching regulation exists in the United States, where scientists are mainly accountable to university ethical review boards.

Monday, July 07, 2008

Garbage In, Megawatts Out

Wednesday, July 02, 2008

Garbage In, Megawatts Out

Ottawa will build the first gasification facility in North America to make energy from waste.

By Peter Fairley

This week, city counselors in Ottawa, Ontario, unanimously approved a new waste-to-energy facility that will turn 400 metric tons of garbage per day into 21 megawatts of net electricity--enough to power about 19,000 homes. Rather than burning trash to generate heat, as with an incinerator, the facility proposed by Ottawa-based PlascoEnergy Group employs electric-plasma torches to gasify the municipal waste and enlist the gas to generate electricity.

A few waste-to-energy gasification plants have been built in Europe and Asia, where landfilling is more difficult and energy has historically been more costly. But PlascoEnergy's plant would be the first large facility of its kind in North America. The company's profitability hinges on its ability to use a cooler gasification process to lower costs, as well as on rising energy and tipping fees to ensure strong revenues.

PlascoEnergy's approval marked the latest in a string of positive developments for waste gasification projects in recent weeks. Last month, Hawaii okayed $100 million in bonds to finance a waste-to-energy plant using plasma-torch technology from Westinghouse Plasma, based in Madison, PA, that is already employed in two large Japanese waste processing plants. Meanwhile, Boston-based competitor Ze-gen reported the successful ramp-up of a 10-metric-ton-per-day pilot plant in New Bedford, MA, that uses molten iron to break down waste.

Most gasification plants work by subjecting waste to extreme heat in the absence of oxygen. Under these conditions, the waste breaks down to yield a blend of hydrogen and carbon monoxide called syngas that can be burned in turbines and engines. What has held back the technology in North America is high operating costs. Plasma plants, using powerful electrical currents to produce a superhot plasma that catalyzes waste breakdown, tend to consume most of the energy they generate. As a result, the focus of plasma gasification plants has been to simply destroy hazardous wastes. "There was really no thought of being able to produce net power," says PlascoEnergy CEO Rod Bryden.

PlascoEnergy started looking at gasification for municipal solid waste five years ago, when it determined through simulation that cooler plasma torches could do the job. "The amount of heat required to separate gases from solids was much less than the amount being delivered when the purpose was simply to destroy the material," says Bryden. PlascoEnergy tested the models on its five-metric-ton-per-day pilot plant in Castellgali, Spain (jointly operated with Hera Holdings, Spain's second largest waste handler). In January, the company began large-scale trials in a 100-metric-ton-per-day demonstration plant built in partnership with the city of Ottawa.

Easy viewing: Gasification plants that convert municipal waste into energy and by-products can be built squat and stackless, according to Canadian developer PlascoEnergy. This artist’s rendering shows the 400-metric-ton-per-day facility that PlascoEnergy plans to build in Ottawa, Canada’s capital.
Credit: PlascoEnergy

Here's how it works. First, bulk metals are removed, and the rest of the shredded waste is conveyed to a 700 ºC gasification chamber. Most of it volatilizes to a complex blend of gases and rises toward a plasma torch operating at 1200 ºC--well below the 3000 to 5000 ºC used with hazardous wastes. The plasma reduces the complex blend to a few simple gases, such as steam, carbon monoxide, and hydrogen, plus assorted contaminants such as mercury and sulfur; subsequent cleanup systems remove the steam and mercury and scrub out the soot before the syngas is sent to an internal combustion engine generator.

The waste that doesn't volatilize forms a solid slag and drops to the bottom of the gasification chamber. The slag is then pushed to another plasma torch, which drives off remaining carbon in the slag before the slag is cooled and vitrifies. The resulting glass can be blended into asphalt road surfacing or cement.

Under its deal with Ottawa, PlascoEnergy will cover the estimated $125 million that it takes to build the plant, which could be operating within three years, while the city will pay only standard tipping fees--on the order of $60 per metric ton.

Ze-gen plans to avoid the challenge of handling complex municipal wastes by focusing first on an easier-to-handle feedstock: construction and demolition wood wastes. The company has filed seven patents on its molten metal gasification technology and waste-to-syngas process, but the equipment itself is standard for the steel industry, which uses molten iron to catalytically drive off impurities from ore. Ze-gen's pilot plant processes wood waste using a standard electrically heated steel-industry crucible full of molten iron.

Ze-gen CEO Bill Davis estimates that a full-size plant just slightly bigger than PlascoEnergy's commercial plant will produce enough syngas to create 30 megawatts of electricity, but he says that the syngas is also of sufficient quality to be used in other applications. As examples, he cites synthetic gasoline, diesel production, and refinery applications.

Healthier Aging

Thursday, July 03, 2008

Healthier Aging

Mice fed an ingredient in red wine are healthier, although they don't necessarily live longer.

By Anna Davison

Aging mice fed a chemical found in red wine were healthier in their twilight years, scientists have confirmed, although the rodents didn't necessarily live longer.

The anti-aging effects of the compound, resveratrol, mimic those of a calorie-restricted diet, which has been shown to give mice, dogs, and worms longer, healthier lives. Although resveratrol only extended the lives of obese mice in this latest study, it made all the animals healthier. They were spared the worst of some of the declines that come with old age, and they had healthier cardiovascular systems and stronger bones than did untreated animals. Non-obese mice fed resveratrol also had significantly lower total cholesterol. The study was done by the National Institute on Aging, as a follow-up to 2006 findings that resveratrol improves the health and longevity of overweight, aged mice.

Golden years: Aging mice fed a compound called resveratrol had healthier cardiovascular systems, stronger bones, and fewer cataracts.
Credit: Kevin Pearson and Kaitlyn Lewis

The study offers yet more evidence of the possible anti-aging benefits of resveratrol. "Is this too good to be true?" asks Harvard Medical School's David Sinclair, one of the authors of the paper, which appears this week in Cell Metabolism. "I think we'll know in the next few years." Sinclair initially showed the anti-aging effect of resveratrol several years ago. Sirtris Pharmaceuticals, the company that he cofounded to develop anti-aging drugs, including ones based on resveratrol, was recently sold to GlaxoSmithKline for about $720 million.

Sinclair and his colleagues gave one-year-old mice--that's middle-aged, in mouse years-- high doses of resveratrol. It's found in the skins of grapes--which are left on the fruit when red wine is fermented but removed from white wine before fermentation--and in lower amounts in peanuts and some berries, including cranberries and blueberries.

Resveratrol had a broad range of health benefits for mice, the researchers confirmed. The mice had fewer cataracts, better bone density, healthier cardiovascular systems, and better motor coordination than did untreated animals, and resveratrol also made obese mice more sensitive to insulin.

"Let's hope it will do the same things for humans," says Mark Leid, a professor in the Department of Pharmaceutical Sciences at Oregon State University. He wasn't involved in this work.

Other studies have found that resveratrol extends life span in various organisms, including fish, flies, and yeast, and in mice fed a high-calorie diet. This study found the same effect in obese mice, although they still didn't live as long as mice on a normal diet. Resveratrol had no effect on the life span of animals fed a normal diet, although they had a healthier old age.

It's possible that in this case, the mice didn't begin resveratrol treatment when they were young enough to get the full benefits of the compound, perhaps including a longer life, Sinclair says. Also, unlike humans, mice don't die from cardiovascular disease or suffer serious consequences from brittle bones, so it's possible that resveratrol may be an even greater boon to aging humans than it is to mice, says Rafael de Cabo of the National Institute on Aging, who also worked on the project.

Sinclair's team also monitored gene activity patterns in various tissues in the treated mice and found that they were similar to those in animals on a restricted-calorie diet. Scientists have found that reducing mice's caloric intake by 30 to 50 percent while maintaining adequate nutrition can ward off age-related diseases, improve stress resistance, and slow the declines in function that come with age in many species, including mice, fish, and yeast. Mice treated with resveratrol in this study "have a younger gene-expression profile," de Cabo says.

It's not exactly clear how resveratrol works. There's evidence that the compound activates proteins called sirtuins that play a key role in controlling aging. However, a recent study using lower doses of resveratrol in mice suggests that there may be another mechanism at work, at least when lower doses are given.

The daily dose of resveratrol that Sinclair and his colleagues gave mice was the equivalent of more red wine than most people will drink in a lifetime, so "wine isn't going to do the trick," says Leonard ­Guarente, a professor at MIT and a pioneer in the study of sirtuins. (Guarente is on the board of Sirtris but didn't work on this study.) "There's going to have to be a supplement," he says.

Resveratrol pills are already on the market, but until more studies are done in humans, de Cabo advises caution. Even though you'll get much less of the compound by eating berries and drinking wine, he says, "I'd rather people buy grapes and red wine than take compounds off the shelf."

Sirtris is conducting clinical trials using resveratrol to treat type 2 diabetes. The preliminary results look promising, and no serious side effects have surfaced, notes Sinclair.

He and other scientists are also studying the anti-aging properties of similar compounds--some of them apparently much stronger than resveratrol. "There's a whole pipeline of better molecules coming along," Sinclair says.


Sunday, July 06, 2008

Hilfe, mein Geldbeutel schrumpft

Inflation

Hilfe, mein Geldbeutel schrumpft

Von Dyrk Scherff

06. Juli 2008 Hätten Sie es gewusst? Die Zitrone macht uns am meisten Sorgen. Ihr Preis steigt unter den Konsumgütern derzeit am schnellsten - um 80 Prozent in einem Jahr. Das vielbeachtete Öl folgt später: Heizöl ist 57 Prozent teurer geworden.

Nahrungsmittel und Energie - sie sorgen derzeit für die rekordhohe Inflation. Im Mai lag sie in

Deutschland bei 3,1 Prozent, im Juni kletterte sie weiter auf 3,3 Prozent, nun ist sie so hoch wie seit 15 Jahren nicht mehr. Üblich waren bisher 1,5 Prozent.

Jeder hat eine andere Rechnung

Das sind freilich Durchschnittswerte, die auf dem Warenkorb der Statistiker für einen durchschnittlichen Konsumenten beruhen. Produkte, für die ein Durchschnittsdeutscher viel Geld ausgibt, gehen stärker in die allgemeine Inflationsrate ein als andere. Die größten Posten sind dabei die Wohnkosten mit etwa 30 Prozent Anteil sowie Lebensmittel, Transport und Freizeit mit jeweils rund 10 Prozent.

© Statistisches Bundesamt
© F.A.Z
© F.A.Z

Für jeden einzelnen Menschen kann die Rechnung allerdings ganz anders aussehen. Denn abhängig von seinem Einkommen, seinem Alter und seinen persönlichen Vorlieben sind die Ausgaben anders gewichtet. So haben natürlich die Transportkosten einen deutlich größeren Anteil an den Ausgaben eines Pendlers, Rentner geben mehr für Kultur, Reisen und Gesundheit aus, Geringverdiener weniger für Freizeit, dafür ist der Anteil der Nahrungsmittel größer. Und der Student hat einen hohen Anteil an Telekommunikationskosten, geht öfter als andere in die Kneipe, hat dafür aber häufig kein Auto zu finanzieren.

Besonders getroffen hat es die Pendler

Je nachdem, wie jeder einzelne lebt, wirkt die aktuelle Preissteigerung daher sehr unterschiedlich. Besonders stark getroffen werden die Pendler, wie eine Beispielrechnung dieser Zeitung zeigt. Die Pendlerin mit 1600 Euro Ausgaben muss eine persönliche Inflation von 3,6 Prozent hinnehmen - ihre Kosten sind heute 57 Euro höher als vor einem Jahr.

Der vielreisende Rentner mit monatlichen Ausgaben von 2000 Euro zahlt demnach 59 Euro (3 Prozent) mehr als 2007, der Gesundheitsbewusste mit 3000 Euro Ausgaben 78 Euro (2,6 Prozent) mehr. Basis der Berechnungen sind Ausgabenprofile, die das Statistische Bundesamt in der Einkommens- und Verbrauchsstichprobe (EVS) ermittelt hat. Sie wurden an die unterschiedlichen Lebensstile angepasst.

Es profitieren die Gutverdiener

Besonders betroffen sind die Pendler, wenn sie auch noch wenig verdienen. Dann geben sie nicht nur viel für den Sprit aus, sondern auch für Lebensmittel. Beides ist viel teurer geworden. Allein der Kraftstoffpreis ist innerhalb eines Jahres um 12 Prozent gestiegen, vor allem weil die Schwellenländer viel Öl verbrauchen und darum das Öl teurer wird. Auch die Preise für Nahrung sind um 6 Prozent gestiegen, wobei das vor allem auf höhere Energiepreise und weniger auf die teuren Feldfrüchte zurückzuführen ist. Strom und Gas kosten 13 Prozent mehr. Entlastung kommt dagegen wie seit Jahren von Elektronikartikeln wie Notebooks, Fernsehern oder Handys, außerdem von Lebensmitteln wie Zwiebeln und Nüssen, die alle um mehr als zehn Prozent billiger wurden.

Davon profitieren eher die Gutverdiener. Sie leiden dafür aber - wie die Pendler - besonders unter den hohen Energiepreisen, weil sie in der Regel mehr Geld fürs Auto ausgeben als die Armen. In der Summe gleichen sich die Folgen der Inflation für die verschiedenen Gehälter an. "Wenn auch die Haushalte mit niedrigem Einkommen im laufenden Jahr etwas stärker von der Teuerung betroffen sind, sind die Inflationsdifferenzen zwischen den Einkommensklassen vernachlässigbar", stellt Christian Dreger vom Deutschen Institut für Wirtschaftsforschung (DIW) in Berlin fest.

Wahrgenommene Inflation liegt bei 12 Prozent

Die Deutschen schätzen den Preisanstieg sogar größer ein, als die Zahlen des Statistischen Bundesamts zeigen. Die wahrgenommene Inflationsrate, die der Statistiker Hans Wolfgang Brachinger von der Universität Fribourg ausrechnet, liegt derzeit bei rund 12 Prozent. Denn mit Benzin und den Lebensmitteln sind ausgerechnet die zwei Produktgruppen am stärksten gestiegen, die jeder für den täglichen Bedarf benötigt und die bei jedem Einkauf an der Kasse derzeit besonders erschrecken.

Ein Blick in die deutsche Vergangenheit zeigt aber, dass es schon einmal schlimmer war. In den siebziger Jahren während der Ölkrisen lag die amtliche Inflationsrate schon einmal auf sieben Prozent im Jahr. Zudem war die Kaufkraft früher schwächer. Nach Berechnungen des Instituts der deutschen Wirtschaft in Köln musste ein Angestellter 1960 zum Beispiel 15 Minuten arbeiten, um sich einen halben Liter Bier leisten zu können. Heute sind es trotz Vervierfachung des Preises nur noch drei Minuten. Grund ist, dass seitdem die Löhne gestiegen sind und die Menschen weniger arbeiten müssen.

Erst 2009 Besserung in Sicht

Der Blick in die Zukunft fällt aber nicht so rosig aus. Die Märkte signalisieren für den Euroraum für 2008 eine Inflationsrate von 2,6 Prozent. Das ist zwar weniger als jetzt, aber immer noch ein hohes Niveau im Vergleich zur Vergangenheit. Viel wird von der Entwicklung des Ölpreises abhängen, der bedeutsamer für die Preissteigerung ist als die Lebensmittel. Hierfür gehen einige Analysten von einem Ende des Anstiegs im zweiten Halbjahr aus, wenn sich die Wirtschaft abkühlt. Langfristig wird er aber hoch bleiben und damit weiter für Druck auf die Preise sorgen. "Erst 2009 dürfte die Inflation wieder unter zwei Prozent fallen. Die niedrige Preissteigerung der vergangenen zehn bis 15 Jahre wird es aber nicht geben", erwartet DIW-Fachmann Dreger.



Text: F.A.S.
Bildmaterial: dpa, F.A.Z., Statistisches Bundesamt