Saturday, December 13, 2008

1950s Pinup Model Bettie Page Dies in LA at 85

1950s Pinup Model Bettie Page Dies in LA at 85

Bettie Page, pinup favorite whose photos helped energize the 1960s sex revolution, dies at 85

Bettie Page, the 1950s secretary-turned-model whose controversial photographs in skimpy attire or none at all helped set the stage for the 1960s sexual revolution, died Thursday. She was 85.

1950s Pinup Model Bettie Page Dies in LA at 85

This undated photo provided Thursday, Dec. 11, 2008 by CMG Worldwide shows Bettie Page. Page, the... Expand
(CMG Worldwide/AP Photo)

Page was placed on life support last week after suffering a heart attack in Los Angeles and never regained consciousness, said her agent, Mark Roesler. He said he and Page's family agreed to remove life support. Before the heart attack, Page had been hospitalized for three weeks with pneumonia.

"She captured the imagination of a generation of men and women with her free spirit and unabashed sensuality," Roesler said. "She is the embodiment of beauty."

Page, who was also known as Betty, attracted national attention with magazine photographs of her sensuous figure in bikinis and see-through lingerie that were quickly tacked up on walls in military barracks, garages and elsewhere, where they remained for years.

er photos included a centerfold in the January 1955 issue of then-fledgling Playboy magazine, as well as controversial sadomasochistic poses.

"I think that she was a remarkable lady, an iconic figure in pop culture who influenced sexuality, taste in fashion, someone who had a tremendous impact on our society," Playboy founder Hugh Hefner told The Associated Press on Thursday. "She was a very dear person."

Page mysteriously disappeared from the public eye for decades, during which time she battled mental illness and became a born-again Christian.

After resurfacing in the 1990s, she occasionally granted interviews but refused to allow her picture to be taken.

"I don't want to be photographed in my old age," she told an interviewer in 1998. "I feel the same way with old movie stars. ... It makes me sad. We want to remember them when they were young."

The 21st century indeed had people remembering her just as she was. She became the subject of songs, biographies, Web sites, comic books, movies and documentaries. A new generation of fans bought thousands of copies of her photos, and some feminists hailed her as a pioneer of women's liberation.

he Notorious Bettie Page

Gretchen Mol portrayed her in 2005's "The Notorious Bettie Page" and Paige Richards had the role in 2004's "Bettie Page: Dark Angel." Page herself took part in the 1998 documentary "Betty Page: Pinup Queen."

Hefner said he last saw Page when he held a screening of "The Notorious Bettie Page" at the Playboy Mansion. He said she objected to the fact that the film referred to her as "notorious," but "we explained to her that it referred to the troubled times she had and was a good way to sell a movie."

age's career began one day in October 1950 when she took a respite from her job as a secretary in a New York office for a walk along the beach at Coney Island. An amateur photographer named Jerry Tibbs admired the 27-year-old's firm, curvy body and asked her to pose.

Looking back on the career that followed, she told Playboy in 1998: "I never thought it was shameful. I felt normal. It's just that it was much better than pounding a typewriter eight hours a day, which gets monotonous."

Nudity didn't bother her, she said, explaining: "God approves of nudity. Adam and Eve in the Garden of Eden, they were naked as jaybirds."

In 1951, Page fell under the influence of a photographer and his sister who specialized in S&M. They cut her hair into the dark bangs that became her signature and posed her in spiked heels and little else. She was photographed with a whip in her hand, and in one session she was spread-eagled between two trees, her feet dangling.

"I thought my arms and legs would come out of their sockets," she said later.

Moralists denounced the photos as perversion, and Sen. Estes Kefauver of Tennessee, Page's home state, launched a congressional investigation.

Page quickly retreated from public view, later saying she was hounded by federal agents who waved her nude photos in her face. She also said she believed that, at age 34, her days as "the girl with the perfect figure" were nearly over.

She moved to Florida in 1957 and married a much younger man, as an early marriage to her high school sweetheart had ended in divorce.

Cultural Icon

Her second marriage also failed, as did a third, and she suffered a nervous breakdown.

In 1959, she was lying on a sea wall in Key West when she saw a church with a white neon cross on top. She walked inside and became a born-again Christian.

After attending Bible school, she wanted to serve as a missionary but was turned down because she had been divorced. Instead, she worked full-time for evangelist Billy Graham's ministry.

A move to Southern California in 1979 brought more troubles.

She was arrested after an altercation with her landlady, and doctors who examined her determined she had acute schizophrenia. She spent 20 months in a state mental hospital in San Bernardino.

A fight with another landlord resulted in her arrest, but she was found not guilty because of insanity. She was placed under state supervision for eight years.

"She had a very turbulent life," Todd Mueller, a family friend and autograph seller, told The Associated Press on Thursday. "She had a temper to her."

Mueller said he first met Page after tracking her down in the 1990s and persuaded her to do an autograph signing event.

He said she was a hit and sold about 3,000 autographs, usually for $200 to $300 each.

"Eleanor Roosevelt, we got $40 to $50. ... Bettie Page outsells them all," he told The AP last week.

Born April 22, 1923, in Nashville, Tenn., Page said she grew up in a family so poor "we were lucky to get an orange in our Christmas stockings."

The family included three boys and three girls, and Page said her father molested all of the girls.

After the Pages moved to Houston, her father decided to return to Tennessee and stole a police car for the trip. He was sent to prison, and for a time Betty lived in an orphanage.

In her teens she acted in high school plays, going on to study drama in New York and win a screen test from 20th Century Fox before her modeling career took off.


Associated Press writers Denise Petski and Raquel Maria Dillon contributed to this report.

Copyright 2008 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Um einen Darwin von

Zum Darwin-Jahr 2009

Um einen Darwin von innen bittend

Von Frank Schirrmacher

Was die Lebewesen zusammenhält: Charles Darwin erforschte die fundamentalen Z...

Was die Lebewesen zusammenhält: Charles Darwin erforschte die fundamentalen Zusammenhänge zwischen den Lebewesen

12. Dezember 2008 Der zweite Weihnachtsfeiertag fiel auf einen Montag. Eben hatte man noch Christi Geburt gefeiert, jetzt hatte Englands führende Schicht die profane Welt in Gestalt der Morgenausgabe der „Times“ auf dem Frühstückstisch liegen. Auf Seite acht, neben den politischen Auslandsnachrichten, stand eine ungewöhnlich lange, außerordentlich lobende Rezension eines soeben erschienenen Buches. Die Überschrift lautete: „Darwin über die Entstehung der Arten“, und im ersten Absatz hieß es: „Wir müssen damit rechnen, mit neuen Ansichten über die Natur und die Beziehungen ihrer Bewohner bekanntgemacht zu werden, weil die Wissenschaft neues Material für neue allgemeine Theorien erhalten hat.“

Auf solchen Taubenfüßen scharrte 1859 eine der folgenreichsten Revolutionen der Wissenschaftsgeschichte vor den Wohnzimmertüren, in den Gärten, auf den Äckern, in den Wäldern der Welt. Sie drang fast ohne Umwege und trotz etlicher Anfeindungen unmittelbar in die Interieurs der Gesellschaft und die Innenwelten ihrer Bewohner ein. Auf Newton oder auf Einstein hatte im klassischen Sinn die Welt nicht gewartet. Darwins Revolution wurde mit der „Times“ ins Haus geliefert. Schon die Generation der Zeitungsleser an jenem Weihnachtstag, erst recht die ihrer Kinder, konnten Menschen, Bäume, Pflanzen oder Regenwürmer bald nicht mehr anders als im Licht jener „neuen Ansichten“ sehen, die die Rezension des Weihnachtstages ihnen mehr androhte als versprach.

Eine monströse Wirkungsgeschichte

Wenn die Welt im Jahr 2009 des hundertfünfzigsten Jahrestags der Erstveröffentlichung der „Entstehung der Arten“ und des zweihundertsten Geburtstags Darwins gedenkt, wird viel über die Rezeptionsgeschichte einer Theorie geredet werden, die schon in ihren Ursprüngen, schon in der Hymne der „Times“, auch das Werk einer sorgsam orchestrierten Vermarktung war. Schier unübersehbar ist die Wirkungsgeschichte dieser Theorie, monströs, das Netzwerk echter und falscher Legitimationen, die sie im Laufe von hundertfünfzig Jahren stiftete.

Sie, die zunächst eine beschreibende Theorie war, wurde fatal, als sie Anwendung wurde: So hat sie bekanntlich als Rechtfertigungsideologie nationalsozialistischer Rassenpolitik ebenso gedient wie als Begründungskern moderner sozialdarwinistischer Theorien. Aber auch jenseits davon hat sie das früheren Generationen völlig fremde Bewusstsein dafür geweckt, schnell selbst zu machen, wozu die Natur eine halbe Ewigkeit braucht: Sie, die den Menschen die unendlichen Zeiträume vor Augen führte, die die Natur benötigt, um die Lebewesen an veränderte Umwelten anzupassen, hat nicht mit der Ungeduld der Menschen gerechnet, ja ihre erstaunlich schnelle und fast vollständige Rezeption innerhalb weniger Jahre ist selbst Symptom dieser Ungeduld.

Die Entdeckung fundamentaler Gemeinsamkeiten

Genetische Veränderungen können über Leben und Tod, über Wohl und Wehe ganzer Arten entscheiden. Seit Darwin lernt jedes Kind, dass die Triebwerke der Evolution zufällige Mutation und zielgerichtete Selektion sind. Sein Werk selbst aber ist paradoxerweise in den Händen der Nachgeborenen zu einem solchen Produkt einer Zuchtwahl geworden. Ihr Ziel war, wie bei jeder Züchtung, Darwins genialem Fund selbst ein Ziel zu geben: dass Evolution ein Prozess ist, der zur allmählichen Vervollkommnung der Arten, also der Gesellschaft, also des Menschen führe, ist einer der folgenreichsten Gedanken rassistischer, aber auch allgemeiner kultureller und politischer Evolutionstheorien und hat mit Darwin nichts zu tun. Die Kontamination seines Werks - es war Herbert Spencer, der die Evolutionstheorie als erster für seine Gesellschaftstheorie benutzte - durch Anwender beginnt buchstäblich mit der Erstveröffentlichung seiner „Entstehung der Arten“.

Zuweilen werden kleinere kulturelle Beben dadurch ausgelöst, dass einflussreiche Werke von ihrer Rezeptionsschicht befreit und wieder so gelesen werden wie am ersten Tag. Wenn nicht alle Zeichen trügen, dann hat dieser Prozess seit Janet Brownes großer Biographie bei Darwin eingesetzt. Der Darwin, der gegen seine Anwender verteidigt werden kann, ist nicht der Theoretiker des Lebenskampfes, des Rassismus und der Naturbeherrschung. Im Gegenteil: Es ist ein Denker, der mit fast beispielloser Behutsamkeit sich der Natur und ihren Geschöpfen zuwendet, um Verschiedenheiten, Unterschiede und Differenzen zu verstehen. Der Mann, der die Regenwürmer zu den wahren Herren der Weltgeschichte machte und sich Hornissennester schicken ließ, um die erstaunlichen Symmetrien der Natur zu verstehen, hat einen Zusammenhang zwischen Natur und Gesellschaft hergestellt, der überhaupt erst das Bewusstsein dafür weckt, dass ihr Schicksal ein gemeinsames ist.

Revisionen sind große Abräum- und Entrümpelungsaktionen. Sie sind im vorliegenden Fall gar nicht nötig. Es genügt, auf die Urtexte zurückzugreifen. Es genügt, nach Lektüre auch nur der Briefe dieses gewaltigen Briefschreibers festzustellen, dass Darwins Weltbild im zweihundertsten Jahr seiner Geburt eine neue Botschaft bereithält: die der Sympathie mit der Schöpfung.

Text: F.A.Z.
Bildmaterial: dpa

Friday, December 12, 2008

What's Next for Computer Interfaces?

Thursday, December 11, 2008

What's Next for Computer Interfaces?

Touch tricks for small and large displays could be the next big thing.

By Kate Greene --- from

Earlier this week, the humble computer mouse celebrated its 40th birthday. While surprisingly little has changed since Doug Engelbart, an engineer at Stanford Research Institute, in Palo Alto, CA, first demonstrated the mouse to a skeptical crowd in San Francisco, we may have already seen a few glimpses of the future of computer interfaces. If so, over the next few years, the future of the computer interface will likely revolve around touch.

Thanks to the popularity of the iPhone, the touch screen has gained recognition as a practical interface for computers. In the coming years, we may see increasingly useful variations on the same theme. A couple of projects, in particular, point the way toward interacting more easily with miniature touch screens, as well as with displays the size of walls.

Tiny touch: A device called nanoTouch has a touch-sensitive back to make it easier to view the front-side display. Here, a credit-card-size gadget shows an image of a person’s finger on the back to help him move a cursor around the screen. 
Credit: Patrick Baudisch
photo See images of interface technology.
video Click here to see how the back-of-the-device touch screen works.

One problem with devices like the iPhone is that users' fingers tend to cover up important information on the screen. Yet making touch screens much larger would make a device too bulky to slip discreetly into a pocket.

A project called nanoTouch, developed at Microsoft Research, tackles the challenges of adding touch sensitivity to ever-shrinking displays. Patrick Baudischand his colleagues have added touch interaction to the back of devices that range in size from an iPod nano to a watch or a pendant. The researchers' concept is for a gadget to have a front that is entirely a display, a back that is entirely touch sensitive, and a side that features buttons.

To make the back of a gadget touch sensitive, the researchers added a capacitive surface, similar to those used on laptop touch pads. In one demonstration, the team shows that the interface can be used to play a first-person video game on a screen the size of a credit card. In another demo, the device produces a semitransparent image of a finger as if the device were completely see-through.

When a transparent finger or a cursor is shown onscreen, people can still operate the device reliably, says Baudisch, who is a part-time researcher at Microsoft Research and a professor of computer science and human-computer interaction at the Hasso Plattner Institute at Postdam University, in Germany.

Details of the device will be presented at the Computer Human Interaction conference in Boston next April. The researchers tested four sizes of square displays, measuring 2.4 inches, 1.2 inches, 0.6 inches, and 0.3 inches wide. They found that people could complete tasks at roughly the same speed using even the smallest display, and that they made about the same number of errors using all sizes of the device. Furthermore, the back-of-the-screen prototypes performed better than the smallest front-touch device.

Baudisch is encouraged by the results and is in the process of establishing guidelines for building rear-touch interfaces into tiny devices. "Envision the future where you buy a video game that's the size of a quarter . . . and you wear electronic pendants," he says.

Jeff Han, founder of a startup called Perceptive Pixel, based in New York, says that Baudisch's concepts are impressive, but he's more interested in using touch technology on large displays. He has already had some success: he has supplied wall-size touch screens to a number of U.S. government agencies and several news outlets. In fact, his company's touch screens were used by news anchors during the November presidential election to show viewers electoral progress across the country.

Traditionally, large touch screens have been built in the same way as smaller ones, making them very expensive to create. Han's displays take advantage of a physical phenomenon called total internal reflection: light is shone into an acrylic panel, which acts as the display and is completely contained within the material. When a finger or another object comes in contact with the surface, light scatters out and is detected by cameras positioned just behind the display. Because a thin layer of material covers the acrylic, the scattered light also depends on the amount of pressure that is applied to the display.

In a paper presented in October at the User Interface Software and Technology Symposium, in Monterey, CA, Han's colleague Philip Davidson describes software that takes touch beyond the surface, using pressure to add another dimension to a screen.

Davidson created software that recognizes how hard a person is pressing a surface. If a user presses hard enough on an image of, say, a playing card and slides it along the display to another card, it will slide underneath. Additionally, if a person presses hard on one corner of an object on the screen, the opposite corner pops up, enabling the user to slide things underneath it. This provides a way to prevent displays from getting too cluttered, Davidson says.

However, Davidson also notes that pressure sensitivity should not make the device uncomfortable to use, and he has studied the natural fatigue that a person feels when she presses on a display and drags an object from one side to the other. The new pressure-sensitive features are expected to ship by the middle of next year, Davidson says.

Report Predicts Job Growth In Green Sector

Report Predicts Job Growth In Green Sector

InTech (12/11) reported, "There are certain types of manufacturing jobs that are going away forever. However, green collar jobs present the next frontier for US manufacturing," a new report from Duke University indicates. "Highlighting the direct linkages between low-carbon technologies and US jobs, Duke researchers" suggested that "US manufacturing should grow in a low-carbon economy. Their report, 'Manufacturing Climate Solutions,' provides a detailed look at the manufacturing jobs that already exist and would come about when the US takes action to limit global-warming pollution." According to the report, "there are five carbon-reducing technologies with potential for future green job creation: LED lighting, high-performance windows, auxiliary power units for long-haul trucks, concentrating solar power, and Super Soil Systems (a new method for treating hog wastes)."

Wednesday, December 10, 2008

Advertisers could benefit from analyzing the early popularity of online content.

A Winning Web Formula

Advertisers could benefit from analyzing the early popularity of online content.

By Kate Greene --

As online advertising money starts shrinking in the economic downturn, some researchers are looking for ways to make the most of every single dollar. Recent research from HP Labs in Palo Alto, CA, shows that it's possible to predict, with reasonable accuracy, how popular an online video clip or news story will become simply by looking at how well it does within the first few hours of being posted. If content providers can predict how many views a video or article will get over a set period, then they can match the most popular items with specific high-dollar ads. Additionally, content providers can place potentially popular content in eye-catching spots on their site, further increasing the number of people who see it and the accompanying ads.

"There's an obvious byproduct of what we're doing here for advertising," says Bernardo Huberman, a senior fellow at HP who led the work. "This will allow people who advertise to at least start getting a sense of what they want [...] if very early on you can tell if people will like a video or news story."

Huberman and his colleagues looked at historical data gathered from the video site YouTube, and from Digg, a news aggregator that lets readers' votes determine which stories become most prominent. The researchers applied mathematical models to these data sets, determining a "popularity curve" for different items. These curves allowed the researchers to extrapolate the future popularity of an item using only information about its popularity over the first few hours or days.

HP isn't the only organization trying to predict the popularity of content. Researchers at Google, Yahoo, Microsoft, and IBM, to name a few, have all invested resources and money in researching the problem. A few years ago, Duncan Watts, now a researcher at Yahoo, showed that the quality of a song is a very poor indicator of its eventual popularity, and that long-term song popularity--as measured on music-sharing networks--can be determined fairly early on, when a sort of popularity trajectory is determined.

In the case of Digg, Huberman says that within the first few hours it is clear whether a story will become popular or not (depending on how many "diggs"--or votes--it receives from the site's community of readers). Factoring in the time of day that a story is submitted (a noontime story will get, on average, twice as many early diggs as a story submitted at midnight), the researchers found that if a story receives a low number of diggs, it has relatively little hope of being one of the top viewed stories of the day. Conversely, if a story receives hundreds of diggs in the first hours, it's likely to be much more popular.

The popularity of YouTube videos follows a similar pattern, albeit on a longer timescale. By looking at the number of views a video gets on its first day, the researchers could determine the likelihood of reaching a certain level of popularity after a longer period. For instance, if only 15 people view the video during the first day, it's unlikely to become a big hit. However, if more than 100 people see it on the first day, then there's a high likelihood that it will be seen tens of thousands of times more.

In the case of both YouTube and Digg, Huberman notes that predictions become more accurate as data are considered over longer periods. For instance, within seven hours, it is possible to predict a story's future popularity on Digg nearly perfectly. Likewise for videos posted to YouTube for 20 days.

Huberman says that other sites, such as online stores, would need their own ways of determining the popularity of their content because each site has unique characteristics. But advertisers, he says, armed with popularity predictions, could quickly determine which products might "go viral" and then tag special ads to those.

"I think popularity prediction is an interesting topic in general," says Claudia Perlich, researcher at IBM's Watson Research Center in Yorktown Heights, NY. "And I certainly see the value [of] predicting popularity for advertising." However, she notes that it's only one part of the advertising puzzle. Increasingly, she says, advertisers are interested in the type of people who are viewing the content, and they find it useful to know the path that a Web surfer has taken before arriving at a site, so that he or she can be better targeted with a specific ad.

Perlich also raises questions about the two systems under study. "I have the slight worry that the results are driven by underlying technology," she says. It could be possible that the researchers are simply measuring the proprietary process in which stories become popular on Digg and some of the video-promotion features of YouTube. This is the problem, she says, with doing an experiment in which the systems are proprietary and it's impossible to know exactly how they work.

Huberman is confident that his methodology can predict popularity independently of specific algorithms used by the sites. He is in the process of analyzing the popularity of people on Twitter, a microblogging service that lets people post short messages to one another and subscribe to updates. Better understanding of these social networks, he says, could lead to entirely new business models. "The only thing that's of value today is people's attention," he says. "An immense amount of money is spent on trying to draw our attention to things."

How a 1960s sociology experiment could hold the key to better Internet routing.

Tuesday, December 09, 2008

The Social Life of Routers

How a 1960s sociology experiment could hold the key to better Internet routing.

By Erica Naone ---

Just like an old-fashioned piece of mail, data traveling over the Internet normally follows a predictable path. As the Internet continues to grow, however, experts have begun to worry that current routing protocols will be unable to cope with increased congestion. And so, as researchers search for new solutions, some are taking inspiration from a famous social experiment that called on people to deliver mail using only a network of friends.

For many years, Internet routers have used a standard known as the border gateway protocol (BGP) to map out the path that data takes. BGP requires each router to store a list of network addresses, known as a routing table, which tells it where to forward packets of information (based on a complete picture of that network). But as the number of Internet-connected machines increases, routing tables grow longer and need to receive updates more frequently, potentially slowing some traffic to a crawl. A major sticking point for the BGP protocol is that every time part of the network changes, every router must process an update.

This is where the work of sociologist Stanley Milgram could help out. Milgram carried out experiments in the 1960s that helped make famous the idea of "six degrees of separation." Milgram gave volunteers the task of forwarding a letter to a stranger by sending it to friends or acquaintances that might be one step closer to the target. Milgram measured how many hops there were between the sender and the end recipient, and found it to be, on average, 5.2. (The term six degrees of separation was coined later by playwright John Guare.)

In 2000, inspired by Milgram's work, Jon Kleinberg, a professor of computer science at Cornell University, in New York, created a mathematical model for routing information across any kind of network. Kleinberg says that he drew from the fact that Milgram "demonstrated not just that short paths were present in large social networks, but that people--operating without a global view of the network--could efficiently find them."

Now, research from Marián Boguñá at the University of Barcelona and colleagues, suggests that the approach could indeed by applied to real-world networks, including the Internet's routing system. In work published recently in Nature Physics, Boguñá and his colleagues argue that the work of Kleinberg and others can be applied to real-world networks and, specifically, could be used to design a protocol that allows routers to keep track of less information about a network, thereby reducing congestion.

The key lies in identifying "hidden" bits of information that could help routers decide where to send a packet, Boguñá says. The people in Milgram's experiment used such information to figure out how to forward their letters. Instead of passing them on to a random friend, they identified criteria, such as a person's profession, that meant that they might be a step closer to the intended recipient. The work of Boguñá and his colleagues focuses on identifying and exploiting hidden information on other kinds of networks. In the case of Internet routing, the physical location of a router or the type of information it last handled could provide useful clues for forwarding information toward a final destination without knowing the complete structure of the network.

Kleinberg says that the work is "a very elegant approach to exploring the underlying structures that make navigability possible in real networks." He adds that Boguñá's group's "techniques have the potential to inform a new class of routing strategies in which global information is replaced by local strategies that follow hidden metrics."

However, Jon Crowcroft, a professor of communication systems at the University of Cambridge, U.K., warns that, while Boguñá's group has done good work in applying Kleinberg's theoretical models, it's too early to tell if the approach would actually work. "When you look at it in reality," he says, "there are other additional constraints," such as the requirements of particular applications. Nonetheless, Crowcroft believes that this direction is "absolutely worth exploring" and says that he would like to see the researchers try some real-world experiments.

Boguñá himself admits that his work is "very preliminary." The next step, he says, is to identify what "hidden metrics" could be used for Internet routing. But Boguñá expects that this could take several more years to figure out.

Monday, December 08, 2008

As the World Economy Sinks, So Does Global Shipping

As the World Economy Sinks, So Does Global Shipping

Long Beach Vehicle Distribution Center near a cargo ship at the Port of Long Beach
Long Beach Vehicle Distribution Center near a cargo ship at the Port of Long Beach
David McNew / Getty

Take a stroll along your proverbial Main Street, and you may see signs of the global financial crisis moving into the real economy. You can also head straight to the nearest major port. There may be no more serious warning of the potential reach and depth — and duration — of the worldwide economic slowdown than the sinking fortunes of the shipping industry.

From reduction of traffic on key trade routes to the cancellation of new ship orders to plummeting cargo rates, transport-by-sea is a very real gauge of declining demand across the globe for raw materials and consumer products. With some 77 percent of worldwide trade arriving by sea, "shipping is the thermometer of globalization,"notes Professor Oliviero Baccelli, a transportation economist at Bocconi University in Milan. "It allows us to take the broadest view of the health of the worldwide economy." (Read "The QE2 Goes on the Block")

And from the docks of Rotterdam to Seattle to Shanghai, the troubling symptoms abound. The Baltic Dry Index, which measures world shipping charges for raw materials, has plummeted from a high of 11,793 in May to 672, dipping to its lowest levels since soon after the index was established in 1985. Daily rental rates for the largest Capesize category of carriers have plunged from $234,000 just two months ago to $2,320, a fall of a staggering 99 percent. Jeremy Penn, president of the London-based Baltic Exchange, cautions that bulk rates are prone to fluctuation, and have been hit particularly hard this time by the skittishness in financial markets as the necessary letters of credit for commodity purchases have grown harder to come by. Still, Penn says the recent drop in bulk cargo fees is unprecedented, citing declining worldwide demand, particularly as the economic slowdown reaches China. "The violence of the drop (in rates),"he says, "is more extreme than anything we've ever seen before."

Despite the fluctuations in spot purchase prices, Baccelli says the industry enjoys a certain stability through tough times thanks to the enduring presence of family-run behemoths and an ever higher concentration of control. "You have families who have hundreds of years of experience, who have lived through these situations and equipped themselves, and are resistant to speculation."

Even seasoned ship owners, however, are facing a storm unlike any they've seen before. Right now, the tangible signs of a lasting retrenchment are popping up in ports and along sea routes across the globe. The CKYH consortium of two of the largest Chinese and two South Korean shipping firms has just announced that six routes will see either a suspension of service or reduction in capacity. Service will be halted on the Mediterranean-Asia-America pendulum service early next year, while CKYH has already reduced its capacity from Asia to the U.S. East Coast by some 18 percent.

Perhaps even more worrisome for the long-term outlook is the rush to cancel orders of new ships. In November, New York-based Genco Shipping and Trading was willing to kiss goodbye to a $53 million deposit in order to get out of a half-billion dollar deal to buy six new vessels. Clarkson Plc, the world's largest shipbroker, announced that while 378 ships were ordered in Oct. 2007, only 37 were ordered in Oct. 2008. Big cancellations could set the stage a shortage of ships once the world economy recovers. But the market today is telling ship owners but one thing: glut.

Kriton Lendoudis, managing director of Athens-based Evalend Shipping company, says that in Greece there are some 100 applications by ship-owners to lay-up their vessels. Evalend, which specializes in medium-sized ships, has thus far avoided the worst, and still expects 200 million dollar profits for 2008, but Lendoudis concludes: "The next 24 months do not look very optimistic."

The free fall of shipping-charge prices and the mothballing of new vessels are not the only measures of the perfect storm of extra-tight credit and worldwide economic retrenchment that is now hitting land. Like oil industry investors who are said to count passing tankers to get early estimates of worldwide supply and demand, the shipping industry also looks for pieces of pure anecdotal evidence in places like the southwest tip of England. Just upstream on the River Faul from the port of Truro, harbor master Andy Brigden runs a service of mooring ships when they are put temporarily out of service. Over the past month, Brigden has had a sudden uptick of inquiries from shipping companies looking for long-term berth space. "It had been quiet here for many years, just a few seasonal moorings,"said Brigden, reached by telephone in his harbor office. "In my experience, this is the largest number of requests we've had in such a short time." Economists will continue to chart wild swings of stock markets and prices of raw materials. Folks on Main Street will be calculated in retail sales figures and unemployment rates. But as long as business is booming for harbor master Brigden, there will be little hope for recovery on the horizon.
With reporting by Emmanouil Karatarakis / Athens and Michael Schuman / Hong Kong

Men Losing Jobs At Faster Rate Than Women, Study Finds. & Article Details "12 Skills That Employers Say They Want."

Men Losing Jobs At Faster Rate Than Women, Study Finds.

The Detroit News (12/8, Gavin) reports, "Men are losing jobs at far greater rates than women as the industries they dominate, such as manufacturing, construction, and investment services, are hardest hit by the downturn." Roughly "1.1 million fewer men are working in the United States than there were a year ago, according to the Labor Department. By contrast, 12,000 more women are working." Economists say that the "gender gap is the product of both the nature of the current recession and the long-term shift in the US economy from making goods, traditionally the province of men, to providing services, in which women play much larger roles." The Detroit News notes, "The divide is far starker than in the last recession, when the technology crash battered professional and technical sectors in which women now hold more than 40 percent of jobs."

Article Details "12 Skills That Employers Say They Want."

The Kansas City Star (12/6) reported on "12 skills that employers say they want," according to ACT, "a nonprofit research and information service." The article included a list of the twelve skills, which include thinking "before speaking and [planning] before acting," the ability to "follow through on tasks without being distracted or bored," maintaining "composure and rationality under stress," and having "high aspirations and" the will to "work to achieve those goals." Having "an accurate self-analysis" of one's current abilities is also recommended. The article added that "having those 'soft skills' is a very big deal in today's workplace, where a lot depends on interpersonal relationships," and noted that "many employers are measuring those traits through pre-employment testing. (Fair disclosure: ACT is a purveyor of such assessments.)"

Bush Stands Ground Over Detroit Aid.

Bush Stands Ground Over Detroit Aid.

The Wall Street Journal (12/8, A16, Hitt) reports, "In the standoff between the lame-duck Republican president and an ascendant Democratic speaker of the House, an odd thing happened: The president won." The Journal adds that "the Detroit rescue package being negotiated by congressional leaders is coming together in defiance of the prevailing political winds, representing a rare concession by House Speaker Nancy Pelosi." President Bush "stood his ground, demanding that any rescue of the Detroit automakers be financed by $25 billion in already-approved loans intended to help the industry meet higher fuel-economy standards." The President "essentially won that argument Friday, when Rep. Pelosi, who had wanted to tap the $700 billion pool created to calm financial markets, backed down."

The Washington Post (12/8, A1, Montgomery) notes in a front-page article that "the Bush administration is calling for a car czar within the Commerce Department who would be empowered to force the automakers to restructure or force them into bankruptcy." Democrats "want to give the companies the money first, permitting them to survive through the end of March, and name an administrator later."

The Politico (12/8, Rogers) reports, "An estimated $15 billion emergency loan package for the auto industry -- now taking shape in Congress -- leaves open the door for the incoming Obama administration to add more money if needed this winter by tapping into the Treasury's financial rescue package." While Bush "has refused to make available the same Treasury funds, the White House has signaled tacit support of the legislative provision as part of the bargaining aimed at winning House and Senate passage this week."

Senator: Deal On GM, Chrysler Rescue Loans Could Be Unveiled Soon. The Detroit Free Press (12/7, Hyde, Snavely) reported, "A compromise pact to provide at least $15 billion in loans to General Motors Corp. and Chrysler LLC could be unveiled within the next 24 hours, Michigan Sen. Carl Levin (D-MI) said Sunday." Levin said on Fox News, "I think they're very close to a deal, I'm very confident there will be a deal, and that will happen within 24 hours." Levin also "said he expected the bill to be introduced in the next couple of days. The Senate is set to come back in session Monday, with the House planning to return Tuesday."

Reporting on the expected deal, AFP (12/8, Sheridan) adds, "Democrats said a deal was imminent after a weekend of negotiations with the White House on a short-term loan package of about 15 billion dollars, but Republicans warned that a tough debate awaited the proposed bill this week." GOP Sen. Richard Shelby (AL) said on Fox News, "This is a bridge loan to nowhere." He "added he hoped for an 'extended debate' on the bill but declined to say whether Republicans would try to block it with a filibuster."

Obama: Detroit's Big Three Lack "Sustainable Business Model." The Financial Times (12/8, Dombey) reports that Obama "said...his administration would seek to restructure the vehicle industry by simulating the effects of Chapter 11 bankruptcy protection while providing government aid to keep it functioning." The President-elect went on to say the companies "did not have 'a sustainable business model right now.'" The Times adds, "The president-elect's comments are particularly significant because the emerging compromise, on which Congress is likely to vote this week, would in effect defer a long-term decision on the industry until the Obama administration and a new Congress are in place." Obama also said yesterday, "We are going to have to figure out ways to put the pressure on -- the way a bankruptcy court would demand accountability, demand serious changes -- but do so in a way that allows them to keep factory doors open."

Mentioning the lack a sustainable business model, the New York Times (12/8, A1, Herszenhorn, Calmes) adds on its front page, "President-elect Barack Obama, whose transition team has been involved in the talks, made starkly clear in an interview and at a brief news conference on Sunday that any aid to the Big Three auto companies should not come without significant concessions." The Politico (12/8, Martin) also reports on Obama's comments.

Oil Price Threatens Biofuel Firms

Friday, December 05, 2008

Oil Price Threatens Biofuel Firms

Falling oil prices could cause some alternative-fuel startups to fail.

By Kevin Bullis --- from

Credit: Technology Review

The price of oil has dipped to levels that could be far too low for many advanced-biofuel startups to succeed, especially those that attracted investment this summer, while oil was well above $100 a barrel. Tight credit markets will also make it difficult for advanced biofuel companies to move ahead with plans for scaling up technologies and building commercial-scale production plants.

Attempts at developing alternative fuels in the 1980s largely failed after oil prices plummeted, and the recent drop in oil prices has many concerned that something similar could happen today. On Friday, the price of oil had fallen to $40.81 a barrel, down from a high of $145 in July. Those earlier high oil prices led venture capitalists to invest in many companies that would require high prices of oil to compete, says David Berry, a partner at Flagship Ventures. This summer, he says, "people were very happy to say, 'We're targeting $80 a barrel for oil, and we think we're going to make a ton of money.'"

This September, at the EmTech08 conference, Berry predicted that if oil prices were to fall, many of these companies could fold. His own company, Flagship Ventures, has invested only in biofuel startups whose breakeven requires oil prices of $45 or lower. "When we thought about investments, we said we're not going to make a single investment in something that has its break-even point above $45 a barrel," he said, speaking in September. "In that way, we think we can be pretty insensitive to what the price of oil will be over time. If the price of oil falls to $60 or $50, from our perspective, we're going to sit here and say, this is where we thought things might end up."

Berry now says, however, that most of the companies that Flagship has invested in will still be able to hit the break-even point with oil prices lower than $45. One of these companies, Boston, MA-based Mascoma, could still make a profit with oil at $20 a barrel, says Bruce Jamerson, Mascoma's CEO, but only because current government incentives help them compete with gasoline. These include a $1.01 subsidy for every gallon of advanced biofuels, fuels made from nonfood crops, as well as federal regulations that require oil companies to sell certain amounts of advanced biofuels.

In the long term, both Berry and Jamerson think oil prices will be higher. Anticipated production cuts from OPEC would likely keep oil prices above $50, Berry says. "If you look at the price points that OPEC has put pressure on, that has ranged between $50 and $80. And so that gives that range some reasonable set of legs."

"I don't think that the oil prices are going to stay this low for a long time," Jamerson says. "My view is it will probably fluctuate between $75 and $100 a barrel, once we get past the intense part of this downturn."

Even at those prices, some biofuel startups may be hard pressed to compete. This summer, Berry says, "about $200 to $300 million was going into algae companies. And algae has long been shown to have a break-even point between $90 and $120 per barrel of oil."

Jamerson is more concerned about tight credit markets than oil prices. Mascoma is still years away from commercial production, so, as with other advanced biofuel companies, today's prices don't have an immediate impact. But Mascoma is still working out the financing for a large 40-million-gallon ethanol plant in Kinross, MI. This month, it will start production at a smaller $20-million facility near Utica, NY, that will produce 200,000 gallons of ethanol annually. To save cash, the company recently laid off 10 percent of its employees and is slowing down orders for equipment and decreasing travel budgets.

Jamerson, however, remains optimistic. Concerns about financing are "tempered by increasing optimism about support for renewable fuels from the new Obama administration," he says. After the current recession, he says, "Advanced biofuels will be one of the first sectors to come back."

Broad Use of Brain Boosters?

Monday, December 08, 2008

Broad Use of Brain Boosters?

Use of drugs to enhance memory and concentration should be permitted, experts say.

By Emily Singer ---

Off-label use of stimulants, such as Ritalin, is on the rise among college students. Studies show that 5 percent to 15 percent of students use prescription drugs as study aids, and surveys suggest the practice may be common among academics as well. The trend has sparked debates over how and when these cognitive enhancers should be used. Military personnel routinely use stimulants while on active duty, but should that practice also be permitted among surgeons working long shifts? What about scientists working late nights in the lab? Or students taking exams?

A commentary appearing today online in the journal Natureadvocates for broad access to brain-boosting drugs. According to the piece, written by a group of ethicists, psychologists, and cognitive neuroscientists, "cognitive enhancement, unlike enhancement for sports competitions, could lead to substantive improvements in the world." While opponents have argued that the use of performance-enhancing drugs is unfair and could undermine the value of hard work, the authors say that these drugs fall into the same category as more common efforts to increase brain function, such as drinking a cup of coffee, or getting a good night's sleep, and thus should be regulated accordingly.

One of the biggest concerns associated with broad access to these drugs is that people will feel pressured to take them to get ahead, or just to keep up. An informal survey conducted by Nature last year of 1,400 people from 60 countries found that 20 percent of respondents engaged in off-label use of drugs to enhance concentration and memory. Ritalin was the most popular, followed by Adderall. Both are prescribed for ADHD. The survey confirmed the potential for peer pressure; while 85 percent of respondents said that the use of these drugs by children under the age of 16 should be restricted, a third said they would feel pressure to give them to their children if others were using them.

The authors of the commentary also note that if cognitive enhancers are to be used more broadly, more extensive study of the risks and benefits of the drugs is sorely needed. The side effects of long-term stimulant use, especially in children, are not yet known. And the potential for dependence and abuse has not been well documented.

Michael Gazzaniga, director of the Sage Center for the Study of Mind at the University of California, Santa Barbara and one of the authors of the commentary, talked with Technology Review about the potential benefits and drawbacks of these drugs.

Technology Review: The commentary suggests that healthy adults should have access to cognitive-enhancing drugs. Why do you think this is a good idea?

Gazzaniga: Normal ageing finds one's memorial processes not what they use to be. If there were drugs that helped and were safe, I would certainly be for them being available to the public.

TR: The commentary argues that cognitive-enhancing drugs "should be viewed in the same general category as education, good health habits, and information technology." Why do you think this is true?

MG: All new technologies are at first resisted, even the typewriter. When changing mental states, people get antsy, especially when it appears to enhance capacity. There is somehow a sense one is cheating the system. Well, so is chemotherapy. When all of these new technologies are used in moderation and the right social context, they are a good.

TR: Do you think it's possible to avoid making people feeling obligated to take these drugs to keep up? Especially given the huge amount of money spent on pharmaceutical advertising and the broad impact it has been shown to have?

MG: Rates of off-label drug use will stabilize. I think they will stay low. One could easily obtain Ritalin now for afternoon lassitude but the vast majority of people don't. The afternoon cup of tea or coffee sustains and seems to do the trick for most of us.

TR: Really? What about in high-pressure situations, like academia?

MG: Remember, these drugs don't make you smarter. They keep you awake so you can study so you can be smarter. While there are always fads of use with such products, usage will settle down to a base rate. That base rate may be higher than some people like, but it will be established no matter what the external drug policy might be.

TR: What about the potential for abuse and dependence? How would you ensure they are used responsibly?

MG: Education is the only tool that works. As we have learned from illicit drug use, it is virtually impossible to keep drugs out of a community. The rate of demand for any given drug, whether illicit or off-label legal, is set by the local social context. One can't ensure drug products will always be used responsibly. It is up to each community to teach about the hazards of inappropriate drug use and, by doing so, control the base rate of use. It is not a perfect world!

TR: One of the major arguments against widespread use of cognitive-enhancing drugs is that it's "the easy way out." Why do you disagree?

MG: Most of these drugs are used in spurts when huge mental demands are called for. They are not for everyday mental routines. Having said that, I think it is a fair concern to make sure people don't become dependent on them as a way of life. Working above one's pay grade in the end has tremendous costs.

TR: What are some of the safety concerns? When giving drugs to healthy people, tolerance for risk is low.

MG: As it should be. Remember, do no harm. I think the concerns are on the mental states if misused. Images might be too vivid, for example. Careful tests and analysis should be run.

TR: Why do you think the idea of using drugs to enhance cognitive function makes people so uncomfortable?

MG: Messing around with the mind is a dangerous and delicate matter. None of this should be taken lightly.

Sunday, December 07, 2008



Gartner Hype Cycle 2008

Ieder jaar komt Gartner met een update van haar Hype Cycle for Emerging Technologies (zie ook: Understanding Gartner's Hype Cycles) waarin het de stand van zaken schetst vwb technologische ontwikkelingen zoals weblogs, rss, web 2.0 en mobiel internet. Vorige maand verscheenHype Cycle for Emerging Technologies, 2008 waarin Gartner de stand van zaken voor 2008 schetst. Daarbij worden de volgende sleutel-technologieën geïndentificeerd:

  • Green IT. Along with broader societal pressure for environmentally sustainable solutions, IT has the opportunity — and in many cases, a requirement — to improve the "greenness" of its own activities, as well as to contribute to broader company and industry environmental initiatives.
  • Cloud computing. As enterprises seek to consume their IT services in the most costeffective way, interest is growing in drawing a broad range of services (for example, computational power, storage and business applications) from the "cloud," rather than from on-premises equipment. Many types of technology providers are aligning themselves with this trend, with the result that confusion and hype will continue for at least another year before distinct submarkets and market leaders emerge.
  • Social computing platforms. Following the phenomenal success of consumer-oriented social networking sites, such as MySpace and Facebook, enterprises are examining the role that these sites, or their enterprise-grade equivalents, will play in future collaboration environments. The scope is also expanding to incorporate the notion of social "platforms," or environments for a broad range of developers to build on the basic application.
  • Video telepresence. High-end videoconferencing systems (for example, from HP, isco, Teliris and others) that utilize large, high-definition (HD) displays and components to show life-size images of participants in meeting rooms or suites have proven significantly more-effective than earlier generations of videoconferencing technology in providing a strong sense of in-room presence between remote participants. High cost is currently the barrier to broader adoption.
  • Microblogging. Pioneered by Twitter (although other services are becoming available), microblogging is a relatively new addition to the world of social networking, in which
    contributors post a stream of very short (less than 140 characters) messages providing information about their current activity or thoughts, which can then be subscribed to by others. The phenomenon has caught on among certain online communities, and leading-edge enterprises are investigating its role in enhancing other social media and channels.
  • 3-D printers. Thanks to dramatic price reductions and quality improvements during the past two or three years, 3-D printers (which create a physical model from a digital design) are expanding into hobbyist, education and small business markets, and have transformational potential in manufacturing, replacement parts and design industries.

Understanding hype cycles

What is a Hype Cycle?

What are the 5 phases of a Hype Cycle?

What is the Priority Matrix?

What was the most influential Hype Cycle?

What is a Hype Cycle?

A Hype Cycle is a graphic representation of the maturity, adoption and business application of specific technologies. 

Since 1995, Gartner has used Hype Cycles to characterize the over-enthusiasm or "hype" and subsequent disappointment that typically happens with the introduction of new technologies (seeUnderstanding Gartner's Hype Cycles) for an introduction to the Hype Cycle concepts). Hype Cycles also show how and when technologies move beyond the hype, offer practical benefits and become widely accepted. Read More

What are the 5 phases of a Hype Cycle?

1. "Technology Trigger"
The first phase of a Hype Cycle is the "technology trigger" or breakthrough, product launch or other event that generates significant press and interest. 

2. "Peak of Inflated Expectations"
In the next phase, a frenzy of publicity typically generates over-enthusiasm and unrealistic expectations. There may be some successful applications of a technology, but there are typically more failures. 

3. "Trough of Disillusionment"
Technologies enter the "trough of disillusionment" because they fail to meet expectations and quickly become unfashionable. Consequently, the press usually abandons the topic and the technology. 

4. "Slope of Enlightenment"
Although the press may have stopped covering the technology, some businesses continue through the "slope of enlightenment" and experiment to understand the benefits and practical application of the technology. 

5. "Plateau of Productivity"
A technology reaches the "plateau of productivity" as the benefits of it become widely demonstrated and accepted. The technology becomes increasingly stable and evolves in second and third generations. The final height of the plateau varies according to whether the technology is broadly applicable or benefits only a niche market. 

What is the Priority Matrix?

The Priority Matrix is a tool for prioritizing emerging technologies by forcing technology planners to look beyond the hype and assess technology opportunities in terms of their relative impact on the enterprise. 

The Priority Matrix supplements the vertical visibility or "hype" axis of the Hype Cycle with a focus on the potential benefit of the technology, rated as transformational, high, moderate or low. 

The pairing of each Hype Cycle with a Priority Matrix will help you better determine the importance and timing of potential investments based on benefit rather than hype.

What was the most influential Hype Cycle?

The E-Business-Hype Cycle by Alex Drobik in fall 1999 that predicted the burst of the dot-com bubble in spring 2000.

--- from