Friday, January 30, 2009

China grants more patents in 2008

China grants more patents in 2008
Updated: 2009-01-30 16:59

BEIJING -- China granted 412,000 patent rights in 2008, up 17.1 percent over the previous year, patent authorities reported on Friday.

These included more than 352,000 domestic applications and 60,000 from abroad, according to the State Intellectual Property Office (SIPO).

The office also revealed it received over 828,000 patent applications last year, an increase of 19.4 percent year-on-year.

A spokesman for the SIPO said the office did not compile statistics on the number of applications rejected.

Yin Xintian, another spokesman with SIPO, said, "Patent grants in high-tech fields, such as audio-visual, optical and semiconductor technologies, still far behind foreign countries".

Each application took one to three years to be processed, depending on their complexity.

China has granted 2.5 million patent rights since 1997.

Stem Cell Research: The Quest Resumes

Thursday, Jan. 29, 2009
Stem Cell Research: The Quest Resumes

Scientific inspiration can come from anywhere — a person, an event, even an experiment gone awry. But perhaps nothing can drive innovation more powerfully than the passion born of tragedy. Or, in Douglas Melton's case, near tragedy. The co-director of the Harvard Stem Cell Institute (HSCI) is one of the leading figures in the search for cures for presently incurable diseases, and his breakthrough work is challenging many long-held beliefs about the ways biology and human development work.

But it was a very personal experience that brought Melton to stem cells, one that 17 years later he still finds difficult to discuss. When his son Sam was 6 months old, he became ill with what his parents thought was a cold. He woke up with projectile vomiting and before long began taking short, shallow breaths. After several hours, he started to turn gray, and Melton and his wife Gail brought the baby to the emergency room. For the rest of that afternoon, doctors performed test after test, trying to figure out what was wrong. "It was a horrific day," says Melton. (See the top 10 medical breakthroughs of 2008.)

It was not until that evening that a nurse thought to dip a testing strip into Sam's urine and they finally got a diagnosis. The boy's body was flooded with sugar; he had Type 1 diabetes. Then, as now, the disease had no cure, and patients like Sam need to perform for themselves the duties their pancreas cannot — keeping track of how much glucose they consume and relying on an insulin pump to break down the sugars when their levels climb too high. The diagnosis changed not only Sam's life but the lives of his parents and older sister Emma as well. Throughout Sam's childhood, Gail would wake every few hours during the night to check his blood sugar and feed him sugar if his concentration fell too low or give him insulin if it was too high. "I thought, This is no way to live," says Melton. "I decided I was not just going to sit around. I decided I was going to do something."

Trained as a molecular biologist in amphibian development, Melton began the work he pursues today: trying to find a way to make insulin-producing cells by using stem cells. "It was a courageous thing to do because he was at the pinnacle of his career," says Gail. "He brought home textbooks on the pancreas to figure it all out." Nearly two decades later, Melton is convinced that stem cells will be a critical part of new therapies that will treat and maybe cure not only diabetes but also other diseases for which there are no answers today.

Melton's confidence is testament to the extraordinary advances in stem-cell science, some of which have brought the promise of breakthrough therapies for conditions like diabetes, Parkinson's and heart disease closer than ever before. The cells filling petri dishes in freezers and incubators in Melton's lab and others around the world are so vastly different — in provenance, programming and potential — from the stem cells of just two years ago that even the scientists leading this biological revolution marvel at the pace at which they are learning, and in some cases relearning, rules of development. Until recently, the field has revolved around either embryonic stem cells — a remarkably plastic class of cells extracted from an embryo that could turn into any of the body's 200 tissue types — or their more restricted adult cousins, cells taken from mature organs or skin that were limited to becoming only specific types of tissue. On Jan. 23, after nearly a decade of preparation, the Food and Drug Administration approved the first trial of an embryonic- stem-cell therapy for a handful of patients paralyzed by spinal-cord injuries.

But today the field encompasses far more than just embryonic and adult stem cells; it has expanded into the broader field of regenerative medicine, and Melton's lab at Harvard is at the vanguard, bringing the newest type of stem cells, which do not rely on embryos at all, closer to the clinic, where patients will actually benefit. Last summer, Melton stunned the scientific community with yet another twist, finding a way to generate new populations of cells by reprogramming one type of fully mature cell so it simply became another, bypassing stem cells altogether. "If I were in high school, I can't imagine anything more interesting than stem cells," says Melton. "This is so cool. It's so amazing that cells in the body have this potential that we can now unlock by asking question after question."

A Battle Joined
That hidden power in each of us did not become obvious until 1963, when Canadian researchers Ernest McCulloch and James Till first proved the existence of stem cells, in the blood. These cells possess the ability to divide and create progeny — some of which will eventually expire, others that are self-renewing. The pair irradiated mice, destroying their immune cells. They then injected versatile bone-marrow cells into the animals' spleens and were surprised to see a ball of cells grow from each injection site. Each mass turned out to have emerged from a single stem cell, which in turn generated new blood cells.

That discovery led, 35 years later, to James Thomson's isolation of the first human embryonic stem cells, at the University of Wisconsin in 1998. And that milestone in turn inspired researchers to think about directing these cellular blank slates to eventually replace cells that had been damaged or were depleted by disease. The key lay in finding just the right recipe of growth factors and nutrients to induce a stem cell to become a heart cell, a neuron, an insulin-making cell or something else. It would take decades, the researchers all knew, but new therapies were sure to come.

Then, in 2001, everything changed. The use of discarded embryos made embryonic-stem-cell research deeply controversial in the U.S. Citing moral concerns, then President Bush restricted federal funding for the study of human embryonic stem cells. Under the new policy, U.S. government funds could be used only to study the dozens of embryonic cell lines already in existence — many of which proved not to be viable.

Read Stem Cells: The Hope and The Hype.

See the Year in Health, from A to Z.

The decision sent some leading scientists abroad, to Britain, Singapore and China, where the governments were more receptive to their work. Others who stayed behind but lacked private funding shifted their attention from embryos to the less versatile adult stem cells. Federally backed scientists, like Melton, who continued embryonic work were forced to adopt a byzantine system of labeling and cataloging their cell cultures and equipment so that government money was not used to grow forbidden cells — and government microscopes were not even used to look at them.

Those days may soon be over. Barack Obama campaigned on a promise to lift the research ban and support "responsible oversight" of the stem-cell field. For scientists, that means "we can stop the silliness," says Melton.

As welcome as that change will be, it may be less urgent now — owing primarily to the work of scientists like Melton. While embryonic stem cells remain the gold standard for any treatments that find their way into the clinic, newer techniques using the next-generation stem cells may soon surpass the older ones.

The Fighter
In looks and demeanor, Melton is the quintessential professor, soft-spoken and thoughtful, someone who appears more mentor than maverick. Born and raised on the South Side of Chicago, he developed an early fascination with animal development; that curiosity led to a bachelor's degree in biology at the University of Illinois in 1975, then a second undergraduate degree, in the history and philosophy of science, at Cambridge University on a Marshall Scholarship. Melton remained there for his Ph.D. work, studying under Sir John Gurdon — the first to clone a frog. At Harvard, Melton teaches a frequently oversubscribed undergraduate course on science and ethics, in which he uses his keen sense of logic to provoke. When the class discussed the morality of embryonic-stem-cell research, Melton invited Richard Doerflinger of the U.S. Conference of Catholic Bishops to present arguments against the field. Melton asked Doerflinger if he considered a day-old embryo and a 6-year-old to be moral equivalents; when Doerflinger responded yes, Melton countered by asking why society accepts the freezing of embryos but not the freezing of 6-year-olds.

Clearly, Melton does not shrink from a fight. As Washington's squeeze on stem-cell research tightened in the early part of this decade, he decided to take action, providing life support for what remained of the U.S. stem-cell community. Not convinced that an entire field could make much progress relying on a few dozen cell lines of questionable quality, in 2004 he used funds HSCI receives from the Juvenile Diabetes Research Foundation and the Howard Hughes Medical Institute, as well as from Harvard alumni, and developed a more streamlined method for generating stem-cell lines from embryos. He created more than 70 new ones and has since distributed 3,000 copies to scientists around the country for free.

"Doug drew a line in the sand," says Alan Trounson, president of the California Institute of Regenerative Medicine, the organization charged with dispensing state money for embryonic-stem-cell research. "He turned the tables on an Administration that was incredibly negative toward stem cells and showed [it] we are not going to tolerate being put out of this field by ideological views that we don't think are correct." Melton's motivation was, again, both professional and intensely personal. Two months after Bush announced his ban, Melton's daughter Emma, then 14, also received a diagnosis of Type 1 diabetes.

In part owing to the restrictive U.S. policy, the momentum in stem-cell research seemed to shift overseas. In 2004, South Korean researcher Hwang Woo Suk announced that he had generated the first human embryonic stem cells from healthy people — and in the following year, from afflicted patients themselves — using an abbreviated cloning method. The latter feat would mean that cardiac patients could essentially donate themselves a healthy new heart without fear of rejection.

The news was huge — but it was also a lie. In 2006, Hwang admitted he had falsified his results. (Melton's colleague at HSCI, Kevin Eggan, finally created embryonic stem cells from patients in 2008.) Although Hwang became a pariah, he had the right idea. Melton and others had been trying to do just what the Korean scientist claimed to have done — grow a new population of a patient's own cells. The key to the process is a supply of fresh, good-quality human eggs, which incubate skin cells taken from a patient. Building up such a stockpile, however, proved practically impossible. The egg-extraction process is invasive and carries certain risks; after the state of Massachusetts prevented donors from being compensated for their eggs, out of fear the women would feel coerced, HSCI ended up with only one volunteer out of more than two years of recruiting.

Melton faced mounting political pressure too. In 2004, voters in California approved a measure providing $3 billion in state funding to embryonic-stem-cell research. That threatened to draw scientists in the stem-cell community west, and Melton took pains to foster a "band of brothers" mentality. "I tried to create a cocoon here," he says, "and tell people that your job is to focus on the science. Don't worry what the politicians say." By then, Melton's team was one of only a handful in the country working on embryonic stem cells and was making headway in teasing apart the myriad critical steps needed to guide these impressionable cells into becoming insulin-generating cells. Both as a scientist and as a father, Melton remained convinced that the federal restrictions simply could not survive. He continued to insist that "the science is so significant that it will change the policy."

And then, astonishingly, it did. In June 2006 a modest researcher from Japan made a startling announcement at the International Society for Stem Cell Research conference in Toronto. Shinya Yamanaka quietly described a study in which he took skin cells from a mouse and stirred them in with varying genetic cocktails made from a recipe list of 30 genes known to be important in development. When he hit on the right four genes and inserted them into the cells aboard retroviruses, he wiped the cells clean, reprogramming them and returning them to an embryo-like state without ever creating the embryo. Four genes, he told his audience, was all it took to undo a lifetime's worth of delicate genetic tapestry. No need for eggs, no need for embryos. Could it be that easy? Were the debate and controversy over embryonic stem cells now rendered moot? "It was unquestionably unexpected," says Melton of the breakthrough.

Read a TIME cover story on Stem Cells.

See the Year in Health, from A to Z.

A year later, Yamanaka followed up his work by reporting success with the same four factors in turning back the clock on human skin cells. At about the same time, in Wisconsin, Thomson achieved the same feat using a different cocktail of genes. With those studies, what became known as induced pluripotent stem cells (iPS cells) were suddenly a reality. Never mind the frustratingly fickle process needed to create embryonic stem cells; this was something any molecular-biology graduate student could do. "We figured somebody would have success with reprogramming. We just thought that somebody would come along a generation from now," says Dr. David Scadden, Melton's co-director at HSCI. "Yamanaka threw a grenade at all of that, and now all of the doors are open."

Beyond Stem Cells
Melton, for one, isn't wasting any time before running through those doors. The iPS technology is the ultimate manufacturing process for cells; it is now possible for researchers to churn out unlimited quantities of a patient's stem cells, which can then be turned into any of the cells that the body might need to repair or replace.

Before that can happen, however, Melton wants to learn more about how diseases develop. And iPS cells make that possible too. For the very first time, he can watch Type 1 diabetes unfold in a petri dish as a patient's cells develop from their embryonic state into mature pancreatic cells. The same will be true for other diseases as well. "There is a good reason we don't have treatments for diseases like Parkinson's," says Melton. "That's because the only way science can study them is to wait until a patient appears in the office with symptoms. The cause could be long gone by then, and you're just seeing the end stages." No longer. Now the major steps in the disease process will be exposed, with each one a potential target for new drugs to treat what goes wrong. "This is a sea change in our thinking about developmental biology," says Dr. Arnold Kriegstein, director of the Institute for Regeneration Medicine at the University of California, San Francisco. "I consider it a real transformative moment in medicine."

The true power of reprogramming, however, does not stop with the stem cell. This summer, Melton flirted with the rules of biology once again when he generated another batch of history-making cells, switching one type of adult pancreatic cell, which does not produce insulin, to a type that does — without using stem cells at all. Why, he thought, do we need to erase a mature cell's entire genetic memory? If it's possible to reprogram cells back to the embryo, wouldn't it be more efficient in some cases to go back only part of the way and simply give them an extreme makeover? Using mouse cells, Melton did just that, creating the insulin-producing pancreatic cells known as islets. "The idea now is that you can view all cells, not just stem cells, as a potential therapeutic opportunity," says Scadden. "Every cell can be your source."

Realizing that potential — and with it, the prospect of successful treatments for conditions like Parkinson's or diabetes — may still be a few years away. Even iPS cells have yet to prove that they are a safe and suitable substitute for the diseased cells they might eventually replace in a patient. Ensuring their safety would require doing away with dangerous genes that can also cause cancer, as well as the retroviral carriers that Yamanaka originally used. Melton's team has already replaced two of the genes with chemicals, and he anticipates that the remaining ones will be swapped out in a few years. There are also hints that the iPS cells' short-circuited development makes them different in some ways from their embryonic counterparts. In mice, embryonic stem cells can generate a new mouse clone; iPS cells from the animals have so far stopped short of the same feat, aborting in midgestation, suggesting that some development cues may be missing. "It certainly makes me cautious," says Eggan.

Even if iPS cells do not prove as stable and as versatile as embryonic stem cells when they're transplanted into patients, they remain a powerful research tool. And if nothing else, they will have opened our eyes to the remarkable plasticity of biology and made possible new ways of thinking about repairing and replacing damaged tissues so we may consider not only treating but also curing disease. "It's a wonderful time," says Scadden. "Keep your seat belt on, because this ride is going to be wild."

For patients like Sam and Emma Melton, that ride carries with it the possibility of being free of the insulin pumps and injections they endure to keep their blood sugar under control. "I definitely think about how my life would be different if there is a cure," says Sam. His father is keenly aware that the ability of stem cells and reprogramming science to provide that cure is far from guaranteed. But his initial confidence in the power of the technology hasn't waned. "Everything we learned about stem cells tells us this was a really powerful approach," he says. "It would be a great shame if we let it wither and just go away." Melton, for one, is determined not to let that happen.

Science in Steps

A decade of conflicts and breakthroughs

James Thomson, U of Wisconsin, isolates human embryonic stem cells

President Bush restricts federal funding for research on human embryonic stem cells

Douglas Melton of Harvard creates more than 70 embryonic-stem-cell lines using private funding and distributes free copies of the cells to researchers around the world

Shinya Yamanaka, Kyoto University, turns back the clock on mouse skin cells to create the first induced pluripotent stem (iPS) cells, or stem cells made without the use of embryos. He uses only four genes, which are inserted into a skin cell's genome using retrovirus vectors

Yamanaka and Thomson separately create the first human iPS cells

Kevin Eggan at Harvard generates the first patient-specific cells from iPS cells — motor neurons from two elderly women with ALS

Melton bypasses stem cells altogether and transforms a type of mouse pancreatic cell that does not produce insulin into one that does

Konrad Hochedlinger at Harvard creates iPS cells in mice using the common-cold virus rather than retrovirus vectors — an important step in making the technology safer for human use

Melton's team makes human iPS cells by replacing two of the four genes, known to cause cancer, with chemicals. All four must be swapped out before iPS-generated cells can be transplanted into people

Yamanaka creates mouse iPS cells using safer plasmids of DNA instead of retrovirus vectors

Read Stem Cells: The Hope and The Hype.

Peter Klaus und Reinhardt Jünemann

Peter Klaus und Reinhardt Jünemann

Die beiden neuen Mitglieder der Logistik Hall of Fame: Professor Peter Klaus (li.) und Professor Reinhardt Jünemann (re.) (Foto: LOGISTIK inside / Scheutzow)
Logistik Hall of Fame: Logistik-Vermesser und Materialfluss-Papst ziehen in die Ruhmeshalle ein

München. Professor Peter Klaus von der Universität Erlangen-Nürnberg und Reinhardt Jünemann, der ehemalige Institutsleiter und Gründer des Dortmunder Fraunhofer-Instituts für Materialfluss und Logistik, sind am Freitag in die Logistik Hall of Fame aufgenommen worden. „Mit ihren Leistungen haben die beiden außergewöhnlichen Wissenschaftler die Logistik in Deutschland maßgeblich vorangebracht“, begründet Anita Würmser, Jury-Vorsitzende und Chefredakteurin von LOGISTIK inside die Entscheidung der unabhängigen Expertenjury.

Professor Peter Klaus (* 1944) ist bekannt durch die von ihm ermittelten zwei wesentlichen Kennzahlen der Logistik. Nach seinen Berechnungen hat Logistik in Deutschland ein Marktvolumen von 205 Milliarden Euro und beschäftigt 2,7 Millionen Menschen. Durch diese Werte hat die Logistik ihr Image als Verkehrslärm und Staus produzierendes Ungetüm verloren. Logistik wurde populär und gilt in Politik und Wirtschaft seitdem als Boombranche und Jobmotor. Sie wird jetzt als eigene Branche beziehungsweise eigener Industriesektor anerkannt.

Professor Reinhardt Jünemann (* 1936) zählt zu den Vätern der industriellen Logistik und hat das Fraunhofer-Institut für Materialfluss und Logistik (IML) in Dortmund gegründet. Das Institut gehört heute weltweit zu den führenden Forschungseinrichtungen. Auf Jünemanns Pionierleistungen gehen wichtige Impulse für automatische Materialflusssysteme, die Simulationstechnik sowie für die Informations- und Kommunikationstechnik in der Logistik zurück.

Seit dem Jahr 2004 wurden bisher elf verdiente Logistiker in die Logistik Hall of Fame aufgenommen: Helmut Baumgarten, Eugene Bradley Clark, Heinz Fiege, Huge Fiege, Klaus-Michael Kühne, Malcom McLean, Taiichi Ohno, Hans-Christian Pfohl, Hanspeter Stabenau, William H. Tunner und Horst Wildemann.

Die Logistik Hall of Fame ist eine Initiative des Fach- und Wirtschaftsmagazins LOGISTIK inside aus dem Münchner Verlag Heinrich Vogel. In die Ruhmeshalle der Logistik werden Persönlichkeiten aufgenommen, die sich durch ihre Leistungen außerordentlich um die Entwicklung der Logistik im deutschsprachigen Raum verdient gemacht haben. Die Logistik Hall of Fame ist jederzeit kostenlos im Internet unter zugänglich und bietet eine Reihe von Informationen und Bilder über die logistischen Meilensteine und deren Macher. (sv/pi)

Swinging the axe

Swinging the axe

Jan 29th 2009 | DAVOS
From The Economist print edition

Job-cutting has begun in earnest. But will the axe be wielded wisely?

Illustration by Claudio Munoz

THE headlines screamed that January 26th was “Black Monday” for jobs, after firms such as Caterpillar, Corus, Home Depot, ING, Pfizer and Sprint Nextel announced cuts of several thousand jobs each, due mostly to the rapidly deteriorating global economy. Alas, the consensus among the corporate bigwigs gathered this week at the World Economic Forum in Davos was that this marked only the beginning of the axe-swinging, and that there are blacker days to come.

This proved to be one of the big points of difference between the company bosses and the politicians brainstorming in the mountains. The politicians are primarily concerned with restoring demand enough to reverse the rising trend in unemployment; for many of the corporate leaders, ensuring the survival of their firms takes precedence over saving jobs. The difficult decision they face is not whether to cut, but how to do so in a way that strengthens their competitive position in the medium term rather than seriously damaging it.

The gloomy mood among bosses in Davos makes the worst-case scenario outlined in a new forecast from the International Labour Organisation (ILO) seem the most plausible of its possible outcomes. This supposes that if every economy in the developed world performs as it did in its worst year for unemployment since 1991, and every other economy performs half as badly as in its worst year, then the global jobless rate will rise to 7.1% this year—some 230m people, up from 179m in 2007.

The ILO’s most optimistic prediction is that global unemployment will rise only to 6.1% (from 6% in 2008). But that assumes that the world economy performs as the IMF forecast in November: global GDP growth of 2.2% in 2009, with a slight recession in the developed economies. The IMF has since become much glummer: this week it forecast growth of just 0.5%.

Already, firms are starting to find that their first round of cuts after the onset of the crisis is not enough. Caterpillar’s latest cut of 5,000 jobs is in addition to 15,000 already announced. Such is the frenzy of cutting that Challenger, Gray & Christmas, a recruitment firm that tracks employment trends in America, sought a crumb of comfort in its finding that over 50% of firms have cut jobs: it proclaimed in its latest report that “nearly half of employers avoid lay-offs.” But it pointed out that things would be even worse without the various innovative schemes adopted by companies to reduce labour costs without shedding jobs. These include salary cuts, reduced hours and “forced vacations”.

As Challenger suggests, this seems in keeping with the suggestion by Barack Obama in his inauguration speech that people should “cut their hours [rather] than see a friend lose a job.” Already, by way of example, White House staff earning $100,000 or more have had their salaries frozen. Companies including Avis, Starbucks and Yahoo! have announced pay freezes for 2009.

Yet these creative job-saving schemes are unlikely to go anywhere near as far as Mr Obama would like. They may appeal as a way to buy some time as companies try to get a clearer picture of where the economy is heading, or to retain talented workers who are likely to be needed in the future, if not now. But they have little appeal once a firm has decided that it needs to scale back its operations. As the boss of a big American retailer put it privately at Davos, “We have to decide who we want on the bus and to motivate them as much as possible.” Clever ways to share the pain can demotivate everyone, especially if they are seen as merely postponing the inevitable job cuts, making everyone fearful.

Painful choices

Equally candidly, many bosses admit that the crisis is giving them a chance to restructure their firms in ways that they should have done before, but found a hard sell when things were going well. As a rule of thumb, a careful cull of the 10% of lowest performers can make a firm leaner by removing fat without damaging muscle. It is going beyond the 10%, as many firms are now starting to do, that poses the real risks to a firm’s competitiveness.

During the relatively modest downturn at the start of this decade, for example, many professional-services firms cut too deeply, especially in their lower ranks, and found they were poorly positioned when strong growth resumed sooner than expected, says Heidi Gardner of Harvard Business School. Firms built on pyramid structures in which senior managers mentored larger numbers of employees below them suddenly found that, in a growing economy, they lacked the mentors needed to manage the army of new recruits. Instead, they had to re-hire ex-staffers at higher salaries and, in some cases, abandon proven policies of hiring senior managers only from within, says Ms Gardner, who worked for McKinsey at the time.

This crisis is revealing how few firms have really thought through their talent strategies, says Mark Spelman of Accenture. Claims that “our workers are our most valuable assets” are too often platitudes, the emptiness of which is now being revealed. But those firms that have thought seriously about their talent needs have the opportunity to get ahead of those that haven’t, says Mr Spelman, not just by shedding poor performers but also hiring scarce talent from outside, in what is now a buyer’s market. Other tips from Mr Spelman include avoiding voluntary redundancy programmes, which encourage the most employable people to quit, and not firing the newest recruits on a crude “last in, first out” basis, as this cuts off the supply of future talent. Instead, firms should identify which workers they need to keep, and do what they must to retain them.

Governments can play a useful role or a harmful one, depending upon their attitude to companies, says David Arkless of Manpower, an employment-services firm. If they focus on working with firms to smooth the movement of labour to where the future work will be, for example by providing skills training and financial incentives to workers in transition, then the economic downturn could be less painful than now seems likely. (A quick recovery in lending to small businesses, the main drivers of job creation in most countries, would also help.) But if governments try to prevent firms from making the changes to their workforces that they want, the result is likely to be prolonged gloom.

Although Mr Obama’s support for strengthening the ability of unions to enter workplaces is arguably a worrying sign, the American economy is far more accommodating of flexibility in employment than many European countries. Mr Arkless, for one, says that without a dramatic change of attitudes to job-cutting in Europe, “there is no doubt that American firms will come out of this downturn better than anywhere else in the world, due to their flexible employment model.” This will provide no comfort to anyone facing the prospect of unemployment, but it is a message that politicians would do well to take to heart.

Microsoft Searches for Group Advantage

Friday, January 30, 2009

Microsoft Searches for Group Advantage

A new search algorithm looks for connections between people.

By Robert Lemos ---

As part of its efforts to better compete with Google, Microsoft is plumbing the connections between searchers and their contacts to produce better results.

Microsoft researchers are exploring whether using data from several members of a social group--a technique that the company calls "groupization"--can improve search results. Their initial findings, based on experiments involving around 100 participating Microsoft employees, suggest that tapping into different types of groups could produce significantly better search results.

Credit: Technology Review

The team has developed an algorithm that, on average, pinpoints at least one search result for all members of a group that they judge to be better than the results returned using conventional algorithms. The results will be presented at the Web Search and Data Mining Conference in Barcelona in mid-February.

The Microsoft team believes that the approach could help the company overcome an industry-wide plateau in the quality of search results. "Today, search engines are really challenged and are sort of at the cusp of having to know individuals better," says Jaime Teevan, a computer scientist at Microsoft Research and lead author of the paper. "This [research] has the opportunity to enrich that."

The new research is part of Microsoft's efforts to erode Google's massive lead in search. Google currently attracts 63 percent of all searches, according to a 2008 survey by consumer-analysis firm Nielsen, far outpacing both Yahoo's 17 percent share and Microsoft's 10 percent share. Last year, Microsoft attempted to increase its share by acquiring Yahoo, but its initial advances were rejected. Yahoo later wanted to return to the bargaining table, but for the time being, Microsoft is focusing on increasing its audience by enhancing its own search offering.

With an eye on refining search results, Teevan and her colleagues--Meredith Morris and Steve Bush--looked at the way that people with similar interests or attributes search for information. The researchers grouped people using explicit factors, such as their age, gender, participation in certain mailing lists, and job function. In some cases, implicit groups--such as people who appeared to be conducting the same task or appeared to have the same interest--were inferred. The researchers acknowledged that gathering such data in the real world could be tricky. But it could perhaps be collected through registration, by caching previous searches or by tapping into social-networking software.

The Microsoft team found that groups defined by demographics such as age and location have little in common for most searches. However, groups of people with similar interests tend to rank similar search terms highly. The researchers also found that, while people believe that they phrase their queries in similar ways, the idiosyncrasies of search terms vary tremendously.

When asked to identify the pros and cons of telecommuting, for example, one searcher searched for "telecommuting," while others queried "working at home cost benefit" and "economic comparison telecommuting versus office." Knowing that these people have a shared interest could mean better results, Teevan says. "I don't talk about things the same way that you talk about things," she says. "And by using those different ways, [Microsoft is] more likely to find a page where someone talks about something in their own way."

Even if tapping into social groups improves search results, Microsoft will have to significantly improve its search service or introduce major new features to win over Google's loyal followers, says Andrew Frank, research vice president with business-intelligence firm Gartner's media group. "I think that the search category has been so successful for Google, and their dominance is so extreme, that it is hard to imagine a specific tactic that could be a silver bullet to change the trajectory of things," he says. "It will take a lot of effort and a lot of different things to change the overall picture of search."

Efforts to compare the quality of search results from Google, Microsoft, and Yahoo have found that about half of people still prefer Google--a smaller number than Google's actual market share. The difference is the attraction of Google's brand, Frank says. "You have to kind of change the game with search," he says. "It is almost impossible to get people to switch on a large scale just on a feature-function comparison."

In 2008, Google kicked off an experiment in which it allowed users to change the look of their search results, mapping them or placing them on a timeline. Early this year, the company added ability function that lets users reorder their search results through a service called SearchWiki. Yahoo has expanded its research-and-development efforts to try to match its rival's efforts.

Microsoft's Teevan believes, however, that there's still plenty of room for improvement. "Search is a really huge activity on the Web, but right now, we only have a single search tool--the search box--and a list of results," she says. "Groups can teach us a lot."

The Army's Remote-Controlled Beetle

Thursday, January 29, 2009

The Army's Remote-Controlled Beetle

The insect's flight path can be wirelessly controlled via a neural implant.

By Emily Singer --- technologyreview.vom

A giant flower beetle with implanted electrodes and a radio receiver on its back can be wirelessly controlled, according to research presented this week. Scientists at the University of California developed a tiny rig that receives control signals from a nearby computer. Electrical signals delivered via the electrodes command the insect to take off, turn left or right, or hover in midflight. The research, funded by the Defense Advanced Research Projects Agency (DARPA), could one day be used for surveillance purposes or for search-and-rescue missions.

Cyborg beetle: Shown here is a giant flower beetle carrying a microprocessor, radio receiver, and microbattery and implanted with several electrodes. To control the insect’s flight, scientists wirelessly deliver signals to the payload, which sends electrical signals through the electrode to the brain and flight muscles.
Credit: Michel Maharbiz
video Watch controlled flights of the beetle.

Beetles and other flying insects are masters of flight control, integrating sensory feedback from the visual system and other senses to navigate and maintain stable flight, all the while using little energy. Rather than trying to re-create these systems from scratch, Michel Maharbiz and his colleagues aim to take advantage of the beetle's natural abilities by melding insect and machine. His group has previously created cyborg beetles, including ones that have been implanted with electronic components as pupae. But the current research, presented at the IEEE MEMS in Italy, is the first demonstration of a wireless beetle system.

The beetle's payload consists of an off-the-shelf microprocessor, a radio receiver, and a battery attached to a custom-printed circuit board, along with six electrodes implanted into the animals' optic lobes and flight muscles. Flight commands are wirelessly sent to the beetle via a radio-frequency transmitter that's controlled by a nearby laptop. Oscillating electrical pulses delivered to the beetle's optic lobes trigger takeoff, while a single short pulse ceases flight. Signals sent to the left or right basilar flight muscles make the animal turn right or left, respectively.

Most previous research in controlling insect flight has focused on moths. But beetles have certain advantages. The giant flower beetle's size--it ranges in weight from four to ten grams and is four to eight centimeters long--means that it can carry relatively heavy payloads. To be used for search-and-rescue missions, for example, the insect would need to carry a small camera and heat sensor.

In addition, the beetle's flight can be controlled relatively simply. A single signal sent to the wing muscles triggers the action, and the beetle takes care of the rest. "That allows the normal function to control the flapping of the wings," says Jay Keasling, who was not involved in the beetle research but who collaborates with Maharbiz. Minimal signaling conserves the battery, extending the life of the implant. Moths, on the other hand, require a stream of electrical signals in order to keep flying.

The research has been driven in large part by advances in the microelectronics industry, with miniaturization of microprocessors and batteries.

Thursday, January 29, 2009

Predicting Breakdowns

Thursday, January 29, 2009

Predicting Breakdowns

A new system that monitors the health of vehicles could save money and lives.

By Brittany Sauser ---

Most new vehicles are studded with hundreds of sensors that collect raw performance data. While beneficial, such information can only be interpreted by the manufacturer or dealer and is usually only read after the car breaks down. Now researchers at Rochester Institute of Technology (RIT) and Lockheed Martin, a security company in Bethesda, MD, have developed a monitoring system that can better assess the health of a vehicle and can alert drivers to any potential problems.

Performance evaluation: Researchers at Rochester Institute of Technology (RIT) and Lockheed Martin have developed a system that monitors and assesses the health of a vehicle and predicts its future performance. The vehicle, such as the military light-armored vehicle, is embedded with sensors that send information to a computer at a central operational center. Shown here are Dave Keegan (right), an RIT student, and Bob Kosty, a technician, looking at the software that analyzes the data to determine how the equipment is functioning and warn of any potential problems.
Credit: Laura W. Nelson, RIT

The system uses a network of embedded smart sensors that are strategically located near automotive components that are prone to problems. The information is sent wirelessly to a central command center, where it is automatically analyzed by software. The monitoring system is similar to OnStar, an in-vehicle security, communications, and diagnostics system built by GM. But Nabil Nasr, assistant provost and director of the Center for Integrated Manufacturing at RIT, says that the system goes "far beyond" anything commercially available by predicting "future health or failures."

The project is part of a $150 million contract between Lockheed Martin and the U.S. Marine Corp, which is equipping up to 12,000 military vehicles with the new technology. The system can assess the health of military vehicles before they are sent on missions so that commanders can know if a vehicle is up to the task. "It could save money and lives, and extend the lifetime of equipment," says Nasr.

The technology has also been tested in a public-transit bus at the Rochester Genesee Regional (RGR) Transportation Authority for the past 18 months. Eight months ago, a spinoff company called LIBAN formed to develop the technology for commercial fleet vehicles.

The system uses standard sensors--such as temperature, vibration, and electronic sensors, as well as customized smart sensors--to monitor a vehicle. The sensors are placed near different components on the vehicle, such as the transmission, alternator, and drivetrain. "Most systems on the market today are just reporting fault codes coming out of the engine-control module. We are looking at data from individual components to get better details . . . and to predict future conditions," says David Chauncey, CEO of LIBAN.

The data from the sensors is processed by an onboard computer system that analyzes the information. That data is sent at regular intervals to a control center via a cellular network, satellite, or private data network, depending on the customer. "Every vehicle is an intelligent, potential source of information, and we have the technology to make the data useful; we just need to develop communication protocols and standards so we can build the infrastructure to share information, beyond just the manufacturers and dealers," says Kirk Steudle, director of the Michigan Department of Transportation (MDOT), which just announced a partnership with Michigan International Speedway to create an open testing environment for cross-brand vehicle communication.

The "heart of the system" is the data-monitoring software, says Nasr. The RIT researchers have created sophisticated programs for mining, trending, and analyzing the data from the sensors. "The algorithms are extremely valuable because they help us build a model of predictive and condition-based maintenance, so we can predict failures before they occur, and we can make determinations about service based on the actual conditions of the equipment," says Randy Weaver, advanced-technology-systems program manager at RGR. The fare box on a bus, for example, is a key piece of hardware that the system monitors. "It's the number-one reason we change a bus off," says Weaver. If the box becomes jammed, transit personnel cannot accept fares from magnetic-pass holders (smart cards), so it is imperative that transit authorities know in advance the condition of the box.

"The [technology] is critical because our primary service is to keep buses on the road for the community, so if we can prevent a road breakdown during service hours, then the technology pays for itself," says Weaver. RGR plans to integrate the technology into its fleets in the next six months.

Steve Underwood, director for the Connected Vehicle Proving Center at the Center for Automotive Research, in Ann Arbor, MI, says that the technology is really important, especially for reducing costs, and that there is a lot that can be done in addition to measuring the performance of a vehicle. Sensors can be used for incident detection, as well as for identifying traffic patterns and pavement conditions.

While the technology is currently being placed in military light-armored vehicles, LIBAN hopes to also use it in commercial vehicles, such as trucks owned by the U.S. Department of Defense, private-fleet operators, freight haulers, and other public-transit systems.

Digging Deeper in Web Search

Thursday, January 29, 2009

Digging Deeper in Web Search

A personalization search tool reveals links buried deep within page results.

By Kate Greene ---

One of the hottest frontiers in Web search is finding ways to improve results based on the searchers' preferences. Already, when you log in to Google, the search engine tries to personalize results by mining your search history: for an eighth grader who has performed lots of searches on marine life, a search for "dolphins" might provide more results for the animal than for the football team.

Credit: Technology Review

Now Surf Canyon, a startup based in Oakland, CA, is adding its own spin on personalization. Its software, which can be downloaded and installed into Firefox and Internet Explorer Web browsers, enhances individual searches on major search engines by evaluating which links you click on, and then instantly giving you revised search returns--including three sites that relate in some way to the site you clicked on. "We have invented real-time personalization," says Mark Cramer, CEO of the company.

For example, a Google search for "thermoelectric cooler" using Firefox with Surf Canyon installed provides 10 standard results. In my case, the eighth result, from, a chip maker, seemed promising. I clicked on it, scanned the page, and then hit the "back" button. When I subsequently looked at the results page, three new suggestions appeared directly under the result. Surf Canyon had elevated these links from the earlier 100 pages of results because its algorithm determined that these recommendations related to the information on, including technical explanations of how thermoelectric coolers work.

Crucially, these new results are cleverly slipped into the search results so that the original results page doesn't look drastically different when a user navigates back. It would be off-putting to users, Cramer says, if they had seen a link in the original results that they wanted to click on but, when they went back to the results, found it missing. Therefore, recommended results only appear automatically below the link that was clicked on. "We don't want to jar the users," Cramer says. "[Surf Canyon is] specifically engineered to be as unobtrusive as possible."

Behind the scenes, an algorithm makes the personalization possible. Among other things, the algorithm analyzes which results are clicked, which are ignored, and how much time a user spends looking at the page. Importantly, says Cramer, the algorithm semantically deconstructs a page to determine what it means and how similar it is to others in the results. The results are cumulative: after a couple of clicks, the algorithm can determine if you're most interested in a Canon camera, an SLR camera, or, specifically, a Canon SLR, Cramer says.

Results revealed: A Google search for “thermoelectric cooler” returns 10 standard results. After clicking on the first result and then clicking back to the results page, Surf Canyon shows three suggested results below the first one.
Credit: Kate Greene

Marti Hearst, a professor at the School of Information at the University of California, Berkeley, says that Surf Canyon succeeds in presenting the reordered links in a clear, useful, and unobtrusive way. It doesn't require people to do any extra work, as does Google's WikiSearch, a feature that lets users personalize their results by voting them up or down.

However, in her test cases, Hearst found that the algorithm's re-ranked results weren't completely useful. "Where personalization works is where queries are ambiguous," she says, but queries have become increasingly longer over the years, and they tend to provide clues that help the engine disambiguate the results on its own. Additionally, in Hearst's tests of Surf Canyon, she found that it only untangled the different meanings of the acronym ACL (which could mean both anterior cruciate ligament and Association for Computational Linguistics) to a certain point: it kept including mixed results even when she felt that her clicking choices had made it clear that she was interested in the linguistics group.

Cramer and his team say that they have gotten more positive results. In a study they performed, some participants saw a second page of search results that were reordered according to Surf Canyon's algorithm, while others saw a second page with standard results. The researchers found that the participants who had access to reordered results clicked on them 30 to 40 percent more frequently.

Monday, January 26, 2009

Top Five Supply Chain Risk Factors

Top Five Supply Chain Risk Factors

Learn to identify and assess all the risks in your supply chain.

1. Country of origin: Knowing the location of your supplier's production facilities is important in recognizing their susceptibility to security threats. Countries that are more vulnerable to threats could affect the security of the supplier's plants and could cause delays in your supply chain or compromise the products you receive. A key question to have answered is the physical location of each of the supplier's plants and/or factories, and if possible, the locations of their raw material suppliers.

2. Shipment and delivery accuracy: Ensuring that a supplier can deliver supplies consistently and on time is key to assessing the risk they pose to your supply chain. Be sure to ask for shipment times (daily, weekly, etc.), mode of transportation (air, land or sea), and rerouting procedures when natural disasters interrupt trade lanes.

3. Physical security: Assessing the physical security is very important, especially in countries where terrorism is on the rise. Inquiring about areas such as the materials used to construct the building, existence of a guard gate, adequate lighting around the perimeter, use of locks on all windows and doors, perimeter fencing, and cargo storage procedures will help you assess the ability of the supplier to keep their location secure regardless of natural or institutional threats.

4. Internal processes: Soliciting information about a supplier's internal processes provides visibility into not only the security, but also the controls put in place during the manufacturing process. Suppliers should be asked to explain processes dealing with how keys are checked out, visitors are monitored while on-site, access to cargo is restricted, use of computers and electronic data is controlled, and employee background checks are conducted.

5. Social and environmental responsibilities: Requesting information on the removal of chemicals used during the manufacturing process, or if the supplier abides by the no child labor law are only two of the important questions in this area. It is also instructive to inquire about internal policies such as maternity leave and paid time off as well as the air quality and work environment. Supplier social and environmental responsibilities are becoming a larger factor in assessing risk in a supply chain as product safety regulations continue to be proposed and passed into law.

Source: Integration Point

See Also

Top Five Supply Chain Risk Factors

Top Five Supply Chain Risk Factors

Learn to identify and assess all the risks in your supply chain.

1. Country of origin: Knowing the location of your supplier's production facilities is important in recognizing their susceptibility to security threats. Countries that are more vulnerable to threats could affect the security of the supplier's plants and could cause delays in your supply chain or compromise the products you receive. A key question to have answered is the physical location of each of the supplier's plants and/or factories, and if possible, the locations of their raw material suppliers.

2. Shipment and delivery accuracy: Ensuring that a supplier can deliver supplies consistently and on time is key to assessing the risk they pose to your supply chain. Be sure to ask for shipment times (daily, weekly, etc.), mode of transportation (air, land or sea), and rerouting procedures when natural disasters interrupt trade lanes.

3. Physical security: Assessing the physical security is very important, especially in countries where terrorism is on the rise. Inquiring about areas such as the materials used to construct the building, existence of a guard gate, adequate lighting around the perimeter, use of locks on all windows and doors, perimeter fencing, and cargo storage procedures will help you assess the ability of the supplier to keep their location secure regardless of natural or institutional threats.

4. Internal processes: Soliciting information about a supplier's internal processes provides visibility into not only the security, but also the controls put in place during the manufacturing process. Suppliers should be asked to explain processes dealing with how keys are checked out, visitors are monitored while on-site, access to cargo is restricted, use of computers and electronic data is controlled, and employee background checks are conducted.

5. Social and environmental responsibilities: Requesting information on the removal of chemicals used during the manufacturing process, or if the supplier abides by the no child labor law are only two of the important questions in this area. It is also instructive to inquire about internal policies such as maternity leave and paid time off as well as the air quality and work environment. Supplier social and environmental responsibilities are becoming a larger factor in assessing risk in a supply chain as product safety regulations continue to be proposed and passed into law.

Source: Integration Point

See Also

Meeting of the Minds: Where Process and Discrete Manufacturing Converge

Meeting of the Minds: Where Process and Discrete Manufacturing Converge

When it comes to continuous improvement, discrete and process manufacturers may have more in common than they think. A close look at the two industries shows they have opportunities to learn from each other.

What could a plant manager from a process manufacturer that makes yarns and fabrics for industrial applications possibly learn from an automaker such as Toyota? Conventional wisdom says companies should benchmark against similar industries to gain knowledge relevant to their operations. But as we've seen with businesses ranging from medical institutions to governmental agencies adopting lean principals, one industry may have best practices that can be tailored to fit a completely different work environment.

The same could be said for process and discrete manufacturers. Generally speaking, process industries are characterized as businesses that make products in bulk quantities, such as chemicals, pharmaceuticals, gasoline, beverages and food products, which often undergo a chemical conversion. On the other hand, discrete manufacturers produce or assemble parts or finished products that are recognizable as distinct units, such as automobiles or computers, capable of being identified by serial numbers or labeling products and measurable as numerical quantities rather than by weight or volume.

Even with these stark differences, process and discrete manufacturers have a history of glomming on to one another's improvement methods. As lean and Six Sigma gained in popularity throughout the 1980s and '90s within discrete operations, process manufacturers began taking note, says Peter Martin, vice president of strategic ventures at automation technology provider Invensys Process Systems. Initially, the process industry's forays into the continuous improvement trend didn't fare so well. When process manufacturers tried to implement statistical analysis methods which are used primarily in discrete operations, such as Six Sigma, they didn't work because they focused too much on defects, Martin says.

"In the process industries we don't tend to do defect-oriented manufacturing," he explains. "[For example], if you charge a little too much pigment, you can just put a little more base in and fix it. The mindset in the process industry is direct, real-time control. The mindset in discrete manufacturing is after-the-fact statistical analysis to get continuous improvement. So the mindsets are very different."

Process manufacturers finally realized value from continuous improvement programs when they stopped applying methods that focused on statistics. "When the process industry started looking at what discrete was doing in continuous improvement, they started saying, 'We should be able to use different techniques, maybe not statistical techniques, to continuously improve our critical performance variables like contribution margin, or energy cost or production value,'" Martin says.

Lean Adaptations

Lean became a reality for several Milliken & Co. plants when the privately held textiles and chemicals manufacturer applied lean manufacturing techniques to optimize lead time and align itself with customer demand variations, says Chris Glover, Milliken Performance System practitioner. Like many other process manufacturers, the company dabbled in lean throughout the 1980s and '90s, but initially didn't fully understand how to gain significant improvements from the methodology, says Glover, who has worked in a variety of leadership roles within Milliken since 1985.

"We went to Japan as an organization and benchmarked many Japanese companies in the early '90s and [lean] was visible inside the Toyota Production System," he explains. "It was just part of their operating system, but when we brought it back we weren't really sure how to incorporate it in our activities, partly because we were process and couldn't figure out why you had to measure every millisecond a person was running a process machine. So we struggled a little bit with that type of diagnosis."

In discrete industries, manufacturers can deploy material requirements planning (MRP) systems to manage raw materials that require long lead times. But process manufacturers deal with materials that usually need to be moved quickly, notes manufacturing systems consultant Jim Ranallo.

"You have inherent variability in the raw materials coming in, so the processes that are built to handle that variability are very different than discrete, which is more assembly of manufactured product that is tightly controlled and very well defined in a build of materials," Ranallo explains.

Lean can be used to replace MRP to manage shorter lead-time materials using a kanban signal, he continues. "Say I'm ordering an ingredient and I have a supplier that responds in two weeks. I can use a two-bin system where I have two pallets of materials in a warehouse. When one of those pallets is consumed, I can place an order to refill that order in the warehouse because I know I'm going to get an order in two weeks."

Milliken had worked with value stream maps with the hope of realizing significant waste reductions in specific areas. But gains were minimal because the plants didn't make lean part of their enterprisewide culture, Glover says. The results started to manifest after the company spoke with customers who requested wider-scale lean adoptions.

The Lean Embrace

Among Milliken's products are fabrics used in various automotive applications. Steeped in lean from its earliest beginnings, the company's auto manufacturing customers pushed Milliken to understand the continuous improvement process better. The company responded by establishing what it refers to as its Lean Enterprise system in 2006 at several plants in Georgia and South Carolina. Through value stream mapping, the company identified waste points, what was driving lead times and the Plan For Every Part (PFEP) system.

Invensys' Peter Martin sees more opportunities for discrete/process knowledge sharing in the future.
Typical PFEPs provide visibility into inventory by charting characteristics of each part, including part numbers and dimensions, and measuring supplier performance metrics such as delivery time. That helped the company determine its replenishment points, safety stock volume and materials runs. "It was especially critical last year with the increase in most of the chemical costs with the oil prices rising," Glover explains. He estimates the company's lean initiatives have helped reduce customer lead times 30% to 60% across multiple plants.

The Dow Chemical Co. developed a lean simulation tool relevant to the process industry to help management teams in its plants address inventory problems, according to Dow global supply chain process consultant Martino Fernandes. The initiative started in 2002 with lean simulation technologies and Lego models for management to gain buy-in from them.

"Lean can definitely drive value in the process manufacturing environment," Fernandes told AMR Research in April 2008. "The key is to demonstrate how so that management believes it. Experiential learning through simulation is very successful in helping to reshape traditional paradigms."

As of 2008, the average range of improvement for teams participating in Dow's lean exercise are 10% to 15% increase in fill rates, 30% to 40% reduction in cycle times and a 10% to 20% inventory improvement yielding a 5% to 15% storage space requirement, according to an AMR report.

Discrete Conversations

Conversely, some discrete manufacturers are keeping a close eye on trends in the process industry, particularly when it comes to wireless technology, according to analyst firm ARC Advisory Group. Discrete manufacturers are showing interest in the development of wireless process communication standards such as ISA 100 and the highly addressable remote transducer protocol, according to a recent ARC study. Potential uses include the implementation of wireless sensing technologies in automotive applications where robotics are used to reduce cable failure in moving equipment and enable the monitoring of information processing and devices.

In addition, wireless sensors can be used to establish predictive maintenance schedules through the collection of vibration data, says Ralph Rio, ARC research director. In process environments sensors are often used to monitor cooling fans, whereas discrete manufacturers might use sensors to observe the voltage characteristics of a motor.

ARC predicts the worldwide market for wireless devices in discrete manufacturing will grow at a compounded annual growth rate of 16.2% over the next five years. But growth will be limited until standards are developed, notes Chantal Polsonetti, ARC vice president.

"While the business drivers are in place, including wireless' status as the ultimate fieldbus from the perspective of wiring reduction, the lag in technology and standards development suitable to meet discrete industry requirements will contribute to an ongoing fissure in growth prospects for discrete versus process industries over the next five years," Polsonetti notes.

There may also be opportunities for discrete manufacturers to implement control devices used in process industries, says Invensys' Martin. He points out the frequent use of vision systems in the pharmaceutical industry for quality control as one such technology that's making its way to discrete operations. However, the greatest potential for information sharing may exist in mixed process and discrete environments, such as mining and metalworking or foods, says Martin, who explains that process manufacturers utilize mathematical controls to drive continuous improvement while discrete is more focused on logistics.

"There are examples of plants like an oil refinery that's 99% continuous and an automotive plant that's 99% discrete, but there are a lot of industries in between those two extremes that combine continuous and discrete processes right within the same plants," he says. "In those cases you've got a huge mix of logic and mathematical approaches, and I think there's a lot that can be learned by both sides looking at how on the process side you can apply logistics to do better production scheduling, to do better demand scheduling, or on the discrete side using mathematics to do more deterministic control."


Lean Manufacturing Visionary Jim Womack On Frontiers Of Lean Thinking

Preparing for ERP with Manufacturing Best Practices

Top 5 Tactics to Support Lean Across the Entire Enterprise

Supply Chain Tips for Weathering the Financial Meltdown

Supply Chain Tips for Weathering the Financial Meltdown

In this broadly deflationary environment, inventory is a poison and responsiveness its only antidote.

Supply chain mistakes can be extremely costly in the very best of times, leading to excessive cost, confusion and downright chaos. But in today's global financial crisis, mistakes are magnified. One major misstep could be catastrophic, as we will witness over the coming weeks as the Nortel bankruptcy ripples through layers of the supply chain.

Despite the numerous advances in ERP systems, dynamic demand planning tools and supply chain modeling software over the past decade, most major supply chain decisions are still ultimately made by people -- and individual people and companies react differently to the issues currently presenting themselves in this economy.

As we head into 2009, supply chain mistakes can trash a balance sheet more quickly than ever, or leave incremental sales on the table in an increasingly competitive market environment. With most companies tightening both their belts and their inventory exposure, it will become more important than ever for suppliers to maintain supply flexibility without taking excess material risk. In this broadly deflationary environment, inventory is a poison and responsiveness its only antidote.

Whether you're dealing with the dramatic secular downturn or industry specific issues impacting the supply and demand balance, the following are five steps you can take to help ensure your business' supply chain is responding appropriately to the current economic conditions.

1. Talk forecasts with your Electronic Manufacturing Services (EMS) provider. Despite contracts and assurances to the contrary, most EMS providers have already lowered their demand signals into the supply chain unilaterally on behalf of their various OEM customers. And although managing excess and obsolete exposure is absolutely critical for both parties in these times, it's equally important that the OEM and EMS provider do not both take down demand numbers independently. When this happens, demand signals get dramatically over-corrected and thus the supply chain drives less material than actual demand would dictate. Talk with your EMS provider and agree upon a specific demand plan, an ongoing schedule for regularly updating that plan together, and the appropriate supply chain signals for the current environment. Keep in mind that ultimately the OEM owns most of the raw material liability, so lying to yourself or your key supplier will not advance the cause.

2. Systemically bring in lead time fences in ERP. As demand drops universally, suppliers typically find themselves with increased inventory and excess capacity. Reducing lead time fences in ERP tends to reflect the new, albeit temporary, supply realities in a contracting market. If the EMS provider runs the ERP and requires the OEM to approve component lead times (as is contractually customary), the OEM needs to make sure that lead time approvals reflect the desired change and the current market conditions. This process will require the EMS provider to manage raw material orders more tightly, and will result in a reduction of the OEM's overall material exposure. It will also likely reduce weighted average material cost as commodity materials such as DRAM, Flash, PCBs and resins all show increased price erosion in contracting markets. But remember, it is extremely important to continue to monitor systemic lead times as the markets eventually stabilize and then return to a new state of normalcy.

3. Validate the financial viability of your critical suppliers (and their customers). Although this seems a simple exercise, understanding who is and who is not at risk in this financial Armageddon requires a different approach. Profitability, cash flow and sales growth must still be considered, but they now take a back seat to issues such as debt maturities, loan covenants and customer concentration ratio. There are a number of specialized component providers that have Top 5 customer concentration ratios in excess of 90%, suggesting the loss of even a single top customer (such as Nortel) could be a catastrophic event. If you have critical suppliers with a Top 5 customer concentration ratio greater than 70%, you probably need to better understand the risk profile of their key customers.

4. Never kick a supplier when he's down. Many a hard negotiating commodity manager cannot resist the temptation of renegotiating a hard deal with a supplier on the ropes in a troubled market - the rationale being that when times are particularly tough, companies cannot afford to lose a key account and are therefore willing to sell their soul to keep the business. While this is a common reality, the ramifications are broad, potentially dire, and unfortunately rarely fully understood. Undermining an already weak critical supplier has obvious attendant risks, but less obvious are the longer term consequences. Suppliers tend to provide the best service to the best customers, and when things get tight (which they always do coming out of the capacity downsizing cycle that inevitably accompanies a downturn), flexibility around capacity-constrained products, services and terms goes to the most profitable accounts. Build relationships with your suppliers that drive long-term success for both parties, and that will be mutually beneficial in the subsequent upturn that will inevitably follow.

5. Don't fight the facts. When you roll up the sales forecast next month and it shows orders are down 20%, resist the temptation to ignore the facts. It is not a forecasting anomaly. Your company is not immune to the global downturn. Everyone likes a good upside and prefers not to trim their forecast, so be assured that your customers are probably not sandbagging. Between stock prices, real estate values and commodity prices, the world has lost $50 trillion dollars in the last 12 months -- the single greatest contraction of wealth in world history. Cash is no longer king; it is now a ruthless dictator. Your company's balance sheet (more so than your products or technology) may very well be your single greatest asset. This downturn is real, cash is everything and inventory is huge potential liability that will likely continue to depreciate at an accelerating pace. Don't worry about the upside for the next quarter. Instead, focus immediately on exceptionally prudent supply chain practices so your company can live to fight another day.

Ron Keith is COO of Riverwood Solutions (, an innovator of managed supply chain services.

China is number one

China is number one

Jan 26th 2009

More than a billion people are using the internet

THE number of people going online has passed one billion for the first time, according to comScore, an online metrics company. Almost 180m internet users—over one in six of the world's online population—live in China, more than any other country. Until a few months ago America had most web users, but with 163m people online, or over half of its total population, it has reached saturation point. More populous countries such as China, Brazil and India have many more potential users and will eventually overtake those western countries with already high penetration rates. ComScore counts only unique users above the age of 15 and excludes access in internet cafes and via mobile devices.