Saturday, March 07, 2009

the Speach

Friday, March 06, 2009

China to have world's fastest train by 2012

China to have world's fastest train by 2012
(Xinhua)
Updated: 2009-03-06 22:36

BEIJING -- Trains on the Beijing-Shanghai high-speed railroad under construction will run at 350 km per hour, the fastest in the world, said Li Heping, a researcher at the China Academy of Railway Sciences on Friday.

The 1,318-km-long railway line, with a designed speed of 350 kilometers per hour, will beat the fastest rails in the world at present, which has a designed speed of 320 kilometers per hour, he said.

Trains will take less than five hours to make the run from the capital to Shanghai, which is now at least 11 hours.

The new rail line would be used for passenger service, while the existing one for cargo transportation, according to Li.

"The railway traffic strain will be greatly eased by 2012 as the country will increase investment and build faster rails and trains," he said.

The Ministry of Railways has said it planned to invest 1.2 trillion yuan ($175.44 billion) to improve the railway system, aiming to have 13,000 km of passenger lines by 2012.

According to Li, 8,000 km of the lines would allow trains to travel between 200 to 350 km per hour, and 5,000 km for trains with a speed from 200 to 250 km.

The money is part of the 4-trillion-yuan stimulus package announced by the government late last year, the ministry said.

Currently, there are not enough seats for all the people who want to travel, especially during the Lunar New Year every year, when millions are on the move for family reunion.

"Research on high-speed trains are going on smoothly," he said. The new trains would be batch-produced after trial operations in 2012.

China was developing trains that would run at a designed speed of 350 km per hour, and the speed for test might exceed 380 kilometers, according to Li.

China aimed to put 120,000 km of rail lines in service by 2020, of which 16,000 km would be dedicated to only passenger services, according to the Ministry of Railways.

To meet the 2020 target will require about 5 trillion yuan, the ministry said.

The Virtualized Plant Floor

The Virtualized Plant Floor

Improving operations without production disruption

In today's economic climate, manufacturers face increased pressure to control spending without sacrificing their competitive advantages. They cannot afford inefficiencies in areas such as production processes and product development. For many manufacturers to stay competitive, it requires the ability to rapidly deploy and upgrade specialized engineering applications. If they cannot do this cost effectively, the result can be lost productivity and delayed time to market.

To address these challenges, virtualization technology is benefiting manufacturers in a number of ways. Virtualization was a cornerstone of IT projects in 2008 and will continue to be in 2009, as identified by IDC in its December 2008 report, "Worldwide System Infrastructure Software 2009 Top 10 Productions." Different types of virtualization technologies enable companies to consolidate server environments, standardize computer terminals and set up multi-version access to hundreds of software applications.

Application virtualization is a technology that is beginning to specifically impact manufacturing operations and engineering. It will be a primary piece of technology in developing the next generation plant floor and design systems.

Innovation on the Plant Floor

It has long been understood that manufacturers are hesitant to upgrade their existing production systems -- even if it's only an application update and not a complete system overhaul. As a manufacturing facility, operational efficiency is measured by the number of hours a plant is in service. In a standard working environment this is 24-hours-a-day, five-to-seven-days a week.

To update plant floor software and computer systems, causing a possible disruption in production, may not appear to be a wise move for manufacturers. It may seem safer to continue using the systems already in place. However, as we all know, technology is constantly evolving -- making it difficult to avoid software updates over the long term. To ensure that older software is compatible with newer applications, regular system upgrades need to be performed.

The benefits of keeping software applications updated are sizeable, such as increased security, improved application compatibility and enhanced performance. However, if you replace the benefits of upgrading plant floor systems with the possibility of a production disruption, it's not hard to understand why more than 80% of manufacturers continue using their legacy software systems -- a system identified as 20 or more years old.

Facing the Challenge

Fear of plant floor disruptions should no longer be keeping manufacturers from updating their critical production systems. Technology has advanced beyond the days of the green screen, and virtualization, specifically application virtualization, is breaking down the remaining barrier. No longer does a specific application need to be tested at length, deployed in phases and then put into full production.

Instead, application virtualization creates a remote and secure connection to a central server where users access their applications. From here, a user will have on-demand access to their applications from any computer terminal on the plant floor. Though the applications are not installed on individual computers, users do not have to change how they interact with the desktop. The application functionality is the same as if it were installed local on the computer.

Engineers working remotely using laptops will also experience the same usability as those connected directly to the network. Once an application is opened from the desktop and when the session is closed, the application code settings and document profiles are saved on the laptop in a virtual "sandbox." This allows users to access the program even when they are not connected to the network. Once the laptop is reconnected to the network, updates made offline are automatically delivered to the application and saved.

For the plant floor, application virtualization means any version of a manufacturing execution, automation and production system can be deployed to a single piece of equipment without causing disruptions. For example, the operating software on a large format lithographic printer may only be compatible with version 1.0 of the automation system, but an automated specialty folder gluer may benefit from version 2.3 of the same automation system.

In this situation, a manufacturer may be inclined to use only version 1.0 of the automation system and forgo the benefits of version 2.3. This decision by a manufacturer will have a direct impact on the operational efficiency of the plant floor and create system compatibility issues for product engineers. However, if the manufacturer did decide to upgrade the automation system without using application virtualization, the IT department would have to conduct intensive regression and application compatibility testing -- undoubtedly causing delays in delivering the product to market.

Furthermore, the design software used to set up the corrugated box construction points for the specialty folder gluer may receive updates on a regular basis. This will require upgrades to the equipment's production system, which in turn will require retesting of the software to ensure compatibility with other plant floor systems (i.e. automation, execution and production).

Simplifying the Process

Application virtualization drastically simplifies this process and replaces it with a centralized structure where applications are sequenced and virtualized in a controlled environment. With application virtualization, a company can have an infinite number of software systems in production with no risk of interoperability issues.

This is accomplished by the design architecture of the technology. Application virtualization is designed to host each application individually and be accessed remotely. This keeps the hundreds of applications independent of each other while being used at computer terminals across the plant floor.

By keeping software application install packages hosted on the server instead of a local computer terminal, IT departments, plant floor managers, along with product engineers are able to work together effortlessly. Instead of product engineers having three computer terminals on their desks for each version of AutoCAD released over the past three years, they will have one computer that can simultaneously execute all three versions of the AutoCAD program.

Leading the Pack

A good example of a process manufacturer using application virtualization is BASF and its subsidiary BASF IT Services. Using Microsoft's Application Virtualization, BASF has improved the operational processes at its manufacturing facilities. As a result, the company is now able to take waste material at one plant and use it as a raw material at another plant.

To obtain such operational benefits and manage the processes between facilities, BASF has established an Engineering & Maintenance competence center where BASF Engineering manages the design and building of new chemical plants. This team of engineers is charged with establishing new plant production processes, optimizing old ones and improving the efficiency of asset allocation. The primary tool BASF Engineering uses for this is aspenONE Process Engineering.

BASF Moves to Application Virtualization

AspenONE offers BASF engineers a good balance between easy-to-use models and advanced process simulation tools for evaluating its plant floor systems and supply chain management processes. However, as with all software products, BASF engineers need the newest version of aspenONE to drive greater efficiency.

Before using application virtualization to roll out aspenONE updates, BASF IT Services regularly faced compatibility issues with other software applications on an engineer's desktop. This meant months of regression testing and trouble shooting to guarantee the newest version of aspenONE would work with an engineer's other programs -- causing costly delays for BASF.

To resolve the problem, BASF IT Services implemented application virtualization to put aspenONE on a centralized server for remote execution using application virtualization. Consolidating the application enabled BASF IT Services to centrally install and locally provision the application to the engineer's desktops without the risk of application conflicts.

Engineers could now access the program, make changes and save documents through their local desktop, as they had done in the past. The only change recognizable to the engineers is the speed in which application updates are delivered. Now with application virtualization, BASF IT Services can deliver updates for aspenONE automatically.

Through application virtualization, BASF IT Services has been able to reduce its average 9 to 18 month software deployment time by up to 90%; remove the need for engineers to develop application workarounds; create a platform for engineers to operate multiple versions of an application from a single desktop; and build a standardized desktop for application updates and rollouts.

Change for Tomorrow

Application virtualization is the pathway for manufacturers looking to replace their legacy production, and automation and execution systems without risking the production delays that can hinder order fulfillment and market success. As the competitive landscape in manufacturing continues to tighten, it will be critical for manufacturers to have the right technologies available for its employees. The global economy may be following the law of physics, "what goes up must come down," but the reverse is also true. To stay competitive in an eroding marketplace and to be ready for success in a boom requires the right technology. Application virtualization can positively impact plant floor operations to help manufacturers weather the tough times and thrive when economic conditions improve.

Tyler Bryson is the General Manager, U.S. Manufacturing and Resources Sector for Microsoft.The U.S. Manufacturing & Resources group at Microsoft represents more than 900 enterprise accounts across six industry segments: Automotive & Industrial Equipment; Chemical; Consumer Goods; High Tech & Electronics; Oil & Gas; and Utilities. The group is focused on helping manufacturers improve supply chain and operational performance, speed product development and open digital communication channels with customers.http://www.microsoft.com/industry/manufacturing/default.mspx

IT-Verbände sehen RFID-Technologie auf dem Vormarsch

IT-Verbände sehen RFID-Technologie auf dem Vormarsch

Hannover/Berlin. Die RFID-Technologie ist in den Unternehmen angekommen und wird dort immer wichtiger. RFID (Radio Frequency Identification) ist in eine stabile, kontinuierliche Wachstumsphase getreten. Die meisten der Anwendungen in den Unternehmen sind bereits nach kurzer Zeit profitabel. Dieses Fazit zogen Vertreter des Bundesverbandes Informationswirtschaft, Telekommunikation und neue Medien e.V. (Bitkom) und des Informationsforum RFID auf einer gemeinsamen CeBIT-Pressekonferenz am heutigen Freitag in Hannover.

Demnach helfe die Funktechnologie RFID hilft bei der Optimierung betrieblicher Prozesse und damit bei der Steigerung der Profitabilität und Kostensenkung – für Konzerne wie für kleinere und mittelständische Unternehmen. Der Effizienzsteigerung komme angesichts der konjunkturellen Entwicklung eine besondere Bedeutung zu. Nach wie vor fehle den Unternehmen jedoch auf Grund offener Regulierungsvorgaben auf europäischer Ebene die erforderliche Planungssicherheit für ihre Investitionen.

Professor Michael ten Hompel, Vorstandsvorsitzender des Informationsforum RFID: „RFID ist den Kinderschuhen entwachsen und in eine kontinuierliche Phase der Marktdurchdringung getreten. Nach unserer Marktforschung planten Ende 2008 rund 50 Prozent der RFID-Nutzer, ihre Anwendungen auszubauen, bei zirka 15 Prozent der Nicht-Nutzer ist die Umsetzung von RFID-Anwendungen in den nächsten zwei Jahren konkret vorgesehen. Gerade in den Bereichen der Prozess- und Kostenoptimierung könnte der Druck auf die Einführung durch die aktuelle konjunkturelle Situation weiter zunehmen. Auch die Reduzierung von CO2-Emissionen durch effizientere Logistikprozesse ist heute ein wichtiges Kriterium, sich für RFID zu entscheiden.“

Ten Hompel stellte heraus, dass sich die Investitionen in RFID-Anwendungen in der Praxis für die Unternehmen als wirtschaftlich sinnvoll erwiesen hätten: „Nach unserer Marktbefragung ebenso wie nach aktuellen internationalen Studien erreicht rund die Hälfte der RFID-Nutzer den Return-On-Invest bereits nach zwei Jahren.“

Bitkom-Vizepräsident Heinz Paul Bonn betonte die Bedeutung, die RFID gerade auch für kleinere und mittlere Betriebe habe: „Mittelständische Unternehmen haben heute im Wettbewerb große Chancen, wenn sie sich frühzeitig mit der neuen Technologie befassen und die Einführung von RFID ernsthaft prüfen.“

Viele Unternehmen warteten aber noch ab mit der Entscheidung, diese Technologie einzuführen. Wichtige politische Weichenstellungen stünden noch aus. Dies gelte insbesondere für die Empfehlung zu Datenschutz und Sicherheit bei RFID, mit deren Entwurf die EU-Kommission im Jahr 2007 begonnen hat: „Seit fast zwei Jahren diskutiert die Kommission diese Empfehlung zu RFID und hat den Markt damit stark verunsichert. Gerade in wirtschaftlich schwierigen Zeiten brauchen die Marktteilnehmer Rechtssicherheit für Investitionen statt zusätzliche bürokratischer Hemmnisse. Der Bitkom und das Informationsforum RFID fordern daher die Kommission auf, endlich Klarheit zu schaffen“, so Bonn. Die Unsicherheit wirke als große Bremse für viele Investitionsentscheidungen.

Gleichzeitig machte Bonn klar, dass die Verbände die Notwendigkeit einer solchen Handlungsempfehlung der EU bezweifelten. Die Privatsphäre und die personenbezogenen Daten der Verbraucher müssten selbstverständlich geschützt werden. Das sei aber durch das Bundesdatenschutzgesetz und Selbstverpflichtungen der Wirtschaft bereits umfassend gewährleistet. Das Informationsforum und der BITKOM begrüßten die Initiative der EU-Kommission zum schnellen Breitbandausbau und der Verfügbarkeit von Wireless Internet auch im ländlichen Raum. Dadurch wird die Verfügbarkeit und Marktdurchsetzung moderner Kommunikationsangebote im Allgemeinen und RFID-Technologien im Speziellen erhöht. (ak)

Invensys Process Systems Unveils Immersive Virtual Reality Process Technology.

Invensys Process Systems Unveils Immersive Virtual Reality Process Technology.

IndustryWeek (3/6) reports Invensys Process Systems "recently unveiled its Immersive Virtual Reality Process technology, a next-generation human machine interface (HMI) solution that will change the way engineers and operator trainees see and interact with the plant and the processes they control." The solution is able to create "three-dimensional computer-generated representation of either a real or proposed process plant" and "has been designed for a wide range of scenarios, including process design, maintenance engineering and plant safety."


Wireless Devices Market Being Driven By Need For Real Time Data.

Wireless Devices Market Being Driven By Need For Real Time Data.

TMCnet.com (3/5, Adkoli) reports, "New analysis from Frost & Sullivan indicates that the wireless devices market in factory automation is being driven by the need for real-time data as well as work force mobility." According to a Wireless Devices Market In Factory Automation report, wireless devices markets "in countries such as Germany, France, Italy, Spain, and the United Kingdom, earned revenues of more than US$75.2 million in 2008 and this is estimated to reach US$132.8 million in 2012."

Thursday, March 05, 2009

Employers beware

Employers beware

Feb 24th 2009
From Economist.com

What departing workers take with them


IF YOU are losing your job, you might at least walk away with a competitive advantage. A survey for Symantec, an internet-security firm, suggests that some 60% of American workers who left their employers last year took some data with them. Respondents admitted that they had lifted anything from e-mail lists to customer information, with two-thirds of such workers using this stolen data in their new job. The most popular method of theft was taking hard-copy files (61%), while around half put data on an electronic-storage device such as a CD or USB stick. And it seems easy to do: 82% of departing employees said that no checks were carried out on what they had kept. Many also admitted to keeping electronic-storage devices given for their jobs, even PDAs and laptops.

Alamy

Six ways to make Web 2.0 work

Six ways to make Web 2.0 work

Web 2.0 tools present a vast array of opportunities—for companies that know how to use them.

Technologies known collectively as Web 2.0 have spread widely among consumers over the past five years. Social-networking Web sites, such as Facebook and MySpace, now attract more than 100 million visitors a month. As the popularity of Web 2.0 has grown, companies have noted the intense consumer engagement and creativity surrounding these technologies. Many organizations, keen to harness Web 2.0 internally, are experimenting with the tools or deploying them on a trial basis.

Over the past two years, McKinsey has studied more than 50 early adopters to garner insights into successful efforts to use Web 2.0 as a way of unlocking participation. We have surveyed, independently, a range of executives on Web 2.0 adoption. Our work suggests the challenges that lie ahead. To date, as many survey respondents are dissatisfied with their use of Web 2.0 technologies as are satisfied. Many of the dissenters cite impediments such as organizational structure, the inability of managers to understand the new levers of change, and a lack of understanding about how value is created using Web 2.0 tools. We have found that, unless a number of success factors are present, Web 2.0 efforts often fail to launch or to reach expected heights of usage. Executives who are suspicious or uncomfortable with perceived changes or risks often call off these efforts. Others fail because managers simply don’t know how to encourage the type of participation that will produce meaningful results.

Some historical perspective is useful. Web 2.0, the latest wave in corporate technology adoptions, could have a more far-reaching organizational impact than technologies adopted in the 1990s—such as enterprise resource planning (ERP), customer relationship management (CRM), and supply chain management (Exhibit 1). The latest Web tools have a strong bottom-up element and engage a broad base of workers. They also demand a mind-set different from that of earlier IT programs, which were instituted primarily by edicts from senior managers.

To view enlarged exhibits, please install the Adobe Flash Player plugin version 7 or greater.

Web 2.0 covers a range of technologies. The most widely used are blogs, wikis, podcasts, information tagging, prediction markets, and social networks (Exhibit 2). New technologies constantly appear as the Internet continues to evolve. Of the companies we interviewed for our research, all were using at least one of these tools. What distinguishes them from previous technologies is the high degree of participation they require to be effective. Unlike ERP and CRM, where most users either simply process information in the form of reports or use the technology to execute transactions (such as issuing payments or entering customer orders), Web 2.0 technologies are interactive and require users to generate new information and content or to edit the work of other participants.

To view enlarged exhibits, please install the Adobe Flash Player plugin version 7 or greater.

Earlier technologies often required expensive and lengthy technical implementations, as well as the realignment of formal business processes. With such memories still fresh, some executives naturally remain wary of Web 2.0. But the new tools are different. While they are inherently disruptive and often challenge an organization and its culture, they are not technically complex to implement. Rather, they are a relatively lightweight overlay to the existing infrastructure and do not necessarily require complex technology integration.

Gains from participation

Clay Shirky, an adjunct professor at New York University, calls the underused human potential at companies an immense “cognitive surplus” and one that could be tapped by participatory tools. Corporate leaders are, of course, eager to find new ways to add value. Over the past 15 years, using a combination of technology investments and process reengineering, they have substantially raised the productivity of transactional processes. Web 2.0 promises further gains, although the capabilities differ from those of the past technologies (Exhibit 3).

To view enlarged exhibits, please install the Adobe Flash Player plugin version 7 or greater.

Research by our colleagues shows how differences in collaboration are correlated with large differences in corporate performance.1 Our most recent Web 2.0 survey demonstrates that despite early frustrations, a growing number of companies remain committed to capturing the collaborative benefits of Web 2.0.2 Since we first polled global executives two years ago, the adoption of these tools has continued. Spending on them is now a relatively modest $1 billion, but the level of investment is expected to grow by more than 15 percent annually over the next five years, despite the current recession.3

Management imperatives for unlocking participation

To help companies navigate the Web 2.0 landscape, we have identified six critical factors that determine the outcome of efforts to implement these technologies.

1. The transformation to a bottom-up culture needs help from the top. Web 2.0 projects often are seen as grassroots experiments, and leaders sometimes believe the technologies will be adopted without management intervention—a “build it and they will come” philosophy. These business leaders are correct in thinking that participatory technologies are founded upon bottom-up involvement from frontline staffers and that this pattern is fundamentally different from the rollout of ERP systems, for example, where compliance with rules is mandatory. Successful participation, however, requires not only grassroots activity but also a different leadership approach: senior executives often become role models and lead through informal channels.

At Lockheed Martin, for instance, a direct report to the CIO championed the use of blogs and wikis when they were introduced. The executive evangelized the benefits of Web 2.0 technologies to other senior leaders and acted as a role model by establishing his own blog. He set goals for adoption across the organization, as well as for the volume of contributions. The result was widespread acceptance and collaboration across the company’s divisions.

2. The best uses come from users—but they require help to scale. In earlier IT campaigns, identifying and prioritizing the applications that would generate the greatest business value was relatively easy. These applications focused primarily on improving the effectiveness and efficiency of known business processes within functional silos (for example, supply-chain-management software to improve coordination across the network). By contrast, our research shows the applications that drive the most value through participatory technologies often aren’t those that management expects.

Efforts go awry when organizations try to dictate their preferred uses of the technologies—a strategy that fits applications designed specifically to improve the performance of known processes—rather than observing what works and then scaling it up. When management chooses the wrong uses, organizations often don’t regroup by switching to applications that might be successful. One global technology player, for example, introduced a collection of participatory tools that management judged would help the company’s new hires quickly get up to speed in their jobs. The intended use never caught on, but people in the company’s recruiting staff began using the tools to share recruiting tips and pass along information about specific candidates and their qualifications. The company, however, has yet to scale up this successful, albeit unintended, use.

At AT&T, it was frontline staffers who found the best use for a participatory technology—in this case, using Web 2.0 for collaborative project management. Rather than dictating the use, management broadened participation by supporting an awareness campaign to seed further experimentation. Over a 12-month period, the use of the technology rose to 95 percent of employees, from 65 percent.

3. What’s in the workflow is what gets used. Perhaps because of the novelty of Web 2.0 initiatives, they’re often considered separate from mainstream work. Earlier generations of technologies, by contrast, often explicitly replaced the tools employees used to accomplish tasks. Thus, using Web 2.0 and participating in online work communities often becomes just another “to do” on an already crowded list of tasks.

Participatory technologies have the highest chance of success when incorporated into a user’s daily workflow. The importance of this principle is sometimes masked by short-term success when technologies are unveiled with great fanfare; with the excitement of the launch, contributions seem to flourish. As normal daily workloads pile up, however, the energy and attention surrounding the rollout decline, as does participation. One professional-services firm introduced a wiki-based knowledge-management system, to which employees were expected to contribute, in addition to their daily tasks. Immediately following the launch, a group of enthusiasts used the wikis vigorously, but as time passed they gave the effort less personal time—outside their daily workflow—and participation levels fell.

Google is an instructive case to the contrary. It has modified the way work is typically done and has made Web tools relevant to how employees actually do their jobs. The company’s engineers use blogs and wikis as core tools for reporting on the progress of their work. Managers stay abreast of their progress and provide direction by using tools that make it easy to mine data on workflows. Engineers are better able to coordinate work with one another and can request or provide backup help when needed. The easily accessible project data allows senior managers to allocate resources to the most important and time-sensitive projects.

Pixar moved in a similar direction when it upgraded a Web 2.0 tool that didn’t quite mesh with the way animators did their jobs. The company started with basic text-based wikis to share information about films in production and to document meeting notes. That was unsatisfactory, since collaborative problem solving at the studio works best when animators, software engineers, managers, and directors analyze and discuss real clips and frames from a movie.4 Once Pixar built video into the wikis, their quality improved as critiques became more relevant. The efficiency of the project groups increased as well.

4. Appeal to the participants’ egos and needs—not just their wallets. Traditional management incentives aren’t particularly useful for encouraging participation.5 Earlier technology adoptions could be guided readily with techniques such as management by objectives, as well as standardized bonus pay or individual feedback. The failure of employees to use a mandated application would affect their performance metrics and reviews. These methods tend to fall short when applied to unlocking participation. In one failed attempt, a leading Web company set performance evaluation criteria that included the frequency of postings on the company’s newly launched wiki. While individuals were posting enough entries to meet the benchmarks, the contributions were generally of low quality. Similarly, a professional-services firm tried to use steady management pressure to get individuals to post on wikis. Participation increased when managers doled out frequent feedback but never reached self-sustaining levels.

A more effective approach plays to the Web’s ethos and the participants’ desire for recognition: bolstering the reputation of participants in relevant communities, rewarding enthusiasm, or acknowledging the quality and usefulness of contributions. ArcelorMittal, for instance, found that when prizes for contributions were handed out at prominent company meetings, employees submitted many more ideas for business improvements than they did when the awards were given in less-public forums.

5. The right solution comes from the right participants. Targeting users who can create a critical mass for participation as well as add value is another key to success. With an ERP rollout, the process is straightforward: a company simply identifies the number of installations (or “seats”) it needs to buy for functions such as purchasing or finance and accounting. With participatory technologies, it’s far from obvious which individuals will be the best participants. Without the right base, efforts are often ineffective. A pharmaceutical company tried to generate new product ideas by tapping suggestions from visitors to its corporate Web site. It soon discovered that most of them had neither the skills nor the knowledge to make meaningful contributions, so the quality of the ideas was very low.

To select users who will help drive a self-sustaining effort (often enthusiastic early technology adopters who have rich personal networks and will thus share knowledge and exchange ideas), a thoughtful approach is required. When P&G introduced wikis and blogs to foster collaboration among its workgroups, the company targeted technology-savvy and respected opinion leaders within the organization. Some of these people ranked high in the corporate hierarchy, while others were influential scientists or employees to whom other colleagues would turn for advice or other assistance.

When Best Buy experimented with internal information markets, the goal was to ensure that participation helped to create value. In these markets, employees place bets on business outcomes, such as sales forecasts.6 To improve the chances of success, Best Buy cast its net widely, going beyond in-house forecasting experts; it also sought out participants with a more diverse base of operational knowledge who could apply independent judgment to the prediction markets. The resulting forecasts were more accurate than those produced by the company’s experts.

6. Balance the top-down and self-management of risk. A common reason for failed participation is discomfort with it, or even fear. In some cases, the lack of management control over the self-organizing nature and power of dissent is the issue. In others, it’s the potential repercussions of content—through blogs, social networks, and other venues—that is detrimental to the company. Numerous executives we interviewed said that participatory initiatives had been stalled by legal and HR concerns. These risks differ markedly from those of previous technology adoptions, where the chief downside was high costs and poor execution.

Companies often have difficulty maintaining the right balance of freedom and control. Some organizations, trying to accommodate new Web standards, have adopted total laissez-faire policies, eschewing even basic controls that screen out inappropriate postings. In some cases, these organizations have been burned.

Prudent managers should work with the legal, HR, and IT security functions to establish reasonable policies, such as prohibiting anonymous posting. Fears are often overblown, however, and the social norms enforced by users in the participating communities can be very effective at policing user exchanges and thus mitigating risks. The sites of some companies incorporate “flag as inappropriate” buttons, which temporarily remove suspect postings until they can be reviewed, though officials report that these functions are rarely used. Participatory technologies should include auditing functions, similar to those for e-mail, that track all contributions and their authors. Ultimately, however, companies must recognize that successful participation means engaging in authentic conversations with participants.

Next steps

Acceptance of Web 2.0 technologies in business is growing. Encouraging participation calls for new approaches that break with the methods used to deploy IT in the past. Company leaders first need to survey their current practices. Once they feel comfortable with some level of controlled disruption, they can begin testing the new participatory tools. The management imperatives we have outlined should improve the likelihood of success.

Keep the conversation going on Twitter

Do our six recommendations agree with the successes and failures you’ve seen? Is the economic downturn affecting your perception and use of Web 2.0 tools? What organizations get the most out of Web 2.0, and why? Use the #web2.0work hashtag to respond to this article and these questions on Twitter. We’ll be following them and responding via our McKinsey Quarterly account, @McKQuarterly.

About the Authors

Michael Chui is a consultant in McKinsey’s San Francisco office; Andy Miller is an associate principal in the Silicon Valley office, where Roger Roberts is a principal.


The authors would like to acknowledge the contributions of their colleagues James Manyika, Yooki Park, Bryan Pate, and Kausik Rajgopal.

Notes

1Scott C. Beardsley, Bradford C. Johnson, and James M. Manyika, “Competitive advantage from better interactions,” mckinseyquarterly.com, May 2006.

2Building the Web 2.0 Enterprise: McKinsey Global Survey Results,” mckinseyquarterly.com, July 2008.

3See G. Oliver Young et al., “Can enterprise Web 2.0 survive the recession?” forrester.com, January 6, 2009.

4See Hayagreeva Rao, Robert Sutton, and Allen P. Webb, “Innovation lessons from Pixar: An interview with Oscar-winning director Brad Bird,” mckinseyquarterly.com, April 2008.

5Exceptions exist for harnessing information markets and searching crowd expertise, where formal incentives are an essential part of the mechanism for participation.

6See Renée Dye, “The promise of prediction markets: A roundtable,” mckinseyquarterly.com, April 2008; and the video “Betting on prediction markets,” mckinseyquarterly.com, November 2007.

The gap between pay for men and women in Europe

A yawning gap

Mar 4th 2009
From Economist.com

The gap between pay for men and women in Europe


DESPITE the best efforts of lawmakers and feminists, there is a long way to go to close the pay gap between the sexes. In the European Union, which set out its principle of equal pay for equal work 50 years ago, gross hourly wages for men were 17.4% higher than for women in 2007, according to a new report. Countries with the biggest differentials, such as Estonia, have highly segregated labour markets or a high proportion of women working part-time, such as Germany, Britain and the Netherlands. Where the female employment rate is low there is usually a smaller gap, because of the small proportion of low-skilled or unskilled women in the workforce.

Shutterstock

Wednesday, March 04, 2009

IT-Industrie: Lenovo beauftragt Geodis Logistics mit Notebook-Garantieservices

IT-Industrie: Lenovo beauftragt Geodis Logistics mit Notebook-Garantieservices

Frankfurt am Main / Hannover. Wenn Lenovo-Kunden aus Deutschland oder Österreich Computer reparieren oder austauschen wollen, sorgt ab sofort Geodis Logistics für die logistische Abwicklung. Der Logistikdientleister Geodis Logistics Deutschland, der zur französischen Geodis-Gruppe gehört, ist ab sofort für Garantie-Reparaturen und die Ersatzteil-Logistik für Consumer-Notebooks von Lenovo zuständig. WieLOGISTIK inside am Dienstag am Rande der IT-Messe Cebit in Hannover erfuhr, werden die Reparaturen am Geodis-Standort in Heppenheim vorgenommen.

Damit sich die Lenovo-Nutzer über den Reparaturstatus ihres Notebooks informieren können, hat Geodis Logistics für den Hardwarehersteller zudem eine Tracking-Seite auf dessen Homepage (http://www.lenovo.de) eingerichtet. So können Kunden online nachverfolgen, wann sie ihr repariertes oder ersetztes Gerät zurückerwarten dürfen – die Kundensupport-Hotline wird somit erheblich entlastet.

Geodis Logistics arbeitet bereits seit 2007 für Lenovo. Seither unterstützt der Logistikdienstleister den Hardwarehersteller bei Garantiedienstleistungen rund um Laptop-Akkus. Der Kundenservice für Lenovos Großkunden wie Bosch, Porsche, Daimler oder auch die Landesbank Baden-Württemberg ist ein weiterer Bereich, den Geodis Logistics für Lenovo übernommen hat. Der Dienstleister repariert PCs und Notebooks direkt bei den Unternehmen vor Ort.

„Im Rahmen der Kooperation mit Geodis Logistics lagern wir die Kundengarantie-Services komplett an einen erfahrenen Experten aus“, sagte Markus Wagenmann, Region Service Delivery Manager bei Lenovo. „So ist es uns möglich, unseren Kunden noch raschere und zuverlässigere Reparaturen zu garantieren. Das Webtracking-Tool sorgt zudem für eine erhöhte Transparenz, und das ist genau das, was unsere Kunden von uns erwarten.“ (pi)

China Manufacturing Index Rises Thanks To Stimulus.

China Manufacturing Index Rises Thanks To Stimulus.

Bloomberg News (3/4, Yanping, Piboontanasawat) reports that China's Purchasing Manager's Index "climbed for a third month, adding to evidence that a 4 trillion yuan ($585 billion) stimulus package is pushing the world's third-biggest economy closer to a recovery." The index "rose to a seasonally adjusted 49 in February from 45.3 in January, the China Federation of Logistics and Purchasing said" Wednesday. "Stocks rose after output and new orders expanded for the first time in five months. Chinese Premier Wen Jiabao may announce extra measures to reverse the nation's economic slide." The New York Times (3/4, Wassener) also covers the story in its website.

Microsoft, Infosys Alliance To Aid Supply Chain Visibility, Collaboration.

Microsoft, Infosys Alliance To Aid Supply Chain Visibility, Collaboration.

Supply & Demand Chain Executive (3/3) reported, "In an effort to help manufacturers manage their increasingly complex global supply chains, Microsoft Corp. and Infosys Technologies said they have forged a new go-to-market alliance around improving supply chain visibility and collaboration. The two companies today are jointly launching a set of solutions, services and a center of excellence that is expected to help manufacturers build next-generation supply chains with improved performance and better visibility across their enterprises and trading partners." In order to address issues such as "poor collaboration, inadequate tools and technology, and challenges in getting the right data into the right format," the companies' market solutions are intended to "provide capabilities for performance management, analytics, collaboration and event-based exception management," and "also include proprietary connectors developed by Infosys and Microsoft to help facilitate the integration of these modules with existing business systems," among other things.

The Real Price of Obama's Cap-and-Trade Plan

The Real Price of Obama's Cap-and-Trade Plan

A carbon-emissions limit will raise energy prices unevenly.

By Kevin Bullis

President Obama's budget numbers depend heavily on revenues from a proposed cap-and-trade program for reducing carbon dioxide emissions. Under the plan, these revenues will come at the cost of higher energy prices, with some states being affected far more than others.

The cap-and-trade program does not yet exist: it will need to be established in future legislation. But the inclusion of future revenues in the budget, and a promise to pursue necessary legislation, is the strongest commitment yet that the administration will follow through with one of Obama's campaign promises and establish a cap-and-trade system for carbon dioxide emissions.

Under such a system, the government sets an annual cap on carbon dioxide emissions--the budget calls for a cap of 14 percent below 2005 emissions levels by 2020, and 83 percent below 2005 levels by 2050. The government then issues a set number of credits for the total emissions allowed under that cap. Under Obama's plan, those credits won't be given away, as they were in the initial version of a cap-and-trade system employed in Europe. Instead, the credits will be auctioned off, and that money will be the source of government revenue. Polluters will be required to buy enough credits at the initial auction to cover their carbon dioxide emissions, or acquire more by trading with others at a later stage. Alternatively, they can reduce their emissions by investing in more efficient technologies. Either way, these costs will result in higher energy prices.

The budget includes $78.7 billion in projected revenues from the cap-and-trade system in its first year, 2012, and $525.7 billion total by 2019. According to Point Carbon, an energy-market analysis firm based in Olso, Norway, these numbers are based on the assumption that credits for a ton of carbon dioxide will sell for $13.70 in 2012 and $16.50 by 2020. These estimates are in line with carbon credits issued in Europe, says Veronique Bugnion, a managing director at Point Carbon. The 2012 price for carbon dioxide emissions will increase gasoline prices by 6 percent compared to current prices, she says. Average electricity prices will increase by 6.8 percent--perhaps more. According to calculations by Gilbert Metcalf, an economist at Tufts University, the average electricity price increase would be 9.7 percent by 2012 and 11.7 percent by 2020.

What's more, the impact of the cap-and-trade system will vary by state. Electricity prices will rise more in states that rely heavily on coal, such as North Dakota, than in states that rely on sources of electricity that produce little carbon dioxide. According to Bugnion, prices could increase by 19.2 percent in North Dakota by 2012 but only 2.6 percent in Washington State, which relies heavily on hydroelectric power, over the same period.

To offset some of these price increases, the budget includes provisions to use some of the auction revenue for tax relief. From 2012 to 2019, $15 billion a year from the carbon-emissions program will be used to pay for "vital investments in a clean energy future"--funding for clean energy technology. The remaining money from the auction is expected to be just enough to pay for a tax credit that is an extension of the "Making Work Pay" credit--a $400-a-person credit included in the recently passed stimulus bill.

Numerical Methods

Numerical Methods. .

. . . a Sorcerer's Apprentice View

Mankind knows exactly one workable means for the precise rendering of dynamic phenomena. Leibnitz's approach to simulating time-dependent mechanisms by recursive numerical integration is not only singular, it is likely to be the most productive abstraction since the golden rule. But, despite economics' emphatically dynamic context and the prominence of 'dynamic' in economic discourse, the singularly effective techniques of numerical integration are almost never brought to bear on political arithmetic. Indeed, numerical simulation methods are all but absent from the economics curriculum.


Familiarity with an Esoteric Discipline

Students' general lack of instruction in the formal methods for emulating dynamic processes is a terrible waste of the critical experience students develop in the long hours spent with their video games. They instinctively know that mankind has mastered the sciences of kinetics and optical projection because they can see these disciplines at work in the game programs with which they interact. This equips them with important criteria for other disciplines they encounter, e.g.: useful models of exterior reality must, at minimum, be expressible in stable, dynamic simulations embodied in physical devices, such as their video game players, capable of operating in the same objective realm as the phenomena being modeled.

Many students arrive in class so saturated with Leibnitz that they can be taught elementary numerical methods as the mere by-product of some other course of study. Their prior exposure is easily given shape and organization by one of the many online courses supported by the System Dynamics fraternity. A considerable degree of proficiency with numerical simulation techniques arrives with an ability to visualize time as if it were a spatial dimension. Mastery of this conceptual skill might be assessed in ones comfort with the brief synopsis to follow.

| top |


Progress from One State to the Next

Dynamic systems control state variables in the dimension of time. The notion of a state variable can be represented by any economic stock - say the amount capital held by the Widgets Sector. In the illustration below, the level of this stock is S0 tons at time = t.

Simulation techniques rely on a scientific discipline's ability to compute the rates at which a system's stocks are changing based on current levels of those stocks. In economics, these references would be to current levels of all the inputs to all the economic sectors.

Input levels would interact with the shapes of production and utility tradeoffs to determine supply, demand, prices and so fourth. Ultimately, these calculations must specify the rate R tons/year at which the Widgets Sector replenishes its capital, and the rate E tons/year at which its capital is expended.

The difference between R and E is the net rate of change of the Widget Sector's capital stock. This rate translates to a slope in the dimensions of tons of capital versus time; and the slope defines a radiant along which the capital stock departs from its current value S0.

Since R and E were computed based on the stocks at time = t, we must not allow them to govern the time-radiant beyond a small time interval d. Thus time's flow must be interrupted at t+d to re-compute all the stocks and refresh the rates of change.

Re-computing stocks simply applies the equation for straight lines in their slope-intercept form. As illustrated at the left:

(R - E) = (S1 - S0) / (t+d - t).

This quickly yields a value for S1, the next state:

S1 = S0 + d*(R - E).

| top |


Concatenation of States to Approximate Time's Continuum

When all such input levels have been computed for time = t+d, we can re-iterate our procedure to bring us up to a time that is advanced by one more d. We simply substitute the value of S1 for S0 in the above cycle of calculations, re-compute whatever R and E this might imply, and compute a new value of S1 that is the system's state at t+d+d.

Embedding a view of economics in the numerical simulation technique reveals that a model's operation is effected by continuous cycle of:

1. Using the current levels of state variables (viz., the amounts of each economic good possessed by each economic actor) to compute the rates at which those variables are changing at the current moment;

2. Updating the state variables by allowing their rates of change to persist for a brief interval, thus simulating the passage of one differential element of time; and

3. Re-computing the rates of change based on the updated state, advancing time one more interval by simulating the brief persistence of these new rates, etc., etc., etc.

Note that this rendition of economic dynamics only relies on economic principles to compute the system's rates of change from its state variables. The manner in which the state variables are updated by the rates would be the same if our subject were aeronautics, or thermodynamics, or open channel flow, or any other time-dependent phenomenon.

Note also the conflict between numerical integration and the notion of successive statics that now animates so much economic theory. Your visualization of the progress from one state to the nest showed a state variable having its value adjusted as time flowed. Successive statics would have, by contrast, displayed that variable's movement as as a flat line parallel to the time dimension; and would somehow portray change as occurring in the durationless boundaries between the differential time elements. To the practitioner of ordinary engineering dynamics, economics seems to have gotten comfortable with change as occurring in some (astral?) dimension other than time.

And, as noted in the introduction to Model 0, SFEcon's circuit of economic cause and effect closes within a single differential element of time, and without the least necessity of having arrived at any sort of ultimate, equilibrium, or optimal state. This completes Model Zero's definition as a finite-state machine: its operation depends only upon knowledge of its current (not necessarily ultimate) state, and a (correct) set of rules for computing its next state.

The Idea of Mathematical Delay

The Idea of Mathematical Delay

Delay mechanisms are the essential devices by which dynamic phenomena are understood. These fundamental dynamic sub-assemblies define specific arrangements of rates-of-change with the state variables that they control. Such structures are best visualized in 'signal path diagrams', the animation of which requires application of numerical methods. While signal path analysis is a rather daunting topic, we do hope to provide some sense of comfort (however false) with SFEcon's use of this discipline.


Exponential Decay and the Signal Path Diagram

The most primitive, naturally-occurring pattern of dynamic adjustment is exponential decay. This pattern is also familiar: consider a the matter of a pizza delivered to a room such that the temperature differential between the pizza and the room equals some amount A; obviously A will decay away to zero with the passage of time T. According to Newton's curve of cooling, the value of A will vary with T according to the exponential pattern shown below.

Exponential functions are characterized by a unique property: striking a tangent at any point, and extending the tangent to cross the ordinate, will always define the same distance 1/V along the time axis. The functional relation between A and T is given by V and A's initial value A0.

Comparatively few dynamic systems are simple enough to yield closed-form expressions of their behavior such as equation for exponential decay above. More complicated systems can be made to disclose their dynamics if they can be perceived in terms of a 'signal path diagram'. The signal path for exponential decay, shown below, makes use of the three elements by which all signal paths are composed.

The current level of temperature differential is depicted by a square. Dissipation of the potential embodied in A is visualized by flow along a solid line leading away from the square. The rate of decay is depicted by a valve symbol attached to the line of flow away from A. Broken lines indicate flows of information, which are considered instantaneous. Thus the instantaneous rate of decay E is always computed by the current level of A, multiplied by the parameter V, which is depicted by a circle. This mechanism indicates that time's passages always has potential A flowing out in proportion to the amount of A that is left. Thus A depletes at the ever decreasing rates characteristic of exponential decay.

| top |


First-order Delays

First-order delays are to dynamics what exponentially-weighted averages are to economics, viz. filtering mechanisms. A delay is constructed by attaching an exogenous driving function R to the signal path depicting exponential decay:

Whatever pattern is evident in the time-series of R will emerge in attenuated and 'spread out' form at 1/V time units later in the time-series of E. E is a delay or lag on R; and the current value of E is an exponentially-weighted average on all past values of R. If R is held constant, A will attain a level such that E = VA = R.

First-order delays have familiar uses in economics and operations analysis. If, for example, R is a series of sales rates for some product, E might disclose appropriate production rates by smoothing the R series. Economists have reference to a certain delay model of consumption and income whereby consumption E is thought to be determined by a lag on income R. This model is considered at some length in SFEcon's treatment of Keynesian economic theory, which we commend as another source of familiarity with the notion of delay.

| top |


Higher-order Delays

The emulation of certain phenomena require that the dynamics of smoothing and delay be arrayed in more complex ways. Higher-ordered delays are constructed by arranging first-order delays in series, such that the effluent of one level cascades into the next. This arrangement is shown below for a third-order delay in which three levels A, B, and C are used to filter and smooth an exogenous input rate R.

Here the delay on R would be given by E = V (A + B + C), and equations for continuously re-computing the levels would follow the pattern shown in the diagram. The functions for updating the levels are in the form required for emulation by numerical methods: levels depart from their prior values by allowing their controlling rates' influence to persist over a differential element of time d.

| top |


Sigmoid Decay

Higher-ordered delays are used for the emulation of behaviors having more complexity than can be rendered by simple, first-order, or exponential delays. The particulars behind such a statement can be as numerous as there are possible driving functions R. But differences in behavior between first-order and higher-ordered delays are best exposed for the case of an established steady-state, in which R = E, that is set on a course of dynamic adjustment by suddenly making the input rate equal 0.

When this is done to the first-order delay, we observe the exponential pattern of decay introduced at the top of this page.

When this is done to a third-order delay, we observe the sigmoid pattern shown below.

The sigmoid pattern of decay shows very little initial loss in the collected levels A + B + C. This follow by a brief period in the vicinity of T = 1/V wherein the collection of levels depletes quite rapidly. The pattern is completed in a third phase characterized levels that are now so low that they have comparatively little force with which to propel their final exhaustion.

A sigmoid pattern of decay might be appropriate for mapping the useful life of a capital asset, such as a family automobile purchased with an intention to be used for, say, 1/V = 8 years. For much of the automobile's useful life it functions as if new. As its age begins to approach the point for which its obsolescence has been designed, the automobile's functionality is only maintained by increasingly frequent and costly repairs. Finally, maintenance expenses are so high that the automobile must be re-sold (i.e. re-valued) at a low enough price for it to be economically useful to its new owner.

A sigmoid pattern's exact shape is controlled by the order of the higher-ordered delays generating it: the higher the order, the sharper the transition between initial and final states. Different orders of delay are therefore likely to be indicated for tracing out the useful economic lives of commodities composing an economic model. SFEcon's generic prototypes use third-order delays to emulate the processes by which economic goods exhaust themselves in producing the next generation of goods. But this is only a formality intended to 'hold place' for a more incisive analysis.

A newer world order

A newer world order
26 February 2009

The year is 2040. China and the United States dominate the world landscape, forming an uneasy and not entirely stable partnership, bound together by ties of mutual benefit. China’s economy is the largest in the world, having grown at a higher rate for the last two decades. With that influence has come a greater role in global security. The United States has stabilized at a lower growth rate and has refocused its global role around its critical priorities in the Middle East and the Americas. Both nations depend on the other as their largest trading partner.

But more important than any individual country are the new multilateral institutions that provide a supranational framework. Around the world, various nation-states regularly find themselves physically and financially incapable of withstanding the acts of terrorism, financial crises, and natural disasters that periodically engulf them. It is this increased level of risk that has given rise to newly empowered organizations.

While the United Nations remains the world’s premier talking shop, institutions affiliated with it play an increasingly important role in the conduct of global affairs. The International Reserve acts as a global stabilization fund and also as a regulator and certifying agency for the world’s financial exchange. The Asia-Pacific Treaty Organization, including the United States, has taken over a security role in Asia, complementing an expanded NATO agenda in Europe, Africa, and the Middle East. The International Criminal Court has become the judicial body of last resort and is also the global community’s investigating arm—the “sharp edge of the sword” in the defense of human rights. The World Health Organization has acquired global quarantine authority supported by local enforcement arms; this change was instituted after the outbreak of a second virulent biological plague in 2020. Meanwhile, as charitable giving outweighs official development aid by an order of magnitude, nongovernmental organizations (NGOs) play an increasingly prominent role in developing countries.

As its economic dominance wanes, the United States has reluctantly become a principal beneficiary of these multilateral institutions. Indeed, it depended on them as it weathered multiple financial crises requiring ever-greater levels of coordinated global intervention to restore economic stability. Domestic economic priorities continue to drive policy. Health care costs now amount to 25 percent of GDP, and marginal tax rates are approaching 50 percent. Immigration has slowed to a trickle, but population growth continues to be robust at 1 percent per annum. US schools have maintained their global preeminence, particularly in biotechnology. Terrorist incidents of both domestic and foreign origin have become more common. Yet US military expenditures have declined from their peak in 2010. The nation’s military bases are now concentrated in Asia and the Middle East in order to defend the nation’s still-critical oil pipelines.

As the world’s largest economy, China has become banker of last resort to the world. It is by far the largest holder of US government debt. The RenEurDol (or Red, for short)—a new international currency made up of the freely floated renminbi, the US dollar, and the Euro—has been established as the world’s reserve currency. No economic crisis anywhere can be resolved without the approval and participation of China. China leads the world in energy, electronics, and material science technologies. It continues to be a manufacturing powerhouse, but its factories now churn out high-tech gear designed by graduates of Chinese universities. Over 50 percent of the Fortune (now renamed Caijing) 500 are based in China.

China’s population, which peaked at about 1.5 billion in the year 2030, is starting to decline even as immigration is picking up. Electoral democracy has come to China, though the Communist Party maintains its leading role in society, holding most of the seats in the National People’s Congress and most local government jobs. The Chinese population’s migration from the countryside to the cities peaked at around 70 percent. China is now effectively made up of ten major economic clusters, each of which shares common infrastructure but possesses heterogeneous cultural tastes. The Chinese have cleaned up their cities, but they now face soaring health care costs—their cancer, obesity, and stroke rates have skyrocketed due to a more sedentary lifestyle as well as the high levels of pollution that were present early in the century.

Elsewhere in the region, India rivals China for influence in South Asia, particularly over access to gas reserves, which are now critical to both countries’ energy requirements, though both also depend on hydro, nuclear, wind, and solar energy sources. India has emerged as a major infrastructure investor in its own right and, while not a major exporter, it has succeeded in building an industrial base largely able to meet its own requirements. India maintains its global reputation as a service-oriented economy, and the Bollywood film industry is now many times the size of Hollywood in both value and productivity. A late wave of urbanization beginning in the 2010s has created a thriving urban economy around a new set of Indian cities.

Japan continues to age rapidly, and its capital surplus is largely consumed domestically as it turns increasingly inward both economically and culturally. The Middle East is now aligned around a group of trading regimes headquartered in the Gulf States, the natural resources boom having definitively ended during the 2020s. Africa is finally emerging. A fully urbanized South Africa has taken up regional leadership in the southern part of the continent, while Egypt, Algeria, and Morocco provide centers of commerce and funding in the north. The only region that stubbornly persists in underdevelopment is South America, where local ruling elites have successfully entrenched themselves and blocked productivity-oriented development priorities.

For its part, Europe remains relatively prosperous but has continued to suffer from immigration-related social tensions and rising costs given its aging population. Russia is now fully economically integrated into Europe and depends on capital flows from the continent to support its own even more rapidly aging population. Eastern Europe has become the new technology hub of the continent, with major scientific and educational institutions that rival those of the United States. Thanks to climate change, the economy is driven by wind and nuclear power, under constant refinement by the region’s R&D labs. In southern Europe, emer-gency medical teams have taken up permanent residence fighting periodic outbreaks of various tropical diseases brought on by climate change.

While military skirmishes take place from time to time, mostly due to conflicts over natural resources such as oil and water, the world has settled into a sometimes tense stasis. China and the United States defend their critical interests directly, but defer to multilateral institutions to do most of the fighting and intervention. The established relationship between the two nations is generally supportive, as both have a mutual interest in defending a global market–based economy. No player has yet emerged to challenge China and the United States directly, but neither can either power exert unilateral power in the way that the United States and the United Kingdom held global sway in the past. A new and special relationship now rules the waves.


Tuesday, March 03, 2009

Finland to buy CO2 emissions from Chinese biogas project

Finland to buy CO2 emissions from Chinese biogas project
By Zhang Qi (chinadaily.com.cn)
Updated: 2009-03-03 19:43

Finland's government has signed an agreement to buy 1.4 million tons of CO2 emissions from a Chinese environmental company for 14.5 million euros ($18.26 million) by 2018 under the clean development mechanism (CDM) project, Finland's Embassy in China said on Tuesday.

Finland will pay Beijing Hebayi Ecological Energy Development Company to purchase the Certified Emission Reductions (CERs) from household biogas reactors and biogas cookers in 210,000 households in Hunan province.

According to the purchase agreement, Finland will buy CERs for both the Kyoto period 2009-2012 as well as for the post-Kyoto period 2013-2018.

The CDM is under the Kyoto Protocol, which allows developed countries with a CO2-reduction commitment to buy carbon credits from developing countries.


Monday, March 02, 2009

Einführung eines ERP-Systems bei einem Unternehmen der verarbeitenden Industrie

Einführung eines ERP-Systems bei einem Unternehmen der verarbeitenden Industrie

Die Unternehmensführung hatte sich dazu entschieden, ein neues ERP-System zur Erhöhung der Effizienz der unternehmensweiten Geschäftsprozesse einzuführen. Zu diesem Zweck wurde TCW beauftragt, einen systematischen und methodisch gestützten Auswahlprozess zur Einführung eines neuen ERP-Systems durchzuführen. Im Zuge des Projektablaufs erfolgte eine zielgerichtete IT-Systembewertung, um die Hauptziele des ERP-Systemeinsatzes einer hohen Transparenz über die Leistungsfähigkeit der verschiedenen Standorte sowie einer benutzergerechten Informationsversorgung sicherstellen zu können. Die Aufnahme und Bewertung von System- und Benutzeranforderungen sowie der Abgleich mit verschiedenen ERP-Lösungen ermöglichte in der Folge eine konsolidierte Zusammenstellung der erfolgskritischen Kriterien bei der Auswahl und Implementierung eines integrierten ERP-Systems.

Ausgangssituation

Das starke organische Unternehmenswachstum verbunden mit einer ehrgeizigen Akquisitionsstrategie erforderte neben einer unternehmensweiten Optimierung der Unternehmensprozesse die Einführung eines integrierten Informationssystems zur verbesserten Planung und Steuerung der betrieblichen Abläufe. Die gegenwärtige Unternehmenssituation war dadurch charakterisiert, dass an den verschiedenen Unternehmensstandorten isolierte IT-Lösungen installiert waren, die ausschließlich zur Steuerung der lokalen Ablaufstrukturen geeignet waren. Der lokale Fokus und die Verschiedenartigkeit der zum Einsatz kommenden Systeme erschwerten die Ausnutzung von gruppenweiten Synergiepotenzialen und die Realisierung zentraler Erfolgsfaktoren im Bereich der informationstechnischen Unterstützung der Organisation. Das Controlling der einzelnen Standortaktivitäten sowie die Transparenz über den Einsatz und die Verwendung der betrieblichen Ressourcen entlang des Wertschöpfungsprozesses konnte nicht einheitlich aufbereitet und dokumentiert werden. Hieraus ergaben sich negative Effekte der Fehlallokation von Ressourcen, welche zu einer Verschlechterung der Ergebnissituation des Unternehmens beitrugen. Weiterhin waren der Austausch von relevanten Informationen und die Bereitstellung von betriebsrelevanten Daten durch die Inkompatibilität der bestehenden Informationssysteme nur unter erschwerten Bedingungen möglich. Dadurch dass für die existierenden Informationstechnologien keine Anpassungs- oder Erweiterungsstrategien vorlagen und die Systemgrenzen der IT-Systeme nur unzureichend abgebildet und definiert waren, konnten keine ausreichenden Potenziale für eine Optimierung der gegeben IT-Organisation identifiziert werden. Es wurde deutlich, dass lediglich eine ganzheitliche und integrierte Lösung der gegebenen Ausgangssituation am ehesten gerecht werden würde.

Mit der Einführung eines gruppenweiten ERP-Systems sollte ein zentraler Stellhebel für die nachhaltige Steigerung der organisatorischen Leistungsfähigkeit des Unternehmens implementiert werden.

Vorgehensweise und Ergebnisse

Mit der gegebenen Ausgangssituation einer unzureichend ausgeprägten IT-Strategie des Unternehmens beauftragte die Geschäftsführung TCW mit der Durchführung eines systematischen Auswahlprozesses von geeigneten ERP-Systemen zur Hebung der identifizierten Potenziale einer agilen und leistungsfähigen Organisationsgestaltung. Die Zielsetzungen der Projektdurchführung konkretisierten sich in folgenden wesentlichen Aspekten. Die Erwartungen an die neue Systemlösung bestanden vorrangig in der Schaffung von Transparenz über die betrieblichen Abläufe an den unterschiedlichen Standorten durch eine Bündelung und Integration sämtlicher betriebsrelevanter Daten und Informationen in einem einheitlichen Informationssystem. Durch die Reduzierung von Schnittstellen und die Standardisierung von Unternehmensprozessen sollte die Reaktivität, Flexibilität und Anpassungsfähigkeit der Organisation an die veränderten Markt- und Wettbewerbsbedingungen verbessert werden. Weiterhin sollte die Erhöhung der organisatorischen Effizienz entlang der Wertschöpfungskette durch die Schaffung einer optimierten Entscheidungsunterstützung auf der Grundlage einer integrierten und einheitlichen Betriebsdatenbereitstellung gewährleistet werden. Außerdem galt es sicherzustellen, dass die Bedienbarkeit und Handhabbarkeit des neuen ERP-Systems auf die Bedürfnisse der Mitarbeiter zugeschnitten werden sollte.

Die Umsetzung des Projektauftrags gliederte sich im Folgenden in ein 7-stufiges Vorgehen. Folgende Phasen wurden zur Projektumsetzung durchgeführt:

  1. Initiierung der vorbereitenden Maßnahmen
    • Bestimmung der Projektleitung sowie der Projektzielsetzungen
    • Aufnahme und Analyse der Systemanforderungen der Bereiche
  2. Realisierung der Projekt-Startphase
    • Erstellung eines Lastenhefts mit den unternehmensspezifischen Anforderungen an die ERP-Software
    • Erarbeitung eines ERP-Pflichtenhefts mit dem ERP-Anbieter
  3. Auswahl der geeigneten Systemlösung
    • Festlegung von Auswahlkriterien unter Berücksichtigung der unternehmensspezifischen Ausgangslage
    • Durchführung des Bewertungs- und Auswahlprozesses
  4. Controlling der Ergebniswirksamkeit der installierten ERP-Lösung und durchgängiges Projektmanagement während Implementierungsphase

Die vorbereitenden Maßnahmen manifestierten sich in der Sichtung bereits geleisteter Vorarbeiten seitens des Unternehmens. Nachdem die Projektzielsetzungen an die relevanten Anspruchsgruppen kommuniziert und die Aufgaben der Projektleitung definiert worden waren, wurden auf einer zumeist noch abstrahierten Ebene die Bereichsanforderungen an das neu zu installierende System in interdisziplinär besetzten Workshops aufgenommen und diskutiert.

Auf der Grundlage dieser frühen Ergebnisse konnte bereits eine erste Eingrenzung und Priorisierung der Anforderungen an das ERP-System vorgenommen werden. Im weiteren Verlauf übernahm TCW die vollständige Verantwortung für die Erstellung des Anforderungskatalogs mit den konkretisierten Systemanforderungen. In Zusammenarbeit mit vorselektierten ERP-Systemanbietern wurde auf Basis der vom Unternehmen vordefinierten Kriterien ein Pflichtenheft erarbeitet, aus welchem der Erfüllungsgrad der unternehmensspezifischen funktionalen Anforderungen für jeden Systemanbieter eindeutig hervorging.

In einem dritten Schritt musste nun die weitere Eingrenzung des Anbieterkreises sowie die anschließende Auswahl des ERP-Systems vorgenommen werden.

Hierbei wurden Auswahlkriterien definiert, welche neben dem funktional-technischen Abdeckungsgrad auch strategische Überlegungen wie Internationalität, Handhabbarkeit, Bedienbarkeit, verschiedene Länderversionen, Zentralität gegenüber Dezentralität oder zukünftige Erweiterungsmöglichkeiten der Module hinsichtlich Funktionsumfang beinhalteten. Nach einer Gewichtung der Kriterien wurden mit den verbliebenen Systemanbietern Workshops durchgeführt, in welchen sämtliche Kriterien abgefragt wurden, sodass zum Abschluss mit den drei besten Anbietern Präsentationen vor dem Unternehmen durchgeführt wurden. Der abschließende Beauty-Contest vor dem Kunden lieferte das Ergebnis und ermöglichte die zielgerichtete Auswahl des besten Anbieters.

TCW übernahm in der Folge die Begleitung des Implementierungsprozesses in Form der Übernahme des Projektmanagements und der lückenlosen Dokumentation der ERP-Systemfunktionalitäten, um einen reibungslosen und zügigen Know-How-Transfer im Unternehmen sicherzustellen.

Die Ergebnisse spiegelten sich in einer deutlich erhöhten Transparenz über die Leistungsperformance der einzelnen Standorte wider. Das neue System ermöglichte ein weitaus effektiveres und effizienteres Controlling der Geschäftsaktivitäten, wodurch eine verbesserte Planung und Zuordnung betrieblicher Ressourcen gelingen konnte. Die signifikante Reduzierung von Informationsdurchlaufzeiten ermöglichte die beschleunigte Abwicklung von Kundenaufträgen und leistete somit einen erheblichen Beitrag zur Stärkung der Kundenzufriedenheit. Insgesamt konnte die Effizienz der Unternehmensorganisation nachhaltig gesteigert werden.