Friday, June 27, 2008

We Need a Definition of "Sustainability"... And Here It Is

We Need a Definition of "Sustainability"... And Here It Is

meyer_100.jpg“Sustainability” is a meme morphing into a buzzword. As “climate crisis,” “triple bottom line,” “carbon cap and trade,” “cleantech,” and other phrases enter more widespread use, they are amalgamating, becoming code words that signify that the user “gets it,” while offering little specificity about what it is that’s gotten.

This happened with “Web 2.0,” a term Tim O’Reilly established to signify web-based applications, which has now engorged to encompass social networking, user-generated content, communities with feedback features like user ratings, and possibly everything except Microsoft Word.

There’s not much harm in the expansion of “Web 2.0." It’s a made-up term, and, Humpty Dumpty-style, it can mean what those who use it decide is most useful. But sustainability is a reasonably specific concept, and it would be a loss to sacrifice its precision for fashion.

Here’s what I think “sustainability” should mean.

Economists have historically labeled impacts--good or bad--that do not feed back into economic decision-making as “externalities.” If you bought a car battery and paid nothing to sequester its toxic materials upon disposal, the costs to society of dealing with it—whether counted as health care for people getting heavy metal poisoning or their harder-to-measure suffering—were deemed externalities because neither the customer nor the battery maker paid this cost.

Now, in many states, that externality has been internalized. The customer purchases a $25 certificate licensing her to dispose of the battery, and she is buying with knowledge of the full cost of the product. Likewise, the impact on the atmosphere of shooting carbon through a smokestack was an externality; carbon taxes and cap and trade systems are proposed mechanisms for internalizing the costs.

An activity is sustainable when all costs are internalized, because if the costs are too high, the activities stop. Low gas prices lead to more Hummers; taxing gas in some fashion to pay for environmental remediation makes sense, and is a pro-sustainability approach. This version of sustainability applies not only to the environment: labor practices are unsustainable if they breed unrest (or revolution) or fail to develop the labor force; additives that extend product shelf life are unsustainable if they diminish human life; corporate presence in a town may be unsustainable if the tax breaks that attracted the facility mean that it is not paying enough to keep the community thriving.

In this view, corporate social responsibility, cleantech, and carbon sequestration or trading are all approaches that can improve sustainability. The key principles are:

(1) Measure the system life-cycle costs in all dimensions we care about

(2) Internalize those costs rather than wave them away or throw them over some regulatory transom. The nurturing of human capital, the fairness of resource allocation (corruption is unsustainable), and caring for the health environment of workers or neighbors are equally a part of running a business or a nation in a sustainable manner.

Holding on to an economics-based definition of sustainability helps reconcile broader social interests with the measurement of shareholder value. If we can capture social costs in earnings equations, then we will align social and financial motivations. It would be a loss to let such a useful concept drift into a more emotional definition.

The Brain Behind Google

The Brain Behind Google

Almost every company on the Internet these days is dependent on Google. Google now does more searches than every other search engine combined, and search is the single biggest driver of traffic for most websites.

Yet how Google’s search algorithms work is a mystery to almost everyone, especially given it changes its search algorithms over 450 times per year, more than once per day. For evidence of the power Google and its algorithms hold over business just look at Answers.com, a public company that lost nearly 30% of its traffic in a single day last year and nearly a quarter of their market cap because of a change that Google made to its search formulas.

So how does Google work and how do you beat the system?

Google’s big innovation is the idea that the importance of a website is based on how many other websites link to it. And it is not only a matter of the number of links, but the quality of those links as well –the thinking being that the best websites should have many other websites that link to them.

Sound familiar? It should, because this is in fact the way the brain’s computing unit, the neuron, determines relevance of information, where the most popular and useful neurons have the largest number of connections. The best neurons, those with the richest connections, have the most links to other neurons around them.

Using links as a proxy for relevance works well for Google and for the brain. It is our natural way of behaving: we watch movies because they are box office hits and read books because they are best sellers. It is a strange recursive loop indeed, where neurons get stronger because they are strong and websites are ranked higher because they have a high rank and books are bought because they have sold well and movies are watched because everyone has seen them – we have entered a world where the rich always get richer and the popular girl becomes prom queen.

But, like the brain, Google’s algorithms are continually evolving and becoming more powerful. Google now looks at the relevance of each link and weights it based on whether or not it is similar to the website’s category itself. And this is precisely how neurons strengthen connections. A link from SportsIllustrated.com is more valuable to a site like Yankees.com than to Amazon.com. It also looks at the quality of the website (yes, their algorithms actually read and process information), such that a link from Harvard Medical School is considered more valuable than Sonny’s Backyard Biomedical. This means that the site needs to have links to it by other relevant, high-quality sites. No longer is it good enough for a book or movie to be popular, now it has to be critically acclaimed, with a starred review or two thumbs up.

According to the gang at Google:

PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at considerably more than the sheer volume of votes, or links a page receives…it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”


This may not pass muster in a democratic election but it is a description that would make even the most ardent neuroscientist proud.

Just as we have this tangle of competitive interests on the Internet, so do we have it in the mind. In our cases, we are constantly weighing information, in a mental battle that we are rarely aware of. Google’s algorithms are mimicking the brain’s need to clear out the clutter and find the good stuff.

So what should you do? Follow the brain.

Stop trying to game Google and focus on building value. Improve the quality of your site, remove the clutter and focus on attracting relevant sites to link to your site. And if you are looking for a good model, look no further than Google’s website: uncluttered, massive links in to the site, and a nice big button for sending your information on.

Bill Gates: Entrepreneur, Manager, and Leader

Bill Gates: Entrepreneur, Manager, and Leader

Today marks the last working day for Bill Gates at Microsoft. So much has been written and spoken about him that another column appears redundant. Some people may even feel a tinge of happiness that they no longer have to contend with the ruthless businessman that Gates has been portrayed as. The purpose of this post is to analyze what can be learned by young people from perhaps the most successful entrepreneur of our times.

Focus: Bill Gates has demonstrated over nearly thirty years the importance of clarity of thought and execution. Unlike many of his contemporaries, he did not move away from the domain he understood better than anything else – software. He has pursued the objective of dominance in software in general and operating systems in particular that has few parallels. Venturing into unfamiliar territory may be fashionable but carries a high degree of risk. If ever a need arises for an absolute example for what Peters and Waterman called “Stick to the Knitting” and Hamel and Prahalad termed core competence, one needs to look no further than Bill Gates and Microsoft. Focus also means the ability to pursue one’s goals whatever the obstacles may be. Such a degree of perseverance is hard to come by.

Thinking big: Along with focus, the ability to dream big and pursue that with single-minded determination sets Gates apart from other entrepreneurs. This is particularly true of entrepreneurs from emerging economies like India where an ultra-conservative attitude has stifled growth. Entrepreneurs need to develop confidence in themselves and their team that they can take on the world and come out winners.

Passion: Simply put, if anything is worth doing, it is worth doing well. From a simple thank you note to a complex proposal, it is critical to place the stamp of excellence on whatever one undertakes. Equally important is the need to constantly innovate. Change is the only constant and the more agile and adaptive we are to change, the more successful we can be.

Learning as a life-long process: Though dropping out of college to his dreams, Bill Gates has probably read and written more than most of us ever will. In the process, he has shown the limits of formal education. Important as formal education is, perhaps it is more important to realize that learning is a life-long process. Knowledge is infinite. Even if we keep assimilating it without a break throughout a lifetime, we would not have scratched the surface. Knowledge should lead to humility and wisdom – not arrogance and one-upmanship.

Giving back to society: The Bill and Melinda Gates Foundation has provided a new dimension to philanthropy by addressing issues that are global in nature – malaria, cancer, AIDS. Feeling good by doing good may appear old-fashioned but this may yet be the best way forward in combating diseases that kill or maim millions of people every year. With friend and legendary investor Warren Buffet also joining hands, a formidable combination has been forged. Bill Gates has shown a remarkable degree of consistency both in his business goals and in his goals in philanthropy – he is a global citizen.

Although some Indian entrepreneurs have indeed espoused similar causes – Infosys Foundation, Azim Premji Foundation, and the House of Tata come to mind, a lot more can be done by successful Indian entrepreneurs. In fact, just 5% of the wealth of the 200 richest people can eradicate some of the most pressing problems that we face. Wealth should not be merely in terms of building the most flamboyant homes but in pursuing a higher calling. Where is the collective conscience of the rich who hav made it big due to the society that they are a part of?

As with any successful or great person, there will always be controversies. In an age where the distinction between means and ends is increasingly blurred, taking extreme positions hardly helps. One may not agree with Gates’ means for achieving what he has, but one would find it difficult to ignore his contributions to the IT industry. However, history and posterity will probably recognize him more for what he has decided to do – at a relatively young age – for the rest of his life. Combating hunger, fighting disease and educating the poor are truly lofty goals worth emulating by anyone who cares for humanity and for the quality of life on this planet. On this count, there cannot be many role models better than Bill Gates. The last thirty years have seen the emergence of an entrepreneur par excellence. The next thirty years will probably see the emergence of the greatest individual philanthropist – not necessarily in monetary terms – but in terms of the global issues addressed with dedication.

Since this is a discussion forum, two questions to readers:

• How do you get the next Bill Gates, or better, without inviting the kind of controversy that his success has spawned?

• Why can’t governments spend 1% less on defense and use the money to improve living conditions for the poorest of the poor?

Balancing Global Risk and Return

Balancing Global Risk and Return

To get the most from your global supply chain, take a strategic view of managing risk. But don’t neglect the tactics. by David Bovet

No matter the risks, global supply chains continue to grow longer and more complex as companies push deeper into uncharted territory in search of lower costs. As a result, the questions surrounding what managers should do about risk have never been more pressing. Do you add tons of inventory, quietly ripening toward obsolescence? Build myriad backup plans, most of which you’ll never draw on? Or rethink the whole thing and retreat from the cost-saving sourcing deals that are driving you into far-flung lands in the first place?

While there are no easy answers—and, in most cases, no “right” answers—there are frameworks for thinking about how to manage risk more effectively and reach the right balance with potential return that will show benefits to supply chain managers, strategic planners, and C-level executives alike.

The key is, first, to examine your basic supply-to-market strategy to ensure you are following the course that best supports your business, and, second, to determine the right tactics to support your strategy. This two-step process may sound simple enough, but to execute it correctly you must have a firm grip on your company’s core strategies, be willing to invest in new forms of internal analysis such as portfolio modeling, and be prepared to make decisions that might fly in the face of organizational precedent.

Strategy first

In order to effectively balance risk and opportunity, product manufacturers, distributors, and retailers need to periodically rethink their strategies for going to market. There are two ways to strategically address the burgeoning nature of risk: (1) shorten the supply chain in order to reduce cycle time and disruption risk or (2) optimize the portfolio of supply chain sources and locations in order to gain flexibility through diversification. Many innovative companies have used the first approach effectively, but it does have its limitations—indeed, it prevents a company from taking full advantage of the economic benefits of extending the supply chain globally.

For firms set on driving their supply chains deeper into new markets, the second strategic approach may offer some as-yet-untapped advantages. The idea of optimizing a company’s portfolio of sources, assembly locations, and distribution points derives from financial portfolio theory and offers a valuable framework for assessing risk/return tradeoffs. The goal here is to create a supply chain strategy that best fits the overarching needs of the firm through a process of modeling that clearly shows decision makers the benefits and risks of different sourcing tactics. Through portfolio modeling, firms can mix and match different tactics in pursuit of the arrangement of sources, locations, and so on that maximizes the supply chain’s ability to support a specific company strategy.

Creating a portfolio starts with developing a set of alternative supply chain designs that support the business in different ways. For example, one design might emphasize speed to market, another might focus on manufacturing quality, a third might home in on cost. In parallel, supply chain risks associated with each design are identified, classified, and quantified. The projected returns and risks can then be modeled and plotted on a spectrum. The optimal solution will lie along the “efficient frontier,” representing the set of options with the highest return for a given level of risk. The key here is to combine supply chain elements whose disruption risks are not directly tied to each other. In other words, you are seeking to minimize the likelihood of a domino effect in which a problem at one stop in the supply chain imperils others. There may be several effective strategies to choose from, offering different combinations of risk and reward.

Through a graphical representation of the portfolio approach, everyone can see the underlying assumptions, which can be changed to explore different scenarios. Managers can update costs, risks, and other data in the model and periodically convene a group of decision makers from other functions to revisit supply chain strategy decisions or test new concepts. This ensures that executives from across the organization understand the alternatives and tradeoffs and commit to successful implementation.

Portfolio modeling offers several advantages. First, it is understood and practiced by CFOs and allows supply chain executives to make the case about risk in terms that senior management understands. Second, it appropriately focuses on the business value that supply chain management can deliver—through satisfied customers, capital efficiency, and low operating costs—for a given level of risk. This framework can also adapt to changes in supply chain strategy in light of shifts, over time, in relative costs and perceived risks.

Then, down to tactics

Once the strategy is in place, many ways exist to reduce risk. Here are practical steps any company can take:

Demand management

1. Improve demand planning with distributors and retailers by being closely connected to customers through shared demand forecasts, vendor-managed inventory, and other joint systems. The goal is to reduce the risk of being blindsided by demand shifts.

Supply management

2. Work with suppliers to create contingency plans. In the wake of 9/11, Continental Teves, a major automotive supplier, activated existing contingency relationships with transport firms such as Emery to supplement air shipments of parts from Europe. After making a same-day assessment of parts flows at risk, Continental Teves was able to rely on prearranged ocean shipping space and increased inventories, thus allowing its customers, including Toyota, to continue operations with little disruption over the following weeks.

3. Diversify sourcing to reduce the risk of catastrophic supply chain failure. Establish backup arrangements by qualifying additional suppliers, without necessarily awarding them significant volume. Geberit, a large Swiss sanitary fixture manufacturer, has adopted a dual sourcing policy. It either retains an existing supplier as a second source, or develops a second source in Asia. Companies can choose to meet 10% to 20% of their needs from a second supplier, which generally will work hard in hopes of displacing the primary supplier. Service-level agreements can call for rapid ramp-up if required.

4. Extend insurance policies to cover overseas suppliers. Contingent business interruption coverage, for example, is typically limited to the U.S. and nearby countries; have it explicitly extended to cover major suppliers located in Asia and other low-cost geographies.

Logistics enhancement

5. To deal with contingencies, employ a major third-party logistics provider with broad resources. One electrical manufacturer recently asked its freight forwarder to provide weekly updates on the best U.S. ports for its inbound product flows from Asia. In essence, the logistics provider becomes a key risk-mitigation agent by continually looking over the horizon on a company’s behalf.

6. Model and optimize inventories on a disaggregated basis, as all components are not created equal. Modeling supply susceptibility to delays leads to finer tuning of safety stocks, which may rise for some parts or finished goods (depending on which point in the supply chain one is looking at) but fall for others. A typical product with a one-week lead time and delivery variability of one day will require 15% more safety stock, for example, if supply variability increases by one day, and 175% more if variability increases by a week.

Supply chain integration

7. Increase product component standardization. The ability to mix and match components from multiple suppliers and plants allows such manufacturers as Dell, IBM, and Herman Miller to make their supply chains more flexible. Reducing product complexity shortens cycle times in normal conditions and speeds response to supply crises as well.

8. Create a centralized product data management system. If the supplier is the only one who knows the actual specifications of products or components, rapid re-sourcing of products can be time-consuming, if not impossible, during an emergency. Centralized product data for immediate consultation or preemptive use helps reduce the risk of disruption. In practice, this means developing a database of product and component designs so that substitute suppliers can be rapidly brought up to speed. Companies that have sole-sourced a key component for years, with-out maintaining control over drawings or other design characteristics, take heed.

9. Raise visibility along the extended supply chain. When inventory is tracked from order placement to reception at a forward distribution center or customer, it can effectively become part of a company’s safety stock. Achieving real time knowledge of the location of parts and products as they flow from distant origins is not easy, to be sure, but trade management software can help track global goods flows and divert shipments when necessary.

10.Monitor specific warning signs of trouble. Tracking a limited number of supply chain risk indicators, such as average train speed, weeks of orders outstanding, component delivery variability, and exchange rate movements, can provide a crucial warning as a problem approaches the tipping point and becomes a dangerous disruption. It is no longer sufficient to track just service levels, lead times, inventories, and logistics costs.

Changing the mindset

By using an appropriate mix of the initiatives outlined here, managers should expect to see real improvements in their supply chain performance.

Of course, incorporating risk considerations into what have historically been highly cost-focused analyses requires a shift in thinking and organizational behavior. To improve the risk/reward equation, compartmentalized decision making must be replaced by cross-functional cooperation.

Participants can come from marketing, sales, sourcing, manufacturing, finance, and risk management. Eliciting full commitment from all these functions might require the CEO’s or COO’s involvement. The long-term trend toward just-in-time delivery combined with strong economic incentives to access the best global supply sources virtually guarantees abundant risks throughout the supply chain. Frictionless commerce remains a utopian vision rather than reality. Where managers can excel, however, is in identifying, quantifying, and preparing for the new realities of risk.

David Bovet is a managing director of Mercer Management Consulting.

This article appeared in the August 2005 issue of Supply Chain Strategy.

Ten Web Startups to Watch

July/August 2008

Ten Web Startups to Watch

We profile some of the most innovative ideas of the Social Web.

By TR Staff and Freelance Writers

Greg Woock (left) and Joe Sipher
Credit: Howard Cao

Instant Voicing
Send voice messages without calling, and listen to them from a phone--or a laptop.
By Larry Aragon

Company: Pinger

Founding date: 2005

Funding amount: $11 million

Worldwide, people sent 1.9 trillion text messages last year. That's a lot of tedious triple-tapping on mobile phones, and it's not free. Pinger, a startup in San Jose, CA, is giving us a voice version of text messaging that's Web accessible, so picking up messages need not trigger mobile-phone charges.

Pinger lets you send voice messages without calling (and interrupting) the recipient. Instead, you speak a name or phone number into your cell phone and then leave your message, which sits on Pinger's servers. A text notification lets the recipient know that a voice message can be picked up by phone or on ­Pinger's website. Pinger cofounder Joe Sipher, a former executive at smart-phone maker Palm, describes the service as "noninterruptive voice mail."

Sipher and cofounder Greg Woock conceived of Pinger as a quick, practical way for businesspeople to leave messages. But its biggest fans are turning out to be women between 15 and 25--a possible sign that it could become an important Web 2.0 tool. And since these eager voice messagers provide personal information upon sign-up, Pinger can make money by selling that information to companies that do targeted mobile-phone advertising. AT&T, T-Mobile, and Sprint have launched similar offerings.

Randy Komisar, a partner at Kleiner Perkins Caufield and Byers, which incubated and funded Pinger, says that because Pinger built its service with free or low-cost open-source software, it can quickly add or change features. In Komisar's view, that flexibility could allow Pinger to adjust more nimbly than larger competitors

Sharing, Privately
With Pownce, think Twitter meets Napster.
By Lissa Harris

Company: Pownce

Founding date: 2007

Funding amount: Undisclosed

You've got mail. You've also got Twitter feeds, Facebook groups, blogrolls, and instant-­messaging clients. Why do you need Pownce? Launched in June 2007, Pownce joins the likes of Twitter, Jaiku, Seesmic, and Kadoink in the rapidly expanding world of microblogging. But it's really a file-sharing platform disguised as a micro­blogging service--and possibly the next big thing to inflict insomnia on entertainment industry lawyers.

Pownce allows users to send and receive large multimedia files, and to precisely control who receives those files and updates--something you can't do with Twitter. The file-sharing capabilities have been critical to Pownce's growth so far, says Leah Culver (see our cover), the 25-year-old who cofounded it with Digg.com cofounder Kevin Rose and Digg's creative director, Daniel Burka. "File sharing is kind of difficult online," she says. "There's not a good way to do it on IM. We did an embed feature, so you can watch videos and photos right in line, and that really took off."

Pownce has been a work in progress. Allowable file sizes were initially too small; recently they got a big upgrade, from 10 to 100 megabytes (250 for the "pro" account, for which users pay $20 a year). Users complained about the lack of a mobile-phone-friendly site; Pownce built one. Last October, the company rolled out a public API (application programming interface) enabling features such as rePownce, which publishes Pownce to your Facebook page. One thing that hasn't changed is the business model. From the start, Pownce has embedded ads in its message feeds. Culver thinks ­the service could give people a reason to jump on the Web 2.0 bandwagon: "We have people who say, 'This is the first social network I've ever used.'"

Cell-phone Streaming
Qik lets tourists--and reporters--broadcast live from phones.
By David Talbot

Company: Qik

Founding date: 2006

Funding amount: $4 million

Wandering the streets of Manchester, NH, during the presidential primary campaign, a self-described "citizen journalist" named Steve Garfield bumped into Duncan Hunter, a minor Republican candidate. ­Garfield pointed his phone's camera at Hunter for a quick interview, whereupon Hunter disclosed that he was about to tell CNN he wasn't quitting the race.

What Hunter didn't know is that Garfield's phone was armed with software from the startup Qik, allowing it to capture video and stream the interview--in real time--on Qik's website and thence to other platforms, including Garfield's Twitter network. Thus, Garfield says, he scooped CNN on this bit of election minutia.

Qik's data center not only converts cell-phone videos to Flash format but allows Web viewers to send text messages back to the person capturing the video. Bhaskar Roy, a former marketing director at Oracle who is a cofounder, says Qik's key is its adaptability--it works with phones and networks of widely differing capacities, and does so in real time. "We have been focusing on speed and live aspects, and the quality," he says. Roy says the company has recruited thousands of trial users in 55 countries.

The company is now working on business models. In one, it would sell ads to Web video consumers who text-message replies; in another, it would take a cut from sales of high-end cell phones that capture the best videos (which would come Qik-equipped). Garfield is happy to pay: "Viewers can type in a window while watching, and affect the coverage. That's, like, totally amazing and groundbreaking!"

Traffic Master
A dashboard gadget brings the Internet to highways, for traffic and local search.
By David Talbot

Company: Dash Navigation

Founding date: 2003

Funding amount: $71 million

This spring, after years of development, Dash Navigation finally released Dash Express, a two-way Internet-­connected dashboard traffic gadget that brings a kind of social network to the highways. At its heart, the device is a traffic reporter; the company draws on existing traffic data and turns users' cars into networked sensors that broadcast their speed and location (based on GPS data) to other Dash-equipped cars, warning of tie-ups and suggesting routes. Because Dash cars provide data from all roads, not just highways that may already have sensors, they fill in blank spots. In addition, the gadget is a search tool that taps the Web for any number of purposes, including location-based search for, say, Thai food, cheap gas, movies, or apartment rentals. Because it has an open programming interface, new search applications will keep popping up, says Robert ­Currie, Dash's president. Dash makes money on sales and subscriptions.

Crisis Sourcing
Ushahidi's platform allows text messages to feed into the Web.
By David Talbot

Company: Ushahidi

Founding date: 2008

Funding amount: Undisclosed

In the chaos that followed Kenya's disputed presidential election last December, 1,200 people were killed, and several hundred thousand more fled their homes. Skeptical of the accuracy of official reports, a group of Web developers and bloggers with Kenyan ties cobbled together a Web application that could receive citizen incident reports via text message from any mobile phone in Kenya and display them as a Google Maps application.

Cofounded by Erik Hersman, an American son of missionary parents who was raised in Kenya (he is author of the blog Whiteafrican.com and now lives in Florida), the group called the creation Ushahidi--the Swahili word for "testimony." They have formed a nonprofit company and are finalizing funding with a large foundation to turn Ushahidi into a platform that can be deployed easily and rapidly in areas of crisis. Already, a version of Ushahidi is being used to track anti-­immigrant violence afflicting South Africa.

"While there have been a lot of projects to do citizen reports, they are all Web-based," says Ethan Zuckerman, a research fellow at Harvard's Berkman Center for Internet and Society and founder of Geekcorps, a technology volunteering agency. "There is no strong content management system designed to take content off of SMS [text messaging]. It's pretty sophisticated."

Now anyone with a mobile phone can become a node on the network. "Whenever a crisis breaks out and you want distributed data gathering and visualization, our goal is to make it a lot easier to do," says Hersman. The technology won't require much expertise; "people can either download Ushahidi or we will host it for you." And it's not just for Africa. He says the technology could help chronicle fast-­moving U.S. calamities such as Hurricane Katrina.

Partial Recall
QTech's reQall makes custom reminders for scatterbrains.
By Lissa Harris

Company: QTech

Founding date: 2004

Funding amount: $5 million

Sunil Vemuri, cofounder of QTech, observes that "one of the dark secrets of memory aids is that people forget to use their memory aids." The company hopes to solve this problem with a Web-based tool called reQall, which grew out of Vemuri's doctoral research at MIT. Users enter calendar items, grocery lists, brilliant ideas, and other snippets of information into the system; they can do this as text over the Web or via a toll-free phone number. Then reQall uses a combination of speech recognition software, human transcriptionists, and proprietary algorithms to generate reminders by phone, text message, RSS feed, or e‑mail, or through a Web interface (the details are customized to the user). "Our main competition is the Post-it note," Vemuri says.

QTech's advisors include leading figures in digitally assisted memory, including Microsoft's Gordon Bell. Former MIT Media Lab head Walter Bender, Vemuri's PhD advisor, says reQall "helps reduce the instances of forgetting in the first place." But QTech initially forgot to make money from the business. It's now exploring partnerships with cell-phone companies, fee-based "premium" accounts, and advertising models.

Are You ... Influential?
33Across calculates your online social clout for sharper ad targeting--and for you.
By Larry Aragon

Company: 33Across

Founding date: 2007

Funding amount: $1 million

When it comes to social networking, "there are an incredible number of people who want to be known as influential," says Eric Wheeler, CEO of the New York City startup 33Across. Of course, there are also plenty of people--advertisers, namely--who want to know who the influential people are. Wheeler would cater to both. A number of companies try to help target ads based on users' behavior; a visitor to Cars.com, for example, might see Ford ads. In June, 33Across announced its first partnership with a social-networking site--Meebo--to build anonymous profiles of users' actual influence.

The profiles are drawn from the usual sources--self-provided information and Web browsing history--as well as from details on users' networks and their propensity to communicate. The goal: to find gossipy influencers who will be the "viral promoters" of, say, a new product, says Christine Herron of First Round Capital, an investor. "All this data can be used to understand an incredible amount of detail about a person's influence," she says.

Mainly, "it allows advertisers to be much smarter in how they deliver a message," says Wheeler, formerly CEO of ad agency Neo@Ogilvy North America. In exchange for giving 33Across nonprivate user data, social-networking sites get a piece of the resulting ad revenue. Users could benefit, too, since the social-­networking site could share the data with them. Measures of influence might be important to bloggers, among others. 33Across plans a full launch in September.

Semantic Ads
Peer39's algorithms promise better ways of mining language.
By Lissa Harris

Company: Peer39

Founding date: 2006

Funding amount: $11 million

The semantic web is coming. That means that software will comb blogs, social networks, and forums for information about the meaning of a page, reading it ever-more intelligently--and, of course, better targeting advertisements.

This last bit is where Peer39, a semantic-advertising company founded by entrepreneur Amiad Solomon, comes in. Peer39's investors are betting that the company's algorithms--built on research at the Technion Institute of Technology and Princeton's Institute for Advanced Study--will improve on existing methods. "These guys find organic expressions of demand on the Web, on blogs, on forums and chats--all kinds of specific areas where people are talking about products," says Jon Medved, an angel investor in the company. Then they instantly deliver custom advertising. "It's a more compelling user experience," Medved says.

Mashups Made Easy
ByLissa Harris

Company: Mashery

Founding date: 2006

Funding amount: Less than $5 million

Websites once stood alone. Now they talk to each other, exchanging bits of data and piggy­backing on each other's communities. One key to this change was the development of application programming interfaces (APIs), which allow all sorts of information sharing and hybridization. But startups often have trouble managing their APIs effectively. Mashery, a San Francisco startup, makes it easier--providing security, keeping abreast of shifting industry standards, and introducing potential partners to each other. This spring, Mashery helped Reuters launch its Open Calais project, a public API that gives developers access to semantically tagged news content, says Oren Michels, Mashery's CEO.

Video Packet-Switching
Anagran helps the Internet handle growth in streaming media.
By Larry Aragon

Company: Anagran

Founding date: 2004

Funding amount: $40 million

As a Pentagon researcher in the 1960s, Lawrence Roberts led development of what became the Internet. But breaking information into packets that could take numerous redundant network paths "wasn't designed for streaming media," Roberts says. Network routers treat packets equally and can delay or drop them; this means blips and dead spots in voice and video.

Roberts' s company, Anagran, promises a fix. Its technology, which prescreens data before it enters a router, can tell that certain packets belong to streaming media and give them priority. (Or it can lower their ­priority, if the goal is to limit file sharing.) Anagran's approach is different from that taken by Roberts's previous company, Caspian Networks, which shut down in 2006 after consuming more than $300 million in venture capital. Caspian made a large, expensive router that required costly network redesigns. Anagran's device, by contrast, plugs into existing routers to handle up to four million simultaneous data or media streams.

Last year, Anagran started shipping products to government and university customers seeking to ensure that peer-to-peer file sharing doesn't overwhelm their networks. (The technology is better at identifying peer-to-peer traffic than an existing technology called deep packet inspection, which can miss encrypted files, Anagran says.) ­Warren Packard, who invested in Anagran for Draper Fisher Jurvetson, says the tech­nology will be critical to future Internet growth, "especially when you consider its impact on real-time streams that require high quality of service."

Bill Gates: Eine Legende tritt ab

Computer | 27.06.2008

Bill Gates: Eine Legende tritt ab

Bill Gates zieht sich endgültig bei Microsoft zurück. Ab dem 1. Juli konzentriert er sich auf seine "Bill & Melinda Gates Foundation", die sich für den Kampf gegen Krankheiten engagiert.

An Bill Gates scheiden sich noch immer die Geister. Die einen bewundern ihn als großen Visionär, dessen Computerprogramme aus den kryptischen Rechenmonstern von einst Geräte gemacht haben, die heute wie selbstverständlich benutzt werden, für elektronische Briefe, Musik, Bilder, Videos und vieles mehr. Für die anderen ist er dagegen ein rotes Tuch, um nicht zu sagen: ein Hassobjekt. Er habe nie eigene Ideen gehabt, sagen sie. Nur die Ideen anderer imitiert und besser vermarktet - zur Not auch mit seiner schieren Marktmacht und seinen Rechtsanwälten.

Was sagt Bill Gates selbst dazu? Früher waren Computer Maschinen, die Millionen von Dollars kosteten, und die nur für große Unternehmen konstruiert wurden, erinnert er sich 1999 in einer Rede vor den Schülern in Bonn: "Sie wurden gebaut, um Rechnungen auszudrucken und riesige Datenbanken anzulegen. Sie hatten nichts mit dem Individuum zu tun, mit persönlichen Bedürfnissen der Menschen. Was die Sache dann wirklich verändert hat, war das Wunder des Mikroprozessors, die riesigen Möglichkeiten und Kapazitäten auf einem einzigen Chip." Sein Freund Paul Allan und er hätten diesen Chip gesehen und sich gesagt: "Das ist etwas, was den Computer zu einem persönlichen Instrument für jedermann machen könnte."

An der eigenen Legende stricken

Bill und Melinda Gates - APBildunterschrift: Großansicht des Bildes mit der Bildunterschrift: Bill und Melinda Gates bei einer AIDS-Konferenz 2006 in Toronto

So kann man auch an seiner persönlichen Legende stricken. Tatsache ist: Keine revolutionäre Idee stammt von Bill Gates selbst. Nichts hat seine 1975 gegründete Firma Microsoft als erstes gemacht. Bill Gates hat nur die Fehler der Pioniere vermieden und aus guten Ideen anderer als erster Kapital geschlagen - in dieser Disziplin aber hat er sich allerdings sehr wohl als wahres Genie erwiesen.

Mit seinen Windows-Betriebssystemen, die einem Computer erst das Laufen beibringen, hat er eine weltweite Monopolstellung erreicht, Konkurrenten drückt er erbarmungslos aus dem Markt, selbst die amerikanische Regierung hat sich an seinem Monopol mit juristischen Mitteln die Zähne ausgebissen.

"Teil von etwas Faszinierendem"

"Das ist alles lange her, inzwischen sind die Computer besser und besser geworden. Wir hatten das Privileg, mit einer Reihe von Freunden eine Firma im Weltmaßstab aufzubauen, und wir hatten das Privileg, ein Teil von etwas ganz Faszinierendem zu sein. Dabei stehen wir erst am Anfang der digitalen Revolution", sagt er.

Einen Revolutionär freilich stellt man sich gemeinhin anders vor. Bill Gates wirkt auch heute noch wie der Junge von nebenan, der nervös an seiner Brille nestelt. Die freie Rede ist auch nicht sein Ding. Aber das hat er auch im Grunde nicht mehr nötig. Er lebt mit seiner Frau Melinda in einem - natürlich voll digitalisierten und automatisierten - Haus am Lake Washington in der Nähe von Seattle und betätigt sich als "big spender", als großzügiger Gönner, der verschiedene Stiftungen gegründet hat und Milliarden verteilt - unter anderem für Impfprogramme in der Dritten Welt. Und um diese Stiftung will er sich jetzt voll und ganz kümmern.

Rolf Wenkel

Thursday, June 26, 2008

Das Ende der Werbung, wie wir sie kannten

Internet | 26.06.2008

Das Ende der Werbung, wie wir sie kannten

Die klassische Werbung steckt in der Krise. Egal ob Fernsehen, Print oder Online - für Kaufentscheidungen sind die Kampagnen zunehmend bedeutungslos. Aber wie bringt man Produkte dann unters Volk. Wird jetzt alles viral?

Täglich prasseln unterschiedlichen Schätzungen zufolge rund 1500 Werbebotschaften auf uns ein. Offenbar ohne großen Erfolg. "Menschen haben gelernt, wie Werbung funktioniert, und können nicht nur im Fernsehen, sondern auch draußen in der Welt Werbung ganz gezielt ausblenden", erklärt Marketingberater Michael Domsalla.

Grafik zur Mediennutzung bei Kaufentscheidungen (Diagramm: DW)Bildunterschrift:

Trotzdem fließen derzeit 90 Prozent der Werbebudgets in die klassischen Medien. Dabei hat das Internet mit den verbleibenden zehn Prozent Budget inzwischen eine doppelt so hohe Relevanz für Kaufentscheidungen wie das zweitplazierte Medium Fernsehen. Zu dem Ergebnis kommt eine neue Studie, die die renommierte Public Relations-Unternehmensberatung Fleishman Hillard in Auftrag gegeben hat.

"Der Begriff Werbung löst sich auf"

Kaufentscheidungen werden nun maßgeblich im Internet getroffen. Allerdings ist der Anteil der Werbung daran in vielen Bereichen völlig bedeutungslos, sagt die Studie. Egal ob ein neuer Computer, ein neuer Stromlieferant, ob DVD oder die Urlaubsreise - Nutzer im Internet informieren sich in Foren und auf Bewertungsseiten. Werbung als Informationsquelle? Meistens irrelevant, so die Studie.

Klassiche Plakatwerbung (Foto: DW)Bildunterschrift:

"Werbung gilt nur in Ausnahmefällen als wichtige Information", sagt Marketingberater Domsalla und zieht ein drastisches Fazit: "Der Begriff der klassischen Werbung löst sich auf." Überraschen, Aufsehen erregen oder erschrecken gelinge immer weniger Werbe-Kampagnen. Die Konsumenten reagierten ablehnend bis aggressiv auf Werbung.

Werbung, die nicht wie Werbung aussieht

Die Rettung aus der Misere ist unscharf, verwackelt und meist ganz schnell wieder vorbei. "Ein von Werbefachleuten gern genutztes Mittel ist das Platzieren von witzigen Werbespots in Video-Portalen wie YouTube", berichtet der Computer-Informations-Dienst.

Doch das so genannte virale Marketing ist weit mehr als ein paar witzige Videos. "Im Kern meint der Begriff, dass Menschen sich so oder so über Produkte austauschen, man versucht einfach diese Gespräche zu forcieren und zu lenken", erklärt Domsalla.

Wie ein Virus soll sich die Botschaft eines Produktes verbreiten. Die dafür nötigen Anstrengungen stehen den klassischen Kampagnen im Arbeitsaufwand um nichts nach.

30 Cent pro Zuschauer

Allein um in der Flut der minütlich eingestellten Videos bei YouTube nicht unterzugehen, braucht man erst einmal 50.000 so genannte Views, um zumindest auf der Startseite des Videodienstes aufzutauchen. Werbeagenturen bieten deshalb inzwischen virale Marketingpakete an. GoViral zum Beispiel garantiert dem Kunden pro 30 Cent einen Abspielvorgang.

Flankierend vonnöten sind zudem noch Blogeinträge, Kommentare und Verlinkungen. Das alles, damit eine virale Marketing-Kampagne überhaupt startet.

Montage zur Werbeoffensive im Web 2.0 (Montage: DW)Bildunterschrift:

Ob und wie eine Werbekampagne der neuen Art die Zielgruppe erreicht, das ist allerdings völlig unklar. Denn wenn sich etwas viral verbreitet, dann hat man es nicht mehr unter Kontrolle. Zudem reagiert die digitale Kundschaft höchst sensibel - vor allem auf versteckte Werbung.

Internet Gridlock

July/August 2008

Internet Gridlock

Video is clogging the Internet. How we choose to unclog it will have far-reaching implications.

By Larry Hardesty

An obscure blogger films his three-year-old daughter reciting the plot of the first Star Wars movie. He stitches together the best parts--including the sage advice "Don't talk back to Darth Vader; he'll getcha"--and posts them on the video-sharing website YouTube. Seven million people download the file. A baby-faced University of Minnesota graduate student with an improbably deep voice films himself singing a mind-numbingly repetitive social-protest song called "Chocolate Rain": 23 million downloads. A self-described "inspirational comedian" films the six-minute dance routine that closes his presentations, which summarizes the history of popular dance from Elvis to Eminem: 87 million downloads.

Video downloads are sucking up bandwidth at an unprecedented rate. A short magazine article might take six minutes to read online. Watching "The Evolution of Dance" also takes six minutes--but it requires you to download 100 times as much data. "The Evolution of Dance" alone has sent the equivalent of 250,000 DVDs' worth of data across the Internet.

Star Wars: Episode IV according to a three-year-old.

And YouTube is just the tip of the iceberg. Fans of Lost or The Office can watch missed episodes on network websites. Netflix now streams videos to its subscribers over the Internet, and both Amazon and Apple's iTunes music store sell movies and episodes of TV shows online. Peer-to-peer file-sharing networks have gradu­ated from transferring four-minute songs to hour-long ­Sopranos episodes. And all of these videos are higher quality--and thus more bandwidth intensive--than YouTube's.

Last November, an IT research firm called Nemertes made headlines by reporting that Internet traffic was growing by about 100 percent a year and that in the United States, user demand would exceed network capacity by 2010. Andrew Odlyzko, who runs the Minnesota Internet Traffic Studies program at the University of Minnesota, believes that the growth rate is closer to 50 percent. At that rate, he says, expected improvements in standard network equipment should keep pace with traffic increases.

But if the real rate of traffic growth is somewhere between Nemertes's and Odlyzko's estimates, or if high-definition video takes off online, then traffic congestion on the Internet could become much more common. And the way that congestion is relieved will have implications for the principles of openness and freedom that have come to characterize the Internet.

Whose Bits Win?
The Internet is a lot like a highway, but not, contrary to popular belief, a superhighway. It's more like a four-lane state highway with traffic lights every five miles or so. A packet of data can blaze down an optical fiber at the speed of light, but every once in a while it reaches an intersection where it has the option of branching off down another fiber. There it encounters a box called an Internet router, which tells it which way to go. If traffic is light, the packet can negotiate the intersection with hardly any loss of speed. But if too many packets reach the intersection at the same time, they have to queue up and wait for the router to usher them through. When the wait gets too long, you've got congestion.

The transmission control protocol, or TCP--one of the Internet's two fundamental protocols--includes an algorithm for handling congestion. Basically, if a given data link gets congested, TCP tells all the computers sending packets over it to halve their transmission rates. The senders then slowly ratchet their rates back up--until things get congested again. But if your computer's transmission rate is constantly being cut in half, you can end up with much less bandwidth than your broadband provider's ads promised you.

Sometimes that's not a problem. If you're downloading a video to watch later, you might leave your computer for a few hours and not notice 10 minutes of congestion. But if you're using streaming audio to listen to a live World Series game, every little audio pop or skip can be infuriating. If a router could just tell which kind of traffic was which, it could wave the delay-sensitive packets through and temporarily hold back the others, and everybody would be happy.


But the idea that an Internet service provider (ISP) would make value judgments about the packets traveling over its network makes many people uneasy. The Internet, as its name was meant to imply, is not a single network. It's a network of networks, most of which the average user has never heard of. A packet traveling long distances often has to traverse several networks. Once ISPs get in the business of discriminating between packets, what's to prevent them from giving their own customers' packets priority, to the detriment of their competitors'? Suppose an ISP has partnered with--or owns--a Web service, such as a search engine or a social-networking site. Or suppose it offers a separate service--like phone or television--that competes with Internet services. If it can treat some packets better than others, it has the means to an unfair advantage over its own rivals, or its partners', or its subsidiaries'.

The idea that the Internet should be fair--that it shouldn't pick favorites among users, service providers, applications, and types of content--is generally known as net neutrality. And it's a principle that has been much in the news lately, after its apparent violation by Comcast, the second-largest ISP in the United States.

Last summer, it became clear that Comcast was intentionally slowing down peer-to-peer traffic sent over its network by programs using the popular file-sharing protocol BitTorrent. The Federal Communications Commission agreed to investigate, in a set of hearings held at Harvard and Stanford Universities in early 2008.

It wasn't BitTorrent Inc. that had complained to the FCC, but rather a company called Vuze, based in Palo Alto, CA, which uses the BitTorrent protocol--perfectly legally--to distribute high-­definition video over the Internet. As a video distributor, Vuze is in competition, however lopsided, with Comcast. By specifically degrading the performance of BitTorrent traffic, Vuze argued, Comcast was giving itself an unfair advantage over a smaller rival.

At the Harvard hearing, Comcast executive vice president David Cohen argued that his company had acted only during periods of severe congestion, and that it had interfered only with traffic being uploaded to its network by computers that weren't simultaneously performing downloads. That was a good indication, Cohen said, that the computers were unattended. By slowing the uploads, he said, Comcast wasn't hurting the absent users, and it was dramatically improving the performance of other applications running over the network.

Whatever Comcast's motivations may have been, its run-in with Vuze graphically illustrates the conflict between congestion management and the principle of net neutrality. "An operator that is just managing the cost of its service by managing congestion may well have to throttle back heavy users," says Bob Briscoe, chief researcher at BT's Networks Research Centre in Ipswich, England. "An operator that wants to pick winners and chooses to say that this certain application is a loser may also throttle back the same applications. And it's very difficult to tell the difference between the two."


To many proponents of net neutrality, the easy way out of this dilemma is for ISPs to increase the capacity of their networks. But they have little business incentive to do so. "Why should I put an enhancement into my platform if somebody else is going to make the money?" says David Clark, a senior research scientist at MIT's Computer Science and Artificial Intelligence Laboratory, who from 1981 to 1989 was the Internet's chief protocol architect. "Vuze is selling HD television with almost no capital expenses whatsoever," Clark says. Should an ISP spend millions--or billions--on hardware upgrades "so that Vuze can get into the business of delivering television over my infrastructure with no capital costs whatsoever, and I don't get any revenues from this?" For ISPs that also offer television service, the situation is worse. If an increase in network capacity helps services like Vuze gain market share, the ISP's massive capital outlay could actually reduce its revenues. "If video is no longer a product [the ISP] can mark up because it's being delivered over packets," Clark says, "he has no business model."

As Clark pointed out at the Harvard FCC hearing, ISPs do have the option of defraying capital expenses by charging heavy users more than they charge light users. But so far, most of them have resisted that approach. "What they have been reluctant to do is charge per byte," says Odlyzko, "or else have caps on usage--only so many gigabytes, beyond which you're hit with a punitive tariff." The industry "is strangely attached to this one-size-fits-all model," says Timothy Wu, a Columbia Law School professor who's generally credited with coining the term "network neutrality." "They've got people used to an all-you-can-eat pricing program," Wu says, "and it's hard to change pricing plans."

Absent a change in pricing structures, however, ISPs that want to both manage congestion and keep regulators happy are in a bind. Can technology help get them out of it?

The Last Bit
To BT's Bob Briscoe, talk of ISPs' unfair congestion-management techniques is misleading, because congestion management on the Internet was never fair. Telling computers to halve their data rates in the face of congestion, as the TCP protocol does, is fair only if all those computers are contributing equally to the congestion. But in today's Internet, some applications gobble up bandwidth more aggressively than others. If my application is using four times as much bandwidth as yours, and we both halve our transmission rates, I'm still using twice as much bandwidth as you were initially. Moreover, if my gluttony is what caused the congestion in the first place, you're being penalized for my greed. "Ideally, we would want to allow everyone the freedom to use exactly what they wanted," Briscoe says. "The problem is that congestion represents the limit on other people's freedom that my freedom causes."

Briscoe has proposed a scheme in which greedy applications can, for the most part, suck up as much bandwidth as they want, while light Internet users will see their download speeds increase--even when the network is congested. The trick is simply to allot every Internet subscriber a monthly quota of high-priority data packets that get a disproportionately large slice of bandwidth during periods of congestion. Once people exhaust their quotas, they can keep using the Internet; they'll just be at the mercy of traffic conditions.

So users will want to conserve high-priority packets. "A browser can tell how big a download is before it starts," Briscoe says, and by default, the browser would be set to use the high-priority packets only for small files. For tech-savvy users who wanted to prioritize some large file on a single occasion, however, "some little control panel might allow them to go in, just like you can go in and change the parameters of your network stack if you really want to."

Just granting users the possibility of setting traffic priorities themselves, Briscoe believes, is enough to assuage concerns about network neutrality. "I suspect that 95 percent of customers, if they were given the choice between doing that themselves or the ISP doing it for them, would just say, Oh, sod it, do it for me," Briscoe says. "The important point is they were asked. And they could have done it themselves. And I think those 5 percent that are complaining are the ones that wish they were asked."

In Briscoe's scheme, users could pay more for larger quotas of high-priority packets, but this wouldn't amount to the kind of usage cap or "punitive tariff" that Odlyzko says ISPs are wary of. Every Internet subscriber would still get unlimited downloads. Some would just get better service during periods of congestion.

In order to determine which packets counted against a user's quota, of course, ISPs would need to know when the network is congested. And that turns out to be more complicated than it sounds. If a Comcast subscriber in New York and an EarthLink subscriber in California are exchanging data, their packets are traveling over several different networks: Comcast's, EarthLink's, and others in between. If there's congestion on one of those networks, the sending and receiving computers can tell, because some of their packets are getting lost. But if the congestion is on Comcast's network, EarthLink doesn't know about it, and vice versa. That's a problem if the ISPs are responsible for tracking their customers' packet quotas.

Briscoe is proposing that when the sending and receiving computers recognize congestion on the link between them, they indicate it to their ISPs by flagging their packets--flipping a single bit from 0 to 1. Of course, hackers could try to game the system, reprogramming their computers so that they deny that they've encountered congestion when they really have. But a computer whose congestion claims are consistently at odds with everyone else's will be easy to ferret out. Enforcing honesty is probably not the biggest problem for Briscoe's scheme.

Getting everyone to agree on it is. An Internet packet consists of a payload--a chunk of the Web page, video, or telephone call that's being transmitted--and a header. The header contains the Internet addresses of the sender and receiver, along with other information that tells routers and the receiving computer how to handle the packet. When the architects of the Internet designed the Internet protocol (IP), they gave the packet header a bunch of extra bits, for use by yet unimagined services. All those extra bits have been parceled out--except one. That's the bit Briscoe wants to use.

Among network engineers, Briscoe's ideas have attracted a lot of attention and a lot of support. But the last bit is a hard sell, and he knows it. "The difficult [part] in doing it is getting it agreed that it should be done," he says. "Because when you want to change IP, because half of the world is now being built on top of IP, it's like arguing to change--I don't know, the rules of cricket or something."

Someday, the Internet might use an approach much like ­Briscoe's to manage congestion. But that day is probably years away. A bandwidth crunch may not be.

Strange Bedfellows
Most agree that the recent spike in Internet traffic is due to video downloads and peer-to-peer file transfers, but nobody's sure how much responsibility each one bears. ISPs know the traffic distributions for their own networks, but they're not disclosing them, and a given ISP's distribution may not reflect that of the Internet as a whole. Video downloads don't hog bandwidth in the way that many peer-to-peer programs do, though. And we do know that peer-to-peer traffic is the type that Comcast clamped down on.

Nonetheless, ISPs and peer-to-peer networks are not natural antagonists. A BitTorrent download may use a lot of bandwidth, but it uses it much more efficiently than a traditional download does; that's why it's so fast. In principle, peer-to-peer protocols could help distribute server load across a network, eliminating bottle­necks. The problem, says Mung Chiang, an associate professor of electrical engineering at Princeton University (and a member of last year's TR35), is the mutual ignorance that ISPs and peer-to-peer networks have maintained in the name of net neutrality.

ISPs don't just rely on the TCP protocol to handle congestion. They actively manage their networks, identifying clogged links and routing traffic around them. At the same time, computers running BitTorrent are constantly searching for new peers that can upload data more rapidly and dropping peers whose transmissions have become sluggish. The problem, according to Chiang, is that peer-to-peer networks respond to congestion much faster than ISPs do. If a bunch of computers running peer-to-peer programs are sending traffic over the same link, they may all see their downloads slow down, so they'll go looking for new peers. By the time the ISP decides to route around the congested link, the peer-to-peer traffic may have moved elsewhere: the ISP has effectively sealed off a wide-open pipe. Even worse, its new routing plan might end up sending traffic over links that have since become congested.

But, Chiang says, "suppose the network operator tells the content distributor something about its network: the route I'm using, the metric I'm using, the way I'm updating my routes. Or the other way around: the content distributor says something about the way it treats servers or selects peers." Network efficiency improves.

An industry consortium called the P4P Working Group--led by Verizon and the New York peer-to-peer company Pando--is exploring just such a possibility. Verizon and Pando have tested a protocol called P4P, created by Haiyong Xie, a PhD student in computer science at Yale University. With P4P, both ISPs and peer-to-peer networks supply abstract information about their network layouts to a central computer, which blends the information to produce a new, hybridized network map. Peer-to-peer networks can use the map to avoid bottlenecks.

In the trial, the P4P system let Verizon customers using the Fios fiber-optic-cable service and the Pando peer-to-peer network download files three to seven times as quickly as they could have otherwise, says Laird Popkin, Pando's chief technology officer. To some extent, that was because the protocol was better at finding peers that were part of Verizon's network, as opposed to some remote network.

Scared Straight?
Every technical attempt to defeat congestion eventually runs up against the principle of net neutrality, however. Even though ­BitTorrent Inc. is a core member of the P4P Working Group, its chief technology officer, Eric Klinker, remains leery of the idea that peer-to-peer networks and ISPs would share information. He worries that a protocol like P4P could allow an ISP to misrepresent its network topology in an attempt to keep traffic local, so it doesn't have to pay access fees to send traffic across other networks.

Even David Clark's proposal that ISPs simply charge their customers according to usage could threaten neutrality. As Mung Chiang points out, an ISP that also sold TV service could tier its charges so that customers who watched a lot of high-definition Internet TV would always end up paying more than they would have for cable subscriptions. So the question that looms over every discussion of congestion and neutrality is, Does the government need to intervene to ensure that everyone plays fair?

For all Klinker's concerns about P4P, BitTorrent seems to have concluded that it doesn't. In February, Klinker had joined representatives of Vuze and several activist groups in a public endorsement of net neutrality legislation proposed by Massachusetts congressman Ed Markey. At the end of March, however, after the Harvard hearings, BitTorrent and Comcast issued a joint press release announcing that they would collaborate to develop methods of peer selection that reduce congestion. Comcast would take a "protocol­-agnostic" approach to congestion management--targeting only heavy bandwidth users, not particular applications--and would increase the amount of bandwidth available to its customers for uploads. BitTorrent, meanwhile, agreed that "these technical issues can be worked out through private business discussions without the need for government intervention."

The FCC, says Clark, "will do something, there's no doubt, if industry does not resolve the current impasse." But, he adds, "it's possible that the middle-of-the-road answer here is that vigilance from the regulators will impose a discipline on the market that will cause the market to find the solution."

That would be welcome news to Chiang. "Often, government legislation is done by people who may not know technology that well," he says, "and therefore they tend to ignore some of the feasibility and realities of the technology."

But Timothy Wu believes that network neutrality regulations could be written at a level of generality that imposes no ­innovation-­killing restrictions on the market, while still giving the FCC latitude to punish transgressors. There's ample precedent, he says, for broad proscriptions that federal agencies interpret on a case-by-case basis. "In employment law, we have a general rule that says you shouldn't discriminate, but in reality we have the fact that you aren't allowed to discriminate unless you have a good reason," he says. "Maybe somebody has to speak Arabic to be a spy. But saying you have to be white to serve food is not the same thing."

Ultimately, however, "the Internet's problems have always been best solved collectively, through its long history," Wu says. "It's held together by people being reasonable ... reasonable and part of a giant community. The fact that it works at all is ridiculous."

Larry Hardesty is a Technology Review senior editor.

Why Web 2.0 Is No Bubble: Corporations Are Willing to Pay for It

Why Web 2.0 Is No Bubble: Corporations Are Willing to Pay for It

Everyone seems to want an answer to the question "When will Web 2.0 startups start making money?" The implication is that unless we can answer the question, the "bubble" of Web 2.0 will burst and all of us who believe in this stuff will be revealed as fantasists.

The fact is, it's incredibly hard to make money as a Web 2.0 startup aimed at consumers.

There are hundreds of these companies, and they all clamor to brief us at Forrester. Each has its own twist on blogs, social networks, ratings, user generated video, or whatever. It's hard to get people to pay attention to a new tool, and the value of the tool depends on lots of participation -- the classic chicken-and-egg problem. Your competitor is always one twist ahead of you. Some of these startups will succeed but the odds are one in a thousand -- you need just the right idea, at the right time, with the right push or set of potential customers, and you need to take off with such velocity that you leave the competition in the dust.

Once a startup like this does take off, there's that other pesky little problem -- monetizing the success. Google transformed the online world by first generating huge traffic, then finding a business model. But Google's success was based on a fantastically clever advertising mechanism that was automated, attracted new advertisers, and served searchers nearly as well as it served advertisers. Facebook hasn't yet unlocked that advertising gold mine, and flubbed up its most prominent try with Beacon. Twitter has no business model yet. Ning has hundreds of thousands of visitors, but still runs Google AdSense ads. And these are the successes. No wonder people are skeptical.

A few of these companies may (and likely will) unlock that genie as Google did and take off. But for any given startup, the odds are astronomical.

The amazing thing is that there are a class of startup companies making good money right now from Web 2.0. They're not flashy and they don't grow like mushrooms. But they've got all the business they can handle and they are growing. I am talking about companies that serve corporate social application needs. This isn't the typical Web 2.0 business paradigm, since serving corporate customers means lots of client service, which is people-intensive -- it doesn't lift off miraculously like a pure technology startup. In fact, in many of these companies, the technology itself is positively mundane. But the startups grow because they deliver value for which they can charge a premium and get customer loyalty. The customers of these companies don't defect when something shiny and new comes along, because they like the service they're getting.

Here are some examples, listed by the objectives they help companies accomplish (for more on these objectives see Chapters 4 through 9 of Groundswell).

Listening. Communispace now has hundreds of private communities that its client companies are using to learn about their customers. It succeeds because it's unlocked the key to running and moderating these communities effectively, and grows despite charging $150K or more per year per community. The other class of listening companies are the brand monitoring companies, and the track record here is great. Research giant Nielsen bought BuzzMetrics. Another research giant, TNS, bought Cymfony. J.D. Power & Associates bought Umbria. MotiveQuest, which is still independent, has typical clients happily paying $30K and up to work with it.

Talking. Talking with the Groundswell is tricky, but there are plenty of agencies ready to help you with it. After building dozens of campaigns and sites, Blast Radius was bought by mega-agency Wunderman. Brains on Fire ignited the spectacular success of Fiskateers. The digital divisions of companies like Edelman also compete in this space, as do the big Web service companies like Avenue A/Razorfish (now part of Microsoft).

Energizing. Ratings and reviews are the easiest way to energize customers to sell others, and the companies that provide them are taking off. Bazaarvoice's clients have generated over 10 billion customer reviews. PowerReviews works with over 200 retailers. And ExpoTV has built a business around consumers creating reviews on video.

Supporting. Support forums work -- they please customers and they reduce costs. Lithium has an impressive client list including Dell, AT&T, Comcast, and Sprint. The community space is crowded, but other companies with growing client lists include Jive Software, Awareness, and Mzinga/Prospero.

Embracing. Startups that enable clients to source ideas from their customers have a bright future, because customer-generated innovation is hot right now. Salesforce.com bought Crispy News and turned it into Salesforce Ideas, which powers idea sites for Dell and Starbucks. And Innocentive is growing rapidly, with 50 companies including Procter & Gamble offering prizes of $10,000 or more to innovators that can solve their problems.

While many were distracted by sparkly consumer-facing startups, these companies were building and growing solid businesses. Look how many of them were acquired! This is no bubble, because companies that deliver business value to clients have durable growth potential. Could this be the Web 2.0 business model everyone is looking for?

Wednesday, June 25, 2008

A Record-Breaking Optical Chip

Wednesday, June 25, 2008

A Record-Breaking Optical Chip

Intel researchers have built a superfast silicon chip for optical networking.

By Kate Greene

The road to a faster Internet, data center, and personal computer is paved with silicon. Or so believe researchers at Intel who have unveiled a test chip--made entirely from silicon--that can encode 200 gigabits of data per second on a beam of light. In contrast, the most advanced chips used in today's fastest optical networks operate at speeds of 100 gigabits per second. And these 100-gigabit chips, which are made from nonsilicon materials, have limitations that Intel's chip doesn't: they can't scale to faster speeds as inexpensively as can those made from silicon.

While silicon is the material of choice in the electronics industry, it has been overlooked in the photonics industry because its optical properties are inferior to those of other semiconductors. Silicon doesn't produce, detect, and manipulate photons as well as materials such as indium phosphide and gallium arsenide. But within the past few years, optical engineers have been giving silicon a second look and cleverly engineering around some of its natural limitations.

The new Intel test chip splits an incoming beam of light into eight channels. Within each channel is a modulator, a device that encodes data onto light. After the beams are encoded with data, they are recombined. In the tests, each modulator ran at a rate of 25 gigabits per second, and each performed nearly identically, says Mario Paniccia, director of the company's silicon-photonics lab. He notes that only one modulator was tested at a time but says that in a future paper his team will publish the results from running multiple channels simultaneously. The multiple channels could produce cross talk, electrical or optical activity that could hinder performance. However, preliminary results, Paniccia says, show that due to the design, cross talk is limited.

In 2004, Intel researchers, led by Paniccia, proved that silicon could be used to build a one-gigabit-per-second modulator; in 2005, the team boosted the speed to 10 gigabits per second. Also in 2005, the researchers built a remarkably good all-silicon laser, and in 2006, they introduced a hybrid laser that combines indium phosphide with silicon, allowing a practical telecom laser to be fabricated on a silicon wafer. Most recently, they have sped up the modulator to 40 gigabits and built a silicon detector.

Light it up: The silver-colored rectangular chip in the middle of the copper-colored holder is Intel’s latest advance in silicon photonics. The chip contains eight modulators that encode data onto light that enters and exits from the side via optical fibers (not pictured). This chip can process 200 gigabits of data per second and is used to test designs that could ultimately process a terabit of data per second.
Credit: Inte

Other companies are now also exploring the capabilities of silicon for photonics. IBM and Sun Microsystems have active research groups, and a startup called Luxtera has already made advances in silicon-based optical interconnects for data centers. At Intel, however, the pieces are coming together to make a single chip that could process a terabit of data in the space of a thumbnail. This chip and its accompanying electronics could replace racks full of expensive hardware that currently occupy rooms at Internet switching stations. And if all goes well, optical devices made of silicon could allow engineers to replace copper wiring in computers with beams of data-encoded light.

"Intel has pioneered a lot of high-speed silicon-photonics devices, and it's certainly one of the premier research groups," says Jack Cunningham, co-principle investigator of Sun Microsystems' proximity interconnect project, which focuses on low-power interchip communication for high-performance computers. Cunningham says that the Intel test chip is another important step in the evolution of silicon photonics. "It's the right direction in the sense that high-bandwidth optical signaling on silicon chips is very important," he says.

Paniccia notes that there is still a lot of work to do before Intel's optical chips find their way to market. Instead of having only 8 modulators, the goal is to have 25 on a chip. In addition, the modulators will run faster--at 40 gigabits per second. And it's still unclear how light will be piped into the modulators in the future. Currently, it enters via an optical fiber on one end of the device, but future versions of the chip may include hybrid lasers fabricated on the chip. Paniccia hopes that in three to five years, Intel's silicon-photonics chips will be ready for market.

Listening on the road

Listening on the road

American Merchant Marine at War, www.usmm.org
Loose lips might sink ships poster because somebody talked! poster
Loose lips might sink ships
Artist: Essarge
Dimensions: 28 x 21.5 inches
This poster is published by the House of Seagram as part of its contribution to the national victory effort
Published New York : Seagram-Distillers Corp.
because somebody talked!
Artist: Wesley, 1944
Dimensions: 28 x 20 inches
[Washington, D.C.] : U. S. Government Printing Office :
Office of War Information: 1944--O-579038"
Posted by:
Economist.com | LONDON
Categories:
Working habits

BUSINESS travellers are nosy, it seems—and the Brits are worse than the Americans. Regus, a provider of office spaces, asked 1,000 professionals on both sides of the Atlantic about their working habits when travelling. To no great surprise, it found that many have no qualms about eavesdropping, while the absence of an office often forces them to work in undignified surroundings. The figures suggest there's an awful lot of foolish chattering going on, reminding me of those second-world-war posters about "careless talk".

The survey found that:

• 67% of Brits travelling with work have eavesdropped on someone else's business conversation, versus 59% of American professionals
• 35% of travelling British professionals have caught sight of sensitive company documents, along with 34% of Americans
• 13% of British professionals have been able to use the information they have overheard in public versus 19% of American mobile professionals
When they’re not busy overhearing things they shouldn’t, British travellers end up working in some bizarre environments:
• 16% have worked from toilets and public bathrooms
• 51% have worked from bars or pubs... (57% of men opposed to 42% of women)
• 46% have worked from shopping centres
• 12% have worked from a gym

For the record, Gulliver eavesdrops readily enough: if someone is nattering into their phone or to a colleague in a public space, then their conversation is open to all listeners. Especially if it's disrupting what would otherwise be a peaceful train ride.

Tuesday, June 24, 2008

Part II: The Business of Social Networks

July/August 2008

Part II: The Business of Social Networks

Can social-networking sites ever make money?

By Bryant Urstadt

Bad Neighbors
Another problem that targeting may not be able to solve is the one posed by what advertisers call "content adjacency."

Unlike a newspaper or television show, social networking is a medium whose content is deeply unpredictable. In the sports pages of a newspaper, an advertiser knows roughly what kind of material its ads will be running next to. But an enormous, highly visible brand may not want to risk seeing its ad wind up on a page such as that run by the actual Facebook group "I've Had Sex with Someone on Facebook," which at press time had 59,353 members. Or consider the MySpace profile (turned up after about two minutes on the site) of 18-year-old "Nikki AKA Death Angel!," which is adorned with the motto "Don't fuckin fuck with ninjette bitch we'll cut ur fuckin head off an give it to ur momma."

This is not content that commands high rates, although certain buyers mind less. "Right now, the low-hanging fruit is entertainment, because they're agnostic about content adjacency," says Goldstein. Indeed, Nikki's badass profile features an ad for the Warner Bros. film Get Smart. But even entertainment companies are steering clear of the user-generated communities offered by Ning and KickApps. "It's not a controllable universe right now, with the porn sites and such," Ruxin says. "It's a blind buy."

Not everyone is so pessimistic. Andrew Braccia, a partner at Accel, one of Facebook's early investors, thinks advertisers will eventually become more accepting of the "breathing, dynamic" nature of social networking and grow to understand that its unpredictability is part of its allure. And Facebook's Palihapitiya, perhaps naively, doesn't seem to think the adjacency problem will arise much on his site; Facebook, he says, has "a tremendous amount of user content moderation, with a very simple mechanism for flagging inappropriate material."

Fancy, this: In spring 2007, artist David Choe painted the walls at Facebook’s Palo Alto offices.
Credit: Jamie Kripke
Multimedia
photo Timeline of key events in the rise of social-networking sites.
photo See a comparison of Myspace and Facebook’s traffic and advertising growth.
photo View a graph on U.S. ad spending on social networking sites relative to U.S. online ad spending.
View a graph on worldwide online social-networking advertising spending.
Users' ideas of what's appropriate are hardly the same as advertisers', though. Such arguments may not be enough to sway the enormous, image-conscious brands that drive the majority of the advertising market. And Palihapitiya, deliberately or otherwise, may be missing the point: advertisers dislike rude content not merely because it might reflect badly on their brands, but because people reading such stuff are probably not thinking about buying many things that advertisers are selling.

Still, backers of social networking feel strongly that so many eyeballs must have value. Braccia points out that while more than 6 percent of advertising dollars are spent online, 20 percent of media consumption now happens there. "It's a significant opportunity," he says. "We're so young, so in our infancy here."

"These sites are no different from traditional media properties," says Paul Kedrosky, who writes Infectious Greed, a much-read blog on venture capital and the Internet. "We're holding these sites to an absurd standard. The advertising allocations will follow the consumer, and right now they're badly out of whack."

Roger McNamee remains convinced that Facebook is too alluring, too useful, and too established not to be profitable somehow. The answer is out there, even if he doesn't have it. "Someone," says McNamee, "is going to have to get creative. I take it on faith that it will emerge. After all, I'm an investor. I'm hopelessly biased."

Marc Canter has a few ideas. Canter, who cofounded ­MacroMedia, is now CEO of the company that produces the social-networking tool PeopleAggregator, which aims to allow communities, tools, search engines, and the rest of Web 2.0 to interconnect in one giant open mesh. He imagines ads of all kinds making up only about a third of revenue, with profits coming from a "long tail" of sources--from Craig's List-style marketplaces to on-demand music downloads to branded apparel to ad-free premium services.

Chamath Palihapitiya expects Facebook to generate revenue by selling a variety of such services to users. The site has rolled out a "gift" program, in which friends spend real money to "give" friends virtual items, such as an image of a box of tissues with a get-well note. He also suggests that Facebook may at some point see reve­nue from ads served through applications on its site, a growing and potentially major source of income from which it currently gets nothing.

Perhaps most optimistic of all is venture capitalist Ron ­Conway, the subject of the book The Godfather of Silicon Valley, who has invested in Google, PayPal, and dozens of Web 2.0 companies. "MySpace projected it would do a billion dollars' worth of reve­nue this year. They came up short and did $800 million," he says. "Rupert Murdoch only paid $570 million for the whole thing. It's been called the best acquisition of all time. I think Facebook is a couple of years behind MySpace but on the same trajectory. It's a hugely monetizable business. I think it's a slam dunk."

A GLOOMY FORECAST
Can social-networking sites continue to make significant inroads into the U.S. online advertising market? The outlook is uncertain. A shaky economy and setbacks in ­targeted-advertising initiatives have caused leading online marketing research firm eMarketer to project more modest revenue growth for social-networking sites over the next four years than it had previously predicted.

THE GLOBAL VIEW
Social networking is a global phenomenon, and reaching users outside the United States will become increasingly important as advertising dollars flow to Western Europe, Asia, and beyond.

Things Fall Apart
The ghosts of vanished giants haunt social networking. So many formerly great Internet companies are struggling or dead. Consider CompuServe, AOL, Netscape, Napster--even Yahoo. Lycos, a search engine that was sold to Terra Networks in 2000 for $12.5 billion, was sold to a Korean firm for $95 million four years later.

What CompuServe and many of the others have in common is that they were portals: gateways to the Web. Facebook wants to be something similar: more than just a useful and fun social tool but the first page people open on the Web, and the platform they use for all their other communication on the Internet.

As would-be portals, however, social-networking sites are vulnerable to one of the problems that brought down those earlier Internet businesses. The portals were "walled gardens" where inexperienced Internet users congregated for a time but where they became restless at last--leaving for the wider, wilder Web. Facebook and MySpace understand this and are now struggling to achieve an appropriate balance between openness and control.

They're also struggling with faddishness. Danah Boyd, a doctoral candidate at the University of California, Berkeley, studies social networking as a cultural phenomenon. She describes online hot spots as though they were popular pubs. "It's supercool when all of your friends go there," she says. "Then all sorts of other people come in. Even if the pub doesn't start feeling physically crowded, it starts feeling socially crowded when your ex is at the other end of the bar talking to some creep who brought his fellow gang members. How long until you say, 'Enough--I'm outta here'?"

Home Page
Several attendees at EconSM took the same flight home, and anyone paying attention on that red-eye from Los Angeles to New York got a lesson on social networking's place in modern life.

Just before the plane began its descent, a 28-year-old woman named Erin fainted on the way to the bathroom. She was possibly overtired, or maybe weirded out by the inhumane crush of economy class. Even she didn't really know what happened. By the time we were on the runway, she had regained her senses. Her first question to the flight attendant was, "Did anyone get my phone?"

As soon as the attendant handed her her iPhone, she opened it up and went right to her Facebook account. She wasn't looking for ads and she wouldn't have noticed one, unless it annoyed her by getting in the way. She wanted to reach her friends, and that was all.

Bryant Urstadt has written for Rolling Stone and Harper's.