Monday, December 15, 2008

Community and the Incalculable ROI

Return on Investment (ROI) is a funny thing, easy to calculate but not always easy to interpret. In many cases, what you can quantify pales in comparison to what you cannot.  Nonetheless, these inestimable flows are real and valuable.

I call this the incalculable ROI.

The Wall Street Journal recently profiled Executive MBAs (EMBAs). Scott Thomas, a 31-year-old who was halfway through the EMBA at a Cleveland school, dropped out to enroll in Ohio State University’s EMBA program. This doubled his tuition costs to $72,500 for the 18-month degree.

The WSJ concluded that this was a good move for Mr. Thomas because the Ohio State program yields a 170% return on investment, third behind only Texas A&M and the University of Florida.

However, it's likely that Mr. Thomas had something different in mind.  One of his reasons for transferring to the Ohio State program was that “the alumni network is unbelievably large, and they’re unbelievably loyal.”  In other words, by moving to a program with robust community, Mr. Thomas might well connect with a future investor in some future start-up. Or his next boss. Or a partner who goes on to help him launch a world-beating product.

The ROI calculated by the WSJ is surely important for near-term cash flow, but it pales beside the incalculable ROI of an expanded, lifelong professional network.

Wednesday, October 29, 2008

Don't Go Toward the Light!

How many hours of sleep do you lose a week because of electric lights and electronic gadgets? Maybe an hour every weeknight? Maybe 3 or 4 hours every weekend?  Maybe more?

Fear not.  The great restorative elixir of our age is coffee.  It has become the worker's little helper, the drug that makes us clear-eyed in the mornings and props us up in the afternoons.  It has evolved from drink to self-medication to  lifestyle, philosophy, and economic juggernaut, all with the underlying mission of off-setting our loss to the bright lights and dancing screens of the night.  Caffeine has turned Dunkin and Starbucks into the great Pavlovian beacons of our age.


I was pondering this while reading the National Geographic article, “The End of Night,” which highlighted another aspect of having too much man-made light on our planet. Author Verlyn Klinkenborg's interesting claim is that we have “engineered night to receive us by filling it with light,” no different from damming a river.

"Now most of humanity lives under intersecting domes of reflected, refracted light. . .Nearly all of nighttime Europe is a nebula of light, as is most of the United States and all of Japan.” In the south Atlantic where squid fishermen use halide lamps to attract their prey, the light cast into space is brighter than Buenos Aires."

The consequence is light pollution. In many places on earth, we have lost the stars. Worse yet, “whenever human light spills into the natural world, some aspect of life—migration, reproduction, feeding—is affected.”

“Migrating birds collide with brightly lit tall buildings," Klinkenborg writes. "Insects cluster around streetlights, providing artificial hunting-grounds for bats. Birds sing at unnatural hours, breed earlier than they should, and put on fat too early for their migratory cycle. Hatchling sea turtles are confused by artificial lighting on the beach, with losses in the hundreds of thousands."

Then there’s the toll light takes on us. Klinkenborg adds, “for the past century or so, we’ve been performing an open-ended experiment on ourselves, extending the day, shortening the night, and short-circuiting the human body’s sensitive response to light. . .At least one new study has suggested a direct correlation between higher rates of breast cancer in women and the nighttime brightness of their neighborhoods.”

Imagine living in a country where 200 million adults are habitually tired. Imagine how grumpy people would be in traffic, how difficult they'd be to work with, and what bad listeners they’d all make. Imagine the foolish things that would go on in such a country where everyone gets robbed of an hour of sleep, compensates with caffeine all morning, and sleepwalks all afternoon.

In 1995, Wolfgang Schivelbusch wrote Disenchanted Night: The Industrialization of Light in the Nineteenth Century. In it, he discusses some of the social implications of light, and makes it clear that, while nobody likes to stub his toe in the dark, the adoption of 7-by-24 artificial light has come at a huge cost to Europeans or Americans.

Here are a few of the things I learned:

1. For thousands of years, the flame remained essentially unchanged as a source of light for human activity. When people wanted more light, they added more flames. In 1688, for example, 24,000 lights—presumably wax candles--were used to illuminate Versailles.

2. Because artificial light was expensive, only royalty used it for extravagant displays like Versailles. “Artificial light was used for work, not for celebrations; it was employed in a rational, economical way. It emancipated the working day from its dependence on natural light, a process that had begun with the introduction of mechanical clocks in the sixteenth century.” Prior to that, Schivelbusch writes, “the medieval community prepared itself for dark like a ship’s crew preparing to face a gathering storm”—retreating indoors, closing the city gates, and bolting doors.”

3. As long as the artificial light required was limited to individual craftsmen, candles and oil lamps were adequate. But, once industrial methods of production were adopted, artificial light was needed for larger spaces and longer periods of time. “In the factories, night was turned to day more consistently than anywhere else.”

4. Schivelbusch says that the wick was as revolutionary in the development of artificial lighting as the wheel was to transport. In fact, people grew so accustomed to wicks that, “in the dazzling brightness of the gaslight, the first thing people wanted to know was what had happened to the wick. ‘Do you mean to tell us it will be possible to have a light without a wick,’ an MP asked the gas engineer William Murdoch at a hearing in the House of Commons in 1810.”

5. Once a house was connected to a central gas supply, it lost its autonomy. “To contemporaries it seemed that industries were expanding, sending out tentacles, octopus-like, into every house. Being connected to them as consumers made people uneasy. They clearly felt a loss of personal freedom.” Many turned off their gas at night, like the medieval city closing its doors. By the mid-1820s most big cities in England had gas; by the late 1840s it had reached many small towns and villages. By 1829, gas was being used for street lighting.

6. There was, of course, a genuinely good reason to fear gas; early gasometers were expected to explode at any minute. And, often they did.

7. The most outstanding feature of gaslight was its brightness. Traditional flames paled in comparison. In fact, the gas flame was so bright people could not look at it directly. Hence, the need arose for shades and frosted glass as ways to dissolve and soften the concentrated light. Worse, though, was that gas used up so much air that it was impossible to stay in gas-lit rooms. People often felt it at the theater, where headaches were common; at home, gas caused headaches and sweating, and could ruin interior decorations. Household guides at the time recommended against gaslights in any of the common living areas.

8. By the mid-nineteenth century, 1,500 police patrolled Paris by day and 3,500 lanterns lit it by night. This lighting was so effective in reducing crime that lantern-smashing became a common crime. In Les Miserables, you might recall, one of the chapters ("A Boy at War With Street-Lamps") describes Gavroche out having his turn at the lanterns. In many cities, the magnificent signboards that decorated the front of shops were removed because they blocked too much light.

9. With signs coming down, shops transitioned to the lighted shop window. This paralleled the ability, about 1850, to make large sheets of glass. Together, these inventions allowed retail shops to extend their hours past sundown.

10. The electric light bulbs shown at the 1881 Paris Electricity Exposition were marketed as superior to gas in every way, shining evenly and steadily irrespective of the season. The bulb demonstrated was, by comparison, a little weaker than today’s 25 watt light bulbs. Unlike gaslight, all doors in the household were open to electric light.

11. Still, the electric light took some getting used to. As one observer noted, “There is something that is lost in electric light: objects (seemingly) appear much more clearly, but in reality it flattens them. Electric light imparts too much brightness and thus things lose body, outline, substance—in short, their essence. In candlelight objects cast much more significant shadows, shadows that have the power actually to create forms.”

And, because the electric light “lit” more of the space, and more brightly, it changed the nature of home decorating. “Muted colors are more compatible with the lively lighting in our homes.”

It’s been only about two centuries that humans have been able to control the dark. The unintended consequences of lighting the world are significant, both on ourselves and the creatures around us. New home decorations. Confused turtles. Tired people. Sick people.

Imagine a world without electricity and electronics. But before we do, let’s go get some coffee so we don't fall asleep while we're doing it.

Thursday, August 28, 2008

Historical Postcards & the Battle of New Orleans

For every nation there are a handful of events each century that are so stunning and transformative that they become what I call “historical postcards”—big, bright pictures burned into the memories of an entire living generation.

One such postcard for my father’s generation was Pearl Harbor.  He was just a young boy when the bombs fell, and I remember him telling me that he heard the news on the radio and hid under his bed for fear that his New England city would be next. He never forgot that terrible feeling of fear and loss, nor did many members of his generation.

There have been, by my count, five historical postcards in my generation. I do not include the assassinations of Robert Kennedy, or the Space Shuttle Columbia disaster; these were big, important events, and I can still see the pictures in my mind.  But an historical postcard, at least by my definition, changes the world as we know it.

Sunday, July 27, 2008

Genealogy, the Idaho Russet and Innovation

In April I was elected Chairman of the New England Historic Genealogical Society, one of my favorite organizations on earth. Not only does the Society have a world class staff, but its Trustees and Councilors are a group of extraordinarily talented individuals who dedicate their time, talent and treasure to collecting, preserving and interpreting--so our mission goes--the stories of families in America.

Founded in 1845, the locus of the Society has moved rapidly in the last decade from its beautiful library on Newbury Street in Boston (still active and vibrant) to a global on-line presence.

With that in mind, these are remarks I made shortly after becoming Chairman. The subject appears to be about the Idaho Russet potato, one of which I pulled from my pocket during the speech. I suggest, for full effect, that you find one in your kitchen and place it on your monitor now.

Tuesday, July 8, 2008

A Plague of Dead Squirrels

Yeah you got yer dead cat and you got yer dead dog
On a moonlight night you got yer dead toad frog
Got yer dead rabbit and yer dead raccoon
The blood and the guts they're gonna make you swoon


--Loudon Wainwright III, Dead Skunk

[NB: No animals were injured in the writing of this article.]

I am sad to report that I am predicting a plague of dead squirrels on the roads of my suburban New England neighborhood. Not tomorrow, but--guessing now--beginning in about 2010 or 2011, and easily stretching for a decade.

I’m predicting the same for your neighborhood as well.

And it won’t just be squirrels—it’ll be chipmunks and skunks, rabbits and raccoons, and a few mystified deer. I’m afraid, even in urban areas, there will be a few more bicyclists thrown from the carbon frames of their Kona King Zings. All beginning about 2010.

It will be, for better or worse, another in a long, unbroken line of unintended consequences surrounding otherwise staggeringly beneficial innovation.

Let me explain.

I just reviewed a book written by Clay McShane and Joel Tarr called The Horse in the City: Living Machines in the Nineteenth Century. It’s a fascinating look at the horse as a “living technology”--and a very persistent living technology--whose numbers continued to grow rapidly despite the steam and mechanization of the Industrial Revolution.

The horse of the 19th century is apt to call up a tableau from some bucolic farm, or perhaps of cowboys out on the open range. While these scenes certainly existed--powered by our allegiance to the American frontier myth--the explosion in the use of the horse in nineteenth-century America occurred primarily in urban areas.

By 1900, a city like New York contained an average of one horse for every 26 people, with 130,000 horses in Manhattan alone pulling street cars and food wagons, carrying firefighters and their equipment, removing snow, carting off the dead, and even providing sources of stationary power.

One of the most interesting features of an innovation is what happens (all around it) during rapid adoption. In the case of horses, we know the obvious: more people and goods got around the city faster and with less human energy. But what about the other consequences, the unintended consequences of such rapid adoption?

In the case of the nineteenth-century horse, the authors point to the following items:
Waste. It’s fair to say that Brooklyn agriculture was built on Manhattan manure. (Farmers termed Manhattan a “manure factory.”) It was only when imported guano became a cheaper commodity that this inter-borough trade slowed.

(Those of you interested in learning how bird poop is harvested, or indeed, how it achieved a sustainable competitive advantage over horse poop will, I’m afraid, have to seek sources outside this blog.)

Abuse. Urban reform groups like the ASPCA took up the welfare of the horse, policing against abuse while actively euthanizing old or lame horses, worth more to the rendering plant than alive. It became clear that the horse was viewed by most city-dwellers in utilitarian terms—-a unit of production-—subject to replacement when the creature became less productive.

Infrastructure. Cities had to create extensive plant to support the horse, including municipal stables and carcass-removal programs. Parkways were created, in part, as venues for afternoon promenades. Meanwhile, the numbers of teamsters, hostlers and stable-keepers tripled from 1870 to 1890—a strange phenomenon in the face the Industrial Revolution.

Medicine. The burgeoning urban horse population led to the rise of a skilled class of urban veterinarians.

Breeding. The horse became subject to breeding programs designed to increase its size and endurance.

Sprawl and suburbs. Street railroads pulled by horses not only encouraged the sprawl of residential neighborhoods but also enabled an expansion of amusement parks and resort destinations for the working class. Indeed, the size and stench of the attendant infrastructure virtually ensured that well-heeled urbanites would eventually find their way to suburbia, even if they had to create it in the process.

Farming. Hay production soared in the farmlands because of the growth of the urban horse. By 1909, more than half of New England’s farmland was involved in hay production. This led to improvements in hay-pressing technology and the ability to ship hay great distances.

Trade. A vast national and international trade in horses developed.
What McShane and Tarr make clear is that, while a horse is a horse (of course, of course), the unintended consequences inherent in their innovative urban use led to a set of vast, largely unforeseen, and completely unintended consequences.

All of which got me thinking about some of the unintended consequences of more modern innovations.

Take the iPod, for example, one of the great entertainment gadgets of our times. Doesn’t it seem likely that one of the unintended consequences of the iPod will be a generation of Americans who begin to experience serious hearing loss in their 40s? Will the iPod one day double, with the flip of a switch, as a hearing aid?

Of course, its predecessor, the television, helped to create the couch potato, the TV dinner, and a habitually sleep-deprived society. (And we still adore it, so I suspect the iPod will still be treasured, even as our national hearing deteriorates.)

Some of the great unintended consequences of our time come from our medical innovations. The wonder drug of the twentieth century, penicillin, has led to the evolution of the superbug. Even Viagra (what could be wrong with Viagra?), a sensation with older men since its launch ten years ago, has the dubious distinction in a recent poll (of women—they finally polled women!) of causing one-third of females to be just plain annoyed at having to have sex at the drop of a pill, and one-out-of-ten who now believe that Viagra led to their husband’s infidelity.

Of course, if you need ill-effects from innovation, look no further than the Web, which robs us of our time and concentration, and truly appears to be making us all stoopid.

The TV, the iPod and the Web; the Tinker to Evers to Chance of unintended consequences: Of the great inventions of the last century, one makes us fat, lazy and tired, one destroys our hearing, and one lowers our IQ and our ability to concentrate.

And we’d love to take an aspirin to make it all better, but that has unintended consequences as well. I just can’t remember what they are.

As for social innovation, a stunning article in the July/August Atlantic by Hanna Rosin suggests that one of the great social programs of our generation--demolishing public-housing projects in large cities to free the poor from the destructive effects of concentrated poverty--has led to steadily falling crime rates in large cities for the last 15 years. That’s the great news. The unintended consequence? Almost like a successful franchising scheme, violent crime didn’t disappear; it just relocated to the mid-sized cities. FBI data now pegs the most dangerous spots in America as Florence, South Carolina; Charlotte-Mecklenburg, North Carolina; Kansas City, Missouri; Reading, Pennsylvania; Orlando, Florida; and Memphis, Tennessee.

Which, speaking of the spread of violence, brings me back to my original prediction of lots and lots of dead squirrels.

In the same Atlantic issue, Jonathan Rauch masterfully profiles General Motor’s attempts to build a true electric hybrid by 2010 in his article, “Electro-Shock Therapy.” This is not your neighbor’s Prius, which is a gasoline-powered car with an electrical assist. GM’s “Chevy Volt” will draw its power from any standard electrical socket and go 40 miles on a single charge. After 40 miles a small gasoline engine will ignite, driving a generator that will maintain the battery.

That means the wheels are always driven by the battery. That means the car will drive hundreds of miles on a tank of gas. That means the 75% of Americans who drive less than 40 miles a day will never buy any gas.

There are lots and lots of technology hurdles to meet, mostly around the battery, if GM is going to make its 2010 date. (You could have the car today if you didn’t mind pulling the battery around in an air-conditioned UHaul, for example.) But fear not; even if the date slips a bit, there will be electric cars on the road in the not-too-distant future. Like 2011.

Very eco-friendly. Very cool. Very innovative. Pretty darn fast. Awfully darn heavy. And very, very quiet.

And that is terribly bad news for squirrels. Because one of the unintended consequences of this breakthrough innovation will be, I’m afraid, a national sneak-attack on creatures of every sort caught, however momentarily, dallying in the road.

I sure hope someone is thinking about his. Maybe Michelin is inventing tires that whistle at some special squirrel frequency. Maybe the next generation of road asphalt comes with sensors. Because, in my town alone I can think of any number of blind corners that are made safe only by the rumble of an internal combustion engine.

When I lived in New York City I used to worry about the squirrels of Central Park, confined to a little island and genetically severed from their brethren. I worried that they would become a race of beer-swilling, sausage-scarfing, spandex-wearing rodents who would one day strap on rollerblades.

Now, I am more inclined to worry about squirrels everywhere. And deer. And you and me, out jogging or riding our bikes.

Q: Why did the squirrel cross the road?

A: Because it couldn’t hear the one-ton, battery-powered rolling mass of silent steel bearing down on it at 50 MPH from around a blind corner.

Wednesday, June 11, 2008

Game 3 Employees

This morning I was listening to Colin Cowherd on ESPN Radio talk about the Los Angeles Lakers’ win in Game 3 of the play-offs against the Celtics.  The Lakers, Cowherd said, won Game 3 but--big deal. The Lakers were supposed to win Game 3, he added. Teams that lose the first two games in a basketball play-off series almost always go home and win Game 3. It’s practically a given. And it doesn’t mean a thing.  "Game 3,” Cowherd said, “is Fool's Gold.”

Statistically, the Lakers have about a 15% chance of becoming World Champions. That’s the history of NBA teams after falling behind in the first two games. So, while miracles do sometimes happen in sports, the Lakers are effectively cooked.  Cowherd ended by saying that, after the first two wins by the Celtics, this series was established.

It got me to thinking about some of the “Game 3” events that occur in our lives.

Suppose you are on a diet, trying to lose weight, but find yourself accidentally in the drive-thru of Burger King at lunchtime. And, being starved, you order a Whopper with cheese, an onion ring, and a Diet Coke.  The Diet Coke is essentially a “Game 3” event. It’s good of you to do, but it doesn’t matter. You might as well have gone with the chocolate shake, because once you’d ordered the Whopper and onion rings, your lunch was established.

Suppose you have a bad experience with an on-line merchant and launch a complaint. In return you get a form email and no other response.  Finally, after repeated calls you get through to customer service, which apologizes and offers you a credit against future purchases.  That’s a “Game 3 event.” By the time you have a bad experience with the original service, and a second bad experience trying to correct the first, it doesn’t much matter what customer service does.  That merchant is established.

I’ve often thought that one of the toughest things in business is to manage the “Game 3 Employee.” That’s the person capable of delivering superior work--and who does so for stretches at a time--but habitually falls off (for whatever reason) to perform at an unacceptable level.

You try to correct through conversation, compensation, review, or even warning. And, inevitably, after one of these sessions, good performance returns.  But, in 85% of the cases, this is a Game 3 event. It’s fool’s gold. Because, with time, the Game 3 Employee inevitably falls off the wagon again.  You can root and you can hope, but the percentages are entirely and inevitably against you. Two cycles of this kind of activity and this employee is established.

Of course, you can always try again. And a lot of us do.  But we can also order the Diet Coke with the Whopper.  Some of us do that, too.

Wednesday, June 4, 2008

Camouflage Marketing: Making it OK to Buy What We Want

This is a post about vibrators--and, yes, those kinds of vibrators.

In “Socially Camouflaged Technologies: the Case of the Electromechanical Vibrator,” scholar Rachel Maines argues that there are certain products and technologies which, while sold legally, are expected to be used illegally, or in a socially unacceptable manner. The success of these products demands not just marketing, but camouflage marketing.

One such product, the electromechanical vibrator, was marketed in the popular press from the late nineteenth century through the early 1930s in the guise of a modern, professional, medical instrument designed to cure female hysteria, a catch-all diagnosis that might comprise up to 75 percent of a nineteenth-century physician's practice.  Advertising leveraged the prevailing belief that electricity was a healing agent.  Once vibrators began appearing in stag films, however, this kind of camouflage was inadequate. Marketing of vibrators did not resurface until social change made it unnecessary to disguise use of the product.

Home vibrators were marketed as benefits to health and beauty by improving the circulation and soothing the nerves. An ad in the 1921 issue of Hearst magazine urges the considerate husband to give his wife “A Gift That Will Keep Her Young and Pretty.”  Advertising of electromechanical vibrators did not appear in magazines selling for less than 5 cents or more than 25 cents per issue.  This suggested market segmentation of readers whose middle-class homes were being added to the electrical grid, but not so well off that they could visit a spa.  The U.S. Bureau of the Census found 66 firms manufacturing these devices in 1908, and by 1919 the annual market was well over $2 million.

Maines lists other products that have been sold actively using camouflage marketing. Distilling technology sold during Prohibition was “Ideal for distilling water for drinking purposes, automobile batteries and industrial uses.”  Today, burglary tools are marketed in some popular magazines “with the admonition that they are to be used only to break into one’s own home or automobile.”

“Most recently,” Maines tells us, “we have seen the appearance of computer software for breaking copy protection, advertised in terms that explicitly prohibit its use for piracy, although surely no software publisher is so naïve as to believe that all purchasers intend to break copy protection only to make backup copies of legitimately purchased programs and data.”

I took a tour around the Internet, seeking more examples of camouflage marketing.  Planning to be in a brawl tonight? How about some brass knuckles, illegal most everywhere in the world? I found a great site to buy them "For novelty purposes only. Makes a fine paperweight.”  Another site proclaims that Bittorrent is "the global standard for accessing rich media over the Internet," a better message than “Bittorrent: You too can rip-off Hollywood."

If you’ve ever been on the Seattle Underground tour, you’ll know that the Klondike Gold Rush of 1897 brought hundreds of prospectors into town on their way north to Alaska. Just coincidentally, there appeared an inordinate number of young women, most without visible means of support, who listed their profession as "seamstress."

Unlike traditional marketing, camouflage marketing isn’t about educating the consumer or creating a compelling value statement. The consumer already knows what it is he or she is buying.  Camouflage marketing just makes the transaction possible.

Friday, May 2, 2008

You Can Be Rich or King, But Not Both

Thomas Edison was the greatest American inventor of the nineteenth century, credited with inventing not just stuff, but the very discipline of research.

And, while inventing was satisfying, what Edison really wanted was to build businesses. In that regard, he was—despite his 1,093 patents--a complete and utter failure; Peter Drucker reminds us that Edison “so totally mismanaged the businesses he started that he had to be removed from every one of them.”  It was, Drucker says, the archetype for the now familiar “rags to riches and back to rags” phenomenon.

In the February 2008 Harvard Business Review, Noam Wasserman goes a long way toward explaining folks like Edison and the thousands of other bright innovators who conceive brilliant products, bring them to market, and then mismanage their companies and lose their jobs, or mismanage their companies and destroy value.

“Four out of five entrepreneurs,” Wasserman says, “are forced to step down from the CEO’s post. Most are shocked when investors insist that they relinquish control, and they’re pushed out of office in ways they don’t like and well before they want to abdicate. The change in leadership can be particularly damaging when employees loyal to the founder oppose it. In fact, the manner in which founders tackle their first leadership transition often makes or breaks young enterprises.”

Wasserman explains, however, that when founders are honest about their reasons for starting the business, the chances of “happily ever after” are significantly improved. Here are four take-ways from this excellent article:
1. New ventures are usually labors of love for entrepreneurs, who become emotionally attached and often accept a smaller salary than folks with comparable backgrounds. In addition, many entrepreneurs are overconfident about their prospects and naïve about the problems they will face. This combination of attachment, overconfidence and naivete may, in fact, be necessary to get new ventures up and running, but these emotions later create problems.

2. Many founders believe that if they’ve successfully led the development of the organization’s first new offering, that represents ample proof of their management prowess. But, the shipping of the first products marks the end of an era. The founder then has to shift gears to build a company capable of marketing and selling large volumes of the product and providing customers with after-sales service. The venture’s finances become infinitely more complex. The organization needs to be structured. The dramatic broadening of the skills that the CEO needs at this stage stretches most founders’ abilities beyond their limits.

3. The faster the founder-CEOs lead their companies to the point where they need outside funds and new management skills, the quicker they lose control. The founder’s emotional strengths—being the heart and soul of the venture—often make it difficult to accept a lesser role, leading to sometimes traumatic leadership transitions within young companies.

4. Most founder-CEOS start out by wanting both wealth and power. But sooner or later, the smart ones grasp that they’ll probably have to make a choice. And the fundamental tension—being rich vs. being king—isn’t biased one way or the other. What matters is why the founder started the business in the first place. If he or she had a clear set of goals and a roadmap, the “new era” represented by the first product release doesn’t have to be traumatic.
In Milton’s Paradise Lost, Satan gives a rousing speech to his followers after being tossed out of heaven, telling them, “Better to reign in hell, than serve in heav'n."  Admittedly, this speech came after one of the worst career moves in history.

But it doesn’t have to be that bad for founders, so long as they know the goal—rich or king—when they first decide to launch a business.

Of course, there is a third alternative alongside rich or king, and that is gifted. One of Edison’s contemporaries, George Westinghouse, was a prolific inventor, beat Edison in the race for electrical standards, and, by 1904, had founded nine manufacturing companies worth about $120 million and employing approximately 50,000 workers.

So, don’t forget “gifted” along with rich or king. Just be realistic when you choose your path.

Sunday, January 6, 2008

Chocolate Wars

One of the special rights of passage for many children is a visit to Hershey, Pennsylvania, and, in particular, Hersheypark. The sweet smell of chocolate hangs over the entire town. The park itself is clean, affordable, and scaled perfectly to its young clientele. And, not far away is a store and presentation center which tells the history of Hershey chocolate, including playing some of those great, old nostalgic ads (like “the Great American Chocolate Bar”).

It takes some doing to hold onto that warm and fuzzy feeling as you make your way through Joel Glenn Brenner’s The Emperors of Chocolate. A look behind the scenes at the manufacture and sale of chocolate products reveals the same down-and-dirty competitive world that exists in every other industry around the globe.

A reporter for the Washington Post, Brenner was assigned in 1989 to write a feature about Mars, Inc., detailing the company’s response to Hershey’s emergence as the nation’s No. 1 candy maker. When she checked the files on Mars, Brenner found exactly one press interview, in 1966, ever granted by Forrest Mars, Sr. (sometimes called the Howard Hughes of Candy). It took Brenner more than a year of weekly calls and cajoling to convince Mars to cooperate.
Over the next two years, I was given full access to the company’s operations around the world . . . I was given access to the Mars family’s personal archives and the company library, and allowed to interview—for the first time ever—John and Forrest Jr., who run the company today.  I found a world as peculiar as that depicted in the Roald Dahl fantasy Charlie and the Chocolate Factory. The company bears the indelible mark of its patriarch, Forrest Mars, Sr., whose idiosyncratic management philosophy has helped Mars become one of the most productive and profitable privately owned companies in the world. The resulting story in The Washington Post Magazine attracted national attention and won several awards. It also outraged the Mars family, who promptly closed their doors to me and haven’t spoken to a reporter since.
Brenner went on to discover through her research that one of the “M’s” in Mars’ “M&M” candy stood for R. Bruce Murrie, the son of Hershey’s longtime president. “I knew that somewhere in the tangled relationships between Mars and Hershey lay the true story of both companies.” So, she spent two years in Hershey, Pennsylvania, with the resulting book being a detailed focus around the careers of Milton Hershey and Forrest Mars, Sr.—the so-called Emperors of Chocolate.

The story of these two giants, and many of the lesser but still impressive players like Tootsie Roll and Henry Heide, is fascinating—a world of cutthroat competition, corporate espionage, and commodity purchases that can make or break the economies of certain small countries. But, as someone who grew up on the Great American Chocolate Bar, the single most startling fact in the book—and there are many--is the repugnance with which most of the world outside the United States views milk chocolate. To a European, milk chocolate is a slightly sour, barnyard variety chocolate not really fit for human consumption.

In the November 7 issue of Conde Nast Portfolio, Alexandra Wolfe reports on "Chocolate Wars," which introduces into the mix a modern-day giant—Warren Buffett, whose company owns See’s chocolates of San Francisco—and a proposal before the F.D.A. to redefine what is and what is not chocolate.

The competition turns out to be just as bruising as ever, and the stakes just as high in this $16 billion dollar industry. Here are a handful of take-aways from the Conde Nast article:
1. For as long as chocolate has been made, it’s been smoothed out with the elixir called cocoa butter, an emulsified form of cacao that gives the finished product its silky texture. In the United States, the F.D.A. mandates that a product can’t legally be labeled as chocolate unless cocoa butter is part of the formula. But because of a drought and political violence in Ivory Coast, a major source for cacao beans, the price of cocoa butter has skyrocketed. This has prompted some of the major chocolate makers, Hershey among them, to lobby the F.D.A. by way of a trade-group petition for a change that would let them substitute such cheaper ingredients as vegetable oil and dried milk for cocoa butter and still call their products chocolate.

2. The cocoa-butter controversy began in October 2006 at a Washington board meeting of the Chocolate Manufacturers Association, a trade group dominated by the biggest names in chocolate: Hershey, Nestlé, Mars, and Archer Daniels Midland. The board was considering whether to support the Grocery Manufacturers Association, which includes some chocolate makers, in petitioning the F.D.A. to update U.S. food standards. The grocery manufacturers group, which happens to be chaired by former Hershey C.E.O. Richard Lenny, routinely submits such petitions when changes in food science demand it. But deep within the fine print of the document—in the last section of a 12-page appendix—lurked the clause allowing the cocoa-butter substitution. Most small chocolate makers were absent from the meeting, and there was no vote on the petition.

Premium-chocolate makers and their allies. . .soon banded together and created a website, DontMessWithOurChocolate.com. They denounced the proposed change as a “mockolate conspiracy” and bombarded the F.D.A. with protest letters and emails. As a result, the F.D.A. extended the public comment period and said it wouldn’t be deciding anytime soon.

3. Complicating matters is a dramatic shift in the chocolate market. Since 2001, sales of premium chocolate have climbed 129 percent, while at some large manufacturers, sales of mass-produced chocolate have declined. During the first six months of 2007, Hershey’s earnings dropped to $97 million, compared with $220.4 million in the same period of 2006. Mix in the sharp spike in cocoa-butter prices, and big chocolate makers have found themselves in a jam. Earlier this year, Hershey announced plans to shut down at least three of its 17 plants in the U.S. and lay off thousands of workers.

4. But many of the smaller chocolate makers see the effort to replace cocoa butter as a ploy that would allow the major companies to cut costs without risking their reputation—or sales. And that, the smaller companies argue, would not only mislead consumers but also give mass chocolate makers an unfair advantage.

5. To detractors of the proposed change, what’s at stake is the very future of the nation’s chocolate industry and its $16.3 billion in annual sales. Buffett’s interest is certainly pecuniary. His holding company, Berkshire Hathaway, got into the chocolate business when it bought See’s, an old-line San Francisco Bay Area chocolate maker, for $25 million in 1972. Given that See’s sales, these days more than $300 million annually, depend largely on the company’s reputation for quality, there are no plans to mess with any formulas, Buffett says. “If you’ve got recipes that people like, you don’t change them.”

6. The potential savings are substantial. By substituting other vegetable fats, chocolate makers could shave at least 50 cents a pound off the cost of producing their candy, estimates Fabrizio Parini, vice president of marketing for Ghirardelli Chocolate, which opposes the change.

Furthermore, premium-chocolate makers fret that the change could spark a consumer revolt against all chocolate. “It would be like saying margarine spreads could be called butter,” says Brad Kinstler, See’s chief executive.