Market anomalies and incongruities may point the way to your next breakthrough strategy.
Illustration by Stefanie Augustine
During their heyday in the late 19th and early 20th centuries, transatlantic cruise lines such as the Hamburg America Line and the White Star Line transported tens of millions of passengers between Europe and the United States. By the 1960s, however, their business was being threatened by the rise of a disruptive new enterprise, namely, nonstop transatlantic flights. As it happened, the cruise ship lines had one potential strategy with which to save their business: vacation cruises. Starting in the 1930s, some of these lines had sailed to the Caribbean during the winter, thus using their boats when rough seas made the Atlantic impassable. And in 1964, when a new port was opened in Miami, Fla., the pleasure cruise business began to boom.
But the great cruise lines missed this breakthrough opportunity. They saw their profitability fall while dozens of startups, including Royal Caribbean and Carnival, retrofitted existing ships to offer pleasure cruises and built an entirely new travel and leisure category that continues to grow today.
Managers and entrepreneurs walk past lucrative opportunities all the time, and later kick themselves when someone else exploits the strategy they overlooked. Why does this happen? It’s often because of the natural human tendency known to psychologists as confirmation bias: People tend to notice data that confirms their existing attitudes and beliefs, and ignore or discredit information that challenges them.
Although it is difficult to overcome confirmation bias, it is not impossible. Managers can increase their skill at spotting hidden opportunities by learning to pay attention to the subtle clues all around them. These are often contradictions, incongruities, and anomalies that don’t jibe with most of the prevailing assumptions about what should happen. Here is my own “top 10” field guide to clues for hidden breakthrough opportunities, observed in a wide variety of industries, countries, and markets. If you find yourself noticing one or more of them, a major opportunity for growth could be lurking behind it.
1. This product should already exist (but it doesn’t). As the accessories editor for Mademoiselle magazine in the early 1990s, Kate Brosnahan spotted a gap in the handbag market between functional bags that lacked style and extremely expensive but impractical designer bags from Hermès or Gucci. Brosnahan quit her job, and with her partner Andy Spade, founded Kate Spade LLC, which produced fabric handbags combining functionality and fashion. These attracted the attention of celebrities such as Gwyneth Paltrow and Julia Roberts. Many well-known product innovations — including the airplane, the mobile phone, and the tablet computer — began similarly, as products that people felt should already exist.
2. This customer experience doesn’t have to be time-consuming, arduous, expensive, or annoying (but it is). Consumer irritation is a reliable indicator of a potential opportunity, because people will typically pay to make it go away. Reed Hastings, for example, founded Netflix Inc. after receiving a US$40 late fee for a rented videocassette of Apollo 13 that he had misplaced. Charles Schwab created the largest low-cost brokerage house because he was fed up with paying the commissions of conventional stockbrokers. Scott Cook got the idea for Quicken after watching his wife grow frustrated tracking their finances by hand.
3. This resource could be worth something (but it is still priced low). Sometimes an asset is underpriced because only a few people recognize its potential. When a low-cost airline such as easyJet or Ryanair announces its intention to fly to a new airport, real estate investors often leap to buy vacation property nearby. They rightfully expect a jump in real estate values. Similarly, the founders of Infosys Technologies Ltd., India’s pioneering provider of outsourced information technology services, were among the first to recognize that Indian engineers, working for very low salaries, could provide great value to multinational clients. The company earned high profits on the spread between what they charged clients and what they paid local engineers.
4. This discovery must be good for something (but it’s not clear what that is). Researchers sometimes recognize that they have stumbled on a promising resource or technology without knowing the best uses for it right away. The resulting search for a problem to solve can lead to great profitability. One example was the founding of the ArthroCare Corporation, a $355 million producer of medical devices based on a process called coblation, which uses radio frequency energy to dissolve damaged tissue with minimal effect on surrounding parts of the body. Medical scientist Hira Thapliyal, who codiscovered this process, founded a company to offer it for cardiac surgery, but that market turned out to be too small and competitive to support a new venture. Undeterred, he looked for other potential uses, and found one in orthopedics, where there are more than 2 million arthroscopic surgeries per year.
5. This product or service should be everywhere (but it isn’t). Sometimes people chance upon an attractive business model that has failed to gain the widespread adoption it deserves. Two archetypal retail food stories illustrate this. In 1954, restaurant equipment salesman Ray Kroc visited the McDonald brothers’ hamburger stand in southern California, and convinced them to franchise their assembly-line approach to flipping burgers. In 1982, coffee machine manufacturing executive Howard Schultz visited a coffee bean producer called Starbucks in Seattle. He recognized the potential of a chain restaurant based on European coffee bars, and he joined Starbucks, hoping to convince the company’s leadership to convert their retail store to this format. When they didn’t, he started his own coffeehouse chain, later buying the Starbucks retail unit as the core of his new business.
6. Customers have adapted our product or service to new uses (but not with our support). Chinese appliance maker Haier Group discovered that customers in one rural province used its clothes washing machines to clean vegetables. Hearing this, a product manager spotted an opportunity. She had company engineers install wider drain pipes and coarser filters that wouldn’t clog with vegetable peels, and then added pictures of local produce and instructions on how to wash vegetables safely. This innovation, along with others including a washing machine designed to make goat’s-milk cheese, helped Haier win share in China’s rural provinces, while avoiding the cutthroat price wars that plagued the country’s appliance industry.
7. Customers shouldn’t want this product (but they do). When Honda Motor Company entered the U.S. motorcycle market in the late 1950s, it expected to sell large motorcycles to leather-clad bikers. Despite a concerted effort, the company managed to sell fewer than 60 of its large bikes each month, far short of its monthly sales goal of 1,000 units. Then a mechanical failure forced the company to recall these models. In desperation, it promoted its smaller 50cc motorbike, the Cub, which Honda executives had assumed would not interest the U.S. market. When the smaller bikes sold well, Honda realized it had discovered an untapped segment looking for two-wheel motorized transportation. (The campaign is still remembered for its catchphrase, “You meet the nicest people on a Honda.”)
8. Customers have discovered a product (but not the one we offered). Joint Juice, a roughly $2 million company that produces an easy-to-digest glucosamine liquid, was founded by Kevin Stone, a prominent San Francisco orthopedic surgeon. He learned about the nutrient from some of his patients, who took it for joint pain instead of the ibuprofen he had prescribed. Many doctors might have ignored this or even scolded their patients for falling prey to fads, but Stone recognized he might be missing something. He looked up the clinical research on glucosamine in Europe, where it was the leading nutritional supplement. (Veterinarians, he discovered, swore by it, and their patients fell for neither fads nor placebos.) Then he built a business around it.
9. This product or service is thriving elsewhere (but no one offers it here). In the early 1990s, a Swedish business student named Carl August Svensen-Ameln tried to store some of his belongings in Sweden while at school in Seattle, but found that all the local self-storage facilities were full. He studied the storage industry, already prevalent in the United States, and discovered a business model characterized by high rents, low turnover, and negligible operating costs. Yet self-storage, at the time, was virtually nonexistent in continental Europe. Svensen-Ameln and a friend from business school set up a partnership with an established U.S. company, Shurgard Storage Centers Inc. The resulting company, European Mini-Storage S.A., was the first of several such companies that Svensen-Ameln started in Europe, to great success.
10. That new product or service shouldn’t make much money (but it does). Established competitors are often surprised when upstart rivals do well. In his 2008 book, The Partnership: The Making of Goldman Sachs (Penguin Press), Charles D. Ellis noted that for decades, Goldman Sachs partners had avoided investment management, which they believed generated lower fees than trading and investment banking. When Donaldson, Lufkin & Jenrette Inc. published its financial performance as part of a 1970 stock offering, Goldman partners were startled to learn that fees and brokerage commissions on frequent trades added up to a highly profitable business. Shortly thereafter, Goldman expanded into managing corporate pension funds, and aggressively built its business.
Incongruities like these can offer a critical clue about where your assumptions no longer match reality. From there, you are more likely to uncover the kinds of opportunities that you might otherwise have missed — and that your competitors still don’t recognize. Start by asking yourself, What are the most unexpected things happening in our business right now? Which competitors are doing better than expected? Which customers are behaving in ways we hadn’t anticipated? Take yourself through the list of top 10 clues. Leaders who consistently notice and explore anomalies increase the odds of spotting emerging opportunities before their rivals.
Reprint No. 11304
- Donald Sull is a professor of strategic and international management at the London Business School, where he is also the faculty director for executive education. His books include The Upside of Turbulence: Seizing Opportunity in an Uncertain World (Harper Business, 2009).
2:27 AM BST 06 Oct 2011
Steve Jobs, who died on October 5 aged 56, was the visionary co-founder, and later chief executive, of Apple, makers of the Macintosh computer, the iMac, the iPod, iPad, and iPhone, and the man behind the astonishing success of the computer animation firm Pixar, makers of Toy Story and Finding Nemo; in consequence he did more to determine what films we watch, how we listen to music, and how we work and play than any other person on the planet.
Jobs never designed a computer in his life, but it was because of him that Apple products, even when they do largely what other products do, are perceived to be different and infinitely more cool. The Macintosh introduced the world to the computer mouse; the iPod became famous for its click wheel, and the iPhone for its "user-interface" – a sophisticated touch-screen that responds to the flick of a finger.
Jobs emphasised the difference between Macs and the PCs that ran Microsoft software, managing to preserve Apple's image as a plucky, creative, insurgent against the bland Microsoft behemoth even as Apple itself became the biggest company on the planet. "I wish Bill Gates well," he once claimed. "I only wish that at some time in his life he had dropped acid or spent time at an ashram."
It was a marketing trick that Jobs worked on consumers too, convincing them that purchasing Apple products somehow conferred membership of an exclusive and visionary club, even when it was transparently obvious that the company's devices were utterly ubiquitous. This corporate reputation for seer-like trailblazing lay completely with Jobs. "I skate to where the puck is going to be," he explained, using an ice hockey metaphor, "not where it has been."
This inspired almost evangelical devotion among techno-geeks. Jobs was not just the brains behind Apple, he was high-priest of the "Mac" religion. His eagerly anticipated "MacWorld" shows were adulatory affairs akin to revivalist rallies, with Jobs, in black turtleneck, jeans and trainers, preaching the message that salvation lay in Apple's latest gadget.
Steve Jobs turned his zeal for perfection - whether it be iPod or iPad - into an ideology that dominates modern computing.
Steve Jobs is the creative titan behind the coolest and most lucrative devices on the planet Photo: AP
The Jobs story – humble birth, rise and fall, miraculous comeback – was even likened by Apple fanatics to the life of Christ. For the less blasphemously-inclined it proved that the American Dream is alive and well.
He was born on February 24 1955 to an Syrian Arab father and an American mother, who had travelled to San Francisco to put him up for adoption. Soon afterwards a blue-collar California couple, Paul and Clara Jobs, claimed him and named him Steven Paul.
After completing high school in Cupertino, northern California, Jobs went north to study at Reed College in Portland, Oregon, but dropped out after a term. Returning to California, he took a job at Atari, the video games manufacturer, in order to save money for a "spiritual quest" to India. There he was converted to Zen Buddhism and vegetarianism and dabbled in hallucinogenic drugs.
On his return to America Jobs resumed his work with Atari and was given the task of creating a more compact circuit board for the game Breakout. He had little interest in the intricacies of circuit board design and persuaded his 16-year old friend, Steve Wozniak, to do the job for him, offering to split any bonus fifty-fifty. Jobs was given $5,000 by a delighted Atari, but Wozniak only got $300, under the impression the payout was $600.
In 1976 Wozniak showed Jobs a computer he had designed for his own use. Jobs was impressed and suggested marketing it. They had no capital, but Jobs had a brilliant idea. By persuading a local store to order 50 of the computers, then asking an electrical store for 30 days credit on the parts to build them, they set up business without a single investor. They called it Apple Computers (which would lead to protracted legal battles with the company behind the Beatles' record label, Apple Corps) and launched their first product, the Apple 1. A year later the more sophisticated Apple 2 hit the jackpot, and by 1980, when the company went public, the pair were multimillionaires.
The success of Apple launched Jobs into the celebrity circuit. He dated Joan Baez and became a personal friend of California Governor Jerry Brown. But his ruthless streak became apparent aged 23 when his then girlfriend gave birth to his daughter. For two years, though already wealthy, he denied paternity while the baby's mother went on welfare. At one point he even swore an affidavit to the effect that he was "sterile and infertile", so could not be the father.
The strain of running a successful company soon began to tell. Employees complained of Jobs's "Management By Walking Around Frightening Everyone" technique and even he realised that more seasoned business experience was required. In 1983 he lured John Sculley, president of PepsiCo, to serve as Apple's chief executive, saying: "Do you want to spend the rest of your life selling sugared water to children, or do you want a chance to change the world?" Two years later the company launched the Macintosh, the first commercially successful small computer with a mouse-driven "graphical user interface".
But the clash of business cultures proved irreconcilable, and in 1985 Jobs (whom Sculley likened to Leon Trotsky) was forced out by his own board. It was 12 years before he returned.
During those years Jobs started Next Computing and bought what became Pixar from George Lucas, the director of Star Wars. Next was a techie's dream – Tim Berners-Lee wrote the software for the web on a Next computer – but a business failure. Pixar struggled for years until 1995, when it contracted with Disney to produce a number of computer-animated feature films. The first of these, Toy Story, broke box-office records and Pixar's flotation in 1996 made Jobs a billionaire. Over the next 10 years the studio went on to produce a string of hits including A Bug's Life (1998), Toy Story 2 (1999), Monsters, Inc. (2001), Finding Nemo (2003) and The Incredibles (2004).
In 2006 Disney bought the company in a $7.4 billion deal under which Jobs became Disney's largest single shareholder with approximately 7 per cent of the company's stock.
Jobs's triumph at Pixar reminded people of his ability to divine the technological future, and in 1997 he persuaded Apple to buy Next – to acquire its forward-looking operating system Nextstep, and, more importantly, Jobs himself.
In his memoirs, published in 1987, John Sculley dismissed Jobs's vision for Apple to become a high tech consumer products company as a "lunatic plan".
But by rejecting this vision, Apple, by 1997, had become a basket case, losing $736 million in one quarter. Management by committee had blunted its innovative flair and the corporate atmosphere was more that of a student bar than a thrusting business in a highly competitive market.
Jobs's instinctive feel for the consumer zeitgeist soon turned things around. Within a year the company was once more posting handsome profits.
The iMac computer was launched in 1998, followed in 2001 by the iPod, a digital music player of strikingly minimalist design. Then came iTunes digital music software, and the iTunes online digital music Store. In 2007, Apple entered the cellular phone business with the iPhone, a clever and expensive product combining cell phone, iPod, and internet device in one streamlined casing. It was followed by the iPad, a tablet device without a physical keyboard. Some wondered if there was really demand for the iPad in a crowded marketplace that was being buffeted by the severest economic downturn in decades; once again Jobs showed his ability to confound the sceptics, and it became a bestseller too.
But Jobs was not a universally popular figure. He oozed arrogance, was vicious about business rivals, and in contrast to, say, Bill Gates, refused to have any truck with notions of corporate responsibility. He habitually parked his Mercedes in the disabled parking slot at Apple headquarters and one of his first acts on returning to the company in 1997 was to terminate all of its corporate philanthropy programmes.
Jobs's management style owed less to Zen Buddhism than to George Orwell. No aspect of corporate life was immune from his authority and he was almost pathologically controlling when it came to dealing with the press.
Journalists found that he would try to stifle even anodyne stories if they had not received his blessing. One described getting an interview with Jobs as about as easy as getting an interview with Saddam Hussein, "except Saddam would probably be more helpful and certainly more polite".
He ruled Apple with a combination of foul-mouthed tantrums and charm, withering scorn and carefully judged flattery. People were either geniuses or "bozos", and those in his regular orbit found that they could flip with no warning from one category to the other, in what became known as the "hero-shithead roller coaster". Employees worried about getting trapped with Jobs in a lift, afraid that they might not have a job when the doors opened.
One senior executive admitted that before heading into a meeting with Jobs, she embraced the mindset of a bullfighter entering the ring: "I pretend I'm already dead."
Apple Company logo (www.apple.com)
Yet members of Jobs's inner circle, many of whom came with him from Next, found working with him an exhilarating experience. To keep them on board, Jobs eliminated most cash bonuses from executive compensation and started handing out stock options instead. But here as elsewhere Jobs played by his own rules.
In 2001 he was granted stock options amounting to 7.5 million Apple shares, allegedly without the required authorisation from the company's board of directors. Furthermore, the option came with an exercise price of $18.30.
But this price allegedly should have been $21.10, thereby incurring a taxable charge of $20 million that Jobs did not report as income.
In 2006 an internal company inquiry found that this grant was "improperly recorded" as having been made at a special board meeting that never took place, but largely exonerated Jobs over the matter, saying that the options had been returned without being exercised and that he was "unaware of the accounting implications". In 2007 the US Securities and Exchange Commission announced that it would not file charges against Apple, but had filed charges against two former executives for their alleged roles in backdating Apple options.
The inquiry did not stall Apple's extraordinary ascent. By 2006 the company had a market value of $108 billion – more than Goldman Sachs. By August 2011, after it reported yet another quarter of record breaking profits, it had become the biggest company in the world, with a market value of $337 billion.
Within days of reaching that corporate milestone, however, Jobs announced his resignation on health grounds. Few were surprised. In 2004 he disclosed that he had been diagnosed with a rare form of pancreatic cancer that had been "cured" by surgery. Questions about his health resurfaced in December 2008 when it was announced that, for the first time in 12 years, he was pulling out of delivering his annual address at Macworld.
Feverish speculation over his well-being was only fuelled by Apple's fanatical devotion to secrecy. When Jobs went on medical leave in January 2009, the company would not say why. Inevitably, suspicions arose that the cancer had returned. In fact in June that year the Wall Street Journal revealed that he had undergone a liver transplant. Even when the news broke Apple remained tight-lipped: "Steve continues to look forward to returning at the end of June, and there's nothing further to say," noted a spokeswoman tersely.
His devoted fans began to scrutinise every public appearance for clues to Jobs's physical fitness. On the stock markets, which considered his presence vital to Apple's own health, the company's shares fell if he looked particularly gaunt. In the first half of 2011 he was seen only a handful of times. Then, on August 24, he announced he was stepping down, to be replaced as CEO by Tim Cook, who had run the company during Jobs's previous absences.
Apple's shares immediately dropped 5 per cent.
Steve Jobs married Laurene Powell in a Buddhist ceremony in 1991. They had three children who survive him along with the daughter by his early girlfriend, whose paternity he eventually acknowledged.
Strongly recommend to view this video on “how to live before you die”! Steve Jobs is at his best in June 2005 at Stanford Convo..
By Marc Kaufman, Published: October 20
An artist rendering of a distant cold water vapor zone. Scientists believe the water for Earth’s oceans came from a similar region present in our solar system when planets were being formed.
Water is everywhere on Earth, but nobody has ever been able to determine conclusively how it got here. Scientists know that the early Earth was far too hot to hold water or water vapor, but then, in relatively short geological time, the oceans appeared.
In a discovery that researchers say sheds important new light on that age-old question, a European team reported Thursday that it has found a very cold reservoir of water vapor in spacethat could explain where the water came from.
The region they discovered is at the outer reaches of a dusty disk surrounding a star 175 light-years away. The star and disk are in the early stages of forming planets, much as Earth was formed some 4.5 billion years ago.
The scientists’ conclusion from the new finding: Life-giving H2O was almost certainly delivered to Earth via comets and asteroids known to originate in these cold but water-filled zones, which were assumed to also be present when our solar system was forming.
“Our observations of this cold vapor indicates enough water exists in the disk to fill thousands of Earth oceans,” said astronomer Michiel Hogerheijde of Leiden Observatory in the Netherlands.
Hogerheijde is the lead author of a paper describing these findings in the Oct. 21 issue of the journal Science.
“Scientists have long suspected there were these reservoirs of cold water vapor hiding in the outer regions of planet-forming disks, but until now we’ve only found signs of water vapor in hot regions closer to the suns,” Hogerheijde said in an interview. “Since the comets and cold asteroids are formed in the outer reaches, this was a problem for the theory that comets delivered the water to Earth. But now we have the cold reservoir in the region where comets are formed, and so the theory gets considerably stronger.”
The logical extension, he said, is that water has also been delivered to some of the billions of exoplanets known to exist beyond our solar system, meaning there are likely to be many “ocean worlds” throughout the galaxies.
Hogerheijde said the 10 million-year-old star his team examined, TW Hydrae, is the closest planet-forming star yet identified.
Signs of the cold water vapor were detected by the Herschel Space Observatory, a European Space Agency satellite that looks for infrared light in the galaxy using the Heterodyne Instrument for the Far-Infrared, or HIFI. Efforts to find the cold water vapor in the past all failed because the instruments were not powerful enough to pick up the faint spectroscopic signals.
NASA also partially funds Herschel, and American researchers were part of the team.
“It is a testament to the instrument-builders that such weak signals can be detected,” said Paul Goldsmith, NASA project scientist for the Herschel Space Observatory at the agency’s Jet Propulsion Laboratory in Pasadena, Calif. “These are the most sensitive HIFI observations to date.”
According to Hogerheijde, the cold water vapor detected is a small portion of the “ice reservoir” existing in the region. The ice crystals, which cover the widespread dust particles, form in conditions reaching 400 degrees below zero. But ultraviolet light from the star warms the ice enough to briefly release the vapor that was detected, Hogerheijde said.
The announcement comes weeks after a related finding that the water in some comets has the same chemical composition of Earth’s oceans. Previous detections had found water with a different isotopic make-up, suggesting that no more than 10 percent of the Earth’s water was delivered by comet.
But data from the comet Hartley 2 found the ratio of heavy hydrogen (deuterium) to ordinary hydrogen to be almost exactly what it is in Earth’s oceans.
“Now, in principle, all the water [in Earth’s oceans] could have come from comets,” said principal investigator Paul Hartogh of Germany’s Max Planck Institute.
Planetary scientists have determined that huge amounts of graphite and silicon dust surround stars as they form. That material over time binds together to form larger bodies such as comets and asteroids and — around many stars — ultimately planets.
Earth formed about 4.5 billion years ago, they have concluded, and was then too close to the sun to hold much water or water vapor. But around 4.1 billion years ago began a period of “heavy bombardment,” when Earth was pummeled by comets and cold meteorites — both of which carry water — from the outer reaches of its disk.
The bombardment ended about 3.8 billion years ago, and at that point much of the Earth’s water was in place. The earliest forms of microbial life detected lived some 3.6 billion years ago, a relatively short geological period after the oceans had filled.
While there is growing evidence for the explanation that comets and wet asteroids delivered our oceans, some researchers hold that the water came primarily from other sources. For instance, water is believed to have been released from early volcanos that belched up molten material from deep within the planet, including H2O. Water could also leak out of certain minerals in rocks as the planet cooled.
Hogerheijde said that while this early “outgassing” most likely played a role in making early Earth wet, the evidence is now persuasive that much of the water for our oceans was later delivered from afar.
By David Zinczenko with Matt Goulding
Sep 27, 2011
Eat This, Not That
Why do some people seem naturally thin—able to torch cheeseburgers instantly and never gain a pound? And why do some of us—okay, most of us—sweat and diet and sweat and diet some more, and never lose enough to get the body we want?
Because those “naturally thin” people actually live by a series of laws that keep them from ever gaining weight. And if you know their secrets, you can indulge and enjoy and never gain another pound as long as you live.
As the editor-in-chief of Men’s Health, I’ve spent the past two decades interviewing leading experts, poring over groundbreaking studies, and grilling top athletes, trainers, and celebrities for their health and fitness advice. And I’ve learned that what separates the fit from the fat, the slim from the sloppy, the toned from the torpid, is a set of rules. And what’s amazing is that none of them involves spending hours on a treadmill, eating nothing but grapefruit and tree bark, or having part of the small intestine replaced with fiberfill. Follow these simple rules and weight loss will be automatic.
LAW #1: Lean People Don’t Diet
What? Of course lean people diet! They’re just magically better at denying themselves than the rest of us are, right?
No. In reality, studies show that the number one predictor of future weight gain is being on a diet right now. Part of the reason is that restricting calories reduces strength, bone density, and muscle mass—and muscle is your body’s number-one calorie burner. So by dieting, you’re actually setting yourself up to gain more weight than ever. And a recent study in the journal Psychosomatic Medicine showed that tracking your diet in a food journal can actually boost your stress levels, which in turn increases your level of a hormone called cortisol, and cortisol is linked to—you guessed it—weight gain.
FAT-FIGHTING FIBER: Get 25 grams of fiber a day—the amount in about 3 servings of fruits and vegetables—and you can boost fat burn up to 30 percent. For more tips on fighting fat and toning your midsection, follow me right here on Twitter. Or try any of these 50 Ways to Lose 10 Pounds!
LAW #2: Lean People Don’t Go Fat-Free
A European study tracked nearly 90,000 people for several years and discovered that participants who tried to eat “low fat” had the same risk of being overweight as those who ate whatever they wanted.
Fat doesn’t make you fat, period. Indeed, you need fat in your diet to help you process certain nutrients, like vitamins A, D, and E, for example. And many “fat-free” foods are loaded with sugar, and therefore have even more calories than their full-fat cousins. Even the American Heart Association says that fat-free labels lead to higher consumption of unhealthy sweets. Fat keeps you full and satisfied. Fat-free will send you running back to the fridge in an hour, hungry for more.
LAW #3: Lean People Sit Down to Eat
In fact, the more you sit down and enjoy your food, the leaner you’re going to be. Punishing yourself only makes you fat!
Greek researchers recently reported that eating more slowly and savoring your meal can boost levels of two hormones that make you feel fuller. And researchers at Cornell University found that when people sat down at the table with already full plates of food, they consumed up to 35 percent less than they did when eating family-style—that is, by passing serving dishes around the table.
FIX IT WITH FOOD! Check out our list of the 40 Foods with Superpowers—foods that, even in moderation, can strengthen your heart, fortify your bones, and boost your metabolism so you can lose weight more quickly.
LAW #4: Lean People Know What They’re Going to Eat Next
Planning your responses to hunger may help you shed pounds faster, say Dutch researchers. They posed their subjects questions like “If you’re hungry at 4 p.m., then . . . what?” Those who had an answer (“I’ll snack on some almonds”) were more successful at losing weight than those who didn’t have an answer.
One of the best things about the brand-new Eat This, Not That! 2012 is that it helps you find fat-fighting food no matter where you are: movie theater, coffee shop, vending machine. It also includes this list of foods that should never see the inside of your belly: The NEW 20 Worst Foods in America.
LAW #5: Lean People Eat Protein
In a recent European study, people who ate moderately high levels of protein were twice as likely to lose weight and keep it off as those who didn’t eat much protein.
A New England Journal of Medicine study looked at a variety of eating plans and discovered that eating a diet high in protein and low in refined starches (like white bread) was the most effective for weight loss. Protein works on two levels: First, you burn more calories to digest it. Second, because your body has to work harder to digest a Big Mac than, say, a Ho Ho, you stay fuller longer.
STEALTH HEALTH FOODS: Power up your diet by expanding your menu. Here are The 7 Healthiest Foods You're Not Eating.
LAW #6: Lean People Move Around
I don’t mean climbing Kilimanjaro, breaking the tape at the Boston Marathon, or spending 24 hours at 24 Hour Fitness. I mean going for a short bike ride (20 minutes burns 200 calories), taking a leisurely walk (145 calories every 51 minutes), wrestling with your kids (another 100 calories smoked in 22 minutes), or fishing (there’s 150 calories gone in an hour—even more if you actually catch something).
Simply put, fit people stay fit by having fun. Scientists have a name for how you burn calories just enjoying yourself. It’s called NEAT: non-exercise activity thermogenesis. Sounds complicated, like something only policy wonks at a global warming summit are qualified to discuss. But it’s pretty simple: Pick a few activities that you enjoy, from tossing a stick for your dog to bowling with your best friend, and just do them more often. The average person makes 200 decisions every day that affect his or her weight. If you choose the fun option more often than not, you’ll see results.
LAW #7: Lean People Watch Less TV
Instead of calling it the boob tube, maybe we should call it the man-boob tube. About 18 percent of people who watch less than two hours of TV a day have a body mass index (BMI) of 30 or more—the cutoff line for obesity, according to the Centers for Disease Control and Prevention. But of those who watch more than four hours of TV a day, nearly 30 percent have a BMI that high, according to a study in the Journal of the American College of Cardiology.
Look, I like TV. But all things in moderation: In a study at the University of Vermont, overweight participants who cut their daily TV time in half (from an average of 5 hours to 2.5 hours) burned an extra 119 calories a day. And a recent study of people who successfully lost weight found that 63 percent of them watched less than 10 hours of TV a week. Want more? A study in the journal Annals of Behavioral Medicine reported that lean people have an average of 2.6 television sets in their homes. Overweight people have an average of 3.4. Finally, researchers in Australia recently discovered that every hour in front of the television trims 22 minutes from your life. Yikes!
Breaking any of these seven laws occasionally is fine. Just don't make a habit of it. Likewise, make sure you haven't fallen into any of these 20 Habits That Make You Fat.
The world’s population will reach 7 billion at the end of October. Don’t panic
Oct 22nd 2011 | from the print edition
IN 1950 the whole population of the earth—2.5 billion—could have squeezed, shoulder to shoulder, onto the Isle of Wight, a 381-square-kilometre rock off southern England. By 1968 John Brunner, a British novelist, observed that the earth’s people—by then 3.5 billion—would have required the Isle of Man, 572 square kilometres in the Irish Sea, for its standing room. Brunner forecast that by 2010 the world’s population would have reached 7 billion, and would need a bigger island. Hence the title of his 1968 novel about over-population, “Stand on Zanzibar” (1,554 square kilometres off east Africa).
Brunner’s prediction was only a year out. The United Nations’ population division now says the world will reach 7 billion on October 31st 2011 (America’s Census Bureau delays the date until March 2012). The UN will even identify someone born that day as the world’s 7 billionth living person. The 6 billionth, Adnan Nevic, was born on October 12th 1999 in Sarajevo, in Bosnia. He will be just past his 12th birthday when the next billion clicks over.
That makes the world’s population look as if it is rising as fast as ever. It took 250,000 years to reach 1 billion, around 1800; over a century more to reach 2 billion (in 1927); and 32 years more to reach 3 billion. But to rise from 5 billion (in 1987) to 6 billion took only 12 years; and now, another 12 years later, it is at 7 billion (see chart 1). By 2050, the UN thinks, there will be 9.3 billion people, requiring an island the size of Tenerife or Maui to stand on.
Odd though it seems, however, the growth in the world’s population is actually slowing. The peak of population growth was in the late 1960s, when the total was rising by almost 2% a year. Now the rate is half that. The last time it was so low was in 1950, when the death rate was much higher. The result is that the next billion people, according to the UN, will take 14 years to arrive, the first time that a billion milestone has taken longer to reach than the one before. The billion after that will take 18 years.
Once upon a time, the passing of population milestones might have been cause for celebration. Now it gives rise to jeremiads. As Hillary Clinton’s science adviser, Nina Fedoroff, told the BBC in 2009, “There are probably already too many people on the planet.” But the notion of “too many” is more flexible than it seems. The earth could certainly not support 10 billion hunter-gatherers, who used much more land per head than modern farm-fed people do. But it does not have to. The earth might well not be able to support 10 billion people if they had exactly the same impact per person as 7 billion do today. But that does not necessarily spell Malthusian doom, because the impact humans have on the earth and on each other can change.
For most people, the big questions about population are: can the world feed 9 billion mouths by 2050? Are so many people ruining the environment? And will those billions, living cheek-by-jowl, go to war more often? On all three counts, surprising as it seems, reducing population growth any more quickly than it is falling anyway may not make much difference.
Start with the link between population and violence. It seems plausible that the more young men there are, the more likely they will be to fight. This is especially true when groups are competing for scarce resources. Some argue that the genocidal conflict in Darfur, western Sudan, was caused partly by high population growth, which led to unsustainable farming and conflicts over land and water. Land pressure also influenced the Rwandan genocide of 1994, as migrants in search of a livelihood in one of the world’s most densely populated countries moved into already settled areas, with catastrophic results.
But there is a difference between local conflicts and what is happening on a global scale. Although the number of sovereign states has increased almost as dramatically as the world’s population over the past half-century, the number of wars between states fell fairly continuously during the period. The number of civil wars rose, then fell. The number of deaths in battle fell by roughly three-quarters. These patterns do not seem to be influenced either by the relentless upward pressure of population, or by the slackening of that pressure as growth decelerates. The difference seems to have been caused by fewer post-colonial wars, the ending of cold-war alliances (and proxy wars) and, possibly, the increase in international peacekeepers.
More people, more damage?
Human activity has caused profound changes to the climate, biodiversity, oceanic acidity and greenhouse-gas levels in the atmosphere. But it does not automatically follow that the more people there are, the worse the damage. In 2007 Americans and Australians emitted almost 20 tonnes of carbon dioxide each. In contrast, more than 60 countries—including the vast majority of African ones—emitted less than 1 tonne per person.
This implies that population growth in poorer countries (where it is concentrated) has had a smaller impact on the climate in recent years than the rise in the population of the United States (up by over 50% in 1970-2010). Most of the world’s population growth in the next 20 years will occur in countries that make the smallest contribution to greenhouse gases. Global pollution will be more affected by the pattern of economic growth—and especially whether emerging nations become as energy-intensive as America, Australia and China.
Population growth does make a bigger difference to food. All things being equal, it is harder to feed 7 billion people than 6 billion. According to the World Bank, between 2005 and 2055 agricultural productivity will have to increase by two-thirds to keep pace with rising population and changing diets. Moreover, according to the bank, if the population stayed at 2005 levels, farm productivity would have to rise by only a quarter, so more future demand comes from a growing population than from consumption per person.
Increasing farm productivity by a quarter would obviously be easier than boosting it by two-thirds. But even a rise of two-thirds is not as much as it sounds. From 1970-2010 farm productivity rose far more than this, by over three-and-a-half times. The big problem for agriculture is not the number of people, but signs that farm productivity may be levelling out. The growth in agricultural yields seems to be slowing down. There is little new farmland available. Water shortages are chronic and fertilisers are over-used. All these—plus the yield-reductions that may come from climate change, and wastefulness in getting food to markets—mean that the big problems are to do with supply, not demand.
None of this means that population does not matter. But the main impact comes from relative changes—the growth of one part of the population compared with another, for example, or shifts in the average age of the population—rather than the absolute number of people. Of these relative changes, falling fertility is most important. The fertility rate is the number of children a woman can expect to have. At the moment, almost half the world’s population—3.2 billion—lives in countries with a fertility rate of 2.1 or less. That number, the so-called replacement rate, is usually taken to be the level at which the population eventually stops growing.
The world’s decline in fertility has been staggering (see chart 2). In 1970 the total fertility rate was 4.45 and the typical family in the world had four or five children. It is now 2.45 worldwide, and lower in some surprising places. Bangladesh’s rate is 2.16, having halved in 20 years. Iran’s fertility fell from 7 in 1984 to just 1.9 in 2006. Countries with below-replacement fertility include supposedly teeming Brazil, Tunisia and Thailand. Much of Europe and East Asia have fertility rates far below replacement levels.
The fertility fall is releasing wave upon wave of demographic change. It is the main influence behind the decline of population growth and, perhaps even more important, is shifting the balance of age groups within a population.
When gold turns to silver
A fall in fertility sends a sort of generational bulge surging through a society. The generation in question is the one before the fertility fall really begins to bite, which in Europe and America was the baby-boom generation that is just retiring, and in China and East Asia the generation now reaching adulthood. To begin with, the favoured generation is in its childhood; countries have lots of children and fewer surviving grandparents (who were born at a time when life expectancy was lower). That was the situation in Europe in the 1950s and in East Asia in the 1970s.
But as the select generation enters the labour force, a country starts to benefit from a so-called “demographic dividend”. This happens when there are relatively few children (because of the fall in fertility), relatively few older people (because of higher mortality previously), and lots of economically active adults, including, often, many women, who enter the labour force in large numbers for the first time. It is a period of smaller families, rising income, rising life expectancy and big social change, including divorce, postponed marriage and single-person households. This was the situation in Europe between 1945 and 1975 (“les trente glorieuses”) and in much of East Asia in 1980-2010.
But there is a third stage. At some point, the gilded generation turns silver and retires. Now the dividend becomes a liability. There are disproportionately more old people depending upon a smaller generation behind them. Population growth stops or goes into reverse, parts of a country are abandoned by the young and the social concerns of the aged grow in significance. This situation already exists in Japan. It is arriving fast in Europe and America, and soon after that will reach East Asia.
A demographic dividend tends to boost economic growth because a large number of working-age adults increases the labour force, keeps wages relatively low, boosts savings and increases demand for goods and services. Part of China’s phenomenal growth has come from its unprecedentedly low dependency ratio—just 38 (this is the number of dependents, children and people over 65, per 100 working adults; it implies the working-age group is almost twice as large as the rest of the population put together). One study by Australia’s central bank calculated that a third of East Asia’s GDP growth in 1965-90 came from its favourable demography. About a third of America’s GDP growth in 2000-10 also came from its increasing population.
The world as a whole reaped a demographic dividend in the 40 years to 2010. In 1970 there were 75 dependents for every 100 adults of working age. In 2010 the number of dependents dropped to just 52. Huge improvements were registered not only in China but also in South-East Asia and north Africa, where dependency ratios fell by 40 points. Even “ageing” Europe and America ended the period with fewer dependents than at the beginning.
A demographic dividend does not automatically generate growth. It depends on whether the country can put its growing labour force to productive use. In the 1980s Latin America and East Asia had similar demographic patterns. But while East Asia experienced a long boom, Latin America endured its “lost decade”. One of the biggest questions for Arab countries, which are beginning to reap their own demographic dividends, is whether they will follow East Asia or Latin America.
But even if demography guarantees nothing, it can make growth harder or easier. National demographic inheritances therefore matter. And they differ a lot.
Where China loses
Hania Zlotnik, the head of the UN’s Population Division, divides the world into three categories, according to levels of fertility (see map). About a fifth of the world lives in countries with high fertility—3 or more. Most are Africans. Sub-Saharan Africa, for example, is one of the fastest-growing parts of the world. In 1975 it had half the population of Europe. It overtook Europe in 2004, and by 2050 there will be just under 2 billion people there compared with 720m Europeans. About half of the 2.3 billion increase in the world’s population over the next 40 years will be in Africa.
The rest of the world is more or less equally divided between countries with below-replacement fertility (less than 2.1) and those with intermediate fertility (between 2.1 and 3). The first group consists of Europe, China and the rest of East Asia. The second comprises South and South-East Asia, the Middle East and the Americas (including the United States).
The low-fertility countries face the biggest demographic problems. The elderly share of Japan’s population is already the highest in the world. By 2050 the country will have almost as many dependents as working-age adults, and half the population will be over 52. This will make Japan the oldest society the world has ever known. Europe faces similar trends, less acutely. It has roughly half as many dependent children and retired people as working-age adults now. By 2050 it will have three dependents for every four adults, so will shoulder a large burden of ageing, which even sustained increases in fertility would fail to reverse for decades. This will cause disturbing policy implications in the provision of pensions and health care, which rely on continuing healthy tax revenues from the working population.
At least these countries are rich enough to make such provision. Not so China. With its fertility artificially suppressed by the one-child policy, it is ageing at an unprecedented rate. In 1980 China’s median age (the point where half the population is older and half younger) was 22 years, a developing-country figure. China will be older than America as early as 2020 and older than Europe by 2030. This will bring an abrupt end to its cheap-labour manufacturing. Its dependency ratio will rise from 38 to 64 by 2050, the sharpest rise in the world. Add in the country’s sexual imbalances—after a decade of sex-selective abortions, China will have 96.5m men in their 20s in 2025 but only 80.3m young women—and demography may become the gravest problem the Communist Party has to face.
Many countries with intermediate fertility—South-East Asia, Latin America, the United States—are better off. Their dependency ratios are not deteriorating so fast and their societies are ageing more slowly. America’s demographic profile is slowly tugging it away from Europe. Though its fertility rate may have fallen recently, it is still slightly higher than Europe’s. In 2010 the two sides of the Atlantic had similar dependency rates. By 2050 America’s could be nearly ten points lower.
But the biggest potential beneficiaries are the two other areas with intermediate fertility—India and the Middle East—and the high-fertility continent of Africa. These places have long been regarded as demographic time-bombs, with youth bulges, poverty and low levels of education and health. But that is because they are moving only slowly out of the early stage of high fertility into the one in which lower fertility begins to make an impact.
At the moment, Africa has larger families and more dependent children than India or Arab countries and is a few years younger (its median age is 20 compared with their 25). But all three areas will see their dependency ratios fall in the next 40 years, the only parts of the world to do so. And they will keep their median ages low—below 38 in 2050. If they can make their public institutions less corrupt, keep their economic policies outward-looking and invest more in education, as East Asia did, then Africa, the Middle East and India could become the fastest-growing parts of the world economy within a decade or so.
Here’s looking at you
Demography, though, is not only about economics. Most emerging countries have benefited from the sort of dividend that changed Europe and America in the 1960s. They are catching up with the West in terms of income, family size and middle-class formation. Most say they want to keep their cultures unsullied by the social trends—divorce, illegitimacy and so on—that also affected the West. But the growing number of never-married women in urban Asia suggests that this will be hard.
If you look at the overall size of the world’s population, then, the picture is one of falling fertility, decelerating growth and a gradual return to the flat population level of the 18th century. But below the surface societies are being churned up in ways not seen in the much more static pre-industrial world. The earth’s population may never need a larger island than Maui to stand on. But the way it arranges itself will go on shifting for centuries to come.
Sachin Parashar, TNN Oct 5, 2011, 12.39 AM IST
NEW DELHI: At a time when Islamabad-Kabul ties are turning increasingly bitter, India and Afghanistan signed a strategic pact that catapults their relationship to a higher level, powered by a strong convergence over regional security and the shared threat from terror sanctuaries in Pakistan.
A sharp focus on terror was evident as Afghan President Hamid Karzai met Prime Minister Manmohan Singh with the two sides making no effort to mask their mutual interests in deference to US concern that a more upfront Indian involvement could send Pakistan's military and political establishment into a tizzy.
On a two-day visit to India, Karzai signed a key document with Singh after a meeting where the two leaders discussed the recent spurt in violence in Afghanistan and Pakistan's role in fomenting terror in the region and sought to take into account a proposed exit by international forces in 2014.
"The greatest need today is for the Afghan people to have peace and stability. India will stand by the people of Afghanistan as they prepare to assume the responsibility for their governance and security after the withdrawal of international forces in 2014," Singh said.
This is the first time Afghanistan has signed such a document with any nation. While the agreement is expected to fuel Islamabad's anxiety, India and Afghanistan are clearly looking at a hard-headed definition of their national interests with US keen on thinning down its presence.
It is learnt that Karzai conveyed to Singh that the strategic engagement between the two countries, which includes a big Indian effort to build Afghanistan's security capacities, will help prepare Kabul for withdrawal of international forces.
Although he did not name Pakistan, Karzai spoke of the dangers of using terror as an instrument of policy after his meeting with Singh. "Afghanistan recognises the dangers that this region faces through terrorism and radicalism that is being used as an instrument of policy against our citizens," he said while addressing the media.
The rise in attacks by groups supported by Pakistan and its determination to ensure a denouement in Afghanistan that meets its prerogatives seems to have dimmed Karzai's expectations of any understanding with Islamabad.
The pact entails security cooperation between the two countries "to help enhance their respective and mutual efforts in the fight against international terrorism, organized crime, illegal trafficking in narcotics and money laundering".
Singh said he discussed terrorism at length with Karzai and described it as a threat to which no country was immune. Referring to the assassination of former Afghan president and chairman of the 'High Peace Council' Burhanuddin Rabbani, Singh said the incident served as "an occasion for all of us to strengthen our resolve to jointly confront the menace of terrorism that threatens to undermine the security and stability of our region".
(Although he did not name Pakistan, Karzai spoke of the dangers of using terror as an instrument of policy after his meeting with Manmohan Singh.)
Rabbani's killing is seen as an indication that Pakistan-supported groups have no intention to allow any sort of reconciliation with elements of the Taliban. On Tuesday, Afghan authorities in Kabul alleged that Rabbani's assassination was carried out at the behest of Pakistani agencies.
The strategic agreement also comprises joint initiatives on key international issues and support for UN reforms, including permanent seat for India in the UN Security Council. It entails a strategic dialogue to provide a framework for cooperation in the area of national security. "The dialogue will be led by NSAs and involve regular consultations with the aim of intensifying mutual efforts towards strengthening regional peace and security," the agreement said.
As per the strategic pact, India also agreed to assist, as mutually determined, in the training, equipping and capacity building programmes for Afghan national security forces. Singh confirmed that India will participate in the forthcoming conferences in Istanbul and Bonn to contribute to international and regional initiatives to support "Afghanistan's efforts at nation building".
The strategic partnership will also deepen economic ties with Afghanistan, a country in which India has pledged investment of $2 billion in development projects. Apart for the strategic pact, the two countries also signed two MoUs for development of hydrocarbons and mineral resources.
Describing India's cooperation with Kabul as an open book, Singh complimented Karzai for his "sagacious" leadership. "I reiterated to the president that India stands by the people of Afghanistan in their journey towards capacity building, reconstruction, development and peace. We will do all that is within our means to help Afghanistan," he said. During his visit to Afghanistan in May, Singh had declared support for the ongoing reconciliation process in the country.
Map of Afghanistan
Institute of Technology, Banaras Hindu University
Varanasi 221005, UP