Actor Joy Mukherjee dead
Yesteryear actor Joy Mukherjee passed away after prolonged illness at the city's Lilavati hospital on Friday morning. He was 73.
A still of Actor Joy Mukherjee from one of his films. Photo:
(The Hindu-.Photo: Special Arrangement)
Mr. Mukherjee had starred in many Hindi movies of the 1960s such as Shagird, Love In Tokyo, Ziddi, Phir Wohi Dil Laya Hoon and Ek Musafir Ek Hasina.
“Joy Mukherjee was admitted to Lilavati hospital a week ago after he complained of breathlessness and fever,” Neelam Gupta and R.R. Pathak, spokespersons for Mr. Mukherjee's family, told The Hindu.
On life support
He was put on artificial support system after his lungs failed to respond. “He was put on respirators since the time he was admitted in the hospital. He was undergoing treatment in the ICU,” Ms. Gupta said.
Mr. Mukherjee lost consciousness nearly two days after he was admitted to the hospital. “His condition deteriorated yesterday [Thursday]. We lost him this morning,” Ms. Gupta said.
He is survived by his wife Neelam and three children including two sons — Boy and Troy.
Veteran actors Dilip Kumar and Saira Banu had visited the hospital on Thursday. Musician Bappi Lahiri too spent some time in the hospital with Mr. Mukherjee's wife, giving her moral support.
Actor Amitabh Bachchan paid tributes to Mr. Mukherjee on social networking site Twitter on Friday, “Joy Mukherjee passes away…leading man and a star during his time...condolences to members of his family and prayers...”
Mr. Mukherjee hailed from a family which was well-known in the Hindi film industry. His father was the co-founder of Filmalaya studio and was married to veteran actor Ashok Kumar's sister Sati Devi.
Mr. Mukherjee was actor Kajol's paternal uncle.
Sixties star Joy Mukherjee passes away
Avijit Ghosh 09 March 2012, 11:45 PM IST
Few stars could bring such manic energy and uninhibited enthusiasm to the silver screen as Joy Mukherjee, who passed away after a prolonged illness at Mumbai’s Leelavati hospital on Friday. He was 73.
Acting wasn’t exactly the 1960s star’s strength; he relished doing what he was comfortable with, which included singing and crawling on all fours at the same time and dancing with his back sweeping the floor. Such acts earned him many a critic's wrath but they also endeared him to a legion of fans.
Who can forget Joy tossing his hair, twisting his body and rattling his bones in the number, Duniya pagal hai yah phir main deewana (film: Shagird)? And who can forget him carrying the substantial Asha Parekh in his arms and dropping her to the ground while crooning, Le gayi dil gudiya Japan ki (film: Love in Tokyo)?
His most popular films were packed with foot-tapping numbers. Every song in Ek Musafir Ek Hasina, Phir Wohi Dil Laya Hoon, Love in Tokyo and Shagird was a rage. And they continue to be so, if YouTube hits are any indicators. These films were also his most impressive box-office winners.
Joy was the son of the illustrious Sashadhar Mukherjee, who co-founded the Filmalaya Studio. Several of his box-office hits came from his own production stable. He was introduced along with heroine Sadhana, as "sensational new stars", in Love in Simla (1960), a young-at-heart romance that hit the bull's eye. In his first scene, he jumped up and down in a train compartment while singing the popular Iqbal Quraishi composition, Dil tham chale, hum aaj kidhar, koi dekhe, koi dekhe. For most of his career, he was to repeat the same thing in different situations and different places.
Unlike most Bengali males, Joy was tall, strapping and a looker. Often dressed in bold colours with his long hair gelled and carefully brushed back, he looked a stylish city boy who takes attractive girls out for a date and a dinner. He also looked like a hero who could actually play a guitar though he preferred to carry it on his shoulder; just watch Lakhon hain nigah mein (film: Phir Wohi Dil Laya Hoon).
But for all his style and antics, the star failed to carve out a distinct screen persona for himself. Many saw him as a lesser version of the more popular Shammi Kapoor, who had a similar style.
Joy, who is also related to Kajol and Rani Mukerji, was most comfortable in breezy romantic comedies though he worked in family dramas (Bahu Beti) and spy thrillers (Humsaya, Inspector) too. His career reached its high point when Shagird (1967), with Saira Bano in the female lead, roared at the cash counters. Songs such as Dil vil pyaar vyar (Binaca Geet Mala's No 1 song of the year) and Woh hain zara khafa khafa sold records by thousands.
But stardom is fickle. The actor's career suffered a major setback after the lavishly mounted spy drama, Humsaya (1968), which he both produced and directed, collapsed at the box-office. Joy had played the double role of a Chinese spy and an Indian officer in the film. As another crop of actors -- Dharmendra, Jeetendra and Rajesh Khanna -- gradually made their presence felt, his career went into a tailspin and lurched from one B-grade flop to another.
Nearly a decade later, Joy bounced back directing Chhaila Babu (1977), a slick crime thriller starring Rajesh Khanna and Zeenat Aman. The movie fared reasonably well but, ironically, it did not help his career. Not even acting as a villain in the B grade dacoit drama, Phoolan Devi (1985), where Rita Bhaduri played the bandit queen, did his reputation or star status any good.
Joy and his wife Neelam have three children: two sons and a daughter. One of his sons, Boy Mukherjee, also had made his Bollywood debut but failed to make headway. After Dev Anand and Shammi Kapoor, he is the third Hindi film star to have passed away in recent months. But as long as television channels and radio stations play flashback songs, a Joy Mukherjee number will always be an ode to joy.
Six popular numbers Joy sang on screen
1. Humsaya (1968): Dil ki awaaz bhi sun mere fasane pe na ja
2. Shagird (1967): Bade miyan deewane aise na bano
3. Love in Tokyo (1966): O mere shahe khuba, o meri jaane janana
4. Aao Pyaar Karein (1964): Tum akele to kabhi baagh mein jaya na karo
5. Jee Chahta Hai (1964): Hum chhod chale hain mehfil ko
6. Ek Musafir Ek Hasina (1962): Aap yoon hi agar humse milte rahein dekhiye ek din pyaar ho jayega
Car given a $263,000 'invisibility cloak'
In a promotion for its first production fuel-cell vehicle in Germany, Mercedes-Benz turned a B-Class hatchback invisible -- at least, from a distance, using the same idea behind the invisible car in the James Bond film "Die Another Day." See if you can see it before it sees you.
The invisibility cloak had its tryout this week on the streets of Stuttgart, Germany. To make Q's idea of an invisible car real, Mercedes employed dozens of technicians and some $263,000 worth of flexible LED mats covering one side of the car. Using a camera mounted on the opposite side of the vehicle, the LEDs were programmed to reproduce the image from the camera at the right scale, blending the vehicle into the background from a few feet away. Doing so required power sources, computers and other gear totaling 1,100 lbs. of equipment inside the B-Class.
Mercedes' point was to show how the F-Cell hydrogen fuel cell powered car would be invisible to the environment, producing only water vapor and heat for emissions. For an invisible car, it's getting a lot of stares.
IANS Mar 29, 2012, 07.12PM IST
NEW DELHI: Seeking to reinforce their growing economic heft with diplomatic clout, the BRICS grouping Thursday pitched for a bigger say in global governance institutions, including the UN and the IMF, and told the West that dialogue was the only way to resolve the Iranian nuclear issue and the Syria crisis.
Brazil, Russia, India, China and South Africa, which comprise nearly half the world's population and a growing share of global GDP, signed two pacts to spur trade in their local currencies. They also agreed to set up a working group for a joint development bank to promote mutual investment in infrastructure.
(The BRICS grouping Thursday pitched for a bigger say in global governance institutions, including the UN and the IMF, and told the West that dialogue was the only way to resolve the Iranian nuclear issue and the Syria crisis.)
Prime Minister Manmohan Singh of India and Presidents Hu Jintao (China), Dmitry Medvedev (Russia), Dilma Rousseff (Brazil) and Jacob Zuma (South Africa) ended the fourth BRICS summit by renewing the pitch for reforming global governance institutions and closer coordination on global issues.
The five leaders stressed on the restructuring of the world order to accommodate emerging economies and developing countries and for promoting sustained and balanced global economic growth.
"While some progress has been made in international financial institutions, there is lack of movement on the political side. BRICS should speak with one voice on important issues such as the reform of the UN Security Council," said Manmohan Singh, the summit host.
"We are committed to stepping up exchanges with other countries on global economic governance reforms and increasing representation of developing countries," said Hu.
The BRICS include Russia and China, two veto-wielding members of the UN Security Council, and three aspiring members for a permanent seat - India, Brazil and South Africa.
The BRICS leaders also pitched for greater voting rights for developing countries in the IMF and voiced disappointment with the West over the slow pace of the quota reforms.
In a fresh assertion, BRICS asked the West to implement the 2010 governance and quota reform before the 2012 IMF/World Bank annual meeting, as well as the comprehensive review of the quota formula to better reflect economic weights.
They asked for enhancing the voice and representation of emerging market and developing countries by January 2013, followed by the completion of the next general quota review by January 2014.
In a signature step, the BRICS decided to create their first institution in the form of a BRICS-led South South Development Bank that will mobilise "resources for infrastructure and sustainable development projects in BRICS and other emerging economies and developing countries", the BRICS' Delhi Declaration said.
The leaders directed their finance ministers "to examine the feasibility and viability of such an initiative, set up a joint working group for further study, and report back by the next summit".
The development banks of the five countries signed two pacts, including a master agreement on extending credit facility in local currency and BRICS multilateral letter of credit confirmation facility agreement, which could help scale up bilateral trade from $230 billion to $500 billion.
Challenging the West's hegemony of the Bretton Woods institutions, the BRICS leaders welcomed the candidatures from the developing world for the position of the president of the World Bank and backed "an open and merit-based process" for selection of the heads of the World Bank and IMF.
Contesting the West's narrative, the five countries warned the West against allowing the Iran situation to escalate into a conflict and said dialogue was the only way to resolve the Iranian and Syria issues.
"We agreed that a lasting solution in Syria and Iran can only be found through dialogue," Manmohan Singh said.
"The situation concerning Iran cannot be allowed to escalate into conflict, the disastrous consequences of which will be in no one's interest," said the declaration, in a veiled allusion to the speculated plan by the US and Israel to target Iran's nuclear facilities.
The declaration saw the leaders voicing "deep concern" over Syria as they called for "an immediate end to all violence and violations of human rights in that country", backing a Syrian-led inclusive political process.
China and Russia had earlier voted against the US and Arab League-backed UN resolution on grounds that it amounted to a regime change, while India had supported the resolution.
When you use a public computer, you should clean up traces of your activity before leaving. You don't want the next user to recover your email conversations or passwords.
By Neil J. Rubenking February 21, 2012
On your own personal computer, you're free to install whatever security software you feel necessary. You'll surely want a firewall to block hack attacks and an antivirus app to keep out malware. You may add a spam filter to protect your Inbox, or a security suite that wraps comprehensive protection in handy package. Your computer isn't accessible to random passers-by, so you may not be so worried about activity traces like browsing history.
Using a public computer at an Internet café, library, school, or even a friend's house is quite a different situation. First, you have no guaranteed that the computer is protected; it might be riddled with viruses or afflicted with a keylogger. Second, unless you're careful the next user might learn a lot more than you'd like about your online session.
Built-in Safe Browsing
For your convenience, the browser keeps a history of sites you've visited, stores cookies that retain personal settings for sites, and caches files for faster loading of sites you visited before. That's fine at home, but when you're using a public computer you don't want the browser storing all that information.
Fortunately most modern browsers can run in a mode that suppresses information-gathering and protects your privacy. You can right-click the Internet Explorer icon and choose "Start InPrivate Browsing," or right-click on the Firefox icon and choose "Enter private browsing." For either Firefox or IE, pressing Ctrl+Shift+P during a normal browsing session switches to private browsing. In Chrome, the private browsing mode is called "Incognito mode," and pressing Ctrl+Shift+N opens an Incognito mode window.
One more thing; be sure to shut down the browser when you're done. Even private browsing doesn't disable the Back button. You don't want the next user backing into your Facebook session or Web-based email account.
I Forgot! Now What?
Of course, there's every possibility you'll sit down to a public computer, check your bank balance, send a few emails… and only later remember that you should have opted for privacy. Fear not; erasing your activity is simple. In Chrome, Firefox, or Internet Explorer you simply press Ctrl+Shift+Del to call up the dialog for deleting your history. The details vary, but you'll want to make sure you've selected all of the options for deletion. Chrome and Firefox let you specify how far back the cleansing should go. Do other users a favor and have it clear all history, not just the last hour.
Cloak and Dagger
It's conceivable that the computer you're using might be seriously compromised security-wise. For example, a stealthed keylogger application could capture all passwords typed on the system. A hardware keylogger could do the same, with no possibility of detection by security software.
Your best bet is to simply refrain from sensitive transactions on a public computer. If you absolutely must log in to an important secure site on a suspect computer, here's one way to make password theft difficult: bring up a page with lots of text in the browser and copy/paste characters from that page into the password dialog. This "ransom note" style is decidedly tedious, but even a spy program that captures periodic screenshots probably won't snap all parts of your password.
Secure Your Connection
A shady Internet café operator could possibly make some money on the side by siphoning passwords out of data packets passing through the wireless network. The guy at the next table might be intercepting your connection using Firesheep or a similar tool. If you really must engage in sensitive communication, you need to secure the connection.
One way to do that is through a VPN (Virtual Private Network), which routes your surfing through a secure connection. PCMag has rounded up a number of free VPN clients. The problem here is that you probably don't have permission to install them on the public computer. However, VPN protection is definitely worthwhile if you've connected your own laptop to an iffy hotspot.
When government representatives and business executives visit China or Russia, they go "electronically naked." They leave all personal or company phones and laptops behind, using a new, blank loaner phone or laptop if necessary. It's an extreme step, but if you're not carrying any sensitive information there's nothing for a hacker to steal.
As you can see, there's a whole range of precautions you might take to keep an Internet café session from turning into an identity theft nightmare. If you're forced to use public computers for sensitive communication, consider using ransom-note passwords and possibly a VPN. Don't engage in any sensitive communication that you could just as well do from your home or office.
But even if you're doing nothing more than checking Facebook and emailing your dear auntie, do take the minimal precautions. Invoke the browser's privacy mode, or clear browsing data if you forgot. Doing so just takes a second and can save hours of aggravation.
Molecular computing: DNA is sometimes called the software of life. Now it is being used to build computers that can run inside cells
Mar 3rd 2012 | from the print edition
EVER since the advent of the integrated circuit in the 1960s, computing has been synonymous with chips of solid silicon. But some researchers have been taking an alternative approach: building liquid computers using DNA and its cousin RNA, the naturally occurring nucleic-acid molecules that encode genetic information inside cells. Rather than encoding ones and zeroes into high and low voltages that switch transistors on and off, the idea is to use high and low concentrations of these molecules to propagate signals through a kind of computational soup.
Computing with nucleic acids is much slower than using transistors. Unlike silicon chips, however, DNA-based computers could be made small enough to operate inside cells and control their activity. “If you can programme events at a molecular level in cells, you can cure or kill cells which are sick or in trouble and leave the other ones intact. You cannot do this with electronics,” says Luca Cardelli of Microsoft’s research centre in Cambridge, England, where the software giant is developing tools for designing molecular circuits.
At the heart of such circuits is Watson-Crick base pairing, the chemical Velcro that binds together the two strands of DNA’s double helix. The four chemical “bases” (the letters of the genetic alphabet) that form the rungs of the helix stick together in complementary pairs: A (adenine) with T (thymine), and C (cytosine) with G (guanine). By making single strands of DNA or RNA with specific A, T, C and G sequences, researchers can precisely define and predict which part of a strand will bind to another. These synthesised strands typically consist of fewer than 100 bases (a gene, by contrast, has thousands of bases).
Leonard Adleman, an American computer scientist, first demonstrated the use of nucleic-acid strand interactions for computing in 1994. He solved a version of the travelling-salesman problem—given a network of linked cities, what is the shortest route that visits each city exactly once?—in a test tube using specially sequenced DNA molecules and standard molecular-biology procedures (see box). Solving such a specific task is a far cry from building a general-purpose computer. But it showed that information could indeed be processed using interactions between strands of synthetic DNA.
Dr Adleman’s work prompted other researchers to develop DNA-based logic circuits, the fundamental building blocks of computing, using a variety of approaches. The resulting circuits can perform simple mathematical and logical operations, recognise patterns based on incomplete data and play simple games. Molecular circuits can even detect and respond to a disease signature inside a living cell, opening up the possibility of medical treatments based on man-made molecular software.
Erik Winfree’s group at the California Institute of Technology (Caltech) is one of the best-known in this emerging field. In recent years it has made many nucleic acid-based digital logic circuits in test tubes, linking up logic gates capable of simple operations (such as AND, OR and NOT) using a trick called strand displacement, pioneered by three Caltech researchers, Georg Seelig, David Soloveichik and Dave Zhang.
In a strand-displacement logic circuit, inputs take the form of free-floating single DNA or RNA strands, and logic gates are complexes of two or more such strands, one of which is the potential output signal. “Sticky” tabs on the gates allow passing signals to latch on. If an input signal has a base-pair sequence complementary to the sequence on a gate, it binds to it, displacing the output strand and causing it to detach. The free-floating output strand can then, in turn, trigger another logic gate, causing a signal to travel through the circuit in a cascade. Billions of copies of the input, gate and output molecules are intermixed in a molecular soup. Programming such a system involves choosing specific base sequences to make up the different gates and the signal paths that connect them.
In a paper published last year in the journal Science, Dr Winfree and his colleague Lulu Qian described the use of strand-displacement cascades to build circuits of increasing complexity, culminating in a circuit made of 74 different DNA strands (pictured) that was capable of calculating the square roots of four-digit binary numbers. Together with their colleague Jehoshua Bruck, they then built a tiny neural network, made up of four interconnected artificial neurons, using a soup of 112 different interacting DNA strands. Each neuron was designed to fire when the sum of its input signals exceeded a certain threshold, and could be configured to assign different weights to different inputs. Such neural networks can recognise simple patterns, even when presented with incomplete data.
To test their neural network’s pattern-recognition powers, Dr Qian made up a game to identify one out of four scientists. Each scientist was represented by a different set of answers to four yes-or-no questions. A human player would add to the test tube some (but not all) of the DNA strands corresponding to one set of answers. The circuit then guessed which scientist was the closest match, showing its answer using different-coloured fluorescent signals. The circuit took eight hours to give its answer, but got it right every time. And this circuit should work in a volume of a cubic micron (one-millionth of a metre), says Dr Winfree, which is small enough to fit into many sorts of cell.
DNA-based computers could be made small enough to operate inside cells and control their activity.
Milan Stojanovic at Columbia University is building circuits using a different form of strand displacement based on catalytic DNA strands, also known as deoxyribozymes or DNAzymes. These are synthetic single-stranded DNA sequences that are, among other things, capable of cutting nearby DNA strands in specific places.
Dr Stojanovic makes a DNAzyme into a logic gate by attaching a loop of DNA at one end that prevents the DNAzyme from working. When one or more input strands bind to complementary sequences on the loop, the loop breaks, activating the DNAzyme and switching the gate on. It can then interact with other strands, chopping them to trigger other gates or activate fluorescent tags that display the circuit’s final output. Dr Stojanovic and his colleague Joanna Macdonald have used this approach to build simple DNA-based circuits capable of playing tic-tac-toe (though they take about half an hour to make each move).
Yannick Rondelez, a researcher in molecular programming at the University of Tokyo, is creating circuits in test tubes in a way that more closely resembles the operation of natural cells. He is using enzymes such as polymerases, nucleases and exonucleases that can also copy, cut and destroy nucleic-acid strands. In cells, enzymes are the basis of the natural circuits that switch genes on and off, maintain biological rhythms and produce molecular answers in response to environmental stimuli. Dr Rondelez has used his enzyme-based approach to build a molecular oscillator, which should be a useful addition to the molecular-computing toolbox.
A group at the Swiss Federal Institute of Technology (ETH Zurich) led by Yaakov Benenson, in collaboration with Ron Weiss of the Massachusetts Institute of Technology, is also creating circuits using enzymes. But unlike Dr Rondelez’s circuits, which work in test tubes, these operate inside cells, piggybacking on the existing cellular machinery found within them. Last year Dr Benenson’s team developed one of the most complex cell-based molecular circuits created so far, though it is still much simpler than systems built in test tubes. It is capable of recognising the signature of cervical cancer and destroying the host cell when it is found.
The circuit works by looking out for short strands called microRNAs, which regulate some processes within cells. They do this by interfering with the activity of the messenger RNA strands that transfer genetic information from the cell’s nucleus to its protein-making machinery. Dr Benenson and his team chose five microRNAs associated with cervical cancer and designed a “classifier” circuit able to detect them. Only if all five are found at the right levels does the circuit activate, producing a protein that causes the cell to destroy itself.
Rather than injecting the necessary components, the researchers tricked the cell into producing them itself by adding instructions for them, in the form of synthetic genes, to the genetic instructions in the cell’s nucleus. “We build the template in the form of synthetic genes and the cell turns them into components,” says Dr Benenson. “So we are hijacking the pathway that already exists.” But this trick is currently possible only for simple circuits.
With molecular circuits becoming steadily more complex, new software tools are being developed to design, model and debug them. Microsoft’s researchers in Cambridge are working with experimentalists at Caltech, the University of Washington and the University of Oxford on a programming language and simulator for strand-displacement circuits, called the DNA Strand Displacement (DSD) tool. Users specify a description of a DNA-based circuit, including how individual DNA strands are joined together, and the software then simulates its behaviour, explains Andrew Phillips, the head of Microsoft’s biological-computation group.
Dr Phillips’s group is also developing tools to model the machinery within cells, including a language called Genetic Engineering of Cells (GEC). Work is under way with synthetic-biology researchers at the University of Cambridge to hook up these different biological modelling environments. “You could have a model of a DNA circuit written in DSD, which interfaces with a model of the cell machinery written in GEC,” he says. It would then be possible to simulate the operation of a DNA circuit that runs inside a cell and outputs drug molecules when certain conditions are met, for example.
Treatments based on molecular computers are still some way off. Today’s most elaborate DNA circuits operate on work benches, not inside cells. But the border between computing and biology is vanishing fast, and the process of hijacking the information-processing potential of DNA to build logic circuits has only just begun.
From the print edition | Technology Quarterly
A series of reports from the annual meeting of the American Association for the Advancement of Science kicks off with new developments in quantum computing
Feb 25th 2012 | vancouver | from the print edition
QUANTUM effects are vital to modern electronics. They can also be a damnable nuisance. Make a transistor too small, for example, and electrons within it can simply vanish from one place and reappear in another because their location is quantumly indeterminate. Currents thus leak away, and signals are degraded.
Other people, though, see opportunity instead. Some of the weird things that go on at the quantum scale afford the possibility of doing computing in a new and faster way, and of sending messages that—in theory at least—cannot be intercepted. Several groups of such enthusiasts hope to build quantum computers capable of solving some of the problems which stump today’s machines, such as finding prime factors of numbers with hundreds of digits or trawling through large databases. They gave a progress report to the annual meeting of the American Association for the Advancement of Science (AAAS) in Vancouver.
At the core of their efforts lie the quantum-mechanical phenomena of superposition and entanglement. An ordinary digital computer manipulates information in the form of bits, which take the value of either 0 or 1. These are represented within the computer as different voltages of electric current, itself the result of the electron’s charge. This charge is a fixed feature of all electrons; each has the same amount of it as any other. But electrons possess other, less rigid properties like spin, which can be either “up”, “down” or a fuzzy, imprecisely defined combination of the two. Such combinations, known as superpositions, can be used to construct a quantum analogue of the traditional bit—the qubit.
Entanglement, meanwhile, is the roping together of particles in order to add more qubits. Each extra qubit in a quantum machine doubles the number of simultaneous operations it can perform. It is this which gives quantum computing its power. Two entangled qubits permit four operations; three permit eight; and so on. A 300-qubit computer could perform more concurrent operations than there are atoms in the visible universe.
A coherent idea
Unfortunately, such a machine is not in the offing. Entanglement and superposition are delicate things. Even the slightest disturbance causes qubits to “decohere”, shedding their magical properties. To build a working quantum computer, qubits will have to become more resilient, and progress so far has been slow. The first quantum computations were done in the lab in 1995. Since then various teams have managed to entangle as many as 14 qubits. The record holders, a group in Innsbruck, use a device called an ion trap in which each qubit exists as a superposition of a rubidium atom at different energies. Raymond Laflamme and his colleagues at the University of Waterloo, in Canada, have managed to entangle 12 qubits by performing a similar trick, entangling certain atoms within a single molecule of an amino acid called histidine, the properties of which make it particularly suited to such experiments.
The problem with these approaches is that they will not be easy to scale up. Ion traps reside inside big vacuum chambers, which cannot easily be shrunk. And a molecule of histidine contains only so many suitable atoms. So the search is on for more practical qubits.
One promising approach is to etch qubits in semiconductors. Charles Marcus, previously of Harvard University and now at the University of Copenhagen, has been using electrons’ spins to do this. Single-electron qubits decohere quickly, so his team decided instead to create a qubit out of two electrons, which they trapped in “quantum dots”, tiny semiconducting crystals (of gallium arsenide, in this case). When two such dots are close together, it is possible to get an electron trapped in one to pop over and join its neighbour in the other. The superposition of the two electrons’ spins produces the qubit.
Dr Marcus’s team have so far managed to stitch four such qubits together. An array of clever tricks has extended their life to about ten microseconds—enough to perform the simple algebraic operations that are the lifeblood of computing. They hope to extend their life further by using silicon or carbon, the atomic nuclei of which interfere less with the entangled electrons than do those of gallium arsenide.
John Martinis and his colleagues at the University of California, Santa Barbara (UCSB), meanwhile, have been trying to forge qubits from superconducting circuits. In a superconductor, electrons do not travel solo. Instead, for complicated quantum-mechanical reasons, they pair up (for the same reasons, the pairs feel no electrical resistance). When they do so, the pairs start behaving like a single particle, superposing proclivities and all. This superparticle can, for instance, in effect be moving in two directions at once. As electrons move, they create a magnetic field. Make a closed loop of superconducting wire, then, and you get a magnetic field which can be facing up and down at the same time. You have yourself a superconducting qubit—or five, the number Dr Martinis has so far managed to entangle.
He has another clever trick up his sleeve. Using a device called a resonator he has been able to transfer information from the circuit to a single photon and trap it in a cavity for a few microseconds. He has, in other words, created a quantum memory. A few microseconds may not sound much, but it is just about enough to perform some basic operations.
The problem with all these approaches is that the quantum states they rely on are fragile, which allows errors to creep in. One way to ensure that they do not scupper the calculation is to encode the same information in several qubits instead of just one. Drs Marcus, Martinis and Laflamme have therefore had to build redundant qubits into their systems. For every “logical” qubit needed to do a calculation, there is a handful of physical ones, all of which need to be entangled.
Michael Freedman is trying to address this problem by taking a different tack. Together with his colleagues at Microsoft’s Station Q research centre, also at UCSB, he is trying to build what he calls a topological quantum computer. This uses a superconductor on top of a layer of an exotic material called indium antimony. When a voltage is applied to this sandwich, the whole lot becomes a quantum system capable of existing in superposed states.
Where Dr Freedman’s qubits differ from Dr Martinis’s is in the way they react to interference. Nudge any electron in a superconducting circuit and the whole lot decoheres. Dr Freedman’s design, however, is invulnerable to such local disruptions thanks to the peculiar way in which energy is distributed throughout indium antimony. The Microsoft team has yet to create a functioning qubit, but hopes to do so soon, and is searching for other materials in which to repeat the same trick.
All of this work is pretty fundamental. Researchers are a long way from creating quantum mainframes, which is how most of them see the future of their fiddly devices, let alone quantum desktops. Dr Martinis thinks that a viable quantum processor is still ten years away. Yet even this is progress of a sort. When he entered the field two decades ago, he thought that building a quantum processor was “insanely difficult”. Now he says it is merely “very, very hard”.
Energy technology: Better ways of storing energy are needed if electricity systems are to become cleaner and more efficient
Mar 3rd 2012 | from the print edition
SUMMER in Texas last year was the hottest on record. Demand for power spiked as air conditioners hummed across the state. The Electric Reliability Council of Texas (ERCOT), the state grid operator, only narrowly avoided having to impose rolling blackouts. To do so, it had to buy all the electricity it could find on the spot market, in some cases paying an eye-watering 30 times the normal price.
On paper at least, ERCOT ought to have had plenty of power. In 2010 it reported 84,400 megawatts (MW) of total generation capacity, well over last summer’s peak demand of 68,294MW. In theory, this is enough to produce some 740 billion kilowatt hours (kWh) of electricity a year—more than double the 319 billion kWh that ERCOT’s customers actually demanded during 2010. In electricity generation, however, aggregates and averages carry little weight. One problem is that wind energy accounted for 9,500MW of ERCOT’s total capacity, and the wind does not blow all the time. It tends to be strongest at night, when demand is low. Moreover, power firms are required by regulators to maintain a safety margin over total estimated demand—of 13.75%, in ERCOT’s case—in order to ensure reliable supply.
If only it were easier for ERCOT and other utilities to store excess energy, such as that produced by wind turbines at night, for later use at peak times. Such “time shifting” would compensate for the intermittent nature of wind and solar power, making them more attractive and easier to integrate into the grid. Energy storage also allows “peak shaving”. By tapping stored energy rather than firing up standby generators, utilities can save money by avoiding expensive spot-market purchases.
Surely the answer is to use giant batteries? Although batteries can deliver power for short periods, and can smooth out the bumps as different sources of power are switched on and off, they cannot provide “grid scale” performance, storing and discharging energy at high rates (hundreds of megawatts) and in really large quantities (thousands of megawatt hours). So other technologies are needed—and growing demand, driven chiefly by wider use of intermittent renewable-energy sources, is sparking plenty of new ideas.
It’s got potential
The most widely used form of bulk-energy storage is currently pumped-storage hydropower (PSH), which uses the simple combination of water and gravity to capture off-peak power and release it at times of high demand. Pumped-hydro facilities typically take advantage of natural topography, and are built around two reservoirs at different heights. Off-peak electricity is used to pump water from the lower to the higher reservoir, turning electrical energy into gravitational potential energy. When power is needed, water is released back down to the lower reservoir, spinning a turbine and generating electricity along the way. PSH accounts for more than 99% of bulk storage capacity worldwide: around 127,000MW, according to the Electric Power Research Institute (EPRI), the research arm of America’s power utilities.
Yet despite its dominance, traditional PSH has limited capacity for expansion. The kind of sites needed for such systems are few and far between. As a result, several firms are devising new forms of PSH.
One ambitious idea (pictured above) is the Green Power Island concept devised by Gottlieb Paludan, a Danish architecture firm, together with researchers at the Technical University of Denmark. This involves building artificial islands with wind turbines and a deep central reservoir. When the wind blows, the energy is used to pump water out of the reservoir into the sea. When power is needed, seawater is allowed to flow back into the reservoir, driving turbines to produce electricity.
Gravity Power, a start-up based in California, has devised a system that relies on two water-filled shafts, one wider than the other, which are connected at both ends. Water is pumped down through the smaller shaft to raise a piston in the larger shaft. When demand peaks, the piston is allowed to sink back down the main shaft, forcing water through a generator to create electricity. The system’s relatively compact nature means it can be installed close to areas of high demand, and extra modules can be added when more capacity is needed, says Tom Mason, the firm’s boss.
Gravity Power’s subterranean hydropower
Another company looking to harness the potential of gravity is Advanced Rail Energy Storage (ARES), based in Santa Monica, California. Its system uses modified railway cars on a specially built track. Off-peak electricity is used to pull the cars to the top of a hill. When energy is needed, the cars are released, and as they run back down the track their motion drives a generator. Like PSH, the ARES system requires specific topography. But William Peitzke, the firm’s boss, says ARES delivers more power for the same height differential. He also says it is more efficient, with a round-trip efficiency—the ratio of energy out to energy in—of more than 85%, compared with 70-75% for PSH. A demonstration system is being built in California, and should become operational in 2013.
The second-biggest form of bulk-energy storage, though it is dwarfed by PSH, is compressed-air energy storage (CAES). This involves compressing air and storing it in large repositories, such as underground salt caverns. During peak hours the air is released to drive a turbine. There are only two commercial CAES plants in operation: one in Huntorf, Germany, and the other in McIntosh, Alabama. The big drawback of CAES is its inefficiency. According to RWE, a German utility, the Huntorf plant is only 42% efficient, and the one in Alabama is only slightly better. The problem is that air heats up when pressurised and cools down when expanded. In existing CAES systems energy is lost as heat during compression, and the air must then be reheated before expansion. The energy to do this usually comes from natural gas, reducing efficiency and increasing greenhouse-gas emissions.
As with hydro storage, efforts are under way to adapt the basic concept of CAES to make it more efficient and easier to install. RWE is working with GE, an industrial conglomerate, and others to commercialise a compressed-air system that captures the heat produced during compression, stores it, and then reapplies it during the expansion process, eliminating the need for additional sources of heat. Having proven the theoretical feasibility of this concept, the partners must now overcome the technical hurdles, which include developing pumps to compress air to 70 times atmospheric pressure, and ceramic materials to store heat at up to 600°C. The aim is to start building a 90MW demonstration plant in Strasfurt, Germany, in 2013, says Peter Moser, the head of RWE’s research arm.
Several smaller outfits are also developing more efficient forms of CAES. SustainX, a company spun out of Dartmouth University’s engineering school and supported by America’s Department of Energy (DOE) and GE, among others, has developed what it calls “isothermal CAES”, which removes heat from the compressed air by injecting water vapour. The water absorbs the heat and is then stored and reapplied to the air during the expansion process. And rather than relying on salt caverns, SustainX uses standard steel pipes to store the compressed air, allowing its systems to be installed wherever they are needed. The firm has built a 40 kilowatt demonstration plant and is partnering with AES, a utility, to build a 1-2MW system. General Compression, a Massachusetts-based company also backed by the DOE, has developed an isothermal CAES system focused on providing support to wind farms. With the backing of ConocoPhillips, an energy giant, it is building a 2MW demonstration plant in Texas.
Another way to store energy is in the form of heat. That is the approach taken by Isentropic, a company based in Cambridge, England, with a system it calls pumped heat electricity storage (PHES), which uses argon gas to transfer heat between two vast tanks filled with gravel. Incoming energy drives a heat pump, compressing and heating the argon and creating a temperature differential between the two tanks, with one at 500°C and the other at -160°C. During periods of high demand, the heat pump runs in reverse as a heat engine, expanding and cooling the argon and generating electricity. Isentropic says its system has an efficiency of 72-80%, depending on size.
BrightSource Energy, an energy company based in Oakland, California, has signed a deal with Southern California Edison, a utility, to implement a system that stores energy in molten salt. BrightSource generates electricity using an approach called concentrated solar power, in which computer-controlled mirrors, known as heliostats, focus the sun’s heat to boil water and turn a steam turbine. But this approach works only while the sun is shining. The storage system, called SolarPLUS, uses a heat exchanger to transfer some of the heat captured by the heliostats to the molten salt. It is then run back through the heat exchanger to drive the steam turbine when needed. This allows BrightSource’s plants to deliver energy even after dark, and gives utilities and grid operators more flexibility than solar power usually provides. BrightSource is planning to equip three of its plants with SolarPLUS.
Changing the rules
Time-shifting would compensate for the intermittent nature of wind and solar power.
The potential market is huge: according to Pike Research, a market-research firm, $122 billion will be invested in energy-storage projects between 2011 and 2021. It predicts that the bulk of this spending will go towards new forms of CAES. Green-minded governments and regulators are taking a closer interest in the technology. California has passed a law requiring utilities to consider storage in their plans. Germany’s environment ministry last year proposed a project to assess technology developments and funding needs for energy storage. And the British government’s “low-carbon networks” fund is being used to build some demonstration projects.
Yet large-scale deployment of bulk storage systems will require regulatory as well as technical progress. Storage systems do not fit neatly into regulatory frameworks that distinguish between power providers and grid operators, since they can be used by both. Their ability to take power off the grid, store it, and then release it later creates “potential problems for current tariff, billing and metering approaches,” notes the EPRI in a recent report. Nor is it clear whether power companies will be allowed to pass on the cost of storage facilities to their customers. But given the technology’s potential to make power grids cleaner and more reliable, it seems likely that changes to the rules are in store.
from the print edition | Technology Quarterly
Hemali Chhapia, TNN Feb 28, 2012, 03.25AM IST
MUMBAI: With supply outstripping demand for engineering and management seats, the country may stop new professional colleges coming up from 2014. This firm stand was taken recently at a meeting of the All-India Council for Technical Education, the country's inspector which grants permission to new professional technical colleges. The decision follows requests from several states that want the Council to reject fresh proposals for more colleges.
(With supply outstripping demand for engineering and management seats, the country may stop new professional colleges coming up from 2014.)
While many states wanted the AICTE to immediately stop accepting applications, the process of setting up a college, like buying land and building the infrastructure, starts two years before a college trust approaches the AICTE for permission. "So, we have decided that two years from now, we will review the situation and may stop accepting proposals for all new technical colleges," said AICTE chairman S S Mantha.
States such as Andhra Pradesh, Karnataka, Tamil Nadu, Haryana and Chhattisgarh and Maharashtra told the AICTE to not to clear proposals for new institutes after waking up to the fact that the number of vacant seats in engineering and management colleges has risen dramatically over the last three years. India is now home to 3,393 engineering colleges that have 14.86 lakhs seats; today there are 3,900 management schools with a total student intake of 3.5 lakh. Maharashtra, Andhra Pradesh, Tamil Nadu, Karnataka and Uttar Pradesh have about 70% tech institutes. When admissions closed last year, AICTE estimated that nearly three lakh seats were unfilled.
Despite the AICTE's decision, many states have decided not to allow colleges to start this year, with the state governments and the council embarking on a collision course.
This year, the AICTE received a total of 204 applications for new engineering institutes and 86 for MBA colleges. "This year, we saw an interest in colleges again wanting to invest in engineering education. However, applications from the southern states, which have witnessed the expansion, are down to a trickle," added Mantha. Andhra Pradesh, which has the largest number of engineering colleges in India, has dispatched merely eight applications this year and a similar number for starting MBA colleges.
However, over time, with no plan, growth has been skewed, but if AICTE's optimism is anything to go by, the country will now see professional colleges springing up in areas like the north-east and in central India, which are yet suffering from low enrolment in the professional education sector.
Closer home, edupreneurs (education entrepreneurs) from Maharashtra are bullish on the growth in this sector. Maharashtra has a rich pool of 348 engineering institutes and 408 MBA colleges. And the fact that 34,000 seats did not have any takers last year did not play spoilsport. The AICTE received 30 applications to start engineering colleges and 15 for MBA institutes from Maharashtra this year (see box).
"We have received the highest number of applications from Maharashtra. But, we have an impressive 307 applicants (almost 50% of the entire pool) for starting polytechnics (colleges that offer diploma in engineering) from across India," added Mantha.
However, overall the slowdown is perceptible: two years ago, the AICTE received 2,176 applications to start new professional degree colleges and this time around, the number stands at a paltry 362. And two years from now, there may be no new colleges that will come up.
By watching evolution in progress, scientists reveal key developments in the evolution of complex life and put evolutionary theories to the test
By Sarah Fecht | January 16, 2012
The transition from single-celled to multicellular organisms was one of the most significant developments in the history of life on Earth. Without it, all living things would still be microscopic and simple; there would be no such thing as a plant or a brain or a human. How exactly multicellularity arose is still a mystery, but a new study, published January 16 in Proceedings of the National Academy of Sciences, found that it may have been quicker and easier than many scientists expected.
Image: William Ratcliff
"This is a significant paper that addresses one of the most fundamental questions in evolutionary and developmental biology," says Rick Grosberg, an evolutionary biologist at the University of California, Davis, who was not involved with the research.
Since evolution acts on individual cells, it pays off for a cell to be selfish. By hogging resources and hindering neighbors, a cell can increase the odds that more of its own genes get passed into the next generation. This logic is one of the reasons it has been challenging to imagine how multicellularity arose; it requires the subjugation of self-interest in favor of the group’s survival.
"Traditional theories make this out to be a difficult transition because you have to somehow turn off selection on the individual cells and turn it on for the collective," says Carl Simpson, a paleobiologist at the Museum für Naturkunde in Berlin, who also was not involved in the research. "The big result here is that these transitions can be super easy."
In the new paper, researchers at the University of Minnesota used a simple but elegant technique to artificially select for multicellularity in yeast. They dumped unicellular yeast into a tube of liquid food and waited a few minutes for the cells to settle. Then they extracted the lowest fraction of the liquid and allowed whatever cells it contained to form the next generation. Because the cells had to cluster together in order to sink to the bottom and survive, the artificial selection made it more advantageous for yeast to cooperate than to be solitary.
After just 60 generations, all of the surviving yeast populations had formed snowflake-shaped multicellular clusters. "Hence we know that simple conditions are sufficient to select for multicellularity," says biologist Michael Travisano, who led the research.
But at what point do the yeast become something more than a cluster of cells? When do they begin behaving as one organism?
In a true multicellular organism, such as a rabbit, evolution acts on the rabbit and not on each of the billions of cells that build it. So the researchers set out to determine whether artificial selection would act on the snowflake yeast as if they also were multicellular organisms. To test it, one batch of the multicellular yeast was allowed only five minutes to settle in a tube (representing a strong selection pressure), whereas another batch was given 25 minutes (a weaker selection pressure). After 35 generations, the yeast that were exposed to stronger selection evolved to have larger cluster sizes, whereas those in the weak selection group actually shrank in size. This indicated that each cluster of cells was evolving as one organism.
In addition, time-lapse photography (video below) revealed that, in order to reproduce, the multicellular yeast divides itself into branches that develop into the multicellular form as well. The daughter clusters did not create their own offspring until they had reached a similar size as their parents. The presence of this juvenile stage shows that the snowflake yeast had adopted a multicellular way of life, says William Ratcliff, a postdoctoral student in Travisano’s lab.
The researchers also found evidence of rudimentary division of labor, which is an essential characteristic for more complex multicellular life forms. In a human, for example, some cells may differentiate into blood cells, others may differentiate into immune cells, but only select egg or sperm cells help form the next generation.
In the multicellular yeast, the division of labor was more subtle. Although the experiment's artificial selection favored large clusters, a large cluster required more time to grow before it could reproduce. That meant that smaller clusters, which divide in half more quickly, could soon outnumber the larger clusters. But after many generations of selection, the large clusters evolved a solution: nonreproductive cells which served as points where offspring could break away from the parent cluster. By providing more break points, these specialized cells allowed the clusters to break into more pieces, to produce a greater number offspring quickly.
“The discovery that there are cells specialized to die in order for the structure to reproduce is suggestive of the first steps toward cellular differentiation,” Grosberg says.
Although researchers agree that the yeast clusters could indeed be considered multicellular organisms, they remain relatively simple. "The researchers are not going to evolve sponges with this approach, but it's amazing what they’re able to do so quickly," Simpson says.
The fast evolution was not all that surprising to Grosberg, who has written papers arguing that multicellularity should be relatively easy to evolve; other researchers have estimated that multicellularity has arisen independently on at least 25 different occasions throughout the history of life. Yet nobody really knew how it originated, or what steps were involved in the process. By watching evolution in progress, the new research uncovered experimental evidence for these theories and revealed one possible scenario of how multicellularity may have evolved.
"We had hypotheses about how multicellularity could evolve, but until now, no one has really been able to test them,” Ratcliff says. "Now that we have this experimental system, we can ask lots of really exciting questions."
Institute of Technology, Banaras Hindu University
Varanasi 221005, UP