SCIENCE & TECHNOLOGY
 

 

Biological computers
D.P. Singh
B
iological computers with their data storage capacity 300 times more than that of the conventional computers are on the anvil. These super-intelligent machines are likely and to be endowed with a faculty of discrimination, fault tolerance, pattern recognition in-built capability to handle and process a wide range of mind boggling data at one go.

Spacecraft that think
T
here’s nothing worse than a satellite that can’t make decisions. Rather than organising data, it simply spews out everything it collects, swamping scientists with huge amounts of information.

No incisions
S
urgeries performed with specialised medical devices requiring only small incisions, called laparoscopic surgery, have many advantages over traditional open surgery, including less pain, fewer complications and quicker recoveries.

Prof Yash Pal

Prof Yash Pal

UNDERSTANDING THE UNIVERSE
WITH PROF YASH PAL

Why does Venus transit occur after 121 years and again after eight years? It seems a bit strange to have such a large difference between these two intervals.
Yes, this does look strange at first sight, but is well understood after a little thought.

New products & discoveries

  • Windswept Titan

  • Humans as earth movers

  Top


 

 

 


 

Biological computers
D.P. Singh

Biological computers with their data storage capacity 300 times more than that of the conventional computers are on the anvil. These super-intelligent machines are likely and to be endowed with a faculty of discrimination, fault tolerance, pattern recognition in-built capability to handle and process a wide range of mind boggling data at one go.

A computer driven by chips stuffed with material derived from a living organism is called a biological computer. An ideal bio-computer is expected to resemble a human brain — the most powerful supercomputer engineered by nature. To overcome the limitations of the current genre of computers, which allow only for sequential processing of data, researchers are looking beyond the silicon chips to develop computers of the future.

It is felt that a better insight into the physiological and anatomical characteristics of the nervous system of various living species can help researchers to devise a computer that mimics human brain. So the researchers are investigating the protein of microorganisms (such as bacteria) and nervous system of animals (like leech and squid) to develop biological computers of the future.

For the last two decades, Americans, Japanese and Russians have been actively involved in the development of biochips. Russia has done pioneering work in developing technology and materials for building biological computers. The Institute of Biological Physics, Russia has developed a memory device from a bacterian protein that is photosensitive. Several American organisations are involved in exploring the possibility of using the protein extracted from bacteria E. Coli to develop a chip capable of increasing the data storing capability of a computer manifold.

Recently, an Indian scientist Prof. K.P.J. Reddy has developed a biochip using bacterial protein in tandem with laser beams. He had mixed the bacterial protein with a polymer under the zero gravity conditions, to develop the superpurity biochip called as bacteriohodopsin. The protein in the bacteriohodopsin, on irradiation by green and red laser beams transforms itself from its natural state to another stable state. These two states serve as the biological equivalent of digits 0 and 1 — keys to write data on the chip. Thus bacteriohodopsin becomes an ideal candidate for the optical memory that depends on light to read and write.

Japanese bio-scientists are trying to develop the biological neural networks that simulate nervous systems of living organisms. Neural cells’ electro-chemical properties are found to be useful in building high performance biochips. These researchers are keenly analysing the nervous system of leeches, squids and bugs. It is expected that the work will help to develop a computing system endowed with a discriminative faculty.

In 1994, Leonard Adleman, a computer scientist at the University of Southern California has showed that a large number of DNA (Deoxyribose Nucleic Acid) molecules could help solve a range of complex problems. Today, a large number of researchers and organisations are actively involved in the field of turning DNA into a problem-solving machine.

Recently Motorola Inc, Packard Instruments Co., Argonne National Lab, USA and W.A. Englehardt Institute of Molecular Biology, Russia have jointly launched an ambitious project for the mass production of biochips composed of multiple layers of DNA. This innovative biochip is likely to revolutionise the technology of gene sequencing. It is expected to have numerous uses in the fields of medical diagnostics, faster drug delivery and improved agricultural practices.

But before the biological computers become commercially available a few of the problem need to be taken care of. It has been seen that when biochip are used in a computer the application of energy becomes a nagging problem. For the optimal utilisation of biochip in association with enzymes and proteins, ATP (Adinosine Triphosphate) has been found to be vital. So the researchers are facing the problem of how to build an ATP molecule.

Biological computers could ultimately help close the gap between the processing speed and storage capacity. The biochips, used in biological computers will be capable of working in tandem with electronic circuits and perform tasks beyond the capability of present genre of computers. It is expected that in the next decade these computers are going to affect our life in a big way.
Top

 

Spacecraft that think

There’s nothing worse than a satellite that can’t make decisions. Rather than organising data, it simply spews out everything it collects, swamping scientists with huge amounts of information. It’s like getting a newspaper with no headlines or section pages in which all the stories are strung together end-to-end.

Researchers at the University of Arizona (UA), Arizona State University (ASU) and the Jet Propulsion Laboratory (JPL) are working to solve this problem by developing machine-learning and pattern-recognition software. This smart software can be used on all kinds of spacecraft, including orbiters, landers and rovers.

Scientists currently are developing this kind of software for NASA’s EO-1 satellite. The smart software allows the satellite to organise data so it sends back the most timely news first, while holding back less-timely data for later transmission.

Although the project, called the Autonomous Sciencecraft Experiment (ASE), is still in the test and development stage, software created by UA hydrologists has already detected flooding on Australia’s Diamantina River.

“We had ordered some images from the satellite to test our software in the lab,” said Felipe Ip, a Ph.D. student in UA’s Hydrology and Water Resources (HWR) Department. “We didn’t know the Diamantina River was flooding, but when we started running the images through our software, it told us, ‘Hey, we’ve got a flood here.’ We were delighted because that’s just what it’s supposed to do.”
Top

 

No incisions

Surgeries performed with specialised medical devices requiring only small incisions, called laparoscopic surgery, have many advantages over traditional open surgery, including less pain, fewer complications and quicker recoveries. Now, scientists at Johns Hopkins have created a new surgical technique that in extensive animal studies is safe and may improve even further the benefit of minimally invasive surgery by leaving the abdominal wall intact.

The new procedure, called flexible transgastric peritoneoscopy, or FTP, is performed by inserting a flexible mini-telescope, called an endoscope, and related surgical tools, through the mouth and into the stomach. After puncturing the stomach wall and the thin membrane surrounding the stomach — called the peritoneum, which also lines the inside of the abdominal and pelvic cavities — the doctors can see and repair any of the abdominal organs, such as the intestines, liver, pancreas, gallbladder and uterus.

FTP may dramatically change the way we practice surgery. The technique is less invasive than even laparoscopy because we don’t have to cut through the skin and muscle of the abdomen, like in existing surgery.
Top

UNDERSTANDING THE UNIVERSE
WITH PROF YASH PAL

Why does Venus transit occur after 121 years and again after eight years? It seems a bit strange to have such a large difference between these two intervals.

Yes, this does look strange at first sight, but is well understood after a little thought. Venus can come between the sun and the earth quite often but a transit would occur only if this opposition is rather accurate. We require that during conjunction Venus should be crossing the ecliptic - the plane in which the earth revolves around the sun. This would be no problem if the orbital planes of Venus and earth were the same because in that case every conjunction would lead to a transit. It turns out that the orbital plane of Venus is tilted with respect to that of the earth by 3°. The longer period between transits results from the fact that Venus occupies such a position only after such a long period. This is not random. Using observations and orbital mechanics this can be accurately calculated. But then why a repeat after just eight years?

The required coincidence of a conjunction with the ecliptic crossing of Venus has some latitude because the solar disc is so much bigger than the image of Venus. Eight years after the transit the Venus would have completed almost exactly 12 full revolutions of the sun. The sun, the earth and Venus would be almost in the same relative position. This is obvious from the fact that 8 times 365.2 = 2921.6 and 243 times 12 = 2916. The condition for a transit would be fulfilled, except the passage of Venus across the solar disc would shift. This would not be repeated at successive periods of eight years because by then the Venus would pass above or below the solar disc.

What is the actual reason at the atomic level that decides the transparency and colour of a medium?

Glass is transparent for visible light. It is not so transparent in infrared, or other frequencies like x-rays, for example. Our atmosphere is conveniently transparent in the visible region of the spectrum, where sun gives out most of it radiation and where our eyes are most sensitive. But the very same atmosphere is opaque in most frequencies of far infrared, ultraviolet, x-rays and gama-rays. Therefore, as you suspect, the transparency, or otherwise, depends on the detailed structure of the material. Atomic, and molecular structure enter centrally in determining the scattering and absorption of radiation. Atoms and molecules have specific energy levels. They determine which part of the spectrum would be absorbed and which would pass through unhindered. Indeed, in most of experimental science, atomic physics, molecular physics, lot of chemistry and biology, astronomy and even nuclear physics we use different radiations to probe structures at the basic level. The experiments are interpreted with the help of known theory and depending on the results new theory often emerges.

The question you have asked is basic to much of modern-day science activity. An example from the current concerns about depletion of the ozone layer might illuminate the discussion given above. Long ultraviolet would get down to the earth if there were no ozone. This is because the atmosphere does not have other molecules in enough quantity to absorb this radiation. Ozone molecules that are formed in the upper atmosphere are structures of three oxygen atoms with relatively loose binding with each other. The energy levels of these molecules are such that they are capable of absorbing the ultraviolet to break up into an oxygen molecule and an oxygen atom. Ozone molecules are destroyed but in dying they manage to save us from harmful ultraviolet. Of course we need continuous replenishment of ozone to compensate for the warriors that perish in trying to save us!

When light enters from one medium to another, its speed changes. Between its wavelength and frequency, which one also changes?

Light, like other forms of electromagnetic radiation, is an oscillating electromagnetic disturbance that propagates in a direction normal to the direction of oscillation. Wavelength is the distance this disturbance travels during one oscillation. Therefore, it is given by velocity of propagation divided by the frequency of oscillation. In going from one medium to another it is the velocity of propagation that changes. This would affect the distance the disturbance travels in one cycle of oscillation, which is the wavelength. From this it would seem that frequency is the basic property and wavelength is a derivative quantity. Frequency is determined by the characteristics of the source.

What would be the effect of the Red Giant phase of the sun on Venus and earth and their revolution periods?

You are talking of a time several billion years from now. We would be incinerated, that is if any life is still there. Other perturbations would have occurred and it would be fruitless talking about the relative orbital periods of the earth and the Venus.

Earth revolves around the sun from west to east; the Venus goes in the opposite direction. Why is it so?

Yes, the sun rises towards the west on Venus. In addition the day on the Venus is longer than the year! It is obvious all this is due to the specific history of the formation of Venus or the collisions it might have suffered. The point is that we do not know the exact reason for the difference.
Top

 

 HOME PAGE

NEW products & discoveries

Windswept Titan

This artist’s conception shows Titan’s surface with Saturn appearing dimly in the background through Titan’s thick atmosphere of mostly nitrogen and methane
This artist’s conception shows Titan’s surface with Saturn appearing dimly in the background through Titan’s thick atmosphere of mostly nitrogen and methane. The Cassini spacecraft flies overhead with its high-gain antenna pointed at the Huygens probe as it nears the surface. (Credit: NASA/JPL - Caltech)

On top the windswept summit of a Hawaiian volcano, a NASA instrument attached to the Japanese Subaru telescope measured distant winds raging on a strange world — Titan, the giant moon of Saturn — to help the robotic Huygens probe as it descends through Titan’s murky atmosphere next January.

When combined with previous observations, new research with the Heterodyne Instrument for Planetary Wind And Composition (HIPWAC) joined to the large aperture of the Subaru telescope supports the model that Titan has currents or jet streams at high latitudes racing through its upper atmosphere (stratosphere) at speeds of approximately 756 km/hour (470 miles/hr.).

The new observations reveal that the wind travels in the same direction as Titan’s rotation, and that the stratospheric winds are milder (about 425 km/hr. or 264 miles/hr.) near the equatorial regions, as the jet stream model predicts.

Humans as earth movers

Think of large earth moving projects: highway interchanges, coal mines or Boston’s Big Dig. According to Roger LeBaron Hooke, a University of Maine scientist, such activities have propelled humans into becoming arguably the most potent force in shaping the planet, surpassing rivers, wind and other natural phenomena.

In the early 1990s, a newspaper report on the annual number of housing starts in the United States led Hooke to wonder just how much earth was being displaced by human activity.

He gathered data on residential subdivisions, road construction and mining. His goal was to estimate the amount of soil and rock that humans move from one location to another through activities akin to the forces of nature that he also studied.

In 1994, Hooke published the results in a paper in GSA Today, a journal of the Geological Society of America. He estimated that on a worldwide basis, humans move more of the planet around, about 45 gigatons (billion tons) annually, than do rivers, glaciers, oceans or wind.
Top