An increasing number of modern and practical products are being developed. Currently, one of the latest inventions is the creation of electronic senses.
Touch
![]() |
Professor Victor Zue (Photo: NLĐ) |
Hiroshi Ishii, a professor at the Massachusetts Institute of Technology (MIT), has made quite a stir by announcing a program that allows one person to touch another through… a computer. This means that through a computer, we can still feel the smooth skin of a baby hundreds or even thousands of kilometers away! Professor Ishii’s invention is called “inTouch,” which is based on haptic feedback technology. When using compatible devices, two people in distant locations can “touch” each other while communicating. Also at MIT, in 1993, scientists Kenneth Salisbury and Thomas Massie from the Artificial Intelligence Laboratory invented the PHANToM Haptic Interface, which allows users to “feel” information in a computer. To use the PHANToM Haptic Interface, one inserts a finger into a fingertip glove (similar to a thimble used to avoid pricking fingers while sewing). All movements of this finger will appear on a screen within a virtual space filled with various “blocks” of different shapes (squares, triangles, etc.). When the finger enters this virtual space and touches, for example, the tip of a triangle, one immediately feels the sharp contours of this triangle. Currently, the PHANToM Haptic Interface has been applied at the University of Pennsylvania for surgical students to practice.
Smell
Medieval physicians knew how to diagnose patients by smelling their breath (some diseases, like liver disease, often emit foul odors). Today, Dr. George Dodd (Scotland) also uses a similar method, except his nose is an electronic nose. A few years ago, an electronic olfactory system was developed, consisting of extremely sensitive electrochemical sensors connected to a powerful computer, used to detect traces of explosives or blood alcohol levels. Based on this theory, Dodd created electronic chips and “trained” them to recognize specific odors. Dodd’s smell detection device is small enough to attach to a telephone receiver (connected directly to the computer). When patients want to check their health status, they simply speak into the receiver for the computer to analyze the disease’s increase or decrease. Dr. Dodd hopes that in about five years, all clinics will have electronic noses, and even individuals can equip themselves with a small nose the size of a credit card to regularly monitor their health status.
Taste
Like smell, taste is one of the most complex senses. Similar to the surface of the moon, the tongue has about 8,000 to 10,000 taste buds, each containing approximately 50 to 75 chemical receptors. These receptors have an extremely short lifespan: they die after about 10 days to be replaced by new generations. From this basic theory, Professor Robert Bradley from the University of Michigan (USA) invented an artificial tongue that can recognize flavors almost like a real tongue. The electronic tongue is made from a silicon disk with a diameter of 4 mm, on which are many tiny holes, each connected to a computer. Then, a taste neuron is extracted and implanted onto the silicon disk, resulting in the creation of an electronic tongue. Bradley’s goal is to use the electronic tongue to discover the mechanisms of flavor recognition as well as the brain’s role in flavor identification (for example, how the brain recognizes the difference between salt and sugar).
Hearing
Professor Victor Zue from the Computer Science Laboratory at MIT has successfully invented a machine that can recognize human speech. For example, if you sit down in front of Jupiter (Zue’s machine) and ask about the weather in Paris or New York, Jupiter will answer accurately for each location. Inquiring about humidity in Tokyo or the weekend weather forecast in Rio, Jupiter can provide those answers as well. With a vocabulary of about 1,500 terms related to weather, Jupiter can only discuss matters of rain and sunshine. Comprising four software components, Jupiter recognizes speech and automatically translates it into a “word hypothesis” based on calculations of possible linguistic structures. Once identified, the second software component steps in to determine the meaning of this word hypothesis. Afterward, Jupiter scans reports from the National Weather Service of the United States on the Internet to find the requested information and respond.
Vision
Imagine this scenario: a person is walking in an unfamiliar city and does not know how to find the restaurant they have an appointment with a friend. After typing a few commands into a small computer attached to their waist, they see a city map not displayed on a screen in front of them but in the air (!), along with instructions for the shortest route. Finally, they find the right path but are unsure which restaurant to head to. At that moment, they lightly touch their glasses. The restaurant they want to visit suddenly appears before their eyes. Thus, they proceed straight to the destination where their friend is waiting… However, no one knows that the person telling this story is blind! This is the vision that scientist Tom Furness hopes to realize in the near future. As the director of the Human-Computer Interaction Laboratory at the University of Washington, Furness has been researching the “visual machine” for the past 30 years. As a result, the “Virtual Retinal Display” (VRD) has been created, a tool that can “draw” images directly onto the retina. The VRD consists of a pair of glasses directly connected to a small computer that looks like a briefcase. To create images for blind users of this device, numerous images are preloaded into the computer. To view the route to the restaurant, the user simply inputs commands into the machine with a simple action.