HISTORY
Robots were first developed by the National Aeronautics and Space
Administration (NASA) for use in space exploration.[2]
These devices, or telemanipulators, were capable of doing manual tasks aboard a
spacecraft or out in space. The slave devices were
controlled electronically from a remote master control
on Earth or aboard a spacecraft. Telemanipulators were used extensively aboard NASA's
Space Shuttle missions between 1983 and 1997. Research in trajectory and missile
guidance systems eventually led to highly precise targeting mechanisms. Precision
pointing at targets, such as the Earth and stars, was crucial for Spacelab telescope
experiments. Telemanipulators such as the Instrument Pointing System (IPS) were
specifically designed for extreme accuracy (±1.2 arcsec).[3]
Scientists at NASA Ames Research Center were responsible for developing virtual
reality. The idea took root with contributions of VPL, a visual programming
language, and Dataglove.[4]
Their integration made
it possible to interact with three-dimensional virtual scenes. However, it took
the integration of robotic engineering and virtual reality to develop a dexterous
telemanipulator for the anastomoses of nerves and vessels in hand surgery.[5]
From these applications, it became apparent to the U.S. Department
of Defense that virtual reality and telepresence might serve a useful function in
treating wartime casualties on the battlefield. Through virtual reality, the surgeon
could be brought to the patient's side, an idea described by the term telepresence.
Data from wounded casualties of the Vietnam War estimated that, of all wounded soldiers,
one third died of head and massive injuries and another third died of exsanguinating
hemorrhage but had the potential to survive if they were treated in time.[2]
The Department of Defense sought to improve medical presence on the battlefield
given that one third of casualties can be saved. Telepresence allowed a surgeon
located aboard an aircraft carrier to perform surgery (with the aid of telemanipulation)
on wounded soldiers located in a remote location on the battlefield. With this idea
in mind, the Department of Defense funded much of the research in telemanipulation
for remote mobile surgical units that would allow for telepresence.
Engineers realized that the distance between patient and surgeon
had an upper limit, beyond which accuracy and dexterity of instrument control would
suffer degradation. Latency is the time it takes
to send an electrical signal from a hand motion to actual visualization of the hand
motion on a remote screen. The lag time to send an electrical signal to a geosynchronous
satellite at 22,300 miles above the earth and return is 1.2 seconds. This transmission
delay would prohibit practical surgery. Humans can compensate for delays of less
than 200 msec. Longer delays compromise surgical accuracy. Tissue moves when force
is applied to it, and with a visual delay greater than 200 msec, the movement would
not be noticed fast enough to avoid cutting in an unintended place. The most optimistic
attempt to provide telesurgical presence over long distances was undertaken using
high-bandwidth fiberoptic ground cable. The latency time of 155 msec allowed Marescaux
and Gagner[6]
[7]
to perform a robotassisted laparoscopic cholecystectomy between New York City and
Strousbourg, France.
Phillipe Mouret[8]
performed
the first video-laparoscopic cholecystectomy in Lyons, France, in 1987, but it was
not until Perissat[9]
presented the innovation to
the Society of American Gastrointestinal Endoscopic Surgeons in 1988 that an exponential
spread of laparoscopic surgical procedures began. Although laparoscopic surgery
provided a great benefit for the patient, it brought tremendous surgical limitations,
such as loss of three-dimensional vision, impaired touch sensation, and poor dexterity
provided by the long instruments and the fulcrum effect. The fulcrum
effect is a nonintuitive motion of the instrument tips in opposite direction
about a fixed point, usually at the skin entrance site. New skills had to be learned.
Initial attempts to surmount the burdens of endoscopic surgery have provided the
impetus for robotic support systems that can enhance surgical skills and control
of instruments. The first of such systems in the medical field was applied in surgical
field camera guidance.
In 1994, the U.S. Food and Drug Administration (FDA) approved
the first Automated Endoscopic System for Optimal Positioning (AESOP)[10]
arm to be used in laparoscopic surgery. The device is controlled through voice activation
to provide a flexible view of the surgical field. Around the same time, the TISKA
Endoarm became available, and it could act as a camera guided by electromagnetic
friction and could work as a tissue retractor.[11]
While foot pedals were being replaced by voice-activated systems, other manufacturers
were designing cameras that moved in synchrony with the movements of the surgeon's
head.[12]
Other devices provided finger "joysticks"
that could be used to control the camera field.[13]
To combat dexterity problems, the master-slave telemanipulator
concept was developed for medical use in the early 1990s. The first master-slave
manipulator for medical use was developed at Stanford Research Institute. The goal
was to have computer algorithms that translate a surgeon's master manual movements
to end-effector slave instruments at a remote site. The robotic slave arms mimic
the natural movements of the surgeon's hand. Early designs had only 4 degrees of
freedom, but by 1992, a German prototype was developed with 6 degrees of freedom
( Fig. 66-1
).[14]
It was used experimentally but never achieved clinical application.[15]
In 1994, Intuitive Surgical obtained technologic rights and eventually developed
robotic instruments with 6 degrees of freedom.
Robots can be preprogrammed with limits set by the operator and
run autonomously, or its kinematics can be completely defined online in real-time
tracking when immediate human interventions and decisions are required. The design
of surgical robots must include sterility barriers and enhanced patient safety features.
It must meet operating room constraints and be compatible with imaging equipment,
as well as require special ergonomic features.
To overcome endoscopic surgery handicaps, engineering technology
has developed three-dimensional video imaging, robot camera holders, and robotic
flexible effector instruments with the ability for tactile pressure sensation. Unfortunately,
every instrument has different stress feedback characteristics, and the surgeon's
ability to "feel" the elastic properties of tissue are not yet fully developed.
The robotic fingers can be made smaller than those of the human hand to help reach
confined spaces. The robot can filter the surgeon's hand tremor and scale the movements
of the instruments to the level of high precision and stability that is required
for microsurgery. Best of all these advantages, repetitive robot motions and tasks
are not prone to fatigue.