Navigation |
Sunday, March 28. 2010The History & Future of Medical Technology, Chapter 10
This post is the tenth in a series based on my soon-to-be published book, The History & Future of Medical Technology. Each week I’ll present highlights from one of thirteen chapters.
Repair or Replace? Part II What can be done when the heart becomes sick or damaged? Sometimes a ventricular assist device can be used to reduce the heart’s work load until it recovers normal function. Both ventricular assist devices and total artificial hearts can be used to buy time until a suitable donor heart can be found for transplantation. Heart transplants offer the best hope for long term survival and a return to normal activities. Or even extreme activities; Kelly Perkins became famous as a mountain climber after receiving a heart transplant. Today, more than two-thirds of heart transplant recipients can expect to survive at least five years. According to some observers, the average heart transplant recipient can expect to live 15 years. The longest surviving heart transplant recipient was Tony Huesman who lived 31 years before succumbing to cancer. The first heart transplant was attempted in 1964 when Dr. James D. Hardy at the University of Mississippi Medical Center implanted the heart of a chimpanzee in the chest of a dying man. The primate heart beat for about 90 minutes before stopping. The highly publicized procedure discouraged further xenografts (transplants between species) but spurred interest in human heart transplants. Christian Barnard at Groote Shuur Hospital in Cape Town, South Africa performed the first human heart transplant in late 1967. A 54-year old grocer, Louis Washkansky, survived 18 days before succumbing to pneumonia. Less well known is the fact that three of Barnard’s heart transplant patients lived more than 20 years. Barnard also devised and performed the first heterotopic heart transplant—an operation in which the recipient’s sick heart is left in place and the donor heart is connected to it producing a sort of “double heart.” The advantage of the heterotopic operation is that it gives the recipient’s own heart a chance to recuperate and, theoretically, makes it easier to replace a failed donor heart. This chapter also traces the development of kidney dialysis machines, artificial hearts, ventricular assist devices, artificial joints, and brain-computer interface chips. Though results from cochlear ear implants have been mixed, we are clearly on the right path, and artificial vision—whether using artificial retinas or sensory substitution—is just a matter of time. But there is more good news. Imagine a material that can be fashioned into almost any shape, is highly biocompatible, and stimulates the body to grow replacement natural tissue. Such a material would be a boon to repairing the human body. A material with those qualities was discovered by biomedical engineer Leslie A. Geddes, a Professor at Purdue University who passed away in late 2009. Among the honors Geddes received for his many accomplishments were the IEEE Edison Medal in 1994 and the 2006 National Medal of Technology. I am deeply indebted to Professor Geddes, who encouraged me to call him when I had questions, and whose many articles and patents proved a treasure trove of information. Like many great discoveries, this one was serendipitous. In 1983, Geddes and undergraduate engineering student Michael Voelz tried oxygenating blood using the small intestine of a dog. It worked, but not very well. However, the experiments gave Geddes another idea. He recalled that blood from an ulcerated small intestine does not clot. He also knew that animals and humans have a good deal of small intestine material. Might the small intestine be a source of material for vascular grafts? Geddes obtained a $50,000 grant from the Showalter Trust and assembled a team to pursue the research. After a false start, they scraped the small intestine leaving only non-cellular material called small intestine submucosa (SIS). They found that because SIS contains no cells it does not provoke an immune response—even when used in different species. Next, the team implanted pig-derived SIS grafts in the carotid arteries of a number of dogs. Later, no evidence of SIS could be found at the implantation sites. Tests suggested the SIS turned into host tissue. SIS has since been used in a wide range of applications—from bioartificial heart valves to hernia repair to treating wounds. It has also been used as a dura mater substitute for covering the brain and to augment bladder volume; in the latter application, the remodeled SIS becomes innervated and contracts well. The human body is a complex system and there are many challenges to repairing, assisting, and replacing body parts. Though our current capabilities are quite primitive, progress has been made in areas that once seemed unlikely. Who would have thought, for example, that material from a specific organ could be transplanted between species and stimulate growth of replacement host tissue in the process? Next time: The Vision Thing Note: If you would like to be notified when The History & Future of Medical Technology is published, please go to Telescope Books and enter your email address in the newsletter sign-up field on the left menu bar. This email list is only used to announce book offers from Telescope Books; your email address will not be shared with third parties.
Posted by Ira Brodsky
in The History & Future of Medical Technology
at
11:33
| Comments (0)
| Trackbacks (0)
|
Friday, March 26. 2010The History and Future of Wireless, Chapter 14
This post is the last in a series based on my book, The History of Wireless: How Creative Minds Produced Technology for the Masses, published in 2008.
How to be a Technology Innovator The wireless industry boasts over a century of experience developing successful products and services; today there are more than 4 billion mobile phone users. Are there lessons technology innovators can learn from the history of wireless? I’m glad you asked. Identify and exploit holes or weaknesses in prevailing solutions As the great inventor Edwin H. Armstrong said, “It ain’t ignorance that causes all the trouble in this world. It’s the things people know that ain’t so.” In the early days of radio, engineers wondered if there was a way to reduce or even eliminate the static that invariably accompanied broadcasts. One idea that emerged was to use frequency modulation (FM). But a respected Bell Laboratories engineer, John Renshaw Carson, threw cold water on the idea, saying “static, like the poor, will always be with us.” When Carson tested FM he used a narrowband implementation to conserve bandwidth. It didn’t work. Armstrong tried using a wider channel and the results were stunning. You could hear a pin drop. Be persistent New technologies are rarely immediate hits. Most require a gestation period. Samuel Morse waited six years to build his first demonstration telegraph line. Wireless was mainly used in niche applications (in maritime communications) for the first 25 years. Television, wireless LANs, cellular radio, and Bluetooth all required significant gestation periods. Some technologies succeed in specific military or industrial applications first. Others only seem to succeed when nearly everyone has given up on them. Accept reality: the best technology does not always win What the market wants is the right product, with the right features and packaging, at the right cost. It’s all about value. In many cases, a technology that is just good enough wins. The best standards build on successful proprietary solutions Much of the industry will tell you that there has to be a common standard before customers will invest in products or services employing a new technology. This gets it exactly backwards, confusing politics with the wisdom of crowds. The best standards usually come from proprietary technologies. For every successful standard that was dreamt up by a committee, there are hundreds of committee standards that end up rotting in warehouses. In fact, many of today’s standards aren’t even necessary. Look at your PC. It’s easier to just support a long list of formats than to hammer out one or two universal standards. Most committee standards are obsolete before the ink dries. With the “long list” approach, you can always just add another format that looks promising to the mix. Consider the mobile phone as another example. I remember a conference at which a respected industry analyst warned that with four mobile phone standards and two frequency allocations in the U.S., anyone who traveled frequently would need to carry five or six different phones. It didn't occur to him at the time that a single phone could support multiple standards—in a small, inexpensive package. Most successful technology standards start as proprietary or ad hoc solutions that are then transformed into formal standards. Companies innovate. Committees work out compromises. Timing is (almost) everything If I had to boil technology innovation down to one thing, I would say it’s recognizing opportunities and knowing when and how to act on them. I confess that sounds all-encompassing, but that’s not how I meant it. My point is this: most people miss the opportunity, are too early, or overshoot the target.
Posted by Ira Brodsky
in The History & Future of Wireless Technology
at
10:53
| Comments (0)
| Trackbacks (0)
|
Saturday, March 20. 2010The History and Future of Medical Technology, Chapter 9
This post is the ninth in a series based on my soon-to-be published book, The History & Future of Medical Technology. Each week I’ll present highlights from one of thirteen chapters.
Repair or Replace? Part I All tissues and organs eventually begin to fail. Sometimes they can be repaired. Sometimes it’s possible to compensate for deteriorating function. But the time comes when the only way to preserve health and extend life is to replace worn out body parts. Some body parts, such as heart valves and knee joints, are eminently replaceable with synthetic equivalents. Others are so complex that transplantation seems the better bet. One organ, the brain, is so entwined with our sense of self and humanity that it defies replacement. To wit, if we could build an artificial brain and upload a stored version of the patient’s personality and memories, then would the recipient be the same person—or even human? One thing is clear: replacement parts are already saving lives and improving the quality of lives. The benefits of a replacement heart valve are profound and there are few, if any, ethical concerns. Still, we may want to place some limits on body part replacement. Not everything that is technically possible is desirable. There are different types of replacement parts. There are direct, permanent replacement parts that take the place of the failed organ or tissue. There are indirect replacement parts that take over part or all of the function of the natural organ or tissue. And there are external devices that handle the function of a failed organ—sometimes while waiting for a transplant. Several organs can be transplanted from people who died from unrelated injuries. It may one day be possible to use whole organs harvested from other species (xenografts). We are already able to use acellular material (called small intestine submucosa) harvested from other species. Blood was one of the first critical body parts to be replaced. (Teeth, discussed in Chapter 12, were another.) The discovery of blood groups, the Rhesus (Rh) factor, anticoagulants, and simple techniques for storage and transfusion enabled today’s blood banks. Let’s hope they don’t need a bailout. I wrote previously about the seminal contributions of an unlikely trio: oil tycoon John D. Rockefeller, Nobel Prize winning biologist Alexis Carrel, and American adventurer Charles Lindbergh. They laid the foundation for organ transplants and the heart-lung machine. John Heysham Gibbon, Jr. deserves the lion’s share of credit for bringing the heart-lung machine to fruition. Gibbon came up with the idea independently, pursued it despite others’ skepticism, and devoted nearly two decades of his life to heart-lung machine research and development. He started by experimenting on cats, progressed to dogs, teamed up with IBM Corp., and performed the first successful surgery using a heart-lung machine in 1953. The patient, a young woman named Cecelia Bavolek, reluctantly became a celebrity. In addition to describing in detail how Gibbon developed the heart-lung machine, the 30-page chapter tells the story of the development of cardiovascular catheterization—initially for diagnostic purposes and eventually enabling procedures that competed with coronary artery bypass graft surgery. The story starts with Werner Forssmann’s unauthorized catheterization of his own heart, additional developments by innovators including André Cournand, Mason Sones, and Charles Dottering, and the development of balloon angioplasty by the flamboyant refugee from East Germany, Andreas Gruentzig. The chapter wraps up with the development and implantation in 1960 of the first artificial heart valve by surgeon Albert Starr and engineer Lowell Edwards. Today, there are both mechanical and bioartificial replacement heart valves, and their use has become routine. The latest development is minimally invasive implantation of artificial aortic valves using a stent-like device that is delivered via a catheter and pushes the natural valve leaflets aside. Progress in repairing and replacing body parts—particularly in the cardiovascular system—has been nothing short of amazing. How can we best ensure more of the same? Most of the progress has been driven by individuals. I’m not going to deny reality: today most medical research is conducted by large organizations—and often run by committees. But we need to keep our minds open in case we encounter another Werner Forssmann, the sort of person who does not ask permission to make great discoveries. Next time: Repair or Replace? Part II Note: If you would like to be notified when The History & Future of Medical Technology is published, please go to Telescope Books and enter your email address in the newsletter sign-up field on the left menu bar. This email list is only used to announce book offers from Telescope Books; your email address will not be shared with third parties.
Posted by Ira Brodsky
in The History & Future of Medical Technology
at
09:40
| Comments (0)
| Trackbacks (0)
|
Saturday, March 13. 2010Should We Care About Online Privacy?
I confess that I have done a complete turnaround on this issue.
Ten years ago, I felt that privacy activists were trying to make life difficult for online businesses and were inhibiting development of powerful "personalization" technology. I saw little difference between visiting a Web site and walking into a bricks and mortar store. The word "privacy" does not appear in the Bill of Rights. Now I fear that PCs and other devices with Web access are becoming the Telescreens depicted in George Orwell's classic dystopian novel, 1984. These devices are two-way, and they are increasingly used to gather information about us. More than most people realize. And wouldn't you know it: as my concern grows others seem to be backing off. They don't care that their e-mail is being scanned. They don't mind storing their personal documents in the cloud. They don't mind that their location is being tracked by their mobile phone. See Declan McCullagh's article at CNET news. Now we really have something to worry about. Science or Magic?
Below is the Introduction from The History of Wireless: How Creative Minds Produced Technology the Masses.
Science or Magic? How did the human race develop palm-sized devices that enable people to converse and exchange text messages worldwide, snap and upload pictures, download music and videos, and determine their precise locations? I’ve been interested in the history of technology for a long time, having worked in the high-tech industry for 30 years; I specifically wanted to know more about the evolution of wireless, and I was surprised that I couldn’t find a comprehensive history. There are many books on key figures and time periods in the history of wireless, but none that explain how we got to where we are today. So I decided to write such a book. The following pages trace the entire journey—from the discovery of fundamental scientific effects to the development of next-generation wireless standards. Arthur C. Clarke said that a sufficiently advanced technology is indistinguishable from magic. But most magic is just sleight of hand. Drawing back the curtain reveals the true sources of advanced wireless technology: brilliant science, ingenious products, and innovative business models. Science rarely progresses in a straight line. Nor is there a single correct way of doing science. The history of wireless technology shows that the clash of opposing philosophies of science can be a catalyst and even a necessary ingredient for progress. The history of wireless technology cannot be separated from the history of wireless business. Technology harnesses science to create valuable products and services. Business delivers those products and services to customers. Before a technology comes to life, someone has to determine who needs it and what they’re going to do with it. It’s also business’s job to figure out how best to package and distribute technology—how to get it in the hands of as many people as possible in a form they can use. The story of wireless is fascinating and inspiring, and the technology should be celebrated. Great technology is every bit as creative as great art. While we can often perceive the creativity in a work of art directly, we usually need to know the story behind a technology to fully appreciate the creativity that went into its development. No one has figured out how to bottle and sell creativity, but the history of wireless provides important clues about its sources. There are lessons about persistence; luck and preparedness; synthesizing ideas; challenging common assumptions; and more. The first decision for anyone writing history is deciding where to begin. A history of wireless communications could begin with the first person to commercialize the technology, Guglielmo Marconi. Or it could start with Heinrich Hertz, the first scientist to create and detect radio waves. Why not go back further? After all, Hertz was only verifying James Clerk Maxwell’s theory of the electromagnetic field. The dilemma is that there would not have been a Marconi without a Hertz, nor a Hertz without a Maxwell, nor a Maxwell without a Faraday. I chose to start with the debate between Luigi Galvani and Alessandro Volta that led to the invention of the battery. (Galvani actually witnessed wireless communications but did not understand its significance.) Once investigators were armed with a source of continuous current, the discovery of electromagnetism became almost inevitable. The narrative proceeds to Michael Faraday, the great experimentalist who added more to our knowledge of magnetism and electricity than anyone before or since. Faraday laid the foundation for James Clerk Maxwell, who translated Faraday’s observed facts into the symbolic language of equations and assembled them into a comprehensive theory of electromagnetism—an achievement that, ironically, might have been disowned by the strict empiricist Faraday. A note about terminology: most early scientists were known as “natural philosophers.” That term is used here, as well, because that’s what investigators such as Michael Faraday wanted to be called. Faraday detested the word “physicist.” I’ve also kept the jargon to a minimum; however, some of it is unavoidable. Most concepts are explained in place and reviewed in the Glossary. Faraday did science in the laboratory; Maxwell did science in his head. Heinrich Hertz proved that Maxwell’s fertile imagination produced something concrete. There really are electromagnetic waves that propagate through free space. Next the journey takes us on an important detour. Wireless communications is technology for conveying human intelligence. There would be no wireless telegraph without Samuel F.B. Morse’s wired telegraph and there would be no wireless telephone without Alexander Graham Bell’s wired telephone. The stories behind these two great inventions are essential to the history of wireless. The idea seems obvious today but taking wireless out of the laboratory, fashioning it to serve specific applications, and offering it for sale initially faced tremendous resistance. With the telegraph going great guns, Guglielmo Marconi struggled to build the first wireless business. He built it around a technology—spark transmission—that would prove a dead end. (At least, temporarily; more than a century later a technology called ultra wideband is emerging that uses spark-like signals.) That brings us to several lesser known names: the people who put wireless on the right technological footing. Reginald Fessenden and Edwin H. Armstrong led the way. Fessenden understood that wireless needed to be based on continuous waves rather than sparks. Armstrong took the vacuum tubes invented by John Ambrose Fleming (the valve) and Lee de Forest (the Audion) and built vastly superior transmitters and receivers. Amateur radio operators—Armstrong was one of them—contributed numerous refinements. David Sarnoff thought about becoming an engineer, but he ended up becoming the prototype for today’s high-tech business leaders. He was a hands-on executive who understood that success requires the right technology, the right products, and the right marketing. He was also one of the first business leaders to successfully navigate the hazardous waters of intellectual property, government policy and regulation, and unscrupulous competition. During this era the word “radio” gradually replaced the word “wireless.” Wireless underwent a dramatic transformation in the years leading up to World War II. The wireless market, once the exclusive domain of entrepreneurs and small businesses, became a playground for big corporations. New technology was developed by teams. It becomes harder to identify individual inventors, but they are still there. The aftermath of World War II saw the commercialization of frequency modulation, mobile radio, television, and mobile telephone. Less well known, it was also the gestation period for the wideband radio technology that later (after being declassified by the U.S. government) enabled unlicensed wireless LANs, the Global Positioning System (GPS), and third generation (3G) cellular systems. We finally arrive at the modern era of wireless. It would be difficult if not impossible to recount this part of the story without research at the frontlines of development. Fortunately, several leading actors—including Andrew Viterbi, Martin Cooper, and Donald Cox—contributed to my research. It would be hard to exaggerate the impact of cellular telephone on culture and the global economy. Ironically, its development was largely hidden from view—and its commercialization was significantly delayed. Perhaps that explains why it has grown way beyond the most optimistic forecasts. [There are now over 4 billion mobile phone subscribers.] The historic role of industry standards must also be acknowledged. A degree of conformity is required so that products from different manufacturers can talk to each other. But it would be remiss to deny the impetus of proprietary technologies and business contrarians. The evolution of wireless continues to be driven by the clash of opposing ideas. By no means has the era of individual discoverers and inventors come to an end. The current industry is obsessed with planning, and much is already decided about the next generation of wireless technology. Or so the experts think. Even the best planning cannot prevent unexpected twists and turns in the road ahead. The book concludes by identifying some key lessons. How did the science behind wireless technology evolve? Why did some technologies succeed and others fail? And what can scientists, inventors, and entrepreneurs learn from the history of wireless about creativity? The history of wireless provides a treasure trove of lessons about how to avoid pitfalls—and how to succeed in science and business.
Posted by Ira Brodsky
in The History & Future of Wireless Technology
at
09:21
| Comments (0)
| Trackbacks (0)
|
Friday, March 12. 2010The History & Future of Medical Technology, Chapter 8
This post is the eighth in a series based on my soon-to-be published book, The History & Future of Medical Technology. Each week I’ll present highlights from one of thirteen chapters.
Listening to Your Body Physicians learned long ago that they could uncover clues about what was going on beneath the skin just by feeling and listening. Medicine took a giant step forward in the 18th century when physicians discovered that they could estimate the size of major organs by tapping on the body (“percussion”). By the 19th century, physicians were associating specific heart sounds with problems observed at autopsy; using body temperature to diagnose and monitor the progress of diseases; and taking accurate blood pressure measurements. Gathering the data was less than half the battle. Making sense of the observations proved to be the bigger challenge. It didn’t take long to find that “normal” measurements vary widely from individual to individual. Desperately needed were data from the broader population and reliable standards for statistical analysis. To wit, it was important that physicians buy into the idea of regularly collecting and sharing key data. And there had to be a balance: gather the data, use the data—but never forget its limitations. One simple data gathering technique, tapping on the body, evolved way beyond what its pioneers are likely to have imagined. Ultrasound imaging is the high-tech version of percussion. Employing the same principle that bats use to navigate, ultrasound scanners emit high frequency sound waves to create echoes that let us visualize structures within the body and study their motion. Ultrasound imaging is fast, non-invasive, and does not employ ionizing radiation. And it doubles as a therapeutic technology with applications range from cleaning teeth to breaking up kidney stones. Today’s ultrasound scanners exploit a discovery made by Pierre and Jacques Curie in 1880. The Curie brothers discovered that mechanical stress causes some materials to generate small electric currents. One of their professors, Gabriel Lippmann, demonstrated the reverse—applying electric currents to the same materials distorts their shapes. What began as a scientific curiosity evolved by World War I into a technology for detecting icebergs and enemy submarines. Though medical researchers experimented with ultrasound in the years following World War II, it wasn’t until cardiologist Harvey Feigenbaum showed how the technology could be used to diagnose specific heart problems in the 1960s that the technology began to gain traction in the clinic. (It’s also now famous for providing millions of expectant parents the first images of their unborn children.) Ultrasound continues to demonstrate unique advantages. Ultrasound is less expensive than CT or MRI; near ideal for 4D (3D plus motion) studies; and as a screening tool lends itself to workflow automation. An emerging application, strain imaging, could provide information about the structural integrity of living tissues that can’t be obtained by other means. Strain imaging distinguishes and quantifies tissues in terms of stiffness and softness. In essence, strain imaging shows how each spot of tissue moves and could be used to gauge the overall health of specific types of tissue such as heart muscle. In that case, ultrasound will not only tell us how well our hearts are performing, but how long they might last. The evolution of medical diagnostic ultrasound traces back two-hundred and fifty years to when physicians started using percussion. Tap on the body and listen to the sound. A few pioneers working in the 1940s and 50s, beleaguered with doubt, thought that ultrasound echoes from inside the body just might turn out to be clinically useful. Decades later, buoyed by advances in electronics, ultrasound emerged as a key diagnostic tool. Next time: Repair or Replace? Part I Note: If you would like to be notified when The History & Future of Medical Technology is published, please go to Telescope Books and enter your email address in the newsletter sign-up field on the left menu bar. This email list is only used to announce book offers from Telescope Books; your email address will not be shared with third parties.
Posted by Ira Brodsky
in The History & Future of Medical Technology
at
10:24
| Comments (0)
| Trackbacks (0)
|
Sunday, March 7. 2010The History & Future of Medical Technology, Chapter 7
This post is the seventh in a series based on my soon-to-be published book, The History & Future of Medical Technology. Each week I’ll present highlights from one of thirteen chapters.
The Nuclear Option Dig into the history of nuclear medicine and you’ll discover some surprising facts. For example, did you know that PET scanners look inside the patient's body for matter-antimatter annihilation events? When a positron emitted by a radioactive tracer encounters an electron, the particles literally destroy each other, leaving only a pair of gamma rays flying off in opposite directions. But that’s jumping ahead in the story. It all started when Henri Becquerel discovered radioactivity by accident. Then Ernest Rutherford acquired basic knowledge about the structure of atoms through incredibly simple “table top” experiments—the likes of which we may never see again. And Marie Curie earned Nobel Prizes in two different scientific fields for, among other things, extracting a tenth of a gram of radium chloride from a ton of pitchblende. Interesting things also happened when nuclear physics met biology. In 1913, George de Hevesy showed that radioactive tracers can be used to track specific molecules absorbed by plants. Hermann Blumgart took that idea a step further in the 1920s, applying radioactive tracers to medical research. By the 1940s, doctors successfully treated thyroid cancer with what they dubbed an “atomic cocktail.” Nuclear imaging took off with the development of single photon emission computed tomography (SPECT) and positron emission tomography (PET). Despite their formidable names, SPECT and PET are simply two different ways of displaying metabolic activity—useful for detecting cancers (which are glucose gluttons) and assessing the effectiveness of various treatments. Which one is more appropriate depends on the application. A CT scan can show that there is a tumor; a SPECT or PET scan can show what the tumor is doing. Combine SPECT or PET with CT, and you can match the metabolic activity map to the patient’s anatomy with precision. Radiation therapy is another important application—crucial to treating many types of cancer. A common trick of the trade is to aim multiple, low-level radiation beams coming from different directions at the target. Only tissue at the point of intersection is destroyed by systems such as the Gamma Knife and Cyberknife. Proton therapy adds another twist, taking advantage of the Bragg peak—the tendency of protons to unload most of their energy in the last few millimeters of travel. The major drawback is that proton therapy requires large and very expensive particle accelerators. Again, the best approach depends on the specific application. Progress in treating HIV suggests that keeping diseases at bay for long periods may sometimes be more practical than hunting for cures. Perhaps radiation therapy can be improved to the point that it enables physicians to contain most cancers. A combination of nanotechnology and radioactivity may be the key, making it possible to track down and kill individual cancer cells. There is one other exciting application for radioactivity in medicine—if we can muster the courage to embrace it. The energy locked in atoms can be used to power implantable medical devices for ten years—far longer than chemical batteries. Nuclear-powered pacemakers were introduced at one time, but people freaked out over the Chernobyl and Three Mile Island accidents, and the products were discontinued. All in all, nuclear medicine has experienced its ups and downs, but it has not lost its glow. Next time: Sensing Health Note: If you would like to be notified when The History & Future of Medical Technology is published, please go to Telescope Books and enter your email address in the newsletter sign-up field on the left menu bar. This email list is only used to announce book offers from Telescope Books; your email address will not be shared with third parties.
Posted by Ira Brodsky
in The History & Future of Medical Technology
at
14:31
| Comments (0)
| Trackbacks (0)
|
(Page 1 of 1, totaling 7 entries)
|
CalendarQuicksearchArchivesCategoriesSyndicate This BlogBlog Administration |