Monday, March 31. 2008
One of the greatest figures in the history of technology was an amphibian. Numerous frogs were martyred to the discovery of bioelectricity and the invention of electrocardiography. Frogs also played a supporting role in the development of the first source of continuous current, a turning point in the study of electricity.
In my History of Wireless, I discuss the “animal electricity” experiments of Luigi Galvani and his debate with Alessandro Volta, driving the latter to construct his “voltaic pile.” Galvani noticed that a frog’s crural nerve could be stimulated wirelessly to create a muscle contraction. This was first demonstrated indoors using a static electricity generator and then outdoors during lightning storms. Later, Galvani discovered he could trigger contractions merely by completing a circuit containing dissimilar metals, though he stubbornly refused to acknowledge that the source of electricity was external to the frog.
Galvani’s mistake drove Volta to perform further experiments demonstrating dissimilar metals can be used to generate electricity. (However, Volta failed to see that a chemical reaction rather than mere contact was the cause.) Armed with Volta’s invention, natural philosophers were empowered to make a series of further discoveries.
Was Luigi Galvani the first person to encounter wireless communication via electromagnetic waves? Close examination of Galvani’s research papers (as explained in the December 1971 issue of IEEE Spectrum by L.A. Geddes and H.E. Hoff) reveals that the muscle contractions in his first “wireless” experiments were due to electro-static induction. In his book The Ambiguous Frog, Marcello Pera shows that lightning storms also triggered contractions via electro-static induction. Galvani happened upon a wireless effect, but it was not due to electromagnetic waves. (This is not as odd as it might seem: in the late 19th century Joseph Henry, William Preece, and Oliver Lodge all pursued wireless communication via magnetic induction, which also does not involve electromagnetic waves.)
Galvani lost the debate with Volta but is now rightly considered the discoverer of bioelectricity. His work inspired Carlo Matteucci, who invented the “rheoscopic frog”—a severed sciatic nerve and its innervated gastrocnemius muscle that could be used as a sensitive electricity detector. In 1856, Kolliker and Mueller used the rheoscopic frog to observe the electrical activity associated with the beating heart of another frog.
The rheoscopic frog was crucial to the development of electrocardiography. Though early researchers had galvanometers for detecting and measuring electrical current, the response time of those devices was too slow for observing the heart’s electrical activity. The rheoscopic frog was the best electrical test and measurement instrument available for that purpose—until it was replaced decades later by the capillary electrometer and then the string galvanometer.
Friday, March 28. 2008
In an interview with CBS's Lesley Stahl, Al Gore reportedly likened people who don't believe in man-made global warming to individuals who believe the Earth is flat and that the 1969 moon landing was a hoax.
It's unfortunate that Al Gore, a long time advocate of scientific research, chooses analogies that can only serve to discourage the free exchange of ideas about a complex and imperfectly understood system--the Earth's environment.
Gore seems convinced that because most scientists believe human activity increases greenhouse gases, anyone who denies the urgent need for action is either irrational or delusional. However, what marks
flat-earthers as irrational is their refusal to accept evidence based on direct observation. For example, seafarers noticed that mountains disappeared below the horizon as they sailed away from them. The Earth's shadow on the Moon during a lunar eclipse also provided evidence that the Earth is spherical. Does Gore believe that global warming is just as easily verified?
Geocentrism is a more apt analogy; it can be employed by both global warming activists and their critics. Activists can argue that while neither heliocentrism nor global warming is directly observable, both are supported by a mass of scientific data and calculations. Critics already point out that in an ecosystem defined by titanic forces, global warming activists wrongly place humans at the center.
Gore's comparison of critics of global warming to people who think the 1969 moon landing was a hoax is nothing but a smear. Even Gore seemed to acknowledge that his analogy crossed the line: "That demeans them a little bit, but it's not that far off." (Translation: "Yeah, I smeared them, but they deserved it.") Gore knows, or should know, that theories regarding the effects of human activity on climate change are still largely speculative. Though most scientists believe global warming is a serious problem, we should not forget that there is a science establishment with a long history of presumed infallibility; science orthodoxy has often been an obstacle to progress.
The clash of opposing scientific theories is often a catalyst or even necessary ingredient for progress. Get used to it, Al.
UPDATE: April 4, 2008 - 10:10am Eastern:
UN meteorologists say that global temperatures will actually dip "slightly" this year, wiping out any warming over the past decade. Perhaps they are flat-earthers and conspirators?
Not to worry, the UN assures us that global temperatures have nevertheless risen 0.74°C since the beginning of the 20th century. It would be interesting to know the margin of error; both instrument error and errors due to frequency and location of measurements need to be taken into account.
And what about the elephant in the living room? The dip in temperature is attributed to a natural event, La Nina. Apparently, this one natural event is capable of counteracting all human activity worldwide. But that's precisely what some critics of policies intended to combat global warming have argued for years.
The margin of error, the overriding effect of select natural events, and our limited knowledge of long term temperature cycles should be cause for any scientist with a modicum of healthy skepticism to admit that we don't yet know whether human activity causes harmful global warming. It certainly warrants further study, but it seems premature to curb industrial growth.
Tuesday, March 18. 2008
It’s hard to decide which is more disconcerting: the EU’s decision to pick a technology winner in mobile TV or the passivity of US leaders in the face of the EU’s anti-competitive rulings.
The EU explains that it’s necessary to pick a mobile TV technology winner in order to avoid a prolonged standards battle that might delay market development. Supposedly, competing standards sow confusion among consumers, who wisely decide to hold onto their money until a single standard emerges. And if that is not enough to convince you, there is always the mother of all government-mandated technologies, Europe's GSM mobile phone standard. (GSM is one of the most successful technologies in history.)
History’s judgment is not so clear. There are few instances in which we can say with certainty that a market was delayed because of competing standards. Markets may be delayed for any number of reasons. In fact, many markets were slow to develop even with the supposed impetus of a single standard.
Nor are there many examples of markets springing to life thanks to government-mandated standards. The success of Europe’s mobile phone standard, GSM, is undeniable. But as I explain in my book, The History of Wireless, several factors contributed to GSM’s success.
Wouldn’t it be nice if there were two planets identical to Earth on which we could test competing political, social, and economic theories? No doubt the outcome of any such experiment is a function of many variables. It’s easy for policymakers to construct arguments around one or two variables, but that doesn’t mean they are right.
It’s ironic that one week the EU is fining Microsoft more than one billion dollars for anti-competitive practices, and another week the EU is picking a technology winner precisely to avoid the ill effects of competition.
Average life expectancy has roughly doubled over the last century. However, most of the increase is attributed to improvements in public health and nutrition--improvements that dramatically reduced infant and childhood mortality. Modern medicine has otherwise yielded only small increases in average life expectancy.
Not everyone is impressed with today's medical technology. Critics complain that technology drives the cost of healthcare through the roof; that most healthcare providers place too much emphasis on expensive technology; and that technology sometimes poses new health threats (e.g., exposure to x-rays).
This reminds me of the old debate about whether personal computers (PCs) really increase office worker productivity. Skeptics argued that the time and effort required to master PC applications offset any productivity gains. That may have been true in the days of MS-DOS. However, the development of graphical user interfaces and the Web changed everything. Now, no one argues that the PC's productivity gains are illusory.
What lessons can we draw from the PC productivity debate? The critics were wrong on two counts. Though early PCs created a whole new set of care and feeding demands, the benefits were sometimes overwhelming. PCs enabled people to do things they couldn't do before and, therefore, the old ways of measuring productivity no longer applied. And though the demands may have exceeded the benefits for most users, it was only a temporary state of affairs; the balance shifted as personal computers became easier to use and software was developed to automate more tasks.
Likewise, we should be careful not to apply the wrong measures to medical technology. Medical technology may not dramatically increase average life expectancy, but it can certainly increase the life expectancy of individuals. Additional increases in average life expectancy will be hard to achieve, and technology will be essential to squeezing out further gains.
Technology has increased productivity in virtually every industry. Healthcare is the only industry in which technology is portrayed as mainly adding cost. But technology may be just the scapegoat. Other factors, such as expanding bureaucracy and malpractice insurance, have contributed to spiraling healthcare costs. It would be interesting to see a comparative study of healthcare costs over the past several decades.
Thursday, March 6. 2008
There is a fascinating article in today's New York Times about how Cubans are using digital technology to fight their oppressors: Cyber-Rebels in Cuba Defy State’s Limits.
Cubans are using cameraphones, memory sticks, and the Internet to share information, criticize the government, and let people both at home and abroad know about small but significant acts of resistance. For example, a video has been circulating showing students at a "prestigious computer science university" confronting Ricardo Alarcón, president of the National Assembly.
Digital technology poses a conundrum for the thugs who rule countries such as Cuba, Iran, and China. They know that their countries will fall hopelessly behind if they prohibit digital technology--and that will fuel opposition and unrest. They also know that digital technology empowers people--which also fuels opposition and unrest. So they try to walk the fine line of permitting limited access.
Digital technology is having a profound impact on geopolitical events. It allows people to record and archive content on an unprecedented scale: emails, text messages, podcasts, pictures, and videos. It also enables people to circumvent censorship and spread the truth.
Digital technology is being used not only to write history, but to make history.
Tuesday, March 4. 2008
I recently learned about three episodes in the history of medicine that most people will find surprising. They remind us that some of the medical technology we take for granted evolved from exotic and even shocking early work. Pun intended.
Episode #1: The first open heart surgeries were performed on children using a parent as the heart-lung machine. One physician quipped that this was the only medical procedure with the potential for 200% mortality. Fortunately, it wasn't long after that that machines capable of aerating the patient's own blood were invented.
Episode #2: Doctors knew for many years that some patients suffer from abnormally slow heart rate (also known as Stokes-Adams disease or "heart block") and that the problem eventually kills them. Then it was discovered that the body controls heart rhythm using electrical signals. Paul Zoll pioneered the use of artificial electricity to treat heart block. Shocks were administered to the patient's chest; the implantable pacemaker had not yet been invented. Obviously, Zoll's method could only be used temporarily--for example, after surgery. Some patients tolerated the shocks quite well, while others could be observed jumping with each electrical pulse.
Episode #3: For a short period, the standard procedure for treating ventricular fibrillation was to cut open the patient's chest and massage the heart directly. Worse, this usually had to be done on an emergency basis wherever the person happened to be at the time.
These methods, as unpleasant as they sound, led to the development of safer and more effective solutions.
(Page 1 of 1, totaling 6 entries)
Syndicate This Blog