Many people are anxious to know the candidates' science and technology positions. The most often cited concerns are funding basic research, promoting science education, developing technologies deemed good for people and the environment, encouraging innovation, protecting intellectual property, and defending science against real or perceived threats.
The ironic thing is that most innovation flows from the bottom up. Outside space exploration, the U.S. government has driven very little innovation. The expectations some people have regarding national science policy are simply not realistic.
I'm not suggesting that there is no role for government in science and technology. But I'm concerned that government's growing presence is suffocating innovation. Funding, regulation, and tax policy can be used to favor certain technologies over others. However, innovation rarely occurs on demand, and (more to the point) it often comes from unexpected quarters. Though both major U.S. parties pay lip service to entrepreneurs and private enterprise, the evidence suggests that neither party has the courage to oppose the growing science bureaucracy.
Basic research - The concern is that industry only funds research with short-term profit potential. There is abundant historical evidence to the contrary. For example, Bell Laboratories did a great deal of pure research. Private companies often sponsor research conducted at universities. But I agree that some areas of pure research may be neglected or deserve greater emphasis. It makes sense for government to fund worthwhile research that otherwise might not get done, but funding should be limited to prevent dependence, waste, and corruption. And it should always be performed at private companies and universities.
Science education - Many people want educators to introduce their children to the wonders of science and opportunities for science careers. But the education establishment confuses the role of the discoverer/inventor with that of the technician. You can teach someone to be a technician, but you can't teach someone to be a discoverer or inventor. The British philosopher John Stuart Mill also cautioned against entrusting education to the government; the temptation to indoctrinate is too great. Having said all of that, perhaps a proper role for government would be to assist rather than lead. For example, I remember writing the Atomic Energy Commission when I was in elementary school and receiving some fascinating literature.
Safety and the environment - One popular view is that private industry would kill its customers and destroy the earth if the government didn't stop it. I don't think it's nearly that bad, though I agree that some problems should be identified and corrected earlier, and that government can play a watchdog role. For example, a government agency operating under the direction of an independent group representing all concerned parties (consumers, manufacturers, retailers, and so forth) could test products and publish the results online. However, no product is 100% safe, and it is reasonable to accept some risks in order to enjoy specific benefits. Likewise, the federal government could monitor environmental concerns and act as an information clearinghouse. However, if we cherish scientific method then we need to recognize that knowledgeable people often disagree about such complex problems. Local and state governments should be free to impose whatever restrictions they deem appropriate; a better role for the federal government would be to recommend standards rather than mandate one-size-fits-all regulations.
Innovation - Problem-solving can be planned, but innovation can't. Government can ensure the same conditions exist that allowed the personal computer to develop and prosper--by avoiding regulations that favor specific technologies over others, minimizing taxes, and doing away with government lotteries and gambling. Once upon a time, the U.S. offered opportunities to get rich through creativity and hard work. Today, the government has replaced organized crime in marketing and profiting from gambling. Meanwhile, we vilify people who make their fortunes in business. We are encouraging the most mindless gambling--and discouraging value creation in the process.
Intellectual property (IP) - A little noticed facet of political correctness: we often punish our own companies in deference to international organizations that envy our success and whose member countries practice protectionism. Leaders who don't support free trade are in no position to protect intellectual property. To wit, imposing restrictions on trading partners reduces their incentive to honor our IP.
Anti-science ideas - History shows that the establishment often greets innovative ideas with resistance and even denial. It's easy to become complacent--believing that the serious mistakes of the past could not happen today. It's precisely this kind of over-confidence that leads to such mistakes.
I'm also surprised by how little faith some people have in the ability of others to recognize and reject false or baseless claims in the course of doing science. It's a bad idea to impose science litmus tests on political leaders, and it's an even worse mistake when public officials believe they are the guardians of scientific truth. My research shows indisputably that even wrong ideas (example: the luminiferous ether) can stimulate great discoveries. It's better to let scientists sort out these issues. In picking favorites, government tends to discourage the competing ideas that spur innovation
The microscope has long been the telescope’s poor cousin. Both tools were invented at almost the same time (around 1600). The telescope had urgent applications (spotting land and ships); was more easily perfected (optically); and was soon revealing the cosmos’ secrets. The microscope enlarged things already in hand but severely distorted them. For two centuries, the microscope was more of a novelty item than a practical instrument.
Now the microscope is poised to illuminate normal and abnormal life processes like never before.
Sure, the microscope has been used for biology research since the days of Robert Hooke and Antony Leeuwenhoek. However, it wasn’t very reliable until Joseph Jackson Lister (father of the famous Joseph Lister) developed a formula for minimizing spherical and chromatic aberration—eliminating dependence on trial and error techniques. Within about a century the wavelength of visible light became the limiting factor for achieving greater resolution.
Scientists developed ultraviolet and electron microscopes to get around the wavelength limitations of visible light. But humans can’t see UV light, so it was used to produce fluorescence (which could be seen) or images on photographic film. The electron microscope, which also relied on film for many years, had the further disadvantage that special specimen preparation was required.
There have been two big changes recently. The marriage of the microscope and digital camera (with real-time display) now permits one instrument to span infrared to visible light to ultraviolet. And electron microscope makers such as FEI Company are exploiting vitrification to avoid the water crystallization that has traditionally plagued the freezing of biological specimens.
Combine vitrification and 3D/4D digital image processing and you have the prospect of, for example, observing intricate cellular processes first hand and even automating genome sequencing. The electron microscope becomes something analogous to a CT scanner—except this scanner can see down to individual atoms. The possibilities downstream are mind boggling.