Introduction
by Senator James Inhofe
Ronald Reagan once said, “Government exists to protect us from each other. We can’t afford the government it would take to protect us from ourselves.” Unfortunately, everyday it seems that the federal bureaucracy is proposing new regulations or new regulatory programs designed to protect us from ourselves. Not only is the government trying to be “Big Brother” but it is also trying to assume the roles of mother and father too.
Earlier this year, I shepherded a targeted regulatory relief bill through the process which was signed into law in July. This bill, S. 880, the Regulatory Fuels Relief Act, removed propane from the reporting requirements of the Environmental Protection Agency’s Risk Management Plan Program. It also placed a moratorium on placing security-sensitive data on the Internet which could be accessed by terrorists for targeting purposes. Propane, a nontoxic substance, was incorrectly placed on the list of toxic substances, despite the fact that it was already regulated by OSHA, the States and through the Fire Codes. When my staff asked why EPA felt its additional regulations were needed, EPA personnel responded that they didn’t like the OSHA standards and felt that they could do a better job.
This kind of bureaucratic second-guessing is leading to an unprecedented level of burdensome regulations. Today, we are not only dealing with out-of-control regulatory agencies, but we are also dealing with bureaucrats who disagree with each other and solve their disputes by implementing duplicative regulations. This leads to a simple case of turf building. By requiring thousands of small propane dealers to report to the EPA, the office within EPA which would handle the data would grow in both funding and personnel. In other words, the bureaucrats responsible for requiring the propane reports would have grown in importance.
A coalition of small propane dealers and farmers (who are large customers of propane) became very vocal in its outrage and flooded Congressional offices with letters and postcards. This led to a bipartisan legislative approach with almost no Congressional opposition. Unfortunately, no member of Congress has the time or resources to keep up to date with every single regulation, and there is not always a natural constituency to oppose a regulation. This means that your typical small businessman cannot be expected to know about, much less understand, every single regulation.
Unfortunately, most bureaucrats in Washington have little to no experience in the private sector so they fail to understand simple business concepts or practices. That is why it is important for Congress to be vigilant in federal oversight and why it is so vital to elect individuals to Congress who have worked in the business world and understand what it is like to meet a payroll. The good work of the Institute for Policy Innovation and the Lexington Institute, and other like-minded organizations, goes a long way towards educating members of Congress and the general public about excessive and sometimes “stupid” regulations. It is unfortunate, and we are trying to change it, but there are far too many departments, agencies and regulations seeking to place mandates on businesses for any one member to keep fully up to date. Publications such as this, highlighting the ten worst regulations, are vital if we are ever going to have an impact on the intrusion of the federal government into our daily lives. We have had fifty-plus years of regulatory growth and, as President Reagan said, “the hardest thing to kill is a government program once it has been created.”
SUVs: Another Case of Missing EPA Data
By Steven J. Milloy
“No taxation without representation” was a favorite motto of the colonists in the period leading up to the American Revolution. Today, we are still being taxed unfairly — this time, by the U.S. Environmental Protection Agency (EPA).
An EPA proposal issued in May of 1999 intends to force manufacturers of sport utility vehicles (SUVs) to reduce tailpipe emission standards so as to increase the cost of SUVs by about $200. At the same time, EPA is demanding that gasoline manufacturers cut sulfur content to such an extent that gasoline costs may rise as much as five cents per gallon. What most consumers don’t realize is that such regulations are nothing more than de facto taxes — imposed not by our representatives in Congress, but the bureaucrats at EPA.
Setting aside the question of whether EPA has the legal authority to issue such regulations under the Clean Air Act — a claim put in question by a recent ruling of the U.S. Court of Appeals for the District of Columbia Circuit — the public has the right to know whether EPA’s justification for these added burdens is valid. EPA says that the new measures will prevent 2,400 deaths every year that would otherwise occur due to emissions of fine particles from SUVs. But how do we know EPA is telling the truth? We don’t. And, in fact, we may never know, because EPA officials are preventing public review of the data upon which these regulations are based.
As it turns out, EPA’s claim is based on a single scientific study (the “Pope” study), conducted by private researchers courtesy of a grant from the agency itself. The results were published in a journal of the American Lung Association — a group that not only receives subsidies from the EPA, but vigorously lobbies for stricter air pollution regulations. Although potential bias is motive enough for challenging this lone report, the scientific fallacies of the Pope study provide even better reason for doubting EPA.
To be accurate, the Pope study really isn’t science at all — it’s simply a statistical analysis of questionable data resulting in a weak correlation between airborne levels of fine particulates and premature mortality: hardly ironclad proof that clamping down on SUV emissions will “save” 2,400 lives per year.
Science is based on the “scientific method” — a system for developing and testing scientific theories. The scientific method requires that individual studies be capable of replication — i.e., independent scientists under the same conditions should be able to repeat a study’s results. How important is such replication to the public?
- The A.H. Robins Company was forced into bankruptcy and had to pay $2.5 billion to women who claimed injuries from its contraceptive device known as the “Dalkon Shield.” The lawsuits were largely the result of a 1981 study conducted by the National Institutes of Health which reported that women who used intrauterine devices, including the Dalkon Shield, have a 60 percent increased risk of pelvic infection. In 1991 the data were reanalyzed and it was discovered that the NIH study “showed an almost complete disregard for epidemiologic principles in its design, conduct, analysis and interpretation of results.”
- A June 1996 study by Tulane University reported that combinations of pesticides and PCBs were potent disrupters of hormonal systems. Published with great fanfare, the report propelled Congress into requiring EPA to develop a multi-billion dollar testing program for chemicals. About one year after the data were published and Congress enacted the law, the study was retracted from publication. Independent scientists from around the world could not replicate the report’s claims.
- In 1991 the National Cancer Institute reported that dogs exposed to the lawn herbicide 2,4-D had a doubled risk of cancer. When the facts were reanalyzed by an independent scientist — following an 18 month battle during which the NCI refused to produce the data — no association was found between 2,4-D and cancer in dogs.
EPA itself is no stranger to shielding questionable science from public scrutiny. In 1996, EPA proposed more stringent national air quality standards, estimated to cost taxpayers as much as $100 billion annually and tens of thousands of jobs. EPA claimed the rules would save 15,000 lives nationally per year — a claim based solely on the Pope study.
Aware of the questions surrounding the Pope study, Congress asked EPA to provide the criteria underlying the report so that it could be examined by independent scientists. EPA refused, alleging there was no useful purpose in producing the data. When Congress would not relent, the EPA-funded researchers stonewalled by claiming proprietary right to the data, though it can hardly be argued that Congress was interested in going into the scientific research business. More to the point, the study was paid for with taxpayer dollars and was being used to impose heavy burdens upon these very same taxpayers.
Cavalierly disregarding Congress, the objections of its own science advisers, and a bipartisan group of governors and mayors, EPA ramrodded the new regulations through. Industry sued — and won. Congress legislated.
Senator Richard Shelby of Alabama inserted a provision into last year’s Omnibus Spending Bill requiring that taxpayer-funded scientific data used to support federal regulations be made available through the Freedom of Information Act (FOIA). Signed into law by President Clinton, the Shelby amendment is designed to prevent federal agencies from regulating on the strength of “secret science.” But since “secret science” serves the interests of those who want to expand the regulatory state, the Shelby amendment has had to withstand repeated legislative efforts to rescind it. Meanwhile, EPA still refuses to make the Pope study available to the public.
Following the recent proposal of the SUV/gasoline sulfur rules, the public interest group Citizens for the Integrity of Science cited the new “data access” law in an attempt to obtain the Pope study data. EPA denied the request, stating that it did not possess the data (they belong to the American Cancer Society), and that it did not pay to have the data collected. These responses are true enough, but beg the question: if EPA itself can’t obtain the Pope study data — that is, even the agency can’t properly assess the credibility of the study it funded — how can the agency use it as a basis to regulate?
Even though EPA apparently doesn’t care to ensure that the Pope study is a sound basis for regulation, the public does. Its time we put a stop to the “long train of abuses” carried out by EPA and demand an end to EPA “taxation” without scientific justification.
Mr. Milloy is an adjunct scholar with the Cato Institute and the publisher of the Junk Science Home Page. (www.junkscience.com)
The Incredible Shrinking Supercomputer
By Philip Peters
Government regulation of private economic activity is a perpetual balancing act. It often serves important public purposes, but always at a cost. There are two traditional ways in which regulations fail a reasonable cost-benefit test. The first is overreach: the mandating of huge expenditures to protect against infinitesimally small risks. The second is micro-management: dictating how companies must meet government standards instead of allowing them to comply by finding cost-efficient methods of their own choosing.
Information technology poses a new pitfall for regulators: irrelevance. The speed at which information technology is developing carries the risk that regulations adopted today will be outpaced by tomorrow’s technology and marketplace conditions. In cyberspeed, “tomorrow” can mean months, not years.
In recognition of this reality, a strong bias has developed among federal and local authorities against regulating the Internet or the marketplace that delivers it to homes and businesses. This hands-off regulatory philosophy is based on a healthy sense of caution: the government has no reason to tamper with something that is working well and is incapable of regulating at a pace that keeps up with the evolution of technology and delivery systems.
This sense of caution, however, has not reached those parts of the federal bureaucracy that control high-technology exports for national security purposes. The result is fast coming into view: millions in computer export sales may be lost to obtain zero protection against new national security risks.
When consumers shop for computers, they compare data storage capacity, random access memory (RAM), microprocessor speed, and other features. Regulators use a different measure of computing power: millions of theoretical operations per second (MTOPS). By keeping the computers available for export below certain MTOPS levels, the U.S. government seeks to prevent weapons proliferation by denying adversaries the tools that would speed the development of nuclear warheads and ballistic missiles.
Such an approach makes sense in theory and would be practicable — if information technology advanced at the pace of railroad technology. But Moore’s law, a rule of thumb that says the performance of chips doubles every eighteen months, conspires against the most well-intentioned regulator.
Consider the following:
- Beginning in 1988, export licenses were required for computers operating over 12.5 MTOPS. Today, desktop PCs available for $2,000 reach 500 MTOPS.
- According to a December 1998 Wall Street Journal report, “The entire U.S. nuclear arsenal was designed on computers running at or below the speed of one of today’s new 450 megahertz PCs. The most modern U.S. ballistic missile, the submarine-launched Trident D-5, was designed on a computer running at less than half that speed.”
- “Supercomputers” were room-sized in decades past; today, equivalent power is found in desktop PCs. Supercomputers by current standards, running at tens of thousands of MTOPS, can be “virtually” created by linking dozens or hundreds of store-bought PCs. These “clusters” require nothing more than commercially available connection hardware and messaging software obtainable on the Internet. For example, an Indian government research center uses a 160-processor cluster, while a 400-processor cluster operates at the U.S. government’s Sandia National Laboratory. If such clusters do not match the speed and enjoy all the capabilities of freestanding supercomputers, their cost is far less.
Advanced weapons development depends on many factors, including superior technological expertise; engineering, software design, and manufacturing capacity; materials; and an ability to conduct a rigorous testing program. Even if these hurdles are cleared, there is still no doubt that if an aspiring weapons builder is denied access to advanced computers, his progress will be significantly slowed. But how realistic is the expectation that U.S. export controls can regulate this access? Last year the Commerce Department reported that hundreds of machines capable of running at 10,000 MTOPS or more have been sold, as have thousands in the 5,000-10,000 MTOPS range, thousands more in the 1,000 -5,000 MTOPS range, and tens of thousands in the 400 -1,000 MTOPS range. Because about half of U.S.-manufactured computers are exported, there comes a point at which export control efforts begin to look like an attempt to choke export sales in an industry crucial to U.S. economic growth.
Until July 1999, the U.S. export control system operated under the following guidelines: There were no restrictions at all on sales of even the most powerful supercomputers to a group of 28 close U.S. allies called “Tier 1” countries. Licenses were required for sales of computers operating at or above 10,000 MTOPS to a “Tier 2” group of 106 countries. “Tier 3,” a 50-country group deemed to be of high proliferation risk, included high-volume, high-growth computer markets such as China, Russia, India, Pakistan, and Israel. In these countries, U.S. exporters had to meet licensing and prior notification requirements for sales of computers that operate as low as 2,000 MTOPS.
This regulatory scheme was the result of revisions made in 1995. Before the end of 1999, these guidelines were on a collision course with the latest product of Moore’s law: the Intel Pentium III Xeon chip, operating at 1,283 MTOPS. Business-grade e-mail servers linking two of these chips will run at 2,566 MTOPS — “supercomputers” for Tier 3 countries. Other business machines may soon break the Tier 2 threshold of 10,000 MTOPS, and a 2,500 MTOPS laptop is on the horizon. When high-tech executives plead that “yesterday’s supercomputer is today’s laptop,” they are right.
On July 1, the Clinton Administration announced a last-minute action to raise the MTOPS thresholds. Absent the correction, exporters and U.S. officials alike would have been burdened with unprecedented volumes of sales notifications and license applications, resulting in a high number of lost sales to foreign manufacturers.
As welcome as the revision in the MTOPS thresholds is, what is needed is a new commitment to keep the thresholds constantly updated so as so keep pace with the evolution of computing technology. Needless to say, the requirement that any revision cannot take effect until six months after its announcement must be eliminated.
Congressional opposition as well as the political climate created by recent Chinese espionage surely explain the Administration’s slow response to this problem. For both commercial and security reasons, a greater willingness to keep these regulations up to date is needed. Apart from generating income and jobs, high-tech exports serve an important national security purpose: they support the robust research and development budgets that created and maintain U.S. dominance in information technology. Such superiority provides an advantage that goes far beyond economics, and as was seen in Iraq and Kosovo, it wins wars.
Phil Peters is Vice President at the Lexington Institute, and served in the State Department during the Reagan and Bush administrations.
Safe Drinking Water: Politics Trumps Science
By Bonner R. Cohen, Ph.D.
Late last year, the Environmental Protection Agency (EPA) rendered one of the most troubling decisions in its stormy, 29-year history. Faced with having to choose between science and politics, the agency opted for the latter. In doing so, EPA took a perilous step toward undermining one of the pillars of public health in the United States: the purification of the nation’s drinking water supply.
Under mounting pressure from environmental groups to ignore the recommendation of the agency’s own scientists, EPA Administrator Carol Browner last December scrapped a science-based standard for chloroform in drinking water.
Browner’s decision reveals a great deal about what role, if any, science will play in forming the basis for EPA’s regulatory actions for the duration of the Clinton administration. It also sheds light on how close Browner’s agency will follow its draft cancer risk guidelines of 1996 which acknowledge that exposure to carcinogens below a certain level, or threshold, often poses little or no threat to human health.
In March 1998, EPA proposed raising the Maximum Contaminant Level Goal (MCLG) for chloroform in drinking water from zero to 300 parts per billion (ppb). The recommendation came after EPA scientists at the agency’s Office of Water had undertaken a painstaking review of toxicological data on human exposure to chloroform going back 20 years, and after they had taken into account the threshold principle contained in the agency’s draft cancer guidelines. EPA’s proposal was hailed by scientists outside the agency (itself a newsworthy event), even drawing praise from the Society of Toxicology, the largest professional association of toxicologists in the world.
That praise, however, was not enough to save the agency’s science-based chloroform proposal from political sabotage. Moreover, in rejecting the recommendations of its own scientists, EPA also turned its back on a key requirement of the 1996 Safe Drinking Water Act, which directs the agency to use “the best peer-reviewed science.” That’s just what EPA scientists had done, only to be overruled by the agency’s politicized top brass.
Acknowledging the Trade-off
Chloroform is created when drinking water is chlorinated to remove microbial pathogens. Together with dibromochloromethane and bromodichloromethane, it belongs to a class of disinfectant byproducts (DBPs) known as trihalomethanes. Since trace elements of disinfectant byproducts are an inevitable result of the water purification process, water suppliers in the US have come to see them as posing a far lower risk to public health than the pathogens that would otherwise remain in drinking water. Indeed, since chlorination was adopted by water systems across the US beginning in 1908, it has resulted in the virtual elimination of such deadly waterborne diseases as cholera, typhoid, dysentery, and hepatitis A.
A 1994 report published by the International Society of Regulatory Toxicology and Pharmacology stated that “the reduction in mortality due to water-borne infectious diseases, attributed largely to chlorination of potable water supplies, appears to outweigh any theoretical cancer risks (which may be as low as 0) posed by the minute quantities of chlorinated organic chemicals reported in drinking waters disinfected with chlorine.”
This view is supported by the American Academy of Microbiology: “It is important to point out that there is no direct or conclusive evidence that disinfection byproducts affect human health in concentrations found in drinking water…. Concerns over the toxicology of DBPs should not be allowed to compromise successful disinfection of drinking water, at least without data to support such conclusions.”
In proposing a 300 ppb MCLG for chloroform, agency scientists were in effect acknowledging that current levels of chloroform in drinking water are safe. For water system operators, however, EPA’s insistence on a zero standard for chloroform (unobtainable in any event) means that water system operators will have to devote their limited resources to combating the fictitious risks posed by disinfectant byproducts and the real threats to public health arising from the presence of microbial pathogens in drinking water.
While the agency’s original chloroform proposal was welcomed by scientists outside EPA, it did not go down well with environmental groups, many of which have been carrying on a longstanding crusade against chlorine and chlorinated compounds. Led by the Natural Resources Defense Council (NRDC), green groups bombarded EPA with negative comments on the proposed MCLG for chloroform. Brushing aside expert scientific opinion, NRDC urged the agency to “reject the unproven and probably incorrect hypothesis that there is a threshold for its carcinogenic effect, a theory that ignores human evidence of chlorination byproducts’ carcinogenicity.”
Historically, the big, Washington-based environmental groups have been allied with EPA (particularly Browner’s EPA) and have been the recipients of generous grants from the agency. Yet for them to acquiesce in the agency’s adoption of a science-based standard which acknowledges that there is little or no risk below a certain threshold is to undermine one of the key tenets of modern environmentalism, which, as the NRDC statement makes clear, denies the existence of such thresholds.
This latest triumph of environmental correctness over science will cast a long, foreboding shadow over the nation’s public health policies for years to come. “If we cannot use the abundant scientific information available to make rational decisions on chloroform,” asks Michigan State University toxicologist Jay Goodman, “then what chemical can we make a respectable decision on?”
Dr. Bonner R. Cohen is a Senior Fellow at the Lexington Institute.
Hypoxia: The Dead Zone Lives
By Dennis Avery
One of the ironies of this fast-moving age is that those who benefit most from innovative technologies often appreciate them the least. Nowhere is such ingratitude more prevalent than in the area of modern, high-yield agriculture, where urban America's ignorance of the forces at work can have devastating consequences.
By Congressional order, the executive branch must devise a plan for mitigating hypoxia in the Gulf of Mexico by May 30, 2000. (Hypoxia is the technical name for a low-oxygen zone in which fish cannot live.) Congress passed the mandate to save the rich fishery of the Gulf from destruction at the hands of over-enthusiastic Midwestern farmers. As the theory goes, farmers are applying too much fertilizer, which is running off into the waters of the Mississippi River system. The runoff has apparently caused a huge and expanding “dead zone” in the Gulf of Mexico, threatening to destroy the Gulf, its fish, and the traditional livelihood of the colorful fishing towns along the Gulf Coast.
The hypoxia problem has been “worked” by a White House Task Force, marine researchers, the Sierra Club Legal Defense Fund, and a whole roster of interagency working groups. Two of their proposed solutions are already on record: 1
- Cut back the use of fertilizer on farms in the American Midwest by 20 percent, thereby substantially reducing the runoff of chemical fertilizer into the rivers by some corollary percentage.
- Convert 24 million acres of the Midwest’s current farmland into new wetlands and forests to absorb more of the nitrogen from the farms. This will have the associated benefit of supporting more wildlife.
This heartwarming picture of mutual cooperation among bureaucrats, scientists, and special-interest groups working together to save the Gulf fisheries ignores several key facts:
- The rich marine life of the Gulf depends on the nutrients which come down the Mississippi. The area around the “dead zone” is known by local fishermen as “the Fertile Crescent.” The Louisiana Department of Fisheries has warned for years that reducing the nitrogen in the Mississippi may starve the Gulf fishery.
- Midwest farmers’ use of fertilizer has remained essentially the same since 1982, while their corn yields have risen about 25 percent. Crop growth is obviously taking up more of the nitrogen they apply. The National Geological Survey’s nitrogen readings in the lower Mississippi have not risen.
- The “dead zone” neither seems to be expanding nor to be human-driven. Rather, it is a natural phenomenon connected to rainfall patterns in the Mississippi Valley. First noted in the 1930s, the area was measured in the 1980s at 3,500 square miles (0.5 percent of the Gulf surface area). In the drought year of 1988, the hypoxic zone essentially disappeared. After the huge Midwest floods of 1993 — today regarded as a 500-year event — the zone doubled in size, spanning about 7,000 square miles through 1997. In 1998, it receded to 4,800 square miles: the likely reason being that the Gulf’s ecosystem has worked through the huge surge of nutrients from the 1993 floods and is now returning to its normal size.
- There are similar zones at the mouths of 40 other rivers around the world — wherever nutrient-rich fresh water enters a bay.
- The White House Task Force says it can find no economic or ecological damage from the current nutrient flows.
- Imagine the regulatory embarrassment! Here the President is engaged, the legislative skids greased, the plan developed — and no problem can be found. Undaunted, the hypoxia team wants to impose its agenda anyway. If implemented, we can expect the following damages:
- Fish catches in the Gulf of Mexico will decline, and the environmental movement will blame the reduction on “pollution.”
- Midwestern farmers will obtain lower corn yields, and production from the 24 million acres converted to new wetlands and forests will disappear entirely. U.S. corn production, heavily focused in the Midwest, will decline by perhaps 20 percent (50 million tons per year).
In a world demanding three times as much farm output in the coming decades, and four to five times as much meat, lost Midwest production will probably be made up in densely-populated industrializing economies such as India and Indonesia. Corn grown on marginal land in these two countries is likely to average only about 0.4 tons per acre, instead of the 3.2 tons per acre harvested in the Midwest.
The resulting environmental damage will be far more real than the imaginary hypoxia crises blamed on Midwest farmers. India’s crop production is already edging closer and closer to its tiger preserves, with more farmers getting eaten and more tigers shot as “man-eaters.” Similarly, Indonesia will be forced to accelerate the clearing of tropical forests to grow low-yield crops of chicken feed for its expanding poultry industry. Fighting a hypoxia problem that doesn’t exist in the Mississippi Valley will thus result in losses of 60 to 100 million acres of tropical wildlands: a trivial price to pay to protect the egos (and salaries) of our hardworking bureaucrats in Washington.
Dennis Avery is the Director of Global Food Issues at the Hudson Institute.
Biotechnology: EPA vs. Plants
By Henry I. Miller, M.D.
The home gardening season began this year in Northern California with a public service advertising campaign that asks, “Have you over-sprayed your garden?” The run-off of agricultural chemicals — especially fertilizers, herbicides and pesticides — directly into San Francisco Bay poses a perennial problem. Why, then, is the Environmental Protection Agency (EPA) obstructing an innovative and environmentally friendly solution to this problem in California and around the nation?
Building on a succession of anti-biotechnology policies beginning in the mid-1980s, EPA turned its sights several years ago on what was once one of biotechnology’s most promising applications: crop and garden plants genetically modified for enhanced pest- and disease-resistance — that is, the ability to repel insects, viruses, bacteria and fungi. In November 1994 the agency announced that it would begin requiring case-by-case regulatory review as “pesticides” of this entire category of products that had not previously been thought to require regulation at all.
Of course, genetically altered plants are nothing new. Plant breeders, farmers and consumers all possess extensive experience with crops and foods that have been genetically modified for pest resistance. Using techniques that pre-date gene-splicing, scientists in recent decades have transferred genes widely across natural breeding boundaries, markedly increasing agricultural productivity. Most often, plant breeders have sought resistance to commercially important plant pests, such as insects and bacteria in tomatoes, and viruses and fungi in potatoes. These “genetically engineered” plants — which require no governmental evaluations of any kind — are routinely bought by American and European consumers at their local supermarkets.
The last major crop epidemic that occurred in the United States dramatically illustrates the superiority of gene-spliced plants. In 1970 a fungus causing a disease known as “Southern corn leaf blight” destroyed approximately 15 percent of the nation’s corn crop, costing farmers 20 billion metric tons worth about one billion dollars. For several years most of the corn in the U.S. had been grown from hybrid lines containing so-called “Texas cytoplasm male sterility” (the extensive use of male sterility obviates the need to remove plant tassels by hand in order to eliminate pollen production). Unknown to plant breeders, the hybrid strain was not only unable to form pollen, but was more sensitive to Southern corn leaf blight.
By contrast, gene-spliced plants could have been more easily saved from the blight. Genetic changes introduced with more precise gene-splicing techniques move or alter only small numbers of genes, thus allowing for greater predictability of the final plant. As the National Research Council’s 1989 report, “Field-Testing Genetically Modified Organisms,” concluded:
-
-
“[Gene-splicing] methodology makes it possible to introduce pieces of DNA, consisting of either single or multiple genes, that can be defined in function and even in nucleotide sequence. With classical techniques of gene transfer, a variable number of genes can be transferred, the number depending on the mechanism of transfer; but predicting the precise number or the traits that have been transferred is difficult, and we cannot always predict the phenotypic expression that will result. With organisms modified by molecular methods, we are in a better, if not perfect, position to predict the phenotypic expression.”
EPA’s draconian decision to regulate an entire class of negligible-risk crop and garden plants genetically improved with new biotechnology has outraged the scientific community and virtually eliminated commercial R&D in this sector. Wholly outside of scientific norms, EPA’s assault on plant varieties crafted with the new biotechnology is so potentially damaging that it has stimulated unprecedented action by the scientific community. In 1996, eleven major scientific societies representing more than 80,000 biologists and food professionals published a report excoriating EPA’s proposal. The critique observed that, contrary to EPA policy, the safety of a new substance synthesized by a plant depends on the biological actions of the substance, the amount present, and whether the substance is in the portion of the plant that will be eaten — not on the mere fact that it’s intended to protect against a plant pest.
The report warned that if EPA policy was implemented, it would discourage the development of new pest-resistant crops, prolong and increase the use of synthetic chemical pesticides, increase the regulatory burden for developers of pest-resistant crops, expand federal and state bureaucracies, limit the use of biotechnology to larger developers capable of paying inflated regulatory costs, and handicap the United States in competition for international markets. In short, this anti-environment, anti-innovation and anti-consumer scheme has nothing at all to recommend it, except the enhanced care and feeding of federal regulators.
In October 1998, the prestigious Council on Agricultural Science and Technology, an international consortium of 36 scientific and professional societies, confirmed the eleven societies’ 1996 criticisms of the EPA. In its issue paper, “The Proposed EPA Plant Pesticide Rule,” the consortium characterized EPA’s approach as “scientifically indefensible,” concluding that treating gene-spliced plants as pesticides would “undermine public confidence in the food supply.”
Fearing virtual elimination of biotechnology applications for plants in universities, the scientific societies are now trying to cut their losses by negotiating with the EPA. While the agency seems prepared to slightly modify the proposed regulations in order to neutralize some of its scientific critics, the underlying premise of EPA’s approach continues to violate one of the cardinal principles of regulation — the degree of oversight of a product or activity should be commensurate with the risk.
The sole trigger of EPA’s rule was, and remains, the use of gene-splicing techniques to enhance a plant’s pest- or disease-resistance. Traditionally-bred plants are exempted from the rule — no matter how pathogenic, toxic or otherwise dangerous to the environment the organism may be — while most gene-spliced plants will be captured regardless of risk. Moreover, EPA’s assertion during these negotiations that it intends to regulate only plants modified to express substances found to be toxic to other species ignores both basic plant biology and the history of plant breeding. Virtually all plants contain substances hazardous to predators and pests, for without such protection the plants could not survive. And, as mentioned above, decades before current gene-splicing techniques were implemented, plant breeders had been moving genes across so-called natural breeding boundaries to enhance toxicity to various pests and pathogens.
The proposed EPA regulations cannot be fixed with a little tinkering — certainly not by expanding a list of genetic modifications exempt from regulation or by substituting the term “plant-expressed protectants” for “pesticides,” as the agency has suggested. The rule needs to be fundamentally revised, made scientifically defensible and genuinely risk-based, and made to focus on real risks instead of on the mere use of gene-splicing techniques. Otherwise, the result will be a “compromise” rather like Galileo and the clerics of his day deciding to agree that all the planets except the Earth revolve around the Sun.
Henry I. Miller is a Senior Research Fellow at Stanford University’s Hoover Institution and the author of Policy Controversy in Biotechnology: An Insider’s View (R. G. Landes Co., 1997).
The Endangered Species Act: Shoot, Shovel, and Shut Up
By R.J. Smith
President Clinton and Interior Secretary Babbitt recently announced the recovery of the Bald Eagle and its coming removal from the Endangered Species List. Although intended to demonstrate the achievements of their administration and the merits of the Endangered Species Act, the Bald Eagle’s recovery has been achieved without the assistance of either. Instead, the ban on the use of DDT in 1972 eliminated the major cause of the eagle’s reproductive failure, and the subsequent reintroduction and restoration of eagle populations was almost entirely achieved through techniques developed by the private Peregrine Fund and first put into practice by the New York State Department of Environmental Conservation and then adopted by other states.
Just over a quarter of a century ago, on December 28, 1973, President Richard Nixon signed into law the Endangered Species Act (ESA) of 1973. With the stroke of a pen he created what is arguably the most powerful and far-reaching law in the nation’s history, one which seems to trump all other laws in according priority to endangered species over all other national concerns, and which was greeted enthusiastically by Congress and the environmental community as “the nation’s principal tool for protecting species from extinction.” Yet twenty-five years later the Act is mired in controversy, seven years overdue for reauthorization in a Congress that can find no way to fix this tragically flawed and broken law.
The ESA is causing tremendous harm to the very species it was designed to protect. Indeed, in 25 years not one single species has recovered and been delisted because of the Act. The goal of the Act is to list imperiled species, assist them in recovering, and then “delist” them (i.e., remove them from the Endangered Species List). Of some 1,400 species on the Endangered Species List, a mere 27 have been delisted. According to the U.S. Fish and Wildlife Service (FWS), which administers and enforces the Act, seven of the 27 were delisted because they became extinct while on the List. Certainly not much of an achievement. Nine species were delisted because of “data error.” This means that the species should not have been listed in the first place. Increasingly, this category is spotlighting the tragic flaw in the Act. Because of the overriding power of the Act to halt growth, development or projects on public or private lands when they might represent a harm to a species, the environmental community has used the Act as a means of achieving cost-free national land-use control and federal zoning.
Finally, the FWS claims that the other 11 delisted species were recoveries. However, an analysis by the Competitive Enterprise Institute reveals that none of the FWS’s “recoveries” qualify. Eight were actually data errors, which the FWS doesn’t want to admit. The other three species have actually recovered, but for reasons other than the Act.
The Act’s achievements: of 27 species have been delisted, seven went extinct, seventeen were data errors, and three were for reasons other than the ESA. Thus the Act has not recovered a single species in a quarter of a century.
The fatal flaw in the Act is that is has been used primarily as a means of cost-free national land-use control, rather than as a means of protecting rare species. In nearly every corner of the nation, landowners who happen to have threatened or endangered species on their lands, or who simply have habitat that might be used by endangered species, are routinely prevented from using their lands or property, including such activities as harvesting trees, planting crops, grazing cattle, irrigating fields, clearing brush along fencelines, discing firebreaks around homes and barns, or building a home.
The lesson is: The better a steward a landowner is, the more wildlife habitat one maintains on one’s land — the more likely it is that he will be punished by losing the use of his private lands. Landowners cannot afford to risk leaving much of their land in wildlife habitat. To do so is to risk losing all economic use and value of their land and irrespective of the U.S. Constitution’s Fifth Amendment, receiving no compensation for that loss. In fact, the fear of the Act drives landowners to actively remove habitat from their lands, especially habitat that could be used by endangered species.
“The incentives are wrong here. If I have a rare metal on my property, its value goes up. But if a rare bird occupies the land, its value disappears,” said U.S. Fish and Wildlife Service official, Sam Hamilton. “We’ve got to turn it around to make the landowner want to have the bird on his property.”
This is the inevitable result of the ESA’s punitive nature. By threatening landowners who make room for nature with the uncompensated loss of their land or crops, it encourages landowners to get rid of wildlife habitat and sterilize their lands. It creates the “shoot, shovel and shut-up syndrome,” whereby wildlife is viewed as a liability — as a threat.
The most important step Congress can take is to remove the perverse incentives in the Act and stop making stewardship a liability. This means no longer penalizing owners of habitat by preventing them from using their land. The key is to work with the nation’s private landowners instead of against them.
The only way to make the Endangered Species Act work for both people and species is to replace the existing compulsory, regulatory Act with a voluntary, non-regulatory, incentive-based Act, whereby the government would have no power to take or regulate private property in order to protect endangered species and/or their habitat. If the government wanted to protect habitat on private lands, it would have to work out mutually compatible, voluntary, contractual arrangements with the landowners. This would be very similar to how the Department of Agriculture “protects” highly erodible lands on the nation’s farms, by paying farmers to place some of their land in the Conservation Reserve Program (CRP) for a set term of years. Agriculture’s CRP has landowners clamoring to join.
The truly significant aspect of a voluntary, non-regulatory law would be the elimination of the perverse incentives in the current Act. Landowners would no longer be afraid of doing good, of sharing their lands with wildlife. Landowners would be willing to voluntarily maintain wildlife habitat and help endangered species. Therefore, the costs associated with a non-regulatory law would be far less than maintaining a draconian regulatory law and then requiring compensation for takings or loss of the use or value of private lands.
Thus, paradoxical as it may seem, a non-regulatory law would be the only endangered species protection law that would not be a budget buster; the only law that would not require a vast new source of funding. There is a model for such a law, HR 2364, the “Endangered Species Recovery and Conservation Incentive Act of 1995,” which was introduced with little fanfare in the 104th Congress and received little attention because it was considered too novel.
The 106th Congress has a unique opportunity to put aside the divisiveness and rancor that have characterized the ESA debate. All sides know the Act is a failure that harms people and their property and wildlife and its habitat. Let the people who care about property rights and liberty and the people who care about endangered species and biodiversity come together and replace the Endangered Species Act with one which will be good for people and species.
R.J. Smith is Senior Environmental Scholar at the Competitive Enterprise Institute in Washington, D.C.
PCBs: EPA Occupies the Hudson Valley
By Bonner R. Cohen, Ph.D.
What do the appearance of large schools of healthy striped bass in the Hudson River and a scientist willing to revise her earlier findings have in common? Each puts a nail in the coffin of yet another environmental scare.
The rise and fall of PCBs - polychlorinated biphemyls - as a source of national anxiety is unique in that the scientist who gave rise to the scare is now the one pulling the plug on the whole enterprise.
In 1975, Renate Kimbrough, M.D., then with the Centers for Disease Control and Prevention (CDC), reported that laboratory rats fed huge doses of PCBs developed liver cancer. Her widely publicized findings prompted Congress in 1976 to ban the manufacture and use of PCBs. In the spirit of the time, Congress concluded that if PCBs caused cancer in rats they could do the same in humans and acted accordingly. What further unnerved people is that PCBs don't degrade easily; they persist, albeit in minute and ever-decreasing quantities, in the air, water, soil, and in humans and animals.
PCBs were widely used in electrical equipment from 1929 to 1977. They replaced highly-combustible mineral oil insulating fluids, which for decades had been responsible for an untold number of deadly fires across the country.
Now, twenty-four years after her initial findings provoked Congressional action, Kimbrough has reached quite a different conclusion. The March 1999 issue of Journal of Occupational and Scientific Medicine contains a peer-reviewed study Kimbrough conducted with Martha Doemland, Ph. D., an epidemiologist with the Institute for Evaluating Health Risks. They find no association between actual exposure to PCBs and death from cancer or any other diseases. The Kimbrough-Doemland study focused on the 7,075 men and women who worked between 1946 and 1977 in two Upstate New York General Electric Co. factories that used PCBs in the manufacture of electrical capacitors. It compared to national and regional averages the number and causes of death for the 1,195 members of the study population who died.
The average follow-up time for the 7,075 workers was 31 years, providing a latency period to determine whether there was any increase in cancer mortality. Some of the workers in the study had PCB levels in their blood as high as several thousand parts per billion (ppb). By contrast, the average PCB levels found in the blood of people who have been tested in the United States range from 4 to 8 ppb, according to the Agency for Toxic Substance and Disease Registry (ATSDR).
"This is a significant study and should be factored into any public discussion of PCBs and human health," commented Arthur C. Upton, M.D., former director of the National Cancer Institute and a professor at the Robert Wood Johnson Medical School.
That discussion has been going on for years, and it has been fueled by the Environmental Protection Agency's (EPA) plans to rid the Hudson River of PCBs. Unfortunately, EPA's cleanser of choice is none other than Superfund, the nation's notoriously troubled "hazardous" waste cleanup program.
Saying there are "hot spots" of PCBs in the river, and that these chemicals, once buried in sludge and now dispersing into the water, "probably cause cancer in people," EPA Administrator Carol Browner asserts that PCBs pose "a serious threat to public health." To the horror of local residents, EPA is proposing to make parts of the Upper Hudson River Valley into a giant Superfund site and wants General Electric to dredge the river until all traces of the PCBs are gone. The cost of the PCB cleanup is estimated between $50 million and $100 million.
What EPA fails to appreciate is that dredging the PCBs will only stir them up, thereby defeating the purpose of the whole exercise. If left alone, the PCBs will, over time, dissipate.
In fact, recent tests by New York State biologists show the latter is already happening. The biologists have found that levels of PCBs detected in fish caught south of Poughkeepsie have dropped enough to meet federal safety standards. As anglers eagerly pull in striped bass, some of them weighing up to 40 pounds, Administrator Browner's argument that the dissipating PCBs pose a "serious threat to public health" begins to ring hollow. And with the cancer scare debunked by the Kimbrough-Doemland study, it's no longer clear what "problem" EPA wants to solve.
Moreover, by imposing Superfund on this picturesque region of Upper State New York, EPA is knowingly subjecting local communities to the same miseries that have followed the misbegotten statute in other parts of the country. Cleanup at a typical litigation-ridden Superfund site takes 12 to 15 years. But since EPA's "remedy" for the Hudson River — dredging — will only stir up the PCBs, the "cleanup" could go on indefinitely. Like any other river, the Hudson flows, meaning that whatever substances are set free during dredging, including PCBs, will be pushed downstream by the current. This opens the door to an ever-expanding Superfund site, with EPA assuming the status of a quasi army of occupation in the Hudson Valley. Meanwhile, property values and tax revenues will plummet, as economic activity in the proximity of the never-ending Superfund site slows down.
Like all scientific research, the Kimbrough-Doemland study, which was funded by a grant from General Electric, will have to stand the acid test of peer review. Unfortunately, the same does not hold true for EPA. If EPA is determined to ignore science and common sense and impose its will on the hapless residents of the Upper Hudson River Valley, there is little anyone can do about it — until, that is, Congress finally holds the agency accountable for such foolish and environmentally harmful decisions.
Dr. Bonner R. Cohen is a Senior Fellow at the Lexington Institute.
Factory Farming: Destroying Parkland to Save Rivers
By Dennis Avery
The search for scapegoats is as old as humanity itself. It provides a quick and easy answer to a problem which, upon closer inspection, may turn out to be a lot more complex. In the realm of environmental regulatory policy, targeting a scapegoat, rather than seeking to understand the variables involved, leads to poor environmental decisions. And if we’re not careful, one of those poor decisions may end up polluting our nation’s rivers and streams.
When a tiny, nasty marine critter (Pfiesteria piscicida) caused a public panic by attacking a few fish in three small rivers of Maryland’s Eastern Shore in the fall of 1997, the Chesapeake Bay Foundation and others were quick to blame confinement poultry farms. Never mind that there were no poultry at all in one of the three watersheds. Or that the local poultry farms were storing their manure and using it to fertilize growing crops (like organic farmers). Or that the region was in a mild drought, so there was no rainfall to wash poultry waste into the water. There were dead fish on TV all over the country, and this was a prime opportunity to condemn “factory farming.”
Joanne Burkholder, a researcher at North Carolina State University, was quoted as saying she was “almost certain” that the Pfiesteria attacks were due to poultry and hog manure. (Historically, peer-reviewed journals haven’t given much credit for a researcher who “almost proves” a hypothesis.) Vice President Gore announced it was a case for the federal government, declaring that the nation obviously needed new EPA regulations on “confined animal feeding operations” (CAFOs). Little family farms were a boon to the nation, of course, but these big CAFOS were a vicious threat to our water quality. Jumping on the bandwagon of blame, a March 1998 EPA report, “Strategy for Addressing Environmental and Health Impacts from Animal Feeding Operations,” concluded that U.S. agriculture is the leading cause of impaired rivers, streams and lakes, contributing up to 60 percent of the pollution in surveyed rivers and streams.
While all this may sound frightening, Richard Halpern, a former policy planning official for Rockingham County, Virginia, cautions that “the operative word is surveyed.” Rockingham is the nation’s second largest poultry-producing county, and Halpern specialized in water quality issues affecting the poultry industry. His analysis of EPA’s efforts at reforming water quality standards that apply to livestock operations reveals that there is considerably less in the agency’s pronouncement than meets the eye. Truth be known, only 17 percent of the nation’s river miles have been surveyed, and of that 17 percent, just 37 percent — 6.3 percent of the nation’s total river miles — are known to be impaired. Agriculture is estimated to be responsible for 60 percent of that impairment, with animal feeding operations of all kinds — including dairy, beef feedlots, poultry and swine — estimated by EPA to “adversely impact 16 percent of those waters.”
“In other words,” Halpern calculates, “livestock affects 16 percent of 60 percent of 37 percent of 17 percent of the nation’s rivers and streams. In the end, that’s less than 1 percent total.”
Even that conclusion is doubtful. The big hog farms were already under a zero-discharge mandate. The Clean Water Act had made it illegal for hog farms to discharge any wastes into the streams. (They spread the wastes on corn and forage crops.) Any hog farm polluting the rivers of any state could immediately be shut down. When North Carolina State and Auburn universities audited North Carolina’s largest hog producers, they found 95 percent of the farms complying with regulations — and the scofflaws were little outdoor farmers! The Black River watershed in North Carolina, for instance, drains the most intensive hog farming in America. But the Black River is still rated “outstanding” in water quality. In fact, state data show the river’s nutrient content has not increased even though its hog population has gone from 2 million to 9 million hogs in the past 15 years! Zero discharge management works.
Indeed, big hog farms are protecting our waters more effectively than are our cities. Modern sewage treatment takes only half the nitrogen and phosphate out of urban wastes. For every pound of nitrogen that U.S. confinement hog farms spread as fertilizer on crops, a city dumps two pounds (legally) into a river. Under a proposal put forward by EPA in August, the Clean Water Act will be extended to cover some 18,000 large-scale hog and dairy operations. The amendment will require these farmers to obtain pollution permits from state environmental agencies. Washington will now be competing with state governors to get credit for shutting down the most efficient and environmentally-constructive livestock and poultry farms in history.
If the new regulations force America to raise its chickens on free range, we’ll also have to convert another 5,000 square miles of forest or parkland to chicken pasture. As the world goes from 1 billion breeding hogs to 3 billion in the 21st century, raising them outdoors will cost one million square miles of wildlife habitat, actually increasing by vast amounts the levels of manure washed into our rivers and streams. Instead of once again blaming productive citizens for creating environmental problems that aren’t their fault, isn’t it time EPA rewards our hardworking farmers for a job well done?
Dennis Avery is the Director of Global Food Issues at the Hudson Institute.
How Common Chemicals Became “Toxic Pollutants”
By Hugh Wise, Ph.D.
In implementing the idealistic goals of environmental legislation, Science and technology tells us what can be done; economics tells us what should be done; politics tells us what will be done.
The term “toxic” has become such a familiar part of the lexicon of environmental rhetoric that much of the public is now conditioned to interpret the term literally, particularly when it is used as a modifier for “pollutant.” Having been taught to fear even low-level exposure to industrial chemicals (“chemophobia”) — despite negligible risk for all but the most sensitive individuals — many people today are unaware that “toxic pollutant” is actually a regulatory term that was introduced by the 1972 Clean Water Act (CWA).
The labeling of industrial chemicals as “toxic” seems to have grown out of heightened public concern about cancer during the 1960s. Rachel Carson’s 1962 bestseller, Silent Spring, popularized the idea that exposure to manmade chemicals could cause cancer. Though Silent Spring could hardly be called a scientific treatise — it refers to pesticides as “chemical death rain” and is filled with allusions to “witchcraft,” “devils,” and “evil spells” — the book had an undeniable effect on public consciousness and, ultimately, on environmental policy.
Indeed, green groups, hostile to the chemical industry and eager to raise money on the public's fear of cancer, were so successful in trumpeting Silent Spring's message that the book's overblown rhetoric soon became standard fare in much of the media's reporting on environmental issues.
The mass media’s repeated depiction of industrial chemicals as “cancer-causing” agents, predictably led people to believe that these chemicals are inherently toxic, an implication without scientific merit. At the same time, a basic principle of toxicology went unnoticed: the dose makes the poison.
President Nixon’s subsequent 1971 declaration of war on cancer was used to justify a massive infusion of funds to the public health bureaucracies, such as the fledgling National Cancer Institute, and accelerated high-dose animal testing of primarily industrial chemicals. Substances exhibiting even the slightest carcinogenic activity during such tests were characterized as “suspected carcinogens.” Few so-called “natural” chemicals, to which people are more commonly exposed, were similarly evaluated, or if they were, the results were not widely reported. Consequently, the public was left ignorant of the fact that many of the chemicals in foods they eat everyday also show carcinogenic activity in high-dose animal testing. Thus was the false notion perpetuated that cancer is associated exclusively with exposure to industrial chemicals.
Growing public concern about environmental degradation and its possible links to cancer enabled Senator Edmund Muskie and his congressional allies to override a Nixon veto that would have struck down the Federal Water Pollution Control Act of 1972 (PL 92-500). Also known as the 1972 Clean Water Act (CWA), section 101(a) declared it a national goal to “eliminate the discharge of pollutants into navigable waters by 1985.” Yet section 502(6) defined a “pollutant” so broadly as to include almost anything (even sand and rocks) that EPA might decide to regulate. Hence, this goal was widely interpreted as calling for the eventual zero discharge of wastewater containing anything regulated as a “pollutant.”
Section 101(a) also gave legal impetus to the term “toxic pollutant” by stating in part that “…it is the national policy that the discharge of toxic pollutants in toxic amounts be prohibited.” Pursuant to this policy, EPA was required to publish a list of chemicals which were to be designated as “toxic pollutants.” As EPA Administrator William Ruckelshaus later pointed out, however, the 1972 CWA and other environmental statutes were based on mistaken legislative assumptions: that EPA knew which chemicals — and what amounts of these chemicals — are “toxic”; how to measure these substances at trace levels; and how to regulate these chemicals to acceptable levels at reasonable costs.
With regulatory attention initially focused on so-called “conventional pollutants” and burdened by many other responsibilities imposed by the 1972 CWA, EPA moved slowly in publishing a list of “toxic pollutants.” This delay prompted a 1975 lawsuit by the Environmental Defense Fund (EDF), the Natural Resources Defense Council (NRDC), et al., charging EPA with failure to include certain chemicals on the list of “toxic pollutants” for which standards were to be proposed. In 1976, EPA and its plaintiff allies settled the lawsuit, after which EPA subjected itself to a consent decree with several lists of chemicals appended.
Using a provisional list of candidate chemicals derived from earlier studies of industrial chemicals detected in water samples or evaluated in dose-exposure testing on fish and aquatic organisms, lawyers for EPA, NRDC and EDF negotiated the designation of “65 compounds and classes of compounds” as “toxic pollutants.” The 65 were listed in Appendix A of the consent decree and referenced Table 1 of Committee Print Numbered 95-30 of the Committee on Public Works and Transportation of the House of Representatives published in the Congressional Record.
in section 307(a) of the 1977 Amendments to the CWA. That is how some common chemicals became “toxic pollutants” — not because they are inherently toxic (the dose makes the poison) — but merely to satisfy a requirement of the 1976 consent decree.
Since it was, and still is, impractical to analyze water samples for the nebulous classes of compounds contained in the consent decree, EPA specified a list of 129 chemical analytes (later reduced to 126) which became widely known in the regulated community as “priority pollutants.” For its part, EPA began to refer to these chemicals either as “toxics” or, more ominously, “toxic pollutants of concern.” EPA even mixed terms in water quality criteria by dubbing these chemicals “priority toxic pollutants.”
The “toxic” label crept into other environmental legislation as well. In implementing the Toxic Substances Control Act (TSCA), the list of “toxic pollutants” from the CWA was expanded to include a class of chemicals identified as “toxic substances.” Any chemical catalogued as a “toxic pollutant” under the CWA can also be defined as a “hazardous substance” under the Resource Conservation and Recovery Act (RCRA). Similarly, a list of chemicals defined as “hazardous air pollutants” (“air toxics”) under the Clean Air Act (CAA) included many of the CWA’s “toxic pollutants.” The Superfund Amendments and Reauthorization Act (SARA) also requires annual reporting of various “toxic chemicals” which are recorded in a so-called Toxic Release Inventory (TRI).
All this is toxic semantics at its deceptive best. Indeed, TRI chemicals are not only listed, but also reported on the basis of amounts that are produced or used. Even though quantity alone is no measure of environmental risk, the relative toxicity of wastes has thus come to be ranked by TRI chemical content.
As impressive as the various lists of “toxic” or “hazardous” chemicals found in many of our environmental statutes may appear, they tell us nothing about risks these substances pose in the real world. If you ask the wrong questions, you will get the wrong answers. Yet thanks to the loose language inserted into the Clean Water Act nearly three decades ago and the arbitrary designation of industrial chemicals as “toxic pollutants” in a long-forgotten consent decree, our vocabulary and many of our environmental regulations have been taken on a semantic ride that has been harmful to both.
Dr. Hugh Wise is an environmental scientist with the U.S. Environmental Protection Agency’s Office of Water. The views expressed in this article are his own and not those of EPA.
EPA: Science without Biology
By David L. Lewis, Ph.D.
On March 29, 1999, NASA officials gathered in Washington, D.C. to discuss an upcoming mission to Mars. They met at the headquarters of the U.S. Environmental Protection Agency (EPA). R. Stone, “New Rules Squeeze EPA Scientists,” Science, 1993, no. 262: 647. ,G. Lee, “EPA Researchers Say New Directives Mean Too Much Paperwork,” The Washington Post, 30 Oct. 1993.
Why the close encounter between the Red Planet and the Red Tape Empire? As the space agency prepares to send an unmanned craft to bring back soil and rock samples from Mars, government scientists want to avoid contaminating the planet with microbes from Earth and vice versa. In a memo to fellow employees, Henry L. Longest II, deputy assistant administrator for EPA’s Office of Research and Development, explained the agency’s position: “As human exploration of Mars and other planets moves from the realm of fantasy to real possibilities, a host of environmental questions arise.” “Job Expansion,” Editorial, The Washington Times, 30 Mar. 1999: A6
For those of us who study microorganisms for a living, it would be enlightening if Mr. Longest would tell us when EPA itself will move “from the realm of fantasy to real possibilities.” Is Mr. Longest, for instance, aware of the fact that almost all of the chemicals regulated by EPA, once introduced into the environment, are transformed by Earthling microorganisms into other chemicals with totally different properties? In all of its thirty years of existence, however, the agency has never developed a reliable means for predicting how long industrial pollutants will persist in the environment, and what chemicals they will be transformed into by the organisms that inhabit the earth’s soil and water. Instead, a handful of old-timers in EPA’s D.C. program offices ignore elementary scientific considerations as they regulate the entire U.S. chemical industry according to political and bureaucratic agendas.
In his memo on protecting extraterrestrial environments, Longest pondered: “Looking at lessons from our own planet, what steps should we take to protect other planetary bodies in the solar system?” I am not sure what lessons Mr. Longest was referring to, but the sad truth is that agency leaders need an education in basic biology.
While I admittedly know nothing at all about Martian microorganisms, it is common knowledge that microbes down here quickly detoxify some industrial wastes once they enter the environment. In other cases, microbes change innocuous wastes into potentially hazardous agents. Because EPA has expended most of its resources devising and defending regulations aimed at pleasing environmental activists — instead of developing and applying the principles of sound science — regulators do not know what happens when pollutants and microorganisms meet in the real world. In spite of EPA’s increasing use of complex mathematical models that incorporate chemical and physical data for predicting how environmental pollutants will behave once they enter the environment, all of these models assume that microorganisms in soil and water simply do not exist. As a result, EPA regulates industry as if it operates on a lifeless planet. Perhaps this is why NASA has seen fit to apply EPA’s approaches to protecting the Martian environment.
Given EPA’s refusal to develop environmental regulations that account for microorganisms, how can the agency ascertain in any scientifically reliable manner how industrial wastes should be handled under its Hazardous Waste Disposal Rule? Or how chemical wastes should be treated under its Remediation Feasibility Implementation Study? Or whether chemicals should be permitted for manufacture under its Pre-Manufacturing Notification (PMN) process? The truth is, it cannot.
One also wonders just how EPA regulates almost everything about life on Earth and yet knows so little about it. Mr. Longest, who is untrained in microbiology but speaks for EPA about microorganisms in the heavens, hints at the answer. The agency has never been administered by a scientist, while the Office of Research and Development that Mr. Longest oversees has relatively few researchers with primary training and expertise in the biological sciences.12 In truth, EPA’s microbiological research is mostly carried out by engineers and others with no formal training in the subject and little idea as to how microorganisms actually interact with chemical substances.
For example, have EPA regulators ever wondered about the significance of the “L” on containers of amino acid supplements sold in health food stores? Or the “D” on bottles of glucose administered in hospitals? These important little letters indicate that these nutrients exist as levo and dextro molecules. Referred to as “chiral,” such substances are mixtures of mirror-images of the same chemical structure. One form of a chiral drug can have desirable effects, such as alleviating pain or improving breathing, while its corresponding mirror-image form can cause birth defects, cancer, or other unwanted side-effects. The pharmaceutical industry has applied such knowledge for many years. Fifty of the top one hundred most widely sold drugs, including barbiturates, ibuprofen, and Ritalin, are marketed as single chiral forms to avoid these adverse effects.4
As it turns out, many substances classified as pollutants — phenoxy acid herbicides, organo-phosphate (OP) insecticides, PCBs, phthalate plasticizers, freon substitutes, and o, p’-DDT and its derivatives — are chiral (Table 1). 5-10 Likewise, approximately one-fourth of all pesticides are chiral.11 Adverse effects of these various chemicals, such as toxicity, mutagenicity, carcinogenicity, and endocrine disrupter activity, are usually associated with only one of the mirror-image forms of the molecule.
Yet all of the data upon which EPA bases its regulations have never accounted for the fact that many of the major pollutants it regulates are chiral, with each individual form of the chemical having completely different effects on living organisms. EPA reports simply fail to indicate which forms are present in the environment. Because they do not differentiate between innocuous substances and their biologically active counterparts, risk studies on adverse health effects or environmental damage are unreliable indicators for many chemicals within environmentally important classes.
Vice President Al Gore has of late been promoting the idea that NASA should increase its efforts to view the earth from a distance as a means of improving our understanding of the impact of mankind’s activities. Instead of looking down on us from above, might not Mr. Gore begin to take an honest look at how life here on earth actually functions? In particular, he might improve his own understanding of how poor science at EPA results in regulations that ultimately harm — rather than help — the environment. EPA must begin to incorporate biology into its Hazardous Waste Disposal Rule, its Remediation Feasibility Implementation Study, its Pre-Manufacturing Notification, and the host of other regulations it oversees. Otherwise, the agency will continue to divert resources away from real environmental problems and spend them regulating chemicals that have little or no environmental and public health consequences.
Dr. David Lewis is a scientist with the U.S. Environmental Protection Agency. The views expressed in this article are his own and not those of EPA.