UPDATED: NAS Public Comment on Strengthening Transparency in Regulatory Science

National Association of Scholars

The National Association of Scholars has just submitted an updated public comment to Secretary Scott Pruitt in support of the Environmental Protection Agency's proposed rule Strengthening Transparency in Regulatory Science. The new rule requires that “When promulgating significant regulatory actions, the Agency shall ensure that dose-response data and models underlying pivotal regulatory science are publicly available in a manner sufficient for independent validation.” The NAS has long been concerned about politicized distortions of dose-response science, and our just-released report The Irreproducibility Crisis: Causes, Consequences, and the Road to Reform urges that the Federal government require that only properly reproducible science be used as a basis for regulatory action. We are delighted that Secretary Pruitt’s proposed action addresses two of our priorities. Below is our full updated public comment, which includes our suggestions for how Secretary Pruitt might build upon this rule to improve reproducibility throughout the EPA--and model larger reproducibility reforms throughout the Federal Government.


June 18, 2018

The Honorable E. Scott Pruitt

Administrator

Environmental Protection Agency

1200 Pennsylvania Avenue, NW

Washington, DC 20460

Re proposed rulemaking – Strengthening Transparency in Regulatory Science- Docket ID No. EPA-HQ-OA-2018-0259

Dear Administrator Pruitt,

I support the Environmental Protection Agency’s (EPA) proposed rulemaking “Strengthening Transparency in Regulatory Science.” Both the current state of science about dose-response and larger concerns about reproducibility in scientific research support this measure. Indeed, the larger concerns about reproducibility suggest that this measure should be applied more generally within the Environmental Protection Agency and across the Federal Government as a whole.

I write as President of the National Association of Scholars (NAS). NAS is a network of scholars and citizens united by our commitment to academic freedom, disinterested scholarship, and excellence in higher education. As part of our mission, we support the highest standards of truth-seeking in the sciences, and seek to have government policy support and rely upon science that eschews political advocacy and subjects its own procedures to the strictest scrutiny.

The NAS is pleased that the EPA has chosen to prioritize the application of reproducibility reforms in the area of dose-response regulation. The NAS has long been concerned about politicized distortions of dose-response science.[1] A notable example is the status of the linear no-threshold (LNT) dose-response model for the biological effects of nuclear radiation.  The prominence of the model stems from the June 29, 1956 Science paper, “Genetic Effects of Atomic Radiation,” authored by the National Academy of Science’s Committee on the Biological Effects of Atomic Radiation.  This paper is now widely questioned and has been seriously critiqued in many peer-reviewed publications, including two detailed 2015 papers.  These criticisms are being taken seriously around the world, as summarized in a December 2, 2015 Wall Street Journal commentary.  This is a consequential matter that bears on a great deal of national public policy, as the LNT model has served as the basis for risk assessment and risk management of radiation and chemical carcinogens for decades. A reassessment of that model could profoundly alter many regulations from the Environmental Protection Agency, the Nuclear Regulatory Commission, and other government agencies.[2]

The NAS is also pleased that the EPA has chosen to address increasing concerns about the irreproducibility crisis of modern science. The NAS has recently written a long report on how the  improper use of statistics, arbitrary research techniques, lack of accountability, political groupthink, and a scientific culture biased toward producing positive results together have produced a reproducibility crisis that afflicts a wide range of scientific and social-scientific disciplines, from epidemiology to social psychology. Many supposedly scientific results cannot be reproduced in subsequent investigations. We have recommended extensive changes to scientific procedures and to the way government judges the science it uses to make policy—including measures such as this proposed rule, to require that government make policy only based on scientific research whose data and procedures are available for other scientists to reproduce.[3]

In response to the EPA’s solicitation for comment on its proposed rulemaking “Strengthening Transparency in Regulatory Science,” we respectfully provide the following suggestions on ways to implement the principles of scientific reproducibility into the administrative practice of the EPA, and into the administrative practice of the Federal Government as a whole.

1)    We recommend that the EPA draft a Transparent Science Guidance Document (TSGD) to govern all aspects of EPA policy. The TSGD should:

  1. Define best available science in all grant awards (including categorical, discretionary, formula, and research grants), professional assessments (including environmental data, information quality, research data, and all other professional assessments) and administrative processes (including action development, communications with the public, enforcement activities, guidance documents, individual party adjudications, non-binding regulatory determinations, non-regulatory actions, peer review, permit proceedings, policy statements, professional development, promulgations, proposed rules, regulatory actions, regulatory decisions, regulatory proposals, risk assessments, rulemakings, site-specific permitting actions, work products, and all other agency actions with precedent-setting influence on future actions) to include only publicly accessible research.

i.Publicly accessible research should be defined as research whose registered report (including protocols), research data, associated protocols, computer codes, data analysis scripts, recorded factual materials, and statistical analyses are archived on an online digital repository in a manner sufficient for continuing independent inspection, replication, reproduction, and/or verification. This online digital repository shall have archival and accessibility capacities at least equal to those possessed by Figshare or the in the Summer 2018.[4]

  1. Require the EPA to use only best available science, defined as including only publicly accessible research, in:

i.all future grant awards, including

  1. categorical grants,
  2. discretionary grants,
  3. formula grants, and
  4. research grants;

ii.all future professional assessments, including

  1. environmental data,
  2. information quality, and
  3. research data;

iii.all future administrative processes, including

  1. action development,
  2. communications with the public,
  3. enforcement activities,
  4. guidance documents,
  5. individual party adjudications,
  6. non-binding regulatory determinations,
  7. non-regulatory actions,
  8. peer review,
  9. permit proceedings,
  10. policy statements,
  11. professional development,
  12. promulgations,
  13. proposed rules,
  14. regulatory actions,
  15. regulatory decisions,
  16. regulatory proposals,
  17. risk assessments,
  18. rulemakings,
  19. site-specific permitting actions,
  20. work products, and
  21. all other agency actions with precedent-setting influence on future actions;

iv.all professional interactions with other government agencies, such as the U.S. Global Change Research Program (USGCRP).

  1. Require the EPA in future to add only best available science, defined as including only publicly accessible research, in  

i.the EPA’s Science Inventory (SI).[5]

ii.the corpus of Influential Scientific Information (ISI).[6]

iii.the corpus of Highly Influential Scientific Assessments (HISA).[7]

iv.the corpus of Peer Review Agendas (PRA).[8]

  1. Require all new EPA significant regulatory actions (SRA) and Highly Influential Scientific Assessments (HISA) to include substantially reproduced research (SRR).

i.Substantially reproduced research (SRR) should be defined as research whose main conclusions have been substantiated in at least two independent studies that qualify as best available science.

ii.Significant regulatory action (SRA) should be defined by the Office of Management and Budget (OMB) pursuant to Executive Order 12866, as any regulatory action that is likely to result in a rule that may:

  1. Have an annual effect on the economy of $100 million or more or adversely affect in a material way the economy, a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or tribal governments or communities;
  2. Create a serious inconsistency or otherwise interfere with an action taken or planned by another agency;
  3. Materially alter the budgetary impact of entitlements, grants, user fees, or loan programs or the rights and obligations of recipients thereof; or
  4. Raise novel legal or policy issues arising out of legal mandates, the President’s priorities, or the principles set forth in this Executive order.[9]

iii.Highly Influential Scientific Assessments (HISA) should be defined as any scientific assessment whose dissemination, as determined by the EPA or the OMB

  1. Require all new EPA significant regulatory actions (SRA) and Highly Influential Scientific Assessments (HISA) to include a professional literature assessment (PLA).

i.A professional literature assessment (PLA) should be defined to include:

  1. Where available, at least one meta-analysis, which must also be best available science, on the scientific research supporting the significant regulatory action or Highly Influential Scientific Analysis.
    1. A meta-analysis shall be defined as the combining of evidence from independent studies using appropriate statistical methods.[10]
  2. Where available, at least one publication bias study, published by different authors from the authors of the meta-analysis, which must also be best available science, on the effect of publication bias on the scientific research supporting the significant regulatory action or Highly Influential Scientific Analysis.
    1. A publication bias study shall be defined as a peer-reviewed study of the effects upon the professional literature on a specific scientific question or an entire discipline of selective (non-)publication, non-dissemination, delayed publication, misrepresentation, and misinterpretation of specific scientific conclusions or whole studies based on the nature and direction of the results.[11]
  3. In the absence of either a meta-analysis or a publication bias study, explicit assessment and professional avowal of the unbiased validity of the substantially reproduced research.
  4. Update and broaden the application of Good Laboratory Practice Standards (GLPS).

i.Update all Good Laboratory Practice Standards (e.g., 40 CFR 160[12]) to incorporate the requirement to produce publicly accessible research.

ii.Require all new EPA significant regulatory actions (SRA) and Highly Influential Scientific Assessments (HISA) that bear on regulatory fields where Good Laboratory Practice Standards apply to include only research that meets Good Laboratory Practice Standards (GLPS).

iii.Apply Good Laboratory Practice Standards (GLPS) impartially and universally to academic research, government research, and industry research.

  1. Provide a set procedure for the EPA Administrator to waive the TSGD requirements above on a case-by-case basis.

i.This set procedure should include strict criteria for the EPA Administrator to provide these case-by-case waivers, focused on the EPA Administrator’s judgment that he must prevent immediate dangers to the health or life of American citizens.

ii.This set procedure should be accompanied by an equally set procedure for private individuals and organizations to challenge the EPA Administrator’s waiver, sufficiently detailed to allow for effective judicial oversight.

  1. We recommend that the EPA update all relevant EPA regulations, including risk assessments, grant guidelines, and guidance documents, to incorporate the TSGD’s definitions and regulatory changes.
  2. We recommend that the EPA draft the TSGD so that it may be used as a model for all Federal agencies that use scientific or social scientific research; and as a model for Federal legislation to introduce reproducible science requirements throughout the Federal Government.
  3. We recommend that the EPA call on Congress to enact a Transparent Science Reform Act (TSRA) that codifies the principles and policies embodied by the TSGD.
  4. We recommend that the EPA formulate a quantitative measure of how often scientific research has been reproduced, and apply that quantitative measure to all research articles in its Health & Environment Research Online (HERO) database.[13] This quantitative measure should then be incorporated into its assessments of scientific weight of evidence.
  5. We recommend that the EPA integrate requirements for pre-registered research in its best available science as widely as is practicable.[14]
  6. We recommend that the EPA prioritize its funding toward upgrading existing research data so that it may meet the standard of best available science; e.g., by anonymizing research data so as to preserve privacy, confidentiality, etc.
  7. We recommend that the EPA prioritize its funding toward making it possible—and then required—for all EPA notices, proposed rules, regulations, etc., to include easily accessible links to all scientific materials used to justify these EPA actions. These links should include all relevant scientific research that meets the best available science standard, but that does not support the proposed EPA action.
  8. We recommend that the EPA prioritize its funding toward providing a “reproducibility architecture” of hardware and software to facilitate the production of best available science by all American scientists whose research informs the EPA’s decision-making.
  9. We recommend that the EPA, as it determines how best to implement the methodologies and technologies of a “reproducibility architecture” to facilitate the adoption of a best available science standard, consult with representatives of the Center for Open Science, the Meta-Research Innovation Center at Stanford (METRICS), and the Laura and John Arnold Foundation Research Integrity Initiative.[15]
  10. We recommend that the EPA consult with the American Statistical Association about how to institute standard procedures that will ensure that all scientific research used or funded by the EPA is conducted according to the highest standards of statistical practice.[16]
  11. We recommend that each EPA granting program establish a funding category, with funding priority over all other categories, for meta-analysis and research into publication bias.
  12. We recommend that the EPA institute a process by which to rescind existing regulations based upon research that does not meet the best available science standard. This process should include:
  13. The establishment of a permanent investigatory commission to examine existing regulations and determine which are based on science that does meet the best available science standard; and
  14. The establishment of a process to rescind regulations based on science that does meet the best available science standard, which provides a reasonable amount of time for researchers to make their science meet that standard before the regulations are rescinded.

NAS believes that these reforms will strengthen the Environmental Protection Agency’s longstanding commitment to using only the most reliable science to inform its decision-making. We also believe these reforms will strengthen American science, by prompting researchers to incorporate and make routine in their practices the highest standards of reproducibility.

                                                                                                Sincerely yours,

                                                                                                Peter Wood

                                                                                                President

                                                                                                National Association of Scholars

 

[1] Peter Wood, “Concerns about the National Academy of Sciences and Scientific Dissent,” December 15, 2015, https://www.nas.org/articles/nas_letter; Edward J. Calabrese, “Societal Threats from Ideologically Driven Science,” December 13, 2017, https://www.nas.org/articles/societal_threats_from_ideologically_driven_science.

[2] See https://www.nas.org/images/documents/LNT.pdf, which reproduces documents including Genetics  Panel  of  the  Biological  Effects  of  Atomic  Radiation  (BEAR) I Committee of  the National Academy of Sciences, “Genetic Effects of Atomic Radiation,” Science 123 (29 June 1956), pp. 1157-64; Edward J. Calabrese, “An abuse of risk assessment: how regulatory agencies improperly  adopted LNT for cancer risk assessment,” Archives of Toxicology 89, 4 (2015), pp. 647-48; Edward J. Calabrese, “On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith,” Environmental Research 142 (2015), pp. 432-42; and Holman W. Jenkins, Jr., “A Nuclear Paradigm Shift?” The Wall Street Journal, December 2, 2015, p. A13.

[3] David Randall and Christopher Welser, The Irreproducibility Crisis in Modern Science: Causes, Consequences, and the Road to Reform (National Association of Scholars: New York, 2018), https://www.nas.org/projects/irreproducibility_report. The report contains a lengthy bibliography on the irreproducibility crisis; notable works in the literature include John P. A. Ioannidis, “Why Most Published Research Findings Are False,” PLoS Med 2, 8 (2005), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1182327/; Joseph P. Simmons, et al., “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant,” Psychological Science 22, 11 (2011), pp. 1359-66, http://journals.sagepub.com/doi/pdf/10.1177/0956797611417632; C. Glenn Begley and Lee M. Ellis, “Drug development: Raise standards for preclinical cancer research,” Nature 483 (2012), pp. 531-33, http://www.nature.com/nature/journal/v483/n7391/full/483531a.html?foxtrotcallback=true; Open Science Collaboration [Brian Nosek, et al.], “Estimating the reproducibility of psychological science,” Science 349 (2015), http://science.sciencemag.org/content/349/6251/aac4716.

[4] Figshare, https://figshare.com/; Open Science Framework, https://osf.io/.

[5] United States Environmental Protection Agency, Science Inventory, https://cfpub.epa.gov/si/.

[6] United States Environmental Protection Agency, Science Inventory, https://cfpub.epa.gov/si/si_public_pr_agenda.cfm#ISI.

[7] United States Environmental Protection Agency, Science Inventory, https://cfpub.epa.gov/si/si_public_pr_agenda.cfm#HISA.

[8] United States Environmental Protection Agency, Science Inventory, https://cfpub.epa.gov/si/si_public_pr_agenda.cfm.

[9] Executive Order 12866, September 30, 1993. Regulatory Planning and Review, https://www.archives.gov/files/federal-register/executive-orders/pdf/12866.pdf.

[10] Office of the Federal Register, Food and Drug Administration, Meta-Analyses of Randomized Controlled Clinical Trials (RCTs) for the Evaluation of Risk To Support Regulatory Decisions; Notice of Public Meeting; Request for Comments, October 24, 2013, https://www.federalregister.gov/documents/2013/10/24/2013-24939/meta-analyses-of-randomized-controlled-clinical-trials-rcts-for-the-evaluation-of-risk-to-support.

[11] For questions of defining publication bias, see Fujian Song, Lee Hooper, and Yoon K. Loke, “Publication bias: what is it? How do we measure it? How do we avoid it?” Open Access Journal of Clinical Trials 5 (2013), 71–81, https://core.ac.uk/download/pdf/19086012.pdf; and Dirk Bassler, et al., “Bias in dissemination of clinical research findings: structured OPEN framework of what, who and why, based on literature review and expert consensus,” BMJ Open 6, 1 (2016), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4735132/.

[13] The R-factor metric, propounded by the private company Verum Analytics, provides a possible model for this quantitive measure. Peter Grabitz, et al., “Science with no fiction: measuring the veracity of scientific reports by citation analysis,” bioRxiv preprint, August 9, 2017, https://www.biorxiv.org/content/biorxiv/early/2017/08/09/172940.full.pdf; Verum Analytics, http://verumanalytics.io/.

[14] Pre-registered research should be defined as research which includes a Registered Report. A Registered Report shall be defined as follows: “Registered Reports are a form of empirical journal article in which methods and proposed analyses are pre-registered and peer-reviewed prior to research being conducted. High quality protocols are then provisionally accepted for publication before data collection commences. … Authors of Registered Reports initially submit a Stage 1 manuscript that includes an Introduction, Methods, and the results of any pilot experiments that motivate the research proposal. Following assessment of the protocol by editors and reviewers, the manuscript can then be offered in-principle acceptance (IPA), which means that the journal virtually guarantees publication if the authors conduct the experiment in accordance with their approved protocol. With IPA in hand, the researchers then implement the experiment. Following data collection, they resubmit a Stage 2 manuscript that includes the Introduction and Methods from the original submission plus the Results and Discussion. The Results section includes the outcome of the pre-registered analyses together with any additional unregistered analyses in a separate section titled “Exploratory Analyses”. … The final complete article is published after this process is complete.” Center for Open Science, “Registered Reports: Peer review before results are known to align scientific values and practices,” https://cos.io/rr/.

[15] Center for Open Science, https://cos.io/; Meta-Research Innovation Center at Stanford | METRICS, https://metrics.stanford.edu/; Laura and John Arnold Foundation Research Integrity Initiative, http://www.arnoldfoundation.org/initiative/research-integrity/.

[16] E.g., American Statistical Association, “ASA Statement on Statistical Significance and P-Values,” The American Statistician 70, 2 (2016), pp. 131-33, https://amstat.tandfonline.com/doi/full/10.1080/00031305.2016.1154108?scroll=top&needAccess=true#aHR0cDovL2Ftc3RhdC50YW5k.

  • Share

Most Commented

February 13, 2024

1.

The Great Academic Divorce with China

All signs show that American education is beginning a long and painful divorce with the People’s Republic of China. But will academia go through with it?...

January 24, 2024

2.

After Claudine

The idea has caught on that the radical left overplayed its hand in DEI and is now vulnerable to those of us who seek major reforms. This is not, however, the first time that the a......

February 2, 2024

3.

Tribalism or Individualism?

The most immediate work of conservatives must be the rejection of tribalism and a refocus on the individual—individual character, industry, and aptitude....

Most Read

May 15, 2015

1.

Where Did We Get the Idea That Only White People Can Be Racist?

A look at the double standard that has arisen regarding racism, illustrated recently by the reaction to a black professor's biased comments on Twitter....

October 12, 2010

2.

Ask a Scholar: What is the True Definition of Latino?

What does it mean to be Latino? Are only Latin American people Latino, or does the term apply to anyone whose language derived from Latin?...

September 21, 2010

3.

Ask a Scholar: What Does YHWH Elohim Mean?

A reader asks, "If Elohim refers to multiple 'gods,' then Yhwh Elohim really means Lord of Gods...the one of many, right?" A Hebrew expert answers....