Please ensure Javascript is enabled for purposes of website accessibility
Evidence for Democracy logo

Scientific Integrity at Federal Departments and Agencies: Promise, Progress, and Problems

In our continuing Perspectives on Scientific Integrity in 2023 blog series, Scott Findlay reflects on the promise of the federal scientific integrity policies, their progress, and the problems in their implementation, and potential paths forward.

Scott Findlay is a retired Professor in the Department of Biology at the University of Ottawa, and former Associate Director (Graduate Studies) at the Institute of Environment. In 2013, he co-founded Evidence for Democracy, a national non-partisan organization that advocates for evidence-informed decision-making by governments. He has been a Researcher in Residence at the Office of the Chief Science Advisor since January 2018.

The promise

In 2013, Katie Gibbs and I co-founded Evidence for Democracy. During the early days, we lobbied strongly for the creation of an independent federal chief science advisor. We saw the creation of such a position as critical to establishing a vigorous culture of scientific integrity within the federal government.

And we, with many others, were successful. In September 2017, the Office of the Chief Science Advisor (OCSA) was established.

Earlier in the same year, two Memoranda of Agreement with Respect to Scientific Integrity, applying to Research (RE) and Applied Science and Patent Examination (SP) employees, were signed between the Treasury Board of Canada and the Professional Institute of the Public Service of Canada (PIPSC). Under these memoranda (which have since been renewed twice), all departments and agencies with more than 10 RE or SP employees were required to develop their own scientific integrity policies (SIPs) and procedures. Furthermore, the memoranda requires the development of a “common policy that can be used as a model by departments when developing their own scientific integrity policies”.

The memoranda were viewed as an important lever for government action on scientific integrity. To assist departments and agencies in developing the “common policy”, the Governance Committee for Implementation of Government-wide Scientific Integrity Policy, consisting of the Chief Science Advisor as chair, the Treasury Board Secretariat and the President of PIPSC, struck a working group to develop a model SIP.

I joined OCSA in January 2018, and (mistakenly) believing I knew something about scientific integrity, the Chief Science Advisor assigned me the task of leading the model SIP drafting team. The final model policy was published on July 30, 2018.

Many of the comments we received from implicated departments and agencies on the draft model SIP did not directly concern the model policy per se, but rather issues related to implementation. Over the last five years, OCSA has developed a set of guidance documents to support the implementation of the non-discretionary articles of the model SIP, including scientific integrity breach investigations, science communication, and peer review. OCSA, in collaboration with the Canada School of the Public Service, has also developed a four-hour course on evidence-informed decision-making that will launch in fall 2023.

Also in fall 2023, the Governance Committee will begin a consultation on amendments to the original model policy. We believe these amendments are needed to address (a) scientific integrity issues that have come to the fore in the last five years, including the appropriate collection, use, communication or archiving of Indigenous Knowledge, research security, open science and the appropriate use of generative AI; and (b) operational issues that have come to light through implementation, specifically the approval process for technical publications and communications.

Progress

Since January 2019, implicated departments and agencies have been surveyed annually to determine their progress in approving and implementing the non-discretionary provisions of their SIPs. As of February 2023, 24 departments and agencies have approved SIPs, 22 of which are in effect.

Compliance with the non-discretionary provisions of departmental policies is highly variable. For example, 18 of the 21 departments and agencies who produce technical communications have obligatory peer review requirements in place. Eighteen of the 24 departments and agencies have implemented a process for bringing forward allegations of scientific integrity breach. Twelve of the 15 departments and agencies who conduct research on human subjects have a process in place for ensuring that research proposals involving human subjects are reviewed by a Research Ethics Board. On these fronts, then, there has been substantial progress – at least in principle.

On other fronts, progress has been slower. Eighteen departments have yet to implement measures to support training and/or professional development devoted to the roles of science and research in supporting evidence-informed decision-making. Most significantly, 20 departments or agencies have yet to develop – let alone implement – a monitoring and performance evaluation plan that will provide information on the extent to which their policy has achieved its objectives.

The SIP process has also resonated outside Canada. In January 2023, the United States’ Office of Science and Technology (OSTP) released its Framework for Federal Scientific Integrity Policy and Practice, which includes a Model Scientific Integrity Policy for Federal Agencies that is hauntingly similar – indeed, in sections virtually identical – to the model SIP. OCSA was consulted extensively on the Framework, and it is gratifying that OSTP would appear to have seen much to like both in our approach to SIP development (that is, developing a model policy in collaboration with affected departments), as well as the policy per se.

Most of the information on progress comes from the annual compliance survey. However, even though departmental responses must be accompanied by supporting evidence, having (for example) a scientific integrity breach investigation policy approved is a far cry from having a comprehensive and systematic breach investigation process implemented. And even if implemented, such a process will have limited effect if, for example, employees don’t use the process because of a fear of retribution.

It is for precisely this reason that, in writing the first draft of the model SIP, I included a set of non-discretionary – and quite prescriptive – provisions concerning policy performance monitoring and evaluation. To assist departments in implementing these provisions, OCSA has expended enormous energy over the last two years developing performance evaluation guidance, including a model performance monitoring plan, performance indicators – even designing tools to acquire the data to support SIP performance indicators. Yet as of January 2023, only one department has implemented these tools – a disappointing result, to say the least.

Problems

In my view, excellent progress (at least in principle) has been made on SIP provisions that are comparatively easy to implement. Unsurprisingly, implementation has lagged for those requiring more resources.

But it is not, in my view, just an issue of resource requirements.

First, culture change – of any persuasion – is hard. It is particularly hard in large institutions like governments which, by definition, have large cultural inertia. Many of the people directly or indirectly affected – at least in principle – by SIPs have been federal employees for decades, establishing patterns of behaviour that are difficult to change, even if there are apparently compelling reasons for doing so.

Second, Deputy Heads of departments and agencies have many responsibilities, and time, energy and resources are, in general, zero-sum games. The Treasury Board Secretariat’s Policy on Results requires that each department establish a Departmental Results Framework comprising the department’s Core Responsibilities, Departmental Results and Departmental Results Indicators. No department’s framework includes results directly pertaining to scientific integrity because for no department is scientific integrity a core responsibility. It is not surprising, then, that ensuring effective implementation of comparatively resource-intensive provisions of departmental SIPs is not a high priority for Deputy Heads.

Third, there is the issue of leverage. OCSA cannot tell the federal government what to do, on scientific integrity or any other issue because its role is purely advisory. Ministerial responsibility means that, ultimately, it is ministers who decide what their departments will – or won’t – do. Notwithstanding the contractual obligations explicit in collective agreements, culture change is largely discretionary: it will not happen unless there is strong, proactive and ongoing pressure from the top.

The above problems are not insurmountable. Some potential solutions include:

  • Establish the full implementation of SIPs as a ministerial priority by including it explicitly within every ministerial mandate letter. This should not be limited to the 25 departments and agencies implicated in the current collective agreements.
  • Enshrine OCSA in legislation that includes a statutory requirement for (among other things) facilitating, assessing and publicly reporting on progress in scientific integrity within the federal government.
  • Develop and include a set of standardized performance indicators of progress on scientific integrity to be included in all departmental results indicators (N.B. OCSA has already developed a candidate set of indicators, as part of its commitment to supporting departments in complying with the performance evaluation provisions on their SIPs.)

The future

Scientific integrity is, in my view, a critical principle for ensuring not only the progress of science itself, but perhaps even more importantly, public trust in science – something which, especially in the context of the SARS-CoV-2 pandemic, has taken rather a beating.

Thus far, the focus of SIPs, both in Canada and abroad, have been public institutions. But especially in some rapidly evolving areas of science, including synthetic biology, quantum computing, and generative artificial intelligence, private sector involvement is growing. Moreover, big science – especially REALLY big science – will almost certainly involve public-private partnerships. There is, therefore, a pressing need to develop model SIPs for the private sector. While there are many elements in public policies that are transferable (with relatively little fuss) to the private sector, others will require careful analysis and (likely) re-design, especially those focusing on open science, public communication of scientific results, and research security.

Spread the Word about this Post
Spread the Word about this Course
Spread the Word about this Case
Spread the Word about this Resource
Spread the Word about this Research
Copied!