Note: This post was prepared jointly with my colleague, Professor John Sinclair at the University of Manitoba.
Rather than merely treat assessments as hoops for proponents to jump through in order to gain project approval, it is now more commonly recognized that impact assessments should be centered on learning. To achieve this, the potential for learning by all participants must be recognized throughout the assessment process from the earliest pre-planning phases through to monitoring of effects and outcomes. In fact, we contend that the current crisis in federal EA in Canada is caused in part by the lack of integration of learning into EA. This suggests that the potential benefits of learning need to be recognized throughout all of the typical stages of a strategic, regional or project assessment. Four aspects of assessment processes are particularly important to fostering a learning orientation. They are public participation, knowledge development, monitoring of effects and regime evolution.
What the Expert Panel Said About Learning Oriented Assessment
The Expert Panel recognized the importance of learning in impact assessment by linking learning to many of the key components of IA throughout its report. The panel supported, for example, the recommendation of the Multi-Interest Advisory Committee (MIAC) regarding the purpose of IA:
The two core purposes of federal EA law and associated processes are: to strengthen progress towards sustainability, including through positive contributions to lasting socio-economic and biophysical wellbeing, while avoiding and mitigating adverse environmental effects; and to enhance the capability, credibility and learning outcomes of EA-related deliberations and decision making.[1]
Echoing some of the input it received, the Expert Panel linked mutual learning to effective and efficient participation and identified participation as a “learning process.” It also established the importance of learning for quality assurance and underscored the importance of “interactive learning processes” as a part of follow-up and monitoring. While the Panel did not make separate recommendations regarding learning, it did provide direction in the following four areas to ensure learning is captured.
- Public Participation
The Expert Panel recognized the need to “foster a culture of learning so that assessments become more effective and efficient over time”. The Panel noted further that “mutual learning and inclusive dialogue” are essential ingredients of this culture of learning. It also clearly underscored the importance of participant funding to the generation of knowledge, building of capacity and effective and efficient IA processes.
- Knowledge Development
The Panel recognized that impact assessment must place a heavy reliance on knowledge/evidence inputs of various kinds throughout almost all stages of the process. Such inputs are critical to learning and understanding the veracity of the outcome decisions of any impact assessment process. The Panel recognized that these inputs will come from a variety of sources including traditional Indigenous and non-Indigenous sources, and western science.
3. Monitoring of Effects
The Expert Panel recognized that the monitoring phase “also helps ensure that the IA process is an iterative learning process. Without tracking and assessing the effectiveness of mitigation measures or the accuracy of impact predictions, it is impossible to learn from past successes and mistakes in order to improve future project design, predictions and decision-making.”[2] The panel recognized that monitoring programs, when done well, offer a critical opportunity for mutual learning beyond the assessment process, and significantly enhance the efficiency and effectiveness of the assessment process over time.
4. Regime Evolution
The Expert Panel recognized the need for administrative bodies to monitor application of IA processes for successes and limitations (including for strengths and deficiencies of impact predictions, Indigenous and public engagement, trade-off avoidance, compliance and effects monitoring and effectiveness of multi-jurisdictional activities) in order to ensure learning and to modify IA processes as needed. The Panel noted that any IA Agency
“…would require strong quality assurance programs, as well as audit functions covering both cost control and process. The role of the quality assurance program would be to assess the quality of IAs conducted by the Commission and to ensure that continuous learning and improvement takes place within the organization. Cross-cutting issues would be studied, such as the accuracy of predictions of certain impacts, the effectiveness of mitigation measures and the implementation and effectiveness of follow-up programs. Program analyses would be publicly available.”[3]
What Bill C-69 Says About Learning
Learning is not specifically mentioned in Bill C-69. We have reviewed in previous posts two of the components of IA that we consider to be essential to a learning orientation, meaningful participation and follow-up. In relation to meaningful participation the aspects of the proposed Bill that could have the most significant implications for learning are the engagement of the public in the early planning phase and provisions for participant funding.
The early planning phase and the development of a public participation plan during this phase involving potentially interested individuals, organizations and other jurisdictions, could have benefits in terms of a better mutual understanding of the undertaking to be assessed, knowledge needs and other participants’ concerns, among other benefits.
Apart from requiring a follow-up program in the first place, the most significant follow-up provisions related to learning are those requiring participant funding and the public reporting of monitoring results.
Knowledge is mentioned in Bill C-69 in relation to both traditional knowledge and scientific information. This is critical as both need to be integrated into impact assessments. The purpose section of the Bill mentions the needed integration of community knowledge. Paragraphs 22(1)(g) & (m) require that traditional and community knowledge be taken into account during an impact assessment of a designated project. Section 23 requires that every federal authority with expert information or knowledge with respect to a designated project make that information available to the Agency, Review Panel or other jurisdiction involved in an assessment.
Section 155 of the Bill includes among the Agency’s duties the promotion and monitoring of the quality of impact assessments conducted under the IAA. Section 167 requires that a comprehensive review of the provisions and operations of the Act be initiated 10 years after the Act comes into force by a committee of the Senate, of the House of Commons or of both houses. Review processes by House Committees are normally public.
Law Reform Recommendations
While the Bill as proposed provides a very basic foundation for learning oriented assessment, a number of the elements have been in place for some time already, without achieving significant progress toward a learning approach to federal assessments. We see the need for a number of reforms. In particular, there is a need to recognize learning as important to, and a critical outcome of, assessments. An additional purpose should be added to section 6 related to learning:
6(1) The purposes of this Act are …
(o) to encourage mutual learning and enhance learning outcomes of deliberations and decision making performed under this Act.
Assessment needs to be thought of by all involved as a continuous learning process, in order for the learning potential of assessment to be realized.
Meaningful participation in assessment processes is essential to creating a strong foundation for mutual learning. We will not repeat here the detailed recommendations we have made for the reform of public participation in an earlier post (https://bit.ly/2px6Y3i), but they are essential to making assessments more learning oriented. In particular, it is critical to define “meaningful participation” broadly, to recognize that timelines can impede meaningful participation and learning, and that regulatory provisions for participation in assessment need to be developed before the new process is implemented. In terms of learning, such regulatory provisions need to establish roles and responsibilities for participatory programs especially in the early planning phase, establish opportunities for deliberative multi-stakeholder collaboration using the full range of methods in the participation toolbox (including opportunities such as scenario building and visioning), and increased attention to alternate dispute resolution throughout any assessment.
In relation to knowledge development and sharing, the Bill recognizes two sources of knowledge (Indigenous and community) and one source of information (expert). Knowledge and information inputs need to come from diverse sources before decisions are made. In particular, we feel that the diverse sources of knowledge should be explicitly recognized in section 22, namely non-government organizations, academics, and “communities”, each defined broadly and inclusively.
Follow-up monitoring and the free and open exchange of monitoring data are essential to continuous learning as outlined above. We have commented on the needed reforms in terms of follow-up in the Bill in a previous post (https://bit.ly/2DQ7Kwy). Reforms related to community engagement in follow-up programs, how data from follow-up programs will be made publicly available on an ongoing basis and the ability to track compliance and monitoring are particularly important for enhancing learning outcomes. It is essential that provisions for follow-up make it clear that a goal of such programs is to harness the potential of learning across assessments, something we have not been able to achieve despite years of EA experience federally. Funding provisions in follow-up need to be clarified to ensure active public participation will be adequately supported when requested.
Ensuring and improving the quality of IA on an ongoing basis is critical to learning. While Section 155 requires the Agency to monitor the quality of impact assessments, there is no associated provision for the development of a quality assurance program that would help to ensure that improving the quality of IA over time is institutionalized (as was the case under the 2003 version of CEAA). At a minimum, policy will be needed to spell this out and ensure that learning occurs through the implementation of a quality assurance program that includes feedback and improvement mechanisms so that mistakes are not repeated. This ongoing consideration of process will also allow for the development of new regulations, guidance and policy to emerge in a way that is linked to experience with process implementation. Provision should also be made to compel federal authorities to comply with any improvements identified by the Agency in the absence of specific legislative direction.
Ongoing review and modification will be critical for the proposed comprehensive legislative review of the Bill and its operations. We recommend that such a comprehensive review be initiated in seven rather than the proposed ten years after coming-into-force.
Meinhard Doelle,
Professor, Schulich School of Law
[1] Submission of the Multi-Interest Advisory Committee to the Expert Panel on Environmental Assessment Processes (December 2016).
[2] Building Common Ground, at 66.
[3] Building Common Ground, at 53.
Leave a Reply