Archivos de la etiqueta: AI

12Nov/24

Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. October 30, 2023.

Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. JOSEPH R. BIDEN JR. THE WHITE HOUSE,
October 30, 2023.

   By the authority vested in me as President by the Constitution and the laws of the United States of America, it is hereby ordered as follows:

     Section 1.  Purpose.  Artificial intelligence (AI) holds extraordinary potential for both promise and peril.  Responsible AI use has the potential to help solve urgent challenges while making our world more prosperous, productive, innovative, and secure.  At the same time, irresponsible use could exacerbate societal harms such as fraud, discrimination, bias, and disinformation; displace and disempower workers; stifle competition; and pose risks to national security.  Harnessing AI for good and realizing its myriad benefits requires mitigating its substantial risks.  This endeavor demands a society-wide effort that includes government, the private sector, academia, and civil society.

     My Administration places the highest urgency on governing the development and use of AI safely and responsibly, and is therefore advancing a coordinated, Federal Government-wide approach to doing so.  The rapid speed at which AI capabilities are advancing compels the United States to lead in this moment for the sake of our security, economy, and society.

     In the end, AI reflects the principles of the people who build it, the people who use it, and the data upon which it is built.  I firmly believe that the power of our ideals; the foundations of our society; and the creativity, diversity, and decency of our people are the reasons that America thrived in past eras of rapid change.  They are the reasons we will succeed again in this moment.  We are more than capable of harnessing AI for justice, security, and opportunity for all.

     Sec. 2.  Policy and Principles.  It is the policy of my Administration to advance and govern the development and use of AI in accordance with eight guiding principles and priorities.  When undertaking the actions set forth in this order, executive departments and agencies (agencies) shall, as appropriate and consistent with applicable law, adhere to these principles, while, as feasible, taking into account the views of other agencies, industry, members of academia, civil society, labor unions, international allies and partners, and other relevant organizations:

     (a)  Artificial Intelligence must be safe and secure.  Meeting this goal requires robust, reliable, repeatable, and standardized evaluations of AI systems, as well as policies, institutions, and, as appropriate, other mechanisms to test, understand, and mitigate risks from these systems before they are put to use.  It also requires addressing AI systems’ most pressing security risks — including with respect to biotechnology, cybersecurity, critical infrastructure, and other national security dangers — while navigating AI’s opacity and complexity.  Testing and evaluations, including post-deployment performance monitoring, will help ensure that AI systems function as intended, are resilient against misuse or dangerous modifications, are ethically developed and operated in a secure manner, and are compliant with applicable Federal laws and policies.  Finally, my Administration will help develop effective labeling and content provenance mechanisms, so that Americans are able to determine when content is generated using AI and when it is not.  These actions will provide a vital foundation for an approach that addresses AI’s risks without unduly reducing its benefits. 

     (b)  Promoting responsible innovation, competition, and collaboration will allow the United States to lead in AI and unlock the technology’s potential to solve some of society’s most difficult challenges.  This effort requires investments in AI-related education, training, development, research, and capacity, while simultaneously tackling novel intellectual property (IP) questions and other problems to protect inventors and creators.  Across the Federal Government, my Administration will support programs to provide Americans the skills they need for the age of AI and attract the world’s AI talent to our shores — not just to study, but to stay — so that the companies and technologies of the future are made in America.  The Federal Government will promote a fair, open, and competitive ecosystem and marketplace for AI and related technologies so that small developers and entrepreneurs can continue to drive innovation.  Doing so requires stopping unlawful collusion and addressing risks from dominant firms’ use of key assets such as semiconductors, computing power, cloud storage, and data to disadvantage competitors, and it requires supporting a marketplace that harnesses the benefits of AI to provide new opportunities for small businesses, workers, and entrepreneurs. 

     (c)  The responsible development and use of AI require a commitment to supporting American workers.  As AI creates new jobs and industries, all workers need a seat at the table, including through collective bargaining, to ensure that they benefit from these opportunities.  My Administration will seek to adapt job training and education to support a diverse workforce and help provide access to opportunities that AI creates.  In the workplace itself, AI should not be deployed in ways that undermine rights, worsen job quality, encourage undue worker surveillance, lessen market competition, introduce new health and safety risks, or cause harmful labor-force disruptions.  The critical next steps in AI development should be built on the views of workers, labor unions, educators, and employers to support responsible uses of AI that improve workers’ lives, positively augment human work, and help all people safely enjoy the gains and opportunities from technological innovation.

     (d)  Artificial Intelligence policies must be consistent with my Administration’s dedication to advancing equity and civil rights.  My Administration cannot — and will not — tolerate the use of AI to disadvantage those who are already too often denied equal opportunity and justice.  From hiring to housing to healthcare, we have seen what happens when AI use deepens discrimination and bias, rather than improving quality of life.  Artificial Intelligence systems deployed irresponsibly have reproduced and intensified existing inequities, caused new types of harmful discrimination, and exacerbated online and physical harms.  My Administration will build on the important steps that have already been taken — such as issuing the Blueprint for an AI Bill of Rights, the AI Risk Management Framework, and Executive Order 14091 of February 16, 2023 (Further Advancing Racial Equity and Support for Underserved Communities Through the Federal Government) — in seeking to ensure that AI complies with all Federal laws and to promote robust technical evaluations, careful oversight, engagement with affected communities, and rigorous regulation.  It is necessary to hold those developing and deploying AI accountable to standards that protect against unlawful discrimination and abuse, including in the justice system and the Federal Government.  Only then can Americans trust AI to advance civil rights, civil liberties, equity, and justice for all.

     (e)  The interests of Americans who increasingly use, interact with, or purchase AI and AI-enabled products in their daily lives must be protected.  Use of new technologies, such as AI, does not excuse organizations from their legal obligations, and hard-won consumer protections are more important than ever in moments of technological change.  The Federal Government will enforce existing consumer protection laws and principles and enact appropriate safeguards against fraud, unintended bias, discrimination, infringements on privacy, and other harms from AI.  Such protections are especially important in critical fields like healthcare, financial services, education, housing, law, and transportation, where mistakes by or misuse of AI could harm patients, cost consumers or small businesses, or jeopardize safety or rights.  At the same time, my Administration will promote responsible uses of AI that protect consumers, raise the quality of goods and services, lower their prices, or expand selection and availability.

     (f)  Americans’ privacy and civil liberties must be protected as AI continues advancing.  Artificial Intelligence is making it easier to extract, re-identify, link, infer, and act on sensitive information about people’s identities, locations, habits, and desires.  Artificial Intelligence’s capabilities in these areas can increase the risk that personal data could be exploited and exposed.  To combat this risk, the Federal Government will ensure that the collection, use, and retention of data is lawful, is secure, and mitigates privacy and confidentiality risks.  Agencies shall use available policy and technical tools, including privacy-enhancing technologies (PETs) where appropriate, to protect privacy and to combat the broader legal and societal risks — including the chilling of First Amendment rights — that result from the improper collection and use of people’s data.

     (g)  It is important to manage the risks from the Federal Government’s own use of AI and increase its internal capacity to regulate, govern, and support responsible use of AI to deliver better results for Americans.  These efforts start with people, our Nation’s greatest asset.  My Administration will take steps to attract, retain, and develop public service-oriented AI professionals, including from underserved communities, across disciplines — including technology, policy, managerial, procurement, regulatory, ethical, governance, and legal fields — and ease AI professionals’ path into the Federal Government to help harness and govern AI.  The Federal Government will work to ensure that all members of its workforce receive adequate training to understand the benefits, risks, and limitations of AI for their job functions, and to modernize Federal Government information technology infrastructure, remove bureaucratic obstacles, and ensure that safe and rights-respecting AI is adopted, deployed, and used. 

     (h)  The Federal Government should lead the way to global societal, economic, and technological progress, as the United States has in previous eras of disruptive innovation and change.  This leadership is not measured solely by the technological advancements our country makes.  Effective leadership also means pioneering those systems and safeguards needed to deploy technology responsibly — and building and promoting those safeguards with the rest of the world.  My Administration will engage with international allies and partners in developing a framework to manage AI’s risks, unlock AI’s potential for good, and promote common approaches to shared challenges.  The Federal Government will seek to promote responsible AI safety and security principles and actions with other nations, including our competitors, while leading key global conversations and collaborations to ensure that AI benefits the whole world, rather than exacerbating inequities, threatening human rights, and causing other harms. 

     Sec. 3.  Definitions.  For purposes of this order:

     (a)  The term “agency” means each agency described in 44 U.S.C. 3502(1), except for the independent regulatory agencies described in 44 U.S.C. 3502(5).

     (b)  The term “artificial intelligence” or “AI” has the meaning set forth in 15 U.S.C. 9401(3):  a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.  Artificial intelligence systems use machine- and human-based inputs to perceive real and virtual environments; abstract such perceptions into models through analysis in an automated manner; and use model inference to formulate options for information or action.

     (c)  The term “AI model” means a component of an information system that implements AI technology and uses computational, statistical, or machine-learning techniques to produce outputs from a given set of inputs.

     (d)  The term “AI red-teaming” means a structured testing effort to find flaws and vulnerabilities in an AI system, often in a controlled environment and in collaboration with developers of AI.  Artificial Intelligence red-teaming is most often performed by dedicated “red teams” that adopt adversarial methods to identify flaws and vulnerabilities, such as harmful or discriminatory outputs from an AI system, unforeseen or undesirable system behaviors, limitations, or potential risks associated with the misuse of the system.

     (e)  The term “AI system” means any data system, software, hardware, application, tool, or utility that operates in whole or in part using AI.

     (f)  The term “commercially available information” means any information or data about an individual or group of individuals, including an individual’s or group of individuals’ device or location, that is made available or obtainable and sold, leased, or licensed to the general public or to governmental or non-governmental entities. 

     (g)  The term “crime forecasting” means the use of analytical techniques to attempt to predict future crimes or crime-related information.  It can include machine-generated predictions that use algorithms to analyze large volumes of data, as well as other forecasts that are generated without machines and based on statistics, such as historical crime statistics.

     (h)  The term “critical and emerging technologies” means those technologies listed in the February 2022 Critical and Emerging Technologies List Update issued by the National Science and Technology Council (NSTC), as amended by subsequent updates to the list issued by the NSTC. 

     (i)  The term “critical infrastructure” has the meaning set forth in section 1016(e) of the USA PATRIOT Act of 2001, 42 U.S.C. 5195c(e).

     (j)  The term “differential-privacy guarantee” means protections that allow information about a group to be shared while provably limiting the improper access, use, or disclosure of personal information about particular entities.  

     (k)  The term “dual-use foundation model” means an AI model that is trained on broad data; generally uses self-supervision; contains at least tens of billions of parameters; is applicable across a wide range of contexts; and that exhibits, or could be easily modified to exhibit, high levels of performance at tasks that pose a serious risk to security, national economic security, national public health or safety, or any combination of those matters, such as by:

          (i)    substantially lowering the barrier of entry for non-experts to design, synthesize, acquire, or use chemical, biological, radiological, or nuclear (CBRN) weapons;

          (ii)   enabling powerful offensive cyber operations through automated vulnerability discovery and exploitation against a wide range of potential targets of cyber attacks; or

          (iii)  permitting the evasion of human control or oversight through means of deception or obfuscation.

Models meet this definition even if they are provided to end users with technical safeguards that attempt to prevent users from taking advantage of the relevant unsafe capabilities. 

     (l)  The term “Federal law enforcement agency” has the meaning set forth in section 21(a) of Executive Order 14074 of May 25, 2022 (Advancing Effective, Accountable Policing and Criminal Justice Practices To Enhance Public Trust and Public Safety).

     (m)  The term “floating-point operation” means any mathematical operation or assignment involving floating-point numbers, which are a subset of the real numbers typically represented on computers by an integer of fixed precision scaled by an integer exponent of a fixed base.

     (n)  The term “foreign person” has the meaning set forth in section 5(c) of Executive Order 13984 of January 19, 2021 (Taking Additional Steps To Address the National Emergency With Respect to Significant Malicious Cyber-Enabled Activities).

     (o)  The terms “foreign reseller” and “foreign reseller of United States Infrastructure as a Service Products” mean a foreign person who has established an Infrastructure as a Service Account to provide Infrastructure as a Service Products subsequently, in whole or in part, to a third party.

     (p)  The term “generative AI” means the class of AI models that emulate the structure and characteristics of input data in order to generate derived synthetic content.  This can include images, videos, audio, text, and other digital content.

     (q)  The terms “Infrastructure as a Service Product,” “United States Infrastructure as a Service Product,” “United States Infrastructure as a Service Provider,” and “Infrastructure as a Service Account” each have the respective meanings given to those terms in section 5 of Executive Order 13984.

     (r)  The term “integer operation” means any mathematical operation or assignment involving only integers, or whole numbers expressed without a decimal point.

     (s)  The term “Intelligence Community” has the meaning given to that term in section 3.5(h) of Executive Order 12333 of December 4, 1981 (United States Intelligence Activities), as amended. 

     (t)  The term “machine learning” means a set of techniques that can be used to train AI algorithms to improve performance at a task based on data.

     (u)  The term “model weight” means a numerical parameter within an AI model that helps determine the model’s outputs in response to inputs.

     (v)  The term “national security system” has the meaning set forth in 44 U.S.C. 3552(b)(6).

     (w)  The term “omics” means biomolecules, including nucleic acids, proteins, and metabolites, that make up a cell or cellular system.

     (x)  The term “Open RAN” means the Open Radio Access Network approach to telecommunications-network standardization adopted by the O-RAN Alliance, Third Generation Partnership Project, or any similar set of published open standards for multi-vendor network equipment interoperability.

     (y)  The term “personally identifiable information” has the meaning set forth in Office of Management and Budget (OMB) Circular No. A-130.

     (z)  The term “privacy-enhancing technology” means any software or hardware solution, technical process, technique, or other technological means of mitigating privacy risks arising from data processing, including by enhancing predictability, manageability, disassociability, storage, security, and confidentiality.  These technological means may include secure multiparty computation, homomorphic encryption, zero-knowledge proofs, federated learning, secure enclaves, differential privacy, and synthetic-data-generation tools.  This is also sometimes referred to as “privacy-preserving technology.”

     (aa)  The term “privacy impact assessment” has the meaning set forth in OMB Circular No. A-130.

     (bb)  The term “Sector Risk Management Agency” has the meaning set forth in 6 U.S.C. 650(23).

     (cc)  The term “self-healing network” means a telecommunications network that automatically diagnoses and addresses network issues to permit self-restoration.

     (dd)  The term “synthetic biology” means a field of science that involves redesigning organisms, or the biomolecules of organisms, at the genetic level to give them new characteristics.  Synthetic nucleic acids are a type of biomolecule redesigned through synthetic-biology methods.

     (ee)  The term “synthetic content” means information, such as images, videos, audio clips, and text, that has been significantly modified or generated by algorithms, including by AI.

     (ff)  The term “testbed” means a facility or mechanism equipped for conducting rigorous, transparent, and replicable testing of tools and technologies, including AI and PETs, to help evaluate the functionality, usability, and performance of those tools or technologies.

     (gg)  The term “watermarking” means the act of embedding information, which is typically difficult to remove, into outputs created by AI — including into outputs such as photos, videos, audio clips, or text — for the purposes of verifying the authenticity of the output or the identity or characteristics of its provenance, modifications, or conveyance.
     Sec. 4.  Ensuring the Safety and Security of AI Technology.

     4.1.  Developing Guidelines, Standards, and Best Practices for AI Safety and Security.  (a)  Within 270 days of the date of this order, to help ensure the development of safe, secure, and trustworthy AI systems, the Secretary of Commerce, acting through the Director of the National Institute of Standards and Technology (NIST), in coordination with the Secretary of Energy, the Secretary of Homeland Security, and the heads of other relevant agencies as the Secretary of Commerce may deem appropriate, shall:

          (i)   Establish guidelines and best practices, with the aim of promoting consensus industry standards, for developing and deploying safe, secure, and trustworthy AI systems, including:

               (A)  developing a companion resource to the AI Risk Management Framework, NIST AI 100-1, for generative AI;

               (B)  developing a companion resource to the Secure Software Development Framework to incorporate secure development practices for generative AI and for dual-use foundation models; and

               (C)  launching an initiative to create guidance and benchmarks for evaluating and auditing AI capabilities, with a focus on capabilities through which AI could cause harm, such as in the areas of cybersecurity and biosecurity.

          (ii)  Establish appropriate guidelines (except for AI used as a component of a national security system), including appropriate procedures and processes, to enable developers of AI, especially of dual-use foundation models, to conduct AI red-teaming tests to enable deployment of safe, secure, and trustworthy systems.  These efforts shall include:

               (A)  coordinating or developing guidelines related to assessing and managing the safety, security, and trustworthiness of dual-use foundation models; and

               (B)  in coordination with the Secretary of Energy and the Director of the National Science Foundation (NSF), developing and helping to ensure the availability of testing environments, such as testbeds, to support the development of safe, secure, and trustworthy AI technologies, as well as to support the design, development, and deployment of associated PETs, consistent with section 9(b) of this order. 

     (b)  Within 270 days of the date of this order, to understand and mitigate AI security risks, the Secretary of Energy, in coordination with the heads of other Sector Risk Management Agencies (SRMAs) as the Secretary of Energy may deem appropriate, shall develop and, to the extent permitted by law and available appropriations, implement a plan for developing the Department of Energy’s AI model evaluation tools and AI testbeds.  The Secretary shall undertake this work using existing solutions where possible, and shall develop these tools and AI testbeds to be capable of assessing near-term extrapolations of AI systems’ capabilities.  At a minimum, the Secretary shall develop tools to evaluate AI capabilities to generate outputs that may represent nuclear, nonproliferation, biological, chemical, critical infrastructure, and energy-security threats or hazards.  The Secretary shall do this work solely for the purposes of guarding against these threats, and shall also develop model guardrails that reduce such risks.  The Secretary shall, as appropriate, consult with private AI laboratories, academia, civil society, and third-party evaluators, and shall use existing solutions.

     4.2.  Ensuring Safe and Reliable AI.  (a)  Within 90 days of the date of this order, to ensure and verify the continuous availability of safe, reliable, and effective AI in accordance with the Defense Production Act, as amended, 50 U.S.C. 4501 et seq., including for the national defense and the protection of critical infrastructure, the Secretary of Commerce shall require:

          (i)   Companies developing or demonstrating an intent to develop potential dual-use foundation models to provide the Federal Government, on an ongoing basis, with information, reports, or records regarding the following:

               (A)  any ongoing or planned activities related to training, developing, or producing dual-use foundation models, including the physical and cybersecurity protections taken to assure the integrity of that training process against sophisticated threats;

               (B)  the ownership and possession of the model weights of any dual-use foundation models, and the physical and cybersecurity measures taken to protect those model weights; and

               (C)  the results of any developed dual-use foundation model’s performance in relevant AI red-team testing based on guidance developed by NIST pursuant to subsection 4.1(a)(ii) of this section, and a description of any associated measures the company has taken to meet safety objectives, such as mitigations to improve performance on these red-team tests and strengthen overall model security.  Prior to the development of guidance on red-team testing standards by NIST pursuant to subsection 4.1(a)(ii) of this section, this description shall include the results of any red-team testing that the company has conducted relating to lowering the barrier to entry for the development, acquisition, and use of biological weapons by non-state actors; the discovery of software vulnerabilities and development of associated exploits; the use of software or tools to influence real or virtual events; the possibility for self-replication or propagation; and associated measures to meet safety objectives; and

          (ii)  Companies, individuals, or other organizations or entities that acquire, develop, or possess a potential large-scale computing cluster to report any such acquisition, development, or possession, including the existence and location of these clusters and the amount of total computing power available in each cluster.

     (b)  The Secretary of Commerce, in consultation with the Secretary of State, the Secretary of Defense, the Secretary of Energy, and the Director of National Intelligence, shall define, and thereafter update as needed on a regular basis, the set of technical conditions for models and computing clusters that would be subject to the reporting requirements of subsection 4.2(a) of this section.  Until such technical conditions are defined, the Secretary shall require compliance with these reporting requirements for:

          (i)   any model that was trained using a quantity of computing power greater than 1026 integer or floating-point operations, or using primarily biological sequence data and using a quantity of computing power greater than 1023 integer or floating-point operations; and

          (ii)  any computing cluster that has a set of machines physically co-located in a single datacenter, transitively connected by data center networking of over 100 Gbit/s, and having a theoretical maximum computing capacity of 1020 integer or floating-point operations per second for training AI.

     (c)  Because I find that additional steps must be taken to deal with the national emergency related to significant malicious cyber-enabled activities declared in Executive Order 13694 of April 1, 2015 (Blocking the Property of Certain Persons Engaging in Significant Malicious Cyber-Enabled Activities), as amended by Executive Order 13757 of December 28, 2016 (Taking Additional Steps to Address the National Emergency With Respect to Significant Malicious Cyber-Enabled Activities), and further amended by Executive Order 13984, to address the use of United States Infrastructure as a Service (IaaS) Products by foreign malicious cyber actors, including to impose additional record-keeping obligations with respect to foreign transactions and to assist in the investigation of transactions involving foreign malicious cyber actors, I hereby direct the Secretary of Commerce, within 90 days of the date of this order, to:

          (i)    Propose regulations that require United States IaaS Providers to submit a report to the Secretary of Commerce when a foreign person transacts with that United States IaaS Provider to train a large AI model with potential capabilities that could be used in malicious cyber-enabled activity (a “training run”).  Such reports shall include, at a minimum, the identity of the foreign person and the existence of any training run of an AI model meeting the criteria set forth in this section, or other criteria defined by the Secretary in regulations, as well as any additional information identified by the Secretary.

          (ii)   Include a requirement in the regulations proposed pursuant to subsection 4.2(c)(i) of this section that United States IaaS Providers prohibit any foreign reseller of their United States IaaS Product from providing those products unless such foreign reseller submits to the United States IaaS Provider a report, which the United States IaaS Provider must provide to the Secretary of Commerce, detailing each instance in which a foreign person transacts with the foreign reseller to use the United States IaaS Product to conduct a training run described in subsection 4.2(c)(i) of this section.  Such reports shall include, at a minimum, the information specified in subsection 4.2(c)(i) of this section as well as any additional information identified by the Secretary.

          (iii)  Determine the set of technical conditions for a large AI model to have potential capabilities that could be used in malicious cyber-enabled activity, and revise that determination as necessary and appropriate.  Until the Secretary makes such a determination, a model shall be considered to have potential capabilities that could be used in malicious cyber-enabled activity if it requires a quantity of computing power greater than 1026 integer or floating-point operations and is trained on a computing cluster that has a set of machines physically co-located in a single datacenter, transitively connected by data center networking of over 100 Gbit/s, and having a theoretical maximum compute capacity of 1020 integer or floating-point operations per second for training AI.   

     (d)  Within 180 days of the date of this order, pursuant to the finding set forth in subsection 4.2(c) of this section, the Secretary of Commerce shall propose regulations that require United States IaaS Providers to ensure that foreign resellers of United States IaaS Products verify the identity of any foreign person that obtains an IaaS account (account) from the foreign reseller.  These regulations shall, at a minimum:

          (i)    Set forth the minimum standards that a United States IaaS Provider must require of foreign resellers of its United States IaaS Products to verify the identity of a foreign person who opens an account or maintains an existing account with a foreign reseller, including:

               (A)  the types of documentation and procedures that foreign resellers of United States IaaS Products must require to verify the identity of any foreign person acting as a lessee or sub-lessee of these products or services;

               (B)  records that foreign resellers of United States IaaS Products must securely maintain regarding a foreign person that obtains an account, including information establishing:

                    (1)  the identity of such foreign person, including name and address;

                    (2)  the means and source of payment (including any associated financial institution and other identifiers such as credit card number, account number, customer identifier, transaction identifiers, or virtual currency wallet or wallet address identifier);

                    (3)  the electronic mail address and telephonic contact information used to verify a foreign person’s identity; and

                    (4)  the Internet Protocol addresses used for access or administration and the date and time of each such access or administrative action related to ongoing verification of such foreign person’s ownership of such an account; and

               (C)  methods that foreign resellers of United States IaaS Products must implement to limit all third-party access to the information described in this subsection, except insofar as such access is otherwise consistent with this order and allowed under applicable law;

          (ii)   Take into consideration the types of accounts maintained by foreign resellers of United States IaaS Products, methods of opening an account, and types of identifying information available to accomplish the objectives of identifying foreign malicious cyber actors using any such products and avoiding the imposition of an undue burden on such resellers; and

          (iii)  Provide that the Secretary of Commerce, in accordance with such standards and procedures as the Secretary may delineate and in consultation with the Secretary of Defense, the Attorney General, the Secretary of Homeland Security, and the Director of National Intelligence, may exempt a United States IaaS Provider with respect to any specific foreign reseller of their United States IaaS Products, or with respect to any specific type of account or lessee, from the requirements of any regulation issued pursuant to this subsection.  Such standards and procedures may include a finding by the Secretary that such foreign reseller, account, or lessee complies with security best practices to otherwise deter abuse of United States IaaS Products.

     (e)  The Secretary of Commerce is hereby authorized to take such actions, including the promulgation of rules and regulations, and to employ all powers granted to the President by the International Emergency Economic Powers Act, 50 U.S.C. 1701 et seq., as may be necessary to carry out the purposes of subsections 4.2(c) and (d) of this section.  Such actions may include a requirement that United States IaaS Providers require foreign resellers of United States IaaS Products to provide United States IaaS Providers verifications relative to those subsections.

     4.3.  Managing AI in Critical Infrastructure and in Cybersecurity.  (a)  To ensure the protection of critical
infrastructure, the following actions shall be taken:

          (i)    Within 90 days of the date of this order, and at least annually thereafter, the head of each agency with relevant regulatory authority over critical infrastructure and the heads of relevant SRMAs, in coordination with the Director of the Cybersecurity and Infrastructure Security Agency within the Department of Homeland Security for consideration of cross-sector risks, shall evaluate and provide to the Secretary of Homeland Security an assessment of potential risks related to the use of AI in critical infrastructure sectors involved, including ways in which deploying AI may make critical infrastructure systems more vulnerable to critical failures, physical attacks, and cyber attacks, and shall consider ways to mitigate these vulnerabilities.  Independent regulatory agencies are encouraged, as they deem appropriate, to contribute to sector-specific risk assessments.

          (ii)   Within 150 days of the date of this order, the Secretary of the Treasury shall issue a public report on best practices for financial institutions to manage AI-specific cybersecurity risks.

          (iii)  Within 180 days of the date of this order, the Secretary of Homeland Security, in coordination with the Secretary of Commerce and with SRMAs and other regulators as determined by the Secretary of Homeland Security, shall incorporate as appropriate the AI Risk Management Framework, NIST AI 100-1, as well as other appropriate security guidance, into relevant safety and security guidelines for use by critical infrastructure owners and operators.

          (iv)   Within 240 days of the completion of the guidelines described in subsection 4.3(a)(iii) of this section, the Assistant to the President for National Security Affairs and the Director of OMB, in consultation with the Secretary of Homeland Security, shall coordinate work by the heads of agencies with authority over critical infrastructure to develop and take steps for the Federal Government to mandate such guidelines, or appropriate portions thereof, through regulatory or other appropriate action.  Independent regulatory agencies are encouraged, as they deem appropriate, to consider whether to mandate guidance through regulatory action in their areas of authority and responsibility.

          (v)    The Secretary of Homeland Security shall establish an Artificial Intelligence Safety and Security Board as an advisory committee pursuant to section 871 of the Homeland Security Act of 2002 (Public Law 107-296).  The Advisory Committee shall include AI experts from the private sector, academia, and government, as appropriate, and provide to the Secretary of Homeland Security and the Federal Government’s critical infrastructure community advice, information, or recommendations for improving security, resilience, and incident response related to AI usage in critical infrastructure.

     (b)  To capitalize on AI’s potential to improve United States cyber defenses:

          (i)    The Secretary of Defense shall carry out the actions described in subsections 4.3(b)(ii) and (iii) of this section for national security systems, and the Secretary of Homeland Security shall carry out these actions for non-national security systems.  Each shall do so in consultation with the heads of other relevant agencies as the Secretary of Defense and the Secretary of Homeland Security may deem appropriate. 

          (ii)   As set forth in subsection 4.3(b)(i) of this section, within 180 days of the date of this order, the Secretary of Defense and the Secretary of Homeland Security shall, consistent with applicable law, each develop plans for, conduct, and complete an operational pilot project to identify, develop, test, evaluate, and deploy AI capabilities, such as large-language models, to aid in the discovery and remediation of vulnerabilities in critical United States Government software, systems, and networks.

          (iii)  As set forth in subsection 4.3(b)(i) of this section, within 270 days of the date of this order, the Secretary of Defense and the Secretary of Homeland Security shall each provide a report to the Assistant to the President for National Security Affairs on the results of actions taken pursuant to the plans and operational pilot projects required by subsection 4.3(b)(ii) of this section, including a description of any vulnerabilities found and fixed through the development and deployment of AI capabilities and any lessons learned on how to identify, develop, test, evaluate, and deploy AI capabilities effectively for cyber defense.

     4.4.  Reducing Risks at the Intersection of AI and CBRN Threats.  (a)  To better understand and mitigate the risk of AI being misused to assist in the development or use of CBRN threats — with a particular focus on biological weapons — the following actions shall be taken: 

          (i)   Within 180 days of the date of this order, the Secretary of Homeland Security, in consultation with the Secretary of Energy and the Director of the Office of Science and Technology Policy (OSTP), shall evaluate the potential for AI to be misused to enable the development or production of CBRN threats, while also considering the benefits and application of AI to counter these threats, including, as appropriate, the results of work conducted under section 8(b) of this order.  The Secretary of Homeland Security shall:

               (A)  consult with experts in AI and CBRN issues from the Department of Energy, private AI laboratories, academia, and third-party model evaluators, as appropriate, to evaluate AI model capabilities to present CBRN threats — for the sole purpose of guarding against those threats — as well as options for minimizing the risks of AI model misuse to generate or exacerbate those threats; and

               (B)  submit a report to the President that describes the progress of these efforts, including an assessment of the types of AI models that may present CBRN risks to the United States, and that makes recommendations for regulating or overseeing the training, deployment, publication, or use of these models, including requirements for safety evaluations and guardrails for mitigating potential threats to national security.

          (ii)  Within 120 days of the date of this order, the Secretary of Defense, in consultation with the Assistant to the President for National Security Affairs and the Director of OSTP, shall enter into a contract with the National Academies of Sciences, Engineering, and Medicine to conduct — and submit to the Secretary of Defense, the Assistant to the President for National Security Affairs, the Director of the Office of Pandemic Preparedness and Response Policy, the Director of OSTP, and the Chair of the Chief Data Officer Council — a study that:

               (A)  assesses the ways in which AI can increase biosecurity risks, including risks from generative AI models trained on biological data, and makes recommendations on how to mitigate these risks;

               (B)  considers the national security implications of the use of data and datasets, especially those associated with pathogens and omics studies, that the United States Government hosts, generates, funds the creation of, or otherwise owns, for the training of generative AI models, and makes recommendations on how to mitigate the risks related to the use of these data and datasets;

               (C)  assesses the ways in which AI applied to biology can be used to reduce biosecurity risks, including recommendations on opportunities to coordinate data and high-performance computing resources; and

               (D)  considers additional concerns and opportunities at the intersection of AI and synthetic biology that the Secretary of Defense deems appropriate.

     (b)  To reduce the risk of misuse of synthetic nucleic acids, which could be substantially increased by AI’s capabilities in this area, and improve biosecurity measures for the nucleic acid synthesis industry, the following actions shall be taken:

          (i)    Within 180 days of the date of this order, the Director of OSTP, in consultation with the Secretary of State, the Secretary of Defense, the Attorney General, the Secretary of Commerce, the Secretary of Health and Human Services (HHS), the Secretary of Energy, the Secretary of Homeland Security, the Director of National Intelligence, and the heads of other relevant agencies as the Director of OSTP may deem appropriate, shall establish a framework, incorporating, as appropriate, existing United States Government guidance, to encourage providers of synthetic nucleic acid sequences to implement comprehensive, scalable, and verifiable synthetic nucleic acid procurement screening mechanisms, including standards and recommended incentives.  As part of this framework, the Director of OSTP shall:

               (A)  establish criteria and mechanisms for ongoing identification of biological sequences that could be used in a manner that would pose a risk to the national security of the United States; and

               (B)  determine standardized methodologies and tools for conducting and verifying the performance of sequence synthesis procurement screening, including customer screening approaches to support due diligence with respect to managing security risks posed by purchasers of biological sequences identified in subsection 4.4(b)(i)(A) of this section, and processes for the reporting of concerning activity to enforcement entities.

          (ii)   Within 180 days of the date of this order, the Secretary of Commerce, acting through the Director of NIST, in coordination with the Director of OSTP, and in consultation with the Secretary of State, the Secretary of HHS, and the heads of other relevant agencies as the Secretary of Commerce may deem appropriate, shall initiate an effort to engage with industry and relevant stakeholders, informed by the framework developed under subsection 4.4(b)(i) of this section, to develop and refine for possible use by synthetic nucleic acid sequence providers:

               (A)  specifications for effective nucleic acid synthesis procurement screening;

               (B)  best practices, including security and access controls, for managing sequence-of-concern databases to support such screening;

               (C)  technical implementation guides for effective screening; and

               (D)  conformity-assessment best practices and mechanisms.

          (iii)  Within 180 days of the establishment of the framework pursuant to subsection 4.4(b)(i) of this section, all agencies that fund life-sciences research shall, as appropriate and consistent with applicable law, establish that, as a requirement of funding, synthetic nucleic acid procurement is conducted through providers or manufacturers that adhere to the framework, such as through an attestation from the provider or manufacturer.  The Assistant to the President for National Security Affairs and the Director of OSTP shall coordinate the process of reviewing such funding requirements to facilitate consistency in implementation of the framework across funding agencies.

          (iv)   In order to facilitate effective implementation of the measures described in subsections 4.4(b)(i)-(iii) of this section, the Secretary of Homeland Security, in consultation with the heads of other relevant agencies as the Secretary of Homeland Security may deem appropriate, shall:

               (A)  within 180 days of the establishment of the framework pursuant to subsection 4.4(b)(i) of this section, develop a framework to conduct structured evaluation and stress testing of nucleic acid synthesis procurement screening, including the systems developed in accordance with subsections 4.4(b)(i)-(ii) of this section and implemented by providers of synthetic nucleic acid sequences; and

               (B)  following development of the framework pursuant to subsection 4.4(b)(iv)(A) of this section, submit an annual report to the Assistant to the President for National Security Affairs, the Director of the Office of Pandemic Preparedness and Response Policy, and the Director of OSTP on any results of the activities conducted pursuant to subsection 4.4(b)(iv)(A) of this section, including recommendations, if any, on how to strengthen nucleic acid synthesis procurement screening, including customer screening systems.

     4.5.  Reducing the Risks Posed by Synthetic Content.

 To foster capabilities for identifying and labeling synthetic content produced by AI systems, and to establish the authenticity and provenance of digital content, both synthetic and not synthetic, produced by the Federal Government or on its behalf:

     (a)  Within 240 days of the date of this order, the Secretary of Commerce, in consultation with the heads of other relevant agencies as the Secretary of Commerce may deem appropriate, shall submit a report to the Director of OMB and the Assistant to the President for National Security Affairs identifying the existing standards, tools, methods, and practices, as well as the potential development of further science-backed standards and techniques, for:

          (i)    authenticating content and tracking its provenance;

          (ii)   labeling synthetic content, such as using watermarking;

          (iii)  detecting synthetic content;

          (iv)   preventing generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals (to include intimate digital depictions of the body or body parts of an identifiable individual);

          (v)    testing software used for the above purposes; and

          (vi)   auditing and maintaining synthetic content.

     (b)  Within 180 days of submitting the report required under subsection 4.5(a) of this section, and updated periodically thereafter, the Secretary of Commerce, in coordination with the Director of OMB, shall develop guidance regarding the existing tools and practices for digital content authentication and synthetic content detection measures.  The guidance shall include measures for the purposes listed in subsection 4.5(a) of this section.

     (c)  Within 180 days of the development of the guidance required under subsection 4.5(b) of this section, and updated periodically thereafter, the Director of OMB, in consultation with the Secretary of State; the Secretary of Defense; the Attorney General; the Secretary of Commerce, acting through the Director of NIST; the Secretary of Homeland Security; the Director of National Intelligence; and the heads of other agencies that the Director of OMB deems appropriate, shall — for the purpose of strengthening public confidence in the integrity of official United States Government digital content — issue guidance to agencies for labeling and authenticating such content that they produce or publish.

     (d)  The Federal Acquisition Regulatory Council shall, as appropriate and consistent with applicable law, consider amending the Federal Acquisition Regulation to take into account the guidance established under subsection 4.5 of this section.

     4.6.  Soliciting Input on Dual-Use Foundation Models with Widely Available Model Weights.  When the weights for a dual-use foundation model are widely available — such as when they are publicly posted on the Internet — there can be substantial benefits to innovation, but also substantial security risks, such as the removal of safeguards within the model.  To address the risks and potential benefits of dual-use foundation models with widely available weights, within 270 days of the date of this order, the Secretary of Commerce, acting through the Assistant Secretary of Commerce for Communications and Information, and in consultation with the Secretary of State, shall:

     (a)  solicit input from the private sector, academia, civil society, and other stakeholders through a public consultation process on potential risks, benefits, other implications, and appropriate policy and regulatory approaches related to dual-use foundation models for which the model weights are widely available, including:

          (i)    risks associated with actors fine-tuning dual-use foundation models for which the model weights are widely available or removing those models’ safeguards;

          (ii)   benefits to AI innovation and research, including research into AI safety and risk management, of dual-use foundation models for which the model weights are widely available; and

          (iii)  potential voluntary, regulatory, and international mechanisms to manage the risks and maximize the benefits of dual-use foundation models for which the model weights are widely available; and

     (b)  based on input from the process described in subsection 4.6(a) of this section, and in consultation with the heads of other relevant agencies as the Secretary of Commerce deems appropriate, submit a report to the President on the potential benefits, risks, and implications of dual-use foundation models for which the model weights are widely available, as well as policy and regulatory recommendations pertaining to those models.

     4.7.  Promoting Safe Release and Preventing the Malicious Use of Federal Data for AI Training.To improve public data access and manage security risks, and consistent with the objectives of the Open, Public, Electronic, and Necessary Government Data Act (title II of Public Law 115-435) to expand public access to Federal data assets in a machine-readable format while also taking into account security considerations, including the risk that information in an individual data asset in isolation does not pose a security risk but, when combined with other available information, may pose such a risk:

     (a)  within 270 days of the date of this order, the Chief Data Officer Council, in consultation with the Secretary of Defense, the Secretary of Commerce, the Secretary of Energy, the Secretary of Homeland Security, and the Director of National Intelligence, shall develop initial guidelines for performing security reviews, including reviews to identify and manage the potential security risks of releasing Federal data that could aid in the development of CBRN weapons as well as the development of autonomous offensive cyber capabilities, while also providing public access to Federal Government data in line with the goals stated in the Open, Public, Electronic, and Necessary Government Data Act (title II of Public Law 115-435); and

     (b)  within 180 days of the development of the initial guidelines required by subsection 4.7(a) of this section, agencies shall conduct a security review of all data assets in the comprehensive data inventory required under 44 U.S.C. 3511(a)(1) and (2)(B) and shall take steps, as appropriate and consistent with applicable law, to address the highest-priority potential security risks that releasing that data could raise with respect to CBRN weapons, such as the ways in which that data could be used to train AI systems.

     4.8.  Directing the Development of a National Security Memorandum.  To develop a coordinated executive branch approach to managing AI’s security risks, the Assistant to the President for National Security Affairs and the Assistant to the President and Deputy Chief of Staff for Policy shall oversee an interagency process with the purpose of, within 270 days of the date of this order, developing and submitting a proposed National Security Memorandum on AI to the President.  The memorandum shall address the governance of AI used as a component of a national security system or for military and intelligence purposes.  The memorandum shall take into account current efforts to govern the development and use of AI for national security systems.  The memorandum shall outline actions for the Department of Defense, the Department of State, other relevant agencies, and the Intelligence Community to address the national security risks and potential benefits posed by AI.  In particular, the memorandum shall:

     (a)  provide guidance to the Department of Defense, other relevant agencies, and the Intelligence Community on the continued adoption of AI capabilities to advance the United States national security mission, including through directing specific AI assurance and risk-management practices for national security uses of AI that may affect the rights or safety of United States persons and, in appropriate contexts, non-United States persons; and

     (b)  direct continued actions, as appropriate and consistent with applicable law, to address the potential use of AI systems by adversaries and other foreign actors in ways that threaten the capabilities or objectives of the Department of Defense or the Intelligence Community, or that otherwise pose risks to the security of the United States or its allies and partners.  

     Sec. 5. Promoting Innovation and Competition.

     5.1.  Attracting AI Talent to the United States.  (a)  Within 90 days of the date of this order, to attract and retain talent in AI and other critical and emerging technologies in the United States economy, the Secretary of State and the Secretary of Homeland Security shall take appropriate steps to:

          (i)   streamline processing times of visa petitions and applications, including by ensuring timely availability of visa appointments, for noncitizens who seek to travel to the United States to work on, study, or conduct research in AI or other critical and emerging technologies; and 

          (ii)  facilitate continued availability of visa appointments in sufficient volume for applicants with expertise in AI or other critical and emerging technologies.

     (b)  Within 120 days of the date of this order, the Secretary of State shall:

          (i)    consider initiating a rulemaking to establish new criteria to designate countries and skills on the Department of State’s Exchange Visitor Skills List as it relates to the 2-year foreign residence requirement for certain J-1 nonimmigrants, including those skills that are critical to the United States;

          (ii)   consider publishing updates to the 2009 Revised Exchange Visitor Skills List (74 FR 20108); and

          (iii)  consider implementing a domestic visa renewal program under 22 C.F.R. 41.111(b) to facilitate the ability of qualified applicants, including highly skilled talent in AI and critical and emerging technologies, to continue their work in the United States without unnecessary interruption.

     (c)  Within 180 days of the date of this order, the Secretary of State shall:

          (i)   consider initiating a rulemaking to expand the categories of nonimmigrants who qualify for the domestic visa renewal program covered under 22 C.F.R. 41.111(b) to include academic J-1 research scholars and F-1 students in science, technology, engineering, and mathematics (STEM); and

          (ii)  establish, to the extent permitted by law and available appropriations, a program to identify and attract top talent in AI and other critical and emerging technologies at universities, research institutions, and the private sector overseas, and to establish and increase connections with that talent to educate them on opportunities and resources for research and employment in the United States, including overseas educational components to inform top STEM talent of nonimmigrant and immigrant visa options and potential expedited adjudication of their visa petitions and applications.

     (d)  Within 180 days of the date of this order, the Secretary of Homeland Security shall:

          (i)   review and initiate any policy changes the Secretary determines necessary and appropriate to clarify and modernize immigration pathways for experts in AI and other critical and emerging technologies, including O-1A and EB-1 noncitizens of extraordinary ability; EB-2 advanced-degree holders and noncitizens of exceptional ability; and startup founders in AI and other critical and emerging technologies using the International Entrepreneur Rule; and

          (ii)  continue its rulemaking process to modernize the H-1B program and enhance its integrity and usage, including by experts in AI and other critical and emerging technologies, and consider initiating a rulemaking to enhance the process for noncitizens, including experts in AI and other critical and emerging technologies and their spouses, dependents, and children, to adjust their status to lawful permanent resident.

     (e)  Within 45 days of the date of this order, for purposes of considering updates to the “Schedule A” list of occupations, 20 C.F.R. 656.5, the Secretary of Labor shall publish a request for information (RFI) to solicit public input, including from industry and worker-advocate communities, identifying AI and other STEM-related occupations, as well as additional occupations across the economy, for which there is an insufficient number of ready, willing, able, and qualified United States workers.

     (f)  The Secretary of State and the Secretary of Homeland Security shall, consistent with applicable law and implementing regulations, use their discretionary authorities to support and attract foreign nationals with special skills in AI and other critical and emerging technologies seeking to work, study, or conduct research in the United States.

     (g)  Within 120 days of the date of this order, the Secretary of Homeland Security, in consultation with the Secretary of State, the Secretary of Commerce, and the Director of OSTP, shall develop and publish informational resources to better attract and retain experts in AI and other critical and emerging technologies, including:

          (i)   a clear and comprehensive guide for experts in AI and other critical and emerging technologies to understand their options for working in the United States, to be published in multiple relevant languages on AI.gov; and

          (ii)  a public report with relevant data on applications, petitions, approvals, and other key indicators of how experts in AI and other critical and emerging technologies have utilized the immigration system through the end of Fiscal Year 2023.

     5.2.  Promoting Innovation.  (a)  To develop and strengthen public-private partnerships for advancing innovation, commercialization, and risk-mitigation methods for AI, and to help promote safe, responsible, fair, privacy-protecting, and trustworthy AI systems, the Director of NSF shall take the following steps:

          (i)    Within 90 days of the date of this order, in coordination with the heads of agencies that the Director of NSF deems appropriate, launch a pilot program implementing the National AI Research Resource (NAIRR), consistent with past recommendations of the NAIRR Task Force.  The program shall pursue the infrastructure, governance mechanisms, and user interfaces to pilot an initial integration of distributed computational, data, model, and training resources to be made available to the research community in support of AI-related research and development.  The Director of NSF shall identify Federal and private sector computational, data, software, and training resources appropriate for inclusion in the NAIRR pilot program.  To assist with such work, within 45 days of the date of this order, the heads of agencies whom the Director of NSF identifies for coordination pursuant to this subsection shall each submit to the Director of NSF a report identifying the agency resources that could be developed and integrated into such a pilot program.  These reports shall include a description of such resources, including their current status and availability; their format, structure, or technical specifications; associated agency expertise that will be provided; and the benefits and risks associated with their inclusion in the NAIRR pilot program.  The heads of independent regulatory agencies are encouraged to take similar steps, as they deem appropriate.

          (ii)   Within 150 days of the date of this order, fund and launch at least one NSF Regional Innovation Engine that prioritizes AI-related work, such as AI-related research, societal, or workforce needs.

          (iii)  Within 540 days of the date of this order, establish at least four new National AI Research Institutes, in addition to the 25 currently funded as of the date of this order. 

     (b)  Within 120 days of the date of this order, to support activities involving high-performance and data-intensive computing, the Secretary of Energy, in coordination with the Director of NSF, shall, in a manner consistent with applicable law and available appropriations, establish a pilot program to enhance existing successful training programs for scientists, with the goal of training 500 new researchers by 2025 capable of meeting the rising demand for AI talent.

     (c)  To promote innovation and clarify issues related to AI and inventorship of patentable subject matter, the Under Secretary of Commerce for Intellectual Property and Director of the United States Patent and Trademark Office (USPTO Director) shall:

          (i)    within 120 days of the date of this order, publish guidance to USPTO patent examiners and applicants addressing inventorship and the use of AI, including generative AI, in the inventive process, including illustrative examples in which AI systems play different roles in inventive processes and how, in each example, inventorship issues ought to be analyzed;

          (ii)   subsequently, within 270 days of the date of this order, issue additional guidance to USPTO patent examiners and applicants to address other considerations at the intersection of AI and IP, which could include, as the USPTO Director deems necessary, updated guidance on patent eligibility to address innovation in AI and critical and emerging technologies; and

          (iii)  within 270 days of the date of this order or 180 days after the United States Copyright Office of the Library of Congress publishes its forthcoming AI study that will address copyright issues raised by AI, whichever comes later, consult with the Director of the United States Copyright Office and issue recommendations to the President on potential executive actions relating to copyright and AI.  The recommendations shall address any copyright and related issues discussed in the United States Copyright Office’s study, including the scope of protection for works produced using AI and the treatment of copyrighted works in AI training.

     (d)  Within 180 days of the date of this order, to assist developers of AI in combatting AI-related IP risks, the Secretary of Homeland Security, acting through the Director of the National Intellectual Property Rights Coordination Center, and in consultation with the Attorney General, shall develop a training, analysis, and evaluation program to mitigate AI-related IP risks.  Such a program shall:

          (i)    include appropriate personnel dedicated to collecting and analyzing reports of AI-related IP theft, investigating such incidents with implications for national security, and, where appropriate and consistent with applicable law, pursuing related enforcement actions;

          (ii)   implement a policy of sharing information and coordinating on such work, as appropriate and consistent with applicable law, with the Federal Bureau of Investigation; United States Customs and Border Protection; other agencies; State and local agencies; and appropriate international organizations, including through work-sharing agreements;

          (iii)  develop guidance and other appropriate resources to assist private sector actors with mitigating the risks of AI-related IP theft;

          (iv)   share information and best practices with AI developers and law enforcement personnel to identify incidents, inform stakeholders of current legal requirements, and evaluate AI systems for IP law violations, as well as develop mitigation strategies and resources; and

          (v)    assist the Intellectual Property Enforcement Coordinator in updating the Intellectual Property Enforcement Coordinator Joint Strategic Plan on Intellectual Property Enforcement to address AI-related issues.

     (e)  To advance responsible AI innovation by a wide range of healthcare technology developers that promotes the welfare of patients and workers in the healthcare sector, the Secretary of HHS shall identify and, as appropriate and consistent with applicable law and the activities directed in section 8 of this order, prioritize grantmaking and other awards, as well as undertake related efforts, to support responsible AI development and use, including:

          (i)    collaborating with appropriate private sector actors through HHS programs that may support the advancement of AI-enabled tools that develop personalized immune-response profiles for patients, consistent with section 4 of this order;

          (ii)   prioritizing the allocation of 2024 Leading Edge Acceleration Project cooperative agreement awards to initiatives that explore ways to improve healthcare-data quality to support the responsible development of AI tools for clinical care, real-world-evidence programs, population health, public health, and related research; and

          (iii)  accelerating grants awarded through the National Institutes of Health Artificial Intelligence/Machine Learning Consortium to Advance Health Equity and Researcher Diversity (AIM-AHEAD) program and showcasing current AIM-AHEAD activities in underserved communities.

     (f)  To advance the development of AI systems that improve the quality of veterans’ healthcare, and in order to support small businesses’ innovative capacity, the Secretary of Veterans Affairs shall:

          (i)   within 365 days of the date of this order, host two 3-month nationwide AI Tech Sprint competitions; and

          (ii)  as part of the AI Tech Sprint competitions and in collaboration with appropriate partners, provide participants access to technical assistance, mentorship opportunities, individualized expert feedback on products under development, potential contract opportunities, and other programming and resources.

     (g)  Within 180 days of the date of this order, to support the goal of strengthening our Nation’s resilience against climate change impacts and building an equitable clean energy economy for the future, the Secretary of Energy, in consultation with the Chair of the Federal Energy Regulatory Commission, the Director of OSTP, the Chair of the Council on Environmental Quality, the Assistant to the President and National Climate Advisor, and the heads of other relevant agencies as the Secretary of Energy may deem appropriate, shall:

          (i)    issue a public report describing the potential for AI to improve planning, permitting, investment, and operations for electric grid infrastructure and to enable the provision of clean, affordable, reliable, resilient, and secure electric power to all Americans;

          (ii)   develop tools that facilitate building foundation models useful for basic and applied science, including models that streamline permitting and environmental reviews while improving environmental and social outcomes;

          (iii)  collaborate, as appropriate, with private sector organizations and members of academia to support development of AI tools to mitigate climate change risks;

          (iv)   take steps to expand partnerships with industry, academia, other agencies, and international allies and partners to utilize the Department of Energy’s computing capabilities and AI testbeds to build foundation models that support new applications in science and energy, and for national security, including partnerships that increase community preparedness for climate-related risks, enable clean-energy deployment (including addressing delays in permitting reviews), and enhance grid reliability and resilience; and

          (v)    establish an office to coordinate development of AI and other critical and emerging technologies across Department of Energy programs and the 17 National Laboratories.

     (h)  Within 180 days of the date of this order, to understand AI’s implications for scientific research, the President’s Council of Advisors on Science and Technology shall submit to the President and make publicly available a report on the potential role of AI, especially given recent developments in AI, in research aimed at tackling major societal and global challenges.  The report shall include a discussion of issues that may hinder the effective use of AI in research and practices needed to ensure that AI is used responsibly for research.

     5.3.  Promoting Competition.  (a)  The head of each agency developing policies and regulations related to AI shall use their authorities, as appropriate and consistent with applicable law, to promote competition in AI and related technologies, as well as in other markets.  Such actions include addressing risks arising from concentrated control of key inputs, taking steps to stop unlawful collusion and prevent dominant firms from disadvantaging competitors, and working to provide new opportunities for small businesses and entrepreneurs.  In particular, the Federal Trade Commission is encouraged to consider, as it deems appropriate, whether to exercise the Commission’s existing authorities, including its rulemaking authority under the Federal Trade Commission Act, 15 U.S.C. 41 et seq., to ensure fair competition in the AI marketplace and to ensure that consumers and workers are protected from harms that may be enabled by the use of AI.

     (b)  To promote competition and innovation in the semiconductor industry, recognizing that semiconductors power AI technologies and that their availability is critical to AI competition, the Secretary of Commerce shall, in implementing division A of Public Law 117-167, known as the Creating Helpful Incentives to Produce Semiconductors (CHIPS) Act of 2022, promote competition by:

          (i)    implementing a flexible membership structure for the National Semiconductor Technology Center that attracts all parts of the semiconductor and microelectronics ecosystem, including startups and small firms;

          (ii)   implementing mentorship programs to increase interest and participation in the semiconductor industry, including from workers in underserved communities;

          (iii)  increasing, where appropriate and to the extent permitted by law, the availability of resources to startups and small businesses, including:

               (A)  funding for physical assets, such as specialty equipment or facilities, to which startups and small businesses may not otherwise have access;

               (B)  datasets — potentially including test and performance data — collected, aggregated, or shared by CHIPS research and development programs;

               (C)  workforce development programs;

               (D)  design and process technology, as well as IP, as appropriate; and

               (E)  other resources, including technical and intellectual property assistance, that could accelerate commercialization of new technologies by startups and small businesses, as appropriate; and

          (iv)   considering the inclusion, to the maximum extent possible, and as consistent with applicable law, of competition-increasing measures in notices of funding availability for commercial research-and-development facilities focused on semiconductors, including measures that increase access to facility capacity for startups or small firms developing semiconductors used to power AI technologies.

     (c)  To support small businesses innovating and commercializing AI, as well as in responsibly adopting and deploying AI, the Administrator of the Small Business Administration shall:

          (i)    prioritize the allocation of Regional Innovation Cluster program funding for clusters that support planning activities related to the establishment of one or more Small Business AI Innovation and Commercialization Institutes that provide support, technical assistance, and other resources to small businesses seeking to innovate, commercialize, scale, or otherwise advance the development of AI;

          (ii)   prioritize the allocation of up to $2 million in Growth Accelerator Fund Competition bonus prize funds for accelerators that support the incorporation or expansion of AI-related curricula, training, and technical assistance, or other AI-related resources within their programming; and

          (iii)  assess the extent to which the eligibility criteria of existing programs, including the State Trade Expansion Program, Technical and Business Assistance funding, and capital-access programs — such as the 7(a) loan program, 504 loan program, and Small Business Investment Company (SBIC) program — support appropriate expenses by small businesses related to the adoption of AI and, if feasible and appropriate, revise eligibility criteria to improve support for these expenses. 

     (d)  The Administrator of the Small Business Administration, in coordination with resource partners, shall conduct outreach regarding, and raise awareness of, opportunities for small businesses to use capital-access programs described in subsection 5.3(c) of this section for eligible AI-related purposes, and for eligible investment funds with AI-related expertise — particularly those seeking to serve or with experience serving underserved communities — to apply for an SBIC license.

     Sec. 6.  Supporting Workers.(a)  To advance the Government’s understanding of AI’s implications for workers, the following actions shall be taken within 180 days of the date of this order:

          (i)   The Chairman of the Council of Economic Advisers shall prepare and submit a report to the President on the labor-market effects of AI.

          (ii)  To evaluate necessary steps for the Federal Government to address AI-related workforce disruptions, the Secretary of Labor shall submit to the President a report analyzing the abilities of agencies to support workers displaced by the adoption of AI and other technological advancements.  The report shall, at a minimum:

               (A)  assess how current or formerly operational Federal programs designed to assist workers facing job disruptions — including unemployment insurance and programs authorized by the Workforce Innovation and Opportunity Act (Public Law 113-128) — could be used to respond to possible future AI-related disruptions; and

               (B)  identify options, including potential legislative measures, to strengthen or develop additional Federal support for workers displaced by AI and, in consultation with the Secretary of Commerce and the Secretary of Education, strengthen and expand education and training opportunities that provide individuals pathways to occupations related to AI.

     (b)  To help ensure that AI deployed in the workplace advances employees’ well-being:

          (i)    The Secretary of Labor shall, within 180 days of the date of this order and in consultation with other agencies and with outside entities, including labor unions and workers, as the Secretary of Labor deems appropriate, develop and publish principles and best practices for employers that could be used to mitigate AI’s potential harms to employees’ well-being and maximize its potential benefits.  The principles and best practices shall include specific steps for employers to take with regard to AI, and shall cover, at a minimum:

               (A)  job-displacement risks and career opportunities related to AI, including effects on job skills and evaluation of applicants and workers;

               (B)  labor standards and job quality, including issues related to the equity, protected-activity, compensation, health, and safety implications of AI in the workplace; and

               (C)  implications for workers of employers’ AI-related collection and use of data about them, including transparency, engagement, management, and activity protected under worker-protection laws.

          (ii)   After principles and best practices are developed pursuant to subsection (b)(i) of this section, the heads of agencies shall consider, in consultation with the Secretary of Labor, encouraging the adoption of these guidelines in their programs to the extent appropriate for each program and consistent with applicable law.

          (iii)  To support employees whose work is monitored or augmented by AI in being compensated appropriately for all of their work time, the Secretary of Labor shall issue guidance to make clear that employers that deploy AI to monitor or augment employees’ work must continue to comply with protections that ensure that workers are compensated for their hours worked, as defined under the Fair Labor Standards Act of 1938, 29 U.S.C. 201 et seq., and other legal requirements.

     (c)  To foster a diverse AI-ready workforce, the Director of NSF shall prioritize available resources to support AI-related education and AI-related workforce development through existing programs.  The Director shall additionally consult with agencies, as appropriate, to identify further opportunities for agencies to allocate resources for those purposes.  The actions by the Director shall use appropriate fellowship programs and awards for these purposes.

     Sec. 7.  Advancing Equity and Civil Rights.

     7.1.  Strengthening AI and Civil Rights in the Criminal Justice System.  (a)  To address unlawful discrimination and other harms that may be exacerbated by AI, the Attorney General shall:

          (i)    consistent with Executive Order 12250 of November 2, 1980 (Leadership and Coordination of Nondiscrimination Laws), Executive Order 14091, and 28 C.F.R. 0.50-51, coordinate with and support agencies in their implementation and enforcement of existing Federal laws to address civil rights and civil liberties violations and discrimination related to AI; 

          (ii)   direct the Assistant Attorney General in charge of the Civil Rights Division to convene, within 90 days of the date of this order, a meeting of the heads of Federal civil rights offices — for which meeting the heads of civil rights offices within independent regulatory agencies will be encouraged to join — to discuss comprehensive use of their respective authorities and offices to:  prevent and address discrimination in the use of automated systems, including algorithmic discrimination; increase coordination between the Department of Justice’s Civil Rights Division and Federal civil rights offices concerning issues related to AI and algorithmic discrimination; improve external stakeholder engagement to promote public awareness of potential discriminatory uses and effects of AI; and develop, as appropriate, additional training, technical assistance, guidance, or other resources; and  

          (iii)  consider providing, as appropriate and consistent with applicable law, guidance, technical assistance, and training to State, local, Tribal, and territorial investigators and prosecutors on best practices for investigating and prosecuting civil rights violations and discrimination related to automated systems, including AI.

     (b)  To promote the equitable treatment of individuals and adhere to the Federal Government’s fundamental obligation to ensure fair and impartial justice for all, with respect to the use of AI in the criminal justice system, the Attorney General shall, in consultation with the Secretary of Homeland Security and the Director of OSTP:

          (i)    within 365 days of the date of this order, submit to the President a report that addresses the use of AI in the criminal justice system, including any use in:

               (A)  sentencing;

               (B)  parole, supervised release, and probation;

               (C)  bail, pretrial release, and pretrial detention;

               (D)  risk assessments, including pretrial, earned time, and early release or transfer to home-confinement determinations;

               (E)  police surveillance;

               (F)  crime forecasting and predictive policing, including the ingestion of historical crime data into AI systems to predict high-density “hot spots”;

               (G)  prison-management tools; and

               (H)  forensic analysis;  

          (ii)   within the report set forth in subsection 7.1(b)(i) of this section:

               (A)  identify areas where AI can enhance law enforcement efficiency and accuracy, consistent with protections for privacy, civil rights, and civil liberties; and

               (B)  recommend best practices for law enforcement agencies, including safeguards and appropriate use limits for AI, to address the concerns set forth in section 13(e)(i) of Executive Order 14074 as well as the best practices and the guidelines set forth in section 13(e)(iii) of Executive Order 14074; and  

          (iii)  supplement the report set forth in subsection 7.1(b)(i) of this section as appropriate with recommendations to the President, including with respect to requests for necessary legislation.  

     (c)  To advance the presence of relevant technical experts and expertise (such as machine-learning engineers, software and infrastructure engineering, data privacy experts, data scientists, and user experience researchers) among law enforcement professionals:

          (i)    The interagency working group created pursuant to section 3 of Executive Order 14074 shall, within 180 days of the date of this order, identify and share best practices for recruiting and hiring law enforcement professionals who have the technical skills mentioned in subsection 7.1(c) of this section, and for training law enforcement professionals about responsible application of AI.

          (ii)   Within 270 days of the date of this order, the Attorney General shall, in consultation with the Secretary of Homeland Security, consider those best practices and the guidance developed under section 3(d) of Executive Order 14074 and, if necessary, develop additional general recommendations for State, local, Tribal, and territorial law enforcement agencies and criminal justice agencies seeking to recruit, hire, train, promote, and retain highly qualified and service-oriented officers and staff with relevant technical knowledge.  In considering this guidance, the Attorney General shall consult with State, local, Tribal, and territorial law enforcement agencies, as appropriate.

          (iii)  Within 365 days of the date of this order, the Attorney General shall review the work conducted pursuant to section 2(b) of Executive Order 14074 and, if appropriate, reassess the existing capacity to investigate law enforcement deprivation of rights under color of law resulting from the use of AI, including through improving and increasing training of Federal law enforcement officers, their supervisors, and Federal prosecutors on how to investigate and prosecute cases related to AI involving the deprivation of rights under color of law pursuant to 18 U.S.C. 242. 

     7.2.  Protecting Civil Rights Related to Government Benefits and Programs.  (a)  To advance equity and civil rights, consistent with the directives of Executive Order 14091, and in addition to complying with the guidance on Federal Government use of AI issued pursuant to section 10.1(b) of this order, agencies shall use their respective civil rights and civil liberties offices and authorities — as appropriate and consistent with applicable law — to prevent and address unlawful discrimination and other harms that result from uses of AI in Federal Government programs and benefits administration.  This directive does not apply to agencies’ civil or criminal enforcement authorities.  Agencies shall consider opportunities to ensure that their respective civil rights and civil liberties offices are appropriately consulted on agency decisions regarding the design, development, acquisition, and use of AI in Federal Government programs and benefits administration.  To further these objectives, agencies shall also consider opportunities to increase coordination, communication, and engagement about AI as appropriate with community-based organizations; civil-rights and civil-liberties organizations; academic institutions; industry; State, local, Tribal, and territorial governments; and other stakeholders.  

     (b)  To promote equitable administration of public benefits:

          (i)   The Secretary of HHS shall, within 180 days of the date of this order and in consultation with relevant agencies, publish a plan, informed by the guidance issued pursuant to section 10.1(b) of this order, addressing the use of automated or algorithmic systems in the implementation by States and localities of public benefits and services administered by the Secretary, such as to promote:  assessment of access to benefits by qualified recipients; notice to recipients about the presence of such systems; regular evaluation to detect unjust denials; processes to retain appropriate levels of discretion of expert agency staff; processes to appeal denials to human reviewers; and analysis of whether algorithmic systems in use by benefit programs achieve equitable and just outcomes.

          (ii)  The Secretary of Agriculture shall, within 180 days of the date of this order and as informed by the guidance issued pursuant to section 10.1(b) of this order, issue guidance to State, local, Tribal, and territorial public-benefits administrators on the use of automated or algorithmic systems in implementing benefits or in providing customer support for benefit programs administered by the Secretary, to ensure that programs using those systems:

               (A)  maximize program access for eligible recipients;

               (B)  employ automated or algorithmic systems in a manner consistent with any requirements for using merit systems personnel in public-benefits programs;

               (C)  identify instances in which reliance on automated or algorithmic systems would require notification by the State, local, Tribal, or territorial government to the Secretary;

               (D)  identify instances when applicants and participants can appeal benefit determinations to a human reviewer for reconsideration and can receive other customer support from a human being;

               (E)  enable auditing and, if necessary, remediation of the logic used to arrive at an individual decision or determination to facilitate the evaluation of appeals; and

               (F)  enable the analysis of whether algorithmic systems in use by benefit programs achieve equitable outcomes.

     7.3.  Strengthening AI and Civil Rights in the Broader Economy.  (a)  Within 365 days of the date of this order, to prevent unlawful discrimination from AI used for hiring, the Secretary of Labor shall publish guidance for Federal contractors regarding nondiscrimination in hiring involving AI and other technology-based hiring systems.

     (b)  To address discrimination and biases against protected groups in housing markets and consumer financial markets, the Director of the Federal Housing Finance Agency and the Director of the Consumer Financial Protection Bureau are encouraged to consider using their authorities, as they deem appropriate, to require their respective regulated entities, where possible, to use appropriate methodologies including AI tools to ensure compliance with Federal law and:

          (i)   evaluate their underwriting models for bias or disparities affecting protected groups; and

          (ii)  evaluate automated collateral-valuation and appraisal processes in ways that minimize bias.

     (c)  Within 180 days of the date of this order, to combat unlawful discrimination enabled by automated or algorithmic tools used to make decisions about access to housing and in other real estate-related transactions, the Secretary of Housing and Urban Development shall, and the Director of the Consumer Financial Protection Bureau is encouraged to, issue additional guidance:

          (i)   addressing the use of tenant screening systems in ways that may violate the Fair Housing Act (Public Law 90-284), the Fair Credit Reporting Act (Public Law 91-508), or other relevant Federal laws, including how the use of data, such as criminal records, eviction records, and credit information, can lead to discriminatory outcomes in violation of Federal law; and

          (ii)  addressing how the Fair Housing Act, the Consumer Financial Protection Act of 2010 (title X of Public Law 111-203), or the Equal Credit Opportunity Act (Public Law 93-495) apply to the advertising of housing, credit, and other real estate-related transactions through digital platforms, including those that use algorithms to facilitate advertising delivery, as well as on best practices to avoid violations of Federal law.

     (d)  To help ensure that people with disabilities benefit from AI’s promise while being protected from its risks, including unequal treatment from the use of biometric data like gaze direction, eye tracking, gait analysis, and hand motions, the Architectural and Transportation Barriers Compliance Board is encouraged, as it deems appropriate, to solicit public participation and conduct community engagement; to issue technical assistance and recommendations on the risks and benefits of AI in using biometric data as an input; and to provide people with disabilities access to information and communication technology and transportation services.

     Sec. 8.  Protecting Consumers, Patients, Passengers, and Students.  (a)  Independent regulatory agencies are encouraged, as they deem appropriate, to consider using their full range of authorities to protect American consumers from fraud, discrimination, and threats to privacy and to address other risks that may arise from the use of AI, including risks to financial stability, and to consider rulemaking, as well as emphasizing or clarifying where existing regulations and guidance apply to AI, including clarifying the responsibility of regulated entities to conduct due diligence on and monitor any third-party AI services they use, and emphasizing or clarifying requirements and expectations related to the transparency of AI models and regulated entities’ ability to explain their use of AI models.

     (b)  To help ensure the safe, responsible deployment and use of AI in the healthcare, public-health, and human-services sectors:

          (i)    Within 90 days of the date of this order, the Secretary of HHS shall, in consultation with the Secretary of Defense and the Secretary of Veterans Affairs, establish an HHS AI Task Force that shall, within 365 days of its creation, develop a strategic plan that includes policies and frameworks — possibly including regulatory action, as appropriate — on responsible deployment and use of AI and AI-enabled technologies in the health and human services sector (including research and discovery, drug and device safety, healthcare delivery and financing, and public health), and identify appropriate guidance and
resources to promote that deployment, including in the following areas:

               (A)  development, maintenance, and use of predictive and generative AI-enabled technologies in healthcare delivery and financing — including quality measurement, performance improvement, program integrity, benefits administration, and patient experience — taking into account considerations such as appropriate human oversight of the application of AI-generated output;

               (B)  long-term safety and real-world performance monitoring of AI-enabled technologies in the health and human services sector, including clinically relevant or significant modifications and performance across population groups, with a means to communicate product updates to regulators, developers, and users; 

               (C)  incorporation of equity principles in AI-enabled technologies used in the health and human services sector, using disaggregated data on affected populations and representative population data sets when developing new models, monitoring algorithmic performance against discrimination and bias in existing models, and helping to identify and mitigate discrimination and bias in current systems; 

               (D)  incorporation of safety, privacy, and security standards into the software-development lifecycle for protection of personally identifiable information, including measures to address AI-enhanced cybersecurity threats in the health and human services sector;

               (E)  development, maintenance, and availability of documentation to help users determine appropriate and safe uses of AI in local settings in the health and human services sector;

               (F)  work to be done with State, local, Tribal, and territorial health and human services agencies to advance positive use cases and best practices for use of AI in local settings; and

               (G)  identification of uses of AI to promote workplace efficiency and satisfaction in the health and human services sector, including reducing administrative burdens.

          (ii)   Within 180 days of the date of this order, the Secretary of HHS shall direct HHS components, as the Secretary of HHS deems appropriate, to develop a strategy, in consultation with relevant agencies, to determine whether AI-enabled technologies in the health and human services sector maintain appropriate levels of quality, including, as appropriate, in the areas described in subsection (b)(i) of this section.  This work shall include the development of AI assurance policy — to evaluate important aspects of the performance of AI-enabled healthcare tools — and infrastructure needs for enabling pre-market assessment and post-market oversight of AI-enabled healthcare-technology algorithmic system performance against real-world data.

          (iii)  Within 180 days of the date of this order, the Secretary of HHS shall, in consultation with relevant agencies as the Secretary of HHS deems appropriate, consider appropriate actions to advance the prompt understanding of, and compliance with, Federal nondiscrimination laws by health and human services providers that receive Federal financial assistance, as well as how those laws relate to AI.  Such actions may include:

               (A)  convening and providing technical assistance to health and human services providers and payers about their obligations under Federal nondiscrimination and privacy laws as they relate to AI and the potential consequences of noncompliance; and

               (B)  issuing guidance, or taking other action as appropriate, in response to any complaints or other reports of noncompliance with Federal nondiscrimination and privacy laws as they relate to AI.

          (iv)   Within 365 days of the date of this order, the Secretary of HHS shall, in consultation with the Secretary of Defense and the Secretary of Veterans Affairs, establish an AI safety program that, in partnership with voluntary federally listed Patient Safety Organizations:

               (A)  establishes a common framework for approaches to identifying and capturing clinical errors resulting from AI deployed in healthcare settings as well as specifications for a central tracking repository for associated incidents that cause harm, including through bias or discrimination, to patients, caregivers, or other parties; 

               (B)  analyzes captured data and generated evidence to develop, wherever appropriate, recommendations, best practices, or other informal guidelines aimed at avoiding these harms; and

               (C)  disseminates those recommendations, best practices, or other informal guidance to appropriate stakeholders, including healthcare providers.

          (v)    Within 365 days of the date of this order, the Secretary of HHS shall develop a strategy for regulating the use of AI or AI-enabled tools in drug-development processes.  The strategy shall, at a minimum:

               (A)  define the objectives, goals, and high-level principles required for appropriate regulation throughout each phase of drug development;

               (B)  identify areas where future rulemaking, guidance, or additional statutory authority may be necessary to implement such a regulatory system;

               (C)  identify the existing budget, resources, personnel, and potential for new public/private partnerships necessary for such a regulatory system; and

               (D)  consider risks identified by the actions undertaken to implement section 4 of this order.

     (c)  To promote the safe and responsible development and use of AI in the transportation sector, in consultation with relevant agencies:

          (i)    Within 30 days of the date of this order, the Secretary of Transportation shall direct the Nontraditional and Emerging Transportation Technology (NETT) Council to assess the need for information, technical assistance, and guidance regarding the use of AI in transportation.  The Secretary of Transportation shall further direct the NETT Council, as part of any such efforts, to:

               (A)  support existing and future initiatives to pilot transportation-related applications of AI, as they align with policy priorities articulated in the Department of Transportation’s (DOT) Innovation Principles, including, as appropriate, through technical assistance and connecting stakeholders;

               (B)  evaluate the outcomes of such pilot programs in order to assess when DOT, or other Federal or State agencies, have sufficient information to take regulatory actions, as appropriate, and recommend appropriate actions when that information is available; and

               (C)  establish a new DOT Cross-Modal Executive Working Group, which will consist of members from different divisions of DOT and coordinate applicable work among these divisions, to solicit and use relevant input from appropriate stakeholders.

          (ii)   Within 90 days of the date of this order, the Secretary of Transportation shall direct appropriate Federal Advisory Committees of the DOT to provide advice on the safe and responsible use of AI in transportation.  The committees shall include the Advanced Aviation Advisory Committee, the Transforming Transportation Advisory Committee, and the Intelligent Transportation Systems Program Advisory Committee.

          (iii)  Within 180 days of the date of this order, the Secretary of Transportation shall direct the Advanced Research Projects Agency-Infrastructure (ARPA-I) to explore the transportation-related opportunities and challenges of AI — including regarding software-defined AI enhancements impacting autonomous mobility ecosystems.  The Secretary of Transportation shall further encourage ARPA-I to prioritize the allocation of grants to those opportunities, as appropriate.  The work tasked to ARPA-I shall include soliciting input on these topics through a public consultation process, such as an RFI.

     (d)  To help ensure the responsible development and deployment of AI in the education sector, the Secretary of Education shall, within 365 days of the date of this order, develop resources, policies, and guidance regarding AI.  These resources shall address safe, responsible, and nondiscriminatory uses of AI in education, including the impact AI systems have on vulnerable and underserved communities, and shall be developed in consultation with stakeholders as appropriate.  They shall also include the development of an “AI toolkit” for education leaders implementing recommendations from the Department of Education’s AI and the Future of Teaching and Learning report, including appropriate human review of AI decisions, designing AI systems to enhance trust and safety and align with privacy-related laws and regulations in the educational context, and developing education-specific guardrails.

     (e)  The Federal Communications Commission is encouraged to consider actions related to how AI will affect communications networks and consumers, including by:

          (i)    examining the potential for AI to improve spectrum management, increase the efficiency of non-Federal spectrum usage, and expand opportunities for the sharing of non-Federal spectrum;

          (ii)   coordinating with the National Telecommunications and Information Administration to create opportunities for sharing spectrum between Federal and non-Federal spectrum operations;

          (iii)  providing support for efforts to improve network security, resiliency, and interoperability using next-generation technologies that incorporate AI, including self-healing networks, 6G, and Open RAN; and

          (iv)   encouraging, including through rulemaking, efforts to combat unwanted robocalls and robotexts that are facilitated or exacerbated by AI and to deploy AI technologies that better serve consumers by blocking unwanted robocalls and robotexts.

     Sec. 9.  Protecting Privacy.  (a)  To mitigate privacy risks potentially exacerbated by AI — including by AI’s facilitation of the collection or use of information about individuals, or the making of inferences about individuals — the Director of OMB shall:

          (i)    evaluate and take steps to identify commercially available information (CAI) procured by agencies, particularly CAI that contains personally identifiable information and including CAI procured from data brokers and CAI procured and processed indirectly through vendors, in appropriate agency inventory and reporting processes (other than when it is used for the purposes of national security);

          (ii)   evaluate, in consultation with the Federal Privacy Council and the Interagency Council on Statistical Policy, agency standards and procedures associated with the collection, processing, maintenance, use, sharing, dissemination, and disposition of CAI that contains personally identifiable information (other than when it is used for the purposes of national security) to inform potential guidance to agencies on ways to mitigate privacy and confidentiality risks from agencies’ activities related to CAI;

          (iii)  within 180 days of the date of this order, in consultation with the Attorney General, the Assistant to the President for Economic Policy, and the Director of OSTP, issue an RFI to inform potential revisions to guidance to agencies on implementing the privacy provisions of the E-Government Act of 2002 (Public Law 107-347).  The RFI shall seek feedback regarding how privacy impact assessments may be more effective at mitigating privacy risks, including those that are further exacerbated by AI; and

          (iv)   take such steps as are necessary and appropriate, consistent with applicable law, to support and advance the near-term actions and long-term strategy identified through the RFI process, including issuing new or updated guidance or RFIs or consulting other agencies or the Federal Privacy Council.

     (b)  Within 365 days of the date of this order, to better enable agencies to use PETs to safeguard Americans’ privacy from the potential threats exacerbated by AI, the Secretary of Commerce, acting through the Director of NIST, shall create guidelines for agencies to evaluate the efficacy of differential-privacy-guarantee protections, including for AI.  The guidelines shall, at a minimum, describe the significant factors that bear on differential-privacy safeguards and common risks to realizing differential privacy in practice.

     (c)  To advance research, development, and implementation related to PETs:

          (i)    Within 120 days of the date of this order, the Director of NSF, in collaboration with the Secretary of Energy, shall fund the creation of a Research Coordination Network (RCN) dedicated to advancing privacy research and, in particular, the development, deployment, and scaling of PETs.  The RCN shall serve to enable privacy researchers to share information, coordinate and collaborate in research, and develop standards for the privacy-research community.  

          (ii)   Within 240 days of the date of this order, the Director of NSF shall engage with agencies to identify ongoing work and potential opportunities to incorporate PETs into their operations.  The Director of NSF shall, where feasible and appropriate, prioritize research — including efforts to translate research discoveries into practical applications — that encourage the adoption of leading-edge PETs solutions for agencies’ use, including through research engagement through the RCN described in subsection (c)(i) of this section.

          (iii)  The Director of NSF shall use the results of the United States-United Kingdom PETs Prize Challenge to inform the approaches taken, and opportunities identified, for PETs research and adoption.

     Sec. 10.  Advancing Federal Government Use of AI.

     10.1.  Providing Guidance for AI Management.  (a)  To coordinate the use of AI across the Federal Government, within 60 days of the date of this order and on an ongoing basis as necessary, the Director of OMB shall convene and chair an interagency council to coordinate the development and use of AI in agencies’ programs and operations, other than the use of AI in national security systems.  The Director of OSTP shall serve as Vice Chair for the interagency council.  The interagency council’s membership shall include, at minimum, the heads of the agencies identified in 31 U.S.C. 901(b), the Director of National Intelligence, and other agencies as identified by the Chair.  Until agencies designate their permanent Chief AI Officers consistent with the guidance described in subsection 10.1(b) of this section, they shall be represented on the interagency council by an appropriate official at the Assistant Secretary level or equivalent, as determined by the head of each agency.  

     (b)  To provide guidance on Federal Government use of AI, within 150 days of the date of this order and updated periodically thereafter, the Director of OMB, in coordination with the Director of OSTP, and in consultation with the interagency council established in subsection 10.1(a) of this section, shall issue guidance to agencies to strengthen the effective and appropriate use of AI, advance AI innovation, and manage risks from AI in the Federal Government.  The Director of OMB’s guidance shall specify, to the extent appropriate and consistent with applicable law:

          (i)     the requirement to designate at each agency within 60 days of the issuance of the guidance a Chief Artificial Intelligence Officer who shall hold primary responsibility in their agency, in coordination with other responsible officials, for coordinating their agency’s use of AI, promoting AI innovation in their agency, managing risks from their agency’s use of AI, and carrying out the responsibilities described in section 8(c) of Executive Order 13960 of December 3, 2020 (Promoting the Use of Trustworthy Artificial Intelligence in the Federal Government), and section 4(b) of Executive Order 14091;

          (ii)    the Chief Artificial Intelligence Officers’ roles, responsibilities, seniority, position, and reporting structures;

          (iii)   for the agencies identified in 31 U.S.C. 901(b), the creation of internal Artificial Intelligence Governance Boards, or other appropriate mechanisms, at each agency within 60 days of the issuance of the guidance to coordinate and govern AI issues through relevant senior leaders from across the agency;

          (iv)    required minimum risk-management practices for Government uses of AI that impact people’s rights or safety, including, where appropriate, the following practices derived from OSTP’s Blueprint for an AI Bill of Rights and the NIST AI Risk Management Framework:  conducting public consultation; assessing data quality; assessing and mitigating disparate impacts and algorithmic discrimination; providing notice of the use of AI; continuously monitoring and evaluating deployed AI; and granting human consideration and remedies for adverse decisions made using AI;

          (v)     specific Federal Government uses of AI that are presumed by default to impact rights or safety;

          (vi)    recommendations to agencies to reduce barriers to the responsible use of AI, including barriers related to information technology infrastructure, data, workforce, budgetary restrictions, and cybersecurity processes; 

          (vii)   requirements that agencies identified in 31 U.S.C. 901(b) develop AI strategies and pursue high-impact AI use cases;

          (viii)  in consultation with the Secretary of Commerce, the Secretary of Homeland Security, and the heads of other appropriate agencies as determined by the Director of OMB, recommendations to agencies regarding:

               (A)  external testing for AI, including AI red-teaming for generative AI, to be developed in coordination with the Cybersecurity and Infrastructure Security Agency;

               (B)  testing and safeguards against discriminatory, misleading, inflammatory, unsafe, or deceptive outputs, as well as against producing child sexual abuse material and against producing non-consensual intimate imagery of real individuals (including intimate digital depictions of the body or body parts of an identifiable individual), for generative AI;

               (C)  reasonable steps to watermark or otherwise label output from generative AI;

               (D)  application of the mandatory minimum risk-management practices defined under subsection 10.1(b)(iv) of this section to procured AI;

               (E)  independent evaluation of vendors’ claims concerning both the effectiveness and risk mitigation of their AI offerings;

               (F)  documentation and oversight of procured AI;

               (G)  maximizing the value to agencies when relying on contractors to use and enrich Federal Government data for the purposes of AI development and operation;

               (H)  provision of incentives for the continuous improvement of procured AI; and

               (I)  training on AI in accordance with the principles set out in this order and in other references related to AI listed herein; and

          (ix)    requirements for public reporting on compliance with this guidance.

     (c)  To track agencies’ AI progress, within 60 days of the issuance of the guidance established in subsection 10.1(b) of this section and updated periodically thereafter, the Director of OMB shall develop a method for agencies to track and assess their ability to adopt AI into their programs and operations, manage its risks, and comply with Federal policy on AI.  This method should draw on existing related efforts as appropriate and should address, as appropriate and consistent with applicable law, the practices, processes, and capabilities necessary for responsible AI adoption, training, and governance across, at a minimum, the areas of information technology infrastructure, data, workforce, leadership, and risk management.  

     (d)  To assist agencies in implementing the guidance to be established in subsection 10.1(b) of this section:

          (i)   within 90 days of the issuance of the guidance, the Secretary of Commerce, acting through the Director of NIST, and in coordination with the Director of OMB and the Director of OSTP, shall develop guidelines, tools, and practices to support implementation of the minimum risk-management practices described in subsection 10.1(b)(iv) of this section; and

          (ii)  within 180 days of the issuance of the guidance, the Director of OMB shall develop an initial means to ensure that agency contracts for the acquisition of AI systems and services align with the guidance described in subsection 10.1(b) of this section and advance the other aims identified in section 7224(d)(1) of the Advancing American AI Act (Public Law 117-263, div. G, title LXXII, subtitle B). 

     (e)  To improve transparency for agencies’ use of AI, the Director of OMB shall, on an annual basis, issue instructions to agencies for the collection, reporting, and publication of agency AI use cases, pursuant to section 7225(a) of the Advancing American AI Act.  Through these instructions, the Director shall, as appropriate, expand agencies’ reporting on how they are managing risks from their AI use cases and update or replace the guidance originally established in section 5 of Executive Order 13960.

     (f)  To advance the responsible and secure use of generative AI in the Federal Government:

          (i)    As generative AI products become widely available and common in online platforms, agencies are discouraged from imposing broad general bans or blocks on agency use of generative AI.  Agencies should instead limit access, as necessary, to specific generative AI services based on specific risk assessments; establish guidelines and limitations on the appropriate use of generative AI; and, with appropriate safeguards in place, provide their personnel and programs with access to secure and reliable generative AI capabilities, at least for the purposes of experimentation and routine tasks that carry a low risk of impacting Americans’ rights.  To protect Federal Government information, agencies are also encouraged to employ risk-management practices, such as training their staff on proper use, protection, dissemination, and disposition of Federal information; negotiating appropriate terms of service with vendors; implementing measures designed to ensure compliance with record-keeping, cybersecurity, confidentiality, privacy, and data protection requirements; and deploying other measures to prevent misuse of Federal Government information in generative AI. 

          (ii)   Within 90 days of the date of this order, the Administrator of General Services, in coordination with the Director of OMB, and in consultation with the Federal Secure Cloud Advisory Committee and other relevant agencies as the Administrator of General Services may deem appropriate, shall develop and issue a framework for prioritizing critical and emerging technologies offerings in the Federal Risk and Authorization Management Program authorization process, starting with generative AI offerings that have the primary purpose of providing large language model-based chat interfaces, code-generation and debugging tools, and associated application programming interfaces, as well as prompt-based image generators.  This framework shall apply for no less than 2 years from the date of its issuance.  Agency Chief Information Officers, Chief Information Security Officers, and authorizing officials are also encouraged to prioritize generative AI and other critical and emerging technologies in granting authorities for agency operation of information technology systems and any other applicable release or oversight processes, using continuous authorizations and approvals wherever feasible.

          (iii)  Within 180 days of the date of this order, the Director of the Office of Personnel Management (OPM), in coordination with the Director of OMB, shall develop guidance on the use of generative AI for work by the Federal workforce.

     (g)  Within 30 days of the date of this order, to increase agency investment in AI, the Technology Modernization Board shall consider, as it deems appropriate and consistent with applicable law, prioritizing funding for AI projects for the Technology Modernization Fund for a period of at least 1 year.  Agencies are encouraged to submit to the Technology Modernization Fund project funding proposals that include AI — and particularly generative AI — in service of mission delivery.

     (h)  Within 180 days of the date of this order, to facilitate agencies’ access to commercial AI capabilities, the Administrator of General Services, in coordination with the Director of OMB, and in collaboration with the Secretary of Defense, the Secretary of Homeland Security, the Director of National Intelligence, the Administrator of the National Aeronautics and Space Administration, and the head of any other agency identified by the Administrator of General Services, shall take steps consistent with applicable law to facilitate access to Federal Government-wide acquisition solutions for specified types of AI services and products, such as through the creation of a resource guide or other tools to assist the acquisition workforce.  Specified types of AI capabilities shall include generative AI and specialized computing infrastructure.

     (i)  The initial means, instructions, and guidance issued pursuant to subsections 10.1(a)-(h) of this section shall not apply to AI when it is used as a component of a national security system, which shall be addressed by the proposed National Security Memorandum described in subsection 4.8 of this order. 

     10.2.  Increasing AI Talent in Government.  (a)  Within 45 days of the date of this order, to plan a national surge in AI talent in the Federal Government, the Director of OSTP and the Director of OMB, in consultation with the Assistant to the President for National Security Affairs, the Assistant to the President for Economic Policy, the Assistant to the President and Domestic Policy Advisor, and the Assistant to the President and Director of the Gender Policy Council, shall identify priority mission areas for increased Federal Government AI talent, the types of talent that are highest priority to recruit and develop to ensure adequate implementation of this order and use of relevant enforcement and regulatory authorities to address AI risks, and accelerated hiring pathways.

     (b)  Within 45 days of the date of this order, to coordinate rapid advances in the capacity of the Federal AI workforce, the Assistant to the President and Deputy Chief of Staff for Policy, in coordination with the Director of OSTP and the Director of OMB, and in consultation with the National Cyber Director, shall convene an AI and Technology Talent Task Force, which shall include the Director of OPM, the Director of the General Services Administration’s Technology Transformation Services, a representative from the Chief Human Capital Officers Council, the Assistant to the President for Presidential Personnel, members of appropriate agency technology talent programs, a representative of the Chief Data Officer Council, and a representative of the interagency council convened under subsection 10.1(a) of this section.  The Task Force’s purpose shall be to accelerate and track the hiring of AI and AI-enabling talent across the Federal Government, including through the following actions:

          (i)    within 180 days of the date of this order, tracking and reporting progress to the President on increasing AI capacity across the Federal Government, including submitting to the President a report and recommendations for further increasing capacity; 

          (ii)   identifying and circulating best practices for agencies to attract, hire, retain, train, and empower AI talent, including diversity, inclusion, and accessibility best practices, as well as to plan and budget adequately for AI workforce needs;

          (iii)  coordinating, in consultation with the Director of OPM, the use of fellowship programs and agency technology-talent programs and human-capital teams to build hiring capabilities, execute hires, and place AI talent to fill staffing gaps; and

          (iv)   convening a cross-agency forum for ongoing collaboration between AI professionals to share best practices and improve retention.

     (c)  Within 45 days of the date of this order, to advance existing Federal technology talent programs, the United States Digital Service, Presidential Innovation Fellowship, United States Digital Corps, OPM, and technology talent programs at agencies, with support from the AI and Technology Talent Task Force described in subsection 10.2(b) of this section, as appropriate and permitted by law, shall develop and begin to implement plans to support the rapid recruitment of individuals as part of a Federal Government-wide AI talent surge to accelerate the placement of key AI and AI-enabling talent in high-priority areas and to advance agencies’ data and technology strategies.

     (d)  To meet the critical hiring need for qualified personnel to execute the initiatives in this order, and to improve Federal hiring practices for AI talent, the Director of OPM, in consultation with the Director of OMB, shall:

          (i)     within 60 days of the date of this order, conduct an evidence-based review on the need for hiring and workplace flexibility, including Federal Government-wide direct-hire authority for AI and related data-science and technical roles, and, where the Director of OPM finds such authority is appropriate, grant it; this review shall include the following job series at all General Schedule (GS) levels:  IT Specialist (2210), Computer Scientist (1550), Computer Engineer (0854), and Program Analyst (0343) focused on AI, and any subsequently developed job series derived from these job series;

          (ii)    within 60 days of the date of this order, consider authorizing the use of excepted service appointments under 5 C.F.R. 213.3102(i)(3) to address the need for hiring additional staff to implement directives of this order;

          (iii)   within 90 days of the date of this order, coordinate a pooled-hiring action informed by subject-matter experts and using skills-based assessments to support the recruitment of AI talent across agencies;

          (iv)    within 120 days of the date of this order, as appropriate and permitted by law, issue guidance for agency application of existing pay flexibilities or incentive pay programs for AI, AI-enabling, and other key technical positions to facilitate appropriate use of current pay incentives;

          (v)     within 180 days of the date of this order, establish guidance and policy on skills-based, Federal Government-wide hiring of AI, data, and technology talent in order to increase access to those with nontraditional academic backgrounds to Federal AI, data, and technology roles; 

          (vi)    within 180 days of the date of this order, establish an interagency working group, staffed with both human-resources professionals and recruiting technical experts, to facilitate Federal Government-wide hiring of people with AI and other technical skills;

          (vii)   within 180 days of the date of this order, review existing Executive Core Qualifications (ECQs) for Senior Executive Service (SES) positions informed by data and AI literacy competencies and, within 365 days of the date of this order, implement new ECQs as appropriate in the SES assessment process;

          (viii)  within 180 days of the date of this order, complete a review of competencies for civil engineers (GS-0810 series) and, if applicable, other related occupations, and make recommendations for ensuring that adequate AI expertise and credentials in these occupations in the Federal Government reflect the increased use of AI in critical infrastructure; and

          (ix)    work with the Security, Suitability, and Credentialing Performance Accountability Council to assess mechanisms to streamline and accelerate personnel-vetting requirements, as appropriate, to support AI and fields related to other critical and emerging technologies.  

     (e)  To expand the use of special authorities for AI hiring and retention, agencies shall use all appropriate hiring authorities, including Schedule A(r) excepted service hiring and direct-hire authority, as applicable and appropriate, to hire AI talent and AI-enabling talent rapidly.  In addition to participating in OPM-led pooled hiring actions, agencies shall collaborate, where appropriate, on agency-led pooled hiring under the Competitive Service Act of 2015 (Public Law 114-137) and other shared hiring.  Agencies shall also, where applicable, use existing incentives, pay-setting authorities, and other compensation flexibilities, similar to those used for cyber and information technology positions, for AI and data-science professionals, as well as plain-language job titles, to help recruit and retain these highly skilled professionals.  Agencies shall ensure that AI and other related talent needs (such as technology governance and privacy) are reflected in strategic workforce planning and budget formulation. 

     (f)  To facilitate the hiring of data scientists, the Chief Data Officer Council shall develop a position-description library for data scientists (job series 1560) and a hiring guide to support agencies in hiring data scientists.

     (g)  To help train the Federal workforce on AI issues, the head of each agency shall implement — or increase the availability and use of — AI training and familiarization programs for employees, managers, and leadership in technology as well as relevant policy, managerial, procurement, regulatory, ethical, governance, and legal fields.  Such training programs should, for example, empower Federal employees, managers, and leaders to develop and maintain an operating knowledge of emerging AI technologies to assess opportunities to use these technologies to enhance the delivery of services to the public, and to mitigate risks associated with these technologies.  Agencies that provide professional-development opportunities, grants, or funds for their staff should take appropriate steps to ensure that employees who do not serve in traditional technical roles, such as policy, managerial, procurement, or legal fields, are nonetheless eligible to receive funding for programs and courses that focus on AI, machine learning, data science, or other related subject areas.  

     (h)  Within 180 days of the date of this order, to address gaps in AI talent for national defense, the Secretary of Defense shall submit a report to the President through the Assistant to the President for
National Security Affairs that includes:

          (i)    recommendations to address challenges in the Department of Defense’s ability to hire certain noncitizens, including at the Science and Technology Reinvention Laboratories;

          (ii)   recommendations to clarify and streamline processes for accessing classified information for certain noncitizens through Limited Access Authorization at Department of Defense laboratories;

          (iii)  recommendations for the appropriate use of enlistment authority under 10 U.S.C. 504(b)(2) for experts in AI and other critical and emerging technologies; and

          (iv)   recommendations for the Department of Defense and the Department of Homeland Security to work together to enhance the use of appropriate authorities for the retention of certain noncitizens of vital importance to national security by the Department of Defense and the Department of Homeland Security.  

     Sec. 11.  Strengthening American Leadership Abroad.  (a)  To strengthen United States leadership of global efforts to unlock AI’s potential and meet its challenges, the Secretary of State, in coordination with the Assistant to the President for National Security Affairs, the Assistant to the President for Economic Policy, the Director of OSTP, and the heads of other relevant agencies as appropriate, shall:

          (i)   lead efforts outside of military and intelligence areas to expand engagements with international allies and partners in relevant bilateral, multilateral, and multi-stakeholder fora to advance those allies’ and partners’ understanding of existing and planned AI-related guidance and policies of the United States, as well as to enhance international collaboration; and

          (ii)  lead efforts to establish a strong international framework for managing the risks and harnessing the benefits of AI, including by encouraging international allies and partners to support voluntary commitments similar to those that United States companies have made in pursuit of these objectives and coordinating the activities directed by subsections (b), (c), (d), and (e) of this section, and to develop common regulatory and other accountability principles for foreign nations, including to manage the risk that AI systems pose.

     (b)  To advance responsible global technical standards for AI development and use outside of military and intelligence areas, the Secretary of Commerce, in coordination with the Secretary of State and the heads of other relevant agencies as appropriate, shall lead preparations for a coordinated effort with key international allies and partners and with standards development organizations, to drive the development and implementation of AI-related consensus standards, cooperation and coordination, and information sharing.  In particular, the Secretary of Commerce shall:

          (i)    within 270 days of the date of this order, establish a plan for global engagement on promoting and developing AI standards, with lines of effort that may include:

               (A)  AI nomenclature and terminology;

               (B)  best practices regarding data capture, processing, protection, privacy, confidentiality, handling, and analysis;

               (C)  trustworthiness, verification, and assurance of AI systems; and

               (D)  AI risk management;

          (ii)   within 180 days of the date the plan is established, submit a report to the President on priority actions taken pursuant to the plan; and

          (iii)  ensure that such efforts are guided by principles set out in the NIST AI Risk Management Framework and United States Government National Standards Strategy for Critical and Emerging Technology.

     (c)  Within 365 days of the date of this order, to promote safe, responsible, and rights-affirming development and deployment of AI abroad:

          (i)   The Secretary of State and the Administrator of the United States Agency for International Development, in coordination with the Secretary of Commerce, acting through the director of NIST, shall publish an AI in Global Development Playbook that incorporates the AI Risk Management Framework’s principles, guidelines, and best practices into the social, technical, economic, governance, human rights, and security conditions of contexts beyond United States borders.  As part of this work, the Secretary of State and the Administrator of the United States Agency for International Development shall draw on lessons learned from programmatic uses of AI in global development.

          (ii)  The Secretary of State and the Administrator of the United States Agency for International Development, in collaboration with the Secretary of Energy and the Director of NSF, shall develop a Global AI Research Agenda to guide the objectives and implementation of AI-related research in contexts beyond United States borders.  The Agenda shall:

               (A)  include principles, guidelines, priorities, and best practices aimed at ensuring the safe, responsible, beneficial, and sustainable global development and adoption of AI; and

               (B)  address AI’s labor-market implications across international contexts, including by recommending risk mitigations.  

     (d)  To address cross-border and global AI risks to critical infrastructure, the Secretary of Homeland Security, in coordination with the Secretary of State, and in consultation with the heads of other relevant agencies as the Secretary of Homeland Security deems appropriate, shall lead efforts with international allies and partners to enhance cooperation to prevent, respond to, and recover from potential critical infrastructure disruptions resulting from incorporation of AI into critical infrastructure systems or malicious use of AI. 

          (i)   Within 270 days of the date of this order, the Secretary of Homeland Security, in coordination with the Secretary of State, shall develop a plan for multilateral engagements to encourage the adoption of the AI safety and security guidelines for use by critical infrastructure owners and operators developed in section 4.3(a) of this order.

          (ii)  Within 180 days of establishing the plan described in subsection (d)(i) of this section, the Secretary of Homeland Security shall submit a report to the President on priority actions to mitigate cross-border risks to critical United States infrastructure.

     Sec. 12.  Implementation.  (a)  There is established, within the Executive Office of the President, the White House Artificial Intelligence Council (White House AI Council).  The function of the White House AI Council is to coordinate the activities of agencies across the Federal Government to ensure the effective formulation, development, communication, industry engagement related to, and timely implementation of AI-related policies, including policies set forth in this order.

     (b)  The Assistant to the President and Deputy Chief of Staff for Policy shall serve as Chair of the White House AI Council.

     (c)  In addition to the Chair, the White House AI Council shall consist of the following members, or their designees:

          (i)       the Secretary of State;

          (ii)      the Secretary of the Treasury;

          (iii)     the Secretary of Defense;

          (iv)      the Attorney General;

          (v)       the Secretary of Agriculture;

          (vi)      the Secretary of Commerce;

          (vii)     the Secretary of Labor;

          (viii)    the Secretary of HHS;

          (ix)      the Secretary of Housing and Urban Development;

          (x)       the Secretary of Transportation;

          (xi)      the Secretary of Energy;

          (xii)     the Secretary of Education;

          (xiii)    the Secretary of Veterans Affairs;

          (xiv)     the Secretary of Homeland Security;

          (xv)      the Administrator of the Small Business Administration;

          (xvi)     the Administrator of the United States Agency for International Development;

          (xvii)    the Director of National Intelligence;

          (xviii)   the Director of NSF;

          (xix)     the Director of OMB;

          (xx)      the Director of OSTP;

          (xxi)     the Assistant to the President for National Security Affairs;

          (xxii)    the Assistant to the President for Economic Policy;

          (xxiii)   the Assistant to the President and Domestic Policy Advisor;

          (xxiv)    the Assistant to the President and Chief of Staff to the Vice President;

          (xxv)     the Assistant to the President and Director of the Gender Policy Council;

          (xxvi)    the Chairman of the Council of Economic Advisers;

          (xxvii)   the National Cyber Director;

          (xxviii)  the Chairman of the Joint Chiefs of Staff; and

          (xxix)    the heads of such other agencies, independent regulatory agencies, and executive offices as the Chair may from time to time designate or invite to participate.

     (d)  The Chair may create and coordinate subgroups consisting of White House AI Council members or their designees, as appropriate.

     Sec. 13.  General Provisions.  (a)  Nothing in this order shall be construed to impair or otherwise affect:

          (i)   the authority granted by law to an executive department or agency, or the head thereof; or

          (ii)  the functions of the Director of the Office of Management and Budget relating to budgetary, administrative, or legislative proposals.

     (b)  This order shall be implemented consistent with applicable law and subject to the availability of appropriations.

     (c)  This order is not intended to, and does not, create any right or benefit, substantive or procedural, enforceable at law or in equity by any party against the United States, its departments, agencies, or entities, its officers, employees, or agents, or any other person.

                             JOSEPH R. BIDEN JR.

THE WHITE HOUSE,
  October 30, 2023.

12Nov/24

BILL C-27 An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, first reading June 16, 2022

First Session, Forty-fourth Parliament, 70-71 Elizabeth II, 2021-2022. HOUSE OF COMMONS OF CANADA. BILL C-27 An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, first reading June 16, 2022

First Session, Forty-fourth Parliament,

70-71 Elizabeth II, 2021-2022

HOUSE OF COMMONS OF CANADA

BILL C-27

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

FIRST READING, June 16, 2022

MINISTER OF INNOVATION, SCIENCE AND INDUSTRY

91102

SUMMARY

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act. It also makes consequential and related amendments to other Acts.

Part 2 enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act.

Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Available on the House of Commons website at the following address:

www.ourcommons.ca

TABLE OF PROVISIONS

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

  

1st Session, 44th Parliament,

70-71 Elizabeth II, 2021-2022

HOUSE OF COMMONS OF CANADA

BILL C-27

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Preamble

Whereas there is a need to modernize Canada’s legislative framework so that it is suited to the digital age;

Whereas the protection of the privacy interests of individuals with respect to their personal information is essential to individual autonomy and dignity and to the full enjoyment of fundamental rights and freedoms in Canada;

Whereas Parliament recognizes the importance of the privacy and data protection principles contained in various international instruments;

Whereas trust in the digital and data-driven economy is key to ensuring its growth and fostering a more inclusive and prosperous Canada;

Whereas Canada is a trading nation and trade and commerce rely on the analysis, circulation and exchange of personal information and data across borders and geographical boundaries;

Whereas the design, development and deployment of artificial intelligence systems across provincial and international borders should be consistent with national and international standards to protect individuals from potential harm;

Whereas organizations of all sizes operate in the digital and data-driven economy and an agile regulatory framework is necessary to facilitate compliance with rules by, and promote innovation within, those organizations;

Whereas individuals expect a regulatory framework that ensures transparency and accountability with respect to how organizations handle their personal information and that is backed by meaningful enforcement;

Whereas the modernization of national standards for privacy protection to align them with international standards ensures a level playing field for organizations across Canada and assists them in maintaining their competitive position;

Whereas a modern regulatory framework governing the protection of personal information should promote the responsible collection, use and disclosure of such information by organizations for purposes that are in the public interest;

Whereas Parliament recognizes that artificial intelligence systems and other emerging technologies should uphold Canadian norms and values in line with the principles of international human rights law;

And whereas this Act aims to support the Government of Canada’s efforts to foster an environment in which Canadians can seize the benefits of the digital and data-driven economy and to establish a regulatory framework that supports and protects Canadian norms and values, including the right to privacy;

Now, therefore, Her Majesty, by and with the advice and consent of the Senate and House of Commons of Canada, enacts as follows:

Short Title

Short title

1 This Act may be cited as the Digital Charter Implementation Act, 2022.

PART 1 

Consumer Privacy Protection Act

Enactment of Act

Enactment

2 The Consumer Privacy Protection Act, whose text is as follows and whose schedule is set out in the schedule to this Act, is enacted:

An Act to support and promote electronic commerce by protecting personal information that is collected, used or disclosed in the course of commercial activities

Short Title

Short title

1 This Act may be cited as the Consumer Privacy Protection Act.

Interpretation

Definitions

2 (1) The following definitions apply in this Act.

alternative format, with respect to personal information, means a format that allows an individual with a sensory disability to read or listen to the personal information. (support de substitution)

anonymize means to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means. (anonymiser)

automated decision system means any technology that assists or replaces the judgment of human decision-makers through the use of a rules-based system, regression analysis, predictive analytics, machine learning, deep learning, a neural network or other technique. (système décisionnel automatisé)

breach of security safeguards means the loss of, unauthorized access to or unauthorized disclosure of personal information resulting from a breach of an organization’s security safeguards that are referred to in section 57 or from a failure to establish those safeguards. (atteinte aux mesures de sécurité)

business transaction includes

(a) the purchase, sale or other acquisition or disposition of an organization or a part of an organization, or any of its assets;

(b) the merger or amalgamation of two or more organizations;

(c) the making of a loan or provision of other financing to an organization or a part of an organization;

(d) the creating of a charge on, or the taking of a security interest in or a security on, any assets or securities of an organization;

(e) the lease or licensing of any of an organization’s assets; and

(f) any other prescribed arrangement between two or more organizations to conduct a business activity. (transaction commerciale)

commercial activity means any particular transaction, act or conduct or any regular course of conduct that is of a commercial character, including the selling, bartering or leasing of donor, membership or other fundraising lists. (activité commerciale)

Commissioner means the Privacy Commissioner appointed under section 53 of the Privacy Act. (commissaire)

de-identify means to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains. (dépersonnaliser)

dispose means to permanently and irreversibly delete personal information or to anonymize it. (retrait)

federal work, undertaking or business means any work, undertaking or business that is within the legislative authority of Parliament. It includes

(a) a work, undertaking or business that is operated or carried on for or in connection with navigation and shipping, whether inland or maritime, including the operation of ships and transportation by ship anywhere in Canada;

(b) a railway, canal, telegraph or other work or undertaking that connects a province with another province, or that extends beyond the limits of a province;

(c) a line of ships that connects a province with another province, or that extends beyond the limits of a province;

(d) a ferry between a province and another province or between a province and a country other than Canada;

(e) aerodromes, aircraft or a line of air transportation;

(f) a radio broadcasting station;

(g) a bank or an authorized foreign bank as defined in section 2 of the Bank Act;

(h) a work that, although wholly situated within a province, is before or after its execution declared by Parliament to be for the general advantage of Canada or for the advantage of two or more provinces;

(i) a work, undertaking or business outside the exclusive legislative authority of the legislatures of the provinces; and

(j) a work, undertaking or business to which federal laws, within the meaning of section 2 of the Oceans Act, apply under section 20 of that Act and any regulations made under paragraph 26(1)‍(k) of that Act. (entreprises fédérales)

Minister means the member of the Queen’s Privy Council for Canada designated under section 3 or, if no member is designated, the Minister of Industry. (ministre)

organization includes an association, a partnership, a person or a trade union. (organisation)

personal information means information about an identifiable individual. (renseignement personnel)

prescribed means prescribed by regulation. (Version anglaise seulement)

record means any documentary material, regardless of medium or form. (document)

service provider means an organization, including a parent corporation, subsidiary, affiliate, contractor or subcontractor, that provides services for or on behalf of another organization to assist the organization in fulfilling its purposes. (fournisseur de services)

Tribunal means the Personal Information and Data Protection Tribunal established under section 4 of the Personal Information and Data Protection Tribunal Act. (Tribunal)

Interpretation — minors

(2) For the purposes of this Act, the personal information of minors is considered to be sensitive information.

Interpretation — de-identified information

(3) For the purposes of this Act, other than sections 20 and 21, subsections 22(1) and 39(1), sections 55 and 56, subsection 63(1) and sections 71, 72, 74, 75 and 116, personal information that has been de-identified is considered to be personal information.

Order designating Minister

3 The Governor in Council may, by order, designate any member of the Queen’s Privy Council for Canada to be the Minister for the purposes of this Act.

Authorized representatives

4 The rights and recourses provided under this Act may be exercised

(a) on behalf of a minor by a parent, guardian or tutor, unless the minor wishes to personally exercise those rights and recourses and is capable of doing so;

(b) on behalf of an individual, other than a minor, under a legal incapacity by a person authorized by law to administer the affairs or property of that individual; and

(c) on behalf of a deceased individual by a person authorized by law to administer the estate or succession of that individual, but only for the purpose of that administration.

Purpose and Application

Purpose

5 The purpose of this Act is to establish — in an era in which data is constantly flowing across borders and geographical boundaries and significant economic activity relies on the analysis, circulation and exchange of personal information — rules to govern the protection of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.

Application

6 (1) This Act applies to every organization in respect of personal information that

(a) the organization collects, uses or discloses in the course of commercial activities; or

(b) is about an employee of, or an applicant for employment with, the organization and that the organization collects, uses or discloses in connection with the operation of a federal work, undertaking or business.

For greater certainty

(2) For greater certainty, this Act applies in respect of personal information

(a) that is collected, used or disclosed interprovincially or internationally by an organization; or

(b) that is collected, used or disclosed by an organization within a province, to the extent that the organization is not exempt from the application of this Act under an order made under paragraph 122(2)‍(b).

Application

(3) This Act also applies to an organization set out in column 1 of the schedule in respect of personal information set out in column 2.

Limit

(4) This Act does not apply to

(a) any government institution to which the Privacy Act applies;

(b) any individual in respect of personal information that the individual collects, uses or discloses solely for personal or domestic purposes;

(c) any organization in respect of personal information that the organization collects, uses or discloses solely for journalistic, artistic or literary purposes;

(d) any organization in respect of an individual’s personal information that the organization collects, uses or discloses solely for the purpose of communicating or facilitating communication with the individual in relation to their employment, business or profession; or

(e) any organization that is, under an order made under paragraph 122(2)‍(b), exempt from the application of this Act in respect of the collection, use or disclosure of personal information that occurs within a province in respect of which the order was made.

For greater certainty

(5) For greater certainty, this Act does not apply in respect of personal information that has been anonymized.

Other Acts

(6) Every provision of this Act applies despite any provision, enacted after December 31, 2000, of any other Act of Parliament, unless the other Act expressly declares that that provision operates despite the provision of this Act.

PART 1 

Obligations of Organizations

Accountability of Organizations

Accountability — personal information under organization’s control

7 (1) An organization is accountable for personal information that is under its control.

Personal information under control of organization

(2) Personal information is under the control of the organization that decides to collect it and that determines the purposes for its collection, use or disclosure, regardless of whether the information is collected, used or disclosed by the organization itself or by a service provider on behalf of the organization.

Designated individual

8 (1) An organization must designate one or more individuals to be responsible for matters related to its obligations under this Act. It must provide the designated individual’s business contact information to any person who requests it.

Effect of designation of individual

(2) The designation of an individual under subsection (1) does not relieve the organization of its obligations under this Act.

Privacy management program

9 (1) Every organization must implement and maintain a privacy management program that includes the policies, practices and procedures the organization has put in place to fulfill its obligations under this Act, including policies, practices and procedures respecting

(a) the protection of personal information;

(b) how requests for information and complaints are received and dealt with;

(c) the training and information provided to the organization’s staff respecting its policies, practices and procedures; and

(d) the development of materials to explain the organization’s policies and procedures.

Volume and sensitivity

(2) In developing its privacy management program, the organization must take into account the volume and sensitivity of the personal information under its control.

Access — privacy management program

10 (1) An organization must, on request of the Commissioner, provide the Commissioner with access to the policies, practices and procedures that are included in its privacy management program.

Guidance and corrective measures

(2) The Commissioner may, after reviewing the policies, practices and procedures, provide guidance on, or recommend that corrective measures be taken by the organization in relation to, its privacy management program.

Same protection

11 (1) If an organization transfers personal information to a service provider, the organization must ensure, by contract or otherwise, that the service provider provides a level of protection of the personal information equivalent to that which the organization is required to provide under this Act.

Service provider obligations

(2) The obligations under this Part, other than those set out in sections 57 and 61, do not apply to a service provider in respect of personal information that is transferred to it. However, the service provider is subject to all of the obligations under this Part if it collects, uses or discloses that information for any purpose other than the purposes for which the information was transferred.

Appropriate Purposes

Appropriate purposes

12 (1) An organization may collect, use or disclose personal information only in a manner and for purposes that a reasonable person would consider appropriate in the circumstances, whether or not consent is required under this Act.

Factors to consider

(2) The following factors must be taken into account in determining whether the manner and purposes referred to in subsection (1) are appropriate:

(a) the sensitivity of the personal information;

(b) whether the purposes represent legitimate business needs of the organization;

(c) the effectiveness of the collection, use or disclosure in meeting the organization’s legitimate business needs;

(d) whether there are less intrusive means of achieving those purposes at a comparable cost and with comparable benefits; and

(e) whether the individual’s loss of privacy is proportionate to the benefits in light of the measures, technical or otherwise, implemented by the organization to mitigate the impacts of the loss of privacy on the individual.

Purposes

(3) An organization must determine at or before the time of the collection of any personal information each of the purposes for which the information is to be collected, used or disclosed and record those purposes.

New purpose

(4) If the organization determines that the personal information it has collected is to be used or disclosed for a new purpose, the organization must record that new purpose before using or disclosing that information for the new purpose.

Limiting Collection, Use and Disclosure

Limiting collection

13 The organization may collect only the personal information that is necessary for the purposes determined and recorded under subsection 12(3).

New purpose

14 (1) An organization must not use or disclose personal information for a purpose other than a purpose determined and recorded under subsection 12(3), unless the organization obtains the individual’s valid consent before any use or disclosure for that other purpose.

Use or disclosure — other purposes

(2) Despite subsection (1), an organization may

(a) use personal information for a purpose other than a purpose determined and recorded under subsection 12(3) in any of the circumstances set out in sections 18, 20 and 21, subsections 22(1) and (3) and sections 23, 24, 26, 30, 41 and 51; or

(b) disclose personal information for a purpose other than a purpose determined and recorded under subsection 12(3) in any of the circumstances set out in subsections 22(1) and (3), sections 23 to 28, 31 to 37 and 39, subsection 40(3) and sections 42 and 43 to 51.

Consent

Consent required

15 (1) Unless this Act provides otherwise, an organization must obtain an individual’s valid consent for the collection, use or disclosure of the individual’s personal information.

Timing of consent

(2) The individual’s consent must be obtained at or before the time of the collection of the personal information or, if the information is to be used or disclosed for a purpose other than a purpose determined and recorded under subsection 12(3), before any use or disclosure of the information for that other purpose.

Information for consent to be valid

(3) The individual’s consent is valid only if, at or before the time that the organization seeks the individual’s consent, it provides the individual with the following information:

(a) the purposes for the collection, use or disclosure of the personal information determined by the organization and recorded under subsection 12(3) or (4);

(b) the manner in which the personal information is to be collected, used or disclosed;

(c) any reasonably foreseeable consequences of the collection, use or disclosure of the personal information;

(d) the specific type of personal information that is to be collected, used or disclosed; and

(e) the names of any third parties or types of third parties to which the organization may disclose the personal information.

Plain language

(4) The organization must provide the information referred to in subsection (3) in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.

Form of consent

(5) Consent must be expressly obtained unless, subject to subsection (6), it is appropriate to rely on an individual’s implied consent, taking into account the reasonable expectations of the individual and the sensitivity of the personal information that is to be collected, used or disclosed.

Business activities

(6) It is not appropriate to rely on an individual’s implied consent if their personal information is collected or used for an activity described in subsection 18(2) or (3).

Consent — provision of product or service

(7) The organization must not, as a condition of the provision of a product or service, require an individual to consent to the collection, use or disclosure of their personal information beyond what is necessary to provide the product or service.

Consent obtained by deception

16 An organization must not obtain or attempt to obtain an individual’s consent by providing false or misleading information or using deceptive or misleadingpractices. Any consent obtained under those circumstances is invalid.

Withdrawal of consent

17 (1) On giving reasonable notice to an organization, an individual may, at any time, subject to this Act, to federal or provincial law or to the reasonable terms of a contract, withdraw their consent in whole or in part.

Collection, use or disclosure to cease

(2) On receiving the notice from the individual, the organization must inform the individual of the consequences of the withdrawal of their consent and, as soon as feasible after that, cease the collection, use or disclosure of the individual’s personal information in respect of which the consent was withdrawn.

Exceptions to Requirement for Consent

Business Operations

Business activities

18 (1) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for the purpose of a business activity described in subsection (2) and

(a) a reasonable person would expect the collection or use for such an activity; and

(b) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions.

List of activities

(2) Subject to the regulations, the following activities are business activities for the purpose of subsection (1):

(a) an activity that is necessary to provide a product or service that the individual has requested from the organization;

(b) an activity that is necessary for the organization’s information, system or network security;

(c) an activity that is necessary for the safety of a product or service that the organization provides; and

(d) any other prescribed activity.

Legitimate interest

(3) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use and

(a) a reasonable person would expect the collection or use for such an activity; and

(b) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions.

Conditions precedent

(4) Prior to collecting or using personal information under subsection (3), the organization must

(a) identify any potential adverse effect on the individual that is likely to result from the collection or use;

(b) identify and take reasonable measures to reduce the likelihood that the effects will occur or to mitigate or eliminate them; and

(c) comply with any prescribed requirements.

Record of assessment

(5) The organization must record its assessment of how it meets the conditions set out in subsection (4) and must, on request, provide a copy of the assessment to the Commissioner.

Transfer to service provider

19 An organization may transfer an individual’s personal information to a service provider without their knowledge or consent.

De-identification of personal information

20 An organization may use an individual’s personal information without their knowledge or consent to de-identify the information.

Research, analysis and development

21 An organization may use an individual’s personal information without their knowledge or consent for the organization’s internal research, analysis and development purposes, if the information is de-identified before it is used.

Prospective business transaction

22 (1) Organizations that are parties to a prospective business transaction may use and disclose an individual’s personal information without their knowledge or consent if

(a) the information is de-identified before it is used or disclosed and remains so until the transaction is completed;

(b) the organizations have entered into an agreement that requires the organization that receives the information

(i) to use and disclose that information solely for purposes related to the transaction,

(ii) to protect the information by security safeguards proportionate to the sensitivity of the information, and

(iii) if the transaction does not proceed, to return the information to the organization that disclosed it, or dispose of it, within a reasonable time;

(c) the organizations comply with the terms of that agreement; and

(d) the information is necessary

(i) to determine whether to proceed with the transaction, and

(ii) if the determination is made to proceed with the transaction, to complete it.

Exception — paragraph (1)‍(a)

(2) The requirement referred to in paragraph (1)‍(a) does not apply if it would undermine the objectives for carrying out the transaction and the organization has taken into account the risk of harm to the individual that could result from using or disclosing the information.

Completed business transaction

(3) If the business transaction is completed, the organizations that are parties to the transaction may use and disclose the personal information referred to in subsection (1) without the individual’s knowledge or consent if

(a) the organizations have entered into an agreement that requires each of them

(i) to use and disclose the information under its control solely for the purposes for which the information was collected or permitted to be used or disclosed before the transaction was completed,

(ii) to protect that information by security safeguards proportionate to the sensitivity of the information, and

(iii) to give effect to any withdrawal of consent made under subsection 17(1);

(b) the organizations comply with the terms of that agreement;

(c) the information is necessary for carrying on the business or activity that was the object of the transaction; and

(d) one of the parties notifies the individual, within a reasonable time after the transaction is completed, that the transaction has been completed and that their information has been disclosed under subsection (1).

Exception

(4) Subsections (1) and (3) do not apply to a business transaction of which the primary purpose or result is the purchase, sale or other acquisition or disposition, or lease, of personal information.

Information produced in employment, business or profession

23 An organization may collect, use or disclose an individual’s personal information without their knowledge or consent if it was produced by the individual in the course of their employment, business or profession and the collection, use or disclosure is consistent with the purposes for which the information was produced.

Employment relationship — federal work, undertaking or business

24 An organization that operates a federal work, undertaking or business may collect, use or disclose an individual’s personal information without their consent if

(a) the collection, use or disclosure is necessary to establish, manage or terminate an employment relationship between the organization and the individual in connection with the operation of a federal work, undertaking or business; and

(b) the organization has informed the individual that the personal information will be or may be collected, used or disclosed for those purposes.

Disclosure to lawyer or notary

25 An organization may disclose an individual’s personal information without their knowledge or consent to a lawyer or, in Quebec, a lawyer or notary, who is representing the organization.

Witness statement

26 An organization may collect, use or disclose an individual’s personal information without their knowledge or consent if the information is contained in a witness statement and the collection, use or disclosure is necessary to assess, process or settle an insurance claim.

Prevention, detection or suppression of fraud

27 (1) An organization may disclose an individual’s personal information to another organization without the individual’s knowledge or consent if the disclosure is reasonable for the purposes of detecting or suppressing fraud or of preventing fraud that is likely to be committed and it is reasonable to expect that the disclosure with the individual’s knowledge or consent would compromise the ability to prevent, detect or suppress the fraud.

Collection

(2) An organization may collect or use an individual’s personal information without their knowledge or consent if the information was disclosed to it under subsection (1).

Debt collection

28 An organization may disclose an individual’s personal information without their knowledge or consent for the purpose of collecting a debt owed by the individual to the organization.

Public Interest

Individual’s interest

29 (1) An organization may collect an individual’s personal information without their knowledge or consent if the collection is clearly in the interests of the individual and consent cannot be obtained in a timely way.

Use

(2) An organization may use an individual’s personal information without their knowledge or consent if the information was collected under subsection (1).

Emergency — use

30 An organization may use an individual’s personal information without their knowledge or consent for the purpose of acting in respect of an emergency that threatens the life, health or security of any individual.

Emergency — disclosure

31 An organization may disclose an individual’s personal information without their knowledge or consent to a person who needs the information because of an emergency that threatens the life, health or security of any individual. If the individual whom the information is about is alive, the organization must inform that individual in writing without delay of the disclosure.

Identification of individual

32 An organization may disclose an individual’s personal information without their knowledge or consent if the disclosure is necessary to identify the individual who is injured, ill or deceased and is made to a government institution, a part of a government institution or the individual’s next of kin or authorized representative. If the individual is alive, the organization must inform them in writing without delay of the disclosure.

Communication with next of kin or authorized representative

33 An organization may disclose an individual’s personal information without their knowledge or consent to a government institution or part of a government institution that has made a request for the information, identified its lawful authority to obtain the information and indicated that the disclosure is requested for the purpose of communicating with the next of kin or authorized representative of an injured, ill or deceased individual.

Financial abuse

34 An organization may on its own initiative disclose an individual’s personal information without their knowledge or consent to a government institution, a part of a government institution or the individual’s next of kin or authorized representative if

(a) the organization has reasonable grounds to believe that the individual has been, is or may be the victim of financial abuse;

(b) the disclosure is made solely for purposes related to preventing or investigating the abuse; and

(c) it is reasonable to expect that disclosure with the knowledge or consent of the individual would compromise the ability to prevent or investigate the abuse.

Statistics, study or research

35 An organization may disclose an individual’s personal information without their knowledge or consent if

(a) the disclosure is made for statistical purposes or for study or research purposes and those purposes cannot be achieved without disclosing the information;

(b) it is impracticable to obtain consent; and

(c) the organization informs the Commissioner of the disclosure before the information is disclosed.

Records of historic or archival importance

36 An organization may disclose an individual’s personal information without their knowledge or consent to an institution whose functions include the conservation of records of historic or archival importance, if the disclosure is made for the purpose of such conservation.

Disclosure after period of time

37 An organization may disclose an individual’s personal information without their knowledge or consent after the earlier of

(a) 100 years after the record containing the information was created, and

(b) 20 years after the death of the individual.

Journalistic, artistic or literary purposes

38 An organization may collect an individual’s personal information without their knowledge or consent if the collection is solely for journalistic, artistic or literary purposes.

Socially beneficial purposes

39 (1) An organization may disclose an individual’s personal information without their knowledge or consent if

(a) the personal information is de-identified before the disclosure is made;

(b) the disclosure is made to

(i) a government institution or part of a government institution in Canada,

(ii) a health care institution, post-secondary educational institution or public library in Canada,

(iii) any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose, or

(iv) any other prescribed entity; and

(c) the disclosure is made for a socially beneficial purpose.

Definition of socially beneficial purpose

(2) For the purpose of this section, socially beneficial purpose means a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.

Investigations

Breach of agreement or contravention

40 (1) An organization may collect an individual’s personal information without their knowledge or consent if it is reasonable to expect that the collection with their knowledge or consent would compromise the availability or the accuracy of the information and the collection is reasonable for purposes related to investigating a breach of an agreement or a contravention of federal or provincial law.

Use

(2) An organization may use an individual’s personal information without their knowledge or consent if the information was collected under subsection (1).

Disclosure

(3) An organization may disclose an individual’s personal information without their knowledge or consent if the disclosure is made to another organization and is reasonable for the purposes of investigating a breach of an agreement or a contravention of federal or provincial law that has been, is being or is about to be committed and it is reasonable to expect that disclosure with the knowledge or consent of the individual would compromise the investigation.

Use for investigations

41 An organization may use an individual’s personal information without their knowledge or consent if, in the course of its activities, the organization becomes aware of information that it has reasonable grounds to believe could be useful in the investigation of a contravention of federal or provincial law or law of a foreign jurisdiction that has been, is being or is about to be committed and the information is used for the purpose of investigating that contravention.

Breach of security safeguards

42 An organization may disclose an individual’s personal information without their knowledge or consent if

(a) the disclosure is made to the other organization, government institution or part of a government institution that was notified of a breach under subsection 59(1); and

(b) the disclosure is made solely for the purposes of reducing the risk of harm to the individual that could result from the breach or mitigating that harm.

Disclosures to Government Institutions

Administering law — request of government institution

43 An organization may disclose an individual’s personal information without their knowledge or consent to a government institution or part of a government institution that has made a request for the information, identified its lawful authority to obtain the information and indicated that the disclosure is requested for the purpose of administering federal or provincial law.

Law enforcement — request of government institution

44 An organization may disclose an individual’s personal information without their knowledge or consent to a government institution or part of a government institution that has made a request for the information, identified its lawful authority to obtain the information and indicated that the disclosure is requested for the purpose of enforcing federal or provincial law or law of a foreign jurisdiction, carrying out an investigation relating to the enforcement of any such law or gathering intelligence for the purpose of enforcing any such law.

Contravention of law — initiative of organization

45 An organization may on its own initiative disclose an individual’s personal information without their knowledge or consent to a government institution or a part of a government institution if the organization has reasonable grounds to believe that the information relates to a contravention of federal or provincial law or law of a foreign jurisdiction that has been, is being or is about to be committed.

Proceeds of Crime (Money Laundering) and Terrorist Financing Act

46 An organization may disclose an individual’s personal information without their knowledge or consent to the government institution referred to in section 7 of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act as required by that section.

National security, defence or international affairs — request by government institution

47 (1) An organization may disclose an individual’s personal information without their knowledge or consent to a government institution or part of a government institution that has made a request for the information, identified its lawful authority to obtain the information and indicated that it suspects that the information relates to national security, the defence of Canada or the conduct of international affairs.

Collection

(2) An organization may collect an individual’s personal information without their knowledge or consent for the purpose of making a disclosure under subsection (1).

Use

(3) An organization may use an individual’s personal information without their knowledge or consent if it was collected under subsection (2).

National security, defence or international affairs — initiative of organization

48 (1) An organization may on its own initiative disclose an individual’s personal information without their knowledge or consent to a government institution or a part of a government institution if the organization suspects that the information relates to national security, the defence of Canada or the conduct of international affairs.

Collection

(2) An organization may collect an individual’s personal information without their knowledge or consent for the purpose of making a disclosure under subsection (1).

Use

(3) An organization may use an individual’s personal information without their knowledge or consent if it was collected under subsection (2).

Required by Law

Required by law — collection

49 (1) An organization may collect an individual’s personal information without their knowledge or consent for the purpose of making a disclosure that is required by law.

Use

(2) An organization may use an individual’s personal information without their knowledge or consent if it was collected under subsection (1).

Disclosure

(3) An organization may disclose an individual’s personal information without their knowledge or consent if the disclosure is required by law.

Subpoena, warrant or order

50 An organization may disclose an individual’s personal information without their knowledge or consent if the disclosure is required to comply with a subpoena or warrant issued or an order made by a court, person or body with jurisdiction to compel the production of information, or to comply with rules of procedure relating to the production of records.

Publicly Available Information

Information specified by regulations

51 An organization may collect, use or disclose an individual’s personal information without their knowledge or consent if the personal information is publicly available and is specified by the regulations.

Non-application of Certain Exceptions — Electronic Addresses and Computer Systems

Definitions

52 (1) The following definitions apply in this section.

access means to program, execute programs on, communicate with, store data in, retrieve data from or otherwise make use of any resources, including data or programs of a computer system or a computer network. (utiliser)

computer program has the same meaning as in subsection 342.‍1(2) of the Criminal Code. (programme d’ordinateur)

computer system has the same meaning as in subsection 342.‍1(2) of the Criminal Code. (ordinateur)

electronic address means an address used in connection with

(a) an electronic mail account;

(b) an instant messaging account; or

(c) any similar account. (adresse électronique)

Collection and use of electronic addresses

(2) An organization is not authorized under any of sections 18, 23 and 26, subsection 29(1) and sections 30, 38, 41 and 51 to

(a) collect an individual’s electronic address without their knowledge or consent, if the address is collected by the use of a computer program that is designed or marketed primarily for use in generating or searching for, and collecting, electronic addresses; or

(b) use an individual’s electronic address without their knowledge or consent, if the address is collected by the use of a computer program described in paragraph (a).

Accessing computer system to collect personal information, etc.

(3) An organization is not authorized under any of sections 18, 23 and 26, subsection 29(1), sections 30 and 38, subsection 40(1) and sections 41 and 51 to

(a) collect an individual’s personal information without their knowledge or consent, through any means of telecommunication, if the information is collected by accessing a computer system or causing a computer system to be accessed in contravention of an Act of Parliament; or

(b) use an individual’s personal information without their knowledge or consent, if the information is collected in a manner described in paragraph (a).

Express consent

(4) Despite subsection 15(5), an organization is not to rely on an individual’s implied consent in respect of any collection of personal information described in paragraph (2)‍(a) or (3)‍(a) or any use of personal information described in paragraph (2)‍(b) or (3)‍(b).

Retention and Disposal of Personal Information

Period for retention and disposal

53 (1) An organization must not retain personal information for a period longer than necessary to

(a) fulfill the purposes for which the information was collected, used or disclosed; or

(b) comply with the requirements of this Act, of federal or provincial law or of the reasonable terms of a contract.

The organization must dispose of the information as soon as feasible after that period.

Sensitivity of personal information

(2) For the purposes of paragraph (1)‍(a), when determining the retention period, the organization must take into account the sensitivity of the information.

Personal information used for decision-making

54 An organization that uses personal information to make a decision about an individual must retain the information for a sufficient period of time to permit the individual to make a request for access under section 63.

Disposal at individual’s request

55 (1) If an organization receives a written request from an individual to dispose of their personal information that is under the organization’s control, the organization must, as soon as feasible, dispose of the information, if

(a) the information was collected, used or disclosed in contravention of this Act;

(b) the individual has withdrawn their consent, in whole or in part, to the collection, use or disclosure of the information; or

(c) the information is no longer necessary for the continued provision of a product or service requested by the individual.

Exception

(2) An organization may refuse a request to dispose of personal information in the circumstances described in paragraph (1)‍(b) or (c) if

(a) disposing of the information would result in the disposal of personal information about another individual and the information is not severable;

(b) there are other requirements of this Act, of federal or provincial law or of the reasonable terms of a contract that prevent it from disposing of the information;

(c) the information is necessary for the establishment of a legal defence or in the exercise of other legal remedies by the organization;

(d) the information is not in relation to a minor and the disposal of the information would have an undue adverse impact on the accuracy or integrity of information that is necessary to the ongoing provision of a product or service to the individual in question;

(e) the request is vexatious or made in bad faith; or

(f) the information is not in relation to a minor and it is scheduled to be disposed of in accordance with the organization’s information retention policy, and the organization informs the individual of the remaining period of time for which the information will be retained.

Reasons for refusal

(3) An organization that refuses to dispose of an individual’s personal information must inform them in writing of the refusal, setting out the reasons and any recourse that they may have under section 73 or subsection 82(1).

Disposal of transferred personal information

(4) If an organization disposes of personal information at an individual’s request, it must, as soon as feasible, inform any service provider to which it has transferred the information of the request and ensure that the service provider has disposed of the information.

Accuracy of Personal Information

Accuracy of information

56 (1) An organization must take reasonable steps to ensure that personal information under its control is as accurate, up-to-date and complete as is necessary to fulfill the purposes for which the information is collected, used or disclosed.

Extent of accuracy

(2) In determining the extent to which personal information must be accurate, complete and up-to-date, the organization must take into account the individual’s interests, including

(a) whether the information may be used to make a decision about the individual;

(b) whether the information is used on an ongoing basis; and

(c) whether the information is disclosed to third parties.

Routine updating

(3) An organization is not to routinely update personal information unless it is necessary to fulfill the purposes for which the information is collected, used or disclosed.

Security Safeguards

Security safeguards

57 (1) An organization must protect personal information through physical, organizational and technological security safeguards. The level of protection provided by those safeguards must be proportionate to the sensitivity of the information.

Factors to consider

(2) In addition to the sensitivity of the information, the organization must, in establishing its security safeguards, take into account the quantity, distribution, format and method of storage of the information.

Scope of security safeguards

(3) The security safeguards must protect personal information against, among other things, loss, theft and unauthorized access, disclosure, copying, use and modification and must include reasonable measures to authenticate the identity of the individual to whom the personal information relates.

Report to Commissioner

58 (1) An organization must report to the Commissioner any breach of security safeguards involving personal information under its control if it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to an individual.

Report requirements

(2) The report must contain the prescribed information and must be made in the prescribed form and manner as soon as feasible after the organization determines that the breach has occurred.

Notification to individual

(3) Unless otherwise prohibited by law, an organization must notify an individual of any breach of security safeguards involving the individual’s personal information under the organization’s control if it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to the individual.

Contents of notification

(4) The notification must contain sufficient information to allow the individual to understand the significance to them of the breach and to take steps, if any are possible, to reduce the risk of harm that could result from it or to mitigate that harm. It must also contain any other prescribed information.

Form and manner

(5) The notification must be conspicuous and must be given directly to the individual in the prescribed form and manner, except in prescribed circumstances, in which case it must be given indirectly in the prescribed form and manner.

Time to give notification

(6) The notification must be given as soon as feasible after the organization determines that the breach has occurred.

Definition of significant harm

(7) For the purpose of this section, significant harm includes bodily harm, humiliation, damage to reputation orrelationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.

Real risk of significant harm — factors

(8) The factors that are relevant to determining whether a breach of security safeguards creates a real risk of significant harm to the individual include

(a) the sensitivity of the personal information involved in the breach;

(b) the probability that the personal information has been, is being or will be misused; and

(c) any other prescribed factor.

Notification to organizations

59 (1) An organization that notifies an individual of a breach of security safeguards under subsection 58(3) must notify any other organization, a government institution or a part of a government institution of the breach if the notifying organization believes that the other organization or the government institution or part concerned may be able to reduce the risk of harm that could result from it or mitigate that harm, or if any of the prescribed conditions are satisfied.

Time to give notification

(2) The notification must be given as soon as feasible after the organization determines that the breach has occurred.

Records

60 (1) An organization must, in accordance with any prescribed requirements, keep and maintain a record of every breach of security safeguards involving personal information under its control.

Provision to Commissioner

(2) An organization must, on request, provide the Commissioner with access to, or a copy of, the record.

Service providers

61 If a service provider determines that any breach of security safeguards has occurred that involves personal information, it must as soon as feasible notify the organization that controls the personal information.

Openness and Transparency

Policies and practices

62 (1) An organization must make readily available, in plain language, information that explains the organization’s policies and practices put in place to fulfill its obligations under this Act.

Additional information

(2) In fulfilling its obligation under subsection (1), an organization must make the following information available:

(a) a description of the type of personal information under the organization’s control;

(b) a general account of how the organization uses the personal information and of how it applies the exceptions to the requirement to obtain an individual’s consent under this Act, including a description of any activities referred to in subsection 18(3) in which it has a legitimate interest;

(c) a general account of the organization’s use of any automated decision system to make predictions, recommendations or decisions about individuals that could have a significant impact on them;

(d) whether or not the organization carries out any international or interprovincial transfer or disclosure of personal information that may have reasonably foreseeable privacy implications;

(e) the retention periods applicable to sensitive personal information;

(f) how an individual may make a request for disposal under section 55 or access under section 63; and

(g) the business contact information of the individual to whom complaints or requests for information may be made.

Access to and Amendment of Personal Information

Information and access

63 (1) On request by an individual, an organization must inform them of whether it has any personal information about them, how it uses the information and whether it has disclosed the information. It must also give the individual access to the information.

Names or types of third parties

(2) If the organization has disclosed the information, the organization must also provide to the individual the names of the third parties or types of third parties to which the disclosure was made, including in cases where the disclosure was made without the consent of the individual.

Automated decision system

(3) If the organization has used an automated decision system to make a prediction, recommendation or decision about the individual that could have a significant impact on them, the organization must, on request by the individual, provide them with an explanation of the prediction, recommendation or decision.

Explanation

(4) The explanation must indicate the type of personal information that was used to make the prediction, recommendation or decision, the source of the information and the reasons or principal factors that led to the prediction, recommendation or decision.

Request in writing

64 (1) A request under section 63 must be made in writing.

Assistance

(2) An organization must assist any individual who informs the organization that they need assistance in preparing a request to the organization.

Information to be provided

65 An organization may require the individual to provide it with sufficient information to allow the organization to fulfill its obligations under section 63.

Plain language

66 (1) The information referred to in section 63 must be provided to the individual in plain language.

Sensory disability

(2) For the purpose of section 63, an organization must give access to personal information in an alternative format to an individual with a sensory disability who requests that it be transmitted in that format if

(a) a version of the information already exists in that format; or

(b) its conversion into that format is reasonable and necessary in order for the individual to be able to exercise rights under this Act.

Sensitive medical information

(3) An organization may choose to give an individual access to sensitive medical information through a medical practitioner.

Time limit

67 (1) An organization must respond to a request made under section 63 with due diligence and in any case no later than 30 days after the day on which the request was received.

Extension of time limit

(2) An organization may extend the time limit

(a) for a maximum of 30 days if

(i) meeting the time limit would unreasonably interfere with the activities of the organization, or

(ii) the time required to undertake any consultations necessary to respond to the request would make the time limit impracticable to meet; or

(b) for the period that is necessary in order to be able to convert the personal information into an alternative format.

In either case, the organization must, no later than 30 days after the day on which the request was received, send a notice of extension to the individual, advising them of the new time limit, the reasons for extending the time limit and their right to make a complaint to the Commissioner in respect of the extension.

Reasons

(3) An organization that responds within the time limit and refuses a request must inform the individual in writing of the refusal, setting out the reasons and any recourse that they may have under section 73 or subsection 82(1).

Deemed refusal

(4) If the organization fails to respond within the time limit, the organization is deemed to have refused the request.

Costs for responding

68 An organization must not respond to the individual’s request made under section 63 at a cost unless

(a) the organization has informed the individual of the approximate cost;

(b) the cost to the individual is minimal; and

(c) the individual has advised the organization that the request is not being withdrawn.

Retention of information

69 An organization that has personal information that is the subject of a request made under section 63 must retain the information for as long as is necessary to allow the individual to exhaust any recourse that they may have under this Act.

When access prohibited

70 (1) Despite section 63, an organization must not give an individual access to personal information under that section if doing so would likely reveal personal information about another individual. However, if the information about the other individual is severable from the information about the requester, the organization must sever the information about the other individual before giving the requester access.

Limit

(2) Subsection (1) does not apply if the other individual consents to the access or the requester needs the information because an individual’s life, health or security is threatened.

Information related to certain exceptions to consent

(3) An organization must comply with subsection (4) if an individual requests that the organization

(a) inform the individual about

(i) any disclosure to a government institution or a part of a government institution under section 44, 45 or 46, subsection 47(1) or 48(1) or section 50, or

(ii) the existence of any information that the organization has relating to a disclosure referred to in subparagraph (i), to a subpoena, warrant or order referred to in section 50 or to a request made by a government institution or a part of a government institution under section 44 or subsection 47(1); or

(b) give the individual access to the information referred to in subparagraph (a)‍(ii).

Notification and response

(4) An organization to which subsection (3) applies

(a) must, in writing and without delay, notify the institution or part concerned of the request made by the individual; and

(b) must not respond to the request before the earlier of

(i) the day on which it is notified under subsection (5), and

(ii) 30 days after the day on which the institution or part is notified.

Objection

(5) Within 30 days after the day on which it is notified under subsection (4), the institution or part must notify the organization of whether the institution or part objects to the organization complying with the request. The institution or part may object only if the institution or part is of the opinion that compliance with the request could reasonably be expected to be injurious to

(a) national security, the defence of Canada or the conduct of international affairs;

(b) the detection, prevention or deterrence of money laundering or the financing of terrorist activities; or

(c) the enforcement of federal or provincial law or law of a foreign jurisdiction, an investigation relating to the enforcement of any such law or the gathering of intelligence for the purpose of enforcing any such law.

Prohibition

(6) Despite section 63, if an organization is notified under subsection (5) that the institution or part objects to the organization complying with the request, the organization

(a) must refuse the request to the extent that it relates to paragraph (3)‍(a) or to information referred to in subparagraph (3)‍(a)‍(ii);

(b) must notify the Commissioner, in writing and without delay, of the refusal;

(c) must not give the individual access to any information that the organization has relating to a disclosure to a government institution or a part of a government institution under section 44, 45 or 46, subsection 47(1) or 48(1) or section 50 or to a request made by a government institution or part of a government institution under section 44 or subsection 47(1);

(d) must not provide to the individual the name of the government institution or part to which the disclosure was made or its type; and

(e) must not disclose to the individual the fact that the organization notified an institution or part under paragraph (4)‍(a), that the institution or part objects or that the Commissioner was notified under paragraph (b).

When access may be refused

(7) Despite section 63, an organization is not required to give access to personal information if

(a) the information is protected by solicitor-client privilege or the professional secrecy of advocates and notaries or by litigation privilege;

(b) to do so would reveal confidential commercial information;

(c) to do so could reasonably be expected to threaten the life or security of another individual;

(d) the information was collected under subsection 40(1);

(e) the information was generated in the course of a formal dispute resolution process; or

(f) the information was created for the purpose of making a disclosure under the Public Servants Disclosure Protection Act or in the course of an investigation into a disclosure under that Act.

However, in the circumstances described in paragraph (b) or (c), if giving access to the information would reveal confidential commercial information or could reasonably be expected to threaten the life or security of another individual, as the case may be, and that information is severable from any other information for which access is requested, the organization must give the individual access after severing.

Limit

(8) Subsection (7) does not apply if the individual needs the information because an individual’s life, health or security is threatened.

Notice

(9) If an organization decides not to give access to personal information in the circumstances set out in paragraph (7)‍(d), the organization must, in writing, notify the Commissioner, and must provide any information that the Commissioner may specify.

Amendment of personal information

71 (1) If an individual has been given access to their personal information and demonstrates that the information is not accurate, up-to-date or complete, the organization must amend the information as required.

Third party

(2) The organization must, if it is appropriate to do so, transmit the amended information to any third party that has access to the information.

Record of determination

(3) If the organization and the individual do not agree on the amendments that are to be made to the information, the organization must record the disagreement and, if it is appropriate to do so, inform third parties that have access to the information of the fact that there is a disagreement.

Mobility of Personal Information

Disclosure under data mobility framework

72 Subject to the regulations, on the request of an individual, an organization must as soon as feasible disclose the personal information that it has collected from the individual to an organization designated by the individual, if both organizations are subject to a data mobility framework.

Challenging Compliance

Complaints and requests for information

73 (1) An individual may make a complaint, or a request for information, to an organization with respect to its compliance with this Part. The organization must respond to any complaint or request that it receives.

Process for making complaint or request

(2) An organization must make readily available information about the process for making a complaint or request.

Investigation of complaints

(3) An organization must investigate any complaint that it receives and make any necessary changes to its policies, practices and procedures as a result of the investigation.

De-identification of Personal Information

Proportionality of technical and administrative measures

74 An organization that de-identifies personal information must ensure that any technical and administrative measures applied to the information are proportionate to the purpose for which the information is de-identified and the sensitivity of the personal information.

Prohibition

75 An organization must not use information that has been de-identified, alone or in combination with other information, to identify an individual except

(a) to conduct testing of the effectiveness of security safeguards that it has put in place;

(b) to comply with any requirements under this Act or under federal or provincial law;

(c) to conduct testing of the fairness and accuracy of models, processes and systems that were developed using information that has been de-identified;

(d) to conduct testing of the effectiveness of its de-identification processes;

(e) for a purpose or situation authorized by the Commissioner under section 116; and

(f) in any other prescribed circumstance.

PART 2 

Commissioner’s Powers, Duties and Functions and General Provisions

Codes of Practice and Certification Programs

Definition of entity

76 (1) For the purpose of this section and sections 77 to 81, entity includes any organization, regardless of whether it is an organization to which this Act applies, or a government institution.

Code of practice

(2) An entity may, in accordance with the regulations, apply to the Commissioner for approval of a code of practice that provides for substantially the same or greater protection of personal information as some or all of the protection provided under this Act.

Approval by Commissioner

(3) The Commissioner may approve the code of practice if the Commissioner determines that the code meets the criteria set out in the regulations.

Certification program

77 (1) An entity may, in accordance with the regulations, apply to the Commissioner for approval of a certification program that includes

(a) a code of practice that provides for substantially the same or greater protection of personal information as some or all of the protection provided under this Act;

(b) guidelines for interpreting and implementing the code of practice;

(c) a mechanism by which an entity that operates the program may certify that an organization is in compliance with the code of practice;

(d) a mechanism for the independent verification of an organization’s compliance with the code of practice;

(e) disciplinary measures for non-compliance with the code of practice by an organization, including the revocation of an organization’s certification; and

(f) anything else that is provided in the regulations.

Approval by Commissioner

(2) The Commissioner may approve the certification program if the Commissioner determines that the program meets the criteria set out in the regulations.

Response by Commissioner

78 The Commissioner must respond in writing to an application under subsection 76(2) or 77(1) in the time specified in the regulations.

Approval made public

79 The Commissioner must make public a decision to approve a code of practice or certification program.

For greater certainty

80 For greater certainty, compliance with the requirements of a code of practice or a certification program does not relieve an organization of its obligations under this Act.

Powers of Commissioner

81 The Commissioner may

(a) request that an entity that operates an approved certification program provide the Commissioner with information that relates to the program;

(b) cooperate with an entity that operates an approved certification program for the purpose of the exercise of the Commissioner’s powers and the performance of the Commissioner’s duties and functions under this Act;

(c) in accordance with the regulations, recommend to an entity that operates an approved certification program that an organization’s certification be withdrawn, in the circumstances and according to the criteria set out in the regulations, if the Commissioner is of the opinion that the organization is not in compliance with the requirements of the program;

(d) disclose information to the Commissioner of Competition, under an agreement or arrangement entered into under section 118, that relates to an entity that operates an approved certification program or an organization that is certified under an approved certification program;

(e) in accordance with the regulations, revoke an approval of a certification program in the circumstances and according to the criteria set out in the regulations; or

(f) consult with federal government institutions respecting codes of practice or certification programs.

Recourses

Filing of Complaints

Contravention

82 (1) An individual may file with the Commissioner a written complaint against an organization for contravening Part 1.

Commissioner may initiate complaint

(2) If the Commissioner is satisfied that there are reasonable grounds to investigate a matter under this Act, the Commissioner may initiate a complaint in respect of the matter.

Time limit

(3) A complaint that results from the refusal to grant a request made under section 63 must be filed within six months, or any longer period that the Commissioner allows, after the refusal or after the expiry of the time limit for responding to the request, as the case may be.

Notice

(4) The Commissioner must give notice of a complaint to the organization against which the complaint was made, unless the Commissioner decides under section 84 not to carry out an investigation.

Investigation of Complaints and Dispute Resolution

Investigation of complaint by Commissioner

83 (1) The Commissioner must carry out an investigation in respect of a complaint, unless the Commissioner is of the opinion that

(a) the complainant should first exhaust grievance or review procedures otherwise reasonably available;

(b) the complaint could more appropriately be dealt with, initially or completely, by means of a procedure provided for under any federal law, other than this Act, or provincial law;

(c) the complaint was not filed within a reasonable period after the day on which the subject matter of the complaint arose;

(d) the complaint raises an issue in respect of which a certification program that was approved by the Commissioner under subsection 77(2) applies and the organization is certified under that program;

(e) there is insufficient evidence to pursue the investigation;

(f) the complaint is trivial, frivolous or vexatious or is made in bad faith;

(g) the organization has provided a fair and reasonable response to the complaint;

(h) the matter is already the object of an ongoing investigation or inquiry under this Act;

(i) the matter has already been the subject of a report or decision by the Commissioner;

(j) the matter is being or has already been addressed under a procedure referred to in paragraph (a) or (b);

(k) the matter is the object of a compliance agreement entered into under subsection 87(1); or

(l) an investigation or any further investigation is unnecessary having regard to all the circumstances of the complaint.

Notification

(2) The Commissioner must notify the complainant and the organization of the Commissioner’s decision not to investigate the complaint or any act referred to in the complaint and give reasons for the decision. However, if the decision is made for any of the reasons set out in section 84, the Commissioner must not notify the organization.

Compelling reasons

(3) The Commissioner may reconsider a decision not to investigate under subsection (1) if the Commissioner is satisfied that the complainant has established that there are compelling reasons to investigate.

Exception

84 Despite subsection 83(1), the Commissioner is not required to carry out an investigation in respect of an act referred to in a complaint if the Commissioner is of the opinion that the act, if proved, would constitute a contravention of any of sections 6 to 9 of An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act or section 52.‍01 of the Competition Act or would constitute conduct that is reviewable under section 74.‍011 of that Act.

Discontinuance

85 The Commissioner may discontinue the investigation of a complaint if the Commissioner has formed an opinion referred to in subsection 83(1) or section 84. The Commissioner must notify the complainant and the organization of the discontinuance and give reasons for the decision.

Dispute resolution mechanisms

86 The Commissioner may attempt to resolve a complaint by means of a dispute resolution mechanism such as mediation and conciliation, unless an inquiry is being conducted in respect of the complaint.

Compliance Agreements

Entering into compliance agreement

87 (1) If, in the course of an investigation, the Commissioner believes on reasonable grounds that an organization has committed, is about to commit or is likely to commit an act or omission that could constitute a contravention of Part 1, the Commissioner may enter into a compliance agreement with that organization, aimed at ensuring compliance with this Act.

Terms

(2) A compliance agreement may contain any terms that the Commissioner considers necessary to ensure compliance with this Act.

Effect of compliance agreement

(3) The Commissioner must not commence an inquiry under section 89 in respect of any matter covered under the agreement.

For greater certainty

(4) For greater certainty, a compliance agreement does not preclude the prosecution of an offence under this Act.

Notification

Notification and reasons

88 The Commissioner must notify the complainant and the organization and give reasons for the decision if an investigation has concluded and the Commissioner has decided not to conduct an inquiry.

Inquiry

Inquiry — complaint

89 (1) After investigating a complaint, the Commissioner may conduct an inquiry in respect of the complaint if the matter is not

(a) the subject of dispute resolution under section 86;

(b) discontinued; or

(c) resolved.

Notice

(2) The Commissioner must give notice of the inquiry to the complainant and the organization.

Inquiry — compliance agreement

90 (1) If the Commissioner believes on reasonable grounds that an organization is not complying with the terms of a compliance agreement entered into under subsection 87(1), the Commissioner may conduct an inquiry in respect of the non-compliance.

Notice

(2) The Commissioner must give notice of the inquiry to the organization.

Nature of inquiries

91 (1) Subject to subsection (2), the Commissioner is not bound by any legal or technical rules of evidence in conducting an inquiry and must deal with the matter as informally and expeditiously as the circumstances and considerations of fairness and natural justice permit.

Restriction

(2) The Commissioner must not receive or accept as evidence anything that would be inadmissible in a court by reason of any privilege under the law of evidence.

Opportunity to be heard

(3) In conducting the inquiry, the Commissioner must give the organization and the complainant an opportunity to be heard and to be assisted or represented by counsel or by any person.

Inquiry in private

(4) The Commissioner may hold all or any part of the inquiry in private.

Rules

92 The Commissioner must make rules respecting the conduct of an inquiry, including the procedure and rules of evidence to be followed, and must make those rules publicly available.

Decision

93 (1) The Commissioner must complete an inquiry by rendering a decision that sets out

(a) the Commissioner’s findings on whether the organization has contravened this Act or has not complied with the terms of a compliance agreement;

(b) any order made under subsection (2);

(c) any decision made under subsection 94(1); and

(d) the Commissioner’s reasons for the findings, order or decision.

Compliance order

(2) The Commissioner may, to the extent that is reasonably necessary to ensure compliance with this Act, order the organization to

(a) take measures to comply with this Act;

(b) stop doing something that is in contravention of this Act;

(c) comply with the terms of a compliance agreement that has been entered into by the organization; or

(d) make public any measures taken or proposed to be taken to correct the policies, practices or procedures that the organization has put in place to fulfill its obligations under this Act.

Communication of decision

(3) The decision must be sent to the complainant and the organization without delay.

Extension of time

(4) An inquiry conducted under section 89 must be completed within one year after the day on which the complaint is filed or is initiated by the Commissioner. However, the Commissioner may extend the time limit, for a period not exceeding one year, by notifying the complainant and the organization of the anticipated date on which the decision is to be made.

Administrative Monetary Penalties

Recommendation

94 (1) If, on completing an inquiry under section 89 or 90, the Commissioner finds that an organization has contravened one or more of the following provisions, the Commissioner must decide whether to recommend that a penalty be imposed on the organization by the Tribunal:

(a) subsection 9(1);

(b) subsection 11(1);

(c) subsections 12(3) and (4);

(d) section 13;

(e) subsection 14(1);

(f) subsections 15(1) and (7);

(g) section 16;

(h) subsection 17(2);

(i) section 53;

(j) subsections 55(1) and (4);

(k) subsection 57(1);

(l) subsections 58(1) and (3);

(m) section 61; and

(n) subsection 62(1).

Factors to consider

(2) In making the decision, the Commissioner must take into account the following factors:

(a) the nature and scope of the contravention;

(b) any evidence that the organization exercised due diligence to avoid the contravention;

(c) whether the organization made reasonable efforts to mitigate or reverse the contravention’s effects;

(d) the organization’s history of compliance with this Act;

(e) any prescribed factor; and

(f) any other relevant factor.

Limitation

(3) The Commissioner must not recommend that a penalty be imposed on an organization if the Commissioner is of the opinion that, at the time of the contravention of the provision in question, the organization was in compliance with the requirements of a certification program that was in relation to that provision and was approved by the Commissioner under subsection 77(2).

Notice to Tribunal

(4) If the Commissioner decides to recommend that a penalty be imposed on an organization, the Commissioner must file with the Tribunal a copy of the decision rendered under subsection 93(1) that sets out the decision to recommend.

Imposition of penalty

95 (1) The Tribunal may, by order, impose a penalty on an organization if

(a) the Commissioner files a copy of a decision in relation to the organization in accordance with subsection 94(4) or the Tribunal, on appeal, substitutes its own decision to recommend that a penalty be imposed on the organization for the Commissioner’s decision not to recommend;

(b) the organization and the Commissioner are given the opportunity to make representations; and

(c) the Tribunal determines that imposing the penalty is appropriate.

Findings

(2) In determining whether it is appropriate to impose a penalty on an organization, the Tribunal must rely on the findings set out in the decision that is rendered by the Commissioner under subsection 93(1) in relation to the organization or on the Tribunal’s own findings if, on appeal, it substitutes its own findings for those of the Commissioner.

Limitations

(3) The Tribunal must not impose a penalty on an organization in relation to a contravention if a prosecution for the act or omission that constitutes the contravention has been instituted against the organization or if the organization establishes that it exercised due diligence to prevent the contravention.

Maximum penalty

(4) The maximum penalty for all the contraventions in a recommendation taken together is the higher of $10,000,000 and 3% of the organization’s gross global revenue in its financial year before the one in which the penalty is imposed.

Factors to consider

(5) In determining whether it is appropriate to impose a penalty on an organization and in determining the amount of a penalty, the Tribunal must take the following factors into account:

(a) the factors set out in subsection 94(2);

(b) the organization’s ability to pay the penalty and the likely effect of paying it on the organization’s ability to carry on its business; and

(c) any financial benefit that the organization obtained from the contravention.

Purpose of penalty

(6) The purpose of a penalty is to promote compliance with this Act and not to punish.

Recovery as debt due to Her Majesty

96 A penalty imposed under section 95 constitutes a debt due to Her Majesty and the debt is payable and may be recovered by the Minister as of the day on which it is imposed.

Audits

Ensure compliance

97 The Commissioner may, on reasonable notice and at any reasonable time, audit the personal information management practices of an organization if the Commissioner has reasonable grounds to believe that the organization has contravened, is contravening or is likely to contravene Part 1.

Report of findings and recommendations

98 (1) After an audit, the Commissioner must provide the audited organization with a report that contains the findings of the audit and any recommendations that the Commissioner considers appropriate.

Reports may be included in annual reports

(2) The report may be included in a report made under section 121.

Commissioner’s Powers — Investigations, Inquiries and Audits

Powers of Commissioner

99 (1) In carrying out an investigation of a complaint, conducting an inquiry or carrying out an audit, the Commissioner may

(a) summon and enforce the appearance of persons before the Commissioner and compel them to give oral or written evidence on oath and to produce any records and things that the Commissioner considers necessary to carry out the investigation, conduct the inquiry or carry out the audit, in the same manner and to the same extent as a superior court of record;

(b) administer oaths;

(c) receive and accept any evidence and other information, whether on oath, by affidavit or otherwise, that the Commissioner sees fit, whether or not it is or would be admissible in a court of law;

(d) make any interim order that the Commissioner considers appropriate;

(e) order an organization that has information that is relevant to the investigation, inquiry or audit to retain the information for as long as is necessary to allow the Commissioner to carry out the investigation, conduct the inquiry or carry out the audit;

(f) at any reasonable time, enter any premises, other than a dwelling-house, occupied by an organization on satisfying any security requirements of the organization relating to the premises;

(g) converse in private with any person in any premises entered under paragraph (f) and otherwise make any inquiries in those premises that the Commissioner sees fit; and

(h) examine or obtain copies of or extracts from records found in any premises entered under paragraph (f) that contain any matter relevant to the investigation, inquiry or audit.

Return of records

(2) The Commissioner or the Commissioner’s delegate must return to a person or an organization any record or thing that they produced under this section within 10 days after the day on which they make a request to the Commissioner or the delegate, but nothing precludes the Commissioner or the delegate from again requiring that the record or thing be produced.

Delegation

100 (1) The Commissioner may delegate any of the powers, duties or functions set out in sections 83 to 97 and subsection 99(1).

Certificate of delegation

(2) Any person to whom powers set out in subsection 99(1) are delegated must be given a certificate of the delegation and the delegate must produce the certificate, on request, to the person in charge of any premises to be entered under paragraph (f) of that subsection.

Appeals

Right of appeal

101 (1) A complainant or organization that is affected by any of the following findings, orders or decisions may appeal it to the Tribunal:

(a) a finding that is set out in a decision rendered under subsection 93(1);

(b) an order made under subsection 93(2); or

(c) a decision made under subsection 94(1) not to recommend that a penalty be imposed on the organization.

Time limit — appeal

(2) The time limit for making an appeal is 30 days after the day on which the Commissioner renders the decision under subsection 93(1) that sets out the finding, order or decision.

Appeal with leave

102 (1) A complainant or organization that is affected by an interim order made under paragraph 99(1)‍(d) may, with leave of the Tribunal, appeal the order to the Tribunal.

Time limit — leave to appeal

(2) The time limit for making an application for leave to appeal is 30 days after the day on which the order is made.

Disposition of appeals

103 (1) The Tribunal may dispose of an appeal by dismissing it or by allowing it and, in allowing the appeal, the Tribunal may substitute its own finding, order or decision for the one under appeal.

Standard of review

(2) The standard of review for an appeal is correctness for questions of law and palpable and overriding error for questions of fact or questions of mixed law and fact.

Enforcement of Orders

Compliance orders

104 (1) If an order made by the Commissioner under subsection 93(2) is not appealed to the Tribunal or an appeal of the order is dismissed by the Tribunal, the order may, for the purposes of its enforcement, be made an order of the Federal Court and is enforceable in the same manner as an order of that Court.

Interim orders

(2) If an application for leave to appeal to the Tribunal is not made in relation to an order made by the Commissioner under paragraph 99(1)‍(d), a leave application in relation to the order is dismissed by the Tribunal or a leave application in relation to the order is granted by the Tribunal but the appeal is dismissed, then the order may, for the purposes of its enforcement, be made an order of the Federal Court and is enforceable in the same manner as an order of that Court.

Tribunal orders

105 If the Tribunal, on appeal, substitutes its own order for an order of the Commissioner made under subsection 93(2) or paragraph 99(1)‍(d), the Tribunal’s order may, for the purposes of its enforcement, be made an order of the Federal Court and is enforceable in the same manner as an order of that Court.

Filing with Court

106 An order referred to in section 104 or 105 is made an order of the Federal Court by filing a certified copy of it with the Registrar of that Court.

Private Right of Action

Damages — contravention of Act

107 (1) An individual who is affected by an act or omission by an organization that constitutes a contravention of this Act has a cause of action against the organization for damages for loss or injury that the individual has suffered as a result of the contravention if

(a) the Commissioner has made a finding under paragraph 93(1)‍(a) that the organization has contravened this Act and

(i) the finding is not appealed and the time limit for making an appeal under subsection 101(2) has expired, or

(ii) the Tribunal has dismissed an appeal of the finding under subsection 103(1); or

(b) the Tribunal has made a finding under subsection 103(1) that the organization has contravened this Act.

Damages — offence

(2) If an organization has been convicted of an offence under section 128, an individual affected by the act or omission that gave rise to the offence has a cause of action against the organization for damages for loss or injury that the individual has suffered as a result of the act or omission.

Limitation period or prescription

(3) An action must not be brought later than two years after the day on which the individual becomes aware of

(a) in the case of an action under subsection (1), the Commissioner’s finding or, if there is an appeal, the Tribunal’s decision; and

(b) in the case of an action under subsection (2), the conviction.

Court of competent jurisdiction

(4) An action referred to in subsection (1) or (2) may be brought in the Federal Court or a superior court of a province.

Certificate Under Canada Evidence Act

Certificate under Canada Evidence Act

108 (1) If a certificate under section 38.‍13 of the Canada Evidence Act prohibiting the disclosure of personal information of a specific individual is issued before a complaint is filed by that individual under this Act in respect of a request for access to that information, the provisions of this Act respecting that individual’s right of access to their personal information do not apply to the information that is subject to the certificate.

Certificate following filing of complaint

(2) Despite any other provision of this Act, if a certificate under section 38.‍13 of the Canada Evidence Act prohibiting the disclosure of personal information of a specific individual is issued after the filing of a complaint under this Act in relation to a request for access to that information,

(a) all proceedings under this Act in respect of that information, including an investigation, inquiry, audit, appeal or judicial review, are discontinued;

(b) the Commissioner must not disclose the information and must take all necessary precautions to prevent its disclosure; and

(c) the Commissioner must, within 10 days after the day on which the certificate is published in the Canada Gazette, return the information to the organization that provided the information.

Information not to be disclosed

(3) The Commissioner and every person acting on behalf or under the direction of the Commissioner, in exercising their powers and performing their duties and functions under this Act, must not disclose information subject to a certificate issued under section 38.‍13 of the Canada Evidence Act and must take every reasonable precaution to avoid the disclosure of that information.

Power to delegate

(4) The Commissioner must not delegate the investigation or inquiry in respect of any complaint relating to information subject to a certificate issued under section 38.‍13 of the Canada Evidence Act except to one of a maximum of four officers or employees of the Commissioner specifically designated by the Commissioner for the purpose of conducting that investigation or inquiry, as the case may be.

Powers, Duties and Functions of Commissioner

Factors to consider

109 In exercising any powers and performing the any duties and functions under this Act, the Commissioner must take into account

(a) the purpose of this Act;

(b) the size and revenue of organizations;

(c) the volume and sensitivity of the personal information under their control; and

(d) matters of general public interest.

Promoting purposes of Act

110 (1) The Commissioner must, in the form and manner that the Commissioner considers appropriate,

(a) develop and conduct information programs to foster public understanding of this Act and recognition of its purposes;

(b) develop guidance materials and tools for organizations in relation to their compliance with this Act — including any guidance materials and tools that are requested by the Minister — in consultation with stakeholders, including any relevant federal government institutions;

(c) undertake and publish research that is related to the protection of personal information, including any research that is requested by the Minister;

(d) undertake and publish any research related to the operation or implementation of this Act that is requested by the Minister;

(e) on request by an organization, provide guidance on — and, if the Commissioner considers it appropriate, recommend corrective measures in relation to — its privacy management program; and

(f) promote, by any other means that the Commissioner considers appropriate, the purposes of this Act.

For greater certainty

(2) For greater certainty, for the purpose of paragraph (1)‍(e), the Commissioner may prioritize the requests of organizations that the Commissioner considers to be in greatest need of guidance and is not required to act on a request that the Commissioner considers unreasonable.

Prohibition — use for initiating complaint or audit

111 The Commissioner must not use the information the Commissioner receives under section 10 or paragraph 110(1)‍(e) as grounds to initiate a complaint under subsection 82(2) or to carry out an audit under section 97 unless the Commissioner considers that the organization has wilfully disregarded the corrective measures that were recommended in relation to its privacy management program.

Information — powers, duties or functions

112 The Commissioner must make readily available information on the manner in which the Commissioner exercises the Commissioner’s powers or performs the Commissioner’s duties or functions under this Act.

Confidentiality

113 (1) Subject to subsections (3) to (8), section 79, paragraph 81(c), subsections 82(4) and 83(2), section 88, subsections 89(2) and 90(2), section 93, subsections 94(4), 98(1), 118(2), 119(3) and 120(1) and section 121, the Commissioner or any person acting on behalf or under the direction of the Commissioner must not disclose any information that comes to their knowledge as a result of the exercise of any of the Commissioner’s powers or the performance of any of the Commissioner’s duties or functions under this Act other than those referred to in subsection 58(1) or 60(2).

Confidentiality — reports and records

(2) Subject to subsections (3) to (8), section 79, paragraph 81(c), subsections 82(4) and 83(2), section 88, subsections 89(2) and 90(2), section 93, subsections 94(4), 98(1), 118(2), 119(3) and 120(1) and section 121, the Commissioner or any person acting on behalf or under the direction of the Commissioner must not disclose any information contained in a report made under subsection 58(1) or in a record obtained under subsection 60(2).

Public interest

(3) The Commissioner may, if the Commissioner considers that it is in the public interest to do so, make public any information that comes to the Commissioner’s knowledge in the exercise of any of the Commissioner’s powers or the performance of any of the Commissioner’s duties or functions under this Act.

Disclosure of necessary information

(4) The Commissioner may disclose, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose, information that in the Commissioner’s opinion is necessary to

(a) carry out an investigation, conduct an inquiry or carry out an audit under this Act; or

(b) establish the grounds for findings and recommendations contained in any decision or report made under this Act.

Disclosure in the course of proceedings

(5) The Commissioner may disclose, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose, information in the course of

(a) a prosecution for an offence under section 128;

(b) a prosecution for an offence under section 132 of the Criminal Code (perjury) in respect of a statement made under this Act;

(c) a proceeding or an appeal before the Tribunal under this Act; or

(d) a judicial review in relation to the exercise of any of the Commissioner’s powers or the performance of any of the Commissioner’s duties or functions under this Act or in relation to a decision of the Tribunal.

Disclosure of offence authorized

(6) The Commissioner may disclose to the Attorney General of Canada or of a province, as the case may be, information relating to the commission of an offence under any federal or provincial law on the part of an officer or employee of an organization if, in the Commissioner’s opinion, there is evidence of an offence.

Disclosure of breach of security safeguards

(7) The Commissioner may disclose, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose, to a government institution or a part of a government institution, any information contained in a report made under subsection 58(1) or in a record obtained under subsection 60(2) if the Commissioner has reasonable grounds to believe that the information could be useful in the investigation of a contravention of any federal or provincial law that has been, is being or is about to be committed.

Disclosure

(8) The Commissioner may disclose information, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose information, in the course of proceedings in which the Commissioner has intervened under paragraph 50(c) of An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act or in accordance with subsection 58(3) or 60(1) of that Act.

Not competent witness

114 The Commissioner or any person acting on behalf or under the direction of the Commissioner is not a competent witness in respect of any matter that comes to their knowledge as a result of the exercise of any of the Commissioner’s powers or the performance of any of the Commissioner’s duties or functions under this Act in any proceeding other than

(a) a prosecution for an offence under section 128;

(b) a prosecution for an offence under section 132 of the Criminal Code (perjury) in respect of a statement made under this Act; or

(c) a proceeding or an appeal before the Tribunal under this Act.

Protection of Commissioner

115 (1) No criminal or civil proceedings lie against the Commissioner, or against any person acting on behalf or under the direction of the Commissioner, for anything done, reported, decided or said in good faith as a result of the exercise or purported exercise of any power of the Commissioner or the performance or purported performance of any duty or function of the Commissioner under this Act.

Defamation

(2) No action lies in defamation with respect to

(a) anything said, any information supplied or any record or thing produced in good faith in the course of an investigation or audit carried out or an inquiry conducted by or on behalf of the Commissioner under this Act; and

(b) any report or decision made in good faith by the Commissioner under this Act and any fair and accurate account of the report or decision made in good faith for the purpose of news reporting.

De-identified information

116 For the purpose of paragraph 75(e), the Commissioner may, on request by an organization, authorize a purpose or situation in which the organization may use information that has been de-identified, alone or in combination with other information, to identify an individual if, in the Commissioner’s opinion, it is clearly in the interests of the individual.

Agreements or arrangements — Minister

117 The Commissioner may enter into an agreement or arrangement with the Minister relating to the administration of this Act.

Agreements or arrangements — CRTC and Commissioner of Competition

118 (1) The Commissioner may enter into agreements or arrangements with the Canadian Radio-television and Telecommunications Commission or the Commissioner of Competition in order to

(a) undertake and publish research on issues of mutual interest; and

(b) develop procedures for disclosing information referred to in subsection (2).

Disclosure of information

(2) The Commissioner may, in accordance with any procedure established under paragraph (1)‍(b), disclose information, other than information the Commissioner has received under section 10 or paragraph 110(1)‍(e), to the Canadian Radio-television and Telecommunications Commission or the Commissioner of Competition if the information is relevant to their powers, duties or functions.

Purpose and confidentiality

(3) The procedures referred to in paragraph (1)‍(b) must

(a) restrict the use of the information to the purpose for which it was originally disclosed; and

(b) stipulate that the information be treated in a confidential manner and not be further disclosed without the express consent of the Commissioner.

Consultations with provinces

119 (1) If the Commissioner considers it appropriate to do so, or on the request of an interested person, the Commissioner may, in order to ensure that personal information is protected in as consistent a manner as possible, consult with any person who, under provincial legislation, has powers, duties and functions similar to those of the Commissioner with respect to the protection of personal information.

Agreements or arrangements with provinces

(2) The Commissioner may enter into agreements or arrangements with any person referred to in subsection (1) in order to

(a) coordinate the activities of their offices and the office of the Commissioner, including to provide for mechanisms for the handling of any complaint in which they are mutually interested;

(b) undertake and publish research or develop and publish guidelines or other documents related to the protection of personal information;

(c) develop model contracts or other documents related to the protection of personal information that is collected, used or disclosed interprovincially or internationally; and

(d) develop procedures for disclosing information referred to in subsection (3).

Disclosure of information to provinces

(3) The Commissioner may, in accordance with any procedure established under paragraph (2)‍(d), disclose information, other than information the Commissioner has received under section 10 or paragraph 110(1)‍(e), to any person referred to in subsection (1), if the information

(a) could be relevant to an ongoing or potential investigation of a complaint, inquiry or audit under this Act or provincial legislation that has objectives that are similar to this Act; or

(b) could assist the Commissioner or that person in the exercise of their powers or the performance of their duties or functions with respect to the protection of personal information.

Purpose and confidentiality

(4) The procedures referred to in paragraph (2)‍(d) must

(a) restrict the use of the information to the purpose for which it was originally disclosed; and

(b) stipulate that the information be treated in a confidential manner and not be further disclosed without the express consent of the Commissioner.

Disclosure of information to foreign state

120 (1) Subject to subsection (3), the Commissioner may, in accordance with any procedure established under paragraph (4)‍(b), disclose information referred to in subsection (2), other than information the Commissioner has received under section 10 or paragraph 110(1)‍(e), that has come to the Commissioner’s knowledge as a result of the exercise of any of the Commissioner’s powers or the performance of any of the Commissioner’s duties and functions under this Act to any person or body who, under the legislation of a foreign state, has

(a) powers, duties and functions similar to those of the Commissioner with respect to the protection of personal information; or

(b) responsibilities that relate to conduct that is substantially similar to conduct that would be in contravention of this Act.

Information that can be disclosed

(2) The information that the Commissioner is authorized to disclose under subsection (1) is information that the Commissioner believes

(a) would be relevant to an ongoing or potential investigation or proceeding in respect of a contravention of the laws of a foreign state that address conduct that is substantially similar to conduct that would be in contravention of this Act; or

(b) is necessary to disclose in order to obtain from the person or body information that may be useful to an ongoing or potential investigation, inquiry or audit under this Act.

Written arrangements

(3) The Commissioner may only disclose information to the person or body referred to in subsection (1) if the Commissioner has entered into a written arrangement with that person or body that

(a) limits the information to be disclosed to that which is necessary for the purpose set out in paragraph (2)‍(a) or (b);

(b) restricts the use of the information to the purpose for which it was originally disclosed; and

(c) stipulates that the information be treated in a confidential manner and not be further disclosed without the express consent of the Commissioner.

Arrangements

(4) The Commissioner may enter into arrangements with one or more persons or bodies referred to in subsection (1) in order to

(a) provide for cooperation with respect to the enforcement of laws protecting personal information, including the disclosure of information referred to in subsection (2) and the provision of mechanisms for the handling of any complaint in which they are mutually interested;

(b) establish procedures for disclosing information referred to in subsection (2);

(c) develop recommendations, resolutions, rules, standards or other documents with respect to the protection of personal information;

(d) undertake and publish research related to the protection of personal information;

(e) share knowledge and expertise by different means, including through staff exchanges; or

(f) identify issues of mutual interest and determine priorities pertaining to the protection of personal information.

Annual report

121 (1) The Commissioner must, within three months after the end of each financial year, cause to be tabled in each House of Parliament a report concerning the application of this Act, the extent to which the provinces have enacted legislation that is substantially similar to this Act and the application of any such legislation.

Consultation

(2) Before preparing the report, the Commissioner must consult with those persons in the provinces who, in the Commissioner’s opinion, are in a position to assist the Commissioner in making a report respecting personal information that is collected, used or disclosed interprovincially or internationally.

General

Regulations

122 (1) The Governor in Council may make regulations for carrying out the purposes and provisions of this Act, including regulations

(a) respecting the scope of any of the activities set out in paragraphs 18(2)‍(a) to (c), including specifying activities that are excluded from the application of this Act;

(b) specifying what is a government institution or part of a government institution for the purposes of any provision of this Act;

(c) specifying information for the purpose of section 51;

(d) specifying information to be kept and maintained under subsection 60(1); and

(e) prescribing anything that by this Act is to be prescribed.

Orders

(2) The Governor in Council may, by order,

(a) provide that this Act is binding on any agent of Her Majesty in right of Canada to which the Privacy Act does not apply;

(b) if satisfied that legislation of a province that is substantially similar to this Act applies to an organization, a class of organizations, an activity or a class of activities, exempt the organization, activity or class from the application of this Act in respect of the collection, use or disclosure of personal information that occurs within that province; and

(c) amend the schedule by adding or deleting, in column 1, a reference to an organization or by adding or deleting, in column 2, the description of personal information in relation to an organization in column 1.

Regulations — substantially similar provincial legislation

(3) The Governor in Council may make regulations establishing

(a) criteria that are to be applied in making a determination under paragraph (2)‍(b) that provincial legislation is substantially similar to this Act, or in reconsidering that determination; and

(b) the process for making or reconsidering that determination.

Data mobility frameworks

123 The Governor in Council may make regulations respecting the disclosure of personal information under section 72, including regulations

(a) respecting data mobility frameworks and prescribing

(i) safeguards that must be put in place by organizations to enable the secure disclosure of personal information under section 72 and the collection of that information, and

(ii) parameters for the technical means for ensuring interoperability in respect of the disclosure and collection of that information;

(b) specifying organizations that are subject to a data mobility framework; and

(c) providing for exceptions to the requirement to disclose personal information under that section, including exceptions related to the protection of proprietary or confidential commercial information.

Distinguishing — classes

124 Regulations made under subsection 122(1) or section 123 may distinguish among different classes of activities, government institutions or parts of government institutions, information, organizations or entities.

Regulations — codes of conduct and certification programs

125 The Minister may make regulations

(a) respecting the making of an application under subsection 76(2);

(b) setting out criteria for the purpose of subsection 76(3);

(c) respecting the reconsideration of a determination made under subsection 76(3);

(d) respecting the making of an application under subsection 77(1);

(e) providing for anything else that must be included in a certification program for the purpose of paragraph 77(1)‍(f);

(f) setting out criteria for the purpose of subsection 77(2);

(g) respecting the reconsideration of a determination made under subsection 77(2);

(h) specifying, for the purpose of section 78, the time for responding to an application;

(i) respecting the criteria for and the manner and the circumstances in which a recommendation may be made under paragraph 81(c);

(j) respecting the criteria for and the manner and the circumstances in which an approval may be revoked under paragraph 81(e); and

(k) respecting record-keeping and reporting obligations of an entity that operates an approved certification program, including obligations to provide reports to the Commissioner in respect of an approved certification program.

Whistleblowing

126 (1) Any person who has reasonable grounds to believe that a person has contravened or intends to contravene Part 1 may notify the Commissioner of the particulars of the matter and may request that their identity be kept confidential with respect to the notification.

Confidentiality

(2) The Commissioner must keep confidential the identity of a person who has notified the Commissioner under subsection (1) and to whom an assurance of confidentiality has been provided by the Commissioner.

Prohibition

127 (1) An employer must not dismiss, suspend, demote, discipline, harass or otherwise disadvantage an employee, or deny an employee a benefit of employment, by reason that

(a) the employee, acting in good faith and on the basis of reasonable belief, has disclosed to the Commissioner that the employer or any other person has contravened or intends to contravene Part 1;

(b) the employee, acting in good faith and on the basis of reasonable belief, has refused or stated an intention of refusing to do anything that is a contravention of Part 1;

(c) the employee, acting in good faith and on the basis of reasonable belief, has done or stated an intention of doing anything that is required to be done in order that Part 1 not be contravened; or

(d) the employer believes that the employee will do anything referred to in paragraph (a), (b) or (c).

Saving

(2) Nothing in this section impairs any right of an employee, either at law or under an employment contract or collective agreement.

Definitions of employee and employer

(3) In this section, employee includes an independent contractor and employer has a corresponding meaning.

Offence and punishment

128 Every organization that knowingly contravenes section 58, subsection 60(1), section 69 or 75 or subsection 127(1) or an order under subsection 93(2) or that obstructs the Commissioner or the Commissioner’s delegate in the investigation of a complaint, in conducting an inquiry or in carrying out an audit is

(a) guilty of an indictable offence and liable to a fine not exceeding the higher of $25,000,000 and 5% of the organization’s gross global revenue in its financial year before the one in which the organization is sentenced; or

(b) guilty of an offence punishable on summary conviction and liable to a fine not exceeding the higher of $20,000,000 and 4% of the organization’s gross global revenue in its financial year before the one in which the organization is sentenced.

Review by parliamentary committee

129 (1) Five years after the day on which this section comes into force, and every five years after that, a comprehensive review of the provisions and operation of this Act is to be commenced by a committee of the Senate, of the House of Commons or of both Houses of Parliament that may be designated or established by the Senate, the House of Commons or both Houses of Parliament, as the case may be, for that purpose.

Report

(2) Within one year, or any further time that is authorized by the Senate, the House of Commons or both Houses of Parliament, as the case may be, after the day on which the review is commenced, the committee must submit a report on that review to the Senate, the House of Commons or both Houses of Parliament, as the case may be, together with a statement of any changes recommended by the committee.

PART 3 

Coming into Force

Order in council

130 (1) Subject to subsections (2) and (3), this Act comes into force on the day on which section 3 of the Digital Charter Implementation Act, 2022 comes into force.

Order in council

(2) Sections 72 and 123 come into force on a day to be fixed by order of the Governor in Council.

Order in council

(3) Sections 76 to 81, paragraph 83(1)‍(d), subsection 94(3) and section 125 come into force on a day to be fixed by order of the Governor in Council.

Consequential and Related Amendments

2000, c. 5

Personal Information Protection and Electronic Documents Act

3 The long title of the Personal Information Protection and Electronic Documents Act is replaced by the following:

An Act to Insertion startprovideInsertion end for the use of electronic means to communicate or record information or transactions

2000, c. 17, par. 97(1)‍(b) and (d); 2001, c. 41, ss. 81, 82 and 103; 2002, c. 8, par. 183(1)‍(r); 2004, c. 15, s. 98; 2005, c. 46, s. 57; 2006, c. 9, s. 223; 2010, c. 23, ss. 82 to 84, 86(2) and 87; 2015, c. 32, ss. 2 to 7, 8(F), 9 to 17, 18(1) and (2)‍(E), 19, 20(1) and (2)‍(E), 21 to 24 and 26(2) and (3), c. 36, s. 164 and 165; 2019, c. 18, s. 61

4 Sections 1 to 30 of the Act are replaced by the following:

Short title

1 This Act may be cited as the Electronic Documents Act.

5 Section 31 of the Act is amended by adding the following after subsection (2):

Designation of Minister

Start of inserted block

(3) The Governor in Council may, by order, designate a member of the Queen’s Privy Council for Canada as the Minister responsible for this Act.

End of inserted block

6 Parts 3 to 5 of the Act are repealed.

7 Schedule 1 to the Act is repealed.

2015, c. 36, s. 166

8 Schedule 4 to the Act is repealed.

R.‍S.‍, c. A-1

Access to Information Act

2015, c. 32, s. 25

9 (1) Schedule II to the Access to Information Act is amended by striking out the reference to

Personal Information Protection and Electronic Documents Act

Loi sur la protection des renseignements personnels et les documents électroniques

and the corresponding reference to “subsection 20(1.‍1)”.

(2) Schedule II to the Act is amended by adding, in alphabetical order, a reference to

Start of inserted block

Start of inserted block

Consumer Privacy Protection Act

Loi sur la protection de la vie privée des consommateurs

End of inserted block

End of inserted block

and a corresponding reference to “subsection 113(2)”.

R.‍S.‍, c. A-2

Aeronautics Act

2011, c. 9, s. 2(1)

10 Subsection 4.‍83(1) of the Aeronautics Act is replaced by the following:

Foreign states requiring information

4.‍83 (1) Despite Insertion startPart 1Insertion end of the Insertion startConsumer PrivacyInsertion end Protection Act, to the extent that that Insertion startPartInsertion end relates to obligations relating to the disclosure of information, anoperator of an aircraft departing from Canada that is due to land in a foreign state or fly over the United States and land outside Canada or of a Canadian aircraft departing from any place outside Canada that is due to land in a foreign state or fly over the United States may, in accordance with the regulations, provide to a competent authority in that foreign state any information that is in the operator’s control relating to persons on board or expected to be on board the aircraft and that is required by the laws of the foreign state.

R.‍S.‍, c. C-5

Canada Evidence Act

2001, c. 41, s. 44

11 Item 14 of the schedule to the Canada Evidence Act is replaced by the following:

14 The Privacy Commissioner, for the purposes of the Insertion startConsumer PrivacyInsertion end Protection Act

2001, c. 41, s. 44

12 Item 17 of the schedule to the Act is replaced by the following:

17 The Personal Information and Insertion startDataInsertion end Protection Insertion startTribunalInsertion end, for the purposes of the Insertion startConsumer PrivacyInsertion end Protection Act

R.‍S.‍, c. C-22

Canadian Radio-television and Telecommunications Commission Act

13 The Canadian Radio-television and Telecommunications Commission Act is amended by adding the following after section 12:

Agreements or arrangements — Privacy Commissioner

Start of inserted block

12.‍1 (1) The Commission may enter into an agreement or arrangement with the Privacy Commissioner in order to

(a) undertake and publish research on issues of mutual interest; and

(b) develop procedures for disclosing information referred to in subsection (2).

End of inserted block

Disclosure of information

Start of inserted block

(2) The Commission may, in accordance with any procedure established under paragraph (1)‍(b), disclose information to the Privacy Commissioner if the information is relevant to the Commissioner’s powers, duties or functions under the Consumer Privacy Protection Act.

End of inserted block

Purpose and confidentiality

Start of inserted block

(3) The procedures referred to in paragraph (1)‍(b) shall

(a) restrict the use of the information to the purpose for which it was originally disclosed; and

(b) stipulate that the information be treated in a confidential manner and not be further disclosed without the express consent of the Commission.

End of inserted block

R.‍S.‍, c. C-34; R.‍S.‍, c. 19 (2nd Supp.‍), s. 19

Competition Act

14 The Competition Act is amended by adding the following after section 29.‍2:

Agreements or arrangements — Privacy Commissioner

Start of inserted block

29.‍3 (1) Despite subsection 29(1), the Commissioner may enter into an agreement or arrangement with the Privacy Commissioner in order to

(a) undertake and publish research on issues of mutual interest; and

(b) develop procedures for disclosing information referred to in subsection (2).

End of inserted block

Disclosure of information

Start of inserted block

(2) The Commissioner may, in accordance with any procedure established under paragraph (1)‍(b), disclose information to the Privacy Commissioner if the information is relevant to the Privacy Commissioner’s powers, duties or functions under the Consumer Privacy Protection Act.

End of inserted block

Purpose and confidentiality

Start of inserted block

(3) The procedures referred to in paragraph (1)‍(b) shall

(a) restrict the use of the information to the purpose for which it was originally disclosed; and

(b) stipulate that the information be treated in a confidential manner and not be further disclosed without the express consent of the Commissioner.

End of inserted block

R.‍S.‍, c. C-44; 1994, c. 24, s. 1(F)

Canada Business Corporations Act

2018, c. 27, s. 183

15 Subsection 21.‍1(5) of the Canada Business Corporations Act is replaced by the following:

Disposal of personal information

(5) Within one year after the sixth anniversary of the day on which an individual ceases to be an individual with significant control over the corporation, the corporation shall — subject to any other Act of Parliament and to any Act of the legislature of a province that provides for a longer retention period — dispose of any of that individual’s personal information, as defined in Insertion startsubsectionInsertion end 2Insertion start(1)Insertion end of the Insertion startConsumer PrivacyInsertion end Protection Act, that is recorded in the register.

1993, c. 38

Telecommunications Act

2010, c. 23, s. 88(1)

16 (1) Subsection 39(2) of the Telecommunications Act is replaced by the following:

Information not to be disclosed

(2) Subject to subsections (4), (5) and (5.‍1) Insertion starttoInsertion end (6), if a person designates information as confidential and the designation is not withdrawn by that person, no person described in subsection (3) shall knowingly disclose the information, or knowingly allow it to be disclosed, to any other person in any manner that is calculated or likely to make it available for the use of any person who may benefit from the information or use the information to the detriment of any person to whose business or affairs the information relates.

(2) Section 39 of the Act is amended by adding the following after subsection (5.‍1):

Disclosure to Privacy Commissioner

Start of inserted block

(5.‍2) The Commission may disclose designated information obtained by it in the exercise of its powers or the performance of its duties or functions under this Act to the Privacy Commissioner in accordance with section 12.‍1 of the Canadian Radio-television and Telecommunications Commission Act.

End of inserted block

2005, c. 46

Public Servants Disclosure Protection Act

17 Paragraph 15(a) of the Public Servants Disclosure Protection Act is replaced by the following:

(a) Insertion startPart 1Insertion end of the Insertion startConsumer PrivacyInsertion end Protection Act, to the extent that that Insertion startPartInsertion end relates to obligations relating to the disclosure of information; and

18 Subsection 16(1.‍1) of the Act is replaced by the following:

Limitation

(1.‍1) Subsection (1) does not apply in respect of information the disclosure of which is subject to any restriction created by or under any Act of Parliament, including the Insertion startConsumer PrivacyInsertion end Protection Act.

19 Section 50 of the Act is replaced by the following:

Personal information

50 Despite Insertion startPart 1Insertion end of the Insertion startConsumer PrivacyInsertion end Protection Act, to the extent that that Insertion startPartInsertion end relates to obligations relating to the disclosure of information, and despite any other Act of Parliament that restricts the disclosure of information, a report by a chief executive in response to recommendations made by the Commissioner to the chief executive under this Act may include personal information within the meaning of subsection 2(1) of that Act, or section 3 of the Privacy Act, depending on which of those Acts applies to the portion of the public sector for which the chief executive is responsible.

2010, c. 23

Chapter 23 of the Statutes of Canada, 2010

20 Section 2 of An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act is replaced by the following:

Precedence of this Act

2 In the event of a conflict between a provision of this Act and a provision of the Insertion startConsumer PrivacyInsertion end Protection Act, the provision of this Act operates despite the provision of that Insertion startActInsertion end, to the extent of the conflict.

21 Paragraph 20(3)‍(c) of the Act is replaced by the following:

(c) the person’s history with respect to

(Insertion startiInsertion end) any previous violation of this Act,

(Insertion startiiInsertion end) any previous conduct that is reviewable under section 74.‍011 of the Competition Act,

(Insertion startiiiInsertion end) any previous contravention of section 5 of the Personal Information Protection and Electronic Documents Act, Insertion startas it read immediately before the day on which section 4 of the Digital Charter Implementation Act, 2022 comes into forceInsertion end, that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act, and

Start of inserted block

(iv) any previous contravention of Part 1 of the Consumer Privacy Protection Act that relates to a collection or use described in subsection 52(2) or (3) of that Act;

End of inserted block

22 (1) Subsection 47(1) of the Act is replaced by the following:

Application

47 (1) A person who alleges that they are affected by an act or omission that constitutes a contravention of any of sections 6 to 9 of this Act or a contravention of Insertion startPartInsertion end 1 of the Insertion startConsumer PrivacyInsertion end Protection Act that relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act — or that constitutes conduct that is reviewable under section 74.‍011 of the Competition Act — may apply to a court of competent jurisdiction for an order under section 51 against one or more persons Insertion startwhomInsertion end they allege have committed the act or omission or Insertion startwhomInsertion end they allege are liable for the contravention or reviewable conduct by reason of section 52 or 53.

(2) Subsection 47(4) of the Act is replaced by the following:

Notice

(4) The applicant must, without delay, serve a copy of the application on every person against whom an order is sought, on the Commission if the application identifies a contravention of this Act, on the Commissioner of Competition if the application identifies conduct that is reviewable under section 74.‍011 of the Competition Act and on the Privacy Commissioner if the application identifies a contravention of the Insertion startConsumer PrivacyInsertion end Protection Act.

23 Paragraph 50(c) of the Act is replaced by the following:

(c) the Privacy Commissioner, if the application identifies a contravention of the Insertion startConsumer PrivacyInsertion end Protection Act.

24 (1) Subparagraph 51(1)‍(b)‍(vi) of the Act is replaced by the following:

(vi) in the case of a contravention of Insertion startPartInsertion end 1 of the Insertion startConsumer PrivacyInsertion end Protection Act that relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act, $1,000,000 for each day on which a contravention occurred, and

(2) Subsection 51(2) of the Act is replaced by the following:

Purpose of order

(2) The purpose of an order under paragraph (1)‍(b) is to promote compliance with this Act, the Insertion startConsumer PrivacyInsertion end Protection Act or the Competition Act, as the case may be, and not to punish.

(3) Paragraph 51(3)‍(c) of the Act is replaced by the following:

(c) the person’s history, or each person’s history, as the case may be, with respect to

(Insertion startiInsertion end) any previous contravention of this Act,

(Insertion startiiInsertion end) Insertion startany previousInsertion end contravention of section 5 of the Personal Information Protection and Electronic Documents Act, Insertion startas it read immediately before the day on which section 4 of the Digital Charter Implementation Act, 2022 comes into forceInsertion end, that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act,

Start of inserted block

(iii) any previous contravention of Part 1 of the Consumer Privacy Protection Act that relates to a collection or use described in subsection 52(2) or (3) of that Act, and

End of inserted block

(Insertion startivInsertion end) any previous conduct that is reviewable under section 74.‍011 of the Competition Act;

25 Sections 52 to 54 of the Act are replaced by the following:

Directors and officers of corporations

52 An officer, director Insertion startorInsertion end agent or mandatary of a corporation that commits a contravention of any of sections 6 to 9 of this Act or of Insertion startPartInsertion end 1 of the Insertion startConsumer PrivacyInsertion end Protection Act that relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act, or that engages in conduct that is reviewable under section 74.‍011 of the Competition Act, is liable for the contravention or reviewable conduct, as the case may be, if they directed, authorized, assented to, acquiesced in or participated in the commission of that contravention, or engaged in that conduct, whether or not the corporation is proceeded against.

Vicarious liability

53 A person is liable for a contravention of any of sections 6 to 9 of this Act or of Insertion startPartInsertion end 1 of the Insertion startConsumer PrivacyInsertion end Protection Act that relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act, or for conduct that is reviewable under section 74.‍011 of the Competition Act, that is committed or engaged in, as the case may be, by their employee acting within the scope of their employment or their agent or mandatary acting within the scope of their authority, whether or not the employee Insertion startorInsertion end agent or mandatary is identified or proceeded against.

Defence

54 (1) A person must not be found to have committed a contravention of any of sections 6 to 9 of this Act or of Insertion startPartInsertion end 1 of the Insertion startConsumer PrivacyInsertion end Protection Act that relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act, or to have engaged in conduct that is reviewable under section 74.‍011 of the Competition Act, if they establish that they exercised due diligence to prevent the contravention or conduct, as the case may be.

Common law principles

(2) Every rule and principle of the common law that makes any circumstance a justification or excuse in relation to a charge for an offence applies in respect of a contravention or conduct Insertion startreferred to in subsection (1)Insertion end, to theextent that it is not inconsistent with this Act or the Insertion startConsumer PrivacyInsertion end Protection Act or the Competition Act, as the case may be.

26 (1) The portion of section 56 of the Act before paragraph (a) is replaced by the following:

Disclosure by an organization

56 Any organization to which the Insertion startConsumer PrivacyInsertion end Protection Act applies may on its own initiative disclose to the Commission, the Commissioner of Competition or the Privacy Commissioner any information in its possession that it believes relates to

(2) Subparagraph 56(a)‍(iii) of the Act is replaced by the following:

(iii) Insertion startPartInsertion end 1 of the Insertion startConsumer PrivacyInsertion end Protection Act, which contravention relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act, or

27 Section 57 of the Act is replaced by the following:

Consultation

57 The Commission, the Commissioner of Competition and the Privacy Commissioner must consult with each other to the extent that they consider appropriate to ensure the effective regulation, under this Act, the Competition Act, the Insertion startConsumer PrivacyInsertion end Protection Act and the Telecommunications Act, of commercial conduct that discourages the use of electronic means to carry out commercial activities, and to coordinate their activities under those Acts as they relate to the regulation of that type of conduct.

28 (1) Paragraph 58(1)‍(a) of the Act is replaced by the following:

(a) to the Privacy Commissioner, if the Commission believes that the information relates to the exercise of the Privacy Commissioner’s powers Insertion startor the performance of the Privacy Commissioner’sInsertion end duties or Insertion startfunctionsInsertion end under the Insertion startConsumer PrivacyInsertion end Protection Act in respect of a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act; and

(2) Paragraph 58(2)‍(a) of the Act is replaced by the following:

(a) to the Privacy Commissioner, if the Commissioner of Competition believes that the information relates to the exercise of the Privacy Commissioner’s powers Insertion startor the performance of the Privacy Commissioner’sInsertion end duties or Insertion startfunctionsInsertion end under the Insertion startConsumer PrivacyInsertion end Protection Act in respect of a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act; and

(3) The portion of subsection 58(3) of the Act before paragraph (a) is replaced by the following:

Disclosure by Privacy Commissioner

(3) The Privacy Commissioner may disclose information obtained by Insertion startthe Privacy CommissionerInsertion end in the exercise of the Insertion startPrivacy Commissioner’sInsertion end powers Insertion startor the performance of the Privacy Commissioner’s dutiesInsertion end or functions under the Insertion startConsumer PrivacyInsertion end Protection Act if the information relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act or to an act alleged in a complaint in respect of which the Privacy Commissioner decides, under Insertion startsection 84Insertion end of that Act, to not conduct an investigation or to discontinue an investigation,

29 Subsection 59(3) of the Act is replaced by the following:

Use of information by Privacy Commissioner

(3) The Privacy Commissioner may use the information that is disclosed to Insertion startthe Privacy CommissionerInsertion end under paragraph 58(1)‍(a) or (2)‍(a) only for the purpose of exercising the Insertion startPrivacy Commissioner’sInsertion end powers Insertion startor performing the Privacy Commissioner’s dutiesInsertion end or functions under the Insertion startConsumer PrivacyInsertion end Protection Act in respect of a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act.

30 (1) Subparagraph 60(1)‍(a)‍(ii) of the Act is replaced by the following:

(ii) conduct that contravenes Insertion startPartInsertion end 1 of the Insertion startConsumer PrivacyInsertion end Protection Act and that relates to a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act,

(2) Subparagraph 60(1)‍(b)‍(iii) of the Act is replaced by the following:

(iii) the exercise by the Privacy Commissioner of the Insertion startPrivacy Commissioner’sInsertion end powers Insertion startor the performance of the Privacy Commissioner’s dutiesInsertion end or functions under the Insertion startConsumer PrivacyInsertion end Protection Act in respect of a collection or use described in subsection Insertion start52Insertion end(2) or (3) of that Act, or

31 Section 61 of the Act is replaced by the following:

Reports to Minister of Industry

61 The Commission, the Commissioner of Competition and the Privacy Commissioner must provide the Minister of Industry with any reports that Insertion startthe MinisterInsertion end requests for the purpose of coordinating the implementation of sections 6 to 9 of this Act, sections 52.‍01 and 74.‍011 of the Competition Act and section Insertion start52Insertion end of the Insertion startConsumer PrivacyInsertion end Protection Act.

2018, c. 10

Transportation Modernization Act

32 Section 62 of the Transportation Modernization Act is amended by replacing the subsection 17.‍91(4) that it enacts with the following:

Consumer Privacy Protection Act and provincial legislation

(4) A company that collects, uses or communicates information under this section, section 17.‍31 or 17.‍94, subsection 28(1.‍1) or 36(2) or regulations made under section 17.‍95 may do so

(a) despite Insertion startPart 1Insertion end of the Insertion startConsumer PrivacyInsertion end Protection Act, to the extent that that Insertion startPartInsertion end relates to obligations relating to the collection, use, disclosure, retention Insertion startand disposalInsertion end of information; and

(b) despite any provision of provincial legislation that is substantially similar to Insertion startthatInsertion end Act and that limits the collection, use, communication or preservation of information.

Terminology

Replacement of “Personal Information Protection and Electronic Documents Act”

33 Every reference to the “Personal Information Protection and Electronic Documents Act” is replaced by a reference to the “Electronic Documents Act” in the following provisions:

(a) the definition secure electronic signature in section 31.‍8 of the Canada Evidence Act;

(b) subsection 95(2) of the Canadian Forces Superannuation Act;

(c) subsections 252.‍6(2) and (3) of the Canada Business Corporations Act;

(d) subsection 74(2) of the Public Service Superannuation Act;

(e) subsection 44(2) of the Royal Canadian Mounted Police Superannuation Act;

(f) subparagraph 205.‍124(1)‍(u)‍(ii) of the Canada–Newfoundland and Labrador Atlantic Accord Implementation Act;

(g) subparagraph 210.‍126(1)‍(u)‍(ii) of the Canada-Nova Scotia Offshore Petroleum Resources Accord Implementation Act;

(h) subsections 539.‍1(2) and (3) of the Trust and Loan Companies Act;

(i) subsections 1001(2) and (3) of the Bank Act;

(j) subsections 1043(2) and (3) of the Insurance Companies Act;

(k) subsections 487.‍1(2) and (3) of the Cooperative Credit Associations Act;

(l) subsections 361.‍6(2) and (3) of the Canada Cooperatives Act; and

(m) subsections 269(2) and (3) of the Canada Not-for-profit Corporations Act.

Transitional Provisions

Definitions

34 (1) The following definitions apply in this section.

former Act means the Personal Information Protection and Electronic Documents Act, as it read immediately before the day on which section 82 of the Consumer Privacy Protection Act, enacted by section 2, comes into force. (ancienne loi)

new Act means the Consumer Privacy Protection Act. (nouvelle loi)

Pending complaints

(2) If a complaint was filed or initiated under section 11 of the former Act before the day on which section 82 of the new Act comes into force and it has not been dealt with or disposed of on that day, the complaint is to be dealt with and disposed of in accordance with the former Act. However, if the Privacy Commissioner has reasonable grounds to believe that the contravention that is alleged in the complaint is continuing after that day, the complaint is to be dealt with and disposed of in accordance with the new Act.

Contraventions before coming into force

(3) If a complaint is filed or initiated on or after the day on which section 82 of the new Act comes into force in respect of a contravention that is alleged to have occurred before that day, the complaint is to be dealt with and disposed of in accordance with the former Act. However, if the Privacy Commissioner has reasonable grounds to believe that the contravention that is alleged in the complaint is continuing after that day, the complaint is to be dealt with and disposed of in accordance with the new Act.

Coordinating Amendments

Bill C-11

35 If Bill C-11, introduced in the 1st session of the 44th Parliament and entitled the Online Streaming Act, receives royal assent, then on the first day on which both section 22 of that Act and section 13 of this Act are in force,

(a) subsection 25.‍3(2) of the Broadcasting Act is replaced by the following:

Information not to be disclosed

(2)  Subject to subsections (4) to (5.‍1) and (7), if a person designates information as confidential and the designation is not withdrawn by that person, no person described in subsection (3) shall knowingly disclose the information, or knowingly allow it to be disclosed, to any other person in any manner that is intended or likely to make it available for the use of any person who may benefit from the information or use it to the detriment of any person to whose business or affairs the information relates.

(b) section 25.‍3 of the Broadcasting Act, as enacted by that section 22, is amended by adding the following after subsection (5):

Disclosure to Privacy Commissioner

(5.‍1) The Commission may disclose designated information obtained by it in the exercise of its powers or the performance of its duties or functions under this Act to the Privacy Commissioner in accordance with section 12.‍1 of the Canadian Radio-television and Telecommunications Commission Act.

2018, c. 10

36 (1) In this section, other Act means the Transportation Modernization Act.

(2) If section 62 of the other Act comes into force before section 2 of this Act, then

(a) section 32 of this Act is repealed; and

(b) on the coming into force of section 2 of this Act, subsection 17.‍91(4) of the Railway Safety Act is replaced by the following:

Consumer Privacy Protection Act and provincial legislation

(4) A company that collects, uses or communicates information under this section, section 17.‍31 or 17.‍94, subsection 28(1.‍1) or 36(2) or regulations made under section 17.‍95 may do so

(a) despite Part 1 of the Consumer Privacy Protection Act, to the extent that that Part relates to obligations relating to the collection, use, disclosure, retention and disposal of information; and

(b) despite any provision of provincial legislation that is substantially similar to that Act and that limits the collection, use, communication or preservation of information.

(3) If section 62 of the other Act comes into force on the same day as section 32 of this Act, then that section 32 is deemed to have come into force before that section 62.

PART 2 

Personal Information and Data Protection Tribunal Act

Enactment of Act

Enactment

37 The Personal Information and Data Protection Tribunal Act is enacted as follows:

An Act to establish the Personal Information and Data Protection Tribunal

Short title

1 This Act may be cited as the Personal Information and Data Protection Tribunal Act.

Definition of Minister

2 In this Act, Minister means the member of the Queen’s Privy Council for Canada designated under section 3 or, if no member is designated, the Minister of Industry.

Order designating Minister

3 The Governor in Council may, by order, designate any member of the Queen’s Privy Council for Canada to be the Minister for the purposes of this Act.

Establishment

4 A tribunal to be called the Personal Information and Data Protection Tribunal (“the Tribunal”) is established.

Jurisdiction

5 The Tribunal has jurisdiction in respect of all appeals that may be made under section 101 or 102 of the Consumer Privacy Protection Act and in respect of the imposition of penalties under section 95 of that Act.

Members

6 (1) The Tribunal consists of three to six members to be appointed by the Governor in Council on the recommendation of the Minister.

Full- or part-time members

(2) Members may be appointed as full-time or part-time members.

Full-time occupation

(3) Full-time members must devote the whole of their time to the performance of their duties and functions under this Act.

Experience

(4) At least three of the members must have experience in the field of information and privacy law.

Chairperson and Vice-Chairperson

7 The Governor in Council must designate one member as Chairperson of the Tribunal and may designate one member as Vice-Chairperson. The Chairperson must be a full-time member.

Duties of Chairperson

8 (1) The Chairperson has supervision over, and direction of the work of the Tribunal, including

(a) the distribution of work among members and the assignment of members to hear matters brought before the Tribunal and, if the Chairperson considers it appropriate for matters to be heard by panels, the assignment of members to panels and to preside over panels; and

(b) the conduct of the work of the Tribunal and the management of its internal affairs.

Acting Chairperson

(2) In the event of the absence or incapacity of the Chairperson or if the office of Chairperson is vacant, the Vice-Chairperson acts as Chairperson.

Acting Chairperson

9 In the event of the absence or incapacity of the Chairperson and the Vice-Chairperson or if both of those offices are vacant, a member of the Tribunal designated by the Minister acts as Chairperson. The designated member is not however authorized to act as Chairperson for a period of more than 90 days without the approval of the Governor in Council.

Term of office

10 (1) A member is to be appointed to hold office during good behaviour for a term not exceeding five years and may be removed for cause by the Governor in Council.

Reappointment

(2) A member is eligible to be reappointed for one or more terms not exceeding three years each.

Disposition after expiry of appointment

(3) A member whose appointment expires may, at the request of the Chairperson and for a period of not more than six months, make or take part in a decision on a matter that they heard as a member. For that purpose, the former member is deemed to be a part-time member.

Remuneration

11 (1) Members are to receive the remuneration that is fixed by the Governor in Council.

Expenses

(2) Each member is entitled to be paid reasonable travel and living expenses incurred while absent in the course of their duties from, in the case of a full-time member, their ordinary place of work and, in the case of a part-time member, their ordinary place of residence.

Status

(3) Members are deemed to be employees for the purposes of the Government Employees Compensation Act and to be employed in the federal public administration for the purposes of any regulations made under section 9 of the Aeronautics Act.

Public Service Superannuation Act

(4) Full-time members are also deemed to be persons employed in the public service for the purposes of the Public Service Superannuation Act.

Inconsistent interests

12 If a member who is assigned to hear or is hearing any matter before the Tribunal, either alone or as a member of a panel, holds any pecuniary or other interest that could be inconsistent with the proper performance of their duties and functions in relation to the matter, the member must disclose the interest to the Chairperson without delay.

Principal office

13 The principal office of the Tribunal must be in a place in Canada that is designated by the Governor in Council or, if no place is designated, in the National Capital Region described in the schedule to the National Capital Act.

Sittings

14 The Tribunal is to sit at those times and places in Canada and in the manner that the Chairperson considers necessary for the proper performance of its duties and functions.

Nature of hearings

15 (1) Subject to subsection (2), the Tribunal is not bound by any legal or technical rules of evidence in conducting a hearing in relation to any matter that comes before it and it must deal with all matters as informally and expeditiously as the circumstances and considerations of fairness and natural justice permit.

Restriction

(2) The Tribunal must not receive or accept as evidence anything that would be inadmissible in a court by reason of any privilege under the law of evidence.

Appearance

(3) A party to a proceeding before the Tribunal may appear in person or be represented by another person, including legal counsel.

Private hearings

(4) Hearings must be held in public. However, the Tribunal may hold all or any part of a hearing in private if it is of the opinion that

(a) a public hearing would not be in the public interest; or

(b) confidential information may be disclosed and the desirability of ensuring that the information is not publicly disclosed outweighs the desirability of adhering to the principle that hearings be open to the public.

Standard of proof

(5) In any proceeding before the Tribunal, a party that has the burden of proof discharges it by proof on the balance of probabilities.

Decision of panel

(6) A decision of the majority of the members of a panel referred to in paragraph 8(1)‍(a) is a decision of the Tribunal.

Powers

16 (1) The Tribunal has, with respect to the appearance, swearing and examination of witnesses, the production and inspection of documents, the enforcement of its decisions and other matters necessary or proper for the due exercise of its jurisdiction, all the powers, rights and privileges that are vested in a superior court of record.

Enforcement of decisions

(2) Any decision of the Tribunal may, for the purposes of its enforcement, be made an order of the Federal Court or of any superior court and is enforceable in the same manner as an order of the court.

Procedure

(3) To make a decision of the Tribunal an order of a court, the usual practice and procedure of the court in such matters may be followed or a certified copy of the decision may be filed with the registrar of the court, at which time the decision becomes an order of the court.

Reasons

17 The Tribunal must provide a decision, with reasons, in writing to all parties to a proceeding.

Public availability — decisions

18 (1) The Tribunal must make its decisions, and the reasons for them, publicly available in accordance with its rules.

Complainants

(2) If the Tribunal makes a decision in relation to a complaint filed under the Consumer Privacy Protection Act, the Tribunal must not make the complainant’s name or any personal information that could be used to identify the complainant publicly available without the complainant’s consent.

Rules

19 (1) The Tribunal may, with the approval of the Governor in Council, make rules that are not inconsistent with this Act or the Consumer Privacy Protection Act to govern the management of its affairs and the practice and procedure in connection with matters brought before it, including rules respecting when decisions are to be made public and the factors to be taken into consideration in deciding whether to name an organization affected by a decision in the decision.

Public availability — rules

(2) The Tribunal must make its rules publicly available.

Cost

20 (1) The Tribunal may, in accordance with its rules, award costs.

Certificate

(2) Costs under subsection (1) that have not been paid may be certified by the Tribunal.

Registration of certificate

(3) On production to the Federal Court, a certificate must be registered. When it is registered, a certificate has the same force and effect as if it were a judgment obtained in the Federal Court for a debt of the amount specified in it and all reasonable costs and charges attendant on its registration, recoverable in that Court or in any other court of competent jurisdiction.

Decisions final

21 A decision of the Tribunal is final and binding and, except for judicial review under the Federal Courts Act, is not subject to appeal or to review by any court.

2014, c. 20, s. 376

Related Amendment to the Administrative Tribunals Support Service of Canada Act

38 The schedule to the Administrative Tribunals Support Service of Canada Act is amended by adding the following in alphabetical order:

Start of inserted block

Start of inserted block

Personal Information and Data Protection Tribunal

Tribunal de la protection des renseignements personnels et des données

End of inserted block

End of inserted block

PART 3 

Artificial Intelligence and Data Act

Enactment of Act

39 The Artificial Intelligence and Data Act is enacted as follows:

An Act respecting artificial intelligence systems and data used in artificial intelligence systems

Short Title

Short title

1 This Act may be cited as the Artificial Intelligence and Data Act.

Definitions and Application

Definitions

2 The following definitions apply in this Act.

artificial intelligence system means a technological system that, autonomously or partly autonomously, processes data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations or predictions. (système d’intelligence artificielle)

person includes a trust, a joint venture, a partnership, an unincorporated association and any other legal entity. (personne)

personal information has the meaning assigned by subsections 2(1) and (3) of the Consumer Privacy Protection Act. (renseignement personnel)

Non-application

3 (1) This Act does not apply with respect to a government institution as defined in section 3 of the Privacy Act.

Product, service or activity

(2) This Act does not apply with respect to a product, service or activity that is under the direction or control of

(a) the Minister of National Defence;

(b) the Director of the Canadian Security Intelligence Service;

(c) the Chief of the Communications Security Establishment; or

(d) any other person who is responsible for a federal or provincial department or agency and who is prescribed by regulation.

Regulations

(3) The Governor in Council may make regulations prescribing persons for the purpose of paragraph (2)‍(d).

Purposes of Act

Purposes

4 The purposes of this Act are

(a) to regulate international and interprovincial trade and commerce in artificial intelligence systems by establishing common requirements, applicable across Canada, for the design, development and use of those systems; and

(b) to prohibit certain conduct in relation to artificial intelligence systems that may result in serious harm to individuals or harm to their interests.

PART 1 

Regulation of Artificial Intelligence Systems in the Private Sector

Interpretation

Definitions

5 (1) The following definitions apply in this Part.

biased output means content that is generated, or a decision, recommendation or prediction that is made, by an artificial intelligence system and that adversely differentiates, directly or indirectly and without justification, in relation to an individual on one or more of the prohibited grounds of discrimination set out in section 3 of the Canadian Human Rights Act, or on a combination of such prohibited grounds. It does not include content, or a decision, recommendation or prediction, the purpose and effect of which are to prevent disadvantages that are likely to be suffered by, or to eliminate or reduce disadvantages that are suffered by, any group of individuals when those disadvantages would be based on or related to the prohibited grounds. (résultat biaisé)

confidential business information, in respect of a person to whose business or affairs the information relates, means business information

(a) that is not publicly available;

(b) in respect of which the person has taken measures that are reasonable in the circumstances to ensure that it remains not publicly available; and

(c) that has actual or potential economic value to the person or their competitors because it is not publicly available and its disclosure would result in a material financial loss to the person or a material financial gain to their competitors. (renseignements commerciaux confidentiels)

harm means

(a) physical or psychological harm to an individual;

(b) damage to an individual’s property; or

(c) economic loss to an individual. (préjudice)

high-impact system means an artificial intelligence system that meets the criteria for a high-impact system that are established in regulations. (système à incidence élevée)

Minister means the member of the Queen’s Privy Council for Canada designated under section 31 or, if no member is so designated, the Minister of Industry. (ministre)

regulated activity means any of the following activities carried out in the course of international or interprovincial trade and commerce:

(a) processing or making available for use any data relating to human activities for the purpose of designing, developing or using an artificial intelligence system;

(b) designing, developing or making available for use an artificial intelligence system or managing its operations. (activité réglementée)

Person responsible

(2) For the purposes of this Part, a person is responsible for an artificial intelligence system, including a high-impact system, if, in the course of international or interprovincial trade and commerce, they design, develop or make available for use the artificial intelligence system or manage its operation.

Requirements

Anonymized data

6 A person who carries out any regulated activity and who processes or makes available for use anonymized data in the course of that activity must, in accordance with the regulations, establish measures with respect to

(a) the manner in which data is anonymized; and

(b) the use or management of anonymized data.

Assessment — high-impact system

7 A person who is responsible for an artificial intelligence system must, in accordance with the regulations, assess whether it is a high-impact system.

Measures related to risks

8 A person who is responsible for a high-impact system must, in accordance with the regulations, establish measures to identify, assess and mitigate the risks of harm or biased output that could result from the use of the system.

Monitoring of mitigation measures

9 A person who is responsible for a high-impact system must, in accordance with the regulations, establish measures to monitor compliance with the mitigation measures they are required to establish under section 8 and the effectiveness of those mitigation measures.

Keeping general records

10 (1) A person who carries out any regulated activity must, in accordance with the regulations, keep records describing in general terms, as the case may be,

(a) the measures they establish under sections 6, 8 and 9; and

(b) the reasons supporting their assessment under section 7.

Additional records

(2) The person must, in accordance with the regulations, keep any other records in respect of the requirements under sections 6 to 9 that apply to them.

Publication of description — making system available for use

11 (1) A person who makes available for use a high-impact system must, in the time and manner that may be prescribed by regulation, publish on a publicly available website a plain-language description of the system that includes an explanation of

(a) how the system is intended to be used;

(b) the types of content that it is intended to generate and the decisions, recommendations or predictions that it is intended to make;

(c) the mitigation measures established under section 8 in respect of it; and

(d) any other information that may be prescribed by regulation.

Publication of description — managing operation of system

(2) A person who manages the operation of a high-impact system must, in the time and manner that may be prescribed by regulation, publish on a publicly available website a plain-language description of the system that includes an explanation of

(a) how the system is used;

(b) the types of content that it generates and the decisions, recommendations or predictions that it makes;

(c) the mitigation measures established under section 8 in respect of it; and

(d) any other information that may be prescribed by regulation.

Notification of material harm

12 A person who is responsible for a high-impact system must, in accordance with the regulations and as soon as feasible, notify the Minister if the use of the system results or is likely to result in material harm.

Ministerial Orders

Provision of subsection 10(1) records

13 The Minister may, by order, require that a person referred to in subsection 10(1) provide the Minister with any of the records referred to in that subsection.

Provision of subsection 10(2) records

14 If the Minister has reasonable grounds to believe that the use of a high-impact system could result in harm or biased output, the Minister may, by order, require that a person referred to in subsection 10(2) provide the Minister, in the form specified in the order, with any of the records referred to in that subsection that relate to that system.

Audit

15 (1) If the Minister has reasonable grounds to believe that a person has contravened any of sections 6 to 12 or an order made under section 13 or 14, the Minister may, by order, require that the person

(a) conduct an audit with respect to the possible contravention; or

(b) engage the services of an independent auditor to conduct the audit.

Qualifications

(2) The audit must be conducted by a person who meets the qualifications that are prescribed by regulation.

Assistance

(3) If the audit is conducted by an independent auditor, the person who is audited must give all assistance that is reasonably required to enable the auditor to conduct the audit, including by providing any records or other information specified by the auditor.

Report

(4) The person who is audited must provide the Minister with the audit report.

Cost

(5) In all cases, the cost of the audit is payable by the person who is audited.

Implementation of measures

16 The Minister may, by order, require that a person who has been audited implement any measure specified in the order to address anything referred to in the audit report.

Cessation

17 (1) The Minister may, by order, require that any person who is responsible for a high-impact system cease using it or making it available for use if the Minister has reasonable grounds to believe that the use of the system gives rise to a serious risk of imminent harm.

Statutory Instruments Act

(2) The order is exempt from the application of sections 3 and 9 of the Statutory Instruments Act.

Publication

18 (1) The Minister may, by order, require that a person referred to in any of sections 6 to 12, 15 and 16 publish, on a publicly available website, any information related to any of those sections. However, the Minister is not permitted to require that the person disclose confidential business information.

Regulations

(2) The person must publish the information under subsection (1) in accordance with any regulations.

Compliance

19 A person who is the subject of an order made by the Minister under this Part must comply with the order.

Filing — Federal Court

20 The Minister may file a certified copy of an order made under any of sections 13 to 18 in the Federal Court and, on the certified copy being filed, the order becomes and may be enforced as an order of the Federal Court.

Statutory Instruments Act

21 An order made under any of sections 13 to 16 and 18 is not a statutory instrument as defined in subsection 2(1) of the Statutory Instruments Act.

Information

Confidential nature maintained

22 For greater certainty, confidential business information that is obtained by the Minister under this Part does not lose its confidential nature by the mere fact that it is so obtained or that it has been disclosed by the Minister under section 25 or 26.

Obligation of Minister

23 Subject to sections 24 to 26, the Minister must take measures to maintain the confidentiality of any confidential business information that the Minister obtains under this Part.

Disclosure of confidential business information — subpoena, warrant, etc.

24 The Minister may disclose confidential business information for the purpose of complying with a subpoena or warrant issued or order made by a court, person or body with jurisdiction to compel the production of information or for the purpose of complying with rules of court relating to the production of information.

Disclosure of information — analyst

25 (1) The Minister may disclose any information that is obtained under this Part to an analyst designated under section 34.

Conditions — confidentiality

(2) The Minister may impose any condition on the analyst in order to protect the confidentiality of information that the Minister discloses.

Duty and restriction

(3) An analyst must maintain the confidentiality of information disclosed to them under subsection (1) and may use the information only for the administration and enforcement of this Part.

Disclosure of information — others

26 (1) The Minister may disclose any information obtained under this Part to any of the following recipients, if the Minister has reasonable grounds to believe that a person who carries out any regulated activity has contravened, or is likely to contravene, another Act of Parliament or a provincial legislature that is administered or enforced by the intended recipient of the information and if the information is relevant to the intended recipient’s powers, duties or functions under that Act:

(a) the Privacy Commissioner;

(b) the Canadian Human Rights Commission;

(c) the Commissioner of Competition;

(d) the Canadian Radio-television and Telecommunications Commission;

(e) any person appointed by the government of a province, or any provincial entity, with powers, duties and functions that are similar to those of the Privacy Commissioner or the Canadian Human Rights Commission;

(f) any other person or entity prescribed by regulation.

Restriction

(2) The Minister may disclose personal information or confidential business information under subsection (1) only if

(a) the Minister is satisfied that the disclosure is necessary for the purposes of enabling the recipient to administer or enforce the Act in question; and

(b) the recipient agrees in writing to maintain the confidentiality of the information except as necessary for any of those purposes.

Restriction — use

(3) The recipient may use the disclosed information only for the purpose of the administration and enforcement of the Act in question.

Publication of information — contravention

27 (1) If the Minister considers that it is in the public interest to do so, the Minister may, for the purpose of encouraging compliance with this Part, publish information about any contravention of this Part on a publicly available website.

Restriction

(2) However, the Minister is not permitted to publish confidential business information under subsection (1).

Publication of information — harm

28 (1) Without the consent of the person to whom the information relates and without notifying that person, the Minister may publish, on a publicly available website, information that relates to an artificial intelligence system and that is obtained under this Part if the Minister has reasonable grounds to believe that

(a) the use of the system gives rise to a serious risk of imminent harm; and

(b) the publication of the information is essential to prevent the harm.

Restriction

(2) However, the Minister is not permitted to publish personal information or confidential business information under subsection (1).

Administrative Monetary Penalties

Administrative monetary penalties

29 (1) A person who is found under the regulations to have committed a violation is liable to the administrative monetary penalty established by the regulations.

Purpose of penalty

(2) The purpose of an administrative monetary penalty is to promote compliance with this Part and not to punish.

Violation or offence

(3) If an act or omission may be proceeded with as a violation or as an offence, proceeding with it in one manner precludes proceeding with it in the other.

Regulations

(4) The Governor in Council may make regulations respecting an administrative monetary penalties scheme, including regulations

(a) designating the provisions of this Part or of the regulations the contravention of which constitutes a violation, including those provisions the contravention of which, if continued on more than one day, constitutes a separate violation in respect of each day during which the violation is continued;

(b) classifying each violation as a minor violation, a serious violation or a very serious violation;

(c) respecting the proceedings in respect of a violation, including in relation to

(i) commencing the proceedings,

(ii) maintaining the confidentiality of confidential business information in the proceedings,

(iii) the defences that may be available in respect of a violation, and

(iv) the circumstances in which the proceedings may be brought to an end;

(d) respecting the administrative monetary penalties that may be imposed for a violation, including in relation to

(i) the amount, or range of amounts, of the administrative monetary penalties that may be imposed on persons or classes of persons,

(ii) the factors to be taken into account in imposing an administrative monetary penalty,

(iii) the payment of administrative monetary penalties that have been imposed, and

(iv) the recovery, as a debt, of unpaid administrative monetary penalties;

(e) respecting reviews or appeals of findings that a violation has been committed and of the imposition of administrative monetary penalties;

(f) respecting compliance agreements; and

(g) respecting the persons or classes of persons who may exercise any power, or perform any duty or function, in relation to the scheme, including the designation of such persons or classes of persons by the Minister.

Offences

Contravention — sections 6 to 12

30 (1) Every person who contravenes any of sections 6 to 12 is guilty of an offence.

Obstruction or providing false or misleading information

(2) Every person who carries out a regulated activity is guilty of an offence if the person obstructs — or provides false or misleading information to — the Minister, anyone acting on behalf of the Minister or an independent auditor in the exercise of their powers or performance of their duties or functions under this Part.

Punishment

(3) A person who commits an offence under subsection (1) or (2)

(a) is liable, on conviction on indictment,

(i) to a fine of not more than the greater of $10,000,000 and 3% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, and

(ii) to a fine at the discretion of the court, in the case of an individual; or

(b) is liable, on summary conviction,

(i) to a fine of not more than the greater of $5,000,000 and 2% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, and

(ii) to a fine of not more than $50,000, in the case of an individual.

Defence of due diligence

(4) A person is not to be found guilty of an offence under subsection (1) or (2) if they establish that they exercised due diligence to prevent the commission of the offence.

Employee, agent or mandatary

(5) It is sufficient proof of an offence under subsection (1) or (2) to establish that it was committed by an employee, agent or mandatary of the accused, whether or not the employee, agent or mandatary is identified or has been prosecuted for the offence, unless the accused establishes that the offence was committed without the knowledge or consent of the accused.

Administration

Designation

31 The Governor in Council may, by order, designate any member of the Queen’s Privy Council for Canada to be the Minister for the purposes of this Part.

General powers of Minister

32 The Minister may

(a) promote public awareness of this Act and provide education with respect to it;

(b) make recommendations and cause to be prepared reports on the establishment of measures to facilitate compliance with this Part; and

(c) establish guidelines with respect to compliance with this Part.

Artificial Intelligence and Data Commissioner

33 (1) The Minister may designate a senior official of the department over which the Minister presides to be called the Artificial Intelligence and Data Commissioner, whose role is to assist the Minister in the administration and enforcement of this Part.

Delegation

(2) The Minister may delegate to the Commissioner any power, duty or function conferred on the Minister under this Part, except the power to make regulations under section 37.

Analysts

34 The Minister may designate any individual or class of individuals as analysts for the administration and enforcement of this Part.

Advisory committee

35 (1) The Minister may establish a committee to provide the Minister with advice on any matters related to this Part.

Advice available to public

(2) The Minister may cause the advice that the committee provides to the Minister to be published on a publicly available website.

Remuneration and expenses

(3) Each committee member is to be paid the remuneration fixed by the Governor in Council and is entitled to the reasonable travel and living expenses that they incur while performing their duties away from their ordinary place of residence.

Regulations — Governor in Council

36 The Governor in Council may make regulations for the purposes of this Part, including regulations

(a) respecting what constitutes or does not constitute justification for the purpose of the definition biased output in subsection 5(1);

(b) establishing criteria for the purpose of the definition high-impact system in subsection 5(1);

(c) respecting the establishment of measures for the purposes of sections 6, 8 and 9;

(d) respecting the assessment for the purposes of section 7;

(e) respecting what constitutes or does not constitute material harm for the purpose of section 12;

(f) prescribing qualifications for the purposes of subsection 15(2); and

(g) prescribing persons and entities for the purpose of paragraph 26(1)‍(f).

Regulations — Minister

37 The Minister may make regulations

(a) respecting the records required to be kept under section 10;

(b) prescribing, for the purposes of subsections 11(1) and (2), the time and the manner in which descriptions are to be published and the information to be included in the descriptions;

(c) respecting the notice required to be provided under section 12; and

(d) respecting the publication of information under section 18.

PART 2 

General Offences Related to Artificial Intelligence Systems

Possession or use of personal information

38 Every person commits an offence if, for the purpose of designing, developing, using or making available for use an artificial intelligence system, the person possesses — within the meaning of subsection 4(3) of the Criminal Code — or uses personal information, knowing or believing that the information is obtained or derived, directly or indirectly, as a result of

(a) the commission in Canada of an offence under an Act of Parliament or a provincial legislature; or

(b) an act or omission anywhere that, if it had occurred in Canada, would have constituted such an offence.

Making system available for use

39 Every person commits an offence if the person

(a) without lawful excuse and knowing that or being reckless as to whether the use of an artificial intelligence system is likely to cause serious physical or psychological harm to an individual or substantial damage to an individual’s property, makes the artificial intelligence system available for use and the use of the system causes such harm or damage; or

(b) with intent to defraud the public and to cause substantial economic loss to an individual, makes an artificial intelligence system available for use and its use causes that loss.

Punishment

40 Every person who commits an offence under section 38 or 39

(a) is liable, on conviction on indictment,

(i) to a fine of not more than the greater of $25,000,000 and 5% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, and

(ii) to a fine in the discretion of the court or to a term of imprisonment of up to five years less a day, or to both, in the case of an individual; or

(b) is liable, on summary conviction,

(i) to a fine of not more than the greater of $20,000,000 and 4% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, and

(ii) to a fine of not more than $100,000 or to a term of imprisonment of up to two years less a day, or to both, in the case of an individual.

PART 3 

Coming into Force

Order in council

41 The provisions of this Act come into force on a day or days to be fixed by order of the Governor in Council.

PART 4 

Coming into Force

Order in council

40 This Act, other than sections 2, 35, 36 and 39, comes into force on a day to be fixed by order of the Governor in Council.

SCHEDULE 

(Section 2)

SCHEDULE 

(Subsection 6(3) and paragraph 122(2)‍(c))

Organizations

Column 1

Column 2

Item

Organization

Personal Information

1

World Anti-Doping Agency

Agence mondiale antidopage

Personal information that the organization collects, uses or discloses in the course of its interprovincial or international activities

EXPLANATORY NOTES

Personal Information Protection and Electronic Documents Act

Clause 3: Existing text of the long title:

An Act to support and promote electronic commerce by protecting personal information that is collected, used or disclosed in certain circumstances, by providing for the use of electronic means to communicate or record information or transactions and by amending the Canada Evidence Act, the Statutory Instruments Act and the Statute Revision Act

Clause 4: Existing text of sections 1 to 30:

1 This Act may be cited as the Personal Information Protection and Electronic Documents Act.

PART 1 

Protection of Personal Information in the Private Sector

Interpretation

2 (1) The definitions in this subsection apply in this Part.

alternative format, with respect to personal information, means a format that allows a person with a sensory disability to read or listen to the personal information. (support de substitution)

breach of security safeguards means the loss of, unauthorized access to or unauthorized disclosure of personal information resulting from a breach of an organization’s security safeguards that are referred to in clause 4.‍7 of Schedule 1 or from a failure to establish those safeguards. (atteinte aux mesures de sécurité)

business contact information means any information that is used for the purpose of communicating or facilitating communication with an individual in relation to their employment, business or profession such as the individual’s name, position name or title, work address, work telephone number, work fax number or work electronic address. (coordonnées d’affaires)

business transaction includes

(a) the purchase, sale or other acquisition or disposition of an organization or a part of an organization, or any of its assets;

(b) the merger or amalgamation of two or more organizations;

(c) the making of a loan or provision of other financing to an organization or a part of an organization;

(d) the creating of a charge on, or the taking of a security interest in or a security on, any assets or securities of an organization;

(e) the lease or licensing of any of an organization’s assets; and

(f) any other prescribed arrangement between two or more organizations to conduct a business activity. (transaction commerciale)

commercial activity means any particular transaction, act or conduct or any regular course of conduct that is of a commercial character, including the selling, bartering or leasing of donor, membership or other fundraising lists. (activité commerciale)

Commissioner means the Privacy Commissioner appointed under section 53 of the Privacy Act. (commissaire)

Court means the Federal Court. (Cour)

federal work, undertaking or business means any work, undertaking or business that is within the legislative authority of Parliament. It includes

(a) a work, undertaking or business that is operated or carried on for or in connection with navigation and shipping, whether inland or maritime, including the operation of ships and transportation by ship anywhere in Canada;

(b) a railway, canal, telegraph or other work or undertaking that connects a province with another province, or that extends beyond the limits of a province;

(c) a line of ships that connects a province with another province, or that extends beyond the limits of a province;

(d) a ferry between a province and another province or between a province and a country other than Canada;

(e) aerodromes, aircraft or a line of air transportation;

(f) a radio broadcasting station;

(g) a bank or an authorized foreign bank as defined in section 2 of the Bank Act;

(h) a work that, although wholly situated within a province, is before or after its execution declared by Parliament to be for the general advantage of Canada or for the advantage of two or more provinces;

(i) a work, undertaking or business outside the exclusive legislative authority of the legislatures of the provinces; and

(j) a work, undertaking or business to which federal laws, within the meaning of section 2 of the Oceans Act, apply under section 20 of that Act and any regulations made under paragraph 26(1)‍(k) of that Act. (entreprises fédérales)

organization includes an association, a partnership, a person and a trade union. (organisation)

personal health information, with respect to an individual, whether living or deceased, means

(a) information concerning the physical or mental health of the individual;

(b) information concerning any health service provided to the individual;

(c) information concerning the donation by the individual of any body part or any bodily substance of the individual or information derived from the testing or examination of a body part or bodily substance of the individual;

(d) information that is collected in the course of providing health services to the individual; or

(e) information that is collected incidentally to the provision of health services to the individual. (renseignement personnel sur la santé)

personal information means information about an identifiable individual. (renseignement personnel)

prescribed means prescribed by regulation. (Version anglaise seulement)

record includes any correspondence, memorandum, book, plan, map, drawing, diagram, pictorial or graphic work, photograph, film, microform, sound recording, videotape, machine-readable record and any other documentary material, regardless of physical form or characteristics, and any copy of any of those things. (document)

(2) In this Part, a reference to clause 4.‍3 or 4.‍9 of Schedule 1 does not include a reference to the note that accompanies that clause.

Purpose

3 The purpose of this Part is to establish, in an era in which technology increasingly facilitates the circulation and exchange of information, rules to govern the collection, use and disclosure of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.

Application

4 (1) This Part applies to every organization in respect of personal information that

(a) the organization collects, uses or discloses in the course of commercial activities; or

(b) is about an employee of, or an applicant for employment with, the organization and that the organization collects, uses or discloses in connection with the operation of a federal work, undertaking or business.

(1.‍1) This Part applies to an organization set out in column 1 of Schedule 4 in respect of personal information set out in column 2.

(2) This Part does not apply to

(a) any government institution to which the Privacy Act applies;

(b) any individual in respect of personal information that the individual collects, uses or discloses for personal or domestic purposes and does not collect, use or disclose for any other purpose; or

(c) any organization in respect of personal information that the organization collects, uses or discloses for journalistic, artistic or literary purposes and does not collect, use or disclose for any other purpose.

*(3) Every provision of this Part applies despite any provision, enacted after this subsection comes into force, of any other Act of Parliament, unless the other Act expressly declares that that provision operates despite the provision of this Part.

* [Note: Subsection 4(3) in force January 1, 2001, see SI/2000-29.‍]

4.‍01 This Part does not apply to an organization in respect of the business contact information of an individual that the organization collects, uses or discloses solely for the purpose of communicating or facilitating communication with the individual in relation to their employment, business or profession.

4.‍1 (1) Where a certificate under section 38.‍13 of the Canada Evidence Act prohibiting the disclosure of personal information of a specific individual is issued before a complaint is filed by that individual under this Part in respect of a request for access to that information, the provisions of this Part respecting that individual’s right of access to his or her personal information do not apply to the information that is subject to the certificate.

(2) Notwithstanding any other provision of this Part, where a certificate under section 38.‍13 of the Canada Evidence Act prohibiting the disclosure of personal information of a specific individual is issued after the filing of a complaint under this Part in relation to a request for access to that information:

(a) all proceedings under this Part in respect of that information, including an investigation, audit, appeal or judicial review, are discontinued;

(b) the Commissioner shall not disclose the information and shall take all necessary precautions to prevent its disclosure; and

(c) the Commissioner shall, within 10 days after the certificate is published in the Canada Gazette, return the information to the organization that provided the information.

(3) The Commissioner and every person acting on behalf or under the direction of the Commissioner, in carrying out their functions under this Part, shall not disclose information subject to a certificate issued under section 38.‍13 of the Canada Evidence Act, and shall take every reasonable precaution to avoid the disclosure of that information.

(4) The Commissioner may not delegate the investigation of any complaint relating to information subject to a certificate issued under section 38.‍13 of the Canada Evidence Act except to one of a maximum of four officers or employees of the Commissioner specifically designated by the Commissioner for the purpose of conducting that investigation.

DIVISION 1 

Protection of Personal Information

5 (1) Subject to sections 6 to 9, every organization shall comply with the obligations set out in Schedule 1.

(2) The word should, when used in Schedule 1, indicates a recommendation and does not impose an obligation.

(3) An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.

6 The designation of an individual under clause 4.‍1 of Schedule 1 does not relieve the organization of the obligation to comply with the obligations set out in that Schedule.

6.‍1 For the purposes of clause 4.‍3 of Schedule 1, the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.

7 (1) For the purpose of clause 4.‍3 of Schedule 1, and despite the note that accompanies that clause, an organization may collect personal information without the knowledge or consent of the individual only if

(a) the collection is clearly in the interests of the individual and consent cannot be obtained in a timely way;

(b) it is reasonable to expect that the collection with the knowledge or consent of the individual would compromise the availability or the accuracy of the information and the collection is reasonable for purposes related to investigating a breach of an agreement or a contravention of the laws of Canada or a province;

(b.‍1) it is contained in a witness statement and the collection is necessary to assess, process or settle an insurance claim;

(b.‍2) it was produced by the individual in the course of their employment, business or profession and the collection is consistent with the purposes for which the information was produced;

(c) the collection is solely for journalistic, artistic or literary purposes;

(d) the information is publicly available and is specified by the regulations; or

(e) the collection is made for the purpose of making a disclosure

(i) under subparagraph (3)‍(c.‍1)‍(i) or (d)‍(ii), or

(ii) that is required by law.

(2) For the purpose of clause 4.‍3 of Schedule 1, and despite the note that accompanies that clause, an organization may, without the knowledge or consent of the individual, use personal information only if

(a) in the course of its activities, the organization becomes aware of information that it has reasonable grounds to believe could be useful in the investigation of a contravention of the laws of Canada, a province or a foreign jurisdiction that has been, is being or is about to be committed, and the information is used for the purpose of investigating that contravention;

(b) it is used for the purpose of acting in respect of an emergency that threatens the life, health or security of an individual;

(b.‍1) the information is contained in a witness statement and the use is necessary to assess, process or settle an insurance claim;

(b.‍2) the information was produced by the individual in the course of their employment, business or profession and the use is consistent with the purposes for which the information was produced;

(c) it is used for statistical, or scholarly study or research, purposes that cannot be achieved without using the information, the information is used in a manner that will ensure its confidentiality, it is impracticable to obtain consent and the organization informs the Commissioner of the use before the information is used;

(c.‍1) it is publicly available and is specified by the regulations; or

(d) it was collected under paragraph (1)‍(a), (b) or (e).

(3) For the purpose of clause 4.‍3 of Schedule 1, and despite the note that accompanies that clause, an organization may disclose personal information without the knowledge or consent of the individual only if the disclosure is

(a) made to, in the Province of Quebec, an advocate or notary or, in any other province, a barrister or solicitor who is representing the organization;

(b) for the purpose of collecting a debt owed by the individual to the organization;

(c) required to comply with a subpoena or warrant issued or an order made by a court, person or body with jurisdiction to compel the production of information, or to comply with rules of court relating to the production of records;

(c.‍1) made to a government institution or part of a government institution that has made a request for the information, identified its lawful authority to obtain the information and indicated that

(i) it suspects that the information relates to national security, the defence of Canada or the conduct of international affairs,

(ii) the disclosure is requested for the purpose of enforcing any law of Canada, a province or a foreign jurisdiction, carrying out an investigation relating to the enforcement of any such law or gathering intelligence for the purpose of enforcing any such law,

(iii) the disclosure is requested for the purpose of administering any law of Canada or a province, or

(iv) the disclosure is requested for the purpose of communicating with the next of kin or authorized representative of an injured, ill or deceased individual;

(c.‍2) made to the government institution mentioned in section 7 of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act as required by that section;

(d) made on the initiative of the organization to a government institution or a part of a government institution and the organization

(i) has reasonable grounds to believe that the information relates to a contravention of the laws of Canada, a province or a foreign jurisdiction that has been, is being or is about to be committed, or

(ii) suspects that the information relates to national security, the defence of Canada or the conduct of international affairs;

(d.‍1) made to another organization and is reasonable for the purposes of investigating a breach of an agreement or a contravention of the laws of Canada or a province that has been, is being or is about to be committed and it is reasonable to expect that disclosure with the knowledge or consent of the individual would compromise the investigation;

(d.‍2) made to another organization and is reasonable for the purposes of detecting or suppressing fraud or of preventing fraud that is likely to be committed and it is reasonable to expect that the disclosure with the knowledge or consent of the individual would compromise the ability to prevent, detect or suppress the fraud;

(d.‍3) made on the initiative of the organization to a government institution, a part of a government institution or the individual’s next of kin or authorized representative and

(i) the organization has reasonable grounds to believe that the individual has been, is or may be the victim of financial abuse,

(ii) the disclosure is made solely for purposes related to preventing or investigating the abuse, and

(iii) it is reasonable to expect that disclosure with the knowledge or consent of the individual would compromise the ability to prevent or investigate the abuse;

(d.‍4) necessary to identify the individual who is injured, ill or deceased, made to a government institution, a part of a government institution or the individual’s next of kin or authorized representative and, if the individual is alive, the organization informs that individual in writing without delay of the disclosure;

(e) made to a person who needs the information because of an emergency that threatens the life, health or security of an individual and, if the individual whom the information is about is alive, the organization informs that individual in writing without delay of the disclosure;

(e.‍1) of information that is contained in a witness statement and the disclosure is necessary to assess, process or settle an insurance claim;

(e.‍2) of information that was produced by the individual in the course of their employment, business or profession and the disclosure is consistent with the purposes for which the information was produced;

(f) for statistical, or scholarly study or research, purposes that cannot be achieved without disclosing the information, it is impracticable to obtain consent and the organization informs the Commissioner of the disclosure before the information is disclosed;

(g) made to an institution whose functions include the conservation of records of historic or archival importance, and the disclosure is made for the purpose of such conservation;

(h) made after the earlier of

(i) one hundred years after the record containing the information was created, and

(ii) twenty years after the death of the individual whom the information is about;

(h.‍1) of information that is publicly available and is specified by the regulations; or

(h.‍2) [Repealed, 2015, c. 32, s. 6]

(i) required by law.

(4) Despite clause 4.‍5 of Schedule 1, an organization may use personal information for purposes other than those for which it was collected in any of the circumstances set out in subsection (2).

(5) Despite clause 4.‍5 of Schedule 1, an organization may disclose personal information for purposes other than those for which it was collected in any of the circumstances set out in paragraphs (3)‍(a) to (h.‍1).

7.‍1 (1) The following definitions apply in this section.

access means to program, to execute programs on, to communicate with, to store data in, to retrieve data from, or to otherwise make use of any resources, including data or programs on a computer system or a computer network. (utiliser)

computer program has the same meaning as in subsection 342.‍1(2) of the Criminal Code. (programme d’ordinateur)

computer system has the same meaning as in subsection 342.‍1(2) of the Criminal Code. (ordinateur)

electronic address means an address used in connection with

(a) an electronic mail account;

(b) an instant messaging account; or

(c) any similar account. (adresse électronique)

(2) Paragraphs 7(1)‍(a) and (b.‍1) to (d) and (2)‍(a) to (c.‍1) and the exception set out in clause 4.‍3 of Schedule 1 do not apply in respect of

(a) the collection of an individual’s electronic address, if the address is collected by the use of a computer program that is designed or marketed primarily for use in generating or searching for, and collecting, electronic addresses; or

(b) the use of an individual’s electronic address, if the address is collected by the use of a computer program described in paragraph (a).

(3) Paragraphs 7(1)‍(a) to (d) and (2)‍(a) to (c.‍1) and the exception set out in clause 4.‍3 of Schedule 1 do not apply in respect of

(a) the collection of personal information, through any means of telecommunication, if the collection is made by accessing a computer system or causing a computer system to be accessed in contravention of an Act of Parliament; or

(b) the use of personal information that is collected in a manner described in paragraph (a).

7.‍2 (1) In addition to the circumstances set out in subsections 7(2) and (3), for the purpose of clause 4.‍3 of Schedule 1, and despite the note that accompanies that clause, organizations that are parties to a prospective business transaction may use and disclose personal information without the knowledge or consent of the individual if

(a) the organizations have entered into an agreement that requires the organization that receives the personal information

(i) to use and disclose that information solely for purposes related to the transaction,

(ii) to protect that information by security safeguards appropriate to the sensitivity of the information, and

(iii) if the transaction does not proceed, to return that information to the organization that disclosed it, or destroy it, within a reasonable time; and

(b) the personal information is necessary

(i) to determine whether to proceed with the transaction, and

(ii) if the determination is made to proceed with the transaction, to complete it.

(2) In addition to the circumstances set out in subsections 7(2) and (3), for the purpose of clause 4.‍3 of Schedule 1, and despite the note that accompanies that clause, if the business transaction is completed, organizations that are parties to the transaction may use and disclose personal information, which was disclosed under subsection (1), without the knowledge or consent of the individual if

(a) the organizations have entered into an agreement that requires each of them

(i) to use and disclose the personal information under its control solely for the purposes for which the personal information was collected, permitted to be used or disclosed before the transaction was completed,

(ii) to protect that information by security safeguards appropriate to the sensitivity of the information, and

(iii) to give effect to any withdrawal of consent made under clause 4.‍3.‍8 of Schedule 1;

(b) the personal information is necessary for carrying on the business or activity that was the object of the transaction; and

(c) one of the parties notifies the individual, within a reasonable time after the transaction is completed, that the transaction has been completed and that their personal information has been disclosed under subsection (1).

(3) An organization shall comply with the terms of any agreement into which it enters under paragraph (1)‍(a) or (2)‍(a).

(4) Subsections (1) and (2) do not apply to a business transaction of which the primary purpose or result is the purchase, sale or other acquisition or disposition, or lease, of personal information.

7.‍3 In addition to the circumstances set out in section 7, for the purpose of clause 4.‍3 of Schedule 1, and despite the note that accompanies that clause, a federal work, undertaking or business may collect, use and disclose personal information without the consent of the individual if

(a) the collection, use or disclosure is necessary to establish, manage or terminate an employment relationship between the federal work, undertaking or business and the individual; and

(b) the federal work, undertaking or business has informed the individual that the personal information will be or may be collected, used or disclosed for those purposes.

7.‍4 (1) Despite clause 4.‍5 of Schedule 1, an organization may use personal information for purposes other than those for which it was collected in any of the circumstances set out in subsection 7.‍2(1) or (2) or section 7.‍3.

(2) Despite clause 4.‍5 of Schedule 1, an organization may disclose personal information for purposes other than those for which it was collected in any of the circumstances set out in subsection 7.‍2(1) or (2) or section 7.‍3.

8 (1) A request under clause 4.‍9 of Schedule 1 must be made in writing.

(2) An organization shall assist any individual who informs the organization that they need assistance in preparing a request to the organization.

(3) An organization shall respond to a request with due diligence and in any case not later than thirty days after receipt of the request.

(4) An organization may extend the time limit

(a) for a maximum of thirty days if

(i) meeting the time limit would unreasonably interfere with the activities of the organization, or

(ii) the time required to undertake any consultations necessary to respond to the request would make the time limit impracticable to meet; or

(b) for the period that is necessary in order to be able to convert the personal information into an alternative format.

In either case, the organization shall, no later than thirty days after the date of the request, send a notice of extension to the individual, advising them of the new time limit, the reasons for extending the time limit and of their right to make a complaint to the Commissioner in respect of the extension.

(5) If the organization fails to respond within the time limit, the organization is deemed to have refused the request.

(6) An organization may respond to an individual’s request at a cost to the individual only if

(a) the organization has informed the individual of the approximate cost; and

(b) the individual has advised the organization that the request is not being withdrawn.

(7) An organization that responds within the time limit and refuses a request shall inform the individual in writing of the refusal, setting out the reasons and any recourse that they may have under this Part.

(8) Despite clause 4.‍5 of Schedule 1, an organization that has personal information that is the subject of a request shall retain the information for as long as is necessary to allow the individual to exhaust any recourse under this Part that they may have.

9 (1) Despite clause 4.‍9 of Schedule 1, an organization shall not give an individual access to personal information if doing so would likely reveal personal information about a third party. However, if the information about the third party is severable from the record containing the information about the individual, the organization shall sever the information about the third party before giving the individual access.

(2) Subsection (1) does not apply if the third party consents to the access or the individual needs the information because an individual’s life, health or security is threatened.

(2.‍1) An organization shall comply with subsection (2.‍2) if an individual requests that the organization

(a) inform the individual about

(i) any disclosure of information to a government institution or a part of a government institution under paragraph 7(3)‍(c), subparagraph 7(3)‍(c.‍1)‍(i) or (ii) or paragraph 7(3)‍(c.‍2) or (d), or

(ii) the existence of any information that the organization has relating to a disclosure referred to in subparagraph (i), to a subpoena, warrant or order referred to in paragraph 7(3)‍(c) or to a request made by a government institution or a part of a government institution under subparagraph 7(3)‍(c.‍1)‍(i) or (ii); or

(b) give the individual access to the information referred to in subparagraph (a)‍(ii).

(2.‍2) An organization to which subsection (2.‍1) applies

(a) shall, in writing and without delay, notify the institution or part concerned of the request made by the individual; and

(b) shall not respond to the request before the earlier of

(i) the day on which it is notified under subsection (2.‍3), and

(ii) thirty days after the day on which the institution or part was notified.

(2.‍3) Within thirty days after the day on which it is notified under subsection (2.‍2), the institution or part shall notify the organization whether or not the institution or part objects to the organization complying with the request. The institution or part may object only if the institution or part is of the opinion that compliance with the request could reasonably be expected to be injurious to

(a) national security, the defence of Canada or the conduct of international affairs;

(a.‍1) the detection, prevention or deterrence of money laundering or the financing of terrorist activities; or

(b) the enforcement of any law of Canada, a province or a foreign jurisdiction, an investigation relating to the enforcement of any such law or the gathering of intelligence for the purpose of enforcing any such law.

(2.‍4) Despite clause 4.‍9 of Schedule 1, if an organization is notified under subsection (2.‍3) that the institution or part objects to the organization complying with the request, the organization

(a) shall refuse the request to the extent that it relates to paragraph (2.‍1)‍(a) or to information referred to in subparagraph (2.‍1)‍(a)‍(ii);

(b) shall notify the Commissioner, in writing and without delay, of the refusal; and

(c) shall not disclose to the individual

(i) any information that the organization has relating to a disclosure to a government institution or a part of a government institution under paragraph 7(3)‍(c), subparagraph 7(3)‍(c.‍1)‍(i) or (ii) or paragraph 7(3)‍(c.‍2) or (d) or to a request made by a government institution under either of those subparagraphs,

(ii) that the organization notified an institution or part under paragraph (2.‍2)‍(a) or the Commissioner under paragraph (b), or

(iii) that the institution or part objects.

(3) Despite the note that accompanies clause 4.‍9 of Schedule 1, an organization is not required to give access to personal information only if

(a) the information is protected by solicitor-client privilege or the professional secrecy of advocates and notaries or by litigation privilege;

(b) to do so would reveal confidential commercial information;

(c) to do so could reasonably be expected to threaten the life or security of another individual;

(c.‍1) the information was collected under paragraph 7(1)‍(b);

(d) the information was generated in the course of a formal dispute resolution process; or

(e) the information was created for the purpose of making a disclosure under the Public Servants Disclosure Protection Act or in the course of an investigation into a disclosure under that Act.

However, in the circumstances described in paragraph (b) or (c), if giving access to the information would reveal confidential commercial information or could reasonably be expected to threaten the life or security of another individual, as the case may be, and that information is severable from the record containing any other information for which access is requested, the organization shall give the individual access after severing.

(4) Subsection (3) does not apply if the individual needs the information because an individual’s life, health or security is threatened.

(5) If an organization decides not to give access to personal information in the circumstances set out in paragraph (3)‍(c.‍1), the organization shall, in writing, so notify the Commissioner, and shall include in the notification any information that the Commissioner may specify.

10 An organization shall give access to personal information in an alternative format to an individual with a sensory disability who has a right of access to personal information under this Part and who requests that it be transmitted in the alternative format if

(a) a version of the information already exists in that format; or

(b) its conversion into that format is reasonable and necessary in order for the individual to be able to exercise rights under this Part.

DIVISION 1.‍1 

Breaches of Security Safeguards

10.‍1 (1) An organization shall report to the Commissioner any breach of security safeguards involving personal information under its control if it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to an individual.

(2) The report shall contain the prescribed information and shall be made in the prescribed form and manner as soon as feasible after the organization determines that the breach has occurred.

(3) Unless otherwise prohibited by law, an organization shall notify an individual of any breach of security safeguards involving the individual’s personal information under the organization’s control if it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to the individual.

(4) The notification shall contain sufficient information to allow the individual to understand the significance to them of the breach and to take steps, if any are possible, to reduce the risk of harm that could result from it or to mitigate that harm. It shall also contain any other prescribed information.

(5) The notification shall be conspicuous and shall be given directly to the individual in the prescribed form and manner, except in prescribed circumstances, in which case it shall be given indirectly in the prescribed form and manner.

(6) The notification shall be given as soon as feasible after the organization determines that the breach has occurred.

(7) For the purpose of this section, significant harm includes bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.

(8) The factors that are relevant to determining whether a breach of security safeguards creates a real risk of significant harm to the individual include

(a) the sensitivity of the personal information involved in the breach;

(b) the probability that the personal information has been, is being or will be misused; and

(c) any other prescribed factor.

10.‍2 (1) An organization that notifies an individual of a breach of security safeguards under subsection 10.‍1(3) shall notify any other organization, a government institution or a part of a government institution of the breach if the notifying organization believes that the other organization or the government institution or part concerned may be able to reduce the risk of harm that could result from it or mitigate that harm, or if any of the prescribed conditions are satisfied.

(2) The notification shall be given as soon as feasible after the organization determines that the breach has occurred.

(3) In addition to the circumstances set out in subsection 7(3), for the purpose of clause 4.‍3 of Schedule 1, and despite the note that accompanies that clause, an organization may disclose personal information without the knowledge or consent of the individual if

(a) the disclosure is made to the other organization, the government institution or the part of a government institution that was notified of the breach under subsection (1); and

(b) the disclosure is made solely for the purposes of reducing the risk of harm to the individual that could result from the breach or mitigating that harm.

(4) Despite clause 4.‍5 of Schedule 1, an organization may disclose personal information for purposes other than those for which it was collected in the circumstance set out in subsection (3).

10.‍3 (1) An organization shall, in accordance with any prescribed requirements, keep and maintain a record of every breach of security safeguards involving personal information under its control.

(2) An organization shall, on request, provide the Commissioner with access to, or a copy of, a record.

DIVISION 2 

Remedies

Filing of Complaints

11 (1) An individual may file with the Commissioner a written complaint against an organization for contravening a provision of Division 1 or 1.‍1 or for not following a recommendation set out in Schedule 1.

(2) If the Commissioner is satisfied that there are reasonable grounds to investigate a matter under this Part, the Commissioner may initiate a complaint in respect of the matter.

(3) A complaint that results from the refusal to grant a request under section 8 must be filed within six months, or any longer period that the Commissioner allows, after the refusal or after the expiry of the time limit for responding to the request, as the case may be.

(4) The Commissioner shall give notice of a complaint to the organization against which the complaint was made.

Investigations of Complaints

12 (1) The Commissioner shall conduct an investigation in respect of a complaint, unless the Commissioner is of the opinion that

(a) the complainant ought first to exhaust grievance or review procedures otherwise reasonably available;

(b) the complaint could more appropriately be dealt with, initially or completely, by means of a procedure provided for under the laws of Canada, other than this Part, or the laws of a province; or

(c) the complaint was not filed within a reasonable period after the day on which the subject matter of the complaint arose.

(2) Despite subsection (1), the Commissioner is not required to conduct an investigation in respect of an act alleged in a complaint if the Commissioner is of the opinion that the act, if proved, would constitute a contravention of any of sections 6 to 9 of An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act or section 52.‍01 of the Competition Act or would constitute conduct that is reviewable under section 74.‍011 of that Act.

(3) The Commissioner shall notify the complainant and the organization that the Commissioner will not investigate the complaint or any act alleged in the complaint and give reasons.

(4) The Commissioner may reconsider a decision not to investigate under subsection (1), if the Commissioner is satisfied that the complainant has established that there are compelling reasons to investigate.

12.‍1 (1) In the conduct of an investigation of a complaint, the Commissioner may

(a) summon and enforce the appearance of persons before the Commissioner and compel them to give oral or written evidence on oath and to produce any records and things that the Commissioner considers necessary to investigate the complaint, in the same manner and to the same extent as a superior court of record;

(b) administer oaths;

(c) receive and accept any evidence and other information, whether on oath, by affidavit or otherwise, that the Commissioner sees fit, whether or not it is or would be admissible in a court of law;

(d) at any reasonable time, enter any premises, other than a dwelling-house, occupied by an organization on satisfying any security requirements of the organization relating to the premises;

(e) converse in private with any person in any premises entered under paragraph (d) and otherwise carry out in those premises any inquiries that the Commissioner sees fit; and

(f) examine or obtain copies of or extracts from records found in any premises entered under paragraph (d) that contain any matter relevant to the investigation.

(2) The Commissioner may attempt to resolve complaints by means of dispute resolution mechanisms such as mediation and conciliation.

(3) The Commissioner may delegate any of the powers set out in subsection (1) or (2).

(4) The Commissioner or the delegate shall return to a person or an organization any record or thing that they produced under this section within 10 days after they make a request to the Commissioner or the delegate, but nothing precludes the Commissioner or the delegate from again requiring that the record or thing be produced.

(5) Any person to whom powers set out in subsection (1) are delegated shall be given a certificate of the delegation and the delegate shall produce the certificate, on request, to the person in charge of any premises to be entered under paragraph (1)‍(d).

Discontinuance of Investigation

12.‍2 (1) The Commissioner may discontinue the investigation of a complaint if the Commissioner is of the opinion that

(a) there is insufficient evidence to pursue the investigation;

(b) the complaint is trivial, frivolous or vexatious or is made in bad faith;

(c) the organization has provided a fair and reasonable response to the complaint;

(c.‍1) the matter is the object of a compliance agreement entered into under subsection 17.‍1(1);

(d) the matter is already the object of an ongoing investigation under this Part;

(e) the matter has already been the subject of a report by the Commissioner;

(f) any of the circumstances mentioned in paragraph 12(1)‍(a), (b) or (c) apply; or

(g) the matter is being or has already been addressed under a procedure referred to in paragraph 12(1)‍(a) or (b).

(2) The Commissioner may discontinue an investigation in respect of an act alleged in a complaint if the Commissioner is of the opinion that the act, if proved, would constitute a contravention of any of sections 6 to 9 of An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act or section 52.‍01 of the Competition Act or would constitute conduct that is reviewable under section 74.‍011 of that Act.

(3) The Commissioner shall notify the complainant and the organization that the investigation has been discontinued and give reasons.

Commissioner’s Report

13 (1) The Commissioner shall, within one year after the day on which a complaint is filed or is initiated by the Commissioner, prepare a report that contains

(a) the Commissioner’s findings and recommendations;

(b) any settlement that was reached by the parties;

(c) if appropriate, a request that the organization give the Commissioner, within a specified time, notice of any action taken or proposed to be taken to implement the recommendations contained in the report or reasons why no such action has been or is proposed to be taken; and

(d) the recourse, if any, that is available under section 14.

(2) [Repealed, 2010, c. 23, s. 84]

(3) The report shall be sent to the complainant and the organization without delay.

Hearing by Court

14 (1) A complainant may, after receiving the Commissioner’s report or being notified under subsection 12.‍2(3) that the investigation of the complaint has been discontinued, apply to the Court for a hearing in respect of any matter in respect of which the complaint was made, or that is referred to in the Commissioner’s report, and that is referred to in clause 4.‍1.‍3, 4.‍2, 4.‍3.‍3, 4.‍4, 4.‍6, 4.‍7 or 4.‍8 of Schedule 1, in clause 4.‍3, 4.‍5 or 4.‍9 of that Schedule as modified or clarified by Division 1 or 1.‍1, in subsection 5(3) or 8(6) or (7), in section 10 or in Division 1.‍1.

(2) A complainant shall make an application within one year after the report or notification is sent or within any longer period that the Court may, either before or after the expiry of that year, allow.

(3) For greater certainty, subsections (1) and (2) apply in the same manner to complaints referred to in subsection 11(2) as to complaints referred to in subsection 11(1).

15 The Commissioner may, in respect of a complaint that the Commissioner did not initiate,

(a) apply to the Court, within the time limited by section 14, for a hearing in respect of any matter described in that section, if the Commissioner has the consent of the complainant;

(b) appear before the Court on behalf of any complainant who has applied for a hearing under section 14; or

(c) with leave of the Court, appear as a party to any hearing applied for under section 14.

16 The Court may, in addition to any other remedies it may give,

(a) order an organization to correct its practices in order to comply with Divisions 1 and 1.‍1;

(b) order an organization to publish a notice of any action taken or proposed to be taken to correct its practices, whether or not ordered to correct them under paragraph (a); and

(c) award damages to the complainant, including damages for any humiliation that the complainant has suffered.

17 (1) An application made under section 14 or 15 shall be heard and determined without delay and in a summary way unless the Court considers it inappropriate to do so.

(2) In any proceedings arising from an application made under section 14 or 15, the Court shall take every reasonable precaution, including, when appropriate, receiving representations ex parte and conducting hearings in camera, to avoid the disclosure by the Court or any person of any information or other material that the organization would be authorized to refuse to disclose if it were requested under clause 4.‍9 of Schedule 1.

Compliance Agreements

17.‍1 (1) If the Commissioner believes on reasonable grounds that an organization has committed, is about to commit or is likely to commit an act or omission that could constitute a contravention of a provision of Division 1 or 1.‍1 or a failure to follow a recommendation set out in Schedule 1, the Commissioner may enter into a compliance agreement, aimed at ensuring compliance with this Part, with that organization.

(2) A compliance agreement may contain any terms that the Commissioner considers necessary to ensure compliance with this Part.

(3) When a compliance agreement is entered into, the Commissioner, in respect of any matter covered under the agreement,

(a) shall not apply to the Court for a hearing under subsection 14(1) or paragraph 15(a); and

(b) shall apply to the court for the suspension of any pending applications that were made by the Commissioner under those provisions.

(4) For greater certainty, a compliance agreement does not preclude

(a) an individual from applying for a hearing under section 14; or

(b) the prosecution of an offence under the Act.

17.‍2 (1) If the Commissioner is of the opinion that a compliance agreement has been complied with, the Commissioner shall provide written notice to that effect to the organization and withdraw any applications that were made under subsection 14(1) or paragraph 15(a) in respect of any matter covered under the agreement.

(2) If the Commissioner is of the opinion that an organization is not complying with the terms of a compliance agreement, the Commissioner shall notify the organization and may apply to the Court for

(a) an order requiring the organization to comply with the terms of the agreement, in addition to any other remedies it may give; or

(b) a hearing under subsection 14(1) or paragraph 15(a) or to reinstate proceedings that have been suspended as a result of an application made under paragraph 17.‍1(3)‍(b).

(3) Despite subsection 14(2), the application shall be made within one year after notification is sent or within any longer period that the Court may, either before or after the expiry of that year, allow.

DIVISION 3 

Audits

18 (1) The Commissioner may, on reasonable notice and at any reasonable time, audit the personal information management practices of an organization if the Commissioner has reasonable grounds to believe that the organization has contravened a provision of Division 1 or 1.‍1 or is not following a recommendation set out in Schedule 1, and for that purpose may

(a) summon and enforce the appearance of persons before the Commissioner and compel them to give oral or written evidence on oath and to produce any records and things that the Commissioner considers necessary for the audit, in the same manner and to the same extent as a superior court of record;

(b) administer oaths;

(c) receive and accept any evidence and other information, whether on oath, by affidavit or otherwise, that the Commissioner sees fit, whether or not it is or would be admissible in a court of law;

(d) at any reasonable time, enter any premises, other than a dwelling-house, occupied by the organization on satisfying any security requirements of the organization relating to the premises;

(e) converse in private with any person in any premises entered under paragraph (d) and otherwise carry out in those premises any inquiries that the Commissioner sees fit; and

(f) examine or obtain copies of or extracts from records found in any premises entered under paragraph (d) that contain any matter relevant to the audit.

(2) The Commissioner may delegate any of the powers set out in subsection (1).

(3) The Commissioner or the delegate shall return to a person or an organization any record or thing they produced under this section within ten days after they make a request to the Commissioner or the delegate, but nothing precludes the Commissioner or the delegate from again requiring that the record or thing be produced.

(4) Any person to whom powers set out in subsection (1) are delegated shall be given a certificate of the delegation and the delegate shall produce the certificate, on request, to the person in charge of any premises to be entered under paragraph (1)‍(d).

19 (1) After an audit, the Commissioner shall provide the audited organization with a report that contains the findings of the audit and any recommendations that the Commissioner considers appropriate.

(2) The report may be included in a report made under section 25.

DIVISION 4 

General

20 (1) Subject to subsections (2) to (7), 12(3), 12.‍2(3), 13(3), 19(1), 23(3) and 23.‍1(1) and section 25, the Commissioner or any person acting on behalf or under the direction of the Commissioner shall not disclose any information that comes to their knowledge as a result of the performance or exercise of any of the Commissioner’s duties or powers under this Part other than those referred to in subsection 10.‍1(1) or 10.‍3(2).

(1.‍1) Subject to subsections (2) to (7), 12(3), 12.‍2(3), 13(3), 19(1), 23(3) and 23.‍1(1) and section 25, the Commissioner or any person acting on behalf or under the direction of the Commissioner shall not disclose any information contained in a report made under subsection 10.‍1(1) or in a record obtained under subsection 10.‍3(2).

(2) The Commissioner may, if the Commissioner considers that it is in the public interest to do so, make public any information that comes to his or her knowledge in the performance or exercise of any of his or her duties or powers under this Part.

(3) The Commissioner may disclose, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose, information that in the Commissioner’s opinion is necessary to

(a) conduct an investigation or audit under this Part; or

(b) establish the grounds for findings and recommendations contained in any report under this Part.

(4) The Commissioner may disclose, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose, information in the course of

(a) a prosecution for an offence under section 28;

(b) a prosecution for an offence under section 132 of the Criminal Code (perjury) in respect of a statement made under this Part;

(c) a hearing before the Court under this Part;

(d) an appeal from a decision of the Court; or

(e) a judicial review in relation to the performance or exercise of any of the Commissioner’s duties or powers under this Part.

(5) The Commissioner may disclose to the Attorney General of Canada or of a province, as the case may be, information relating to the commission of an offence against any law of Canada or a province on the part of an officer or employee of an organization if, in the Commissioner’s opinion, there is evidence of an offence.

(6) The Commissioner may disclose, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose to a government institution or a part of a government institution, any information contained in a report made under subsection 10.‍1(1) or in a record obtained under subsection 10.‍3(2) if the Commissioner has reasonable grounds to believe that the information could be useful in the investigation of a contravention of the laws of Canada or a province that has been, is being or is about to be committed.

(7) The Commissioner may disclose information, or may authorize any person acting on behalf or under the direction of the Commissioner to disclose information, in the course of proceedings in which the Commissioner has intervened under paragraph 50(c) of An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act or in accordance with subsection 58(3) or 60(1) of that Act.

21 The Commissioner or person acting on behalf or under the direction of the Commissioner is not a competent witness in respect of any matter that comes to their knowledge as a result of the performance or exercise of any of the Commissioner’s duties or powers under this Part in any proceeding other than

(a) a prosecution for an offence under section 28;

(b) a prosecution for an offence under section 132 of the Criminal Code (perjury) in respect of a statement made under this Part;

(c) a hearing before the Court under this Part; or

(d) an appeal from a decision of the Court.

22 (1) No criminal or civil proceedings lie against the Commissioner, or against any person acting on behalf or under the direction of the Commissioner, for anything done, reported or said in good faith as a result of the performance or exercise or purported performance or exercise of any duty or power of the Commissioner under this Part.

(2) No action lies in defamation with respect to

(a) anything said, any information supplied or any record or thing produced in good faith in the course of an investigation or audit carried out by or on behalf of the Commissioner under this Part; and

(b) any report made in good faith by the Commissioner under this Part and any fair and accurate account of the report made in good faith for the purpose of news reporting.

23 (1) If the Commissioner considers it appropriate to do so, or on the request of an interested person, the Commissioner may, in order to ensure that personal information is protected in as consistent a manner as possible, consult with any person who, under provincial legislation, has functions and duties similar to those of the Commissioner with respect to the protection of such information.

(2) The Commissioner may enter into agreements or arrangements with any person referred to in subsection (1) in order to

(a) coordinate the activities of their offices and the office of the Commissioner, including to provide for mechanisms for the handling of any complaint in which they are mutually interested;

(b) undertake and publish research or develop and publish guidelines or other instruments related to the protection of personal information;

(c) develop model contracts or other instruments for the protection of personal information that is collected, used or disclosed interprovincially or internationally; and

(d) develop procedures for sharing information referred to in subsection (3).

(3) The Commissioner may, in accordance with any procedure established under paragraph (2)‍(d), share information with any person referred to in subsection (1), if the information

(a) could be relevant to an ongoing or potential investigation of a complaint or audit under this Part or provincial legislation that has objectives that are similar to this Part; or

(b) could assist the Commissioner or that person in the exercise of their functions and duties with respect to the protection of personal information.

(4) The procedures referred to in paragraph (2)‍(d) shall

(a) restrict the use of the information to the purpose for which it was originally shared; and

(b) stipulate that the information be treated in a confidential manner and not be further disclosed without the express consent of the Commissioner.

23.‍1 (1) Subject to subsection (3), the Commissioner may, in accordance with any procedure established under paragraph (4)‍(b), disclose information referred to in subsection (2) that has come to the Commissioner’s knowledge as a result of the performance or exercise of any of the Commissioner’s duties or powers under this Part to any person or body who, under the legislation of a foreign state, has

(a) functions and duties similar to those of the Commissioner with respect to the protection of personal information; or

(b) responsibilities that relate to conduct that is substantially similar to conduct that would be in contravention of this Part.

(2) The information that the Commissioner is authorized to disclose under subsection (1) is information that the Commissioner believes

(a) would be relevant to an ongoing or potential investigation or proceeding in respect of a contravention of the laws of a foreign state that address conduct that is substantially similar to conduct that would be in contravention of this Part; or

(b) is necessary to disclose in order to obtain from the person or body information that may be useful to an ongoing or potential investigation or audit under this Part.

(3) The Commissioner may only disclose information to the person or body referred to in subsection (1) if the Commissioner has entered into a written arrangement with that person or body that

(a) limits the information to be disclosed to that which is necessary for the purpose set out in paragraph (2)‍(a) or (b);

(b) restricts the use of the information to the purpose for which it was originally shared; and

(c) stipulates that the information be treated in a confidential manner and not be further disclosed without the express consent of the Commissioner.

(4) The Commissioner may enter into arrangements with one or more persons or bodies referred to in subsection (1) in order to

(a) provide for cooperation with respect to the enforcement of laws protecting personal information, including the sharing of information referred to in subsection (2) and the provision of mechanisms for the handling of any complaint in which they are mutually interested;

(b) establish procedures for sharing information referred to in subsection (2);

(c) develop recommendations, resolutions, rules, standards or other instruments with respect to the protection of personal information;

(d) undertake and publish research related to the protection of personal information;

(e) share knowledge and expertise by different means, including through staff exchanges; or

(f) identify issues of mutual interest and determine priorities pertaining to the protection of personal information.

24 The Commissioner shall

(a) develop and conduct information programs to foster public understanding, and recognition of the purposes, of this Part;

(b) undertake and publish research that is related to the protection of personal information, including any such research that is requested by the Minister of Industry;

(c) encourage organizations to develop detailed policies and practices, including organizational codes of practice, to comply with Divisions 1 and 1.‍1; and

(d) promote, by any means that the Commissioner considers appropriate, the purposes of this Part.

25 (1) The Commissioner shall, within three months after the end of each financial year, submit to Parliament a report concerning the application of this Part, the extent to which the provinces have enacted legislation that is substantially similar to this Part and the application of any such legislation.

(2) Before preparing the report, the Commissioner shall consult with those persons in the provinces who, in the Commissioner’s opinion, are in a position to assist the Commissioner in making a report respecting personal information that is collected, used or disclosed interprovincially or internationally.

26 (1) The Governor in Council may make regulations for carrying out the purposes and provisions of this Part, including regulations

(a) specifying, by name or by class, what is a government institution or part of a government institution for the purposes of any provision of this Part;

(a.‍01) [Repealed, 2015, c. 32, s. 21]

(a.‍1) specifying information or classes of information for the purpose of paragraph 7(1)‍(d), (2)‍(c.‍1) or (3)‍(h.‍1);

(b) specifying information to be kept and maintained under subsection 10.‍3(1); and

(c) prescribing anything that by this Part is to be prescribed.

(2) The Governor in Council may, by order,

(a) provide that this Part is binding on any agent of Her Majesty in right of Canada to which the Privacy Act does not apply;

(b) if satisfied that legislation of a province that is substantially similar to this Part applies to an organization, a class of organizations, an activity or a class of activities, exempt the organization, activity or class from the application of this Part in respect of the collection, use or disclosure of personal information that occurs within that province; and

(c) amend Schedule 4.

27 (1) Any person who has reasonable grounds to believe that a person has contravened or intends to contravene a provision of Division 1 or 1.‍1 may notify the Commissioner of the particulars of the matter and may request that their identity be kept confidential with respect to the notification.

(2) The Commissioner shall keep confidential the identity of a person who has notified the Commissioner under subsection (1) and to whom an assurance of confidentiality has been provided by the Commissioner.

27.‍1 (1) No employer shall dismiss, suspend, demote, discipline, harass or otherwise disadvantage an employee, or deny an employee a benefit of employment, by reason that

(a) the employee, acting in good faith and on the basis of reasonable belief, has disclosed to the Commissioner that the employer or any other person has contravened or intends to contravene a provision of Division 1 or 1.‍1;

(b) the employee, acting in good faith and on the basis of reasonable belief, has refused or stated an intention of refusing to do anything that is a contravention of a provision of Division 1 or 1.‍1;

(c) the employee, acting in good faith and on the basis of reasonable belief, has done or stated an intention of doing anything that is required to be done in order that a provision of Division 1 or 1.‍1 not be contravened; or

(d) the employer believes that the employee will do anything referred to in paragraph (a), (b) or (c).

(2) Nothing in this section impairs any right of an employee either at law or under an employment contract or collective agreement.

(3) In this section, employee includes an independent contractor and employer has a corresponding meaning.

28 Every organization that knowingly contravenes subsection 8(8), section 10.‍1 or subsection 10.‍3(1) or 27.‍1(1) or that obstructs the Commissioner or the Commissioner’s delegate in the investigation of a complaint or in conducting an audit is guilty of

(a) an offence punishable on summary conviction and liable to a fine not exceeding $10,000; or

(b) an indictable offence and liable to a fine not exceeding $100,000.

*29 (1) The administration of this Part shall, every five years after this Part comes into force, be reviewed by the committee of the House of Commons, or of both Houses of Parliament, that may be designated or established by Parliament for that purpose.

* [Note: Part 1 in force January 1, 2001, see SI/2000-29.‍]

(2) The committee shall undertake a review of the provisions and operation of this Part and shall, within a year after the review is undertaken or within any further period that the House of Commons may authorize, submit a report to Parliament that includes a statement of any changes to this Part or its administration that the committee recommends.

DIVISION 5 

Transitional Provisions

30 (1) This Part does not apply to any organization in respect of personal information that it collects, uses or discloses within a province whose legislature has the power to regulate the collection, use or disclosure of the information, unless the organization does it in connection with the operation of a federal work, undertaking or business or the organization discloses the information outside the province for consideration.

(1.‍1) This Part does not apply to any organization in respect of personal health information that it collects, uses or discloses.

*(2) Subsection (1) ceases to have effect three years after the day on which this section comes into force.

* [Note: Section 30 in force January 1, 2001, see SI/2000-29.‍]

*(2.‍1) Subsection (1.‍1) ceases to have effect one year after the day on which this section comes into force.

* [Note: Section 30 in force January 1, 2001, see SI/2000-29.‍]

Clause 5: New.

Clause 6: Spent consequential amendments.

Aeronautics Act

Clause 10: Existing text of subsection 4.‍83(1):

4.‍83 (1) Despite section 5 of the Personal Information Protection and Electronic Documents Act, to the extent that that section relates to obligations set out in Schedule 1 to that Act relating to the disclosure of information, and despite subsection 7(3) of that Act, an operator of an aircraft departing from Canada that is due to land in a foreign state or fly over the United States and land outside Canada or of a Canadian aircraft departing from any place outside Canada that is due to land in a foreign state or fly over the United States may, in accordance with the regulations, provide to a competent authority in that foreign state any information that is in the operator’s control relating to persons on board or expected to be on board the aircraft and that is required by the laws of the foreign state.

Canadian Radio-television and Telecommunications Commission Act

Clause 13: New.

Competition Act

Clause 14: New.

Canada Business Corporations Act

Clause 15: Existing text of subsection 21.‍1(5):

(5) Within one year after the sixth anniversary of the day on which an individual ceases to be an individual with significant control over the corporation, the corporation shall — subject to any other Act of Parliament and to any Act of the legislature of a province that provides for a longer retention period — dispose of any of that individual’s personal information, as defined in subsection 2(1) of the Personal Information Protection and Electronic Documents Act, that is recorded in the register.

Telecommunications Act

Clause 16: (1) Existing text of subsection 39(2):

(2) Subject to subsections (4), (5), (5.‍1) and (6), where a person designates information as confidential and the designation is not withdrawn by that person, no person described in subsection (3) shall knowingly disclose the information, or knowingly allow it to be disclosed, to any other person in any manner that is calculated or likely to make it available for the use of any person who may benefit from the information or use the information to the detriment of any person to whose business or affairs the information relates.

(2) New.

Public Servants Disclosure Protection Act

Clause 17: Relevant portion of section 15:

15 Sections 12 to 14 apply despite

(a) section 5 of the Personal Information Protection and Electronic Documents Act, to the extent that that section relates to obligations set out in Schedule 1 to that Act relating to the disclosure of information; and

Clause 18: Existing text of subsection 16(1.‍1):

(1.‍1) Subsection (1) does not apply in respect of information the disclosure of which is subject to any restriction created by or under any Act of Parliament, including the Personal Information Protection and Electronic Documents Act.

Clause 19: Existing text of section 50:

50 Despite section 5 of the Personal Information Protection and Electronic Documents Act, to the extent that that section relates to obligations set out in Schedule 1 to that Act relating to the disclosure of information, and despite any other Act of Parliament that restricts the disclosure of information, a report by a chief executive in response to recommendations made by the Commissioner to the chief executive under this Act may include personal information within the meaning of subsection 2(1) of that Act, or section 3 of the Privacy Act, depending on which of those Acts applies to the portion of the public sector for which the chief executive is responsible.

An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act

Clause 20: Existing text of section 2:

2 In the event of a conflict between a provision of this Act and a provision of Part 1 of the Personal Information Protection and Electronic Documents Act, the provision of this Act operates despite the provision of that Part, to the extent of the conflict.

Clause 21: Relevant portion of subsection 20(3):

(3) The following factors must be taken into account when determining the amount of a penalty:

.‍.‍. 

(c) the person’s history with respect to any previous violation under this Act, any previous conduct that is reviewable under section 74.‍011 of the Competition Act and any previous contravention of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act;

Clause 22: Existing text of subsection 47(1):

47 (1) A person who alleges that they are affected by an act or omission that constitutes a contravention of any of sections 6 to 9 of this Act or of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act — or that constitutes conduct that is reviewable under section 74.‍011 of the Competition Act — may apply to a court of competent jurisdiction for an order under section 51 against one or more persons who they allege have committed the act or omission or who they allege are liable for the contravention or reviewable conduct by reason of section 52 or 53.

(2) Existing text of subsection 47(4):

(4) The applicant must, without delay, serve a copy of the application on every person against whom an order is sought, on the Commission if the application identifies a contravention of this Act, on the Commissioner of Competition if the application identifies conduct that is reviewable under section 74.‍011 of the Competition Act and on the Privacy Commissioner if the application identifies a contravention of the Personal Information Protection and Electronic Documents Act.

Clause 23: Relevant portion of section 50:

50 The following may intervene in any proceedings in connection with an application under subsection 47(1) for an order under paragraph 51(1)‍(b) and in any related proceedings:

.‍.‍. 

(c) the Privacy Commissioner, if the application identifies a contravention of the Personal Information Protection and Electronic Documents Act.

Clause 24: Relevant portion of subsection 51(1):

51 (1) If, after hearing the application, the court is satisfied that one or more persons have contravened any of the provisions referred to in the application or engaged in conduct referred to in it that is reviewable under section 74.‍011 of the Competition Act, the court may order the person or persons, as the case may be, to pay the applicant

.‍.‍. 

(b) a maximum of

.‍.‍. 

(vi) in the case of a contravention of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act, $1,000,000 for each day on which a contravention occurred, and

(2) Text of subsection 51(2):

(2) The purpose of an order under paragraph (1)‍(b) is to promote compliance with this Act, the Personal Information Protection and Electronic Documents Act or the Competition Act, as the case may be, and not to punish.

(2) Relevant portion of subsection 51(3):

(3) The court must consider the following factors when it determines the amount payable under paragraph (1)‍(b) for each contravention or each occurrence of the reviewable conduct:

.‍.‍. 

(c) the person’s history, or each person’s history, as the case may be, with respect to any previous contravention of this Act and of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act and with respect to any previous conduct that is reviewable under section 74.‍011 of the Competition Act;

Clause 25: Existing text of sections 52 to 54:

52 An officer, director, agent or mandatary of a corporation that commits a contravention of any of sections 6 to 9 of this Act or of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act, or that engages in conduct that is reviewable under section 74.‍011 of the Competition Act, is liable for the contravention or reviewable conduct, as the case may be, if they directed, authorized, assented to, acquiesced in or participated in the commission of that contravention, or engaged in that conduct, whether or not the corporation is proceeded against.

53 A person is liable for a contravention of any of sections 6 to 9 of this Act or of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act, or for conduct that is reviewable under section 74.‍011 of the Competition Act, that is committed or engaged in, as the case may be, by their employee acting within the scope of their employment or their agent or mandatary acting within the scope of their authority, whether or not the employee, agent or mandatary is identified or proceeded against.

54 (1) A person must not be found to have committed a contravention of any of sections 6 to 9 of this Act or of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act, or to have engaged in conduct that is reviewable under section 74.‍011 of the Competition Act, if they establish that they exercised due diligence to prevent the contravention or conduct, as the case may be.

(2) Every rule and principle of the common law that makes any circumstance a justification or excuse in relation to a charge for an offence applies in respect of a contravention of any of sections 6 to 9 of this Act or of section 5 of the Personal Information Protection and Electronic Documents Act that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act, or in respect of conduct that is reviewable under section 74.‍011 of the Competition Act, to the extent that it is not inconsistent with this Act or the Personal Information Protection and Electronic Documents Act or the Competition Act, as the case may be.

Clause 26: (1) and (2) Relevant portion of section 56:

56 Despite subsection 7(3) of the Personal Information Protection and Electronic Documents Act, any organization to which Part 1 of that Act applies may on its own initiative disclose to the Commission, the Commissioner of Competition or the Privacy Commissioner any information in its possession that it believes relates to

(a) a contravention of

.‍.‍. 

(iii) section 5 of the Personal Information Protection and Electronic Documents Act, which contravention relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act, or

Clause 27: Existing text of section 57:

57 The Commission, the Commissioner of Competition and the Privacy Commissioner must consult with each other to the extent that they consider appropriate to ensure the effective regulation, under this Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act, of commercial conduct that discourages the use of electronic means to carry out commercial activities, and to coordinate their activities under those Acts as they relate to the regulation of that type of conduct.

Clause 28: (1) Relevant portion of subsection 58(1):

58 (1) The Commission may disclose information obtained by it in the performance or exercise of its duties or powers related to any of sections 6 to 9 of this Act and, in respect of conduct carried out by electronic means, to section 41 of the Telecommunications Act,

(a) to the Privacy Commissioner, if the Commission believes that the information relates to the performance or exercise of the Privacy Commissioner’s duties or powers under Part 1 of the Personal Information Protection and Electronic Documents Act in respect of a collection or use described in subsection 7.‍1(2) or (3) of that Act; and

(2) Relevant portion of subsection 58(2):

(2) Despite section 29 of the Competition Act, the Commissioner of Competition may disclose information obtained by him or her in the performance or exercise of his or her duties or powers related to section 52.‍01 or 74.‍011 of that Act or, in respect of conduct carried out by electronic means, to section 52, 52.‍1, 53, 55, 55.‍1, 74.‍01, 74.‍02, 74.‍04, 74.‍05 or 74.‍06 of that Act,

(a) to the Privacy Commissioner, if the Commissioner of Competition believes that the information relates to the performance or exercise of the Privacy Commissioner’s duties or powers under Part 1 of the Personal Information Protection and Electronic Documents Act in respect of a collection or use described in subsection 7.‍1(2) or (3) of that Act; and

(3) Relevant portion of subsection 58(3):

(3) The Privacy Commissioner may disclose information obtained by him or her in the performance or exercise of his or her duties or powers under Part 1 of the Personal Information Protection and Electronic Documents Act if the information relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act or to an act alleged in a complaint in respect of which the Privacy Commissioner decides, under subsection 12(2) or 12.‍2(2) of that Act, to not conduct an investigation or to discontinue an investigation,

Clause 29: Existing text of subsection 59(3):

(3) The Privacy Commissioner may use the information that is disclosed to him or her under paragraph 58(1)‍(a) or (2)‍(a) only for the purpose of performing or exercising his or her duties or powers under Part 1 of the Personal Information Protection and Electronic Documents Act in respect of a collection or use described in subsection 7.‍1(2) or (3) of that Act.

Clause 30: (1) and (2) Relevant portion of subsection 60(1):

60 (1) Information may be disclosed under an agreement or arrangement in writing between the Government of Canada, the Commission, the Commissioner of Competition or the Privacy Commissioner and the government of a foreign state, an international organization of states or an international organization established by the governments of states, or any institution of any such government or organization, if the person responsible for disclosing the information believes that

(a) the information may be relevant to an investigation or proceeding in respect of a contravention of the laws of a foreign state that address conduct that is substantially similar to

.‍.‍. 

(ii) conduct that contravenes section 5 of the Personal Information Protection and Electronic Documents Act and that relates to a collection or use described in subsection 7.‍1(2) or (3) of that Act,

(b) the disclosure is necessary in order to obtain from that foreign state, organization or institution information that may be relevant for any of the following purposes and no more information will be disclosed than is required for that purpose:

.‍.‍. 

(iii) the performance or exercise by the Privacy Commissioner of his or her duties or powers under Part 1 of the Personal Information Protection and Electronic Documents Act in respect of a collection or use described in subsection 7.‍1(2) or (3) of that Act, or

Clause 31: Existing text of section 61:

61 The Commission, the Commissioner of Competition and the Privacy Commissioner must provide the Minister of Industry with any reports that he or she requests for the purpose of coordinating the implementation of sections 6 to 9 of this Act, sections 52.‍01 and 74.‍011 of the Competition Act and section 7.‍1 of the Personal Information Protection and Electronic Documents Act.

Transportation Modernization Act

Clause 32: Existing text of subsection 17.‍91(4):

(4) A company that collects, uses or communicates information under this section, section 17.‍31 or 17.‍94, subsection 28(1.‍1) or 36(2) or regulations made under section 17.‍95 may do so

(a) despite section 5 of the Personal Information Protection and Electronic Documents Act, to the extent that that section relates to obligations set out in Schedule 1 to that Act relating to the collection, use, disclosure and retention of information, and despite section 7 of that Act; and

(b) dspite any provision of provincial legislation that is substantially similar to Part 1 of the Act referred to in paragraph (a) and that limits the collection, use, communication or preservation of information.