Alberta’s Approach to AI + Privacy Protection

Join our email list today to receive alerts, articles, invitations to events and more!

Join Our Email List

2 min read
With the federal government taking a backseat on AI regulation, Alberta’s Office of the Information and Privacy Commissioner (OIPC) is stepping up. In a July 2025 report, the OIPC called for provincial AI legislation that complements existing privacy laws to safeguard Albertans’ data rights. The report criticizes gaps in Alberta’s Protection of Privacy Act (POPA), especially regarding transparency, opt-out rights, and complaint mechanisms for AI-driven decisions. Additionally, in September 2025, the OIPC released detailed guidance on AI scribe tools used in healthcare, emphasizing the need for Privacy Impact Assessments and cautioning against common implementation pitfalls. These documents underscore Alberta’s push for responsible, localized AI governance in the absence of federal action.

The federal government is taking a hands-off approach to AI regulation for the foreseeable future. This naturally means that others will move in to try to fill the vacuum. Alberta's Office of the Information and Privacy Commissioner (OIPC) has recently made a couple of interesting moves in the area of AI and personal information.

Responsible AI Governance in Alberta
The OIPC issued a report in July 2025 in which it advocates for made-in-Alberta legislation to regulate AI in the province. This is envisioned as a regulatory scheme that works alongside existing provincial privacy laws to protect the rights of individual Albertans concerning the collection, use, or disclosure of their personal or health information to train AI, and to use AI where this involves processing personal or health information.

This report makes recommendations in key areas:

  • Public sector privacy legislation

  • Private sector privacy legislation

  • Health privacy legislation

  • The proposed (federal) Artificial Intelligence and Data Act (AIDA) in the Alberta context

The OIPC notes that Alberta's Protection of Privacy Act (POPA) and its regulations, as recently enacted, did not incorporate the OIPC's recommendations related to AI. The only AI-related provisions in POPA are:

  • the obligation of public bodies to notify individuals about the use of automated decision-making or content generation at the time of collection of personal information;

  • the use of automated decision-making by public bodies generally; and

  • accuracy and retention obligations for personal information used in automated decision-making.

However, POPA did not address these issues:

  • There is currently no requirement to document decision factors (other than the retention requirement regarding personal information and the decision itself);

  • There is no right to opt out of, object to, or ask for human review of automated decisions; or

  • POPA does not establish a channel for complaints specific to the process or outcomes of automated decision-making.

The OIPC's position is that these regulatory gaps should be filled by provincial legislators.

Guidance on AI Scribe Tools
The use of AI scribes is now common in business meetings across many industries. In September 2025, the OIPC also released its guidance on the use of AI scribes by health care professionals.

AI scribe tools are already being widely used by “custodians” under the Health Information Act (HIA). For every custodian considering the use of this kind of technology, the privacy and security risks must be addressed. The HIA requires the submission of privacy impact assessments (PIAs) to describe the effect of any proposed administrative practice or system, or change in administrative practice or system, involving the collection, use, and disclosure of individually identifying health information. Custodians must submit a PIA on any implementation and use of AI scribe tools.

Although it is focused on the health services sector, the OIPC's document also provides some excellent guidance on the use of AI scribe tools generally, including a review of common pitfalls for any customer to avoid when engaging the services of a software vendor selling AI scribe tools.

We regularly assist clients in the review of AI tools, the development of AI governance documentation, the assessment of privacy risks, and the review of PIAs. Contact Richard Stobbe or any member of our Privacy + Data Management Group for assistance.