Canada unveils its new privacy legislation – with even bigger fines than the GDPR

Posted on Nov 19, 2020 by Glyn Moody

As this blog has frequently noted, the EU’s General Data Protection Regulation (GDPR) plays a crucial role in the privacy world. It not only creates protections for privacy online in the EU, it also provides a role model for other countries looking to implement privacy laws of their own. It shows that this complex area can be addressed by thoughtful law-making, and that strong protection for personal data is not incompatible with its central role in modern online activities like e-commerce. For example, both Brazil and Kenya have passed data protection laws closely modelled on the GDPR. Some of the better recent modifications to the California Consumer Protection Act were also inspired by ideas in the EU legislation.

Now Canada plans to join the club of countries building on the GDPR’s ideas, with a major update to its existing law in this area that will known as the Consumer Privacy Protection Act (CPPA) if and when it is passed. Canada’s current Personal Information Protection and Electronic Documents Act (PIPEDA) dates back in 2000, when the data protection landscape was very different from what it is today. There is a long background document on the proposed law from the Canadian government, with detailed discussions of the issues involved. However, a more approachable starting point is a blog post on the bill from Professor Michael Geist, one of Canada’s leading experts on digital law. He describes the CPPA as “Canada’s biggest privacy overhaul in decades“. He notes that the current text is just a starting point. There is likely to be significant lobbying to change parts of the bill, and some of the new rules require accompanying regulations, which could take years more to finalize after additional consultations. In terms of the bill’s current text, one of the most striking elements is the enforcement regime. The Privacy Commissioner of Canada will be given a new power to order compliance with the law, and to recommend stiff new penalties for failing to do so:

the order making power comes with the ability to recommend penalties that in some cases are the highest in the G7. The potential penalties for contravening the law “is the higher of $10,000,000 and 3% of the organization’s gross global revenue in its financial year before the one in which the penalty is imposed.” Moreover, there are even tougher penalties in cases of violations for failing to comply with some of the security breach disclosure rules, data retention requirements, identifying someone using de-identified data (except in limited circumstances), or sanctioning a whistleblower. In those circumstances, the penalties can reach $25,000,000 or 5% of the organization’s gross global revenue.”

What’s interesting here is that a fine of 5% of global revenue is even higher than the GDPR’s 4% figure. It’s a clear indication of how the EU has set a benchmark that Canada now evidently plans to surpass. It emphasizes once more the importance of the GDPR in providing a framework that others can draw on, and which legitimizes their proposals.

The bill includes a new privacy right for data portability, which allows individuals to ask organizations to transfer their personal information elsewhere. There is also an important access right with respect to algorithms. The draft text says:

If the organization has used an automated decision system to make a prediction, recommendation or decision about the individual, the organization must, on request by the individual, provide them with an explanation of the prediction, recommendation or decision and of how the personal information that was used to make the prediction, recommendation or decision was obtained.

This goes beyond the corresponding part of the GDPR, Article 22, by requiring an explanation of how an algorithm arrived at its prediction, recommendation or decision. Given the increasing importance of algorithms, and the corresponding concern over their transparency – or lack of it – the Canadian approach could well form the basis of other privacy laws in the future. Another interesting element concerns de-identification of personal information, with strong penalties on those who violate the new standards. The proposed law says that an organization must not use de-identified information “alone or in combination with other information to identify an individual”, except for the purpose of carrying out testing of the effectiveness of security safeguards that have been put in place by the organization to protect the information. As Geist rightly points out, de-identification has been a particularly hot issue in Canada in part because of the public battle over Sidewalk Labs’ plans in Toronto, discussed several times on this blog, and now shut down.

A novel idea suggested in the draft text is that organizations may create a “code of practice”, which “provides for substantially the same or greater protection of personal information as some or all of the protection provided under this Act.” On the key matter of standards of consent, the bill establishes basic requirements for what must be included for consent to be valid. There is a prohibition on making consent a requirement for a product or service beyond what is strictly necessary.

There are some other elements in the CPPA that Geist discusses, and he also promises more posts with analysis in the weeks to come – he’s already written one looking at what he calls “ten pressure points“. Even though the current text is likely to change in various ways, it is clear that Canada’s proposed privacy law will be one of the most important, alongside the GDPR, and a useful further example of how to draft legislation offering strong privacy protection in the digital world.

Featured image by Saffron Blaze.