A little over a decade ago, the path to a functional record retention policy was simple: “just keep it all forever” and then eventually move those documents into offsite storage.

Rapidly changing privacy laws and regulations, both in the United States and globally, have made that practice obsolete.

According to IAPP, in just 3 short years, the United States went from 2 regulations in two states (California’s CCPA and New Jersey’s S 2834) to 29 different laws and regulations across 18 states.

Since it’d be nearly impossible to publish something with real-time up-to-date changes, we’re instead going to look at several larger trends so far in 2022 and what they might mean for the next half of this year.

US Federal Privacy Law? Unlikely to Happen Anytime Soon

Since the passage of CCPA, the question in most privacy professionals’ minds is whether the US government was going to step in to create an overarching privacy law that would cover all states and territories.

The latest reports indicate that despite several laws being presented to Congress, they are unlikely to move out of committee anytime soon.

As previously mentioned, many other states either have legislature addressing privacy, have laws in various stages of drafting, committee, or otherwise rely on consumer protection laws. Due to the 10th Amendment of the constitution, any Federal law would supersede those created by the states.

There are still many questions that need answering before the government can take any action to solve them, including:

  • Should privacy be a federal right?
  • What does this federal privacy law cover that the states can’t or don’t?
  • Which governing body should be enforcing these laws?

With the myriad other hot button issues facing Congress, Federal privacy laws seem unlikely to even crack the top 20 in priority.

Cookies are Getting Staler

Since their creation in 1996, cookies have been considered a primary method of collecting information on website visitors as they travel about the internet. Organizations have been using that information to serve ads, relevant content, and more.

Fast forward 30 years and the feeling around cookies has changed. Unless a cookie is edible, consumers don’t like them. In fact, Quartz reports that 72% of Americans are concerned with how they’re being tracked online.

As soon as the EU’s General Data Protection Regulation (GDPR) became effective in May 2018, any brands that had a digital footprint or wanted to do business with a European citizen had to ask for and receive consent from those visitors to send them emails or use cookies to track them online.

In 2021, Google announced that they were phasing out third-party cookies. While other browsers like Firefox and Safari have been doing this for years, WIRED notes that Google’s Chrome browser has an astounding 63% market share.

Instead of cookies, Google pitched the idea of their anonymized “Federated Learning of Cohorts” (FLoC). These “flocks” have their own issues which the public was all too happy to inform Google of.

As CookieBot reported, “placing users in flocks will likely reveal personal details that can be related to your browser authentication profile either directly or by inference, thereby requiring consent” – which is the privacy equivalent of cutting back on sugar by replacing cookies with candy bars.

Because of the reaction to FLoC, in 2022, Google revised their timeline to 2023 and announced that FLoCs are now topics that have gotten a similar negative reaction from industry and privacy professionals.

Facebook is in the News about Privacy…Again.

There are a few things you can always count on: death, taxes, and if there’s a conversation about privacy, Facebook (by Meta) is going to come up.

In April, a leaked internal document obtained by Motherboard and covered by Vice discussed Facebook’s program for Privacy by Design (or lack thereof).

Facebook admits that its approach to their data storage for personal data as it stands today makes it nearly impossible for them to comply with GDPR.

One particular standout quote from the document written by Meta’s privacy engineers was that they “do not have an adequate level of control and explainability over how our systems use data, and thus [they] can’t confidently make controlled policy changes or external commitments such as ‘we will not use X data for Y purpose.’ And yet, this is exactly what regulators expect [Facebook] to do”.

What will happen next is anyone’s guess because if Facebook’s main platform is found in violation of GDPR, they could be fined 4% of their global turnover – which by 2021’s numbers, would be a fine to the tune of $4 billion.

The Rise in Information Requests and PEC

With data breaches in the news seemingly every other day, the general public is getting more aware that companies want their personal information for their own means. However, this doesn’t mean that the larger public has an in-depth understanding of the implications or how that data. According to PEW research, “A majority of Americans said last year that they were concerned about how companies or the government were using their personal data, but few said they understood what was being done with their information.”

There is an inherent Catch-22 in dealing with this. Organizations that want to make privacy more advanced and understand how the data is being used need to use the sensitive information to understand how and where it needs to be protected.

In other words, protecting sensitive data means looking at that data. The solution, which has thus far been mostly an academic exercise, is privacy-enhancing computation or PEC which Gartner has cited multiple years in a row as a growing strategic technology trend. Gartner defines PEC and similar technologies as consisting of 3 parts:

  1. Secure – a trusted environment in which sensitive data can be processed or analyzed.
  2. Decentralized – Processing and analyzing data in a decentralized manner
  3. Encrypted – Encrypts data and algorithms before processing or performing analytic functions.

This allows PII or other sensitive data to be processed and analyzed without exposing the underlying information.

Increasing Investment in Privacy Technology

For CISOs and other privacy professionals, the growth in the Internet of Things (IoT) has a lot in common with changes in information governance.  Much like there are more ways than ever to create a record, there are now also more ways than ever to have a device connected to the internet that may not be secure.

For instance, Kevin Barnard, Chief Innovation Office at ServiceNow, reported that back in 2017, a group of hackers infiltrated a casino’s secure networks through the most unlikely doorway ever: a fish tank. Through that entry point, the hackers were able to access areas they never would have been able to reach directly.

Barnard noted, “regulators haven’t woken up to the fact that there’s no longer a material difference between an espresso machine and a router, or a lightbulb and a server.” Dealing with this exponential increase in technology will require a similar investment in tools and technology to deal with it. Gartner predicts that through the rest of this year, “privacy-driven spending on compliance tooling will rise to $8 billion worldwide” with much of the tools relying on AI to manage the sheer volume of areas to protect.


There’s no such thing as a minor infraction on privacy. Dr. Ann Cavoukian, Executive Director at Global Privacy & Security by Design Centre, and the former Information and Privacy Commissioner for the province of Ontario, commented at an ARMA event that most privacy laws are being created in a reactionary way.

It’s a constant balancing act between access and security and more often than not, it’s the squeaky wheel that gets fixed first.

For guidance on how to develop a privacy program that is proactive (rather than reactive), check out our whitepaper, Data Privacy for the Information Professional.

Read Our Data Privacy Whitepaper Now