The National Institute of Standards and Technology (NIST) published draft guidance this week on evaluating differential privacy – a privacy-enhancing technology (PET) used in data analytics – to fulfill one of its tasks under President Biden’s recent artificial intelligence executive order (EO).

The EO called on NIST to advance research into PETs – such as differential privacy – and mandated the creation of guidelines within one year “to evaluate the efficacy of differential-privacy-guarantee protections, including for AI,” according to NIST.

“You can use differential privacy to publish analyses of data and trends without being able to identify any individuals within the dataset,” Naomi Lefkovitz, the manager of NIST’s Privacy Engineering Program and one of the publication’s editors, explained in a press release.

For example, let’s say a researcher wants to analyze health data collected by fitness trackers to improve medical diagnostics. The business that sells the fitness trackers may not want to share this data in order to protect consumers, as it contains personally identifiable information (PII).

However, with differential privacy, this business can slightly alter the PII – otherwise known as adding noise to the data – allowing the data to publicly be released while protecting the privacy of the individuals within the dataset.

“If it’s sensitive data, you don’t want it revealed,” Lefkovitz said. “But differential privacy technology is still maturing, and there are risks you should be aware of. We want this publication to help organizations evaluate differential privacy products and get a better sense of whether their creators’ claims are accurate.”

NIST said that understanding differential privacy and other PETs is crucial, especially as AI – which relies on large datasets to train machine learning models – continues to rapidly evolve.

The agency said its new guidance, titled Draft NIST Special Publication (SP) 800-226, Guidelines for Evaluating Differential Privacy Guarantees, is mainly geared toward Federal agencies, but added that anyone can use it.

The publication is an initial draft, and NIST is requesting public comments on it until Jan. 25, 2024. The agency will use the comments to inform a final version of the guidance to be published later in 2024.

“We show the math that’s involved, but we are trying to focus on making the document accessible,” Lefkovitz concluded. “We don’t want you to have to be a math expert to use differential privacy effectively.”

Read More About
About
Grace Dille
Grace Dille
Grace Dille is MeriTalk's Assistant Managing Editor covering the intersection of government and technology.
Tags