The director of the Cybersecurity and Infrastructure Security Agency (CISA) is calling out technology manufacturers for failing to create tech products that put the safety of customers first, and is calling for a new secure-by-design, secure-by-default cybersecurity model.
During remarks this morning at Carnegie Mellon University, CISA Director Jen Easterly explained how in today’s society we’ve normalized blaming the user for unsafe technology, instead of blaming the technology manufacturer.
She compared blaming the tech user to blaming of bad drivers for car accidents in the first half of the 20th century. Today, we know that car manufacturers design vehicles with a number of standard safety features built-in – seatbelts, airbags, anti-lock brakes, etc. – and we would not accept paying extra to have these basic safety features installed.
Unfortunately, Easterly said the same cannot be said for the technology we use every day.
“We find ourselves blaming the user for unsafe technology. In place of building in effective security from the start, technology manufacturers are using us, the users, as their crash test dummies – and we’re feeling the effects of those crashes every day with real-world consequences,” she said. “This situation is not sustainable. We need a new model.”
This new cybersecurity model, Easterly said, needs to start with tech products that bake in the safety of customers from the start. Additionally, she said the responsibility for defending the technology ecosystem should be placed onto major technology manufacturers who are more suited to manage cyber risks – not on small businesses.
To help “crystallize” this model, the CISA director said her agency is working on developing a set of core principles for technology manufacturers to build product safety into their processes.
“I want to highlight three of these principles. First, the burden of safety should never fall solely upon the customer. Technology manufacturers must take ownership of the security outcomes of those customers,” Easterly said.
“Second, technology manufacturers should embrace radical transparency to disclose and ultimately help us better understand the scope of consumer safety challenges, as well as a commitment to accountability for the products that they bring to market,” she said.
“And third, the leaders of technology manufacturers should explicitly focus on building safe products, publishing a roadmap that lays out the company’s plan for how products will be developed and updated to be both secure by design and secure by default,” she added.
In practice, Easterly said this could look like transitioning to memory-safe languages, having a transparent vulnerability disclosure policy, and secure coding practices.
Easterly also called on Big Tech companies to address “memory safety vulnerabilities,” which she said cause two-thirds of known software vulnerabilities. By switching to memory-safe programming languages – such as Rust, Python, and Java – she said these vulnerabilities can be eliminated.
The government can also play a role in shifting liability onto technology manufacturers, she said.
“Government can work to advance legislation to prevent technology manufacturers from disclaiming liability by contract, establishing higher standards of care for software in specific critical infrastructure entities, and driving the development of a safe harbor framework to shield from liability companies that securely develop and maintain their software products and services,” the CISA director said.
Most breaches occur due to bad cyber hygiene, Easterly explained, because companies don’t patch vulnerabilities or enforce multi-factor authentication. But she said it’s a much bigger issue than just trying to point the blame at the chief information security officer.
“At the end of the day, why are we not saying, ‘Why is that technology built in a way that we have to patch it so many times? And why did those vulnerabilities cause such a damaging breach?” Easterly asked.
“Frankly, we have a multibillion-dollar cybersecurity industry because technology companies were not incentivized to create safe technology,” she added. “They were incentivized to create features and to lower their costs and to speed to market, but not to create safe technology. That has to stop.”