Government data breaches expose sensitive information while undermining public confidence in agencies’ ability to keep them safe, the CISO of Maricopa County, Arizona said. Lester Godsey during a GovLoop board this week.

“The most significant impact, in many cases, is that every time there is a data breach there is a continual erosion of the trust of our constituents, the business community, the people we serve as government agencies,” he said.

The problem presents a moving target, but efforts to improve data awareness and inventory, integrate encryption efforts into broader cybersecurity strategies, and employ strong permissions and access controls can all do the difference, the speakers said.


“Technology has improved, the tool sets available to local governments have improved…but the scope and scale of the problem is growing,” Godsey said.

THE ATTACK SURFACE EXPANDS

The public sector’s push for modernization has been a mixed blessing, speakers said. The COVID-19 pandemic has prompted agencies to adopt the cloud. But this rush to hybrid environments has also complicated data management and monitoring, adding more places in which data can be stored.

“With the adoption of new technologies, getting our arms around — from an asset management perspective — where that data is stored, what you have, and how sensitive that data is, is a challenge,” Godsey said. . “It’s kind of like we’ve now taken our internal file storage and basically used the entire internet as a potential repository. So you can imagine the size and complexity of this problem.

This hampers cybersecurity efforts because governments cannot accurately assess their risks until they have a clear inventory of what they are trying to protect.

Governments selecting cloud providers are also integrating more players into their operations, which means that a cyber incident affecting one of these companies – or one of the company’s own providers – could have an impact on the public sector client, Godsey said. While the government has always worked with vendors, the move to the cloud makes it particularly important to consider third party and fourth party risks.

He recommended that agencies consider their company’s assets to include their data, the cloud providers, and the companies that those cloud providers depend on.

Agencies also need to look internally and ensure staff stay trained in best practices for handling sensitive and non-sensitive data. said panelist Matthew Lamb, Principal of Prisma Cloud Solutions Architects, focused on state, local and educational entities, for cybersecurity firm Palo Alto Networks.

WORKING WITH ENCRYPTION

As governments increasingly adopt IT tools and techniques to protect their data, they must also be attuned to the trade-offs involved and adjust their strategies accordingly.

Encrypting data – both in transit and at rest – means that any hacker who manages to intercept it will have to find a way to break the encryption before they can use it, noted Carmen Taglienti, principal cloud architect and of AI for IT solutions provider Insight. .

This is a useful defensive measure, but one that existing agency security approaches may not have considered.

Godsey said agencies’ cyber strategies often assume they can monitor data entering and leaving the company, which would help alert them to inappropriate data exfiltration indicating a breach. But that visibility is harder to achieve when agencies encrypt data or offload data storage to third-party clouds. This forces entities to update their data security design.

“The other thing we do that is problematic is that a lot of the tools that we use from a cybersecurity perspective are such that they assume you have insight and can see the data that comes in. into your environment and leave the organization, whether on-premises or in the cloud,” Godsey said. “Our inspection and ability to use tools to see if data is inappropriately leaving our environment may be inhibited by that encryption…and then, with that data stored in third-party cloud repositories, we may not have the same ability to apply the same security tools and controls in place.

The process of encrypting data to protect it and then decrypting it for use takes time — one reason why entities don’t just encrypt everything, Taglienti said. Agencies must decide which level of encryption best suits their needs.

They will also need to make sure they can keep their encryption keys safe, and Taglienti recommended recycling these keys regularly to reduce the risk that any hacker who manages to obtain keys could use them.

ACCESS AND CHECKS

To further reduce the risk of a breach, agencies can carefully control who gets permission to access data in the first place. This means first ensuring that any person – or device – seeking access is authenticated and that only users with legitimate needs are approved.

Entities can grant permissions based on user roles or they can grant permission to users based on certain attributes. The latter is a more “refined” and increasingly popular approach, Taglienti said.

Putting all of this advice into practice can be tricky, and Godsey said agencies should start by evaluating their current security postures to see where they could improve. Lamb also urged agencies to conduct regular audits to check how well they continue to follow their strategies and policies.

Maricopa, for example, conducts a self-assessment at least twice a year and every few years engages a third party to analyze it. The latter is useful on a practical and “organizational” level, bringing an independent and outside perspective to the situation, Godsey said.

Regular audits also help agencies ensure they keep their approaches up-to-date as technologies evolve.

Trying to bolster defenses can seem daunting, and organizations can only hope to reduce the likelihood and damage of breaches, not prevent them altogether, Godsey said. Still, he urged agencies to recognize that any improvements they are able to make, even small ones, put them on a better footing.

“Attack what you can,” he said.