As online presences grow, social media continues to explode, and more and more people interact online, the universe of external datasets is growing exponentially. Gone are the days of consulting an organization’s almanac; today, a few clicks can place historical weather notifications for a specific location at your fingertips. And that’s just the tip of the iceberg, as billions of data points are collected with every post, click, or even scroll. Companies have used this data…

READ MORE

As online presences grow, social media continues to explode, and more and more people interact online, the universe of external datasets is growing exponentially. Gone are the days of consulting an organization’s almanac; today, a few clicks can place historical weather notifications for a specific location at your fingertips. And that’s just the tip of the iceberg, as billions of data points are collected with every post, click, or even scroll. Companies have used this data to increase sales and improve customer experience, but the intelligence community has struggled to connect the dots and cut through the noise to provide its employees with sophisticated tools to analyze the data of company.

This external intelligence data is not (yet) subject to common standards and finding the right tools to use it both effectively and efficiently is currently an uphill battle, as many agencies have made little isolated investment in the use of this data. to better understand the needs of their customers. The benefits of using data to educate and inform decisions are endless, but the findings are only accessible to a select few who can overcome the challenges to get there.

The Intelligence Community (IC) has stepped up its data practices. In 18 different intelligence agencies, data managers worked to improve data lifecycle management. Analysts are present within each agency and at different levels. Agencies are striving for more accurate data while maintaining superior security and global governance.

However, this is not enough. About 80-90% of data used by these agencies is still largely unstructured, making the job of government analysts increasingly difficult. Agencies invest in isolated data management solutions and end up with a fundamental external data system that doesn’t track data consumption, frequency of use, or accessibility. Determining what data matters to shape future security products, policies, and regulations is currently siloed. So how do federal agencies improve their third-party data integration game to reap the benefits? By implementing incredibly specific segments with a focus on interoperability, accessibility, technology, and skilled labor.

Government agencies are not unique in their struggle with siled teams and departments, as many organizations struggle to navigate this common division caused by growth. Over the years, agencies have released strategic roadmaps focused on how intelligence professionals and analysts incorporate big data into their work. For example, 2016 Data Center Optimization Initiative, which aimed to improve the efficiency of the remaining data centers. Every year since, agencies face an uphill battle trying to successfully consolidate resources when time and money aren’t always on their side.

Focusing on data interoperability across silos and providing a centralized data management system can quickly put all agencies on the same page about what exists, in what format, and what information is in a database. data. Additionally, providing access to the same data and monitoring its usage by agency, department, role, and use case can provide another layer of critical information that can be analyzed by CI to drive product and policy.

Removing silos also reduces costs. Implementing an external data pipeline comes at a price, and paying that same cost over and over again to each agency is a waste of funds that can be better spent on other imperative projects. Accessibility, technology and skilled labor can be handled individually by each agency, but interoperability is the critical change to avoid redundancy.

External accessibility is also a new avenue that IC could explore. These agencies are in a unique position where they are data consumers but also have the opportunity to become data providers. Making data collection transparent to individuals and allowing businesses to access it opens up a wealth of infinite data points that can be used to make more informed decisions.

Matt Conner, Chief Information Security Officer for the Office of the Director of National Intelligence, said“The cybersecurity appliance is still focused on a traditional definition of systems – you know, full stack, storage, compute, processing – all in one appliance. I think we don’t talk about enough data: data integrity data, data security.

Not only do these agencies need to update their tech stack, but they also need to make sure everything is digitally accessible, especially if they’re going to deliver data to the masses.

Of course, none of these incremental changes are even possible without the highly trained experts needed to do the job. Speed ​​is a necessity to hook these human resources, and agencies often move more slowly through their paperwork to ensure that regulations are followed to the letter.

To combat this, federal agencies should redirect their focus to their existing employees by offering retraining incentives and training. This is especially necessary if the agency wishes to build its data pipeline from scratch, as labor costs are high initially, then decline – but do not disappear and are often underestimated – for the duration. remaining life of the data platform. .

Interoperability, accessibility, technology and skilled manpower are four critical areas to improve the intelligence community data pipeline. By taking every initiative step-by-step and genuinely focusing on fine-tuning the details, IC can become a major player in the consumption and delivery of third-party data.

Will Freiberg is CEO of Crux.