Early adopters merge and analyze their massive geospatial and time series data in real time to support new application areas.
Real-time decisions informed by smart sensors and IoT data are increasingly critical across many industries and applications. However, new approaches to data management and analysis must be taken because more of the data from these devices is streaming data that includes time and location information. Early adopters merge and analyze their massive geospatial and time series data in real time to support new areas of growth.
RTInsights recently sat down with Chad Meley, CMO at Kinetica, to talk about the explosion of this type of data, the challenges companies face in analyzing the data, the benefits that come with it when leveraged correctly, and how which Kinietica can help. Here is a summary of our conversation:
RTInsights: Where does the wealth of real-time geospatial data come from?
Meley: The fastest growing type of data this decade is real-time geospatial data. Specifically, we are talking about all data with an x and y coordinate that changes over time. The best way to think of it is as an evolution of traditional IoT data. The first IoT data came from sensors that took a reading like temperature or vibration, from a thing like a wheel bearing or a cylinder, over time. This way, analysts could detect anomalies and make the necessary adjustments. Nowadays, these sensors can also broadcast their position. When location is tagged on data, it creates more value and use cases.
Real-time geospatial data is proliferating because prices have dropped dramatically on the technology that generates this data. The cost of location-based chips for cellular connectivity has steadily declined since 2017. The expansion of 5G networks is contributing to the collection of greater volumes of geospatial data. Bluetooth beacons with integrated energy harvesting like the Apple Air beacons are falling in price.
Then couple that with the growth of satellite imagery and closed circuit cameras. Both capture real-time video that is inherently geospatial since objects in the imagery can be analyzed and plotted on a 2D or 3D map. Satellite launch costs have fallen sharply over the past decade on a per kilogram basis, meaning more data collection satellite launches in the next few years. Additionally, it is estimated that one billion surveillance cameras are in use worldwide, both in fixed locations such as a store or city street and in motion such as drone footage.
RTInsights: Why is real-time location-based analytics so popular?
Meley: When I talk to people in the industry, you can sense the excitement and interest in real-time location intelligence. I think it’s because people see how disruptive and valuable this space is. It’s also pretty cool as a developer to work with real-time space and data because you can build apps that are fresh and beyond the typical dashboard and batch recommendations we we’re used to with the first generation of big data.
For example, yesterday I made a presentation to Merv Adrian at Gartner. Merv covers databases, Spark, NoSQL and other adjacent technologies at Gartner, and it appears on most lists of top influencers for data science and big data. During the presentation, I shared a recent example of how one of our EV customers solved a previously unsolvable problem using real-time vehicle location data with a variety of constraints. of route. He stopped me and said something about how he was getting vendor briefings all the time and how the use cases were basically the same as what we go to as industry for years. However, in this case, the customer was using the data in a new way and driving the new economy forward.
Over the past five years, companies have been so focused on redesigning the platform for the cloud, which was badly needed, but now there’s an appetite to accelerate innovation through data analytics. . The explosion of real-time location data and breakthroughs in technologies that allow organizations to expand the collection, analysis and operationalization of this new form of data is an area ripe for innovation.
RTInsights: What are some of the challenges companies face when trying to perform or support such analysis?
Meley: I see many parallels to the challenges companies faced in managing “big data” at the turn of the previous decade. The era of Big Data has marked the shift from managing and analyzing structured data from transaction systems to semi-structured data from web logs. The way data was collected, transformed and analyzed had to be redesigned. There was a period in the beginning where companies either used their old technologies and methods, which ultimately failed, or they did nothing, resulting in opportunity costs. Emerging leaders were able to identify appropriate new technologies and launch new sets of best practices to harness and extract value from this new form of data.
The same is true as we move from web logs that capture web interactions to the next generation of IoT data that captures observations from sensors and cameras. Old technologies and methods must once again be reconsidered. For example, the way joins between two sets of spatial data should be done by calculating overlapping polygons will cripple a traditional data warehouse or data lake. Add a time series function on top of that, and chances are the query will never come back. Additionally, most high-value use cases come from real-time decision making. Data warehouses and data lakes were simply not designed to solve complex problems in a real-time latency profile.
RTInsights: How does Kinetica help you?
Meley: Kinetica’s Design Center merges huge geospatial and time-series datasets and processes complex spatio-temporal analyzes in real time. Many of our customers refer to their Kinetica implementation as the “speed layer”. It fills a critical gap between traditional data warehouses and data lakes which are batch oriented and optimized for transaction and log data and streaming tools like Kafka, Confluent and Kinesis which are real-time but unable to ‘run advanced scans. Kinetica is available as a service and everything from connecting to Kafka queues to calling powerful spatial and time series functions is done through simple SQL, resulting in incredible time to value.
Kinetica’s origins were in tracking terrorists in 2009 for the NSA by merging data from satellites and the full corpus of cellphones, email, social media and other sources. The original Kinetica engineers had to develop a completely different approach to database architecture to achieve the mission goal using matrix calculations. Now that real-time spatial data is proliferating in commercial sectors, Kinetica is a proven, hardened database that accelerates the adoption of real-time location intelligence.
RTInsights: What are Kinetica’s real-time location-based client use cases?
Meley: One of the largest package delivery organizations in the world uses Kinetica to track and trace shipments across its network and continuously re-optimizes routes for on-time delivery while simultaneously reducing fuel costs.
Several government agencies use Kinetica to scan objects in our airspace to detect risks and keep us safe. A major automaker uses Kinetica in its connected car program to recommend optimal routes based on EV charging station stops.
A leading health care provider works to monitor customers’ vital parameters, such as heart rate and stress levels, to make personalized wellness recommendations.
Several green energy start-ups use Kinetica to monitor and arbitrate electricity through smart meters. And a major insurance agency is now using Kinetica to combine real-time weather events with policyholder properties to take action that mitigates potential losses and responds by supporting claims faster than the competition.
Real-time location data exists in many industries and enables all kinds of new high-impact results.
To learn more about real-time location intelligence, visit Kinetica.com.