In our business engagements we’ve often asked managers and IT personnel what would improve if their organizations increased investment in technologies that support delivery of true real-time data, including data streams. Most indicate that actionable information in dashboards would improve, and they go on to say operational decisions and management would get better. A majority also say actionable information in dashboards would improve if their organizations invested in real-time data, and data streaming.
The Pursuit of Real Time – Options for Faster Data
Performance management, frequently one of the main drivers behind dashboard development, is also an area where many organizations would expect to see improvement with faster data and analytics. Business strategy, which organizations implementing performance management seek to communicate via dashboards, KPIs, and other metrics, is another focus.
At RStor we see that if and when organizations invest in technologies and services for faster data and analytics, the leading objective is most likely to improve information for managers tasked with increasing operational efficiency and effectiveness. Many of the companies with whom we speak say process execution would improve with investment in faster data and analytics, which again shows that some organizations see the value of not only faster but smarter business processes. A significant contingent say situational awareness and alerting are outcomes that their organizations would want to see from investment, objectives that are often key goals behind deployment of streaming data management and real-time analytics.
Use cases for streaming technologies draw interest and streaming data can give organizations new insights into how to solve problems. Many say their organizations would focus investment in faster data and analytics on improving predictive maintenance, while others indicate that IT systems and network management would be a priority for improvement. With many organizations now able to tap IoT sensor data from machines and other equipment, they need analytics that can explore this data sooner for trends and patterns that could indicate situations such as an imminent failure.
Predictive maintenance based on IoT sensor data is a growing use case for real-time analytics. Customer engagement and personalization as well as customer behavior pattern detection are other common use cases.
Sensor data is revolutionizing how organizations analyze maintenance problems. Using time-series analysis that combines real-time data exploration with historical data and other information records, organizations can observe changes in sensor data over time. In this way, streaming data can provide organizations with new perspectives on changes over time; A large segment of organizations want to use faster data to improve time-series analysis.
This modernized analysis can result in smarter maintenance. Rather than use traditional fixed-schedule maintenance, which can either overlook serious problems or apply maintenance when it is not needed, organizations can monitor conditions to see when maintenance is actually needed. They can develop predictive models based on all relevant data rather than assumptions based on a smaller selection of historical data and other records. Similar efficiency could be brought to areas such as risk management and fraud detection.
These and other use cases require integrated analysis of real-time, streaming data and historical data. Organizations should evaluate solutions such as data virtualization that can provide combined views of streaming and historical data. Some data virtualization solutions, for example, can read data as it is streaming from edge devices through pipelines for comparison with historical data rather than having to wait for this data to be loaded into a target database.
Faster data drives analytics innovations for business benefits. Predictive maintenance based on IoT sensor data is a growing use case for analytics in operations, manufacturing, IT, and logistics. However, perhaps an even bigger trend is the use of streaming data to improve customer engagement and personalization, which is an area that most of the companies that we speak with identify as an objective for improvement. Along with personalization companies often was to see improvement in pattern detection in customer data. If organizations can analyze near or real-time data, they can respond to customers’ interests and concerns in the timeliest manner possible, which is a competitive advantage. Organizations can also gain insights into patterns that they would not find when analyzing only historical data.
AI bringing Faster Insights and Automated Decisions
AI techniques, including machine learning and natural language processing, are rapidly becoming part of all kinds of analytics, applications, and data management systems. AI is playing a growing role in enabling organizations to move faster to gain value from data. Until recently, only expert data scientists and developers could use AI techniques. Now many types of users, including consumers, may be using AI embedded in applications and services without knowing it.
AI can help organizations churn through volumes of data to help understand why something is happening, what could happen next, and what to do about it. In some cases, AI techniques are adding the intelligence to drive fast, automated decisions; in others, AI algorithms are surfacing data insights to augment information humans are using to make decisions.
RStor engagements reveal that the most prevalent use of AI is to automate discovery of actionable insights. Many organizations intend to set up algorithms and models that do not require regular human intervention, but with the end purpose of supplying personnel with insights that can improve daily decisions. A large portion want to use AI to enable faster analytics on large data volumes, which shows that organizations see AI as a solution for scaling up discovery and analytics and growing big data sources.
To augment human decisions, AI-derived insights can be delivered to decision makers in the form of recommendations and a significant segment of companies say their organizations want to augment user decision making by giving them recommendations. Some decision makers, however, do not necessarily want recommendations; they just want faster and more comprehensive data search and exploration. We also hear companies say that they want AI to help their users find, select, and use data for analysis.
Because of emerging requirements such as these, organizations must make how data integration solutions use AI a key point in their evaluations. They should examine how AI is applied for faster location, access, and viewing of new data. Some solutions can apply AI programs that learn from and adjust integration and preparation steps to changes in the data and its formats. This can reduce the need for manual adjustments to reconfigure target databases or logical views, which slow down access and analysis. AI programs can also learn from users’ search, access, and viewing patterns to recommend related data sets.
Organizations see a role for AI in data governance, cataloging, and preparation. Along with helping users locate relevant data and data relationships, many see AI helping their organizations better govern, integrate, and manage the data. Organizations most often seek improvements in the following areas:
- Automating data classification for governance and security. Many companies anticipate that AI can help reduce the manual effort and inconsistency that plague data classification and make it hard to locate data for governance and security. Some organizations see AI addressing the general problem of taxonomy development.
- Improve/automate data preparation and enrichment. A significant percentage of organizations currently use or plan to use AI to streamline how data is collected, cleansed, transformed, and enriched for users.
- Develop and update the data catalog or metadata repository. As mentioned earlier, collecting and consolidating knowledge about the data and its location and origins is frequently manual and incomplete.
When working with large data sets in the cloud it is important to have that data close to users. Working with a multi-cloud provider who can rapidly replicate large data sets across vast networks in near real-time is important. When exploring your options consider companies that provide and can guarantee rapid data movement and replication with low latency.
RStor provides the valuable transport services and near real time geo-dispersed replication of data necessary for delivering rapid insights. For more information contact RStor where data experts are available to help.
Next week I will complete this five part series with ten recommendations for achieving faster insights from faster data.