By navigating our site, you agree to allow us to use cookies, in accordance with our Privacy Policy.

New Techniques to Streamline the Analysis of Large Waveform Databases

Many electronic devices and systems perform essential functions that must execute flawlessly over long periods of time. For example, electronic power grids, telecom systems, and implanted medical devices cannot afford errors that occur even once out of millions of events.

For obvious reasons, the ability to capture and isolate extremely rare anomalies represents the key challenge to ensuring this level of reliability. Monitoring voltage is not effective at identifying subtle device or system issues since it is typically controlled so effectively that minute variations are difficult to detect. In contrast, current waveforms contain much richer information relating to device or system operation. However, since current waveforms can fluctuate quickly over wide dynamic ranges, it is important to sample them at a high sampling rate to capture their full bandwidth. This can generate huge data files, since capturing data at 10 Mega-samples/second over a 24-hour period creates a data file greater than 1 Terabyte. Sifting through such a massive database to locate anomalous events is obviously a daunting task.

Until recently there were no solutions able to meet the hardware requirements just described. Data loggers can capture large amounts of data, but they have relatively low bandwidth and they can easily miss high-frequency signal components. Conventional oscilloscopes are good at capturing high bandwidth signals, but they have limited data storage capability.  Even high-performance oscilloscopes with large memory depths cannot capture data at a high sampling rate over time periods of hours or days. Oscilloscope current probes also do not have enough dynamic range to capture both low-level and high-level currents.  Finally, neither of these hardware solutions supports any efficient means to analyze the data they collect and identify abnormalities quickly. This becomes a problem of big data analysis.

One solution to handle these sorts of big data challenges is machine learning. An initial technique that we explored was Deep Learning Neural Networks (DLNNs), which have been very successful in image and voice recognition. Unfortunately, DLNN technology proved to perform marginally well when applied to waveform database analysis in addition to requiring significant computing power. To analyze large waveform databases Keysight researchers had to develop new machine learning techniques optimized for that purpose. This new solution was developed over a period of five years and it incorporates clustering, unsupervised machine learning and proprietary database compression techniques. It can analyze terabyte sized waveform databases orders of magnitude faster than conventional techniques while running on a PC-based bench-top instrument.

Figure 1
Figure 1: System architecture of long duration waveform analytics software.

Figure 1 shows the system architecture of the long duration waveform analytics software. It consists of three components, and we will discuss each one in-turn.

The acquisition subsystem pre-sorts incoming data real-time during the acquisition. The real-time tagging is the most important module in the acquisition subsystem, as it pre-sorts incoming waveform segments. Similar waveform segments are grouped together and registered as members of a tag. It is important to note that the pre-sorting does not have to be perfect; it just needs to contain enough information to enable post-processing analysis.

The database subsystem consists of the tag database and the lossless database. The tag database is a concise summary of pre-sorted waveform segments. It provides quick overview of the long duration recording. The lossless database is a full archive of the complete long-duration waveform record. It allows quick query of waveform at any location of huge database by timing or waveform similarity. The size of the tag database ranges between one one-hundredth to one five-hundredth the size of the lossless database. This configuration permits great flexibility with regards to data management and analytics.

The analysis subsystem has two modes of operation: quick clustering and detail clustering. Quick clustering allows quick overview of the entire database. Typical computing time is less than one second. However, since quick clustering uses pre-sorted tag information its accuracy is limited by the tagging similarity threshold. Detail clustering offers more precise analysis capability as it uses the lossless database information. Conventional analytics software needed re-scanning of lossless database in many occasions which take many hours. With this solution, user can enjoy interactive analytics with fast response without re-scanning the lossless database.

It is worthwhile to point out that this technology is not only new to the test and measurement industry, but also to the AI/Machine-Learning community.  Keysight presented a paper on this new solution at the IEEE Big Data 2019 conference (*1).   At the conference, researchers stated they had never seen anything like the performance and capabilities of this solution. The technology is integrated into Keysight’s CX3300A Dynamic Current Waveform Analyzer as an available option. It combines high integrity voltage and current measurement with long-duration waveform analytics.

The following example shows commercial power line voltage monitored over a period of four days at a sample rate of 1 MSa/s. Different waveform types are grouped by cluster with their populations displayed in the clustering panel. You can select one or more clusters and jump to their occurrences in the main playback window using the arrow keys. Although the database contained over 18 million waveform segments, data tagging allowed anomalies to be identified in a matter of seconds. For example, the screen capture below shows that 2 days and 21 hours into the datalog some significant over-voltage occurred. While interesting, this case is rather simple so let us look at a more challenging example.

Figure 2
Figure 2: Over-voltage anomalies detected on commercial power line voltage.

IoT devices need to operate long hours, and any unexpected current spikes could cause an internal IR drop and trigger a system malfunction. To verify device integrity, we measured a Bluetooth device’s supply current for 17 hours at a sampling rate of 10MSa/s. This generated a one terabyte database file. Although the normal peak current is around 25 mA, we found very rare current spikes as large as 50 mA. These occurred only 17 times out of the over 7 million waveform segments recorded. Further analysis showed that, in this device, there are two types of asynchronous events. The 50 mA spikes are observed when those two events occur within a narrow timing window, and this happens only once per 400,000 times.  This type of detailed analysis can only be achieved using the CX3300A’s dynamic current measurement capabilities along with its datalog option/long-duration waveform analytics option.

Figure 3
Figure 3: Large spike waveforms occurring 17 times out of over 7 million waveform segments identified within 5 minutes on an IoT device.

As the complexity of modern devices and systems continues to increase, the software tools used to evaluate them need to improve to keep pace. In cases where devices are used in mission critical systems, it is important to understand the behavior of waveforms over long time periods.  The software used to capture the data also needs to be able to help analyze the data. This article has shown that by utilizing new machine learning techniques developed by Keysight. It is possible to analyze large waveform databases efficiently and pinpoint anomalies in those databases quickly.

About the Authors:

Alan Wadsworth

Alan is currently the business development manager for Keysight Technologies’ precision and power products. He is the author of Keysight’s parametric measurement handbook and has over 30 years’ experience in design and test. Alan holds bachelors and master’s degrees in electrical engineering from the Massachusetts Institute of Technology and an MBA from Santa Clara University.

Masaharu GotoMasaharu Goto

Masaharu Goto is a Principal Research Engineer in Keysight Technologies. He co-developed the ROOT/CINT scientific data analysis framework (https://root.cern.ch) for CERN’s LHC (Large Hadron Collider) experiment which was the world’s first big data project. He provided C++ interpreter (https://root.cern.ch/cint) for seamlessly connecting interactive big data exploration and high-performance computing. At Keysight, he spearheaded the research and development of various test and measurement systems for big measurement data environments. These systems enable the massive parametric measurements for the most advanced semiconductor research and high-volume production. His current research project combines big data analytics with real-time data processing for various test and measurement applications. His research interests include measurement science, big data, data mining, and exploring the broader frontiers of computer science.

Tags

Nitisha Dubey

I am a Journalist with a post graduate degree in Journalism & Mass Communication. I love reading non-fiction books, exploring different destinations and varieties of cuisines. Biographies and historical movies are few favourites.

Related Articles

Upcoming Events