By 2025 organizations will churn out 175 zetta-bytes of data, IDC predicts. However bottlenecks and compute problems continue to plague IT pros as they struggle to support their growing edge workloads, according to a survey of more than 300 storage professionals by NGD Systems.
In the study entitled The State of Storage and Edge Computing conducted by Dimensional Research, barely one in 10 respondents gave themselves an “A” grade for their compute and storage capabilities.
The study, designed to understand the adoption and challenges of edge-related workloads, discovered that while enterprises are rapidly deploying technologies for real-time analytics, machine learning and IoT, they are still utilizing legacy storage solutions that are not designed for such data-intensive workloads. In fact, 54 percent of respondents said their processing of edge applications is a bottleneck, and they desire faster and more intelligent storage solutions.
Edge platforms are driving the need for more localized compute capability, and 60 percent of storage professionals reported they are using NVMe SSDs to speed up the processing of large data sets; however, this has not solved their needs.
As AI and other data-intensive deployments increase, data needs to be moved over increasingly longer distances, which causes network bottlenecks and delays analytic results.
These issues have provoked pros to look to new computational models that have a lower power footprint and to storage devices with smaller form-factors, which is why 89 percent of respondents said they expect real value from Computational Storage.
“We were not surprised to find that while more than half of respondents are actively using edge computing, more than 70 percent are using legacy GPUs, which will not reduce the network bandwidth, power and footprint necessary to analyze mass data-sets in real time,” said Nader Salessi, CEO and founder of NGD Systems.
“Computational Storage provides an innovative solution to today’s architecture, in which compute moves closer to where data is generated, rather than the data being moved up to compute.
“This is why computational storage is ideal for any organization deploying edge computing as its new model; it makes it possible to process data right where it’s created and needed, speeding up the time to analyze petabytes of data.”
The survey revealed that with the rapid growth of edge computing, compute cost and speed are still major challenges, as more organizations look for faster and more intelligent storage solutions:
– 55 percent of respondents are using edge computing
– 71 percent of respondents are using edge computing for real-time analytics
– 61 percent of respondents say the cost of traditional storage solutions continues to plague their applications
– 57 percent said faster access to storage would improve their compute abilities
The study also found that respondents are using the industry-standard NVMe interface to speed up data processing:
– 86 percent expect storage’s future to rely on NVMe SSDs
– 60 percent are using NVMe SSDs in their work environments
– 63 percent said NVMe SSDs helped with superior storage speed
– 67 percent reported budget and cost as issues preventing the use of NVMe SSDs
Additional findings include:
– 46 percent of respondents say they do not have the infrastructure in place to do proper compute at the data-generation site
– 70 percent of respondents said they are using GPUs to help improve workload performance
– 73 percent of respondents want lower power consumption
– 81 percent said they desire a smaller footprint