The acceleration of digital transformation accompanied by steadily advancing adoption of consumer "smart" technology is creating waves of data generation that threaten to overwhelm the systems and companies that increasingly rely on data to create value.
This new reality is pushing the processing of data out of its traditional epicenter in the data center to the public cloud and out to the edge. Inevitably this, too, will prove inefficient and lack the responsiveness necessary to satisfy consumers and the need for real-time data processing for IoT and growing use of AI for smart automation. The end of Moore’s Law means CPU-centric infrastructure has reached its limits, no matter where it may physically reside.
As a result, organizations are looking for more efficient, modern computing architectures that can support multi-tenancy and deliver applications at data center scale with all the necessary levels of performance and security.
To achieve the necessary levels of performance and security, delivery and security services must take advantage of technologies capable of offloading, accelerating, and isolating security, storage, and networking processing.
- Offloading refers to moving infrastructure services from the host to a dedicated hardware resource.
- Accelerating means that this dedicated hardware resource is capable of boosting application performance by taking advantage of in-hardware programmable accelerators.
- Isolating provides a trusted environment for running infrastructure applications on a dedicated hardware resource, one that runs separately from the host system’s CPU and operating system or hypervisor.
One of the difficulties in adopting this approach has always been enabling organizations to leverage dedicated hardware without employing a team of experts in system-level engineering and hardware programming. The appliance models that solved this problem in the past are not suited to a cloud and edge dominated future.
The modern answer is to provide a platform and framework that enables infrastructure providers and enterprises alike to rapidly develop services and systems capable of tapping into the benefits of a dedicated hardware resource. Such an answer is seen in the NVIDIA BlueField-3 DPU and the NVIDIA Morpheus AI framework for cybersecurity.
NVIDIA BlueField-3 DPU is a 3rd-generation data center infrastructure-on-a-chip that enables organizations to build software-defined, hardware-accelerated IT infrastructure from cloud to core data center to edge. The new DPU platform offloads, accelerates, and isolates software-defined networking, storage, security, and management functions from the application workloads in ways that profoundly improve data center performance, efficiency, scalability, and security.
The NVIDIA Morpheus AI framework offers pre-trained AI models that provide developers powerful tools to simplify their workflows and help detect and mitigate security threats. When coupled with NVIDIA BlueField-2 DPUs, security, virtualization, monitoring, and policy enforcement can be available on every server. We see the potential in both to accelerate access to embedded analytics and security across the cloud and the edge.
F5 believes that the advancement in processor, platform, and chipset technologies demonstrated by NVIDIA has made it feasible for specialized compute to significantly optimize resource use for specific workload types. We see intriguing benefits for the creation of special-purpose application-facing intelligent "industrial systems" and thus boundless possibilities for establishing attractive IoT solutions where real-time processing is needed locally. We are especially excited about the real-time pre-processing of telemetry data along with the many other capabilities we will be able to enable across our product portfolio and our edge-as-a-service platform, Volterra.
As with all hardware platforms, business will benefit from DPUs when a complementary software ecosystem evolves. Enabling that ecosystem requires providing developers a uniform and transparent way to take advantage of the acceleration and offload functions provided by the DPU. One of our many areas of innovation is focused on how to enable that ecosystem and provide business the ability to optimize the security and delivery of their applications using the hardware provided by their vendors. We are finding that technologies like containers and Web Assembly (WASM) provide an opportunity to deliver this capability for DPUs and enable the necessary ecosystem without introducing additional complexity or requiring hardware domain expertise from the end user.
Together with NVIDIA we have a long history of enabling offload and acceleration of particular functions using NVIDIA ConnectX SmartNICs. F5 and NVIDIA plan to work together with customers on BlueField-2 and BlueField-3 capabilities to deliver solutions that enable businesses to benefit from the offload and acceleration of application security and delivery services at scale.With our unique history and expertise in building software capable of harnessing the benefits of hardware, F5 is excited to be working with NVIDIA to unlock the potential of the emerging edge for our customers.
About the Author
Related Blog Posts
At the Intersection of Operational Data and Generative AI
Help your organization understand the impact of generative AI (GenAI) on its operational data practices, and learn how to better align GenAI technology adoption timelines with existing budgets, practices, and cultures.
Using AI for IT Automation Security
Learn how artificial intelligence and machine learning aid in mitigating cybersecurity threats to your IT automation processes.
The Commodification of Cloud
Public cloud is no longer the bright new shiny toy, but it paved the way for XaaS, Edge, and a new cycle of innovation.
Most Exciting Tech Trend in 2022: IT/OT Convergence
The line between operation and digital systems continues to blur as homes and businesses increase their reliance on connected devices, accelerating the convergence of IT and OT. While this trend of integration brings excitement, it also presents its own challenges and concerns to be considered.
Adaptive Applications are Data-Driven
There's a big difference between knowing something's wrong and knowing what to do about it. Only after monitoring the right elements can we discern the health of a user experience, deriving from the analysis of those measurements the relationships and patterns that can be inferred. Ultimately, the automation that will give rise to truly adaptive applications is based on measurements and our understanding of them.
Inserting App Services into Shifting App Architectures
Application architectures have evolved several times since the early days of computing, and it is no longer optimal to rely solely on a single, known data path to insert application services. Furthermore, because many of the emerging data paths are not as suitable for a proxy-based platform, we must look to the other potential points of insertion possible to scale and secure modern applications.