Service providers, as the initial point of connectivity for many AI applications and devices, are in the perfect position to offer the edge AI technologies and monetize their investment if they start building, optimizing, and securing their edge AI infrastructure today. AI applications are everywhere and growing at an exponential rate. Edge AI technologies are critical for AI applications that need the responsiveness and accuracy businesses require to stay competitive and relevant.
The opportunity for service providers
According to a recent report from STL Partners, businesses are expected to increase their edge AI spending by 285% by 2030, reaching over $157 billion. All of these AI applications and projects will need to access edge AI infrastructure.
This surge in edge AI application development and expansion will generate over $2.5 billion of revenue for service providers that provide edge AI services to their customers. Service providers that build their edge AI infrastructure early will be the ones that gain the most value from this opportunity.
The need for edge AI environments
The connected car is an ideal use case for edge AI where localized and responsive information is necessary for the cars to be safe and utilize optimized driving patterns. The number of AI-connected cars is increasing rapidly and auto manufacturers are just starting to take advantage of edge AI technologies. Connected cars will be doing 50% of their AI workloads through edge AI, according to the STL Partners report.
“This surge in edge AI application development and expansion will generate over $2.5 billion of revenue for service providers that provide edge AI services to their customers.”
The evolution of the modern smart city is another example where edge AI will play an important role. Smart cities utilize real-time traffic monitoring, video surveillance, and resource management to optimize their infrastructure and security. Reduced latency and localized information will enhance their ability to operate efficiently and securely. By 2030, 54% of the typical smart city’s AI workload will be processed in the edge, according to the STL Partners report, moving away from on-device and on-premises processing.
Developing an AI edge infrastructure
Today, many AI applications use the cloud, but the cloud is impractical for many AI applications. The cloud cannot support real-time. mission-critical AI applications and has significant backhaul costs. Many AI applications, such as self-driving cars and video surveillance. require near-real time data to work effectively. Exposing proprietary data to the cloud is also a concern. Leaking personal information or proprietary intellectual property is not an option, and security must be top of mind when designing an edge AI architecture.
Instead of the cloud, AI application owners are looking for edge AI resources that improve the reliability, latency, and security of their AI communications. The service provider will be one of the first places they will look towards to access edge AI capabilities.
How F5 can help
For service providers to gain the confidence and trust of their customers utilizing their edge AI resources, a robust, secure, and optimized edge AI architecture is essential. With our F5 Application Delivery and Security Platform (ADSP) that secures and optimizes AI resources, F5 is ready to be your AI partner as you design and build out your edge AI infrastructure.
To learn more, read about F5’s AI solutions.
About the Author
Related Blog Posts

Why sub-optimal application delivery architecture costs more than you think
Discover the hidden performance, security, and operational costs of sub‑optimal application delivery—and how modern architectures address them.

Keyfactor + F5: Integrating digital trust in the F5 platform
By integrating digital trust solutions into F5 ADSP, Keyfactor and F5 redefine how organizations protect and deliver digital services at enterprise scale.

Architecting for AI: Secure, scalable, multicloud
Operationalize AI-era multicloud with F5 and Equinix. Explore scalable solutions for secure data flows, uniform policies, and governance across dynamic cloud environments.

Nutanix and F5 expand successful partnership to Kubernetes
Nutanix and F5 have a shared vision of simplifying IT management. The two are joining forces for a Kubernetes service that is backed by F5 NGINX Plus.

AppViewX + F5: Automating and orchestrating app delivery
As an F5 ADSP Select partner, AppViewX works with F5 to deliver a centralized orchestration solution to manage app services across distributed environments.
F5 NGINX Gateway Fabric is a certified solution for Red Hat OpenShift
F5 collaborates with Red Hat to deliver a solution that combines the high-performance app delivery of F5 NGINX with Red Hat OpenShift’s enterprise Kubernetes capabilities.