There’s a lot of hype around the use of AI in everything from marketing to recruiting to operations to security. Our annual research indicates that most organizations are eagerly eyeing AI across business, operations, and security.
None of that is a surprise. There is survey upon survey indicating a healthy adoption of AI to support a wide variety of business and IT functions.
What’s not always talked about is how AI is being incorporated into development.
The core premise of AI’s value to the business and security is in its ability to recognize patterns and relationships that produce actionable insights. Many don’t consider the “engine” behind that capability, so never really dig into the details of what AI technologies are being used to magically uncover valuable insights.
Machine learning is a specific branch of AI that focuses on data analysis and modeling. Its usage is applicable to security because, given enough time and data, it’s capable of identifying patterns that indicate anomalous behavior in real time. Similarly, it can find obscure relationships in business data that represent opportunities to market products and services.
But machine learning also excels at modeling; that is, executing hundreds of “what if” scenarios to produce a better understanding of the complex relationship between multiple variables. In development—engineering—those variables could be the size of data, memory allocated, speed of I/O, network bandwidth, and virtual machine parameters. Machine learning is quite flexible and if you identify the variables, you can use machine learning to model various combinations of those variables to discover an ”optimal” set.
For example, F5 Distinguished Engineer Laurent Querel and F5 Sr. Architect Sebastien Soudan teamed up and recently published a piece describing how they designed a model to “build an efficient way to get data from PubSub to BigQuery.”
They also explain why the use of machine learning is a better choice for software optimizing today, and did it so well I’m just going to quote them:
“Today, software optimization is an iterative and mostly manual process where profilers are used to identify the performance bottlenecks in software code. Profilers measure the software performance and generate reports that developers can review and further optimize the code. The drawback of this manual approach is that the optimization depends on [a] developer's experience and hence is very subjective. It is slow, non-exhaustive, error prone and susceptible to human bias. The distributed nature of cloud native applications further complicates the manual optimization process.
An under-utilized and more global approach is another type of performance engineering that relies on performance experiments and black-box optimization algorithms. More specifically, we aim to optimize the operational cost of a complex system with many parameters.”
The driving factors behind the use of AI—specifically machine learning—in development is much the same as the factors driving its adoption across IT operations: manual processes are slow, error prone, and susceptible to human bias.
When we talk about the modernization of IT and the steady march toward a fully digital business, that includes development/engineering.
I encourage a read of “Optimize your applications using Google Vertex AI Vizier” even if it’s just to get a feel for the process of designing an appropriate model and what they learned from their experience.