On my shelf I have many books spanning everything from networking to AI to security to the basics. While most of my college textbooks are long gone, two remain: the dragon book and Introduction to Algorithms.
The former I keep for sentimentality. I’ve written exactly two compilers in my life, and I hope never to write another one. The latter I keep because it is timeless. Algorithms, you see, aren’t tied to any operating system or language. They’re logical rules—patterns—that are followed to solve common problems.
This is sometimes why I say I “Dijkstra” my errands when out driving. Dijkstra’s algorithm is a set of rules to find the shortest path, and it’s as applicable to running multiple errands as it is routing packets through a network.
With that in mind, let’s consider the evolving space of prompt engineering. A simple definition is “the practice of designing inputs for generative AI tools that will produce optimal outputs.” (McKinsey)
Over the course of the past few months we’ve seen numerous prompt engineering “techniques” surface, each of which were devised to solve a specific type of problem: how to produce optimal outputs out of generative AI.
Forbes has been doing an excellent job bringing these techniques to the fore:
There are many more out there, but they all share the same characteristics. Each describes a set of rules or patterns for interacting with generative AI to produce desired results. From an engineering perspective, this is not all that different than algorithms describing how to sort a binary tree, reverse a linked list, or find the shortest path through a graph to a destination.
They are, in design and purpose, natural language algorithms.
Now, I’m not going to encourage engineers to become prompt engineers. But as many engineers today are finding out, using natural language algorithms to design more effective generative AI solutions works. If you read through this blog on mitigating AI hallucinations you’ll see that within the context of the solution, multiple natural language algorithms including chain of thought and reflective AI are used to guide the responses of GPT such that an optimal answer is generated.
The reason this is important to recognize is that as prompt engineering techniques emerge and, ultimately, receive recognizable names, they become the building blocks for solutions that leverage generative AI. Today’s prompt engineering techniques are tomorrow’s natural language algorithms.
And we would do well not to ignore them or dismiss them as less valuable than traditional algorithms nor ignore them as only applicable to chat interfaces used by family and friends.
We may rely on an API to integrate generative AI into solutions, but the data we’re exchanging is natural language, and that means we can leverage those prompt engineering techniques—those natural language algorithms—within those solutions we’re building to produce better, clearer, and more correct answers from generative AI.
This also means that technology leaders should not just allow but encourage engineers to spend time engaging with generative AI to uncover those patterns and algorithms that will lead to more optimal solutions.
You never know, one of your engineers might just wind up having an algorithm named after them in the future.
About the Author

Related Blog Posts
At the Intersection of Operational Data and Generative AI
Help your organization understand the impact of generative AI (GenAI) on its operational data practices, and learn how to better align GenAI technology adoption timelines with existing budgets, practices, and cultures.
Using AI for IT Automation Security
Learn how artificial intelligence and machine learning aid in mitigating cybersecurity threats to your IT automation processes.
The Commodification of Cloud
Public cloud is no longer the bright new shiny toy, but it paved the way for XaaS, Edge, and a new cycle of innovation.
Most Exciting Tech Trend in 2022: IT/OT Convergence
The line between operation and digital systems continues to blur as homes and businesses increase their reliance on connected devices, accelerating the convergence of IT and OT. While this trend of integration brings excitement, it also presents its own challenges and concerns to be considered.
Adaptive Applications are Data-Driven
There's a big difference between knowing something's wrong and knowing what to do about it. Only after monitoring the right elements can we discern the health of a user experience, deriving from the analysis of those measurements the relationships and patterns that can be inferred. Ultimately, the automation that will give rise to truly adaptive applications is based on measurements and our understanding of them.
Inserting App Services into Shifting App Architectures
Application architectures have evolved several times since the early days of computing, and it is no longer optimal to rely solely on a single, known data path to insert application services. Furthermore, because many of the emerging data paths are not as suitable for a proxy-based platform, we must look to the other potential points of insertion possible to scale and secure modern applications.
