An offensive security testing/assessment process that involves having a group of people (the "red team") simulate an adversarial attack to penetrate or corrupt the target, which could be artificial intelligence systems or models, policies, or assumptions; used to identify vulnerabilities, demonstrate the effect of a potential attack, or test the strength of defenses. See Blue Teaming and Purple Teaming

Deliver and Secure Every App
F5 application delivery and security solutions are built to ensure that every app and API deployed anywhere is fast, available, and secure. Learn how we can partner to deliver exceptional experiences every time.
Connect With Us