How you can spend $108,000 instead of building an AI governance program (but not really)
Don't just jump to the simple (but expensive) choice when trying to manage AI risk.
ChatGPT Enterprise costs (at least) $108,000 annually.
And a company considering a StackAware AI risk assessment and governance program decided to just buy the software instead of working with us.
AI risk solved, right?
Unfortunately not.
AI is more than just ChatGPT
Don’t get me wrong, ChatGPT Enterprise seems pretty cool. It:
Prevents all organizational user inputs from being used for model training
Gives administrators granular control over data retention and deletion
Doesn’t make you choose between limited retention and custom GPTs
But there are a lot of things it doesn’t do for you, like:
Build a customized AI policy and procedure for your organization
Evaluate vendor AI processing, training, and security procedures
Help communicate to customers about how you use AI
Map AI-related controls to compliance frameworks
Negotiate with partners about IP ownership
Manage risk NOT related to OpenAI
Inventory your AI systems and tools
Penetration test AI applications
Give you vendor-neutral advice
And my point of contact at this particular company understood all of the above and made the case internally (you know who you are - thanks!).
But to no avail.
You can try to address AI risks piecemeal, but it will cost you
Some business and security teams seem to view AI as magic. Something to both harness but also fear.
In the end, though, it’s just a technology.
The same principles of risk management apply here, like they do everywhere else in life. You only have four options:
Mitigate
Transfer
Avoid
Accept
In the race to deploy AI systems while not exposing sensitive data or suffering reputation damage, many companies seem to be blindly defaulting to option #1, regardless of cost.
Others are just going all in on #3, blocking every AI tool they learn about. They don’t seem to worry about:
Slamming the brakes on innovation
Losing competitive advantage
Scaring off AI talent
The thing is, there is no way to know which of these (or combination) will have the highest return-on-investment without doing a comprehensive assessment.
And if you don’t even know about a risk, you are defaulting to #4, by accident.
Tools are the final piece of the puzzle, not the first one
AI security tools certainly have their place. StackAware has plenty of partners who sell them.
But deploying one of these is the last step you should take in your AI governance, risk, and compliance journey.
Once you have:
established your business goals for AI
mapped the landscape of threats
built a governance framework
then it’s time to start considering what software you should buy (if any).
But don’t put the cart before the horse.
Ready to start managing AI risk holistically?
P.S. We cost less than $108,000 a year.
Related LinkedIn posts: