What Belongs in AI COGS? The Financial Framework SaaS Companies Are Scrambling to Build
Are AI inference costs already eating into your gross margin β and you can't even see them on your P&L?
In episode #370, Ben Murray breaks down exactly what belongs in AI COGS for SaaS companies offering an AI-first or AI-infused product line. Inference bills are stacking up fast, infrastructure-layer spend is the surprise line item nobody priced in, and most finance teams haven't built the GL account structure to capture any of it cleanly. If you don't get the framework in place now, you'll be reporting AI gross margin you can't actually defend by next quarter β and your board will notice.
- The 5 cost categories every AI COGS framework needs β inference, model hosting/GPU infrastructure, the AI infrastructure layer, monitoring and observability, and AI-specific support
- Why AI inference costs deserve their own GL account β and shouldn't be buried inside your cloud hosting bill where they disappear
- The surprise cost line one industry report flagged as the #1 unexpected AI expense β hiding in data platform usage, networking, and egress
- How to structure your COGS cost centers so you can deliver clean margins by AI product line, not just lumped together at the company level
- Why token tracking by customer cohort (heavy / medium / light users) is now table stakes for any AI product sold as a subscription
- The deployed-engineer question: should AI support tickets sit with tech support or a specialized team β and how that decision rewires your margin model
Tune in to get the AI COGS framework in place before your gross margin lands on a board slide you can't defend.
Resources Mentioned
- Ben's new AI course: https://www.thesaasacademy.com/ai-finance-metrics-saas
- Ben's blog post: What Should Be Included in AI COGS: https://www.thesaascfo.com/what-should-be-included-in-ai-cogs/
- SaaS Metrics Foundation course: https://www.thesaasacademy.com/the-saas-metrics-foundation