Thursday, 30 October 2025

Show HN: LangSpend – Track LLM costs by feature and customer (OpenAI/Anthropic) https://bit.ly/3X2MMFw

Show HN: LangSpend – Track LLM costs by feature and customer (OpenAI/Anthropic) We're two developers who got hit twice by LLM cost problems and built LangSpend to fix it. First: We couldn't figure out which features in our SaaS were expensive to run or which customers were costing us the most. Made it impossible to price properly or spot runaway costs. Second: We burned 80% of our $1,000 AWS credits on Claude 4 (AWS Bedrock) in just 2 months while building prototypes of our idea but we had zero visibility into which experiments were eating the budget. So we built LangSpend — a simple SDK that wraps your LLM calls and tracks costs per customer and per feature. How it works: - Wrap your LLM calls and tag them with customer/feature metadata. - Dashboard shows you who's costing what in real-time - Currently supports Node.js and Python SDKs Still early days but solving our problem. Try it out and let me know if it helps you too. - https://bit.ly/47jJNid - Docs: https://bit.ly/3LcnulX - Discord: https://bit.ly/3X15QnI https://bit.ly/3X2cCtl October 30, 2025 at 11:40PM

No comments:

Post a Comment