Saturday, 8 March 2025

Show HN: Can I run this LLM? (locally) https://bit.ly/4bF1zNd

Show HN: Can I run this LLM? (locally) One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that. https://bit.ly/3QRJkun March 9, 2025 at 12:08AM

No comments:

Post a Comment