Text generation
- Details
- Written by: Super User
- Category: Text generation
- Hits: 62
Making summaries locally - Part #2A Texts: Introduction, Oobabooga, Ollama, OpenWebUI
Making Text Summaries Offline Using a Local LLM
While you likely know that AI models can summarize text, you might not be aware that you can run them entirely on your local computer. The easiest way is to choose an LLM that fits within your system's memory.
Note that standard LLMs are limited to text-based summaries, whereas multimodal models like MiniCPM-V can also interpret video content.
Read more: Making text summaries offline using a local LLM part A
- Details
- Written by: Super User
- Category: Text generation
- Hits: 220
Making summaries locally - Part #1 Videos

Following my tests of different online chatbots for content summarization, I found it interesting to explain what is possible using only local resources (local LLM or MMLM)
You may be uncomfortable submitting personal data/ confidential files to private external servers, prefer not to upload content to a server under a different jurisdiction, or simply find it more convenient because summarization is an integral component of a larger, locally managed workflow.
- Details
- Written by: Super User
- Category: Text generation
- Hits: 5525
H2OGPT: Another tool to ask questions about your own documents
H2OGPT is related to the project H2OAI is a Web UI (which means web user interface) a bit like the oobagooba web ui.
In a matter of a saturday, this project went in my own ranking from "barely usable with poor result" to "pretty decent".
Read more: H2OGPT: Another tool to ask questions about your own documents but on GPU!
