However, knowledge is only valuable if it is available at the crucial moment. In view of the shortage of skilled workers and the increasing complexity of the energy transition, finding information efficiently is becoming a critical success factor against the backdrop of the demand for data sovereignty. This is where intelligent knowledge management systems based on AI come into their own.
The challenge: searching in a haystack and data sovereignty
Engineers and technicians often spend a significant part of their working time researching PDF documents, network histories or manuals. Conventional search functions often fail due to the sheer volume or imprecise keywording. The goal remains to keep knowledge in-house. Trust is the basis, especially for sensitive information such as network plans or confidential maintenance strategies. Many companies are understandably reluctant to upload their core knowledge to public cloud systems. The good news is that sovereign knowledge management is possible in-house. If the entire stack (Ollama, database and UI) is installed locally, the data never leaves the internal network. There is no training of data models by third-party providers. The system can even be operated completely "air-gapped" - i.e. without an internet connection - which represents the highest security standard for KRITIS companies.
The solution: Retrieval Augmented Generation (RAG)
A modern AI bot in knowledge management does not work like a simple search engine, but like a highly qualified assistant. Using the Retrieval Augmented Generation (RAG) process, the AI accesses your internal documents, understands the context and formulates a precise answer - including a reference.
- Precision: The AI only answers based on the data you provide ("grounding").
- Source fidelity: Each answer refers directly to the relevant chapter in the manual.
- Language barriers: Technical terms are categorized correctly in context and translated if necessary.
The technological foundation: a comparison of open source models
Specialized language models (LLMs) are used to operate such a system with confidence. In practice, the following open source models have proven to be particularly efficient:
- Llama 3.3 / Llama 4 (Meta): The current gold standard among the open models. It is extremely eloquent and is ideally suited as a general assistant for complex logic tasks.
- Mistral / Mixtral (Mistral AI): These models from Europe are characterized by high efficiency and excellent support for the German language - ideal for regulatory texts.
- Command R (Cohere): This model has been explicitly optimized for RAG scenarios. It specializes in processing huge volumes of external documents and generating fact-based answers without overlooking important details.
The software infrastructure: the "stack" for in-house operation
To ensure that these models run securely and user-friendly in your company, a modular software stack is set up. The advantage: all components are open source and can be operated on your own servers.
- rms. AI Hub (the engine): This software acts as a runtime environment. It loads the language models and ensures that they are calculated efficiently on the available hardware (GPUs).
- ChromaDB or Qdrant (The memory): These are so-called vector databases. Your documents (PDFs, Word, Excel) are not simply stored here, but converted into mathematical vectors. In this way, the AI "understands" the meaning of your texts instead of just searching for keywords.
- Open WebUI (the interface): This is the face of the bot. It provides a user interface that is as intuitive as ChatGPT, but offers full control over user permissions, chat history and document access.
Three steps to the AI-supported knowledge archive
- Piloting: Select a specific area, e.g. the documentation of a substation.
- Infrastructure check: Provide dedicated AI servers to guarantee computing power and data security.
- Human-in-the-loop: Subject matter experts validate the bot's answers to continuously increase quality.
Conclusion: secure knowledge, increase efficiency
AI bots in knowledge management can be an answer to the shortage of skilled workers and the increasing complexity of the energy transition. They preserve empirical knowledge and make it accessible at the touch of a button - securely, precisely and completely confidently in-house.