diff options
author | Malte Neuss <malteneuss@users.noreply.github.com> | 2024-05-20 14:54:08 +0200 |
---|---|---|
committer | Malte Neuss <malteneuss@users.noreply.github.com> | 2024-05-23 23:48:55 +0200 |
commit | 8a05b4f8d4bec7d1ae32c836c3cda8265689270b (patch) | |
tree | 8cc7fcd664dd8c0a6ba99e223dc40a70a2a4c483 /nixos/doc/manual/release-notes/rl-2405.section.md | |
parent | 1df1f8d3be8c916aa50cc5dd0c2d828a3472a70b (diff) |
nixos/nextjs-ollama-llm-ui: init module
NixOS already has good support for the Ollama backend service. Now we can benefit from having a convenient web frontend as well for it.
Diffstat (limited to 'nixos/doc/manual/release-notes/rl-2405.section.md')
-rw-r--r-- | nixos/doc/manual/release-notes/rl-2405.section.md | 2 |
1 files changed, 2 insertions, 0 deletions
diff --git a/nixos/doc/manual/release-notes/rl-2405.section.md b/nixos/doc/manual/release-notes/rl-2405.section.md index 484cc4a3b6725..a1447620e8bb8 100644 --- a/nixos/doc/manual/release-notes/rl-2405.section.md +++ b/nixos/doc/manual/release-notes/rl-2405.section.md @@ -137,6 +137,8 @@ The pre-existing [services.ankisyncd](#opt-services.ankisyncd.enable) has been m - [ollama](https://ollama.ai), server for running large language models locally. +- [nextjs-ollama-llm-ui](https://github.com/jakobhoeg/nextjs-ollama-llm-ui), light-weight frontend server to chat with Ollama models through a web app. + - [ownCloud Infinite Scale Stack](https://owncloud.com/infinite-scale-4-0/), a modern and scalable rewrite of ownCloud. - [PhotonVision](https://photonvision.org/), a free, fast, and easy-to-use computer vision solution for the FIRSTĀ® Robotics Competition. |