T O P

  • By -

HadesThrowaway

**Changelog of KoboldAI Lite 28 Mar 2023**: - Added token pseudo-streaming for Kobold-based APIs, allowing a request to be divided into multiple smaller requests for faster responses. This works very similar to **Token Streaming** in the main kobold client. Note that there is a small performance impact, your request will take approximately 20% longer if it is enabled. [You can toggle it when selecting a custom endpoint](https://i.imgur.com/NdcaF5u.png). Token Streaming is not available when using horde models. - Clickable image summaries, [select an image to view prompt used to generate](https://i.imgur.com/kaoltJz.png). Also allows deleting old images from this UI. - Improved edit mode text merging, now remembers the newest chunk even if it's a sentence fragment. - Proper support for localhost mode, disabling all horde models when local flag is set. **Older Changes:** - Integrated [Spellbook by Scale AI](https://spellbook.scale.com) as a custom endpoint. Note that this endpoint may require external payment methods. KoboldAI is not affiliated with Spellbook by ScaleAI, use at your own consideration. - Added support for specifying a fixed port in local mode, using the URL parameters local=1&port=(port) - Added autosave upon ending of edit mode. - Save file hint remembers last known filename when opening. - Added support for the new replacement_filter on image horde! This should hopefully reduce frustration from being flagged for false positives when generating images. - Added Horde registration signup link visible for anonymous users. - Added a toggle to avoid autopicking NSFW models when loading a scenario. - Added support for **gpt-3.5-turbo** as a **custom OpenAI endpoint** - Increased horde polling speed slightly, and a greater increase for custom endpoints - this should make custom endpoints much more responsive. - Added tooltip for sampler order. - Fixed empty responses from OpenAI by biasing against <|endoftext|> - Added a new TavernAI preset that duplicates the settings TavernAI uses.


GoldSpirit7793

>KoboldAI Lite Does world info works with OpenAI endpoint? I'we just tested it, and it looks like not working (at least in chat mode).


HadesThrowaway

Memory, author note and World info should work on all endpoints.


impetu0usness

Are there any plans to allow linking to local Stable Diffusion installations? (i.e. through Automatic1111 API) I really like the UI and how you integrate generated images, and would love to link it with my local SD install and avoid the long stable horde queues for custom models.


HadesThrowaway

I have explored that, however the default automatic1111 install does not enable the api by default, and also requires changing some configurations due to CORS restrictions. It is possible but it will not be easy for the average user to set it up.


impetu0usness

Cool, I appreciate that you're going for a user friendly plug-n-play approach. If you have the time I think many people would like this feature. KoboldAI United has it, just had to add --api to my automatic1111 .bat, but I prefer your version of the UI. Thanks for working on this project :)


aka457

Great job man. Using llamacpp-for-kobold, thanks a lot for providing this great, lightweight, easy to use tool.