Show HN: I made an app to use local AI as daily driver https://ift.tt/mxviZa0
Show HN: I made an app to use local AI as daily driver Hi Hackers, Excited to share a macOS app I've been working on: https://recurse.chat/ for chatting with local AI. While it's amazing that you can run AI models locally quite easily these days (through llama.cpp / llamafile / ollama / llm CLI etc.), I missed feature complete chat interfaces. Tools like LMStudio are super powerful, but there's a learning curve to it. I'd like to hit a middleground of simplicity and customizability for advanced users. Here's what separates RecurseChat out from similar apps: - UX designed for you to use local AI as a daily driver. Zero config setup, supports multi-modal chat, chat with multiple models in the same session, link your own gguf file. - Import ChatGPT history. This is probably my favorite feature. Import your hundreds of messages, search them and even continuing previous chats using local AI offline. - Full text search. Search for hundreds of messages and see results instantly. - Private and capable of working completely offline. Thanks to the amazing work of @ggerganov on llama.cpp which made this possible. If there is anything that you wish to exist in an ideal local AI app, I'd love to hear about it. https://recurse.chat/ February 28, 2024 at 01:40AM
No comments