Hii Hiiiii! 👋
Are you stuck between AI and AI?? I'm too! But we have to go with the flow else we won't be able to last our impact!
This blog is a...
For further actions, you may consider blocking this person and/or reporting abuse
Share your thoughts and doubts here.
Also, don't forget to Star the awesome LLMWare repo
Nice article! I've written my own codes for this, running models locally. I used the
nomic-embed-text-v1.5
model, found here:huggingface.co/nomic-ai/nomic-embe...
I wrote a Python script where my folder was indexed (converted to a text embedding vector database by the model), then GPT4o (or a locally running model) could use tool calling to input something specific and get relevant parts of the output. For large folders; it was a bit slow sometimes, but it worked great!
Basically, the point was to let an AI chat model be able to summarize gigantic files or entire folders on my computer for me.
I think I'm going to open source my project soon, since I used all open-source models (GPT4o is optional) to create it.
That's so great... The downloads are
553,239
. You're really amazing. I suggest you join the llmware discord, you'll get a lot of great stuff there! The power of llmware is more based on the SSM's (small specialized models), you can read the documentation or Intro to llmware for more details!Also, Making your project OS is great thinking if you're thinking to maximize its extent!
Oh, just to be clear — I did not create that model, I only used it! 😅
I've tried a few things like LLMWare, but I usually prefer just to make my own thing, so I know how everything works. Of course, I use libraries for lots of my AI things, but mostly just the Hugging Face transformers library and a couple others.
LLMWare is too on hugging face though 😉
I'll check it out if I come across it :D
Great!
dam dam bro great
Haha.. Thanks
Very informative. Thanks you
You welcome.. Thanks for reading!
✨✨💯💯
😉
Great article and consistency too😉
Thanks for the read Vamshi
Nice information
I'm glad that you liked it!
Nice article!
Thanks for reading!