Probably better to ask on !localllama. Ollama should be able to give you a decent LLM, and RAG (Retrieval Augmented Generation) will let it reference your dataset.
The only issue is that you asked for a smart model, which usually means a larger one, plus the RAG portion consumes even more memory, which may be more than a typical laptop can handle. Smaller models have a higher tendency to hallucinate - produce incorrect answers.
Short answer - yes, you can do it. It's just a matter of how much RAM you have available and how long you're willing to wait for an answer.
Trending Communities for Friday 31st May 2024
Active User Growth...
Trending Communities for Wednesday 29th May 2024
Active User Growth...
Trending Communities for Tuesday 28th May 2024
Active User Growth...
Trending Communities for Monday 27th May 2024
Active User Growth...
Trending Communities for Sunday 26th May 2024
Active User Growth...
Self hoating an LLM for research
I am a teacher and I have a LOT of different literature material that I wish to study, and play around with....