@Mtt for running LLM’s, I’ve been switching back and forth between LM Studio and Anaconda’s AI Navigator. They are pretty similar apps – providing a pretty easy UI for discovering LLM’s, downloading, loading, and then chatting with the model. Both also provide you the ability to setup an API endpoint to programmatically access the LLM (typically thru an OpenAI specification).
LM Studio ties into HuggingFace for its choice of models while AI Navigator seems to pull from a list of models curated directly by Anaconda. This list is smaller than HuggingFace.
Both work really well – I have an M4 Pro-based Maci Mini with 48gb ram. This allows for a pretty decent range of LLM’s to run. If you’re running on a mac, try find MLX-based models to run as they are optimized for Applie Silicon. However, the majority of models on HuggingFace are GGUF-based.