Mtt
Mtt

Is anyone running an LLM or text-to-image model locally on a Mac? If so, let me know the details of your setup.

|
Embed
Progress spinner
7robots
7robots

@Mtt for running LLM’s, I’ve been switching back and forth between LM Studio and Anaconda’s AI Navigator. They are pretty similar apps – providing a pretty easy UI for discovering LLM’s, downloading, loading, and then chatting with the model. Both also provide you the ability to setup an API endpoint to programmatically access the LLM (typically thru an OpenAI specification).

LM Studio ties into HuggingFace for its choice of models while AI Navigator seems to pull from a list of models curated directly by Anaconda. This list is smaller than HuggingFace.

Both work really well – I have an M4 Pro-based Maci Mini with 48gb ram. This allows for a pretty decent range of LLM’s to run. If you’re running on a mac, try find MLX-based models to run as they are optimized for Applie Silicon. However, the majority of models on HuggingFace are GGUF-based.

|
Embed
Progress spinner
Mtt
Mtt

@7robots Good info. I have an MacBook Pro M4 Pro with 24GB Ram. Not looking to go crazy, just experimenting with some basic stuff. I’ve used Whisper locally so far but that’s it.

Have you done anything with image generation or editing (more interested in editing)? I’ve considered trying MagicQuill, but I just started looking into stuff.

|
Embed
Progress spinner
In reply to
7robots
7robots

@Mtt I haven’t really dabbled in “local” AI image generation or editing, so don’t have any insider intel to share. I hope you post on what you learn as you progress!

|
Embed
Progress spinner