This app makes using Ollama local AI on MacOS devices so easy


LLM on MacOS

ZDNET

I’ve turned to locally installed AI for research because I don’t want third parties using my information to either build a profile or train their local language models (LLMs). 

My local AI of choice is the open-source Ollama. I recently wrote a piece on how to make using this local LLM easier with the help of a browser extension, which I use on Linux. But on MacOS, I turn to an easy-to-use, free app called Msty. 

Also: How to turn Ollama from a terminal tool into a browser-based AI with this free extension

Msty allows you to use locally installed and online AI models. However, I default to the locally installed option. And, unlike the other options for Ollama, there’s no container to deploy, no terminal to use, and no need to open another browser tab. 

Msty features things like split chats (so you can run more than one query at a time), regenerate model response, clone chats, add multiple models, real-time data summoning (which only works with certain models), create Knowledge Stacks (where you can add files, folders, Obsidian vaults, notes, and more to be used to train your local model), a prompt library, and more.

Msty is one of the best tools for interacting with Ollama. Here’s how to use it.

Installing Msty

What you’ll need: The only things you’ll need for this are a MacOS device, and Ollama installed and running. If you haven’t installed Ollama, do that first (here’s how). You’ll also need to pull down one of the local models (which is demonstrated in the article above).

Head to the Msty website, click the Download Msty dropdown, select Mac, and then select either Apple Silicon or Intel.


Show more

When the installation is complete, double-click on the file and, when prompted, drag the Msty icon to the Applications folder.


Show more

Using Msty

1. Open Msty

Next, open Launchpad and locate the launcher for Msty. Click the launcher to open the app.

2. Connect your local Ollama model

When you first run Msty, click Setup Local AI and it will download the necessary components. Once the download completes, it will take care of the configuration and download a local model other than Ollama. 

Also: I tried Sanctum’s local AI app, and it’s exactly what I needed to keep my data private

To connect Msty to Ollama, click Local AI Models in the sidebar and then click the download button associated with Llama 3.2. Once downloaded, you can select it from the models dropdown. You can also add other models, for which you’ll need to retrieve an API key from your account for that particular model. Msty should now be connected to the local Ollama LLM. 

The Msty local model downloader.

I prefer to use the Ollama local model.

Screenshot by Jack Wallen/ZDNET

At this point, you can type your first query and wait for the response. 

3. Model Instructions

One of the cool features of Msty is that it allows you to change the model instructions. 

For example, you might want to use the local LLM as an AI-assisted doctor, for writing, accounting, as an alien anthropologist, or as an artistic advisor. 

To change the model instructions, click Edit Model Instructions in the center of the app and then click the tiny chat button to the left of the broom icon. 

Also: The best AI for coding in 2025 (and what not to use)

From the popup menu, you can select the instructions you want to apply. Click “Apply to this chat” before running your first query.

The Msty model instructions pop-up.

You can choose from several model instructions to hone your queries.

Screenshot by Jack Wallen/ZDNET

There are many other things Msty can do, but this guide will get you up and running quickly. I would suggest starting with the basics and, as you get used to the app, venture into more complicated processes.





Source link