Running large language models locally
Model Shop
Ollama and Open WebUI let you join the AI revolution without relying on the cloud.
Large language models (LLMs) such as the ones used by OpenAI's [1] ChatGPT [2] are too resource intensive to run locally on your own computer. That's why they're deployed as online services that you pay for. However, since ChatGPT's release, some significant advancements have occurred around smaller LLMs. Many of these smaller LLMs are open source or have a liberal license (see the "Licenses" box). You can run them on your own computer without having to send your input to a cloud server and without having to pay a fee to an online service.
Because these LLMs are computationally intensive and need a lot of RAM, running them on your CPU can be slow. For optimal performance, you need a GPU – GPUs have many parallel compute cores and a lot of dedicated RAM. An NVIDIA or AMD GPU with 8GB RAM or more is recommended.
In addition to the hardware and the models, you also need software that enables you to run the models. One popular package is Ollama [3], named for Meta AI's large language model Llama [4]. Ollama is a command-line application that runs on Linux, macOS, and Windows, and you can also run it as a server that other software connects to.
[...]
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

News
-
openSUSE Tumbleweed Ditches AppArmor for SELinux
If you're an openSUSE Tumbleweed user, you can expect a major change to the distribution.
-
Plasma 6.3 Now Available
Plasma desktop v6.3 has a couple of pretty nifty tricks up its sleeve.
-
LibreOffice 25.2 Has Arrived
If you've been hoping for a release that offers more UI customizations, you're in for a treat.
-
TuxCare Has a Big AlmaLinux 9 Announcement in Store
TuxCare announced it has successfully completed a Security Technical Implementation Guide for AlmaLinux OS 9.
-
First Release Candidate for Linux Kernel 6.14 Now Available
Linus Torvalds has officially released the first release candidate for kernel 6.14 and it includes over 500,000 lines of modified code, making for a small release.
-
System76 Refreshes Meerkat Mini PC
If you're looking for a small form factor PC powered by Linux, System76 has exactly what you need in the Meerkat mini PC.
-
Gnome 48 Alpha Ready for Testing
The latest Gnome desktop alpha is now available with plenty of new features and improvements.
-
Wine 10 Includes Plenty to Excite Users
With its latest release, Wine has the usual crop of bug fixes and improvements, along with some exciting new features.
-
Linux Kernel 6.13 Offers Improvements for AMD/Apple Users
The latest Linux kernel is now available, and it includes plenty of improvements, especially for those who use AMD or Apple-based systems.
-
Gnome 48 Debuts New Audio Player
To date, the audio player found within the Gnome desktop has been meh at best, but with the upcoming release that all changes.