Ollama software for running artificial intelligence algorithms in the LLM group has been updated to version 0.1.29, now supporting AMD cards through ROCm software starting from Radeon consumer models, workstation models, and server models in the Instinct group.
In reality, Ollama has quietly supported ROCm for some time now, but with a few bugs remaining. This version marks the first official support for the project, allowing it to run on Linux, Windows, and Docker systems.
Previously, running LLM typically relied on NVIDIA cards due to the popularity of the CUDA library. However, there has been a gradual increase in support for AMD chips as well.
Source: Ollama
TLDR: Ollama software now officially supports AMD cards through ROCm, expanding its compatibility beyond NVIDIA GPUs.
Leave a Comment