Home ยป Advanced LLM AI System by Ollama with AMD Card Support

Advanced LLM AI System by Ollama with AMD Card Support

Ollama software for running artificial intelligence algorithms in the LLM group has been updated to version 0.1.29, now supporting AMD cards through ROCm software starting from Radeon consumer models, workstation models, and server models in the Instinct group.

In reality, Ollama has quietly supported ROCm for some time now, but with a few bugs remaining. This version marks the first official support for the project, allowing it to run on Linux, Windows, and Docker systems.

Previously, running LLM typically relied on NVIDIA cards due to the popularity of the CUDA library. However, there has been a gradual increase in support for AMD chips as well.

Source: Ollama

TLDR: Ollama software now officially supports AMD cards through ROCm, expanding its compatibility beyond NVIDIA GPUs.

More Reading

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

LLM Software Suite Running on PC: Windows Edition Upgrade

Python/JavaScript: Unleashing Artificial Intelligence Software Ollama on PC

Utilizing the NVIDIA CUDA Library on AMD ROCm: ZLUDA Unleashes Experimental Iterations