linux安装 Ollama

部署运行你感兴趣的模型镜像

# Linux

## Install

To install Ollama, run the following command:

要安装 Ollama,请运行以下命令:

 
curl -fsSL https://ollama.com/install.sh | sh


## Manual install 手动安装

<Note>
  If you are upgrading from a prior version, you should remove the old libraries
  with `sudo rm -rf /usr/lib/ollama` first.

如果您要从以前的版本升级,则应sudo rm -rf /usr/lib/ollama首先删除旧的库。
</Note>

Download and extract the package:下载并解压包:

 
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz \
    | sudo tar zx -C /usr
 

Start Ollama:

 
ollama serve
 

In another terminal, verify that Ollama is running:

在另一个终端中,验证 Ollama 是否正在运行:

 
ollama -v
 

### AMD GPU install

If you have an AMD GPU, also download and extract the additional ROCm package:

如果您有 AMD GPU,还需下载并解压额外的 ROCm 包:

 
curl -fsSL https://ollama.com/download/ollama-linux-amd64-rocm.tgz \
    | sudo tar zx -C /usr
 

### ARM64 install

Download and extract the ARM64-specific package:

下载并解压ARM64专用包:

 
curl -fsSL https://ollama.com/download/ollama-linux-arm64.tgz \
    | sudo tar zx -C /usr
 

### Adding Ollama as a startup service (recommended)

添加 Ollama 作为启动服务(推荐)

Create a user and group for Ollama:

为 Ollama 创建用户和组:

 
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)
 

在以下位置创建服务文件Create a service file in `/etc/systemd/system/ollama.service`:

 ini  theme={"system"}
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"

[Install]
WantedBy=multi-user.target
 

Then start the service:然后启动服务:

 
sudo systemctl daemon-reload
sudo systemctl enable ollama
 

### Install CUDA drivers (optional)安装 CUDA 驱动程序(可选)

[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.

Verify that the drivers are installed by running the following command, which should print details about your GPU:

通过运行以下命令验证驱动程序是否已安装,该命令应打印有关 GPU 的详细信息

 
nvidia-smi
 

### Install AMD ROCm drivers (optional)安装 AMD ROCm 驱动程序(可选)

[Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html) ROCm v6.

### Start Ollama

Start Ollama and verify it is running:启动 Ollama 并验证它正在运行:

 
sudo systemctl start ollama
sudo systemctl status ollama
 

<Note>
  While AMD has contributed the `amdgpu` driver upstream to the official linux
  kernel source, the version is older and may not support all ROCm features. We
  recommend you install the latest driver from
  [https://www.amd.com/en/support/linux-drivers](https://www.amd.com/en/support/linux-drivers) for best support of your Radeon
  GPU.
</Note>

## Customizing 定制

To customize the installation of Ollama, you can edit the systemd service file or the environment variables by running:

要自定义 Ollama 的安装,您可以通过运行以下命令来编辑 systemd 服务文件或环境变量:

 
sudo systemctl edit ollama
 

Alternatively, create an override file manually in `/etc/systemd/system/ollama.service.d/override.conf`:

或者,在以下位置手动创建覆盖文件

 ini  theme={"system"}
[Service]
Environment="OLLAMA_DEBUG=1"
 

## Updating

Update Ollama by running the install script again:

再次运行安装脚本来更新 Ollama:

curl -fsSL https://ollama.com/install.sh | sh
 

Or by re-downloading Ollama:或者重新下载 Ollama:

 
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz \
    | sudo tar zx -C /usr
 

## Installing specific versions 安装特定版本

Use `OLLAMA_VERSION` environment variable with the install script to install a specific version of Ollama, including pre-releases. You can find the version numbers in the [releases page](https://github.com/ollama/ollama/releases).使用OLLAMA_VERSION环境变量和安装脚本来安装特定版本的 Ollama,包括预发行版。您可以在发布页面中找到版本号。

For example:

 
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
 

## Viewing logs查看日志

To view logs of Ollama running as a startup service, run:要查看作为启动服务运行的 Ollama 日志,请运行:

 
journalctl -e -u ollama
 

## Uninstall

Remove the ollama service:

 
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
 

Remove ollama libraries from your lib directory (either `/usr/local/lib`, `/usr/lib`, or `/lib`):

 
sudo rm -r $(which ollama | tr 'bin' 'lib')
 

Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):

 
sudo rm $(which ollama)
 

Remove the downloaded models and Ollama service user and group:

 
sudo userdel ollama
sudo groupdel ollama
sudo rm -r /usr/share/ollama
 
 

您可能感兴趣的与本文相关的镜像

Llama Factory

Llama Factory

模型微调
LLama-Factory

LLaMA Factory 是一个简单易用且高效的大型语言模型(Large Language Model)训练与微调平台。通过 LLaMA Factory,可以在无需编写任何代码的前提下,在本地完成上百种预训练模型的微调

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值