RUN LLMs Locally On ANDROID LlaMa3, Gemma & More

Опубликовано: 07 Ноябрь 2024
на канале: Himas World
153
7

RM LLMs Locally On Android device using Ollama. Ollama is simple tool that allows running open source models like llama3, Gemma, tinyllama & more.

Downloads
Termux - https://github.com/termux/termux-app
Ollama Snippet - https://gitlab.com/-/snippets/3682973

Command List
termux-setup-storage
termux-change-repo
pkg upgrade
pkg install git cmake golang

Setup Ollama
git clone --depth 1 https://github.com/ollama/ollama.git
cd ollama
go generate ./...
go build .
./ollama serve &

TimeLaps
00:30 Introduction
01:02 Installation
02:07 Credentials Install and end