Tag: LLM
November 13, 2024 5:14 pm
483 viewsAdding own Documents to your Local AI Using RAG
Introduction
This is part 4 of a series on running AI locally on an openSUSE system. Previous parts can be found here:
Running AI locally
Generating images with LocalAI using a GPU
Introduction to AI training with openSUSE
Since we have LocalAI running, generated images, text and even trained own LoRAs, another big topic is […]
Tags: AI, Container, image generation, kohya, Linux, LLM, localai, LoRA, low ranking adapter, openSUSE, Training
Categories: Containers, Free Tools, Products, SUSE AI
October 23, 2024 2:50 pm
645 viewsIntroduction to AI training with openSUSE
Introduction
In my last posts I explained on how to run AI models on a openSUSE system using LocalAI. Now I'd like to introduce you to training AI with a small guide on creating a Low Ranking Adaption, also known as LoRA, and using it with LocalAI on your system. This way, you can leverage […]
Tags: AI, Container, image generation, kohya, Linux, LLM, localai, LoRA, low ranking adapter, openSUSE, Training
October 4, 2024 10:41 am
1,342 viewsGenerating images with LocalAI using a GPU
Introduction
In my last blog post on using code generating LLMs on OpenSUSE, I explained on how to run models on your local machine without utilizing a GPU card. In this post I want to show on how to setup your system in order to make use of an available GPU for running […]
Tags: AI, Artificial Intelligence, GPU, image generation, LLM, locaIAI, neural net, NVIDIA, stablediffusion, SUSE