Installing Text Generation WebUI A Step-by-Step Guide

text generation webui


Text-Generation-WebUI is a Gradio-based user interface for running Large Language Models. It supports various models such as transformers, GPTQ, llama.cpp (ggml), and Llama models. This tool provides a user-friendly interface to interact with these models and generate text.

The Text Generation Web UI comes with several features such as model switching, notebook mode, chat mode, and more. It aims to become the go-to web UI for text generation. It also provides a dropdown menu for switching between models, a notebook mode that resembles OpenAI’s playground, and a chat mode for conversation.

Installation:

There are two ways to install Text-Generation-WebUI: using the one-click installers or manually installing it using Conda.


One-Click Installers:

The one-click installers are available for Windows, Linux, macOS, and WSL. You can download the zip file for your operating system from the GitHub repository, extract it, and double-click on “start”. The web UI and all its dependencies will be installed in the same folder. There is no need to run the installers as admin. It can be downloaded from the original github repositry : GitHub - oobabooga/text-generation-webui: A Gradio web UI for Large Language Models. Supports transformers, GPTQ, llama.cpp (GGUF), Llama models.



Manual Installation using Conda:

If you have some experience with the command-line, you can manually install Text-Generation-WebUI using Conda. Here are the steps:

a. Install Conda: You can download Conda from its official website. On Linux or WSL, you can automatically install it using these two commands:

  curl -sL "https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh" > "Miniconda3.sh"

  bash Miniconda3.sh

b. Create a new Conda environment:

  conda create -n textgen python=3.10.9

  conda activate textgen


c. Install Pytorch: The command to install Pytorch depends on your system and GPU. You can find the up-to-date commands on the Pytorch website.


d. Install the web UI:

  git clone https://github.com/oobabooga/text-generation-webui

  cd text-generation-webui

  pip install -r requirements.txt


Starting the Web UI:

After installation, you can start the web UI with the following commands:

conda activate textgen

cd text-generation-webui

python server.py

Then, browse to http://localhost:7860/?__theme=dark to access the web UI.


Conclusion:

Installing Text-Generation-WebUI is a straightforward process, whether you choose to use the one-click installers or install it manually using Conda. Once installed, you can start generating text using large language models right from your browser.

Get updates directly in your mailbox by signing up for our newsletter. Signup Now

More information can be found at the github page : GitHub - oobabooga/text-generation-webui: A Gradio web UI for Large Language Models. Supports transformers, GPTQ, llama.cpp (ggml), Llama models.


Comments

Popular Posts