Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: update installation guide #664

Merged
merged 6 commits into from
Nov 22, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 64 additions & 0 deletions docs/docs/install/from-source.md
Original file line number Diff line number Diff line change
@@ -0,0 1,64 @@
---
title: From Source
---

# Install Jan from Source

## Installation

### Pre-requisites
Before proceeding with the installation of Jan from source, ensure that the following software versions are installed on your system:
- Node.js version 20.0.0 or higher
- Yarn version 1.22.0 or higher

### Instructions
> **_Note:_** This instruction is tested on MacOS only.

1. Clone the Jan repository from GitHub
```bash
git clone https://github.com/janhq/jan
git checkout DESIRED_BRANCH
cd jan
```
2. Install the required dependencies using Yarn
```bash
yarn install

# Build core module
yarn build:core

# Packing base plugins
yarn build:plugins

# Packing uikit
yarn build:uikit
```
3. Run development and using Jan
```bash
yarn dev
```
This will start the development server and open the desktop app. During this step, you may encounter notifications about installing base plugins. Simply click `OK` and `Next` to continue.

#### For production build
Build the app for macOS M1/M2 for production and place the result in the dist folder

```bash
# Do step 1 and 2 in previous section
git clone https://github.com/janhq/jan
cd jan
yarn install

# Build core module
yarn build:core

# Package base plugins
yarn build:plugins

# Packing uikit
yarn build:uikit

# Build the app
yarn build
```

This completes the installation process for Jan from source. The production-ready app for macOS can be found in the dist folder.
77 changes: 11 additions & 66 deletions docs/docs/install/linux.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,76 2,21 @@
title: Linux
---

# Installing Jan on Linux
# Jan on Linux

## Installation

### Step 1: Download the Installer
To begin using 👋Jan.ai on your Windows computer, follow these steps:

1. Visit [Jan.ai](https://jan.ai/).
2. Click on the "Download for Windows" button to download the Jan Installer.

![Jan Installer](/img/jan-download.png)

:::tip

For faster results, you should enable your NVIDIA GPU. Make sure to have the CUDA toolkit installed. You can download it from your Linux distro's package manager or from here: [CUDA Toolkit](https://developer.nvidia.com/cuda-downloads).

:::

1. To download the lastest version of Jan on Linux, please visit the [Jan's homepage](https://jan.ai/).
2. For Debian/Ubuntu-based distributions, the recommended installation method is through the `.deb` package (64-bit). This can be done either through the graphical software center, if available, or via the command line using the following:
```bash
apt install nvidia-cuda-toolkit
sudo apt install ./jan-linux-amd64-<version>.deb
# sudo apt install ./jan-linux-arm64-0.3.1.deb
```

Check the installation by

## Uninstall Jan
To uninstall VS Code on Linux, you should use your package manager's uninstall or remove option. For Debian/Ubuntu-based distributions, if you installed Jan via the `.deb` package, you can uninstall Jan using the following command:
```bash
nvidia-smi
```

:::tip

For AMD GPU. You can download it from your Linux distro's package manager or from here: [ROCm Quick Start (Linux)](https://rocm.docs.amd.com/en/latest/deploy/linux/quick_start.html).

:::

### Step 2: Download your first model
Now, let's get your first model:

1. After installation, you'll find the 👋Jan application icon on your desktop. Double-click to open it.

2. Welcome to the Jan homepage. Click on "Explore Models" to see the Model catalog.

![Explore models](/img/explore-model.png)

3. You can also see different quantized versions by clicking on "Show Available Versions."

![Model versions](/img/model-version.png)

> Note: Choose a model that matches your computer's memory and RAM.

4. Select your preferred model and click "Download."

![Downloading](/img/downloading.png)

### Step 3: Start the model
Once your model is downloaded. Go to "My Models" and then click "Start Model."

![Start model](/img/start-model.png)


### Step 4: Start the conversations
Now you're ready to start using 👋Jan.ai for conversations:

Click "Chat" and begin your first conversation by selecting "New conversation."

You can also check the CPU and Memory usage of the computer.

![Chat](/img/chat.png)

That's it! Enjoy using Large Language Models (LLMs) with 👋Jan.ai.

## Uninstallation

## Troubleshooting
sudo apt-get remove jan`
# where jan is the name of Jan package
```
In case you wish to completely remove all user data associated with Jan after uninstallation, you can delete the user data folders located at `$HOME/.config/Jan` and ~/.jan. This will return your system to its state prior to the installation of Jan. This method can also be used to reset all settings if you are experiencing any issues with Jan.
110 changes: 35 additions & 75 deletions docs/docs/install/mac.md
Original file line number Diff line number Diff line change
@@ -1,84 1,44 @@
---
title: Mac
title: Mac
---

## Installation

### Step 1: Download the Installer
To begin using 👋Jan.ai on your Windows computer, follow these steps:

1. Visit [Jan.ai](https://jan.ai/).
2. Click on the "Download for Windows" button to download the Jan Installer.

![Jan Installer](/img/jan-download.png)

### Step 2: Download your first model
Now, let's get your first model:

1. After installation, you'll find the 👋Jan application icon on your desktop. Open it.

2. Welcome to the Jan homepage. Click on "Explore Models" to see the Model catalog.

![Explore models](/img/explore-model.png)

3. You can also see different quantized versions by clicking on "Show Available Versions."

![Model versions](/img/model-version.png)

> Note: Choose a model that matches your computer's memory and RAM.

4. Select your preferred model and click "Download."

![Downloading](/img/downloading.png)

### Step 3: Start the model
Once your model is downloaded. Go to "My Models" and then click "Start Model."

![Start model](/img/start-model.png)

### Step 4: Start the conversations
Now you're ready to start using 👋Jan.ai for conversations:

Click "Chat" and begin your first conversation by selecting "New conversation."

You can also check the CPU and Memory usage of the computer.

![Chat](/img/chat.png)

That's it! Enjoy using Large Language Models (LLMs) with 👋Jan.ai.

## Uninstallation

As Jan is development mode, you might get stuck on a broken build.

To reset your installation:
# Jan on MacOS

## Installation
1. To download the lastest version of Jan on MacOS, please visit the [Jan's homepage](https://jan.ai/).
2. On the homepage, please choose the appropriate release version for your system architecture as follows:
- Intel Mac: `jan-mac-x64-<version>.dmg`
- Apple Silicon Mac: `jan-mac-arm64-<version>.dmg`

## Uninstall Jan
As Jan is development mode, you might get stuck on a broken build
To reset your installation
1. Delete Jan from your `/Applications` folder

2. Delete Application data:
```sh
# Newer versions
rm -rf /Users/$(whoami)/Library/Application\ Support/jan

# Versions 0.2.0 and older
rm -rf /Users/$(whoami)/Library/Application\ Support/jan-electron
```

3. Clear Application cache:
```sh
rm -rf /Users/$(whoami)/Library/Caches/jan*
```

2. Delete Application data
```bash
# Newer versions
rm -rf /Users/$(whoami)/Library/Application\ Support/jan

# Versions 0.2.0 and older
rm -rf /Users/$(whoami)/Library/Application\ Support/jan-electron
```
3. Clear Application cache
```bash
rm -rf /Users/$(whoami)/Library/Caches/jan*
```
4. Use the following commands to remove any dangling backend processes:
```bash
ps aux | grep nitro
```
Look for processes like "nitro" and "nitro_arm_64," and kill them one by one with:
```bash
kill -9 <PID>
```

```sh
ps aux | grep nitro
```

Look for processes like "nitro" and "nitro_arm_64," and kill them one by one with:
## Common Questions

```sh
kill -9 <PID>
```
### Does Jan run on Apple Silicon machines?
Yes, Jan supports MacOS Arm64 builds that can run on Macs with the Apple Silicon chipsets. You can install Jan on your Apple Silicon Mac by downloading the `jan-mac-arm64-<version>.dmg` file from the [Jan's homepage](https://jan.ai/).

## FAQs
### Which package should I download for my Mac?
Jan supports both Intel and Apple Silicon Macs. To find which appropriate package to download for your Mac, please follow this official guide from Apple: [Get system information about your Mac - Apple Support](https://support.apple.com/guide/mac-help/syspr35536/mac).
43 changes: 43 additions & 0 deletions docs/docs/install/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 1,43 @@
---
title: Overview
---

Getting up and running open-source AI models on your own computer with Jan is quick and easy. Jan is lightweight and can run on a variety of hardware and platform versions. Specific requirements tailored to your platform are outlined below.

## Cross platform
A free, open-source alternative to OpenAI that runs on the Linux, macOS, and Windows operating systems. Please refer to the specific guides below for your platform
- [Linux](/install/linux)
- [MacOS (Mac Intel Chip and Mac Apple Silicon Chip)](/install/mac)
- [Windows](/install/windows)

## Requirements for Jan

### Hardware
Jan is a lightweight platform designed for seamless download, storage, and execution of open-source Large Language Models (LLMs). With a small download size of less than 200 MB and a disk footprint of under 300 MB, Jan is optimized for efficiency and should run smoothly on modern hardware.

To ensure optimal performance while using Jan and handling LLM models, it is recommended to meet the following system requirements:

#### Disk space
- Minimum requirement
- At least 5 GB of free disk space is required to accommodate the download, storage, and management of open-source LLM models.
- Recommended
- For an optimal experience and to run most available open-source LLM models on Jan, it is recommended to have 10 GB of free disk space.

#### Random Access Memory (RAM) and Graphics Processing Unit Video Random Access Memory (GPU VRAM)
The amount of RAM on your system plays a crucial role in determining the size and complexity of LLM models you can effectively run. Jan can be utilized on traditional computers where RAM is a key resource. For enhanced performance, Jan also supports GPU acceleration, utilizing the VRAM of your graphics card.

#### Relationship between RAM and VRAM Sizes in Relation to LLM Models
The RAM and GPU VRAM requirements are dependent on the size and complexity of the LLM models you intend to run. The following are some general guidelines to help you determine the amount of RAM or VRAM you need to run LLM models on Jan
- 8 GB of RAM: Suitable for running smaller models like 3B models or quantized 7B models
- 16 GB of RAM(recommended): This is considered the "minimum usable models" threshold, particularly for 7B models (e.g Mistral 7B, etc)
- Beyond 16GB of RAM: Required for handling larger and more sophisticated model, such as 70B models.

### Architecture
Jan is designed to run on muptiple architectures, versatility and widespread usability. The supported architectures include:
#### CPU
- x86: Jan is well-suited for systems with x86 architecture, which is commonly found in traditional desktops and laptops. It ensures smooth performance on a variety of devices using x86 processors.
- ARM: Jan is optimized to run efficiently on ARM-based systems, extending compatibility to a broad range of devices using ARM processors.
#### GPU
- NVIDIA: Jan optimizes the computational capabilities of NVIDIA GPUs, achieving efficiency through the utilization of llama.cpp. This strategic integration enhances the performance of Jan, particularly in resource-intensive Language Model (LLM) tasks. Users can expect accelerated processing and improved responsiveness when leveraging the processing capabilities inherent in NVIDIA GPUs.
- AMD: Users with AMD GPUs can seamlessly integrate Jan's GPU acceleration, offering a comprehensive solution for diverse hardware configurations and preferences.
- ARM64 Mac: Jan seamlessly supports ARM64 architecture on Mac systems, leveraging Metal for efficient GPU operations. This ensures a smooth and efficient experience for users with Apple Silicon Chips, utilizing the power of Metal for optimal performance on ARM64 Mac devices.
Loading