Documentation | Blog | Demo Video
Arguflow is a truly all-in-one service for hosting AI powered semantic search and LLM retrieval-augmented generation (RAG) on your data.
- Find an issue in the issues tab that you would like to work on.
- Fork the repository and clone it to your local machine
- Create a new branch with a descriptive name: git checkout -b your-branch-name
- Solve the issue by adding or removing code on your forked branch.
- Test your changes locally to ensure that they do not break anything
- Commit your changes with a descriptive commit message: git commit -m "Add descriptive commit message here"
- Push your changes to your forked repository: git push origin your-branch-name
- Open a pull request to the main repository and describe your changes in the PR description
We have a full self-hosting guide available on our documentation page here.
curl \
gcc \
g \
make \
pkg-config \
python3 \
python3-pip \
libpq-dev \
libssl-dev \
openssl \
libreoffice
You can use the following, but we recommend using NVM and then running yarn --cwd ./server-nodejs install
.
RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - && \
apt-get install -y nodejs && \
npm install -g yarn && \
yarn --cwd ./server-nodejs install
yarn --cwd ./server/server-nodejs
cp .env.chat ./chat/.env
cp .env.search ./search/.env
cp .env.server ./server/.env
Here is a guide for acquiring that.
- Open the
./server/.env
file - Replace the value for
OPENAI_API_KEY
to be your own OpenAI API key.
cat .env.chat .env.search .env.server .env.docker-compose > .env
./convenience.sh -l
We know this is bad. Currently, We recommend managing this through tmux or VSCode terminal tabs.
cd server
cargo watch -x run
cd search
yarn
yarn dev
cd chat
yarn
yarn dev