Skip to content

node wrapper for run alpaca.cpp and get result from request url/api

Notifications You must be signed in to change notification settings

lcsouzamenezes/alpaca_cpp_node

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Run

  • npm install
  • make .env file
  • add/fill COMMAND_PATH= in .env file with path to your executable file/build of https://github.com/antimatter15/alpaca.cpp (e.g. path-to-alpaca.cpp/Release/chat.exe)
  • add/fill MODEL_PATH= in .env file with path to your model file (e.g. path-to-model/ggml-alpaca-7b-q4.bin)
  • node index.js
  • open localhost:3000 in your browser, and you will see message if model is loaded or not
  • open localhost:3000/chat?prompt=your_prompt in your browser if model is loaded, and you will see response from model
  • response will be content-type: text/event-stream

About

node wrapper for run alpaca.cpp and get result from request url/api

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 100.0%