Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release/0.4.13 to main #2916

Merged
merged 27 commits into from
May 16, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
27 commits
Select commit Hold shift click to select a range
012cc80
Merge pull request #2815 from janhq/main
Van-QA Apr 25, 2024
0bad1a4
fix: remove scroll animation chat screen (#2819)
namchuai Apr 25, 2024
ce2d8e5
chore: remove nutjs (#2860)
namchuai May 2, 2024
c6182ab
Customize scroll-bar style (#2857)
QuentinMacheda May 3, 2024
2016eae
Remove hidden overflow property of tailwind Update buttons position…
QuentinMacheda May 3, 2024
c21bc08
Fix eslint issue in EditChatInput (#2864)
QuentinMacheda May 3, 2024
092a572
Feat: Remote API Parameters Correction (#2802)
hahuyhoang411 May 4, 2024
4c88d03
feat: add remote model command-r (#2868)
henryh0x1 May 6, 2024
86fda1c
feat: add model gpt-4 turbo (#2836)
henryh0x1 May 6, 2024
a6ccd67
fix: validate max_token from context_length value (#2870)
urmauur May 6, 2024
1e3e5a8
feat/implement-inference-martian-extension (#2869)
henryh0x1 May 6, 2024
9effb6a
fix: validate context length (#2871)
urmauur May 6, 2024
d226640
Add OpenRouter (#2826)
Inchoker May 6, 2024
2008aae
Feat: Correct context length for models (#2867)
hahuyhoang411 May 6, 2024
0406b51
fix: stop auto scroll if user manually scrolling up (#2874)
namchuai May 6, 2024
efbc96d
feat: inference anthropic extension (#2885)
henryh0x1 May 11, 2024
6af4a2d
feat: add deeplink support (#2883)
namchuai May 13, 2024
1e0d4f3
Feat: Adjust model hub v0.4.13 (#2879)
hahuyhoang411 May 13, 2024
08d15e5
fix: deeplink when app not open on linux (#2893)
namchuai May 13, 2024
eb7e963
add: gpt4o (#2899)
hahuyhoang411 May 14, 2024
aa1f01f
Revert "chore: remove nutjs" and replace nutjs version (#2900)
Van-QA May 15, 2024
1130979
fix: cohere stream param does not work (#2907)
louis-jan May 15, 2024
33697be
Change mac arm64 build use github runner (#2910)
hiento09 May 16, 2024
06be308
Revert "Change mac arm64 build use github runner (#2910)" (#2911)
hiento09 May 16, 2024
0436224
Revert "Revert "Change mac arm64 build use github runner (#2910)" (#2…
hiento09 May 16, 2024
2182599
Chore: Add phi3 (#2914)
hahuyhoang411 May 16, 2024
537ef20
chore: replace nitro by cortex-cpp (#2912)
louis-jan May 16, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Add OpenRouter (#2826)
* Add OpenRouter

* fix cohere setting description

* fix: update to auto router

* fix: auto router

* add: config parameters

* fix: correct max tokens

---------

Co-authored-by: Jack Tri Le <Jack>
Co-authored-by: Hoang Ha <64120343 [email protected]>
  • Loading branch information
Inchoker and hahuyhoang411 committed May 6, 2024
commit d2266405cc56e1c584b022d416f2e4b644114e08
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 12,7 @@
{
"key": "cohere-api-key",
"title": "API Key",
"description": "The Cohere API uses API keys for authentication. Visit your [API Keys](https://platform.openai.com/account/api-keys) page to retrieve the API key you'll use in your requests.",
"description": "The Cohere API uses API keys for authentication. Visit your [API Keys](https://dashboard.cohere.com/api-keys) page to retrieve the API key you'll use in your requests.",
"controllerType": "input",
"controllerProps": {
"placeholder": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
Expand Down
79 changes: 79 additions & 0 deletions extensions/inference-openrouter-extension/README.md
Original file line number Diff line number Diff line change
@@ -0,0 1,79 @@
# Open Router Engine Extension

Created using Jan extension example

# Create a Jan Extension using Typescript

Use this template to bootstrap the creation of a TypeScript Jan extension. 🚀

## Create Your Own Extension

To create your own extension, you can use this repository as a template! Just follow the below instructions:

1. Click the Use this template button at the top of the repository
2. Select Create a new repository
3. Select an owner and name for your new repository
4. Click Create repository
5. Clone your new repository

## Initial Setup

After you've cloned the repository to your local machine or codespace, you'll need to perform some initial setup steps before you can develop your extension.

> [!NOTE]
>
> You'll need to have a reasonably modern version of
> [Node.js](https://nodejs.org) handy. If you are using a version manager like
> [`nodenv`](https://github.com/nodenv/nodenv) or
> [`nvm`](https://github.com/nvm-sh/nvm), you can run `nodenv install` in the
> root of your repository to install the version specified in
> [`package.json`](./package.json). Otherwise, 20.x or later should work!

1. :hammer_and_wrench: Install the dependencies

```bash
npm install
```

1. :building_construction: Package the TypeScript for distribution

```bash
npm run bundle
```

1. :white_check_mark: Check your artifact

There will be a tgz file in your extension directory now

## Update the Extension Metadata

The [`package.json`](package.json) file defines metadata about your extension, such as
extension name, main entry, description and version.

When you copy this repository, update `package.json` with the name, description for your extension.

## Update the Extension Code

The [`src/`](./src/) directory is the heart of your extension! This contains the
source code that will be run when your extension functions are invoked. You can replace the
contents of this directory with your own code.

There are a few things to keep in mind when writing your extension code:

- Most Jan Extension functions are processed asynchronously.
In `index.ts`, you will see that the extension function will return a `Promise<any>`.

```typescript
import { events, MessageEvent, MessageRequest } from '@janhq/core'

function onStart(): Promise<any> {
return events.on(MessageEvent.OnMessageSent, (data: MessageRequest) =>
this.inference(data)
)
}
```

For more information about the Jan Extension Core module, see the
[documentation](https://github.com/janhq/jan/blob/main/core/README.md).

So, what are you waiting for? Go ahead and start customizing your extension!
43 changes: 43 additions & 0 deletions extensions/inference-openrouter-extension/package.json
Original file line number Diff line number Diff line change
@@ -0,0 1,43 @@
{
"name": "@janhq/inference-openrouter-extension",
"productName": "OpenRouter Inference Engine",
"version": "1.0.0",
"description": "This extension enables Open Router chat completion API calls",
"main": "dist/index.js",
"module": "dist/module.js",
"engine": "openrouter",
"author": "Jan <[email protected]>",
"license": "AGPL-3.0",
"scripts": {
"build": "tsc -b . && webpack --config webpack.config.js",
"build:publish": "rimraf *.tgz --glob && yarn build && npm pack && cpx *.tgz ../../pre-install",
"sync:core": "cd ../.. && yarn build:core && cd extensions && rm yarn.lock && cd inference-openrouter-extension && yarn && yarn build:publish"
},
"exports": {
".": "./dist/index.js",
"./main": "./dist/module.js"
},
"devDependencies": {
"cpx": "^1.5.0",
"rimraf": "^3.0.2",
"webpack": "^5.88.2",
"webpack-cli": "^5.1.4",
"ts-loader": "^9.5.0"
},
"dependencies": {
"@janhq/core": "file:../../core",
"fetch-retry": "^5.0.6",
"ulidx": "^2.3.0"
},
"engines": {
"node": ">=18.0.0"
},
"files": [
"dist/*",
"package.json",
"README.md"
],
"bundleDependencies": [
"fetch-retry"
]
}
28 changes: 28 additions & 0 deletions extensions/inference-openrouter-extension/resources/models.json
Original file line number Diff line number Diff line change
@@ -0,0 1,28 @@
[
{
"sources": [
{
"url": "https://openrouter.ai"
}
],
"id": "open-router-auto",
"object": "model",
"name": "OpenRouter",
"version": "1.0",
"description": " OpenRouter scouts for the lowest prices and best latencies/throughputs across dozens of providers, and lets you choose how to prioritize them.",
"format": "api",
"settings": {},
"parameters": {
"max_tokens": 1024,
"temperature": 0.7,
"top_p": 0.95,
"frequency_penalty": 0,
"presence_penalty": 0
},
"metadata": {
"author": "OpenRouter",
"tags": ["General", "Big Context Length"]
},
"engine": "openrouter"
}
]
23 changes: 23 additions & 0 deletions extensions/inference-openrouter-extension/resources/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 1,23 @@
[
{
"key": "chat-completions-endpoint",
"title": "Chat Completions Endpoint",
"description": "The endpoint to use for chat completions. See the [OpenRouter API documentation](https://openrouter.ai/docs) for more information.",
"controllerType": "input",
"controllerProps": {
"placeholder": "https://openrouter.ai/api/v1/chat/completions",
"value": "https://openrouter.ai/api/v1/chat/completions"
}
},
{
"key": "openrouter-api-key",
"title": "API Key",
"description": "The OpenRouter API uses API keys for authentication. Visit your [API Keys](https://openrouter.ai/keys) page to retrieve the API key you'll use in your requests.",
"controllerType": "input",
"controllerProps": {
"placeholder": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"value": "",
"type": "password"
}
}
]
76 changes: 76 additions & 0 deletions extensions/inference-openrouter-extension/src/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 1,76 @@
/**
* @file This file exports a class that implements the InferenceExtension interface from the @janhq/core package.
* The class provides methods for initializing and stopping a model, and for making inference requests.
* It also subscribes to events emitted by the @janhq/core package and handles new message requests.
* @version 1.0.0
* @module inference-openai-extension/src/index
*/

import { RemoteOAIEngine } from '@janhq/core'
import { PayloadType } from '@janhq/core'
import { ChatCompletionRole } from '@janhq/core'

declare const SETTINGS: Array<any>
declare const MODELS: Array<any>

enum Settings {
apiKey = 'openrouter-api-key',
chatCompletionsEndPoint = 'chat-completions-endpoint',
}

enum RoleType {
user = 'USER',
chatbot = 'CHATBOT',
system = 'SYSTEM',
}

/**
* A class that implements the InferenceExtension interface from the @janhq/core package.
* The class provides methods for initializing and stopping a model, and for making inference requests.
* It also subscribes to events emitted by the @janhq/core package and handles new message requests.
*/
export default class JanInferenceOpenRouterExtension extends RemoteOAIEngine {
inferenceUrl: string = ''
provider: string = 'openrouter'

override async onLoad(): Promise<void> {
super.onLoad()

// Register Settings
this.registerSettings(SETTINGS)
this.registerModels(MODELS)

this.apiKey = await this.getSetting<string>(Settings.apiKey, '')
this.inferenceUrl = await this.getSetting<string>(
Settings.chatCompletionsEndPoint,
''
)
if (this.inferenceUrl.length === 0) {
SETTINGS.forEach((setting) => {
if (setting.key === Settings.chatCompletionsEndPoint) {
this.inferenceUrl = setting.controllerProps.value as string
}
})
}
}

onSettingUpdate<T>(key: string, value: T): void {
if (key === Settings.apiKey) {
this.apiKey = value as string
} else if (key === Settings.chatCompletionsEndPoint) {
if (typeof value !== 'string') return

if (value.trim().length === 0) {
SETTINGS.forEach((setting) => {
if (setting.key === Settings.chatCompletionsEndPoint) {
this.inferenceUrl = setting.controllerProps.value as string
}
})
} else {
this.inferenceUrl = value
}
}
}

transformPayload = (payload: PayloadType)=>({...payload,model:"openrouter/auto"})
}
14 changes: 14 additions & 0 deletions extensions/inference-openrouter-extension/tsconfig.json
Original file line number Diff line number Diff line change
@@ -0,0 1,14 @@
{
"compilerOptions": {
"target": "es2016",
"module": "ES6",
"moduleResolution": "node",
"outDir": "./dist",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": false,
"skipLibCheck": true,
"rootDir": "./src"
},
"include": ["./src"]
}
37 changes: 37 additions & 0 deletions extensions/inference-openrouter-extension/webpack.config.js
Original file line number Diff line number Diff line change
@@ -0,0 1,37 @@
const webpack = require('webpack')
const packageJson = require('./package.json')
const settingJson = require('./resources/settings.json')
const modelsJson = require('./resources/models.json')

module.exports = {
experiments: { outputModule: true },
entry: './src/index.ts', // Adjust the entry point to match your project's main file
mode: 'production',
module: {
rules: [
{
test: /\.tsx?$/,
use: 'ts-loader',
exclude: /node_modules/,
},
],
},
plugins: [
new webpack.DefinePlugin({
MODELS: JSON.stringify(modelsJson),
SETTINGS: JSON.stringify(settingJson),
ENGINE: JSON.stringify(packageJson.engine),
}),
],
output: {
filename: 'index.js', // Adjust the output file name as needed
library: { type: 'module' }, // Specify ESM output format
},
resolve: {
extensions: ['.ts', '.js'],
},
optimization: {
minimize: false,
},
// Add loaders and other configuration as needed for your project
}
Loading