-
Notifications
You must be signed in to change notification settings - Fork 279
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ospp/new llm embedding #19727
base: main
Are you sure you want to change the base?
Ospp/new llm embedding #19727
Conversation
As part of our document LLM support, we are introducing the `LLM_EXTRACT_TEXT` function. This function extracts text from PDF files and writes the extracted text to a specified text file, extractor type can be specified by the third argument.
|
PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here. PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here. PR Code Suggestions ✨Explore these optional code suggestions:
|
User description
What type of PR is this?
Which issue(s) this PR fixes:
issue #18664
What this PR does / why we need it:
As part of our document LLM support, we are introducing the
LLM_EMBEDDING
function. This function can a string or embed the content from a specified txt file by using LLM platforms like Ollama and LLM models like llama3.Three global variables are introduced for users to customize their own LLM platforms, proxy and models.
Three global variables:
llm_embedding_platform
: default isollama
llm_server_proxy
: default ishttp://localhost:11434/api/embed
llm_model
: default isllama3
Usage:
llm_embedding(<input txt datalink>);
orllm_embedding(<input string>);
Return Value: a vector of 4096 32-bit floating point.
Note:
ollama run llama3
in the shell before using the embedding function.Example SQL:
Example return:
PR Type
Enhancement, Tests
Description
LLM_CHUNK
,LLM_EXTRACT_TEXT
, andLLM_EMBEDDING
.Changes walkthrough 📝
6 files
func_llm.go
Implement LLM functions for chunking, text extraction, and embedding
pkg/sql/plan/function/func_llm.go
document.
list_builtIn.go
Register new LLM functions and define overloads
pkg/sql/plan/function/list_builtIn.go
LLM_EMBEDDING.
ollama_service.go
Implement Ollama service interaction for embeddings
pkg/sql/plan/function/ollama_service.go
embedding_service.go
Define EmbeddingService interface and implement Ollama service
pkg/sql/plan/function/embedding_service.go
variables.go
Add system variables for LLM embedding configuration
pkg/frontend/variables.go
function_id.go
Add function IDs for new LLM functions
pkg/sql/plan/function/function_id.go
7 files
func_llm_test.go
Add unit tests for LLM chunking and extraction functions
pkg/sql/plan/function/func_llm_test.go
func_llm_chunk.result
Add expected results for LLM chunking function tests
test/distributed/cases/function/func_llm_chunk.result
func_llm_chunk.sql
Add SQL test cases for LLM chunking function
test/distributed/cases/function/func_llm_chunk.sql
func_llm_extract_file.result
Add expected results for LLM text extraction function tests
test/distributed/cases/function/func_llm_extract_file.result
func_llm_extract_file.sql
Add SQL test cases for LLM text extraction function
test/distributed/cases/function/func_llm_extract_file.sql
4.txt
Add text file for LLM chunking test resources
test/distributed/resources/llm_test/chunk/4.txt
func_llm_embedding.sql
Add SQL test case for LLM embedding function
test/distributed/cases/function/func_llm_embedding.sql
2 files
go.sum
Update dependencies for PDF processing
go.sum
go.mod
Add PDF processing library to module dependencies
go.mod
1 files
func_llm_embedding.result
...
test/distributed/cases/function/func_llm_embedding.result
...