Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doc:ollama document #1512

Merged
merged 2 commits into from
May 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion dbgpt/storage/vector_store/chroma_store.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@

CHROMA_COLLECTION_NAME = "langchain"


@register_resource(
_("Chroma Vector Store"),
"chroma_vector_store",
Expand Down Expand Up @@ -152,7 +153,6 @@ def vector_name_exists(self) -> bool:
files = list(filter(lambda f: f != "chroma.sqlite3", files))
return len(files) > 0


def load_document(self, chunks: List[Chunk]) -> List[str]:
"""Load document to vector store."""
logger.info("ChromaStore load document")
Expand Down
2 changes: 1 addition & 1 deletion dbgpt/storage/vector_store/pgvector_store.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ def __init__(self, vector_store_config: PGVectorConfig) -> None:
embedding_function=self.embeddings,
collection_name=self.collection_name,
connection_string=self.connection_string,
) # mypy: ignore
) # mypy: ignore

def similar_search(
self, text: str, topk: int, filters: Optional[MetadataFilters] = None
Expand Down
41 changes: 41 additions & 0 deletions docs/docs/installation/advanced_usage/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# ollama
ollama is a model serving platform that allows you to deploy models in a few seconds.
It is a great tool.

### Install ollama
If your system is linux.
```bash
curl -fsSL https://ollama.com/install.sh | sh
```
other environments, please refer to the [official ollama website](https://ollama.com/).
### Pull models.
1. Pull LLM
```bash
ollama pull qwen:0.5b
```
2. Pull embedding model.
```bash
ollama pull nomic-embed-text
```

3. install ollama package.
```bash
pip install ollama
```

### Use ollama proxy model in DB-GPT `.env` file

```bash
LLM_MODEL=ollama_proxyllm
PROXY_SERVER_URL=http://127.0.0.1:11434
PROXYLLM_BACKEND="qwen:0.5b"
PROXY_API_KEY=not_used
EMBEDDING_MODEL=proxy_ollama
proxy_ollama_proxy_server_url=http://127.0.0.1:11434
proxy_ollama_proxy_backend="nomic-embed-text:latest"
```

### run dbgpt server
```bash
python dbgpt/app/dbgpt_server.py
```
4 changes: 4 additions & 0 deletions docs/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -237,6 +237,10 @@ const sidebars = {
type: 'doc',
id: 'installation/advanced_usage/More_proxyllms',
},
{
type: 'doc',
id: 'installation/advanced_usage/ollama',
},
{
type: 'doc',
id: 'installation/advanced_usage/vLLM_inference',
Expand Down
Loading