2024-11-24 16:11:38 +00:00
# Yura LLM Client for Katya server
Part of project with as target replacing the native ollama protocol. This protocol supports streaming and is usable trough https and it is possible to directly attach a web client to the backend.
## Install
```bash
pip install -e .
```
## Build
```bash
make build
```
## Command line usage
```bash
yura ws://[host]:[port]/[path]/
```
## Python
```python
import asyncio
2024-12-01 08:00:25 +00:00
2024-11-24 16:11:38 +00:00
from yura.client import AsyncClient
2024-12-01 08:00:25 +00:00
2024-11-24 16:11:38 +00:00
async def communicate():
client = AsyncClient("ws://[host]:[port]/[path]/")
async for response in client.chat("Your prompt"):
2024-12-01 08:00:25 +00:00
print(response)
2024-11-24 16:11:38 +00:00
asyncio.run(communicate())
```