Update README.md
Browse files
README.md
CHANGED
@@ -38,7 +38,36 @@ Post-training methods also support the safety and alignment of LLMs. This import
|
|
38 |
|
39 |
## How to use
|
40 |
|
41 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
42 |
|
43 |
## Safety testing
|
44 |
|
@@ -56,3 +85,16 @@ TODO
|
|
56 |
|
57 |
TODO
|
58 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
38 |
|
39 |
## How to use
|
40 |
|
41 |
+
If you have a CUDA GPU (>=12GB VRAM), the best way to use Relay is with the [relaylm.py]() inference script. Just run:
|
42 |
+
```bash
|
43 |
+
curl https://danlou.co/f/relaylm.py | python -
|
44 |
+
```
|
45 |
+
|
46 |
+
This script will select the best model for your available VRAM, download, load, and start and interactive chat session.
|
47 |
+
It does not have any dependencies besides `transformers >= 4.45.1`.
|
48 |
+
|
49 |
+
Alternatively, if you do not have a CUDA GPU (e.g., on a Mac), you can use the [GGUF versions]() through LM Studio.
|
50 |
+
|
51 |
+
With [relaylm.py](), you can also use the model declaratively, outside of an interactive chat session:
|
52 |
+
|
53 |
+
```python
|
54 |
+
from relaylm import suggest_relay_model, RelayLM
|
55 |
+
|
56 |
+
def favorite_holiday(relay: RelayLM, country: str) -> str:
|
57 |
+
relay.init_context()
|
58 |
+
relay.join(role='model', channel=country.lower())
|
59 |
+
relay.cast(role='model', desc=f"I'm from {country}.")
|
60 |
+
relay.message(role='input', content="What's your favorite holiday?")
|
61 |
+
relay.respond(role='model')
|
62 |
+
response = relay.get_last()
|
63 |
+
return response['content']
|
64 |
+
|
65 |
+
model_info = suggest_relay_model()
|
66 |
+
relay = RelayLM(**model_info)
|
67 |
+
|
68 |
+
print(favorite_holiday(relay, "Portugal"))
|
69 |
+
print(favorite_holiday(relay, "China"))
|
70 |
+
```
|
71 |
|
72 |
## Safety testing
|
73 |
|
|
|
85 |
|
86 |
TODO
|
87 |
|
88 |
+
## Citation
|
89 |
+
|
90 |
+
If you use Relay in your research, please cite it as follows:
|
91 |
+
```
|
92 |
+
@misc{relay2024,
|
93 |
+
author = {Loureiro, Daniel},
|
94 |
+
title = {Relay: LLMs as IRCs},
|
95 |
+
year = {2024},
|
96 |
+
publisher = {GitHub},
|
97 |
+
journal = {GitHub repository},
|
98 |
+
howpublished = {\url{https://github.com/danlou/relay}},
|
99 |
+
}
|
100 |
+
```
|