Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
rkrisztian
8 months ago
|
parent
|
context
|
favorite
| on:
ETH Zurich and EPFL to release a LLM developed on ...
I'm disappointed. 8B is too low for GPUs with 16 GB VRAM (which is still common in affordable PCs), where most 13B to 16B models could still be easily run, depending on the quantization.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: