=Paper=
{{Paper
|id=Vol-4164/paper11
|storemode=property
|title=Deploying LLMs on CPU-only Environments with llama.cpp Library Set: MedLocalGPT Project Case
|pdfUrl=https://ceur-ws.org/Vol-4164/paper11.pdf
|volume=Vol-4164
|authors=Kyrylo S. Malakhov
}}
==Deploying LLMs on CPU-only Environments with llama.cpp Library Set: MedLocalGPT Project Case==
None