We have hosted the application localai in order to run this application in our online workstations with Wine or directly.
Quick description about localai:
Self-hosted, community-driven, local OpenAI compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. Free Open Source OpenAI alternative. No GPU is required. Runs ggml, GPTQ, onnx, TF compatible models: llama, gpt4all, rwkv, whisper, vicuna, koala, gpt4all-j, cerebras, falcon, dolly, starcoder, and many others. LocalAI is a drop-in replacement REST API that�s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer-grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU.Features:
- Local, OpenAI drop-in alternative REST API
- NO GPU required
- Supports multiple models
- Once loaded the first time, it keep models loaded in memory for faster inference
- Doesn�t shell-out, but uses C++ bindings for a faster inference and better performance
- You own your data
Programming Language: Go.
Categories:
©2024. Winfy. All Rights Reserved.
By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.