We have hosted the application react llm in order to run this application in our online workstations with Wine or directly.


Quick description about react llm:

Easy-to-use headless React Hooks to run LLMs in the browser with WebGPU. As simple as useLLM().

Features:
  • Supports Vicuna 7B
  • Use custom system prompts and "user: "/"assistant: " role names
  • Completion options like max tokens and stop sequences
  • No data leaves the browser. Accelerated via WebGPU
  • Hooks built to 'Bring your own UI'
  • Persistent storage for conversations in browser storage. Hooks for loading and saving conversations


Programming Language: TypeScript.
Categories:
Large Language Models (LLM)

Page navigation:

©2024. Winfy. All Rights Reserved.

By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.