We have hosted the application litellm in order to run this application in our online workstations with Wine or directly.


Quick description about litellm:

Call all LLM APIs using the OpenAI format [Anthropic, Huggingface, Cohere, Azure OpenAI etc.] liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, and Huggingface models.

Features:
  • Translating inputs to the provider's completion and embedding endpoints
  • Guarantees consistent output, text responses will always be available
  • Exception mapping
  • Common exceptions across providers are mapped to the OpenAI exception types
  • LiteLLM Client - debugging & 1-click add new LLMs
  • liteLLM supports streaming the model response back


Programming Language: Python.
Categories:
Large Language Models (LLM)

Page navigation:

©2024. Winfy. All Rights Reserved.

By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.