promptfoo online with Winfy

We have hosted the application promptfoo in order to run this application in our online workstations with Wine or directly.


Quick description about promptfoo:

Ensure high-quality LLM outputs with automatic evals. Use a representative sample of user inputs to reduce subjectivity when tuning prompts. Use built-in metrics, LLM-graded evals, or define your own custom metrics. Compare prompts and model outputs side-by-side, or integrate the library into your existing test/CI workflow. Use OpenAI, Anthropic, and open-source models like Llama and Vicuna, or integrate custom API providers for any LLM API.

Features:
  • Create a list of test cases
  • Set up evaluation metrics
  • Select the best prompt & model
  • Use a representative sample of user inputs to reduce subjectivity when tuning prompts
  • Use built-in metrics, LLM-graded evals, or define your own custom metrics
  • Compare prompts and model outputs side-by-side, or integrate the library into your existing test/CI workflow


Programming Language: TypeScript.
Categories:
Large Language Models (LLM)

©2024. Winfy. All Rights Reserved.

By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.