# svelte-evals **Repository Path**: mirrors_sveltejs/svelte-evals ## Basic Information - **Project Name**: svelte-evals - **Description**: Evals for LLMS to learn/benchmark their Svelte skills - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-06-29 - **Last Updated**: 2025-09-02 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # svelte-evals Evals for LLMS to learn/benchmark their Svelte skills This repository, once fully set up, will serve two purposes: 1. Create a set of evals - basically tests for LLMS: you give a question and an answer and LLMs can see how good they answered. Model providers can use it to incorporate it into their training. Model providers and everyone else can use it to benchmark models. 2. Create a set of commonly asked questions with concise answeres. These will appear on svelte.dev to help out people searching docs for common tasks. 1 and 2 have strong overlap, 2 will likely be a subset of 1.