# transformerlab-app **Repository Path**: stephenxi2023/transformerlab-app ## Basic Information - **Project Name**: transformerlab-app - **Description**: No description available - **Primary Language**: Unknown - **License**: AGPL-3.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-08-04 - **Last Updated**: 2025-08-21 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
transformer lab logo

100% Open Source Toolkit for Large Language Models: Train, Tune, Chat on your own Machine
Download · Explore the docs »

View Demo · Report Bugs · Suggest Features · Join Discord · Follow on Twitter

Note: Transformer Lab is actively being developed. Please join our Discord or follow us on Twitter for updates. Questions, feedback and contributions are highly valued!

## Download Now [![Download Icon]][Download URL] ## About The Project ![Product Screen Shot](assets/transformerlab-demo-jan2025.gif) Transformer Lab is an app that allows anyone to experiment with Large Language Models. ## Backed by Mozilla Transformer Lab is proud to be supported by Mozilla through the Mozilla Builders Program Mozilla Builders Logo ## Features Transformer Lab allows you to: - 💕 **One-click Download Hundreds of Popular Models**: - DeepSeek, Qwen, Gemma, Phi4, Llama, Mistral, Mixtral, Stable Diffusion, Flux, Command-R, and dozens more - ⬇ **Download any LLM, VLM, or Diffusion model from Huggingface** - 🎶 **Finetune / Train Across Different Hardware** - Finetune using MLX on Apple Silicon - Finetune using Huggingface on GPU - Finetune Diffusion LoRAs on GPU - ⚖️ **RLHF and Preference Optimization** - DPO - ORPO - SIMPO - Reward Modeling - 💻 **Work with Models Across Operating Systems**: - Windows App - MacOS App - Linux - 💬 **Chat with Models** - Chat - Completions - Visualize Model Architecture - Inspect activations & attention for each generated token - Preset (Templated) Prompts - Chat History - Tweak generation parameters - Batched Inference - Tool Use / Function Calling (in alpha) - 🚂 **Use Different Inference Engines** - MLX on Apple Silicon - FastChat - vLLM - Llama CPP - SGLang - 🖼️ **Support for Image Diffusion Models** - Run and experiment with image generation models (e.g., Stable Diffusion, Flux, etc.) - 🧑‍🎓 **Evaluate models** - 📖 **RAG (Retreival Augmented Generation)** - Drag and Drop File UI - Works on Apple MLX, FastChat, and other engines - 📓 **Build Datasets for Training** - Pull from hundreds of common datasets available on HuggingFace - Provide your own dataset using drag and drop - 🔢 **Calculate Embeddings** - 💁 **Full REST API** - 🌩 **Run in the Cloud** - You can run the user interface on your desktop/laptop while the engine runs on a remote or cloud machine - Or you can run everything locally on a single machine - 🔀 **Convert Models Across Platforms** - Convert from/to Huggingface, MLX, GGUF - 🔌 **Plugin Support** - Easily install from a gallery of existing plugins - Write your own plugins to extend functionality - 🧑‍💻 **Embedded Monaco Code Editor** - Edit plugins and view what's happening behind the scenes - 📝 **Prompt Editing** - Easily edit System Messages or Prompt Templates - 📜 **Inference Logs** - While doing inference or RAG, view a log of the raw queries sent to the model And you can do the above, all through a simple cross-platform GUI. ## Getting Started Click here to download Transformer Lab. Read this page to learn how to install and use. ### Built With - [![Electron][Electron]][Electron-url] - [![React][React.js]][React-url] - [![HuggingFace][HuggingFace]][HuggingFace-url] ## Developers ### Building from Scratch To build the app yourself, pull this repo, and follow the steps below: (Please note that the current build doesn't work on Node v23 but it works on v22) ```bash npm install ``` ```bash npm start ``` ## Packaging for Production To package apps for the local platform: ```bash npm run package ``` ## License Distributed under the AGPL V3 License. See `LICENSE.txt` for more information. ## Reference If you found Transformer Lab useful in your research or applications, please cite using the following BibTeX: ``` @software{transformerlab, author = {Asaria, Ali}, title = {Transformer Lab: Experiment with Large Language Models}, month = December, year = 2023, url = {https://github.com/transformerlab/transformerlab-app} } ``` ## Contact - [@aliasaria](https://twitter.com/aliasaria) - Ali Asasria - [@dadmobile](https://github.com/dadmobile) - Tony Salomone [product-screenshot]: https://transformerlab.ai/assets/images/screenshot01-53ecb8c52338db3c9246cf2ebbbdc40d.png [React.js]: https://img.shields.io/badge/React-20232A?style=for-the-badge&logo=react&logoColor=61DAFB [React-url]: https://reactjs.org/ [Electron]: https://img.shields.io/badge/Electron-20232A?style=for-the-badge&logo=electron&logoColor=61DAFB [Electron-url]: https://www.electronjs.org/ [HuggingFace]: https://img.shields.io/badge/🤗_HuggingFace-20232A?style=for-the-badge [HuggingFace-url]: https://huggingface.co/ [Download Icon]: https://img.shields.io/badge/Download-EF2D5E?style=for-the-badge&logoColor=white&logo=DocuSign [Download URL]: https://transformerlab.ai/docs/download