# screenshot-to-code **Repository Path**: whilewon/screenshot-to-code ## Basic Information - **Project Name**: screenshot-to-code - **Description**: No description available - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 2 - **Forks**: 0 - **Created**: 2024-05-07 - **Last Updated**: 2024-10-03 ## Categories & Tags **Categories**: Uncategorized **Tags**: AI ## README # screenshot-to-code A simple tool to convert screenshots, mockups and Figma designs into clean, functional code using AI. https://github.com/abi/screenshot-to-code/assets/23818/6cebadae-2fe3-4986-ac6a-8fb9db030045 Supported stacks: - HTML + Tailwind - React + Tailwind - Vue + Tailwind - Bootstrap - Ionic + Tailwind - SVG Supported AI models: - GPT-4 Turbo (Apr 2024) - Best model - GPT-4 Vision (Nov 2023) - Good model that's better than GPT-4 Turbo on some inputs - Claude 3 Sonnet - Faster, and on par or better than GPT-4 vision for many inputs - DALL-E 3 for image generation See the [Examples](#-examples) section below for more demos. We also just added experimental support for taking a video/screen recording of a website in action and turning that into a functional prototype. ![google in app quick 3](https://github.com/abi/screenshot-to-code/assets/23818/8758ffa4-9483-4b9b-bb66-abd6d1594c33) [Learn more about video here](https://github.com/abi/screenshot-to-code/wiki/Screen-Recording-to-Code). [Follow me on Twitter for updates](https://twitter.com/_abi_). ## 🚀 Try It Out without no install [Try it live on the hosted version (paid)](https://screenshottocode.com). ## 🛠 Getting Started The app has a React/Vite frontend and a FastAPI backend. You will need an OpenAI API key with access to the GPT-4 Vision API or an Anthropic key if you want to use Claude Sonnet, or for experimental video support. Run the backend (I use Poetry for package management - `pip install poetry` if you don't have it): ```bash cd backend echo "OPENAI_API_KEY=sk-your-key" > .env poetry install poetry shell poetry run uvicorn main:app --reload --port 7001 ``` If you want to use Anthropic, add the `ANTHROPIC_API_KEY` to `backend/.env` with your API key from Anthropic. Run the frontend: ```bash cd frontend yarn yarn dev ``` Open http://localhost:5173 to use the app. If you prefer to run the backend on a different port, update VITE_WS_BACKEND_URL in `frontend/.env.local` For debugging purposes, if you don't want to waste GPT4-Vision credits, you can run the backend in mock mode (which streams a pre-recorded response): ```bash MOCK=true poetry run uvicorn main:app --reload --port 7001 ``` ## Docker If you have Docker installed on your system, in the root directory, run: ```bash echo "OPENAI_API_KEY=sk-your-key" > .env docker-compose up -d --build ``` The app will be up and running at http://localhost:5173. Note that you can't develop the application with this setup as the file changes won't trigger a rebuild. ## 🙋‍♂️ FAQs - **I'm running into an error when setting up the backend. How can I fix it?** [Try this](https://github.com/abi/screenshot-to-code/issues/3#issuecomment-1814777959). If that still doesn't work, open an issue. - **How do I get an OpenAI API key?** See https://github.com/abi/screenshot-to-code/blob/main/Troubleshooting.md - **How can I configure an OpenAI proxy?** - If you're not able to access the OpenAI API directly (due to e.g. country restrictions), you can try a VPN or you can configure the OpenAI base URL to use a proxy: Set OPENAI_BASE_URL in the `backend/.env` or directly in the UI in the settings dialog. Make sure the URL has "v1" in the path so it should look like this: `https://xxx.xxxxx.xxx/v1` - **How can I update the backend host that my front-end connects to?** - Configure VITE_HTTP_BACKEND_URL and VITE_WS_BACKEND_URL in front/.env.local For example, set VITE_HTTP_BACKEND_URL=http://124.10.20.1:7001 - **Seeing UTF-8 errors when running the backend?** - On windows, open the .env file with notepad++, then go to Encoding and select UTF-8. - **How can I provide feedback?** For feedback, feature requests and bug reports, open an issue or ping me on [Twitter](https://twitter.com/_abi_). ## 📚 Examples **NYTimes** | Original | Replica | | --------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Screenshot 2023-11-20 at 12 54 03 PM | Screenshot 2023-11-20 at 12 59 56 PM | **Instagram page (with not Taylor Swift pics)** https://github.com/abi/screenshot-to-code/assets/23818/503eb86a-356e-4dfc-926a-dabdb1ac7ba1 **Hacker News** but it gets the colors wrong at first so we nudge it https://github.com/abi/screenshot-to-code/assets/23818/3fec0f77-44e8-4fb3-a769-ac7410315e5d ## 🌍 Hosted Version 🆕 [Try it here (paid)](https://screenshottocode.com). Or see [Getting Started](#-getting-started) for local install instructions to use with your own API keys. [!["Buy Me A Coffee"](https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png)](https://www.buymeacoffee.com/abiraja)