# CoHelperGLM **Repository Path**: coldcurlyfu/CoHelperGLM ## Basic Information - **Project Name**: CoHelperGLM - **Description**: 基于CodeGeeX2-6B模型来实现的代码辅助工具,项目基于ChatGLM2-6B - **Primary Language**: Python - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 2 - **Forks**: 3 - **Created**: 2023-08-11 - **Last Updated**: 2024-04-24 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # CoHelperGLM >基于CodeGeeX2-6B模型来实现的代码辅助工具,项目基于[ChatGLM2-6B](https://github.com/THUDM/ChatGLM-6B) ## 1. 环境准备 ### 1.1 显卡驱动 - 确保当前显卡驱动支持的CUDA版本大于等于11.7 ```shell nvidia-smi ``` 如下图,显示的CUDA Version大于11.7即可 ![nvidia-smi](resources/nvidia-smi.png) ### 1.2 虚拟环境 - 创建虚拟环境 ```shell conda create -n CoHelperGLM python=3.8 ``` - 安装依赖包 ```shell conda activate CoHelperGLM pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple ``` ## 2. 运行: ```shell conda activate CoHelperGLM streamlit run web_demo2.py ``` ## 3. 运行效果如下: ![](/resources/web_demo.png)