# hyperbolic_llm **Repository Path**: wangcl_deep/hyperbolic_llm ## Basic Information - **Project Name**: hyperbolic_llm - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-07-09 - **Last Updated**: 2024-07-09 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README Code for Hyperbolic Pre-Trained Language Model (TASLP). ## Pre-Training The code is written based on [Nvidia's Deep Learning Example](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/LanguageModeling/BERT). You can refer to the original repo to see the guidelines on data preparation. We provide the script for pre-training hyperbolic BERT in `scripts/run_bert.sh` ## Fine-Tuning The fine-tuning scripts are provided in `scripts`. ## Pre-Trained Model We provide the pre-trained hyperbolic BERT [here](https://cloud.tsinghua.edu.cn/f/eea7dfbf5df8437c83ed/?dl=1), you can download and extract to `results/`