1 Star 0 Fork 0

zhangminli/mindspore-lite-ci

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
llama2_infer_parallel.sh 2.30 KB
一键复制 编辑 原始数据 按行查看 历史
zhangminli 提交于 2023-11-06 14:22 +08:00 . add llama2 inference test case
readonly start_device_id=0
export HCCL_CONNECT_TIMEOUT=1200
export HCCL_EXEC_TIMEOUT=1200
# export ASCEND_GLOBAL_LOG_LEVEL=1
# export ASCEND_GLOBAL_EVENT_ENABLE=0
# export ASCEND_SLOG_PRINT_TO_STDOUT=1
CURRENT_DIR=$(cd $(dirname $0); pwd)
ROOT_DIR=$(cd $(dirname $0)/../; pwd)
source /usr/local/Ascend/latest/aarch64-linux/bin/setenv.bash
# ASCEND_HOME=/usr/local/Ascend/CANN-7.0
# export LD_LIBRARY_PATH=$ASCEND_HOME/compiler/lib64:$ASCEND_HOME/compiler/lib64/plugin/opskernel:ASCEND_HOME/compiler/lib64/plugin/nnengine:$LD_LIBRARY_PATH
# export PATH=$ASCEND_HOME/compiler/ccec_compiler/bin:$ASCEND_HOME/compiler/bin:$ASCEND_HOME/compiler/tikcc_compiler/bin:$PATH
# export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$ASCEND_HOME/runtime/lib64:$ASCEND_HOME/compiler/lib64/plugin/opskernel:$ASCEND_HOME/compiler/lib64/plugin/nnengine
# export PYTHONPATH=$ASCEND_HOME/compiler/python/site-packages:$PYTHONPATH
# export ASCEND_OPP_PATH=$ASCEND_HOME/opp
# export TOOLCHAIN_HOME=$ASCEND_HOME/toolkit
echo $PATH
echo $LD_LIBRARY_PATH
rm -rf $CURRENT_DIR/tmp_weight_*
rm -rf $CURRENT_DIR/kernel_meta_*
rm -rf $CURRENT_DIR/rank_*
rm -rf $CURRENT_DIR/*.log
mkdir -p $CURRENT_DIR/logs
arch=$1
context=$2
MODEL_DIR="/home/workspace/mindspore_dataset/mslite/models/hiai"
FULL_CONFIG_PATH="./predict-config/prompt_config_2p_${arch}_${context}.ini"
INC_CONFIG_PATH="./predict-config/decoder_config_2p_${arch}_${context}.ini"
for i in {0..1}; do
rank_id=$i
LOG_FILE=$CURRENT_DIR/logs/llama2_${rank_id}.log
FULL_MODEL_PATH="$MODEL_DIR/llama_7b_seq1024_bs1_full_${i}.mindir"
INC_MODEL_PATH="$MODEL_DIR/llama_7b_seq1024_bs1_inc_${i}.mindir"
device_id=$((i+start_device_id))
echo "python infer_parallel.py --device_id ${device_id} --rank_id ${rank_id} --full_model_path $FULL_MODEL_PATH \
--inc_model_path $INC_MODEL_PATH --full_config_path $FULL_CONFIG_PATH --inc_config_path $INC_CONFIG_PATH"
python infer_parallel.py \
--device_id ${device_id} \
--rank_id ${rank_id} \
--full_model_path $FULL_MODEL_PATH \
--inc_model_path $INC_MODEL_PATH \
--full_config_path $FULL_CONFIG_PATH \
--inc_config_path $INC_CONFIG_PATH> $LOG_FILE 2>&1 &
done
wait
cat $LOG_FILE
if grep -q "The avg time cost" $LOG_FILE; then
echo "infer_parallel run success"
exit 0
else
echo "infer_parallel run failed"
exit 1
fi
Loading...
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/zhang-minli/mindspore-lite-ci.git
git@gitee.com:zhang-minli/mindspore-lite-ci.git
zhang-minli
mindspore-lite-ci
mindspore-lite-ci
master

搜索帮助