diff --git a/.gitignore b/.gitignore index 6fb9043a6e8c77e6d6b59091a05073b23f33c79c..95d9875b53ecb6035a96e3afb3a0b9dff4492720 100644 --- a/.gitignore +++ b/.gitignore @@ -4,7 +4,7 @@ # Temp File *tmp* - +*.zip # Python Tmp File *.pyc diff --git "a/AICore Profiling\345\267\245\345\205\267\344\275\277\347\224\250\346\214\207\345\257\274\344\271\246.md" "b/AICore Profiling\345\267\245\345\205\267\344\275\277\347\224\250\346\214\207\345\257\274\344\271\246.md" new file mode 100644 index 0000000000000000000000000000000000000000..b191438616d46e9637e1c809886202c54d5472a9 --- /dev/null +++ "b/AICore Profiling\345\267\245\345\205\267\344\275\277\347\224\250\346\214\207\345\257\274\344\271\246.md" @@ -0,0 +1,69 @@ +## 1 背景 +(1)模型调测过程中,性能分析是一大块工作,所以一款好用的profiling工具必不可少。 +(2)630之前的profiling工具其实就已经有了,但易用性上直接让人感到反感,这次改版之后,改进很大,数据采集和数据解析在易用性上也有很大提升。 + +## 2 数据采集(训练任务JOB) +其他方式不在这里介绍,这里只介绍设置环境变量的方式,开启profiling功能: +``` +export PROFILING_MODE=true +export PROFILING_OPTIONS='{"output": "./cann_profiling", "training_trace": "on", "task_trace": "on", "aicpu": "on", "fp_point": "", "bp_point": "", "aic_metrics": "PipeUtilization"}' +``` +其中,output指明采集数据的存放目录,需提前创建好目录, aicpu为aicpu算子开关,aic_metrics为aicore算子开关,其他参数见说明[CANN V100R020C10 开发辅助工具指南 (训练) 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100164832/6f4033fd) + +设置好环境变量后,然后执行训练,之后会在./cann_profiling目录下产生profiling数据: +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig1.png) + + +## 3 数据解析 +### 3.1 解析工具 +解析数据必须要用到昇腾软件包的安装目录下的msprof工具,一般默认路径为: +``` +x86路径为: +/usr/local/Ascend/ascend-toolkit/latest/x86_64-linux/toolkit/tools/profiler/profiler_tool/analysis/msprof/msprof.py + +arm路径为: +/usr/local/Ascend/ascend-toolkit/latest/aarch64-linux/toolkit/tools/profiler/profiler_tool/analysis/msprof/msprof.py +``` +以下指令以x86_64环境为例 + +### 3.2 解析profiling数据 +``` +python3.7 /usr/local/Ascend/ascend-toolkit/latest/x86_64-linux/toolkit/tools/profiler/profiler_tool/analysis/msprof/msprof.py import -dir ./cann_profiling/ +``` +其中,./cann_profiling为上面采集数据的保存路径 +执行过程: +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig2.png) + +执行之后会在cann_profiling目录下生成sqlite等数据目录: +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig3.png) + +### 3.3 导出timeline数据 +导出timeline数据,执行如下命令: +``` +python3.7 /usr/local/Ascend/ascend-toolkit/latest/x86_64-linux/toolkit/tools/profiler/profiler_tool/analysis/msprof/msprof.py export timeline -dir ./cann_profiling/ +``` +如果需要指定某一个迭代step,可设置参数:-iteration-id +执行之后会生成timeline文件夹: +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig4.png) + +里面的json文件可以用chrome://tracing查看 +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig5.png) + +### 3.4 导出summary数据 +导出summary数据,执行如下命令: +``` +python3.7 /usr/local/Ascend/ascend-toolkit/latest/x86_64-linux/toolkit/tools/profiler/profiler_tool/analysis/msprof/msprof.py export summary -dir ./cann_profiling/ +``` + +如果需要指定某一个迭代step,可设置参数:-iteration-id +执行之后会生成summary文件夹,里面csv文件就是summary数据,能看到算子名称,算子执行顺序,算子耗时。 +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig6.png) + +csv文件 +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig7.png) + +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/aicore_profiling_fig8.png) + +## 4 展望 +(1)该工具还可以profiling系统性能数据,如PCIE,DVPP,HBM等。 +(2)目前算子的input shape及其dtype、format,output shape及其dtype、format等信息(task_info信息)pytorch场景没有生成,tf场景是有的。因为对于op_based场景,GE没有上报这些信息。已经提了这个需求。 diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNeXt50_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNeXt50_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNeXt50_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNeXt50_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/LICENSE" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/LICENSE" new file mode 100644 index 0000000000000000000000000000000000000000..eeac88fb9dc15a1427b41173cf5f136327230c49 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/LICENSE" @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/README.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/README.md" new file mode 100644 index 0000000000000000000000000000000000000000..a37c7038bb37a6b948a1aa8ea7f508c8f5e1960d --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/README.md" @@ -0,0 +1,345 @@ +# ResNeXt50 Onnx模型端到端推理指导 +- [1 模型概述](#1-模型概述) + - [1.1 论文地址](#11-论文地址) + - [1.2 代码地址](#12-代码地址) +- [2 环境说明](#2-环境说明) + - [2.1 深度学习框架](#21-深度学习框架) + - [2.2 python第三方库](#22-python第三方库) +- [3 模型转换](#3-模型转换) + - [3.1 pth转onnx模型](#31-pth转onnx模型) + - [3.2 onnx转om模型](#32-onnx转om模型) +- [4 数据集预处理](#4-数据集预处理) + - [4.1 数据集获取](#41-数据集获取) + - [4.2 数据集预处理](#42-数据集预处理) + - [4.3 生成数据集信息文件](#43-生成数据集信息文件) +- [5 离线推理](#5-离线推理) + - [5.1 benchmark工具概述](#51-benchmark工具概述) + - [5.2 离线推理](#52-离线推理) +- [6 精度对比](#6-精度对比) + - [6.1 离线推理精度统计](#61-离线推理精度统计) + - [6.2 开源精度](#62-开源精度) + - [6.3 精度对比](#63-精度对比) +- [7 性能对比](#7-性能对比) + - [7.1 npu性能数据](#71-npu性能数据) + - [7.2 T4性能数据](#72-T4性能数据) + - [7.3 性能对比](#73-性能对比) + + + +## 1 模型概述 + +- **[论文地址](#11-论文地址)** + +- **[代码地址](#12-代码地址)** + +### 1.1 论文地址 +[ResNeXt50论文](https://arxiv.org/abs/1611.05431) + +### 1.2 代码地址 +[ResNeXt50代码](https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py) +branch:master +commit id:b68adcf9a9280aef02fc08daed170d74d0892361 +$\color{red}{说明:删除线用于说明READ.md必要的包含项,以下带有删除线的说明在README.md中需要删除}$ +~~优先使用本任务提供的开源代码仓,填写分支与commit id,需要从github的commits中找到commit id,commit id是指基于该次提交时的模型代码做推理,通常选择稳定版本的最后一次提交,或代码仓最新的一次提交~~ + + +## 2 环境说明 + +- **[深度学习框架](#21-深度学习框架)** + +- **[python第三方库](#22-python第三方库)** + +### 2.1 深度学习框架 +``` +python3.7.5 +CANN 5.0.1 + +pytorch >= 1.5.0 +torchvision >= 0.6.0 +onnx >= 1.7.0 +``` +~~目前推理310服务器安装的是蓝区商用版本CANN 5.0.1,库若无特殊版本要求以上三个库固定这么写,需要使用python3.7命令执行脚本,pip3.7命令安装库,torch使用1.5.0版本,如果开源模型代码导出onnx要求torch版本大于1.5.0,则使用1.8.0版本,并在此处说明~~ + +### 2.2 python第三方库 + +``` +numpy == 1.20.3 +Pillow == 8.2.0 +opencv-python == 4.5.2.54 +``` +~~requirements.txt中需要写明本模型离线推理所有必要依赖库的具体版本,版本号即是推理310服务器上推理时使用库的版本号~~ + +**说明:** +> X86架构:pytorch,torchvision和onnx可以通过官方下载whl包安装,其它可以通过pip3.7 install 包名 安装 +> +> Arm架构:pytorch,torchvision和onnx可以通过源码编译安装,其它可以通过pip3.7 install 包名 安装 + +## 3 模型转换 + +- **[pth转onnx模型](#31-pth转onnx模型)** + +- **[onnx转om模型](#32-onnx转om模型)** + +### 3.1 pth转onnx模型 + +1.下载pth权重文件 +[ResNeXt50预训练pth权重文件](https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth) +文件md5sum: 1d6611049e6ef03f1d6afa11f6f9023e +``` +wget https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth +``` +~~优先使用训练提供的权重文件,如果训练的权重文件网上能获则需给出网址,否则需要给出从哪获取权重文件。如果训练没有提供权重则使用开源代码仓的权重文件。需要给出权重文件名与通过md5sum命令计算的权重文件md5sum值~~ +2.resnext50模型代码在torchvision里,安装torchvision,arm下需源码安装,参考torchvision官网,若安装过程报错请百度解决 +``` +git clone https://github.com/pytorch/vision +cd vision +git reset b68adcf9a9280aef02fc08daed170d74d0892361 --hard +python3.7 setup.py install +cd .. +``` +~~如果需要对模型的开源代码做修改,以打patch的形式修改后再安装:patch -p1 < ../{patch_name}.diff~~ +3.编写pth2onnx脚本resnext50_pth2onnx.py +~~如果模型开源代码仓没有安装脚本,可以通过sys.path.append(r"./vision")添加搜索路径,然后就可以引用模型开源代码仓的函数或类~~ + **说明:** +>注意目前ATC支持的onnx算子版本为11 + +4.执行pth2onnx脚本,生成onnx模型文件 +``` +python3.7 resnext50_pth2onnx.py resnext50_32x4d-7cdf4587.pth resnext50.onnx +``` + + **模型转换要点:** +~~对于CANN包算子有问题导致模型转换失败或需要规避才能转换成功,则需要在模型转换要点里写明定位主要过程,原因与措施~~ +>此模型转换为onnx不需要修改开源代码仓代码,故不需要特殊说明 + +### 3.2 onnx转om模型 + +1.设置环境变量 +``` +source env.sh +``` +2.使用atc将onnx模型转换为om模型文件,工具使用方法可以参考CANN 5.0.1 开发辅助工具指南 (推理) 01 +``` +atc --framework=5 --model=./resnext50.onnx --input_format=NCHW --input_shape="image:16,3,224,224" --output=resnext50_bs16 --log=debug --soc_version=Ascend310 +``` + +## 4 数据集预处理 + +- **[数据集获取](#41-数据集获取)** + +- **[数据集预处理](#42-数据集预处理)** + +- **[生成数据集信息文件](#43-生成数据集信息文件)** + +### 4.1 数据集获取 +该模型使用[ImageNet官网](http://www.image-net.org)的5万张验证集进行测试,图片与标签分别存放在/root/datasets/imagenet/val与/root/datasets/imagenet/val_label.txt。 + +### 4.2 数据集预处理 +1.预处理脚本imagenet_torch_preprocess.py + +2.执行预处理脚本,生成数据集预处理后的bin文件 +``` +python3.7 imagenet_torch_preprocess.py resnet /root/datasets/imagenet/val ./prep_dataset +``` +### 4.3 生成数据集信息文件 +1.生成数据集信息文件脚本gen_dataset_info.py + +2.执行生成数据集信息脚本,生成数据集信息文件 +``` +python3.7 gen_dataset_info.py bin ./prep_dataset ./resnext50_prep_bin.info 224 224 +``` +第一个参数为模型输入的类型,第二个参数为生成的bin文件路径,第三个为输出的info文件,后面为宽高信息 +## 5 离线推理 + +- **[benchmark工具概述](#51-benchmark工具概述)** + +- **[离线推理](#52-离线推理)** + +### 5.1 benchmark工具概述 + +benchmark工具为华为自研的模型推理工具,支持多种模型的离线推理,能够迅速统计出模型在Ascend310上的性能,支持真实数据和纯推理两种模式,配合后处理脚本,可以实现诸多模型的端到端过程,获取工具及使用方法可以参考CANN 5.0.1 推理benchmark工具用户指南 01 +### 5.2 离线推理 +1.设置环境变量 +``` +source env.sh +``` +2.执行离线推理 +``` +./benchmark.x86_64 -model_type=vision -device_id=0 -batch_size=16 -om_path=resnext50_bs16.om -input_text_path=./resnext50_prep_bin.info -input_width=224 -input_height=224 -output_binary=False -useDvpp=False +``` +输出结果默认保存在当前目录result/dumpOutput_device{0},模型只有一个名为class的输出,shape为bs * 1000,数据类型为FP32,对应1000个分类的预测结果,每个输入对应的输出对应一个_x.bin文件。 + +## 6 精度对比 + +- **[离线推理精度](#61-离线推理精度)** +- **[开源精度](#62-开源精度)** +- **[精度对比](#63-精度对比)** + +### 6.1 离线推理精度统计 + +后处理统计TopN精度 + +调用imagenet_acc_eval.py脚本推理结果与label比对,可以获得Accuracy Top5数据,结果保存在result.json中。 +``` +python3.7 imagenet_acc_eval.py result/dumpOutput_device0/ /root/datasets/imagenet/val_label.txt ./ result.json +``` +第一个为benchmark输出目录,第二个为数据集配套标签,第三个是生成文件的保存目录,第四个是生成的文件名。 +查看输出结果: +``` +{"title": "Overall statistical evaluation", "value": [{"key": "Number of images", "value": "50000"}, {"key": "Number of classes", "value": "1000"}, {"key": "Top1 accuracy", "value": "77.62%"}, {"key": "Top2 accuracy", "value": "87.42%"}, {"key": "Top3 accuracy", "value": "90.79%"}, {"key": "Top4 accuracy", "value": "92.56%"}, {"key": "Top5 accuracy", "value": "93.69%"}] +``` +经过对bs1与bs16的om测试,本模型batch1的精度与batch16的精度没有差别,精度数据均如上 +~~因为batch可能影响精度,如果模型支持多batch的话,精度测试需要且仅测试bs1与bs16的精度~~ + +### 6.2 开源精度 +[torchvision官网精度](https://pytorch.org/vision/stable/models.html) +``` +Model Acc@1 Acc@5 +ResNeXt-50-32x4d 77.618 93.698 +``` +### 6.3 精度对比 +将得到的om离线模型推理TopN精度与该模型github代码仓上公布的精度对比,精度下降在1%范围之内,故精度达标。 + **精度调试:** +~~对于CANN包算子有问题导致精度不达标或需要规避才能达标,则需要在精度调试里写明定位主要过程,原因与措施~~ +>没有遇到精度不达标的问题,故不需要进行精度调试 + +## 7 性能对比 + +- **[npu性能数据](#71-npu性能数据)** +- **[T4性能数据](#72-T4性能数据)** +- **[性能对比](#73-性能对比)** + +~~性能数据需要测bs1,16,4,8,32的性能数据,且需要计算出单卡吞吐率。对于npu,bs1,16要在整个数据集上推理测性能,为了避免长期占用device,bs4,8,32也可以使用纯推理测性能~~ + +### 7.1 npu性能数据 +benchmark工具在整个数据集上推理时也会统计性能数据,但是推理整个数据集较慢,如果这么测性能那么整个推理期间需要确保独占device,使用npu-smi info可以查看device是否空闲。也可以使用benchmark纯推理功能测得性能数据,但是由于随机数不能模拟数据分布,纯推理功能测的有些模型性能数据可能不太准,benchmark纯推理功能测性能仅为快速获取大概的性能数据以便调试优化使用,可初步确认benchmark工具在整个数据集上推理时由于device也被其它推理任务使用了导致的性能不准的问题。模型的性能以使用benchmark工具在整个数据集上推理得到bs1与bs16的性能数据为准,对于使用benchmark工具测试的batch4,8,32的性能数据在README.md中如下作记录即可。 +1.benchmark工具在整个数据集上推理获得性能数据 +batch1的性能,benchmark工具在整个数据集上推理后生成result/perf_vision_batchsize_1_device_0.txt: +``` +[e2e] throughputRate: 243.034, latency: 205733 +[data read] throughputRate: 258.963, moduleLatency: 3.86155 +[preprocess] throughputRate: 258.404, moduleLatency: 3.86991 +[infer] throughputRate: 244.435, Interface throughputRate: 382.328, moduleLatency: 3.35758 +[post] throughputRate: 244.435, moduleLatency: 4.09107 +``` +Interface throughputRate: 382.328,382.328x4=1529.312既是batch1 310单卡吞吐率 +batch16的性能,benchmark工具在整个数据集上推理后生成result/perf_vision_batchsize_16_device_1.txt: +``` +[e2e] throughputRate: 173.173, latency: 288729 +[data read] throughputRate: 174.62, moduleLatency: 5.72673 +[preprocess] throughputRate: 174.357, moduleLatency: 5.73535 +[infer] throughputRate: 173.844, Interface throughputRate: 519.634, moduleLatency: 3.36724 +[post] throughputRate: 10.865, moduleLatency: 92.0383 +``` +Interface throughputRate: 519.634,519.634x4=2078.536既是batch16 310单卡吞吐率 +batch4性能: +``` +[e2e] throughputRate: 232.98, latency: 214611 +[data read] throughputRate: 235.537, moduleLatency: 4.24562 +[preprocess] throughputRate: 235.147, moduleLatency: 4.25266 +[infer] throughputRate: 234.437, Interface throughputRate: 492.99, moduleLatency: 3.48397 +[post] throughputRate: 58.6087, moduleLatency: 17.0623 +``` +batch4 310单卡吞吐率:492.99x4=1971.96fps +batch8性能: +``` +[e2e] throughputRate: 211.307, latency: 236622 +[data read] throughputRate: 212.246, moduleLatency: 4.71152 +[preprocess] throughputRate: 211.931, moduleLatency: 4.71851 +[infer] throughputRate: 211.927, Interface throughputRate: 496.378, moduleLatency: 3.45797 +[post] throughputRate: 26.4906, moduleLatency: 37.7493 +``` +batch8 310单卡吞吐率:496.378x4=1985.512fps +batch32性能: +``` +[e2e] throughputRate: 122.942, latency: 406696 +[data read] throughputRate: 123.244, moduleLatency: 8.11402 +[preprocess] throughputRate: 123.143, moduleLatency: 8.12064 +[infer] throughputRate: 123.207, Interface throughputRate: 377.787, moduleLatency: 4.10655 +[post] throughputRate: 3.8514, moduleLatency: 259.646 +``` +batch32 310单卡吞吐率:377.787x4=1511.148fps + +### 7.2 T4性能数据 +在装有T4卡的服务器上测试gpu性能,测试过程请确保卡没有运行其他任务,TensorRT版本:7.2.3.4,cuda版本:11.0,cudnn版本:8.2 +~~目前T4服务器安装的cuda,cudnn,TensorRT版本如上~~ +batch1性能: +``` +trtexec --onnx=resnext50.onnx --fp16 --shapes=image:1x3x224x224 --threads +``` +gpu T4是4个device并行执行的结果,mean是时延(tensorrt的时延是batch个数据的推理时间),即吞吐率的倒数乘以batch。其中--fp16是算子精度,目前算子精度只测--fp16的。注意--shapes是onnx的输入节点名与shape,当onnx输入节点的batch为-1时,可以用同一个onnx文件测不同batch的性能,否则用固定batch的onnx测不同batch的性能不准 +``` +[03/24/2021-03:54:47] [I] GPU Compute +[03/24/2021-03:54:47] [I] min: 1.26575 ms +[03/24/2021-03:54:47] [I] max: 4.41528 ms +[03/24/2021-03:54:47] [I] mean: 1.31054 ms +[03/24/2021-03:54:47] [I] median: 1.30151 ms +[03/24/2021-03:54:47] [I] percentile: 1.40723 ms at 99% +[03/24/2021-03:54:47] [I] total compute time: 2.9972 s +``` +batch1 t4单卡吞吐率:1000/(1.31054/1)=763.044fps + +batch16性能: +``` +trtexec --onnx=resnext50.onnx --fp16 --shapes=image:16x3x224x224 --threads +``` +``` +[03/24/2021-03:57:22] [I] GPU Compute +[03/24/2021-03:57:22] [I] min: 12.5645 ms +[03/24/2021-03:57:22] [I] max: 14.8437 ms +[03/24/2021-03:57:22] [I] mean: 12.9561 ms +[03/24/2021-03:57:22] [I] median: 12.8541 ms +[03/24/2021-03:57:22] [I] percentile: 14.8377 ms at 99% +[03/24/2021-03:57:22] [I] total compute time: 3.03173 s +``` +batch16 t4单卡吞吐率:1000/(12.9561/16)=1234.940fps + +batch4性能: +``` +[05/27/2021-03:16:26] [I] GPU Compute +[05/27/2021-03:16:26] [I] min: 3.77515 ms +[05/27/2021-03:16:26] [I] max: 4.07959 ms +[05/27/2021-03:16:26] [I] mean: 3.92862 ms +[05/27/2021-03:16:26] [I] median: 3.9552 ms +[05/27/2021-03:16:26] [I] percentile: 4.07324 ms at 99% +[05/27/2021-03:16:26] [I] total compute time: 3.0054 s +``` +batch4 t4单卡吞吐率:1000/(3.92862/4)=1018.169fps + +batch8性能: +``` +[05/27/2021-03:14:52] [I] GPU Compute +[05/27/2021-03:14:52] [I] min: 6.52148 ms +[05/27/2021-03:14:52] [I] max: 7.22937 ms +[05/27/2021-03:14:52] [I] mean: 6.80709 ms +[05/27/2021-03:14:52] [I] median: 6.78735 ms +[05/27/2021-03:14:52] [I] percentile: 7.08972 ms at 99% +[05/27/2021-03:14:52] [I] total compute time: 3.01554 s +``` +batch8 t4单卡吞吐率:1000/(6.80709/8)=1175.245fps + +batch32性能: +``` +[05/27/2021-03:13:11] [I] GPU Compute +[05/27/2021-03:13:11] [I] min: 23.126 ms +[05/27/2021-03:13:11] [I] max: 26.0043 ms +[05/27/2021-03:13:11] [I] mean: 24.2826 ms +[05/27/2021-03:13:11] [I] median: 24.2343 ms +[05/27/2021-03:13:11] [I] percentile: 25.6355 ms at 99% +[05/27/2021-03:13:11] [I] total compute time: 3.05961 s +``` +batch32 t4单卡吞吐率:1000/(24.2826/32)=1317.816fps + +### 7.3 性能对比 +batch1:382.328x4 > 1000x1/(1.31054/1) +batch16:519.634x4 > 1000x1/(12.9561/16) +310单个device的吞吐率乘4即单卡吞吐率比T4单卡的吞吐率大,故310性能高于T4性能,性能达标。 +对于batch1与batch16,310性能均高于T4性能1.2倍,该模型放在Benchmark/cv/classification目录下。 +~~对比bs1和16,小于1倍放于Research,1-1.2倍放于Official,大于1.2倍放于Benchmark,而实际提交代码时目前都放在Research目录下~~ + **性能优化:** +~~对于CANN包算子有问题导致性能不达标或需要规避才能达标,则需要在性能优化里写明定位主要过程,原因与措施~~ +>没有遇到性能不达标的问题,故不需要进行性能优化 + +~~如果蓝区商用版本测精度或性能不达标,蓝区最新社区CANN版本测可以达标,这里需要写出原因与最新蓝区社区CANN包版本,用最新版本测。如果是无法规避的算子缺陷导致性能不达标,这里需要添加性能不达标的原因与解决方案。如果onnx因包含自定义算子不支持推理,需要说明性能是在t4上测的在线推理,如果模型不支持batch 16,也需要说明一下~~ + + diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/env.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/env.sh" new file mode 100644 index 0000000000000000000000000000000000000000..49be8f16a045f6e4e61666ecc8a5da811c15fd80 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/env.sh" @@ -0,0 +1,8 @@ +#! /bin/bash + +export install_path=/usr/local/Ascend/ascend-toolkit/latest +export PATH=/usr/local/python3.7.5/bin:${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH +export PYTHONPATH=${install_path}/atc/python/site-packages:$PYTHONPATH +export LD_LIBRARY_PATH=${install_path}/atc/lib64:${install_path}/acllib/lib64:$LD_LIBRARY_PATH +export ASCEND_OPP_PATH=${install_path}/opp +export ASCEND_AICPU_PATH=/usr/local/Ascend/ascend-toolkit/latest diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/gen_dataset_info.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/gen_dataset_info.py" new file mode 100644 index 0000000000000000000000000000000000000000..80c2b0fc300d7037330a166b23c562015cd17148 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/gen_dataset_info.py" @@ -0,0 +1,60 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import os +import sys +import cv2 +from glob import glob + + +def get_bin_info(file_path, info_name, width, height): + bin_images = glob(os.path.join(file_path, '*.bin')) + with open(info_name, 'w') as file: + for index, img in enumerate(bin_images): + content = ' '.join([str(index), img, width, height]) + file.write(content) + file.write('\n') + + +def get_jpg_info(file_path, info_name): + extensions = ['jpg', 'jpeg', 'JPG', 'JPEG'] + image_names = [] + for extension in extensions: + image_names.append(glob(os.path.join(file_path, '*.' + extension))) + with open(info_name, 'w') as file: + for image_name in image_names: + if len(image_name) == 0: + continue + else: + for index, img in enumerate(image_name): + img_cv = cv2.imread(img) + shape = img_cv.shape + width, height = shape[1], shape[0] + content = ' '.join([str(index), img, str(width), str(height)]) + file.write(content) + file.write('\n') + + +if __name__ == '__main__': + file_type = sys.argv[1] + file_path = sys.argv[2] + info_name = sys.argv[3] + if file_type == 'bin': + width = sys.argv[4] + height = sys.argv[5] + assert len(sys.argv) == 6, 'The number of input parameters must be equal to 5' + get_bin_info(file_path, info_name, width, height) + elif file_type == 'jpg': + assert len(sys.argv) == 4, 'The number of input parameters must be equal to 3' + get_jpg_info(file_path, info_name) diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/imagenet_acc_eval.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/imagenet_acc_eval.py" new file mode 100644 index 0000000000000000000000000000000000000000..0e1db27e816a0cf3ec4fb21ee23e691315f3f959 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/imagenet_acc_eval.py" @@ -0,0 +1,183 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import os +import sys +import json +import numpy as np +import time + +np.set_printoptions(threshold=sys.maxsize) + +LABEL_FILE = "HiAI_label.json" + + +def gen_file_name(img_name): + full_name = img_name.split('/')[-1] + index = full_name.rfind('.') + return full_name[:index] + + +def cre_groundtruth_dict(gtfile_path): + """ + :param filename: file contains the imagename and label number + :return: dictionary key imagename, value is label number + """ + img_gt_dict = {} + for gtfile in os.listdir(gtfile_path): + if (gtfile != LABEL_FILE): + with open(os.path.join(gtfile_path, gtfile), 'r') as f: + gt = json.load(f) + ret = gt["image"]["annotations"][0]["category_id"] + img_gt_dict[gen_file_name(gtfile)] = ret + return img_gt_dict + + +def cre_groundtruth_dict_fromtxt(gtfile_path): + """ + :param filename: file contains the imagename and label number + :return: dictionary key imagename, value is label number + """ + img_gt_dict = {} + with open(gtfile_path, 'r')as f: + for line in f.readlines(): + temp = line.strip().split(" ") + imgName = temp[0].split(".")[0] + imgLab = temp[1] + img_gt_dict[imgName] = imgLab + return img_gt_dict + + +def load_statistical_predict_result(filepath): + """ + function: + the prediction esult file data extraction + input: + result file:filepath + output: + n_label:numble of label + data_vec: the probabilitie of prediction in the 1000 + :return: probabilities, numble of label, in_type, color + """ + with open(filepath, 'r')as f: + data = f.readline() + temp = data.strip().split(" ") + n_label = len(temp) + if data == '': + n_label = 0 + data_vec = np.zeros((n_label), dtype=np.float32) + in_type = '' + color = '' + if n_label == 0: + in_type = f.readline() + color = f.readline() + else: + for ind, prob in enumerate(temp): + data_vec[ind] = np.float32(prob) + return data_vec, n_label, in_type, color + + +def create_visualization_statistical_result(prediction_file_path, + result_store_path, json_file_name, + img_gt_dict, topn=5): + """ + :param prediction_file_path: + :param result_store_path: + :param json_file_name: + :param img_gt_dict: + :param topn: + :return: + """ + writer = open(os.path.join(result_store_path, json_file_name), 'w') + table_dict = {} + table_dict["title"] = "Overall statistical evaluation" + table_dict["value"] = [] + + count = 0 + resCnt = 0 + n_labels = 0 + count_hit = np.zeros(topn) + for tfile_name in os.listdir(prediction_file_path): + count += 1 + temp = tfile_name.split('.')[0] + index = temp.rfind('_') + img_name = temp[:index] + filepath = os.path.join(prediction_file_path, tfile_name) + ret = load_statistical_predict_result(filepath) + prediction = ret[0] + n_labels = ret[1] + sort_index = np.argsort(-prediction) + gt = img_gt_dict[img_name] + if (n_labels == 1000): + realLabel = int(gt) + elif (n_labels == 1001): + realLabel = int(gt) + 1 + else: + realLabel = int(gt) + + resCnt = min(len(sort_index), topn) + for i in range(resCnt): + if (str(realLabel) == str(sort_index[i])): + count_hit[i] += 1 + break + + if 'value' not in table_dict.keys(): + print("the item value does not exist!") + else: + table_dict["value"].extend( + [{"key": "Number of images", "value": str(count)}, + {"key": "Number of classes", "value": str(n_labels)}]) + if count == 0: + accuracy = 0 + else: + accuracy = np.cumsum(count_hit) / count + for i in range(resCnt): + table_dict["value"].append({"key": "Top" + str(i + 1) + " accuracy", + "value": str( + round(accuracy[i] * 100, 2)) + '%'}) + json.dump(table_dict, writer) + writer.close() + + +if __name__ == '__main__': + start = time.time() + try: + # txt file path + folder_davinci_target = sys.argv[1] + # annotation files path, "val_label.txt" + annotation_file_path = sys.argv[2] + # the path to store the results json path + result_json_path = sys.argv[3] + # result json file name + json_file_name = sys.argv[4] + except IndexError: + print("Stopped!") + exit(1) + + if not (os.path.exists(folder_davinci_target)): + print("target file folder does not exist.") + + if not (os.path.exists(annotation_file_path)): + print("Ground truth file does not exist.") + + if not (os.path.exists(result_json_path)): + print("Result folder doesn't exist.") + + img_label_dict = cre_groundtruth_dict_fromtxt(annotation_file_path) + create_visualization_statistical_result(folder_davinci_target, + result_json_path, json_file_name, + img_label_dict, topn=5) + + elapsed = (time.time() - start) + print("Time used:", elapsed) diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/imagenet_torch_preprocess.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/imagenet_torch_preprocess.py" new file mode 100644 index 0000000000000000000000000000000000000000..65b50a502a600bc3a8ee420cbdbe3c6f13adc02a --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/imagenet_torch_preprocess.py" @@ -0,0 +1,116 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import os +import sys +from PIL import Image +import numpy as np +import multiprocessing + + +model_config = { + 'resnet': { + 'resize': 256, + 'centercrop': 224, + 'mean': [0.485, 0.456, 0.406], + 'std': [0.229, 0.224, 0.225], + }, + 'inceptionv3': { + 'resize': 342, + 'centercrop': 299, + 'mean': [0.485, 0.456, 0.406], + 'std': [0.229, 0.224, 0.225], + }, + 'inceptionv4': { + 'resize': 342, + 'centercrop': 299, + 'mean': [0.5, 0.5, 0.5], + 'std': [0.5, 0.5, 0.5], + }, +} + + +def center_crop(img, output_size): + if isinstance(output_size, int): + output_size = (int(output_size), int(output_size)) + image_width, image_height = img.size + crop_height, crop_width = output_size + crop_top = int(round((image_height - crop_height) / 2.)) + crop_left = int(round((image_width - crop_width) / 2.)) + return img.crop((crop_left, crop_top, crop_left + crop_width, crop_top + crop_height)) + + +def resize(img, size, interpolation=Image.BILINEAR): + if isinstance(size, int): + w, h = img.size + if (w <= h and w == size) or (h <= w and h == size): + return img + if w < h: + ow = size + oh = int(size * h / w) + return img.resize((ow, oh), interpolation) + else: + oh = size + ow = int(size * w / h) + return img.resize((ow, oh), interpolation) + else: + return img.resize(size[::-1], interpolation) + + +def gen_input_bin(mode_type, file_batches, batch): + i = 0 + for file in file_batches[batch]: + i = i + 1 + print("batch", batch, file, "===", i) + + # RGBA to RGB + image = Image.open(os.path.join(src_path, file)).convert('RGB') + image = resize(image, model_config[mode_type]['resize']) # Resize + image = center_crop(image, model_config[mode_type]['centercrop']) # CenterCrop + img = np.array(image, dtype=np.float32) + img = img.transpose(2, 0, 1) # ToTensor: HWC -> CHW + img = img / 255. # ToTensor: div 255 + img -= np.array(model_config[mode_type]['mean'], dtype=np.float32)[:, None, None] # Normalize: mean + img /= np.array(model_config[mode_type]['std'], dtype=np.float32)[:, None, None] # Normalize: std + img.tofile(os.path.join(save_path, file.split('.')[0] + ".bin")) + + +def preprocess(mode_type, src_path, save_path): + files = os.listdir(src_path) + file_batches = [files[i:i + 500] for i in range(0, 50000, 500) if files[i:i + 500] != []] + thread_pool = multiprocessing.Pool(len(file_batches)) + for batch in range(len(file_batches)): + thread_pool.apply_async(gen_input_bin, args=(mode_type, file_batches, batch)) + thread_pool.close() + thread_pool.join() + print("in thread, except will not report! please ensure bin files generated.") + + +if __name__ == '__main__': + if len(sys.argv) < 4: + raise Exception("usage: python3 xxx.py [model_type] [src_path] [save_path]") + mode_type = sys.argv[1] + src_path = sys.argv[2] + save_path = sys.argv[3] + src_path = os.path.realpath(src_path) + save_path = os.path.realpath(save_path) + if mode_type not in model_config: + model_type_help = "model type: " + for key in model_config.keys(): + model_type_help += key + model_type_help += ' ' + raise Exception(model_type_help) + if not os.path.isdir(save_path): + os.makedirs(os.path.realpath(save_path)) + preprocess(mode_type, src_path, save_path) diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/requirements.txt" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/requirements.txt" new file mode 100644 index 0000000000000000000000000000000000000000..8f0de9561f6cdbfe3facbf31905aee0e9bf7fa9e --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/requirements.txt" @@ -0,0 +1,6 @@ +torch == 1.5.0 +torchvision == 0.6.0 +onnx == 1.7.0 +numpy == 1.20.3 +Pillow == 8.2.0 +opencv-python == 4.5.2.54 \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/resnext50_pth2onnx.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/resnext50_pth2onnx.py" new file mode 100644 index 0000000000000000000000000000000000000000..8e180e9245b3f8964b98196e080e3e6e8835077b --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/resnext50_pth2onnx.py" @@ -0,0 +1,35 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import sys +import torch +import torch.onnx +import torchvision.models as models + +def pth2onnx(input_file, output_file): + model = models.resnext50_32x4d(pretrained=False) + checkpoint = torch.load(input_file, map_location=None) + model.load_state_dict(checkpoint) + + model.eval() + input_names = ["image"] + output_names = ["class"] + dynamic_axes = {'image': {0: '-1'}, 'class': {0: '-1'}} + dummy_input = torch.randn(1, 3, 224, 224) + torch.onnx.export(model, dummy_input, output_file, input_names = input_names, dynamic_axes = dynamic_axes, output_names = output_names, verbose=True, opset_version=11) + +if __name__ == "__main__": + input_file = sys.argv[1] + output_file = sys.argv[2] + pth2onnx(input_file, output_file) diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/README.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/README.md" new file mode 100644 index 0000000000000000000000000000000000000000..c25279493efdab395677a1e991caa2467ca60a11 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/README.md" @@ -0,0 +1,33 @@ +环境准备: + +1.数据集路径 +通用的数据集统一放在/root/datasets/或/opt/npu/ +本模型数据集放在/root/datasets/ + +2.进入工作目录 +cd ResNext50 + +3.安装必要的依赖,测试环境可能已经安装其中的一些不同版本的库了,故手动测试时不推荐使用该命令安装 +pip3.7 install -r requirements.txt + +4.获取,修改与安装开源模型代码 +git clone https://github.com/pytorch/vision +cd vision +如果修改了模型代码,交付了{model_name}.diff +patch -p1 < ../{model_name}.diff +如果模型代码需要安装,则安装模型代码(如果没有安装脚本,pth2onnx等脚本需要引用模型代码的类或函数,可通过sys.path.append(r"./vision")添加搜索路径的方式) +python3.7 setup.py install +cd .. + +5.获取权重文件 +wget https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth + +6.获取benchmark工具 +将benchmark.x86_64 benchmark.aarch64放在当前目录 + +7.310上执行,执行时确保device空闲 +bash test/pth2om.sh +bash test/eval_acc_perf.sh --datasets_path=/root/datasets + +8.在t4环境上将onnx文件与perf_t4.sh放在同一目录 +然后执行bash perf_t4.sh,执行时确保gpu空闲 diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/eval_acc_perf.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/eval_acc_perf.sh" new file mode 100644 index 0000000000000000000000000000000000000000..46fcfc68792b801f73d1f214c9fb97a0a0020731 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/eval_acc_perf.sh" @@ -0,0 +1,69 @@ +#!/bin/bash + +datasets_path="/root/datasets/" + +for para in $* +do + if [[ $para == --datasets_path* ]]; then + datasets_path=`echo ${para#*=}` + fi +done + +arch=`uname -m` +rm -rf ./prep_dataset +python3.7 imagenet_torch_preprocess.py resnet ${datasets_path}/imagenet/val ./prep_dataset +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 gen_dataset_info.py bin ./prep_dataset ./resnext50_prep_bin.info 224 224 +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +source env.sh +rm -rf result/dumpOutput_device0 +./benchmark.${arch} -model_type=vision -device_id=0 -batch_size=1 -om_path=resnext50_bs1.om -input_text_path=./resnext50_prep_bin.info -input_width=224 -input_height=224 -output_binary=False -useDvpp=False +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +rm -rf result/dumpOutput_device1 +./benchmark.${arch} -model_type=vision -device_id=1 -batch_size=16 -om_path=resnext50_bs16.om -input_text_path=./resnext50_prep_bin.info -input_width=224 -input_height=224 -output_binary=False -useDvpp=False +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 imagenet_acc_eval.py result/dumpOutput_device0/ ${datasets_path}/imagenet/val_label.txt ./ result_bs1.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 imagenet_acc_eval.py result/dumpOutput_device1/ ${datasets_path}/imagenet/val_label.txt ./ result_bs16.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +echo "====accuracy data====" +python3.7 test/parse.py result_bs1.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 test/parse.py result_bs16.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +echo "====performance data====" +python3.7 test/parse.py result/perf_vision_batchsize_1_device_0.txt +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 test/parse.py result/perf_vision_batchsize_16_device_1.txt +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +echo "success" \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/parse.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/parse.py" new file mode 100644 index 0000000000000000000000000000000000000000..b9c74f41d7848e1250356f14472b237a18bb3489 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/parse.py" @@ -0,0 +1,32 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import sys +import json +import re + +if __name__ == '__main__': + if sys.argv[1].endswith('.json'): + result_json = sys.argv[1] + with open(result_json, 'r') as f: + content = f.read() + tops = [i.get('value') for i in json.loads(content).get('value') if 'Top' in i.get('key')] + print('om {} top1:{} top5:{}'.format(result_json.split('_')[1].split('.')[0], tops[0], tops[4])) + elif sys.argv[1].endswith('.txt'): + result_txt = sys.argv[1] + with open(result_txt, 'r') as f: + content = f.read() + txt_data_list = [i.strip() for i in re.findall(r':(.*?),', content.replace('\n', ',') + ',')] + fps = float(txt_data_list[7].replace('samples/s', '')) * 4 + print('310 bs{} fps:{}'.format(result_txt.split('_')[3], fps)) \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/perf_t4.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/perf_t4.sh" new file mode 100644 index 0000000000000000000000000000000000000000..aee41c4db7a3e56e608c84c89f5cac1d532458d8 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/perf_t4.sh" @@ -0,0 +1,22 @@ +#!/bin/bash + +# T4上执行: +trtexec --onnx=resnext50.onnx --fp16 --shapes=image:1x3x224x224 --threads > resnext50_bs1.log +perf_str=`grep "GPU.* mean.*ms$" resnext50_bs1.log` +if [ -n "$perf_str" ]; then + perf_num=`echo $perf_str | awk -F' ' '{print $16}'` +else + perf_str=`grep "mean.*ms$" resnext50_bs1.log` + perf_num=`echo $perf_str | awk -F' ' '{print $4}'` +fi +awk 'BEGIN{printf "t4 bs1 fps:%.3f\n", 1000*1/('$perf_num'/1)}' + +trtexec --onnx=resnext50.onnx --fp16 --shapes=image:16x3x224x224 --threads > resnext50_bs16.log +perf_str=`grep "GPU.* mean.*ms$" resnext50_bs16.log` +if [ -n "$perf_str" ]; then + perf_num=`echo $perf_str | awk -F' ' '{print $16}'` +else + perf_str=`grep "mean.*ms$" resnext50_bs16.log` + perf_num=`echo $perf_str | awk -F' ' '{print $4}'` +fi +awk 'BEGIN{printf "t4 bs16 fps:%.3f\n", 1000*1/('$perf_num'/16)}' \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/pth2om.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/pth2om.sh" new file mode 100644 index 0000000000000000000000000000000000000000..eaf285c653a1a928765ca6c650d557b89e364607 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/ResNext50/test/pth2om.sh" @@ -0,0 +1,13 @@ +#!/bin/bash + +rm -rf resnext50.onnx +python3.7 resnext50_pth2onnx.py resnext50_32x4d-7cdf4587.pth resnext50.onnx +source env.sh +rm -rf resnext50_bs1.om resnext50_bs16.om +atc --framework=5 --model=./resnext50.onnx --input_format=NCHW --input_shape="image:1,3,224,224" --output=resnext50_bs1 --log=debug --soc_version=Ascend310 +atc --framework=5 --model=./resnext50.onnx --input_format=NCHW --input_shape="image:16,3,224,224" --output=resnext50_bs16 --log=debug --soc_version=Ascend310 +if [ -f "resnext50_bs1.om" ] && [ -f "resnext50_bs16.om" ]; then + echo "success" +else + echo "fail!" +fi \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/ssd_detection.diff" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/ssd_detection.diff" new file mode 100644 index 0000000000000000000000000000000000000000..6c8e0123a9b73a5cd07a1d4f55d3235b655c0486 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/ssd_detection.diff" @@ -0,0 +1,140 @@ +diff --git a/mmdet/core/anchor/anchor_generator.py b/mmdet/core/anchor/anchor_generator.py +index 3c2fd5a0..f6d11fa7 100644 +--- a/mmdet/core/anchor/anchor_generator.py ++++ b/mmdet/core/anchor/anchor_generator.py +@@ -197,6 +197,8 @@ class AnchorGenerator: + tuple[torch.Tensor]: The mesh grids of x and y. + """ + # use shape instead of len to keep tracing while exporting to onnx ++ x = x.to(dtype=torch.int32) ++ y = y.to(dtype=torch.int32) + xx = x.repeat(y.shape[0]) + yy = y.view(-1, 1).repeat(1, x.shape[0]).view(-1) + if row_major: +diff --git a/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py b/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py +index 98d30906..48bcdae3 100644 +--- a/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py ++++ b/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py +@@ -207,10 +207,22 @@ def delta2bbox(rois, + deltas.size(-1) // 4) + stds = deltas.new_tensor(stds).view(1, -1).repeat(1, deltas.size(-1) // 4) + denorm_deltas = deltas * stds + means +- dx = denorm_deltas[..., 0::4] ++ '''dx = denorm_deltas[..., 0::4] + dy = denorm_deltas[..., 1::4] + dw = denorm_deltas[..., 2::4] +- dh = denorm_deltas[..., 3::4] ++ dh = denorm_deltas[..., 3::4]''' ++ if denorm_deltas.shape[2] > 4: ++ #please self fix when shape[2] > 4 ++ denorm_deltas = denorm_deltas.view(-1, 80, 4) ++ dx = denorm_deltas[:, :, 0:1:].view(-1, 80) ++ dy = denorm_deltas[:, :, 1:2:].view(-1, 80) ++ dw = denorm_deltas[:, :, 2:3:].view(-1, 80) ++ dh = denorm_deltas[:, :, 3:4:].view(-1, 80) ++ else: ++ dx = denorm_deltas[..., 0:1:] ++ dy = denorm_deltas[..., 1:2:] ++ dw = denorm_deltas[..., 2:3:] ++ dh = denorm_deltas[..., 3:4:] + + x1, y1 = rois[..., 0], rois[..., 1] + x2, y2 = rois[..., 2], rois[..., 3] +diff --git a/mmdet/models/dense_heads/anchor_head.py b/mmdet/models/dense_heads/anchor_head.py +index e7c975f5..e2d057e9 100644 +--- a/mmdet/models/dense_heads/anchor_head.py ++++ b/mmdet/models/dense_heads/anchor_head.py +@@ -9,6 +9,55 @@ from ..builder import HEADS, build_loss + from .base_dense_head import BaseDenseHead + from .dense_test_mixins import BBoxTestMixin + ++class BatchNMSOp(torch.autograd.Function): ++ @staticmethod ++ def forward(ctx, bboxes, scores, score_threshold, iou_threshold, max_size_per_class, max_total_size): ++ """ ++ boxes (torch.Tensor): boxes in shape (batch, N, C, 4). ++ scores (torch.Tensor): scores in shape (batch, N, C). ++ return: ++ nmsed_boxes: (1, N, 4) ++ nmsed_scores: (1, N) ++ nmsed_classes: (1, N) ++ nmsed_num: (1,) ++ """ ++ ++ # Phony implementation for onnx export ++ nmsed_boxes = bboxes[:, :max_total_size, 0, :] ++ nmsed_scores = scores[:, :max_total_size, 0] ++ nmsed_classes = torch.arange(max_total_size, dtype=torch.long) ++ nmsed_num = torch.Tensor([max_total_size]) ++ ++ return nmsed_boxes, nmsed_scores, nmsed_classes, nmsed_num ++ ++ @staticmethod ++ def symbolic(g, bboxes, scores, score_thr, iou_thr, max_size_p_class, max_t_size): ++ nmsed_boxes, nmsed_scores, nmsed_classes, nmsed_num = g.op('BatchMultiClassNMS', ++ bboxes, scores, score_threshold_f=score_thr, iou_threshold_f=iou_thr, ++ max_size_per_class_i=max_size_p_class, max_total_size_i=max_t_size, outputs=4) ++ return nmsed_boxes, nmsed_scores, nmsed_classes, nmsed_num ++ ++def batch_nms_op(bboxes, scores, score_threshold, iou_threshold, max_size_per_class, max_total_size): ++ """ ++ boxes (torch.Tensor): boxes in shape (N, 4). ++ scores (torch.Tensor): scores in shape (N, ). ++ """ ++ ++ if bboxes.dtype == torch.float32: ++ bboxes = bboxes.reshape(bboxes.size(0), bboxes.shape[1].numpy(), -1, 4).half() ++ scores = scores.reshape(scores.size(0), scores.shape[1].numpy(), -1).half() ++ else: ++ bboxes = bboxes.reshape(bboxes.size(0), bboxes.shape[1].numpy(), -1, 4) ++ scores = scores.reshape(scores.size(0), scores.shape[1].numpy(), -1) ++ ++ nmsed_boxes, nmsed_scores, nmsed_classes, nmsed_num = BatchNMSOp.apply(bboxes, scores, ++ score_threshold, iou_threshold, max_size_per_class, max_total_size) ++ nmsed_boxes = nmsed_boxes.float() ++ nmsed_scores = nmsed_scores.float() ++ nmsed_classes = nmsed_classes.long() ++ dets = torch.cat((nmsed_boxes.reshape((bboxes.size(0), max_total_size, 4)), nmsed_scores.reshape((bboxes.size(0), max_total_size, 1))), -1) ++ labels = nmsed_classes.reshape((bboxes.size(0), max_total_size)) ++ return dets, labels + + @HEADS.register_module() + class AnchorHead(BaseDenseHead, BBoxTestMixin): +@@ -653,7 +702,10 @@ class AnchorHead(BaseDenseHead, BBoxTestMixin): + anchors = anchors.expand_as(bbox_pred) + # Always keep topk op for dynamic input in onnx + from mmdet.core.export import get_k_for_topk +- nms_pre = get_k_for_topk(nms_pre_tensor, bbox_pred.shape[1]) ++ #nms_pre = get_k_for_topk(nms_pre_tensor, bbox_pred.shape[1]) ++ nms_pre = bbox_pred.shape[1] ++ if nms_pre_tensor > 0 and bbox_pred.shape[1] > nms_pre_tensor: ++ nms_pre = nms_pre_tensor + if nms_pre > 0: + # Get maximum scores for foreground classes. + if self.use_sigmoid_cls: +@@ -662,11 +714,14 @@ class AnchorHead(BaseDenseHead, BBoxTestMixin): + # remind that we set FG labels to [0, num_class-1] + # since mmdet v2.0 + # BG cat_id: num_class +- max_scores, _ = scores[..., :-1].max(-1) ++ scores_tmp = scores.permute(2, 1, 0) ++ max_scores, _ = scores_tmp[:-1, ...].max(0) ++ max_scores = max_scores.permute(1, 0) + + _, topk_inds = max_scores.topk(nms_pre) + batch_inds = torch.arange(batch_size).view( +- -1, 1).expand_as(topk_inds) ++ -1, 1).to(dtype=torch.int32).expand_as(topk_inds) ++ batch_inds = batch_inds.to(dtype=torch.int64) + anchors = anchors[batch_inds, topk_inds, :] + bbox_pred = bbox_pred[batch_inds, topk_inds, :] + scores = scores[batch_inds, topk_inds, :] +@@ -694,6 +749,8 @@ class AnchorHead(BaseDenseHead, BBoxTestMixin): + iou_threshold = cfg.nms.get('iou_threshold', 0.5) + score_threshold = cfg.score_thr + nms_pre = cfg.get('deploy_nms_pre', -1) ++ dets, labels = batch_nms_op(batch_mlvl_bboxes, batch_mlvl_scores, score_threshold, iou_threshold, cfg.max_per_img, cfg.max_per_img) ++ return dets, labels + return add_dummy_nms_for_onnx(batch_mlvl_bboxes, batch_mlvl_scores, + max_output_boxes_per_class, + iou_threshold, score_threshold, diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216detectron2\350\256\255\347\273\203\347\232\204npu\346\235\203\351\207\215\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216detectron2\350\256\255\347\273\203\347\232\204npu\346\235\203\351\207\215\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216detectron2\350\256\255\347\273\203\347\232\204npu\346\235\203\351\207\215\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216detectron2\350\256\255\347\273\203\347\232\204npu\346\235\203\351\207\215\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216\345\274\200\346\272\220mmdetection\351\242\204\350\256\255\347\273\203\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216\345\274\200\346\272\220mmdetection\351\242\204\350\256\255\347\273\203\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216\345\274\200\346\272\220mmdetection\351\242\204\350\256\255\347\273\203\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/\345\237\272\344\272\216\345\274\200\346\272\220mmdetection\351\242\204\350\256\255\347\273\203\347\232\204maskrcnn_Onnx\346\250\241\345\236\213\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274.md" diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/.keep" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/nlp/.keep" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/.keep" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/nlp/.keep" diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/LICENSE" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/LICENSE" new file mode 100644 index 0000000000000000000000000000000000000000..56ee3c8c4cc2b4b32e0975d17258f9ba515fdbcc --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/LICENSE" @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/README.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/README.md" new file mode 100644 index 0000000000000000000000000000000000000000..c5f37d0c9ef5f29c0ff618e9df896b417da223e9 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/README.md" @@ -0,0 +1,53 @@ +$\color{red}{说明:删除线用于READ.md的说明,以下带有删除线的说明在README.md中需要删除}$ + +# ReID-strong-baseline模型PyTorch离线推理指导 + +## 1 环境准备 + +1.安装必要的依赖,测试环境可能已经安装其中的一些不同版本的库了,故手动测试时不推荐使用该命令安装 +``` +pip3.7 install -r requirements.txt +``` +~~需要使用python3.7命令执行脚本,pip3.7命令安装库,torch使用1.5.0版本,如果开源模型代码导出onnx要求torch版本大于1.5.0,则使用1.8.0版本,并在此处说明。onnx可以选择1.9.0。requirements.txt中需要写明本模型离线推理所有必要依赖库的具体版本,版本号即是推理310服务器上推理时使用库的版本号,常用库包括numpy,Pillow,opencv-python等。目前atc工具支持的onnx算子版本opset_version为11。~~ + + +2.获取,修改与安装开源模型代码 +``` +git clone https://github.com/michuanhaohao/reid-strong-baseline -b master +cd reid-strong-baseline +git reset 3da7e6f03164a92e696cb6da059b1cd771b0346d --hard +cd .. +``` +~~优先使用本任务提供的开源代码仓,并基于分支与commit id的代码做离线推理,分支一般选择master或稳定的版本,需要从github的commits中找到commit id,通常选择稳定版本的最后一次提交,或代码仓最新的一次提交~~ +~~如果需要对模型的开源代码做修改,以打patch的形式修改后再安装:patch -p1 < ../{patch_name}.diff~~ +~~如果开源模型代码仓没有安装脚本,可以通过sys.path.append(r"./reid-strong-baseline")添加搜索路径,然后在pth2onnx脚本中就可以引用模型代码的函数或类~~ + +3.获取权重文件 + +[market_resnet50_model_120_rank1_945.pth](https://drive.google.com/open?id=1hn0sXLZ5yJcxtmuY-ItQfYD7hBtHwt7A) +~~优先使用训练提供的权重文件,如果训练的权重文件网上能获则需给出网址,否则需要给出从哪获取权重文件。如果训练没有提供权重则使用开源代码仓的权重文件。需要给出权重文件名~~ + +4.数据集 +[获取Market1501](http://www.liangzheng.org/Project/project_reid.html),并重命名为market1501 + +5.[获取benchmark工具](https://support.huawei.com/enterprise/zh/ascend-computing/cann-pid-251168373/software/) +将benchmark.x86_64或benchmark.aarch64放到当前目录 + +## 2 离线推理 + +310上执行,执行时使npu-smi info查看设备状态,确保device空闲 +``` +bash test/pth2om.sh +bash test/eval_acc_perf.sh --datasets_path=/root/datasets +``` + **评测结果:** +| 模型 | 官网pth精度 | 310离线推理精度 | gpu性能 | 310性能 | +| :------: | :------: | :------: | :------: | :------: | +| ReID-strong-baseline bs1 | [rank1:94.5% mAP:85.9%](https://github.com/michuanhaohao/reid-strong-baseline) | rank1:94.5% mAP:85.9% | 992.9994fps | 1446.188fps | +| ReID-strong-baseline bs16 | [rank1:94.5% mAP:85.9%](https://github.com/michuanhaohao/reid-strong-baseline) | rank1:94.5% mAP:85.9% | 2211.7074fps | 2588.56fps | + +备注: +加上TEST.NECK_FEAT "('before')" TEST.FEAT_NORM "('no')"导出的onnx可以进行离线推理 +不加TEST.NECK_FEAT "('before')" TEST.FEAT_NORM "('no')"导出的onnx转换的om精度与官网精度一致 + + diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_postprocess.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_postprocess.py" new file mode 100644 index 0000000000000000000000000000000000000000..0af4cb769d1e1e998e62e37aee621d542d94115a --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_postprocess.py" @@ -0,0 +1,73 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import sys +sys.path.append('./reid-strong-baseline') +import os +import argparse +import glob +import re +import numpy as np +import torch +from utils.reid_metric import R1_mAP + +def get_pred_label(label_dir, pre_dir): + img_paths = glob.glob(os.path.join(label_dir, '*.jpg')) + pattern = re.compile(r'([-\d]+)_c(\d)') + + outputs = [] + for img_path in img_paths: + pid, camid = map(int, pattern.search(img_path).groups()) + if pid == -1: continue # junk images are just ignored + camid -= 1 # index starts from 0 + + filename = img_path.split("/")[-1] + if filename[-8:] == ".jpg.jpg": + bin_file = filename[:-8] + "_1.bin" + else: + bin_file = filename[:-4] + "_1.bin" + output = np.fromfile(os.path.join(pre_dir, bin_file), dtype="float32") + output = torch.from_numpy(output) + output = output.unsqueeze(0) + + pid = torch.from_numpy(np.array([pid,])) + camid = torch.from_numpy(np.array([camid,])) + outputs.append((output, pid, camid)) + + return outputs + +def eval(query_dir, gallery_dir, pred_dir): + + query = get_pred_label(query_dir, pred_dir) + gallery = get_pred_label(gallery_dir, pred_dir) + outputs = query + gallery + + num_query = 3368 + eval = R1_mAP(num_query, max_rank=50, feat_norm="yes") + eval.reset() + for output in outputs: + eval.update(output) + cmc, mAP = eval.compute() + print('Validation Results') + print("mAP: {:.1%}".format(mAP)) + for r in [1, 5, 10]: + print("CMC curve, Rank-{:<3}:{:.1%}".format(r, cmc[r - 1])) + +if __name__ == '__main__': + parser = argparse.ArgumentParser() + parser.add_argument("--query_dir", default="./data/market1501/query") + parser.add_argument("--gallery_dir", default="./data/market1501/bounding_box_test") + parser.add_argument("--pred_dir", default="./result/dumpOutput_device0/") + args = parser.parse_args() + eval(args.query_dir, args.gallery_dir, args.pred_dir) diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_preprocess.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_preprocess.py" new file mode 100644 index 0000000000000000000000000000000000000000..c296fb121ebb4f91f918d885d51b9288c451d70e --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_preprocess.py" @@ -0,0 +1,56 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import os +import sys +import numpy as np +from PIL import Image +from torchvision import transforms +import multiprocessing + +preprocess = transforms.Compose([ + transforms.Resize([256, 128]), + transforms.ToTensor(), + transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]) +]) + +def gen_input_bin(file_batches, batch): + i = 0 + for file in file_batches[batch]: + if ".db" in file: + continue + i = i + 1 + print("batch", batch, file, "===", i) + + input_image = Image.open(os.path.join(src_path, file)).convert('RGB') + input_tensor = preprocess(input_image) + img = np.array(input_tensor).astype(np.float32) + img.tofile(os.path.join(save_path, file.split('.')[0] + ".bin")) + +def ReID_preprocess(src_path, save_path): + files = os.listdir(src_path) + file_batches = [files[i:i + 500] for i in range(0, 50000, 500) if files[i:i + 500] != []] + thread_pool = multiprocessing.Pool(len(file_batches)) + for batch in range(len(file_batches)): + thread_pool.apply_async(gen_input_bin, args=(file_batches, batch)) + thread_pool.close() + thread_pool.join() + print("in thread, except will not report! please ensure bin files generated.") + +if __name__ == '__main__': + src_path = sys.argv[1] + save_path = sys.argv[2] + if not os.path.isdir(save_path): + os.makedirs(os.path.realpath(save_path)) + ReID_preprocess(src_path, save_path) diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_pth2onnx.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_pth2onnx.py" new file mode 100644 index 0000000000000000000000000000000000000000..659451d4ade18d31b45aecf33889d3b548e4c5ba --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/ReID_pth2onnx.py" @@ -0,0 +1,63 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import os +import sys +import argparse +import torch +import torch.onnx +sys.path.append('./reid-strong-baseline') +from config import cfg +from modeling import build_model + +from collections import OrderedDict +def proc_nodes_module(checkpoint): + new_state_dict = OrderedDict() + for k, v in checkpoint.items(): + if "classifier" in k: + continue + new_state_dict[k] = v + return new_state_dict + +def main(): + parser = argparse.ArgumentParser(description="ReID Baseline Inference") + parser.add_argument( + "--config_file", default="", help="path to config file", type=str + ) + parser.add_argument("opts", help="Modify config options using the command-line", default=None, + nargs=argparse.REMAINDER) + args = parser.parse_args() + + cfg.merge_from_file(args.config_file) + cfg.merge_from_list(args.opts) + cfg.freeze() + + num_classes = 751 + model = build_model(cfg, num_classes) + checkpoint = torch.load(cfg.TEST.WEIGHT, map_location='cpu') + #checkpoint = proc_nodes_module(checkpoint) + model.load_state_dict(checkpoint) + model.eval() + + input_names = ["image"] + output_names = ["class"] + dynamic_axes = {'image': {0: '-1'}, 'class': {0: '-1'}} + dummy_input = torch.randn(1, 3, 256, 128) + export_onnx_file = "ReID.onnx" + + torch.onnx.export(model, dummy_input, export_onnx_file, input_names=input_names, dynamic_axes=dynamic_axes, + output_names=output_names, opset_version=11, verbose=True) + +if __name__ == '__main__': + main() diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/env.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/env.sh" new file mode 100644 index 0000000000000000000000000000000000000000..49be8f16a045f6e4e61666ecc8a5da811c15fd80 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/env.sh" @@ -0,0 +1,8 @@ +#! /bin/bash + +export install_path=/usr/local/Ascend/ascend-toolkit/latest +export PATH=/usr/local/python3.7.5/bin:${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH +export PYTHONPATH=${install_path}/atc/python/site-packages:$PYTHONPATH +export LD_LIBRARY_PATH=${install_path}/atc/lib64:${install_path}/acllib/lib64:$LD_LIBRARY_PATH +export ASCEND_OPP_PATH=${install_path}/opp +export ASCEND_AICPU_PATH=/usr/local/Ascend/ascend-toolkit/latest diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/gen_dataset_info.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/gen_dataset_info.py" new file mode 100644 index 0000000000000000000000000000000000000000..f80f45a34c450d57f0ea49d93167892d93a30e88 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/gen_dataset_info.py" @@ -0,0 +1,60 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import os +import sys +import cv2 +from glob import glob + + +def get_bin_info(file_path, info_name, width, height): + bin_images = glob(os.path.join(file_path, '*.bin')) + with open(info_name, 'w') as file: + for index, img in enumerate(bin_images): + content = ' '.join([str(index), img, width, height]) + file.write(content) + file.write('\n') + + +def get_jpg_info(file_path, info_name): + extensions = ['jpg', 'jpeg', 'JPG', 'JPEG'] + image_names = [] + for extension in extensions: + image_names.append(glob(os.path.join(file_path, '*.' + extension))) + with open(info_name, 'w') as file: + for image_name in image_names: + if len(image_name) == 0: + continue + else: + for index, img in enumerate(image_name): + img_cv = cv2.imread(img) + shape = img_cv.shape + width, height = shape[1], shape[0] + content = ' '.join([str(index), img, str(width), str(height)]) + file.write(content) + file.write('\n') + + +if __name__ == '__main__': + file_type = sys.argv[1] + file_path = sys.argv[2] + info_name = sys.argv[3] + if file_type == 'bin': + width = sys.argv[4] + height = sys.argv[5] + assert len(sys.argv) == 6, 'The number of input parameters must be equal to 5' + get_bin_info(file_path, info_name, width, height) + elif file_type == 'jpg': + assert len(sys.argv) == 4, 'The number of input parameters must be equal to 3' + get_jpg_info(file_path, info_name) diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/modelzoo_level.txt" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/modelzoo_level.txt" new file mode 100644 index 0000000000000000000000000000000000000000..9e95396651cc4382fe60ee1ee053674f527a448c --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/modelzoo_level.txt" @@ -0,0 +1,4 @@ +FuncStatus:OK +PrecisionStatus:OK +AutoTune:OK +PerfStatus:POK \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/requirements.txt" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/requirements.txt" new file mode 100644 index 0000000000000000000000000000000000000000..4cda32191fe9ee542596389f75f4377bd38521d8 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/requirements.txt" @@ -0,0 +1,8 @@ +torch == 1.5.0 +torchvision == 0.6.0 +onnx == 1.7.0 +numpy == 1.20.3 +Pillow == 8.2.0 +opencv-python == 4.5.2.54 +yacs == 0.1.8 +pytorch-ignite == 0.4.5 \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/eval_acc_perf.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/eval_acc_perf.sh" new file mode 100644 index 0000000000000000000000000000000000000000..1bfa7cf5aa72efe9118b025030f52d9c17b89248 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/eval_acc_perf.sh" @@ -0,0 +1,72 @@ +#!/bin/bash + +datasets_path="/root/datasets/" + +for para in $* +do + if [[ $para == --datasets_path* ]]; then + datasets_path=`echo ${para#*=}` + fi +done + +arch=`uname -m` +rm -rf ./prep_dataset_query +rm -rf ./prep_dataset_gallery +python3.7 ReID_preprocess.py ${datasets_path}/market1501/query ./prep_dataset_query +python3.7 ReID_preprocess.py ${datasets_path}/market1501/bounding_box_test ./prep_dataset_gallery +mv prep_dataset_gallery/* prep_dataset_query/ +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 gen_dataset_info.py bin ./prep_dataset_query ./prep_bin.info 128 256 +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +source env.sh +rm -rf result/dumpOutput_device0 +./benchmark.${arch} -model_type=vision -device_id=0 -batch_size=1 -om_path=./ReID_bs1.om -input_text_path=./prep_bin.info -input_width=128 -input_height=256 -output_binary=True -useDvpp=False +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +rm -rf result/dumpOutput_device1 +./benchmark.${arch} -model_type=vision -device_id=1 -batch_size=16 -om_path=./ReID_bs16.om -input_text_path=./prep_bin.info -input_width=128 -input_height=256 -output_binary=True -useDvpp=False +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 ReID_postprocess.py --query_dir=${datasets_path}/market1501/query --gallery_dir=${datasets_path}/market1501/bounding_box_test --pred_dir=./result/dumpOutput_device0 > result_bs1.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 ReID_postprocess.py --query_dir=${datasets_path}/market1501/query --gallery_dir=${datasets_path}/market1501/bounding_box_test --pred_dir=./result/dumpOutput_device1 > result_bs16.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +echo "====accuracy data====" +python3.7 test/parse.py result_bs1.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 test/parse.py result_bs16.json +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +echo "====performance data====" +python3.7 test/parse.py result/perf_vision_batchsize_1_device_0.txt +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +python3.7 test/parse.py result/perf_vision_batchsize_16_device_1.txt +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi +echo "success" \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/parse.py" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/parse.py" new file mode 100644 index 0000000000000000000000000000000000000000..6d5a1293288dce8bfc70f79ecf6551acafed6b81 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/parse.py" @@ -0,0 +1,33 @@ +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import sys +import json +import re + +if __name__ == '__main__': + if sys.argv[1].endswith('.json'): + result_json = sys.argv[1] + with open(result_json, 'r') as f: + content = f.read() + #tops = [i.get('value') for i in json.loads(content).get('value') if 'Top' in i.get('key')] + #print('om {} top1:{} top5:{}'.format(result_json.split('_')[1].split('.')[0], tops[0], tops[4])) + print(content) + elif sys.argv[1].endswith('.txt'): + result_txt = sys.argv[1] + with open(result_txt, 'r') as f: + content = f.read() + txt_data_list = [i.strip() for i in re.findall(r':(.*?),', content.replace('\n', ',') + ',')] + fps = float(txt_data_list[7].replace('samples/s', '')) * 4 + print('310 bs{} fps:{}'.format(result_txt.split('_')[3], fps)) \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/perf_g.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/perf_g.sh" new file mode 100644 index 0000000000000000000000000000000000000000..a1c5a64463d3165909d67bffe256b4f69732e50e --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/perf_g.sh" @@ -0,0 +1,19 @@ +trtexec --onnx=ReID.onnx --fp16 --shapes=image:1x3x256x128 > ReID_bs1.log +perf_str=`grep "GPU.* mean.*ms$" ReID_bs1.log` +if [ -n "$perf_str" ]; then + perf_num=`echo $perf_str | awk -F' ' '{print $16}'` +else + perf_str=`grep "mean.*ms$" ReID_bs1.log` + perf_num=`echo $perf_str | awk -F' ' '{print $4}'` +fi +awk 'BEGIN{printf "t4 bs1 fps:%.3f\n", 1000*1/('$perf_num'/1)}' + +trtexec --onnx=ReID.onnx --fp16 --shapes=image:16x3x256x128 > ReID_bs16.log +perf_str=`grep "GPU.* mean.*ms$" ReID_bs16.log` +if [ -n "$perf_str" ]; then + perf_num=`echo $perf_str | awk -F' ' '{print $16}'` +else + perf_str=`grep "mean.*ms$" ReID_bs16.log` + perf_num=`echo $perf_str | awk -F' ' '{print $4}'` +fi +awk 'BEGIN{printf "t4 bs16 fps:%.3f\n", 1000*1/('$perf_num'/16)}' \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/pth2om.sh" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/pth2om.sh" new file mode 100644 index 0000000000000000000000000000000000000000..393275a3c0ecde085fcc643e29e828d28d81dd67 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/official/cv/ReID/ReID-strong-baseline/test/pth2om.sh" @@ -0,0 +1,19 @@ +#!/bin/bash + +rm -rf ReID.onnx +python3.7 ReID_pth2onnx.py --config_file='reid-strong-baseline/configs/softmax_triplet_with_center.yml' MODEL.PRETRAIN_CHOICE "('self')" TEST.WEIGHT "('market_resnet50_model_120_rank1_945.pth')" TEST.NECK_FEAT "('before')" TEST.FEAT_NORM "('no')" +if [ $? != 0 ]; then + echo "fail!" + exit -1 +fi + +rm -rf ReID_bs1.om ReID_bs16.om +source env.sh +atc --framework=5 --model=ReID.onnx --output=ReID_bs1 --input_format=NCHW --input_shape="image:1,3,256,128" --log=debug --soc_version=Ascend310 +atc --framework=5 --model=ReID.onnx --output=ReID_bs16 --input_format=NCHW --input_shape="image:16,3,256,128" --log=debug --soc_version=Ascend310 --auto_tune_mode="RL,GA" + +if [ -f "ReID_bs1.om" ] && [ -f "ReID_bs16.om" ]; then + echo "success" +else + echo "fail!" +fi \ No newline at end of file diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/.keep" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/research/.keep" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/.keep" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/ONNX\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274/research/.keep" diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-FAQ.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-FAQ.md" new file mode 100644 index 0000000000000000000000000000000000000000..09a53c3361b790daa6908814da0a7c180a9b32de --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-FAQ.md" @@ -0,0 +1,286 @@ +# Ascend PyTorch模型推理常见问题FAQ +- [1 介绍](#1-介绍) +- [2 常见问题FAQ](#2-常见问题FAQ) + - [2.1 NPU模型打通常见问题FAQ](#21-NPU模型打通常见问题FAQ) + - [2.2 NPU模型精度调试常见问题FAQ](#22-NPU模型精度调试常见问题FAQ) + - [2.3 NPU模型性能优化常见问题FAQ](#23-NPU模型性能优化常见问题FAQ) + +# [1 介绍](#1-介绍) + + 本文目标读者为Ascend模型离线推理开发者,用于指导开发者在昇腾版本的CANN包下,实现模型推理精度性能达标。这里仅列举模型离线推理中遇到的常见问题与解决方法,持续更新。 + + +# [2 常见问题FAQ](#2-常见问题FAQ) + +## [2.1 NPU模型打通常见问题FAQ](#21-NPU模型打通常见问题FAQ) + +### FAQ1、需要提供哪些交付件,如何交付? +交付请参考《推理指导》6.2 交付标准与规范 +交付件样例:https://gitee.com/ascend/modelzoo/tree/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50 + +### FAQ2、装有Ascend 310卡的服务器环境如何使用? +提供的装有Ascend 310卡的服务器已经安装好ascend的包,服务器home/common/resnext50的样例是可以运行的 + +### FAQ3、推理与训练的关系? +模型推理与训练是独立的事情,推理比训练简单一些,推理是使用Ascend 910训练的权重或模型开源代码仓提供的权重在310上执行推理,一般在训练等待结果的时间内可以同步做推理 + +### FAQ4、推理工作量? +做之前需要先熟悉相关工作,然后进行模型推理,如果精度性能不达标就需要花费不少时间了,模型推理到验收还有检视整改测试资料文档的工作,不是三天就能做完模型推理,从开始到验收完成整个周期规划了1个月~1.5个月的时间 + +### FAQ5、推理过程中哪些工作在310服务器上做,哪些在t4服务器上做,哪些在cpu上做? +前后处理与转onnx在cpu上做即可,转om模型和benchmark推理时的命令在装有ascend 310的服务器上执行即可,因为两个命令依赖Ascend cann包提供的编译工具与npu算子库,gpu性能数据需要在装有t4卡的服务器上测 + +### FAQ6、预训练权重文件选择的问题? +如果已经有了ascend 910训练提供的权重文件,那么优先使用910训练提供的权重文件做离线推理,精度与910训练出的精度对齐 +如果开源代码仓提供了多个权重文件,使用常用的基础的那个配置的权重文件即可,并且模型支持多任务时只需要针对一个任务做推理 +如果开源代码仓没有提供pth权重文件,则需要该模型的训练同学提供pth权重文件,或者使用开源代码仓训练脚本简单训练一个pth权重文件,然后对比om精度与该pth权重文件的精度 + +### FAQ7、精度与性能需要测试哪些batch? +如果模型支持多batch,需要测试batch1,4,8,16,32的精度与性能,写在README.md里,模型测试脚本与提交代码的描述只需提供bs1和bs16的精度性能数据 + +### FAQ8、onnx不能推理,t4性能如何测? +如果导出的onnx因包含自定义算子等而不能推理,则在t4上运行开源评测脚本测试pth模型在线推理性能 + +### FAQ9、om性能如何测? +测试时需要确保测试过程中device只进行了这一个测试任务,使用npu-smi info查看device是否空闲 +由于随机数可能不能模拟数据分布,Ascend benchmark工具纯推理功能测的有些模型性能数据可能不太准,所以模型测试脚本与提交代码的描述中的性能数据以Ascend benchmark在数据集上推理时得到性能数据为准 + +### FAQ10、导出onnx脚本的dynamic_axes与onnx的输入shape(-1,3,224,224)中的-1是什么意思? +如下导出的onnx模型通过可视化工具netron查看其输入shape是(-1,3,224,224),-1代表onnx模型是动态batch的,当用tensorRT在t4上测onnx的性能时可以指定任意batch的输入(batch,3,224,224),dynamic_axes是动态batch参数,'image': {0: '-1'}表示输入image的第一维是-1即batch维为-1表示动态 +``` + input_names = ["image"] + output_names = ["class"] + dynamic_axes = {'image': {0: '-1'}, 'class': {0: '-1'}} + dummy_input = torch.randn(1, 3, 224, 224) + torch.onnx.export(model, dummy_input, output_file, input_names = input_names, dynamic_axes = dynamic_axes, output_names = output_names, opset_version=11, verbose=True) +``` +无论onnx模型的batch是多少,onnx转换为om时只要通过--input_shape指定batch为正整数,就得到对应batch的om模型,目前om虽然支持动态batch,但是我们不使用动态batch的om模型 +``` +atc --framework=5 --model=./resnext50.onnx --input_format=NCHW --input_shape="image:16,3,224,224" --output=resnext50_bs16 --log=debug --soc_version=Ascend310 +``` +当然像一些模型如shufflenetv1其实不支持动态batch,转换为固定batch的om时除了指定--input_shape的相同的batch,还需要相同batch的onnx模型来转换,否则会报错 + +### FAQ11、atc命令失败时如何查看日志? +``` +export ASCEND_SLOG_PRINT_TO_STDOUT=1 +export ASCEND_GLOBAL_LOG_LEVEL=0 #debug 0 --> info 1 --> warning 2 --> error 3 +然后执行atc ... > atc.log +``` + +### FAQ12、模型代码包含不能导出onnx的算子时如何解决-等价替换为自定义算子? +pytorch代码的adaptive_avg_pool2d目前onnx还不支持,所以导出onnx时报错,解决方案是尝试使用avg_pool2d替换adaptive_avg_pool2d,但当input最后两维不是output的整数倍时,adaptive_avg_pool2d不能完全等价替换为avg_pool2d,而npu有adaptive_avg_pool2d算子的实现,所以解决方案变为将adaptive_avg_pool2d改为自定义算子导出onnx,自定义算子不需要具体实现代码(因此导出的onnx不能使用onnxruntime进行推理,还需要将pytorch的_check_onnx_proto(proto)改为pass去除导出onnx时进行检查),只要自定义算子返回的输出shape与原算子输出的shape保持一致即可,相当于onnx只包含这个算子的声明(数据类型与属性需要与npu版算子对应),在onnx转为om时,atc工具的onnx插件如果支持该算子,atc工具会根据这个声明找到该算子npu的实现。 +查看npu的adaptive_avg_pool2d声明: +``` +REG_OP(AdaptiveAvgPool2d) + .INPUT(x, TensorType({DT_FLOAT, DT_FLOAT16})) + .OUTPUT(y, TensorType({DT_FLOAT, DT_FLOAT16})) + .REQUIRED_ATTR(output_size, ListInt) + .OP_END_FACTORY_REG(AdaptiveAvgPool2d) +``` +修改模型代码,将adaptive_avg_pool2d改为自定义算子,然后导出onnx,其中output_size_i代表int64类型的算子属性: +``` +class AdaptiveAvgPoolOp(torch.autograd.Function): + + @staticmethod + def forward(ctx, x, output_size): + out = torch.randn(x.shape[0], x.shape[1], output_size[0], output_size[1]).to(x.dtype) + return out + + @staticmethod + def symbolic(g, x, output_size): + out = g.op('AdaptiveAvgPool2d', x, output_size_i = output_size) + return out + +def adaptive_avg_pool_op(x, output_size): + out = AdaptiveAvgPoolOp.apply(x, output_size) + return out + +x = F.adaptive_avg_pool2d(input, output_size=bin_size)替换为x = adaptive_avg_pool_op(input, (bin_size, bin_size)) +``` + +### FAQ13、运行atc或benchmark命令时报错找不到atc命令或找不到ascend动态库 + +* 现象描述 + + ``` + Command 'atc' not found, but can be installed with: + or + ./benchmark.x86_64: error while loading shared libraries: libascendcl.so: cannot open shared object file: No such file or directory + ``` + +* 原因分析 + + 当环境变量未设置或者无效时,会出现上述错误。 + +* 处理方法 + + 设置环境变量: + ``` + export install_path=/usr/local/Ascend/ascend-toolkit/latest + export PATH=/usr/local/python3.7.5/bin:${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH + export PYTHONPATH=${install_path}/atc/python/site-packages:$PYTHONPATH + export LD_LIBRARY_PATH=${install_path}/atc/lib64:${install_path}/acllib/lib64:$LD_LIBRARY_PATH + export ASCEND_OPP_PATH=${install_path}/opp + export ASCEND_AICPU_PATH=/usr/local/Ascend/ascend-toolkit/latest + ``` + 若是普通用户登录装有Ascend310卡的服务器,需要使用sudo执行命令,并且 + ``` + 修改/etc/sudoers将Defaults env_reset改成Defaults !env_reset + 修改/etc/bash.bashrc添加alias sudo='sudo env PATH=$PATH LD_LIBRARY_PATH=$LD_LIBRARY_PATH' + ``` + +### FAQ14、推理性能不达标,profiling显示TransData算子耗时,参考如下方案优化 +(1)修改five_2_four.py优化方法 + 在环境变量env.sh中export install_path=/usr/local/Ascend/ascend-toolkit/latest路径下查找five_2_four.py文件,路径一般为 +``` +/usr/local/Ascend/ascend-toolkit/latest/x86_64-linux/opp/op_impl/built-in/ai_core/tbe/impl/five_2_four.py +``` + +修改five_2_four.py文件,将TransData算子的output shape加入five_2_four函数行中,示例如下: +``` +from impl import trans_data_negative_target_ntc + +@util.check_input_type(dict, dict, str, str, str) +def five_2_four(src, dst, src_format, dst_format, kernel_name='five_2_four'): + ... + elif dst_format.lower() == "nhwc" and dst_shape in [[10000, 63, 63, 1], [10000, 127, 127, 1], [16, 19, 19, 486], + [16, 10, 10, 486], [16, 38, 38, 324], [16, 5, 5, 486], + [16, 3, 3, 324], [8, 19, 19, 486], [8, 10, 10, 486], + [8, 38, 38, 324], [8, 5, 5, 486], [8, 3, 3, 324], + [100, 28, 28, 91]]: + trans_data_negative_target_tc.trans_data_negative_target_tc(src, dst, src_format, dst_format, kernel_name) + elif dst_format.lower() == "nchw" and dst_shape in [[2560, 512, 4, 26], [2560, 512, 1, 26], [2560, 256, 8, 25], + [16, 240, 7, 7], [16, 120, 14, 14], [1,19,1024,2048], [4,19,1024,2048]]: + print("=================================") + print("ntc dst shape:", dst_shape) + print("=================================") + trans_data_negative_target_ntc.trans_data_negative_target_ntc(src, dst, src_format, dst_format, kernel_name) + ... +``` +- 不同的batch_size,添加的shape不一样,shape大小为[*,19,1024,2048 ] ,以某模型为例,只测试batch1和batch4,因此添加的shape为[1,19,1024,2048],[4,19,1024,2048] + +修改完成后,重新转换生成om文件,atc转换过程会打印添加的日志,如下: +``` +ATC start working now, please wait for a moment. +================================= +ntc dst shape: [1, 19, 1024, 2048] +================================= +================================= +ntc dst shape: [1, 19, 1024, 2048] +================================= +ATC run success, welcome to the next use. +W11001: High-priority service of op[PartitionedCall_AvgPool_45_2] is invalid, low-priority service is used. It can work normally but may affect performance. +W11001: High-priority service of op[PartitionedCall_AvgPool_52_6] is invalid, low-priority service is used. It can work normally but may affect performance. +``` +(2)output_node输出节点类型更改为float16 +atc转换时指定输出节点类型为float16 +``` +atc --framework=5 --model=./ICNet.onnx --output=ICNet_bs1 --out_nodes="Resize_317:0" --output_type=FP16 --input_format=NCHW --input_shape="actual_input_1: 1,3,1024,2048" --log=debug --soc_version=Ascend310 +``` + +### FAQ15、onnx转om模型报错atc命令ERROR问题解决 +* 现象描述 + ``` + ATC run failed,please check the detail log. try 'atc --help' + E19999: Inter Error! + Unknown error occurred,please check the log. + ``` + 1. 设置环境变量 + ``` + export install_path=/usr/local/Ascend/ascend-toolkit/latest + export PATH=/usr/local/python3.7.5/bin:${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH + export PYTHONPATH=${install_path}/atc/python/site-packages:$PYTHONPATH + export LD_LIBRARY_PATH=${install_path}/atc/lib64:${install_path}/acllib/lib64:$LD_LIBRARY_PATH + export ASCEND_OPP_PATH=${install_path}/opp + ``` + 2. 更新最新的推理包run包 + + 3. 打印host日志 + ``` + export ASCEND_SLOG_PRINT_TO_STDOUT=1 + [WARNING] TBE(3112,atc.bin):2021-05-25-15:20:33.329.360 [image_ops.cc:2146][OP_PROTO] ResizeNearestInferShape:2146 OpName:[Resize_140] "Get + constValue failed of [sizes]" + [ERROR] TBE(3112,atc.bin):2021-05-25-15:20:33.329.371 [image_ops.cc:2084][OP_PROTO] CalculateSizeOut:2084 OpName:[Resize_140] "length of scale_out + after erase must be equal to 2" + [ERROR] TBE(3112,atc.bin):2021-05-25-15:20:33.329.376 [image_ops.cc:2155][OP_PROTO] ResizeNearestInferShape:2155 OpName:[Resize_140] "calculate size + out failed." + [ERROR] GE(3112,atc.bin):2021-05-25-15:20:33.329.391 [op_desc.cc:1345]3112 CallInferFunc: ErrorNo: -1(failed) [COMP][PRE_OPT]Resize_140 call infer + func. ret: 4294967295 + [ERROR] GE(3112,atc.bin):2021-05-25-15:20:33.329.397 [shape_refiner.cc:766]3112 InferShapeAndType: ErrorNo: -1(failed) [COMP][PRE_OPT]Resize_140 call + infer function failed. + ``` + 得出的结论为:onnx不支持 constValuse 需要进行优化转换 + 优化转换采用onnx-simplifier 工具进行转换 + 安装onnx-simplifier + pip3 install onnx-simplifier + 简化onnx模型: + python3 -m onnxsim ./hrnet_w18.onnx ./hrnet_w18_1.onnx --input-shape "16,3,224,224" + 转换完成再执行如下命令 + ``` + atc --framework=5 --model=./hrnet_w18_1.onnx --input_format=NCHW --input_shape="image:16,3,224,224" --output=hrnet_bs16 --log=debug -- + soc_version=Ascend310 + ``` + onnx转om成功。 + +### FAQ16、离线推理后处理脚本适配 + 对于一些图像分类的模型,后处理脚本都是通用的;而有些模型(比如分割类)是没有后处理脚本的,需要读者自行适配。 +(1)源码中包含在线推理脚本(如evaluate.py)或测试类脚本(如test.py) +基于这两个脚本适配,一般脚本中都包含类似的model语句 +``` +outputs = model(image) +``` +benchmark离线推理得到的./result/dumpOutput_device0/数据就可以理解为在线推理的model(image)步骤,适配过程就是从./result/dumpOutput_device0/中按照对应的名字将数据读取出来,适配代码参考如下: +``` +outputs = self.file2tensor(annotation_file) + +# 生成的是bin文件 +def file2tensor(self, annotation_file): + + filepath = annotation_file + '_1.bin' + size = os.path.getsize(filepath) + res = [] + L = int(size/4) #由于需要的是float32类型,所以按照4字节读取;根据实际情况按字节读取 + binfile = open(filepath, 'rb') + for i in range(L): + data = binfile.read(4) + num = struct.unpack('f', data) + res.append(num[0]) + binfile.close() + + dim_res = np.array(res).reshape(1,19,1024,2048) #转换为对应的shape,可通过在线推理打印outputs的shape获取到 + tensor_res = torch.tensor(dim_res, dtype=torch.float32) + print(filepath, tensor_res.dtype, tensor_res.shape) + + return tensor_res +``` +(2)如上的文件都没有,可以参考训练过程的validation步骤进行适配,适配方法同上。 + + +### FAQ17、执行数据集预处理报错 +``` +python3.7 imagenet_torch_preprocess.py /opt/npu/imagenet/val ./pre_dataset +``` +报错如下 +``` +PIL.UnidentifeldImageError: cannot identify image file '/opt/npu/imagenet/val/xxxx.jpeg' +``` +出现这个问题代表图片文件损坏。 +解决方法:更换未损坏的val数据集。 + + +## [2.2 NPU模型精度调试常见问题FAQ](#22-NPU模型精度调试常见问题FAQ) + + 1.前后处理与模型参数是否与开源代码仓的推理使用的完全一致 + 2.使用开源代码仓提供的测评pth的脚本测试pth在线推理精度是否达标,可以添加算子输出结果的调试打印 + 3.如果导出的onnx可以推理,确定onnx精度是否达标 + 4.如果是om算子导致精度下降,则模型转换时指定算子为om的输出节点,然后与pth在线推理时该算子(开启verbose导出onnx时会打印算子对应的py文件代码行)的输出对比,查看是否一致 + 5.如果某算子导致精度下降问题,尝试是否可以修改模型使用其它方法替换掉该算子,然后看精度是否达标,如果遇到实在规避不了的算子问题则需要在modelzoo提issue +参考《推理指导》的4.5 maskrcnn端到端推理指导案例 + + +## [2.3 NPU模型性能优化常见问题FAQ](#22-NPU模型性能优化常见问题FAQ) + + 1.优化修改onnx模型去掉影响性能的冗余pad,用Ascend atc的相关优化选项尝试一下,尝试使用最近邻替换双线性的resize重新训练,降低图片分辨率等使性能达标。 + 2.对于算子导致的性能问题,需要使用profiling分析定位引起性能下降的原因,具体到引起性能下降的算子。优先修改模型代码以使其选择性能好的npu算子替换性能差的npu算子使性能达标,然后在modelzoo上提issue,等修复版本发布后再重测性能,继续优化。 + 3.需要交付profiling性能数据,对经过上述方法性能可以达标的模型,在交付文档中写明问题原因与达标需要执行的操作;对经过上述方法性能仍不达标的模型,在交付的README.md文档中写明问题原因与简要的定位过程。 + diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-XxxxNet\347\275\221\347\273\234\346\250\241\345\236\213[\344\272\244\344\273\230\345\206\205\345\256\271]\346\265\213\350\257\225\346\212\245\345\221\212.docx" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-XxxxNet\347\275\221\347\273\234\346\250\241\345\236\213[\344\272\244\344\273\230\345\206\205\345\256\271]\346\265\213\350\257\225\346\212\245\345\221\212.docx" new file mode 100644 index 0000000000000000000000000000000000000000..870d22278c0a820f5a2ae736c4696015ce467a11 Binary files /dev/null and "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-XxxxNet\347\275\221\347\273\234\346\250\241\345\236\213[\344\272\244\344\273\230\345\206\205\345\256\271]\346\265\213\350\257\225\346\212\245\345\221\212.docx" differ diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/.keep" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-Xxx\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274\344\271\246.docx" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/.keep" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-Xxx\346\250\241\345\236\213\346\216\250\347\220\206\346\214\207\345\257\274\344\271\246.docx" diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-Xxx\346\250\241\345\236\213\346\265\213\350\257\225\346\212\245\345\221\212.docx" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-Xxx\346\250\241\345\236\213\346\265\213\350\257\225\346\212\245\345\221\212.docx" new file mode 100644 index 0000000000000000000000000000000000000000..fd5fb68b511e1abc0ed06f9b2da249026fa813cb Binary files /dev/null and "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-Xxx\346\250\241\345\236\213\346\265\213\350\257\225\346\212\245\345\221\212.docx" differ diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-issue\346\250\241\346\235\277.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-issue\346\250\241\346\235\277.md" new file mode 100644 index 0000000000000000000000000000000000000000..d47164e36c4d1d3c7ab1cf5234ea6fb06bacc3f0 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-issue\346\250\241\346\235\277.md" @@ -0,0 +1,21 @@ +标题: +[众智-PyTorch离线推理] [问题求助] - xx算子耗时长 + + +一、问题现象(附截图): +xx模型迁移到Ascend310上,pytorch->onnx->om,模型性能不达标,原因为算子性能差,profiling数据截图如下: + + + + +二、软件版本: +-- Pytorch 版本 (源码或二进制): +-- Python 版本 (e.g., Python 3.7.5): +-- 操作系统版本 (e.g., Ubuntu 18.04): +-- CANN 版本 (e.g., CANNN 5.0.1): + + +提供附件: +1. profiling原始数据 +2. onnx模型与bs16的om模型 +3.PyToch离线推理xxx模型性能不达标测试报告.docx \ No newline at end of file diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-models_result.xlsx" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-models_result.xlsx" new file mode 100644 index 0000000000000000000000000000000000000000..1f95acd26098ea6f384121566b7ab6b8b8ef27b3 Binary files /dev/null and "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-models_result.xlsx" differ diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\347\216\257\345\242\203\351\203\250\347\275\262\344\270\216\344\275\277\347\224\250\350\257\264\346\230\216.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\347\216\257\345\242\203\351\203\250\347\275\262\344\270\216\344\275\277\347\224\250\350\257\264\346\230\216.md" new file mode 100644 index 0000000000000000000000000000000000000000..5b38f1e95710f16a487ebb1da2bd36bcae99ec68 --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\347\216\257\345\242\203\351\203\250\347\275\262\344\270\216\344\275\277\347\224\250\350\257\264\346\230\216.md" @@ -0,0 +1,115 @@ +# 310服务器部署与说明 +- CANN包安装 + - 1.获取CANN包 + https://www.hiascend.com/software/cann/community + - 2.卸载老CANN包 + bash A300-3000-3010-npu-firmware_1.77.22.6.220.run --uninstall + rm -rf /usr/local/Ascend + reboot + - 3.安装CANN包 + bash A300-3010-npu-driver_21.0.1_ubuntu18.04-x86_64.run --full + bash A300-3000-3010-npu-firmware_1.77.22.6.220.run --full + reboot + bash Ascend-cann-toolkit_5.0.1_linux-x86_64.run --install + - 4.获取benchmark工具 + unzip Ascend-cann-benchmark_5.0.1-Linux-x86_64.zip + + >**说明:** + > + >若报无HwHiAiUser用户则执行useradd HwHiAiUser + >安装固件若报Not a physical-machine, firmware upgrade does not support.则不必安装固件 + >若报错ls: cannot access '.../5.0.1/x86_64-linux/toolkit/python/site-packages/bin': No such file or directory,则```export PATH=/usr/local/python3.7.5/bin:$PATH;export LD_LIBRARY_PATH=/usr/local/python3.7.5/lib:$LD_LIBRARY_PATH``` + >可以指定安装路径bash Ascend-cann-toolkit_5.0.1_linux-x86_64.run --install-path=/home/test --install,然后环境变量的CANN包路径也相应设置 为/home/test/Ascend/ascend-toolkit/latest + +- 添加普通用户 + useradd -m your_name + passwd your_name + usermod -s /bin/bash your_name + + 修改/etc/sudoers添加your_name ALL=(ALL:ALL) ALL + your_name用户便可使用sudo命令 + + 普通用户通过sudo执行atc报错找不到动态库,需要修改: + 修改/etc/sudoers将Defaults env_reset改成Defaults !env_reset + 修改/etc/bash.bashrc添加```alias sudo='sudo env PATH=$PATH LD_LIBRARY_PATH=$LD_LIBRARY_PATH'``` + +- 请使用conda安装python的库,将python的库安装在自己的conda环境里,修改这些库的代码不会影响其他用户 + 查看已有环境: + conda env list + 创建自己的conda环境: + conda create -n your_env_name python=3.7.5 + 进入环境: + conda activate your_env_name + 查看环境安装的python的库: + conda list + 只在该环境中安装py软件包: + https://anaconda.org/ 网址搜索包的安装命令 + conda install -c pytorch pytorch + conda install -c pytorch torchvision + conda install -c conda-forge onnx=1.9.0 + 查看安装路径: + python3.7 + import torchvision + print(torchvision.__file__) + 退出环境: + conda deactivate + 删除环境: + conda remove -n your_env_name --all + +- 使用说明: + - 请联系华为方申请登录服务器的普通用户名与密码 + - /home/下每个用户创建一个自己的目录,原则上只允许用户在自己的这个目录下开发,不要修改其它目录的东西 + - 不要随意更新CANN包与驱动包,修改系统文件,系统密码等 + - /opt/npu是共享的数据集盘目录,该目录仅用来存放共享的数据集,不可向该目录盘写其它数据 + - 每个模型使用到的通用数据集都放在/root/datasets/目录,除/root/datasets与/opt/npu外,不应在其它目录存放通用的数据集 + - 环境中默认安装的商用版CANN包放在/root/commerce_packages/目录下 + - 如果需要安装最新社区版CANN包可以安装在/root/cann_community/目录下 + +# t4服务器部署与说明 +- 安装cuda,cudnn,tensorrt + - 安装cuda + https://developer.nvidia.cn/cuda-toolkit-archive: + wget http://developer.download.nvidia.com/compute/cuda/11.0.2/local_installers/cuda_11.0.2_450.51.05_linux.run + sh cuda_11.0.2_450.51.05_linux.run + 安装过程不要选driver + 修改/etc/bash.bashrc添加: + ``` + export CUDA_HOME=/usr/local/cuda + export PATH=$PATH:$CUDA_HOME/bin + export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CUDA_HOME/lib64 + ``` + - 安装cudnn + https://developer.nvidia.cn/rdp/cudnn-download: + wget https://developer.nvidia.cn/compute/machine-learning/cudnn/secure/8.2.0.53/11.3_04222021/cudnn-11.3-linux-x64-v8.2.0.53.tgz + cp cuda/include/cudnn.h /usr/local/cuda/include/ + cp cuda/lib64/libcudnn* /usr/local/cuda/lib64/ + - 安装tensorRT + https://developer.nvidia.cn/nvidia-tensorrt-download: + wget https://developer.nvidia.cn/compute/machine-learning/tensorrt/secure/7.2.3/tars/TensorRT-7.2.3.4.Ubuntu-18.04.x86_64-gnu.cuda-11.0.cudnn8.1.tar.gz + tar zxvf TensorRT-7.2.3.4.Ubuntu-18.04.x86_64-gnu.cuda-11.0.cudnn8.1.tar.gz -C /usr/local/ + 修改/etc/bash.bashrc添加: + ``` + export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/TensorRT-7.2.3.4/lib + export PATH=$PATH:/usr/local/TensorRT-7.2.3.4/targets/x86_64-linux-gnu/bin + ``` + +- 添加普通用户 + useradd -m your_name + passwd your_name + usermod -s /bin/bash your_name + + t4上在线推理需要使用sudo安装库: + 修改/etc/sudoers添加your_name ALL=(ALL:ALL) ALL + your_name用户便可使用sudo命令 + 配置t4代理以访问外网: + export http_proxy=http://192.168.88.254:8080 + export https_proxy=http://192.168.88.254:8080 + +- 使用说明: + - /home/下每个普通用户创建一个自己的目录,原则上只允许用户在自己的这个目录下开发,不要修改其它目录的东西 + - 测试时请确保t4卡没有运行其它测试任务,使用nvidia-smi查看卡是否处于空闲态 + - t4上使用trtexec一条命令即可测试onnx模型性能,一般模型性能测试在半小时到半天之间可完成,测试完成后请及时退出登录 + - 不要更新CUDA驱动包与tensorRT,修改系统文件,系统密码等 + - 如果onnx不支持离线推理,需要在t4上在线推理测试性能,建议共用一台t4给普通用户sudo权限测试在线推理,而其它t4用来测试onnx离线推理性能不需要给普通用户sudo权限 + + diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\350\277\233\345\261\225.xlsx" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\350\277\233\345\261\225.xlsx" new file mode 100644 index 0000000000000000000000000000000000000000..fdda2d26fc8f6411dc83e8d23ad7a7c55ae4fbe7 Binary files /dev/null and "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\350\277\233\345\261\225.xlsx" differ diff --git "a/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\351\252\214\346\224\266\346\214\207\345\215\227.md" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\351\252\214\346\224\266\346\214\207\345\215\227.md" new file mode 100644 index 0000000000000000000000000000000000000000..70c0cfcd29822866bf24f70ba4ef12660033079f --- /dev/null +++ "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/PyTorch\347\246\273\347\272\277\346\216\250\347\220\206-\351\252\214\346\224\266\346\214\207\345\215\227.md" @@ -0,0 +1,285 @@ +# Ascend PyTorch 模型推理众智验收指南 + +1. 先上gitee管理平台,将验收目标调整至验收状态 +2. 检查PR内容,文件夹路径和文件结构 + - PR模板和文件路径结构都在下面附件里有详细说明,请仔细check + - 参见付件pr检视,请仔细check +3. 按照验收脚本在交付文件夹下进行验收 + 验收机器:192.168.88.45 + 参考[ResNext50测试说明](https://gitee.com/ascend/modelzoo/blob/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50/test/README.md) + 准备环境: + ``` + 1.拉取modelzoo上提交的模型pr,然后将模型文件夹ResNext50拷贝到验收机器的/home/verify_models,并进入到/home/verify_models/ResNext50 + 2.根据requirements.txt安装必要的依赖 + 3.git clone ResNext50模型结构代码所在的开源代码仓torchvision + 4.如果通过补丁修改了开源模型代码则将补丁打入,如果开源模型代码需要安装则安装 + 5.获取训练的权重文件 + 6.获取数据集存放路径 + 7.获取benchmark工具 + ``` + + + ```shell + #准备环境 + 交付的代码文件夹下获取模型结构的开源代码,安装必要的依赖,获取训练提供的权重文件,获取数据集路径,获取benchmark工具 + + # pth是否能正确转换为om + bash test/pth2om.sh + + # 精度数据是否达标(需要显示官网pth精度与om模型的精度) + # npu性能数据(确保device空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,性能数据以单卡吞吐率为标准),不指定数据集目录时默认/root/datasets + bash test/eval_acc_perf.sh --datasets_path=/root/datasets + + # 在t4环境测试性能数据(确保gpu空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,如果导出的onnx模型因含自定义算子等不能离线推理,则在t4上测试pytorch模型的在线推理性能,性能数据以单卡吞吐率为标准) + bash test/perf_t4.sh + ``` + + - 验收过程中遇到问题,如是一些路径或者打字错误的问题,先修复继续执行 + - 每次验收都需要对验收脚本中的所有未验收脚本进行验收,不要因某一项验收失败而阻塞后续验收工作 +4. 验收反馈 + - 验收后,使用验收报告模板,在评论区反馈验收结果 + ```shell + # 第X次验收测试 + # 验收结果 OK / Failed + # 验收环境: A + K / CANN 5.0.1 + # 关联issue: + + # pth是否能正确转换为om + bash test/pth2om.sh + # 验收结果: OK / Failed + # 备注: 成功生成om,无运行报错,报错日志xx 等 + + # 精度数据是否达标(需要显示官网pth精度与om模型的精度) + # npu性能数据(确保device空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,性能数据以单卡吞吐率为标准) + bash test/eval_acc_perf.sh --datasets_path=/root/datasets + # 验收结果: 是 / 否 + # 备注: 目标pth精度top1:77.62% top5:93.70%;bs1,bs16验收om精度top1:77.62% top5:93.69%;精度下降不超过1%;无运行报错,报错日志xx 等 + # 备注: 验收310测试性能bs1:1497.252FPS bs16:2096.376FPS;无运行报错,报错日志xx 等 + + # 在t4环境测试性能数据(确保gpu空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,如果导出的onnx模型因含自定义算子等不能离线推理,则在t4上测试pytorch模型的在线推理性能,性能数据以单卡吞吐率为标准),该步是验证eval_acc_perf.sh显示的t4性能数据是否正确,该脚本中填写的性能数据与t4实测性能数据要接近 + bash test/perf_t4.sh + # 验收结果: OK / Failed + # 备注: 验收t4测试性能bs1:763.044FPS bs16:1234.940FPS,与eval_acc_perf.sh脚本显示的t4性能数据一致;无运行报错,报错日志xx 等 + + # 310性能是否超过t4: 是 / 否 + bs1:310=(1497.252/763.044)1.96倍t4 + bs16:310=(2096.376/1234.940)1.70倍t4 + ``` + - 示例链接 https://gitee.com/ascend/modelzoo/pulls/836#note_4814643 +5. 验收完成后,需要进行以下几步 + - 在pr评论区按照上文模板反馈验收结果 + - 上gitee管理平台,将验收目标调整至完成状态 + - 上团队空间-测试管理-PyTorch模型众智验收跟踪表 更新模型验收数据 + - 完成验收测试报告文档,归档obs + - 整理验收必要的交付件,归档obs,将/home/verify_models/{模型名}目录归档,归档时需要删除该目录下的占用磁盘空间的无用文件夹预处理后的数据集prep_dataset,result/dumpOutput_device0与result/dumpOutput_device1 +6. 验收归档与统计 + 1./home/verify_models/modelzoo目录用来拉取modelzoo代码pr + 1./home/verify_models目录下需要保存以上测试后通过的模型 + 3./home/verify_models/models_result.xlsx里填写模型的测试数据,bs4,8,32的性能数据从README.md中获取,如果蓝区版本精度性能不达标,而黄区测试达标在备注里写明黄区版本,如果黄区测试也不能达标则写明黄区测试精度或性能不达标 + 4./home/verify_models仅用来存放测试通过的模型,models_result.xlsx以及modelzoo的代码,不要在该目录存放其它无用的文件 + + + + +- 关联issue模板 (负责人请关联相应的学生,若无法关联,请关联验收者) + ``` + 【Pytorch模型推理众智测试验收】【第x次回归测试】 xxx模型 验收不通过 + + 贴上验收报告 + + ``` + - 在pr提交的内容栏里编辑issue的链接即可关联对应的issue,问题解决后issue将自动关闭 + - 示例链接 https://gitee.com/ascend/modelzoo/issues/I3FI5L?from=project-issue +- [性能不达标issue模板](https://gitee.com/pengyeqing/ascend-pytorch-crowdintelligence-doc/blob/master/%E4%BC%97%E6%99%BA%E6%8E%A8%E7%90%86issue) + +### 附: pr检视 + +- pr检视: +1.标题格式:[华为大学昇腾学院][高校贡献][Pytorch离线推理][Cascade_RCNN]-初次提交 +2.包含bs1与bs16权重精度与om精度,包含bs1与bs16的t4与310性能数据,性能数据用fps表示 +3.备注:如果蓝区版本测精度或性能不达标,最新CANN版本测可以达标,这里需要写出原因与最新CANN包版本,用最新版本测。如果是无法规避的算子缺陷导致性能不达标,这里需要添加性能不达标的原因与解决方案。如果onnx因包含自定义算子不支持推理,需要说明性能是在t4上测的在线推理,如果模型不支持batch 16,也需要说明一下 +4.自验报告:CANN包版本与精度性能等数据是否正确 + +- 代码规范: +参考[ResNext50](https://gitee.com/ascend/modelzoo/tree/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50) +1.pipline要通过,缺陷扫描与规范扫描要尽可能改 +2.python脚本文件头需要加License声明 +3.pr不要包括开源模型的代码与权重文件 +注意: +4.python脚本不能包含从网上下载权重的代码,比如函数预训练为true时一般会下载权重 +5.python脚本避免依赖非必要的第三方库 +6.requirements.txt包含服务器上安装的本模型所有必要依赖的开源库的具体版本 + +- 模型README.md检视: +模板参见[README.md](https://gitee.com/ascend/modelzoo/tree/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50/README.md) +1.1.2 代码地址->需要给出使用的模型开源代码地址与其branch,commitid +2.2 环境说明->需要给出服务器上安装的本模型所有必要依赖的开源库的具体版本 +3.3.1 pth转onnx模型->优先使用训练提供的权重文件,如果训练的权重文件网上能获则需给出网址,否则需要给出从哪获取权重文件。如果训练没有提供权重则使用开源代码仓的权重文件。需要给出权重文件名与其md5sum值 +4.3.1 pth转onnx模型->如果需要对模型的开源代码做修改,以打patch的形式修改 +5.3.1 模型转换要点:->对于CANN包算子有问题导致模型转换失败或需要规避才能转换成功,则需要在模型转换要点里写明定位主要过程,原因与措施 +6.6.1 离线推理TopN精度统计->精度测试需要测试bs1与bs16的精度 +7.6.1 精度调试:->对于CANN包算子有问题导致精度不达标或需要规避才能达标,则需要在精度调试里写明定位主要过程,原因与措施 +8.7 性能对比->性能数据需要测bs1,16,4,8,32的性能数据,且需要计算出单卡吞吐率 +9.7 性能优化:->对于CANN包算子有问题导致性能不达标或需要规避才能达标,则需要在性能优化里写明定位主要过程,原因与措施 + +- test/README.md检视: +该文件是验收测试说明,主要是准备环境,pip3.7 install -r requirements.txt可能会重新安装某版本pytorch,验收时根据需要决定是否执行 +参见模板[test/README.md](https://gitee.com/ascend/modelzoo/tree/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50/test/README.md) + +- 如果使用补丁文件修改了模型代码则将补丁打入模型代码,如果需要引用模型代码仓的类或函数通过sys.path.append(r"./pytorch-nested-unet")添加搜索路径 + 预处理脚本不需要improt该脚本没有使用的库 +参见https://gitee.com/ascend/modelzoo/pulls/2309 +参见https://gitee.com/ascend/modelzoo/pulls/2585 +- 模型不支持动态onnx,性能不达标等特殊情况需要在pr备注与性能优化里说明 +参见https://gitee.com/ascend/modelzoo/pulls/2122 +参见https://gitee.com/ascend/modelzoo/pulls/2328 + +### 附: 模型推理指导中的交付标准与规范 +- 交付标准 + - 精度: + om模型推理的精度与Ascend 910训练出的权重精度或PyTorch预训练模型github代码仓README.md或官网文档公布的精度对比,精度下降不超过1%则认为精度达标 + - 性能: + Ascend benchmark工具在数据集上推理测的NPU 310单颗device吞吐率乘以4颗即单卡吞吐率大于TensorRT工具测的GPU T4单卡吞吐率则认为性能达标 + 如若交付要求中对性能有要求(易模型),310的性能必须高于t4的性能 + 如若交付要求中没有对性能有要求(中难模型),310上推理需尽可能进行性能优化 + 若无法达到,则需要向华为方提交性能已达瓶颈的认证申请,华为方将定期组织专家组对申请模型进行评审,通过评审的模型允许以不高于t4的性能进行交付 + - 脚本: + 代码符合pep8规范; + 脚本命名格式需统一,文件名含模型名时模型名用小写,模型名含多个字符串时用-连接; + xxx_pth2onnx.py中不能使用从网络下载权重pth文件的代码,xxx_pth2onnx.py应有输入输出参数,输入是本地权重pth文件,输出是生成的onnx模型文件名; + xxx_pth_preprocess.py与xxx_pth_postprocess.py尽可能只引用numpy,Pillow,torch,pycocotools等基础库,如不要因mmdetection框架的数据处理与精度评测部分封装了这些基础库的操作,为调用这些简单接口,前后处理脚本就依赖mmdetection; + 不同模型的脚本与代码部分处理流程有相似性,尽量整合成通用的脚本与代码。 + - 推理过程: + 需要提供端到端推理过程中执行的命令等 + - 关键问题总结: + 需要提供端到端推理遇到的关键问题的简要调试过程,至少包含模型转换要点,精度调试,性能优化 + + 说明: + ``` + 1.如果已经有了ascend 910训练提供的权重文件,那么优先使用910训练提供的权重文件做离线推理,精度与910训练出的精度对齐;如果开源代码仓提供了多个权重文件,使用常用的基础的那个配置的权重文件即可;如果开源代码仓没有提供pth权重文件,则需要该模型的训练同学提供pth权重文件,或者使用开源代码仓训练脚本简单训练一个pth权重文件,然后对比om精度与该pth权重文件的精度 + + 2.由于随机数可能不能模拟数据分布,Ascend benchmark工具纯推理功能测的有些模型性能数据可能不太准,所以模型测试脚本与提交代码的描述中的性能数据以Ascend benchmark在数据集上推理时得到性能数据为准 + + 3.如果模型支持多batch,需要测试batch1,4,8,16,32的精度与性能,写在README.md里,模型测试脚本与提交代码的描述只需提供bs1和bs16的精度性能数据 + + 4.如果导出的onnx因包含自定义算子等而不能推理,则在t4上运行开源评测脚本测试pth模型在线推理性能 + + 5.对于性能不达标的模型,需要进行如下工作: + 1)优化修改onnx模型去掉影响性能的冗余pad,用Ascend atc的相关优化选项尝试一下,尝试使用最近邻替换双线性的resize重新训练,降低图片分辨率等使性能达标。 + 2)对于算子导致的性能问题,需要使用profiling分析定位引起性能下降的原因,具体到引起性能下降的算子。优先修改模型代码以使其选择性能好的npu算子替换性能差的npu算子使性能达标,然后在modelzoo上提issue,等修复版本发布后再重测性能,继续优化。 + 3)需要交付profiling性能数据,对经过上述方法性能可以达标的模型,在交付文档中写明问题原因与达标需要执行的操作;对经过上述方法性能仍不达标的模型,在交付的README.md文档中写明问题原因与简要的定位过程。 + + 6.git clone开源模型代码仓到工作目录,如果模型代码仓没有安装命令,pth2onnx.py脚本需要引用模型代码仓的函数或类时,通过sys.path.append(r"./代码仓目录")添加搜索路径,如果需要修改开源代码仓代码,将修改用git diff做成一个patch文件,交付件不要交付开源代码仓里的代码,只需要交付这个patch文件。参见本文3.5 maskrcnn端到端推理指导-开源detectron2加载npu权重的推理指导 + + 7.数据集统一放在/root/datasets/目录 + ``` + +- 交付件 + - 交付件参考:[ResNeXt50_Onnx模型端到端推理指导.md](https://gitee.com/ascend/modelzoo/tree/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50) + - 最终交付件: + 包含以上交付标准的代码,README.md,以及验收脚本 + 权重文件、profiling性能数据等非代码交付件一并打压缩包邮件发送 + - 最终交付形式: + gitee网址:https://gitee.com/ascend/modelzoo/tree/master/contrib/ACL_PyTorch/Research + commit信息格式:【高校贡献-${学校学院名称}】【Pytorch离线推理-${模型名称}】${PR内容摘要} + 模型命名风格为大驼峰,模型名含多个字符串时使用横杠或下划线连接,当上下文用横杠时模型名用下划线连接,否则用横杠连接 + 对于batch1与batch16,npu性能均高于T4性能1.2倍的模型,放在Benchmark目录下,1-1.2倍对应Official目录,低于1倍放在Research目录,目前都放在contrib/ACL_PyTorch/Research下即可 + +- gitee仓PR贡献流程 + - fork [modelzoo](https://gitee.com/ascend/modelzoo) 到个人仓 + - 提交代码到个人仓 + - 签署cla [link](https://clasign.osinfra.cn/sign/Z2l0ZWUlMkZhc2NlbmQ=) + - 选择 Sign Individual CLA + - 若已提交PR,但忘记签署,可在签署CLA后再评论内评论 ```/check-cla``` 重新校验 + - 依据文件夹名称及目录规范整理代码,完成自验,使用PR内容模板进行PR,审查人员请指定 王姜奔(wangjiangben_hw) + - PR后,华为方会进行代码检视,并对PR进行验证,请关注PR的评论并及时修改 + - 最终验收完成后合入主干 +- gitee仓验收使用脚本(请自验)、PR内容模板 + - 验收使用脚本(请自验) + >![](public_sys-resources/icon-note.gif) + **说明:** + > **提交前请确保自验通过!确保直接执行以下脚本就可运行!** + + ```shell + #准备环境 + 交付的代码文件夹下获取模型结构的开源代码,安装必要的依赖,获取训练提供的权重文件,获取数据集路径,获取benchmark工具 + + # pth是否能正确转换为om + bash test/pth2om.sh + + # 精度数据是否达标(需要显示官网pth精度与om模型的精度) + # npu性能数据(确保device空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,性能数据以单卡吞吐率为标准),不指定数据集目录时默认/root/datasets + bash test/eval_acc_perf.sh --datasets_path=/root/datasets + + # 在t4环境测试性能数据(确保gpu空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,如果导出的onnx模型因含自定义算子等不能离线推理,则在t4上测试pytorch模型的在线推理性能,性能数据以单卡吞吐率为标准) + bash test/perf_t4.sh + ``` + - PR内容模板 + - PR示例链接 https://gitee.com/ascend/modelzoo/pulls/887 + - PR名称 + - [学校学院名称][高校贡献][Pytorch离线推理][模型名称]-PR内容摘要 + - 举例说明:[华为大学昇腾学院][高校贡献][Pytorch离线推理][ResNeXt50]-初次提交 + + ``` + + + **What type of PR is this?** + > /kind task + + **What does this PR do / why do we need it**: + # 简述你这次的PR的详情 + + | 模型 | 官网精度 | 310精度 | t4性能 | 310性能 | + | :------: | :------: | :------: | :------: | :------: | + | ResNeXt50 bs1 | top1:77.62% top5:93.70% | top1:77.62% top5:93.69% | 763.044fps | 1497.252fps | + | ResNeXt50 bs16 | top1:77.62% top5:93.70% | top1:77.62% top5:93.69% | 1234.940fps | 2096.376fps | + # 如果是无法规避的算子缺陷导致性能不达标,这里需要添加性能不达标的原因与解决方案 + + 自验报告 + # 第X次验收测试 + # 验收结果 OK / Failed + # 验收环境: A + K / CANN 5.0.1 + # 关联issue: + + # pth是否能正确转换为om + bash test/pth2om.sh + # 验收结果: OK / Failed + # 备注: 成功生成om,无运行报错,报错日志xx 等 + + # 精度数据是否达标(需要显示官网pth精度与om模型的精度) + # npu性能数据(确保device空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,性能数据以单卡吞吐率为标准) + bash test/eval_acc_perf.sh --datasets_path=/root/datasets + # 验收结果: 是 / 否 + # 备注: 目标pth精度top1:77.62% top5:93.70%;bs1,bs16验收om精度top1:77.62% top5:93.69%;精度下降不超过1%;无运行报错,报错日志xx 等 + # 备注: 验收310测试性能bs1:1497.252FPS bs16:2096.376FPS;无运行报错,报错日志xx 等 + + # 在t4环境测试性能数据(确保gpu空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,如果导出的onnx模型因含自定义算子等不能离线推理,则在t4上测试pytorch模型的在线推理性能,性能数据以单卡吞吐率为标准),该步是验证eval_acc_perf.sh显示的t4性能数据是否正确,该脚本中填写的性能数据与t4实测性能数据要接近 + bash test/perf_t4.sh + # 验收结果: OK / Failed + # 备注: 验收t4测试性能bs1:763.044FPS bs16:1234.940FPS,与eval_acc_perf.sh脚本显示的t4性能数据一致;无运行报错,报错日志xx 等 + + # 310性能是否超过t4: 是 / 否 + bs1:310=(1497.252/763.044)1.96倍t4 + bs16:310=(2096.376/1234.940)1.70倍t4 + + - 示例链接 https://gitee.com/ascend/modelzoo/pulls/836#note_4750681 + + **Which issue(s) this PR fixes**: + # 用于后期issue关联的pr + + Fixes # + + **Special notes for your reviewers**: + # 在reviewer检视时你想要和他说的 + + ``` + + diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/.keep" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/\346\241\210\344\276\213/\345\212\237\350\203\275\346\211\223\351\200\232/.keep" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/classification/.keep" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/\346\241\210\344\276\213/\345\212\237\350\203\275\346\211\223\351\200\232/.keep" diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/.keep" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/\346\241\210\344\276\213/\346\200\247\350\203\275\344\274\230\345\214\226/.keep" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/cv/segmentation/.keep" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/\346\241\210\344\276\213/\346\200\247\350\203\275\344\274\230\345\214\226/.keep" diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/nlp/.keep" "b/Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/\346\241\210\344\276\213/\347\262\276\345\272\246\350\260\203\350\257\225/.keep" similarity index 100% rename from "onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/benchmark/nlp/.keep" rename to "Ascend-PyTorch\347\246\273\347\272\277\346\216\250\347\220\206\346\214\207\345\257\274/\346\241\210\344\276\213/\347\262\276\345\272\246\350\260\203\350\257\225/.keep" diff --git "a/AscendPyTorch\346\250\241\345\236\213\344\274\227\346\231\272\346\226\207\346\241\243-\347\246\273\347\272\277\346\216\250\347\220\206.md" "b/AscendPyTorch\346\250\241\345\236\213\344\274\227\346\231\272\346\226\207\346\241\243-\347\246\273\347\272\277\346\216\250\347\220\206.md" index c5dee66f2a56c57985c0d0cff1b4448091e8097f..a42fec42b7e71bf40ae238d0942196b817e4ca0e 100644 --- "a/AscendPyTorch\346\250\241\345\236\213\344\274\227\346\231\272\346\226\207\346\241\243-\347\246\273\347\272\277\346\216\250\347\220\206.md" +++ "b/AscendPyTorch\346\250\241\345\236\213\344\274\227\346\231\272\346\226\207\346\241\243-\347\246\273\347\272\277\346\216\250\347\220\206.md" @@ -1,84 +1,82 @@ -# Ascend PyTorch模型端到端推理指导 -- [1 资源与端到端推理流程](#1-资源与端到端推理流程) - - [1.1 Ascend文档与软件包网址](#11-Ascend文档与软件包网址) - - [1.2 端到端推理流程与交付标准](#12-端到端推理流程与交付标准) -- [2 环境搭建](#2-环境搭建) - - [2.1 Ascend的run包安装](#21-Ascend的run包安装) - - [2.2 深度学习框架与第三方库](#22-深度学习框架与第三方库) -- [3 端到端推理示例](#3-端到端推理示例) - - [3.1 华为云昇腾modelzone里Pytorch模型端到端推理网址](#31-华为云昇腾modelzone里Pytorch模型端到端推理网址) - - [3.2 端到端推理实例](#32-端到端推理实例) -- [4 问题总结](#4-问题总结) - - [4.1 Inception-V4端到端推理要点](#41-Inception-V4端到端推理要点) - - [4.2 UNet端到端推理要点](#42-UNet端到端推理要点) - - [4.3 SSD端到端推理要点](#43-SSD端到端推理要点) - - [4.4 Inception-V3端到端推理要点](#44-Inception-V3端到端推理要点) - - [4.5 maskrcnn端到端推理指导](#45-maskrcnn端到端推理指导) -- [5 深度学习指导](#5-深度学习指导) - - [5.1 书籍推荐](#51-书籍推荐) - - [5.2 实践](#52-实践) - - [5.3 参加社区开发](#53-参加社区开发) -- [6 附录](#6-附录) - - [6.1 机器申请及使用指南](#61-机器申请及使用指南) - - [6.2 交付标准与规范](#62-交付标准与规范) +# Ascend PyTorch 模型众智文档-离线推理 +- [1 概述](#1-概述) + - [1.1 目标读者](#11-目标读者) + - [1.2 原理与方案](#12-原理与方案) + - [1.3 环境搭建](#13-环境搭建) +- [2 推理指导](#2-推理指导) + - [2.1 推理流程](#21-推理流程) + - [2.1.1 pytorch模型导出onnx](#211-pytorch模型导出onnx) + - [2.1.2 onnx模型转om模型](#212-onnx模型转om模型) + - [2.1.3 数据验证集预处理](#213-数据验证集预处理) + - [2.1.4 离线推理](#214-离线推理) + - [2.1.5 精度统计](#215-精度统计) + - [2.1.6 性能对比](#216-性能对比) + - [2.2 模型转换指导](#22-模型转换指导) + - [2.3 精度调试指导](#23-精度调试指导) + - [2.4 性能优化指导](#24-性能优化指导) +- [3 推理案例](#3-推理案例) + - [3.1 推理案例](#31-推理案例) + - [3.2 Inception-V4端到端推理要点](#32-Inception-V4端到端推理要点) + - [3.3 UNet端到端推理要点](#33-UNet端到端推理要点) + - [3.4 SSD端到端推理要点](#34-SSD端到端推理要点) + - [3.5 maskrcnn端到端推理指导](#35-maskrcnn端到端推理指导) +- [4 附录](#4-附录) + - [4.1 机器申请及使用指南](#41-机器申请及使用指南) + - [4.2 交付标准与规范](#42-交付标准与规范) + - [4.3 深度学习指导](#43-深度学习指导) -## 1 资源与端到端推理流程 +## 1 概述 -- **[Ascend文档与软件包网址](#11-Ascend文档与软件包网址)** +- **[目标读者](#11-目标读者)** -- **[端到端推理流程与交付标准)](#12-端到端推理流程与交付标准)** +- **[原理与方案](#12-原理与方案)** -### 1.1 Ascend文档与软件包网址 +- **[环境搭建](#13-环境搭建)** -本文目标读者为Ascend模型端到端推理开发者,用于指导开发者在昇腾版本的 PyTorch 下,实现模型端到端推理精度性能达标。 +### 1.1 目标读者 -**Ascend相关文档与软件发布在华为云[support地址](https://support.huawei.com/enterprise/zh/category/ascend-computing-pid-1557196528909)CANN和A300-3010** +本文目标读者为Ascend模型端到端推理开发者,用于指导开发者在onnx框架下,实现模型端到端推理精度性能达标。 ->![](public_sys-resources/icon-note.gif) +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) >**说明:** > >开发者除编程语言知识外,应对如下基础知识有一定的了解和熟悉: >1. 深度学习方法(cv, nlp等); >2. PyTorch写法及其运行原理; ->3. Ascend-PyTorch适配细节; +>3. onnx模型细节; >2. CANN运作流程; -### 1.2 端到端推理流程与交付标准 +### 1.2 原理与方案 - 精度性能要求 -om模型推理的精度与PyTorch预训练模型github代码仓或官网公布的精度对比,精度下降不超过1%则认为精度达标 -npu单颗芯片吞吐率乘以4颗大于gpu T4吞吐率则认为性能达标 + Ascend PyTorch模型离线推理目标是pytorch模型在npu Ascend 310卡上离线推理的精度与gpu T4卡上推理精度一致,推理性能超过T4。 + +- Ascend PyTorch模型离线推理流程 -- Ascend PyTorch模型端到端推理流程 + 首先在github上找到PyTorch实现的引用多包含预训练的模型代码仓,参考代码仓预处理模型加载的代码加载pth并转换为onnx模型,再由onnx模型转换为om模型,参考代码仓预训练模型数据预处理代码对用来评价模型精度的数据集进行预处理,使用昇腾benchmark工具执行om模型的离线推理,最后参考代码仓数据后处理代码进行后处理,统计出om模型的推理精度。使用benchmark工具测试om推理性能,对性能不达标的om模型,使用profiling工具分析,通过模型调优,算子开发与算子融合等方法实现达标 -昇腾NPU推理芯片只支持昇腾om格式的离线模型,由于昇腾atc工具不支持pth模型转换为om模型,因此先将PyTorch预训练的pth模型文件转换为onnx模型文件,再用atc工具将onnx转化为om模型文件。首先在github上找到PyTorch实现的引用多包含预训练的模型代码仓,参考代码仓预处理模型加载的代码加载pth并转换为onnx模型,参考代码仓预训练模型数据预处理代码对用来评价模型精度的数据集进行预处理,使用昇腾benchmark工具执行om模型的离线推理,最后参考代码仓数据后处理代码进行后处理,统计出om模型的推理精度 -使用benchmark工具纯推理功能测试om推理性能,对性能不达标的om模型,使用profiling工具分析,通过模型调优,算子开发与算子融合等方法实现达标 + Ascend PyTorch模型离线推理流程: +![](figures/pyotrch_offline_infer.png) -## 2 环境搭建 -- **[Ascend的run包安装](#21-Ascend的run包安装)** +### 1.3 环境搭建 -- **[深度学习框架与第三方库](#22-深度学习框架与第三方库)** +**Ascend相关文档与软件发布在华为云[support地址](https://support.huawei.com/enterprise/zh/category/ascend-computing-pid-1557196528909)CANN和A300-3010** ->![](public_sys-resources/icon-note.gif) +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) **说明:** -> > **若使用搭建完成的环境,可跳过此步骤。一般情况下,华为默认会提供搭建完成的环境。** -### 2.1 Ascend的run包安装 - -1. 安装前准备。 +1. Ascend的run包安装。 - 在Ascend NPU系列设备上部署开发环境,请参考[《CANN V100R020C10 软件安装指南》](https://support.huawei.com/enterprise/zh/doc/EDOC1100164870/59fb2d06)的“获取软件包”和”安装开发环境“章节,完成安装前准备。 -2. 完成驱动和固件的安装。 - - 请参考[《CANN V100R020C10 软件安装指南》](https://support.huawei.com/enterprise/zh/doc/EDOC1100164870/59fb2d06)的”安装昇腾芯片驱动和固件“ -\>“安装开发套件和框架插件包”章节,完成安装。 + - 请参考[《CANN V100R020C10 软件安装指南》](https://support.huawei.com/enterprise/zh/doc/EDOC1100164870/59fb2d06)的”安装昇腾芯片驱动和固件“ -\>“安装开发套件和框架插件包”章节,完成安装。 - >![](public_sys-resources/icon-note.gif) **说明:** + >![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) **说明:** > >安装驱动和固件需要root用户,若使用默认路径安装: > @@ -92,25 +90,22 @@ npu单颗芯片吞吐率乘以4颗大于gpu T4吞吐率则认为性能达标 > >解压Ascend-cann-benchmark_\{version\}-Linux-x86_64.zip,获取benchmark工具与脚本 > - >若报无HwHiAiUser用户则执行useradd HwHiAiUser,安装固件若报Not a physical-machine, firmware upgrade does not support.则不必安装固件,若报错ls: cannot access '/usr/local/Ascend/ascend-toolkit/5.0.1/x86_64-linux/toolkit/python/site-packages/bin': No such file or directory则export PATH=/usr/local/python3.7.5/bin:¥PATH;export LD_LIBRARY_PATH=/usr/local/python3.7.5/lib:¥LD_LIBRARY_PATH。安装后需要重启。 + >若报无HwHiAiUser用户则执行useradd HwHiAiUser,安装固件若报Not a physical-machine, firmware upgrade does not support.则不必安装固件,若报错ls: cannot access '/usr/local/Ascend/ascend-toolkit/5.0.1/x86_64-linux/toolkit/python/site-packages/bin': No such file or directory则export PATH=/usr/local/python3.7.5/bin:¥PATH;export LD_LIBRARY_PATH=/usr/local/python3.7.5/lib:¥LD_LIBRARY_PATH。安装驱动需要重启。 -### 2.2 深度学习框架与第三方库 +2. 深度学习框架与第三方库 -1. 安装深度学习框架 ``` -pytorch == 1.6.0 +pytorch >= 1.5.0 torchvision == 0.7.0 onnx == 1.7.0 onnxruntime == 1.5.2 onnxoptimizer == 0.1.1 -``` -2. 安装第三方库 -``` + numpy == 1.18.5 Pillow == 7.2.0 opencv-python == 4.2.0.34 ``` ->![](public_sys-resources/icon-note.gif) +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) **说明:** > > X86架构:pytorch和torchvision可以通过官方下载whl包安装,其他可以通过pip install 包名 安装 @@ -119,49 +114,48 @@ opencv-python == 4.2.0.34 > > 以上为多数网络需要安装的软件与推荐的版本,根据实际情况安装。如果python脚本运行过程中import 模块失败,安装相应模块即可,如果报错是缺少动态库,网上搜索报错信息找到相应安装包,执行yum install 包名安装即可 -## 3 端到端推理示例 +## 2 推理指导 -- **[华为云昇腾modelzone里Pytorch模型端到端推理网址](#31-华为云昇腾modelzone里Pytorch模型端到端推理网址)** +- **[推理流程](#21-推理流程)** -- **[端到端推理实例](#32-端到端推理实例)** +- **[模型转换指导](#22-模型转换指导)** -### 3.1 华为云昇腾modelzone里Pytorch模型端到端推理网址 +- **[精度调试指导](#23-精度调试指导)** -当前已完成端到端推理模型放在[ModelZoo](https://ascend.huawei.com/zh/#/software/modelzoo),包含模型端到端推理说明,代码与操作完整流程,下面的实例仅给出用于说明问题的代码片段,该页面过滤条件框中搜索atc可以看到这些模型 - -一些典型模型的链接如下 -1. [ResNeXt-50](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/2ca8ac26aeac461c85e7b04f17aa201a) -2. [Inception-V3](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/132f32e409b44aac8951f58ca073b780) -3. [Inception-V4](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/75eb32c2a2d94c4db743983504f83a06) -4. [EfficientNet-b0](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/75026a6edf604ec0bc5d16d220328646) -5. [YoloV3](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/36ea401e0d844f549da2693c6289ad89) -... - -### 3.2 端到端推理实例 +- **[性能换指导](#24-性能换指导)** +### 2.1 推理流程 以EfficientNet-b0为例介绍端到端推理涉及到的所有流程 - -1.pth模型转换为om模型 -PyTorch训练得到的pth模型文件不能直接转换为om模型文件,因此先将pth文件转化为onnx模型文件,再由onnx转化为离线om模型文件 - -1)基于PyTorch框架的模型代码与pth文件可以从开源[github网址](https://github.com/lukemelas/EfficientNet-PyTorch)获取,有些模型使用resize使用双线性模式训练的性能不达标,需要修改为resize使用最近邻模式重新训练,通过以下步骤得到onnx模型文件: - - [下载pth文件](https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth) - - 参考github网址说明安装efficientnet_pytorch +#### 2.1.1 pytorch模型导出onnx + - 获取pytorch模型代码与权重文件 + 基于PyTorch框架的EfficientNet模型代码与pth权重文件可以从开源[github网址](https://github.com/lukemelas/EfficientNet-PyTorch)获取 +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) +**说明:** +> 1.如果Ascend 910训练提供了pytorch模型代码与权重文件,那么优先使用910训练的代码与权重做离线推理,然后om模型精度对齐训练权重的精度 +> 2.否则在github上找到pytorch实现的尽可能是模型作者的或引用量最多的与提供pth权重文件的开源模型代码仓 +> 3.如果开源代码仓提供了多个pth权重文件,使用常用的基础的那个配置的权重文件即可,并且模型支持多任务时只需要针对一个基础的任务做推理 +> 4.如果开源代码仓没有提供pth权重文件,需要暂时使用开源代码仓训练脚本简单训练一个权重,然后om模型精度对齐pth权重在线推理的精度 + + 参考github网址说明安装efficientnet_pytorch ``` git clone https://github.com/lukemelas/EfficientNet-PyTorch cd EfficientNet-Pytorch -pip install -e . +pip3.7 install -e . ``` ->![](public_sys-resources/icon-note.gif) +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) **说明:** -> -> 有些模型代码库没有提供安装脚本,可以在python脚本中通过添加如下代码引用EfficientNet-PyTorch的EfficientNetlei类: -> -> sys.path.append(r"./EfficientNet-PyTorch") -> ->from efficientnet_pytorch import EfficientNet - - 参考EfficientNet-PyTorch模型预训练加载与导出onnx的代码,写脚本将pth转换为onnx -```python +> 有些模型代码仓没有提供安装脚本,可以在python脚本中通过添加如下代码引用EfficientNet-PyTorch的EfficientNet类: +> sys.path.append(r"./EfficientNet-PyTorch") +> from efficientnet_pytorch import EfficientNet + + [下载pth权重文件](https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth) +``` +wget https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth +``` + +- 导出onnx模型 + - 一般模型代码仓提供导出onnx的脚本,如果没有提供则需要调用onnx的torch.onnx.export接口导出onnx。参考EfficientNet-PyTorch模型预训练加载与导出onnx的代码,写脚本导出onnx。 +``` def pth2onnx(input_file, output_file): model = EfficientNet.from_pretrained('efficientnet-b0', weights_path=input_file) model.eval() @@ -171,53 +165,17 @@ def pth2onnx(input_file, output_file): dummy_input = torch.randn(1, 3, 224, 224) torch.onnx.export(model, dummy_input, output_file, input_names = input_names, dynamic_axes = dynamic_axes, output_names = output_names, opset_version=11, verbose=True) ``` - - 通过Ascend 910训练的PyTorch EfficientNet模型checkpoint会保存为pth.tar格式,通过如下脚本可以转换为ONNX模型 - -```python -import torch -import torch.onnx -from NPU.efficientnet_pytorch import EfficientNet -from collections import OrderedDict - -def proc_nodes_module(checkpoint,AttrName): - new_state_dict = OrderedDict() - for k,v in checkpoint[AttrName].items(): - if(k[0:7] == "module."): - name = k[7:] - else: - name = k[0:] - new_state_dict[name]=v - return new_state_dict +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) +**说明:** +> 目前atc工具支持的onnx算子版本opset_version为11 +> 如上导出的onnx模型使用netron查看其输入节点image的shape是(-1,3,224,224),-1代表onnx模型是动态batch的,当用tensorrt在t4上测onnx的性能时可以指定任意batch的输入(batch,3,224,224),dynamic_axes是动态batch参数,'image': {0: '-1'}表示输入image的第一维是-1即batch维为-1表示动态。 +> 当然像少数模型如shufflenetv1即使设置dynamic_axes实际上导出的onnx也是固定batch的,转换为om时指定的batch size要和onnx的固定batch一样才不会报错 +> 导出onnx出现如下错误Exporting the operator eye to ONNX opset version 11 is not supported,可以参考[issue](https://github.com/pytorch/pytorch/pull/41357)进行修改 -def convert(): - checkpoint = torch.load(input_file, map_location='cpu') - checkpoint['state_dict'] = proc_nodes_module(checkpoint, 'state_dict') - model = EfficientNet.from_name('efficientnet-b0') - model.load_state_dict(checkpoint['state_dict']) - model.eval() - model.set_swish(memory_efficient=False) - input_names = ["actual_input_1"] - output_names = ["output1"] - dummy_input = torch.randn(1, 3, 224, 224) - torch.onnx.export(model, dummy_input, output_file, input_names = input_names, output_names = output_names, opset_version=11, verbose=True) + 目前atc不支持efficientnet-b0模型中自定义pad算子,使用开源提供的模型可视化工具Netron可以看到这部分算子名称与连接关系,使用脚本删除自定义pad并使用卷积中的pad属性实现同样的功能: ``` ->![](public_sys-resources/icon-note.gif) - **说明:** -> ->注意目前ATC支持的onnx算子版本为11 -> -> 导出onnx出现如下错误Exporting the operator eye to ONNX opset version 11 is not supported,可以参考[issue](https://github.com/pytorch/pytorch/pull/41357)进行修改 - - 2)使用atc将onnx模型转为om模型,当模型中有算子atc不支持时则会转换失败报相应算子不支持,一种方法是开发om模型对应的算子,另一种方法是修改onnx模型,替换相关算子,有些模型靠近输出节点的一部分性能太低,因此从onnx中去除这部分与原输出节点,然后在后处理中实现相关功能,过程如下: - - 算子适配 - -目前atc不支持efficientnet-b0模型中自定义pad算子,使用开源提供的模型可视化工具Netron可以看到这部分算子名称与连接关系,使用脚本删除自定义pad并使用卷积中的pad实现同样功能。 - -```python import onnx - model = onnx.load("./efficientnet-b0.onnx") - model.graph.node[15].input[0] = 'image' model.graph.node[34].input[0] = '388' model.graph.node[66].input[0] = '429' @@ -239,7 +197,6 @@ delete_id_range = [[0, 14], [19, 33], [51, 65], [83, 97], [116, 130], [148, 162] [213, 227], [246, 260], [279, 293], [311, 325], [344, 358], [377, 391], [409, 423], [442, 456], [475, 489], [508, 522]] modify_ids = [15, 34, 66, 98, 131, 163, 196, 228, 261, 294, 326, 359, 392, 424, 457, 490, 523] - def indelrang(id): for start, end in delete_id_range: if id >= int(start) and id <= int(end): @@ -259,25 +216,23 @@ for i in range(max_idx): if kh % 2 != 0: pad = (kh - 1) // 2 else: - pad = kh // 2 + pad = kh // 2 model.graph.node[i - rm_cnt].attribute[3].ints[0] = pad model.graph.node[i - rm_cnt].attribute[3].ints[1] = pad model.graph.node[i - rm_cnt].attribute[3].ints[2] = pad model.graph.node[i - rm_cnt].attribute[3].ints[3] = pad print("adapt pad for", model.graph.node[i - rm_cnt].name) - onnx.checker.check_model(model) onnx.save(model, "./efficientnet-b0_adaptpad.onnx") ``` -但是由于强行更改了pad没有重新训练会导致精度下降2%,使用如下脚本可以测试onnx模型的精度: -```python +但是由于强行更改了pad没有重新训练会导致精度下降2%,使用如下脚本可以测试onnx模型推理的精度: +``` import os, sys from PIL import Image import numpy as np import torch import onnx import onnxruntime - class ONNXModel(): def __init__(self, onnx_path): self.onnx_session = onnxruntime.InferenceSession(onnx_path) @@ -302,16 +257,12 @@ class ONNXModel(): input_feed = self.get_input_feed(self.input_name, image_numpy) segmap = self.onnx_session.run(self.output_name, input_feed=input_feed) return segmap - def to_numpy(tensor): return tensor.detach().cpu().numpy() if tensor.requires_grad else tensor.cpu().numpy() - if __name__ == '__main__': - model_file = sys.argv[1]#'/home/adapt/efficientnet/efficientnet-b0.onnx' prep_file_path = sys.argv[2]#'/home/adapt/efficientnet/prep_dataset' output_path = sys.argv[3]#'/home/adapt/efficientnet/infer_onnx_res' - if not os.path.exists(output_path): os.makedirs(output_path) net = ONNXModel(model_file) @@ -327,46 +278,115 @@ if __name__ == '__main__': ``` 实际的解决方法是使用onnxsim去除自定义pad并优化网络: ``` -pip3 install onnx-simplifier +pip3.7 install onnx-simplifier python3.7 -m onnxsim --input-shape="1,3,224,224" efficientnet-b0.onnx efficientnet-b0_sim.onnx ``` - - 转为om模型 - -设置环境变量,使用atc工具将onnx模型转换为om模型,命令参考: + - 这里也总结一些其它模型遇到的问题。导出onnx是需要可以在cpu上执行的脚本,因此可能需要将权重map到cpu。如果gpu训练时使用了DataParallel,map到cpu上时模型结构需要去掉DataParallel,同时删除权重节点名前缀module.: ``` +def proc_nodes_module(checkpoint): + new_state_dict = OrderedDict() + for k, v in checkpoint.items(): + if "module." in k: + name = k.replace("module.", "") + else: + name = k + new_state_dict[name] = v + return new_state_dict +net = DnCNN(channels=1, num_of_layers=opt.num_of_layers) +model = net #model = nn.DataParallel(net, device_ids=device_ids).cuda() +checkpoint = torch.load(os.path.join('./', 'net.pth'), map_location='cpu') +checkpoint = proc_nodes_module(checkpoint) +model.load_state_dict(checkpoint) +``` +而对于910训练出的权重文件,删除前缀module.可能需要修改如下: +``` +def proc_nodes_module(checkpoint, AttrName): +... + for k, v in checkpoint[AttrName].items(): +... +checkpoint['state_dict'] = proc_nodes_module(checkpoint, 'state_dict') +... +model.load_state_dict(checkpoint['state_dict']) +``` + - 可视化工具netron可以查看模型图,[获取可视化工具netron](https://github.com/lutzroeder/netron/releases/download/v4.9.5/Netron-Setup-4.9.5.exe),om模型文件与atc工具dump的.pbtxt中间图模型文件也可以用华为修改后的netron查看,请联系华为方。使用netron可以方便的查看模型结构,权重与算子属性,比如输入节点名与其shape,输出的所有节点名与其shape + + - 有些pytorch算子onnx还不支持,根据开源社区提供的方法等价替换这些算子,如果不能完全等价替换而且npu已经支持该算子,则需要修改模型代码将该算子封装为自定义算子,然后导出包含自定义算子的onnx + + 例如,pytorch代码的adaptive_avg_pool2d目前onnx还不支持,所以导出onnx时报错,解决方案是尝试使用avg_pool2d替换adaptive_avg_pool2d,但当input的最后两维不是output的整数倍时,adaptive_avg_pool2d不能完全等价替换为avg_pool2d,而npu有adaptive_avg_pool2d算子的实现,所以解决方案变为将adaptive_avg_pool2d改为自定义算子导出onnx,自定义算子不需要具体实现代码(因此导出的onnx不能使用onnxruntime进行推理,还需要将pytorch的_check_onnx_proto(proto)改为pass去除导出onnx时进行检查),只要自定义算子返回的输出shape与原算子输出的shape保持一致即可,相当于onnx只包含这个算子的声明(数据类型与属性需要与npu版算子对应),在onnx转为om时,atc工具的onnx插件如果支持该算子,atc工具会根据这个声明找到该算子npu的实现。 + + 在CANN包安装目录的opp下搜索AdaptiveAvgPool2d,查看npu的adaptive_avg_pool2d声明: +``` +REG_OP(AdaptiveAvgPool2d) + .INPUT(x, TensorType({DT_FLOAT, DT_FLOAT16})) + .OUTPUT(y, TensorType({DT_FLOAT, DT_FLOAT16})) + .REQUIRED_ATTR(output_size, ListInt) + .OP_END_FACTORY_REG(AdaptiveAvgPool2d) +``` +修改模型代码,将adaptive_avg_pool2d改为自定义算子,然后导出onnx,其中output_size_i代表int64类型的算子属性: +``` +class AdaptiveAvgPoolOp(torch.autograd.Function): + @staticmethod + def forward(ctx, x, output_size): + out = torch.randn(x.shape[0], x.shape[1], output_size[0], output_size[1]).to(x.dtype) + return out + @staticmethod + def symbolic(g, x, output_size): + out = g.op('AdaptiveAvgPool2d', x, output_size_i = output_size) + return out +def adaptive_avg_pool_op(x, output_size): + out = AdaptiveAvgPoolOp.apply(x, output_size) + return out +x = F.adaptive_avg_pool2d(input, output_size=bin_size)替换为x = adaptive_avg_pool_op(input, (bin_size, bin_size)) +``` + + - 目标检测类网络nms与roi模块会将包含的许多动态shape小算子引入onnx,但是atc工具暂不支持动态shape的算子,解决方案是使用大颗粒npu的nms与roi自定义算子替换pytorch模型的nms与roi函数(这些小算子可以在转onnx时的verbose打印中找到其对应的pytorch模型代码,从而找到引入这些算子的函数)。参见本文3.5 maskrcnn端到端推理指导-mmdetection框架的maskrcnn + + - 开源detectron2目前仅支持pytorch1.8导出onnx,但是基于detectron2框架ascend 910训练的模型代码依赖华为npu版的pytorch1.5.0,无论cpu,gpu还是npu训练出的权重都是数值,只要保存权重的网络节点结构相同,就可以使用开源的detectron2加载npu训练的权重基于pytorch1.8导出onnx。参见本文3.5 maskrcnn端到端推理指导-detectron2框架npu训练的maskrcnn + + +#### 2.1.2 onnx模型转换为om模型 + - 使用Ascend atc工具将onnx转换为om +``` +CANN安装目录 export install_path=/usr/local/Ascend/ascend-toolkit/latest export PATH=/usr/local/python3.7.5/bin:${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH export PYTHONPATH=${install_path}/atc/python/site-packages:$PYTHONPATH export LD_LIBRARY_PATH=${install_path}/atc/lib64:${install_path}/acllib/lib64:$LD_LIBRARY_PATH export ASCEND_OPP_PATH=${install_path}/opp -# export DUMP_GE_GRAPH=2 - -atc --framework=5 --model=efficientnet-b0_sim.onnx --output=efficientnet-b0_bs1 --input_format=NCHW --input_shape="image:1,3,224,224" --log=debug --soc_version=Ascend310 -``` ->![](public_sys-resources/icon-note.gif) - **说明:** -> ->为使性能达标有些模型需要开启autotune或repeat autotune -> ->开启autotune方法:添加--auto_tune_mode="RL,GA" -> ->开启repeat autotune方法:添加--auto_tune_mode="RL,GA"同时export REPEAT_TUNE=True -> ->如果使用aipp进行图片预处理需要添加--insert_op_conf=aipp_efficientnet-b0_pth.config -> ->算子精度通过参数--precision_mode选择,默认值force_fp16 -> -> atc工具的使用可以参考[CANN V100R020C10 开发辅助工具指南 (推理) 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100164868?idPath=23710424%7C251366513%7C22892968%7C251168373) - -2.数据集预处理 - -图片分类网络使用[ImageNet官网](http://www.image-net.org)的5万张验证集进行评价,图片与标签分别是dataset/ImageNet/val_union与dataset/ImageNet/val_label.txt。数据集均值方差参考EfficientNet-PyTorch的数据预处理配置代码。预处理有两种方式:不使用aipp的二进制输入需要先用脚本仿照github官网训练预处理方法处理数据,以获得最佳精度;使用aipp的jpg输入可以直接读取原图,需要使用昇腾开发的DVPP模块和AIPP模块,因为解码、缩放等处理和官网训练预处理有一定区别,最终精度可能会下降0.7%左右。 - -1)不使用aipp进行预处理 -参考EfficientNet-PyTorch中的预处理代码,通过缩放、中心裁剪,totensor、均值方差归一化,输出为二进制文件: -```python -def preprocess(src_path, save_path): +export ASCEND_AICPU_PATH=/usr/local/Ascend/ascend-toolkit/latest +将atc日志打印到屏幕 +#export ASCEND_SLOG_PRINT_TO_STDOUT=1 +设置日志级别 +#export ASCEND_GLOBAL_LOG_LEVEL=0 #debug 0 --> info 1 --> warning 2 --> error 3 +开启ge dump图 +#export DUMP_GE_GRAPH=2 +参考命令 +atc --framework=5 --model=efficientnet-b0_sim.onnx --output=efficientnet-b0_bs16 --input_format=NCHW --input_shape="image:16,3,224,224" --log=debug --soc_version=Ascend310 +``` +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) +**说明:** +> 1.--input_shape是模型输入节点的shape,可使用netron查看onnx输入节点名与shape,batch维值为16,即会生成batch size为16的om模型。无论onnx模型的batch是多少,只要通过--input_shape指定batch为正整数,就得到对应batch size的om模型,om模型虽然支持动态batch,但是我们不使用动态batch的om模型 +> 2.--out_nodes选项可以指定模型的输出节点,形如--out_nodes="节点1名称:0;节点2名称:0;节点3名称:0"就指定了这三个节点每个节点的第1个输出作为模型的第一,第二,第三个输出 +> 3.算子精度通过参数--precision_mode选择,默认值force_fp16 +> 3.开启autotune方法:添加--auto_tune_mode="RL,GA" +> 5.开启repeat autotune方法:添加--auto_tune_mode="RL,GA"同时export REPEAT_TUNE=True +> 6.配置环境变量ASCEND_SLOG_PRINT_TO_STDOUT和ASCEND_GLOBAL_LOG_LEVEL,然后执行命令atc ... > atc.log可以输出日志到文件 +> 7.配置环境变量DUMP_GE_GRAPH后执行atc命令时会dump中间过程生成的模型图,使用华为修改的netron可以可视化这些.pbtxt模型文件,如需要请联系华为方,当atc转换失败时可以查看ge生成的中间过程图的模型结构与算子属性,分析出哪个算子引起的问题 +> 8.如果使用aipp进行图片预处理需要添加--insert_op_conf=aipp_efficientnet-b0_pth.config +> 9.atc工具的使用可以参考[CANN 5.0.1 开发辅助工具指南 (推理) 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100191944?idPath=23710424%7C251366513%7C22892968%7C251168373) +> 10.若模型包含atc不支持的算子,算子问题可以规避的先通过修改模型进行规避,并在modelzoo上提issue或联系华为方 + +#### 2.1.3 数据验证集预处理 +- 参考模型代码仓在验证集上评测精度的推理脚本里的数据预处理代码进行预处理脚本的编写 + + - 请确保预处理脚本的数据集预处理方法与代码仓评测精度的脚本采用的预处理方法保持一致,通常包括减均值除方差,缩放加pad、中心裁剪,除以255,nhwc转换为nchw,rgb转换为bgr等 + + ImageNet官网的5万张验证集图片与标签分别是datasets/ImageNet/val_union与datasets/ImageNet/val_label.txt。预处理有两种方式:不使用aipp的二进制输入即需要编写预处理脚本处理数据集,以获得最佳精度;使用aipp的jpg输入可以直接读取原图即硬件进行预处理,需要使用昇腾开发的DVPP模块和AIPP模块,因为解码、缩放等处理和官网训练预处理有一定区别,最终精度可能会下降0.7%左右。因此本文推荐使用不使用aipp进行预处理,这里给出的使用aipp进行预处理的方法用作学习 + + 1.不使用aipp进行预处理,参考EfficientNet-PyTorch中的预处理代码,通过缩放、中心裁剪,totensor、均值方差归一化,输出为二进制文件: + ``` + def preprocess(src_path, save_path): preprocess = transforms.Compose([ transforms.Resize(256, Image.BICUBIC), @@ -384,21 +404,20 @@ def preprocess(src_path, save_path): input_tensor = preprocess(input_image) img = np.array(input_tensor).astype(np.float32) img.tofile(os.path.join(save_path, file.split('.')[0] + ".bin")) -``` -``` -python3 imagenet_torch_preprocess.py dataset/ImageNet/val_union ./prep_dataset -python3 get_info.py bin ./prep_dataset ./efficientnet_prep_bin.info 224 224 -``` -查看efficientnet_prep_bin.info: + ``` + ``` + python3.7 imagenet_torch_preprocess.py datasets/ImageNet/val_union ./prep_dataset + python3.7 get_info.py bin ./prep_dataset ./efficientnet_prep_bin.info 224 224 + ``` +预处理后的数据集信息文件efficientnet_prep_bin.info: ``` 0 ./prep_dataset/ILSVRC2012_val_00005654.bin 224 224 1 ./prep_dataset/ILSVRC2012_val_00033427.bin 224 224 2 ./prep_dataset/ILSVRC2012_val_00004213.bin 224 224 ... ``` -2)使用aipp进行预处理 -通过DVPP实现解码、缩放功能,输出YUV数据,再通过AIPP进行色域转换及裁剪,最终直接输入网络中进行推理,方便快捷,benchmark工具已集成DVPP功能,只需添加命令行参数-useDvpp=true即开启DVPP。 -AIPP功能的开启需要在atc工具转换的过程中通过选项--insert_op_conf=aipp_efficientnet-b0_pth.config添加配置文件,即可与DVPP处理后的数据无缝对接。aipp_efficientnet-b0_pth.config: +第一列为样本序号,第二列为预处理后的样本路径,第三四列为预处理后样本的宽高 +2.使用aipp进行预处理,通过DVPP实现解码、缩放功能,输出YUV数据,再通过AIPP进行色域转换及裁剪,最终直接输入网络中进行推理,方便快捷,benchmark工具已集成DVPP功能,只需添加命令行参数-useDvpp=true即开启DVPP。 AIPP功能的开启需要在atc工具转换的过程中通过选项--insert_op_conf=aipp_efficientnet-b0_pth.config添加配置文件,即可与DVPP处理后的数据无缝对接。aipp_efficientnet-b0_pth.config: ``` aipp_op{ aipp_mode:static @@ -439,117 +458,226 @@ aipp_op{ } ``` ``` -python3 get_info.py jpg dataset/ImageNet/val_union ImageNet.info +python3.7 get_info.py jpg datasets/ImageNet/val_union ImageNet.info ``` 查看ImageNet.info: ``` -0 dataset/ImageNet/val_union/ILSVRC2012_val_00005654.jpeg 500 334 -1 dataset/ImageNet/val_union/ILSVRC2012_val_00033427.jpeg 500 334 -2 dataset/ImageNet/val_union/ILSVRC2012_val_00004213.jpeg 116 87 +0 datasets/ImageNet/val_union/ILSVRC2012_val_00005654.jpeg 500 334 +1 datasets/ImageNet/val_union/ILSVRC2012_val_00033427.jpeg 500 334 +2 datasets/ImageNet/val_union/ILSVRC2012_val_00004213.jpeg 116 87 ... ``` ->![](public_sys-resources/icon-note.gif) +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) **说明:** > > 这里只给出示例代码,前后处理,配置与评价等脚本来源: +> 1.[gitee benchmark 脚本](https://gitee.com/ascend/cann-benchmark/tree/master/infer/src/scripts) +> 2.[modelzoo案例](https://www.hiascend.com/zh/software/modelzoo) > -> 1.[gitee benchmark 脚本](https://gitee.com/ascend/cann-benchmark/tree/master/infer/src/scripts) -> -> 2.前面已经给出的modelzone示例网址链接 -> -> 3.前面已经给出了Ascend文档与软件网址链接 -> -> 4.参考EfficientNet-PyTorch的数据预处理代码 -> -> aipp配置参考前面提到的《CANN V100R020C10 开发辅助工具指南 (推理) 01》 - -3.离线推理 -benchmark工具为华为自研的模型推理工具,支持多种模型的离线推理,能够迅速统计出模型在Ascend310上的性能,支持真实数据和纯推理两种模式,配合后处理脚本,可以实现诸多模型的端到端过程,获取工具及使用方法可以参考[CANN V100R020C10 推理benchmark工具用户指南 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100164874?idPath=23710424%7C251366513%7C22892968%7C251168373) + - 此外对于一些其它模型,pytorch模型支持动态hw的输入,但是onnx模型输入shape的hw维是固定的,因此图片预处理的等比例缩放加pad不会与代码仓的完全一致,但是处理的合理的话对精度影响仅在0.5%之内。如果代码仓导出onnx的脚本或评测精度的脚本推荐了hw,就使用该hw。否则为了对于胖矮型与瘦高型图片不敏感,让预处理后的h与w相同,在评测脚本model推理数据集前添加打印出输入shape的hw维,根据结果为hw选择一个合适的值,或者根据代码仓推理预处理限定的最长边最短边与pad为hw选择一个合适的值,然后将图片按最长边比例将图片等比例缩放到该值,最短边两边补齐pad达到该值。 +以mmdetection框架的maskrcnn预处理为例,由于pytorch模型支持动态hw的输入,参考代码仓转onnx的脚本需要将onnx模型输入h维固定为800,w维固定为1216,通过等比例缩放加pad固定模型预处理后的输入样本的hw维为800,1216: +``` +def resize(img, size): + org_h = img.shape[0] + org_w = img.shape[1] + scale_ratio = min(size[0] / org_w, size[1] / org_h) + new_w = int(np.floor(org_w * scale_ratio)) + new_h = int(np.floor(org_h * scale_ratio)) + resized_img = mmcv.imresize(img, (new_w, new_h), backend='cv2') + return resized_img +def gen_input_bin(file_batches, batch): + for file in file_batches[batch]: + image = mmcv.imread(os.path.join(flags.image_src_path, file), backend='cv2') + image = resize(image, (flags.model_input_width, flags.model_input_height)) + mean = np.array([123.675, 116.28, 103.53], dtype=np.float32) + std = np.array([58.395, 57.12, 57.375], dtype=np.float32) + image = mmcv.imnormalize(image, mean, std) + rh = image.shape[0] + rw = image.shape[1] + pad_left = (flags.model_input_width - rw) // 2 + pad_top = (flags.model_input_height - rh) // 2 + pad_right = flags.model_input_width - pad_left - rw + pad_bottom = flags.model_input_height - pad_top - rh + image = mmcv.impad(image, padding=(pad_left, pad_top, pad_right, pad_bottom), pad_val=0) + image = image.transpose(2, 0, 1) + image.tofile(os.path.join(flags.bin_file_path, file.split('.')[0] + ".bin")) +``` +相应的后处理: +``` +def postprocess_bboxes(bboxes, image_size, net_input_width, net_input_height): + org_w = image_size[0] + org_h = image_size[1] + scale = min(net_input_width / org_w, net_input_height / org_h) + pad_w = net_input_width - org_w * scale + pad_h = net_input_height - org_h * scale + pad_left = pad_w // 2 + pad_top = pad_h // 2 + bboxes[:, 0] = (bboxes[:, 0] - pad_left) / scale + bboxes[:, 1] = (bboxes[:, 1] - pad_top) / scale + bboxes[:, 2] = (bboxes[:, 2] - pad_left) / scale + bboxes[:, 3] = (bboxes[:, 3] - pad_top) / scale + return bboxes +``` +基于detectron2框架的预处理参见3.5 maskrcnn端到端推理指导 + +#### 2.1.4 离线推理 +benchmark工具为华为自研的模型推理工具,支持多种模型的离线推理,能够迅速统计出模型在Ascend310上的性能,支持真实数据和纯推理两种模式,配合后处理脚本,可以实现诸多模型的端到端过程,获取工具及使用方法可以参考[CANN 5.0.1 推理benchmark工具用户指南 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100191895?idPath=23710424%7C251366513%7C22892968%7C251168373) - 二进制输入 ``` -./benchmark -model_type=vision -device_id=0 -batch_size=1 -om_path=efficientnet-b0_bs1.om -input_text_path=./efficientnet_prep_bin.info -input_width=224 -input_height=224 -output_binary=False -useDvpp=False +./benchmark.x86_64 -model_type=vision -device_id=0 -batch_size=1 -om_path=efficientnet-b0_bs1.om -input_text_path=./efficientnet_prep_bin.info -input_width=224 -input_height=224 -output_binary=False -useDvpp=False ``` +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) + **说明:** +> +> -model_type为benchmark支持的模型类型,目前支持的有vision,nmt,widedeep,nlp,yolocaffe,bert,deepfm +> -device_id是指运行在ascend 310的哪个device上,每张ascend 310卡有4个device +> -batch_size是指om模型的batch大小,该值应与om模型的batch大小相同,否则报输入大小不一致的错误 +> -om_path是om模型文件路径 +> -input_text_path为包含数据集每个样本的路径与其相关信息的数据集信息文件路径 +> -input_height为输入高度 +> -input_width为输入宽度 +> -output_binary为以预处理后的数据集为输入,benchmark工具推理om模型的输出数据保存为二进制还是txt,但对于输出是int64类型的节点时,指定输出为txt时会将float类型的小数转换为0而出错 +> -useDvpp为是否使用aipp进行数据集预处理 + + 输出结果默认保存在当前目录result/dumpOutput_device{device_id},性能数据默认保存在result/perf_{vision}_batchsize_{16}_device_{0}.txt。模型只有一个名为class的输出,shape为bs * 1000,数据类型为FP32,对应1000个分类的预测结果,每个输入的输出对应一个{input}_1.bin文件。此外,如果模型有三个输出,则三个输出分别对应{input}_1.bin,{input}_2.bin,{input}_3.bin。 + - jpg输入 ``` -./benchmark -model_type=vision -device_id=0 -batch_size=1 -om_path=efficientnet-b0_bs1.om -input_text_path=ImageNet.info -input_width=256 -input_height=256 -output_binary=False -useDvpp=true +./benchmark.x86_64 -model_type=vision -device_id=0 -batch_size=1 -om_path=efficientnet-b0_bs1.om -input_text_path=ImageNet.info -input_width=256 -input_height=256 -output_binary=False -useDvpp=true ``` ImageNet.info为图片信息,注意这里的“input_height”和“input_weight”与AIPP节点输入一致,值为256因为AIPP中做了裁剪,参数-useDvpp=true。 -输出结果默认保存在当前目录result/dumpOutput_device0,模型只有一个名为class的输出,shape为bs * 1000,数据类型为FP32,对应1000个分类的预测结果,每个输入对应的输出对应一个_x.bin文件。 ->![](public_sys-resources/icon-note.gif) - **说明:** -> -> 若benchmark执行失败可以通过查看系统输出日志初步定位原因,参考[CANN V100R020C10 日志参考 (推理) 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100164869?idPath=23710424%7C251366513%7C22892968%7C251168373) -> ->benchmark二次开发,参考[CANN V100R020C10 应用软件开发指南 (C&C++) 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100164875?idPath=23710424%7C251366513%7C22892968%7C251168373) -4.后处理与精度统计 - -调用vision_metric_ImageNet.py脚本与label比对,可以获得Accuracy Top5数据,结果保存在result.json中。 +#### 2.1.5 精度统计 +- 参考模型代码仓评测精度的脚本,编写后处理统计精度的脚本,然后评测om模型的精度: ``` -python3 vision_metric_ImageNet.py result/dumpOutput_device0/ dataset/ImageNet/val_label.txt ./ result.json +python3.7 imagenet_acc_eval.py result/dumpOutput_device0/ datasets/imagenet/val_label.txt ./ result.json ``` -查看输出结果: +将om输出与数据集标签对比,统计出精度: ``` -{"title": "Overall statistical evaluation", "value": [{"key": "Number of images", "value": "50000"}, {"key": "Number of classes", "value": "1000"}, {"key": "Top1 accuracy", "value": "76.76%"}, {"key": "Top2 accuracy", "value": "86.55%"}, {"key": "Top3 accuracy", "value": "90.06%"}, {"key": "Top4 accuracy", "value": "92.02%"}, {"key": "Top5 accuracy", "value": "93.2%"}] +{"key": "Top1 accuracy", "value": "76.76%"} {"key": "Top5 accuracy", "value": "93.2%"} ``` -将得到的om离线模型推理TopN结果与github该模型代码仓上公布的结果对比,精度下降在1%范围内可认为精度达标 +对bs1与bs16的om模型进行精度评测,与pth权重文件的精度相比,下降不超过1%,故精度达标 + +#### 2.1.6 性能对比 +- npu性能数据 -5.性能测试 + benchmark工具在整个数据集上推理时也会统计性能数据,但是推理整个数据集较慢,如果这么测性能那么整个推理期间需要确保独占device。为快速获取性能数据,也可以使用benchmark纯推理功能测得性能数据,但是由于随机数不能模拟数据分布,纯推理功能测的有些模型性能数据可能不太准。这里给出两种方式,benchmark纯推理功能测性能仅为快速获取大概的性能数据以便调试优化使用,模型的性能以使用benchmark工具在整个数据集上推理得到bs1与bs16的性能数据为准。 + + 1.benchmark工具在整个数据集上推理获得性能数据 + + batch1的性能,benchmark工具在整个数据集上推理后生成result/perf_vision_batchsize_1_device_0.txt: +``` +[e2e] throughputRate: 243.034, latency: 205733 +[data read] throughputRate: 258.963, moduleLatency: 3.86155 +[preprocess] throughputRate: 258.404, moduleLatency: 3.86991 +[infer] throughputRate: 244.435, Interface throughputRate: 382.328, moduleLatency: 3.35758 +[post] throughputRate: 244.435, moduleLatency: 4.09107 +``` +Interface throughputRate: 382.328,382.328x4=1529.312既是batch1 310单卡吞吐率 +bs1 310单卡吞吐率:382.328x4=1529.312fps/card + + batch16的性能,benchmark工具在整个数据集上推理后生成result/perf_vision_batchsize_16_device_1.txt: +``` +[e2e] throughputRate: 173.173, latency: 288729 +[data read] throughputRate: 174.62, moduleLatency: 5.72673 +[preprocess] throughputRate: 174.357, moduleLatency: 5.73535 +[infer] throughputRate: 173.844, Interface throughputRate: 519.634, moduleLatency: 3.36724 +[post] throughputRate: 10.865, moduleLatency: 92.0383 +``` +bs16 310单卡吞吐率:519.634x4=2078.536fps/card +2.benchmark纯推理功能测得性能数据 - - npu + batch1性能: 测试npu性能要确保device空闲,使用npu-smi info命令可查看device是否在运行其它推理任务 ``` -./benchmark -round=50 -om_path=efficientnet-b0_bs1.om -device_id=0 -batch_size=1 +./benchmark.x86_64 -round=20 -om_path=efficientnet-b0_bs1.om -device_id=0 -batch_size=1 ``` -执行50次纯推理取均值,统计吞吐率与其倒数时延(benchmark的时延是单个数据的推理时间),npu性能是一个device执行的结果 +执行20次纯推理取均值,统计吞吐率与其倒数时延(benchmark的时延是单个数据的推理时间),npu性能是一个device执行的结果 ``` - [INFO] Dataset number: 49 finished cost 2.808ms - [INFO] PureInfer result saved in ./result/PureInfer_perf_of_efficientnet-b0_bs1_in_device_0.txt - -----------------PureInfer Performance Summary------------------ - [INFO] ave_throughputRate: 354.724samples/s, ave_latency: 2.82228ms +[INFO] Dataset number: 19 finished cost 2.635ms +[INFO] PureInfer result saved in ./result/PureInfer_perf_of_efficientnet-b0_bs1_in_device_0.txt +-----------------PureInfer Performance Summary------------------ +[INFO] ave_throughputRate: 374.313samples/s, ave_latency: 2.67914ms ``` - - gpu -安装开源TensorRT +bs1 310单卡吞吐率:374.313x4=1497.252fps/card + + batch16性能: ``` -cd /usr/local/TensorRT-7.2.1.6/targets/x86_64-linux-gnu/bin/ -./trtexec --onnx=efficientnet-b0_sim.onnx --fp16 --shapes=image:1x3x224x224 --threads +./benchmark.x86_64 -round=20 -om_path=efficientnet-b0_bs16.om -device_id=0 -batch_size=16 ``` -gpu T4是4个device并行执行的结果,mean是时延(tensorrt的时延是batch个数据的推理时间),即吞吐率的倒数乘以batch ``` -[01/21/2021-19:51:51] [I] GPU Compute -[01/21/2021-19:51:51] [I] min: 0.888855 ms -[01/21/2021-19:51:51] [I] max: 3.28113 ms -[01/21/2021-19:51:51] [I] mean: 0.93663 ms -[01/21/2021-19:51:51] [I] median: 0.929688 ms -[01/21/2021-19:51:51] [I] percentile: 1.5061 ms at 99% -[01/21/2021-19:51:51] [I] total compute time: 2.86515 s +[INFO] Dataset number: 19 finished cost 30.514ms +[INFO] PureInfer result saved in ./result/PureInfer_perf_of_efficientnet-b0_bs16_in_device_0.txt +-----------------PureInfer Performance Summary------------------ +[INFO] ave_throughputRate: 524.094samples/s, ave_latency: 1.9101ms ``` - - npu的吞吐率乘4与gpu的吞吐率比较,npu性能高于gpu性能则认为性能达标 +bs16 310单卡吞吐率:524.094x4=2096.376fps/card +- gpu性能数据 -6.profiling性能分析 + 在装有T4卡的服务器上使用TensorRT测试gpu性能 - - CANN C10 版本profiling使用方法 + batch1性能: ``` -以root用户运行ada:kill -9 $(pidof ada) && /usr/local/Ascend/driver/tools/ada -... -新建/home/HwHiAiUser/test/run文件: -#! /bin/bash -export install_path=/usr/local/Ascend/ascend-toolkit/latest -export PATH=/usr/local/python3.7.5/bin:${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH -export PYTHONPATH=${install_path}/atc/python/site-packages:$PYTHONPATH -export LD_LIBRARY_PATH=${install_path}/atc/lib64:${install_path}/acllib/lib64:$LD_LIBRARY_PATH -export ASCEND_OPP_PATH=${install_path}/opp -./benchmark -round=50 -om_path=/home/HwHiAiUser/test/efficientnet-b0_bs1.om -device_id=0 -batch_size=1 -... -chmod 777 /home/HwHiAiUser/test/run -cd /usr/local/Ascend/ascend-toolkit/latest/toolkit/tools/profiler/profiler_tool/analysis/command/ -python3.7.5 hiprof.pyc --ip_address=本机ip --result_dir=/root/out --profiling_options=task_trace --app_dir=/home/HwHiAiUser/test/ --app="run" +trtexec --onnx=efficientnet-b0-sim.onnx --fp16 --shapes=image:1x3x224x224 --threads +``` +gpu T4是4个device并行执行的结果,mean是时延(tensorrt的时延是batch个数据的推理时间),即吞吐率的倒数乘以batch +``` +[03/24/2021-03:54:47] [I] GPU Compute +[03/24/2021-03:54:47] [I] min: 1.26575 ms +[03/24/2021-03:54:47] [I] max: 4.41528 ms +[03/24/2021-03:54:47] [I] mean: 1.31054 ms +[03/24/2021-03:54:47] [I] median: 1.30151 ms +[03/24/2021-03:54:47] [I] percentile: 1.40723 ms at 99% +[03/24/2021-03:54:47] [I] total compute time: 2.9972 s ``` +batch1 t4单卡吞吐率:1000/(1.31054/1)=763.044fps - - CANN C20 版本profiling使用方法 + batch16性能: ``` -新建/home/HwHiAiUser/test/run文件: +trtexec --onnx=efficientnet-b0_sim.onnx --fp16 --shapes=image:16x3x224x224 --threads +``` +``` +[03/24/2021-03:57:22] [I] GPU Compute +[03/24/2021-03:57:22] [I] min: 12.5645 ms +[03/24/2021-03:57:22] [I] max: 14.8437 ms +[03/24/2021-03:57:22] [I] mean: 12.9561 ms +[03/24/2021-03:57:22] [I] median: 12.8541 ms +[03/24/2021-03:57:22] [I] percentile: 14.8377 ms at 99% +[03/24/2021-03:57:22] [I] total compute time: 3.03173 s +``` +batch16 t4单卡吞吐率:1000/(12.9561/16)=1234.940fps + +- 性能对比 +bs1: 310/t4=1529.312/763.044=2.00倍 +bs16: 310/t4=2078.536/1234.940=1.68倍 +性能达标 + +### 2.2 模型转换指导 +参见2.1.1 导出onnx模型 +参见3.5 maskrcnn端到端推理指导 +### 2.3 精度调试指导 + +精度调试:根据推理流程逐步排除引起精度下降的点,由大到小排查,用替换法对比输入输出排查某个修改的或自定子或关键算子的影响,对影响精度的算子改进,可能是算子代码实现问题也可能是算子推理设置的参数没有与训练时用的保持一致。一般github代码仓提供了测评单个样本与整个验证数据集的命令,已此为基准调试精度。 + +1.前后处理与模型参数是否与开源代码仓的推理使用的完全一致 +2.使用开源代码仓提供的测评pth的脚本测试pth在线推理精度是否达标,可以添加算子输出结果的调试打印 +3.如果导出的onnx可以推理,确定onnx精度是否达标 +4.如果是om算子导致精度下降,则模型转换时指定算子为om的输出节点,然后与pth在线推理时该算子(开启verbose导出onnx时会打印算子对应的py文件代码行)的输出对比,查看是否一致 +5.如果某算子导致精度下降问题,尝试是否可以修改模型使用其它方法替换掉该算子,然后看精度是否达标,如果遇到实在规避不了的算子问题则需要在modelzoo提issue + +pytorch模型在线推理支持模型输入的hw维是变化的,而onnx模型仅支持输入固定的hw维,即每个预处理后输入样本的hw都是一样的,为了验证预处理等比例缩放加pad固定样本的hw维对精度的影响,可以修改代码仓精度评测脚本,加载预处理后的样本,然后替换掉model的输入再评测精度,查看精度是否下降。同样为了验om模型输出的结果,可以修改代码仓精度评测脚本,加载om模型输出的结果,然后替换掉model的输出再评测精度,查看精度是否下降。 + +精度对比工具: +https://gitee.com/ascend/tools/tree/master/msquickcmp + +精度调试可以参考3.5 maskrcnn端到端推理指导 + +### 2.4 性能优化指导 +- 性能分析工具profiling + - CANN C20及以后的版本profiling的使用方法: +``` +新建/home/HwHiAiUser/test/run文件,内容如下: #! /bin/bash export install_path=/usr/local/Ascend/ascend-toolkit/latest export PATH=/usr/local/python3.7.5/bin:${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH @@ -557,36 +685,62 @@ export PYTHONPATH=${install_path}/atc/python/site-packages:$PYTHONPATH export LD_LIBRARY_PATH=${install_path}/atc/lib64:${install_path}/acllib/lib64:$LD_LIBRARY_PATH export ASCEND_OPP_PATH=${install_path}/opp ./benchmark -round=50 -om_path=/home/HwHiAiUser/test/efficientnet-b0_bs1.om -device_id=0 -batch_size=1 -... +然后执行如下命令: chmod 777 /home/HwHiAiUser/test/run cd /usr/local/Ascend/ascend-toolkit/latest/x86_64-linux/toolkit/tools/profiler/bin ./msprof --output=/home/HwHiAiUser/test --application=/home/HwHiAiUser/test/run --sys-hardware-mem=on --sys-cpu-profiling=on --sys-profiling=on --sys-pid-profiling=on --sys-io-profiling=on --dvpp-profiling=on cd /usr/local/Ascend/ascend-toolkit/latest/x86_64-linux/toolkit/tools/profiler/profiler_tool/analysis/msprof/ -python3.7 msprof.pyc import -dir /home/HwHiAiUser/test/生成的profiling目录 -python3.7 msprof.pyc export summary -dir /home/HwHiAiUser/test/生成的profiling目录 +python3.7 msprof.py import -dir /home/HwHiAiUser/test/生成的profiling目录 +python3.7 msprof.py export summary -dir /home/HwHiAiUser/test/生成的profiling目录 +python3.7 msprof.py export timeline -dir /home/HwHiAiUser/test/生成的profiling目录 --iteration-id 1 +在chrome的地址栏输入chrome://tracing/加载打点数据查看打点图 ``` ->![](public_sys-resources/icon-note.gif) - **说明:** -> ->查看aicore算子运行时间整体统计,对影响性能可以融合的算子进行融合,参考[CANN V100R020C10 图融合和UB融合规则参考 (推理) 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100164873?idPath=23710424%7C251366513%7C22892968%7C251168373)等 -> ->C20版本的CANN包对profiling的使用有较大变更,如果导出csv文件失败可能需要使用C20版本的benhmark -> -> profiling使用与分析参考《CANN V100R020C10 开发辅助工具指南 (推理) 01》 +其中op_statistic_0_1.csv文件统计了模型中每类算子总体耗时与百分比,op_summary_0_1.csv中包含了模型每个算子的耗时 +profiling工具使用详情请参考[CANN 5.0.1 开发辅助工具指南 (推理) 01](https://support.huawei.com/enterprise/zh/doc/EDOC1100191944?idPath=23710424%7C251366513%7C22892968%7C251168373) + +- 实例 +Inception-V3性能不达标,使用profiling工具分析,可以从输出的csv文件看到算子统计结果 +``` +Model Name OP Type Core Type Count Total Time(us) Min Time(us) Avg Time(us) Max Time(us) Ratio(%) +inception_v3_bs16 TransData AI Core 22 399586.005 20.883 18163 105754.996 46.091391 +inception_v3_bs16 PadV3D AI Core 9 377928.343 14787.287 41992.038 102073.381 43.593226 +inception_v3_bs16 Conv2D AI Core 94 54804.676 201.195 583.028 3338.536 6.321602 +inception_v3_bs16 Pooling AI Core 13 27901.298 411.091 2146.253 4397.026 3.218355 +inception_v3_bs16 Mul AI Core 3 1964.518 572.027 654.839 714.319 0.226603 +inception_v3_bs16 ConcatD AI Core 3 1628.841 253.224 542.947 1111.872 0.187883 +inception_v3_bs16 GatherV2D AI Core 3 1284.729 335.778 428.243 594.11 0.148191 +inception_v3_bs16 Cast AI Core 2 1237.338 20.258 618.669 1217.08 0.142724 +inception_v3_bs16 AvgPool AI Core 1 460.62 460.62 460.62 460.62 0.053132 +inception_v3_bs16 MatMulV2 AI Core 1 126.037 126.037 126.037 126.037 0.014538 +inception_v3_bs16 Flatten AI Core 1 20.415 20.415 20.415 20.415 0.002355 +``` +profiling也会统计每个算子耗时,结合使用netron查看onnx模型结构图,可以看出pad和pad前后的transdata耗时很长,经过分析pad的功能可以由其后的averagepool中的pad属性完成,可以节约大量时间,于是进行padV3D和pooling算子的graph融合。从op_summary_0_1.csv中看出单个TransData算子aicore的耗时已经很短了,本模型TransData算子没有优化空间。 + + +## 3 实例总结 +- **[推理案例](#31-推理案例)** -## 4 实例总结 -- **[Inception-V4端到端推理要点](#41-Inception-V4端到端推理要点)** +- **[Inception-V4端到端推理要点](#32-Inception-V4端到端推理要点)** -- **[UNet端到端推理要点](#42-UNet端到端推理要点)** +- **[UNet端到端推理要点](#33-UNet端到端推理要点)** -- **[SSD端到端推理要点](#43-SSD端到端推理要点)** +- **[SSD端到端推理要点](#34-SSD端到端推理要点)** -- **[Inception-V3端到端推理要点](#44-Inception-V3端到端推理要点)** +- **[maskrcnn端到端推理指导](#35-maskrcnn端到端推理指导)** -- **[maskrcnn端到端推理指导](#45-maskrcnn端到端推理指导)** +### 3.1 推理案例 -这里对经典类型的端到端推理网络遇到的问题做了一些总结,读者可以按照上面的流程尝试一下,从中体会代码是如何写出来的以及问题是如何解决的 -### 4.1 Inception-V4端到端推理要点 +当前已完成端到端推理模型放在[ModelZoo](https://ascend.huawei.com/zh/#/software/modelzoo),包含模型端到端推理说明,代码与操作完整流程,下面的实例仅给出用于说明问题的代码片段,该页面过滤条件框中搜索atc可以看到这些模型 + +一些典型模型的链接如下 +1. [ResNeXt-50](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/2ca8ac26aeac461c85e7b04f17aa201a) +2. [Inception-V3](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/132f32e409b44aac8951f58ca073b780) +3. [Inception-V4](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/75eb32c2a2d94c4db743983504f83a06) +4. [EfficientNet-b0](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/75026a6edf604ec0bc5d16d220328646) +5. [YoloV3](https://ascend.huawei.com/zh/#/software/modelzoo/detail/1/36ea401e0d844f549da2693c6289ad89) +... + +### 3.2 Inception-V4端到端推理要点 1.github开源代码网址 google搜索或github上找到引用最多且尽量包含预训练的[PyTorch Inception-V4 模型代码仓](https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/inceptionv4.py),预训练模型[pth文件的下载地址](http://data.lip6.fr/cadene/pretrainedmodels/inceptionv4-8e4777a0.pth)可以在该代码仓里找到 @@ -686,7 +840,7 @@ aipp_op{ ... } ``` -### 4.2 UNet端到端推理要点 +### 3.3 UNet端到端推理要点 1.github上Pytorch-UNet没有安装脚本,在pth2onnx脚本中引用代码仓定义的UNet: ``` git clone https://github.com/milesial/Pytorch-UNet @@ -924,7 +1078,7 @@ res.save(os.path.join('.', file.replace('_1.bin', '.png'))) python3 Pytorch_UNet/predict.py -i train/fff9b3a5373f_16.jpg -o output.jpg --model=unet_carvana_scale1_epoch5.pth --scale=1 ``` -### 4.3 SSD端到端推理要点 +### 3.4 SSD端到端推理要点 1.github上pytorch-ssd没有安装脚本,在pth2onnx脚本中引用代码仓定义的create_vgg_ssd,并转换为onnx: ```python git clone https://github.com/qfgaohao/pytorch-ssd @@ -1072,118 +1226,185 @@ python3 ssd_pth_postprocess.py /root/dataset/VOCdevkit/VOC2007/ ./voc-model-labe ``` 第一个参数为voc2007数据集目录,第二个为包含背景的21个类别,在代码仓里可以找到下载地址,第三个为benchmark推理结果,第四个为评价推理结果 -### 4.4 Inception-V3端到端推理要点 -1.Inception-V3性能不达标,使用profiling工具分析,可以从输出的csv文件看到算子统计结果 -``` -Model Name OP Type Core Type Count Total Time(us) Min Time(us) Avg Time(us) Max Time(us) Ratio(%) -inception_v3_bs16 TransData AI Core 22 399586.005 20.883 18163 105754.996 46.091391 -inception_v3_bs16 PadV3D AI Core 9 377928.343 14787.287 41992.038 102073.381 43.593226 -inception_v3_bs16 Conv2D AI Core 94 54804.676 201.195 583.028 3338.536 6.321602 -inception_v3_bs16 Pooling AI Core 13 27901.298 411.091 2146.253 4397.026 3.218355 -inception_v3_bs16 Mul AI Core 3 1964.518 572.027 654.839 714.319 0.226603 -inception_v3_bs16 ConcatD AI Core 3 1628.841 253.224 542.947 1111.872 0.187883 -inception_v3_bs16 GatherV2D AI Core 3 1284.729 335.778 428.243 594.11 0.148191 -inception_v3_bs16 Cast AI Core 2 1237.338 20.258 618.669 1217.08 0.142724 -inception_v3_bs16 AvgPool AI Core 1 460.62 460.62 460.62 460.62 0.053132 -inception_v3_bs16 MatMulV2 AI Core 1 126.037 126.037 126.037 126.037 0.014538 -inception_v3_bs16 Flatten AI Core 1 20.415 20.415 20.415 20.415 0.002355 -``` - -2.算子融合 -profiling也会统计每个算子耗时,结合使用netron查看onnx模型结构图,可以看出pad和pad前后的transdata耗时很长,经过分析pad的功能可以由其后的averagepool中的pad属性完成,可以节约大量时间,于是进行padV3D和pooling算子的graph融合 - -参考前面提到的《CANN V100R020C10 图融合和UB融合规则参考 (推理) 01》 - -### 4.5 maskrcnn端到端推理指导 +### 3.5 maskrcnn端到端推理指导 [基于开源mmdetection预训练的maskrcnn_Onnx模型端到端推理指导.md](https://gitee.com/pengyeqing/ascend-pytorch-crowdintelligence-doc/blob/master/onnx%E7%AB%AF%E5%88%B0%E7%AB%AF%E6%8E%A8%E7%90%86%E6%8C%87%E5%AF%BC/benchmark/cv/segmentation/%E5%9F%BA%E4%BA%8E%E5%BC%80%E6%BA%90mmdetection%E9%A2%84%E8%AE%AD%E7%BB%83%E7%9A%84maskrcnn_Onnx%E6%A8%A1%E5%9E%8B%E7%AB%AF%E5%88%B0%E7%AB%AF%E6%8E%A8%E7%90%86%E6%8C%87%E5%AF%BC.md) [基于detectron2训练的npu权重的maskrcnn_Onnx模型端到端推理指导.md](https://gitee.com/pengyeqing/ascend-pytorch-crowdintelligence-doc/blob/master/onnx%E7%AB%AF%E5%88%B0%E7%AB%AF%E6%8E%A8%E7%90%86%E6%8C%87%E5%AF%BC/benchmark/cv/segmentation/%E5%9F%BA%E4%BA%8Edetectron2%E8%AE%AD%E7%BB%83%E7%9A%84npu%E6%9D%83%E9%87%8D%E7%9A%84maskrcnn_Onnx%E6%A8%A1%E5%9E%8B%E7%AB%AF%E5%88%B0%E7%AB%AF%E6%8E%A8%E7%90%86%E6%8C%87%E5%AF%BC.md) -## 5 深度学习指导 -### 5.1 书籍推荐 -``` -1.现代设计理论与方法-约束条件下的最优化问题与梯度下降 -2.自动控制原理-神经网络前向计算与反向传播 -3.深入理解TensorFlow-深度学习框架 -4.极客网王天一人工智能基础课-理解理论基本概念 -5.动手学深度学习-实践教材图像自然语言模型的基本概念 -6.python深度学习-Keras实战理解深度学习到的是什么 -7.深度学习-理论 -数据的处理,网络模型的结构,理解深度学习学的是什么 -``` -### 5.2 实践 -``` -重构经典的深度学习教材 -类似kaggle类型的学生开发者竞赛 -类似嵌入式开发板搭建简单易得的实验条件 -使用mindspore进行网络开发 -``` -### 5.3 参加社区开发 - - - [https://ascend.huawei.com](http://ascend.huawei.com "https://ascend.huawei.com") - - [https://gitee.com/mindspore/mindspore](https://gitee.com/mindspore/mindspore) - - [https://gitee.com/openeuler/A-Tune](https://gitee.com/openeuler/A-Tune) -## 6 附录 +## 4 附录 -- **[机器申请及使用指南](#61-机器申请及使用指南)** -- **[交付标准与规范](#62-交付标准与规范)** +- **[机器申请及使用指南](#41-机器申请及使用指南)** +- **[交付标准与规范](#42-交付标准与规范)** +- **[深度学习指导](#43-深度学习指导)** -### 6.1 机器申请及使用指南 ->![](public_sys-resources/icon-note.gif) +### 4.1 机器申请及使用指南 +>![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) **说明:** > **机器周均使用率过低且项目无故无进展时,华为方将有权回收算力资源,由此造成交付延期由使用者自己承担。** > **请勿随意更改密码,更改密码带来的风险由更改者承担。** > **请勿随意更新驱动等系统相关软件,有需要请及时联系华为方支持人员。** - 机器申请 - - GPU - - 由于GPU资源紧张,请提前做好资源申请,每个模型按3个工作日作为调测时间,原则上每次调测不少于2个模型,每个模型不可重复申请调试。若无法按期归还,请提前和华为方支撑者做好沟通 - - NPU - - 每个模型调测人员至少分配一张NPU用于模型调测,请向华为方申请动态调配的NPU资源 -- 磁盘使用 - - / 下是系统目录 - - /home 是可使用的数据盘目录 - - 开发主要使用磁盘 - - /opt/npu(GPU是/opt/gpu) 是共享的数据集盘目录 - - 预存数据集,仅用于copy - - **使用如下命令解压数据集到本机使用```tar xf imagenet.tar.gz -C /home/data```** - - 请勿在共享盘上运行脚本和写数据 - - 重启后请执行 ```bash /root/mount_datadisk.sh``` 挂载数据盘,**请勿在/etc/fstab设置为自动挂载** + - GPU T4 + - 装有t4的服务器,请联系华为方 + - NPU 310 + - 装有310的服务器,请联系华为方 +- t4服务器使用 + - /home/下每个普通用户创建一个自己的目录,原则上只允许用户在自己的这个目录下开发,不要修改其它目录的东西 + - 测试时请确保t4卡没有运行其它测试任务,使用nvidia-smi查看卡是否处于空闲态 + - t4上使用trtexec一条命令即可测试onnx模型性能,一般模型性能测试在半小时到半天之间可完成,测试完成后请及时退出登录 + - 不要更新CUDA驱动包与tensorRT,修改系统文件,系统密码等 + +- 310服务器使用 + - 请联系华为方申请登录服务器的普通用户名与密码 + - /home/下每个用户创建一个自己的目录,原则上只允许用户在自己的这个目录下开发,不要修改其它目录的东西 + - 不要随意更新CANN包与驱动包,修改系统文件,系统密码等 + - /opt/npu是共享的数据集盘目录,该目录仅用来存放共享的数据集,不可向该目录盘写其它数据 + - 每个模型使用到的数据集都放在/root/datasets/目录,除/root/datasets与/opt/npu外,不要在其它目录存放数据集 + - CANN包放在/home/common/packages/目录下 + - 请使用conda安装python的库,将python的库安装在自己的conda环境里,修改这些库的代码不会影响其他用户 + ``` + 查看已有环境: + conda env list + 创建自己的conda环境: + conda create -n your_env_name python=3.7.5 + 进入环境: + conda activate your_env_name + 查看环境安装的python的库: + conda list + 只在该环境中安装py软件包: + https://anaconda.org/ 网址搜索包的安装命令 + conda install -c pytorch pytorch + conda install -c pytorch torchvision + conda install -c conda-forge onnx=1.9.0 + 查看安装路径: + python3.7 + import torchvision + print(torchvision.__file__) + 退出环境: + conda deactivate + 删除环境: + conda remove -n your_env_name --all + ``` + - root添加普通用户方法: + ``` + useradd -m your_name + passwd your_name + usermod -s /bin/bash your_name + 修改/etc/sudoers添加your_name ALL=(ALL:ALL) ALL + your_name用户便可使用sudo命令 + ``` + - 普通用户执行atc找不到动态库: + ``` + 修改/etc/sudoers将Defaults env_reset改成Defaults !env_reset + 修改/etc/bash.bashrc添加alias sudo='sudo env PATH=$PATH LD_LIBRARY_PATH=$LD_LIBRARY_PATH' + ``` -### 6.2 交付标准与规范 +### 4.2 交付标准与规范 - 交付标准 - 精度: - om模型推理的精度与PyTorch预训练模型github代码仓README.md或官网文档公布的精度对比,精度下降不超过1%则认为精度达标 + om模型推理的精度与Ascend 910训练出的权重精度或PyTorch预训练模型github代码仓README.md或官网文档公布的精度对比,精度下降不超过1%则认为精度达标 - 性能: - ascend benchmark工具纯推理测的npu单颗device吞吐率乘以4颗大于TensorRT工具测的gpu T4吞吐率则认为性能达标 + Ascend benchmark工具在数据集上推理测的NPU 310单颗device吞吐率乘以4颗即单卡吞吐率大于TensorRT工具测的GPU T4单卡吞吐率则认为性能达标 + 如若交付要求中对性能有要求(易模型),310的性能必须高于t4的性能 + 如若交付要求中没有对性能有要求(中难模型),310上推理需尽可能进行性能优化 + 若无法达到,则需要向华为方提交性能已达瓶颈的认证申请,华为方将定期组织专家组对申请模型进行评审,通过评审的模型允许以不高于t4的性能进行交付 - 脚本: 代码符合pep8规范; + 脚本命名格式需统一,文件名含模型名时模型名用小写,模型名含多个字符串时用-连接; xxx_pth2onnx.py中不能使用从网络下载权重pth文件的代码,xxx_pth2onnx.py应有输入输出参数,输入是本地权重pth文件,输出是生成的onnx模型文件名; xxx_pth_preprocess.py与xxx_pth_postprocess.py尽可能只引用numpy,Pillow,torch,pycocotools等基础库,如不要因mmdetection框架的数据处理与精度评测部分封装了这些基础库的操作,为调用这些简单接口,前后处理脚本就依赖mmdetection; 不同模型的脚本与代码部分处理流程有相似性,尽量整合成通用的脚本与代码。 - - 推理步骤: - 需要提供端到端推理的操作过程 + - 推理过程: + 需要提供端到端推理过程中执行的命令等 - 关键问题总结: - 需要提供端到端推理遇到的关键问题的简要调试过程,模型转换要点,精度调试,性能优化 + 需要提供端到端推理遇到的关键问题的简要调试过程,至少包含模型转换要点,精度调试,性能优化 说明: ``` - 对于性能不达标的模型,优化是学生能做的尽量做,比如用ascend atc的相关优化选项尝试一下,尝试使用最近邻替换双线性的resize重新训练,降低图片分辨率等,然后profiling分析定位引起性能下降的原因,具体到引起性能下降的算子,并在交付文档中写明问题原因与简要的定位过程,涉及到atc算子代码的修改由华为方做。 - 算子导致的性能问题可以在modelzoo提issue,等修复版本发布后再重测性能,继续优化。 - 工作量为简单模型2-3个工作日,复杂模型5-10个工作日,个别难度大的模型15-20个工作日。 + 1.如果已经有了ascend 910训练提供的权重文件,那么优先使用910训练提供的权重文件做离线推理,精度与910训练出的精度对齐;如果开源代码仓提供了多个权重文件,使用常用的基础的那个配置的权重文件即可;如果开源代码仓没有提供pth权重文件,则需要该模型的训练同学提供pth权重文件,或者使用开源代码仓训练脚本简单训练一个pth权重文件,然后对比om精度与该pth权重文件的精度 + + 2.由于随机数可能不能模拟数据分布,Ascend benchmark工具纯推理功能测的有些模型性能数据可能不太准,所以模型测试脚本与提交代码的描述中的性能数据以Ascend benchmark在数据集上推理时得到性能数据为准 + + 3.如果模型支持多batch,需要测试batch1,4,8,16,32的精度与性能,写在README.md里,模型测试脚本与提交代码的描述只需提供bs1和bs16的精度性能数据 + + 4.如果导出的onnx因包含自定义算子等而不能推理,则在t4上运行开源评测脚本测试pth模型在线推理性能 + + 5.对于性能不达标的模型,需要进行如下工作: + 1)优化修改onnx模型去掉影响性能的冗余pad,用Ascend atc的相关优化选项尝试一下,尝试使用最近邻替换双线性的resize重新训练,降低图片分辨率等使性能达标。 + 2)对于算子导致的性能问题,需要使用profiling分析定位引起性能下降的原因,具体到引起性能下降的算子。优先修改模型代码以使其选择性能好的npu算子替换性能差的npu算子使性能达标,然后在modelzoo上提issue,等修复版本发布后再重测性能,继续优化。 + 3)需要交付profiling性能数据,对经过上述方法性能可以达标的模型,在交付文档中写明问题原因与达标需要执行的操作;对经过上述方法性能仍不达标的模型,在交付的README.md文档中写明问题原因与简要的定位过程。 + + 6.git clone开源模型代码仓到工作目录,如果模型代码仓没有安装命令,pth2onnx.py脚本需要引用模型代码仓的函数或类时,通过sys.path.append(r"./代码仓目录")添加搜索路径,如果需要修改开源代码仓代码,将修改用git diff做成一个patch文件,交付件不要交付开源代码仓里的代码,只需要交付这个patch文件。参见本文3.5 maskrcnn端到端推理指导-开源detectron2加载npu权重的推理指导 + + 7.数据集统一放在/root/datasets/目录 ``` - 交付件 - 交付件参考:[ResNeXt50_Onnx模型端到端推理指导.md](https://gitee.com/ascend/modelzoo/tree/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50) - 最终交付件: - 包含以上交付标准的模型名称_Onnx端到端推理指导.md + 包含以上交付标准的代码,README.md,以及验收脚本 + 权重文件、profiling性能数据等非代码交付件一并打压缩包邮件发送 - 最终交付形式: - gitee网址:https://gitee.com/ascend/modelzoo/tree/master/contrib/ACL_PyTorch - commit信息格式:【高校贡献-学校学院名称】【Onnx-模型名称】模型名称 Onnx端到端推理 - 对于batch1与batch16,npu性能均高于T4性能1.2倍的模型,放在benchmark目录下,1-1.2倍对应official目录,低于1倍放在research目录 + gitee网址:https://gitee.com/ascend/modelzoo/tree/master/contrib/ACL_PyTorch/Research + commit信息格式:[学校学院名称][高校贡献][Pytorch离线推理][模型名称]-PR内容摘要 + 模型命名风格为大驼峰,模型名含多个字符串时使用横杠或下划线连接,当上下文用横杠时模型名用下划线连接,否则用横杠连接 + 对于batch1与batch16,npu性能均高于T4性能1.2倍的模型,放在Benchmark目录下,1-1.2倍对应Official目录,低于1倍放在Research目录,目前都放在contrib/ACL_PyTorch/Research下即可 + +- 交付件参考 + 代码交付件checklist参考: + ``` + Nested_UNet #模型名称命名的文件夹 + ├── env.sh #环境变量 + ├── gen_dataset_info.py #生成数据集info文件 + ├── LICENSE #选用Apache LICENCE + ├── nested_unet.diff #以补丁形式修改开源模型代码 + ├── nested_unet_pth2onnx.py #模型转换脚本,避免脚本下载权重加载模型时有pretrained参数的话设置为False,通常通过sys.path.append(r"./pytorch-nested-unet") from archs import NestedUNet的方式引用开源模型代码,如果开源模型代码仓提供了pth2onnx脚本,则不需要提交该文件 + ├── nested_unet_postprocess.py #模型后处理脚本,尽量不import开源模型代码,但是如果脚本直接复制了大量开源模型代码的函数而不是自己写的,为求精简,也可以import开源模型代码的类或函数 + ├── nested_unet_preprocess.py #模型前处理脚本,不要import该脚本没有用到的库,类或函数,数据集路径要通过输入参数传递给脚本而不是直接写在脚本里 + ├── README.md #模型离线推理说明README + ├── requirements.txt #模型离线推理用到的所有依赖库与版本,不要写入实际没有依赖的库,pytorch版本优先1.5.0,特殊情况选用1.8.0,其它库具体版本是离线推理时使用的具体版本 + └── test + ├── eval_acc_perf.sh + ├── parse.py + ├── perf_g.sh + ├── pth2om.sh + └── README.md + ``` + + 模型目录结构与pth2onnx基本操作: + ``` + cd Nested_UNet + git clone https://github.com/4uiiurz1/pytorch-nested-unet + cd pytorch-nested-unet + patch -p1 < ../nested_unet.diff #以补丁形式修改模型代码,其中补丁是通过git diff > ./nested_unet.diff生成的 + cd .. + python3.7 nested_unet_pth2onnx.py nested_unet.pth nested_unet.onnx #pth2onnx脚本sys.path.append(r"./pytorch-nested-unet") from archs import NestedUNet加载NestedUNet模型结构 + 注意目录结构,模型转换与前后处理脚本和开源模型代码尽量处于同一层,执行脚本命令时要依据目录结构确定路径 + ``` + + 性能不达标的模型非代码交付件: + ``` + icnet性能不达标初步优化 + ├── icnet_bs16.om + ├── icnet.onnx + ├── icnet模型性能不达标测试报告.docx + ├── 优化前的profiling + └── 优化后的profiling + [xxx模型性能不达标测试报告.docx](https://gitee.com/pengyeqing/ascend-pytorch-crowdintelligence-doc/blob/master/docs/PyTorch%E7%A6%BB%E7%BA%BF%E6%8E%A8%E7%90%86-xxx%E6%A8%A1%E5%9E%8B%E6%80%A7%E8%83%BD%E4%B8%8D%E8%BE%BE%E6%A0%87%E6%B5%8B%E8%AF%95%E6%8A%A5%E5%91%8A.docx) + ``` + + + 如果使用补丁文件修改了模型代码则将补丁打入模型代码,如果需要引用模型代码仓的类或函数通过sys.path.append(r"./pytorch-nested-unet")添加搜索路径 + 参见https://gitee.com/ascend/modelzoo/pulls/2309 + 参见https://gitee.com/ascend/modelzoo/pulls/2585 + + 模型不支持动态onnx,性能不达标等特殊情况需要在pr备注与README性能优化里说明 + 参见https://gitee.com/ascend/modelzoo/pulls/2122 + 参见https://gitee.com/ascend/modelzoo/pulls/2328 - gitee仓PR贡献流程 - fork [modelzoo](https://gitee.com/ascend/modelzoo) 到个人仓 @@ -1196,31 +1417,31 @@ profiling也会统计每个算子耗时,结合使用netron查看onnx模型结 - 最终验收完成后合入主干 - gitee仓验收使用脚本(请自验)、PR内容模板 - 验收使用脚本(请自验) - >![](public_sys-resources/icon-note.gif) + >![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/public_sys-resources/icon-note.gif) **说明:** > **提交前请确保自验通过!确保直接执行以下脚本就可运行!** - ```shell script + ```shell + #准备环境 + 交付的代码文件夹下获取模型结构的开源代码,安装必要的依赖,获取训练提供的权重文件,获取数据集路径,获取benchmark工具 # pth是否能正确转换为om - bash scripts/pth2om.sh - - # 精度数据是否达标(需要显示官网精度与om模型的精度) - bash scripts/eval_acc.sh + bash test/pth2om.sh - # npu性能数据(如果模型支持多batch,测试bs1与bs16,否则只测试bs1,性能数据以单卡吞吐率为标准) - bash scripts/perform_310.sh - - # t4性能数据(如果模型支持多batch,测试bs1与bs16,否则只测试bs1,如果导出的onnx模型因含自定义算子等不能离线推理,则在t4上测试pytorch模型的在线推理性能,性能数据以单卡吞吐率为标准) - bash scripts/perform_t4.sh + # 精度数据是否达标(需要显示官网pth精度与om模型的精度) + # npu性能数据(确保device空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,性能数据以单卡吞吐率为标准),不指定数据集目录时默认/root/datasets + bash test/eval_acc_perf.sh --datasets_path=/root/datasets + # 在t4环境测试性能数据(确保gpu空闲时测试,如果模型支持多batch,测试bs1与bs16,否则只测试bs1,如果导出的onnx模型因含自定义算子等不能离线推理,则在t4上测试pytorch模型的在线推理性能,性能数据以单卡吞吐率为标准) + bash perf_t4.sh ``` - PR内容模板 + - PR示例链接 https://gitee.com/ascend/modelzoo/pulls/887 - PR名称 - - 【高校贡献-${学校学院名称}】【Pytorch离线推理-${模型名称}】${PR内容摘要} - - 举例说明:【高校贡献-华为大学昇腾学院】【Pytorch离线推理-ResNeXt50】初次提交。模型命名风格为大驼峰,含多个字符串时使用下划线连接 - ``` - + - [学校学院名称][高校贡献][Pytorch离线推理][模型名称]-PR内容摘要 + - 举例说明:[华为大学昇腾学院][高校贡献][Pytorch离线推理][ResNeXt50]-初次提交 + + ``` - **What type of PR is this?** - > /kind task + > task **What does this PR do / why do we need it**: - # 简述你这次的PR的详情 + # 首次提交Pytorch AbNet | 名称 | 精度 | 性能 | | :------: | :------: | :------: | @@ -698,9 +831,48 @@ npu-smi info | GPU-8p | 100.0 | 2200 | | NPU-1p | - | 2200 | | NPU-8p | 100.0 | 14000 | - - + # 自验报告 + ```shell + # 1p train perf + # 是否正确输出了性能log文件 + bash test/train_performance_1p.sh --data_path xxx + # 验收结果: OK / Failed + # 备注: 目标性能301FPS;验收测试性能336FPS;无输出日志,运行报错,报错日志xx 等 + + # 8p train perf + # 是否正确输出了性能log文件 + bash test/train_performance_1p.sh --data_path xxx + # 验收结果: OK / Failed + # 备注: 目标性能301FPS;验收测试性能336FPS;无输出日志,运行报错,报错日志xx 等 + + # 8p train full + # 是否正确输出了性能精度log文件,是否正确保存了模型文件 + bash test/train_full_8p.sh --data_path xxx + # 验收结果: OK / Failed + # 备注: 目标精度77.632;验收精度76.78;无输出日志,运行报错,报错日志xx 等 + + # 8p eval + # 是否正确输出了性能精度log文件 + bash test/train_eval_8p.sh --data_path xxx + # 验收结果: OK / Failed + # 备注: 功能正确,无输出日志,运行报错,报错日志xx 等 + + # online inference demo + # 是否正确输出预测结果,请确保输入固定tensor多次运行的输出结果一致 + python3.7.5 demo.py + # 验收结果: OK / Failed + # 备注: 功能正确,无输出日志,运行报错,报错日志xx 等 + + # To ONNX + # 是否正确输出onnx + python3.7.5 pthtar2onnx.py + # 验收结果: OK / Failed + # 备注: 功能正确,无输出日志,运行报错,报错日志xx 等 + + ``` + - 示例链接 https://gitee.com/ascend/modelzoo/pulls/836#note_4750681 + **Which issue(s) this PR fixes**: # 用于后期issue关联的pr + + **What type of PR is this?** + > /kind task + + **What does this PR do / why do we need it**: + # 简述你这次的PR的详情 + + | 名称 | 精度 | 性能 | + | :------: | :------: | :------: | + | GPU-1p | - | 2200 | + | GPU-8p | 100.0 | 2200 | + | NPU-1p | - | 2200 | + | NPU-8p | 100.0 | 14000 | + + + + **Which issue(s) this PR fixes**: + # 用于后期issue关联的pr(相关的issue可填在此处) + + Fixes # + + **Special notes for your reviewers**: + # 在reviewer检视时你想要和他说的 + + ``` + + + + + + diff --git "a/AscendPytorch\346\250\241\345\236\213\344\274\227\346\231\272FAQ.md" "b/AscendPytorch\346\250\241\345\236\213\344\274\227\346\231\272FAQ.md" index 11ea406cc30054bb31a46ae741d6979be92a9153..214456d5d0e47c20af3bec17f274efd7844007fc 100644 --- "a/AscendPytorch\346\250\241\345\236\213\344\274\227\346\231\272FAQ.md" +++ "b/AscendPytorch\346\250\241\345\236\213\344\274\227\346\231\272FAQ.md" @@ -17,7 +17,7 @@ * 现象描述 -![](figures/model_faq_1103.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq_1103.png) * 原因分析 @@ -32,7 +32,7 @@ * 现象描述 -![](figures/model_faq_6_1111.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq_6_1111.png) * 原因分析 @@ -46,7 +46,7 @@ * 现象描述 -![](figures/model_faq2_1103.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq2_1103.png) * 原因分析 @@ -54,13 +54,13 @@ * 处理方法 - 遇到这种情况处理方式有两种:一是查看host日志,可以查看具体的报错日志信息。日志默认位置是/var/log/npu/slog/host-0/下,可能会有多个日志文件,日志文件以host-0为前缀,根据时间标识查找对应的日志文件。打开日志文件,搜索“ERROR”,查询具体的报错信息。二是关闭多线程下发(export TASK_QUEUE_ENABLE=0),然后再次运行代码,一般可根据终端报错信息知道错误原因。 + 遇到这种情况处理方式有两种:一是查看host日志,可以查看具体的报错日志信息。日志默认位置是/root/ascend/log/plog/下,可能会有多个日志文件,日志文件以host-0为前缀,根据时间标识查找对应的日志文件。打开日志文件,搜索“ERROR”,查询具体的报错信息。二是关闭多线程下发(export TASK_QUEUE_ENABLE=0),同时开启错误日志级别(export ASCEND_GLOBAL_LOG_LEVEL=3)然后再次运行代码,一般可根据终端报错信息知道错误原因。 ### FAQ4、在模型调测遇到报错,“RuntimeError: malloc:/..../pytorch/c10/npu/NPUCachingAllocator.cpp:293 NPU error, error code is 500000.” * 现象描述 -![](figures/model_faq3_1109.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq3_1109.png) * 原因分析 @@ -74,7 +74,7 @@ * 现象描述 -![](figures/model_faq5_1109.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq5_1109.png) * 原因分析 @@ -87,9 +87,9 @@ ### FAQ6、在模型调测中,遇到某个算子报错的情况,如下分别为MaxPoolGradWithArgmaxV1算子和max算子报错。 -![](figures/model_faq4_1109.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq4_1109.png) -![](figures/model_faq4_2_1109.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq4_2_1109.png) * 原因分析 @@ -111,7 +111,7 @@ * 现象描述 -![](figures/model_faq6_1118.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq6_1118.png) * 原因分析 @@ -126,7 +126,7 @@ * 现象描述 -![](figures/model_faq7_1118.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq7_1118.png) * 原因分析 @@ -136,7 +136,7 @@ 查看host日志,确定报错算子和位置。日志默认路径为/var/log/npu/slog/host-0,查找对应时间的log文件,搜索ERROR字段,查找错误信息。如对上述的错误,查询日志中的ERROR字段为: -![](figures/model_faq7_1_1118.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq7_1_1118.png) 从日志信息EEROR部分可以发现,报错算子为topKD,报错原因是 The number of attrs in op desc and op store does not match. 定位到是topk算子错误,具体原因是算子参数不匹配。 在模型代码中查找topk算子调用位置,确定该算子是否可有其他算子代替,若可由其他算子报错,暂时使用代替方案,并将算子报错信息报告华为工程师。若无替代算子,请将算子报错信息通知华为工程师解决。 @@ -145,14 +145,14 @@ * 现象描述 -![](figures/model_faq8_1118.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq8_1118.png) * 原因分析 根据报错信息,是npu设备初始化错误。查看host日志内容,根据前述步骤查找日志。日志报错如下: -![](figures/model_faq8_1_1118.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq8_1_1118.png) * 处理方法 @@ -163,7 +163,7 @@ * 现象描述 -![](figures/model_faq10_1118.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq10_1118.png) * 原因分析 @@ -171,15 +171,15 @@ * 处理方法 - 更新te等组件版本,具体是需要更新te-*.whl和topi-*.whl安装包。在安装的toolkit或者nnae的fwkacllib子目录下(对于默认安装路径,在/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/lib64目录下,更新安装包即可。在该目录下有安装包topi-0.4.0-py3-none-any.whl和te-0.4.0-py3-none-any.whl,分别运行 pip install --upgrade te-0.4.0-py3-none-any.whl, pip install --upgrade te-0.4.0-py3-none-any.whl. + 更新te等组件版本,具体是需要更新te-*.whl和topi-*.whl安装包。在安装的toolkit或者nnae的fwkacllib子目录下(对于默认安装路径,在/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/lib64目录下,更新安装包即可。在该目录下有安装包topi-0.4.0-py3-none-any.whl和te-0.4.0-py3-none-any.whl,分别运行 pip install --upgrade topi-0.4.0-py3-none-any.whl, pip install --upgrade te-0.4.0-py3-none-any.whl. -![](figures/model_faq10_1_1118.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq10_1_1118.png) ### FAQ11、在调用torch时,遇到ModuleNotFoundError: No module named 'torch._C'报错。 * 现象描述 -![](figures/model_faq11_1123.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq11_1123.png) * 原因分析 @@ -189,11 +189,11 @@ 切换到其他目录执行脚本即可。 -### FAQ11、cuda流同步操作报错。 +### FAQ12、cuda流同步操作报错。 * 现象描述 -![](figures/model_faq11_1123.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq11_1123.png) * 原因分析 @@ -207,9 +207,9 @@ ``` -### FAQ12、aicpu_kernels/libpt_kernels.so找不到 +### FAQ13、aicpu_kernels/libpt_kernels.so找不到 -![](figures/model_faq12_0126.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq12_0126.png) * 原因分析 @@ -220,9 +220,9 @@ ```export ASCEND_AICPU_PATH=/usr/local/Ascend/ascend-toolkit/latest``` -### FAQ13、npu-smi info 查看显存时发现有残留 +### FAQ14、npu-smi info 查看显存时发现有残留 -![](figures/model_faq13_0126.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq13_0126.png) * 原因分析 @@ -233,9 +233,9 @@ ```pkill -9 python``` -### FAQ14、match op inputs failed +### FAQ15、match op inputs failed -![](figures/model_faq14_0126.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq14_0126.png) * 原因分析 @@ -246,7 +246,7 @@ -### FAQ15、Op type SigmoidCrossEntropyWithLogitsV2 of ops kernel AIcoreEngine is unsupported, +### FAQ16、Op type SigmoidCrossEntropyWithLogitsV2 of ops kernel AIcoreEngine is unsupported, ``` [ERROR] GE(24836,python3.7):2021-01-27-18:27:51.562.111 [../../../../../../graphengine/ge/engine_manager/dnnengine_manager.cc:266]25155 GetDNNEngineName: ErrorNo: 1343242282(assign engine failed) GetDNNEngineName:Op type SigmoidCrossEntropyWithLogitsV2 of ops kernel AIcoreEngine is unsupported, reason:Op SigmoidCrossEntropyWithLogitsV2 not supported reason: The type of this op is not found in op store, check whether the op store has this type of op. Op store name is tbe-custom. @@ -261,7 +261,7 @@ The dtype, format or shape of input in op desc is not supported in op store, che 检查对应python代码中输入的数据类型,多半是由于输入int64类型导致的错误 -### FAQ16、Hook失败 +### FAQ17、Hook失败 ```shell Traceback (most recent call last): @@ -315,7 +315,7 @@ StopIteration ``` -### FAQ17、模型报错运行时MemCopySync:drvMemcpy failed. +### FAQ18、模型报错运行时MemCopySync:drvMemcpy failed. ```shell 脚本: @@ -364,11 +364,11 @@ StopIteration 通过在代码中加上stream同步操作,可以确定报错算子是在stack,那么打印stack所有参数的shape、dtype、npu_format,然后构造单算子用例就能很好的复现和解决问题。所以这里的问题是,减法计算输入参数数据类型不同导致a-b和b-a结果的数据类型也不一致,最终在stack算子中报错。可以将stack入参数据类型转换为一致即可临时规避问题。对于该报错需要根据实际的错误来定位。 -### FAQ18、加载权重时发生load state_dict error. +### FAQ19、加载权重时发生load state_dict error. * 现象描述 -![](figures/model_faq18_01.png) -![](figures/model_faq18_02.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq18_01.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq18_02.png) * 原因分析 @@ -389,13 +389,203 @@ StopIteration model.load_state_dict(state_dict) ``` +### FAQ20、加载数据集时发生cannot identify image. + +* 现象描述 +![](https://gitee.com/zhangjie11ee/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq19_0527.png) + + +* 原因分析 + + 模型训练时无法找到对应的数据集出现错误。 + +* 处理方法 + + 找不到数据集,检查数据集路径和数据集是否有效。 + +### FAQ21、timm框架版本问题导致错误 + +* 现象描述 +![](https://gitee.com/zhangjie11ee/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq20_0607.png) + 模型出差copy_npu的错误,使用的是timm框架结构。 + + +* 原因分析 + + 使用的timm框架有问题,需要定位分析 + +* 处理方法 + + 进行timme框架替换,环境上timm安装的是4.9版本,需要替换为是4.6版本的。 + +### FAQ22、GPU场景下安装DCN2v模块 + +* 现象描述 +![](https://gitee.com/zhangjie11ee/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq21_0607.png) + 导入DCNv2失败,出现未定义符合错误。 + + +* 原因分析 + + 该模块是模型自带的,导致导入失败,需要自己进行编译安装DCNv2模块 + +* 处理方法 + + 下载https://github.com/CharlesShang/DCNv2的源码,进行编译安装, + 可以参考https://www.gitmemory.com/issue/xingyizhou/CenterNet/7/486653333进行安装。 + + + + + +### FAQ23、模型训练时报libtorch_npu.so: undefined symbol: aclopSetCompileFlag错误。 + +* 现象描述 + +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq20_0528.PNG) + +* 原因分析 + + 环境中的pytorch版本与toolkit版本不匹配,或存在多个tookit版本,环境变量未正确指定。 + +* 处理方法 + + 1)重新安装版本匹配的torch或者toolkit。 + 2)重新设置环境变量,指定正确的toolkit路径。 + + +### FAQ24、模型训练时报fill算子错误: RuntimeError: Run:/usr1/workspace/PyTorch_Apex_Daily_c20tr5/CODE/aten/src/ATen/native/npu/utils/OpParamMaker.h:280 NPU error,NPU error code is:500002 + +* 现象描述 + +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq21_0529.PNG) + +* 原因分析 + + 脚本中fill算子输入的类型是int64, 查看vim /usr/local/Ascend/ascend-toolkit/latest/opp/op_impl/built-in/ai_core/tbe/config/ascend910/aic-ascend910-ops-info.json中Fills算子支持的输入类型是float16,float32,int32 + +* 处理方法 + + 1)将fill算子输入的类型改成int32。 + + +### FAQ25、cpu下运行scatter算子报错:RuntimeError: index 4558486308284583594 is out of bounds for dimension 1 with size 4233. + +* 现象描述 + +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq22_0604.PNG) + +* 原因分析 + + scatter算子中的index参数仅支持long类型 + index (LongTensor) – the indices of elements to scatter, can be either empty or the same size of src. When empty, the operation returns identity + +* 处理方法 + + 修改代码中b的类型为long。 + + +### FAQ26、NPU训练时,第一个batch训练速度特别慢,第二个开始速度正常,和gpu差不多。 + +* 现象描述 + +无 + +* 原因分析 + + NPU是在线编译算子的,在执行第一步的时候会去编译算子,所以第一步会特别慢。 + +* 处理方法 + + 正常现象。 + + +### FAQ26、pip安装包 matplotlib pillow numpy scipy xtcocotools torchvision 等包,在x86环境安装顺利但是在arm环境失败 + +* 现象描述 + + arm环境上安装很多包的时候会报错,报错的内容,现象各不相同,找不到包或者缺少底层依赖。 + +* 原因分析 + + 1. pip根据你当前的pip版本下载的,不同版本对应的包再pip源的中是相互隔离的,所以即使有些包已经出了arm版本,但是你的pip不够新下载也会出错,或者找不到。 + 2. 有些包在你选定的源里的确没有,尝试使用其他的pip源 + 3. arm上的环境上的python生态并不好,有些包即使是最新的pip版本依然下载不到。那只能编译源码了。 + +* 处理方法 + + 1)尝试更新pip,注意你若是使用的pip3.7则下面命令对应改成pip3.7 + ```bash + pip install --upgrade pip + ``` + + 2)临时切换pip源下载,使用 -i 参数指定临时源,常用的源还有 清华源,阿里源,豆瓣源 + ```bash + pip install torchvision==0.2.2 -i https://repo.huaweicloud.com/repository/pypi/simple/ + ``` + 其他源: + ```bash + 清华大学 :https://pypi.tuna.tsinghua.edu.cn/simple/ + 阿里云:http://mirrors.aliyun.com/pypi/simple/ + 豆瓣源:http://pypi.douban.com/simple/ + ``` + + 3)对于前两种办法都无法安装的包,那只能使用源码安装了,目前像 kaldi,xtcocotools 这类包需要使用gitee或者github上共享的源码,根据他的readme来编译安装。 + +### FAQ27、模型训练时出现argmax算子计算问题。 + +* 现象描述 + +![](https://gitee.com/zhangjie11ee/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq27_0618.png) + +* 原因分析 + + 1.因为传进去的输入的NCHW会发生变化,需要提前固定,不然会出现形状问题 + +* 处理方法 + + 1)需要对输入进行处理,参考如下: + output.data = output.data.npu_format_cast(0) + predict = torch.argmax(output, 1).to(torch.int32) + 1 + +### FAQ28、模型推理时加载pth出现问题。 + +* 现象描述 + +![](https://gitee.com/zhangjie11ee/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq28_0618.png) + +* 原因分析 + + 1.pth在npu上时加载会出现加载失败的现象 + +* 处理方法 + + 1)需要添加:pretrained_net = torch.load(cfg["test"]["ckpt_path"], map_location='cpu') + +### FAQ29、多个环境都遇到了安装升级5.0.1的toolkit包,安装时报错的问题。 + +* 现象描述 + +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq1_2901.png) + +* 原因分析 + + 通过分析发现是环境中/usr/bin/pip3目录不存在,或者没有软链到/usr/bin/pip3.7 + +* 处理方法 + + 如果环境中/usr/bin/pip3目录已经存在,但没有软链到/usr/bin/pip3.7,那就删除/usr/bin/pip3目录,然后做软链;如果/usr/bin/pip3目录不存在,直接做软链接。软链接命令如下:ln -s /usr/bin/pip3.7 /usr/bin/pip3 + +## [2.2 NPU模型分布式运行常见问题FAQ](#22-NPU模型分布式运行常见问题FAQ) + + ## [2.2 NPU模型分布式运行常见问题FAQ](#22-NPU模型分布式运行常见问题FAQ) ### FAQ1、在模型分布式训练时,遇到报错 host not found. * 现象描述 -![](figures/model_faq11_1120.png) +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq11_1120.png) * 原因分析 @@ -405,4 +595,31 @@ StopIteration 在运行脚本中设置正确的IP地址,对于单机情况,设置为本机的IP即可;对于多机情况,每个服务器上脚本中的IP需要设置为master节点的IP。 +### FAQ2、在模型运行时,遇到eval模式下loss值特别大,过万. + +* 现象描述 +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq2_0201.png) + + +* 原因分析 + + 通过打印输入、查看数据集,降低loss_scale等方式均没有效果,通过重装torch和apex解决,该问题应该是包的版本不匹配引起的 + +* 处理方法 + + 重装环境中的torch和apex,问题得到解决。 + +### FAQ3、在模型运行时,模型训练的精度和loss值多卡之间不同步. + + +* 现象描述 +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq2_0301.png) + + +* 原因分析 + + 只填加了train_sampler,没有添加set_epoch,导致不同步问题 + +* 处理方法 + 在train epoch循环过程中,添加set_epoch,问题得到解决。 \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/Dockerfile" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/Dockerfile" new file mode 100644 index 0000000000000000000000000000000000000000..30a31af55804dd79571d2a36e6107a844cb7e549 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/Dockerfile" @@ -0,0 +1,5 @@ +ARG FROM_IMAGE_NAME +FROM $FROM_IMAGE_NAME + +COPY requirements.txt . +RUN pip3.7 install -r requirements.txt \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/LICENSE" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/LICENSE" new file mode 100644 index 0000000000000000000000000000000000000000..dfcc682b4b265c524b676eea5c382472c09f42c4 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/LICENSE" @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2017, +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +* Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +* Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +* Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/README.md" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/README.md" new file mode 100644 index 0000000000000000000000000000000000000000..eba4b0294514836fc7ad5b98310fc140dc1f58c2 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/README.md" @@ -0,0 +1,42 @@ +# ResNet101_ID1595_for_PyTorch + +This implements training of ResNet101_ID1595_for_PyTorch on the ImageNet dataset, mainly modified from [pytorch/examples](https://github.com/pytorch/examples/tree/master/imagenet). + +[Paper](https://arxiv.org/pdf/1512.03385.pdf) Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun."Deep Residual Learning for Image Recognition" + +## Requirements + +- Install PyTorch ([pytorch.org](http://pytorch.org)) +- `pip install -r requirements.txt` +- Download the ImageNet dataset from http://www.image-net.org/ + - Then, and move validation images to labeled subfolders, using [the following shell script](https://raw.githubusercontent.com/soumith/imagenetloader.torch/master/valprep.sh) + +## Training + +To train a model, run `main.py` with the desired model architecture and the path to the ImageNet dataset: + +# 1p training 1p +bash ./test/train_full_1p.sh --data_path=xxx # training accuracy + +bash ./test/train_performance_1p.sh --data_path=xxx # training performance + +# 8p training 8p +bash ./test/train_full_8p.sh --data_path=xxx # training accuracy + +bash ./test/train_performance_8p.sh --data_path=xxx # training performance + +# eval default 8p, should support 1p +bash ./test/train_eval_8p.sh --data_path=xxx + +## Traing log +test/output/devie_id/train_${device_id}.log # training detail log + +test/output/devie_id/ResNet101_${device_id}_bs_8p_perf.log # 8p training performance result + +test/output/devie_id/ResNet101_${device_id}_bs_8p_acc.log # 8p training accuracy result + +## ResNet101_ID1595_for_PyTorch training result +| Acc@1 | FPS | Npu_nums | Epochs | AMP_Type | +| :------: | :------: | :------: | :------: | :------: | +| - | 698 | 1 | 110 | O2 | +| 77.36 | 3687 | 8 | 110 | O2 | \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/demo.py" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/demo.py" new file mode 100644 index 0000000000000000000000000000000000000000..2bb222c4834f69d4ea812b7cd1bd5e4ca7671e5b --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/demo.py" @@ -0,0 +1,73 @@ +# -*- coding: utf-8 -*- +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ============================================================================ + +import torch +import numpy as np + + +def build_model(): + import torchvision + model = torchvision.models.resnet101(pretrained=True) + model.eval() + return model + + +def get_raw_data(): + from PIL import Image + from urllib.request import urlretrieve + IMAGE_URL = 'https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png' + urlretrieve(IMAGE_URL, 'tmp.jpg') + img = Image.open("tmp.jpg") + img = img.convert('RGB') + return img + + +def pre_process(raw_data): + from torchvision import transforms + normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406], + std=[0.229, 0.224, 0.225]) + transforms_list = transforms.Compose([ + transforms.Resize(256), + transforms.CenterCrop(224), + transforms.ToTensor(), + normalize + ]) + input_data = transforms_list(raw_data) + return input_data.unsqueeze(0) + + +def post_process(output_tensor): + return torch.argmax(output_tensor, 1) + + +if __name__ == '__main__': + # 1. get raw data + raw_data = get_raw_data() + + # 2. buid model + model = build_model() + + # 3. pre process data + input_tensor = pre_process(raw_data) + + # 4. run forward + output_tensor = model(input_tensor) + + # 5. post process + result = post_process(output_tensor) + + # 6. print result + print(result) \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/docker_start.sh" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/docker_start.sh" new file mode 100644 index 0000000000000000000000000000000000000000..944bca3cdac8e3f2d47ceb0e2b6eb181a405de11 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/docker_start.sh" @@ -0,0 +1,25 @@ +#!/bin/bash + +docker_image=$1 +data_dir=$2 +model_dir=$3 + +docker run -it --ipc=host \ + --device=/dev/davinci0 \ + --device=/dev/davinci1 \ + --device=/dev/davinci2 \ + --device=/dev/davinci3 \ + --device=/dev/davinci4 \ + --device=/dev/davinci5 \ + --device=/dev/davinci6 \ + --device=/dev/davinci7 \ + --device=/dev/davinci_manager \ + --device=/dev/devmm_svm --device=/dev/hisi_hdc \ + -v /usr/local/Ascend/driver:/usr/local/Ascend/driver \ + -v /usr/local/Ascend/add-ons/:/usr/local/Ascend/add-ons/ \ + -v ${model_dir}:${model_dir} \ + -v ${data_dir}:${data_dir} \ + -v /var/log/npu/conf/slog/slog.conf:/var/log/npu/conf/slog/slog.conf \ + -v /var/log/npu/slog/:/var/log/npu/slog -v /var/log/npu/profiling/:/var/log/npu/profiling \ + -v /var/log/npu/dump/:/var/log/npu/dump -v /var/log/npu/:/usr/slog ${docker_image} \ + /bin/bash \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/fusion_result.json" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/fusion_result.json" new file mode 100644 index 0000000000000000000000000000000000000000..46392cfd2562f14bb4f605949501dc7b4676187e --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/fusion_result.json" @@ -0,0 +1,885 @@ +{ + "graphId": "50", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "50" +}{ + "graphId": "56", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "56" +}{ + "graphId": "60", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "60" +}{ + "graphId": "61", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "61" +}{ + "graphId": "66", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "66" +}{ + "graphId": "67", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "67" +}{ + "graphId": "71", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "71" +}{ + "graphId": "75", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "75" +}{ + "graphId": "78", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "78" +}{ + "graphId": "81", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "81" +}{ + "graphId": "82", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "82" +}{ + "graphId": "83", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "83" +}{ + "graphId": "87", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "87" +}{ + "graphId": "91", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "91" +}{ + "graphId": "94", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "94" +}{ + "graphId": "97", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "97" +}{ + "graphId": "98", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "98" +}{ + "graphId": "99", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "99" +}{ + "graphId": "103", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "103" +}{ + "graphId": "107", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "107" +}{ + "graphId": "110", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "110" +}{ + "graphId": "113", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "113" +}{ + "graphId": "114", + "graph_fusion": { + "ConvToFullyConnectionFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ConvWeightCompressFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "GroupConv2DFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "114" +}{ + "graphId": "197", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "197" +}{ + "graphId": "198", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "198" +}{ + "graphId": "202", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "202" +}{ + "graphId": "203", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "203" +}{ + "graphId": "204", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "204" +}{ + "graphId": "205", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "205" +}{ + "graphId": "206", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "206" +}{ + "graphId": "207", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "207" +}{ + "graphId": "208", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "208" +}{ + "graphId": "209", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "209" +}{ + "graphId": "213", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "213" +}{ + "graphId": "214", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "214" +}{ + "graphId": "218", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "218" +}{ + "graphId": "219", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "219" +}{ + "graphId": "223", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "223" +}{ + "graphId": "224", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "224" +}{ + "graphId": "225", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "225" +}{ + "graphId": "226", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "226" +}{ + "graphId": "227", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "227" +}{ + "graphId": "228", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "228" +}{ + "graphId": "229", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "229" +}{ + "graphId": "230", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "230" +}{ + "graphId": "234", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "234" +}{ + "graphId": "235", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "235" +}{ + "graphId": "239", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "239" +}{ + "graphId": "240", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "240" +}{ + "graphId": "244", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "244" +}{ + "graphId": "245", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "245" +}{ + "graphId": "246", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "246" +}{ + "graphId": "247", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "247" +}{ + "graphId": "248", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "248" +}{ + "graphId": "249", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "249" +}{ + "graphId": "250", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "250" +}{ + "graphId": "251", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "251" +}{ + "graphId": "255", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "255" +}{ + "graphId": "256", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "256" +}{ + "graphId": "260", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "260" +}{ + "graphId": "261", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "261" +}{ + "graphId": "265", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "265" +}{ + "graphId": "266", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "266" +}{ + "graphId": "267", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "267" +}{ + "graphId": "268", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "268" +}{ + "graphId": "269", + "graph_fusion": { + "Conv2DbpInputDilationFusionPass": { + "effect_times": "0", + "match_times": "1" + }, + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "269" +}{ + "graphId": "270", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "270" +}{ + "graphId": "276", + "graph_fusion": { + "ZDwGroupFusionPass": { + "effect_times": "0", + "match_times": "1" + } + }, + "sessionId": "276" +} \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/main.py" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/main.py" new file mode 100644 index 0000000000000000000000000000000000000000..211833e3eb26015b6d6da425fde410101fa14f53 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/main.py" @@ -0,0 +1,626 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://opensource.org/licenses/BSD-3-Clause +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ============================================================================ + +import argparse +import os +import random +import shutil +import time +import warnings + +import torch + +from apex import amp + +import torch.nn as nn +import torch.nn.parallel +import torch.backends.cudnn as cudnn +import torch.distributed as dist +import torch.optim +import torch.multiprocessing as mp +import torch.utils.data +import torch.utils.data.distributed +import torchvision.transforms as transforms +import torchvision.datasets as datasets +import torchvision.models as models + +model_names = sorted(name for name in models.__dict__ + if name.islower() and not name.startswith("__") + and callable(models.__dict__[name])) + +parser = argparse.ArgumentParser(description='PyTorch ImageNet Training') +parser.add_argument('data', metavar='DIR', + help='path to dataset') +parser.add_argument('-a', '--arch', metavar='ARCH', default='resnet34', + choices=model_names, + help='model architecture: ' + + ' | '.join(model_names) + + ' (default: resnet18)') +parser.add_argument('-j', '--workers', default=4, type=int, metavar='N', + help='number of data loading workers (default: 4)') +parser.add_argument('--epochs', default=90, type=int, metavar='N', + help='number of total epochs to run') +parser.add_argument('--start-epoch', default=0, type=int, metavar='N', + help='manual epoch number (useful on restarts)') +parser.add_argument('-b', '--batch-size', default=256, type=int, + metavar='N', + help='mini-batch size (default: 256), this is the total ' + 'batch size of all GPUs on the current node when ' + 'using Data Parallel or Distributed Data Parallel') +parser.add_argument('--lr', '--learning-rate', default=0.1, type=float, + metavar='LR', help='initial learning rate', dest='lr') +parser.add_argument('--momentum', default=0.9, type=float, metavar='M', + help='momentum') +parser.add_argument('--wd', '--weight-decay', default=1e-4, type=float, + metavar='W', help='weight decay (default: 1e-4)', + dest='weight_decay') +parser.add_argument('-p', '--print-freq', default=10, type=int, + metavar='N', help='print frequency (default: 10)') +parser.add_argument('--resume', default='', type=str, metavar='PATH', + help='path to latest checkpoint (default: none)') +parser.add_argument('-e', '--evaluate', dest='evaluate', action='store_true', + help='evaluate model on validation set') +parser.add_argument('--pretrained', dest='pretrained', action='store_true', + help='use pre-trained model') +parser.add_argument('--world-size', default=-1, type=int, + help='number of nodes for distributed training') +parser.add_argument('--rank', default=-1, type=int, + help='node rank for distributed training') +parser.add_argument('--dist-url', default='tcp://224.66.41.62:23456', type=str, + help='url used to set up distributed training') +parser.add_argument('--dist-backend', default='nccl', type=str, + help='distributed backend') +parser.add_argument('--seed', default=None, type=int, + help='seed for initializing training. ') +parser.add_argument('--gpu', default=None, type=int, + help='GPU id to use.') +parser.add_argument('--multiprocessing-distributed', action='store_true', + help='Use multi-processing distributed training to launch ' + 'N processes per node, which has N GPUs. This is the ' + 'fastest way to use PyTorch for either single node or ' + 'multi node data parallel training') +## for ascend 910 +parser.add_argument('--device', default='npu', type=str, help='npu or gpu') +parser.add_argument('--addr', default='10.136.181.115', + type=str, help='master addr') +parser.add_argument('--device_list', default='0,1,2,3,4,5,6,7', + type=str, help='device id list') +parser.add_argument('--amp', default=False, action='store_true', + help='use amp to train the model') +parser.add_argument('--loss-scale', default=1024., type=float, + help='loss scale using in amp, default -1 means dynamic') +parser.add_argument('--opt-level', default='O2', type=str, + help='loss scale using in amp, default -1 means dynamic') +parser.add_argument('--FusedSGD', default=False, action='store_true', + help='use FusedSGD') +best_acc1 = 0 + + +def device_id_to_process_device_map(device_list): + devices = device_list.split(",") + devices = [int(x) for x in devices] + devices.sort() + + process_device_map = dict() + for process_id, device_id in enumerate(devices): + process_device_map[process_id] = device_id + + return process_device_map + + +def main(): + args = parser.parse_args() + print(args.device_list) + + os.environ['MASTER_ADDR'] = args.addr + os.environ['MASTER_PORT'] = '29688' + + if args.seed is not None: + random.seed(args.seed) + torch.manual_seed(args.seed) + cudnn.deterministic = True + warnings.warn('You have chosen to seed training. ' + 'This will turn on the CUDNN deterministic setting, ' + 'which can slow down your training considerably! ' + 'You may see unexpected behavior when restarting ' + 'from checkpoints.') + + if args.gpu is not None: + warnings.warn('You have chosen a specific GPU. This will completely ' + 'disable data parallelism.') + + if args.dist_url == "env://" and args.world_size == -1: + args.world_size = int(os.environ["WORLD_SIZE"]) + + args.distributed = args.world_size > 1 or args.multiprocessing_distributed + + args.process_device_map = device_id_to_process_device_map(args.device_list) + + if args.device == 'npu': + ngpus_per_node = len(args.process_device_map) + else: + if args.distributed: + ngpus_per_node = torch.cuda.device_count() + else: + ngpus_per_node = 1 + print('ngpus_per_node:', ngpus_per_node) + if args.multiprocessing_distributed: + # Since we have ngpus_per_node processes per node, the total world_size + # needs to be adjusted accordingly + args.world_size = ngpus_per_node * args.world_size + # Use torch.multiprocessing.spawn to launch distributed processes: the + # main_worker process function + mp.spawn(main_worker, nprocs=ngpus_per_node, + args=(ngpus_per_node, args)) + else: + # Simply call main_worker function + main_worker(args.gpu, ngpus_per_node, args) + + +def main_worker(gpu, ngpus_per_node, args): + global best_acc1 + args.gpu = args.process_device_map[gpu] + + if args.gpu is not None: + print("Use GPU: {} for training".format(args.gpu)) + + if args.distributed: + if args.dist_url == "env://" and args.rank == -1: + args.rank = int(os.environ["RANK"]) + if args.multiprocessing_distributed: + # For multiprocessing distributed training, rank needs to be the + # global rank among all the processes + args.rank = args.rank * ngpus_per_node + gpu + + if args.device == 'npu': + dist.init_process_group(backend=args.dist_backend, + world_size=args.world_size, + rank=args.rank) + else: + dist.init_process_group(backend=args.dist_backend, + init_method=args.dist_url, + world_size=args.world_size, + rank=args.rank) + # create model + if args.pretrained: + print("=> using pre-trained model '{}'".format(args.arch)) + #model = models.__dict__[args.arch](pretrained=True) + model = resnet101.resnet101() + print("Load my train models...") + pretrained_dict = \ + torch.load("/home/ResNet101/model_best.pth.tar", map_location="cpu")["state_dict"] + model.load_state_dict(pretrained_dict, strict=False) + else: + print("=> creating model '{}'".format(args.arch)) + model = models.__dict__[args.arch]() + + if args.distributed: + # For multiprocessing distributed, DistributedDataParallel constructor + # should always set the single device scope, otherwise, + # DistributedDataParallel will use all available devices. + if args.gpu is not None: + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + torch.npu.set_device(loc) + model = model.to(loc) + else: + torch.cuda.set_device(args.gpu) + model.cuda(args.gpu) + + # When using a single GPU per process and per + # DistributedDataParallel, we need to divide the batch size + # ourselves based on the total number of GPUs we have + args.batch_size = int(args.batch_size / args.world_size) + args.workers = int((args.workers + ngpus_per_node - 1) / ngpus_per_node) + else: + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + model = model.to(loc) + else: + model.cuda() + # DistributedDataParallel will divide and allocate batch_size to all + # available GPUs if device_ids are not set + print("[gpu id:", args.gpu, "]", + "===========test args.gpu is not None else==============") + elif args.gpu is not None: + print("[gpu id:", args.gpu, "]", + "==============test elif args.gpu is not None:================") + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + torch.npu.set_device(args.gpu) + model = model.to(loc) + else: + torch.cuda.set_device(args.gpu) + model = model.cuda(args.gpu) + + else: + # DataParallel will divide and allocate batch_size to all available GPUs + print("[gpu id:", args.gpu, "]", "==============test 1===============") + if args.arch.startswith('alexnet') or args.arch.startswith('vgg'): + print("[gpu id:", args.gpu, "]", "============test 2=============") + else: + print("[gpu id:", args.gpu, "]", "===========test 3==============") + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + else: + print("before : model = torch.nn.DataParallel(model).cuda()") + + # define loss function (criterion) and optimizer + if args.FusedSGD: + from apex.optimizers import NpuFusedSGD + optimizer = NpuFusedSGD(model.parameters(), args.lr, + momentum=args.momentum, + weight_decay=args.weight_decay) + else: + optimizer = torch.optim.SGD(model.parameters(), args.lr, + momentum=args.momentum, + weight_decay=args.weight_decay) + + if args.amp: + model, optimizer = amp.initialize( + model, optimizer, opt_level=args.opt_level, loss_scale=args.loss_scale) + + if args.distributed: + # For multiprocessing distributed, DistributedDataParallel constructor + # should always set the single device scope, otherwise, + # DistributedDataParallel will use all available devices. + if args.gpu is not None: + # When using a single GPU per process and per + # DistributedDataParallel, we need to divide the batch size + # ourselves based on the total number of GPUs we have + # args.batch_size = int(args.batch_size / ngpus_per_node) + # args.workers = int((args.workers + ngpus_per_node - 1) / ngpus_per_node) + if args.pretrained: + model = torch.nn.parallel.DistributedDataParallel(model, device_ids=[args.gpu], broadcast_buffers=False, + find_unused_parameters=True) + else: + model = torch.nn.parallel.DistributedDataParallel(model, device_ids=[args.gpu], broadcast_buffers=False) + else: + print("[gpu id:", args.gpu, "]", + "============================test args.gpu is not None else==========================") + model = torch.nn.parallel.DistributedDataParallel(model) + elif args.gpu is not None: + print("[gpu id:", args.gpu, "]", + "============================test elif args.gpu is not None:==========================") + else: + # DataParallel will divide and allocate batch_size to all available GPUs + print("[gpu id:", args.gpu, "]", "============================test 1==========================") + if args.arch.startswith('alexnet') or args.arch.startswith('vgg'): + print("[gpu id:", args.gpu, "]", "============================test 2==========================") + model.features = torch.nn.DataParallel(model.features) + model.cuda() + else: + print("[gpu id:", args.gpu, "]", "============================test 3==========================") + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + model = torch.nn.DataParallel(model).to(loc) + else: + model = torch.nn.DataParallel(model).cuda() + + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + criterion = nn.CrossEntropyLoss().to(loc) + else: + criterion = nn.CrossEntropyLoss().cuda(args.gpu) + + # optionally resume from a checkpoint + if args.resume: + if os.path.isfile(args.resume): + print("=> loading checkpoint '{}'".format(args.resume)) + if args.gpu is None: + checkpoint = torch.load(args.resume) + else: + # Map model to be loaded to specified single gpu. + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + else: + loc = 'cuda:{}'.format(args.gpu) + checkpoint = torch.load(args.resume, map_location=loc) + args.start_epoch = checkpoint['epoch'] + best_acc1 = checkpoint['best_acc1'] + if args.gpu is not None: + # best_acc1 may be from a checkpoint from a different GPU + best_acc1 = best_acc1.to(args.gpu) + model.load_state_dict(checkpoint['state_dict']) + optimizer.load_state_dict(checkpoint['optimizer']) + if args.amp: + amp.load_state_dict(checkpoint['amp']) + print("=> loaded checkpoint '{}' (epoch {})" + .format(args.resume, checkpoint['epoch'])) + else: + print("=> no checkpoint found at '{}'".format(args.resume)) + + cudnn.benchmark = True + + # Data loading code + traindir = os.path.join(args.data, 'train') + valdir = os.path.join(args.data, 'val') + normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406], + std=[0.229, 0.224, 0.225]) + + train_dataset = datasets.ImageFolder( + traindir, + transforms.Compose([ + transforms.RandomResizedCrop(224), + transforms.RandomHorizontalFlip(), + transforms.ToTensor(), + normalize, + ])) + + if args.distributed: + train_sampler = torch.utils.data.distributed.DistributedSampler( + train_dataset) + else: + train_sampler = None + + train_loader = torch.utils.data.DataLoader( + train_dataset, batch_size=args.batch_size, shuffle=( + train_sampler is None), + num_workers=args.workers, pin_memory=True, sampler=train_sampler, drop_last=True) + + val_loader = torch.utils.data.DataLoader( + datasets.ImageFolder(valdir, transforms.Compose([ + transforms.Resize(256), + transforms.CenterCrop(224), + transforms.ToTensor(), + normalize, + ])), + batch_size=args.batch_size, shuffle=True, + num_workers=args.workers, pin_memory=False, drop_last=True) + + if args.evaluate: + validate(val_loader, model, criterion, args, ngpus_per_node) + return + + start_time = time.time() + for epoch in range(args.start_epoch, args.epochs): + if args.distributed: + train_sampler.set_epoch(epoch) + + adjust_learning_rate(optimizer, epoch, args) + + # train for one epoch + train(train_loader, model, criterion, optimizer, epoch, args, ngpus_per_node) + + # evaluate on validation set + acc1 = validate(val_loader, model, criterion, args, ngpus_per_node) + + # remember best acc@1 and save checkpoint + is_best = acc1 > best_acc1 + best_acc1 = max(acc1, best_acc1) + if args.device == 'npu' and args.gpu == 0 and epoch == 89: + print("Complete 90 epoch training, take time:{}h".format(round((time.time() - start_time) / 3600.0, 2))) + + if not args.multiprocessing_distributed or (args.multiprocessing_distributed + and args.rank % ngpus_per_node == 0): + if args.amp: + save_checkpoint({ + 'epoch': epoch + 1, + 'arch': args.arch, + 'state_dict': model.state_dict(), + 'best_acc1': best_acc1, + 'optimizer': optimizer.state_dict(), + 'amp': amp.state_dict(), + }, is_best) + else: + save_checkpoint({ + 'epoch': epoch + 1, + 'arch': args.arch, + 'state_dict': model.state_dict(), + 'best_acc1': best_acc1, + 'optimizer': optimizer.state_dict(), + }, is_best) + + +def train(train_loader, model, criterion, optimizer, epoch, args, ngpus_per_node): + batch_time = AverageMeter('Time', ':6.3f') + data_time = AverageMeter('Data', ':6.3f') + losses = AverageMeter('Loss', ':.4e', start_count_index=0) + top1 = AverageMeter('Acc@1', ':6.2f', start_count_index=0) + top5 = AverageMeter('Acc@5', ':6.2f', start_count_index=0) + progress = ProgressMeter( + len(train_loader), + [batch_time, data_time, losses, top1, top5], + prefix="Epoch: [{}]".format(epoch)) + + # switch to train mode + model.train() + + end = time.time() + for i, (images, target) in enumerate(train_loader): + # measure data loading time + data_time.update(time.time() - end) + + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + images = images.to(loc, non_blocking=True).to(torch.float) + target = target.to(torch.int32).to(loc, non_blocking=True) + else: + images = images.cuda(args.gpu, non_blocking=True) + target = target.cuda(args.gpu, non_blocking=True) + + # compute output + output = model(images) + loss = criterion(output, target) + + # measure accuracy and record loss + acc1, acc5 = accuracy(output, target, topk=(1, 5)) + losses.update(loss.item(), images.size(0)) + top1.update(acc1[0], images.size(0)) + top5.update(acc5[0], images.size(0)) + + # compute gradient and do SGD step + optimizer.zero_grad() + if args.amp: + with amp.scale_loss(loss, optimizer) as scaled_loss: + scaled_loss.backward() + else: + loss.backward() + optimizer.step() + + # measure elapsed time + cost_time = time.time() - end + batch_time.update(cost_time) + end = time.time() + + if i % args.print_freq == 0: + if not args.multiprocessing_distributed or (args.multiprocessing_distributed + and args.rank % ngpus_per_node == 0): + progress.display(i) + + if not args.multiprocessing_distributed or (args.multiprocessing_distributed + and args.rank % ngpus_per_node == 0): + print("[npu id:", args.gpu, "]", "batch_size:", args.batch_size, + 'Time: {:.3f}'.format(batch_time.avg), '* FPS@all {:.3f}'.format( + args.batch_size / batch_time.avg)) + + +def validate(val_loader, model, criterion, args, ngpus_per_node): + batch_time = AverageMeter('Time', ':6.3f', start_count_index=2) + losses = AverageMeter('Loss', ':.4e', start_count_index=0) + top1 = AverageMeter('Acc@1', ':6.2f', start_count_index=0) + top5 = AverageMeter('Acc@5', ':6.2f', start_count_index=0) + progress = ProgressMeter( + len(val_loader), + [batch_time, losses, top1, top5], + prefix='Test: ') + + # switch to evaluate mode + model.eval() + + with torch.no_grad(): + end = time.time() + for i, (images, target) in enumerate(val_loader): + if args.gpu is not None: + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + images = images.to(loc).to(torch.float) + else: + images = images.cuda(args.gpu, non_blocking=True) + if args.device == 'npu': + loc = 'npu:{}'.format(args.gpu) + target = target.to(torch.int32).to(loc, non_blocking=True) + else: + target = target.cuda(args.gpu, non_blocking=True) + + # compute output + output = model(images) + loss = criterion(output, target) + + # measure accuracy and record loss + acc1, acc5 = accuracy(output, target, topk=(1, 5)) + losses.update(loss.item(), images.size(0)) + top1.update(acc1[0], images.size(0)) + top5.update(acc5[0], images.size(0)) + + # measure elapsed time + cost_time = time.time() - end + batch_time.update(cost_time) + end = time.time() + + if i % args.print_freq == 0: + if not args.multiprocessing_distributed or (args.multiprocessing_distributed + and args.rank % ngpus_per_node == 0): + progress.display(i) + + if i % args.print_freq == 0: + if not args.multiprocessing_distributed or (args.multiprocessing_distributed + and args.rank % ngpus_per_node == 0): + print("[gpu id:", args.gpu, "]", '[AVG-ACC] * Acc@1 {top1.avg:.3f} Acc@5 {top5.avg:.3f}' + .format(top1=top1, top5=top5)) + + return top1.avg + + +def save_checkpoint(state, is_best, filename='checkpoint.pth.tar'): + torch.save(state, filename) + if is_best: + shutil.copyfile(filename, 'model_best.pth.tar') + + +class AverageMeter(object): + """Computes and stores the average and current value""" + + def __init__(self, name, fmt=':f', start_count_index=2): + self.name = name + self.fmt = fmt + self.reset() + self.start_count_index = start_count_index + + def reset(self): + self.val = 0 + self.avg = 0 + self.sum = 0 + self.count = 0 + + def update(self, val, n=1): + if self.count == 0: + self.N = n + + self.val = val + self.count += n + if self.count > (self.start_count_index * self.N): + self.sum += val * n + self.avg = self.sum / (self.count - self.start_count_index * self.N) + + def __str__(self): + fmtstr = '{name} {val' + self.fmt + '} ({avg' + self.fmt + '})' + return fmtstr.format(**self.__dict__) + +class ProgressMeter(object): + def __init__(self, num_batches, meters, prefix=""): + self.batch_fmtstr = self._get_batch_fmtstr(num_batches) + self.meters = meters + self.prefix = prefix + + def display(self, batch): + entries = [self.prefix + self.batch_fmtstr.format(batch)] + entries += [str(meter) for meter in self.meters] + print('\t'.join(entries)) + + def _get_batch_fmtstr(self, num_batches): + num_digits = len(str(num_batches // 1)) + fmt = '{:' + str(num_digits) + 'd}' + return '[' + fmt + '/' + fmt.format(num_batches) + ']' + + +def adjust_learning_rate(optimizer, epoch, args): + """Sets the learning rate to the initial LR decayed by 10 every 30 epochs""" + lr = args.lr * (0.1 ** (epoch // 30)) + for param_group in optimizer.param_groups: + param_group['lr'] = lr + + +def accuracy(output, target, topk=(1,)): + """Computes the accuracy over the k top predictions for the specified values of k""" + with torch.no_grad(): + maxk = max(topk) + batch_size = target.size(0) + + _, pred = output.topk(maxk, 1, True, True) + pred = pred.t() + correct = pred.eq(target.view(1, -1).expand_as(pred)) + + res = [] + for k in topk: + correct_k = correct[:k].reshape(-1).float().sum(0, keepdim=True) + res.append(correct_k.mul_(100.0 / batch_size)) + return res + + +if __name__ == '__main__': + main() diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/modelzoo_level.txt" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/modelzoo_level.txt" new file mode 100644 index 0000000000000000000000000000000000000000..d9cb363132a9aa5f027e117463592d7e8dbca150 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/modelzoo_level.txt" @@ -0,0 +1,3 @@ +FuncStatus:OK +PerfStatus:NOK +PrercisionStatus:NOK \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/pthtar2onnx.py" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/pthtar2onnx.py" new file mode 100644 index 0000000000000000000000000000000000000000..50281ab0c5832ccd0522538448af8b47f3f12de9 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/pthtar2onnx.py" @@ -0,0 +1,37 @@ +# -*- coding: utf-8 -*- +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ============================================================================ + +import torch +import torchvision +import torch.onnx + +def convert(): + checkpoint = torch.load("./checkpoint.pth.tar", map_location='cpu') + model = torchvision.models.resnet101(pretrained=True) + model.load_state_dict(checkpoint['state_dict'], False) + model.eval() + print(model) + + input_names = ["actual_input_1"] + output_names = ["output1"] + dummy_input = torch.randn(16, 3, 224, 224) + torch.onnx.export(model, dummy_input, "resnet101_npu_16.onnx", + input_names=input_names, output_names=output_names, + opset_version=11) + + +if __name__ == "__main__": + convert() diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/requirements.txt" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/requirements.txt" new file mode 100644 index 0000000000000000000000000000000000000000..1ca16753b069e278dd0d4d5782e06b9c3b0ca654 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/requirements.txt" @@ -0,0 +1,5 @@ +torch==1.5.0 +apex +torchvision +onnx +opencv-python \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/resnet.py" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/resnet.py" new file mode 100644 index 0000000000000000000000000000000000000000..4817aec6be2730861d9273a712b9b4f6e31f8e29 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/resnet.py" @@ -0,0 +1,295 @@ +# -*- coding: utf-8 -*- +# Copyright 2021 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ============================================================================ + +import torch.nn as nn +import torch.utils.model_zoo as model_zoo + +__all__ = ['ResNet', 'resnet18', 'resnet34', 'resnet50', 'resnet101', + 'resnet152'] + + +model_urls = { + 'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth', + 'resnet34': 'https://download.pytorch.org/models/resnet34-333f7ec4.pth', + 'resnet50': 'https://download.pytorch.org/models/resnet50-19c8e357.pth', + 'resnet101': 'https://download.pytorch.org/models/resnet101-5d3b4d8f.pth', + 'resnet152': 'https://download.pytorch.org/models/resnet152-b121ed2d.pth', +} + + +def conv3x3(in_planes, out_planes, stride=1): + """3x3 convolution with padding""" + return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride, + padding=1, bias=False) + + +def conv1x1(in_planes, out_planes, stride=1): + """1x1 convolution""" + return nn.Conv2d(in_planes, out_planes, kernel_size=1, stride=stride, bias=False) + + +class BasicBlock(nn.Module): + expansion = 1 + + def __init__(self, inplanes, planes, stride=1, downsample=None): + super(BasicBlock, self).__init__() + self.conv1 = conv3x3(inplanes, planes, stride) + self.bn1 = nn.BatchNorm2d(planes) + self.relu = nn.ReLU(inplace=True) + self.conv2 = conv3x3(planes, planes) + self.bn2 = nn.BatchNorm2d(planes) + self.downsample = downsample + self.stride = stride + + def forward(self, x): + identity = x + + out = self.conv1(x) + out = self.bn1(out) + out = self.relu(out) + + out = self.conv2(out) + out = self.bn2(out) + + if self.downsample is not None: + identity = self.downsample(x) + + out += identity + out = self.relu(out) + + return out + + +class Bottleneck(nn.Module): + expansion = 4 + + def __init__(self, inplanes, planes, stride=1, downsample=None): + super(Bottleneck, self).__init__() + self.conv1 = conv1x1(inplanes, planes) + self.bn1 = nn.BatchNorm2d(planes) + self.conv2 = conv3x3(planes, planes, stride) + self.bn2 = nn.BatchNorm2d(planes) + self.conv3 = conv1x1(planes, planes * self.expansion) + self.bn3 = nn.BatchNorm2d(planes * self.expansion) + self.relu = nn.ReLU(inplace=True) + self.downsample = downsample + self.stride = stride + + def forward(self, x): + identity = x + + out = self.conv1(x) + out = self.bn1(out) + out = self.relu(out) + + out = self.conv2(out) + out = self.bn2(out) + out = self.relu(out) + + out = self.conv3(out) + out = self.bn3(out) + + if self.downsample is not None: + identity = self.downsample(x) + + out += identity + out = self.relu(out) + + return out + + +class ResNet(nn.Module): + + def __init__(self, block, layers, num_classes=1000, zero_init_residual=False): + super(ResNet, self).__init__() + self.inplanes = 64 + self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3, + bias=False) + self.bn1 = nn.BatchNorm2d(64) + self.relu = nn.ReLU(inplace=True) + self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1) + self.layer1 = self._make_layer(block, 64, layers[0]) + self.layer2 = self._make_layer(block, 128, layers[1], stride=2) + self.layer3 = self._make_layer(block, 256, layers[2], stride=2) + self.layer4 = self._make_layer(block, 512, layers[3], stride=2) + self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) + self.fc = nn.Linear(512 * block.expansion, num_classes) + + for m in self.modules(): + if isinstance(m, nn.Conv2d): + nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu') + elif isinstance(m, nn.BatchNorm2d): + nn.init.constant_(m.weight, 1) + nn.init.constant_(m.bias, 0) + + # Zero-initialize the last BN in each residual branch, + # so that the residual branch starts with zeros, and each residual block behaves like an identity. + # This improves the model by 0.2~0.3% according to https://arxiv.org/abs/1706.02677 + if zero_init_residual: + for m in self.modules(): + if isinstance(m, Bottleneck): + nn.init.constant_(m.bn3.weight, 0) + elif isinstance(m, BasicBlock): + nn.init.constant_(m.bn2.weight, 0) + + def _make_layer(self, block, planes, blocks, stride=1): + downsample = None + if stride != 1 or self.inplanes != planes * block.expansion: + downsample = nn.Sequential( + conv1x1(self.inplanes, planes * block.expansion, stride), + nn.BatchNorm2d(planes * block.expansion), + ) + + layers = [] + layers.append(block(self.inplanes, planes, stride, downsample)) + self.inplanes = planes * block.expansion + for _ in range(1, blocks): + layers.append(block(self.inplanes, planes)) + + return nn.Sequential(*layers) + + def forward(self, x): + # See note [TorchScript super()] + x = self.conv1(x) + x = self.bn1(x) + x = self.relu(x) + x = self.maxpool(x) + + x = self.layer1(x) + x = self.layer2(x) + x = self.layer3(x) + x = self.layer4(x) + + x = self.avgpool(x) + x = x.view(x.size(0), -1) + x = self.fc(x) + + return x + + +def resnet18(pretrained=False, progress=True, **kwargs): + r"""ResNet-18 model from + `"Deep Residual Learning for Image Recognition" `_ + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + return _resnet('resnet18', BasicBlock, [2, 2, 2, 2], pretrained, progress, + **kwargs) + + +def resnet34(pretrained=False, progress=True, **kwargs): + r"""ResNet-34 model from + `"Deep Residual Learning for Image Recognition" `_ + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + return _resnet('resnet34', BasicBlock, [3, 4, 6, 3], pretrained, progress, + **kwargs) + + +def resnet50(pretrained=False, progress=True, **kwargs): + r"""ResNet-50 model from + `"Deep Residual Learning for Image Recognition" `_ + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + return _resnet('resnet50', Bottleneck, [3, 4, 6, 3], pretrained, progress, + **kwargs) + + +def resnet101(pretrained=False, **kwargs): + r"""ResNet-101 model from + `"Deep Residual Learning for Image Recognition" `_ + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + model = ResNet(Bottleneck, [3, 4, 23, 3], **kwargs) + if pretrained: + model.load_state_dict(model_zoo.load_url(model_urls['resnet101'])) + return model + + +def resnet152(pretrained=False, progress=True, **kwargs): + r"""ResNet-152 model from + `"Deep Residual Learning for Image Recognition" `_ + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + return _resnet('resnet152', Bottleneck, [3, 8, 36, 3], pretrained, progress, + **kwargs) + + +def resnext50_32x4d(pretrained=False, progress=True, **kwargs): + r"""ResNeXt-50 32x4d model from + `"Aggregated Residual Transformation for Deep Neural Networks" `_ + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + kwargs['groups'] = 32 + kwargs['width_per_group'] = 4 + return _resnet('resnext50_32x4d', Bottleneck, [3, 4, 6, 3], + pretrained, progress, **kwargs) + + +def resnext101_32x8d(pretrained=False, progress=True, **kwargs): + r"""ResNeXt-101 32x8d model from + `"Aggregated Residual Transformation for Deep Neural Networks" `_ + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + kwargs['groups'] = 32 + kwargs['width_per_group'] = 8 + return _resnet('resnext101_32x8d', Bottleneck, [3, 4, 23, 3], + pretrained, progress, **kwargs) + + +def wide_resnet50_2(pretrained=False, progress=True, **kwargs): + r"""Wide ResNet-50-2 model from + `"Wide Residual Networks" `_ + The model is the same as ResNet except for the bottleneck number of channels + which is twice larger in every block. The number of channels in outer 1x1 + convolutions is the same, e.g. last block in ResNet-50 has 2048-512-2048 + channels, and in Wide ResNet-50-2 has 2048-1024-2048. + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + kwargs['width_per_group'] = 64 * 2 + return _resnet('wide_resnet50_2', Bottleneck, [3, 4, 6, 3], + pretrained, progress, **kwargs) + + +def wide_resnet101_2(pretrained=False, progress=True, **kwargs): + r"""Wide ResNet-101-2 model from + `"Wide Residual Networks" `_ + The model is the same as ResNet except for the bottleneck number of channels + which is twice larger in every block. The number of channels in outer 1x1 + convolutions is the same, e.g. last block in ResNet-50 has 2048-512-2048 + channels, and in Wide ResNet-50-2 has 2048-1024-2048. + Args: + pretrained (bool): If True, returns a model pre-trained on ImageNet + progress (bool): If True, displays a progress bar of the download to stderr + """ + kwargs['width_per_group'] = 64 * 2 + return _resnet('wide_resnet101_2', Bottleneck, [3, 4, 23, 3], + pretrained, progress, **kwargs) \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/env.sh" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/env.sh" new file mode 100644 index 0000000000000000000000000000000000000000..6567ba8318b7776486ff78a0bdc44bcc58a7cf09 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/env.sh" @@ -0,0 +1,65 @@ +#!/bin/bash +export install_path=/usr/local/Ascend + +if [ -d ${install_path}/toolkit ]; then + export LD_LIBRARY_PATH=/usr/include/hdf5/lib/:/usr/local/:/usr/local/lib/:/usr/lib/:${install_path}/fwkacllib/lib64/:${install_path}/driver/lib64/common/:${install_path}/driver/lib64/driver/:${install_path}/add-ons:${path_lib}:${LD_LIBRARY_PATH} + export PATH=${install_path}/fwkacllib/ccec_compiler/bin:${install_path}/fwkacllib/bin:$PATH + export PYTHONPATH=${install_path}/fwkacllib/python/site-packages:${install_path}/tfplugin/python/site-packages:${install_path}/toolkit/python/site-packages:$PYTHONPATH + export PYTHONPATH=/usr/local/python3.7.5/lib/python3.7/site-packages:$PYTHONPATH + export ASCEND_OPP_PATH=${install_path}/opp +else + if [ -d ${install_path}/nnae/latest ];then + export LD_LIBRARY_PATH=/usr/local/:/usr/local/python3.7.5/lib/:/usr/local/openblas/lib:/usr/local/lib/:/usr/lib64/:/usr/lib/:${install_path}/nnae/latest/fwkacllib/lib64/:${install_path}/driver/lib64/common/:${install_path}/driver/lib64/driver/:${install_path}/add-ons/:/usr/lib/aarch64_64-linux-gnu:$LD_LIBRARY_PATH + export PATH=$PATH:${install_path}/nnae/latest/fwkacllib/ccec_compiler/bin/:${install_path}/nnae/latest/toolkit/tools/ide_daemon/bin/ + export ASCEND_OPP_PATH=${install_path}/nnae/latest/opp/ + export OPTION_EXEC_EXTERN_PLUGIN_PATH=${install_path}/nnae/latest/fwkacllib/lib64/plugin/opskernel/libfe.so:${install_path}/nnae/latest/fwkacllib/lib64/plugin/opskernel/libaicpu_engine.so:${install_path}/nnae/latest/fwkacllib/lib64/plugin/opskernel/libge_local_engine.so + export PYTHONPATH=${install_path}/nnae/latest/fwkacllib/python/site-packages/:${install_path}/nnae/latest/fwkacllib/python/site-packages/auto_tune.egg/auto_tune:${install_path}/nnae/latest/fwkacllib/python/site-packages/schedule_search.egg:$PYTHONPATH + export ASCEND_AICPU_PATH=${install_path}/nnae/latest + else + export LD_LIBRARY_PATH=/usr/local/:/usr/local/lib/:/usr/lib64/:/usr/lib/:/usr/local/python3.7.5/lib/:/usr/local/openblas/lib:${install_path}/ascend-toolkit/latest/fwkacllib/lib64/:${install_path}/driver/lib64/common/:${install_path}/driver/lib64/driver/:${install_path}/add-ons/:/usr/lib/aarch64-linux-gnu:$LD_LIBRARY_PATH + export PATH=$PATH:${install_path}/ascend-toolkit/latest/fwkacllib/ccec_compiler/bin/:${install_path}/ascend-toolkit/latest/toolkit/tools/ide_daemon/bin/ + export ASCEND_OPP_PATH=${install_path}/ascend-toolkit/latest/opp/ + export OPTION_EXEC_EXTERN_PLUGIN_PATH=${install_path}/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libfe.so:${install_path}/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libaicpu_engine.so:${install_path}/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libge_local_engine.so + export PYTHONPATH=${install_path}/ascend-toolkit/latest/fwkacllib/python/site-packages/:${install_path}/ascend-toolkit/latest/fwkacllib/python/site-packages/auto_tune.egg/auto_tune:${install_path}/ascend-toolkit/latest/fwkacllib/python/site-packages/schedule_search.egg:$PYTHONPATH + export ASCEND_AICPU_PATH=${install_path}/ascend-toolkit/latest + fi +fi + + +#将Host日志输出到串口,0-关闭/1-开启 +export ASCEND_SLOG_PRINT_TO_STDOUT=0 +#设置默认日志级别,0-debug/1-info/2-warning/3-error +export ASCEND_GLOBAL_LOG_LEVEL=3 +#设置Event日志开启标志,0-关闭/1-开启 +export ASCEND_GLOBAL_EVENT_ENABLE=0 +#设置是否开启taskque,0-关闭/1-开启 +export TASK_QUEUE_ENABLE=1 +#设置是否开启PTCopy,0-关闭/1-开启 +export PTCOPY_ENABLE=1 +#设置是否开启combined标志,0-关闭/1-开启 +export COMBINED_ENABLE=1 +#HCCL白名单开关,1-关闭/0-开启 +export HCCL_WHITELIST_DISABLE=1 +export HCCL_IF_IP=$(hostname -I |awk '{print $1}') + + +path_lib=$(python3.7 -c """ +import sys +import re +result='' +for index in range(len(sys.path)): + match_sit = re.search('-packages', sys.path[index]) + if match_sit is not None: + match_lib = re.search('lib', sys.path[index]) + + if match_lib is not None: + end=match_lib.span()[1] + result += sys.path[index][0:end] + ':' + + result+=sys.path[index] + '/torch/lib:' +print(result)""" +) + +echo ${path_lib} + +export LD_LIBRARY_PATH=/usr/local/python3.7.5/lib/:${path_lib}:$LD_LIBRARY_PATH diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/ResNet101_ID1595_for_PyTorch_bs256_1p_perf.log" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/ResNet101_ID1595_for_PyTorch_bs256_1p_perf.log" new file mode 100644 index 0000000000000000000000000000000000000000..79f9e8881921a963a3aa24c975773ee8c67d009f --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/ResNet101_ID1595_for_PyTorch_bs256_1p_perf.log" @@ -0,0 +1,9 @@ +Network = ResNet101_ID1595_for_PyTorch +RankSize = 1 +BatchSize = 256 +DeviceType = aarch64 +CaseName = ResNet101_ID1595_for_PyTorch_bs256_1p_perf +ActualFPS = 1048.610 +TrainingTime = 244.13 +ActualLoss = 3.7035e+00 +E2ETrainingTime = 2793 diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/train_0.log" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/train_0.log" new file mode 100644 index 0000000000000000000000000000000000000000..7a189f304ce766996c474d13f4a90f11c10543a1 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/train_0.log" @@ -0,0 +1,5039 @@ +0,1,2,3,4,5,6,7 +ngpus_per_node: 8 +Use GPU: 2 for training +=> creating model 'resnet101' +[gpu id: 2 ] ==============test elif args.gpu is not None:================ +./main.py:136: UserWarning: You have chosen to seed training. This will turn on the CUDNN deterministic setting, which can slow down your training considerably! You may see unexpected behavior when restarting from checkpoints. + warnings.warn('You have chosen to seed training. ' +./main.py:143: UserWarning: You have chosen a specific GPU. This will completely disable data parallelism. + warnings.warn('You have chosen a specific GPU. This will completely ' +Selected optimization level O2: FP16 training with FP32 batchnorm and FP32 master weights. + +Defaults for this optimization level are: +enabled : True +opt_level : O2 +cast_model_type : torch.float16 +patch_torch_functions : False +keep_batchnorm_fp32 : True +master_weights : True +loss_scale : dynamic +combine_grad : None +check_combined_tensors : None +Processing user overrides (additional kwargs that are not None)... +After processing overrides, optimization options are: +enabled : True +opt_level : O2 +cast_model_type : torch.float16 +patch_torch_functions : False +keep_batchnorm_fp32 : True +master_weights : True +loss_scale : 1024.0 +combine_grad : None +check_combined_tensors : None +Use npu fused optimizer +[gpu id: 2 ] ============================test elif args.gpu is not None:========================== +Epoch: [0][ 0/5004] Time 196.695 ( 0.000) Data 8.062 ( 0.000) Loss 7.1284e+00 (7.1284e+00) Acc@1 0.00 ( 0.00) Acc@5 0.00 ( 0.00) +Epoch: [0][ 1/5004] Time 15.131 ( 0.000) Data 3.294 ( 0.000) Loss 8.6593e+00 (7.8939e+00) Acc@1 0.39 ( 0.20) Acc@5 1.17 ( 0.59) +Epoch: [0][ 2/5004] Time 0.237 ( 0.237) Data 0.013 ( 0.013) Loss 9.5272e+00 (8.4383e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.39) +Epoch: [0][ 3/5004] Time 0.241 ( 0.239) Data 0.022 ( 0.017) Loss 8.3551e+00 (8.4175e+00) Acc@1 0.00 ( 0.10) Acc@5 0.00 ( 0.29) +Epoch: [0][ 4/5004] Time 0.239 ( 0.239) Data 0.024 ( 0.019) Loss 8.6900e+00 (8.4720e+00) Acc@1 0.00 ( 0.08) Acc@5 0.78 ( 0.39) +Epoch: [0][ 5/5004] Time 0.238 ( 0.239) Data 0.025 ( 0.021) Loss 8.8442e+00 (8.5340e+00) Acc@1 0.00 ( 0.07) Acc@5 0.39 ( 0.39) +Epoch: [0][ 6/5004] Time 0.235 ( 0.238) Data 0.025 ( 0.022) Loss 7.5124e+00 (8.3881e+00) Acc@1 0.39 ( 0.11) Acc@5 0.39 ( 0.39) +Epoch: [0][ 7/5004] Time 0.240 ( 0.238) Data 0.028 ( 0.023) Loss 8.7513e+00 (8.4335e+00) Acc@1 0.00 ( 0.10) Acc@5 0.78 ( 0.44) +Epoch: [0][ 8/5004] Time 0.240 ( 0.239) Data 0.026 ( 0.023) Loss 8.4683e+00 (8.4374e+00) Acc@1 0.39 ( 0.13) Acc@5 0.39 ( 0.43) +Epoch: [0][ 9/5004] Time 0.238 ( 0.238) Data 0.025 ( 0.023) Loss 8.1676e+00 (8.4104e+00) Acc@1 0.00 ( 0.12) Acc@5 0.00 ( 0.39) +Epoch: [0][ 10/5004] Time 0.237 ( 0.238) Data 0.025 ( 0.024) Loss 8.2323e+00 (8.3942e+00) Acc@1 0.00 ( 0.11) Acc@5 0.00 ( 0.36) +Epoch: [0][ 11/5004] Time 0.239 ( 0.238) Data 0.028 ( 0.024) Loss 7.7600e+00 (8.3413e+00) Acc@1 0.00 ( 0.10) Acc@5 0.78 ( 0.39) +Epoch: [0][ 12/5004] Time 0.239 ( 0.239) Data 0.027 ( 0.024) Loss 7.6141e+00 (8.2854e+00) Acc@1 0.00 ( 0.09) Acc@5 0.00 ( 0.36) +Epoch: [0][ 13/5004] Time 0.244 ( 0.239) Data 0.025 ( 0.024) Loss 8.0500e+00 (8.2686e+00) Acc@1 0.00 ( 0.08) Acc@5 0.00 ( 0.33) +Epoch: [0][ 14/5004] Time 0.241 ( 0.239) Data 0.024 ( 0.024) Loss 7.8809e+00 (8.2427e+00) Acc@1 0.00 ( 0.08) Acc@5 0.39 ( 0.34) +Epoch: [0][ 15/5004] Time 0.239 ( 0.239) Data 0.022 ( 0.024) Loss 7.6010e+00 (8.2026e+00) Acc@1 0.00 ( 0.07) Acc@5 0.00 ( 0.32) +Epoch: [0][ 16/5004] Time 0.238 ( 0.239) Data 0.024 ( 0.024) Loss 7.5687e+00 (8.1653e+00) Acc@1 0.78 ( 0.11) Acc@5 1.17 ( 0.37) +Epoch: [0][ 17/5004] Time 0.241 ( 0.239) Data 0.025 ( 0.024) Loss 7.7687e+00 (8.1433e+00) Acc@1 0.00 ( 0.11) Acc@5 0.00 ( 0.35) +Epoch: [0][ 18/5004] Time 0.239 ( 0.239) Data 0.024 ( 0.024) Loss 9.0109e+00 (8.1890e+00) Acc@1 0.00 ( 0.10) Acc@5 0.00 ( 0.33) +Epoch: [0][ 19/5004] Time 0.239 ( 0.239) Data 0.025 ( 0.024) Loss 7.4484e+00 (8.1519e+00) Acc@1 0.00 ( 0.10) Acc@5 0.78 ( 0.35) +Epoch: [0][ 20/5004] Time 0.238 ( 0.239) Data 0.025 ( 0.024) Loss 7.3605e+00 (8.1142e+00) Acc@1 0.00 ( 0.09) Acc@5 0.39 ( 0.35) +Epoch: [0][ 21/5004] Time 0.240 ( 0.239) Data 0.025 ( 0.024) Loss 7.1803e+00 (8.0718e+00) Acc@1 0.00 ( 0.09) Acc@5 0.00 ( 0.34) +Epoch: [0][ 22/5004] Time 0.237 ( 0.239) Data 0.024 ( 0.024) Loss 7.7433e+00 (8.0575e+00) Acc@1 0.00 ( 0.08) Acc@5 0.39 ( 0.34) +Epoch: [0][ 23/5004] Time 0.239 ( 0.239) Data 0.025 ( 0.024) Loss 8.3131e+00 (8.0682e+00) Acc@1 0.39 ( 0.10) Acc@5 0.78 ( 0.36) +Epoch: [0][ 24/5004] Time 0.240 ( 0.239) Data 0.024 ( 0.024) Loss 7.5259e+00 (8.0465e+00) Acc@1 0.00 ( 0.09) Acc@5 1.17 ( 0.39) +Epoch: [0][ 25/5004] Time 0.243 ( 0.239) Data 0.025 ( 0.024) Loss 7.1664e+00 (8.0126e+00) Acc@1 0.00 ( 0.09) Acc@5 0.39 ( 0.39) +Epoch: [0][ 26/5004] Time 0.236 ( 0.239) Data 0.023 ( 0.024) Loss 7.1321e+00 (7.9800e+00) Acc@1 0.39 ( 0.10) Acc@5 1.56 ( 0.43) +Epoch: [0][ 27/5004] Time 0.239 ( 0.239) Data 0.025 ( 0.024) Loss 7.1734e+00 (7.9512e+00) Acc@1 0.00 ( 0.10) Acc@5 0.00 ( 0.42) +Epoch: [0][ 28/5004] Time 0.240 ( 0.239) Data 0.025 ( 0.024) Loss 7.0932e+00 (7.9216e+00) Acc@1 0.00 ( 0.09) Acc@5 0.39 ( 0.42) +Epoch: [0][ 29/5004] Time 0.245 ( 0.239) Data 0.026 ( 0.024) Loss 7.2980e+00 (7.9008e+00) Acc@1 0.78 ( 0.12) Acc@5 0.78 ( 0.43) +Epoch: [0][ 30/5004] Time 0.238 ( 0.239) Data 0.024 ( 0.024) Loss 7.0408e+00 (7.8731e+00) Acc@1 0.39 ( 0.13) Acc@5 1.17 ( 0.45) +Epoch: [0][ 31/5004] Time 0.237 ( 0.239) Data 0.025 ( 0.024) Loss 7.0054e+00 (7.8460e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.45) +Epoch: [0][ 32/5004] Time 0.239 ( 0.239) Data 0.026 ( 0.025) Loss 6.9824e+00 (7.8198e+00) Acc@1 0.00 ( 0.12) Acc@5 0.00 ( 0.44) +Epoch: [0][ 33/5004] Time 0.241 ( 0.239) Data 0.026 ( 0.025) Loss 7.0612e+00 (7.7975e+00) Acc@1 0.00 ( 0.11) Acc@5 0.00 ( 0.43) +Epoch: [0][ 34/5004] Time 0.243 ( 0.239) Data 0.028 ( 0.025) Loss 6.9541e+00 (7.7734e+00) Acc@1 0.39 ( 0.12) Acc@5 1.95 ( 0.47) +Epoch: [0][ 35/5004] Time 0.240 ( 0.239) Data 0.025 ( 0.025) Loss 6.9516e+00 (7.7506e+00) Acc@1 0.39 ( 0.13) Acc@5 1.56 ( 0.50) +Epoch: [0][ 36/5004] Time 0.239 ( 0.239) Data 0.024 ( 0.025) Loss 6.9389e+00 (7.7286e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.49) +Epoch: [0][ 37/5004] Time 0.239 ( 0.239) Data 0.025 ( 0.025) Loss 6.9487e+00 (7.7081e+00) Acc@1 0.00 ( 0.12) Acc@5 1.56 ( 0.51) +Epoch: [0][ 38/5004] Time 0.240 ( 0.239) Data 0.024 ( 0.025) Loss 6.9494e+00 (7.6887e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.51) +Epoch: [0][ 39/5004] Time 0.239 ( 0.239) Data 0.024 ( 0.025) Loss 6.9794e+00 (7.6709e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.52) +Epoch: [0][ 40/5004] Time 0.242 ( 0.239) Data 0.026 ( 0.025) Loss 6.9615e+00 (7.6536e+00) Acc@1 0.39 ( 0.12) Acc@5 1.17 ( 0.53) +Epoch: [0][ 41/5004] Time 0.240 ( 0.239) Data 0.024 ( 0.025) Loss 6.9375e+00 (7.6366e+00) Acc@1 0.39 ( 0.13) Acc@5 0.39 ( 0.53) +Epoch: [0][ 42/5004] Time 0.240 ( 0.239) Data 0.024 ( 0.025) Loss 7.0985e+00 (7.6241e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.53) +Epoch: [0][ 43/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.025) Loss 6.9116e+00 (7.6079e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.53) +Epoch: [0][ 44/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.025) Loss 6.9217e+00 (7.5926e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.52) +Epoch: [0][ 45/5004] Time 0.235 ( 0.239) Data 0.022 ( 0.025) Loss 6.9310e+00 (7.5782e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.53) +Epoch: [0][ 46/5004] Time 0.240 ( 0.239) Data 0.025 ( 0.025) Loss 6.9175e+00 (7.5642e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.52) +Epoch: [0][ 47/5004] Time 0.239 ( 0.239) Data 0.025 ( 0.025) Loss 6.9402e+00 (7.5512e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.53) +Epoch: [0][ 48/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.025) Loss 7.0500e+00 (7.5409e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.53) +Epoch: [0][ 49/5004] Time 0.237 ( 0.240) Data 0.022 ( 0.025) Loss 6.9396e+00 (7.5289e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.52) +Epoch: [0][ 50/5004] Time 0.238 ( 0.239) Data 0.024 ( 0.025) Loss 6.9335e+00 (7.5172e+00) Acc@1 0.00 ( 0.12) Acc@5 0.00 ( 0.51) +Epoch: [0][ 51/5004] Time 0.238 ( 0.239) Data 0.025 ( 0.025) Loss 6.9301e+00 (7.5060e+00) Acc@1 0.39 ( 0.13) Acc@5 1.17 ( 0.53) +Epoch: [0][ 52/5004] Time 0.239 ( 0.239) Data 0.027 ( 0.025) Loss 6.9106e+00 (7.4947e+00) Acc@1 0.39 ( 0.13) Acc@5 0.39 ( 0.52) +Epoch: [0][ 53/5004] Time 0.247 ( 0.240) Data 0.026 ( 0.025) Loss 6.9183e+00 (7.4840e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.53) +Epoch: [0][ 54/5004] Time 0.240 ( 0.240) Data 0.020 ( 0.025) Loss 6.9287e+00 (7.4739e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.52) +Epoch: [0][ 55/5004] Time 0.238 ( 0.240) Data 0.020 ( 0.024) Loss 6.9055e+00 (7.4638e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.52) +Epoch: [0][ 56/5004] Time 0.248 ( 0.240) Data 0.024 ( 0.024) Loss 6.8933e+00 (7.4538e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.53) +Epoch: [0][ 57/5004] Time 0.241 ( 0.240) Data 0.023 ( 0.024) Loss 6.9284e+00 (7.4447e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.53) +Epoch: [0][ 58/5004] Time 0.246 ( 0.240) Data 0.024 ( 0.024) Loss 6.8942e+00 (7.4354e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.53) +Epoch: [0][ 59/5004] Time 0.240 ( 0.240) Data 0.021 ( 0.024) Loss 6.9559e+00 (7.4274e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.53) +Epoch: [0][ 60/5004] Time 0.241 ( 0.240) Data 0.023 ( 0.024) Loss 6.9157e+00 (7.4190e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.54) +Epoch: [0][ 61/5004] Time 0.248 ( 0.240) Data 0.023 ( 0.024) Loss 6.9127e+00 (7.4109e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.54) +Epoch: [0][ 62/5004] Time 0.235 ( 0.240) Data 0.018 ( 0.024) Loss 6.9173e+00 (7.4030e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.53) +Epoch: [0][ 63/5004] Time 0.245 ( 0.240) Data 0.024 ( 0.024) Loss 6.9168e+00 (7.3954e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.52) +Epoch: [0][ 64/5004] Time 0.244 ( 0.240) Data 0.025 ( 0.024) Loss 6.9022e+00 (7.3878e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.52) +Epoch: [0][ 65/5004] Time 0.245 ( 0.240) Data 0.024 ( 0.024) Loss 6.9269e+00 (7.3808e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.52) +Epoch: [0][ 66/5004] Time 0.245 ( 0.240) Data 0.023 ( 0.024) Loss 6.9046e+00 (7.3737e+00) Acc@1 0.00 ( 0.12) Acc@5 1.56 ( 0.54) +Epoch: [0][ 67/5004] Time 0.239 ( 0.240) Data 0.023 ( 0.024) Loss 6.8840e+00 (7.3665e+00) Acc@1 0.39 ( 0.13) Acc@5 0.39 ( 0.53) +Epoch: [0][ 68/5004] Time 0.233 ( 0.240) Data 0.024 ( 0.024) Loss 6.9088e+00 (7.3599e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.54) +Epoch: [0][ 69/5004] Time 0.239 ( 0.240) Data 0.028 ( 0.024) Loss 6.9110e+00 (7.3535e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.54) +Epoch: [0][ 70/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.024) Loss 6.9031e+00 (7.3471e+00) Acc@1 0.00 ( 0.12) Acc@5 1.17 ( 0.54) +Epoch: [0][ 71/5004] Time 0.239 ( 0.240) Data 0.029 ( 0.024) Loss 6.9139e+00 (7.3411e+00) Acc@1 0.39 ( 0.12) Acc@5 0.39 ( 0.54) +Epoch: [0][ 72/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.024) Loss 6.8853e+00 (7.3349e+00) Acc@1 0.39 ( 0.13) Acc@5 0.39 ( 0.54) +Epoch: [0][ 73/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.024) Loss 6.8804e+00 (7.3287e+00) Acc@1 0.39 ( 0.13) Acc@5 1.56 ( 0.55) +Epoch: [0][ 74/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.024) Loss 6.9051e+00 (7.3231e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.55) +Epoch: [0][ 75/5004] Time 0.236 ( 0.240) Data 0.024 ( 0.024) Loss 6.9058e+00 (7.3176e+00) Acc@1 0.00 ( 0.13) Acc@5 1.17 ( 0.56) +Epoch: [0][ 76/5004] Time 0.240 ( 0.240) Data 0.027 ( 0.024) Loss 6.8905e+00 (7.3121e+00) Acc@1 0.39 ( 0.13) Acc@5 1.17 ( 0.57) +Epoch: [0][ 77/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.024) Loss 6.9131e+00 (7.3069e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.57) +Epoch: [0][ 78/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.024) Loss 6.9044e+00 (7.3018e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.56) +Epoch: [0][ 79/5004] Time 0.237 ( 0.240) Data 0.025 ( 0.025) Loss 6.9107e+00 (7.2970e+00) Acc@1 0.39 ( 0.13) Acc@5 0.78 ( 0.57) +Epoch: [0][ 80/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.025) Loss 6.9145e+00 (7.2922e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.56) +Epoch: [0][ 81/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.025) Loss 6.9048e+00 (7.2875e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.57) +Epoch: [0][ 82/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.025) Loss 6.9143e+00 (7.2830e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.56) +Epoch: [0][ 83/5004] Time 0.238 ( 0.240) Data 0.025 ( 0.025) Loss 6.9050e+00 (7.2785e+00) Acc@1 0.39 ( 0.13) Acc@5 0.78 ( 0.56) +Epoch: [0][ 84/5004] Time 0.237 ( 0.240) Data 0.026 ( 0.025) Loss 6.9378e+00 (7.2745e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.56) +Epoch: [0][ 85/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.025) Loss 6.9118e+00 (7.2703e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.55) +Epoch: [0][ 86/5004] Time 0.238 ( 0.240) Data 0.025 ( 0.025) Loss 6.9211e+00 (7.2663e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.55) +Epoch: [0][ 87/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.025) Loss 6.9301e+00 (7.2625e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.55) +Epoch: [0][ 88/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.025) Loss 6.9074e+00 (7.2585e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.56) +Epoch: [0][ 89/5004] Time 0.243 ( 0.240) Data 0.025 ( 0.025) Loss 6.9421e+00 (7.2550e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.55) +Epoch: [0][ 90/5004] Time 0.238 ( 0.240) Data 0.023 ( 0.025) Loss 6.8935e+00 (7.2510e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.55) +Epoch: [0][ 91/5004] Time 0.243 ( 0.240) Data 0.025 ( 0.025) Loss 6.9175e+00 (7.2474e+00) Acc@1 0.00 ( 0.12) Acc@5 1.56 ( 0.56) +Epoch: [0][ 92/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.025) Loss 6.9175e+00 (7.2438e+00) Acc@1 0.00 ( 0.12) Acc@5 0.00 ( 0.55) +Epoch: [0][ 93/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.025) Loss 6.8978e+00 (7.2401e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.56) +Epoch: [0][ 94/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.025) Loss 6.9228e+00 (7.2368e+00) Acc@1 0.39 ( 0.12) Acc@5 0.78 ( 0.56) +Epoch: [0][ 95/5004] Time 0.239 ( 0.240) Data 0.024 ( 0.025) Loss 6.9092e+00 (7.2334e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.56) +Epoch: [0][ 96/5004] Time 0.240 ( 0.240) Data 0.025 ( 0.025) Loss 6.8961e+00 (7.2299e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.56) +Epoch: [0][ 97/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.025) Loss 6.9005e+00 (7.2265e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.56) +Epoch: [0][ 98/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.025) Loss 6.9057e+00 (7.2233e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.56) +Epoch: [0][ 99/5004] Time 0.244 ( 0.240) Data 0.025 ( 0.025) Loss 6.9490e+00 (7.2206e+00) Acc@1 0.00 ( 0.12) Acc@5 1.17 ( 0.57) +Epoch: [0][ 100/5004] Time 0.242 ( 0.240) Data 0.025 ( 0.025) Loss 6.9313e+00 (7.2177e+00) Acc@1 0.39 ( 0.12) Acc@5 0.78 ( 0.57) +Epoch: [0][ 101/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.025) Loss 6.9129e+00 (7.2147e+00) Acc@1 0.39 ( 0.12) Acc@5 0.78 ( 0.57) +Epoch: [0][ 102/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.025) Loss 6.8933e+00 (7.2116e+00) Acc@1 0.39 ( 0.13) Acc@5 1.95 ( 0.59) +Epoch: [0][ 103/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.025) Loss 6.9098e+00 (7.2087e+00) Acc@1 0.39 ( 0.13) Acc@5 0.39 ( 0.59) +Epoch: [0][ 104/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.025) Loss 6.9323e+00 (7.2060e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.58) +Epoch: [0][ 105/5004] Time 0.237 ( 0.240) Data 0.023 ( 0.025) Loss 6.9283e+00 (7.2034e+00) Acc@1 0.00 ( 0.13) Acc@5 1.17 ( 0.59) +Epoch: [0][ 106/5004] Time 0.243 ( 0.240) Data 0.025 ( 0.025) Loss 6.9174e+00 (7.2008e+00) Acc@1 0.00 ( 0.12) Acc@5 0.00 ( 0.58) +Epoch: [0][ 107/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.025) Loss 6.8973e+00 (7.1979e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.58) +Epoch: [0][ 108/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.025) Loss 6.9061e+00 (7.1953e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.58) +Epoch: [0][ 109/5004] Time 0.245 ( 0.240) Data 0.025 ( 0.025) Loss 6.9083e+00 (7.1927e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.58) +Epoch: [0][ 110/5004] Time 0.243 ( 0.240) Data 0.027 ( 0.025) Loss 6.8965e+00 (7.1900e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.57) +Epoch: [0][ 111/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.025) Loss 6.8907e+00 (7.1873e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.58) +Epoch: [0][ 112/5004] Time 0.244 ( 0.240) Data 0.023 ( 0.025) Loss 6.9153e+00 (7.1849e+00) Acc@1 0.39 ( 0.12) Acc@5 0.78 ( 0.58) +Epoch: [0][ 113/5004] Time 0.234 ( 0.240) Data 0.020 ( 0.025) Loss 6.9036e+00 (7.1824e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.58) +Epoch: [0][ 114/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.025) Loss 6.9938e+00 (7.1808e+00) Acc@1 0.00 ( 0.12) Acc@5 0.39 ( 0.58) +Epoch: [0][ 115/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.025) Loss 6.8924e+00 (7.1783e+00) Acc@1 0.39 ( 0.12) Acc@5 0.78 ( 0.58) +Epoch: [0][ 116/5004] Time 0.237 ( 0.240) Data 0.026 ( 0.025) Loss 6.9054e+00 (7.1760e+00) Acc@1 0.00 ( 0.12) Acc@5 1.56 ( 0.59) +Epoch: [0][ 117/5004] Time 0.240 ( 0.240) Data 0.029 ( 0.025) Loss 6.8820e+00 (7.1735e+00) Acc@1 0.39 ( 0.12) Acc@5 0.78 ( 0.59) +Epoch: [0][ 118/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.025) Loss 6.9104e+00 (7.1713e+00) Acc@1 0.39 ( 0.12) Acc@5 1.17 ( 0.59) +Epoch: [0][ 119/5004] Time 0.242 ( 0.240) Data 0.028 ( 0.025) Loss 6.9092e+00 (7.1691e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.60) +Epoch: [0][ 120/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.025) Loss 6.9232e+00 (7.1671e+00) Acc@1 0.00 ( 0.12) Acc@5 0.00 ( 0.59) +Epoch: [0][ 121/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.025) Loss 6.9078e+00 (7.1649e+00) Acc@1 0.00 ( 0.12) Acc@5 0.78 ( 0.59) +Epoch: [0][ 122/5004] Time 0.242 ( 0.240) Data 0.028 ( 0.025) Loss 6.8891e+00 (7.1627e+00) Acc@1 0.78 ( 0.13) Acc@5 1.56 ( 0.60) +Epoch: [0][ 123/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.025) Loss 6.9134e+00 (7.1607e+00) Acc@1 0.39 ( 0.13) Acc@5 0.78 ( 0.60) +Epoch: [0][ 124/5004] Time 0.240 ( 0.240) Data 0.029 ( 0.025) Loss 6.9144e+00 (7.1587e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.60) +Epoch: [0][ 125/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.025) Loss 6.9167e+00 (7.1568e+00) Acc@1 0.00 ( 0.13) Acc@5 1.56 ( 0.61) +Epoch: [0][ 126/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.025) Loss 6.8809e+00 (7.1546e+00) Acc@1 1.17 ( 0.14) Acc@5 1.17 ( 0.62) +Epoch: [0][ 127/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.025) Loss 6.9052e+00 (7.1527e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.62) +Epoch: [0][ 128/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.025) Loss 6.8908e+00 (7.1506e+00) Acc@1 0.78 ( 0.14) Acc@5 1.17 ( 0.62) +Epoch: [0][ 129/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.025) Loss 6.8974e+00 (7.1487e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.62) +Epoch: [0][ 130/5004] Time 0.241 ( 0.240) Data 0.029 ( 0.025) Loss 6.8855e+00 (7.1467e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.62) +Epoch: [0][ 131/5004] Time 0.245 ( 0.240) Data 0.027 ( 0.025) Loss 6.9503e+00 (7.1452e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.62) +Epoch: [0][ 132/5004] Time 0.236 ( 0.240) Data 0.023 ( 0.025) Loss 6.9034e+00 (7.1434e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.62) +Epoch: [0][ 133/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.025) Loss 6.9069e+00 (7.1416e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.62) +Epoch: [0][ 134/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.025) Loss 6.9067e+00 (7.1399e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.61) +Epoch: [0][ 135/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.025) Loss 6.9081e+00 (7.1382e+00) Acc@1 0.00 ( 0.14) Acc@5 1.56 ( 0.62) +Epoch: [0][ 136/5004] Time 0.245 ( 0.240) Data 0.025 ( 0.025) Loss 6.9073e+00 (7.1365e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.62) +Epoch: [0][ 137/5004] Time 0.245 ( 0.240) Data 0.021 ( 0.025) Loss 6.9141e+00 (7.1349e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.63) +Epoch: [0][ 138/5004] Time 0.246 ( 0.240) Data 0.020 ( 0.025) Loss 6.9162e+00 (7.1333e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.62) +Epoch: [0][ 139/5004] Time 0.245 ( 0.240) Data 0.021 ( 0.025) Loss 6.8952e+00 (7.1316e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.62) +Epoch: [0][ 140/5004] Time 0.241 ( 0.240) Data 0.022 ( 0.025) Loss 6.8825e+00 (7.1298e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.63) +Epoch: [0][ 141/5004] Time 0.243 ( 0.240) Data 0.022 ( 0.025) Loss 6.9196e+00 (7.1284e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.62) +Epoch: [0][ 142/5004] Time 0.243 ( 0.240) Data 0.023 ( 0.025) Loss 6.8937e+00 (7.1267e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.63) +Epoch: [0][ 143/5004] Time 0.247 ( 0.240) Data 0.022 ( 0.025) Loss 7.0251e+00 (7.1260e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.63) +Epoch: [0][ 144/5004] Time 0.245 ( 0.240) Data 0.020 ( 0.025) Loss 6.9050e+00 (7.1245e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.64) +Epoch: [0][ 145/5004] Time 0.242 ( 0.240) Data 0.021 ( 0.025) Loss 6.9091e+00 (7.1230e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.64) +Epoch: [0][ 146/5004] Time 0.248 ( 0.240) Data 0.023 ( 0.025) Loss 6.9004e+00 (7.1215e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.63) +Epoch: [0][ 147/5004] Time 0.241 ( 0.240) Data 0.019 ( 0.025) Loss 6.9079e+00 (7.1200e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.63) +Epoch: [0][ 148/5004] Time 0.245 ( 0.240) Data 0.023 ( 0.025) Loss 6.9192e+00 (7.1187e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.62) +Epoch: [0][ 149/5004] Time 0.243 ( 0.240) Data 0.022 ( 0.025) Loss 6.9033e+00 (7.1173e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.62) +Epoch: [0][ 150/5004] Time 0.246 ( 0.240) Data 0.022 ( 0.025) Loss 6.9119e+00 (7.1159e+00) Acc@1 0.78 ( 0.14) Acc@5 2.34 ( 0.63) +Epoch: [0][ 151/5004] Time 0.246 ( 0.241) Data 0.021 ( 0.025) Loss 6.9053e+00 (7.1145e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.64) +Epoch: [0][ 152/5004] Time 0.253 ( 0.241) Data 0.021 ( 0.025) Loss 6.9230e+00 (7.1133e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.64) +Epoch: [0][ 153/5004] Time 0.245 ( 0.241) Data 0.017 ( 0.025) Loss 6.9133e+00 (7.1120e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.64) +Epoch: [0][ 154/5004] Time 0.244 ( 0.241) Data 0.019 ( 0.025) Loss 6.8999e+00 (7.1106e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.64) +Epoch: [0][ 155/5004] Time 0.251 ( 0.241) Data 0.021 ( 0.025) Loss 6.9164e+00 (7.1094e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.64) +Epoch: [0][ 156/5004] Time 0.248 ( 0.241) Data 0.016 ( 0.024) Loss 6.9078e+00 (7.1081e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.63) +Epoch: [0][ 157/5004] Time 0.259 ( 0.241) Data 0.020 ( 0.024) Loss 6.9015e+00 (7.1068e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.64) +Epoch: [0][ 158/5004] Time 0.237 ( 0.241) Data 0.016 ( 0.024) Loss 6.9090e+00 (7.1055e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.64) +Epoch: [0][ 159/5004] Time 0.244 ( 0.241) Data 0.022 ( 0.024) Loss 6.8973e+00 (7.1042e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.64) +Epoch: [0][ 160/5004] Time 0.241 ( 0.241) Data 0.021 ( 0.024) Loss 6.9102e+00 (7.1030e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.64) +Epoch: [0][ 161/5004] Time 0.245 ( 0.241) Data 0.023 ( 0.024) Loss 6.9004e+00 (7.1018e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.64) +Epoch: [0][ 162/5004] Time 0.246 ( 0.241) Data 0.022 ( 0.024) Loss 6.9164e+00 (7.1006e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.64) +Epoch: [0][ 163/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.024) Loss 6.9124e+00 (7.0995e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.63) +Epoch: [0][ 164/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.024) Loss 6.9005e+00 (7.0983e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.63) +Epoch: [0][ 165/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.024) Loss 6.9069e+00 (7.0971e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.64) +Epoch: [0][ 166/5004] Time 0.245 ( 0.241) Data 0.020 ( 0.024) Loss 6.8998e+00 (7.0959e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.64) +Epoch: [0][ 167/5004] Time 0.244 ( 0.241) Data 0.022 ( 0.024) Loss 6.8850e+00 (7.0947e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.64) +Epoch: [0][ 168/5004] Time 0.242 ( 0.241) Data 0.020 ( 0.024) Loss 6.8924e+00 (7.0935e+00) Acc@1 0.39 ( 0.14) Acc@5 2.34 ( 0.65) +Epoch: [0][ 169/5004] Time 0.243 ( 0.241) Data 0.022 ( 0.024) Loss 6.9052e+00 (7.0924e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.65) +Epoch: [0][ 170/5004] Time 0.245 ( 0.241) Data 0.021 ( 0.024) Loss 6.8916e+00 (7.0912e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.64) +Epoch: [0][ 171/5004] Time 0.242 ( 0.241) Data 0.021 ( 0.024) Loss 6.8904e+00 (7.0900e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.65) +Epoch: [0][ 172/5004] Time 0.242 ( 0.241) Data 0.023 ( 0.024) Loss 6.8918e+00 (7.0889e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.65) +Epoch: [0][ 173/5004] Time 0.251 ( 0.241) Data 0.023 ( 0.024) Loss 6.9082e+00 (7.0879e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.64) +Epoch: [0][ 174/5004] Time 0.237 ( 0.241) Data 0.015 ( 0.024) Loss 6.9023e+00 (7.0868e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.65) +Epoch: [0][ 175/5004] Time 0.242 ( 0.241) Data 0.020 ( 0.024) Loss 6.8979e+00 (7.0857e+00) Acc@1 0.00 ( 0.14) Acc@5 1.56 ( 0.65) +Epoch: [0][ 176/5004] Time 0.243 ( 0.241) Data 0.022 ( 0.024) Loss 6.9041e+00 (7.0847e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.65) +Epoch: [0][ 177/5004] Time 0.248 ( 0.241) Data 0.022 ( 0.024) Loss 6.9039e+00 (7.0837e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.65) +Epoch: [0][ 178/5004] Time 0.236 ( 0.241) Data 0.020 ( 0.024) Loss 6.8999e+00 (7.0827e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.65) +Epoch: [0][ 179/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.8799e+00 (7.0815e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.66) +Epoch: [0][ 180/5004] Time 0.244 ( 0.241) Data 0.024 ( 0.024) Loss 6.9307e+00 (7.0807e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.65) +Epoch: [0][ 181/5004] Time 0.247 ( 0.241) Data 0.022 ( 0.024) Loss 6.8876e+00 (7.0796e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.66) +Epoch: [0][ 182/5004] Time 0.242 ( 0.241) Data 0.021 ( 0.024) Loss 6.8743e+00 (7.0785e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.66) +Epoch: [0][ 183/5004] Time 0.242 ( 0.241) Data 0.020 ( 0.024) Loss 6.9040e+00 (7.0776e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.66) +Epoch: [0][ 184/5004] Time 0.240 ( 0.241) Data 0.021 ( 0.024) Loss 7.3134e+00 (7.0788e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.66) +Epoch: [0][ 185/5004] Time 0.241 ( 0.241) Data 0.023 ( 0.024) Loss 6.8843e+00 (7.0778e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.66) +Epoch: [0][ 186/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.024) Loss 6.9121e+00 (7.0769e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.66) +Epoch: [0][ 187/5004] Time 0.240 ( 0.241) Data 0.022 ( 0.024) Loss 6.9121e+00 (7.0760e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.66) +Epoch: [0][ 188/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.024) Loss 6.9151e+00 (7.0752e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.66) +Epoch: [0][ 189/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.024) Loss 6.8938e+00 (7.0742e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.66) +Epoch: [0][ 190/5004] Time 0.241 ( 0.241) Data 0.023 ( 0.024) Loss 6.9116e+00 (7.0734e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.66) +Epoch: [0][ 191/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.9060e+00 (7.0725e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.66) +Epoch: [0][ 192/5004] Time 0.244 ( 0.241) Data 0.023 ( 0.024) Loss 6.9082e+00 (7.0716e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.66) +Epoch: [0][ 193/5004] Time 0.240 ( 0.241) Data 0.023 ( 0.024) Loss 6.8949e+00 (7.0707e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.66) +Epoch: [0][ 194/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.024) Loss 6.8953e+00 (7.0698e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.66) +Epoch: [0][ 195/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.8974e+00 (7.0690e+00) Acc@1 0.00 ( 0.13) Acc@5 1.56 ( 0.66) +Epoch: [0][ 196/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.024) Loss 6.9128e+00 (7.0682e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.66) +Epoch: [0][ 197/5004] Time 0.245 ( 0.241) Data 0.023 ( 0.024) Loss 6.8846e+00 (7.0672e+00) Acc@1 0.39 ( 0.13) Acc@5 1.56 ( 0.66) +Epoch: [0][ 198/5004] Time 0.241 ( 0.241) Data 0.022 ( 0.024) Loss 6.8858e+00 (7.0663e+00) Acc@1 0.00 ( 0.13) Acc@5 1.17 ( 0.67) +Epoch: [0][ 199/5004] Time 0.241 ( 0.241) Data 0.023 ( 0.024) Loss 6.8917e+00 (7.0655e+00) Acc@1 0.39 ( 0.13) Acc@5 1.17 ( 0.67) +Epoch: [0][ 200/5004] Time 0.245 ( 0.241) Data 0.024 ( 0.024) Loss 6.8916e+00 (7.0646e+00) Acc@1 0.00 ( 0.13) Acc@5 1.17 ( 0.67) +Epoch: [0][ 201/5004] Time 0.240 ( 0.241) Data 0.021 ( 0.024) Loss 6.9135e+00 (7.0638e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.67) +Epoch: [0][ 202/5004] Time 0.239 ( 0.241) Data 0.022 ( 0.024) Loss 6.9005e+00 (7.0630e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.67) +Epoch: [0][ 203/5004] Time 0.241 ( 0.241) Data 0.023 ( 0.024) Loss 6.8896e+00 (7.0622e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.67) +Epoch: [0][ 204/5004] Time 0.242 ( 0.241) Data 0.023 ( 0.024) Loss 6.9163e+00 (7.0615e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.67) +Epoch: [0][ 205/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.024) Loss 6.9111e+00 (7.0607e+00) Acc@1 0.39 ( 0.13) Acc@5 0.78 ( 0.67) +Epoch: [0][ 206/5004] Time 0.239 ( 0.241) Data 0.021 ( 0.024) Loss 6.9013e+00 (7.0600e+00) Acc@1 0.78 ( 0.14) Acc@5 1.56 ( 0.67) +Epoch: [0][ 207/5004] Time 0.247 ( 0.241) Data 0.023 ( 0.024) Loss 6.9095e+00 (7.0592e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.67) +Epoch: [0][ 208/5004] Time 0.237 ( 0.241) Data 0.021 ( 0.024) Loss 6.9100e+00 (7.0585e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.67) +Epoch: [0][ 209/5004] Time 0.243 ( 0.241) Data 0.024 ( 0.024) Loss 6.8864e+00 (7.0577e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.67) +Epoch: [0][ 210/5004] Time 0.244 ( 0.241) Data 0.023 ( 0.024) Loss 6.9078e+00 (7.0570e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.67) +Epoch: [0][ 211/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.024) Loss 6.9239e+00 (7.0564e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 212/5004] Time 0.235 ( 0.241) Data 0.023 ( 0.024) Loss 6.9143e+00 (7.0557e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 213/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.024) Loss 6.9012e+00 (7.0550e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 214/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.024) Loss 6.8996e+00 (7.0543e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 215/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.024) Loss 6.9172e+00 (7.0536e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 216/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.024) Loss 6.9065e+00 (7.0530e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 217/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.024) Loss 6.9088e+00 (7.0523e+00) Acc@1 1.17 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 218/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.024) Loss 6.9145e+00 (7.0517e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 219/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.024) Loss 6.9150e+00 (7.0510e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 220/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.024) Loss 6.8884e+00 (7.0503e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 221/5004] Time 0.240 ( 0.241) Data 0.024 ( 0.024) Loss 6.9496e+00 (7.0499e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 222/5004] Time 0.238 ( 0.241) Data 0.023 ( 0.024) Loss 6.9361e+00 (7.0493e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 223/5004] Time 0.238 ( 0.241) Data 0.024 ( 0.024) Loss 6.9008e+00 (7.0487e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 224/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.024) Loss 6.9106e+00 (7.0481e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 225/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.024) Loss 6.9065e+00 (7.0474e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 226/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.024) Loss 6.8867e+00 (7.0467e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.68) +Epoch: [0][ 227/5004] Time 0.234 ( 0.241) Data 0.024 ( 0.024) Loss 6.9070e+00 (7.0461e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 228/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.024) Loss 6.8886e+00 (7.0454e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 229/5004] Time 0.237 ( 0.241) Data 0.027 ( 0.024) Loss 6.9297e+00 (7.0449e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 230/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.024) Loss 6.9617e+00 (7.0446e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 231/5004] Time 0.234 ( 0.241) Data 0.023 ( 0.024) Loss 6.8992e+00 (7.0439e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 232/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.024) Loss 6.9185e+00 (7.0434e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 233/5004] Time 0.239 ( 0.241) Data 0.024 ( 0.024) Loss 6.8868e+00 (7.0427e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.68) +Epoch: [0][ 234/5004] Time 0.235 ( 0.241) Data 0.024 ( 0.024) Loss 6.8905e+00 (7.0421e+00) Acc@1 0.00 ( 0.14) Acc@5 1.56 ( 0.68) +Epoch: [0][ 235/5004] Time 0.243 ( 0.241) Data 0.027 ( 0.024) Loss 6.9059e+00 (7.0415e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 236/5004] Time 0.232 ( 0.241) Data 0.023 ( 0.024) Loss 6.9074e+00 (7.0409e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 237/5004] Time 0.239 ( 0.241) Data 0.029 ( 0.024) Loss 6.8900e+00 (7.0403e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 238/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.024) Loss 6.9064e+00 (7.0397e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 239/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.024) Loss 6.9071e+00 (7.0392e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.67) +Epoch: [0][ 240/5004] Time 0.234 ( 0.241) Data 0.025 ( 0.024) Loss 6.8967e+00 (7.0386e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.67) +Epoch: [0][ 241/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.024) Loss 6.9032e+00 (7.0380e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.67) +Epoch: [0][ 242/5004] Time 0.234 ( 0.241) Data 0.029 ( 0.024) Loss 6.9005e+00 (7.0375e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.67) +Epoch: [0][ 243/5004] Time 0.244 ( 0.241) Data 0.032 ( 0.024) Loss 6.9124e+00 (7.0370e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.67) +Epoch: [0][ 244/5004] Time 0.231 ( 0.241) Data 0.024 ( 0.024) Loss 6.9149e+00 (7.0365e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.67) +Epoch: [0][ 245/5004] Time 0.241 ( 0.241) Data 0.031 ( 0.024) Loss 6.9009e+00 (7.0359e+00) Acc@1 0.00 ( 0.13) Acc@5 0.78 ( 0.67) +Epoch: [0][ 246/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.024) Loss 6.9049e+00 (7.0354e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.67) +Epoch: [0][ 247/5004] Time 0.236 ( 0.241) Data 0.025 ( 0.024) Loss 6.9059e+00 (7.0349e+00) Acc@1 0.39 ( 0.13) Acc@5 0.78 ( 0.67) +Epoch: [0][ 248/5004] Time 0.237 ( 0.241) Data 0.027 ( 0.024) Loss 6.9288e+00 (7.0344e+00) Acc@1 0.00 ( 0.13) Acc@5 0.00 ( 0.67) +Epoch: [0][ 249/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.024) Loss 6.9214e+00 (7.0340e+00) Acc@1 0.39 ( 0.13) Acc@5 0.78 ( 0.67) +Epoch: [0][ 250/5004] Time 0.243 ( 0.241) Data 0.027 ( 0.024) Loss 6.9135e+00 (7.0335e+00) Acc@1 0.00 ( 0.13) Acc@5 0.39 ( 0.66) +Epoch: [0][ 251/5004] Time 0.236 ( 0.241) Data 0.026 ( 0.024) Loss 6.8989e+00 (7.0330e+00) Acc@1 1.17 ( 0.14) Acc@5 1.56 ( 0.67) +Epoch: [0][ 252/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.024) Loss 6.8909e+00 (7.0324e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.67) +Epoch: [0][ 253/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.024) Loss 6.9014e+00 (7.0319e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.67) +Epoch: [0][ 254/5004] Time 0.235 ( 0.241) Data 0.025 ( 0.024) Loss 6.8934e+00 (7.0313e+00) Acc@1 0.78 ( 0.14) Acc@5 0.78 ( 0.67) +Epoch: [0][ 255/5004] Time 0.236 ( 0.241) Data 0.028 ( 0.024) Loss 6.8826e+00 (7.0308e+00) Acc@1 0.78 ( 0.14) Acc@5 0.78 ( 0.67) +Epoch: [0][ 256/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.024) Loss 6.9038e+00 (7.0303e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.67) +Epoch: [0][ 257/5004] Time 0.236 ( 0.241) Data 0.025 ( 0.024) Loss 6.9026e+00 (7.0298e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.67) +Epoch: [0][ 258/5004] Time 0.247 ( 0.241) Data 0.026 ( 0.024) Loss 6.9051e+00 (7.0293e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.67) +Epoch: [0][ 259/5004] Time 0.232 ( 0.241) Data 0.020 ( 0.024) Loss 6.8915e+00 (7.0288e+00) Acc@1 0.39 ( 0.15) Acc@5 2.34 ( 0.68) +Epoch: [0][ 260/5004] Time 0.234 ( 0.241) Data 0.026 ( 0.024) Loss 6.9054e+00 (7.0283e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.67) +Epoch: [0][ 261/5004] Time 0.242 ( 0.241) Data 0.029 ( 0.024) Loss 6.9099e+00 (7.0278e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 262/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.024) Loss 6.9032e+00 (7.0274e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.67) +Epoch: [0][ 263/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.024) Loss 6.8919e+00 (7.0269e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.67) +Epoch: [0][ 264/5004] Time 0.248 ( 0.241) Data 0.026 ( 0.024) Loss 6.9052e+00 (7.0264e+00) Acc@1 0.00 ( 0.14) Acc@5 1.56 ( 0.68) +Epoch: [0][ 265/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.9029e+00 (7.0259e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 266/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.024) Loss 6.8913e+00 (7.0254e+00) Acc@1 0.39 ( 0.14) Acc@5 2.73 ( 0.68) +Epoch: [0][ 267/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.024) Loss 6.8942e+00 (7.0249e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.69) +Epoch: [0][ 268/5004] Time 0.239 ( 0.241) Data 0.018 ( 0.024) Loss 6.9011e+00 (7.0245e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.69) +Epoch: [0][ 269/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.9158e+00 (7.0241e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 270/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.024) Loss 6.9049e+00 (7.0236e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 271/5004] Time 0.239 ( 0.241) Data 0.021 ( 0.024) Loss 6.8827e+00 (7.0231e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 272/5004] Time 0.242 ( 0.241) Data 0.023 ( 0.024) Loss 6.9004e+00 (7.0227e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 273/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.024) Loss 6.8811e+00 (7.0222e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.68) +Epoch: [0][ 274/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.024) Loss 6.9067e+00 (7.0217e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 275/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.024) Loss 6.9173e+00 (7.0214e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 276/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.024) Loss 6.8925e+00 (7.0209e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.68) +Epoch: [0][ 277/5004] Time 0.258 ( 0.241) Data 0.021 ( 0.024) Loss 6.8792e+00 (7.0204e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 278/5004] Time 0.238 ( 0.241) Data 0.014 ( 0.024) Loss 6.9413e+00 (7.0201e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 279/5004] Time 0.256 ( 0.241) Data 0.021 ( 0.024) Loss 6.8975e+00 (7.0197e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 280/5004] Time 0.247 ( 0.241) Data 0.014 ( 0.024) Loss 6.9278e+00 (7.0193e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 281/5004] Time 0.251 ( 0.241) Data 0.018 ( 0.024) Loss 6.9107e+00 (7.0189e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 282/5004] Time 0.245 ( 0.241) Data 0.019 ( 0.024) Loss 6.9130e+00 (7.0186e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 283/5004] Time 0.245 ( 0.241) Data 0.021 ( 0.024) Loss 6.8811e+00 (7.0181e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 284/5004] Time 0.255 ( 0.241) Data 0.021 ( 0.024) Loss 6.9064e+00 (7.0177e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.68) +Epoch: [0][ 285/5004] Time 0.251 ( 0.241) Data 0.018 ( 0.024) Loss 6.8960e+00 (7.0173e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 286/5004] Time 0.244 ( 0.241) Data 0.019 ( 0.024) Loss 6.9006e+00 (7.0169e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 287/5004] Time 0.244 ( 0.241) Data 0.021 ( 0.024) Loss 6.8880e+00 (7.0164e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 288/5004] Time 0.250 ( 0.241) Data 0.022 ( 0.024) Loss 6.8972e+00 (7.0160e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 289/5004] Time 0.241 ( 0.241) Data 0.018 ( 0.024) Loss 6.8995e+00 (7.0156e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 290/5004] Time 0.249 ( 0.241) Data 0.022 ( 0.024) Loss 6.9087e+00 (7.0152e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 291/5004] Time 0.243 ( 0.241) Data 0.020 ( 0.024) Loss 6.8989e+00 (7.0148e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 292/5004] Time 0.246 ( 0.241) Data 0.022 ( 0.024) Loss 6.8962e+00 (7.0144e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 293/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.024) Loss 6.8905e+00 (7.0140e+00) Acc@1 0.39 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 294/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.024) Loss 6.9034e+00 (7.0136e+00) Acc@1 0.78 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 295/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.024) Loss 6.9125e+00 (7.0133e+00) Acc@1 0.78 ( 0.14) Acc@5 1.17 ( 0.68) +Epoch: [0][ 296/5004] Time 0.246 ( 0.241) Data 0.022 ( 0.024) Loss 6.8823e+00 (7.0129e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 297/5004] Time 0.244 ( 0.241) Data 0.022 ( 0.024) Loss 6.9008e+00 (7.0125e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.68) +Epoch: [0][ 298/5004] Time 0.242 ( 0.241) Data 0.022 ( 0.024) Loss 6.9099e+00 (7.0121e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.68) +Epoch: [0][ 299/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.024) Loss 6.8990e+00 (7.0118e+00) Acc@1 0.39 ( 0.14) Acc@5 1.56 ( 0.68) +Epoch: [0][ 300/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.024) Loss 6.8890e+00 (7.0113e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.69) +Epoch: [0][ 301/5004] Time 0.245 ( 0.241) Data 0.024 ( 0.024) Loss 6.8940e+00 (7.0110e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.69) +Epoch: [0][ 302/5004] Time 0.241 ( 0.241) Data 0.022 ( 0.024) Loss 6.8889e+00 (7.0106e+00) Acc@1 0.39 ( 0.14) Acc@5 1.17 ( 0.69) +Epoch: [0][ 303/5004] Time 0.236 ( 0.241) Data 0.022 ( 0.024) Loss 6.8959e+00 (7.0102e+00) Acc@1 0.39 ( 0.15) Acc@5 1.95 ( 0.69) +Epoch: [0][ 304/5004] Time 0.234 ( 0.241) Data 0.024 ( 0.024) Loss 6.9122e+00 (7.0099e+00) Acc@1 0.00 ( 0.14) Acc@5 0.78 ( 0.69) +Epoch: [0][ 305/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.024) Loss 6.9076e+00 (7.0095e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.69) +Epoch: [0][ 306/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.024) Loss 6.8916e+00 (7.0091e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 307/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.024) Loss 6.9075e+00 (7.0088e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 308/5004] Time 0.237 ( 0.241) Data 0.024 ( 0.024) Loss 6.8926e+00 (7.0084e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 309/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.024) Loss 7.0136e+00 (7.0085e+00) Acc@1 0.39 ( 0.15) Acc@5 1.56 ( 0.69) +Epoch: [0][ 310/5004] Time 0.243 ( 0.241) Data 0.024 ( 0.024) Loss 6.9202e+00 (7.0082e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 311/5004] Time 0.237 ( 0.241) Data 0.020 ( 0.024) Loss 6.9073e+00 (7.0078e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 312/5004] Time 0.232 ( 0.241) Data 0.021 ( 0.024) Loss 6.9152e+00 (7.0075e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 313/5004] Time 0.244 ( 0.241) Data 0.027 ( 0.024) Loss 6.9190e+00 (7.0073e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 314/5004] Time 0.239 ( 0.241) Data 0.022 ( 0.024) Loss 6.9089e+00 (7.0070e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 315/5004] Time 0.240 ( 0.241) Data 0.022 ( 0.024) Loss 6.9034e+00 (7.0066e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 316/5004] Time 0.238 ( 0.241) Data 0.024 ( 0.024) Loss 6.8940e+00 (7.0063e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 317/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.024) Loss 6.8943e+00 (7.0059e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.69) +Epoch: [0][ 318/5004] Time 0.239 ( 0.241) Data 0.024 ( 0.024) Loss 6.8911e+00 (7.0056e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 319/5004] Time 0.238 ( 0.241) Data 0.023 ( 0.024) Loss 6.9223e+00 (7.0053e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 320/5004] Time 0.235 ( 0.241) Data 0.024 ( 0.024) Loss 6.9046e+00 (7.0050e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 321/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.024) Loss 6.9048e+00 (7.0047e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 322/5004] Time 0.234 ( 0.241) Data 0.023 ( 0.024) Loss 6.8966e+00 (7.0043e+00) Acc@1 0.00 ( 0.14) Acc@5 1.56 ( 0.69) +Epoch: [0][ 323/5004] Time 0.237 ( 0.241) Data 0.027 ( 0.024) Loss 6.8996e+00 (7.0040e+00) Acc@1 1.17 ( 0.15) Acc@5 1.56 ( 0.70) +Epoch: [0][ 324/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.024) Loss 6.8993e+00 (7.0037e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 325/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.024) Loss 6.8912e+00 (7.0033e+00) Acc@1 0.00 ( 0.14) Acc@5 1.17 ( 0.70) +Epoch: [0][ 326/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.024) Loss 6.8963e+00 (7.0030e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.70) +Epoch: [0][ 327/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.024) Loss 6.9135e+00 (7.0027e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 328/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.024) Loss 6.9024e+00 (7.0024e+00) Acc@1 0.00 ( 0.14) Acc@5 0.00 ( 0.69) +Epoch: [0][ 329/5004] Time 0.236 ( 0.241) Data 0.027 ( 0.024) Loss 6.9000e+00 (7.0021e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 330/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.024) Loss 6.8922e+00 (7.0018e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 331/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.024) Loss 6.9175e+00 (7.0015e+00) Acc@1 0.39 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 332/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.024) Loss 6.8858e+00 (7.0012e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 333/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.024) Loss 6.8918e+00 (7.0009e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 334/5004] Time 0.244 ( 0.241) Data 0.027 ( 0.024) Loss 6.9063e+00 (7.0006e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 335/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.024) Loss 6.8935e+00 (7.0003e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 336/5004] Time 0.236 ( 0.241) Data 0.025 ( 0.024) Loss 6.8846e+00 (6.9999e+00) Acc@1 0.00 ( 0.14) Acc@5 0.39 ( 0.69) +Epoch: [0][ 337/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.024) Loss 6.9041e+00 (6.9996e+00) Acc@1 0.78 ( 0.15) Acc@5 1.56 ( 0.69) +Epoch: [0][ 338/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.024) Loss 6.8648e+00 (6.9992e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 339/5004] Time 0.236 ( 0.241) Data 0.025 ( 0.024) Loss 6.8938e+00 (6.9989e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 340/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.024) Loss 6.9089e+00 (6.9987e+00) Acc@1 0.78 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 341/5004] Time 0.238 ( 0.241) Data 0.024 ( 0.024) Loss 6.8731e+00 (6.9983e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 342/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.024) Loss 6.8951e+00 (6.9980e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 343/5004] Time 0.235 ( 0.241) Data 0.024 ( 0.024) Loss 6.8855e+00 (6.9977e+00) Acc@1 0.00 ( 0.15) Acc@5 1.95 ( 0.70) +Epoch: [0][ 344/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.024) Loss 6.9134e+00 (6.9974e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 345/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.024) Loss 6.9158e+00 (6.9972e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 346/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.024) Loss 6.9029e+00 (6.9969e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 347/5004] Time 0.234 ( 0.241) Data 0.023 ( 0.024) Loss 6.8796e+00 (6.9966e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 348/5004] Time 0.243 ( 0.241) Data 0.027 ( 0.024) Loss 6.8855e+00 (6.9963e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 349/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.024) Loss 6.9056e+00 (6.9960e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 350/5004] Time 0.233 ( 0.241) Data 0.017 ( 0.024) Loss 6.9020e+00 (6.9957e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 351/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.9171e+00 (6.9955e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 352/5004] Time 0.234 ( 0.241) Data 0.023 ( 0.024) Loss 6.9071e+00 (6.9953e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 353/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.024) Loss 6.9143e+00 (6.9950e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 354/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.024) Loss 6.8884e+00 (6.9947e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 355/5004] Time 0.236 ( 0.241) Data 0.024 ( 0.024) Loss 6.8881e+00 (6.9944e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 356/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.024) Loss 6.9047e+00 (6.9942e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 357/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.024) Loss 6.9063e+00 (6.9939e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 358/5004] Time 0.234 ( 0.241) Data 0.025 ( 0.024) Loss 7.0665e+00 (6.9941e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 359/5004] Time 0.238 ( 0.241) Data 0.029 ( 0.024) Loss 6.9054e+00 (6.9939e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 360/5004] Time 0.240 ( 0.241) Data 0.029 ( 0.024) Loss 6.8948e+00 (6.9936e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 361/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.024) Loss 6.8917e+00 (6.9933e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 362/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.024) Loss 6.8730e+00 (6.9930e+00) Acc@1 0.78 ( 0.15) Acc@5 1.95 ( 0.70) +Epoch: [0][ 363/5004] Time 0.234 ( 0.241) Data 0.024 ( 0.024) Loss 6.9019e+00 (6.9928e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 364/5004] Time 0.244 ( 0.241) Data 0.029 ( 0.024) Loss 6.9112e+00 (6.9925e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 365/5004] Time 0.235 ( 0.241) Data 0.024 ( 0.024) Loss 6.9001e+00 (6.9923e+00) Acc@1 0.78 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 366/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.024) Loss 6.9232e+00 (6.9921e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.70) +Epoch: [0][ 367/5004] Time 0.244 ( 0.241) Data 0.028 ( 0.024) Loss 6.9152e+00 (6.9919e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 368/5004] Time 0.229 ( 0.241) Data 0.022 ( 0.024) Loss 6.8935e+00 (6.9916e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 369/5004] Time 0.240 ( 0.241) Data 0.030 ( 0.024) Loss 6.9028e+00 (6.9914e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.70) +Epoch: [0][ 370/5004] Time 0.236 ( 0.241) Data 0.028 ( 0.024) Loss 6.9044e+00 (6.9911e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 371/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.024) Loss 6.9095e+00 (6.9909e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 372/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.024) Loss 6.8977e+00 (6.9907e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 373/5004] Time 0.236 ( 0.241) Data 0.028 ( 0.024) Loss 6.9209e+00 (6.9905e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 374/5004] Time 0.237 ( 0.241) Data 0.029 ( 0.024) Loss 6.9060e+00 (6.9903e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 375/5004] Time 0.238 ( 0.241) Data 0.029 ( 0.024) Loss 6.8983e+00 (6.9900e+00) Acc@1 0.78 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 376/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.024) Loss 6.8913e+00 (6.9898e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 377/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.024) Loss 6.9077e+00 (6.9895e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.70) +Epoch: [0][ 378/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.024) Loss 6.9116e+00 (6.9893e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 379/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.024) Loss 6.9077e+00 (6.9891e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 380/5004] Time 0.236 ( 0.241) Data 0.027 ( 0.024) Loss 6.8884e+00 (6.9889e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 381/5004] Time 0.242 ( 0.241) Data 0.029 ( 0.024) Loss 6.9055e+00 (6.9886e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 382/5004] Time 0.235 ( 0.241) Data 0.024 ( 0.024) Loss 6.8999e+00 (6.9884e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 383/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.024) Loss 6.9082e+00 (6.9882e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 384/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.024) Loss 6.8846e+00 (6.9879e+00) Acc@1 0.00 ( 0.15) Acc@5 1.56 ( 0.69) +Epoch: [0][ 385/5004] Time 0.236 ( 0.241) Data 0.026 ( 0.024) Loss 6.8873e+00 (6.9877e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 386/5004] Time 0.248 ( 0.241) Data 0.029 ( 0.024) Loss 6.9113e+00 (6.9875e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 387/5004] Time 0.230 ( 0.241) Data 0.021 ( 0.024) Loss 6.9103e+00 (6.9873e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 388/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.024) Loss 6.8935e+00 (6.9870e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 389/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.024) Loss 6.8667e+00 (6.9867e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 390/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.024) Loss 6.8886e+00 (6.9865e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 391/5004] Time 0.236 ( 0.241) Data 0.027 ( 0.024) Loss 6.9096e+00 (6.9863e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 392/5004] Time 0.244 ( 0.241) Data 0.028 ( 0.024) Loss 6.8905e+00 (6.9860e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 393/5004] Time 0.245 ( 0.241) Data 0.024 ( 0.024) Loss 6.9037e+00 (6.9858e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 394/5004] Time 0.241 ( 0.241) Data 0.022 ( 0.024) Loss 6.9149e+00 (6.9856e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 395/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.8776e+00 (6.9854e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.69) +Epoch: [0][ 396/5004] Time 0.244 ( 0.241) Data 0.024 ( 0.024) Loss 6.9101e+00 (6.9852e+00) Acc@1 0.78 ( 0.15) Acc@5 1.56 ( 0.69) +Epoch: [0][ 397/5004] Time 0.238 ( 0.241) Data 0.021 ( 0.024) Loss 6.9060e+00 (6.9850e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.69) +Epoch: [0][ 398/5004] Time 0.240 ( 0.241) Data 0.024 ( 0.024) Loss 6.9044e+00 (6.9848e+00) Acc@1 0.39 ( 0.15) Acc@5 1.56 ( 0.69) +Epoch: [0][ 399/5004] Time 0.240 ( 0.241) Data 0.024 ( 0.024) Loss 6.8701e+00 (6.9845e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 400/5004] Time 0.243 ( 0.241) Data 0.024 ( 0.024) Loss 6.9192e+00 (6.9843e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.69) +Epoch: [0][ 401/5004] Time 0.238 ( 0.241) Data 0.022 ( 0.024) Loss 6.8919e+00 (6.9841e+00) Acc@1 0.00 ( 0.15) Acc@5 1.95 ( 0.69) +Epoch: [0][ 402/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.024) Loss 6.9173e+00 (6.9839e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.69) +Epoch: [0][ 403/5004] Time 0.247 ( 0.241) Data 0.024 ( 0.024) Loss 6.8854e+00 (6.9837e+00) Acc@1 0.39 ( 0.15) Acc@5 1.95 ( 0.70) +Epoch: [0][ 404/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.024) Loss 6.8961e+00 (6.9835e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 405/5004] Time 0.244 ( 0.241) Data 0.021 ( 0.024) Loss 6.9001e+00 (6.9833e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.70) +Epoch: [0][ 406/5004] Time 0.237 ( 0.241) Data 0.021 ( 0.024) Loss 6.8946e+00 (6.9831e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 407/5004] Time 0.240 ( 0.241) Data 0.024 ( 0.024) Loss 6.9008e+00 (6.9828e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 408/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.024) Loss 6.8877e+00 (6.9826e+00) Acc@1 0.00 ( 0.15) Acc@5 1.56 ( 0.70) +Epoch: [0][ 409/5004] Time 0.240 ( 0.241) Data 0.023 ( 0.024) Loss 6.8953e+00 (6.9824e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.70) +Epoch: [0][ 410/5004] Time 0.248 ( 0.241) Data 0.023 ( 0.024) Loss 6.8991e+00 (6.9822e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 411/5004] Time 0.237 ( 0.241) Data 0.019 ( 0.024) Loss 6.8855e+00 (6.9820e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 412/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.8985e+00 (6.9818e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.70) +Epoch: [0][ 413/5004] Time 0.239 ( 0.241) Data 0.023 ( 0.024) Loss 6.8710e+00 (6.9815e+00) Acc@1 0.78 ( 0.15) Acc@5 1.95 ( 0.70) +Epoch: [0][ 414/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.024) Loss 6.9023e+00 (6.9813e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.70) +Epoch: [0][ 415/5004] Time 0.242 ( 0.241) Data 0.023 ( 0.024) Loss 6.8946e+00 (6.9811e+00) Acc@1 0.78 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 416/5004] Time 0.242 ( 0.241) Data 0.023 ( 0.024) Loss 6.8726e+00 (6.9808e+00) Acc@1 0.00 ( 0.15) Acc@5 1.56 ( 0.70) +Epoch: [0][ 417/5004] Time 0.242 ( 0.241) Data 0.023 ( 0.024) Loss 6.9001e+00 (6.9806e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 418/5004] Time 0.245 ( 0.241) Data 0.024 ( 0.024) Loss 6.8887e+00 (6.9804e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 419/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.024) Loss 6.8648e+00 (6.9801e+00) Acc@1 0.39 ( 0.15) Acc@5 1.56 ( 0.70) +Epoch: [0][ 420/5004] Time 0.244 ( 0.241) Data 0.022 ( 0.024) Loss 6.8971e+00 (6.9800e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 421/5004] Time 0.192 ( 0.241) Data 0.021 ( 0.024) Loss 6.8828e+00 (6.9797e+00) Acc@1 0.39 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 422/5004] Time 0.237 ( 0.241) Data 0.066 ( 0.024) Loss 6.9037e+00 (6.9795e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 423/5004] Time 0.235 ( 0.241) Data 0.066 ( 0.024) Loss 6.9040e+00 (6.9794e+00) Acc@1 0.00 ( 0.15) Acc@5 1.17 ( 0.70) +Epoch: [0][ 424/5004] Time 0.238 ( 0.241) Data 0.066 ( 0.025) Loss 6.9091e+00 (6.9792e+00) Acc@1 0.00 ( 0.15) Acc@5 0.39 ( 0.70) +Epoch: [0][ 425/5004] Time 0.236 ( 0.241) Data 0.066 ( 0.025) Loss 6.9081e+00 (6.9790e+00) Acc@1 0.00 ( 0.15) Acc@5 1.95 ( 0.71) +Epoch: [0][ 426/5004] Time 0.247 ( 0.241) Data 0.067 ( 0.025) Loss 6.8907e+00 (6.9788e+00) Acc@1 0.00 ( 0.15) Acc@5 0.00 ( 0.70) +Epoch: [0][ 427/5004] Time 0.277 ( 0.241) Data 0.057 ( 0.025) Loss 6.8716e+00 (6.9786e+00) Acc@1 0.39 ( 0.15) Acc@5 1.17 ( 0.71) +Epoch: [0][ 428/5004] Time 0.233 ( 0.241) Data 0.023 ( 0.025) Loss 6.9182e+00 (6.9784e+00) Acc@1 0.39 ( 0.15) Acc@5 0.78 ( 0.71) +Epoch: [0][ 429/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.025) Loss 6.8789e+00 (6.9782e+00) Acc@1 0.00 ( 0.15) Acc@5 0.78 ( 0.71) +Epoch: [0][ 430/5004] Time 0.236 ( 0.241) Data 0.027 ( 0.025) Loss 6.9008e+00 (6.9780e+00) Acc@1 0.78 ( 0.15) Acc@5 0.78 ( 0.71) +Epoch: [0][ 431/5004] Time 0.241 ( 0.241) Data 0.028 ( 0.025) Loss 6.8950e+00 (6.9778e+00) Acc@1 0.78 ( 0.16) Acc@5 1.17 ( 0.71) +Epoch: [0][ 432/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.025) Loss 6.9087e+00 (6.9777e+00) Acc@1 0.39 ( 0.16) Acc@5 1.95 ( 0.71) +Epoch: [0][ 433/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.025) Loss 6.9083e+00 (6.9775e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.71) +Epoch: [0][ 434/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.025) Loss 6.8922e+00 (6.9773e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.71) +Epoch: [0][ 435/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.025) Loss 6.9170e+00 (6.9772e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.71) +Epoch: [0][ 436/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.025) Loss 6.8902e+00 (6.9770e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.71) +Epoch: [0][ 437/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.025) Loss 6.9156e+00 (6.9768e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.71) +Epoch: [0][ 438/5004] Time 0.239 ( 0.241) Data 0.024 ( 0.025) Loss 6.8958e+00 (6.9767e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.71) +Epoch: [0][ 439/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.025) Loss 6.8722e+00 (6.9764e+00) Acc@1 0.00 ( 0.16) Acc@5 2.34 ( 0.72) +Epoch: [0][ 440/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.025) Loss 6.9149e+00 (6.9763e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 441/5004] Time 0.236 ( 0.241) Data 0.026 ( 0.025) Loss 6.9120e+00 (6.9761e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.71) +Epoch: [0][ 442/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.025) Loss 6.8758e+00 (6.9759e+00) Acc@1 0.78 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 443/5004] Time 0.235 ( 0.241) Data 0.026 ( 0.025) Loss 6.9059e+00 (6.9757e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.71) +Epoch: [0][ 444/5004] Time 0.244 ( 0.241) Data 0.028 ( 0.025) Loss 6.9051e+00 (6.9756e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 445/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.025) Loss 6.8971e+00 (6.9754e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.71) +Epoch: [0][ 446/5004] Time 0.236 ( 0.241) Data 0.026 ( 0.025) Loss 6.8935e+00 (6.9752e+00) Acc@1 0.78 ( 0.16) Acc@5 1.56 ( 0.72) +Epoch: [0][ 447/5004] Time 0.236 ( 0.241) Data 0.028 ( 0.025) Loss 6.9058e+00 (6.9751e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.72) +Epoch: [0][ 448/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.025) Loss 6.8900e+00 (6.9749e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.72) +Epoch: [0][ 449/5004] Time 0.239 ( 0.240) Data 0.028 ( 0.025) Loss 6.9089e+00 (6.9747e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 450/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.025) Loss 6.8848e+00 (6.9745e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 451/5004] Time 0.240 ( 0.240) Data 0.027 ( 0.025) Loss 6.8798e+00 (6.9743e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 452/5004] Time 0.236 ( 0.240) Data 0.026 ( 0.025) Loss 6.9024e+00 (6.9742e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.72) +Epoch: [0][ 453/5004] Time 0.241 ( 0.240) Data 0.028 ( 0.025) Loss 6.9064e+00 (6.9740e+00) Acc@1 0.00 ( 0.16) Acc@5 1.56 ( 0.72) +Epoch: [0][ 454/5004] Time 0.236 ( 0.240) Data 0.024 ( 0.025) Loss 6.8978e+00 (6.9739e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 455/5004] Time 0.238 ( 0.240) Data 0.025 ( 0.025) Loss 6.8765e+00 (6.9736e+00) Acc@1 0.39 ( 0.16) Acc@5 0.39 ( 0.72) +Epoch: [0][ 456/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.025) Loss 6.8964e+00 (6.9735e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.72) +Epoch: [0][ 457/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.025) Loss 6.9170e+00 (6.9733e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 458/5004] Time 0.237 ( 0.240) Data 0.020 ( 0.025) Loss 6.8768e+00 (6.9731e+00) Acc@1 0.00 ( 0.16) Acc@5 1.56 ( 0.72) +Epoch: [0][ 459/5004] Time 0.249 ( 0.240) Data 0.024 ( 0.025) Loss 6.8816e+00 (6.9729e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 460/5004] Time 0.235 ( 0.240) Data 0.017 ( 0.025) Loss 6.8797e+00 (6.9727e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 461/5004] Time 0.238 ( 0.240) Data 0.023 ( 0.025) Loss 6.8945e+00 (6.9726e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.72) +Epoch: [0][ 462/5004] Time 0.236 ( 0.240) Data 0.024 ( 0.025) Loss 6.8931e+00 (6.9724e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 463/5004] Time 0.235 ( 0.240) Data 0.025 ( 0.025) Loss 6.8881e+00 (6.9722e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 464/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.025) Loss 6.8872e+00 (6.9720e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 465/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.025) Loss 6.8815e+00 (6.9718e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.72) +Epoch: [0][ 466/5004] Time 0.237 ( 0.240) Data 0.026 ( 0.025) Loss 6.8844e+00 (6.9716e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 467/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.025) Loss 6.8788e+00 (6.9714e+00) Acc@1 0.00 ( 0.16) Acc@5 1.56 ( 0.72) +Epoch: [0][ 468/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.025) Loss 6.8992e+00 (6.9713e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 469/5004] Time 0.234 ( 0.240) Data 0.023 ( 0.025) Loss 6.8835e+00 (6.9711e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 470/5004] Time 0.240 ( 0.240) Data 0.027 ( 0.025) Loss 6.9141e+00 (6.9710e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.72) +Epoch: [0][ 471/5004] Time 0.236 ( 0.240) Data 0.025 ( 0.025) Loss 6.8760e+00 (6.9708e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.72) +Epoch: [0][ 472/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.025) Loss 6.9079e+00 (6.9707e+00) Acc@1 0.39 ( 0.16) Acc@5 0.39 ( 0.72) +Epoch: [0][ 473/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.025) Loss 6.8847e+00 (6.9705e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.72) +Epoch: [0][ 474/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.025) Loss 6.8989e+00 (6.9703e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 475/5004] Time 0.238 ( 0.240) Data 0.025 ( 0.025) Loss 6.9004e+00 (6.9702e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 476/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.025) Loss 6.8949e+00 (6.9700e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 477/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.025) Loss 6.8881e+00 (6.9698e+00) Acc@1 0.39 ( 0.16) Acc@5 2.34 ( 0.73) +Epoch: [0][ 478/5004] Time 0.230 ( 0.240) Data 0.021 ( 0.025) Loss 6.9007e+00 (6.9697e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 479/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.025) Loss 6.8887e+00 (6.9695e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 480/5004] Time 0.245 ( 0.240) Data 0.027 ( 0.025) Loss 6.8712e+00 (6.9693e+00) Acc@1 0.78 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 481/5004] Time 0.236 ( 0.240) Data 0.024 ( 0.025) Loss 6.8910e+00 (6.9692e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 482/5004] Time 0.237 ( 0.240) Data 0.026 ( 0.025) Loss 6.8832e+00 (6.9690e+00) Acc@1 0.39 ( 0.16) Acc@5 1.95 ( 0.73) +Epoch: [0][ 483/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.025) Loss 6.8820e+00 (6.9688e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 484/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.025) Loss 6.8932e+00 (6.9687e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 485/5004] Time 0.238 ( 0.240) Data 0.026 ( 0.025) Loss 6.8861e+00 (6.9685e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 486/5004] Time 0.206 ( 0.240) Data 0.026 ( 0.025) Loss 6.8915e+00 (6.9683e+00) Acc@1 0.00 ( 0.16) Acc@5 1.56 ( 0.74) +Epoch: [0][ 487/5004] Time 0.234 ( 0.240) Data 0.057 ( 0.025) Loss 6.8807e+00 (6.9681e+00) Acc@1 0.78 ( 0.16) Acc@5 1.95 ( 0.74) +Epoch: [0][ 488/5004] Time 0.237 ( 0.240) Data 0.058 ( 0.025) Loss 6.8816e+00 (6.9680e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 489/5004] Time 0.236 ( 0.240) Data 0.058 ( 0.025) Loss 6.8919e+00 (6.9678e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 490/5004] Time 0.242 ( 0.240) Data 0.057 ( 0.025) Loss 6.8833e+00 (6.9676e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 491/5004] Time 0.234 ( 0.240) Data 0.052 ( 0.025) Loss 6.8723e+00 (6.9674e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 492/5004] Time 0.238 ( 0.240) Data 0.058 ( 0.025) Loss 6.8955e+00 (6.9673e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 493/5004] Time 0.233 ( 0.240) Data 0.056 ( 0.025) Loss 6.8735e+00 (6.9671e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 494/5004] Time 0.239 ( 0.240) Data 0.058 ( 0.025) Loss 6.8850e+00 (6.9669e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 495/5004] Time 0.237 ( 0.240) Data 0.058 ( 0.026) Loss 6.8765e+00 (6.9668e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 496/5004] Time 0.237 ( 0.240) Data 0.057 ( 0.026) Loss 6.8449e+00 (6.9665e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 497/5004] Time 0.238 ( 0.240) Data 0.057 ( 0.026) Loss 6.8828e+00 (6.9663e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 498/5004] Time 0.233 ( 0.240) Data 0.056 ( 0.026) Loss 6.8763e+00 (6.9662e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 499/5004] Time 0.236 ( 0.240) Data 0.060 ( 0.026) Loss 6.8728e+00 (6.9660e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 500/5004] Time 0.238 ( 0.240) Data 0.061 ( 0.026) Loss 6.8993e+00 (6.9658e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.74) +Epoch: [0][ 501/5004] Time 0.240 ( 0.240) Data 0.060 ( 0.026) Loss 6.8828e+00 (6.9657e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.74) +Epoch: [0][ 502/5004] Time 0.236 ( 0.240) Data 0.057 ( 0.026) Loss 6.8667e+00 (6.9655e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 503/5004] Time 0.240 ( 0.240) Data 0.058 ( 0.026) Loss 6.8970e+00 (6.9654e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 504/5004] Time 0.236 ( 0.240) Data 0.057 ( 0.026) Loss 6.8821e+00 (6.9652e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 505/5004] Time 0.238 ( 0.240) Data 0.058 ( 0.026) Loss 6.8662e+00 (6.9650e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 506/5004] Time 0.240 ( 0.240) Data 0.057 ( 0.026) Loss 6.9289e+00 (6.9649e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 507/5004] Time 0.239 ( 0.240) Data 0.055 ( 0.026) Loss 6.8855e+00 (6.9648e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 508/5004] Time 0.234 ( 0.240) Data 0.054 ( 0.026) Loss 6.8896e+00 (6.9646e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 509/5004] Time 0.237 ( 0.240) Data 0.058 ( 0.026) Loss 6.8722e+00 (6.9644e+00) Acc@1 0.39 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 510/5004] Time 0.238 ( 0.240) Data 0.058 ( 0.026) Loss 6.8849e+00 (6.9643e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 511/5004] Time 0.244 ( 0.240) Data 0.058 ( 0.027) Loss 6.8962e+00 (6.9641e+00) Acc@1 0.39 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 512/5004] Time 0.235 ( 0.240) Data 0.054 ( 0.027) Loss 6.8895e+00 (6.9640e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 513/5004] Time 0.239 ( 0.240) Data 0.058 ( 0.027) Loss 6.8656e+00 (6.9638e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 514/5004] Time 0.234 ( 0.240) Data 0.057 ( 0.027) Loss 6.8842e+00 (6.9637e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 515/5004] Time 0.244 ( 0.240) Data 0.061 ( 0.027) Loss 6.8817e+00 (6.9635e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 516/5004] Time 0.237 ( 0.240) Data 0.057 ( 0.027) Loss 6.9088e+00 (6.9634e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 517/5004] Time 0.238 ( 0.240) Data 0.058 ( 0.027) Loss 6.8629e+00 (6.9632e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 518/5004] Time 0.236 ( 0.240) Data 0.057 ( 0.027) Loss 6.8683e+00 (6.9630e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 519/5004] Time 0.237 ( 0.240) Data 0.057 ( 0.027) Loss 6.8759e+00 (6.9628e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 520/5004] Time 0.238 ( 0.240) Data 0.057 ( 0.027) Loss 6.8833e+00 (6.9627e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 521/5004] Time 0.239 ( 0.240) Data 0.057 ( 0.027) Loss 6.8898e+00 (6.9626e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 522/5004] Time 0.233 ( 0.240) Data 0.055 ( 0.027) Loss 6.8765e+00 (6.9624e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 523/5004] Time 0.238 ( 0.240) Data 0.060 ( 0.027) Loss 6.8789e+00 (6.9622e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 524/5004] Time 0.239 ( 0.240) Data 0.058 ( 0.027) Loss 6.8913e+00 (6.9621e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 525/5004] Time 0.239 ( 0.240) Data 0.056 ( 0.027) Loss 6.8554e+00 (6.9619e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 526/5004] Time 0.238 ( 0.240) Data 0.059 ( 0.027) Loss 6.8808e+00 (6.9617e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 527/5004] Time 0.238 ( 0.240) Data 0.057 ( 0.027) Loss 6.8839e+00 (6.9616e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 528/5004] Time 0.240 ( 0.240) Data 0.056 ( 0.028) Loss 6.8565e+00 (6.9614e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 529/5004] Time 0.234 ( 0.240) Data 0.055 ( 0.028) Loss 6.8894e+00 (6.9613e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 530/5004] Time 0.241 ( 0.240) Data 0.058 ( 0.028) Loss 6.8599e+00 (6.9611e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 531/5004] Time 0.237 ( 0.240) Data 0.055 ( 0.028) Loss 6.8950e+00 (6.9609e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 532/5004] Time 0.241 ( 0.240) Data 0.057 ( 0.028) Loss 6.8481e+00 (6.9607e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 533/5004] Time 0.239 ( 0.240) Data 0.057 ( 0.028) Loss 6.8349e+00 (6.9605e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 534/5004] Time 0.241 ( 0.240) Data 0.056 ( 0.028) Loss 6.8978e+00 (6.9604e+00) Acc@1 0.00 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 535/5004] Time 0.231 ( 0.240) Data 0.053 ( 0.028) Loss 6.9046e+00 (6.9603e+00) Acc@1 0.39 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 536/5004] Time 0.244 ( 0.240) Data 0.060 ( 0.028) Loss 6.8670e+00 (6.9601e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 537/5004] Time 0.237 ( 0.240) Data 0.056 ( 0.028) Loss 6.8644e+00 (6.9599e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 538/5004] Time 0.237 ( 0.240) Data 0.057 ( 0.028) Loss 6.8619e+00 (6.9597e+00) Acc@1 0.00 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 539/5004] Time 0.240 ( 0.240) Data 0.058 ( 0.028) Loss 6.8928e+00 (6.9596e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 540/5004] Time 0.238 ( 0.240) Data 0.057 ( 0.028) Loss 6.8695e+00 (6.9594e+00) Acc@1 0.78 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 541/5004] Time 0.239 ( 0.240) Data 0.058 ( 0.028) Loss 6.8840e+00 (6.9593e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 542/5004] Time 0.239 ( 0.240) Data 0.058 ( 0.028) Loss 6.9025e+00 (6.9592e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 543/5004] Time 0.236 ( 0.240) Data 0.056 ( 0.028) Loss 6.8495e+00 (6.9590e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 544/5004] Time 0.243 ( 0.240) Data 0.058 ( 0.028) Loss 6.8654e+00 (6.9588e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.73) +Epoch: [0][ 545/5004] Time 0.233 ( 0.240) Data 0.054 ( 0.028) Loss 6.8781e+00 (6.9587e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 546/5004] Time 0.242 ( 0.240) Data 0.061 ( 0.029) Loss 6.8466e+00 (6.9585e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 547/5004] Time 0.234 ( 0.240) Data 0.057 ( 0.029) Loss 6.8708e+00 (6.9583e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.73) +Epoch: [0][ 548/5004] Time 0.237 ( 0.240) Data 0.059 ( 0.029) Loss 6.9052e+00 (6.9582e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.73) +Epoch: [0][ 549/5004] Time 0.237 ( 0.240) Data 0.060 ( 0.029) Loss 6.8516e+00 (6.9580e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 550/5004] Time 0.238 ( 0.240) Data 0.060 ( 0.029) Loss 6.8219e+00 (6.9578e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 551/5004] Time 0.240 ( 0.240) Data 0.060 ( 0.029) Loss 6.8561e+00 (6.9576e+00) Acc@1 0.78 ( 0.16) Acc@5 1.56 ( 0.73) +Epoch: [0][ 552/5004] Time 0.238 ( 0.240) Data 0.060 ( 0.029) Loss 6.8770e+00 (6.9574e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.73) +Epoch: [0][ 553/5004] Time 0.234 ( 0.240) Data 0.060 ( 0.029) Loss 6.8457e+00 (6.9572e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 554/5004] Time 0.240 ( 0.240) Data 0.064 ( 0.029) Loss 6.8610e+00 (6.9571e+00) Acc@1 0.78 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 555/5004] Time 0.240 ( 0.240) Data 0.063 ( 0.029) Loss 6.8360e+00 (6.9569e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 556/5004] Time 0.237 ( 0.240) Data 0.060 ( 0.029) Loss 6.8726e+00 (6.9567e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 557/5004] Time 0.239 ( 0.240) Data 0.061 ( 0.029) Loss 6.8903e+00 (6.9566e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 558/5004] Time 0.237 ( 0.240) Data 0.059 ( 0.029) Loss 6.8750e+00 (6.9564e+00) Acc@1 1.17 ( 0.16) Acc@5 1.95 ( 0.74) +Epoch: [0][ 559/5004] Time 0.238 ( 0.240) Data 0.059 ( 0.029) Loss 6.8594e+00 (6.9563e+00) Acc@1 0.39 ( 0.16) Acc@5 1.95 ( 0.74) +Epoch: [0][ 560/5004] Time 0.237 ( 0.240) Data 0.060 ( 0.029) Loss 6.8458e+00 (6.9561e+00) Acc@1 0.78 ( 0.16) Acc@5 1.56 ( 0.74) +Epoch: [0][ 561/5004] Time 0.242 ( 0.240) Data 0.060 ( 0.029) Loss 6.8906e+00 (6.9560e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 562/5004] Time 0.234 ( 0.240) Data 0.057 ( 0.029) Loss 6.8777e+00 (6.9558e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 563/5004] Time 0.233 ( 0.240) Data 0.059 ( 0.029) Loss 6.8588e+00 (6.9556e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 564/5004] Time 0.244 ( 0.240) Data 0.063 ( 0.030) Loss 6.8642e+00 (6.9555e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 565/5004] Time 0.235 ( 0.240) Data 0.057 ( 0.030) Loss 6.8462e+00 (6.9553e+00) Acc@1 0.39 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 566/5004] Time 0.237 ( 0.240) Data 0.059 ( 0.030) Loss 6.8409e+00 (6.9551e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 567/5004] Time 0.278 ( 0.240) Data 0.059 ( 0.030) Loss 6.8841e+00 (6.9550e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 568/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.030) Loss 6.8533e+00 (6.9548e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 569/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.030) Loss 6.8413e+00 (6.9546e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 570/5004] Time 0.245 ( 0.240) Data 0.025 ( 0.030) Loss 6.8459e+00 (6.9544e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.74) +Epoch: [0][ 571/5004] Time 0.243 ( 0.240) Data 0.025 ( 0.030) Loss 6.8355e+00 (6.9542e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.75) +Epoch: [0][ 572/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.030) Loss 6.8458e+00 (6.9540e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 573/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.030) Loss 6.8769e+00 (6.9539e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 574/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.030) Loss 6.8525e+00 (6.9537e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 575/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.030) Loss 6.8585e+00 (6.9535e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.74) +Epoch: [0][ 576/5004] Time 0.240 ( 0.240) Data 0.025 ( 0.030) Loss 6.8511e+00 (6.9533e+00) Acc@1 0.78 ( 0.16) Acc@5 1.56 ( 0.75) +Epoch: [0][ 577/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.030) Loss 6.8796e+00 (6.9532e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 578/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.030) Loss 6.8624e+00 (6.9531e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.74) +Epoch: [0][ 579/5004] Time 0.249 ( 0.240) Data 0.024 ( 0.030) Loss 6.8602e+00 (6.9529e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.74) +Epoch: [0][ 580/5004] Time 0.237 ( 0.240) Data 0.018 ( 0.030) Loss 6.8486e+00 (6.9527e+00) Acc@1 0.39 ( 0.16) Acc@5 1.17 ( 0.75) +Epoch: [0][ 581/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.030) Loss 6.8382e+00 (6.9525e+00) Acc@1 0.78 ( 0.16) Acc@5 1.17 ( 0.75) +Epoch: [0][ 582/5004] Time 0.245 ( 0.240) Data 0.025 ( 0.030) Loss 6.8569e+00 (6.9524e+00) Acc@1 0.39 ( 0.16) Acc@5 0.78 ( 0.75) +Epoch: [0][ 583/5004] Time 0.242 ( 0.240) Data 0.025 ( 0.030) Loss 6.8210e+00 (6.9521e+00) Acc@1 0.00 ( 0.16) Acc@5 1.17 ( 0.75) +Epoch: [0][ 584/5004] Time 0.249 ( 0.240) Data 0.024 ( 0.030) Loss 6.8327e+00 (6.9519e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.75) +Epoch: [0][ 585/5004] Time 0.248 ( 0.240) Data 0.022 ( 0.029) Loss 6.8207e+00 (6.9517e+00) Acc@1 0.39 ( 0.16) Acc@5 1.56 ( 0.75) +Epoch: [0][ 586/5004] Time 0.238 ( 0.240) Data 0.018 ( 0.029) Loss 6.8411e+00 (6.9515e+00) Acc@1 0.39 ( 0.17) Acc@5 1.17 ( 0.75) +Epoch: [0][ 587/5004] Time 0.249 ( 0.240) Data 0.024 ( 0.029) Loss 6.8434e+00 (6.9513e+00) Acc@1 0.00 ( 0.16) Acc@5 0.00 ( 0.75) +Epoch: [0][ 588/5004] Time 0.244 ( 0.240) Data 0.022 ( 0.029) Loss 6.8462e+00 (6.9512e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.75) +Epoch: [0][ 589/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.029) Loss 6.8057e+00 (6.9509e+00) Acc@1 0.39 ( 0.16) Acc@5 1.95 ( 0.75) +Epoch: [0][ 590/5004] Time 0.242 ( 0.240) Data 0.025 ( 0.029) Loss 6.8846e+00 (6.9508e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.75) +Epoch: [0][ 591/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.8258e+00 (6.9506e+00) Acc@1 0.00 ( 0.16) Acc@5 0.78 ( 0.75) +Epoch: [0][ 592/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.8551e+00 (6.9504e+00) Acc@1 0.00 ( 0.16) Acc@5 0.39 ( 0.75) +Epoch: [0][ 593/5004] Time 0.240 ( 0.240) Data 0.025 ( 0.029) Loss 6.8733e+00 (6.9503e+00) Acc@1 0.39 ( 0.16) Acc@5 0.39 ( 0.75) +Epoch: [0][ 594/5004] Time 0.246 ( 0.240) Data 0.026 ( 0.029) Loss 6.8327e+00 (6.9501e+00) Acc@1 0.78 ( 0.17) Acc@5 1.95 ( 0.75) +Epoch: [0][ 595/5004] Time 0.243 ( 0.240) Data 0.027 ( 0.029) Loss 6.8706e+00 (6.9500e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.75) +Epoch: [0][ 596/5004] Time 0.245 ( 0.240) Data 0.025 ( 0.029) Loss 6.8228e+00 (6.9497e+00) Acc@1 0.00 ( 0.16) Acc@5 1.56 ( 0.75) +Epoch: [0][ 597/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.8477e+00 (6.9496e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.75) +Epoch: [0][ 598/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.8566e+00 (6.9494e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.75) +Epoch: [0][ 599/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.8388e+00 (6.9492e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.75) +Epoch: [0][ 600/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.9730e+00 (6.9493e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.75) +Epoch: [0][ 601/5004] Time 0.248 ( 0.240) Data 0.027 ( 0.029) Loss 6.8482e+00 (6.9491e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.75) +Epoch: [0][ 602/5004] Time 0.238 ( 0.240) Data 0.023 ( 0.029) Loss 6.8209e+00 (6.9489e+00) Acc@1 0.39 ( 0.17) Acc@5 1.17 ( 0.75) +Epoch: [0][ 603/5004] Time 0.236 ( 0.240) Data 0.029 ( 0.029) Loss 6.8197e+00 (6.9487e+00) Acc@1 0.39 ( 0.17) Acc@5 1.95 ( 0.75) +Epoch: [0][ 604/5004] Time 0.239 ( 0.240) Data 0.032 ( 0.029) Loss 6.8438e+00 (6.9485e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.75) +Epoch: [0][ 605/5004] Time 0.238 ( 0.240) Data 0.034 ( 0.029) Loss 6.8413e+00 (6.9483e+00) Acc@1 0.78 ( 0.17) Acc@5 1.17 ( 0.75) +Epoch: [0][ 606/5004] Time 0.239 ( 0.240) Data 0.034 ( 0.029) Loss 6.8738e+00 (6.9482e+00) Acc@1 0.00 ( 0.17) Acc@5 1.56 ( 0.75) +Epoch: [0][ 607/5004] Time 0.238 ( 0.240) Data 0.032 ( 0.029) Loss 6.8130e+00 (6.9480e+00) Acc@1 0.78 ( 0.17) Acc@5 1.17 ( 0.75) +Epoch: [0][ 608/5004] Time 0.243 ( 0.240) Data 0.036 ( 0.029) Loss 6.8095e+00 (6.9478e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.75) +Epoch: [0][ 609/5004] Time 0.239 ( 0.240) Data 0.030 ( 0.029) Loss 6.8157e+00 (6.9475e+00) Acc@1 0.78 ( 0.17) Acc@5 1.56 ( 0.76) +Epoch: [0][ 610/5004] Time 0.233 ( 0.240) Data 0.030 ( 0.029) Loss 6.8178e+00 (6.9473e+00) Acc@1 0.39 ( 0.17) Acc@5 0.39 ( 0.76) +Epoch: [0][ 611/5004] Time 0.239 ( 0.240) Data 0.033 ( 0.029) Loss 6.8308e+00 (6.9471e+00) Acc@1 0.39 ( 0.17) Acc@5 1.56 ( 0.76) +Epoch: [0][ 612/5004] Time 0.240 ( 0.240) Data 0.033 ( 0.029) Loss 6.8123e+00 (6.9469e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.76) +Epoch: [0][ 613/5004] Time 0.236 ( 0.240) Data 0.030 ( 0.029) Loss 6.8292e+00 (6.9467e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.76) +Epoch: [0][ 614/5004] Time 0.237 ( 0.240) Data 0.033 ( 0.029) Loss 6.8363e+00 (6.9466e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.76) +Epoch: [0][ 615/5004] Time 0.240 ( 0.240) Data 0.033 ( 0.029) Loss 6.8402e+00 (6.9464e+00) Acc@1 0.00 ( 0.17) Acc@5 1.95 ( 0.76) +Epoch: [0][ 616/5004] Time 0.237 ( 0.240) Data 0.031 ( 0.029) Loss 6.8062e+00 (6.9462e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.76) +Epoch: [0][ 617/5004] Time 0.239 ( 0.240) Data 0.032 ( 0.029) Loss 6.7752e+00 (6.9459e+00) Acc@1 0.39 ( 0.17) Acc@5 1.95 ( 0.76) +Epoch: [0][ 618/5004] Time 0.239 ( 0.240) Data 0.031 ( 0.029) Loss 6.8784e+00 (6.9458e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.76) +Epoch: [0][ 619/5004] Time 0.242 ( 0.240) Data 0.034 ( 0.029) Loss 6.8100e+00 (6.9455e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.76) +Epoch: [0][ 620/5004] Time 0.244 ( 0.240) Data 0.030 ( 0.029) Loss 6.8751e+00 (6.9454e+00) Acc@1 0.39 ( 0.17) Acc@5 1.17 ( 0.76) +Epoch: [0][ 621/5004] Time 0.234 ( 0.240) Data 0.025 ( 0.029) Loss 6.8228e+00 (6.9452e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.76) +Epoch: [0][ 622/5004] Time 0.240 ( 0.240) Data 0.029 ( 0.029) Loss 6.7739e+00 (6.9450e+00) Acc@1 0.78 ( 0.17) Acc@5 2.73 ( 0.76) +Epoch: [0][ 623/5004] Time 0.233 ( 0.240) Data 0.029 ( 0.029) Loss 6.8646e+00 (6.9448e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.76) +Epoch: [0][ 624/5004] Time 0.240 ( 0.240) Data 0.033 ( 0.029) Loss 6.8263e+00 (6.9446e+00) Acc@1 0.00 ( 0.17) Acc@5 1.56 ( 0.77) +Epoch: [0][ 625/5004] Time 0.240 ( 0.240) Data 0.031 ( 0.029) Loss 6.8552e+00 (6.9445e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.77) +Epoch: [0][ 626/5004] Time 0.233 ( 0.240) Data 0.028 ( 0.029) Loss 6.8643e+00 (6.9444e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.77) +Epoch: [0][ 627/5004] Time 0.236 ( 0.240) Data 0.032 ( 0.029) Loss 6.8539e+00 (6.9442e+00) Acc@1 0.39 ( 0.17) Acc@5 1.56 ( 0.77) +Epoch: [0][ 628/5004] Time 0.238 ( 0.240) Data 0.033 ( 0.029) Loss 6.8271e+00 (6.9440e+00) Acc@1 0.78 ( 0.17) Acc@5 1.17 ( 0.77) +Epoch: [0][ 629/5004] Time 0.240 ( 0.240) Data 0.033 ( 0.029) Loss 6.8123e+00 (6.9438e+00) Acc@1 0.00 ( 0.17) Acc@5 1.56 ( 0.77) +Epoch: [0][ 630/5004] Time 0.236 ( 0.240) Data 0.030 ( 0.029) Loss 6.7688e+00 (6.9436e+00) Acc@1 0.39 ( 0.17) Acc@5 1.56 ( 0.77) +Epoch: [0][ 631/5004] Time 0.245 ( 0.240) Data 0.032 ( 0.029) Loss 6.8407e+00 (6.9434e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.77) +Epoch: [0][ 632/5004] Time 0.235 ( 0.240) Data 0.025 ( 0.029) Loss 6.7822e+00 (6.9431e+00) Acc@1 0.39 ( 0.17) Acc@5 1.95 ( 0.77) +Epoch: [0][ 633/5004] Time 0.241 ( 0.240) Data 0.031 ( 0.029) Loss 6.8404e+00 (6.9430e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.77) +Epoch: [0][ 634/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.029) Loss 6.8420e+00 (6.9428e+00) Acc@1 0.00 ( 0.17) Acc@5 0.00 ( 0.77) +Epoch: [0][ 635/5004] Time 0.233 ( 0.240) Data 0.028 ( 0.029) Loss 6.8305e+00 (6.9426e+00) Acc@1 0.00 ( 0.17) Acc@5 1.95 ( 0.77) +Epoch: [0][ 636/5004] Time 0.239 ( 0.240) Data 0.033 ( 0.029) Loss 6.8317e+00 (6.9425e+00) Acc@1 0.39 ( 0.17) Acc@5 1.56 ( 0.77) +Epoch: [0][ 637/5004] Time 0.237 ( 0.240) Data 0.032 ( 0.029) Loss 6.8067e+00 (6.9423e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.77) +Epoch: [0][ 638/5004] Time 0.239 ( 0.240) Data 0.032 ( 0.029) Loss 6.8522e+00 (6.9421e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.77) +Epoch: [0][ 639/5004] Time 0.239 ( 0.240) Data 0.032 ( 0.029) Loss 6.8420e+00 (6.9420e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.77) +Epoch: [0][ 640/5004] Time 0.239 ( 0.240) Data 0.032 ( 0.029) Loss 6.8416e+00 (6.9418e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.77) +Epoch: [0][ 641/5004] Time 0.241 ( 0.240) Data 0.031 ( 0.029) Loss 6.7960e+00 (6.9416e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.77) +Epoch: [0][ 642/5004] Time 0.244 ( 0.240) Data 0.028 ( 0.029) Loss 6.7582e+00 (6.9413e+00) Acc@1 0.39 ( 0.17) Acc@5 1.56 ( 0.77) +Epoch: [0][ 643/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.8175e+00 (6.9411e+00) Acc@1 0.39 ( 0.17) Acc@5 1.56 ( 0.77) +Epoch: [0][ 644/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.7842e+00 (6.9409e+00) Acc@1 0.78 ( 0.17) Acc@5 1.95 ( 0.78) +Epoch: [0][ 645/5004] Time 0.237 ( 0.240) Data 0.024 ( 0.029) Loss 6.8666e+00 (6.9407e+00) Acc@1 0.39 ( 0.17) Acc@5 1.17 ( 0.78) +Epoch: [0][ 646/5004] Time 0.243 ( 0.240) Data 0.027 ( 0.029) Loss 6.8177e+00 (6.9405e+00) Acc@1 0.39 ( 0.17) Acc@5 0.39 ( 0.78) +Epoch: [0][ 647/5004] Time 0.246 ( 0.240) Data 0.027 ( 0.029) Loss 6.8347e+00 (6.9404e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.78) +Epoch: [0][ 648/5004] Time 0.250 ( 0.240) Data 0.025 ( 0.029) Loss 6.7411e+00 (6.9401e+00) Acc@1 0.78 ( 0.17) Acc@5 2.34 ( 0.78) +Epoch: [0][ 649/5004] Time 0.244 ( 0.240) Data 0.025 ( 0.029) Loss 6.7987e+00 (6.9399e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.78) +Epoch: [0][ 650/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.8064e+00 (6.9397e+00) Acc@1 0.39 ( 0.17) Acc@5 2.34 ( 0.78) +Epoch: [0][ 651/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.8511e+00 (6.9395e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.78) +Epoch: [0][ 652/5004] Time 0.242 ( 0.240) Data 0.025 ( 0.029) Loss 6.7996e+00 (6.9393e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.78) +Epoch: [0][ 653/5004] Time 0.244 ( 0.240) Data 0.027 ( 0.029) Loss 6.8324e+00 (6.9391e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.78) +Epoch: [0][ 654/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.8182e+00 (6.9390e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.78) +Epoch: [0][ 655/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.7676e+00 (6.9387e+00) Acc@1 0.00 ( 0.17) Acc@5 0.39 ( 0.78) +Epoch: [0][ 656/5004] Time 0.237 ( 0.240) Data 0.023 ( 0.029) Loss 6.8215e+00 (6.9385e+00) Acc@1 0.39 ( 0.17) Acc@5 0.39 ( 0.78) +Epoch: [0][ 657/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.029) Loss 6.8605e+00 (6.9384e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.78) +Epoch: [0][ 658/5004] Time 0.243 ( 0.240) Data 0.027 ( 0.029) Loss 6.8124e+00 (6.9382e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.78) +Epoch: [0][ 659/5004] Time 0.243 ( 0.240) Data 0.027 ( 0.029) Loss 6.8259e+00 (6.9380e+00) Acc@1 0.39 ( 0.17) Acc@5 1.56 ( 0.78) +Epoch: [0][ 660/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.029) Loss 6.7842e+00 (6.9378e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.78) +Epoch: [0][ 661/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.029) Loss 6.8012e+00 (6.9376e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.78) +Epoch: [0][ 662/5004] Time 0.245 ( 0.240) Data 0.027 ( 0.029) Loss 6.7588e+00 (6.9373e+00) Acc@1 0.39 ( 0.17) Acc@5 1.95 ( 0.79) +Epoch: [0][ 663/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.7698e+00 (6.9371e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.79) +Epoch: [0][ 664/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.8143e+00 (6.9369e+00) Acc@1 0.00 ( 0.17) Acc@5 0.78 ( 0.79) +Epoch: [0][ 665/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.8021e+00 (6.9367e+00) Acc@1 0.39 ( 0.17) Acc@5 1.17 ( 0.79) +Epoch: [0][ 666/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.7903e+00 (6.9365e+00) Acc@1 0.00 ( 0.17) Acc@5 0.00 ( 0.79) +Epoch: [0][ 667/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.7716e+00 (6.9362e+00) Acc@1 0.39 ( 0.17) Acc@5 1.17 ( 0.79) +Epoch: [0][ 668/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.7858e+00 (6.9360e+00) Acc@1 0.39 ( 0.17) Acc@5 0.39 ( 0.79) +Epoch: [0][ 669/5004] Time 0.245 ( 0.240) Data 0.027 ( 0.029) Loss 6.7707e+00 (6.9357e+00) Acc@1 0.39 ( 0.17) Acc@5 1.17 ( 0.79) +Epoch: [0][ 670/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.029) Loss 6.8189e+00 (6.9356e+00) Acc@1 0.00 ( 0.17) Acc@5 1.17 ( 0.79) +Epoch: [0][ 671/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.7474e+00 (6.9353e+00) Acc@1 0.39 ( 0.17) Acc@5 0.78 ( 0.79) +Epoch: [0][ 672/5004] Time 0.244 ( 0.240) Data 0.027 ( 0.029) Loss 6.7969e+00 (6.9351e+00) Acc@1 0.78 ( 0.18) Acc@5 1.56 ( 0.79) +Epoch: [0][ 673/5004] Time 0.244 ( 0.240) Data 0.025 ( 0.029) Loss 6.8350e+00 (6.9349e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.79) +Epoch: [0][ 674/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.7845e+00 (6.9347e+00) Acc@1 0.78 ( 0.18) Acc@5 1.17 ( 0.79) +Epoch: [0][ 675/5004] Time 0.246 ( 0.240) Data 0.023 ( 0.029) Loss 6.7967e+00 (6.9345e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.79) +Epoch: [0][ 676/5004] Time 0.241 ( 0.240) Data 0.022 ( 0.029) Loss 6.7227e+00 (6.9342e+00) Acc@1 1.17 ( 0.18) Acc@5 3.12 ( 0.79) +Epoch: [0][ 677/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.029) Loss 6.7978e+00 (6.9340e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.79) +Epoch: [0][ 678/5004] Time 0.243 ( 0.240) Data 0.024 ( 0.029) Loss 6.7940e+00 (6.9338e+00) Acc@1 0.78 ( 0.18) Acc@5 1.56 ( 0.80) +Epoch: [0][ 679/5004] Time 0.245 ( 0.240) Data 0.023 ( 0.029) Loss 6.8097e+00 (6.9336e+00) Acc@1 0.39 ( 0.18) Acc@5 0.39 ( 0.79) +Epoch: [0][ 680/5004] Time 0.240 ( 0.240) Data 0.021 ( 0.029) Loss 6.7249e+00 (6.9333e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.80) +Epoch: [0][ 681/5004] Time 0.245 ( 0.240) Data 0.024 ( 0.029) Loss 6.8328e+00 (6.9332e+00) Acc@1 0.39 ( 0.18) Acc@5 1.56 ( 0.80) +Epoch: [0][ 682/5004] Time 0.246 ( 0.240) Data 0.022 ( 0.029) Loss 6.8061e+00 (6.9330e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.80) +Epoch: [0][ 683/5004] Time 0.250 ( 0.240) Data 0.022 ( 0.029) Loss 6.7944e+00 (6.9328e+00) Acc@1 0.00 ( 0.18) Acc@5 0.00 ( 0.80) +Epoch: [0][ 684/5004] Time 0.236 ( 0.240) Data 0.018 ( 0.029) Loss 6.7538e+00 (6.9325e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.80) +Epoch: [0][ 685/5004] Time 0.242 ( 0.240) Data 0.023 ( 0.029) Loss 6.7928e+00 (6.9323e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.80) +Epoch: [0][ 686/5004] Time 0.247 ( 0.240) Data 0.024 ( 0.029) Loss 6.8062e+00 (6.9321e+00) Acc@1 0.00 ( 0.18) Acc@5 0.00 ( 0.79) +Epoch: [0][ 687/5004] Time 0.239 ( 0.240) Data 0.022 ( 0.029) Loss 6.7960e+00 (6.9319e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.80) +Epoch: [0][ 688/5004] Time 0.246 ( 0.240) Data 0.024 ( 0.029) Loss 6.7772e+00 (6.9317e+00) Acc@1 0.00 ( 0.18) Acc@5 0.00 ( 0.79) +Epoch: [0][ 689/5004] Time 0.240 ( 0.240) Data 0.019 ( 0.029) Loss 6.8267e+00 (6.9315e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.80) +Epoch: [0][ 690/5004] Time 0.242 ( 0.240) Data 0.022 ( 0.029) Loss 6.7962e+00 (6.9314e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.80) +Epoch: [0][ 691/5004] Time 0.246 ( 0.240) Data 0.023 ( 0.029) Loss 6.7935e+00 (6.9312e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.80) +Epoch: [0][ 692/5004] Time 0.240 ( 0.240) Data 0.022 ( 0.029) Loss 6.7547e+00 (6.9309e+00) Acc@1 0.39 ( 0.18) Acc@5 0.39 ( 0.80) +Epoch: [0][ 693/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.029) Loss 6.7938e+00 (6.9307e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.80) +Epoch: [0][ 694/5004] Time 0.241 ( 0.240) Data 0.023 ( 0.029) Loss 6.7997e+00 (6.9305e+00) Acc@1 0.39 ( 0.18) Acc@5 2.34 ( 0.80) +Epoch: [0][ 695/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.7578e+00 (6.9303e+00) Acc@1 0.00 ( 0.18) Acc@5 3.12 ( 0.80) +Epoch: [0][ 696/5004] Time 0.241 ( 0.240) Data 0.023 ( 0.029) Loss 6.8340e+00 (6.9301e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.80) +Epoch: [0][ 697/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.7596e+00 (6.9299e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.80) +Epoch: [0][ 698/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.029) Loss 6.7900e+00 (6.9297e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.80) +Epoch: [0][ 699/5004] Time 0.241 ( 0.240) Data 0.023 ( 0.029) Loss 6.7942e+00 (6.9295e+00) Acc@1 0.00 ( 0.18) Acc@5 1.95 ( 0.80) +Epoch: [0][ 700/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.029) Loss 6.8207e+00 (6.9293e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.80) +Epoch: [0][ 701/5004] Time 0.243 ( 0.240) Data 0.024 ( 0.029) Loss 6.7811e+00 (6.9291e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.81) +Epoch: [0][ 702/5004] Time 0.242 ( 0.240) Data 0.023 ( 0.029) Loss 6.7903e+00 (6.9289e+00) Acc@1 0.00 ( 0.18) Acc@5 0.00 ( 0.80) +Epoch: [0][ 703/5004] Time 0.243 ( 0.240) Data 0.023 ( 0.029) Loss 6.8257e+00 (6.9288e+00) Acc@1 0.39 ( 0.18) Acc@5 0.39 ( 0.80) +Epoch: [0][ 704/5004] Time 0.247 ( 0.240) Data 0.023 ( 0.029) Loss 6.7848e+00 (6.9286e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.80) +Epoch: [0][ 705/5004] Time 0.247 ( 0.240) Data 0.023 ( 0.029) Loss 6.8274e+00 (6.9284e+00) Acc@1 0.00 ( 0.18) Acc@5 0.39 ( 0.80) +Epoch: [0][ 706/5004] Time 0.239 ( 0.240) Data 0.019 ( 0.029) Loss 6.8091e+00 (6.9283e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.80) +Epoch: [0][ 707/5004] Time 0.244 ( 0.240) Data 0.023 ( 0.029) Loss 6.7929e+00 (6.9281e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.80) +Epoch: [0][ 708/5004] Time 0.249 ( 0.240) Data 0.023 ( 0.029) Loss 6.7519e+00 (6.9278e+00) Acc@1 0.39 ( 0.18) Acc@5 1.95 ( 0.81) +Epoch: [0][ 709/5004] Time 0.243 ( 0.240) Data 0.020 ( 0.029) Loss 6.7715e+00 (6.9276e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.81) +Epoch: [0][ 710/5004] Time 0.244 ( 0.240) Data 0.023 ( 0.029) Loss 6.7552e+00 (6.9274e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.81) +Epoch: [0][ 711/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.029) Loss 6.7935e+00 (6.9272e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.81) +Epoch: [0][ 712/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.029) Loss 6.7709e+00 (6.9269e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.81) +Epoch: [0][ 713/5004] Time 0.240 ( 0.240) Data 0.023 ( 0.029) Loss 6.7507e+00 (6.9267e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.81) +Epoch: [0][ 714/5004] Time 0.240 ( 0.240) Data 0.023 ( 0.029) Loss 6.8318e+00 (6.9266e+00) Acc@1 0.00 ( 0.18) Acc@5 0.39 ( 0.81) +Epoch: [0][ 715/5004] Time 0.242 ( 0.240) Data 0.023 ( 0.029) Loss 6.8196e+00 (6.9264e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.81) +Epoch: [0][ 716/5004] Time 0.242 ( 0.240) Data 0.022 ( 0.029) Loss 6.7874e+00 (6.9262e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.81) +Epoch: [0][ 717/5004] Time 0.254 ( 0.240) Data 0.022 ( 0.029) Loss 6.7798e+00 (6.9260e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.81) +Epoch: [0][ 718/5004] Time 0.235 ( 0.240) Data 0.016 ( 0.029) Loss 6.7509e+00 (6.9258e+00) Acc@1 0.39 ( 0.18) Acc@5 1.95 ( 0.81) +Epoch: [0][ 719/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.029) Loss 6.7111e+00 (6.9255e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.81) +Epoch: [0][ 720/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.029) Loss 6.7982e+00 (6.9253e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.81) +Epoch: [0][ 721/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.029) Loss 6.7758e+00 (6.9251e+00) Acc@1 0.39 ( 0.18) Acc@5 1.56 ( 0.81) +Epoch: [0][ 722/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.029) Loss 6.7513e+00 (6.9249e+00) Acc@1 0.00 ( 0.18) Acc@5 0.00 ( 0.81) +Epoch: [0][ 723/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.029) Loss 6.8487e+00 (6.9248e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.81) +Epoch: [0][ 724/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.7830e+00 (6.9246e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.81) +Epoch: [0][ 725/5004] Time 0.249 ( 0.240) Data 0.024 ( 0.029) Loss 6.7463e+00 (6.9243e+00) Acc@1 0.78 ( 0.18) Acc@5 3.12 ( 0.81) +Epoch: [0][ 726/5004] Time 0.237 ( 0.240) Data 0.019 ( 0.029) Loss 6.7603e+00 (6.9241e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.81) +Epoch: [0][ 727/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.7135e+00 (6.9238e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.81) +Epoch: [0][ 728/5004] Time 0.242 ( 0.240) Data 0.025 ( 0.029) Loss 6.7802e+00 (6.9236e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.82) +Epoch: [0][ 729/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.7733e+00 (6.9234e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.82) +Epoch: [0][ 730/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.029) Loss 6.7914e+00 (6.9232e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.82) +Epoch: [0][ 731/5004] Time 0.244 ( 0.240) Data 0.027 ( 0.029) Loss 6.7794e+00 (6.9230e+00) Acc@1 0.00 ( 0.18) Acc@5 0.39 ( 0.82) +Epoch: [0][ 732/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.8379e+00 (6.9229e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.82) +Epoch: [0][ 733/5004] Time 0.243 ( 0.240) Data 0.023 ( 0.029) Loss 6.7470e+00 (6.9227e+00) Acc@1 0.00 ( 0.18) Acc@5 0.00 ( 0.82) +Epoch: [0][ 734/5004] Time 0.238 ( 0.240) Data 0.022 ( 0.029) Loss 6.7484e+00 (6.9224e+00) Acc@1 0.78 ( 0.18) Acc@5 1.95 ( 0.82) +Epoch: [0][ 735/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.8033e+00 (6.9223e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.82) +Epoch: [0][ 736/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.7671e+00 (6.9221e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.82) +Epoch: [0][ 737/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.029) Loss 6.7491e+00 (6.9218e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.82) +Epoch: [0][ 738/5004] Time 0.245 ( 0.240) Data 0.027 ( 0.029) Loss 6.7796e+00 (6.9216e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.82) +Epoch: [0][ 739/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.7394e+00 (6.9214e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.82) +Epoch: [0][ 740/5004] Time 0.246 ( 0.240) Data 0.026 ( 0.029) Loss 6.7562e+00 (6.9212e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.82) +Epoch: [0][ 741/5004] Time 0.241 ( 0.240) Data 0.027 ( 0.029) Loss 6.7607e+00 (6.9209e+00) Acc@1 0.39 ( 0.18) Acc@5 1.95 ( 0.82) +Epoch: [0][ 742/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.8022e+00 (6.9208e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.82) +Epoch: [0][ 743/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.029) Loss 6.7848e+00 (6.9206e+00) Acc@1 0.39 ( 0.18) Acc@5 2.73 ( 0.82) +Epoch: [0][ 744/5004] Time 0.243 ( 0.240) Data 0.027 ( 0.029) Loss 6.7940e+00 (6.9204e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.82) +Epoch: [0][ 745/5004] Time 0.248 ( 0.240) Data 0.026 ( 0.029) Loss 6.7476e+00 (6.9202e+00) Acc@1 0.00 ( 0.18) Acc@5 2.34 ( 0.83) +Epoch: [0][ 746/5004] Time 0.235 ( 0.240) Data 0.022 ( 0.029) Loss 6.7598e+00 (6.9200e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.83) +Epoch: [0][ 747/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.029) Loss 6.7342e+00 (6.9197e+00) Acc@1 0.78 ( 0.18) Acc@5 1.56 ( 0.83) +Epoch: [0][ 748/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.7313e+00 (6.9195e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.83) +Epoch: [0][ 749/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.7544e+00 (6.9193e+00) Acc@1 0.00 ( 0.18) Acc@5 1.95 ( 0.83) +Epoch: [0][ 750/5004] Time 0.240 ( 0.240) Data 0.025 ( 0.029) Loss 6.7663e+00 (6.9191e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.83) +Epoch: [0][ 751/5004] Time 0.247 ( 0.240) Data 0.027 ( 0.029) Loss 6.7498e+00 (6.9188e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.83) +Epoch: [0][ 752/5004] Time 0.249 ( 0.240) Data 0.025 ( 0.029) Loss 6.7318e+00 (6.9186e+00) Acc@1 0.00 ( 0.18) Acc@5 2.73 ( 0.83) +Epoch: [0][ 753/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.029) Loss 6.7261e+00 (6.9183e+00) Acc@1 0.39 ( 0.18) Acc@5 1.56 ( 0.83) +Epoch: [0][ 754/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.7644e+00 (6.9181e+00) Acc@1 0.78 ( 0.18) Acc@5 1.56 ( 0.83) +Epoch: [0][ 755/5004] Time 0.250 ( 0.240) Data 0.026 ( 0.029) Loss 6.7217e+00 (6.9179e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.83) +Epoch: [0][ 756/5004] Time 0.246 ( 0.240) Data 0.025 ( 0.029) Loss 6.7608e+00 (6.9177e+00) Acc@1 0.00 ( 0.18) Acc@5 1.95 ( 0.83) +Epoch: [0][ 757/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.029) Loss 6.7668e+00 (6.9175e+00) Acc@1 0.00 ( 0.18) Acc@5 0.39 ( 0.83) +Epoch: [0][ 758/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.029) Loss 6.6999e+00 (6.9172e+00) Acc@1 0.78 ( 0.18) Acc@5 1.95 ( 0.84) +Epoch: [0][ 759/5004] Time 0.246 ( 0.240) Data 0.026 ( 0.029) Loss 6.7377e+00 (6.9169e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.84) +Epoch: [0][ 760/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.029) Loss 6.7207e+00 (6.9167e+00) Acc@1 0.39 ( 0.18) Acc@5 1.95 ( 0.84) +Epoch: [0][ 761/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.029) Loss 6.7721e+00 (6.9165e+00) Acc@1 0.78 ( 0.18) Acc@5 1.56 ( 0.84) +Epoch: [0][ 762/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.029) Loss 6.7394e+00 (6.9163e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.84) +Epoch: [0][ 763/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.029) Loss 6.7596e+00 (6.9160e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.84) +Epoch: [0][ 764/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.029) Loss 6.7983e+00 (6.9159e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.84) +Epoch: [0][ 765/5004] Time 0.243 ( 0.241) Data 0.027 ( 0.029) Loss 6.7662e+00 (6.9157e+00) Acc@1 0.39 ( 0.18) Acc@5 1.17 ( 0.84) +Epoch: [0][ 766/5004] Time 0.248 ( 0.241) Data 0.026 ( 0.029) Loss 6.7960e+00 (6.9155e+00) Acc@1 1.17 ( 0.18) Acc@5 1.95 ( 0.84) +Epoch: [0][ 767/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.029) Loss 6.7648e+00 (6.9153e+00) Acc@1 0.00 ( 0.18) Acc@5 1.56 ( 0.84) +Epoch: [0][ 768/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.029) Loss 6.6788e+00 (6.9150e+00) Acc@1 0.39 ( 0.18) Acc@5 2.34 ( 0.84) +Epoch: [0][ 769/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.029) Loss 6.7563e+00 (6.9148e+00) Acc@1 0.00 ( 0.18) Acc@5 0.78 ( 0.84) +Epoch: [0][ 770/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.029) Loss 6.7550e+00 (6.9146e+00) Acc@1 0.39 ( 0.18) Acc@5 0.78 ( 0.84) +Epoch: [0][ 771/5004] Time 0.246 ( 0.241) Data 0.027 ( 0.029) Loss 6.7234e+00 (6.9144e+00) Acc@1 0.78 ( 0.18) Acc@5 2.73 ( 0.85) +Epoch: [0][ 772/5004] Time 0.237 ( 0.241) Data 0.023 ( 0.029) Loss 6.7232e+00 (6.9141e+00) Acc@1 0.78 ( 0.18) Acc@5 1.56 ( 0.85) +Epoch: [0][ 773/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.029) Loss 6.7600e+00 (6.9139e+00) Acc@1 0.00 ( 0.18) Acc@5 1.17 ( 0.85) +Epoch: [0][ 774/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.029) Loss 6.7357e+00 (6.9137e+00) Acc@1 0.78 ( 0.19) Acc@5 1.95 ( 0.85) +Epoch: [0][ 775/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.029) Loss 6.7328e+00 (6.9135e+00) Acc@1 0.00 ( 0.19) Acc@5 0.78 ( 0.85) +Epoch: [0][ 776/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.029) Loss 6.7314e+00 (6.9132e+00) Acc@1 0.39 ( 0.19) Acc@5 0.78 ( 0.85) +Epoch: [0][ 777/5004] Time 0.244 ( 0.241) Data 0.027 ( 0.029) Loss 6.6561e+00 (6.9129e+00) Acc@1 0.78 ( 0.19) Acc@5 2.34 ( 0.85) +Epoch: [0][ 778/5004] Time 0.244 ( 0.241) Data 0.025 ( 0.029) Loss 6.6965e+00 (6.9126e+00) Acc@1 0.00 ( 0.19) Acc@5 1.17 ( 0.85) +Epoch: [0][ 779/5004] Time 0.252 ( 0.241) Data 0.024 ( 0.029) Loss 6.7689e+00 (6.9124e+00) Acc@1 1.56 ( 0.19) Acc@5 3.52 ( 0.85) +Epoch: [0][ 780/5004] Time 0.238 ( 0.241) Data 0.017 ( 0.029) Loss 6.7023e+00 (6.9122e+00) Acc@1 1.17 ( 0.19) Acc@5 3.12 ( 0.86) +Epoch: [0][ 781/5004] Time 0.249 ( 0.241) Data 0.023 ( 0.029) Loss 6.7325e+00 (6.9119e+00) Acc@1 0.78 ( 0.19) Acc@5 1.56 ( 0.86) +Epoch: [0][ 782/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.029) Loss 6.7458e+00 (6.9117e+00) Acc@1 0.78 ( 0.19) Acc@5 2.73 ( 0.86) +Epoch: [0][ 783/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.029) Loss 6.8207e+00 (6.9116e+00) Acc@1 0.39 ( 0.19) Acc@5 0.78 ( 0.86) +Epoch: [0][ 784/5004] Time 0.243 ( 0.241) Data 0.024 ( 0.029) Loss 6.6930e+00 (6.9113e+00) Acc@1 0.78 ( 0.19) Acc@5 1.95 ( 0.86) +Epoch: [0][ 785/5004] Time 0.247 ( 0.241) Data 0.024 ( 0.029) Loss 6.7238e+00 (6.9111e+00) Acc@1 0.39 ( 0.19) Acc@5 0.39 ( 0.86) +Epoch: [0][ 786/5004] Time 0.238 ( 0.241) Data 0.019 ( 0.029) Loss 6.7600e+00 (6.9109e+00) Acc@1 0.39 ( 0.19) Acc@5 1.17 ( 0.86) +Epoch: [0][ 787/5004] Time 0.247 ( 0.241) Data 0.024 ( 0.029) Loss 6.7355e+00 (6.9107e+00) Acc@1 0.39 ( 0.19) Acc@5 0.78 ( 0.86) +Epoch: [0][ 788/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.029) Loss 6.7659e+00 (6.9105e+00) Acc@1 0.00 ( 0.19) Acc@5 0.78 ( 0.86) +Epoch: [0][ 789/5004] Time 0.247 ( 0.241) Data 0.022 ( 0.029) Loss 6.7434e+00 (6.9103e+00) Acc@1 0.78 ( 0.19) Acc@5 1.56 ( 0.86) +Epoch: [0][ 790/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.029) Loss 6.7409e+00 (6.9101e+00) Acc@1 0.78 ( 0.19) Acc@5 0.78 ( 0.86) +Epoch: [0][ 791/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.029) Loss 6.7395e+00 (6.9099e+00) Acc@1 0.00 ( 0.19) Acc@5 1.56 ( 0.86) +Epoch: [0][ 792/5004] Time 0.242 ( 0.241) Data 0.021 ( 0.029) Loss 6.7572e+00 (6.9097e+00) Acc@1 0.78 ( 0.19) Acc@5 1.95 ( 0.86) +Epoch: [0][ 793/5004] Time 0.246 ( 0.241) Data 0.025 ( 0.029) Loss 6.7169e+00 (6.9094e+00) Acc@1 0.00 ( 0.19) Acc@5 0.78 ( 0.86) +Epoch: [0][ 794/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.029) Loss 6.7659e+00 (6.9092e+00) Acc@1 0.00 ( 0.19) Acc@5 0.00 ( 0.86) +Epoch: [0][ 795/5004] Time 0.244 ( 0.241) Data 0.024 ( 0.029) Loss 6.7652e+00 (6.9091e+00) Acc@1 0.78 ( 0.19) Acc@5 1.17 ( 0.86) +Epoch: [0][ 796/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.029) Loss 6.7069e+00 (6.9088e+00) Acc@1 0.39 ( 0.19) Acc@5 0.39 ( 0.86) +Epoch: [0][ 797/5004] Time 0.245 ( 0.241) Data 0.024 ( 0.029) Loss 6.7473e+00 (6.9086e+00) Acc@1 0.78 ( 0.20) Acc@5 2.34 ( 0.87) +Epoch: [0][ 798/5004] Time 0.238 ( 0.241) Data 0.023 ( 0.028) Loss 6.6947e+00 (6.9083e+00) Acc@1 0.39 ( 0.20) Acc@5 1.17 ( 0.87) +Epoch: [0][ 799/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.7111e+00 (6.9081e+00) Acc@1 0.00 ( 0.20) Acc@5 0.78 ( 0.87) +Epoch: [0][ 800/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.7774e+00 (6.9079e+00) Acc@1 0.39 ( 0.20) Acc@5 1.95 ( 0.87) +Epoch: [0][ 801/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.7340e+00 (6.9077e+00) Acc@1 0.00 ( 0.20) Acc@5 1.17 ( 0.87) +Epoch: [0][ 802/5004] Time 0.235 ( 0.241) Data 0.025 ( 0.028) Loss 6.7137e+00 (6.9075e+00) Acc@1 0.78 ( 0.20) Acc@5 1.56 ( 0.87) +Epoch: [0][ 803/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.028) Loss 6.7419e+00 (6.9073e+00) Acc@1 0.39 ( 0.20) Acc@5 1.95 ( 0.87) +Epoch: [0][ 804/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.7497e+00 (6.9071e+00) Acc@1 0.78 ( 0.20) Acc@5 2.34 ( 0.87) +Epoch: [0][ 805/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.7197e+00 (6.9068e+00) Acc@1 0.00 ( 0.20) Acc@5 0.39 ( 0.87) +Epoch: [0][ 806/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.7421e+00 (6.9066e+00) Acc@1 0.78 ( 0.20) Acc@5 1.56 ( 0.87) +Epoch: [0][ 807/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.7418e+00 (6.9064e+00) Acc@1 0.00 ( 0.20) Acc@5 0.78 ( 0.87) +Epoch: [0][ 808/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.7555e+00 (6.9062e+00) Acc@1 0.39 ( 0.20) Acc@5 1.17 ( 0.87) +Epoch: [0][ 809/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.028) Loss 6.6898e+00 (6.9060e+00) Acc@1 0.39 ( 0.20) Acc@5 1.56 ( 0.87) +Epoch: [0][ 810/5004] Time 0.244 ( 0.241) Data 0.022 ( 0.028) Loss 6.7155e+00 (6.9057e+00) Acc@1 0.39 ( 0.20) Acc@5 1.17 ( 0.87) +Epoch: [0][ 811/5004] Time 0.244 ( 0.241) Data 0.021 ( 0.028) Loss 6.7102e+00 (6.9055e+00) Acc@1 0.78 ( 0.20) Acc@5 4.30 ( 0.88) +Epoch: [0][ 812/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.028) Loss 6.6651e+00 (6.9052e+00) Acc@1 0.78 ( 0.20) Acc@5 1.95 ( 0.88) +Epoch: [0][ 813/5004] Time 0.238 ( 0.241) Data 0.023 ( 0.028) Loss 6.6914e+00 (6.9049e+00) Acc@1 1.56 ( 0.20) Acc@5 3.12 ( 0.88) +Epoch: [0][ 814/5004] Time 0.240 ( 0.241) Data 0.023 ( 0.028) Loss 6.8272e+00 (6.9048e+00) Acc@1 0.78 ( 0.20) Acc@5 1.56 ( 0.88) +Epoch: [0][ 815/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.028) Loss 6.6330e+00 (6.9045e+00) Acc@1 0.00 ( 0.20) Acc@5 0.39 ( 0.88) +Epoch: [0][ 816/5004] Time 0.241 ( 0.241) Data 0.025 ( 0.028) Loss 6.7128e+00 (6.9043e+00) Acc@1 0.00 ( 0.20) Acc@5 0.39 ( 0.88) +Epoch: [0][ 817/5004] Time 0.238 ( 0.241) Data 0.024 ( 0.028) Loss 6.7701e+00 (6.9041e+00) Acc@1 0.39 ( 0.20) Acc@5 0.78 ( 0.88) +Epoch: [0][ 818/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.6802e+00 (6.9038e+00) Acc@1 0.00 ( 0.20) Acc@5 1.17 ( 0.88) +Epoch: [0][ 819/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.7129e+00 (6.9036e+00) Acc@1 0.39 ( 0.20) Acc@5 1.56 ( 0.88) +Epoch: [0][ 820/5004] Time 0.243 ( 0.241) Data 0.024 ( 0.028) Loss 6.6999e+00 (6.9034e+00) Acc@1 0.00 ( 0.20) Acc@5 1.17 ( 0.88) +Epoch: [0][ 821/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.028) Loss 6.6856e+00 (6.9031e+00) Acc@1 0.00 ( 0.20) Acc@5 0.78 ( 0.88) +Epoch: [0][ 822/5004] Time 0.235 ( 0.241) Data 0.020 ( 0.028) Loss 6.7094e+00 (6.9029e+00) Acc@1 0.39 ( 0.20) Acc@5 1.56 ( 0.88) +Epoch: [0][ 823/5004] Time 0.250 ( 0.241) Data 0.025 ( 0.028) Loss 6.6900e+00 (6.9026e+00) Acc@1 0.00 ( 0.20) Acc@5 1.56 ( 0.88) +Epoch: [0][ 824/5004] Time 0.245 ( 0.241) Data 0.020 ( 0.028) Loss 6.6879e+00 (6.9023e+00) Acc@1 0.00 ( 0.20) Acc@5 1.95 ( 0.89) +Epoch: [0][ 825/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.028) Loss 6.6856e+00 (6.9021e+00) Acc@1 0.00 ( 0.20) Acc@5 1.95 ( 0.89) +Epoch: [0][ 826/5004] Time 0.247 ( 0.241) Data 0.022 ( 0.028) Loss 6.6618e+00 (6.9018e+00) Acc@1 1.17 ( 0.20) Acc@5 4.30 ( 0.89) +Epoch: [0][ 827/5004] Time 0.243 ( 0.241) Data 0.018 ( 0.028) Loss 6.7441e+00 (6.9016e+00) Acc@1 0.00 ( 0.20) Acc@5 1.95 ( 0.89) +Epoch: [0][ 828/5004] Time 0.250 ( 0.241) Data 0.020 ( 0.028) Loss 6.6826e+00 (6.9013e+00) Acc@1 0.00 ( 0.20) Acc@5 2.34 ( 0.89) +Epoch: [0][ 829/5004] Time 0.241 ( 0.241) Data 0.016 ( 0.028) Loss 6.7311e+00 (6.9011e+00) Acc@1 0.39 ( 0.20) Acc@5 2.34 ( 0.90) +Epoch: [0][ 830/5004] Time 0.243 ( 0.241) Data 0.020 ( 0.028) Loss 6.7218e+00 (6.9009e+00) Acc@1 0.39 ( 0.20) Acc@5 2.34 ( 0.90) +Epoch: [0][ 831/5004] Time 0.244 ( 0.241) Data 0.022 ( 0.028) Loss 6.7127e+00 (6.9007e+00) Acc@1 0.00 ( 0.20) Acc@5 1.17 ( 0.90) +Epoch: [0][ 832/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.028) Loss 6.7441e+00 (6.9005e+00) Acc@1 0.39 ( 0.20) Acc@5 1.56 ( 0.90) +Epoch: [0][ 833/5004] Time 0.247 ( 0.241) Data 0.022 ( 0.028) Loss 6.6563e+00 (6.9002e+00) Acc@1 0.39 ( 0.20) Acc@5 1.95 ( 0.90) +Epoch: [0][ 834/5004] Time 0.244 ( 0.241) Data 0.020 ( 0.028) Loss 6.6703e+00 (6.8999e+00) Acc@1 0.78 ( 0.20) Acc@5 2.34 ( 0.90) +Epoch: [0][ 835/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.028) Loss 6.7149e+00 (6.8997e+00) Acc@1 0.39 ( 0.20) Acc@5 1.95 ( 0.90) +Epoch: [0][ 836/5004] Time 0.245 ( 0.241) Data 0.022 ( 0.028) Loss 6.6870e+00 (6.8995e+00) Acc@1 0.00 ( 0.20) Acc@5 0.78 ( 0.90) +Epoch: [0][ 837/5004] Time 0.248 ( 0.241) Data 0.021 ( 0.028) Loss 6.6788e+00 (6.8992e+00) Acc@1 0.78 ( 0.20) Acc@5 1.56 ( 0.90) +Epoch: [0][ 838/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.028) Loss 6.7434e+00 (6.8990e+00) Acc@1 0.00 ( 0.20) Acc@5 1.56 ( 0.90) +Epoch: [0][ 839/5004] Time 0.247 ( 0.241) Data 0.021 ( 0.028) Loss 6.6562e+00 (6.8987e+00) Acc@1 0.39 ( 0.20) Acc@5 1.17 ( 0.90) +Epoch: [0][ 840/5004] Time 0.251 ( 0.241) Data 0.019 ( 0.028) Loss 6.6764e+00 (6.8985e+00) Acc@1 0.78 ( 0.20) Acc@5 2.34 ( 0.91) +Epoch: [0][ 841/5004] Time 0.244 ( 0.241) Data 0.019 ( 0.028) Loss 6.6457e+00 (6.8982e+00) Acc@1 1.56 ( 0.21) Acc@5 2.73 ( 0.91) +Epoch: [0][ 842/5004] Time 0.243 ( 0.241) Data 0.021 ( 0.028) Loss 6.6862e+00 (6.8979e+00) Acc@1 0.00 ( 0.21) Acc@5 1.56 ( 0.91) +Epoch: [0][ 843/5004] Time 0.238 ( 0.241) Data 0.021 ( 0.028) Loss 6.6416e+00 (6.8976e+00) Acc@1 0.00 ( 0.21) Acc@5 2.73 ( 0.91) +Epoch: [0][ 844/5004] Time 0.239 ( 0.241) Data 0.024 ( 0.028) Loss 6.7771e+00 (6.8975e+00) Acc@1 0.39 ( 0.21) Acc@5 1.17 ( 0.91) +Epoch: [0][ 845/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.7240e+00 (6.8972e+00) Acc@1 0.39 ( 0.21) Acc@5 1.95 ( 0.91) +Epoch: [0][ 846/5004] Time 0.239 ( 0.241) Data 0.024 ( 0.028) Loss 6.6886e+00 (6.8970e+00) Acc@1 0.39 ( 0.21) Acc@5 1.56 ( 0.91) +Epoch: [0][ 847/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.028) Loss 6.6923e+00 (6.8968e+00) Acc@1 0.39 ( 0.21) Acc@5 1.56 ( 0.91) +Epoch: [0][ 848/5004] Time 0.241 ( 0.241) Data 0.028 ( 0.028) Loss 6.6383e+00 (6.8965e+00) Acc@1 0.39 ( 0.21) Acc@5 1.56 ( 0.92) +Epoch: [0][ 849/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.028) Loss 6.6398e+00 (6.8962e+00) Acc@1 0.39 ( 0.21) Acc@5 1.95 ( 0.92) +Epoch: [0][ 850/5004] Time 0.244 ( 0.241) Data 0.027 ( 0.028) Loss 6.6543e+00 (6.8959e+00) Acc@1 0.00 ( 0.21) Acc@5 1.17 ( 0.92) +Epoch: [0][ 851/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.6729e+00 (6.8956e+00) Acc@1 0.00 ( 0.21) Acc@5 0.39 ( 0.92) +Epoch: [0][ 852/5004] Time 0.242 ( 0.241) Data 0.031 ( 0.028) Loss 6.6319e+00 (6.8953e+00) Acc@1 0.00 ( 0.21) Acc@5 1.95 ( 0.92) +Epoch: [0][ 853/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.7144e+00 (6.8951e+00) Acc@1 0.39 ( 0.21) Acc@5 1.95 ( 0.92) +Epoch: [0][ 854/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.028) Loss 6.6691e+00 (6.8948e+00) Acc@1 0.39 ( 0.21) Acc@5 1.56 ( 0.92) +Epoch: [0][ 855/5004] Time 0.244 ( 0.241) Data 0.028 ( 0.028) Loss 6.7167e+00 (6.8946e+00) Acc@1 0.00 ( 0.21) Acc@5 1.17 ( 0.92) +Epoch: [0][ 856/5004] Time 0.236 ( 0.241) Data 0.024 ( 0.028) Loss 6.6680e+00 (6.8944e+00) Acc@1 0.00 ( 0.21) Acc@5 0.78 ( 0.92) +Epoch: [0][ 857/5004] Time 0.241 ( 0.241) Data 0.028 ( 0.028) Loss 6.7065e+00 (6.8941e+00) Acc@1 0.00 ( 0.21) Acc@5 1.95 ( 0.92) +Epoch: [0][ 858/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.7136e+00 (6.8939e+00) Acc@1 0.39 ( 0.21) Acc@5 2.34 ( 0.92) +Epoch: [0][ 859/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.028) Loss 6.7470e+00 (6.8938e+00) Acc@1 0.39 ( 0.21) Acc@5 0.39 ( 0.92) +Epoch: [0][ 860/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.028) Loss 6.6364e+00 (6.8935e+00) Acc@1 0.39 ( 0.21) Acc@5 2.34 ( 0.92) +Epoch: [0][ 861/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6422e+00 (6.8932e+00) Acc@1 0.78 ( 0.21) Acc@5 1.56 ( 0.92) +Epoch: [0][ 862/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.7149e+00 (6.8930e+00) Acc@1 0.00 ( 0.21) Acc@5 0.39 ( 0.92) +Epoch: [0][ 863/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6896e+00 (6.8927e+00) Acc@1 0.00 ( 0.21) Acc@5 1.56 ( 0.92) +Epoch: [0][ 864/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.7000e+00 (6.8925e+00) Acc@1 0.39 ( 0.21) Acc@5 1.95 ( 0.93) +Epoch: [0][ 865/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.028) Loss 6.6969e+00 (6.8923e+00) Acc@1 0.78 ( 0.21) Acc@5 3.12 ( 0.93) +Epoch: [0][ 866/5004] Time 0.236 ( 0.241) Data 0.024 ( 0.028) Loss 6.6869e+00 (6.8920e+00) Acc@1 0.78 ( 0.21) Acc@5 1.17 ( 0.93) +Epoch: [0][ 867/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.028) Loss 6.7326e+00 (6.8918e+00) Acc@1 0.00 ( 0.21) Acc@5 1.56 ( 0.93) +Epoch: [0][ 868/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.6978e+00 (6.8916e+00) Acc@1 0.00 ( 0.21) Acc@5 2.34 ( 0.93) +Epoch: [0][ 869/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.6793e+00 (6.8914e+00) Acc@1 0.78 ( 0.21) Acc@5 0.78 ( 0.93) +Epoch: [0][ 870/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.5643e+00 (6.8910e+00) Acc@1 1.17 ( 0.21) Acc@5 3.52 ( 0.93) +Epoch: [0][ 871/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6531e+00 (6.8907e+00) Acc@1 0.00 ( 0.21) Acc@5 1.17 ( 0.93) +Epoch: [0][ 872/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.6791e+00 (6.8905e+00) Acc@1 0.39 ( 0.21) Acc@5 2.73 ( 0.94) +Epoch: [0][ 873/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.6698e+00 (6.8902e+00) Acc@1 1.56 ( 0.21) Acc@5 3.12 ( 0.94) +Epoch: [0][ 874/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.028) Loss 6.6887e+00 (6.8900e+00) Acc@1 0.39 ( 0.21) Acc@5 1.56 ( 0.94) +Epoch: [0][ 875/5004] Time 0.244 ( 0.241) Data 0.028 ( 0.028) Loss 6.6896e+00 (6.8898e+00) Acc@1 0.39 ( 0.21) Acc@5 3.52 ( 0.94) +Epoch: [0][ 876/5004] Time 0.236 ( 0.241) Data 0.025 ( 0.028) Loss 6.6358e+00 (6.8895e+00) Acc@1 0.78 ( 0.21) Acc@5 1.56 ( 0.94) +Epoch: [0][ 877/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.7604e+00 (6.8893e+00) Acc@1 0.00 ( 0.21) Acc@5 0.39 ( 0.94) +Epoch: [0][ 878/5004] Time 0.243 ( 0.241) Data 0.025 ( 0.028) Loss 6.6488e+00 (6.8891e+00) Acc@1 0.39 ( 0.21) Acc@5 2.34 ( 0.94) +Epoch: [0][ 879/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.6979e+00 (6.8889e+00) Acc@1 0.00 ( 0.21) Acc@5 0.39 ( 0.94) +Epoch: [0][ 880/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.6550e+00 (6.8886e+00) Acc@1 0.78 ( 0.21) Acc@5 1.95 ( 0.94) +Epoch: [0][ 881/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.7443e+00 (6.8884e+00) Acc@1 0.00 ( 0.21) Acc@5 1.17 ( 0.94) +Epoch: [0][ 882/5004] Time 0.237 ( 0.241) Data 0.024 ( 0.028) Loss 6.6944e+00 (6.8882e+00) Acc@1 0.00 ( 0.21) Acc@5 2.73 ( 0.95) +Epoch: [0][ 883/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.7200e+00 (6.8880e+00) Acc@1 0.00 ( 0.21) Acc@5 1.17 ( 0.95) +Epoch: [0][ 884/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.7008e+00 (6.8878e+00) Acc@1 0.00 ( 0.21) Acc@5 0.78 ( 0.95) +Epoch: [0][ 885/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.5638e+00 (6.8874e+00) Acc@1 0.78 ( 0.21) Acc@5 2.34 ( 0.95) +Epoch: [0][ 886/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.6463e+00 (6.8872e+00) Acc@1 0.39 ( 0.21) Acc@5 3.52 ( 0.95) +Epoch: [0][ 887/5004] Time 0.238 ( 0.241) Data 0.024 ( 0.028) Loss 6.6758e+00 (6.8869e+00) Acc@1 0.00 ( 0.21) Acc@5 2.34 ( 0.95) +Epoch: [0][ 888/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6639e+00 (6.8867e+00) Acc@1 0.39 ( 0.21) Acc@5 1.95 ( 0.95) +Epoch: [0][ 889/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.6646e+00 (6.8864e+00) Acc@1 0.78 ( 0.21) Acc@5 1.95 ( 0.95) +Epoch: [0][ 890/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6675e+00 (6.8862e+00) Acc@1 0.78 ( 0.21) Acc@5 1.56 ( 0.96) +Epoch: [0][ 891/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.6438e+00 (6.8859e+00) Acc@1 0.39 ( 0.21) Acc@5 2.34 ( 0.96) +Epoch: [0][ 892/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.6732e+00 (6.8857e+00) Acc@1 0.39 ( 0.21) Acc@5 1.95 ( 0.96) +Epoch: [0][ 893/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6758e+00 (6.8854e+00) Acc@1 0.78 ( 0.21) Acc@5 1.56 ( 0.96) +Epoch: [0][ 894/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.6820e+00 (6.8852e+00) Acc@1 0.39 ( 0.21) Acc@5 2.34 ( 0.96) +Epoch: [0][ 895/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.7055e+00 (6.8850e+00) Acc@1 0.39 ( 0.21) Acc@5 3.52 ( 0.96) +Epoch: [0][ 896/5004] Time 0.233 ( 0.241) Data 0.024 ( 0.028) Loss 6.7264e+00 (6.8848e+00) Acc@1 0.39 ( 0.21) Acc@5 0.78 ( 0.96) +Epoch: [0][ 897/5004] Time 0.240 ( 0.241) Data 0.029 ( 0.028) Loss 6.6462e+00 (6.8846e+00) Acc@1 0.39 ( 0.21) Acc@5 2.73 ( 0.96) +Epoch: [0][ 898/5004] Time 0.236 ( 0.241) Data 0.028 ( 0.028) Loss 6.6903e+00 (6.8843e+00) Acc@1 0.00 ( 0.21) Acc@5 2.34 ( 0.97) +Epoch: [0][ 899/5004] Time 0.239 ( 0.241) Data 0.030 ( 0.028) Loss 6.6924e+00 (6.8841e+00) Acc@1 0.39 ( 0.21) Acc@5 2.34 ( 0.97) +Epoch: [0][ 900/5004] Time 0.240 ( 0.241) Data 0.030 ( 0.028) Loss 6.6891e+00 (6.8839e+00) Acc@1 0.78 ( 0.22) Acc@5 2.73 ( 0.97) +Epoch: [0][ 901/5004] Time 0.241 ( 0.241) Data 0.030 ( 0.028) Loss 6.6558e+00 (6.8837e+00) Acc@1 0.39 ( 0.22) Acc@5 0.78 ( 0.97) +Epoch: [0][ 902/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.6783e+00 (6.8834e+00) Acc@1 0.39 ( 0.22) Acc@5 1.17 ( 0.97) +Epoch: [0][ 903/5004] Time 0.238 ( 0.241) Data 0.030 ( 0.028) Loss 6.6434e+00 (6.8832e+00) Acc@1 1.56 ( 0.22) Acc@5 2.73 ( 0.97) +Epoch: [0][ 904/5004] Time 0.237 ( 0.241) Data 0.029 ( 0.028) Loss 6.6480e+00 (6.8829e+00) Acc@1 0.78 ( 0.22) Acc@5 1.95 ( 0.97) +Epoch: [0][ 905/5004] Time 0.240 ( 0.241) Data 0.030 ( 0.028) Loss 6.6923e+00 (6.8827e+00) Acc@1 0.00 ( 0.22) Acc@5 2.34 ( 0.97) +Epoch: [0][ 906/5004] Time 0.238 ( 0.241) Data 0.030 ( 0.028) Loss 6.5919e+00 (6.8824e+00) Acc@1 0.78 ( 0.22) Acc@5 1.56 ( 0.98) +Epoch: [0][ 907/5004] Time 0.241 ( 0.241) Data 0.030 ( 0.028) Loss 6.6339e+00 (6.8821e+00) Acc@1 0.00 ( 0.22) Acc@5 1.17 ( 0.98) +Epoch: [0][ 908/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.028) Loss 6.6126e+00 (6.8818e+00) Acc@1 0.00 ( 0.22) Acc@5 0.39 ( 0.97) +Epoch: [0][ 909/5004] Time 0.234 ( 0.241) Data 0.025 ( 0.028) Loss 6.6787e+00 (6.8816e+00) Acc@1 1.17 ( 0.22) Acc@5 2.34 ( 0.98) +Epoch: [0][ 910/5004] Time 0.237 ( 0.241) Data 0.029 ( 0.028) Loss 6.6766e+00 (6.8814e+00) Acc@1 0.78 ( 0.22) Acc@5 1.17 ( 0.98) +Epoch: [0][ 911/5004] Time 0.239 ( 0.241) Data 0.030 ( 0.028) Loss 6.6787e+00 (6.8811e+00) Acc@1 0.39 ( 0.22) Acc@5 1.56 ( 0.98) +Epoch: [0][ 912/5004] Time 0.238 ( 0.241) Data 0.029 ( 0.028) Loss 6.7137e+00 (6.8810e+00) Acc@1 0.00 ( 0.22) Acc@5 1.56 ( 0.98) +Epoch: [0][ 913/5004] Time 0.239 ( 0.241) Data 0.030 ( 0.028) Loss 6.6569e+00 (6.8807e+00) Acc@1 0.39 ( 0.22) Acc@5 1.95 ( 0.98) +Epoch: [0][ 914/5004] Time 0.242 ( 0.241) Data 0.030 ( 0.028) Loss 6.7052e+00 (6.8805e+00) Acc@1 0.00 ( 0.22) Acc@5 2.34 ( 0.98) +Epoch: [0][ 915/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6411e+00 (6.8803e+00) Acc@1 0.78 ( 0.22) Acc@5 2.34 ( 0.98) +Epoch: [0][ 916/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.6620e+00 (6.8800e+00) Acc@1 0.00 ( 0.22) Acc@5 1.56 ( 0.98) +Epoch: [0][ 917/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.028) Loss 6.6799e+00 (6.8798e+00) Acc@1 0.78 ( 0.22) Acc@5 3.12 ( 0.98) +Epoch: [0][ 918/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.028) Loss 6.6059e+00 (6.8795e+00) Acc@1 0.39 ( 0.22) Acc@5 2.73 ( 0.99) +Epoch: [0][ 919/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.6981e+00 (6.8793e+00) Acc@1 1.17 ( 0.22) Acc@5 1.56 ( 0.99) +Epoch: [0][ 920/5004] Time 0.243 ( 0.241) Data 0.023 ( 0.028) Loss 6.6304e+00 (6.8790e+00) Acc@1 0.39 ( 0.22) Acc@5 2.34 ( 0.99) +Epoch: [0][ 921/5004] Time 0.236 ( 0.241) Data 0.020 ( 0.028) Loss 6.6259e+00 (6.8788e+00) Acc@1 1.17 ( 0.22) Acc@5 1.56 ( 0.99) +Epoch: [0][ 922/5004] Time 0.241 ( 0.241) Data 0.025 ( 0.028) Loss 6.6601e+00 (6.8785e+00) Acc@1 0.39 ( 0.22) Acc@5 1.17 ( 0.99) +Epoch: [0][ 923/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.028) Loss 6.6727e+00 (6.8783e+00) Acc@1 0.00 ( 0.22) Acc@5 0.00 ( 0.99) +Epoch: [0][ 924/5004] Time 0.240 ( 0.241) Data 0.024 ( 0.028) Loss 6.7490e+00 (6.8782e+00) Acc@1 0.00 ( 0.22) Acc@5 1.56 ( 0.99) +Epoch: [0][ 925/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.6613e+00 (6.8779e+00) Acc@1 0.39 ( 0.22) Acc@5 2.73 ( 0.99) +Epoch: [0][ 926/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.7060e+00 (6.8777e+00) Acc@1 0.39 ( 0.22) Acc@5 1.56 ( 0.99) +Epoch: [0][ 927/5004] Time 0.236 ( 0.241) Data 0.022 ( 0.028) Loss 6.6796e+00 (6.8775e+00) Acc@1 0.39 ( 0.22) Acc@5 1.56 ( 0.99) +Epoch: [0][ 928/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.028) Loss 6.6344e+00 (6.8773e+00) Acc@1 0.00 ( 0.22) Acc@5 1.56 ( 0.99) +Epoch: [0][ 929/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.6731e+00 (6.8770e+00) Acc@1 0.00 ( 0.22) Acc@5 1.17 ( 0.99) +Epoch: [0][ 930/5004] Time 0.248 ( 0.241) Data 0.025 ( 0.028) Loss 6.6377e+00 (6.8768e+00) Acc@1 0.39 ( 0.22) Acc@5 3.12 ( 1.00) +Epoch: [0][ 931/5004] Time 0.238 ( 0.241) Data 0.019 ( 0.028) Loss 6.6461e+00 (6.8765e+00) Acc@1 0.00 ( 0.22) Acc@5 1.56 ( 1.00) +Epoch: [0][ 932/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.6613e+00 (6.8763e+00) Acc@1 0.00 ( 0.22) Acc@5 1.56 ( 1.00) +Epoch: [0][ 933/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.6293e+00 (6.8760e+00) Acc@1 0.78 ( 0.22) Acc@5 2.34 ( 1.00) +Epoch: [0][ 934/5004] Time 0.243 ( 0.241) Data 0.024 ( 0.028) Loss 6.5619e+00 (6.8757e+00) Acc@1 0.78 ( 0.22) Acc@5 2.34 ( 1.00) +Epoch: [0][ 935/5004] Time 0.240 ( 0.241) Data 0.022 ( 0.028) Loss 6.6555e+00 (6.8755e+00) Acc@1 0.00 ( 0.22) Acc@5 1.17 ( 1.00) +Epoch: [0][ 936/5004] Time 0.240 ( 0.241) Data 0.023 ( 0.028) Loss 6.7021e+00 (6.8753e+00) Acc@1 0.00 ( 0.22) Acc@5 1.17 ( 1.00) +Epoch: [0][ 937/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6675e+00 (6.8751e+00) Acc@1 0.78 ( 0.22) Acc@5 1.17 ( 1.00) +Epoch: [0][ 938/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.028) Loss 6.6884e+00 (6.8749e+00) Acc@1 1.17 ( 0.22) Acc@5 1.95 ( 1.00) +Epoch: [0][ 939/5004] Time 0.235 ( 0.241) Data 0.022 ( 0.028) Loss 6.6423e+00 (6.8746e+00) Acc@1 0.78 ( 0.23) Acc@5 3.91 ( 1.00) +Epoch: [0][ 940/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6171e+00 (6.8744e+00) Acc@1 0.78 ( 0.23) Acc@5 1.95 ( 1.00) +Epoch: [0][ 941/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.6313e+00 (6.8741e+00) Acc@1 0.39 ( 0.23) Acc@5 2.34 ( 1.01) +Epoch: [0][ 942/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.5663e+00 (6.8738e+00) Acc@1 0.78 ( 0.23) Acc@5 3.12 ( 1.01) +Epoch: [0][ 943/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6424e+00 (6.8735e+00) Acc@1 0.00 ( 0.23) Acc@5 1.56 ( 1.01) +Epoch: [0][ 944/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.028) Loss 6.6452e+00 (6.8733e+00) Acc@1 0.00 ( 0.23) Acc@5 1.17 ( 1.01) +Epoch: [0][ 945/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6069e+00 (6.8730e+00) Acc@1 0.78 ( 0.23) Acc@5 2.34 ( 1.01) +Epoch: [0][ 946/5004] Time 0.241 ( 0.241) Data 0.025 ( 0.028) Loss 6.5656e+00 (6.8727e+00) Acc@1 0.00 ( 0.23) Acc@5 1.56 ( 1.01) +Epoch: [0][ 947/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.6396e+00 (6.8724e+00) Acc@1 0.00 ( 0.23) Acc@5 2.34 ( 1.01) +Epoch: [0][ 948/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.7760e+00 (6.8723e+00) Acc@1 0.00 ( 0.23) Acc@5 0.00 ( 1.01) +Epoch: [0][ 949/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.7070e+00 (6.8722e+00) Acc@1 0.39 ( 0.23) Acc@5 1.17 ( 1.01) +Epoch: [0][ 950/5004] Time 0.246 ( 0.241) Data 0.026 ( 0.028) Loss 6.6485e+00 (6.8719e+00) Acc@1 1.17 ( 0.23) Acc@5 3.52 ( 1.01) +Epoch: [0][ 951/5004] Time 0.234 ( 0.241) Data 0.021 ( 0.028) Loss 6.5912e+00 (6.8716e+00) Acc@1 0.39 ( 0.23) Acc@5 4.30 ( 1.02) +Epoch: [0][ 952/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6044e+00 (6.8713e+00) Acc@1 1.56 ( 0.23) Acc@5 2.34 ( 1.02) +Epoch: [0][ 953/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6439e+00 (6.8711e+00) Acc@1 0.39 ( 0.23) Acc@5 1.56 ( 1.02) +Epoch: [0][ 954/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6593e+00 (6.8709e+00) Acc@1 1.17 ( 0.23) Acc@5 1.95 ( 1.02) +Epoch: [0][ 955/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.028) Loss 6.5980e+00 (6.8706e+00) Acc@1 0.39 ( 0.23) Acc@5 2.73 ( 1.02) +Epoch: [0][ 956/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.5807e+00 (6.8703e+00) Acc@1 0.00 ( 0.23) Acc@5 2.73 ( 1.02) +Epoch: [0][ 957/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6284e+00 (6.8700e+00) Acc@1 0.78 ( 0.23) Acc@5 3.12 ( 1.03) +Epoch: [0][ 958/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.6066e+00 (6.8698e+00) Acc@1 1.17 ( 0.23) Acc@5 3.12 ( 1.03) +Epoch: [0][ 959/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.6054e+00 (6.8695e+00) Acc@1 1.17 ( 0.23) Acc@5 2.73 ( 1.03) +Epoch: [0][ 960/5004] Time 0.246 ( 0.241) Data 0.026 ( 0.028) Loss 6.5819e+00 (6.8692e+00) Acc@1 1.56 ( 0.23) Acc@5 3.91 ( 1.03) +Epoch: [0][ 961/5004] Time 0.236 ( 0.241) Data 0.021 ( 0.028) Loss 6.6604e+00 (6.8690e+00) Acc@1 0.78 ( 0.23) Acc@5 1.95 ( 1.03) +Epoch: [0][ 962/5004] Time 0.247 ( 0.241) Data 0.028 ( 0.028) Loss 6.5904e+00 (6.8687e+00) Acc@1 0.39 ( 0.23) Acc@5 2.73 ( 1.04) +Epoch: [0][ 963/5004] Time 0.237 ( 0.241) Data 0.022 ( 0.028) Loss 6.6498e+00 (6.8685e+00) Acc@1 0.78 ( 0.24) Acc@5 2.73 ( 1.04) +Epoch: [0][ 964/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5395e+00 (6.8681e+00) Acc@1 0.39 ( 0.24) Acc@5 2.34 ( 1.04) +Epoch: [0][ 965/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.028) Loss 6.5845e+00 (6.8678e+00) Acc@1 0.00 ( 0.23) Acc@5 2.34 ( 1.04) +Epoch: [0][ 966/5004] Time 0.244 ( 0.241) Data 0.028 ( 0.028) Loss 6.6115e+00 (6.8676e+00) Acc@1 0.39 ( 0.24) Acc@5 0.78 ( 1.04) +Epoch: [0][ 967/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.6425e+00 (6.8673e+00) Acc@1 0.39 ( 0.24) Acc@5 1.95 ( 1.04) +Epoch: [0][ 968/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.7343e+00 (6.8672e+00) Acc@1 0.00 ( 0.24) Acc@5 1.17 ( 1.04) +Epoch: [0][ 969/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.6102e+00 (6.8669e+00) Acc@1 1.17 ( 0.24) Acc@5 3.52 ( 1.04) +Epoch: [0][ 970/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.028) Loss 6.6308e+00 (6.8667e+00) Acc@1 0.00 ( 0.24) Acc@5 1.17 ( 1.04) +Epoch: [0][ 971/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6767e+00 (6.8665e+00) Acc@1 0.39 ( 0.24) Acc@5 1.95 ( 1.05) +Epoch: [0][ 972/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.028) Loss 6.5974e+00 (6.8662e+00) Acc@1 0.39 ( 0.24) Acc@5 3.12 ( 1.05) +Epoch: [0][ 973/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.6209e+00 (6.8660e+00) Acc@1 0.39 ( 0.24) Acc@5 1.56 ( 1.05) +Epoch: [0][ 974/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.6264e+00 (6.8657e+00) Acc@1 0.39 ( 0.24) Acc@5 0.78 ( 1.05) +Epoch: [0][ 975/5004] Time 0.235 ( 0.241) Data 0.023 ( 0.028) Loss 6.6618e+00 (6.8655e+00) Acc@1 0.00 ( 0.24) Acc@5 3.12 ( 1.05) +Epoch: [0][ 976/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.028) Loss 6.6920e+00 (6.8653e+00) Acc@1 0.00 ( 0.24) Acc@5 1.95 ( 1.05) +Epoch: [0][ 977/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6049e+00 (6.8651e+00) Acc@1 0.00 ( 0.24) Acc@5 2.73 ( 1.05) +Epoch: [0][ 978/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.6440e+00 (6.8648e+00) Acc@1 0.39 ( 0.24) Acc@5 2.34 ( 1.05) +Epoch: [0][ 979/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.028) Loss 6.5973e+00 (6.8646e+00) Acc@1 0.00 ( 0.24) Acc@5 2.73 ( 1.06) +Epoch: [0][ 980/5004] Time 0.233 ( 0.241) Data 0.021 ( 0.028) Loss 6.6402e+00 (6.8643e+00) Acc@1 0.39 ( 0.24) Acc@5 2.34 ( 1.06) +Epoch: [0][ 981/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6373e+00 (6.8641e+00) Acc@1 0.39 ( 0.24) Acc@5 3.12 ( 1.06) +Epoch: [0][ 982/5004] Time 0.247 ( 0.241) Data 0.026 ( 0.028) Loss 6.6421e+00 (6.8639e+00) Acc@1 0.39 ( 0.24) Acc@5 1.95 ( 1.06) +Epoch: [0][ 983/5004] Time 0.243 ( 0.241) Data 0.025 ( 0.028) Loss 6.5987e+00 (6.8636e+00) Acc@1 1.17 ( 0.24) Acc@5 1.95 ( 1.06) +Epoch: [0][ 984/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.6493e+00 (6.8634e+00) Acc@1 0.39 ( 0.24) Acc@5 2.73 ( 1.06) +Epoch: [0][ 985/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.6595e+00 (6.8632e+00) Acc@1 0.39 ( 0.24) Acc@5 1.17 ( 1.06) +Epoch: [0][ 986/5004] Time 0.235 ( 0.241) Data 0.023 ( 0.028) Loss 6.6754e+00 (6.8630e+00) Acc@1 0.78 ( 0.24) Acc@5 2.73 ( 1.06) +Epoch: [0][ 987/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.6464e+00 (6.8628e+00) Acc@1 1.17 ( 0.24) Acc@5 4.30 ( 1.07) +Epoch: [0][ 988/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.028) Loss 6.5872e+00 (6.8625e+00) Acc@1 0.39 ( 0.24) Acc@5 2.73 ( 1.07) +Epoch: [0][ 989/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.6700e+00 (6.8623e+00) Acc@1 0.00 ( 0.24) Acc@5 0.00 ( 1.07) +Epoch: [0][ 990/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6166e+00 (6.8621e+00) Acc@1 0.78 ( 0.24) Acc@5 1.95 ( 1.07) +Epoch: [0][ 991/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5495e+00 (6.8617e+00) Acc@1 1.17 ( 0.24) Acc@5 1.95 ( 1.07) +Epoch: [0][ 992/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.6031e+00 (6.8615e+00) Acc@1 0.00 ( 0.24) Acc@5 2.34 ( 1.07) +Epoch: [0][ 993/5004] Time 0.247 ( 0.241) Data 0.026 ( 0.028) Loss 6.6210e+00 (6.8612e+00) Acc@1 0.00 ( 0.24) Acc@5 2.73 ( 1.07) +Epoch: [0][ 994/5004] Time 0.236 ( 0.241) Data 0.022 ( 0.028) Loss 6.5051e+00 (6.8609e+00) Acc@1 1.17 ( 0.24) Acc@5 3.12 ( 1.07) +Epoch: [0][ 995/5004] Time 0.240 ( 0.241) Data 0.030 ( 0.028) Loss 6.6197e+00 (6.8606e+00) Acc@1 0.39 ( 0.24) Acc@5 2.73 ( 1.08) +Epoch: [0][ 996/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.5641e+00 (6.8603e+00) Acc@1 0.39 ( 0.24) Acc@5 3.52 ( 1.08) +Epoch: [0][ 997/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6606e+00 (6.8601e+00) Acc@1 0.78 ( 0.24) Acc@5 2.73 ( 1.08) +Epoch: [0][ 998/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.028) Loss 6.6242e+00 (6.8599e+00) Acc@1 0.78 ( 0.24) Acc@5 1.56 ( 1.08) +Epoch: [0][ 999/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6478e+00 (6.8597e+00) Acc@1 0.00 ( 0.24) Acc@5 0.00 ( 1.08) +Epoch: [0][1000/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.6333e+00 (6.8595e+00) Acc@1 0.00 ( 0.24) Acc@5 1.95 ( 1.08) +Epoch: [0][1001/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.5575e+00 (6.8592e+00) Acc@1 0.78 ( 0.24) Acc@5 1.56 ( 1.08) +Epoch: [0][1002/5004] Time 0.242 ( 0.241) Data 0.030 ( 0.028) Loss 6.5881e+00 (6.8589e+00) Acc@1 0.39 ( 0.24) Acc@5 3.91 ( 1.08) +Epoch: [0][1003/5004] Time 0.244 ( 0.241) Data 0.028 ( 0.028) Loss 6.5473e+00 (6.8586e+00) Acc@1 0.39 ( 0.24) Acc@5 1.17 ( 1.08) +Epoch: [0][1004/5004] Time 0.244 ( 0.241) Data 0.027 ( 0.028) Loss 6.5767e+00 (6.8583e+00) Acc@1 0.39 ( 0.24) Acc@5 2.73 ( 1.09) +Epoch: [0][1005/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6105e+00 (6.8581e+00) Acc@1 0.00 ( 0.24) Acc@5 1.56 ( 1.09) +Epoch: [0][1006/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.6374e+00 (6.8578e+00) Acc@1 0.78 ( 0.24) Acc@5 2.73 ( 1.09) +Epoch: [0][1007/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.028) Loss 6.5697e+00 (6.8575e+00) Acc@1 0.39 ( 0.24) Acc@5 1.56 ( 1.09) +Epoch: [0][1008/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.6380e+00 (6.8573e+00) Acc@1 0.78 ( 0.24) Acc@5 1.56 ( 1.09) +Epoch: [0][1009/5004] Time 0.247 ( 0.241) Data 0.027 ( 0.028) Loss 6.5745e+00 (6.8570e+00) Acc@1 1.56 ( 0.24) Acc@5 2.73 ( 1.09) +Epoch: [0][1010/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.5510e+00 (6.8567e+00) Acc@1 1.17 ( 0.25) Acc@5 3.12 ( 1.09) +Epoch: [0][1011/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.028) Loss 6.6642e+00 (6.8566e+00) Acc@1 1.17 ( 0.25) Acc@5 2.73 ( 1.09) +Epoch: [0][1012/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6845e+00 (6.8564e+00) Acc@1 0.39 ( 0.25) Acc@5 2.34 ( 1.10) +Epoch: [0][1013/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.028) Loss 6.5341e+00 (6.8561e+00) Acc@1 1.17 ( 0.25) Acc@5 3.91 ( 1.10) +Epoch: [0][1014/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.6511e+00 (6.8559e+00) Acc@1 0.39 ( 0.25) Acc@5 1.17 ( 1.10) +Epoch: [0][1015/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.5104e+00 (6.8555e+00) Acc@1 0.39 ( 0.25) Acc@5 1.95 ( 1.10) +Epoch: [0][1016/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.028) Loss 6.7300e+00 (6.8554e+00) Acc@1 0.00 ( 0.25) Acc@5 1.17 ( 1.10) +Epoch: [0][1017/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.028) Loss 6.5924e+00 (6.8551e+00) Acc@1 0.39 ( 0.25) Acc@5 2.34 ( 1.10) +Epoch: [0][1018/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.028) Loss 6.7619e+00 (6.8551e+00) Acc@1 0.00 ( 0.25) Acc@5 2.34 ( 1.10) +Epoch: [0][1019/5004] Time 0.246 ( 0.241) Data 0.028 ( 0.028) Loss 6.6147e+00 (6.8548e+00) Acc@1 0.00 ( 0.25) Acc@5 1.95 ( 1.10) +Epoch: [0][1020/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.028) Loss 6.5624e+00 (6.8545e+00) Acc@1 0.78 ( 0.25) Acc@5 3.52 ( 1.10) +Epoch: [0][1021/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.5996e+00 (6.8543e+00) Acc@1 1.17 ( 0.25) Acc@5 2.34 ( 1.11) +Epoch: [0][1022/5004] Time 0.237 ( 0.241) Data 0.025 ( 0.028) Loss 6.6462e+00 (6.8541e+00) Acc@1 0.00 ( 0.25) Acc@5 1.17 ( 1.11) +Epoch: [0][1023/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.5859e+00 (6.8538e+00) Acc@1 0.78 ( 0.25) Acc@5 1.56 ( 1.11) +Epoch: [0][1024/5004] Time 0.241 ( 0.241) Data 0.028 ( 0.028) Loss 6.5855e+00 (6.8536e+00) Acc@1 0.00 ( 0.25) Acc@5 2.34 ( 1.11) +Epoch: [0][1025/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.5473e+00 (6.8533e+00) Acc@1 0.78 ( 0.25) Acc@5 2.73 ( 1.11) +Epoch: [0][1026/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.028) Loss 6.5644e+00 (6.8530e+00) Acc@1 0.78 ( 0.25) Acc@5 3.91 ( 1.11) +Epoch: [0][1027/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.5261e+00 (6.8527e+00) Acc@1 0.39 ( 0.25) Acc@5 3.12 ( 1.11) +Epoch: [0][1028/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.028) Loss 6.6228e+00 (6.8524e+00) Acc@1 0.78 ( 0.25) Acc@5 1.95 ( 1.11) +Epoch: [0][1029/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.5947e+00 (6.8522e+00) Acc@1 0.00 ( 0.25) Acc@5 3.12 ( 1.12) +Epoch: [0][1030/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.6140e+00 (6.8520e+00) Acc@1 0.78 ( 0.25) Acc@5 1.56 ( 1.12) +Epoch: [0][1031/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.028) Loss 6.5863e+00 (6.8517e+00) Acc@1 0.78 ( 0.25) Acc@5 2.34 ( 1.12) +Epoch: [0][1032/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.028) Loss 6.5518e+00 (6.8514e+00) Acc@1 0.39 ( 0.25) Acc@5 1.95 ( 1.12) +Epoch: [0][1033/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.028) Loss 6.6367e+00 (6.8512e+00) Acc@1 0.39 ( 0.25) Acc@5 2.34 ( 1.12) +Epoch: [0][1034/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.028) Loss 6.6277e+00 (6.8510e+00) Acc@1 0.00 ( 0.25) Acc@5 0.78 ( 1.12) +Epoch: [0][1035/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.5804e+00 (6.8507e+00) Acc@1 0.78 ( 0.25) Acc@5 2.34 ( 1.12) +Epoch: [0][1036/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.5894e+00 (6.8505e+00) Acc@1 0.78 ( 0.25) Acc@5 2.34 ( 1.12) +Epoch: [0][1037/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.028) Loss 6.6114e+00 (6.8502e+00) Acc@1 0.00 ( 0.25) Acc@5 1.17 ( 1.12) +Epoch: [0][1038/5004] Time 0.249 ( 0.241) Data 0.028 ( 0.028) Loss 6.6319e+00 (6.8500e+00) Acc@1 0.39 ( 0.25) Acc@5 2.34 ( 1.12) +Epoch: [0][1039/5004] Time 0.239 ( 0.241) Data 0.024 ( 0.028) Loss 6.5901e+00 (6.8498e+00) Acc@1 0.78 ( 0.25) Acc@5 2.34 ( 1.12) +Epoch: [0][1040/5004] Time 0.236 ( 0.241) Data 0.026 ( 0.028) Loss 6.6157e+00 (6.8496e+00) Acc@1 0.00 ( 0.25) Acc@5 0.78 ( 1.12) +Epoch: [0][1041/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.028) Loss 6.5204e+00 (6.8492e+00) Acc@1 0.78 ( 0.25) Acc@5 2.73 ( 1.13) +Epoch: [0][1042/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.5613e+00 (6.8490e+00) Acc@1 0.78 ( 0.25) Acc@5 1.95 ( 1.13) +Epoch: [0][1043/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.028) Loss 6.5164e+00 (6.8486e+00) Acc@1 0.39 ( 0.25) Acc@5 2.73 ( 1.13) +Epoch: [0][1044/5004] Time 0.237 ( 0.241) Data 0.027 ( 0.028) Loss 6.6514e+00 (6.8485e+00) Acc@1 0.78 ( 0.25) Acc@5 1.17 ( 1.13) +Epoch: [0][1045/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.5814e+00 (6.8482e+00) Acc@1 0.78 ( 0.25) Acc@5 3.12 ( 1.13) +Epoch: [0][1046/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.5507e+00 (6.8479e+00) Acc@1 0.39 ( 0.25) Acc@5 1.95 ( 1.13) +Epoch: [0][1047/5004] Time 0.244 ( 0.241) Data 0.025 ( 0.028) Loss 6.5464e+00 (6.8476e+00) Acc@1 0.00 ( 0.25) Acc@5 0.78 ( 1.13) +Epoch: [0][1048/5004] Time 0.237 ( 0.241) Data 0.023 ( 0.028) Loss 6.5455e+00 (6.8473e+00) Acc@1 0.39 ( 0.25) Acc@5 3.52 ( 1.13) +Epoch: [0][1049/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.6037e+00 (6.8471e+00) Acc@1 0.39 ( 0.25) Acc@5 1.56 ( 1.13) +Epoch: [0][1050/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.5907e+00 (6.8469e+00) Acc@1 0.00 ( 0.25) Acc@5 2.34 ( 1.13) +Epoch: [0][1051/5004] Time 0.242 ( 0.241) Data 0.025 ( 0.028) Loss 6.6052e+00 (6.8466e+00) Acc@1 0.00 ( 0.25) Acc@5 1.56 ( 1.14) +Epoch: [0][1052/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.028) Loss 6.5734e+00 (6.8464e+00) Acc@1 0.78 ( 0.25) Acc@5 1.56 ( 1.14) +Epoch: [0][1053/5004] Time 0.244 ( 0.241) Data 0.027 ( 0.028) Loss 6.5543e+00 (6.8461e+00) Acc@1 0.78 ( 0.26) Acc@5 2.34 ( 1.14) +Epoch: [0][1054/5004] Time 0.233 ( 0.241) Data 0.021 ( 0.028) Loss 6.5199e+00 (6.8458e+00) Acc@1 0.78 ( 0.26) Acc@5 3.12 ( 1.14) +Epoch: [0][1055/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5062e+00 (6.8455e+00) Acc@1 0.00 ( 0.26) Acc@5 1.56 ( 1.14) +Epoch: [0][1056/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.028) Loss 6.5316e+00 (6.8452e+00) Acc@1 1.17 ( 0.26) Acc@5 3.91 ( 1.14) +Epoch: [0][1057/5004] Time 0.235 ( 0.241) Data 0.024 ( 0.028) Loss 6.6779e+00 (6.8450e+00) Acc@1 0.78 ( 0.26) Acc@5 1.56 ( 1.14) +Epoch: [0][1058/5004] Time 0.251 ( 0.241) Data 0.027 ( 0.028) Loss 6.5029e+00 (6.8447e+00) Acc@1 0.39 ( 0.26) Acc@5 3.52 ( 1.14) +Epoch: [0][1059/5004] Time 0.230 ( 0.241) Data 0.018 ( 0.028) Loss 6.5954e+00 (6.8445e+00) Acc@1 0.78 ( 0.26) Acc@5 3.91 ( 1.15) +Epoch: [0][1060/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5419e+00 (6.8442e+00) Acc@1 0.00 ( 0.26) Acc@5 3.12 ( 1.15) +Epoch: [0][1061/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.028) Loss 6.5168e+00 (6.8439e+00) Acc@1 0.39 ( 0.26) Acc@5 2.73 ( 1.15) +Epoch: [0][1062/5004] Time 0.236 ( 0.241) Data 0.024 ( 0.028) Loss 6.6493e+00 (6.8437e+00) Acc@1 0.39 ( 0.26) Acc@5 1.17 ( 1.15) +Epoch: [0][1063/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.028) Loss 6.4756e+00 (6.8433e+00) Acc@1 0.78 ( 0.26) Acc@5 3.52 ( 1.15) +Epoch: [0][1064/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.028) Loss 6.5601e+00 (6.8431e+00) Acc@1 0.39 ( 0.26) Acc@5 2.34 ( 1.15) +Epoch: [0][1065/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.6356e+00 (6.8429e+00) Acc@1 0.78 ( 0.26) Acc@5 1.95 ( 1.15) +Epoch: [0][1066/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.028) Loss 6.6806e+00 (6.8427e+00) Acc@1 0.39 ( 0.26) Acc@5 2.34 ( 1.16) +Epoch: [0][1067/5004] Time 0.237 ( 0.241) Data 0.027 ( 0.028) Loss 6.5703e+00 (6.8425e+00) Acc@1 1.17 ( 0.26) Acc@5 1.56 ( 1.16) +Epoch: [0][1068/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.5498e+00 (6.8422e+00) Acc@1 0.00 ( 0.26) Acc@5 3.91 ( 1.16) +Epoch: [0][1069/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.028) Loss 6.5219e+00 (6.8419e+00) Acc@1 0.78 ( 0.26) Acc@5 2.73 ( 1.16) +Epoch: [0][1070/5004] Time 0.242 ( 0.241) Data 0.027 ( 0.028) Loss 6.6475e+00 (6.8417e+00) Acc@1 0.78 ( 0.26) Acc@5 1.56 ( 1.16) +Epoch: [0][1071/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.028) Loss 6.6552e+00 (6.8415e+00) Acc@1 0.39 ( 0.26) Acc@5 1.17 ( 1.16) +Epoch: [0][1072/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.5810e+00 (6.8413e+00) Acc@1 0.00 ( 0.26) Acc@5 1.17 ( 1.16) +Epoch: [0][1073/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5474e+00 (6.8410e+00) Acc@1 0.78 ( 0.26) Acc@5 4.30 ( 1.16) +Epoch: [0][1074/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.028) Loss 6.5969e+00 (6.8408e+00) Acc@1 0.00 ( 0.26) Acc@5 2.34 ( 1.16) +Epoch: [0][1075/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5889e+00 (6.8406e+00) Acc@1 0.00 ( 0.26) Acc@5 1.17 ( 1.16) +Epoch: [0][1076/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5994e+00 (6.8403e+00) Acc@1 0.78 ( 0.26) Acc@5 1.56 ( 1.16) +Epoch: [0][1077/5004] Time 0.241 ( 0.241) Data 0.026 ( 0.028) Loss 6.5196e+00 (6.8400e+00) Acc@1 0.39 ( 0.26) Acc@5 1.56 ( 1.16) +Epoch: [0][1078/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.028) Loss 6.5625e+00 (6.8398e+00) Acc@1 0.78 ( 0.26) Acc@5 1.56 ( 1.17) +Epoch: [0][1079/5004] Time 0.243 ( 0.241) Data 0.027 ( 0.028) Loss 6.5061e+00 (6.8395e+00) Acc@1 0.39 ( 0.26) Acc@5 2.34 ( 1.17) +Epoch: [0][1080/5004] Time 0.236 ( 0.241) Data 0.022 ( 0.028) Loss 6.6160e+00 (6.8393e+00) Acc@1 1.17 ( 0.26) Acc@5 4.30 ( 1.17) +Epoch: [0][1081/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.6218e+00 (6.8391e+00) Acc@1 0.39 ( 0.26) Acc@5 2.34 ( 1.17) +Epoch: [0][1082/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.5248e+00 (6.8388e+00) Acc@1 1.17 ( 0.26) Acc@5 3.52 ( 1.17) +Epoch: [0][1083/5004] Time 0.240 ( 0.241) Data 0.026 ( 0.028) Loss 6.5025e+00 (6.8385e+00) Acc@1 1.56 ( 0.26) Acc@5 3.91 ( 1.18) +Epoch: [0][1084/5004] Time 0.245 ( 0.241) Data 0.025 ( 0.028) Loss 6.6667e+00 (6.8383e+00) Acc@1 0.00 ( 0.26) Acc@5 2.34 ( 1.18) +Epoch: [0][1085/5004] Time 0.240 ( 0.241) Data 0.022 ( 0.028) Loss 6.6407e+00 (6.8381e+00) Acc@1 0.00 ( 0.26) Acc@5 2.34 ( 1.18) +Epoch: [0][1086/5004] Time 0.240 ( 0.241) Data 0.024 ( 0.028) Loss 6.5809e+00 (6.8379e+00) Acc@1 0.39 ( 0.26) Acc@5 1.95 ( 1.18) +Epoch: [0][1087/5004] Time 0.241 ( 0.241) Data 0.025 ( 0.028) Loss 6.5705e+00 (6.8376e+00) Acc@1 0.78 ( 0.26) Acc@5 2.73 ( 1.18) +Epoch: [0][1088/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.5472e+00 (6.8374e+00) Acc@1 0.00 ( 0.26) Acc@5 3.12 ( 1.18) +Epoch: [0][1089/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.6507e+00 (6.8372e+00) Acc@1 0.39 ( 0.26) Acc@5 1.56 ( 1.18) +Epoch: [0][1090/5004] Time 0.242 ( 0.241) Data 0.026 ( 0.028) Loss 6.5088e+00 (6.8369e+00) Acc@1 0.39 ( 0.26) Acc@5 2.34 ( 1.18) +Epoch: [0][1091/5004] Time 0.243 ( 0.241) Data 0.025 ( 0.028) Loss 6.5922e+00 (6.8367e+00) Acc@1 0.39 ( 0.26) Acc@5 2.73 ( 1.18) +Epoch: [0][1092/5004] Time 0.241 ( 0.241) Data 0.028 ( 0.028) Loss 6.4746e+00 (6.8363e+00) Acc@1 0.39 ( 0.26) Acc@5 3.12 ( 1.19) +Epoch: [0][1093/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.028) Loss 6.5978e+00 (6.8361e+00) Acc@1 0.00 ( 0.26) Acc@5 1.56 ( 1.19) +Epoch: [0][1094/5004] Time 0.240 ( 0.241) Data 0.025 ( 0.028) Loss 6.5915e+00 (6.8359e+00) Acc@1 0.78 ( 0.27) Acc@5 1.95 ( 1.19) +Epoch: [0][1095/5004] Time 0.234 ( 0.241) Data 0.025 ( 0.028) Loss 6.4656e+00 (6.8356e+00) Acc@1 0.39 ( 0.27) Acc@5 2.34 ( 1.19) +Epoch: [0][1096/5004] Time 0.239 ( 0.241) Data 0.030 ( 0.028) Loss 6.5648e+00 (6.8353e+00) Acc@1 0.78 ( 0.27) Acc@5 1.95 ( 1.19) +Epoch: [0][1097/5004] Time 0.240 ( 0.241) Data 0.031 ( 0.028) Loss 6.6436e+00 (6.8351e+00) Acc@1 0.39 ( 0.27) Acc@5 0.78 ( 1.19) +Epoch: [0][1098/5004] Time 0.242 ( 0.241) Data 0.031 ( 0.028) Loss 6.5309e+00 (6.8349e+00) Acc@1 1.17 ( 0.27) Acc@5 2.73 ( 1.19) +Epoch: [0][1099/5004] Time 0.237 ( 0.241) Data 0.031 ( 0.028) Loss 6.5474e+00 (6.8346e+00) Acc@1 0.00 ( 0.27) Acc@5 1.95 ( 1.19) +Epoch: [0][1100/5004] Time 0.237 ( 0.241) Data 0.032 ( 0.028) Loss 6.5610e+00 (6.8344e+00) Acc@1 0.00 ( 0.27) Acc@5 3.12 ( 1.19) +Epoch: [0][1101/5004] Time 0.240 ( 0.241) Data 0.032 ( 0.028) Loss 6.4982e+00 (6.8341e+00) Acc@1 0.78 ( 0.27) Acc@5 3.52 ( 1.19) +Epoch: [0][1102/5004] Time 0.242 ( 0.241) Data 0.031 ( 0.028) Loss 6.5853e+00 (6.8338e+00) Acc@1 0.39 ( 0.27) Acc@5 2.73 ( 1.20) +Epoch: [0][1103/5004] Time 0.235 ( 0.241) Data 0.027 ( 0.028) Loss 6.5581e+00 (6.8336e+00) Acc@1 1.17 ( 0.27) Acc@5 2.73 ( 1.20) +Epoch: [0][1104/5004] Time 0.232 ( 0.241) Data 0.031 ( 0.028) Loss 6.5549e+00 (6.8333e+00) Acc@1 1.17 ( 0.27) Acc@5 3.52 ( 1.20) +Epoch: [0][1105/5004] Time 0.217 ( 0.241) Data 0.037 ( 0.028) Loss 6.5306e+00 (6.8330e+00) Acc@1 0.78 ( 0.27) Acc@5 1.56 ( 1.20) +Epoch: [0][1106/5004] Time 0.239 ( 0.241) Data 0.057 ( 0.028) Loss 6.5493e+00 (6.8328e+00) Acc@1 0.39 ( 0.27) Acc@5 2.73 ( 1.20) +Epoch: [0][1107/5004] Time 0.239 ( 0.241) Data 0.055 ( 0.028) Loss 6.5638e+00 (6.8326e+00) Acc@1 0.39 ( 0.27) Acc@5 4.30 ( 1.20) +Epoch: [0][1108/5004] Time 0.238 ( 0.241) Data 0.053 ( 0.028) Loss 6.5415e+00 (6.8323e+00) Acc@1 0.78 ( 0.27) Acc@5 3.91 ( 1.21) +Epoch: [0][1109/5004] Time 0.237 ( 0.241) Data 0.054 ( 0.028) Loss 6.5323e+00 (6.8320e+00) Acc@1 0.78 ( 0.27) Acc@5 3.52 ( 1.21) +Epoch: [0][1110/5004] Time 0.237 ( 0.241) Data 0.055 ( 0.028) Loss 6.5494e+00 (6.8318e+00) Acc@1 0.00 ( 0.27) Acc@5 1.56 ( 1.21) +Epoch: [0][1111/5004] Time 0.238 ( 0.241) Data 0.054 ( 0.028) Loss 6.5054e+00 (6.8315e+00) Acc@1 1.17 ( 0.27) Acc@5 3.91 ( 1.21) +Epoch: [0][1112/5004] Time 0.239 ( 0.241) Data 0.053 ( 0.028) Loss 6.5223e+00 (6.8312e+00) Acc@1 0.78 ( 0.27) Acc@5 4.69 ( 1.21) +Epoch: [0][1113/5004] Time 0.239 ( 0.241) Data 0.054 ( 0.028) Loss 6.6440e+00 (6.8310e+00) Acc@1 0.39 ( 0.27) Acc@5 1.95 ( 1.21) +Epoch: [0][1114/5004] Time 0.238 ( 0.241) Data 0.055 ( 0.028) Loss 6.4709e+00 (6.8307e+00) Acc@1 1.17 ( 0.27) Acc@5 4.30 ( 1.22) +Epoch: [0][1115/5004] Time 0.236 ( 0.241) Data 0.054 ( 0.028) Loss 6.6017e+00 (6.8305e+00) Acc@1 1.56 ( 0.27) Acc@5 2.34 ( 1.22) +Epoch: [0][1116/5004] Time 0.238 ( 0.241) Data 0.055 ( 0.028) Loss 6.5013e+00 (6.8302e+00) Acc@1 0.39 ( 0.27) Acc@5 3.12 ( 1.22) +Epoch: [0][1117/5004] Time 0.238 ( 0.241) Data 0.054 ( 0.028) Loss 6.4601e+00 (6.8299e+00) Acc@1 0.39 ( 0.27) Acc@5 2.73 ( 1.22) +Epoch: [0][1118/5004] Time 0.263 ( 0.241) Data 0.053 ( 0.028) Loss 6.4961e+00 (6.8296e+00) Acc@1 0.39 ( 0.27) Acc@5 3.12 ( 1.22) +Epoch: [0][1119/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.028) Loss 6.4736e+00 (6.8293e+00) Acc@1 0.00 ( 0.27) Acc@5 1.56 ( 1.22) +Epoch: [0][1120/5004] Time 0.238 ( 0.241) Data 0.028 ( 0.028) Loss 6.5305e+00 (6.8290e+00) Acc@1 0.00 ( 0.27) Acc@5 1.56 ( 1.22) +Epoch: [0][1121/5004] Time 0.245 ( 0.241) Data 0.028 ( 0.028) Loss 6.4868e+00 (6.8287e+00) Acc@1 0.78 ( 0.27) Acc@5 3.12 ( 1.23) +Epoch: [0][1122/5004] Time 0.232 ( 0.241) Data 0.022 ( 0.028) Loss 6.4555e+00 (6.8284e+00) Acc@1 0.39 ( 0.27) Acc@5 1.95 ( 1.23) +Epoch: [0][1123/5004] Time 0.239 ( 0.241) Data 0.028 ( 0.028) Loss 6.5168e+00 (6.8281e+00) Acc@1 0.78 ( 0.27) Acc@5 2.73 ( 1.23) +Epoch: [0][1124/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.4468e+00 (6.8277e+00) Acc@1 0.78 ( 0.27) Acc@5 3.91 ( 1.23) +Epoch: [0][1125/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.028) Loss 6.5879e+00 (6.8275e+00) Acc@1 0.39 ( 0.27) Acc@5 1.56 ( 1.23) +Epoch: [0][1126/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.028) Loss 6.4508e+00 (6.8272e+00) Acc@1 0.39 ( 0.27) Acc@5 1.56 ( 1.23) +Epoch: [0][1127/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.028) Loss 6.5128e+00 (6.8269e+00) Acc@1 0.00 ( 0.27) Acc@5 3.12 ( 1.23) +Epoch: [0][1128/5004] Time 0.238 ( 0.241) Data 0.027 ( 0.028) Loss 6.5561e+00 (6.8267e+00) Acc@1 0.78 ( 0.27) Acc@5 2.73 ( 1.23) +Epoch: [0][1129/5004] Time 0.241 ( 0.241) Data 0.028 ( 0.028) Loss 6.6104e+00 (6.8265e+00) Acc@1 0.39 ( 0.27) Acc@5 2.34 ( 1.23) +Epoch: [0][1130/5004] Time 0.235 ( 0.241) Data 0.025 ( 0.028) Loss 6.5616e+00 (6.8262e+00) Acc@1 0.00 ( 0.27) Acc@5 2.73 ( 1.24) +Epoch: [0][1131/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.028) Loss 6.5288e+00 (6.8260e+00) Acc@1 1.17 ( 0.28) Acc@5 3.52 ( 1.24) +Epoch: [0][1132/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.028) Loss 6.5285e+00 (6.8257e+00) Acc@1 0.39 ( 0.28) Acc@5 2.73 ( 1.24) +Epoch: [0][1133/5004] Time 0.238 ( 0.241) Data 0.023 ( 0.028) Loss 6.4903e+00 (6.8254e+00) Acc@1 1.17 ( 0.28) Acc@5 3.12 ( 1.24) +Epoch: [0][1134/5004] Time 0.237 ( 0.241) Data 0.024 ( 0.028) Loss 6.5250e+00 (6.8252e+00) Acc@1 1.56 ( 0.28) Acc@5 2.34 ( 1.24) +Epoch: [0][1135/5004] Time 0.238 ( 0.241) Data 0.025 ( 0.028) Loss 6.6541e+00 (6.8250e+00) Acc@1 1.56 ( 0.28) Acc@5 3.52 ( 1.24) +Epoch: [0][1136/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.5366e+00 (6.8248e+00) Acc@1 0.00 ( 0.28) Acc@5 1.95 ( 1.24) +Epoch: [0][1137/5004] Time 0.239 ( 0.241) Data 0.026 ( 0.028) Loss 6.5135e+00 (6.8245e+00) Acc@1 0.78 ( 0.28) Acc@5 2.34 ( 1.25) +Epoch: [0][1138/5004] Time 0.244 ( 0.241) Data 0.025 ( 0.028) Loss 6.4438e+00 (6.8241e+00) Acc@1 1.95 ( 0.28) Acc@5 3.12 ( 1.25) +Epoch: [0][1139/5004] Time 0.236 ( 0.241) Data 0.023 ( 0.028) Loss 6.4447e+00 (6.8238e+00) Acc@1 0.00 ( 0.28) Acc@5 2.73 ( 1.25) +Epoch: [0][1140/5004] Time 0.240 ( 0.241) Data 0.024 ( 0.028) Loss 6.6344e+00 (6.8236e+00) Acc@1 0.78 ( 0.28) Acc@5 1.56 ( 1.25) +Epoch: [0][1141/5004] Time 0.235 ( 0.241) Data 0.023 ( 0.028) Loss 6.5512e+00 (6.8234e+00) Acc@1 0.39 ( 0.28) Acc@5 1.17 ( 1.25) +Epoch: [0][1142/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.028) Loss 6.5292e+00 (6.8231e+00) Acc@1 0.39 ( 0.28) Acc@5 3.12 ( 1.25) +Epoch: [0][1143/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.028) Loss 6.5443e+00 (6.8229e+00) Acc@1 0.39 ( 0.28) Acc@5 4.69 ( 1.25) +Epoch: [0][1144/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.028) Loss 6.5545e+00 (6.8227e+00) Acc@1 0.39 ( 0.28) Acc@5 1.95 ( 1.25) +Epoch: [0][1145/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.028) Loss 6.5945e+00 (6.8225e+00) Acc@1 1.56 ( 0.28) Acc@5 2.34 ( 1.25) +Epoch: [0][1146/5004] Time 0.240 ( 0.240) Data 0.023 ( 0.028) Loss 6.4738e+00 (6.8222e+00) Acc@1 0.39 ( 0.28) Acc@5 2.34 ( 1.26) +Epoch: [0][1147/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.028) Loss 6.4725e+00 (6.8219e+00) Acc@1 0.78 ( 0.28) Acc@5 3.91 ( 1.26) +Epoch: [0][1148/5004] Time 0.241 ( 0.240) Data 0.023 ( 0.028) Loss 6.5390e+00 (6.8216e+00) Acc@1 0.78 ( 0.28) Acc@5 5.08 ( 1.26) +Epoch: [0][1149/5004] Time 0.237 ( 0.240) Data 0.022 ( 0.028) Loss 6.4270e+00 (6.8213e+00) Acc@1 1.56 ( 0.28) Acc@5 4.69 ( 1.26) +Epoch: [0][1150/5004] Time 0.238 ( 0.240) Data 0.024 ( 0.028) Loss 6.4562e+00 (6.8210e+00) Acc@1 0.78 ( 0.28) Acc@5 3.91 ( 1.27) +Epoch: [0][1151/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.028) Loss 6.4135e+00 (6.8206e+00) Acc@1 0.78 ( 0.28) Acc@5 4.69 ( 1.27) +Epoch: [0][1152/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.028) Loss 6.4431e+00 (6.8203e+00) Acc@1 0.78 ( 0.29) Acc@5 3.12 ( 1.27) +Epoch: [0][1153/5004] Time 0.239 ( 0.240) Data 0.024 ( 0.028) Loss 6.4681e+00 (6.8200e+00) Acc@1 0.00 ( 0.29) Acc@5 2.34 ( 1.27) +Epoch: [0][1154/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.028) Loss 6.4868e+00 (6.8197e+00) Acc@1 1.17 ( 0.29) Acc@5 2.73 ( 1.27) +Epoch: [0][1155/5004] Time 0.240 ( 0.240) Data 0.023 ( 0.028) Loss 6.4783e+00 (6.8194e+00) Acc@1 0.39 ( 0.29) Acc@5 1.95 ( 1.27) +Epoch: [0][1156/5004] Time 0.238 ( 0.240) Data 0.023 ( 0.028) Loss 6.4897e+00 (6.8191e+00) Acc@1 1.17 ( 0.29) Acc@5 2.73 ( 1.28) +Epoch: [0][1157/5004] Time 0.237 ( 0.240) Data 0.024 ( 0.028) Loss 6.5253e+00 (6.8188e+00) Acc@1 0.78 ( 0.29) Acc@5 2.73 ( 1.28) +Epoch: [0][1158/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.028) Loss 6.6011e+00 (6.8187e+00) Acc@1 1.17 ( 0.29) Acc@5 2.73 ( 1.28) +Epoch: [0][1159/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.028) Loss 6.4578e+00 (6.8184e+00) Acc@1 0.39 ( 0.29) Acc@5 3.91 ( 1.28) +Epoch: [0][1160/5004] Time 0.239 ( 0.240) Data 0.024 ( 0.028) Loss 6.4633e+00 (6.8180e+00) Acc@1 0.78 ( 0.29) Acc@5 3.52 ( 1.28) +Epoch: [0][1161/5004] Time 0.239 ( 0.240) Data 0.023 ( 0.028) Loss 6.4100e+00 (6.8177e+00) Acc@1 1.56 ( 0.29) Acc@5 5.47 ( 1.29) +Epoch: [0][1162/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.028) Loss 6.5483e+00 (6.8175e+00) Acc@1 0.39 ( 0.29) Acc@5 1.95 ( 1.29) +Epoch: [0][1163/5004] Time 0.244 ( 0.240) Data 0.025 ( 0.028) Loss 6.5265e+00 (6.8172e+00) Acc@1 0.78 ( 0.29) Acc@5 2.34 ( 1.29) +Epoch: [0][1164/5004] Time 0.241 ( 0.240) Data 0.023 ( 0.028) Loss 6.5666e+00 (6.8170e+00) Acc@1 0.00 ( 0.29) Acc@5 3.12 ( 1.29) +Epoch: [0][1165/5004] Time 0.238 ( 0.240) Data 0.023 ( 0.028) Loss 6.5296e+00 (6.8167e+00) Acc@1 0.39 ( 0.29) Acc@5 2.34 ( 1.29) +Epoch: [0][1166/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.028) Loss 6.5031e+00 (6.8165e+00) Acc@1 1.17 ( 0.29) Acc@5 4.69 ( 1.29) +Epoch: [0][1167/5004] Time 0.237 ( 0.240) Data 0.022 ( 0.028) Loss 6.6110e+00 (6.8163e+00) Acc@1 0.78 ( 0.29) Acc@5 2.34 ( 1.29) +Epoch: [0][1168/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.028) Loss 6.4237e+00 (6.8160e+00) Acc@1 1.17 ( 0.29) Acc@5 3.12 ( 1.29) +Epoch: [0][1169/5004] Time 0.242 ( 0.240) Data 0.023 ( 0.028) Loss 6.4419e+00 (6.8156e+00) Acc@1 1.17 ( 0.29) Acc@5 3.91 ( 1.30) +Epoch: [0][1170/5004] Time 0.242 ( 0.240) Data 0.022 ( 0.028) Loss 6.5598e+00 (6.8154e+00) Acc@1 0.78 ( 0.29) Acc@5 3.12 ( 1.30) +Epoch: [0][1171/5004] Time 0.241 ( 0.240) Data 0.022 ( 0.028) Loss 6.5698e+00 (6.8152e+00) Acc@1 0.39 ( 0.29) Acc@5 1.56 ( 1.30) +Epoch: [0][1172/5004] Time 0.243 ( 0.240) Data 0.023 ( 0.028) Loss 6.4571e+00 (6.8149e+00) Acc@1 0.39 ( 0.29) Acc@5 2.34 ( 1.30) +Epoch: [0][1173/5004] Time 0.243 ( 0.240) Data 0.022 ( 0.028) Loss 6.4365e+00 (6.8146e+00) Acc@1 1.95 ( 0.29) Acc@5 3.91 ( 1.30) +Epoch: [0][1174/5004] Time 0.240 ( 0.240) Data 0.021 ( 0.028) Loss 6.4593e+00 (6.8143e+00) Acc@1 1.56 ( 0.30) Acc@5 2.73 ( 1.30) +Epoch: [0][1175/5004] Time 0.243 ( 0.240) Data 0.022 ( 0.028) Loss 6.5082e+00 (6.8140e+00) Acc@1 0.39 ( 0.30) Acc@5 4.30 ( 1.31) +Epoch: [0][1176/5004] Time 0.242 ( 0.240) Data 0.021 ( 0.028) Loss 6.5324e+00 (6.8138e+00) Acc@1 1.17 ( 0.30) Acc@5 5.47 ( 1.31) +Epoch: [0][1177/5004] Time 0.236 ( 0.240) Data 0.022 ( 0.028) Loss 6.5855e+00 (6.8136e+00) Acc@1 0.39 ( 0.30) Acc@5 2.73 ( 1.31) +Epoch: [0][1178/5004] Time 0.244 ( 0.240) Data 0.028 ( 0.028) Loss 6.4060e+00 (6.8133e+00) Acc@1 1.95 ( 0.30) Acc@5 4.30 ( 1.31) +Epoch: [0][1179/5004] Time 0.233 ( 0.240) Data 0.022 ( 0.028) Loss 6.4459e+00 (6.8129e+00) Acc@1 0.78 ( 0.30) Acc@5 2.34 ( 1.31) +Epoch: [0][1180/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.028) Loss 6.4859e+00 (6.8127e+00) Acc@1 0.78 ( 0.30) Acc@5 1.95 ( 1.31) +Epoch: [0][1181/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.028) Loss 6.6005e+00 (6.8125e+00) Acc@1 0.78 ( 0.30) Acc@5 2.73 ( 1.32) +Epoch: [0][1182/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.028) Loss 6.4756e+00 (6.8122e+00) Acc@1 0.78 ( 0.30) Acc@5 2.73 ( 1.32) +Epoch: [0][1183/5004] Time 0.242 ( 0.240) Data 0.025 ( 0.028) Loss 6.4454e+00 (6.8119e+00) Acc@1 0.39 ( 0.30) Acc@5 2.73 ( 1.32) +Epoch: [0][1184/5004] Time 0.237 ( 0.240) Data 0.022 ( 0.028) Loss 6.5951e+00 (6.8117e+00) Acc@1 0.39 ( 0.30) Acc@5 3.52 ( 1.32) +Epoch: [0][1185/5004] Time 0.234 ( 0.240) Data 0.025 ( 0.028) Loss 6.5359e+00 (6.8115e+00) Acc@1 0.78 ( 0.30) Acc@5 3.12 ( 1.32) +Epoch: [0][1186/5004] Time 0.239 ( 0.240) Data 0.029 ( 0.028) Loss 6.5574e+00 (6.8113e+00) Acc@1 0.78 ( 0.30) Acc@5 3.12 ( 1.32) +Epoch: [0][1187/5004] Time 0.238 ( 0.240) Data 0.029 ( 0.028) Loss 6.4302e+00 (6.8109e+00) Acc@1 0.00 ( 0.30) Acc@5 2.73 ( 1.32) +Epoch: [0][1188/5004] Time 0.240 ( 0.240) Data 0.029 ( 0.028) Loss 6.4992e+00 (6.8107e+00) Acc@1 0.39 ( 0.30) Acc@5 1.95 ( 1.32) +Epoch: [0][1189/5004] Time 0.240 ( 0.240) Data 0.027 ( 0.028) Loss 6.4677e+00 (6.8104e+00) Acc@1 0.00 ( 0.30) Acc@5 5.47 ( 1.33) +Epoch: [0][1190/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.028) Loss 6.5888e+00 (6.8102e+00) Acc@1 0.39 ( 0.30) Acc@5 1.95 ( 1.33) +Epoch: [0][1191/5004] Time 0.236 ( 0.240) Data 0.026 ( 0.028) Loss 6.5119e+00 (6.8100e+00) Acc@1 1.17 ( 0.30) Acc@5 3.52 ( 1.33) +Epoch: [0][1192/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.028) Loss 6.5359e+00 (6.8097e+00) Acc@1 1.17 ( 0.30) Acc@5 2.34 ( 1.33) +Epoch: [0][1193/5004] Time 0.237 ( 0.240) Data 0.027 ( 0.028) Loss 6.5434e+00 (6.8095e+00) Acc@1 0.00 ( 0.30) Acc@5 2.73 ( 1.33) +Epoch: [0][1194/5004] Time 0.240 ( 0.240) Data 0.029 ( 0.028) Loss 6.5234e+00 (6.8093e+00) Acc@1 0.00 ( 0.30) Acc@5 2.73 ( 1.33) +Epoch: [0][1195/5004] Time 0.243 ( 0.240) Data 0.027 ( 0.028) Loss 6.5142e+00 (6.8090e+00) Acc@1 1.17 ( 0.30) Acc@5 4.69 ( 1.34) +Epoch: [0][1196/5004] Time 0.232 ( 0.240) Data 0.022 ( 0.028) Loss 6.4809e+00 (6.8087e+00) Acc@1 0.00 ( 0.30) Acc@5 2.73 ( 1.34) +Epoch: [0][1197/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.028) Loss 6.4906e+00 (6.8085e+00) Acc@1 1.17 ( 0.30) Acc@5 3.12 ( 1.34) +Epoch: [0][1198/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.028) Loss 6.5789e+00 (6.8083e+00) Acc@1 0.78 ( 0.30) Acc@5 3.91 ( 1.34) +Epoch: [0][1199/5004] Time 0.238 ( 0.240) Data 0.026 ( 0.028) Loss 6.3823e+00 (6.8079e+00) Acc@1 1.17 ( 0.30) Acc@5 5.08 ( 1.34) +Epoch: [0][1200/5004] Time 0.237 ( 0.240) Data 0.026 ( 0.028) Loss 6.5776e+00 (6.8077e+00) Acc@1 0.00 ( 0.30) Acc@5 1.56 ( 1.34) +Epoch: [0][1201/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.028) Loss 6.5054e+00 (6.8075e+00) Acc@1 1.56 ( 0.30) Acc@5 3.12 ( 1.35) +Epoch: [0][1202/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.028) Loss 6.5691e+00 (6.8073e+00) Acc@1 0.78 ( 0.30) Acc@5 3.12 ( 1.35) +Epoch: [0][1203/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.028) Loss 6.4427e+00 (6.8070e+00) Acc@1 0.78 ( 0.30) Acc@5 3.12 ( 1.35) +Epoch: [0][1204/5004] Time 0.236 ( 0.240) Data 0.025 ( 0.028) Loss 6.5158e+00 (6.8067e+00) Acc@1 1.17 ( 0.31) Acc@5 3.52 ( 1.35) +Epoch: [0][1205/5004] Time 0.241 ( 0.240) Data 0.028 ( 0.028) Loss 6.5275e+00 (6.8065e+00) Acc@1 1.17 ( 0.31) Acc@5 2.34 ( 1.35) +Epoch: [0][1206/5004] Time 0.237 ( 0.240) Data 0.029 ( 0.028) Loss 6.4744e+00 (6.8062e+00) Acc@1 0.78 ( 0.31) Acc@5 1.95 ( 1.35) +Epoch: [0][1207/5004] Time 0.246 ( 0.240) Data 0.030 ( 0.028) Loss 6.5385e+00 (6.8060e+00) Acc@1 0.78 ( 0.31) Acc@5 3.12 ( 1.35) +Epoch: [0][1208/5004] Time 0.242 ( 0.240) Data 0.026 ( 0.028) Loss 6.4491e+00 (6.8057e+00) Acc@1 0.39 ( 0.31) Acc@5 1.17 ( 1.35) +Epoch: [0][1209/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.028) Loss 6.4146e+00 (6.8054e+00) Acc@1 0.78 ( 0.31) Acc@5 3.12 ( 1.35) +Epoch: [0][1210/5004] Time 0.236 ( 0.240) Data 0.024 ( 0.028) Loss 6.5301e+00 (6.8052e+00) Acc@1 1.95 ( 0.31) Acc@5 3.52 ( 1.36) +Epoch: [0][1211/5004] Time 0.241 ( 0.240) Data 0.030 ( 0.028) Loss 6.5043e+00 (6.8049e+00) Acc@1 0.39 ( 0.31) Acc@5 3.91 ( 1.36) +Epoch: [0][1212/5004] Time 0.242 ( 0.240) Data 0.028 ( 0.028) Loss 6.4290e+00 (6.8046e+00) Acc@1 0.39 ( 0.31) Acc@5 2.73 ( 1.36) +Epoch: [0][1213/5004] Time 0.236 ( 0.240) Data 0.025 ( 0.028) Loss 6.4379e+00 (6.8043e+00) Acc@1 1.56 ( 0.31) Acc@5 3.52 ( 1.36) +Epoch: [0][1214/5004] Time 0.237 ( 0.240) Data 0.028 ( 0.028) Loss 6.5327e+00 (6.8041e+00) Acc@1 1.17 ( 0.31) Acc@5 3.52 ( 1.36) +Epoch: [0][1215/5004] Time 0.242 ( 0.240) Data 0.030 ( 0.028) Loss 6.4306e+00 (6.8038e+00) Acc@1 0.78 ( 0.31) Acc@5 3.52 ( 1.37) +Epoch: [0][1216/5004] Time 0.237 ( 0.240) Data 0.027 ( 0.028) Loss 6.5953e+00 (6.8036e+00) Acc@1 0.78 ( 0.31) Acc@5 3.12 ( 1.37) +Epoch: [0][1217/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.028) Loss 6.5162e+00 (6.8034e+00) Acc@1 1.17 ( 0.31) Acc@5 2.34 ( 1.37) +Epoch: [0][1218/5004] Time 0.242 ( 0.240) Data 0.027 ( 0.028) Loss 6.4772e+00 (6.8031e+00) Acc@1 1.17 ( 0.31) Acc@5 5.47 ( 1.37) +Epoch: [0][1219/5004] Time 0.238 ( 0.240) Data 0.024 ( 0.028) Loss 6.5291e+00 (6.8029e+00) Acc@1 0.78 ( 0.31) Acc@5 1.95 ( 1.37) +Epoch: [0][1220/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.028) Loss 6.4888e+00 (6.8026e+00) Acc@1 0.78 ( 0.31) Acc@5 2.73 ( 1.37) +Epoch: [0][1221/5004] Time 0.235 ( 0.240) Data 0.025 ( 0.028) Loss 6.3933e+00 (6.8023e+00) Acc@1 1.17 ( 0.31) Acc@5 5.08 ( 1.38) +Epoch: [0][1222/5004] Time 0.240 ( 0.240) Data 0.029 ( 0.028) Loss 6.4377e+00 (6.8020e+00) Acc@1 1.95 ( 0.32) Acc@5 3.12 ( 1.38) +Epoch: [0][1223/5004] Time 0.238 ( 0.240) Data 0.028 ( 0.028) Loss 6.5765e+00 (6.8018e+00) Acc@1 0.00 ( 0.32) Acc@5 1.95 ( 1.38) +Epoch: [0][1224/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.028) Loss 6.4773e+00 (6.8015e+00) Acc@1 0.00 ( 0.32) Acc@5 3.91 ( 1.38) +Epoch: [0][1225/5004] Time 0.236 ( 0.240) Data 0.027 ( 0.028) Loss 6.4497e+00 (6.8013e+00) Acc@1 0.39 ( 0.32) Acc@5 1.95 ( 1.38) +Epoch: [0][1226/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.028) Loss 6.4928e+00 (6.8010e+00) Acc@1 0.78 ( 0.32) Acc@5 3.91 ( 1.38) +Epoch: [0][1227/5004] Time 0.237 ( 0.240) Data 0.027 ( 0.028) Loss 6.3957e+00 (6.8007e+00) Acc@1 1.17 ( 0.32) Acc@5 3.91 ( 1.38) +Epoch: [0][1228/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.028) Loss 6.4891e+00 (6.8004e+00) Acc@1 1.56 ( 0.32) Acc@5 3.91 ( 1.39) +Epoch: [0][1229/5004] Time 0.236 ( 0.240) Data 0.027 ( 0.028) Loss 6.4502e+00 (6.8001e+00) Acc@1 0.78 ( 0.32) Acc@5 3.52 ( 1.39) +Epoch: [0][1230/5004] Time 0.241 ( 0.240) Data 0.029 ( 0.028) Loss 6.5640e+00 (6.7999e+00) Acc@1 1.17 ( 0.32) Acc@5 3.91 ( 1.39) +Epoch: [0][1231/5004] Time 0.237 ( 0.240) Data 0.027 ( 0.028) Loss 6.3804e+00 (6.7996e+00) Acc@1 0.39 ( 0.32) Acc@5 3.52 ( 1.39) +Epoch: [0][1232/5004] Time 0.237 ( 0.240) Data 0.028 ( 0.028) Loss 6.5117e+00 (6.7994e+00) Acc@1 0.00 ( 0.32) Acc@5 1.95 ( 1.39) +Epoch: [0][1233/5004] Time 0.240 ( 0.240) Data 0.029 ( 0.028) Loss 6.4300e+00 (6.7991e+00) Acc@1 1.17 ( 0.32) Acc@5 2.34 ( 1.39) +Epoch: [0][1234/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.028) Loss 6.4365e+00 (6.7988e+00) Acc@1 1.95 ( 0.32) Acc@5 5.08 ( 1.40) +Epoch: [0][1235/5004] Time 0.238 ( 0.240) Data 0.027 ( 0.028) Loss 6.5048e+00 (6.7985e+00) Acc@1 0.39 ( 0.32) Acc@5 3.12 ( 1.40) +Epoch: [0][1236/5004] Time 0.237 ( 0.240) Data 0.028 ( 0.028) Loss 6.5424e+00 (6.7983e+00) Acc@1 0.00 ( 0.32) Acc@5 0.39 ( 1.40) +Epoch: [0][1237/5004] Time 0.242 ( 0.240) Data 0.029 ( 0.028) Loss 6.4618e+00 (6.7981e+00) Acc@1 1.95 ( 0.32) Acc@5 5.86 ( 1.40) +Epoch: [0][1238/5004] Time 0.235 ( 0.240) Data 0.026 ( 0.028) Loss 6.3403e+00 (6.7977e+00) Acc@1 1.17 ( 0.32) Acc@5 5.08 ( 1.40) +Epoch: [0][1239/5004] Time 0.240 ( 0.240) Data 0.030 ( 0.028) Loss 6.4037e+00 (6.7974e+00) Acc@1 0.39 ( 0.32) Acc@5 2.73 ( 1.40) +Epoch: [0][1240/5004] Time 0.236 ( 0.240) Data 0.028 ( 0.028) Loss 6.4161e+00 (6.7971e+00) Acc@1 0.78 ( 0.32) Acc@5 2.34 ( 1.40) +Epoch: [0][1241/5004] Time 0.241 ( 0.240) Data 0.030 ( 0.028) Loss 6.3947e+00 (6.7967e+00) Acc@1 0.39 ( 0.32) Acc@5 4.30 ( 1.41) +Epoch: [0][1242/5004] Time 0.237 ( 0.240) Data 0.028 ( 0.028) Loss 6.4388e+00 (6.7965e+00) Acc@1 1.17 ( 0.32) Acc@5 2.73 ( 1.41) +Epoch: [0][1243/5004] Time 0.239 ( 0.240) Data 0.029 ( 0.028) Loss 6.4944e+00 (6.7962e+00) Acc@1 0.78 ( 0.32) Acc@5 1.56 ( 1.41) +Epoch: [0][1244/5004] Time 0.226 ( 0.240) Data 0.029 ( 0.028) Loss 6.4316e+00 (6.7959e+00) Acc@1 1.17 ( 0.32) Acc@5 4.30 ( 1.41) +Epoch: [0][1245/5004] Time 0.229 ( 0.240) Data 0.044 ( 0.028) Loss 6.4433e+00 (6.7956e+00) Acc@1 0.78 ( 0.32) Acc@5 2.34 ( 1.41) +Epoch: [0][1246/5004] Time 0.242 ( 0.240) Data 0.053 ( 0.028) Loss 6.4282e+00 (6.7953e+00) Acc@1 0.39 ( 0.32) Acc@5 1.95 ( 1.41) +Epoch: [0][1247/5004] Time 0.238 ( 0.240) Data 0.050 ( 0.028) Loss 6.5178e+00 (6.7951e+00) Acc@1 0.00 ( 0.32) Acc@5 2.73 ( 1.41) +Epoch: [0][1248/5004] Time 0.235 ( 0.240) Data 0.050 ( 0.028) Loss 6.5146e+00 (6.7949e+00) Acc@1 0.00 ( 0.32) Acc@5 3.52 ( 1.41) +Epoch: [0][1249/5004] Time 0.241 ( 0.240) Data 0.053 ( 0.028) Loss 6.4876e+00 (6.7946e+00) Acc@1 1.17 ( 0.32) Acc@5 5.08 ( 1.42) +Epoch: [0][1250/5004] Time 0.242 ( 0.240) Data 0.050 ( 0.028) Loss 6.4250e+00 (6.7943e+00) Acc@1 1.56 ( 0.33) Acc@5 4.30 ( 1.42) +Epoch: [0][1251/5004] Time 0.236 ( 0.240) Data 0.046 ( 0.028) Loss 6.4470e+00 (6.7941e+00) Acc@1 1.17 ( 0.33) Acc@5 4.30 ( 1.42) +Epoch: [0][1252/5004] Time 0.237 ( 0.240) Data 0.047 ( 0.028) Loss 6.4201e+00 (6.7938e+00) Acc@1 0.39 ( 0.33) Acc@5 2.34 ( 1.42) +Epoch: [0][1253/5004] Time 0.230 ( 0.240) Data 0.049 ( 0.028) Loss 6.4905e+00 (6.7935e+00) Acc@1 0.78 ( 0.33) Acc@5 2.73 ( 1.42) +Epoch: [0][1254/5004] Time 0.243 ( 0.240) Data 0.057 ( 0.028) Loss 6.3618e+00 (6.7932e+00) Acc@1 0.00 ( 0.33) Acc@5 2.73 ( 1.42) +Epoch: [0][1255/5004] Time 0.237 ( 0.240) Data 0.053 ( 0.028) Loss 6.4406e+00 (6.7929e+00) Acc@1 0.39 ( 0.33) Acc@5 2.34 ( 1.43) +Epoch: [0][1256/5004] Time 0.242 ( 0.240) Data 0.054 ( 0.028) Loss 6.4198e+00 (6.7926e+00) Acc@1 0.78 ( 0.33) Acc@5 3.12 ( 1.43) +Epoch: [0][1257/5004] Time 0.237 ( 0.240) Data 0.051 ( 0.028) Loss 6.4886e+00 (6.7924e+00) Acc@1 1.17 ( 0.33) Acc@5 3.52 ( 1.43) +Epoch: [0][1258/5004] Time 0.238 ( 0.240) Data 0.051 ( 0.028) Loss 6.3112e+00 (6.7920e+00) Acc@1 1.17 ( 0.33) Acc@5 4.30 ( 1.43) +Epoch: [0][1259/5004] Time 0.230 ( 0.240) Data 0.051 ( 0.028) Loss 6.5404e+00 (6.7918e+00) Acc@1 0.39 ( 0.33) Acc@5 1.95 ( 1.43) +Epoch: [0][1260/5004] Time 0.248 ( 0.240) Data 0.058 ( 0.028) Loss 6.5018e+00 (6.7916e+00) Acc@1 0.78 ( 0.33) Acc@5 3.52 ( 1.43) +Epoch: [0][1261/5004] Time 0.233 ( 0.240) Data 0.049 ( 0.028) Loss 6.4281e+00 (6.7913e+00) Acc@1 0.39 ( 0.33) Acc@5 3.52 ( 1.43) +Epoch: [0][1262/5004] Time 0.238 ( 0.240) Data 0.054 ( 0.028) Loss 6.4048e+00 (6.7910e+00) Acc@1 0.78 ( 0.33) Acc@5 3.12 ( 1.44) +Epoch: [0][1263/5004] Time 0.238 ( 0.240) Data 0.053 ( 0.028) Loss 6.5059e+00 (6.7907e+00) Acc@1 0.00 ( 0.33) Acc@5 3.12 ( 1.44) +Epoch: [0][1264/5004] Time 0.240 ( 0.240) Data 0.054 ( 0.028) Loss 6.3292e+00 (6.7904e+00) Acc@1 0.78 ( 0.33) Acc@5 4.30 ( 1.44) +Epoch: [0][1265/5004] Time 0.235 ( 0.240) Data 0.051 ( 0.028) Loss 6.3653e+00 (6.7900e+00) Acc@1 0.78 ( 0.33) Acc@5 3.91 ( 1.44) +Epoch: [0][1266/5004] Time 0.235 ( 0.240) Data 0.054 ( 0.028) Loss 6.3779e+00 (6.7897e+00) Acc@1 1.56 ( 0.33) Acc@5 3.91 ( 1.44) +Epoch: [0][1267/5004] Time 0.242 ( 0.240) Data 0.057 ( 0.028) Loss 6.5100e+00 (6.7895e+00) Acc@1 0.78 ( 0.33) Acc@5 1.95 ( 1.44) +Epoch: [0][1268/5004] Time 0.240 ( 0.240) Data 0.053 ( 0.028) Loss 6.4875e+00 (6.7893e+00) Acc@1 0.39 ( 0.33) Acc@5 2.73 ( 1.44) +Epoch: [0][1269/5004] Time 0.236 ( 0.240) Data 0.051 ( 0.028) Loss 6.4479e+00 (6.7890e+00) Acc@1 0.39 ( 0.33) Acc@5 1.56 ( 1.45) +Epoch: [0][1270/5004] Time 0.243 ( 0.240) Data 0.053 ( 0.028) Loss 6.3697e+00 (6.7887e+00) Acc@1 1.95 ( 0.33) Acc@5 5.08 ( 1.45) +Epoch: [0][1271/5004] Time 0.239 ( 0.240) Data 0.047 ( 0.028) Loss 6.4246e+00 (6.7884e+00) Acc@1 0.78 ( 0.33) Acc@5 4.30 ( 1.45) +Epoch: [0][1272/5004] Time 0.229 ( 0.240) Data 0.047 ( 0.028) Loss 6.5503e+00 (6.7882e+00) Acc@1 0.00 ( 0.33) Acc@5 3.52 ( 1.45) +Epoch: [0][1273/5004] Time 0.244 ( 0.240) Data 0.057 ( 0.028) Loss 6.3958e+00 (6.7879e+00) Acc@1 0.39 ( 0.33) Acc@5 2.73 ( 1.45) +Epoch: [0][1274/5004] Time 0.241 ( 0.240) Data 0.051 ( 0.028) Loss 6.4417e+00 (6.7876e+00) Acc@1 0.78 ( 0.33) Acc@5 4.69 ( 1.46) +Epoch: [0][1275/5004] Time 0.241 ( 0.240) Data 0.053 ( 0.028) Loss 6.4559e+00 (6.7873e+00) Acc@1 0.78 ( 0.33) Acc@5 3.52 ( 1.46) +Epoch: [0][1276/5004] Time 0.234 ( 0.240) Data 0.049 ( 0.028) Loss 6.5690e+00 (6.7872e+00) Acc@1 1.56 ( 0.33) Acc@5 2.73 ( 1.46) +Epoch: [0][1277/5004] Time 0.243 ( 0.240) Data 0.058 ( 0.028) Loss 6.5855e+00 (6.7870e+00) Acc@1 0.78 ( 0.33) Acc@5 2.73 ( 1.46) +Epoch: [0][1278/5004] Time 0.238 ( 0.240) Data 0.053 ( 0.028) Loss 6.5444e+00 (6.7868e+00) Acc@1 0.39 ( 0.33) Acc@5 1.56 ( 1.46) +Epoch: [0][1279/5004] Time 0.244 ( 0.240) Data 0.056 ( 0.029) Loss 6.3557e+00 (6.7865e+00) Acc@1 0.78 ( 0.34) Acc@5 4.30 ( 1.46) +Epoch: [0][1280/5004] Time 0.241 ( 0.240) Data 0.051 ( 0.029) Loss 6.3755e+00 (6.7862e+00) Acc@1 0.78 ( 0.34) Acc@5 7.03 ( 1.47) +Epoch: [0][1281/5004] Time 0.237 ( 0.240) Data 0.048 ( 0.029) Loss 6.3811e+00 (6.7858e+00) Acc@1 1.17 ( 0.34) Acc@5 5.08 ( 1.47) +Epoch: [0][1282/5004] Time 0.237 ( 0.240) Data 0.050 ( 0.029) Loss 6.4712e+00 (6.7856e+00) Acc@1 0.00 ( 0.34) Acc@5 1.95 ( 1.47) +Epoch: [0][1283/5004] Time 0.238 ( 0.240) Data 0.051 ( 0.029) Loss 6.5466e+00 (6.7854e+00) Acc@1 0.00 ( 0.34) Acc@5 1.95 ( 1.47) +Epoch: [0][1284/5004] Time 0.239 ( 0.240) Data 0.053 ( 0.029) Loss 6.4431e+00 (6.7852e+00) Acc@1 0.00 ( 0.34) Acc@5 3.91 ( 1.47) +Epoch: [0][1285/5004] Time 0.236 ( 0.240) Data 0.052 ( 0.029) Loss 6.4809e+00 (6.7849e+00) Acc@1 0.39 ( 0.34) Acc@5 3.12 ( 1.47) +Epoch: [0][1286/5004] Time 0.241 ( 0.240) Data 0.055 ( 0.029) Loss 6.5216e+00 (6.7847e+00) Acc@1 0.00 ( 0.34) Acc@5 3.91 ( 1.47) +Epoch: [0][1287/5004] Time 0.239 ( 0.240) Data 0.055 ( 0.029) Loss 6.4891e+00 (6.7845e+00) Acc@1 0.00 ( 0.33) Acc@5 0.78 ( 1.47) +Epoch: [0][1288/5004] Time 0.241 ( 0.240) Data 0.054 ( 0.029) Loss 6.3798e+00 (6.7842e+00) Acc@1 2.34 ( 0.34) Acc@5 4.69 ( 1.48) +Epoch: [0][1289/5004] Time 0.239 ( 0.240) Data 0.053 ( 0.029) Loss 6.5055e+00 (6.7839e+00) Acc@1 1.17 ( 0.34) Acc@5 3.91 ( 1.48) +Epoch: [0][1290/5004] Time 0.236 ( 0.240) Data 0.052 ( 0.029) Loss 6.4041e+00 (6.7837e+00) Acc@1 0.78 ( 0.34) Acc@5 2.73 ( 1.48) +Epoch: [0][1291/5004] Time 0.238 ( 0.240) Data 0.054 ( 0.029) Loss 6.4691e+00 (6.7834e+00) Acc@1 1.17 ( 0.34) Acc@5 3.52 ( 1.48) +Epoch: [0][1292/5004] Time 0.239 ( 0.240) Data 0.053 ( 0.029) Loss 6.5379e+00 (6.7832e+00) Acc@1 0.39 ( 0.34) Acc@5 3.12 ( 1.48) +Epoch: [0][1293/5004] Time 0.236 ( 0.240) Data 0.054 ( 0.029) Loss 6.4660e+00 (6.7830e+00) Acc@1 0.00 ( 0.34) Acc@5 2.34 ( 1.48) +Epoch: [0][1294/5004] Time 0.249 ( 0.240) Data 0.055 ( 0.029) Loss 6.4280e+00 (6.7827e+00) Acc@1 1.56 ( 0.34) Acc@5 3.91 ( 1.48) +Epoch: [0][1295/5004] Time 0.229 ( 0.240) Data 0.044 ( 0.029) Loss 6.4608e+00 (6.7825e+00) Acc@1 0.78 ( 0.34) Acc@5 3.52 ( 1.49) +Epoch: [0][1296/5004] Time 0.237 ( 0.240) Data 0.052 ( 0.029) Loss 6.4110e+00 (6.7822e+00) Acc@1 0.39 ( 0.34) Acc@5 5.86 ( 1.49) +Epoch: [0][1297/5004] Time 0.243 ( 0.240) Data 0.053 ( 0.029) Loss 6.5028e+00 (6.7820e+00) Acc@1 1.56 ( 0.34) Acc@5 5.08 ( 1.49) +Epoch: [0][1298/5004] Time 0.237 ( 0.240) Data 0.052 ( 0.029) Loss 6.4340e+00 (6.7817e+00) Acc@1 0.00 ( 0.34) Acc@5 2.34 ( 1.49) +Epoch: [0][1299/5004] Time 0.239 ( 0.240) Data 0.053 ( 0.029) Loss 6.4096e+00 (6.7814e+00) Acc@1 0.39 ( 0.34) Acc@5 2.73 ( 1.49) +Epoch: [0][1300/5004] Time 0.237 ( 0.240) Data 0.052 ( 0.029) Loss 6.3859e+00 (6.7811e+00) Acc@1 1.17 ( 0.34) Acc@5 3.52 ( 1.50) +Epoch: [0][1301/5004] Time 0.238 ( 0.240) Data 0.053 ( 0.029) Loss 6.3549e+00 (6.7808e+00) Acc@1 1.17 ( 0.34) Acc@5 3.91 ( 1.50) +Epoch: [0][1302/5004] Time 0.235 ( 0.240) Data 0.054 ( 0.029) Loss 6.3428e+00 (6.7804e+00) Acc@1 1.95 ( 0.34) Acc@5 5.86 ( 1.50) +Epoch: [0][1303/5004] Time 0.240 ( 0.240) Data 0.057 ( 0.029) Loss 6.3967e+00 (6.7801e+00) Acc@1 1.17 ( 0.34) Acc@5 4.30 ( 1.50) +Epoch: [0][1304/5004] Time 0.237 ( 0.240) Data 0.054 ( 0.029) Loss 6.4403e+00 (6.7799e+00) Acc@1 0.39 ( 0.34) Acc@5 1.17 ( 1.50) +Epoch: [0][1305/5004] Time 0.241 ( 0.240) Data 0.053 ( 0.029) Loss 6.4673e+00 (6.7796e+00) Acc@1 0.00 ( 0.34) Acc@5 2.73 ( 1.50) +Epoch: [0][1306/5004] Time 0.237 ( 0.240) Data 0.051 ( 0.029) Loss 6.2918e+00 (6.7793e+00) Acc@1 0.78 ( 0.34) Acc@5 4.69 ( 1.51) +Epoch: [0][1307/5004] Time 0.233 ( 0.240) Data 0.052 ( 0.029) Loss 6.5052e+00 (6.7791e+00) Acc@1 0.00 ( 0.34) Acc@5 2.34 ( 1.51) +Epoch: [0][1308/5004] Time 0.266 ( 0.240) Data 0.057 ( 0.029) Loss 6.3467e+00 (6.7787e+00) Acc@1 0.00 ( 0.34) Acc@5 1.95 ( 1.51) +Epoch: [0][1309/5004] Time 0.244 ( 0.240) Data 0.029 ( 0.029) Loss 6.4065e+00 (6.7784e+00) Acc@1 0.78 ( 0.34) Acc@5 6.64 ( 1.51) +Epoch: [0][1310/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.029) Loss 6.4357e+00 (6.7782e+00) Acc@1 1.56 ( 0.34) Acc@5 4.69 ( 1.51) +Epoch: [0][1311/5004] Time 0.231 ( 0.240) Data 0.023 ( 0.029) Loss 6.3781e+00 (6.7779e+00) Acc@1 0.78 ( 0.34) Acc@5 3.52 ( 1.51) +Epoch: [0][1312/5004] Time 0.239 ( 0.240) Data 0.030 ( 0.029) Loss 6.3506e+00 (6.7775e+00) Acc@1 1.17 ( 0.34) Acc@5 5.08 ( 1.52) +Epoch: [0][1313/5004] Time 0.240 ( 0.240) Data 0.028 ( 0.029) Loss 6.4241e+00 (6.7773e+00) Acc@1 0.78 ( 0.35) Acc@5 1.95 ( 1.52) +Epoch: [0][1314/5004] Time 0.235 ( 0.240) Data 0.026 ( 0.029) Loss 6.4375e+00 (6.7770e+00) Acc@1 0.39 ( 0.35) Acc@5 3.52 ( 1.52) +Epoch: [0][1315/5004] Time 0.237 ( 0.240) Data 0.029 ( 0.029) Loss 6.3504e+00 (6.7767e+00) Acc@1 1.95 ( 0.35) Acc@5 3.91 ( 1.52) +Epoch: [0][1316/5004] Time 0.241 ( 0.240) Data 0.031 ( 0.029) Loss 6.4617e+00 (6.7765e+00) Acc@1 0.39 ( 0.35) Acc@5 2.34 ( 1.52) +Epoch: [0][1317/5004] Time 0.242 ( 0.240) Data 0.028 ( 0.029) Loss 6.4214e+00 (6.7762e+00) Acc@1 1.56 ( 0.35) Acc@5 3.12 ( 1.52) +Epoch: [0][1318/5004] Time 0.235 ( 0.240) Data 0.026 ( 0.029) Loss 6.3391e+00 (6.7759e+00) Acc@1 1.56 ( 0.35) Acc@5 4.30 ( 1.52) +Epoch: [0][1319/5004] Time 0.243 ( 0.240) Data 0.028 ( 0.029) Loss 6.4163e+00 (6.7756e+00) Acc@1 0.39 ( 0.35) Acc@5 2.34 ( 1.53) +Epoch: [0][1320/5004] Time 0.237 ( 0.240) Data 0.025 ( 0.029) Loss 6.3841e+00 (6.7753e+00) Acc@1 1.56 ( 0.35) Acc@5 6.25 ( 1.53) +Epoch: [0][1321/5004] Time 0.247 ( 0.240) Data 0.028 ( 0.029) Loss 6.3572e+00 (6.7750e+00) Acc@1 1.17 ( 0.35) Acc@5 5.08 ( 1.53) +Epoch: [0][1322/5004] Time 0.233 ( 0.240) Data 0.022 ( 0.029) Loss 6.3798e+00 (6.7747e+00) Acc@1 0.00 ( 0.35) Acc@5 3.52 ( 1.53) +Epoch: [0][1323/5004] Time 0.239 ( 0.240) Data 0.027 ( 0.029) Loss 6.3804e+00 (6.7744e+00) Acc@1 0.39 ( 0.35) Acc@5 3.12 ( 1.53) +Epoch: [0][1324/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.3625e+00 (6.7741e+00) Acc@1 2.34 ( 0.35) Acc@5 5.47 ( 1.54) +Epoch: [0][1325/5004] Time 0.238 ( 0.240) Data 0.024 ( 0.029) Loss 6.4310e+00 (6.7738e+00) Acc@1 0.00 ( 0.35) Acc@5 1.95 ( 1.54) +Epoch: [0][1326/5004] Time 0.238 ( 0.240) Data 0.025 ( 0.029) Loss 6.4898e+00 (6.7736e+00) Acc@1 0.39 ( 0.35) Acc@5 3.91 ( 1.54) +Epoch: [0][1327/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.029) Loss 6.4075e+00 (6.7733e+00) Acc@1 0.78 ( 0.35) Acc@5 2.73 ( 1.54) +Epoch: [0][1328/5004] Time 0.239 ( 0.240) Data 0.023 ( 0.029) Loss 6.5419e+00 (6.7731e+00) Acc@1 0.78 ( 0.35) Acc@5 3.52 ( 1.54) +Epoch: [0][1329/5004] Time 0.246 ( 0.240) Data 0.025 ( 0.029) Loss 6.3940e+00 (6.7729e+00) Acc@1 1.17 ( 0.35) Acc@5 2.73 ( 1.54) +Epoch: [0][1330/5004] Time 0.242 ( 0.240) Data 0.021 ( 0.029) Loss 6.3722e+00 (6.7726e+00) Acc@1 0.78 ( 0.35) Acc@5 4.69 ( 1.55) +Epoch: [0][1331/5004] Time 0.238 ( 0.240) Data 0.023 ( 0.029) Loss 6.5138e+00 (6.7724e+00) Acc@1 1.17 ( 0.35) Acc@5 3.91 ( 1.55) +Epoch: [0][1332/5004] Time 0.238 ( 0.240) Data 0.024 ( 0.029) Loss 6.5274e+00 (6.7722e+00) Acc@1 0.39 ( 0.35) Acc@5 1.17 ( 1.55) +Epoch: [0][1333/5004] Time 0.240 ( 0.240) Data 0.024 ( 0.029) Loss 6.4615e+00 (6.7719e+00) Acc@1 1.17 ( 0.35) Acc@5 3.91 ( 1.55) +Epoch: [0][1334/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.029) Loss 6.3853e+00 (6.7717e+00) Acc@1 0.78 ( 0.35) Acc@5 3.91 ( 1.55) +Epoch: [0][1335/5004] Time 0.248 ( 0.240) Data 0.025 ( 0.029) Loss 6.5334e+00 (6.7715e+00) Acc@1 0.78 ( 0.35) Acc@5 3.52 ( 1.55) +Epoch: [0][1336/5004] Time 0.237 ( 0.240) Data 0.018 ( 0.029) Loss 6.4043e+00 (6.7712e+00) Acc@1 0.78 ( 0.35) Acc@5 3.12 ( 1.55) +Epoch: [0][1337/5004] Time 0.246 ( 0.240) Data 0.022 ( 0.029) Loss 6.3497e+00 (6.7709e+00) Acc@1 0.78 ( 0.36) Acc@5 3.91 ( 1.55) +Epoch: [0][1338/5004] Time 0.241 ( 0.240) Data 0.021 ( 0.029) Loss 6.3474e+00 (6.7706e+00) Acc@1 0.39 ( 0.36) Acc@5 5.47 ( 1.56) +Epoch: [0][1339/5004] Time 0.247 ( 0.240) Data 0.024 ( 0.029) Loss 6.4821e+00 (6.7704e+00) Acc@1 1.56 ( 0.36) Acc@5 4.30 ( 1.56) +Epoch: [0][1340/5004] Time 0.245 ( 0.240) Data 0.023 ( 0.029) Loss 6.4784e+00 (6.7701e+00) Acc@1 0.00 ( 0.36) Acc@5 3.12 ( 1.56) +Epoch: [0][1341/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.3568e+00 (6.7698e+00) Acc@1 1.17 ( 0.36) Acc@5 4.30 ( 1.56) +Epoch: [0][1342/5004] Time 0.245 ( 0.240) Data 0.024 ( 0.029) Loss 6.3914e+00 (6.7695e+00) Acc@1 1.17 ( 0.36) Acc@5 4.69 ( 1.57) +Epoch: [0][1343/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.5115e+00 (6.7694e+00) Acc@1 0.39 ( 0.36) Acc@5 1.56 ( 1.57) +Epoch: [0][1344/5004] Time 0.243 ( 0.240) Data 0.024 ( 0.029) Loss 6.3340e+00 (6.7690e+00) Acc@1 0.39 ( 0.36) Acc@5 4.69 ( 1.57) +Epoch: [0][1345/5004] Time 0.249 ( 0.240) Data 0.024 ( 0.029) Loss 6.3611e+00 (6.7687e+00) Acc@1 0.78 ( 0.36) Acc@5 4.30 ( 1.57) +Epoch: [0][1346/5004] Time 0.243 ( 0.240) Data 0.024 ( 0.029) Loss 6.3682e+00 (6.7684e+00) Acc@1 1.17 ( 0.36) Acc@5 3.52 ( 1.57) +Epoch: [0][1347/5004] Time 0.247 ( 0.240) Data 0.024 ( 0.029) Loss 6.3658e+00 (6.7681e+00) Acc@1 0.39 ( 0.36) Acc@5 2.34 ( 1.57) +Epoch: [0][1348/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.4214e+00 (6.7679e+00) Acc@1 0.39 ( 0.36) Acc@5 3.12 ( 1.57) +Epoch: [0][1349/5004] Time 0.244 ( 0.240) Data 0.023 ( 0.029) Loss 6.2681e+00 (6.7675e+00) Acc@1 1.17 ( 0.36) Acc@5 4.30 ( 1.57) +Epoch: [0][1350/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.3793e+00 (6.7672e+00) Acc@1 0.39 ( 0.36) Acc@5 2.73 ( 1.58) +Epoch: [0][1351/5004] Time 0.243 ( 0.240) Data 0.024 ( 0.029) Loss 6.3695e+00 (6.7669e+00) Acc@1 0.78 ( 0.36) Acc@5 3.12 ( 1.58) +Epoch: [0][1352/5004] Time 0.247 ( 0.240) Data 0.025 ( 0.029) Loss 6.2861e+00 (6.7666e+00) Acc@1 0.78 ( 0.36) Acc@5 3.52 ( 1.58) +Epoch: [0][1353/5004] Time 0.243 ( 0.240) Data 0.024 ( 0.029) Loss 6.4391e+00 (6.7663e+00) Acc@1 0.00 ( 0.36) Acc@5 3.12 ( 1.58) +Epoch: [0][1354/5004] Time 0.238 ( 0.240) Data 0.024 ( 0.029) Loss 6.3437e+00 (6.7660e+00) Acc@1 0.00 ( 0.36) Acc@5 3.12 ( 1.58) +Epoch: [0][1355/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.029) Loss 6.3736e+00 (6.7657e+00) Acc@1 1.56 ( 0.36) Acc@5 4.69 ( 1.58) +Epoch: [0][1356/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.3678e+00 (6.7654e+00) Acc@1 0.78 ( 0.36) Acc@5 5.08 ( 1.59) +Epoch: [0][1357/5004] Time 0.240 ( 0.240) Data 0.025 ( 0.029) Loss 6.4270e+00 (6.7652e+00) Acc@1 1.56 ( 0.36) Acc@5 5.08 ( 1.59) +Epoch: [0][1358/5004] Time 0.237 ( 0.240) Data 0.024 ( 0.029) Loss 6.5202e+00 (6.7650e+00) Acc@1 1.17 ( 0.36) Acc@5 2.34 ( 1.59) +Epoch: [0][1359/5004] Time 0.239 ( 0.240) Data 0.026 ( 0.029) Loss 6.4663e+00 (6.7648e+00) Acc@1 0.39 ( 0.36) Acc@5 2.73 ( 1.59) +Epoch: [0][1360/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.4155e+00 (6.7645e+00) Acc@1 1.56 ( 0.36) Acc@5 3.91 ( 1.59) +Epoch: [0][1361/5004] Time 0.245 ( 0.240) Data 0.025 ( 0.029) Loss 6.3921e+00 (6.7643e+00) Acc@1 0.00 ( 0.36) Acc@5 2.34 ( 1.59) +Epoch: [0][1362/5004] Time 0.239 ( 0.240) Data 0.025 ( 0.029) Loss 6.3030e+00 (6.7639e+00) Acc@1 0.39 ( 0.36) Acc@5 2.73 ( 1.59) +Epoch: [0][1363/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.3896e+00 (6.7636e+00) Acc@1 0.00 ( 0.36) Acc@5 3.91 ( 1.59) +Epoch: [0][1364/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.4997e+00 (6.7634e+00) Acc@1 0.39 ( 0.36) Acc@5 1.17 ( 1.59) +Epoch: [0][1365/5004] Time 0.240 ( 0.240) Data 0.025 ( 0.029) Loss 6.3610e+00 (6.7632e+00) Acc@1 2.34 ( 0.36) Acc@5 4.69 ( 1.60) +Epoch: [0][1366/5004] Time 0.238 ( 0.240) Data 0.026 ( 0.029) Loss 6.4195e+00 (6.7629e+00) Acc@1 0.78 ( 0.36) Acc@5 3.52 ( 1.60) +Epoch: [0][1367/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.4650e+00 (6.7627e+00) Acc@1 1.17 ( 0.36) Acc@5 1.95 ( 1.60) +Epoch: [0][1368/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.4012e+00 (6.7624e+00) Acc@1 0.00 ( 0.36) Acc@5 3.12 ( 1.60) +Epoch: [0][1369/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.3806e+00 (6.7621e+00) Acc@1 0.39 ( 0.36) Acc@5 2.34 ( 1.60) +Epoch: [0][1370/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.3803e+00 (6.7619e+00) Acc@1 1.56 ( 0.36) Acc@5 4.30 ( 1.60) +Epoch: [0][1371/5004] Time 0.240 ( 0.240) Data 0.026 ( 0.029) Loss 6.4176e+00 (6.7616e+00) Acc@1 0.78 ( 0.37) Acc@5 4.30 ( 1.60) +Epoch: [0][1372/5004] Time 0.241 ( 0.240) Data 0.025 ( 0.029) Loss 6.3044e+00 (6.7613e+00) Acc@1 1.17 ( 0.37) Acc@5 3.52 ( 1.60) +Epoch: [0][1373/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.3093e+00 (6.7609e+00) Acc@1 1.17 ( 0.37) Acc@5 5.86 ( 1.61) +Epoch: [0][1374/5004] Time 0.240 ( 0.240) Data 0.025 ( 0.029) Loss 6.3978e+00 (6.7607e+00) Acc@1 0.39 ( 0.37) Acc@5 3.12 ( 1.61) +Epoch: [0][1375/5004] Time 0.246 ( 0.240) Data 0.025 ( 0.029) Loss 6.3523e+00 (6.7604e+00) Acc@1 1.95 ( 0.37) Acc@5 4.69 ( 1.61) +Epoch: [0][1376/5004] Time 0.246 ( 0.240) Data 0.023 ( 0.029) Loss 6.3853e+00 (6.7601e+00) Acc@1 0.39 ( 0.37) Acc@5 1.56 ( 1.61) +Epoch: [0][1377/5004] Time 0.242 ( 0.240) Data 0.022 ( 0.029) Loss 6.3014e+00 (6.7598e+00) Acc@1 1.95 ( 0.37) Acc@5 4.69 ( 1.61) +Epoch: [0][1378/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.029) Loss 6.5711e+00 (6.7596e+00) Acc@1 1.56 ( 0.37) Acc@5 3.52 ( 1.61) +Epoch: [0][1379/5004] Time 0.238 ( 0.240) Data 0.025 ( 0.029) Loss 6.3389e+00 (6.7593e+00) Acc@1 1.17 ( 0.37) Acc@5 5.08 ( 1.62) +Epoch: [0][1380/5004] Time 0.241 ( 0.240) Data 0.026 ( 0.029) Loss 6.3254e+00 (6.7590e+00) Acc@1 2.34 ( 0.37) Acc@5 5.08 ( 1.62) +Epoch: [0][1381/5004] Time 0.252 ( 0.240) Data 0.025 ( 0.029) Loss 6.2977e+00 (6.7587e+00) Acc@1 0.78 ( 0.37) Acc@5 4.69 ( 1.62) +Epoch: [0][1382/5004] Time 0.244 ( 0.240) Data 0.023 ( 0.029) Loss 6.3900e+00 (6.7584e+00) Acc@1 1.95 ( 0.37) Acc@5 5.08 ( 1.62) +Epoch: [0][1383/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.3432e+00 (6.7581e+00) Acc@1 0.39 ( 0.37) Acc@5 6.25 ( 1.63) +Epoch: [0][1384/5004] Time 0.247 ( 0.240) Data 0.026 ( 0.029) Loss 6.3229e+00 (6.7578e+00) Acc@1 1.17 ( 0.37) Acc@5 3.52 ( 1.63) +Epoch: [0][1385/5004] Time 0.246 ( 0.240) Data 0.024 ( 0.029) Loss 6.3834e+00 (6.7575e+00) Acc@1 0.39 ( 0.37) Acc@5 3.52 ( 1.63) +Epoch: [0][1386/5004] Time 0.245 ( 0.240) Data 0.024 ( 0.029) Loss 6.2609e+00 (6.7572e+00) Acc@1 1.95 ( 0.37) Acc@5 6.25 ( 1.63) +Epoch: [0][1387/5004] Time 0.244 ( 0.240) Data 0.025 ( 0.029) Loss 6.3489e+00 (6.7569e+00) Acc@1 0.00 ( 0.37) Acc@5 2.34 ( 1.63) +Epoch: [0][1388/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.5541e+00 (6.7567e+00) Acc@1 0.78 ( 0.37) Acc@5 2.34 ( 1.63) +Epoch: [0][1389/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.3816e+00 (6.7565e+00) Acc@1 1.17 ( 0.38) Acc@5 3.12 ( 1.64) +Epoch: [0][1390/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.3931e+00 (6.7562e+00) Acc@1 1.17 ( 0.38) Acc@5 4.30 ( 1.64) +Epoch: [0][1391/5004] Time 0.247 ( 0.240) Data 0.025 ( 0.029) Loss 6.3263e+00 (6.7559e+00) Acc@1 1.17 ( 0.38) Acc@5 3.91 ( 1.64) +Epoch: [0][1392/5004] Time 0.254 ( 0.240) Data 0.024 ( 0.029) Loss 6.3166e+00 (6.7556e+00) Acc@1 0.78 ( 0.38) Acc@5 6.25 ( 1.64) +Epoch: [0][1393/5004] Time 0.250 ( 0.240) Data 0.022 ( 0.029) Loss 6.3621e+00 (6.7553e+00) Acc@1 0.78 ( 0.38) Acc@5 4.30 ( 1.64) +Epoch: [0][1394/5004] Time 0.239 ( 0.240) Data 0.019 ( 0.029) Loss 6.2188e+00 (6.7549e+00) Acc@1 2.73 ( 0.38) Acc@5 7.03 ( 1.65) +Epoch: [0][1395/5004] Time 0.242 ( 0.240) Data 0.024 ( 0.029) Loss 6.4002e+00 (6.7547e+00) Acc@1 0.78 ( 0.38) Acc@5 3.12 ( 1.65) +Epoch: [0][1396/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.3129e+00 (6.7544e+00) Acc@1 0.78 ( 0.38) Acc@5 3.91 ( 1.65) +Epoch: [0][1397/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.3375e+00 (6.7541e+00) Acc@1 1.95 ( 0.38) Acc@5 5.86 ( 1.65) +Epoch: [0][1398/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.3984e+00 (6.7538e+00) Acc@1 1.17 ( 0.38) Acc@5 3.52 ( 1.66) +Epoch: [0][1399/5004] Time 0.246 ( 0.240) Data 0.026 ( 0.029) Loss 6.3931e+00 (6.7535e+00) Acc@1 0.00 ( 0.38) Acc@5 3.91 ( 1.66) +Epoch: [0][1400/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.2205e+00 (6.7532e+00) Acc@1 0.78 ( 0.38) Acc@5 3.91 ( 1.66) +Epoch: [0][1401/5004] Time 0.251 ( 0.240) Data 0.028 ( 0.029) Loss 6.4339e+00 (6.7529e+00) Acc@1 1.17 ( 0.38) Acc@5 4.30 ( 1.66) +Epoch: [0][1402/5004] Time 0.241 ( 0.240) Data 0.024 ( 0.029) Loss 6.3401e+00 (6.7526e+00) Acc@1 0.78 ( 0.38) Acc@5 2.73 ( 1.66) +Epoch: [0][1403/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.3700e+00 (6.7524e+00) Acc@1 1.17 ( 0.38) Acc@5 4.69 ( 1.66) +Epoch: [0][1404/5004] Time 0.244 ( 0.240) Data 0.026 ( 0.029) Loss 6.3224e+00 (6.7521e+00) Acc@1 0.39 ( 0.38) Acc@5 5.08 ( 1.67) +Epoch: [0][1405/5004] Time 0.244 ( 0.240) Data 0.025 ( 0.029) Loss 6.3305e+00 (6.7518e+00) Acc@1 1.17 ( 0.38) Acc@5 4.69 ( 1.67) +Epoch: [0][1406/5004] Time 0.247 ( 0.240) Data 0.025 ( 0.029) Loss 6.4470e+00 (6.7515e+00) Acc@1 0.78 ( 0.38) Acc@5 3.12 ( 1.67) +Epoch: [0][1407/5004] Time 0.245 ( 0.240) Data 0.025 ( 0.029) Loss 6.3360e+00 (6.7512e+00) Acc@1 1.95 ( 0.38) Acc@5 3.91 ( 1.67) +Epoch: [0][1408/5004] Time 0.251 ( 0.240) Data 0.026 ( 0.029) Loss 6.1795e+00 (6.7508e+00) Acc@1 2.34 ( 0.39) Acc@5 4.69 ( 1.67) +Epoch: [0][1409/5004] Time 0.243 ( 0.240) Data 0.023 ( 0.029) Loss 6.2694e+00 (6.7505e+00) Acc@1 0.00 ( 0.39) Acc@5 4.30 ( 1.67) +Epoch: [0][1410/5004] Time 0.247 ( 0.240) Data 0.025 ( 0.029) Loss 6.2787e+00 (6.7502e+00) Acc@1 0.78 ( 0.39) Acc@5 4.30 ( 1.68) +Epoch: [0][1411/5004] Time 0.247 ( 0.240) Data 0.026 ( 0.029) Loss 6.2532e+00 (6.7498e+00) Acc@1 1.56 ( 0.39) Acc@5 5.86 ( 1.68) +Epoch: [0][1412/5004] Time 0.247 ( 0.240) Data 0.024 ( 0.029) Loss 6.4125e+00 (6.7496e+00) Acc@1 0.39 ( 0.39) Acc@5 3.52 ( 1.68) +Epoch: [0][1413/5004] Time 0.244 ( 0.240) Data 0.024 ( 0.029) Loss 6.3096e+00 (6.7493e+00) Acc@1 1.56 ( 0.39) Acc@5 3.12 ( 1.68) +Epoch: [0][1414/5004] Time 0.243 ( 0.240) Data 0.024 ( 0.029) Loss 6.2590e+00 (6.7489e+00) Acc@1 1.17 ( 0.39) Acc@5 5.08 ( 1.68) +Epoch: [0][1415/5004] Time 0.243 ( 0.240) Data 0.025 ( 0.029) Loss 6.3638e+00 (6.7486e+00) Acc@1 1.95 ( 0.39) Acc@5 6.25 ( 1.69) +Epoch: [0][1416/5004] Time 0.243 ( 0.240) Data 0.026 ( 0.029) Loss 6.2412e+00 (6.7483e+00) Acc@1 1.17 ( 0.39) Acc@5 3.91 ( 1.69) +Epoch: [0][1417/5004] Time 0.245 ( 0.240) Data 0.026 ( 0.029) Loss 6.2491e+00 (6.7479e+00) Acc@1 0.00 ( 0.39) Acc@5 3.91 ( 1.69) +Epoch: [0][1418/5004] Time 0.253 ( 0.240) Data 0.026 ( 0.029) Loss 6.3082e+00 (6.7476e+00) Acc@1 1.17 ( 0.39) Acc@5 7.03 ( 1.69) +Epoch: [0][1419/5004] Time 0.250 ( 0.240) Data 0.023 ( 0.029) Loss 6.2790e+00 (6.7473e+00) Acc@1 0.39 ( 0.39) Acc@5 3.12 ( 1.70) +Epoch: [0][1420/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.029) Loss 6.3492e+00 (6.7470e+00) Acc@1 1.17 ( 0.39) Acc@5 5.86 ( 1.70) +Epoch: [0][1421/5004] Time 0.249 ( 0.241) Data 0.024 ( 0.029) Loss 6.4170e+00 (6.7468e+00) Acc@1 2.34 ( 0.39) Acc@5 5.08 ( 1.70) +Epoch: [0][1422/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.029) Loss 6.3644e+00 (6.7465e+00) Acc@1 1.17 ( 0.39) Acc@5 3.52 ( 1.70) +Epoch: [0][1423/5004] Time 0.252 ( 0.241) Data 0.026 ( 0.029) Loss 6.3260e+00 (6.7462e+00) Acc@1 1.56 ( 0.39) Acc@5 4.30 ( 1.70) +Epoch: [0][1424/5004] Time 0.240 ( 0.241) Data 0.020 ( 0.029) Loss 6.2833e+00 (6.7459e+00) Acc@1 1.56 ( 0.39) Acc@5 5.47 ( 1.71) +Epoch: [0][1425/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.029) Loss 6.2706e+00 (6.7456e+00) Acc@1 1.17 ( 0.39) Acc@5 4.69 ( 1.71) +Epoch: [0][1426/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.029) Loss 6.4572e+00 (6.7454e+00) Acc@1 0.39 ( 0.39) Acc@5 2.34 ( 1.71) +Epoch: [0][1427/5004] Time 0.243 ( 0.241) Data 0.025 ( 0.029) Loss 6.3763e+00 (6.7451e+00) Acc@1 1.17 ( 0.40) Acc@5 4.69 ( 1.71) +Epoch: [0][1428/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.029) Loss 6.2526e+00 (6.7448e+00) Acc@1 0.00 ( 0.39) Acc@5 5.47 ( 1.71) +Epoch: [0][1429/5004] Time 0.247 ( 0.241) Data 0.026 ( 0.029) Loss 6.2076e+00 (6.7444e+00) Acc@1 1.56 ( 0.40) Acc@5 5.47 ( 1.72) +Epoch: [0][1430/5004] Time 0.242 ( 0.241) Data 0.024 ( 0.029) Loss 6.3113e+00 (6.7441e+00) Acc@1 1.17 ( 0.40) Acc@5 5.86 ( 1.72) +Epoch: [0][1431/5004] Time 0.246 ( 0.241) Data 0.026 ( 0.029) Loss 6.2884e+00 (6.7438e+00) Acc@1 0.39 ( 0.40) Acc@5 3.91 ( 1.72) +Epoch: [0][1432/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.029) Loss 6.3492e+00 (6.7435e+00) Acc@1 1.17 ( 0.40) Acc@5 3.91 ( 1.72) +Epoch: [0][1433/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.029) Loss 6.2837e+00 (6.7432e+00) Acc@1 0.39 ( 0.40) Acc@5 6.64 ( 1.73) +Epoch: [0][1434/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.029) Loss 6.4013e+00 (6.7429e+00) Acc@1 0.00 ( 0.40) Acc@5 4.30 ( 1.73) +Epoch: [0][1435/5004] Time 0.248 ( 0.241) Data 0.026 ( 0.029) Loss 6.3221e+00 (6.7426e+00) Acc@1 0.39 ( 0.40) Acc@5 3.91 ( 1.73) +Epoch: [0][1436/5004] Time 0.248 ( 0.241) Data 0.024 ( 0.029) Loss 6.3740e+00 (6.7424e+00) Acc@1 1.17 ( 0.40) Acc@5 2.73 ( 1.73) +Epoch: [0][1437/5004] Time 0.249 ( 0.241) Data 0.024 ( 0.029) Loss 6.3742e+00 (6.7421e+00) Acc@1 0.78 ( 0.40) Acc@5 3.52 ( 1.73) +Epoch: [0][1438/5004] Time 0.249 ( 0.241) Data 0.023 ( 0.029) Loss 6.2466e+00 (6.7418e+00) Acc@1 0.39 ( 0.40) Acc@5 4.30 ( 1.73) +Epoch: [0][1439/5004] Time 0.246 ( 0.241) Data 0.023 ( 0.029) Loss 6.3644e+00 (6.7415e+00) Acc@1 1.56 ( 0.40) Acc@5 3.12 ( 1.73) +Epoch: [0][1440/5004] Time 0.244 ( 0.241) Data 0.025 ( 0.029) Loss 6.2651e+00 (6.7412e+00) Acc@1 0.78 ( 0.40) Acc@5 2.73 ( 1.73) +Epoch: [0][1441/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.029) Loss 6.2780e+00 (6.7409e+00) Acc@1 0.78 ( 0.40) Acc@5 2.34 ( 1.73) +Epoch: [0][1442/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.029) Loss 6.2412e+00 (6.7405e+00) Acc@1 1.95 ( 0.40) Acc@5 5.47 ( 1.74) +Epoch: [0][1443/5004] Time 0.243 ( 0.241) Data 0.026 ( 0.029) Loss 6.3895e+00 (6.7403e+00) Acc@1 1.17 ( 0.40) Acc@5 3.12 ( 1.74) +Epoch: [0][1444/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.029) Loss 6.2770e+00 (6.7400e+00) Acc@1 0.78 ( 0.40) Acc@5 4.69 ( 1.74) +Epoch: [0][1445/5004] Time 0.247 ( 0.241) Data 0.025 ( 0.029) Loss 6.3611e+00 (6.7397e+00) Acc@1 1.17 ( 0.40) Acc@5 4.69 ( 1.74) +Epoch: [0][1446/5004] Time 0.247 ( 0.241) Data 0.024 ( 0.029) Loss 6.3003e+00 (6.7394e+00) Acc@1 0.78 ( 0.40) Acc@5 4.30 ( 1.74) +Epoch: [0][1447/5004] Time 0.248 ( 0.241) Data 0.025 ( 0.029) Loss 6.1737e+00 (6.7390e+00) Acc@1 1.56 ( 0.40) Acc@5 5.86 ( 1.75) +Epoch: [0][1448/5004] Time 0.243 ( 0.241) Data 0.025 ( 0.029) Loss 6.2777e+00 (6.7387e+00) Acc@1 0.78 ( 0.40) Acc@5 4.30 ( 1.75) +Epoch: [0][1449/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.029) Loss 6.2934e+00 (6.7384e+00) Acc@1 1.17 ( 0.40) Acc@5 3.91 ( 1.75) +Epoch: [0][1450/5004] Time 0.244 ( 0.241) Data 0.025 ( 0.029) Loss 6.3127e+00 (6.7381e+00) Acc@1 0.78 ( 0.40) Acc@5 3.12 ( 1.75) +Epoch: [0][1451/5004] Time 0.244 ( 0.241) Data 0.026 ( 0.029) Loss 6.2478e+00 (6.7377e+00) Acc@1 1.17 ( 0.40) Acc@5 4.69 ( 1.75) +Epoch: [0][1452/5004] Time 0.246 ( 0.241) Data 0.026 ( 0.029) Loss 6.2638e+00 (6.7374e+00) Acc@1 2.34 ( 0.40) Acc@5 6.25 ( 1.76) +Epoch: [0][1453/5004] Time 0.255 ( 0.241) Data 0.025 ( 0.029) Loss 6.2570e+00 (6.7371e+00) Acc@1 0.78 ( 0.41) Acc@5 5.08 ( 1.76) +Epoch: [0][1454/5004] Time 0.229 ( 0.241) Data 0.017 ( 0.029) Loss 6.3286e+00 (6.7368e+00) Acc@1 1.95 ( 0.41) Acc@5 5.47 ( 1.76) +Epoch: [0][1455/5004] Time 0.216 ( 0.241) Data 0.034 ( 0.029) Loss 6.2761e+00 (6.7365e+00) Acc@1 1.56 ( 0.41) Acc@5 6.64 ( 1.76) +Epoch: [0][1456/5004] Time 0.233 ( 0.241) Data 0.055 ( 0.029) Loss 6.2900e+00 (6.7362e+00) Acc@1 2.73 ( 0.41) Acc@5 7.42 ( 1.77) +Epoch: [0][1457/5004] Time 0.242 ( 0.241) Data 0.059 ( 0.029) Loss 6.2390e+00 (6.7358e+00) Acc@1 0.39 ( 0.41) Acc@5 7.42 ( 1.77) +Epoch: [0][1458/5004] Time 0.236 ( 0.241) Data 0.054 ( 0.029) Loss 6.3252e+00 (6.7356e+00) Acc@1 0.78 ( 0.41) Acc@5 2.73 ( 1.77) +Epoch: [0][1459/5004] Time 0.237 ( 0.241) Data 0.056 ( 0.029) Loss 6.2377e+00 (6.7352e+00) Acc@1 0.78 ( 0.41) Acc@5 6.25 ( 1.78) +Epoch: [0][1460/5004] Time 0.238 ( 0.241) Data 0.056 ( 0.029) Loss 6.3278e+00 (6.7349e+00) Acc@1 0.78 ( 0.41) Acc@5 3.91 ( 1.78) +Epoch: [0][1461/5004] Time 0.240 ( 0.241) Data 0.055 ( 0.029) Loss 6.2638e+00 (6.7346e+00) Acc@1 1.17 ( 0.41) Acc@5 5.08 ( 1.78) +Epoch: [0][1462/5004] Time 0.235 ( 0.241) Data 0.053 ( 0.029) Loss 6.3727e+00 (6.7344e+00) Acc@1 1.56 ( 0.41) Acc@5 5.08 ( 1.78) +Epoch: [0][1463/5004] Time 0.239 ( 0.241) Data 0.057 ( 0.029) Loss 6.2691e+00 (6.7341e+00) Acc@1 1.95 ( 0.41) Acc@5 7.42 ( 1.79) +Epoch: [0][1464/5004] Time 0.242 ( 0.241) Data 0.055 ( 0.029) Loss 6.2055e+00 (6.7337e+00) Acc@1 1.95 ( 0.41) Acc@5 6.64 ( 1.79) +Epoch: [0][1465/5004] Time 0.238 ( 0.241) Data 0.053 ( 0.029) Loss 6.2526e+00 (6.7334e+00) Acc@1 1.17 ( 0.41) Acc@5 3.12 ( 1.79) +Epoch: [0][1466/5004] Time 0.234 ( 0.241) Data 0.052 ( 0.029) Loss 6.3752e+00 (6.7331e+00) Acc@1 1.17 ( 0.41) Acc@5 4.69 ( 1.79) +Epoch: [0][1467/5004] Time 0.233 ( 0.241) Data 0.055 ( 0.029) Loss 6.1857e+00 (6.7327e+00) Acc@1 0.39 ( 0.41) Acc@5 4.69 ( 1.79) +Epoch: [0][1468/5004] Time 0.241 ( 0.241) Data 0.060 ( 0.029) Loss 6.3265e+00 (6.7325e+00) Acc@1 1.17 ( 0.41) Acc@5 3.91 ( 1.80) +Epoch: [0][1469/5004] Time 0.237 ( 0.241) Data 0.056 ( 0.029) Loss 6.3754e+00 (6.7322e+00) Acc@1 0.00 ( 0.41) Acc@5 2.34 ( 1.80) +Epoch: [0][1470/5004] Time 0.238 ( 0.241) Data 0.056 ( 0.029) Loss 6.3165e+00 (6.7319e+00) Acc@1 1.95 ( 0.42) Acc@5 6.25 ( 1.80) +Epoch: [0][1471/5004] Time 0.241 ( 0.241) Data 0.057 ( 0.029) Loss 6.2717e+00 (6.7316e+00) Acc@1 1.17 ( 0.42) Acc@5 3.52 ( 1.80) +Epoch: [0][1472/5004] Time 0.236 ( 0.241) Data 0.053 ( 0.029) Loss 6.2634e+00 (6.7313e+00) Acc@1 1.17 ( 0.42) Acc@5 5.47 ( 1.80) +Epoch: [0][1473/5004] Time 0.238 ( 0.241) Data 0.054 ( 0.029) Loss 6.3670e+00 (6.7311e+00) Acc@1 1.56 ( 0.42) Acc@5 4.69 ( 1.80) +Epoch: [0][1474/5004] Time 0.238 ( 0.241) Data 0.054 ( 0.029) Loss 6.2475e+00 (6.7307e+00) Acc@1 1.56 ( 0.42) Acc@5 4.69 ( 1.81) +Epoch: [0][1475/5004] Time 0.236 ( 0.241) Data 0.054 ( 0.029) Loss 6.2395e+00 (6.7304e+00) Acc@1 2.34 ( 0.42) Acc@5 5.08 ( 1.81) +Epoch: [0][1476/5004] Time 0.239 ( 0.241) Data 0.056 ( 0.029) Loss 6.2867e+00 (6.7301e+00) Acc@1 0.39 ( 0.42) Acc@5 1.95 ( 1.81) +Epoch: [0][1477/5004] Time 0.235 ( 0.241) Data 0.054 ( 0.029) Loss 6.3297e+00 (6.7298e+00) Acc@1 0.78 ( 0.42) Acc@5 4.30 ( 1.81) +Epoch: [0][1478/5004] Time 0.234 ( 0.241) Data 0.055 ( 0.029) Loss 6.1355e+00 (6.7294e+00) Acc@1 1.95 ( 0.42) Acc@5 7.81 ( 1.81) +Epoch: [0][1479/5004] Time 0.242 ( 0.241) Data 0.058 ( 0.029) Loss 6.1980e+00 (6.7291e+00) Acc@1 0.39 ( 0.42) Acc@5 5.47 ( 1.82) +Epoch: [0][1480/5004] Time 0.233 ( 0.241) Data 0.052 ( 0.029) Loss 6.3071e+00 (6.7288e+00) Acc@1 0.39 ( 0.42) Acc@5 4.69 ( 1.82) +Epoch: [0][1481/5004] Time 0.265 ( 0.241) Data 0.055 ( 0.029) Loss 6.1472e+00 (6.7284e+00) Acc@1 1.56 ( 0.42) Acc@5 3.91 ( 1.82) +Epoch: [0][1482/5004] Time 0.237 ( 0.241) Data 0.028 ( 0.029) Loss 6.3563e+00 (6.7281e+00) Acc@1 1.56 ( 0.42) Acc@5 5.86 ( 1.82) +Epoch: [0][1483/5004] Time 0.240 ( 0.241) Data 0.029 ( 0.029) Loss 6.2142e+00 (6.7278e+00) Acc@1 0.39 ( 0.42) Acc@5 5.86 ( 1.83) +Epoch: [0][1484/5004] Time 0.239 ( 0.241) Data 0.027 ( 0.029) Loss 6.2932e+00 (6.7275e+00) Acc@1 0.00 ( 0.42) Acc@5 3.91 ( 1.83) +Epoch: [0][1485/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.029) Loss 6.2359e+00 (6.7272e+00) Acc@1 0.39 ( 0.42) Acc@5 4.30 ( 1.83) +Epoch: [0][1486/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.029) Loss 6.3353e+00 (6.7269e+00) Acc@1 1.17 ( 0.42) Acc@5 4.69 ( 1.83) +Epoch: [0][1487/5004] Time 0.240 ( 0.241) Data 0.027 ( 0.029) Loss 6.2622e+00 (6.7266e+00) Acc@1 1.17 ( 0.42) Acc@5 3.91 ( 1.83) +Epoch: [0][1488/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.029) Loss 6.4513e+00 (6.7264e+00) Acc@1 0.39 ( 0.42) Acc@5 2.73 ( 1.83) +Epoch: [0][1489/5004] Time 0.238 ( 0.241) Data 0.024 ( 0.029) Loss 6.2995e+00 (6.7261e+00) Acc@1 0.39 ( 0.42) Acc@5 4.30 ( 1.83) +Epoch: [0][1490/5004] Time 0.239 ( 0.241) Data 0.025 ( 0.029) Loss 6.2314e+00 (6.7258e+00) Acc@1 1.17 ( 0.42) Acc@5 5.08 ( 1.84) +Epoch: [0][1491/5004] Time 0.245 ( 0.241) Data 0.026 ( 0.029) Loss 6.2915e+00 (6.7255e+00) Acc@1 0.39 ( 0.42) Acc@5 4.69 ( 1.84) +Epoch: [0][1492/5004] Time 0.241 ( 0.241) Data 0.024 ( 0.029) Loss 6.2190e+00 (6.7252e+00) Acc@1 0.78 ( 0.42) Acc@5 4.69 ( 1.84) +Epoch: [0][1493/5004] Time 0.247 ( 0.241) Data 0.024 ( 0.029) Loss 6.3534e+00 (6.7249e+00) Acc@1 1.17 ( 0.42) Acc@5 2.73 ( 1.84) +Epoch: [0][1494/5004] Time 0.241 ( 0.241) Data 0.023 ( 0.029) Loss 6.3487e+00 (6.7247e+00) Acc@1 1.17 ( 0.42) Acc@5 5.47 ( 1.84) +Epoch: [0][1495/5004] Time 0.241 ( 0.241) Data 0.023 ( 0.029) Loss 6.3157e+00 (6.7244e+00) Acc@1 1.56 ( 0.42) Acc@5 4.69 ( 1.85) +Epoch: [0][1496/5004] Time 0.237 ( 0.241) Data 0.024 ( 0.029) Loss 6.1931e+00 (6.7240e+00) Acc@1 0.78 ( 0.43) Acc@5 6.25 ( 1.85) +Epoch: [0][1497/5004] Time 0.238 ( 0.241) Data 0.026 ( 0.029) Loss 6.4112e+00 (6.7238e+00) Acc@1 1.17 ( 0.43) Acc@5 4.30 ( 1.85) +Epoch: [0][1498/5004] Time 0.237 ( 0.241) Data 0.026 ( 0.029) Loss 6.2980e+00 (6.7235e+00) Acc@1 2.34 ( 0.43) Acc@5 5.08 ( 1.85) +Epoch: [0][1499/5004] Time 0.249 ( 0.241) Data 0.027 ( 0.029) Loss 6.1800e+00 (6.7232e+00) Acc@1 1.95 ( 0.43) Acc@5 6.25 ( 1.85) +Epoch: [0][1500/5004] Time 0.252 ( 0.241) Data 0.024 ( 0.029) Loss 6.2490e+00 (6.7229e+00) Acc@1 0.39 ( 0.43) Acc@5 4.30 ( 1.86) +Epoch: [0][1501/5004] Time 0.248 ( 0.241) Data 0.022 ( 0.029) Loss 6.1101e+00 (6.7225e+00) Acc@1 1.95 ( 0.43) Acc@5 5.86 ( 1.86) +Epoch: [0][1502/5004] Time 0.247 ( 0.241) Data 0.023 ( 0.029) Loss 6.2451e+00 (6.7221e+00) Acc@1 2.34 ( 0.43) Acc@5 4.69 ( 1.86) +Epoch: [0][1503/5004] Time 0.244 ( 0.241) Data 0.024 ( 0.029) Loss 6.2390e+00 (6.7218e+00) Acc@1 1.56 ( 0.43) Acc@5 7.03 ( 1.86) +Epoch: [0][1504/5004] Time 0.244 ( 0.241) Data 0.024 ( 0.029) Loss 6.2240e+00 (6.7215e+00) Acc@1 1.95 ( 0.43) Acc@5 5.47 ( 1.87) +Epoch: [0][1505/5004] Time 0.244 ( 0.241) Data 0.024 ( 0.029) Loss 6.1914e+00 (6.7211e+00) Acc@1 1.95 ( 0.43) Acc@5 7.03 ( 1.87) +Epoch: [0][1506/5004] Time 0.247 ( 0.241) Data 0.023 ( 0.029) Loss 6.2158e+00 (6.7208e+00) Acc@1 1.17 ( 0.43) Acc@5 5.86 ( 1.87) +Epoch: [0][1507/5004] Time 0.244 ( 0.241) Data 0.023 ( 0.029) Loss 6.2695e+00 (6.7205e+00) Acc@1 0.78 ( 0.43) Acc@5 0.78 ( 1.87) +Epoch: [0][1508/5004] Time 0.246 ( 0.241) Data 0.024 ( 0.029) Loss 6.2473e+00 (6.7202e+00) Acc@1 1.17 ( 0.43) Acc@5 4.30 ( 1.87) +Epoch: [0][1509/5004] Time 0.247 ( 0.241) Data 0.024 ( 0.029) Loss 6.3158e+00 (6.7199e+00) Acc@1 1.95 ( 0.44) Acc@5 3.52 ( 1.88) +Epoch: [0][1510/5004] Time 0.244 ( 0.241) Data 0.023 ( 0.029) Loss 6.2713e+00 (6.7196e+00) Acc@1 0.78 ( 0.44) Acc@5 3.52 ( 1.88) +Epoch: [0][1511/5004] Time 0.246 ( 0.241) Data 0.024 ( 0.029) Loss 6.2241e+00 (6.7193e+00) Acc@1 1.17 ( 0.44) Acc@5 3.52 ( 1.88) +Epoch: [0][1512/5004] Time 0.244 ( 0.241) Data 0.023 ( 0.029) Loss 6.3024e+00 (6.7190e+00) Acc@1 1.56 ( 0.44) Acc@5 5.86 ( 1.88) +Epoch: [0][1513/5004] Time 0.245 ( 0.241) Data 0.024 ( 0.029) Loss 6.3270e+00 (6.7188e+00) Acc@1 1.95 ( 0.44) Acc@5 2.73 ( 1.88) +Epoch: [0][1514/5004] Time 0.244 ( 0.241) Data 0.024 ( 0.029) Loss 6.3048e+00 (6.7185e+00) Acc@1 1.17 ( 0.44) Acc@5 4.30 ( 1.88) +Epoch: [0][1515/5004] Time 0.248 ( 0.241) Data 0.024 ( 0.029) Loss 6.3105e+00 (6.7182e+00) Acc@1 0.00 ( 0.44) Acc@5 4.69 ( 1.88) +Epoch: [0][1516/5004] Time 0.247 ( 0.241) Data 0.022 ( 0.029) Loss 6.2362e+00 (6.7179e+00) Acc@1 1.56 ( 0.44) Acc@5 3.91 ( 1.89) +Epoch: [0][1517/5004] Time 0.245 ( 0.241) Data 0.023 ( 0.029) Loss 6.1931e+00 (6.7176e+00) Acc@1 1.17 ( 0.44) Acc@5 5.47 ( 1.89) +Epoch: [0][1518/5004] Time 0.249 ( 0.241) Data 0.023 ( 0.029) Loss 6.2352e+00 (6.7172e+00) Acc@1 3.52 ( 0.44) Acc@5 6.25 ( 1.89) +Epoch: [0][1519/5004] Time 0.247 ( 0.241) Data 0.022 ( 0.029) Loss 6.2094e+00 (6.7169e+00) Acc@1 1.56 ( 0.44) Acc@5 4.30 ( 1.89) +Epoch: [0][1520/5004] Time 0.246 ( 0.241) Data 0.021 ( 0.029) Loss 6.3642e+00 (6.7167e+00) Acc@1 1.56 ( 0.44) Acc@5 3.91 ( 1.89) +Epoch: [0][1521/5004] Time 0.248 ( 0.241) Data 0.023 ( 0.029) Loss 6.2692e+00 (6.7164e+00) Acc@1 0.39 ( 0.44) Acc@5 3.52 ( 1.89) +Epoch: [0][1522/5004] Time 0.253 ( 0.241) Data 0.021 ( 0.029) Loss 6.2933e+00 (6.7161e+00) Acc@1 1.95 ( 0.44) Acc@5 5.47 ( 1.90) +Epoch: [0][1523/5004] Time 0.251 ( 0.241) Data 0.017 ( 0.029) Loss 6.3833e+00 (6.7159e+00) Acc@1 0.78 ( 0.44) Acc@5 5.47 ( 1.90) +Epoch: [0][1524/5004] Time 0.256 ( 0.241) Data 0.019 ( 0.029) Loss 6.1545e+00 (6.7155e+00) Acc@1 0.39 ( 0.44) Acc@5 6.25 ( 1.90) +Epoch: [0][1525/5004] Time 0.241 ( 0.241) Data 0.014 ( 0.029) Loss 6.3310e+00 (6.7153e+00) Acc@1 1.56 ( 0.44) Acc@5 4.69 ( 1.90) +Epoch: [0][1526/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.029) Loss 6.2399e+00 (6.7149e+00) Acc@1 1.17 ( 0.44) Acc@5 3.12 ( 1.90) +Epoch: [0][1527/5004] Time 0.249 ( 0.241) Data 0.021 ( 0.029) Loss 6.2470e+00 (6.7146e+00) Acc@1 0.78 ( 0.45) Acc@5 5.08 ( 1.91) +Epoch: [0][1528/5004] Time 0.251 ( 0.241) Data 0.021 ( 0.029) Loss 6.2505e+00 (6.7143e+00) Acc@1 1.17 ( 0.45) Acc@5 5.47 ( 1.91) +Epoch: [0][1529/5004] Time 0.253 ( 0.241) Data 0.020 ( 0.029) Loss 6.2150e+00 (6.7140e+00) Acc@1 1.95 ( 0.45) Acc@5 4.69 ( 1.91) +Epoch: [0][1530/5004] Time 0.250 ( 0.241) Data 0.019 ( 0.029) Loss 6.2503e+00 (6.7137e+00) Acc@1 0.78 ( 0.45) Acc@5 5.08 ( 1.91) +Epoch: [0][1531/5004] Time 0.247 ( 0.241) Data 0.020 ( 0.029) Loss 6.3077e+00 (6.7134e+00) Acc@1 1.95 ( 0.45) Acc@5 3.52 ( 1.91) +Epoch: [0][1532/5004] Time 0.245 ( 0.241) Data 0.020 ( 0.029) Loss 6.3023e+00 (6.7132e+00) Acc@1 1.95 ( 0.45) Acc@5 4.69 ( 1.92) +Epoch: [0][1533/5004] Time 0.248 ( 0.241) Data 0.021 ( 0.029) Loss 6.2406e+00 (6.7129e+00) Acc@1 1.95 ( 0.45) Acc@5 6.64 ( 1.92) +Epoch: [0][1534/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.029) Loss 6.2438e+00 (6.7126e+00) Acc@1 1.56 ( 0.45) Acc@5 4.69 ( 1.92) +Epoch: [0][1535/5004] Time 0.248 ( 0.241) Data 0.019 ( 0.029) Loss 6.2934e+00 (6.7123e+00) Acc@1 0.00 ( 0.45) Acc@5 4.30 ( 1.92) +Epoch: [0][1536/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.029) Loss 6.3063e+00 (6.7120e+00) Acc@1 1.95 ( 0.45) Acc@5 6.25 ( 1.92) +Epoch: [0][1537/5004] Time 0.245 ( 0.241) Data 0.019 ( 0.029) Loss 6.2277e+00 (6.7117e+00) Acc@1 0.78 ( 0.45) Acc@5 4.30 ( 1.93) +Epoch: [0][1538/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.029) Loss 6.2253e+00 (6.7114e+00) Acc@1 0.78 ( 0.45) Acc@5 4.69 ( 1.93) +Epoch: [0][1539/5004] Time 0.248 ( 0.241) Data 0.021 ( 0.029) Loss 6.2888e+00 (6.7111e+00) Acc@1 0.39 ( 0.45) Acc@5 4.30 ( 1.93) +Epoch: [0][1540/5004] Time 0.250 ( 0.241) Data 0.020 ( 0.029) Loss 6.1503e+00 (6.7108e+00) Acc@1 1.95 ( 0.45) Acc@5 6.64 ( 1.93) +Epoch: [0][1541/5004] Time 0.246 ( 0.241) Data 0.017 ( 0.029) Loss 6.1702e+00 (6.7104e+00) Acc@1 1.17 ( 0.45) Acc@5 5.86 ( 1.94) +Epoch: [0][1542/5004] Time 0.246 ( 0.241) Data 0.019 ( 0.029) Loss 6.2719e+00 (6.7101e+00) Acc@1 0.78 ( 0.45) Acc@5 3.91 ( 1.94) +Epoch: [0][1543/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.029) Loss 6.3179e+00 (6.7099e+00) Acc@1 1.56 ( 0.45) Acc@5 4.69 ( 1.94) +Epoch: [0][1544/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.029) Loss 6.2715e+00 (6.7096e+00) Acc@1 1.56 ( 0.45) Acc@5 5.47 ( 1.94) +Epoch: [0][1545/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.029) Loss 6.3820e+00 (6.7094e+00) Acc@1 0.39 ( 0.45) Acc@5 3.52 ( 1.94) +Epoch: [0][1546/5004] Time 0.250 ( 0.241) Data 0.020 ( 0.029) Loss 6.3044e+00 (6.7091e+00) Acc@1 0.78 ( 0.45) Acc@5 2.73 ( 1.94) +Epoch: [0][1547/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.029) Loss 6.2277e+00 (6.7088e+00) Acc@1 0.39 ( 0.45) Acc@5 5.86 ( 1.94) +Epoch: [0][1548/5004] Time 0.257 ( 0.241) Data 0.022 ( 0.029) Loss 6.1795e+00 (6.7085e+00) Acc@1 1.17 ( 0.46) Acc@5 3.52 ( 1.95) +Epoch: [0][1549/5004] Time 0.254 ( 0.241) Data 0.020 ( 0.029) Loss 6.3450e+00 (6.7082e+00) Acc@1 1.56 ( 0.46) Acc@5 2.73 ( 1.95) +Epoch: [0][1550/5004] Time 0.249 ( 0.241) Data 0.021 ( 0.029) Loss 6.1779e+00 (6.7079e+00) Acc@1 1.56 ( 0.46) Acc@5 5.47 ( 1.95) +Epoch: [0][1551/5004] Time 0.246 ( 0.241) Data 0.021 ( 0.029) Loss 6.1738e+00 (6.7075e+00) Acc@1 0.78 ( 0.46) Acc@5 5.08 ( 1.95) +Epoch: [0][1552/5004] Time 0.248 ( 0.241) Data 0.023 ( 0.029) Loss 6.1556e+00 (6.7072e+00) Acc@1 2.34 ( 0.46) Acc@5 7.42 ( 1.95) +Epoch: [0][1553/5004] Time 0.248 ( 0.241) Data 0.023 ( 0.029) Loss 6.2513e+00 (6.7069e+00) Acc@1 0.78 ( 0.46) Acc@5 5.08 ( 1.96) +Epoch: [0][1554/5004] Time 0.249 ( 0.241) Data 0.022 ( 0.029) Loss 6.2816e+00 (6.7066e+00) Acc@1 1.17 ( 0.46) Acc@5 3.91 ( 1.96) +Epoch: [0][1555/5004] Time 0.249 ( 0.241) Data 0.023 ( 0.029) Loss 6.2324e+00 (6.7063e+00) Acc@1 1.17 ( 0.46) Acc@5 5.86 ( 1.96) +Epoch: [0][1556/5004] Time 0.255 ( 0.241) Data 0.021 ( 0.029) Loss 6.2892e+00 (6.7060e+00) Acc@1 0.39 ( 0.46) Acc@5 1.95 ( 1.96) +Epoch: [0][1557/5004] Time 0.256 ( 0.241) Data 0.020 ( 0.029) Loss 6.2318e+00 (6.7057e+00) Acc@1 0.78 ( 0.46) Acc@5 5.08 ( 1.96) +Epoch: [0][1558/5004] Time 0.248 ( 0.241) Data 0.019 ( 0.029) Loss 6.1499e+00 (6.7054e+00) Acc@1 1.17 ( 0.46) Acc@5 7.03 ( 1.97) +Epoch: [0][1559/5004] Time 0.261 ( 0.241) Data 0.022 ( 0.029) Loss 6.1558e+00 (6.7050e+00) Acc@1 1.95 ( 0.46) Acc@5 6.64 ( 1.97) +Epoch: [0][1560/5004] Time 0.250 ( 0.241) Data 0.018 ( 0.029) Loss 6.3091e+00 (6.7048e+00) Acc@1 1.95 ( 0.46) Acc@5 7.81 ( 1.97) +Epoch: [0][1561/5004] Time 0.253 ( 0.241) Data 0.021 ( 0.029) Loss 6.1107e+00 (6.7044e+00) Acc@1 1.17 ( 0.46) Acc@5 7.03 ( 1.98) +Epoch: [0][1562/5004] Time 0.254 ( 0.241) Data 0.020 ( 0.029) Loss 6.3404e+00 (6.7042e+00) Acc@1 0.39 ( 0.46) Acc@5 3.12 ( 1.98) +Epoch: [0][1563/5004] Time 0.247 ( 0.241) Data 0.020 ( 0.029) Loss 6.2402e+00 (6.7039e+00) Acc@1 1.56 ( 0.46) Acc@5 3.91 ( 1.98) +Epoch: [0][1564/5004] Time 0.249 ( 0.241) Data 0.022 ( 0.029) Loss 6.2289e+00 (6.7036e+00) Acc@1 0.78 ( 0.46) Acc@5 4.30 ( 1.98) +Epoch: [0][1565/5004] Time 0.251 ( 0.241) Data 0.022 ( 0.029) Loss 6.3021e+00 (6.7033e+00) Acc@1 0.39 ( 0.46) Acc@5 5.08 ( 1.98) +Epoch: [0][1566/5004] Time 0.252 ( 0.241) Data 0.020 ( 0.029) Loss 6.2117e+00 (6.7030e+00) Acc@1 1.17 ( 0.46) Acc@5 5.86 ( 1.98) +Epoch: [0][1567/5004] Time 0.252 ( 0.241) Data 0.020 ( 0.029) Loss 6.1836e+00 (6.7027e+00) Acc@1 1.17 ( 0.46) Acc@5 5.08 ( 1.99) +Epoch: [0][1568/5004] Time 0.246 ( 0.241) Data 0.020 ( 0.029) Loss 6.2755e+00 (6.7024e+00) Acc@1 1.95 ( 0.46) Acc@5 4.69 ( 1.99) +Epoch: [0][1569/5004] Time 0.248 ( 0.241) Data 0.022 ( 0.029) Loss 6.2979e+00 (6.7021e+00) Acc@1 0.78 ( 0.47) Acc@5 4.69 ( 1.99) +Epoch: [0][1570/5004] Time 0.249 ( 0.241) Data 0.022 ( 0.029) Loss 6.2580e+00 (6.7018e+00) Acc@1 0.00 ( 0.46) Acc@5 5.08 ( 1.99) +Epoch: [0][1571/5004] Time 0.250 ( 0.241) Data 0.022 ( 0.029) Loss 6.0091e+00 (6.7014e+00) Acc@1 1.17 ( 0.47) Acc@5 7.42 ( 1.99) +Epoch: [0][1572/5004] Time 0.248 ( 0.241) Data 0.021 ( 0.029) Loss 6.1477e+00 (6.7011e+00) Acc@1 1.17 ( 0.47) Acc@5 6.25 ( 2.00) +Epoch: [0][1573/5004] Time 0.246 ( 0.241) Data 0.022 ( 0.029) Loss 6.2548e+00 (6.7008e+00) Acc@1 1.17 ( 0.47) Acc@5 4.69 ( 2.00) +Epoch: [0][1574/5004] Time 0.253 ( 0.241) Data 0.023 ( 0.029) Loss 6.2265e+00 (6.7005e+00) Acc@1 0.78 ( 0.47) Acc@5 5.08 ( 2.00) +Epoch: [0][1575/5004] Time 0.243 ( 0.241) Data 0.018 ( 0.029) Loss 6.1972e+00 (6.7002e+00) Acc@1 0.39 ( 0.47) Acc@5 4.69 ( 2.00) +Epoch: [0][1576/5004] Time 0.248 ( 0.241) Data 0.023 ( 0.029) Loss 6.2239e+00 (6.6998e+00) Acc@1 1.17 ( 0.47) Acc@5 5.86 ( 2.00) +Epoch: [0][1577/5004] Time 0.249 ( 0.241) Data 0.023 ( 0.029) Loss 6.2418e+00 (6.6996e+00) Acc@1 0.39 ( 0.47) Acc@5 6.25 ( 2.01) +Epoch: [0][1578/5004] Time 0.250 ( 0.241) Data 0.022 ( 0.029) Loss 6.1506e+00 (6.6992e+00) Acc@1 1.56 ( 0.47) Acc@5 7.03 ( 2.01) +Epoch: [0][1579/5004] Time 0.249 ( 0.241) Data 0.023 ( 0.029) Loss 6.2323e+00 (6.6989e+00) Acc@1 0.78 ( 0.47) Acc@5 7.42 ( 2.01) +Epoch: [0][1580/5004] Time 0.248 ( 0.241) Data 0.022 ( 0.029) Loss 6.1737e+00 (6.6986e+00) Acc@1 1.95 ( 0.47) Acc@5 6.25 ( 2.02) +Epoch: [0][1581/5004] Time 0.254 ( 0.241) Data 0.024 ( 0.029) Loss 6.1320e+00 (6.6982e+00) Acc@1 1.17 ( 0.47) Acc@5 5.47 ( 2.02) +Epoch: [0][1582/5004] Time 0.247 ( 0.241) Data 0.020 ( 0.029) Loss 6.2560e+00 (6.6979e+00) Acc@1 0.78 ( 0.47) Acc@5 4.30 ( 2.02) +Epoch: [0][1583/5004] Time 0.250 ( 0.241) Data 0.022 ( 0.029) Loss 6.1821e+00 (6.6976e+00) Acc@1 1.17 ( 0.47) Acc@5 4.30 ( 2.02) +Epoch: [0][1584/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.029) Loss 6.1980e+00 (6.6973e+00) Acc@1 2.34 ( 0.47) Acc@5 7.03 ( 2.02) +Epoch: [0][1585/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.029) Loss 6.3082e+00 (6.6971e+00) Acc@1 1.17 ( 0.47) Acc@5 4.30 ( 2.03) +Epoch: [0][1586/5004] Time 0.249 ( 0.241) Data 0.021 ( 0.029) Loss 6.1652e+00 (6.6967e+00) Acc@1 1.95 ( 0.47) Acc@5 4.69 ( 2.03) +Epoch: [0][1587/5004] Time 0.252 ( 0.241) Data 0.020 ( 0.029) Loss 6.1305e+00 (6.6964e+00) Acc@1 1.95 ( 0.47) Acc@5 8.20 ( 2.03) +Epoch: [0][1588/5004] Time 0.249 ( 0.241) Data 0.019 ( 0.029) Loss 6.2058e+00 (6.6961e+00) Acc@1 2.34 ( 0.47) Acc@5 6.25 ( 2.03) +Epoch: [0][1589/5004] Time 0.252 ( 0.241) Data 0.019 ( 0.029) Loss 6.2434e+00 (6.6958e+00) Acc@1 0.78 ( 0.47) Acc@5 4.69 ( 2.04) +Epoch: [0][1590/5004] Time 0.247 ( 0.241) Data 0.018 ( 0.029) Loss 6.2432e+00 (6.6955e+00) Acc@1 1.17 ( 0.47) Acc@5 3.91 ( 2.04) +Epoch: [0][1591/5004] Time 0.253 ( 0.241) Data 0.020 ( 0.029) Loss 6.1922e+00 (6.6952e+00) Acc@1 2.73 ( 0.48) Acc@5 6.64 ( 2.04) +Epoch: [0][1592/5004] Time 0.246 ( 0.241) Data 0.018 ( 0.029) Loss 6.2959e+00 (6.6949e+00) Acc@1 1.17 ( 0.48) Acc@5 5.86 ( 2.04) +Epoch: [0][1593/5004] Time 0.250 ( 0.241) Data 0.020 ( 0.029) Loss 6.3707e+00 (6.6947e+00) Acc@1 1.17 ( 0.48) Acc@5 2.34 ( 2.04) +Epoch: [0][1594/5004] Time 0.246 ( 0.241) Data 0.019 ( 0.029) Loss 6.2363e+00 (6.6944e+00) Acc@1 2.34 ( 0.48) Acc@5 5.86 ( 2.04) +Epoch: [0][1595/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.029) Loss 6.1636e+00 (6.6941e+00) Acc@1 2.34 ( 0.48) Acc@5 5.08 ( 2.05) +Epoch: [0][1596/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.029) Loss 6.1462e+00 (6.6938e+00) Acc@1 2.34 ( 0.48) Acc@5 8.98 ( 2.05) +Epoch: [0][1597/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.029) Loss 6.2544e+00 (6.6935e+00) Acc@1 2.34 ( 0.48) Acc@5 6.64 ( 2.05) +Epoch: [0][1598/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.029) Loss 6.3015e+00 (6.6932e+00) Acc@1 1.95 ( 0.48) Acc@5 5.08 ( 2.06) +Epoch: [0][1599/5004] Time 0.247 ( 0.241) Data 0.019 ( 0.029) Loss 6.3610e+00 (6.6930e+00) Acc@1 0.00 ( 0.48) Acc@5 3.91 ( 2.06) +Epoch: [0][1600/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.029) Loss 6.2046e+00 (6.6927e+00) Acc@1 1.56 ( 0.48) Acc@5 5.08 ( 2.06) +Epoch: [0][1601/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.029) Loss 6.3110e+00 (6.6925e+00) Acc@1 0.78 ( 0.48) Acc@5 3.12 ( 2.06) +Epoch: [0][1602/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.029) Loss 6.1909e+00 (6.6922e+00) Acc@1 1.17 ( 0.48) Acc@5 5.86 ( 2.06) +Epoch: [0][1603/5004] Time 0.254 ( 0.241) Data 0.020 ( 0.029) Loss 6.1586e+00 (6.6918e+00) Acc@1 1.95 ( 0.48) Acc@5 7.03 ( 2.07) +Epoch: [0][1604/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.029) Loss 6.2207e+00 (6.6915e+00) Acc@1 0.39 ( 0.48) Acc@5 6.25 ( 2.07) +Epoch: [0][1605/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.029) Loss 6.2503e+00 (6.6913e+00) Acc@1 0.78 ( 0.48) Acc@5 2.73 ( 2.07) +Epoch: [0][1606/5004] Time 0.247 ( 0.241) Data 0.019 ( 0.029) Loss 6.0979e+00 (6.6909e+00) Acc@1 1.95 ( 0.49) Acc@5 5.47 ( 2.07) +Epoch: [0][1607/5004] Time 0.246 ( 0.241) Data 0.020 ( 0.029) Loss 6.3842e+00 (6.6907e+00) Acc@1 0.39 ( 0.49) Acc@5 5.47 ( 2.07) +Epoch: [0][1608/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.029) Loss 6.3328e+00 (6.6905e+00) Acc@1 0.78 ( 0.49) Acc@5 3.91 ( 2.07) +Epoch: [0][1609/5004] Time 0.248 ( 0.241) Data 0.019 ( 0.028) Loss 6.2704e+00 (6.6902e+00) Acc@1 2.34 ( 0.49) Acc@5 6.25 ( 2.08) +Epoch: [0][1610/5004] Time 0.247 ( 0.241) Data 0.020 ( 0.028) Loss 6.2203e+00 (6.6899e+00) Acc@1 1.95 ( 0.49) Acc@5 5.47 ( 2.08) +Epoch: [0][1611/5004] Time 0.250 ( 0.241) Data 0.021 ( 0.028) Loss 6.2013e+00 (6.6896e+00) Acc@1 2.34 ( 0.49) Acc@5 5.08 ( 2.08) +Epoch: [0][1612/5004] Time 0.249 ( 0.241) Data 0.019 ( 0.028) Loss 6.2946e+00 (6.6894e+00) Acc@1 1.95 ( 0.49) Acc@5 3.91 ( 2.08) +Epoch: [0][1613/5004] Time 0.253 ( 0.241) Data 0.021 ( 0.028) Loss 6.1973e+00 (6.6891e+00) Acc@1 2.34 ( 0.49) Acc@5 5.08 ( 2.08) +Epoch: [0][1614/5004] Time 0.248 ( 0.241) Data 0.016 ( 0.028) Loss 6.1691e+00 (6.6888e+00) Acc@1 1.17 ( 0.49) Acc@5 4.69 ( 2.08) +Epoch: [0][1615/5004] Time 0.247 ( 0.241) Data 0.018 ( 0.028) Loss 6.3083e+00 (6.6885e+00) Acc@1 2.34 ( 0.49) Acc@5 5.47 ( 2.09) +Epoch: [0][1616/5004] Time 0.247 ( 0.241) Data 0.019 ( 0.028) Loss 6.2538e+00 (6.6883e+00) Acc@1 2.34 ( 0.49) Acc@5 5.86 ( 2.09) +Epoch: [0][1617/5004] Time 0.247 ( 0.241) Data 0.020 ( 0.028) Loss 6.1893e+00 (6.6879e+00) Acc@1 0.78 ( 0.49) Acc@5 4.69 ( 2.09) +Epoch: [0][1618/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.028) Loss 6.3059e+00 (6.6877e+00) Acc@1 1.17 ( 0.49) Acc@5 4.30 ( 2.09) +Epoch: [0][1619/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.028) Loss 6.3325e+00 (6.6875e+00) Acc@1 1.95 ( 0.50) Acc@5 4.69 ( 2.09) +Epoch: [0][1620/5004] Time 0.252 ( 0.241) Data 0.021 ( 0.028) Loss 6.1155e+00 (6.6871e+00) Acc@1 0.78 ( 0.50) Acc@5 5.86 ( 2.10) +Epoch: [0][1621/5004] Time 0.250 ( 0.241) Data 0.019 ( 0.028) Loss 6.3222e+00 (6.6869e+00) Acc@1 1.17 ( 0.50) Acc@5 4.30 ( 2.10) +Epoch: [0][1622/5004] Time 0.248 ( 0.241) Data 0.018 ( 0.028) Loss 6.2158e+00 (6.6866e+00) Acc@1 0.78 ( 0.50) Acc@5 4.30 ( 2.10) +Epoch: [0][1623/5004] Time 0.253 ( 0.241) Data 0.019 ( 0.028) Loss 6.2599e+00 (6.6864e+00) Acc@1 0.78 ( 0.50) Acc@5 3.12 ( 2.10) +Epoch: [0][1624/5004] Time 0.259 ( 0.241) Data 0.019 ( 0.028) Loss 6.2362e+00 (6.6861e+00) Acc@1 1.56 ( 0.50) Acc@5 6.64 ( 2.10) +Epoch: [0][1625/5004] Time 0.244 ( 0.241) Data 0.015 ( 0.028) Loss 6.2405e+00 (6.6858e+00) Acc@1 0.78 ( 0.50) Acc@5 3.91 ( 2.10) +Epoch: [0][1626/5004] Time 0.254 ( 0.241) Data 0.020 ( 0.028) Loss 6.2532e+00 (6.6855e+00) Acc@1 1.56 ( 0.50) Acc@5 5.08 ( 2.11) +Epoch: [0][1627/5004] Time 0.246 ( 0.241) Data 0.015 ( 0.028) Loss 6.2763e+00 (6.6853e+00) Acc@1 1.17 ( 0.50) Acc@5 3.91 ( 2.11) +Epoch: [0][1628/5004] Time 0.248 ( 0.241) Data 0.019 ( 0.028) Loss 6.1422e+00 (6.6850e+00) Acc@1 1.95 ( 0.50) Acc@5 7.42 ( 2.11) +Epoch: [0][1629/5004] Time 0.253 ( 0.241) Data 0.020 ( 0.028) Loss 6.2447e+00 (6.6847e+00) Acc@1 0.78 ( 0.50) Acc@5 4.69 ( 2.11) +Epoch: [0][1630/5004] Time 0.244 ( 0.241) Data 0.016 ( 0.028) Loss 6.2154e+00 (6.6844e+00) Acc@1 0.78 ( 0.50) Acc@5 5.47 ( 2.11) +Epoch: [0][1631/5004] Time 0.254 ( 0.241) Data 0.021 ( 0.028) Loss 6.1191e+00 (6.6841e+00) Acc@1 1.17 ( 0.50) Acc@5 7.03 ( 2.12) +Epoch: [0][1632/5004] Time 0.260 ( 0.241) Data 0.018 ( 0.028) Loss 6.1192e+00 (6.6837e+00) Acc@1 3.52 ( 0.50) Acc@5 8.20 ( 2.12) +Epoch: [0][1633/5004] Time 0.242 ( 0.241) Data 0.013 ( 0.028) Loss 6.1314e+00 (6.6834e+00) Acc@1 1.95 ( 0.50) Acc@5 5.47 ( 2.12) +Epoch: [0][1634/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.028) Loss 6.2342e+00 (6.6831e+00) Acc@1 1.17 ( 0.50) Acc@5 5.47 ( 2.12) +Epoch: [0][1635/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.028) Loss 6.2519e+00 (6.6828e+00) Acc@1 3.52 ( 0.50) Acc@5 9.38 ( 2.13) +Epoch: [0][1636/5004] Time 0.250 ( 0.241) Data 0.019 ( 0.028) Loss 6.1882e+00 (6.6825e+00) Acc@1 1.17 ( 0.51) Acc@5 4.69 ( 2.13) +Epoch: [0][1637/5004] Time 0.254 ( 0.241) Data 0.020 ( 0.028) Loss 6.2924e+00 (6.6823e+00) Acc@1 1.56 ( 0.51) Acc@5 6.25 ( 2.13) +Epoch: [0][1638/5004] Time 0.246 ( 0.241) Data 0.015 ( 0.028) Loss 6.1657e+00 (6.6820e+00) Acc@1 0.78 ( 0.51) Acc@5 6.25 ( 2.13) +Epoch: [0][1639/5004] Time 0.253 ( 0.241) Data 0.019 ( 0.028) Loss 6.1998e+00 (6.6817e+00) Acc@1 0.78 ( 0.51) Acc@5 2.73 ( 2.14) +Epoch: [0][1640/5004] Time 0.259 ( 0.241) Data 0.018 ( 0.028) Loss 6.1405e+00 (6.6814e+00) Acc@1 0.78 ( 0.51) Acc@5 4.30 ( 2.14) +Epoch: [0][1641/5004] Time 0.249 ( 0.241) Data 0.019 ( 0.028) Loss 6.2610e+00 (6.6811e+00) Acc@1 0.39 ( 0.51) Acc@5 5.08 ( 2.14) +Epoch: [0][1642/5004] Time 0.249 ( 0.241) Data 0.019 ( 0.028) Loss 6.1499e+00 (6.6808e+00) Acc@1 1.95 ( 0.51) Acc@5 5.47 ( 2.14) +Epoch: [0][1643/5004] Time 0.246 ( 0.241) Data 0.019 ( 0.028) Loss 6.3120e+00 (6.6806e+00) Acc@1 0.39 ( 0.51) Acc@5 6.25 ( 2.14) +Epoch: [0][1644/5004] Time 0.250 ( 0.241) Data 0.021 ( 0.028) Loss 6.2073e+00 (6.6803e+00) Acc@1 1.95 ( 0.51) Acc@5 4.69 ( 2.14) +Epoch: [0][1645/5004] Time 0.248 ( 0.241) Data 0.018 ( 0.028) Loss 6.1608e+00 (6.6799e+00) Acc@1 0.78 ( 0.51) Acc@5 3.91 ( 2.15) +Epoch: [0][1646/5004] Time 0.248 ( 0.241) Data 0.019 ( 0.028) Loss 6.1852e+00 (6.6796e+00) Acc@1 1.56 ( 0.51) Acc@5 3.91 ( 2.15) +Epoch: [0][1647/5004] Time 0.250 ( 0.241) Data 0.020 ( 0.028) Loss 6.3017e+00 (6.6794e+00) Acc@1 1.95 ( 0.51) Acc@5 5.86 ( 2.15) +Epoch: [0][1648/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.028) Loss 6.1838e+00 (6.6791e+00) Acc@1 1.95 ( 0.51) Acc@5 5.47 ( 2.15) +Epoch: [0][1649/5004] Time 0.250 ( 0.241) Data 0.019 ( 0.028) Loss 6.2170e+00 (6.6788e+00) Acc@1 0.78 ( 0.51) Acc@5 3.91 ( 2.15) +Epoch: [0][1650/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.028) Loss 6.2075e+00 (6.6786e+00) Acc@1 0.39 ( 0.51) Acc@5 5.47 ( 2.15) +Epoch: [0][1651/5004] Time 0.252 ( 0.241) Data 0.020 ( 0.028) Loss 6.1706e+00 (6.6782e+00) Acc@1 3.12 ( 0.51) Acc@5 7.42 ( 2.16) +Epoch: [0][1652/5004] Time 0.250 ( 0.241) Data 0.019 ( 0.028) Loss 6.1297e+00 (6.6779e+00) Acc@1 1.56 ( 0.51) Acc@5 5.86 ( 2.16) +Epoch: [0][1653/5004] Time 0.255 ( 0.241) Data 0.019 ( 0.028) Loss 6.2389e+00 (6.6776e+00) Acc@1 1.56 ( 0.51) Acc@5 5.86 ( 2.16) +Epoch: [0][1654/5004] Time 0.252 ( 0.241) Data 0.019 ( 0.028) Loss 6.1402e+00 (6.6773e+00) Acc@1 1.95 ( 0.51) Acc@5 6.64 ( 2.16) +Epoch: [0][1655/5004] Time 0.248 ( 0.241) Data 0.018 ( 0.028) Loss 6.2667e+00 (6.6771e+00) Acc@1 1.56 ( 0.51) Acc@5 4.69 ( 2.17) +Epoch: [0][1656/5004] Time 0.249 ( 0.241) Data 0.019 ( 0.028) Loss 6.2832e+00 (6.6768e+00) Acc@1 1.95 ( 0.52) Acc@5 5.08 ( 2.17) +Epoch: [0][1657/5004] Time 0.247 ( 0.241) Data 0.020 ( 0.028) Loss 6.1954e+00 (6.6765e+00) Acc@1 0.00 ( 0.52) Acc@5 5.08 ( 2.17) +Epoch: [0][1658/5004] Time 0.252 ( 0.241) Data 0.020 ( 0.028) Loss 6.1516e+00 (6.6762e+00) Acc@1 0.39 ( 0.52) Acc@5 4.69 ( 2.17) +Epoch: [0][1659/5004] Time 0.247 ( 0.241) Data 0.019 ( 0.028) Loss 6.2377e+00 (6.6760e+00) Acc@1 0.78 ( 0.52) Acc@5 5.08 ( 2.17) +Epoch: [0][1660/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.028) Loss 6.1120e+00 (6.6756e+00) Acc@1 0.78 ( 0.52) Acc@5 7.81 ( 2.18) +Epoch: [0][1661/5004] Time 0.250 ( 0.241) Data 0.020 ( 0.028) Loss 6.3711e+00 (6.6754e+00) Acc@1 0.39 ( 0.52) Acc@5 3.12 ( 2.18) +Epoch: [0][1662/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.028) Loss 6.0187e+00 (6.6750e+00) Acc@1 1.17 ( 0.52) Acc@5 7.42 ( 2.18) +Epoch: [0][1663/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.028) Loss 6.2844e+00 (6.6748e+00) Acc@1 1.95 ( 0.52) Acc@5 6.25 ( 2.18) +Epoch: [0][1664/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.028) Loss 6.1311e+00 (6.6745e+00) Acc@1 1.56 ( 0.52) Acc@5 5.47 ( 2.18) +Epoch: [0][1665/5004] Time 0.255 ( 0.241) Data 0.020 ( 0.028) Loss 6.1943e+00 (6.6742e+00) Acc@1 1.17 ( 0.52) Acc@5 6.64 ( 2.19) +Epoch: [0][1666/5004] Time 0.252 ( 0.241) Data 0.019 ( 0.028) Loss 6.1019e+00 (6.6739e+00) Acc@1 1.56 ( 0.52) Acc@5 5.86 ( 2.19) +Epoch: [0][1667/5004] Time 0.251 ( 0.241) Data 0.019 ( 0.028) Loss 6.2087e+00 (6.6736e+00) Acc@1 1.17 ( 0.52) Acc@5 6.25 ( 2.19) +Epoch: [0][1668/5004] Time 0.251 ( 0.241) Data 0.019 ( 0.028) Loss 6.0433e+00 (6.6732e+00) Acc@1 1.17 ( 0.52) Acc@5 6.64 ( 2.19) +Epoch: [0][1669/5004] Time 0.248 ( 0.241) Data 0.019 ( 0.028) Loss 6.1208e+00 (6.6729e+00) Acc@1 2.34 ( 0.52) Acc@5 7.81 ( 2.20) +Epoch: [0][1670/5004] Time 0.251 ( 0.241) Data 0.020 ( 0.028) Loss 6.1447e+00 (6.6726e+00) Acc@1 2.34 ( 0.52) Acc@5 7.03 ( 2.20) +Epoch: [0][1671/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.028) Loss 6.2045e+00 (6.6723e+00) Acc@1 0.78 ( 0.52) Acc@5 6.25 ( 2.20) +Epoch: [0][1672/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.028) Loss 6.1229e+00 (6.6719e+00) Acc@1 3.91 ( 0.52) Acc@5 8.59 ( 2.21) +Epoch: [0][1673/5004] Time 0.248 ( 0.241) Data 0.020 ( 0.028) Loss 6.1547e+00 (6.6716e+00) Acc@1 1.17 ( 0.52) Acc@5 5.47 ( 2.21) +Epoch: [0][1674/5004] Time 0.249 ( 0.241) Data 0.020 ( 0.028) Loss 6.2099e+00 (6.6714e+00) Acc@1 2.34 ( 0.53) Acc@5 3.91 ( 2.21) +Epoch: [0][1675/5004] Time 0.246 ( 0.241) Data 0.020 ( 0.028) Loss 6.2601e+00 (6.6711e+00) Acc@1 0.78 ( 0.53) Acc@5 5.08 ( 2.21) +Epoch: [0][1676/5004] Time 0.249 ( 0.241) Data 0.021 ( 0.028) Loss 6.0653e+00 (6.6708e+00) Acc@1 1.95 ( 0.53) Acc@5 7.03 ( 2.21) +Epoch: [0][1677/5004] Time 0.232 ( 0.241) Data 0.020 ( 0.028) Loss 6.1612e+00 (6.6705e+00) Acc@1 0.39 ( 0.53) Acc@5 3.52 ( 2.22) +Epoch: [0][1678/5004] Time 0.242 ( 0.241) Data 0.030 ( 0.028) Loss 6.1563e+00 (6.6701e+00) Acc@1 1.17 ( 0.53) Acc@5 6.25 ( 2.22) +Epoch: [0][1679/5004] Time 0.242 ( 0.241) Data 0.030 ( 0.028) Loss 6.1880e+00 (6.6699e+00) Acc@1 1.17 ( 0.53) Acc@5 7.42 ( 2.22) +Epoch: [0][1680/5004] Time 0.244 ( 0.241) Data 0.030 ( 0.028) Loss 6.2453e+00 (6.6696e+00) Acc@1 1.56 ( 0.53) Acc@5 5.08 ( 2.22) +Epoch: [0][1681/5004] Time 0.244 ( 0.241) Data 0.030 ( 0.028) Loss 6.0937e+00 (6.6693e+00) Acc@1 2.73 ( 0.53) Acc@5 7.42 ( 2.23) +Epoch: [0][1682/5004] Time 0.241 ( 0.241) Data 0.027 ( 0.028) Loss 6.2553e+00 (6.6690e+00) Acc@1 0.78 ( 0.53) Acc@5 4.69 ( 2.23) +Epoch: [0][1683/5004] Time 0.243 ( 0.241) Data 0.030 ( 0.028) Loss 6.1586e+00 (6.6687e+00) Acc@1 0.78 ( 0.53) Acc@5 4.69 ( 2.23) +Epoch: [0][1684/5004] Time 0.239 ( 0.241) Data 0.029 ( 0.028) Loss 6.2422e+00 (6.6685e+00) Acc@1 1.95 ( 0.53) Acc@5 5.47 ( 2.23) +Epoch: [0][1685/5004] Time 0.243 ( 0.241) Data 0.030 ( 0.028) Loss 6.2275e+00 (6.6682e+00) Acc@1 1.17 ( 0.53) Acc@5 5.08 ( 2.23) +Epoch: [0][1686/5004] Time 0.245 ( 0.241) Data 0.030 ( 0.028) Loss 6.2906e+00 (6.6680e+00) Acc@1 1.17 ( 0.53) Acc@5 3.52 ( 2.23) +Epoch: [0][1687/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.028) Loss 6.2221e+00 (6.6677e+00) Acc@1 0.78 ( 0.53) Acc@5 5.08 ( 2.23) +Epoch: [0][1688/5004] Time 0.246 ( 0.241) Data 0.030 ( 0.028) Loss 6.0901e+00 (6.6674e+00) Acc@1 3.52 ( 0.53) Acc@5 7.81 ( 2.24) +Epoch: [0][1689/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.028) Loss 6.1900e+00 (6.6671e+00) Acc@1 0.78 ( 0.53) Acc@5 5.47 ( 2.24) +Epoch: [0][1690/5004] Time 0.256 ( 0.241) Data 0.030 ( 0.028) Loss 6.1244e+00 (6.6668e+00) Acc@1 2.34 ( 0.53) Acc@5 5.86 ( 2.24) +Epoch: [0][1691/5004] Time 0.242 ( 0.241) Data 0.023 ( 0.028) Loss 6.1125e+00 (6.6664e+00) Acc@1 2.73 ( 0.54) Acc@5 6.25 ( 2.24) +Epoch: [0][1692/5004] Time 0.243 ( 0.241) Data 0.027 ( 0.028) Loss 6.1855e+00 (6.6662e+00) Acc@1 2.34 ( 0.54) Acc@5 6.25 ( 2.25) +Epoch: [0][1693/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.028) Loss 6.1510e+00 (6.6658e+00) Acc@1 2.73 ( 0.54) Acc@5 5.08 ( 2.25) +Epoch: [0][1694/5004] Time 0.245 ( 0.241) Data 0.029 ( 0.028) Loss 6.1798e+00 (6.6656e+00) Acc@1 1.17 ( 0.54) Acc@5 5.86 ( 2.25) +Epoch: [0][1695/5004] Time 0.242 ( 0.241) Data 0.028 ( 0.028) Loss 6.1608e+00 (6.6653e+00) Acc@1 0.78 ( 0.54) Acc@5 7.03 ( 2.25) +Epoch: [0][1696/5004] Time 0.242 ( 0.241) Data 0.030 ( 0.028) Loss 6.0458e+00 (6.6649e+00) Acc@1 3.91 ( 0.54) Acc@5 11.33 ( 2.26) +Epoch: [0][1697/5004] Time 0.243 ( 0.241) Data 0.030 ( 0.028) Loss 6.1492e+00 (6.6646e+00) Acc@1 2.34 ( 0.54) Acc@5 4.69 ( 2.26) +Epoch: [0][1698/5004] Time 0.245 ( 0.241) Data 0.029 ( 0.028) Loss 6.0680e+00 (6.6642e+00) Acc@1 2.34 ( 0.54) Acc@5 6.64 ( 2.26) +Epoch: [0][1699/5004] Time 0.249 ( 0.241) Data 0.029 ( 0.028) Loss 6.2087e+00 (6.6640e+00) Acc@1 1.17 ( 0.54) Acc@5 3.91 ( 2.26) +Epoch: [0][1700/5004] Time 0.241 ( 0.241) Data 0.028 ( 0.028) Loss 6.1310e+00 (6.6637e+00) Acc@1 1.17 ( 0.54) Acc@5 5.47 ( 2.27) +Epoch: [0][1701/5004] Time 0.243 ( 0.241) Data 0.030 ( 0.028) Loss 6.1316e+00 (6.6634e+00) Acc@1 2.73 ( 0.54) Acc@5 7.03 ( 2.27) +Epoch: [0][1702/5004] Time 0.246 ( 0.241) Data 0.029 ( 0.028) Loss 6.1747e+00 (6.6631e+00) Acc@1 1.95 ( 0.54) Acc@5 4.69 ( 2.27) +Epoch: [0][1703/5004] Time 0.243 ( 0.241) Data 0.028 ( 0.028) Loss 6.1172e+00 (6.6627e+00) Acc@1 1.95 ( 0.55) Acc@5 6.64 ( 2.27) +Epoch: [0][1704/5004] Time 0.241 ( 0.241) Data 0.029 ( 0.028) Loss 6.1110e+00 (6.6624e+00) Acc@1 2.34 ( 0.55) Acc@5 8.20 ( 2.28) +Epoch: [0][1705/5004] Time 0.243 ( 0.241) Data 0.030 ( 0.028) Loss 6.1383e+00 (6.6621e+00) Acc@1 1.95 ( 0.55) Acc@5 4.69 ( 2.28) +Epoch: [0][1706/5004] Time 0.242 ( 0.241) Data 0.030 ( 0.028) Loss 6.1165e+00 (6.6618e+00) Acc@1 1.17 ( 0.55) Acc@5 7.03 ( 2.28) +Epoch: [0][1707/5004] Time 0.248 ( 0.241) Data 0.029 ( 0.028) Loss 6.0949e+00 (6.6615e+00) Acc@1 3.52 ( 0.55) Acc@5 9.77 ( 2.28) +Epoch: [0][1708/5004] Time 0.246 ( 0.241) Data 0.029 ( 0.028) Loss 6.2561e+00 (6.6612e+00) Acc@1 1.17 ( 0.55) Acc@5 3.91 ( 2.29) +Epoch: [0][1709/5004] Time 0.245 ( 0.241) Data 0.029 ( 0.028) Loss 6.2271e+00 (6.6610e+00) Acc@1 0.78 ( 0.55) Acc@5 3.52 ( 2.29) +Epoch: [0][1710/5004] Time 0.240 ( 0.241) Data 0.028 ( 0.028) Loss 6.1370e+00 (6.6607e+00) Acc@1 0.78 ( 0.55) Acc@5 6.25 ( 2.29) +Epoch: [0][1711/5004] Time 0.244 ( 0.242) Data 0.030 ( 0.028) Loss 6.1528e+00 (6.6604e+00) Acc@1 1.95 ( 0.55) Acc@5 8.59 ( 2.29) +Epoch: [0][1712/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.028) Loss 6.1146e+00 (6.6600e+00) Acc@1 1.17 ( 0.55) Acc@5 7.03 ( 2.29) +Epoch: [0][1713/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.028) Loss 6.1604e+00 (6.6598e+00) Acc@1 1.56 ( 0.55) Acc@5 6.64 ( 2.30) +Epoch: [0][1714/5004] Time 0.242 ( 0.242) Data 0.029 ( 0.028) Loss 6.1704e+00 (6.6595e+00) Acc@1 0.78 ( 0.55) Acc@5 3.91 ( 2.30) +Epoch: [0][1715/5004] Time 0.242 ( 0.242) Data 0.029 ( 0.028) Loss 6.1208e+00 (6.6592e+00) Acc@1 0.78 ( 0.55) Acc@5 5.86 ( 2.30) +Epoch: [0][1716/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.028) Loss 6.2547e+00 (6.6589e+00) Acc@1 1.56 ( 0.55) Acc@5 4.69 ( 2.30) +Epoch: [0][1717/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.028) Loss 6.1287e+00 (6.6586e+00) Acc@1 1.17 ( 0.55) Acc@5 7.81 ( 2.30) +Epoch: [0][1718/5004] Time 0.257 ( 0.242) Data 0.028 ( 0.028) Loss 6.1600e+00 (6.6583e+00) Acc@1 2.34 ( 0.55) Acc@5 6.64 ( 2.31) +Epoch: [0][1719/5004] Time 0.258 ( 0.242) Data 0.025 ( 0.028) Loss 6.1823e+00 (6.6580e+00) Acc@1 0.39 ( 0.55) Acc@5 5.86 ( 2.31) +Epoch: [0][1720/5004] Time 0.269 ( 0.242) Data 0.025 ( 0.028) Loss 6.1246e+00 (6.6577e+00) Acc@1 1.95 ( 0.56) Acc@5 5.86 ( 2.31) +Epoch: [0][1721/5004] Time 0.262 ( 0.242) Data 0.016 ( 0.028) Loss 6.1511e+00 (6.6574e+00) Acc@1 2.34 ( 0.56) Acc@5 8.59 ( 2.31) +Epoch: [0][1722/5004] Time 0.259 ( 0.242) Data 0.016 ( 0.028) Loss 6.0582e+00 (6.6571e+00) Acc@1 2.73 ( 0.56) Acc@5 7.03 ( 2.32) +Epoch: [0][1723/5004] Time 0.261 ( 0.242) Data 0.018 ( 0.028) Loss 6.1691e+00 (6.6568e+00) Acc@1 1.17 ( 0.56) Acc@5 5.86 ( 2.32) +Epoch: [0][1724/5004] Time 0.258 ( 0.242) Data 0.018 ( 0.028) Loss 6.2556e+00 (6.6566e+00) Acc@1 1.95 ( 0.56) Acc@5 5.08 ( 2.32) +Epoch: [0][1725/5004] Time 0.260 ( 0.242) Data 0.018 ( 0.028) Loss 6.0511e+00 (6.6562e+00) Acc@1 0.78 ( 0.56) Acc@5 6.25 ( 2.32) +Epoch: [0][1726/5004] Time 0.261 ( 0.242) Data 0.019 ( 0.028) Loss 6.0548e+00 (6.6559e+00) Acc@1 1.95 ( 0.56) Acc@5 5.86 ( 2.33) +Epoch: [0][1727/5004] Time 0.259 ( 0.242) Data 0.018 ( 0.028) Loss 6.1456e+00 (6.6556e+00) Acc@1 1.56 ( 0.56) Acc@5 5.86 ( 2.33) +Epoch: [0][1728/5004] Time 0.259 ( 0.242) Data 0.019 ( 0.028) Loss 6.1391e+00 (6.6553e+00) Acc@1 1.56 ( 0.56) Acc@5 5.86 ( 2.33) +Epoch: [0][1729/5004] Time 0.243 ( 0.242) Data 0.018 ( 0.028) Loss 6.1972e+00 (6.6550e+00) Acc@1 1.17 ( 0.56) Acc@5 6.64 ( 2.33) +Epoch: [0][1730/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.028) Loss 6.1704e+00 (6.6547e+00) Acc@1 1.17 ( 0.56) Acc@5 5.08 ( 2.33) +Epoch: [0][1731/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.028) Loss 6.1348e+00 (6.6544e+00) Acc@1 1.95 ( 0.56) Acc@5 7.81 ( 2.34) +Epoch: [0][1732/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.028) Loss 6.1420e+00 (6.6541e+00) Acc@1 1.56 ( 0.56) Acc@5 5.08 ( 2.34) +Epoch: [0][1733/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.1865e+00 (6.6539e+00) Acc@1 3.52 ( 0.56) Acc@5 8.20 ( 2.34) +Epoch: [0][1734/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.028) Loss 6.0599e+00 (6.6535e+00) Acc@1 2.34 ( 0.57) Acc@5 6.64 ( 2.34) +Epoch: [0][1735/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.028) Loss 6.1069e+00 (6.6532e+00) Acc@1 1.17 ( 0.57) Acc@5 5.08 ( 2.35) +Epoch: [0][1736/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.028) Loss 6.1867e+00 (6.6530e+00) Acc@1 1.17 ( 0.57) Acc@5 4.69 ( 2.35) +Epoch: [0][1737/5004] Time 0.248 ( 0.242) Data 0.024 ( 0.028) Loss 6.1279e+00 (6.6526e+00) Acc@1 1.56 ( 0.57) Acc@5 5.86 ( 2.35) +Epoch: [0][1738/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.028) Loss 6.0872e+00 (6.6523e+00) Acc@1 2.34 ( 0.57) Acc@5 5.86 ( 2.35) +Epoch: [0][1739/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.028) Loss 6.1161e+00 (6.6520e+00) Acc@1 0.78 ( 0.57) Acc@5 3.91 ( 2.35) +Epoch: [0][1740/5004] Time 0.249 ( 0.242) Data 0.023 ( 0.028) Loss 6.1532e+00 (6.6517e+00) Acc@1 1.17 ( 0.57) Acc@5 7.03 ( 2.35) +Epoch: [0][1741/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.028) Loss 6.1203e+00 (6.6514e+00) Acc@1 1.17 ( 0.57) Acc@5 5.86 ( 2.36) +Epoch: [0][1742/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.028) Loss 6.0742e+00 (6.6511e+00) Acc@1 3.12 ( 0.57) Acc@5 8.59 ( 2.36) +Epoch: [0][1743/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.028) Loss 6.0750e+00 (6.6508e+00) Acc@1 0.39 ( 0.57) Acc@5 5.08 ( 2.36) +Epoch: [0][1744/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.028) Loss 6.2615e+00 (6.6505e+00) Acc@1 0.39 ( 0.57) Acc@5 3.12 ( 2.36) +Epoch: [0][1745/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.028) Loss 6.0949e+00 (6.6502e+00) Acc@1 2.73 ( 0.57) Acc@5 5.47 ( 2.36) +Epoch: [0][1746/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.028) Loss 6.2251e+00 (6.6500e+00) Acc@1 0.39 ( 0.57) Acc@5 3.12 ( 2.36) +Epoch: [0][1747/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.028) Loss 6.2348e+00 (6.6497e+00) Acc@1 0.78 ( 0.57) Acc@5 5.08 ( 2.37) +Epoch: [0][1748/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.028) Loss 6.2127e+00 (6.6495e+00) Acc@1 2.34 ( 0.57) Acc@5 7.42 ( 2.37) +Epoch: [0][1749/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.028) Loss 6.0923e+00 (6.6492e+00) Acc@1 1.56 ( 0.57) Acc@5 8.59 ( 2.37) +Epoch: [0][1750/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.028) Loss 6.1035e+00 (6.6489e+00) Acc@1 0.78 ( 0.57) Acc@5 3.91 ( 2.37) +Epoch: [0][1751/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.028) Loss 6.2271e+00 (6.6486e+00) Acc@1 1.56 ( 0.57) Acc@5 5.47 ( 2.38) +Epoch: [0][1752/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.028) Loss 6.1105e+00 (6.6483e+00) Acc@1 1.95 ( 0.57) Acc@5 5.86 ( 2.38) +Epoch: [0][1753/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 6.0466e+00 (6.6480e+00) Acc@1 1.95 ( 0.58) Acc@5 7.42 ( 2.38) +Epoch: [0][1754/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.028) Loss 6.1138e+00 (6.6477e+00) Acc@1 0.78 ( 0.58) Acc@5 5.86 ( 2.38) +Epoch: [0][1755/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.028) Loss 6.2065e+00 (6.6474e+00) Acc@1 0.00 ( 0.57) Acc@5 5.08 ( 2.38) +Epoch: [0][1756/5004] Time 0.257 ( 0.242) Data 0.020 ( 0.028) Loss 6.1114e+00 (6.6471e+00) Acc@1 1.95 ( 0.58) Acc@5 5.47 ( 2.39) +Epoch: [0][1757/5004] Time 0.242 ( 0.242) Data 0.016 ( 0.028) Loss 6.1120e+00 (6.6468e+00) Acc@1 3.12 ( 0.58) Acc@5 9.38 ( 2.39) +Epoch: [0][1758/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.028) Loss 6.2484e+00 (6.6466e+00) Acc@1 2.34 ( 0.58) Acc@5 8.59 ( 2.39) +Epoch: [0][1759/5004] Time 0.242 ( 0.242) Data 0.017 ( 0.028) Loss 6.1748e+00 (6.6463e+00) Acc@1 1.56 ( 0.58) Acc@5 6.64 ( 2.40) +Epoch: [0][1760/5004] Time 0.254 ( 0.242) Data 0.021 ( 0.028) Loss 6.2653e+00 (6.6461e+00) Acc@1 2.34 ( 0.58) Acc@5 5.86 ( 2.40) +Epoch: [0][1761/5004] Time 0.242 ( 0.242) Data 0.017 ( 0.028) Loss 6.1549e+00 (6.6458e+00) Acc@1 2.34 ( 0.58) Acc@5 6.25 ( 2.40) +Epoch: [0][1762/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.028) Loss 6.2230e+00 (6.6456e+00) Acc@1 0.00 ( 0.58) Acc@5 2.73 ( 2.40) +Epoch: [0][1763/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.028) Loss 6.1842e+00 (6.6453e+00) Acc@1 1.56 ( 0.58) Acc@5 5.08 ( 2.40) +Epoch: [0][1764/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.028) Loss 6.0838e+00 (6.6450e+00) Acc@1 2.34 ( 0.58) Acc@5 7.42 ( 2.40) +Epoch: [0][1765/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.028) Loss 6.0433e+00 (6.6447e+00) Acc@1 1.17 ( 0.58) Acc@5 7.42 ( 2.41) +Epoch: [0][1766/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.028) Loss 6.0443e+00 (6.6443e+00) Acc@1 0.78 ( 0.58) Acc@5 6.64 ( 2.41) +Epoch: [0][1767/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.028) Loss 6.1498e+00 (6.6440e+00) Acc@1 1.17 ( 0.58) Acc@5 6.64 ( 2.41) +Epoch: [0][1768/5004] Time 0.254 ( 0.242) Data 0.020 ( 0.028) Loss 6.0596e+00 (6.6437e+00) Acc@1 0.78 ( 0.58) Acc@5 4.69 ( 2.41) +Epoch: [0][1769/5004] Time 0.247 ( 0.242) Data 0.016 ( 0.028) Loss 6.1914e+00 (6.6434e+00) Acc@1 2.34 ( 0.58) Acc@5 5.86 ( 2.42) +Epoch: [0][1770/5004] Time 0.274 ( 0.242) Data 0.019 ( 0.028) Loss 6.0309e+00 (6.6431e+00) Acc@1 1.17 ( 0.58) Acc@5 7.03 ( 2.42) +Epoch: [0][1771/5004] Time 0.247 ( 0.242) Data 0.007 ( 0.028) Loss 6.0861e+00 (6.6428e+00) Acc@1 1.56 ( 0.58) Acc@5 5.08 ( 2.42) +Epoch: [0][1772/5004] Time 0.243 ( 0.242) Data 0.018 ( 0.028) Loss 5.9340e+00 (6.6424e+00) Acc@1 3.91 ( 0.59) Acc@5 8.59 ( 2.42) +Epoch: [0][1773/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.028) Loss 6.0812e+00 (6.6421e+00) Acc@1 2.34 ( 0.59) Acc@5 6.64 ( 2.43) +Epoch: [0][1774/5004] Time 0.252 ( 0.242) Data 0.021 ( 0.028) Loss 6.1495e+00 (6.6418e+00) Acc@1 1.56 ( 0.59) Acc@5 5.47 ( 2.43) +Epoch: [0][1775/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.028) Loss 6.1003e+00 (6.6415e+00) Acc@1 1.56 ( 0.59) Acc@5 7.42 ( 2.43) +Epoch: [0][1776/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.0937e+00 (6.6412e+00) Acc@1 2.73 ( 0.59) Acc@5 5.47 ( 2.43) +Epoch: [0][1777/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 5.9172e+00 (6.6408e+00) Acc@1 2.34 ( 0.59) Acc@5 8.98 ( 2.43) +Epoch: [0][1778/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.028) Loss 5.9841e+00 (6.6404e+00) Acc@1 2.34 ( 0.59) Acc@5 6.64 ( 2.44) +Epoch: [0][1779/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.028) Loss 6.1055e+00 (6.6401e+00) Acc@1 1.56 ( 0.59) Acc@5 6.64 ( 2.44) +Epoch: [0][1780/5004] Time 0.239 ( 0.242) Data 0.022 ( 0.028) Loss 6.0576e+00 (6.6398e+00) Acc@1 1.17 ( 0.59) Acc@5 3.12 ( 2.44) +Epoch: [0][1781/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.028) Loss 6.1577e+00 (6.6395e+00) Acc@1 1.56 ( 0.59) Acc@5 4.69 ( 2.44) +Epoch: [0][1782/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.028) Loss 6.0781e+00 (6.6392e+00) Acc@1 0.00 ( 0.59) Acc@5 6.64 ( 2.44) +Epoch: [0][1783/5004] Time 0.251 ( 0.242) Data 0.024 ( 0.028) Loss 6.2285e+00 (6.6390e+00) Acc@1 0.78 ( 0.59) Acc@5 5.08 ( 2.45) +Epoch: [0][1784/5004] Time 0.244 ( 0.242) Data 0.018 ( 0.028) Loss 6.0485e+00 (6.6386e+00) Acc@1 2.34 ( 0.59) Acc@5 5.47 ( 2.45) +Epoch: [0][1785/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.028) Loss 6.1462e+00 (6.6384e+00) Acc@1 3.52 ( 0.60) Acc@5 8.59 ( 2.45) +Epoch: [0][1786/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.028) Loss 6.0954e+00 (6.6381e+00) Acc@1 1.17 ( 0.60) Acc@5 7.42 ( 2.45) +Epoch: [0][1787/5004] Time 0.249 ( 0.242) Data 0.023 ( 0.028) Loss 6.1748e+00 (6.6378e+00) Acc@1 2.73 ( 0.60) Acc@5 8.20 ( 2.46) +Epoch: [0][1788/5004] Time 0.238 ( 0.242) Data 0.021 ( 0.028) Loss 6.2707e+00 (6.6376e+00) Acc@1 1.17 ( 0.60) Acc@5 6.25 ( 2.46) +Epoch: [0][1789/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.028) Loss 6.1001e+00 (6.6373e+00) Acc@1 3.52 ( 0.60) Acc@5 7.42 ( 2.46) +Epoch: [0][1790/5004] Time 0.248 ( 0.242) Data 0.024 ( 0.028) Loss 6.1933e+00 (6.6370e+00) Acc@1 0.78 ( 0.60) Acc@5 4.30 ( 2.46) +Epoch: [0][1791/5004] Time 0.236 ( 0.242) Data 0.019 ( 0.028) Loss 6.2395e+00 (6.6368e+00) Acc@1 0.78 ( 0.60) Acc@5 3.91 ( 2.46) +Epoch: [0][1792/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.028) Loss 6.0913e+00 (6.6365e+00) Acc@1 0.78 ( 0.60) Acc@5 7.42 ( 2.47) +Epoch: [0][1793/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.028) Loss 6.1195e+00 (6.6362e+00) Acc@1 1.95 ( 0.60) Acc@5 5.08 ( 2.47) +Epoch: [0][1794/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.028) Loss 6.0370e+00 (6.6359e+00) Acc@1 3.12 ( 0.60) Acc@5 7.03 ( 2.47) +Epoch: [0][1795/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.028) Loss 6.0369e+00 (6.6356e+00) Acc@1 1.17 ( 0.60) Acc@5 5.47 ( 2.47) +Epoch: [0][1796/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.028) Loss 6.1822e+00 (6.6353e+00) Acc@1 1.95 ( 0.60) Acc@5 8.98 ( 2.48) +Epoch: [0][1797/5004] Time 0.240 ( 0.242) Data 0.024 ( 0.028) Loss 6.2212e+00 (6.6351e+00) Acc@1 1.17 ( 0.60) Acc@5 4.30 ( 2.48) +Epoch: [0][1798/5004] Time 0.240 ( 0.242) Data 0.024 ( 0.028) Loss 6.0950e+00 (6.6348e+00) Acc@1 3.12 ( 0.60) Acc@5 5.47 ( 2.48) +Epoch: [0][1799/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.028) Loss 6.0732e+00 (6.6345e+00) Acc@1 1.17 ( 0.60) Acc@5 5.86 ( 2.48) +Epoch: [0][1800/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.028) Loss 6.1707e+00 (6.6342e+00) Acc@1 1.95 ( 0.61) Acc@5 6.64 ( 2.48) +Epoch: [0][1801/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.028) Loss 6.1881e+00 (6.6340e+00) Acc@1 0.78 ( 0.61) Acc@5 6.25 ( 2.48) +Epoch: [0][1802/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.028) Loss 6.1594e+00 (6.6337e+00) Acc@1 1.56 ( 0.61) Acc@5 6.25 ( 2.49) +Epoch: [0][1803/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 5.8065e+00 (6.6332e+00) Acc@1 2.34 ( 0.61) Acc@5 10.94 ( 2.49) +Epoch: [0][1804/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.028) Loss 6.0659e+00 (6.6329e+00) Acc@1 3.52 ( 0.61) Acc@5 6.64 ( 2.49) +Epoch: [0][1805/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.028) Loss 5.9919e+00 (6.6326e+00) Acc@1 1.56 ( 0.61) Acc@5 6.25 ( 2.50) +Epoch: [0][1806/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.028) Loss 6.0433e+00 (6.6322e+00) Acc@1 3.12 ( 0.61) Acc@5 8.59 ( 2.50) +Epoch: [0][1807/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.028) Loss 6.1407e+00 (6.6320e+00) Acc@1 1.95 ( 0.61) Acc@5 5.47 ( 2.50) +Epoch: [0][1808/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.028) Loss 6.2211e+00 (6.6317e+00) Acc@1 2.34 ( 0.61) Acc@5 5.47 ( 2.50) +Epoch: [0][1809/5004] Time 0.240 ( 0.242) Data 0.024 ( 0.028) Loss 5.9636e+00 (6.6314e+00) Acc@1 1.56 ( 0.61) Acc@5 9.38 ( 2.51) +Epoch: [0][1810/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.028) Loss 5.9459e+00 (6.6310e+00) Acc@1 3.12 ( 0.61) Acc@5 10.16 ( 2.51) +Epoch: [0][1811/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.028) Loss 6.0432e+00 (6.6307e+00) Acc@1 1.95 ( 0.61) Acc@5 7.42 ( 2.51) +Epoch: [0][1812/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.028) Loss 6.1375e+00 (6.6304e+00) Acc@1 0.00 ( 0.61) Acc@5 1.56 ( 2.51) +Epoch: [0][1813/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.028) Loss 6.0789e+00 (6.6301e+00) Acc@1 1.17 ( 0.61) Acc@5 5.86 ( 2.51) +Epoch: [0][1814/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.028) Loss 6.1382e+00 (6.6298e+00) Acc@1 1.17 ( 0.62) Acc@5 8.59 ( 2.52) +Epoch: [0][1815/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.028) Loss 6.0653e+00 (6.6295e+00) Acc@1 1.17 ( 0.62) Acc@5 6.25 ( 2.52) +Epoch: [0][1816/5004] Time 0.292 ( 0.242) Data 0.023 ( 0.028) Loss 6.0367e+00 (6.6292e+00) Acc@1 2.34 ( 0.62) Acc@5 7.42 ( 2.52) +Epoch: [0][1817/5004] Time 0.249 ( 0.242) Data 0.015 ( 0.028) Loss 6.0638e+00 (6.6289e+00) Acc@1 2.34 ( 0.62) Acc@5 6.25 ( 2.52) +Epoch: [0][1818/5004] Time 0.253 ( 0.242) Data 0.020 ( 0.028) Loss 6.1218e+00 (6.6286e+00) Acc@1 1.17 ( 0.62) Acc@5 5.47 ( 2.53) +Epoch: [0][1819/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.028) Loss 6.0586e+00 (6.6283e+00) Acc@1 1.17 ( 0.62) Acc@5 5.86 ( 2.53) +Epoch: [0][1820/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.028) Loss 6.1248e+00 (6.6280e+00) Acc@1 2.73 ( 0.62) Acc@5 5.47 ( 2.53) +Epoch: [0][1821/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.028) Loss 6.0495e+00 (6.6277e+00) Acc@1 3.12 ( 0.62) Acc@5 7.81 ( 2.53) +Epoch: [0][1822/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.028) Loss 6.0315e+00 (6.6274e+00) Acc@1 0.39 ( 0.62) Acc@5 5.86 ( 2.53) +Epoch: [0][1823/5004] Time 0.248 ( 0.242) Data 0.019 ( 0.028) Loss 6.0527e+00 (6.6270e+00) Acc@1 4.69 ( 0.62) Acc@5 7.81 ( 2.54) +Epoch: [0][1824/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.028) Loss 6.0548e+00 (6.6267e+00) Acc@1 3.91 ( 0.62) Acc@5 8.20 ( 2.54) +Epoch: [0][1825/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.028) Loss 5.9845e+00 (6.6264e+00) Acc@1 2.34 ( 0.63) Acc@5 9.77 ( 2.54) +Epoch: [0][1826/5004] Time 0.252 ( 0.242) Data 0.019 ( 0.028) Loss 6.0269e+00 (6.6261e+00) Acc@1 1.56 ( 0.63) Acc@5 6.25 ( 2.55) +Epoch: [0][1827/5004] Time 0.250 ( 0.242) Data 0.018 ( 0.028) Loss 6.1214e+00 (6.6258e+00) Acc@1 0.78 ( 0.63) Acc@5 3.91 ( 2.55) +Epoch: [0][1828/5004] Time 0.252 ( 0.242) Data 0.021 ( 0.028) Loss 5.9683e+00 (6.6254e+00) Acc@1 1.17 ( 0.63) Acc@5 5.86 ( 2.55) +Epoch: [0][1829/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.028) Loss 6.0032e+00 (6.6251e+00) Acc@1 3.12 ( 0.63) Acc@5 8.98 ( 2.55) +Epoch: [0][1830/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.028) Loss 6.0998e+00 (6.6248e+00) Acc@1 1.56 ( 0.63) Acc@5 7.42 ( 2.55) +Epoch: [0][1831/5004] Time 0.252 ( 0.242) Data 0.022 ( 0.028) Loss 6.0621e+00 (6.6245e+00) Acc@1 1.17 ( 0.63) Acc@5 7.42 ( 2.56) +Epoch: [0][1832/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.028) Loss 5.9694e+00 (6.6241e+00) Acc@1 2.73 ( 0.63) Acc@5 8.59 ( 2.56) +Epoch: [0][1833/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.1650e+00 (6.6239e+00) Acc@1 1.17 ( 0.63) Acc@5 5.08 ( 2.56) +Epoch: [0][1834/5004] Time 0.252 ( 0.242) Data 0.021 ( 0.028) Loss 6.2593e+00 (6.6237e+00) Acc@1 1.56 ( 0.63) Acc@5 6.25 ( 2.56) +Epoch: [0][1835/5004] Time 0.255 ( 0.242) Data 0.020 ( 0.028) Loss 5.9943e+00 (6.6233e+00) Acc@1 1.56 ( 0.63) Acc@5 5.86 ( 2.57) +Epoch: [0][1836/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.028) Loss 5.9903e+00 (6.6230e+00) Acc@1 2.34 ( 0.63) Acc@5 7.81 ( 2.57) +Epoch: [0][1837/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.028) Loss 6.0900e+00 (6.6227e+00) Acc@1 1.56 ( 0.63) Acc@5 8.20 ( 2.57) +Epoch: [0][1838/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.028) Loss 6.0092e+00 (6.6224e+00) Acc@1 1.56 ( 0.63) Acc@5 5.08 ( 2.57) +Epoch: [0][1839/5004] Time 0.254 ( 0.242) Data 0.021 ( 0.028) Loss 6.0758e+00 (6.6221e+00) Acc@1 3.12 ( 0.63) Acc@5 7.81 ( 2.58) +Epoch: [0][1840/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.028) Loss 5.9855e+00 (6.6217e+00) Acc@1 1.56 ( 0.63) Acc@5 7.42 ( 2.58) +Epoch: [0][1841/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.028) Loss 6.1232e+00 (6.6215e+00) Acc@1 1.95 ( 0.64) Acc@5 5.47 ( 2.58) +Epoch: [0][1842/5004] Time 0.252 ( 0.242) Data 0.021 ( 0.028) Loss 6.1651e+00 (6.6212e+00) Acc@1 1.56 ( 0.64) Acc@5 5.08 ( 2.58) +Epoch: [0][1843/5004] Time 0.246 ( 0.242) Data 0.019 ( 0.028) Loss 6.0622e+00 (6.6209e+00) Acc@1 1.56 ( 0.64) Acc@5 6.25 ( 2.58) +Epoch: [0][1844/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 5.9588e+00 (6.6205e+00) Acc@1 2.34 ( 0.64) Acc@5 10.94 ( 2.59) +Epoch: [0][1845/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 5.9450e+00 (6.6202e+00) Acc@1 2.34 ( 0.64) Acc@5 10.55 ( 2.59) +Epoch: [0][1846/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.028) Loss 5.9684e+00 (6.6198e+00) Acc@1 1.56 ( 0.64) Acc@5 3.52 ( 2.59) +Epoch: [0][1847/5004] Time 0.254 ( 0.242) Data 0.023 ( 0.028) Loss 5.9261e+00 (6.6194e+00) Acc@1 2.34 ( 0.64) Acc@5 8.59 ( 2.60) +Epoch: [0][1848/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.028) Loss 6.1173e+00 (6.6192e+00) Acc@1 1.95 ( 0.64) Acc@5 4.69 ( 2.60) +Epoch: [0][1849/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.028) Loss 6.0845e+00 (6.6189e+00) Acc@1 1.56 ( 0.64) Acc@5 7.42 ( 2.60) +Epoch: [0][1850/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.028) Loss 6.0911e+00 (6.6186e+00) Acc@1 2.34 ( 0.64) Acc@5 6.64 ( 2.60) +Epoch: [0][1851/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.028) Loss 5.9977e+00 (6.6183e+00) Acc@1 0.78 ( 0.64) Acc@5 7.03 ( 2.60) +Epoch: [0][1852/5004] Time 0.249 ( 0.242) Data 0.023 ( 0.028) Loss 6.0402e+00 (6.6180e+00) Acc@1 3.12 ( 0.64) Acc@5 8.59 ( 2.61) +Epoch: [0][1853/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.028) Loss 6.0924e+00 (6.6177e+00) Acc@1 1.56 ( 0.64) Acc@5 6.64 ( 2.61) +Epoch: [0][1854/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 6.0855e+00 (6.6174e+00) Acc@1 2.34 ( 0.64) Acc@5 7.03 ( 2.61) +Epoch: [0][1855/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.0890e+00 (6.6171e+00) Acc@1 2.34 ( 0.65) Acc@5 8.59 ( 2.62) +Epoch: [0][1856/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 5.9302e+00 (6.6167e+00) Acc@1 1.95 ( 0.65) Acc@5 6.64 ( 2.62) +Epoch: [0][1857/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.028) Loss 6.2334e+00 (6.6165e+00) Acc@1 2.73 ( 0.65) Acc@5 5.86 ( 2.62) +Epoch: [0][1858/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.028) Loss 6.1144e+00 (6.6163e+00) Acc@1 1.17 ( 0.65) Acc@5 5.47 ( 2.62) +Epoch: [0][1859/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.028) Loss 5.9035e+00 (6.6159e+00) Acc@1 1.56 ( 0.65) Acc@5 9.77 ( 2.62) +Epoch: [0][1860/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.2051e+00 (6.6157e+00) Acc@1 0.78 ( 0.65) Acc@5 6.64 ( 2.63) +Epoch: [0][1861/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 6.0656e+00 (6.6154e+00) Acc@1 1.56 ( 0.65) Acc@5 5.86 ( 2.63) +Epoch: [0][1862/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 5.9300e+00 (6.6150e+00) Acc@1 2.73 ( 0.65) Acc@5 8.20 ( 2.63) +Epoch: [0][1863/5004] Time 0.252 ( 0.242) Data 0.022 ( 0.028) Loss 5.9045e+00 (6.6146e+00) Acc@1 1.17 ( 0.65) Acc@5 8.98 ( 2.63) +Epoch: [0][1864/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.028) Loss 5.9920e+00 (6.6143e+00) Acc@1 2.34 ( 0.65) Acc@5 8.59 ( 2.64) +Epoch: [0][1865/5004] Time 0.253 ( 0.242) Data 0.022 ( 0.028) Loss 6.0314e+00 (6.6140e+00) Acc@1 0.78 ( 0.65) Acc@5 7.03 ( 2.64) +Epoch: [0][1866/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 5.9396e+00 (6.6136e+00) Acc@1 2.34 ( 0.65) Acc@5 6.25 ( 2.64) +Epoch: [0][1867/5004] Time 0.252 ( 0.242) Data 0.023 ( 0.028) Loss 6.1314e+00 (6.6133e+00) Acc@1 1.56 ( 0.65) Acc@5 7.03 ( 2.64) +Epoch: [0][1868/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.1426e+00 (6.6131e+00) Acc@1 1.17 ( 0.65) Acc@5 4.69 ( 2.65) +Epoch: [0][1869/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.1165e+00 (6.6128e+00) Acc@1 1.95 ( 0.65) Acc@5 6.25 ( 2.65) +Epoch: [0][1870/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.028) Loss 6.0661e+00 (6.6125e+00) Acc@1 3.12 ( 0.65) Acc@5 7.03 ( 2.65) +Epoch: [0][1871/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.028) Loss 5.9897e+00 (6.6122e+00) Acc@1 3.12 ( 0.66) Acc@5 8.59 ( 2.65) +Epoch: [0][1872/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.028) Loss 6.0113e+00 (6.6119e+00) Acc@1 1.95 ( 0.66) Acc@5 8.59 ( 2.66) +Epoch: [0][1873/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.028) Loss 5.9804e+00 (6.6115e+00) Acc@1 2.34 ( 0.66) Acc@5 8.20 ( 2.66) +Epoch: [0][1874/5004] Time 0.253 ( 0.242) Data 0.022 ( 0.028) Loss 5.9925e+00 (6.6112e+00) Acc@1 0.78 ( 0.66) Acc@5 8.59 ( 2.66) +Epoch: [0][1875/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.028) Loss 6.0146e+00 (6.6109e+00) Acc@1 1.95 ( 0.66) Acc@5 6.64 ( 2.66) +Epoch: [0][1876/5004] Time 0.253 ( 0.242) Data 0.022 ( 0.028) Loss 6.1596e+00 (6.6107e+00) Acc@1 2.34 ( 0.66) Acc@5 6.25 ( 2.67) +Epoch: [0][1877/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.028) Loss 6.0826e+00 (6.6104e+00) Acc@1 1.56 ( 0.66) Acc@5 5.86 ( 2.67) +Epoch: [0][1878/5004] Time 0.255 ( 0.242) Data 0.021 ( 0.028) Loss 5.8912e+00 (6.6100e+00) Acc@1 3.12 ( 0.66) Acc@5 9.38 ( 2.67) +Epoch: [0][1879/5004] Time 0.244 ( 0.242) Data 0.017 ( 0.028) Loss 6.0025e+00 (6.6097e+00) Acc@1 3.52 ( 0.66) Acc@5 8.20 ( 2.67) +Epoch: [0][1880/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.028) Loss 6.1222e+00 (6.6094e+00) Acc@1 1.17 ( 0.66) Acc@5 6.64 ( 2.68) +Epoch: [0][1881/5004] Time 0.251 ( 0.242) Data 0.019 ( 0.028) Loss 5.9999e+00 (6.6091e+00) Acc@1 1.95 ( 0.66) Acc@5 8.98 ( 2.68) +Epoch: [0][1882/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.028) Loss 6.0775e+00 (6.6088e+00) Acc@1 2.73 ( 0.66) Acc@5 7.03 ( 2.68) +Epoch: [0][1883/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.028) Loss 6.1352e+00 (6.6085e+00) Acc@1 2.73 ( 0.67) Acc@5 7.03 ( 2.68) +Epoch: [0][1884/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.028) Loss 6.0019e+00 (6.6082e+00) Acc@1 1.56 ( 0.67) Acc@5 7.03 ( 2.69) +Epoch: [0][1885/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.028) Loss 6.0047e+00 (6.6079e+00) Acc@1 2.73 ( 0.67) Acc@5 7.81 ( 2.69) +Epoch: [0][1886/5004] Time 0.250 ( 0.242) Data 0.028 ( 0.028) Loss 6.1633e+00 (6.6077e+00) Acc@1 2.73 ( 0.67) Acc@5 7.42 ( 2.69) +Epoch: [0][1887/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.9517e+00 (6.6073e+00) Acc@1 1.95 ( 0.67) Acc@5 6.64 ( 2.69) +Epoch: [0][1888/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.028) Loss 5.9941e+00 (6.6070e+00) Acc@1 1.17 ( 0.67) Acc@5 6.64 ( 2.70) +Epoch: [0][1889/5004] Time 0.249 ( 0.242) Data 0.030 ( 0.028) Loss 5.9421e+00 (6.6066e+00) Acc@1 1.95 ( 0.67) Acc@5 8.59 ( 2.70) +Epoch: [0][1890/5004] Time 0.252 ( 0.242) Data 0.028 ( 0.028) Loss 5.9198e+00 (6.6063e+00) Acc@1 1.17 ( 0.67) Acc@5 6.64 ( 2.70) +Epoch: [0][1891/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.028) Loss 6.0120e+00 (6.6060e+00) Acc@1 2.73 ( 0.67) Acc@5 5.47 ( 2.70) +Epoch: [0][1892/5004] Time 0.247 ( 0.242) Data 0.026 ( 0.028) Loss 5.9919e+00 (6.6056e+00) Acc@1 1.95 ( 0.67) Acc@5 7.42 ( 2.71) +Epoch: [0][1893/5004] Time 0.248 ( 0.242) Data 0.027 ( 0.028) Loss 6.0123e+00 (6.6053e+00) Acc@1 2.34 ( 0.67) Acc@5 7.42 ( 2.71) +Epoch: [0][1894/5004] Time 0.250 ( 0.242) Data 0.028 ( 0.028) Loss 6.0338e+00 (6.6050e+00) Acc@1 1.95 ( 0.67) Acc@5 8.59 ( 2.71) +Epoch: [0][1895/5004] Time 0.251 ( 0.242) Data 0.028 ( 0.028) Loss 5.9574e+00 (6.6047e+00) Acc@1 2.34 ( 0.67) Acc@5 9.77 ( 2.72) +Epoch: [0][1896/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.028) Loss 6.0715e+00 (6.6044e+00) Acc@1 1.95 ( 0.68) Acc@5 6.25 ( 2.72) +Epoch: [0][1897/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 5.9572e+00 (6.6041e+00) Acc@1 1.56 ( 0.68) Acc@5 7.03 ( 2.72) +Epoch: [0][1898/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 6.0069e+00 (6.6038e+00) Acc@1 4.30 ( 0.68) Acc@5 11.33 ( 2.72) +Epoch: [0][1899/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.028) Loss 5.9899e+00 (6.6034e+00) Acc@1 5.08 ( 0.68) Acc@5 9.77 ( 2.73) +Epoch: [0][1900/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.028) Loss 6.1474e+00 (6.6032e+00) Acc@1 1.56 ( 0.68) Acc@5 8.59 ( 2.73) +Epoch: [0][1901/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 6.2308e+00 (6.6030e+00) Acc@1 1.17 ( 0.68) Acc@5 4.69 ( 2.73) +Epoch: [0][1902/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.028) Loss 6.0127e+00 (6.6027e+00) Acc@1 1.17 ( 0.68) Acc@5 6.64 ( 2.73) +Epoch: [0][1903/5004] Time 0.250 ( 0.242) Data 0.029 ( 0.028) Loss 6.0737e+00 (6.6024e+00) Acc@1 2.73 ( 0.68) Acc@5 7.42 ( 2.74) +Epoch: [0][1904/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.028) Loss 6.1558e+00 (6.6022e+00) Acc@1 1.17 ( 0.68) Acc@5 6.64 ( 2.74) +Epoch: [0][1905/5004] Time 0.247 ( 0.242) Data 0.028 ( 0.028) Loss 6.0268e+00 (6.6019e+00) Acc@1 0.78 ( 0.68) Acc@5 5.86 ( 2.74) +Epoch: [0][1906/5004] Time 0.263 ( 0.242) Data 0.029 ( 0.028) Loss 6.1370e+00 (6.6016e+00) Acc@1 1.17 ( 0.68) Acc@5 3.12 ( 2.74) +Epoch: [0][1907/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.028) Loss 5.9557e+00 (6.6013e+00) Acc@1 1.95 ( 0.68) Acc@5 8.20 ( 2.74) +Epoch: [0][1908/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.028) Loss 5.9012e+00 (6.6009e+00) Acc@1 1.95 ( 0.68) Acc@5 6.25 ( 2.74) +Epoch: [0][1909/5004] Time 0.249 ( 0.242) Data 0.028 ( 0.028) Loss 6.0767e+00 (6.6006e+00) Acc@1 1.56 ( 0.68) Acc@5 5.47 ( 2.75) +Epoch: [0][1910/5004] Time 0.245 ( 0.242) Data 0.028 ( 0.028) Loss 6.0052e+00 (6.6003e+00) Acc@1 2.73 ( 0.69) Acc@5 7.42 ( 2.75) +Epoch: [0][1911/5004] Time 0.252 ( 0.242) Data 0.029 ( 0.028) Loss 6.0097e+00 (6.6000e+00) Acc@1 2.34 ( 0.69) Acc@5 5.86 ( 2.75) +Epoch: [0][1912/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.028) Loss 6.0374e+00 (6.5997e+00) Acc@1 2.73 ( 0.69) Acc@5 7.42 ( 2.75) +Epoch: [0][1913/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.028) Loss 6.1072e+00 (6.5995e+00) Acc@1 2.73 ( 0.69) Acc@5 5.47 ( 2.75) +Epoch: [0][1914/5004] Time 0.248 ( 0.242) Data 0.028 ( 0.028) Loss 5.9392e+00 (6.5991e+00) Acc@1 3.12 ( 0.69) Acc@5 8.20 ( 2.76) +Epoch: [0][1915/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.028) Loss 6.1855e+00 (6.5989e+00) Acc@1 1.56 ( 0.69) Acc@5 5.86 ( 2.76) +Epoch: [0][1916/5004] Time 0.244 ( 0.242) Data 0.030 ( 0.028) Loss 6.0241e+00 (6.5986e+00) Acc@1 1.17 ( 0.69) Acc@5 6.25 ( 2.76) +Epoch: [0][1917/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.028) Loss 5.9896e+00 (6.5983e+00) Acc@1 1.95 ( 0.69) Acc@5 8.20 ( 2.76) +Epoch: [0][1918/5004] Time 0.247 ( 0.242) Data 0.031 ( 0.028) Loss 6.1185e+00 (6.5980e+00) Acc@1 1.56 ( 0.69) Acc@5 6.64 ( 2.77) +Epoch: [0][1919/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.9917e+00 (6.5977e+00) Acc@1 2.73 ( 0.69) Acc@5 5.47 ( 2.77) +Epoch: [0][1920/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.028) Loss 6.0658e+00 (6.5975e+00) Acc@1 1.56 ( 0.69) Acc@5 8.20 ( 2.77) +Epoch: [0][1921/5004] Time 0.244 ( 0.242) Data 0.029 ( 0.028) Loss 6.0110e+00 (6.5971e+00) Acc@1 1.95 ( 0.69) Acc@5 6.25 ( 2.77) +Epoch: [0][1922/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.028) Loss 6.0169e+00 (6.5968e+00) Acc@1 4.30 ( 0.70) Acc@5 8.20 ( 2.77) +Epoch: [0][1923/5004] Time 0.247 ( 0.242) Data 0.030 ( 0.028) Loss 6.0112e+00 (6.5965e+00) Acc@1 2.34 ( 0.70) Acc@5 8.20 ( 2.78) +Epoch: [0][1924/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.028) Loss 6.0556e+00 (6.5963e+00) Acc@1 1.56 ( 0.70) Acc@5 6.25 ( 2.78) +Epoch: [0][1925/5004] Time 0.251 ( 0.242) Data 0.031 ( 0.028) Loss 5.8792e+00 (6.5959e+00) Acc@1 1.56 ( 0.70) Acc@5 8.20 ( 2.78) +Epoch: [0][1926/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 6.0188e+00 (6.5956e+00) Acc@1 4.69 ( 0.70) Acc@5 9.77 ( 2.79) +Epoch: [0][1927/5004] Time 0.247 ( 0.242) Data 0.031 ( 0.028) Loss 6.1237e+00 (6.5953e+00) Acc@1 3.12 ( 0.70) Acc@5 8.98 ( 2.79) +Epoch: [0][1928/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 6.0402e+00 (6.5951e+00) Acc@1 2.34 ( 0.70) Acc@5 7.03 ( 2.79) +Epoch: [0][1929/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 5.9867e+00 (6.5947e+00) Acc@1 5.08 ( 0.70) Acc@5 8.20 ( 2.79) +Epoch: [0][1930/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.028) Loss 5.9492e+00 (6.5944e+00) Acc@1 3.52 ( 0.71) Acc@5 9.38 ( 2.80) +Epoch: [0][1931/5004] Time 0.245 ( 0.242) Data 0.028 ( 0.028) Loss 6.1214e+00 (6.5942e+00) Acc@1 1.95 ( 0.71) Acc@5 4.69 ( 2.80) +Epoch: [0][1932/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.9829e+00 (6.5938e+00) Acc@1 1.56 ( 0.71) Acc@5 7.81 ( 2.80) +Epoch: [0][1933/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.028) Loss 5.9468e+00 (6.5935e+00) Acc@1 1.17 ( 0.71) Acc@5 5.86 ( 2.80) +Epoch: [0][1934/5004] Time 0.247 ( 0.242) Data 0.030 ( 0.028) Loss 5.8616e+00 (6.5931e+00) Acc@1 1.56 ( 0.71) Acc@5 9.38 ( 2.81) +Epoch: [0][1935/5004] Time 0.250 ( 0.242) Data 0.029 ( 0.028) Loss 5.8081e+00 (6.5927e+00) Acc@1 3.12 ( 0.71) Acc@5 9.38 ( 2.81) +Epoch: [0][1936/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 6.0418e+00 (6.5924e+00) Acc@1 2.34 ( 0.71) Acc@5 5.86 ( 2.81) +Epoch: [0][1937/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.028) Loss 5.9114e+00 (6.5921e+00) Acc@1 1.95 ( 0.71) Acc@5 11.33 ( 2.81) +Epoch: [0][1938/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.028) Loss 6.0987e+00 (6.5918e+00) Acc@1 0.39 ( 0.71) Acc@5 5.47 ( 2.82) +Epoch: [0][1939/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.8739e+00 (6.5915e+00) Acc@1 4.30 ( 0.71) Acc@5 9.38 ( 2.82) +Epoch: [0][1940/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.028) Loss 5.9725e+00 (6.5911e+00) Acc@1 5.08 ( 0.71) Acc@5 8.20 ( 2.82) +Epoch: [0][1941/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.028) Loss 6.1321e+00 (6.5909e+00) Acc@1 1.95 ( 0.71) Acc@5 7.42 ( 2.82) +Epoch: [0][1942/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.028) Loss 5.9141e+00 (6.5906e+00) Acc@1 4.69 ( 0.72) Acc@5 12.89 ( 2.83) +Epoch: [0][1943/5004] Time 0.246 ( 0.242) Data 0.030 ( 0.028) Loss 5.9960e+00 (6.5903e+00) Acc@1 3.91 ( 0.72) Acc@5 5.47 ( 2.83) +Epoch: [0][1944/5004] Time 0.250 ( 0.242) Data 0.030 ( 0.028) Loss 5.9443e+00 (6.5899e+00) Acc@1 1.95 ( 0.72) Acc@5 8.59 ( 2.83) +Epoch: [0][1945/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 6.0335e+00 (6.5896e+00) Acc@1 1.17 ( 0.72) Acc@5 6.25 ( 2.84) +Epoch: [0][1946/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 6.1037e+00 (6.5894e+00) Acc@1 2.34 ( 0.72) Acc@5 7.03 ( 2.84) +Epoch: [0][1947/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.028) Loss 5.9687e+00 (6.5891e+00) Acc@1 1.95 ( 0.72) Acc@5 8.20 ( 2.84) +Epoch: [0][1948/5004] Time 0.250 ( 0.242) Data 0.029 ( 0.028) Loss 5.9866e+00 (6.5888e+00) Acc@1 4.30 ( 0.72) Acc@5 8.59 ( 2.84) +Epoch: [0][1949/5004] Time 0.249 ( 0.242) Data 0.027 ( 0.028) Loss 5.9921e+00 (6.5885e+00) Acc@1 1.56 ( 0.72) Acc@5 6.64 ( 2.85) +Epoch: [0][1950/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.028) Loss 6.0327e+00 (6.5882e+00) Acc@1 3.52 ( 0.72) Acc@5 7.03 ( 2.85) +Epoch: [0][1951/5004] Time 0.256 ( 0.242) Data 0.029 ( 0.028) Loss 5.9912e+00 (6.5879e+00) Acc@1 1.95 ( 0.72) Acc@5 8.98 ( 2.85) +Epoch: [0][1952/5004] Time 0.248 ( 0.242) Data 0.026 ( 0.028) Loss 5.9756e+00 (6.5876e+00) Acc@1 1.95 ( 0.73) Acc@5 7.42 ( 2.85) +Epoch: [0][1953/5004] Time 0.249 ( 0.242) Data 0.029 ( 0.028) Loss 6.0451e+00 (6.5873e+00) Acc@1 3.12 ( 0.73) Acc@5 7.03 ( 2.86) +Epoch: [0][1954/5004] Time 0.249 ( 0.242) Data 0.028 ( 0.028) Loss 6.0157e+00 (6.5870e+00) Acc@1 2.34 ( 0.73) Acc@5 6.64 ( 2.86) +Epoch: [0][1955/5004] Time 0.248 ( 0.242) Data 0.027 ( 0.028) Loss 5.9688e+00 (6.5867e+00) Acc@1 2.34 ( 0.73) Acc@5 7.81 ( 2.86) +Epoch: [0][1956/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.028) Loss 5.8886e+00 (6.5863e+00) Acc@1 3.91 ( 0.73) Acc@5 10.55 ( 2.86) +Epoch: [0][1957/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.8466e+00 (6.5859e+00) Acc@1 2.34 ( 0.73) Acc@5 9.38 ( 2.87) +Epoch: [0][1958/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 5.8956e+00 (6.5856e+00) Acc@1 1.56 ( 0.73) Acc@5 9.77 ( 2.87) +Epoch: [0][1959/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 5.9688e+00 (6.5853e+00) Acc@1 1.95 ( 0.73) Acc@5 8.20 ( 2.87) +Epoch: [0][1960/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 5.9942e+00 (6.5850e+00) Acc@1 1.17 ( 0.73) Acc@5 5.47 ( 2.87) +Epoch: [0][1961/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 6.0276e+00 (6.5847e+00) Acc@1 1.56 ( 0.73) Acc@5 7.03 ( 2.88) +Epoch: [0][1962/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.028) Loss 5.9884e+00 (6.5844e+00) Acc@1 2.34 ( 0.73) Acc@5 7.03 ( 2.88) +Epoch: [0][1963/5004] Time 0.248 ( 0.242) Data 0.028 ( 0.028) Loss 5.7954e+00 (6.5840e+00) Acc@1 2.73 ( 0.73) Acc@5 10.16 ( 2.88) +Epoch: [0][1964/5004] Time 0.253 ( 0.242) Data 0.030 ( 0.028) Loss 5.9892e+00 (6.5837e+00) Acc@1 0.78 ( 0.73) Acc@5 4.30 ( 2.88) +Epoch: [0][1965/5004] Time 0.245 ( 0.242) Data 0.026 ( 0.028) Loss 6.0576e+00 (6.5834e+00) Acc@1 1.56 ( 0.73) Acc@5 5.47 ( 2.88) +Epoch: [0][1966/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.028) Loss 6.0374e+00 (6.5831e+00) Acc@1 0.78 ( 0.73) Acc@5 6.25 ( 2.89) +Epoch: [0][1967/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.028) Loss 6.1042e+00 (6.5829e+00) Acc@1 1.56 ( 0.74) Acc@5 7.81 ( 2.89) +Epoch: [0][1968/5004] Time 0.246 ( 0.242) Data 0.030 ( 0.028) Loss 6.0168e+00 (6.5826e+00) Acc@1 0.39 ( 0.73) Acc@5 6.64 ( 2.89) +Epoch: [0][1969/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 6.1416e+00 (6.5824e+00) Acc@1 1.56 ( 0.74) Acc@5 7.03 ( 2.89) +Epoch: [0][1970/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 6.0121e+00 (6.5821e+00) Acc@1 1.56 ( 0.74) Acc@5 6.64 ( 2.89) +Epoch: [0][1971/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 6.0888e+00 (6.5818e+00) Acc@1 2.73 ( 0.74) Acc@5 6.25 ( 2.90) +Epoch: [0][1972/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.028) Loss 6.1364e+00 (6.5816e+00) Acc@1 1.56 ( 0.74) Acc@5 5.86 ( 2.90) +Epoch: [0][1973/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.028) Loss 5.9949e+00 (6.5813e+00) Acc@1 1.95 ( 0.74) Acc@5 5.08 ( 2.90) +Epoch: [0][1974/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.028) Loss 6.0629e+00 (6.5810e+00) Acc@1 1.56 ( 0.74) Acc@5 6.64 ( 2.90) +Epoch: [0][1975/5004] Time 0.251 ( 0.242) Data 0.030 ( 0.028) Loss 6.1133e+00 (6.5808e+00) Acc@1 1.17 ( 0.74) Acc@5 5.08 ( 2.90) +Epoch: [0][1976/5004] Time 0.247 ( 0.242) Data 0.028 ( 0.028) Loss 5.9585e+00 (6.5805e+00) Acc@1 2.73 ( 0.74) Acc@5 7.81 ( 2.90) +Epoch: [0][1977/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 6.1257e+00 (6.5803e+00) Acc@1 3.12 ( 0.74) Acc@5 9.38 ( 2.91) +Epoch: [0][1978/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.028) Loss 5.9141e+00 (6.5799e+00) Acc@1 2.73 ( 0.74) Acc@5 8.20 ( 2.91) +Epoch: [0][1979/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.028) Loss 5.9143e+00 (6.5796e+00) Acc@1 2.73 ( 0.74) Acc@5 9.38 ( 2.91) +Epoch: [0][1980/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.028) Loss 6.0018e+00 (6.5793e+00) Acc@1 1.56 ( 0.74) Acc@5 10.94 ( 2.92) +Epoch: [0][1981/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.028) Loss 5.9938e+00 (6.5790e+00) Acc@1 1.56 ( 0.74) Acc@5 7.42 ( 2.92) +Epoch: [0][1982/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.028) Loss 6.0544e+00 (6.5787e+00) Acc@1 1.56 ( 0.74) Acc@5 7.81 ( 2.92) +Epoch: [0][1983/5004] Time 0.244 ( 0.242) Data 0.030 ( 0.028) Loss 5.9936e+00 (6.5784e+00) Acc@1 1.17 ( 0.74) Acc@5 7.42 ( 2.92) +Epoch: [0][1984/5004] Time 0.246 ( 0.242) Data 0.030 ( 0.028) Loss 6.0161e+00 (6.5782e+00) Acc@1 3.12 ( 0.75) Acc@5 8.59 ( 2.93) +Epoch: [0][1985/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.9068e+00 (6.5778e+00) Acc@1 2.34 ( 0.75) Acc@5 8.59 ( 2.93) +Epoch: [0][1986/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.028) Loss 6.0421e+00 (6.5776e+00) Acc@1 1.56 ( 0.75) Acc@5 7.42 ( 2.93) +Epoch: [0][1987/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.028) Loss 5.9807e+00 (6.5773e+00) Acc@1 1.95 ( 0.75) Acc@5 8.98 ( 2.94) +Epoch: [0][1988/5004] Time 0.247 ( 0.242) Data 0.030 ( 0.028) Loss 5.9405e+00 (6.5769e+00) Acc@1 3.91 ( 0.75) Acc@5 10.16 ( 2.94) +Epoch: [0][1989/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 5.8365e+00 (6.5766e+00) Acc@1 2.34 ( 0.75) Acc@5 11.33 ( 2.94) +Epoch: [0][1990/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.8715e+00 (6.5762e+00) Acc@1 3.52 ( 0.75) Acc@5 8.98 ( 2.95) +Epoch: [0][1991/5004] Time 0.244 ( 0.242) Data 0.029 ( 0.028) Loss 6.0791e+00 (6.5760e+00) Acc@1 3.12 ( 0.75) Acc@5 8.20 ( 2.95) +Epoch: [0][1992/5004] Time 0.251 ( 0.242) Data 0.029 ( 0.028) Loss 5.9537e+00 (6.5756e+00) Acc@1 1.95 ( 0.75) Acc@5 7.81 ( 2.95) +Epoch: [0][1993/5004] Time 0.244 ( 0.242) Data 0.029 ( 0.028) Loss 5.8042e+00 (6.5753e+00) Acc@1 4.30 ( 0.75) Acc@5 12.89 ( 2.96) +Epoch: [0][1994/5004] Time 0.250 ( 0.242) Data 0.029 ( 0.028) Loss 6.0225e+00 (6.5750e+00) Acc@1 1.95 ( 0.76) Acc@5 7.03 ( 2.96) +Epoch: [0][1995/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.028) Loss 5.9325e+00 (6.5747e+00) Acc@1 3.52 ( 0.76) Acc@5 5.86 ( 2.96) +Epoch: [0][1996/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 5.9694e+00 (6.5744e+00) Acc@1 2.73 ( 0.76) Acc@5 6.64 ( 2.96) +Epoch: [0][1997/5004] Time 0.244 ( 0.242) Data 0.029 ( 0.028) Loss 5.9379e+00 (6.5740e+00) Acc@1 2.73 ( 0.76) Acc@5 6.25 ( 2.96) +Epoch: [0][1998/5004] Time 0.246 ( 0.242) Data 0.030 ( 0.028) Loss 6.1620e+00 (6.5738e+00) Acc@1 2.73 ( 0.76) Acc@5 6.25 ( 2.97) +Epoch: [0][1999/5004] Time 0.246 ( 0.242) Data 0.030 ( 0.028) Loss 6.0152e+00 (6.5736e+00) Acc@1 1.95 ( 0.76) Acc@5 6.25 ( 2.97) +Epoch: [0][2000/5004] Time 0.247 ( 0.242) Data 0.030 ( 0.028) Loss 5.9895e+00 (6.5733e+00) Acc@1 2.73 ( 0.76) Acc@5 7.42 ( 2.97) +Epoch: [0][2001/5004] Time 0.249 ( 0.242) Data 0.028 ( 0.028) Loss 5.8409e+00 (6.5729e+00) Acc@1 3.52 ( 0.76) Acc@5 9.38 ( 2.97) +Epoch: [0][2002/5004] Time 0.244 ( 0.242) Data 0.026 ( 0.028) Loss 5.9589e+00 (6.5726e+00) Acc@1 3.12 ( 0.76) Acc@5 10.55 ( 2.98) +Epoch: [0][2003/5004] Time 0.248 ( 0.242) Data 0.028 ( 0.028) Loss 5.8399e+00 (6.5722e+00) Acc@1 1.56 ( 0.76) Acc@5 12.11 ( 2.98) +Epoch: [0][2004/5004] Time 0.255 ( 0.242) Data 0.028 ( 0.028) Loss 5.8988e+00 (6.5719e+00) Acc@1 2.73 ( 0.76) Acc@5 9.38 ( 2.98) +Epoch: [0][2005/5004] Time 0.249 ( 0.242) Data 0.025 ( 0.028) Loss 5.9288e+00 (6.5716e+00) Acc@1 3.12 ( 0.77) Acc@5 8.98 ( 2.99) +Epoch: [0][2006/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.028) Loss 6.0114e+00 (6.5713e+00) Acc@1 1.56 ( 0.77) Acc@5 5.86 ( 2.99) +Epoch: [0][2007/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.028) Loss 5.9575e+00 (6.5710e+00) Acc@1 2.73 ( 0.77) Acc@5 9.38 ( 2.99) +Epoch: [0][2008/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 5.9549e+00 (6.5707e+00) Acc@1 2.73 ( 0.77) Acc@5 9.77 ( 2.99) +Epoch: [0][2009/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 5.9477e+00 (6.5704e+00) Acc@1 3.91 ( 0.77) Acc@5 7.81 ( 3.00) +Epoch: [0][2010/5004] Time 0.244 ( 0.242) Data 0.029 ( 0.028) Loss 5.8826e+00 (6.5700e+00) Acc@1 5.47 ( 0.77) Acc@5 10.94 ( 3.00) +Epoch: [0][2011/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.028) Loss 6.0914e+00 (6.5698e+00) Acc@1 0.78 ( 0.77) Acc@5 6.25 ( 3.00) +Epoch: [0][2012/5004] Time 0.248 ( 0.242) Data 0.030 ( 0.028) Loss 5.8012e+00 (6.5694e+00) Acc@1 3.52 ( 0.77) Acc@5 10.16 ( 3.01) +Epoch: [0][2013/5004] Time 0.246 ( 0.242) Data 0.030 ( 0.028) Loss 5.9086e+00 (6.5691e+00) Acc@1 1.95 ( 0.77) Acc@5 10.94 ( 3.01) +Epoch: [0][2014/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.028) Loss 5.9897e+00 (6.5688e+00) Acc@1 3.52 ( 0.78) Acc@5 7.42 ( 3.01) +Epoch: [0][2015/5004] Time 0.250 ( 0.242) Data 0.029 ( 0.028) Loss 5.8610e+00 (6.5684e+00) Acc@1 2.34 ( 0.78) Acc@5 6.25 ( 3.01) +Epoch: [0][2016/5004] Time 0.249 ( 0.242) Data 0.028 ( 0.028) Loss 5.9041e+00 (6.5681e+00) Acc@1 3.12 ( 0.78) Acc@5 7.81 ( 3.02) +Epoch: [0][2017/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.028) Loss 5.8540e+00 (6.5678e+00) Acc@1 2.34 ( 0.78) Acc@5 7.81 ( 3.02) +Epoch: [0][2018/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 5.9422e+00 (6.5674e+00) Acc@1 3.91 ( 0.78) Acc@5 8.59 ( 3.02) +Epoch: [0][2019/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 6.0304e+00 (6.5672e+00) Acc@1 1.56 ( 0.78) Acc@5 8.20 ( 3.02) +Epoch: [0][2020/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 5.8840e+00 (6.5668e+00) Acc@1 3.91 ( 0.78) Acc@5 10.94 ( 3.03) +Epoch: [0][2021/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.9754e+00 (6.5665e+00) Acc@1 0.78 ( 0.78) Acc@5 4.69 ( 3.03) +Epoch: [0][2022/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.028) Loss 5.8764e+00 (6.5662e+00) Acc@1 3.12 ( 0.78) Acc@5 10.94 ( 3.03) +Epoch: [0][2023/5004] Time 0.245 ( 0.242) Data 0.029 ( 0.028) Loss 5.9434e+00 (6.5659e+00) Acc@1 3.91 ( 0.78) Acc@5 9.38 ( 3.04) +Epoch: [0][2024/5004] Time 0.249 ( 0.242) Data 0.029 ( 0.028) Loss 5.8338e+00 (6.5655e+00) Acc@1 2.34 ( 0.79) Acc@5 9.77 ( 3.04) +Epoch: [0][2025/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.028) Loss 5.8620e+00 (6.5652e+00) Acc@1 3.12 ( 0.79) Acc@5 11.33 ( 3.04) +Epoch: [0][2026/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.028) Loss 6.0048e+00 (6.5649e+00) Acc@1 1.56 ( 0.79) Acc@5 6.25 ( 3.05) +Epoch: [0][2027/5004] Time 0.244 ( 0.242) Data 0.029 ( 0.028) Loss 5.8864e+00 (6.5646e+00) Acc@1 1.95 ( 0.79) Acc@5 9.38 ( 3.05) +Epoch: [0][2028/5004] Time 0.258 ( 0.242) Data 0.030 ( 0.028) Loss 6.0824e+00 (6.5643e+00) Acc@1 3.12 ( 0.79) Acc@5 6.64 ( 3.05) +Epoch: [0][2029/5004] Time 0.244 ( 0.242) Data 0.018 ( 0.028) Loss 5.9145e+00 (6.5640e+00) Acc@1 3.52 ( 0.79) Acc@5 10.55 ( 3.05) +Epoch: [0][2030/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.028) Loss 5.8193e+00 (6.5637e+00) Acc@1 3.52 ( 0.79) Acc@5 7.81 ( 3.06) +Epoch: [0][2031/5004] Time 0.252 ( 0.242) Data 0.022 ( 0.028) Loss 6.0788e+00 (6.5634e+00) Acc@1 2.34 ( 0.79) Acc@5 10.94 ( 3.06) +Epoch: [0][2032/5004] Time 0.246 ( 0.242) Data 0.019 ( 0.028) Loss 5.8428e+00 (6.5631e+00) Acc@1 1.95 ( 0.79) Acc@5 10.16 ( 3.06) +Epoch: [0][2033/5004] Time 0.242 ( 0.242) Data 0.021 ( 0.028) Loss 5.8613e+00 (6.5627e+00) Acc@1 1.56 ( 0.79) Acc@5 7.81 ( 3.07) +Epoch: [0][2034/5004] Time 0.248 ( 0.242) Data 0.024 ( 0.028) Loss 5.7906e+00 (6.5623e+00) Acc@1 4.69 ( 0.79) Acc@5 14.45 ( 3.07) +Epoch: [0][2035/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.028) Loss 5.8872e+00 (6.5620e+00) Acc@1 1.95 ( 0.80) Acc@5 12.50 ( 3.08) +Epoch: [0][2036/5004] Time 0.243 ( 0.242) Data 0.017 ( 0.028) Loss 5.8682e+00 (6.5617e+00) Acc@1 2.34 ( 0.80) Acc@5 7.81 ( 3.08) +Epoch: [0][2037/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.028) Loss 5.9745e+00 (6.5614e+00) Acc@1 2.34 ( 0.80) Acc@5 8.59 ( 3.08) +Epoch: [0][2038/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.028) Loss 5.7857e+00 (6.5610e+00) Acc@1 3.91 ( 0.80) Acc@5 10.16 ( 3.08) +Epoch: [0][2039/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.028) Loss 5.9021e+00 (6.5607e+00) Acc@1 2.73 ( 0.80) Acc@5 7.42 ( 3.09) +Epoch: [0][2040/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.028) Loss 5.8503e+00 (6.5603e+00) Acc@1 5.47 ( 0.80) Acc@5 10.94 ( 3.09) +Epoch: [0][2041/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.028) Loss 5.8827e+00 (6.5600e+00) Acc@1 3.12 ( 0.80) Acc@5 11.33 ( 3.09) +Epoch: [0][2042/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.028) Loss 6.0456e+00 (6.5597e+00) Acc@1 3.12 ( 0.80) Acc@5 7.81 ( 3.10) +Epoch: [0][2043/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.028) Loss 5.7930e+00 (6.5594e+00) Acc@1 0.78 ( 0.80) Acc@5 8.98 ( 3.10) +Epoch: [0][2044/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.028) Loss 6.0142e+00 (6.5591e+00) Acc@1 1.56 ( 0.80) Acc@5 5.86 ( 3.10) +Epoch: [0][2045/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.028) Loss 6.0560e+00 (6.5589e+00) Acc@1 2.73 ( 0.81) Acc@5 8.20 ( 3.10) +Epoch: [0][2046/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.028) Loss 5.8860e+00 (6.5585e+00) Acc@1 2.34 ( 0.81) Acc@5 9.77 ( 3.11) +Epoch: [0][2047/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.028) Loss 5.9062e+00 (6.5582e+00) Acc@1 2.34 ( 0.81) Acc@5 9.77 ( 3.11) +Epoch: [0][2048/5004] Time 0.232 ( 0.242) Data 0.018 ( 0.028) Loss 5.7776e+00 (6.5578e+00) Acc@1 2.73 ( 0.81) Acc@5 12.11 ( 3.11) +Epoch: [0][2049/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.028) Loss 6.0026e+00 (6.5576e+00) Acc@1 2.34 ( 0.81) Acc@5 7.03 ( 3.12) +Epoch: [0][2050/5004] Time 0.238 ( 0.242) Data 0.025 ( 0.028) Loss 5.8630e+00 (6.5572e+00) Acc@1 3.12 ( 0.81) Acc@5 11.33 ( 3.12) +Epoch: [0][2051/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.028) Loss 5.8193e+00 (6.5569e+00) Acc@1 1.56 ( 0.81) Acc@5 10.16 ( 3.12) +Epoch: [0][2052/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.028) Loss 5.9056e+00 (6.5565e+00) Acc@1 1.95 ( 0.81) Acc@5 9.38 ( 3.13) +Epoch: [0][2053/5004] Time 0.244 ( 0.242) Data 0.026 ( 0.028) Loss 5.9643e+00 (6.5563e+00) Acc@1 1.95 ( 0.81) Acc@5 10.55 ( 3.13) +Epoch: [0][2054/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.028) Loss 5.8846e+00 (6.5559e+00) Acc@1 2.34 ( 0.81) Acc@5 8.59 ( 3.13) +Epoch: [0][2055/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.028) Loss 5.8601e+00 (6.5556e+00) Acc@1 2.34 ( 0.81) Acc@5 10.55 ( 3.14) +Epoch: [0][2056/5004] Time 0.248 ( 0.242) Data 0.025 ( 0.028) Loss 5.9434e+00 (6.5553e+00) Acc@1 3.12 ( 0.81) Acc@5 10.55 ( 3.14) +Epoch: [0][2057/5004] Time 0.233 ( 0.242) Data 0.020 ( 0.028) Loss 5.9664e+00 (6.5550e+00) Acc@1 1.95 ( 0.81) Acc@5 5.86 ( 3.14) +Epoch: [0][2058/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.028) Loss 5.9856e+00 (6.5547e+00) Acc@1 0.78 ( 0.81) Acc@5 8.20 ( 3.14) +Epoch: [0][2059/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.028) Loss 5.8161e+00 (6.5544e+00) Acc@1 2.34 ( 0.82) Acc@5 9.38 ( 3.15) +Epoch: [0][2060/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.028) Loss 5.8699e+00 (6.5540e+00) Acc@1 3.12 ( 0.82) Acc@5 9.38 ( 3.15) +Epoch: [0][2061/5004] Time 0.245 ( 0.242) Data 0.026 ( 0.028) Loss 5.9174e+00 (6.5537e+00) Acc@1 2.34 ( 0.82) Acc@5 9.38 ( 3.15) +Epoch: [0][2062/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.028) Loss 5.9008e+00 (6.5534e+00) Acc@1 2.34 ( 0.82) Acc@5 8.20 ( 3.16) +Epoch: [0][2063/5004] Time 0.244 ( 0.242) Data 0.026 ( 0.028) Loss 6.0884e+00 (6.5532e+00) Acc@1 1.56 ( 0.82) Acc@5 5.08 ( 3.16) +Epoch: [0][2064/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.028) Loss 5.8733e+00 (6.5529e+00) Acc@1 3.91 ( 0.82) Acc@5 10.16 ( 3.16) +Epoch: [0][2065/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.028) Loss 5.9692e+00 (6.5526e+00) Acc@1 2.73 ( 0.82) Acc@5 7.03 ( 3.16) +Epoch: [0][2066/5004] Time 0.244 ( 0.242) Data 0.026 ( 0.028) Loss 5.8624e+00 (6.5522e+00) Acc@1 3.12 ( 0.82) Acc@5 8.20 ( 3.16) +Epoch: [0][2067/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.028) Loss 5.9450e+00 (6.5519e+00) Acc@1 3.91 ( 0.82) Acc@5 8.98 ( 3.17) +Epoch: [0][2068/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.028) Loss 5.8981e+00 (6.5516e+00) Acc@1 3.12 ( 0.82) Acc@5 9.77 ( 3.17) +Epoch: [0][2069/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.028) Loss 6.0162e+00 (6.5514e+00) Acc@1 2.34 ( 0.82) Acc@5 8.59 ( 3.17) +Epoch: [0][2070/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.028) Loss 5.9350e+00 (6.5511e+00) Acc@1 3.52 ( 0.83) Acc@5 6.25 ( 3.17) +Epoch: [0][2071/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.028) Loss 5.9278e+00 (6.5508e+00) Acc@1 3.91 ( 0.83) Acc@5 9.77 ( 3.18) +Epoch: [0][2072/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.028) Loss 5.9670e+00 (6.5505e+00) Acc@1 0.78 ( 0.83) Acc@5 5.47 ( 3.18) +Epoch: [0][2073/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.028) Loss 5.9824e+00 (6.5502e+00) Acc@1 1.95 ( 0.83) Acc@5 5.86 ( 3.18) +Epoch: [0][2074/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.028) Loss 5.8647e+00 (6.5499e+00) Acc@1 2.73 ( 0.83) Acc@5 12.50 ( 3.18) +Epoch: [0][2075/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.028) Loss 5.9682e+00 (6.5496e+00) Acc@1 1.95 ( 0.83) Acc@5 8.20 ( 3.19) +Epoch: [0][2076/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.028) Loss 5.9358e+00 (6.5493e+00) Acc@1 1.95 ( 0.83) Acc@5 11.72 ( 3.19) +Epoch: [0][2077/5004] Time 0.251 ( 0.242) Data 0.024 ( 0.028) Loss 5.8024e+00 (6.5490e+00) Acc@1 4.69 ( 0.83) Acc@5 13.67 ( 3.20) +Epoch: [0][2078/5004] Time 0.236 ( 0.242) Data 0.016 ( 0.028) Loss 6.0844e+00 (6.5487e+00) Acc@1 1.56 ( 0.83) Acc@5 5.86 ( 3.20) +Epoch: [0][2079/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.028) Loss 5.9816e+00 (6.5485e+00) Acc@1 2.34 ( 0.83) Acc@5 10.16 ( 3.20) +Epoch: [0][2080/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 5.8619e+00 (6.5481e+00) Acc@1 3.12 ( 0.83) Acc@5 7.81 ( 3.20) +Epoch: [0][2081/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 6.0808e+00 (6.5479e+00) Acc@1 3.52 ( 0.84) Acc@5 7.81 ( 3.20) +Epoch: [0][2082/5004] Time 0.239 ( 0.242) Data 0.019 ( 0.027) Loss 6.0119e+00 (6.5476e+00) Acc@1 2.34 ( 0.84) Acc@5 7.03 ( 3.21) +Epoch: [0][2083/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.9539e+00 (6.5474e+00) Acc@1 2.34 ( 0.84) Acc@5 9.77 ( 3.21) +Epoch: [0][2084/5004] Time 0.249 ( 0.242) Data 0.024 ( 0.027) Loss 5.8887e+00 (6.5470e+00) Acc@1 3.91 ( 0.84) Acc@5 8.98 ( 3.21) +Epoch: [0][2085/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.027) Loss 5.8032e+00 (6.5467e+00) Acc@1 3.91 ( 0.84) Acc@5 8.59 ( 3.22) +Epoch: [0][2086/5004] Time 0.253 ( 0.242) Data 0.024 ( 0.027) Loss 5.8994e+00 (6.5464e+00) Acc@1 3.52 ( 0.84) Acc@5 9.77 ( 3.22) +Epoch: [0][2087/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.027) Loss 5.9913e+00 (6.5461e+00) Acc@1 3.52 ( 0.84) Acc@5 8.59 ( 3.22) +Epoch: [0][2088/5004] Time 0.243 ( 0.242) Data 0.017 ( 0.027) Loss 5.7999e+00 (6.5458e+00) Acc@1 1.56 ( 0.84) Acc@5 10.55 ( 3.22) +Epoch: [0][2089/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.9834e+00 (6.5455e+00) Acc@1 1.56 ( 0.84) Acc@5 7.81 ( 3.23) +Epoch: [0][2090/5004] Time 0.252 ( 0.243) Data 0.022 ( 0.027) Loss 5.9327e+00 (6.5452e+00) Acc@1 3.91 ( 0.84) Acc@5 9.77 ( 3.23) +Epoch: [0][2091/5004] Time 0.243 ( 0.243) Data 0.019 ( 0.027) Loss 5.9495e+00 (6.5449e+00) Acc@1 1.95 ( 0.85) Acc@5 11.33 ( 3.23) +Epoch: [0][2092/5004] Time 0.244 ( 0.243) Data 0.021 ( 0.027) Loss 5.8878e+00 (6.5446e+00) Acc@1 0.78 ( 0.85) Acc@5 7.42 ( 3.24) +Epoch: [0][2093/5004] Time 0.247 ( 0.243) Data 0.022 ( 0.027) Loss 6.0339e+00 (6.5443e+00) Acc@1 3.12 ( 0.85) Acc@5 7.03 ( 3.24) +Epoch: [0][2094/5004] Time 0.250 ( 0.243) Data 0.021 ( 0.027) Loss 6.0302e+00 (6.5441e+00) Acc@1 3.12 ( 0.85) Acc@5 7.42 ( 3.24) +Epoch: [0][2095/5004] Time 0.235 ( 0.243) Data 0.015 ( 0.027) Loss 5.9281e+00 (6.5438e+00) Acc@1 2.34 ( 0.85) Acc@5 10.16 ( 3.24) +Epoch: [0][2096/5004] Time 0.248 ( 0.243) Data 0.023 ( 0.027) Loss 5.9763e+00 (6.5435e+00) Acc@1 2.34 ( 0.85) Acc@5 6.25 ( 3.24) +Epoch: [0][2097/5004] Time 0.237 ( 0.243) Data 0.018 ( 0.027) Loss 5.8048e+00 (6.5432e+00) Acc@1 2.34 ( 0.85) Acc@5 8.59 ( 3.25) +Epoch: [0][2098/5004] Time 0.242 ( 0.243) Data 0.024 ( 0.027) Loss 6.0138e+00 (6.5429e+00) Acc@1 0.39 ( 0.85) Acc@5 6.25 ( 3.25) +Epoch: [0][2099/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.9200e+00 (6.5426e+00) Acc@1 3.52 ( 0.85) Acc@5 9.77 ( 3.25) +Epoch: [0][2100/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 5.8339e+00 (6.5423e+00) Acc@1 2.34 ( 0.85) Acc@5 10.16 ( 3.25) +Epoch: [0][2101/5004] Time 0.240 ( 0.243) Data 0.023 ( 0.027) Loss 5.8638e+00 (6.5420e+00) Acc@1 2.73 ( 0.85) Acc@5 10.94 ( 3.26) +Epoch: [0][2102/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.8095e+00 (6.5416e+00) Acc@1 3.52 ( 0.85) Acc@5 13.28 ( 3.26) +Epoch: [0][2103/5004] Time 0.239 ( 0.243) Data 0.023 ( 0.027) Loss 5.8449e+00 (6.5413e+00) Acc@1 4.69 ( 0.86) Acc@5 8.59 ( 3.27) +Epoch: [0][2104/5004] Time 0.253 ( 0.243) Data 0.025 ( 0.027) Loss 5.9644e+00 (6.5410e+00) Acc@1 3.12 ( 0.86) Acc@5 6.64 ( 3.27) +Epoch: [0][2105/5004] Time 0.233 ( 0.243) Data 0.016 ( 0.027) Loss 5.7913e+00 (6.5407e+00) Acc@1 3.12 ( 0.86) Acc@5 8.98 ( 3.27) +Epoch: [0][2106/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.8011e+00 (6.5403e+00) Acc@1 2.73 ( 0.86) Acc@5 10.16 ( 3.27) +Epoch: [0][2107/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.9184e+00 (6.5400e+00) Acc@1 1.17 ( 0.86) Acc@5 8.59 ( 3.28) +Epoch: [0][2108/5004] Time 0.249 ( 0.243) Data 0.022 ( 0.027) Loss 5.8091e+00 (6.5397e+00) Acc@1 3.12 ( 0.86) Acc@5 10.55 ( 3.28) +Epoch: [0][2109/5004] Time 0.241 ( 0.243) Data 0.021 ( 0.027) Loss 5.8889e+00 (6.5394e+00) Acc@1 2.34 ( 0.86) Acc@5 11.33 ( 3.28) +Epoch: [0][2110/5004] Time 0.246 ( 0.243) Data 0.023 ( 0.027) Loss 5.8140e+00 (6.5390e+00) Acc@1 1.95 ( 0.86) Acc@5 6.25 ( 3.28) +Epoch: [0][2111/5004] Time 0.239 ( 0.243) Data 0.018 ( 0.027) Loss 5.7781e+00 (6.5387e+00) Acc@1 1.95 ( 0.86) Acc@5 9.77 ( 3.29) +Epoch: [0][2112/5004] Time 0.244 ( 0.243) Data 0.022 ( 0.027) Loss 5.8968e+00 (6.5384e+00) Acc@1 2.34 ( 0.86) Acc@5 7.03 ( 3.29) +Epoch: [0][2113/5004] Time 0.248 ( 0.243) Data 0.023 ( 0.027) Loss 5.8197e+00 (6.5380e+00) Acc@1 2.73 ( 0.86) Acc@5 11.72 ( 3.29) +Epoch: [0][2114/5004] Time 0.241 ( 0.243) Data 0.021 ( 0.027) Loss 5.9279e+00 (6.5377e+00) Acc@1 2.73 ( 0.86) Acc@5 6.64 ( 3.29) +Epoch: [0][2115/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.8592e+00 (6.5374e+00) Acc@1 4.30 ( 0.87) Acc@5 9.77 ( 3.30) +Epoch: [0][2116/5004] Time 0.250 ( 0.243) Data 0.023 ( 0.027) Loss 5.8295e+00 (6.5371e+00) Acc@1 2.73 ( 0.87) Acc@5 9.38 ( 3.30) +Epoch: [0][2117/5004] Time 0.249 ( 0.243) Data 0.020 ( 0.027) Loss 5.7980e+00 (6.5367e+00) Acc@1 2.73 ( 0.87) Acc@5 10.55 ( 3.30) +Epoch: [0][2118/5004] Time 0.244 ( 0.243) Data 0.020 ( 0.027) Loss 5.6988e+00 (6.5363e+00) Acc@1 3.12 ( 0.87) Acc@5 11.72 ( 3.31) +Epoch: [0][2119/5004] Time 0.242 ( 0.243) Data 0.022 ( 0.027) Loss 5.8664e+00 (6.5360e+00) Acc@1 2.73 ( 0.87) Acc@5 9.38 ( 3.31) +Epoch: [0][2120/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.9330e+00 (6.5357e+00) Acc@1 1.95 ( 0.87) Acc@5 8.59 ( 3.31) +Epoch: [0][2121/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 6.0490e+00 (6.5355e+00) Acc@1 0.39 ( 0.87) Acc@5 5.86 ( 3.31) +Epoch: [0][2122/5004] Time 0.243 ( 0.243) Data 0.022 ( 0.027) Loss 5.7167e+00 (6.5351e+00) Acc@1 3.52 ( 0.87) Acc@5 6.64 ( 3.32) +Epoch: [0][2123/5004] Time 0.250 ( 0.243) Data 0.026 ( 0.027) Loss 5.8518e+00 (6.5348e+00) Acc@1 2.73 ( 0.87) Acc@5 8.59 ( 3.32) +Epoch: [0][2124/5004] Time 0.243 ( 0.243) Data 0.020 ( 0.027) Loss 6.0362e+00 (6.5346e+00) Acc@1 2.34 ( 0.87) Acc@5 7.03 ( 3.32) +Epoch: [0][2125/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.7891e+00 (6.5342e+00) Acc@1 3.12 ( 0.87) Acc@5 7.42 ( 3.32) +Epoch: [0][2126/5004] Time 0.248 ( 0.243) Data 0.024 ( 0.027) Loss 5.7855e+00 (6.5339e+00) Acc@1 2.34 ( 0.87) Acc@5 10.16 ( 3.33) +Epoch: [0][2127/5004] Time 0.241 ( 0.243) Data 0.023 ( 0.027) Loss 5.8951e+00 (6.5336e+00) Acc@1 3.91 ( 0.88) Acc@5 13.67 ( 3.33) +Epoch: [0][2128/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.9346e+00 (6.5333e+00) Acc@1 4.30 ( 0.88) Acc@5 11.72 ( 3.33) +Epoch: [0][2129/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.9133e+00 (6.5330e+00) Acc@1 1.56 ( 0.88) Acc@5 8.98 ( 3.34) +Epoch: [0][2130/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.8255e+00 (6.5327e+00) Acc@1 2.34 ( 0.88) Acc@5 7.81 ( 3.34) +Epoch: [0][2131/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 5.9455e+00 (6.5324e+00) Acc@1 2.73 ( 0.88) Acc@5 8.59 ( 3.34) +Epoch: [0][2132/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.8678e+00 (6.5321e+00) Acc@1 3.12 ( 0.88) Acc@5 11.72 ( 3.35) +Epoch: [0][2133/5004] Time 0.244 ( 0.243) Data 0.022 ( 0.027) Loss 6.0361e+00 (6.5318e+00) Acc@1 2.34 ( 0.88) Acc@5 8.98 ( 3.35) +Epoch: [0][2134/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.6729e+00 (6.5314e+00) Acc@1 3.91 ( 0.88) Acc@5 12.11 ( 3.35) +Epoch: [0][2135/5004] Time 0.247 ( 0.243) Data 0.023 ( 0.027) Loss 5.9632e+00 (6.5312e+00) Acc@1 4.30 ( 0.88) Acc@5 10.55 ( 3.36) +Epoch: [0][2136/5004] Time 0.243 ( 0.243) Data 0.020 ( 0.027) Loss 5.8744e+00 (6.5309e+00) Acc@1 2.34 ( 0.88) Acc@5 8.20 ( 3.36) +Epoch: [0][2137/5004] Time 0.241 ( 0.243) Data 0.023 ( 0.027) Loss 5.8039e+00 (6.5305e+00) Acc@1 3.91 ( 0.89) Acc@5 9.38 ( 3.36) +Epoch: [0][2138/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 6.1014e+00 (6.5303e+00) Acc@1 1.17 ( 0.89) Acc@5 6.64 ( 3.36) +Epoch: [0][2139/5004] Time 0.248 ( 0.243) Data 0.023 ( 0.027) Loss 5.9379e+00 (6.5300e+00) Acc@1 3.12 ( 0.89) Acc@5 8.98 ( 3.36) +Epoch: [0][2140/5004] Time 0.241 ( 0.243) Data 0.021 ( 0.027) Loss 5.8855e+00 (6.5297e+00) Acc@1 3.91 ( 0.89) Acc@5 9.77 ( 3.37) +Epoch: [0][2141/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.8670e+00 (6.5294e+00) Acc@1 2.34 ( 0.89) Acc@5 8.20 ( 3.37) +Epoch: [0][2142/5004] Time 0.239 ( 0.243) Data 0.023 ( 0.027) Loss 5.8268e+00 (6.5291e+00) Acc@1 4.69 ( 0.89) Acc@5 10.94 ( 3.37) +Epoch: [0][2143/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.8919e+00 (6.5288e+00) Acc@1 1.95 ( 0.89) Acc@5 8.59 ( 3.38) +Epoch: [0][2144/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.8918e+00 (6.5285e+00) Acc@1 3.91 ( 0.89) Acc@5 8.98 ( 3.38) +Epoch: [0][2145/5004] Time 0.241 ( 0.243) Data 0.022 ( 0.027) Loss 5.9585e+00 (6.5282e+00) Acc@1 0.78 ( 0.89) Acc@5 8.20 ( 3.38) +Epoch: [0][2146/5004] Time 0.249 ( 0.243) Data 0.024 ( 0.027) Loss 6.0008e+00 (6.5280e+00) Acc@1 2.73 ( 0.89) Acc@5 8.98 ( 3.38) +Epoch: [0][2147/5004] Time 0.239 ( 0.243) Data 0.020 ( 0.027) Loss 6.0698e+00 (6.5278e+00) Acc@1 1.56 ( 0.89) Acc@5 9.77 ( 3.39) +Epoch: [0][2148/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.9662e+00 (6.5275e+00) Acc@1 1.95 ( 0.89) Acc@5 10.55 ( 3.39) +Epoch: [0][2149/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.9095e+00 (6.5272e+00) Acc@1 3.52 ( 0.90) Acc@5 9.38 ( 3.39) +Epoch: [0][2150/5004] Time 0.242 ( 0.243) Data 0.022 ( 0.027) Loss 5.8971e+00 (6.5269e+00) Acc@1 1.56 ( 0.90) Acc@5 8.59 ( 3.40) +Epoch: [0][2151/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.7985e+00 (6.5266e+00) Acc@1 4.30 ( 0.90) Acc@5 12.11 ( 3.40) +Epoch: [0][2152/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.8820e+00 (6.5263e+00) Acc@1 2.34 ( 0.90) Acc@5 8.98 ( 3.40) +Epoch: [0][2153/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.8930e+00 (6.5260e+00) Acc@1 4.69 ( 0.90) Acc@5 9.77 ( 3.40) +Epoch: [0][2154/5004] Time 0.241 ( 0.243) Data 0.022 ( 0.027) Loss 5.9247e+00 (6.5257e+00) Acc@1 3.12 ( 0.90) Acc@5 7.42 ( 3.41) +Epoch: [0][2155/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.9281e+00 (6.5255e+00) Acc@1 5.86 ( 0.90) Acc@5 9.38 ( 3.41) +Epoch: [0][2156/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 5.9409e+00 (6.5252e+00) Acc@1 1.56 ( 0.90) Acc@5 6.64 ( 3.41) +Epoch: [0][2157/5004] Time 0.243 ( 0.243) Data 0.022 ( 0.027) Loss 5.8453e+00 (6.5249e+00) Acc@1 1.17 ( 0.90) Acc@5 8.59 ( 3.41) +Epoch: [0][2158/5004] Time 0.247 ( 0.243) Data 0.022 ( 0.027) Loss 5.9049e+00 (6.5246e+00) Acc@1 3.52 ( 0.90) Acc@5 9.77 ( 3.42) +Epoch: [0][2159/5004] Time 0.244 ( 0.243) Data 0.020 ( 0.027) Loss 5.7563e+00 (6.5242e+00) Acc@1 3.91 ( 0.91) Acc@5 12.11 ( 3.42) +Epoch: [0][2160/5004] Time 0.254 ( 0.243) Data 0.022 ( 0.027) Loss 5.7883e+00 (6.5239e+00) Acc@1 3.12 ( 0.91) Acc@5 7.42 ( 3.42) +Epoch: [0][2161/5004] Time 0.240 ( 0.243) Data 0.018 ( 0.027) Loss 5.9086e+00 (6.5236e+00) Acc@1 4.69 ( 0.91) Acc@5 12.11 ( 3.43) +Epoch: [0][2162/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 6.0362e+00 (6.5234e+00) Acc@1 2.34 ( 0.91) Acc@5 7.42 ( 3.43) +Epoch: [0][2163/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.9312e+00 (6.5231e+00) Acc@1 3.12 ( 0.91) Acc@5 13.28 ( 3.43) +Epoch: [0][2164/5004] Time 0.238 ( 0.243) Data 0.022 ( 0.027) Loss 5.8872e+00 (6.5228e+00) Acc@1 3.52 ( 0.91) Acc@5 11.33 ( 3.44) +Epoch: [0][2165/5004] Time 0.245 ( 0.243) Data 0.026 ( 0.027) Loss 5.7634e+00 (6.5225e+00) Acc@1 4.69 ( 0.91) Acc@5 13.28 ( 3.44) +Epoch: [0][2166/5004] Time 0.236 ( 0.243) Data 0.020 ( 0.027) Loss 5.8251e+00 (6.5221e+00) Acc@1 4.30 ( 0.92) Acc@5 10.55 ( 3.44) +Epoch: [0][2167/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.8679e+00 (6.5218e+00) Acc@1 2.73 ( 0.92) Acc@5 10.16 ( 3.45) +Epoch: [0][2168/5004] Time 0.242 ( 0.243) Data 0.026 ( 0.027) Loss 5.9161e+00 (6.5216e+00) Acc@1 2.34 ( 0.92) Acc@5 9.38 ( 3.45) +Epoch: [0][2169/5004] Time 0.241 ( 0.243) Data 0.026 ( 0.027) Loss 5.9183e+00 (6.5213e+00) Acc@1 2.73 ( 0.92) Acc@5 7.03 ( 3.45) +Epoch: [0][2170/5004] Time 0.241 ( 0.243) Data 0.026 ( 0.027) Loss 5.7592e+00 (6.5209e+00) Acc@1 2.34 ( 0.92) Acc@5 10.55 ( 3.45) +Epoch: [0][2171/5004] Time 0.248 ( 0.243) Data 0.029 ( 0.027) Loss 5.8421e+00 (6.5206e+00) Acc@1 4.30 ( 0.92) Acc@5 10.16 ( 3.46) +Epoch: [0][2172/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.8326e+00 (6.5203e+00) Acc@1 4.69 ( 0.92) Acc@5 10.55 ( 3.46) +Epoch: [0][2173/5004] Time 0.249 ( 0.243) Data 0.029 ( 0.027) Loss 5.9787e+00 (6.5200e+00) Acc@1 3.12 ( 0.92) Acc@5 11.72 ( 3.46) +Epoch: [0][2174/5004] Time 0.235 ( 0.243) Data 0.021 ( 0.027) Loss 6.0362e+00 (6.5198e+00) Acc@1 1.56 ( 0.92) Acc@5 7.81 ( 3.47) +Epoch: [0][2175/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.9595e+00 (6.5196e+00) Acc@1 2.34 ( 0.92) Acc@5 10.55 ( 3.47) +Epoch: [0][2176/5004] Time 0.242 ( 0.243) Data 0.027 ( 0.027) Loss 5.8831e+00 (6.5193e+00) Acc@1 1.95 ( 0.92) Acc@5 9.77 ( 3.47) +Epoch: [0][2177/5004] Time 0.241 ( 0.243) Data 0.026 ( 0.027) Loss 5.9568e+00 (6.5190e+00) Acc@1 1.17 ( 0.92) Acc@5 7.03 ( 3.47) +Epoch: [0][2178/5004] Time 0.241 ( 0.243) Data 0.026 ( 0.027) Loss 5.9892e+00 (6.5188e+00) Acc@1 1.17 ( 0.92) Acc@5 6.64 ( 3.48) +Epoch: [0][2179/5004] Time 0.241 ( 0.243) Data 0.026 ( 0.027) Loss 5.8595e+00 (6.5185e+00) Acc@1 3.52 ( 0.93) Acc@5 9.38 ( 3.48) +Epoch: [0][2180/5004] Time 0.243 ( 0.243) Data 0.026 ( 0.027) Loss 5.9504e+00 (6.5182e+00) Acc@1 1.17 ( 0.93) Acc@5 9.38 ( 3.48) +Epoch: [0][2181/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.9089e+00 (6.5179e+00) Acc@1 0.78 ( 0.93) Acc@5 5.47 ( 3.48) +Epoch: [0][2182/5004] Time 0.243 ( 0.243) Data 0.022 ( 0.027) Loss 5.8373e+00 (6.5176e+00) Acc@1 3.12 ( 0.93) Acc@5 8.98 ( 3.48) +Epoch: [0][2183/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.8971e+00 (6.5173e+00) Acc@1 2.34 ( 0.93) Acc@5 9.77 ( 3.49) +Epoch: [0][2184/5004] Time 0.240 ( 0.243) Data 0.021 ( 0.027) Loss 5.8542e+00 (6.5170e+00) Acc@1 1.56 ( 0.93) Acc@5 10.16 ( 3.49) +Epoch: [0][2185/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.8316e+00 (6.5167e+00) Acc@1 3.12 ( 0.93) Acc@5 10.55 ( 3.49) +Epoch: [0][2186/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.8221e+00 (6.5164e+00) Acc@1 1.17 ( 0.93) Acc@5 8.20 ( 3.50) +Epoch: [0][2187/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.8195e+00 (6.5161e+00) Acc@1 1.56 ( 0.93) Acc@5 10.16 ( 3.50) +Epoch: [0][2188/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 6.0695e+00 (6.5159e+00) Acc@1 3.91 ( 0.93) Acc@5 8.98 ( 3.50) +Epoch: [0][2189/5004] Time 0.241 ( 0.243) Data 0.022 ( 0.027) Loss 5.9728e+00 (6.5156e+00) Acc@1 1.95 ( 0.93) Acc@5 7.42 ( 3.50) +Epoch: [0][2190/5004] Time 0.241 ( 0.243) Data 0.023 ( 0.027) Loss 5.8639e+00 (6.5153e+00) Acc@1 3.52 ( 0.93) Acc@5 8.59 ( 3.51) +Epoch: [0][2191/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.8431e+00 (6.5150e+00) Acc@1 3.52 ( 0.93) Acc@5 10.55 ( 3.51) +Epoch: [0][2192/5004] Time 0.243 ( 0.243) Data 0.022 ( 0.027) Loss 5.9416e+00 (6.5148e+00) Acc@1 0.39 ( 0.93) Acc@5 6.25 ( 3.51) +Epoch: [0][2193/5004] Time 0.241 ( 0.243) Data 0.023 ( 0.027) Loss 5.8024e+00 (6.5144e+00) Acc@1 3.91 ( 0.93) Acc@5 9.77 ( 3.51) +Epoch: [0][2194/5004] Time 0.246 ( 0.243) Data 0.023 ( 0.027) Loss 5.8913e+00 (6.5142e+00) Acc@1 2.73 ( 0.93) Acc@5 10.16 ( 3.52) +Epoch: [0][2195/5004] Time 0.252 ( 0.243) Data 0.022 ( 0.027) Loss 5.8488e+00 (6.5139e+00) Acc@1 2.73 ( 0.94) Acc@5 9.77 ( 3.52) +Epoch: [0][2196/5004] Time 0.247 ( 0.243) Data 0.016 ( 0.027) Loss 5.9110e+00 (6.5136e+00) Acc@1 3.12 ( 0.94) Acc@5 8.98 ( 3.52) +Epoch: [0][2197/5004] Time 0.242 ( 0.243) Data 0.014 ( 0.027) Loss 5.8074e+00 (6.5133e+00) Acc@1 2.34 ( 0.94) Acc@5 7.42 ( 3.52) +Epoch: [0][2198/5004] Time 0.243 ( 0.243) Data 0.020 ( 0.027) Loss 5.7967e+00 (6.5129e+00) Acc@1 1.95 ( 0.94) Acc@5 8.59 ( 3.53) +Epoch: [0][2199/5004] Time 0.245 ( 0.243) Data 0.021 ( 0.027) Loss 5.6602e+00 (6.5125e+00) Acc@1 2.34 ( 0.94) Acc@5 11.72 ( 3.53) +Epoch: [0][2200/5004] Time 0.245 ( 0.243) Data 0.021 ( 0.027) Loss 5.8477e+00 (6.5122e+00) Acc@1 3.12 ( 0.94) Acc@5 11.72 ( 3.53) +Epoch: [0][2201/5004] Time 0.246 ( 0.243) Data 0.020 ( 0.027) Loss 5.8152e+00 (6.5119e+00) Acc@1 4.30 ( 0.94) Acc@5 9.38 ( 3.54) +Epoch: [0][2202/5004] Time 0.247 ( 0.243) Data 0.020 ( 0.027) Loss 5.8749e+00 (6.5116e+00) Acc@1 2.34 ( 0.94) Acc@5 8.98 ( 3.54) +Epoch: [0][2203/5004] Time 0.250 ( 0.243) Data 0.021 ( 0.027) Loss 5.8151e+00 (6.5113e+00) Acc@1 3.12 ( 0.94) Acc@5 12.11 ( 3.54) +Epoch: [0][2204/5004] Time 0.248 ( 0.243) Data 0.018 ( 0.027) Loss 5.8646e+00 (6.5110e+00) Acc@1 3.12 ( 0.94) Acc@5 8.59 ( 3.54) +Epoch: [0][2205/5004] Time 0.244 ( 0.243) Data 0.018 ( 0.027) Loss 5.8609e+00 (6.5107e+00) Acc@1 5.08 ( 0.95) Acc@5 11.72 ( 3.55) +Epoch: [0][2206/5004] Time 0.250 ( 0.243) Data 0.021 ( 0.027) Loss 5.8023e+00 (6.5104e+00) Acc@1 4.69 ( 0.95) Acc@5 9.77 ( 3.55) +Epoch: [0][2207/5004] Time 0.243 ( 0.243) Data 0.019 ( 0.027) Loss 5.7522e+00 (6.5101e+00) Acc@1 2.73 ( 0.95) Acc@5 10.55 ( 3.55) +Epoch: [0][2208/5004] Time 0.245 ( 0.243) Data 0.020 ( 0.027) Loss 5.6834e+00 (6.5097e+00) Acc@1 5.08 ( 0.95) Acc@5 13.67 ( 3.56) +Epoch: [0][2209/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.8786e+00 (6.5094e+00) Acc@1 2.34 ( 0.95) Acc@5 8.59 ( 3.56) +Epoch: [0][2210/5004] Time 0.250 ( 0.243) Data 0.025 ( 0.027) Loss 5.7914e+00 (6.5091e+00) Acc@1 3.52 ( 0.95) Acc@5 12.50 ( 3.56) +Epoch: [0][2211/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.9762e+00 (6.5088e+00) Acc@1 0.39 ( 0.95) Acc@5 8.20 ( 3.57) +Epoch: [0][2212/5004] Time 0.244 ( 0.243) Data 0.021 ( 0.027) Loss 5.8877e+00 (6.5086e+00) Acc@1 1.56 ( 0.95) Acc@5 8.59 ( 3.57) +Epoch: [0][2213/5004] Time 0.248 ( 0.243) Data 0.021 ( 0.027) Loss 5.9374e+00 (6.5083e+00) Acc@1 3.12 ( 0.95) Acc@5 8.59 ( 3.57) +Epoch: [0][2214/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.8876e+00 (6.5080e+00) Acc@1 1.95 ( 0.95) Acc@5 8.59 ( 3.57) +Epoch: [0][2215/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.8752e+00 (6.5077e+00) Acc@1 2.34 ( 0.95) Acc@5 8.59 ( 3.58) +Epoch: [0][2216/5004] Time 0.244 ( 0.243) Data 0.021 ( 0.027) Loss 5.9519e+00 (6.5075e+00) Acc@1 3.91 ( 0.95) Acc@5 9.38 ( 3.58) +Epoch: [0][2217/5004] Time 0.242 ( 0.243) Data 0.021 ( 0.027) Loss 5.9109e+00 (6.5072e+00) Acc@1 2.34 ( 0.96) Acc@5 7.42 ( 3.58) +Epoch: [0][2218/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.9505e+00 (6.5070e+00) Acc@1 2.73 ( 0.96) Acc@5 8.98 ( 3.58) +Epoch: [0][2219/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.8879e+00 (6.5067e+00) Acc@1 3.52 ( 0.96) Acc@5 7.42 ( 3.58) +Epoch: [0][2220/5004] Time 0.249 ( 0.243) Data 0.024 ( 0.027) Loss 5.8594e+00 (6.5064e+00) Acc@1 2.73 ( 0.96) Acc@5 10.94 ( 3.59) +Epoch: [0][2221/5004] Time 0.243 ( 0.243) Data 0.021 ( 0.027) Loss 5.8694e+00 (6.5061e+00) Acc@1 3.52 ( 0.96) Acc@5 10.94 ( 3.59) +Epoch: [0][2222/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.8475e+00 (6.5058e+00) Acc@1 3.12 ( 0.96) Acc@5 10.55 ( 3.59) +Epoch: [0][2223/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.7873e+00 (6.5055e+00) Acc@1 3.52 ( 0.96) Acc@5 12.89 ( 3.60) +Epoch: [0][2224/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.7836e+00 (6.5052e+00) Acc@1 2.73 ( 0.96) Acc@5 10.55 ( 3.60) +Epoch: [0][2225/5004] Time 0.245 ( 0.243) Data 0.026 ( 0.027) Loss 5.7481e+00 (6.5048e+00) Acc@1 1.17 ( 0.96) Acc@5 7.42 ( 3.60) +Epoch: [0][2226/5004] Time 0.248 ( 0.243) Data 0.025 ( 0.027) Loss 5.6289e+00 (6.5044e+00) Acc@1 2.73 ( 0.96) Acc@5 13.67 ( 3.61) +Epoch: [0][2227/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.8971e+00 (6.5042e+00) Acc@1 3.12 ( 0.96) Acc@5 7.42 ( 3.61) +Epoch: [0][2228/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.7366e+00 (6.5038e+00) Acc@1 3.12 ( 0.97) Acc@5 12.11 ( 3.61) +Epoch: [0][2229/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.7377e+00 (6.5035e+00) Acc@1 3.91 ( 0.97) Acc@5 10.16 ( 3.62) +Epoch: [0][2230/5004] Time 0.255 ( 0.243) Data 0.024 ( 0.027) Loss 5.5516e+00 (6.5030e+00) Acc@1 3.12 ( 0.97) Acc@5 13.28 ( 3.62) +Epoch: [0][2231/5004] Time 0.250 ( 0.243) Data 0.022 ( 0.027) Loss 5.8798e+00 (6.5028e+00) Acc@1 2.73 ( 0.97) Acc@5 10.55 ( 3.62) +Epoch: [0][2232/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.8011e+00 (6.5025e+00) Acc@1 2.73 ( 0.97) Acc@5 8.59 ( 3.63) +Epoch: [0][2233/5004] Time 0.247 ( 0.243) Data 0.024 ( 0.027) Loss 5.9569e+00 (6.5022e+00) Acc@1 2.73 ( 0.97) Acc@5 10.55 ( 3.63) +Epoch: [0][2234/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.7557e+00 (6.5019e+00) Acc@1 4.30 ( 0.97) Acc@5 12.89 ( 3.63) +Epoch: [0][2235/5004] Time 0.247 ( 0.243) Data 0.024 ( 0.027) Loss 5.8123e+00 (6.5016e+00) Acc@1 5.08 ( 0.97) Acc@5 12.89 ( 3.64) +Epoch: [0][2236/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.7836e+00 (6.5012e+00) Acc@1 1.56 ( 0.97) Acc@5 11.33 ( 3.64) +Epoch: [0][2237/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.9352e+00 (6.5010e+00) Acc@1 2.34 ( 0.97) Acc@5 9.38 ( 3.64) +Epoch: [0][2238/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.7749e+00 (6.5007e+00) Acc@1 4.69 ( 0.98) Acc@5 9.77 ( 3.65) +Epoch: [0][2239/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.8955e+00 (6.5004e+00) Acc@1 1.95 ( 0.98) Acc@5 10.16 ( 3.65) +Epoch: [0][2240/5004] Time 0.251 ( 0.243) Data 0.023 ( 0.027) Loss 5.8109e+00 (6.5001e+00) Acc@1 2.34 ( 0.98) Acc@5 10.16 ( 3.65) +Epoch: [0][2241/5004] Time 0.246 ( 0.243) Data 0.020 ( 0.027) Loss 5.7214e+00 (6.4997e+00) Acc@1 6.64 ( 0.98) Acc@5 16.80 ( 3.66) +Epoch: [0][2242/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.7767e+00 (6.4994e+00) Acc@1 1.95 ( 0.98) Acc@5 8.59 ( 3.66) +Epoch: [0][2243/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.8281e+00 (6.4991e+00) Acc@1 1.95 ( 0.98) Acc@5 6.25 ( 3.66) +Epoch: [0][2244/5004] Time 0.248 ( 0.243) Data 0.025 ( 0.027) Loss 5.7322e+00 (6.4988e+00) Acc@1 2.34 ( 0.98) Acc@5 12.50 ( 3.67) +Epoch: [0][2245/5004] Time 0.248 ( 0.243) Data 0.027 ( 0.027) Loss 5.9186e+00 (6.4985e+00) Acc@1 2.73 ( 0.98) Acc@5 7.03 ( 3.67) +Epoch: [0][2246/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.8811e+00 (6.4982e+00) Acc@1 2.73 ( 0.98) Acc@5 10.55 ( 3.67) +Epoch: [0][2247/5004] Time 0.247 ( 0.243) Data 0.024 ( 0.027) Loss 5.7061e+00 (6.4979e+00) Acc@1 3.12 ( 0.98) Acc@5 10.55 ( 3.67) +Epoch: [0][2248/5004] Time 0.243 ( 0.243) Data 0.021 ( 0.027) Loss 5.8876e+00 (6.4976e+00) Acc@1 3.12 ( 0.98) Acc@5 7.42 ( 3.67) +Epoch: [0][2249/5004] Time 0.250 ( 0.243) Data 0.023 ( 0.027) Loss 5.7820e+00 (6.4973e+00) Acc@1 3.91 ( 0.99) Acc@5 10.55 ( 3.68) +Epoch: [0][2250/5004] Time 0.246 ( 0.243) Data 0.021 ( 0.027) Loss 5.8479e+00 (6.4970e+00) Acc@1 2.34 ( 0.99) Acc@5 11.33 ( 3.68) +Epoch: [0][2251/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.8339e+00 (6.4967e+00) Acc@1 4.30 ( 0.99) Acc@5 10.16 ( 3.68) +Epoch: [0][2252/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.8795e+00 (6.4964e+00) Acc@1 4.69 ( 0.99) Acc@5 10.55 ( 3.69) +Epoch: [0][2253/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.7583e+00 (6.4961e+00) Acc@1 2.34 ( 0.99) Acc@5 12.11 ( 3.69) +Epoch: [0][2254/5004] Time 0.247 ( 0.243) Data 0.025 ( 0.027) Loss 5.8102e+00 (6.4958e+00) Acc@1 4.30 ( 0.99) Acc@5 10.94 ( 3.69) +Epoch: [0][2255/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.7982e+00 (6.4955e+00) Acc@1 1.56 ( 0.99) Acc@5 8.59 ( 3.70) +Epoch: [0][2256/5004] Time 0.248 ( 0.243) Data 0.023 ( 0.027) Loss 5.7103e+00 (6.4952e+00) Acc@1 2.34 ( 0.99) Acc@5 7.81 ( 3.70) +Epoch: [0][2257/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.7080e+00 (6.4948e+00) Acc@1 2.73 ( 0.99) Acc@5 10.55 ( 3.70) +Epoch: [0][2258/5004] Time 0.244 ( 0.243) Data 0.022 ( 0.027) Loss 5.9405e+00 (6.4946e+00) Acc@1 3.52 ( 0.99) Acc@5 10.55 ( 3.70) +Epoch: [0][2259/5004] Time 0.248 ( 0.243) Data 0.024 ( 0.027) Loss 5.7672e+00 (6.4942e+00) Acc@1 5.86 ( 1.00) Acc@5 11.72 ( 3.71) +Epoch: [0][2260/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.7154e+00 (6.4939e+00) Acc@1 2.73 ( 1.00) Acc@5 7.81 ( 3.71) +Epoch: [0][2261/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 5.7353e+00 (6.4936e+00) Acc@1 2.73 ( 1.00) Acc@5 12.50 ( 3.71) +Epoch: [0][2262/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.9803e+00 (6.4933e+00) Acc@1 3.91 ( 1.00) Acc@5 8.59 ( 3.72) +Epoch: [0][2263/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.7328e+00 (6.4930e+00) Acc@1 3.52 ( 1.00) Acc@5 10.16 ( 3.72) +Epoch: [0][2264/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.9542e+00 (6.4928e+00) Acc@1 2.34 ( 1.00) Acc@5 6.64 ( 3.72) +Epoch: [0][2265/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.7586e+00 (6.4924e+00) Acc@1 1.95 ( 1.00) Acc@5 10.16 ( 3.72) +Epoch: [0][2266/5004] Time 0.243 ( 0.243) Data 0.022 ( 0.027) Loss 5.8726e+00 (6.4922e+00) Acc@1 3.91 ( 1.00) Acc@5 10.94 ( 3.73) +Epoch: [0][2267/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.8167e+00 (6.4919e+00) Acc@1 1.95 ( 1.00) Acc@5 10.94 ( 3.73) +Epoch: [0][2268/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.8700e+00 (6.4916e+00) Acc@1 1.95 ( 1.00) Acc@5 10.94 ( 3.73) +Epoch: [0][2269/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.7448e+00 (6.4913e+00) Acc@1 3.91 ( 1.00) Acc@5 12.89 ( 3.74) +Epoch: [0][2270/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.8537e+00 (6.4910e+00) Acc@1 2.73 ( 1.01) Acc@5 7.81 ( 3.74) +Epoch: [0][2271/5004] Time 0.258 ( 0.243) Data 0.024 ( 0.027) Loss 5.8186e+00 (6.4907e+00) Acc@1 2.73 ( 1.01) Acc@5 11.33 ( 3.74) +Epoch: [0][2272/5004] Time 0.244 ( 0.243) Data 0.022 ( 0.027) Loss 5.7740e+00 (6.4904e+00) Acc@1 1.95 ( 1.01) Acc@5 9.38 ( 3.74) +Epoch: [0][2273/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.6877e+00 (6.4900e+00) Acc@1 4.69 ( 1.01) Acc@5 13.67 ( 3.75) +Epoch: [0][2274/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.8320e+00 (6.4897e+00) Acc@1 2.73 ( 1.01) Acc@5 11.33 ( 3.75) +Epoch: [0][2275/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.9345e+00 (6.4895e+00) Acc@1 3.12 ( 1.01) Acc@5 11.72 ( 3.75) +Epoch: [0][2276/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.7928e+00 (6.4892e+00) Acc@1 2.34 ( 1.01) Acc@5 9.38 ( 3.76) +Epoch: [0][2277/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.8356e+00 (6.4889e+00) Acc@1 1.56 ( 1.01) Acc@5 7.42 ( 3.76) +Epoch: [0][2278/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.7444e+00 (6.4886e+00) Acc@1 3.91 ( 1.01) Acc@5 11.33 ( 3.76) +Epoch: [0][2279/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.8790e+00 (6.4883e+00) Acc@1 2.73 ( 1.01) Acc@5 10.16 ( 3.76) +Epoch: [0][2280/5004] Time 0.251 ( 0.243) Data 0.024 ( 0.027) Loss 5.9467e+00 (6.4881e+00) Acc@1 2.73 ( 1.01) Acc@5 11.72 ( 3.77) +Epoch: [0][2281/5004] Time 0.247 ( 0.243) Data 0.022 ( 0.027) Loss 5.8602e+00 (6.4878e+00) Acc@1 3.52 ( 1.01) Acc@5 10.55 ( 3.77) +Epoch: [0][2282/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.6681e+00 (6.4874e+00) Acc@1 3.91 ( 1.02) Acc@5 12.11 ( 3.77) +Epoch: [0][2283/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.6842e+00 (6.4871e+00) Acc@1 3.12 ( 1.02) Acc@5 11.72 ( 3.78) +Epoch: [0][2284/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.6730e+00 (6.4867e+00) Acc@1 3.12 ( 1.02) Acc@5 12.11 ( 3.78) +Epoch: [0][2285/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.8115e+00 (6.4864e+00) Acc@1 3.91 ( 1.02) Acc@5 10.16 ( 3.78) +Epoch: [0][2286/5004] Time 0.248 ( 0.243) Data 0.024 ( 0.027) Loss 5.6809e+00 (6.4861e+00) Acc@1 5.47 ( 1.02) Acc@5 15.62 ( 3.79) +Epoch: [0][2287/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.8397e+00 (6.4858e+00) Acc@1 1.95 ( 1.02) Acc@5 10.94 ( 3.79) +Epoch: [0][2288/5004] Time 0.249 ( 0.243) Data 0.024 ( 0.027) Loss 5.7063e+00 (6.4854e+00) Acc@1 4.30 ( 1.02) Acc@5 11.33 ( 3.80) +Epoch: [0][2289/5004] Time 0.236 ( 0.243) Data 0.020 ( 0.027) Loss 5.7830e+00 (6.4851e+00) Acc@1 3.12 ( 1.02) Acc@5 12.89 ( 3.80) +Epoch: [0][2290/5004] Time 0.243 ( 0.243) Data 0.029 ( 0.027) Loss 5.8422e+00 (6.4849e+00) Acc@1 4.30 ( 1.02) Acc@5 12.11 ( 3.80) +Epoch: [0][2291/5004] Time 0.249 ( 0.243) Data 0.029 ( 0.027) Loss 5.6356e+00 (6.4845e+00) Acc@1 5.08 ( 1.03) Acc@5 13.28 ( 3.81) +Epoch: [0][2292/5004] Time 0.239 ( 0.243) Data 0.024 ( 0.027) Loss 5.7182e+00 (6.4842e+00) Acc@1 6.64 ( 1.03) Acc@5 14.84 ( 3.81) +Epoch: [0][2293/5004] Time 0.243 ( 0.243) Data 0.028 ( 0.027) Loss 5.7925e+00 (6.4839e+00) Acc@1 3.12 ( 1.03) Acc@5 12.50 ( 3.82) +Epoch: [0][2294/5004] Time 0.250 ( 0.243) Data 0.030 ( 0.027) Loss 5.7765e+00 (6.4835e+00) Acc@1 3.91 ( 1.03) Acc@5 10.55 ( 3.82) +Epoch: [0][2295/5004] Time 0.248 ( 0.243) Data 0.027 ( 0.027) Loss 5.8168e+00 (6.4833e+00) Acc@1 2.73 ( 1.03) Acc@5 10.55 ( 3.82) +Epoch: [0][2296/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.8697e+00 (6.4830e+00) Acc@1 1.95 ( 1.03) Acc@5 9.77 ( 3.83) +Epoch: [0][2297/5004] Time 0.247 ( 0.243) Data 0.029 ( 0.027) Loss 5.7969e+00 (6.4827e+00) Acc@1 2.34 ( 1.03) Acc@5 8.20 ( 3.83) +Epoch: [0][2298/5004] Time 0.249 ( 0.243) Data 0.028 ( 0.027) Loss 5.7744e+00 (6.4824e+00) Acc@1 3.12 ( 1.03) Acc@5 12.11 ( 3.83) +Epoch: [0][2299/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.7220e+00 (6.4821e+00) Acc@1 3.12 ( 1.03) Acc@5 14.06 ( 3.84) +Epoch: [0][2300/5004] Time 0.250 ( 0.243) Data 0.028 ( 0.027) Loss 5.8251e+00 (6.4818e+00) Acc@1 3.52 ( 1.04) Acc@5 13.28 ( 3.84) +Epoch: [0][2301/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.7214e+00 (6.4814e+00) Acc@1 3.91 ( 1.04) Acc@5 10.55 ( 3.84) +Epoch: [0][2302/5004] Time 0.244 ( 0.243) Data 0.027 ( 0.027) Loss 5.7628e+00 (6.4811e+00) Acc@1 4.69 ( 1.04) Acc@5 14.06 ( 3.85) +Epoch: [0][2303/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.6990e+00 (6.4808e+00) Acc@1 4.69 ( 1.04) Acc@5 12.50 ( 3.85) +Epoch: [0][2304/5004] Time 0.242 ( 0.243) Data 0.028 ( 0.027) Loss 5.9394e+00 (6.4805e+00) Acc@1 3.52 ( 1.04) Acc@5 8.59 ( 3.85) +Epoch: [0][2305/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.7466e+00 (6.4802e+00) Acc@1 4.69 ( 1.04) Acc@5 13.28 ( 3.86) +Epoch: [0][2306/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.8545e+00 (6.4800e+00) Acc@1 1.95 ( 1.04) Acc@5 9.77 ( 3.86) +Epoch: [0][2307/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.7716e+00 (6.4797e+00) Acc@1 3.12 ( 1.04) Acc@5 10.94 ( 3.86) +Epoch: [0][2308/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.9270e+00 (6.4794e+00) Acc@1 3.12 ( 1.05) Acc@5 9.38 ( 3.86) +Epoch: [0][2309/5004] Time 0.253 ( 0.243) Data 0.028 ( 0.027) Loss 5.8033e+00 (6.4791e+00) Acc@1 5.08 ( 1.05) Acc@5 14.06 ( 3.87) +Epoch: [0][2310/5004] Time 0.243 ( 0.243) Data 0.020 ( 0.027) Loss 5.7611e+00 (6.4788e+00) Acc@1 2.34 ( 1.05) Acc@5 10.94 ( 3.87) +Epoch: [0][2311/5004] Time 0.249 ( 0.243) Data 0.021 ( 0.027) Loss 5.8547e+00 (6.4785e+00) Acc@1 3.12 ( 1.05) Acc@5 10.94 ( 3.87) +Epoch: [0][2312/5004] Time 0.248 ( 0.243) Data 0.020 ( 0.027) Loss 5.8473e+00 (6.4783e+00) Acc@1 3.52 ( 1.05) Acc@5 9.77 ( 3.88) +Epoch: [0][2313/5004] Time 0.246 ( 0.243) Data 0.020 ( 0.027) Loss 5.8194e+00 (6.4780e+00) Acc@1 3.52 ( 1.05) Acc@5 12.50 ( 3.88) +Epoch: [0][2314/5004] Time 0.244 ( 0.243) Data 0.020 ( 0.027) Loss 5.8866e+00 (6.4777e+00) Acc@1 3.12 ( 1.05) Acc@5 11.33 ( 3.88) +Epoch: [0][2315/5004] Time 0.245 ( 0.243) Data 0.021 ( 0.027) Loss 5.7877e+00 (6.4774e+00) Acc@1 3.91 ( 1.05) Acc@5 12.50 ( 3.89) +Epoch: [0][2316/5004] Time 0.247 ( 0.243) Data 0.021 ( 0.027) Loss 5.8022e+00 (6.4771e+00) Acc@1 3.12 ( 1.05) Acc@5 10.55 ( 3.89) +Epoch: [0][2317/5004] Time 0.246 ( 0.243) Data 0.021 ( 0.027) Loss 5.8525e+00 (6.4769e+00) Acc@1 3.12 ( 1.05) Acc@5 10.94 ( 3.89) +Epoch: [0][2318/5004] Time 0.244 ( 0.243) Data 0.020 ( 0.027) Loss 5.9062e+00 (6.4766e+00) Acc@1 2.73 ( 1.06) Acc@5 7.81 ( 3.90) +Epoch: [0][2319/5004] Time 0.246 ( 0.243) Data 0.021 ( 0.027) Loss 5.8226e+00 (6.4763e+00) Acc@1 2.34 ( 1.06) Acc@5 9.77 ( 3.90) +Epoch: [0][2320/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.7480e+00 (6.4760e+00) Acc@1 4.69 ( 1.06) Acc@5 12.89 ( 3.90) +Epoch: [0][2321/5004] Time 0.250 ( 0.243) Data 0.021 ( 0.027) Loss 5.7339e+00 (6.4757e+00) Acc@1 3.91 ( 1.06) Acc@5 10.55 ( 3.91) +Epoch: [0][2322/5004] Time 0.248 ( 0.243) Data 0.020 ( 0.027) Loss 5.7679e+00 (6.4754e+00) Acc@1 3.52 ( 1.06) Acc@5 10.94 ( 3.91) +Epoch: [0][2323/5004] Time 0.246 ( 0.243) Data 0.021 ( 0.027) Loss 5.7813e+00 (6.4751e+00) Acc@1 4.30 ( 1.06) Acc@5 12.89 ( 3.91) +Epoch: [0][2324/5004] Time 0.245 ( 0.243) Data 0.021 ( 0.027) Loss 5.6454e+00 (6.4747e+00) Acc@1 4.69 ( 1.06) Acc@5 10.94 ( 3.91) +Epoch: [0][2325/5004] Time 0.245 ( 0.243) Data 0.021 ( 0.027) Loss 5.9081e+00 (6.4745e+00) Acc@1 3.12 ( 1.06) Acc@5 7.42 ( 3.92) +Epoch: [0][2326/5004] Time 0.245 ( 0.243) Data 0.020 ( 0.027) Loss 5.8419e+00 (6.4742e+00) Acc@1 3.12 ( 1.06) Acc@5 10.55 ( 3.92) +Epoch: [0][2327/5004] Time 0.248 ( 0.243) Data 0.020 ( 0.027) Loss 5.8306e+00 (6.4740e+00) Acc@1 3.52 ( 1.07) Acc@5 11.33 ( 3.92) +Epoch: [0][2328/5004] Time 0.244 ( 0.243) Data 0.019 ( 0.027) Loss 5.6953e+00 (6.4736e+00) Acc@1 3.12 ( 1.07) Acc@5 10.16 ( 3.93) +Epoch: [0][2329/5004] Time 0.243 ( 0.243) Data 0.020 ( 0.027) Loss 5.8237e+00 (6.4733e+00) Acc@1 3.12 ( 1.07) Acc@5 11.33 ( 3.93) +Epoch: [0][2330/5004] Time 0.244 ( 0.243) Data 0.022 ( 0.027) Loss 5.7322e+00 (6.4730e+00) Acc@1 3.12 ( 1.07) Acc@5 9.77 ( 3.93) +Epoch: [0][2331/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.8549e+00 (6.4728e+00) Acc@1 3.12 ( 1.07) Acc@5 10.94 ( 3.93) +Epoch: [0][2332/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.8175e+00 (6.4725e+00) Acc@1 3.12 ( 1.07) Acc@5 11.72 ( 3.94) +Epoch: [0][2333/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.7747e+00 (6.4722e+00) Acc@1 3.12 ( 1.07) Acc@5 10.16 ( 3.94) +Epoch: [0][2334/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.7112e+00 (6.4719e+00) Acc@1 4.69 ( 1.07) Acc@5 12.50 ( 3.94) +Epoch: [0][2335/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.6554e+00 (6.4715e+00) Acc@1 3.52 ( 1.07) Acc@5 11.72 ( 3.95) +Epoch: [0][2336/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.7736e+00 (6.4712e+00) Acc@1 5.08 ( 1.07) Acc@5 10.94 ( 3.95) +Epoch: [0][2337/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.9161e+00 (6.4710e+00) Acc@1 1.56 ( 1.08) Acc@5 8.20 ( 3.95) +Epoch: [0][2338/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.7442e+00 (6.4707e+00) Acc@1 1.95 ( 1.08) Acc@5 12.11 ( 3.96) +Epoch: [0][2339/5004] Time 0.247 ( 0.243) Data 0.024 ( 0.027) Loss 5.8682e+00 (6.4704e+00) Acc@1 2.73 ( 1.08) Acc@5 9.38 ( 3.96) +Epoch: [0][2340/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.6368e+00 (6.4700e+00) Acc@1 3.91 ( 1.08) Acc@5 11.33 ( 3.96) +Epoch: [0][2341/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 5.6190e+00 (6.4697e+00) Acc@1 7.03 ( 1.08) Acc@5 15.23 ( 3.97) +Epoch: [0][2342/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.8432e+00 (6.4694e+00) Acc@1 3.91 ( 1.08) Acc@5 8.20 ( 3.97) +Epoch: [0][2343/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.6145e+00 (6.4690e+00) Acc@1 2.34 ( 1.08) Acc@5 12.89 ( 3.97) +Epoch: [0][2344/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.6722e+00 (6.4687e+00) Acc@1 3.12 ( 1.08) Acc@5 10.55 ( 3.97) +Epoch: [0][2345/5004] Time 0.250 ( 0.243) Data 0.026 ( 0.027) Loss 5.8362e+00 (6.4684e+00) Acc@1 1.17 ( 1.08) Acc@5 8.59 ( 3.98) +Epoch: [0][2346/5004] Time 0.247 ( 0.243) Data 0.025 ( 0.027) Loss 5.6734e+00 (6.4681e+00) Acc@1 3.52 ( 1.08) Acc@5 14.45 ( 3.98) +Epoch: [0][2347/5004] Time 0.242 ( 0.243) Data 0.022 ( 0.027) Loss 5.7668e+00 (6.4678e+00) Acc@1 4.69 ( 1.09) Acc@5 13.28 ( 3.98) +Epoch: [0][2348/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.7404e+00 (6.4675e+00) Acc@1 3.12 ( 1.09) Acc@5 13.67 ( 3.99) +Epoch: [0][2349/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.8578e+00 (6.4672e+00) Acc@1 2.73 ( 1.09) Acc@5 11.72 ( 3.99) +Epoch: [0][2350/5004] Time 0.249 ( 0.243) Data 0.024 ( 0.027) Loss 5.8260e+00 (6.4670e+00) Acc@1 3.12 ( 1.09) Acc@5 10.55 ( 3.99) +Epoch: [0][2351/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.7593e+00 (6.4667e+00) Acc@1 3.52 ( 1.09) Acc@5 11.72 ( 4.00) +Epoch: [0][2352/5004] Time 0.241 ( 0.243) Data 0.021 ( 0.027) Loss 5.6559e+00 (6.4663e+00) Acc@1 3.52 ( 1.09) Acc@5 10.16 ( 4.00) +Epoch: [0][2353/5004] Time 0.249 ( 0.243) Data 0.024 ( 0.027) Loss 5.6654e+00 (6.4660e+00) Acc@1 1.56 ( 1.09) Acc@5 9.38 ( 4.00) +Epoch: [0][2354/5004] Time 0.248 ( 0.243) Data 0.023 ( 0.027) Loss 5.7191e+00 (6.4657e+00) Acc@1 5.47 ( 1.09) Acc@5 13.67 ( 4.01) +Epoch: [0][2355/5004] Time 0.240 ( 0.243) Data 0.020 ( 0.027) Loss 5.5765e+00 (6.4653e+00) Acc@1 3.91 ( 1.09) Acc@5 12.89 ( 4.01) +Epoch: [0][2356/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.8470e+00 (6.4650e+00) Acc@1 1.95 ( 1.09) Acc@5 8.20 ( 4.01) +Epoch: [0][2357/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.6849e+00 (6.4647e+00) Acc@1 3.12 ( 1.09) Acc@5 12.89 ( 4.02) +Epoch: [0][2358/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.7346e+00 (6.4644e+00) Acc@1 3.12 ( 1.10) Acc@5 11.33 ( 4.02) +Epoch: [0][2359/5004] Time 0.248 ( 0.243) Data 0.024 ( 0.027) Loss 5.6279e+00 (6.4640e+00) Acc@1 4.30 ( 1.10) Acc@5 10.94 ( 4.02) +Epoch: [0][2360/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.7687e+00 (6.4637e+00) Acc@1 4.30 ( 1.10) Acc@5 12.50 ( 4.03) +Epoch: [0][2361/5004] Time 0.250 ( 0.243) Data 0.023 ( 0.027) Loss 5.8686e+00 (6.4635e+00) Acc@1 4.69 ( 1.10) Acc@5 11.33 ( 4.03) +Epoch: [0][2362/5004] Time 0.244 ( 0.243) Data 0.021 ( 0.027) Loss 5.6768e+00 (6.4631e+00) Acc@1 3.91 ( 1.10) Acc@5 10.55 ( 4.03) +Epoch: [0][2363/5004] Time 0.252 ( 0.243) Data 0.024 ( 0.027) Loss 5.8109e+00 (6.4629e+00) Acc@1 1.56 ( 1.10) Acc@5 8.59 ( 4.03) +Epoch: [0][2364/5004] Time 0.248 ( 0.243) Data 0.023 ( 0.027) Loss 5.7149e+00 (6.4625e+00) Acc@1 3.91 ( 1.10) Acc@5 8.59 ( 4.04) +Epoch: [0][2365/5004] Time 0.244 ( 0.243) Data 0.021 ( 0.027) Loss 5.6717e+00 (6.4622e+00) Acc@1 3.52 ( 1.10) Acc@5 8.59 ( 4.04) +Epoch: [0][2366/5004] Time 0.242 ( 0.243) Data 0.023 ( 0.027) Loss 5.8093e+00 (6.4619e+00) Acc@1 3.52 ( 1.10) Acc@5 12.11 ( 4.04) +Epoch: [0][2367/5004] Time 0.252 ( 0.243) Data 0.025 ( 0.027) Loss 5.7417e+00 (6.4616e+00) Acc@1 5.47 ( 1.11) Acc@5 12.50 ( 4.04) +Epoch: [0][2368/5004] Time 0.244 ( 0.243) Data 0.018 ( 0.027) Loss 5.9887e+00 (6.4614e+00) Acc@1 2.73 ( 1.11) Acc@5 8.98 ( 4.05) +Epoch: [0][2369/5004] Time 0.252 ( 0.243) Data 0.023 ( 0.027) Loss 5.7669e+00 (6.4611e+00) Acc@1 3.12 ( 1.11) Acc@5 14.06 ( 4.05) +Epoch: [0][2370/5004] Time 0.243 ( 0.243) Data 0.018 ( 0.027) Loss 5.7300e+00 (6.4608e+00) Acc@1 3.91 ( 1.11) Acc@5 14.06 ( 4.05) +Epoch: [0][2371/5004] Time 0.243 ( 0.243) Data 0.022 ( 0.027) Loss 5.8640e+00 (6.4606e+00) Acc@1 3.52 ( 1.11) Acc@5 10.94 ( 4.06) +Epoch: [0][2372/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.7658e+00 (6.4603e+00) Acc@1 3.12 ( 1.11) Acc@5 10.94 ( 4.06) +Epoch: [0][2373/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.9303e+00 (6.4601e+00) Acc@1 1.56 ( 1.11) Acc@5 7.81 ( 4.06) +Epoch: [0][2374/5004] Time 0.239 ( 0.243) Data 0.023 ( 0.027) Loss 5.9165e+00 (6.4598e+00) Acc@1 4.30 ( 1.11) Acc@5 12.50 ( 4.07) +Epoch: [0][2375/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.7229e+00 (6.4595e+00) Acc@1 2.34 ( 1.11) Acc@5 12.11 ( 4.07) +Epoch: [0][2376/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.5696e+00 (6.4592e+00) Acc@1 5.86 ( 1.11) Acc@5 16.80 ( 4.07) +Epoch: [0][2377/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.7818e+00 (6.4589e+00) Acc@1 2.73 ( 1.12) Acc@5 8.59 ( 4.08) +Epoch: [0][2378/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.5873e+00 (6.4585e+00) Acc@1 3.91 ( 1.12) Acc@5 14.06 ( 4.08) +Epoch: [0][2379/5004] Time 0.241 ( 0.243) Data 0.024 ( 0.027) Loss 5.7112e+00 (6.4582e+00) Acc@1 3.12 ( 1.12) Acc@5 10.55 ( 4.08) +Epoch: [0][2380/5004] Time 0.247 ( 0.243) Data 0.025 ( 0.027) Loss 5.6107e+00 (6.4578e+00) Acc@1 1.95 ( 1.12) Acc@5 10.94 ( 4.09) +Epoch: [0][2381/5004] Time 0.238 ( 0.243) Data 0.020 ( 0.027) Loss 5.9027e+00 (6.4576e+00) Acc@1 3.52 ( 1.12) Acc@5 8.98 ( 4.09) +Epoch: [0][2382/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.7982e+00 (6.4573e+00) Acc@1 4.30 ( 1.12) Acc@5 11.72 ( 4.09) +Epoch: [0][2383/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.8966e+00 (6.4571e+00) Acc@1 2.34 ( 1.12) Acc@5 7.42 ( 4.09) +Epoch: [0][2384/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.8789e+00 (6.4568e+00) Acc@1 3.52 ( 1.12) Acc@5 12.11 ( 4.10) +Epoch: [0][2385/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.6537e+00 (6.4565e+00) Acc@1 5.08 ( 1.12) Acc@5 12.50 ( 4.10) +Epoch: [0][2386/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.7866e+00 (6.4562e+00) Acc@1 3.12 ( 1.12) Acc@5 9.77 ( 4.10) +Epoch: [0][2387/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.7875e+00 (6.4559e+00) Acc@1 2.73 ( 1.12) Acc@5 10.94 ( 4.10) +Epoch: [0][2388/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.7833e+00 (6.4557e+00) Acc@1 3.91 ( 1.13) Acc@5 11.33 ( 4.11) +Epoch: [0][2389/5004] Time 0.243 ( 0.243) Data 0.024 ( 0.027) Loss 5.7224e+00 (6.4554e+00) Acc@1 3.52 ( 1.13) Acc@5 12.11 ( 4.11) +Epoch: [0][2390/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 5.7960e+00 (6.4551e+00) Acc@1 2.73 ( 1.13) Acc@5 10.94 ( 4.11) +Epoch: [0][2391/5004] Time 0.240 ( 0.243) Data 0.022 ( 0.027) Loss 5.6809e+00 (6.4548e+00) Acc@1 4.69 ( 1.13) Acc@5 15.62 ( 4.12) +Epoch: [0][2392/5004] Time 0.249 ( 0.243) Data 0.024 ( 0.027) Loss 5.6610e+00 (6.4544e+00) Acc@1 4.30 ( 1.13) Acc@5 13.28 ( 4.12) +Epoch: [0][2393/5004] Time 0.242 ( 0.243) Data 0.019 ( 0.027) Loss 5.6848e+00 (6.4541e+00) Acc@1 2.34 ( 1.13) Acc@5 12.11 ( 4.13) +Epoch: [0][2394/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.4916e+00 (6.4537e+00) Acc@1 7.81 ( 1.13) Acc@5 15.23 ( 4.13) +Epoch: [0][2395/5004] Time 0.238 ( 0.243) Data 0.020 ( 0.027) Loss 5.8598e+00 (6.4535e+00) Acc@1 0.39 ( 1.13) Acc@5 10.55 ( 4.13) +Epoch: [0][2396/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.6660e+00 (6.4531e+00) Acc@1 2.34 ( 1.13) Acc@5 15.23 ( 4.14) +Epoch: [0][2397/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.6889e+00 (6.4528e+00) Acc@1 2.73 ( 1.13) Acc@5 10.16 ( 4.14) +Epoch: [0][2398/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.7352e+00 (6.4525e+00) Acc@1 3.91 ( 1.14) Acc@5 10.16 ( 4.14) +Epoch: [0][2399/5004] Time 0.242 ( 0.243) Data 0.026 ( 0.027) Loss 5.6614e+00 (6.4522e+00) Acc@1 2.73 ( 1.14) Acc@5 11.33 ( 4.15) +Epoch: [0][2400/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.7704e+00 (6.4519e+00) Acc@1 5.08 ( 1.14) Acc@5 10.94 ( 4.15) +Epoch: [0][2401/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.8702e+00 (6.4517e+00) Acc@1 5.08 ( 1.14) Acc@5 13.28 ( 4.15) +Epoch: [0][2402/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.7488e+00 (6.4514e+00) Acc@1 1.95 ( 1.14) Acc@5 11.72 ( 4.16) +Epoch: [0][2403/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.8080e+00 (6.4511e+00) Acc@1 3.12 ( 1.14) Acc@5 9.77 ( 4.16) +Epoch: [0][2404/5004] Time 0.238 ( 0.243) Data 0.021 ( 0.027) Loss 5.7534e+00 (6.4508e+00) Acc@1 3.52 ( 1.14) Acc@5 10.55 ( 4.16) +Epoch: [0][2405/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.6712e+00 (6.4505e+00) Acc@1 3.12 ( 1.14) Acc@5 12.89 ( 4.16) +Epoch: [0][2406/5004] Time 0.238 ( 0.243) Data 0.022 ( 0.027) Loss 5.6623e+00 (6.4502e+00) Acc@1 4.30 ( 1.14) Acc@5 11.72 ( 4.17) +Epoch: [0][2407/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.6661e+00 (6.4498e+00) Acc@1 3.12 ( 1.14) Acc@5 9.77 ( 4.17) +Epoch: [0][2408/5004] Time 0.240 ( 0.243) Data 0.023 ( 0.027) Loss 5.7583e+00 (6.4495e+00) Acc@1 3.52 ( 1.15) Acc@5 12.11 ( 4.17) +Epoch: [0][2409/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.6571e+00 (6.4492e+00) Acc@1 3.12 ( 1.15) Acc@5 11.33 ( 4.18) +Epoch: [0][2410/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.5408e+00 (6.4488e+00) Acc@1 4.30 ( 1.15) Acc@5 13.67 ( 4.18) +Epoch: [0][2411/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.7324e+00 (6.4485e+00) Acc@1 2.73 ( 1.15) Acc@5 12.89 ( 4.18) +Epoch: [0][2412/5004] Time 0.240 ( 0.243) Data 0.023 ( 0.027) Loss 5.8428e+00 (6.4483e+00) Acc@1 3.12 ( 1.15) Acc@5 11.33 ( 4.19) +Epoch: [0][2413/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.7963e+00 (6.4480e+00) Acc@1 3.12 ( 1.15) Acc@5 11.72 ( 4.19) +Epoch: [0][2414/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.7108e+00 (6.4477e+00) Acc@1 2.73 ( 1.15) Acc@5 13.28 ( 4.19) +Epoch: [0][2415/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.8415e+00 (6.4475e+00) Acc@1 2.34 ( 1.15) Acc@5 8.98 ( 4.20) +Epoch: [0][2416/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.8575e+00 (6.4472e+00) Acc@1 4.69 ( 1.15) Acc@5 9.77 ( 4.20) +Epoch: [0][2417/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.7555e+00 (6.4469e+00) Acc@1 3.52 ( 1.15) Acc@5 12.89 ( 4.20) +Epoch: [0][2418/5004] Time 0.243 ( 0.243) Data 0.021 ( 0.027) Loss 5.8408e+00 (6.4467e+00) Acc@1 3.52 ( 1.15) Acc@5 11.72 ( 4.20) +Epoch: [0][2419/5004] Time 0.244 ( 0.243) Data 0.023 ( 0.027) Loss 5.7737e+00 (6.4464e+00) Acc@1 2.34 ( 1.15) Acc@5 10.16 ( 4.21) +Epoch: [0][2420/5004] Time 0.250 ( 0.243) Data 0.023 ( 0.027) Loss 5.7271e+00 (6.4461e+00) Acc@1 1.95 ( 1.16) Acc@5 12.89 ( 4.21) +Epoch: [0][2421/5004] Time 0.239 ( 0.243) Data 0.019 ( 0.027) Loss 5.7704e+00 (6.4458e+00) Acc@1 3.12 ( 1.16) Acc@5 10.16 ( 4.21) +Epoch: [0][2422/5004] Time 0.239 ( 0.243) Data 0.024 ( 0.027) Loss 5.7522e+00 (6.4455e+00) Acc@1 2.73 ( 1.16) Acc@5 8.20 ( 4.21) +Epoch: [0][2423/5004] Time 0.238 ( 0.243) Data 0.026 ( 0.027) Loss 5.7147e+00 (6.4452e+00) Acc@1 3.12 ( 1.16) Acc@5 12.89 ( 4.22) +Epoch: [0][2424/5004] Time 0.237 ( 0.243) Data 0.027 ( 0.027) Loss 5.7009e+00 (6.4449e+00) Acc@1 2.73 ( 1.16) Acc@5 13.28 ( 4.22) +Epoch: [0][2425/5004] Time 0.242 ( 0.243) Data 0.028 ( 0.027) Loss 5.8046e+00 (6.4447e+00) Acc@1 3.91 ( 1.16) Acc@5 12.11 ( 4.23) +Epoch: [0][2426/5004] Time 0.235 ( 0.243) Data 0.024 ( 0.027) Loss 5.6721e+00 (6.4443e+00) Acc@1 2.73 ( 1.16) Acc@5 10.94 ( 4.23) +Epoch: [0][2427/5004] Time 0.240 ( 0.243) Data 0.027 ( 0.027) Loss 5.7000e+00 (6.4440e+00) Acc@1 3.52 ( 1.16) Acc@5 13.67 ( 4.23) +Epoch: [0][2428/5004] Time 0.237 ( 0.243) Data 0.025 ( 0.027) Loss 5.7326e+00 (6.4437e+00) Acc@1 2.34 ( 1.16) Acc@5 11.72 ( 4.24) +Epoch: [0][2429/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.6179e+00 (6.4434e+00) Acc@1 2.73 ( 1.16) Acc@5 8.20 ( 4.24) +Epoch: [0][2430/5004] Time 0.240 ( 0.243) Data 0.028 ( 0.027) Loss 5.7015e+00 (6.4431e+00) Acc@1 3.12 ( 1.16) Acc@5 13.28 ( 4.24) +Epoch: [0][2431/5004] Time 0.236 ( 0.243) Data 0.027 ( 0.027) Loss 5.6379e+00 (6.4428e+00) Acc@1 4.30 ( 1.16) Acc@5 13.67 ( 4.24) +Epoch: [0][2432/5004] Time 0.240 ( 0.243) Data 0.030 ( 0.027) Loss 5.7351e+00 (6.4425e+00) Acc@1 2.73 ( 1.16) Acc@5 12.89 ( 4.25) +Epoch: [0][2433/5004] Time 0.243 ( 0.243) Data 0.028 ( 0.027) Loss 5.6324e+00 (6.4421e+00) Acc@1 2.34 ( 1.17) Acc@5 13.28 ( 4.25) +Epoch: [0][2434/5004] Time 0.237 ( 0.243) Data 0.027 ( 0.027) Loss 5.6208e+00 (6.4418e+00) Acc@1 3.91 ( 1.17) Acc@5 13.28 ( 4.26) +Epoch: [0][2435/5004] Time 0.243 ( 0.243) Data 0.029 ( 0.027) Loss 5.6391e+00 (6.4415e+00) Acc@1 3.12 ( 1.17) Acc@5 10.55 ( 4.26) +Epoch: [0][2436/5004] Time 0.237 ( 0.243) Data 0.026 ( 0.027) Loss 5.6547e+00 (6.4412e+00) Acc@1 5.47 ( 1.17) Acc@5 15.62 ( 4.26) +Epoch: [0][2437/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.7248e+00 (6.4409e+00) Acc@1 2.34 ( 1.17) Acc@5 13.67 ( 4.27) +Epoch: [0][2438/5004] Time 0.243 ( 0.243) Data 0.027 ( 0.027) Loss 5.6013e+00 (6.4405e+00) Acc@1 3.12 ( 1.17) Acc@5 11.72 ( 4.27) +Epoch: [0][2439/5004] Time 0.241 ( 0.243) Data 0.024 ( 0.027) Loss 5.8119e+00 (6.4403e+00) Acc@1 2.73 ( 1.17) Acc@5 9.38 ( 4.27) +Epoch: [0][2440/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.7953e+00 (6.4400e+00) Acc@1 4.30 ( 1.17) Acc@5 9.38 ( 4.27) +Epoch: [0][2441/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.7688e+00 (6.4397e+00) Acc@1 5.86 ( 1.17) Acc@5 13.67 ( 4.28) +Epoch: [0][2442/5004] Time 0.243 ( 0.243) Data 0.026 ( 0.027) Loss 5.7161e+00 (6.4394e+00) Acc@1 4.69 ( 1.18) Acc@5 12.50 ( 4.28) +Epoch: [0][2443/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.6105e+00 (6.4391e+00) Acc@1 5.08 ( 1.18) Acc@5 14.45 ( 4.29) +Epoch: [0][2444/5004] Time 0.246 ( 0.243) Data 0.025 ( 0.027) Loss 5.7362e+00 (6.4388e+00) Acc@1 3.12 ( 1.18) Acc@5 13.28 ( 4.29) +Epoch: [0][2445/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.7480e+00 (6.4385e+00) Acc@1 5.08 ( 1.18) Acc@5 12.50 ( 4.29) +Epoch: [0][2446/5004] Time 0.241 ( 0.243) Data 0.023 ( 0.027) Loss 5.6802e+00 (6.4382e+00) Acc@1 4.69 ( 1.18) Acc@5 12.50 ( 4.30) +Epoch: [0][2447/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.6911e+00 (6.4379e+00) Acc@1 4.69 ( 1.18) Acc@5 13.28 ( 4.30) +Epoch: [0][2448/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.6380e+00 (6.4376e+00) Acc@1 3.52 ( 1.18) Acc@5 16.02 ( 4.30) +Epoch: [0][2449/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.5802e+00 (6.4372e+00) Acc@1 5.08 ( 1.18) Acc@5 17.19 ( 4.31) +Epoch: [0][2450/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.7112e+00 (6.4369e+00) Acc@1 2.73 ( 1.19) Acc@5 10.16 ( 4.31) +Epoch: [0][2451/5004] Time 0.245 ( 0.243) Data 0.027 ( 0.027) Loss 5.5101e+00 (6.4366e+00) Acc@1 3.91 ( 1.19) Acc@5 13.67 ( 4.32) +Epoch: [0][2452/5004] Time 0.248 ( 0.243) Data 0.024 ( 0.027) Loss 5.6625e+00 (6.4362e+00) Acc@1 3.12 ( 1.19) Acc@5 10.55 ( 4.32) +Epoch: [0][2453/5004] Time 0.243 ( 0.243) Data 0.021 ( 0.027) Loss 5.4723e+00 (6.4358e+00) Acc@1 5.86 ( 1.19) Acc@5 14.84 ( 4.32) +Epoch: [0][2454/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.7478e+00 (6.4356e+00) Acc@1 2.73 ( 1.19) Acc@5 10.55 ( 4.32) +Epoch: [0][2455/5004] Time 0.240 ( 0.243) Data 0.023 ( 0.027) Loss 5.8014e+00 (6.4353e+00) Acc@1 2.34 ( 1.19) Acc@5 9.38 ( 4.33) +Epoch: [0][2456/5004] Time 0.244 ( 0.243) Data 0.024 ( 0.027) Loss 5.8564e+00 (6.4351e+00) Acc@1 1.95 ( 1.19) Acc@5 8.59 ( 4.33) +Epoch: [0][2457/5004] Time 0.238 ( 0.243) Data 0.021 ( 0.027) Loss 5.7715e+00 (6.4348e+00) Acc@1 4.30 ( 1.19) Acc@5 11.72 ( 4.33) +Epoch: [0][2458/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.7377e+00 (6.4345e+00) Acc@1 3.91 ( 1.19) Acc@5 13.67 ( 4.34) +Epoch: [0][2459/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.8873e+00 (6.4343e+00) Acc@1 3.12 ( 1.19) Acc@5 9.77 ( 4.34) +Epoch: [0][2460/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.7294e+00 (6.4340e+00) Acc@1 3.12 ( 1.19) Acc@5 12.50 ( 4.34) +Epoch: [0][2461/5004] Time 0.239 ( 0.243) Data 0.023 ( 0.027) Loss 5.7086e+00 (6.4337e+00) Acc@1 3.12 ( 1.20) Acc@5 10.55 ( 4.34) +Epoch: [0][2462/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.5912e+00 (6.4334e+00) Acc@1 5.86 ( 1.20) Acc@5 14.45 ( 4.35) +Epoch: [0][2463/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5802e+00 (6.4330e+00) Acc@1 5.47 ( 1.20) Acc@5 14.45 ( 4.35) +Epoch: [0][2464/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.6710e+00 (6.4327e+00) Acc@1 2.73 ( 1.20) Acc@5 15.23 ( 4.36) +Epoch: [0][2465/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.7036e+00 (6.4324e+00) Acc@1 2.73 ( 1.20) Acc@5 10.55 ( 4.36) +Epoch: [0][2466/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.5364e+00 (6.4321e+00) Acc@1 6.25 ( 1.20) Acc@5 16.41 ( 4.36) +Epoch: [0][2467/5004] Time 0.245 ( 0.243) Data 0.023 ( 0.027) Loss 5.7077e+00 (6.4318e+00) Acc@1 5.08 ( 1.20) Acc@5 14.45 ( 4.37) +Epoch: [0][2468/5004] Time 0.239 ( 0.243) Data 0.023 ( 0.027) Loss 5.7537e+00 (6.4315e+00) Acc@1 3.52 ( 1.20) Acc@5 10.94 ( 4.37) +Epoch: [0][2469/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.6396e+00 (6.4312e+00) Acc@1 3.52 ( 1.21) Acc@5 12.50 ( 4.37) +Epoch: [0][2470/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.7370e+00 (6.4309e+00) Acc@1 3.91 ( 1.21) Acc@5 12.89 ( 4.38) +Epoch: [0][2471/5004] Time 0.247 ( 0.243) Data 0.026 ( 0.027) Loss 5.6188e+00 (6.4306e+00) Acc@1 3.91 ( 1.21) Acc@5 14.45 ( 4.38) +Epoch: [0][2472/5004] Time 0.235 ( 0.243) Data 0.019 ( 0.027) Loss 5.7263e+00 (6.4303e+00) Acc@1 3.52 ( 1.21) Acc@5 12.50 ( 4.38) +Epoch: [0][2473/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.6227e+00 (6.4299e+00) Acc@1 3.12 ( 1.21) Acc@5 13.28 ( 4.39) +Epoch: [0][2474/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.5957e+00 (6.4296e+00) Acc@1 3.52 ( 1.21) Acc@5 11.33 ( 4.39) +Epoch: [0][2475/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.6438e+00 (6.4293e+00) Acc@1 2.34 ( 1.21) Acc@5 10.94 ( 4.39) +Epoch: [0][2476/5004] Time 0.244 ( 0.243) Data 0.021 ( 0.027) Loss 5.5680e+00 (6.4289e+00) Acc@1 3.91 ( 1.21) Acc@5 14.84 ( 4.40) +Epoch: [0][2477/5004] Time 0.241 ( 0.243) Data 0.018 ( 0.027) Loss 5.7263e+00 (6.4287e+00) Acc@1 4.30 ( 1.21) Acc@5 12.89 ( 4.40) +Epoch: [0][2478/5004] Time 0.246 ( 0.243) Data 0.024 ( 0.027) Loss 5.7532e+00 (6.4284e+00) Acc@1 2.73 ( 1.21) Acc@5 10.94 ( 4.40) +Epoch: [0][2479/5004] Time 0.244 ( 0.243) Data 0.020 ( 0.027) Loss 5.5513e+00 (6.4280e+00) Acc@1 7.81 ( 1.22) Acc@5 16.02 ( 4.41) +Epoch: [0][2480/5004] Time 0.241 ( 0.243) Data 0.021 ( 0.027) Loss 5.7482e+00 (6.4278e+00) Acc@1 1.95 ( 1.22) Acc@5 10.94 ( 4.41) +Epoch: [0][2481/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.7415e+00 (6.4275e+00) Acc@1 2.73 ( 1.22) Acc@5 8.98 ( 4.41) +Epoch: [0][2482/5004] Time 0.241 ( 0.243) Data 0.024 ( 0.027) Loss 5.5947e+00 (6.4271e+00) Acc@1 5.08 ( 1.22) Acc@5 17.19 ( 4.42) +Epoch: [0][2483/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.7218e+00 (6.4269e+00) Acc@1 1.95 ( 1.22) Acc@5 12.89 ( 4.42) +Epoch: [0][2484/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.7164e+00 (6.4266e+00) Acc@1 3.12 ( 1.22) Acc@5 13.28 ( 4.42) +Epoch: [0][2485/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5793e+00 (6.4262e+00) Acc@1 3.91 ( 1.22) Acc@5 15.23 ( 4.43) +Epoch: [0][2486/5004] Time 0.246 ( 0.243) Data 0.026 ( 0.027) Loss 5.6363e+00 (6.4259e+00) Acc@1 2.73 ( 1.22) Acc@5 12.89 ( 4.43) +Epoch: [0][2487/5004] Time 0.241 ( 0.243) Data 0.022 ( 0.027) Loss 5.7699e+00 (6.4257e+00) Acc@1 2.73 ( 1.22) Acc@5 9.77 ( 4.43) +Epoch: [0][2488/5004] Time 0.238 ( 0.243) Data 0.023 ( 0.027) Loss 5.7461e+00 (6.4254e+00) Acc@1 3.12 ( 1.22) Acc@5 12.89 ( 4.44) +Epoch: [0][2489/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.8458e+00 (6.4252e+00) Acc@1 6.25 ( 1.23) Acc@5 14.06 ( 4.44) +Epoch: [0][2490/5004] Time 0.241 ( 0.243) Data 0.024 ( 0.027) Loss 5.6628e+00 (6.4248e+00) Acc@1 4.30 ( 1.23) Acc@5 13.67 ( 4.45) +Epoch: [0][2491/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.4944e+00 (6.4245e+00) Acc@1 5.86 ( 1.23) Acc@5 15.23 ( 4.45) +Epoch: [0][2492/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.6208e+00 (6.4242e+00) Acc@1 5.47 ( 1.23) Acc@5 10.16 ( 4.45) +Epoch: [0][2493/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.7045e+00 (6.4239e+00) Acc@1 3.91 ( 1.23) Acc@5 9.77 ( 4.45) +Epoch: [0][2494/5004] Time 0.239 ( 0.243) Data 0.022 ( 0.027) Loss 5.7516e+00 (6.4236e+00) Acc@1 3.52 ( 1.23) Acc@5 11.72 ( 4.46) +Epoch: [0][2495/5004] Time 0.243 ( 0.243) Data 0.025 ( 0.027) Loss 5.5241e+00 (6.4232e+00) Acc@1 3.91 ( 1.23) Acc@5 15.23 ( 4.46) +Epoch: [0][2496/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.7907e+00 (6.4230e+00) Acc@1 5.08 ( 1.23) Acc@5 11.33 ( 4.46) +Epoch: [0][2497/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.6990e+00 (6.4227e+00) Acc@1 4.69 ( 1.24) Acc@5 13.28 ( 4.47) +Epoch: [0][2498/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.4271e+00 (6.4223e+00) Acc@1 4.69 ( 1.24) Acc@5 14.45 ( 4.47) +Epoch: [0][2499/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.6516e+00 (6.4220e+00) Acc@1 2.73 ( 1.24) Acc@5 12.89 ( 4.48) +Epoch: [0][2500/5004] Time 0.239 ( 0.243) Data 0.022 ( 0.027) Loss 5.5812e+00 (6.4216e+00) Acc@1 4.30 ( 1.24) Acc@5 9.77 ( 4.48) +Epoch: [0][2501/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.6165e+00 (6.4213e+00) Acc@1 5.86 ( 1.24) Acc@5 11.72 ( 4.48) +Epoch: [0][2502/5004] Time 0.242 ( 0.243) Data 0.024 ( 0.027) Loss 5.8023e+00 (6.4211e+00) Acc@1 3.91 ( 1.24) Acc@5 11.72 ( 4.48) +Epoch: [0][2503/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.6450e+00 (6.4208e+00) Acc@1 3.91 ( 1.24) Acc@5 13.28 ( 4.49) +Epoch: [0][2504/5004] Time 0.239 ( 0.243) Data 0.024 ( 0.027) Loss 5.5793e+00 (6.4204e+00) Acc@1 5.08 ( 1.24) Acc@5 11.33 ( 4.49) +Epoch: [0][2505/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5731e+00 (6.4201e+00) Acc@1 2.73 ( 1.25) Acc@5 11.33 ( 4.49) +Epoch: [0][2506/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.6767e+00 (6.4198e+00) Acc@1 4.30 ( 1.25) Acc@5 15.23 ( 4.50) +Epoch: [0][2507/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.6407e+00 (6.4195e+00) Acc@1 3.52 ( 1.25) Acc@5 12.50 ( 4.50) +Epoch: [0][2508/5004] Time 0.240 ( 0.243) Data 0.024 ( 0.027) Loss 5.5821e+00 (6.4192e+00) Acc@1 3.12 ( 1.25) Acc@5 11.33 ( 4.50) +Epoch: [0][2509/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.9506e+00 (6.4190e+00) Acc@1 1.56 ( 1.25) Acc@5 8.59 ( 4.50) +Epoch: [0][2510/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.5767e+00 (6.4186e+00) Acc@1 2.73 ( 1.25) Acc@5 12.50 ( 4.51) +Epoch: [0][2511/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.4659e+00 (6.4183e+00) Acc@1 1.95 ( 1.25) Acc@5 12.89 ( 4.51) +Epoch: [0][2512/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.6190e+00 (6.4179e+00) Acc@1 5.08 ( 1.25) Acc@5 14.84 ( 4.51) +Epoch: [0][2513/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.6685e+00 (6.4176e+00) Acc@1 6.64 ( 1.25) Acc@5 13.67 ( 4.52) +Epoch: [0][2514/5004] Time 0.249 ( 0.243) Data 0.025 ( 0.027) Loss 5.6642e+00 (6.4173e+00) Acc@1 5.08 ( 1.25) Acc@5 14.84 ( 4.52) +Epoch: [0][2515/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.6944e+00 (6.4170e+00) Acc@1 4.69 ( 1.26) Acc@5 12.11 ( 4.53) +Epoch: [0][2516/5004] Time 0.245 ( 0.243) Data 0.020 ( 0.027) Loss 5.5965e+00 (6.4167e+00) Acc@1 4.30 ( 1.26) Acc@5 14.06 ( 4.53) +Epoch: [0][2517/5004] Time 0.253 ( 0.243) Data 0.022 ( 0.027) Loss 5.6002e+00 (6.4164e+00) Acc@1 3.91 ( 1.26) Acc@5 13.28 ( 4.53) +Epoch: [0][2518/5004] Time 0.246 ( 0.243) Data 0.021 ( 0.027) Loss 5.6646e+00 (6.4161e+00) Acc@1 5.08 ( 1.26) Acc@5 14.06 ( 4.54) +Epoch: [0][2519/5004] Time 0.252 ( 0.243) Data 0.022 ( 0.027) Loss 5.7039e+00 (6.4158e+00) Acc@1 4.69 ( 1.26) Acc@5 12.11 ( 4.54) +Epoch: [0][2520/5004] Time 0.258 ( 0.243) Data 0.020 ( 0.027) Loss 5.7781e+00 (6.4156e+00) Acc@1 3.12 ( 1.26) Acc@5 11.72 ( 4.54) +Epoch: [0][2521/5004] Time 0.252 ( 0.243) Data 0.019 ( 0.027) Loss 5.5814e+00 (6.4152e+00) Acc@1 2.73 ( 1.26) Acc@5 14.84 ( 4.55) +Epoch: [0][2522/5004] Time 0.246 ( 0.243) Data 0.019 ( 0.027) Loss 5.6087e+00 (6.4149e+00) Acc@1 4.69 ( 1.26) Acc@5 14.45 ( 4.55) +Epoch: [0][2523/5004] Time 0.242 ( 0.243) Data 0.022 ( 0.027) Loss 5.6295e+00 (6.4146e+00) Acc@1 3.12 ( 1.26) Acc@5 10.55 ( 4.55) +Epoch: [0][2524/5004] Time 0.255 ( 0.243) Data 0.024 ( 0.027) Loss 5.7009e+00 (6.4143e+00) Acc@1 4.69 ( 1.27) Acc@5 11.33 ( 4.56) +Epoch: [0][2525/5004] Time 0.231 ( 0.243) Data 0.018 ( 0.027) Loss 5.6387e+00 (6.4140e+00) Acc@1 3.52 ( 1.27) Acc@5 14.45 ( 4.56) +Epoch: [0][2526/5004] Time 0.253 ( 0.243) Data 0.030 ( 0.027) Loss 5.4810e+00 (6.4136e+00) Acc@1 2.73 ( 1.27) Acc@5 17.97 ( 4.56) +Epoch: [0][2527/5004] Time 0.239 ( 0.243) Data 0.026 ( 0.027) Loss 5.5667e+00 (6.4133e+00) Acc@1 4.69 ( 1.27) Acc@5 12.11 ( 4.57) +Epoch: [0][2528/5004] Time 0.249 ( 0.243) Data 0.030 ( 0.027) Loss 5.5793e+00 (6.4130e+00) Acc@1 5.08 ( 1.27) Acc@5 14.84 ( 4.57) +Epoch: [0][2529/5004] Time 0.237 ( 0.243) Data 0.024 ( 0.027) Loss 5.6519e+00 (6.4127e+00) Acc@1 3.91 ( 1.27) Acc@5 13.67 ( 4.58) +Epoch: [0][2530/5004] Time 0.245 ( 0.243) Data 0.030 ( 0.027) Loss 5.6839e+00 (6.4124e+00) Acc@1 3.91 ( 1.27) Acc@5 10.16 ( 4.58) +Epoch: [0][2531/5004] Time 0.243 ( 0.243) Data 0.029 ( 0.027) Loss 5.5718e+00 (6.4121e+00) Acc@1 5.86 ( 1.27) Acc@5 12.11 ( 4.58) +Epoch: [0][2532/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.5450e+00 (6.4117e+00) Acc@1 4.69 ( 1.28) Acc@5 16.02 ( 4.58) +Epoch: [0][2533/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.5872e+00 (6.4114e+00) Acc@1 3.12 ( 1.28) Acc@5 14.06 ( 4.59) +Epoch: [0][2534/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.6402e+00 (6.4111e+00) Acc@1 6.25 ( 1.28) Acc@5 14.06 ( 4.59) +Epoch: [0][2535/5004] Time 0.243 ( 0.243) Data 0.029 ( 0.027) Loss 5.5815e+00 (6.4108e+00) Acc@1 3.12 ( 1.28) Acc@5 10.94 ( 4.59) +Epoch: [0][2536/5004] Time 0.247 ( 0.243) Data 0.029 ( 0.027) Loss 5.4138e+00 (6.4104e+00) Acc@1 5.47 ( 1.28) Acc@5 17.97 ( 4.60) +Epoch: [0][2537/5004] Time 0.240 ( 0.243) Data 0.027 ( 0.027) Loss 5.5982e+00 (6.4100e+00) Acc@1 3.12 ( 1.28) Acc@5 13.67 ( 4.60) +Epoch: [0][2538/5004] Time 0.243 ( 0.243) Data 0.029 ( 0.027) Loss 5.6555e+00 (6.4097e+00) Acc@1 3.91 ( 1.28) Acc@5 14.45 ( 4.61) +Epoch: [0][2539/5004] Time 0.245 ( 0.243) Data 0.029 ( 0.027) Loss 5.4861e+00 (6.4094e+00) Acc@1 5.47 ( 1.28) Acc@5 16.02 ( 4.61) +Epoch: [0][2540/5004] Time 0.244 ( 0.243) Data 0.028 ( 0.027) Loss 5.6619e+00 (6.4091e+00) Acc@1 5.08 ( 1.29) Acc@5 13.67 ( 4.62) +Epoch: [0][2541/5004] Time 0.243 ( 0.243) Data 0.028 ( 0.027) Loss 5.6179e+00 (6.4088e+00) Acc@1 5.47 ( 1.29) Acc@5 14.45 ( 4.62) +Epoch: [0][2542/5004] Time 0.251 ( 0.243) Data 0.029 ( 0.027) Loss 5.4754e+00 (6.4084e+00) Acc@1 4.30 ( 1.29) Acc@5 14.45 ( 4.62) +Epoch: [0][2543/5004] Time 0.244 ( 0.243) Data 0.027 ( 0.027) Loss 5.6467e+00 (6.4081e+00) Acc@1 4.30 ( 1.29) Acc@5 12.11 ( 4.63) +Epoch: [0][2544/5004] Time 0.248 ( 0.243) Data 0.029 ( 0.027) Loss 5.7469e+00 (6.4079e+00) Acc@1 5.86 ( 1.29) Acc@5 12.89 ( 4.63) +Epoch: [0][2545/5004] Time 0.248 ( 0.243) Data 0.028 ( 0.027) Loss 5.6236e+00 (6.4075e+00) Acc@1 5.08 ( 1.29) Acc@5 14.06 ( 4.63) +Epoch: [0][2546/5004] Time 0.248 ( 0.243) Data 0.029 ( 0.027) Loss 5.6221e+00 (6.4072e+00) Acc@1 4.69 ( 1.29) Acc@5 15.23 ( 4.64) +Epoch: [0][2547/5004] Time 0.243 ( 0.243) Data 0.028 ( 0.027) Loss 5.4767e+00 (6.4069e+00) Acc@1 4.30 ( 1.30) Acc@5 14.06 ( 4.64) +Epoch: [0][2548/5004] Time 0.247 ( 0.243) Data 0.029 ( 0.027) Loss 5.4763e+00 (6.4065e+00) Acc@1 7.42 ( 1.30) Acc@5 17.58 ( 4.65) +Epoch: [0][2549/5004] Time 0.247 ( 0.243) Data 0.028 ( 0.027) Loss 5.5149e+00 (6.4062e+00) Acc@1 3.12 ( 1.30) Acc@5 11.72 ( 4.65) +Epoch: [0][2550/5004] Time 0.250 ( 0.243) Data 0.031 ( 0.027) Loss 5.6314e+00 (6.4059e+00) Acc@1 5.47 ( 1.30) Acc@5 13.67 ( 4.65) +Epoch: [0][2551/5004] Time 0.244 ( 0.243) Data 0.028 ( 0.027) Loss 5.6109e+00 (6.4055e+00) Acc@1 4.30 ( 1.30) Acc@5 12.11 ( 4.66) +Epoch: [0][2552/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.7464e+00 (6.4053e+00) Acc@1 3.52 ( 1.30) Acc@5 11.33 ( 4.66) +Epoch: [0][2553/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.5769e+00 (6.4050e+00) Acc@1 4.30 ( 1.30) Acc@5 15.23 ( 4.66) +Epoch: [0][2554/5004] Time 0.243 ( 0.243) Data 0.029 ( 0.027) Loss 5.6348e+00 (6.4047e+00) Acc@1 3.91 ( 1.30) Acc@5 13.67 ( 4.67) +Epoch: [0][2555/5004] Time 0.243 ( 0.243) Data 0.030 ( 0.027) Loss 5.6738e+00 (6.4044e+00) Acc@1 1.56 ( 1.30) Acc@5 12.11 ( 4.67) +Epoch: [0][2556/5004] Time 0.245 ( 0.243) Data 0.030 ( 0.027) Loss 5.6209e+00 (6.4041e+00) Acc@1 5.08 ( 1.31) Acc@5 14.84 ( 4.67) +Epoch: [0][2557/5004] Time 0.243 ( 0.243) Data 0.030 ( 0.027) Loss 5.6809e+00 (6.4038e+00) Acc@1 2.73 ( 1.31) Acc@5 13.67 ( 4.68) +Epoch: [0][2558/5004] Time 0.247 ( 0.243) Data 0.032 ( 0.027) Loss 5.5252e+00 (6.4034e+00) Acc@1 4.30 ( 1.31) Acc@5 14.84 ( 4.68) +Epoch: [0][2559/5004] Time 0.247 ( 0.243) Data 0.030 ( 0.027) Loss 5.7629e+00 (6.4032e+00) Acc@1 5.08 ( 1.31) Acc@5 12.11 ( 4.68) +Epoch: [0][2560/5004] Time 0.251 ( 0.243) Data 0.029 ( 0.027) Loss 5.6656e+00 (6.4029e+00) Acc@1 5.86 ( 1.31) Acc@5 12.89 ( 4.69) +Epoch: [0][2561/5004] Time 0.250 ( 0.243) Data 0.027 ( 0.027) Loss 5.6626e+00 (6.4026e+00) Acc@1 3.52 ( 1.31) Acc@5 13.28 ( 4.69) +Epoch: [0][2562/5004] Time 0.246 ( 0.243) Data 0.027 ( 0.027) Loss 5.6098e+00 (6.4023e+00) Acc@1 6.25 ( 1.31) Acc@5 14.06 ( 4.69) +Epoch: [0][2563/5004] Time 0.244 ( 0.243) Data 0.028 ( 0.027) Loss 5.6514e+00 (6.4020e+00) Acc@1 4.30 ( 1.31) Acc@5 12.50 ( 4.70) +Epoch: [0][2564/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.6496e+00 (6.4017e+00) Acc@1 4.30 ( 1.32) Acc@5 12.11 ( 4.70) +Epoch: [0][2565/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.5320e+00 (6.4014e+00) Acc@1 3.52 ( 1.32) Acc@5 14.06 ( 4.70) +Epoch: [0][2566/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.7541e+00 (6.4011e+00) Acc@1 4.69 ( 1.32) Acc@5 13.28 ( 4.71) +Epoch: [0][2567/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.6402e+00 (6.4008e+00) Acc@1 3.12 ( 1.32) Acc@5 10.94 ( 4.71) +Epoch: [0][2568/5004] Time 0.243 ( 0.243) Data 0.028 ( 0.027) Loss 5.4814e+00 (6.4005e+00) Acc@1 3.12 ( 1.32) Acc@5 13.67 ( 4.71) +Epoch: [0][2569/5004] Time 0.246 ( 0.243) Data 0.030 ( 0.027) Loss 5.5759e+00 (6.4001e+00) Acc@1 4.30 ( 1.32) Acc@5 14.45 ( 4.72) +Epoch: [0][2570/5004] Time 0.247 ( 0.243) Data 0.027 ( 0.027) Loss 5.5292e+00 (6.3998e+00) Acc@1 3.12 ( 1.32) Acc@5 10.55 ( 4.72) +Epoch: [0][2571/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.7214e+00 (6.3995e+00) Acc@1 3.91 ( 1.32) Acc@5 9.77 ( 4.72) +Epoch: [0][2572/5004] Time 0.242 ( 0.243) Data 0.029 ( 0.027) Loss 5.6209e+00 (6.3992e+00) Acc@1 3.91 ( 1.32) Acc@5 13.28 ( 4.72) +Epoch: [0][2573/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.6770e+00 (6.3990e+00) Acc@1 4.69 ( 1.32) Acc@5 13.67 ( 4.73) +Epoch: [0][2574/5004] Time 0.245 ( 0.243) Data 0.029 ( 0.027) Loss 5.8087e+00 (6.3987e+00) Acc@1 3.12 ( 1.33) Acc@5 11.72 ( 4.73) +Epoch: [0][2575/5004] Time 0.243 ( 0.243) Data 0.030 ( 0.027) Loss 5.7043e+00 (6.3985e+00) Acc@1 5.47 ( 1.33) Acc@5 13.67 ( 4.73) +Epoch: [0][2576/5004] Time 0.243 ( 0.243) Data 0.030 ( 0.027) Loss 5.6896e+00 (6.3982e+00) Acc@1 1.95 ( 1.33) Acc@5 10.16 ( 4.74) +Epoch: [0][2577/5004] Time 0.249 ( 0.243) Data 0.030 ( 0.027) Loss 5.8109e+00 (6.3980e+00) Acc@1 3.52 ( 1.33) Acc@5 9.77 ( 4.74) +Epoch: [0][2578/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.7212e+00 (6.3977e+00) Acc@1 3.52 ( 1.33) Acc@5 13.28 ( 4.74) +Epoch: [0][2579/5004] Time 0.246 ( 0.243) Data 0.030 ( 0.027) Loss 5.5977e+00 (6.3974e+00) Acc@1 3.12 ( 1.33) Acc@5 14.84 ( 4.74) +Epoch: [0][2580/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.5067e+00 (6.3970e+00) Acc@1 4.30 ( 1.33) Acc@5 13.67 ( 4.75) +Epoch: [0][2581/5004] Time 0.247 ( 0.243) Data 0.028 ( 0.027) Loss 5.5789e+00 (6.3967e+00) Acc@1 3.91 ( 1.33) Acc@5 12.11 ( 4.75) +Epoch: [0][2582/5004] Time 0.243 ( 0.243) Data 0.027 ( 0.027) Loss 5.4021e+00 (6.3963e+00) Acc@1 5.08 ( 1.33) Acc@5 15.23 ( 4.75) +Epoch: [0][2583/5004] Time 0.241 ( 0.243) Data 0.028 ( 0.027) Loss 5.4917e+00 (6.3960e+00) Acc@1 4.30 ( 1.33) Acc@5 16.02 ( 4.76) +Epoch: [0][2584/5004] Time 0.251 ( 0.243) Data 0.030 ( 0.027) Loss 5.5326e+00 (6.3957e+00) Acc@1 5.08 ( 1.34) Acc@5 16.02 ( 4.76) +Epoch: [0][2585/5004] Time 0.249 ( 0.243) Data 0.027 ( 0.027) Loss 5.6026e+00 (6.3954e+00) Acc@1 3.12 ( 1.34) Acc@5 12.89 ( 4.77) +Epoch: [0][2586/5004] Time 0.246 ( 0.243) Data 0.028 ( 0.027) Loss 5.5076e+00 (6.3950e+00) Acc@1 5.08 ( 1.34) Acc@5 12.11 ( 4.77) +Epoch: [0][2587/5004] Time 0.244 ( 0.243) Data 0.027 ( 0.027) Loss 5.3627e+00 (6.3946e+00) Acc@1 3.12 ( 1.34) Acc@5 16.02 ( 4.77) +Epoch: [0][2588/5004] Time 0.250 ( 0.243) Data 0.029 ( 0.027) Loss 5.4157e+00 (6.3942e+00) Acc@1 7.03 ( 1.34) Acc@5 17.58 ( 4.78) +Epoch: [0][2589/5004] Time 0.245 ( 0.243) Data 0.027 ( 0.027) Loss 5.5322e+00 (6.3939e+00) Acc@1 3.91 ( 1.34) Acc@5 16.41 ( 4.78) +Epoch: [0][2590/5004] Time 0.242 ( 0.243) Data 0.028 ( 0.027) Loss 5.4933e+00 (6.3936e+00) Acc@1 3.52 ( 1.34) Acc@5 16.02 ( 4.79) +Epoch: [0][2591/5004] Time 0.244 ( 0.243) Data 0.030 ( 0.027) Loss 5.7870e+00 (6.3933e+00) Acc@1 3.12 ( 1.34) Acc@5 9.38 ( 4.79) +Epoch: [0][2592/5004] Time 0.245 ( 0.243) Data 0.030 ( 0.027) Loss 5.5421e+00 (6.3930e+00) Acc@1 3.52 ( 1.34) Acc@5 11.33 ( 4.79) +Epoch: [0][2593/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.5379e+00 (6.3927e+00) Acc@1 3.91 ( 1.35) Acc@5 16.02 ( 4.80) +Epoch: [0][2594/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.4681e+00 (6.3923e+00) Acc@1 2.73 ( 1.35) Acc@5 11.33 ( 4.80) +Epoch: [0][2595/5004] Time 0.241 ( 0.243) Data 0.027 ( 0.027) Loss 5.6624e+00 (6.3920e+00) Acc@1 3.12 ( 1.35) Acc@5 12.50 ( 4.80) +Epoch: [0][2596/5004] Time 0.242 ( 0.243) Data 0.029 ( 0.027) Loss 5.5409e+00 (6.3917e+00) Acc@1 3.91 ( 1.35) Acc@5 14.84 ( 4.81) +Epoch: [0][2597/5004] Time 0.243 ( 0.243) Data 0.030 ( 0.027) Loss 5.6333e+00 (6.3914e+00) Acc@1 5.86 ( 1.35) Acc@5 14.84 ( 4.81) +Epoch: [0][2598/5004] Time 0.247 ( 0.243) Data 0.029 ( 0.027) Loss 5.5780e+00 (6.3911e+00) Acc@1 3.52 ( 1.35) Acc@5 12.89 ( 4.81) +Epoch: [0][2599/5004] Time 0.247 ( 0.243) Data 0.031 ( 0.027) Loss 5.4301e+00 (6.3907e+00) Acc@1 4.30 ( 1.35) Acc@5 13.28 ( 4.82) +Epoch: [0][2600/5004] Time 0.245 ( 0.243) Data 0.029 ( 0.027) Loss 5.6246e+00 (6.3904e+00) Acc@1 5.47 ( 1.35) Acc@5 13.28 ( 4.82) +Epoch: [0][2601/5004] Time 0.241 ( 0.243) Data 0.027 ( 0.027) Loss 5.7060e+00 (6.3902e+00) Acc@1 3.52 ( 1.35) Acc@5 8.20 ( 4.82) +Epoch: [0][2602/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.6888e+00 (6.3899e+00) Acc@1 3.52 ( 1.35) Acc@5 12.11 ( 4.82) +Epoch: [0][2603/5004] Time 0.233 ( 0.243) Data 0.028 ( 0.027) Loss 5.6844e+00 (6.3896e+00) Acc@1 2.73 ( 1.35) Acc@5 9.38 ( 4.82) +Epoch: [0][2604/5004] Time 0.239 ( 0.243) Data 0.032 ( 0.027) Loss 5.7482e+00 (6.3894e+00) Acc@1 1.56 ( 1.35) Acc@5 8.59 ( 4.83) +Epoch: [0][2605/5004] Time 0.247 ( 0.243) Data 0.031 ( 0.027) Loss 5.7045e+00 (6.3891e+00) Acc@1 3.91 ( 1.36) Acc@5 13.28 ( 4.83) +Epoch: [0][2606/5004] Time 0.240 ( 0.243) Data 0.031 ( 0.027) Loss 5.5713e+00 (6.3888e+00) Acc@1 4.69 ( 1.36) Acc@5 12.11 ( 4.83) +Epoch: [0][2607/5004] Time 0.236 ( 0.243) Data 0.031 ( 0.027) Loss 5.5047e+00 (6.3885e+00) Acc@1 5.08 ( 1.36) Acc@5 19.92 ( 4.84) +Epoch: [0][2608/5004] Time 0.244 ( 0.243) Data 0.033 ( 0.027) Loss 5.5366e+00 (6.3881e+00) Acc@1 4.30 ( 1.36) Acc@5 11.72 ( 4.84) +Epoch: [0][2609/5004] Time 0.236 ( 0.243) Data 0.029 ( 0.027) Loss 5.5611e+00 (6.3878e+00) Acc@1 5.47 ( 1.36) Acc@5 12.89 ( 4.84) +Epoch: [0][2610/5004] Time 0.239 ( 0.243) Data 0.030 ( 0.027) Loss 5.5764e+00 (6.3875e+00) Acc@1 2.34 ( 1.36) Acc@5 11.72 ( 4.85) +Epoch: [0][2611/5004] Time 0.237 ( 0.243) Data 0.031 ( 0.027) Loss 5.6121e+00 (6.3872e+00) Acc@1 3.52 ( 1.36) Acc@5 13.67 ( 4.85) +Epoch: [0][2612/5004] Time 0.238 ( 0.243) Data 0.032 ( 0.027) Loss 5.4857e+00 (6.3869e+00) Acc@1 5.86 ( 1.36) Acc@5 17.19 ( 4.85) +Epoch: [0][2613/5004] Time 0.241 ( 0.243) Data 0.032 ( 0.027) Loss 5.3908e+00 (6.3865e+00) Acc@1 5.08 ( 1.37) Acc@5 17.58 ( 4.86) +Epoch: [0][2614/5004] Time 0.238 ( 0.243) Data 0.031 ( 0.027) Loss 5.5464e+00 (6.3862e+00) Acc@1 8.20 ( 1.37) Acc@5 16.02 ( 4.86) +Epoch: [0][2615/5004] Time 0.243 ( 0.243) Data 0.035 ( 0.027) Loss 5.6996e+00 (6.3859e+00) Acc@1 7.42 ( 1.37) Acc@5 14.84 ( 4.87) +Epoch: [0][2616/5004] Time 0.239 ( 0.243) Data 0.032 ( 0.027) Loss 5.5246e+00 (6.3856e+00) Acc@1 5.47 ( 1.37) Acc@5 17.97 ( 4.87) +Epoch: [0][2617/5004] Time 0.236 ( 0.243) Data 0.031 ( 0.027) Loss 5.5423e+00 (6.3852e+00) Acc@1 4.69 ( 1.37) Acc@5 13.28 ( 4.88) +Epoch: [0][2618/5004] Time 0.238 ( 0.243) Data 0.032 ( 0.027) Loss 5.4692e+00 (6.3849e+00) Acc@1 6.25 ( 1.38) Acc@5 19.14 ( 4.88) +Epoch: [0][2619/5004] Time 0.240 ( 0.243) Data 0.032 ( 0.027) Loss 5.5765e+00 (6.3846e+00) Acc@1 3.52 ( 1.38) Acc@5 12.50 ( 4.88) +Epoch: [0][2620/5004] Time 0.242 ( 0.243) Data 0.032 ( 0.027) Loss 5.7091e+00 (6.3843e+00) Acc@1 3.52 ( 1.38) Acc@5 12.50 ( 4.89) +Epoch: [0][2621/5004] Time 0.245 ( 0.243) Data 0.030 ( 0.027) Loss 5.6296e+00 (6.3840e+00) Acc@1 4.69 ( 1.38) Acc@5 10.94 ( 4.89) +Epoch: [0][2622/5004] Time 0.243 ( 0.243) Data 0.029 ( 0.027) Loss 5.5420e+00 (6.3837e+00) Acc@1 3.12 ( 1.38) Acc@5 14.84 ( 4.89) +Epoch: [0][2623/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 5.4733e+00 (6.3834e+00) Acc@1 5.86 ( 1.38) Acc@5 18.36 ( 4.90) +Epoch: [0][2624/5004] Time 0.245 ( 0.243) Data 0.029 ( 0.027) Loss 5.5778e+00 (6.3831e+00) Acc@1 4.69 ( 1.38) Acc@5 12.89 ( 4.90) +Epoch: [0][2625/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.6948e+00 (6.3828e+00) Acc@1 5.86 ( 1.38) Acc@5 16.02 ( 4.91) +Epoch: [0][2626/5004] Time 0.246 ( 0.243) Data 0.030 ( 0.027) Loss 5.4892e+00 (6.3825e+00) Acc@1 4.30 ( 1.38) Acc@5 13.28 ( 4.91) +Epoch: [0][2627/5004] Time 0.246 ( 0.243) Data 0.028 ( 0.027) Loss 5.6200e+00 (6.3822e+00) Acc@1 3.91 ( 1.39) Acc@5 12.11 ( 4.91) +Epoch: [0][2628/5004] Time 0.248 ( 0.243) Data 0.029 ( 0.027) Loss 5.4041e+00 (6.3818e+00) Acc@1 4.69 ( 1.39) Acc@5 16.41 ( 4.92) +Epoch: [0][2629/5004] Time 0.239 ( 0.243) Data 0.028 ( 0.027) Loss 5.5881e+00 (6.3815e+00) Acc@1 3.52 ( 1.39) Acc@5 16.80 ( 4.92) +Epoch: [0][2630/5004] Time 0.240 ( 0.243) Data 0.030 ( 0.027) Loss 5.4618e+00 (6.3812e+00) Acc@1 5.86 ( 1.39) Acc@5 16.02 ( 4.92) +Epoch: [0][2631/5004] Time 0.238 ( 0.243) Data 0.031 ( 0.027) Loss 5.5187e+00 (6.3808e+00) Acc@1 7.03 ( 1.39) Acc@5 18.36 ( 4.93) +Epoch: [0][2632/5004] Time 0.243 ( 0.243) Data 0.031 ( 0.027) Loss 5.5043e+00 (6.3805e+00) Acc@1 3.91 ( 1.39) Acc@5 14.45 ( 4.93) +Epoch: [0][2633/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.4989e+00 (6.3802e+00) Acc@1 4.69 ( 1.39) Acc@5 14.06 ( 4.94) +Epoch: [0][2634/5004] Time 0.255 ( 0.243) Data 0.030 ( 0.027) Loss 5.4519e+00 (6.3798e+00) Acc@1 7.81 ( 1.40) Acc@5 16.02 ( 4.94) +Epoch: [0][2635/5004] Time 0.244 ( 0.243) Data 0.026 ( 0.027) Loss 5.5346e+00 (6.3795e+00) Acc@1 4.30 ( 1.40) Acc@5 14.84 ( 4.94) +Epoch: [0][2636/5004] Time 0.237 ( 0.243) Data 0.030 ( 0.027) Loss 5.5494e+00 (6.3792e+00) Acc@1 1.95 ( 1.40) Acc@5 11.72 ( 4.95) +Epoch: [0][2637/5004] Time 0.240 ( 0.243) Data 0.031 ( 0.027) Loss 5.3751e+00 (6.3788e+00) Acc@1 5.47 ( 1.40) Acc@5 17.97 ( 4.95) +Epoch: [0][2638/5004] Time 0.243 ( 0.243) Data 0.031 ( 0.027) Loss 5.5735e+00 (6.3785e+00) Acc@1 3.52 ( 1.40) Acc@5 12.50 ( 4.96) +Epoch: [0][2639/5004] Time 0.234 ( 0.243) Data 0.028 ( 0.027) Loss 5.5896e+00 (6.3782e+00) Acc@1 6.64 ( 1.40) Acc@5 14.06 ( 4.96) +Epoch: [0][2640/5004] Time 0.245 ( 0.243) Data 0.032 ( 0.027) Loss 5.6385e+00 (6.3779e+00) Acc@1 3.52 ( 1.40) Acc@5 12.11 ( 4.96) +Epoch: [0][2641/5004] Time 0.235 ( 0.243) Data 0.030 ( 0.027) Loss 5.4820e+00 (6.3776e+00) Acc@1 3.91 ( 1.40) Acc@5 16.41 ( 4.97) +Epoch: [0][2642/5004] Time 0.240 ( 0.243) Data 0.032 ( 0.027) Loss 5.7612e+00 (6.3773e+00) Acc@1 4.30 ( 1.40) Acc@5 12.11 ( 4.97) +Epoch: [0][2643/5004] Time 0.238 ( 0.243) Data 0.032 ( 0.027) Loss 5.4444e+00 (6.3770e+00) Acc@1 6.64 ( 1.41) Acc@5 16.02 ( 4.97) +Epoch: [0][2644/5004] Time 0.241 ( 0.243) Data 0.031 ( 0.027) Loss 5.5449e+00 (6.3767e+00) Acc@1 6.25 ( 1.41) Acc@5 15.23 ( 4.98) +Epoch: [0][2645/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.6478e+00 (6.3764e+00) Acc@1 3.91 ( 1.41) Acc@5 12.50 ( 4.98) +Epoch: [0][2646/5004] Time 0.239 ( 0.243) Data 0.028 ( 0.027) Loss 5.5188e+00 (6.3761e+00) Acc@1 5.86 ( 1.41) Acc@5 15.62 ( 4.98) +Epoch: [0][2647/5004] Time 0.238 ( 0.243) Data 0.031 ( 0.027) Loss 5.4425e+00 (6.3757e+00) Acc@1 6.25 ( 1.41) Acc@5 16.80 ( 4.99) +Epoch: [0][2648/5004] Time 0.240 ( 0.243) Data 0.031 ( 0.027) Loss 5.7404e+00 (6.3755e+00) Acc@1 3.91 ( 1.41) Acc@5 10.55 ( 4.99) +Epoch: [0][2649/5004] Time 0.236 ( 0.243) Data 0.030 ( 0.027) Loss 5.5389e+00 (6.3752e+00) Acc@1 5.86 ( 1.42) Acc@5 14.45 ( 4.99) +Epoch: [0][2650/5004] Time 0.237 ( 0.243) Data 0.032 ( 0.027) Loss 5.6001e+00 (6.3749e+00) Acc@1 5.08 ( 1.42) Acc@5 14.45 ( 5.00) +Epoch: [0][2651/5004] Time 0.239 ( 0.243) Data 0.032 ( 0.027) Loss 5.6984e+00 (6.3746e+00) Acc@1 1.17 ( 1.42) Acc@5 12.11 ( 5.00) +Epoch: [0][2652/5004] Time 0.240 ( 0.243) Data 0.031 ( 0.027) Loss 5.5366e+00 (6.3743e+00) Acc@1 7.03 ( 1.42) Acc@5 14.45 ( 5.00) +Epoch: [0][2653/5004] Time 0.241 ( 0.243) Data 0.030 ( 0.027) Loss 5.5431e+00 (6.3740e+00) Acc@1 5.86 ( 1.42) Acc@5 15.23 ( 5.01) +Epoch: [0][2654/5004] Time 0.249 ( 0.243) Data 0.032 ( 0.027) Loss 5.4298e+00 (6.3736e+00) Acc@1 3.52 ( 1.42) Acc@5 15.23 ( 5.01) +Epoch: [0][2655/5004] Time 0.237 ( 0.243) Data 0.027 ( 0.027) Loss 5.5158e+00 (6.3733e+00) Acc@1 4.69 ( 1.42) Acc@5 13.67 ( 5.01) +Epoch: [0][2656/5004] Time 0.240 ( 0.243) Data 0.030 ( 0.027) Loss 5.6845e+00 (6.3730e+00) Acc@1 3.12 ( 1.42) Acc@5 10.94 ( 5.02) +Epoch: [0][2657/5004] Time 0.239 ( 0.243) Data 0.030 ( 0.027) Loss 5.6816e+00 (6.3728e+00) Acc@1 3.52 ( 1.42) Acc@5 10.55 ( 5.02) +Epoch: [0][2658/5004] Time 0.237 ( 0.243) Data 0.031 ( 0.027) Loss 5.6418e+00 (6.3725e+00) Acc@1 3.52 ( 1.42) Acc@5 14.84 ( 5.02) +Epoch: [0][2659/5004] Time 0.238 ( 0.243) Data 0.032 ( 0.027) Loss 5.5253e+00 (6.3722e+00) Acc@1 4.30 ( 1.43) Acc@5 17.19 ( 5.03) +Epoch: [0][2660/5004] Time 0.238 ( 0.243) Data 0.032 ( 0.027) Loss 5.6050e+00 (6.3719e+00) Acc@1 1.95 ( 1.43) Acc@5 13.28 ( 5.03) +Epoch: [0][2661/5004] Time 0.239 ( 0.243) Data 0.033 ( 0.027) Loss 5.5862e+00 (6.3716e+00) Acc@1 4.69 ( 1.43) Acc@5 12.89 ( 5.03) +Epoch: [0][2662/5004] Time 0.244 ( 0.243) Data 0.032 ( 0.027) Loss 5.5705e+00 (6.3713e+00) Acc@1 4.69 ( 1.43) Acc@5 11.72 ( 5.04) +Epoch: [0][2663/5004] Time 0.236 ( 0.243) Data 0.029 ( 0.027) Loss 5.6061e+00 (6.3710e+00) Acc@1 4.30 ( 1.43) Acc@5 11.72 ( 5.04) +Epoch: [0][2664/5004] Time 0.248 ( 0.243) Data 0.031 ( 0.027) Loss 5.5167e+00 (6.3707e+00) Acc@1 5.86 ( 1.43) Acc@5 15.62 ( 5.04) +Epoch: [0][2665/5004] Time 0.236 ( 0.243) Data 0.022 ( 0.027) Loss 5.8128e+00 (6.3705e+00) Acc@1 2.73 ( 1.43) Acc@5 11.33 ( 5.04) +Epoch: [0][2666/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.4409e+00 (6.3701e+00) Acc@1 6.64 ( 1.43) Acc@5 17.58 ( 5.05) +Epoch: [0][2667/5004] Time 0.239 ( 0.243) Data 0.027 ( 0.027) Loss 5.7127e+00 (6.3699e+00) Acc@1 4.69 ( 1.43) Acc@5 11.33 ( 5.05) +Epoch: [0][2668/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.4060e+00 (6.3695e+00) Acc@1 7.03 ( 1.44) Acc@5 17.19 ( 5.06) +Epoch: [0][2669/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5275e+00 (6.3692e+00) Acc@1 4.69 ( 1.44) Acc@5 11.72 ( 5.06) +Epoch: [0][2670/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.4829e+00 (6.3689e+00) Acc@1 4.69 ( 1.44) Acc@5 11.72 ( 5.06) +Epoch: [0][2671/5004] Time 0.237 ( 0.243) Data 0.024 ( 0.027) Loss 5.4720e+00 (6.3686e+00) Acc@1 5.08 ( 1.44) Acc@5 17.19 ( 5.07) +Epoch: [0][2672/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.6083e+00 (6.3683e+00) Acc@1 3.52 ( 1.44) Acc@5 14.45 ( 5.07) +Epoch: [0][2673/5004] Time 0.243 ( 0.243) Data 0.026 ( 0.027) Loss 5.5894e+00 (6.3680e+00) Acc@1 5.47 ( 1.44) Acc@5 15.23 ( 5.07) +Epoch: [0][2674/5004] Time 0.237 ( 0.243) Data 0.023 ( 0.027) Loss 5.6123e+00 (6.3677e+00) Acc@1 4.69 ( 1.44) Acc@5 13.28 ( 5.08) +Epoch: [0][2675/5004] Time 0.250 ( 0.243) Data 0.025 ( 0.027) Loss 5.6190e+00 (6.3674e+00) Acc@1 2.34 ( 1.44) Acc@5 12.89 ( 5.08) +Epoch: [0][2676/5004] Time 0.238 ( 0.243) Data 0.016 ( 0.027) Loss 5.5182e+00 (6.3671e+00) Acc@1 4.69 ( 1.45) Acc@5 14.06 ( 5.08) +Epoch: [0][2677/5004] Time 0.239 ( 0.243) Data 0.021 ( 0.027) Loss 5.6375e+00 (6.3668e+00) Acc@1 5.86 ( 1.45) Acc@5 13.28 ( 5.08) +Epoch: [0][2678/5004] Time 0.240 ( 0.243) Data 0.022 ( 0.027) Loss 5.8143e+00 (6.3666e+00) Acc@1 4.30 ( 1.45) Acc@5 9.38 ( 5.09) +Epoch: [0][2679/5004] Time 0.236 ( 0.243) Data 0.021 ( 0.027) Loss 5.5764e+00 (6.3663e+00) Acc@1 4.30 ( 1.45) Acc@5 13.67 ( 5.09) +Epoch: [0][2680/5004] Time 0.238 ( 0.243) Data 0.024 ( 0.027) Loss 5.7318e+00 (6.3661e+00) Acc@1 3.91 ( 1.45) Acc@5 13.28 ( 5.09) +Epoch: [0][2681/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5502e+00 (6.3658e+00) Acc@1 5.86 ( 1.45) Acc@5 14.45 ( 5.10) +Epoch: [0][2682/5004] Time 0.238 ( 0.243) Data 0.024 ( 0.027) Loss 5.6694e+00 (6.3655e+00) Acc@1 3.12 ( 1.45) Acc@5 12.89 ( 5.10) +Epoch: [0][2683/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.7690e+00 (6.3653e+00) Acc@1 2.34 ( 1.45) Acc@5 9.77 ( 5.10) +Epoch: [0][2684/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.7304e+00 (6.3651e+00) Acc@1 2.34 ( 1.45) Acc@5 11.33 ( 5.10) +Epoch: [0][2685/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.6667e+00 (6.3648e+00) Acc@1 3.52 ( 1.45) Acc@5 13.28 ( 5.11) +Epoch: [0][2686/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5439e+00 (6.3645e+00) Acc@1 3.91 ( 1.46) Acc@5 15.23 ( 5.11) +Epoch: [0][2687/5004] Time 0.241 ( 0.243) Data 0.024 ( 0.027) Loss 5.3422e+00 (6.3641e+00) Acc@1 6.64 ( 1.46) Acc@5 17.58 ( 5.11) +Epoch: [0][2688/5004] Time 0.244 ( 0.243) Data 0.026 ( 0.027) Loss 5.5405e+00 (6.3638e+00) Acc@1 4.69 ( 1.46) Acc@5 13.28 ( 5.12) +Epoch: [0][2689/5004] Time 0.248 ( 0.243) Data 0.022 ( 0.027) Loss 5.4955e+00 (6.3635e+00) Acc@1 3.52 ( 1.46) Acc@5 15.23 ( 5.12) +Epoch: [0][2690/5004] Time 0.246 ( 0.243) Data 0.020 ( 0.027) Loss 5.4818e+00 (6.3632e+00) Acc@1 3.52 ( 1.46) Acc@5 13.67 ( 5.12) +Epoch: [0][2691/5004] Time 0.228 ( 0.243) Data 0.014 ( 0.027) Loss 5.5071e+00 (6.3628e+00) Acc@1 5.47 ( 1.46) Acc@5 15.62 ( 5.13) +Epoch: [0][2692/5004] Time 0.242 ( 0.243) Data 0.026 ( 0.027) Loss 5.6807e+00 (6.3626e+00) Acc@1 4.30 ( 1.46) Acc@5 12.11 ( 5.13) +Epoch: [0][2693/5004] Time 0.236 ( 0.243) Data 0.023 ( 0.027) Loss 5.5701e+00 (6.3623e+00) Acc@1 5.08 ( 1.46) Acc@5 14.06 ( 5.13) +Epoch: [0][2694/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.5699e+00 (6.3620e+00) Acc@1 4.69 ( 1.46) Acc@5 14.06 ( 5.14) +Epoch: [0][2695/5004] Time 0.239 ( 0.243) Data 0.024 ( 0.027) Loss 5.4725e+00 (6.3617e+00) Acc@1 7.03 ( 1.47) Acc@5 16.02 ( 5.14) +Epoch: [0][2696/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.6065e+00 (6.3614e+00) Acc@1 2.73 ( 1.47) Acc@5 12.89 ( 5.14) +Epoch: [0][2697/5004] Time 0.245 ( 0.243) Data 0.025 ( 0.027) Loss 5.6271e+00 (6.3611e+00) Acc@1 5.86 ( 1.47) Acc@5 14.45 ( 5.15) +Epoch: [0][2698/5004] Time 0.235 ( 0.243) Data 0.020 ( 0.027) Loss 5.6482e+00 (6.3609e+00) Acc@1 4.30 ( 1.47) Acc@5 14.45 ( 5.15) +Epoch: [0][2699/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.4829e+00 (6.3605e+00) Acc@1 4.30 ( 1.47) Acc@5 14.06 ( 5.15) +Epoch: [0][2700/5004] Time 0.241 ( 0.243) Data 0.021 ( 0.027) Loss 5.4110e+00 (6.3602e+00) Acc@1 6.25 ( 1.47) Acc@5 17.58 ( 5.16) +Epoch: [0][2701/5004] Time 0.243 ( 0.243) Data 0.022 ( 0.027) Loss 5.4644e+00 (6.3598e+00) Acc@1 3.91 ( 1.47) Acc@5 17.58 ( 5.16) +Epoch: [0][2702/5004] Time 0.239 ( 0.243) Data 0.021 ( 0.027) Loss 5.5126e+00 (6.3595e+00) Acc@1 3.52 ( 1.47) Acc@5 9.77 ( 5.17) +Epoch: [0][2703/5004] Time 0.245 ( 0.243) Data 0.024 ( 0.027) Loss 5.5120e+00 (6.3592e+00) Acc@1 4.69 ( 1.48) Acc@5 16.41 ( 5.17) +Epoch: [0][2704/5004] Time 0.246 ( 0.243) Data 0.021 ( 0.027) Loss 5.5408e+00 (6.3589e+00) Acc@1 6.64 ( 1.48) Acc@5 15.23 ( 5.17) +Epoch: [0][2705/5004] Time 0.239 ( 0.243) Data 0.017 ( 0.027) Loss 5.7053e+00 (6.3587e+00) Acc@1 3.52 ( 1.48) Acc@5 15.62 ( 5.18) +Epoch: [0][2706/5004] Time 0.239 ( 0.243) Data 0.022 ( 0.027) Loss 5.5243e+00 (6.3584e+00) Acc@1 3.52 ( 1.48) Acc@5 12.89 ( 5.18) +Epoch: [0][2707/5004] Time 0.246 ( 0.243) Data 0.026 ( 0.027) Loss 5.5293e+00 (6.3581e+00) Acc@1 5.08 ( 1.48) Acc@5 15.23 ( 5.18) +Epoch: [0][2708/5004] Time 0.240 ( 0.243) Data 0.022 ( 0.027) Loss 5.6164e+00 (6.3578e+00) Acc@1 5.86 ( 1.48) Acc@5 13.28 ( 5.19) +Epoch: [0][2709/5004] Time 0.246 ( 0.243) Data 0.023 ( 0.027) Loss 5.5472e+00 (6.3575e+00) Acc@1 4.69 ( 1.48) Acc@5 14.45 ( 5.19) +Epoch: [0][2710/5004] Time 0.240 ( 0.243) Data 0.022 ( 0.027) Loss 5.5133e+00 (6.3572e+00) Acc@1 4.69 ( 1.48) Acc@5 16.80 ( 5.19) +Epoch: [0][2711/5004] Time 0.237 ( 0.243) Data 0.022 ( 0.027) Loss 5.5592e+00 (6.3569e+00) Acc@1 4.30 ( 1.49) Acc@5 14.06 ( 5.20) +Epoch: [0][2712/5004] Time 0.248 ( 0.243) Data 0.025 ( 0.027) Loss 5.5166e+00 (6.3566e+00) Acc@1 5.86 ( 1.49) Acc@5 18.75 ( 5.20) +Epoch: [0][2713/5004] Time 0.252 ( 0.243) Data 0.018 ( 0.027) Loss 5.4530e+00 (6.3562e+00) Acc@1 4.30 ( 1.49) Acc@5 14.45 ( 5.21) +Epoch: [0][2714/5004] Time 0.235 ( 0.243) Data 0.019 ( 0.027) Loss 5.4844e+00 (6.3559e+00) Acc@1 5.86 ( 1.49) Acc@5 16.02 ( 5.21) +Epoch: [0][2715/5004] Time 0.236 ( 0.243) Data 0.022 ( 0.027) Loss 5.5095e+00 (6.3556e+00) Acc@1 7.03 ( 1.49) Acc@5 17.97 ( 5.22) +Epoch: [0][2716/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.4880e+00 (6.3553e+00) Acc@1 3.52 ( 1.49) Acc@5 14.84 ( 5.22) +Epoch: [0][2717/5004] Time 0.238 ( 0.243) Data 0.024 ( 0.027) Loss 5.5076e+00 (6.3550e+00) Acc@1 4.69 ( 1.49) Acc@5 14.84 ( 5.22) +Epoch: [0][2718/5004] Time 0.241 ( 0.243) Data 0.025 ( 0.027) Loss 5.5062e+00 (6.3547e+00) Acc@1 5.47 ( 1.50) Acc@5 14.45 ( 5.23) +Epoch: [0][2719/5004] Time 0.239 ( 0.243) Data 0.023 ( 0.027) Loss 5.4397e+00 (6.3543e+00) Acc@1 6.25 ( 1.50) Acc@5 18.36 ( 5.23) +Epoch: [0][2720/5004] Time 0.237 ( 0.243) Data 0.023 ( 0.027) Loss 5.5843e+00 (6.3540e+00) Acc@1 4.69 ( 1.50) Acc@5 14.45 ( 5.23) +Epoch: [0][2721/5004] Time 0.237 ( 0.243) Data 0.024 ( 0.027) Loss 5.5366e+00 (6.3537e+00) Acc@1 2.73 ( 1.50) Acc@5 13.67 ( 5.24) +Epoch: [0][2722/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.4960e+00 (6.3534e+00) Acc@1 3.52 ( 1.50) Acc@5 13.28 ( 5.24) +Epoch: [0][2723/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.5619e+00 (6.3531e+00) Acc@1 5.08 ( 1.50) Acc@5 12.11 ( 5.24) +Epoch: [0][2724/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.5510e+00 (6.3528e+00) Acc@1 5.08 ( 1.50) Acc@5 14.06 ( 5.25) +Epoch: [0][2725/5004] Time 0.238 ( 0.243) Data 0.026 ( 0.027) Loss 5.2300e+00 (6.3524e+00) Acc@1 9.38 ( 1.50) Acc@5 19.92 ( 5.25) +Epoch: [0][2726/5004] Time 0.242 ( 0.243) Data 0.026 ( 0.027) Loss 5.4360e+00 (6.3521e+00) Acc@1 5.08 ( 1.51) Acc@5 18.75 ( 5.26) +Epoch: [0][2727/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.4144e+00 (6.3518e+00) Acc@1 6.25 ( 1.51) Acc@5 17.58 ( 5.26) +Epoch: [0][2728/5004] Time 0.233 ( 0.243) Data 0.019 ( 0.027) Loss 5.6379e+00 (6.3515e+00) Acc@1 3.91 ( 1.51) Acc@5 14.84 ( 5.26) +Epoch: [0][2729/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.4937e+00 (6.3512e+00) Acc@1 4.69 ( 1.51) Acc@5 14.45 ( 5.27) +Epoch: [0][2730/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.4999e+00 (6.3509e+00) Acc@1 5.86 ( 1.51) Acc@5 12.50 ( 5.27) +Epoch: [0][2731/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.6881e+00 (6.3506e+00) Acc@1 3.52 ( 1.51) Acc@5 13.67 ( 5.27) +Epoch: [0][2732/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.4238e+00 (6.3503e+00) Acc@1 7.03 ( 1.51) Acc@5 17.97 ( 5.28) +Epoch: [0][2733/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.5799e+00 (6.3500e+00) Acc@1 5.08 ( 1.52) Acc@5 15.23 ( 5.28) +Epoch: [0][2734/5004] Time 0.235 ( 0.243) Data 0.023 ( 0.027) Loss 5.5692e+00 (6.3497e+00) Acc@1 3.91 ( 1.52) Acc@5 13.28 ( 5.28) +Epoch: [0][2735/5004] Time 0.242 ( 0.243) Data 0.026 ( 0.027) Loss 5.5608e+00 (6.3494e+00) Acc@1 3.91 ( 1.52) Acc@5 13.67 ( 5.29) +Epoch: [0][2736/5004] Time 0.241 ( 0.243) Data 0.024 ( 0.027) Loss 5.4211e+00 (6.3491e+00) Acc@1 4.69 ( 1.52) Acc@5 16.80 ( 5.29) +Epoch: [0][2737/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.6344e+00 (6.3488e+00) Acc@1 5.08 ( 1.52) Acc@5 12.89 ( 5.29) +Epoch: [0][2738/5004] Time 0.239 ( 0.243) Data 0.027 ( 0.027) Loss 5.6397e+00 (6.3486e+00) Acc@1 3.12 ( 1.52) Acc@5 12.11 ( 5.30) +Epoch: [0][2739/5004] Time 0.241 ( 0.243) Data 0.027 ( 0.027) Loss 5.3938e+00 (6.3482e+00) Acc@1 4.69 ( 1.52) Acc@5 14.45 ( 5.30) +Epoch: [0][2740/5004] Time 0.242 ( 0.243) Data 0.027 ( 0.027) Loss 5.4130e+00 (6.3479e+00) Acc@1 7.03 ( 1.52) Acc@5 17.97 ( 5.30) +Epoch: [0][2741/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.4991e+00 (6.3476e+00) Acc@1 5.08 ( 1.52) Acc@5 16.02 ( 5.31) +Epoch: [0][2742/5004] Time 0.237 ( 0.243) Data 0.025 ( 0.027) Loss 5.5721e+00 (6.3473e+00) Acc@1 7.03 ( 1.53) Acc@5 17.97 ( 5.31) +Epoch: [0][2743/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.5447e+00 (6.3470e+00) Acc@1 3.91 ( 1.53) Acc@5 14.06 ( 5.32) +Epoch: [0][2744/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.4605e+00 (6.3467e+00) Acc@1 5.86 ( 1.53) Acc@5 16.41 ( 5.32) +Epoch: [0][2745/5004] Time 0.241 ( 0.243) Data 0.026 ( 0.027) Loss 5.6474e+00 (6.3464e+00) Acc@1 4.30 ( 1.53) Acc@5 11.72 ( 5.32) +Epoch: [0][2746/5004] Time 0.236 ( 0.243) Data 0.025 ( 0.027) Loss 5.5223e+00 (6.3461e+00) Acc@1 4.30 ( 1.53) Acc@5 16.80 ( 5.33) +Epoch: [0][2747/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.4501e+00 (6.3458e+00) Acc@1 4.69 ( 1.53) Acc@5 14.06 ( 5.33) +Epoch: [0][2748/5004] Time 0.241 ( 0.243) Data 0.027 ( 0.027) Loss 5.4746e+00 (6.3455e+00) Acc@1 3.91 ( 1.53) Acc@5 18.75 ( 5.34) +Epoch: [0][2749/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.6476e+00 (6.3452e+00) Acc@1 5.47 ( 1.53) Acc@5 15.62 ( 5.34) +Epoch: [0][2750/5004] Time 0.238 ( 0.243) Data 0.026 ( 0.027) Loss 5.5029e+00 (6.3449e+00) Acc@1 5.86 ( 1.54) Acc@5 17.97 ( 5.34) +Epoch: [0][2751/5004] Time 0.242 ( 0.243) Data 0.027 ( 0.027) Loss 5.5109e+00 (6.3446e+00) Acc@1 5.08 ( 1.54) Acc@5 15.62 ( 5.35) +Epoch: [0][2752/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.5150e+00 (6.3443e+00) Acc@1 4.69 ( 1.54) Acc@5 15.23 ( 5.35) +Epoch: [0][2753/5004] Time 0.235 ( 0.243) Data 0.021 ( 0.027) Loss 5.6181e+00 (6.3440e+00) Acc@1 3.91 ( 1.54) Acc@5 14.84 ( 5.35) +Epoch: [0][2754/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5420e+00 (6.3438e+00) Acc@1 5.47 ( 1.54) Acc@5 15.62 ( 5.36) +Epoch: [0][2755/5004] Time 0.242 ( 0.243) Data 0.024 ( 0.027) Loss 5.4154e+00 (6.3434e+00) Acc@1 4.30 ( 1.54) Acc@5 14.06 ( 5.36) +Epoch: [0][2756/5004] Time 0.247 ( 0.243) Data 0.026 ( 0.027) Loss 5.4307e+00 (6.3431e+00) Acc@1 5.47 ( 1.54) Acc@5 14.84 ( 5.36) +Epoch: [0][2757/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.4150e+00 (6.3427e+00) Acc@1 5.86 ( 1.54) Acc@5 19.14 ( 5.37) +Epoch: [0][2758/5004] Time 0.247 ( 0.243) Data 0.022 ( 0.027) Loss 5.5497e+00 (6.3425e+00) Acc@1 5.08 ( 1.55) Acc@5 14.84 ( 5.37) +Epoch: [0][2759/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 5.4709e+00 (6.3421e+00) Acc@1 5.08 ( 1.55) Acc@5 16.41 ( 5.38) +Epoch: [0][2760/5004] Time 0.244 ( 0.243) Data 0.022 ( 0.027) Loss 5.3756e+00 (6.3418e+00) Acc@1 6.25 ( 1.55) Acc@5 18.75 ( 5.38) +Epoch: [0][2761/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.5434e+00 (6.3415e+00) Acc@1 2.73 ( 1.55) Acc@5 13.67 ( 5.38) +Epoch: [0][2762/5004] Time 0.243 ( 0.243) Data 0.021 ( 0.027) Loss 5.6508e+00 (6.3413e+00) Acc@1 4.30 ( 1.55) Acc@5 14.06 ( 5.39) +Epoch: [0][2763/5004] Time 0.243 ( 0.243) Data 0.021 ( 0.027) Loss 5.6226e+00 (6.3410e+00) Acc@1 5.86 ( 1.55) Acc@5 16.80 ( 5.39) +Epoch: [0][2764/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.6177e+00 (6.3407e+00) Acc@1 5.47 ( 1.55) Acc@5 13.28 ( 5.40) +Epoch: [0][2765/5004] Time 0.245 ( 0.243) Data 0.022 ( 0.027) Loss 5.5283e+00 (6.3404e+00) Acc@1 2.73 ( 1.55) Acc@5 8.98 ( 5.40) +Epoch: [0][2766/5004] Time 0.243 ( 0.243) Data 0.021 ( 0.027) Loss 5.5611e+00 (6.3402e+00) Acc@1 3.52 ( 1.55) Acc@5 10.94 ( 5.40) +Epoch: [0][2767/5004] Time 0.246 ( 0.243) Data 0.022 ( 0.027) Loss 5.6421e+00 (6.3399e+00) Acc@1 3.52 ( 1.56) Acc@5 13.28 ( 5.40) +Epoch: [0][2768/5004] Time 0.247 ( 0.243) Data 0.020 ( 0.027) Loss 5.5173e+00 (6.3396e+00) Acc@1 3.91 ( 1.56) Acc@5 14.06 ( 5.40) +Epoch: [0][2769/5004] Time 0.245 ( 0.243) Data 0.020 ( 0.027) Loss 5.5629e+00 (6.3393e+00) Acc@1 5.47 ( 1.56) Acc@5 14.06 ( 5.41) +Epoch: [0][2770/5004] Time 0.243 ( 0.243) Data 0.020 ( 0.027) Loss 5.3512e+00 (6.3390e+00) Acc@1 5.86 ( 1.56) Acc@5 20.70 ( 5.41) +Epoch: [0][2771/5004] Time 0.240 ( 0.243) Data 0.021 ( 0.027) Loss 5.4446e+00 (6.3387e+00) Acc@1 5.47 ( 1.56) Acc@5 14.45 ( 5.42) +Epoch: [0][2772/5004] Time 0.236 ( 0.243) Data 0.022 ( 0.027) Loss 5.6021e+00 (6.3384e+00) Acc@1 3.52 ( 1.56) Acc@5 14.06 ( 5.42) +Epoch: [0][2773/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.5749e+00 (6.3381e+00) Acc@1 4.30 ( 1.56) Acc@5 14.84 ( 5.42) +Epoch: [0][2774/5004] Time 0.236 ( 0.243) Data 0.024 ( 0.027) Loss 5.5260e+00 (6.3378e+00) Acc@1 2.73 ( 1.56) Acc@5 14.84 ( 5.43) +Epoch: [0][2775/5004] Time 0.238 ( 0.243) Data 0.026 ( 0.027) Loss 5.6156e+00 (6.3376e+00) Acc@1 3.12 ( 1.56) Acc@5 10.94 ( 5.43) +Epoch: [0][2776/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.5842e+00 (6.3373e+00) Acc@1 4.69 ( 1.56) Acc@5 13.28 ( 5.43) +Epoch: [0][2777/5004] Time 0.239 ( 0.243) Data 0.028 ( 0.027) Loss 5.5411e+00 (6.3370e+00) Acc@1 4.30 ( 1.57) Acc@5 10.94 ( 5.43) +Epoch: [0][2778/5004] Time 0.241 ( 0.243) Data 0.028 ( 0.027) Loss 5.3671e+00 (6.3367e+00) Acc@1 6.64 ( 1.57) Acc@5 19.14 ( 5.44) +Epoch: [0][2779/5004] Time 0.237 ( 0.243) Data 0.026 ( 0.027) Loss 5.5530e+00 (6.3364e+00) Acc@1 5.08 ( 1.57) Acc@5 14.06 ( 5.44) +Epoch: [0][2780/5004] Time 0.237 ( 0.243) Data 0.028 ( 0.027) Loss 5.4305e+00 (6.3360e+00) Acc@1 4.30 ( 1.57) Acc@5 16.02 ( 5.44) +Epoch: [0][2781/5004] Time 0.244 ( 0.243) Data 0.029 ( 0.027) Loss 5.4829e+00 (6.3357e+00) Acc@1 5.86 ( 1.57) Acc@5 16.80 ( 5.45) +Epoch: [0][2782/5004] Time 0.237 ( 0.243) Data 0.026 ( 0.027) Loss 5.5394e+00 (6.3355e+00) Acc@1 4.69 ( 1.57) Acc@5 14.45 ( 5.45) +Epoch: [0][2783/5004] Time 0.240 ( 0.243) Data 0.028 ( 0.027) Loss 5.5694e+00 (6.3352e+00) Acc@1 5.08 ( 1.57) Acc@5 14.84 ( 5.46) +Epoch: [0][2784/5004] Time 0.241 ( 0.243) Data 0.026 ( 0.027) Loss 5.5532e+00 (6.3349e+00) Acc@1 5.47 ( 1.57) Acc@5 17.19 ( 5.46) +Epoch: [0][2785/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.4671e+00 (6.3346e+00) Acc@1 5.47 ( 1.58) Acc@5 13.67 ( 5.46) +Epoch: [0][2786/5004] Time 0.236 ( 0.243) Data 0.026 ( 0.027) Loss 5.7900e+00 (6.3344e+00) Acc@1 2.73 ( 1.58) Acc@5 9.77 ( 5.46) +Epoch: [0][2787/5004] Time 0.241 ( 0.243) Data 0.029 ( 0.027) Loss 5.5179e+00 (6.3341e+00) Acc@1 3.52 ( 1.58) Acc@5 14.06 ( 5.47) +Epoch: [0][2788/5004] Time 0.240 ( 0.243) Data 0.027 ( 0.027) Loss 5.3545e+00 (6.3337e+00) Acc@1 5.86 ( 1.58) Acc@5 16.41 ( 5.47) +Epoch: [0][2789/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.5847e+00 (6.3335e+00) Acc@1 4.30 ( 1.58) Acc@5 14.45 ( 5.47) +Epoch: [0][2790/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.4458e+00 (6.3332e+00) Acc@1 5.47 ( 1.58) Acc@5 18.75 ( 5.48) +Epoch: [0][2791/5004] Time 0.237 ( 0.243) Data 0.027 ( 0.027) Loss 5.4857e+00 (6.3329e+00) Acc@1 4.69 ( 1.58) Acc@5 16.41 ( 5.48) +Epoch: [0][2792/5004] Time 0.244 ( 0.243) Data 0.028 ( 0.027) Loss 5.5886e+00 (6.3326e+00) Acc@1 4.30 ( 1.58) Acc@5 16.02 ( 5.49) +Epoch: [0][2793/5004] Time 0.238 ( 0.243) Data 0.022 ( 0.027) Loss 5.5443e+00 (6.3323e+00) Acc@1 3.52 ( 1.58) Acc@5 13.67 ( 5.49) +Epoch: [0][2794/5004] Time 0.236 ( 0.243) Data 0.023 ( 0.027) Loss 5.5259e+00 (6.3320e+00) Acc@1 3.91 ( 1.58) Acc@5 18.36 ( 5.49) +Epoch: [0][2795/5004] Time 0.236 ( 0.243) Data 0.026 ( 0.027) Loss 5.3363e+00 (6.3317e+00) Acc@1 6.64 ( 1.59) Acc@5 19.53 ( 5.50) +Epoch: [0][2796/5004] Time 0.241 ( 0.243) Data 0.029 ( 0.027) Loss 5.4068e+00 (6.3313e+00) Acc@1 7.81 ( 1.59) Acc@5 17.97 ( 5.50) +Epoch: [0][2797/5004] Time 0.240 ( 0.243) Data 0.028 ( 0.027) Loss 5.5196e+00 (6.3310e+00) Acc@1 5.86 ( 1.59) Acc@5 17.97 ( 5.51) +Epoch: [0][2798/5004] Time 0.239 ( 0.243) Data 0.028 ( 0.027) Loss 5.3645e+00 (6.3307e+00) Acc@1 5.47 ( 1.59) Acc@5 19.53 ( 5.51) +Epoch: [0][2799/5004] Time 0.238 ( 0.243) Data 0.028 ( 0.027) Loss 5.4029e+00 (6.3304e+00) Acc@1 3.91 ( 1.59) Acc@5 17.19 ( 5.52) +Epoch: [0][2800/5004] Time 0.242 ( 0.243) Data 0.028 ( 0.027) Loss 5.3594e+00 (6.3300e+00) Acc@1 1.95 ( 1.59) Acc@5 14.45 ( 5.52) +Epoch: [0][2801/5004] Time 0.239 ( 0.243) Data 0.026 ( 0.027) Loss 5.5000e+00 (6.3297e+00) Acc@1 5.47 ( 1.59) Acc@5 16.41 ( 5.52) +Epoch: [0][2802/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.4159e+00 (6.3294e+00) Acc@1 5.08 ( 1.60) Acc@5 15.23 ( 5.53) +Epoch: [0][2803/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.3969e+00 (6.3291e+00) Acc@1 7.42 ( 1.60) Acc@5 15.23 ( 5.53) +Epoch: [0][2804/5004] Time 0.238 ( 0.243) Data 0.025 ( 0.027) Loss 5.4478e+00 (6.3287e+00) Acc@1 5.47 ( 1.60) Acc@5 16.80 ( 5.54) +Epoch: [0][2805/5004] Time 0.236 ( 0.243) Data 0.026 ( 0.027) Loss 5.5034e+00 (6.3285e+00) Acc@1 4.69 ( 1.60) Acc@5 18.36 ( 5.54) +Epoch: [0][2806/5004] Time 0.212 ( 0.243) Data 0.029 ( 0.027) Loss 5.4595e+00 (6.3281e+00) Acc@1 5.86 ( 1.60) Acc@5 17.58 ( 5.54) +Epoch: [0][2807/5004] Time 0.238 ( 0.243) Data 0.055 ( 0.027) Loss 5.3213e+00 (6.3278e+00) Acc@1 5.86 ( 1.60) Acc@5 19.92 ( 5.55) +Epoch: [0][2808/5004] Time 0.238 ( 0.243) Data 0.055 ( 0.027) Loss 5.4445e+00 (6.3275e+00) Acc@1 5.08 ( 1.60) Acc@5 16.41 ( 5.55) +Epoch: [0][2809/5004] Time 0.238 ( 0.243) Data 0.055 ( 0.027) Loss 5.5040e+00 (6.3272e+00) Acc@1 4.69 ( 1.61) Acc@5 14.06 ( 5.56) +Epoch: [0][2810/5004] Time 0.236 ( 0.243) Data 0.054 ( 0.027) Loss 5.1955e+00 (6.3268e+00) Acc@1 8.98 ( 1.61) Acc@5 17.97 ( 5.56) +Epoch: [0][2811/5004] Time 0.243 ( 0.243) Data 0.055 ( 0.027) Loss 5.3897e+00 (6.3264e+00) Acc@1 4.69 ( 1.61) Acc@5 15.23 ( 5.56) +Epoch: [0][2812/5004] Time 0.233 ( 0.243) Data 0.052 ( 0.027) Loss 5.6003e+00 (6.3262e+00) Acc@1 5.08 ( 1.61) Acc@5 12.11 ( 5.57) +Epoch: [0][2813/5004] Time 0.242 ( 0.243) Data 0.058 ( 0.027) Loss 5.3884e+00 (6.3259e+00) Acc@1 7.81 ( 1.61) Acc@5 16.80 ( 5.57) +Epoch: [0][2814/5004] Time 0.234 ( 0.243) Data 0.054 ( 0.027) Loss 5.6413e+00 (6.3256e+00) Acc@1 7.42 ( 1.61) Acc@5 13.28 ( 5.57) +Epoch: [0][2815/5004] Time 0.240 ( 0.243) Data 0.057 ( 0.027) Loss 5.5729e+00 (6.3253e+00) Acc@1 1.56 ( 1.61) Acc@5 13.67 ( 5.58) +Epoch: [0][2816/5004] Time 0.237 ( 0.243) Data 0.055 ( 0.027) Loss 5.4203e+00 (6.3250e+00) Acc@1 4.69 ( 1.62) Acc@5 16.02 ( 5.58) +Epoch: [0][2817/5004] Time 0.238 ( 0.243) Data 0.056 ( 0.027) Loss 5.3905e+00 (6.3247e+00) Acc@1 3.91 ( 1.62) Acc@5 17.19 ( 5.58) +Epoch: [0][2818/5004] Time 0.241 ( 0.243) Data 0.056 ( 0.027) Loss 5.5983e+00 (6.3244e+00) Acc@1 5.47 ( 1.62) Acc@5 19.53 ( 5.59) +Epoch: [0][2819/5004] Time 0.235 ( 0.243) Data 0.054 ( 0.027) Loss 5.5902e+00 (6.3242e+00) Acc@1 4.69 ( 1.62) Acc@5 14.45 ( 5.59) +Epoch: [0][2820/5004] Time 0.243 ( 0.243) Data 0.056 ( 0.027) Loss 5.5005e+00 (6.3239e+00) Acc@1 3.91 ( 1.62) Acc@5 17.58 ( 5.60) +Epoch: [0][2821/5004] Time 0.235 ( 0.243) Data 0.055 ( 0.027) Loss 5.5453e+00 (6.3236e+00) Acc@1 3.91 ( 1.62) Acc@5 13.28 ( 5.60) +Epoch: [0][2822/5004] Time 0.242 ( 0.243) Data 0.056 ( 0.027) Loss 5.5852e+00 (6.3233e+00) Acc@1 3.91 ( 1.62) Acc@5 17.19 ( 5.60) +Epoch: [0][2823/5004] Time 0.240 ( 0.243) Data 0.057 ( 0.027) Loss 5.3651e+00 (6.3230e+00) Acc@1 7.42 ( 1.62) Acc@5 19.14 ( 5.61) +Epoch: [0][2824/5004] Time 0.239 ( 0.243) Data 0.056 ( 0.027) Loss 5.2997e+00 (6.3226e+00) Acc@1 5.08 ( 1.62) Acc@5 18.36 ( 5.61) +Epoch: [0][2825/5004] Time 0.233 ( 0.243) Data 0.055 ( 0.027) Loss 5.3870e+00 (6.3223e+00) Acc@1 5.86 ( 1.63) Acc@5 16.41 ( 5.62) +Epoch: [0][2826/5004] Time 0.242 ( 0.243) Data 0.059 ( 0.027) Loss 5.4461e+00 (6.3220e+00) Acc@1 5.08 ( 1.63) Acc@5 16.41 ( 5.62) +Epoch: [0][2827/5004] Time 0.238 ( 0.243) Data 0.056 ( 0.027) Loss 5.3513e+00 (6.3217e+00) Acc@1 4.30 ( 1.63) Acc@5 14.84 ( 5.62) +Epoch: [0][2828/5004] Time 0.237 ( 0.243) Data 0.056 ( 0.027) Loss 5.4330e+00 (6.3213e+00) Acc@1 4.30 ( 1.63) Acc@5 16.80 ( 5.63) +Epoch: [0][2829/5004] Time 0.238 ( 0.243) Data 0.056 ( 0.027) Loss 5.4855e+00 (6.3210e+00) Acc@1 4.69 ( 1.63) Acc@5 12.89 ( 5.63) +Epoch: [0][2830/5004] Time 0.236 ( 0.243) Data 0.055 ( 0.027) Loss 5.5043e+00 (6.3208e+00) Acc@1 3.52 ( 1.63) Acc@5 16.80 ( 5.63) +Epoch: [0][2831/5004] Time 0.236 ( 0.243) Data 0.056 ( 0.027) Loss 5.4354e+00 (6.3204e+00) Acc@1 7.03 ( 1.63) Acc@5 15.62 ( 5.64) +Epoch: [0][2832/5004] Time 0.239 ( 0.243) Data 0.057 ( 0.027) Loss 5.5261e+00 (6.3202e+00) Acc@1 5.86 ( 1.63) Acc@5 12.50 ( 5.64) +Epoch: [0][2833/5004] Time 0.241 ( 0.243) Data 0.055 ( 0.027) Loss 5.6217e+00 (6.3199e+00) Acc@1 3.12 ( 1.63) Acc@5 10.16 ( 5.64) +Epoch: [0][2834/5004] Time 0.239 ( 0.243) Data 0.053 ( 0.027) Loss 5.4648e+00 (6.3196e+00) Acc@1 3.91 ( 1.64) Acc@5 16.80 ( 5.65) +Epoch: [0][2835/5004] Time 0.238 ( 0.243) Data 0.052 ( 0.027) Loss 5.4722e+00 (6.3193e+00) Acc@1 7.42 ( 1.64) Acc@5 13.67 ( 5.65) +Epoch: [0][2836/5004] Time 0.233 ( 0.243) Data 0.052 ( 0.027) Loss 5.5891e+00 (6.3191e+00) Acc@1 3.12 ( 1.64) Acc@5 13.28 ( 5.65) +Epoch: [0][2837/5004] Time 0.237 ( 0.243) Data 0.056 ( 0.027) Loss 5.4522e+00 (6.3188e+00) Acc@1 5.86 ( 1.64) Acc@5 14.06 ( 5.65) +Epoch: [0][2838/5004] Time 0.239 ( 0.243) Data 0.057 ( 0.027) Loss 5.4833e+00 (6.3185e+00) Acc@1 3.91 ( 1.64) Acc@5 14.84 ( 5.66) +Epoch: [0][2839/5004] Time 0.237 ( 0.243) Data 0.056 ( 0.027) Loss 5.5560e+00 (6.3182e+00) Acc@1 3.91 ( 1.64) Acc@5 17.19 ( 5.66) +Epoch: [0][2840/5004] Time 0.237 ( 0.243) Data 0.057 ( 0.027) Loss 5.5456e+00 (6.3179e+00) Acc@1 4.69 ( 1.64) Acc@5 16.80 ( 5.66) +Epoch: [0][2841/5004] Time 0.271 ( 0.243) Data 0.058 ( 0.027) Loss 5.4709e+00 (6.3176e+00) Acc@1 5.08 ( 1.64) Acc@5 14.45 ( 5.67) +Epoch: [0][2842/5004] Time 0.240 ( 0.243) Data 0.026 ( 0.027) Loss 5.4660e+00 (6.3173e+00) Acc@1 5.08 ( 1.64) Acc@5 17.19 ( 5.67) +Epoch: [0][2843/5004] Time 0.236 ( 0.243) Data 0.025 ( 0.027) Loss 5.4564e+00 (6.3170e+00) Acc@1 5.86 ( 1.65) Acc@5 16.02 ( 5.68) +Epoch: [0][2844/5004] Time 0.240 ( 0.243) Data 0.028 ( 0.027) Loss 5.2912e+00 (6.3167e+00) Acc@1 7.42 ( 1.65) Acc@5 19.92 ( 5.68) +Epoch: [0][2845/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.6210e+00 (6.3164e+00) Acc@1 6.25 ( 1.65) Acc@5 12.11 ( 5.68) +Epoch: [0][2846/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.5201e+00 (6.3161e+00) Acc@1 5.08 ( 1.65) Acc@5 12.50 ( 5.69) +Epoch: [0][2847/5004] Time 0.239 ( 0.243) Data 0.029 ( 0.027) Loss 5.7217e+00 (6.3159e+00) Acc@1 5.08 ( 1.65) Acc@5 12.89 ( 5.69) +Epoch: [0][2848/5004] Time 0.239 ( 0.243) Data 0.028 ( 0.027) Loss 5.5895e+00 (6.3157e+00) Acc@1 2.34 ( 1.65) Acc@5 15.23 ( 5.69) +Epoch: [0][2849/5004] Time 0.239 ( 0.243) Data 0.027 ( 0.027) Loss 5.4528e+00 (6.3154e+00) Acc@1 6.64 ( 1.65) Acc@5 18.75 ( 5.70) +Epoch: [0][2850/5004] Time 0.237 ( 0.243) Data 0.027 ( 0.027) Loss 5.6624e+00 (6.3151e+00) Acc@1 4.69 ( 1.66) Acc@5 10.94 ( 5.70) +Epoch: [0][2851/5004] Time 0.242 ( 0.243) Data 0.028 ( 0.027) Loss 5.5503e+00 (6.3149e+00) Acc@1 4.30 ( 1.66) Acc@5 16.41 ( 5.70) +Epoch: [0][2852/5004] Time 0.238 ( 0.243) Data 0.026 ( 0.027) Loss 5.5822e+00 (6.3146e+00) Acc@1 3.91 ( 1.66) Acc@5 15.62 ( 5.70) +Epoch: [0][2853/5004] Time 0.245 ( 0.243) Data 0.027 ( 0.027) Loss 5.3537e+00 (6.3143e+00) Acc@1 7.42 ( 1.66) Acc@5 17.97 ( 5.71) +Epoch: [0][2854/5004] Time 0.234 ( 0.243) Data 0.024 ( 0.027) Loss 5.5319e+00 (6.3140e+00) Acc@1 5.08 ( 1.66) Acc@5 14.84 ( 5.71) +Epoch: [0][2855/5004] Time 0.239 ( 0.243) Data 0.028 ( 0.027) Loss 5.4063e+00 (6.3137e+00) Acc@1 3.52 ( 1.66) Acc@5 17.58 ( 5.72) +Epoch: [0][2856/5004] Time 0.239 ( 0.243) Data 0.027 ( 0.027) Loss 5.3493e+00 (6.3133e+00) Acc@1 6.64 ( 1.66) Acc@5 17.19 ( 5.72) +Epoch: [0][2857/5004] Time 0.237 ( 0.243) Data 0.026 ( 0.027) Loss 5.4879e+00 (6.3131e+00) Acc@1 3.12 ( 1.66) Acc@5 17.58 ( 5.72) +Epoch: [0][2858/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.5127e+00 (6.3128e+00) Acc@1 3.52 ( 1.66) Acc@5 13.28 ( 5.73) +Epoch: [0][2859/5004] Time 0.239 ( 0.243) Data 0.029 ( 0.027) Loss 5.3385e+00 (6.3124e+00) Acc@1 5.08 ( 1.67) Acc@5 18.75 ( 5.73) +Epoch: [0][2860/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.4472e+00 (6.3121e+00) Acc@1 6.25 ( 1.67) Acc@5 16.80 ( 5.74) +Epoch: [0][2861/5004] Time 0.240 ( 0.243) Data 0.027 ( 0.027) Loss 5.5861e+00 (6.3119e+00) Acc@1 5.08 ( 1.67) Acc@5 11.33 ( 5.74) +Epoch: [0][2862/5004] Time 0.238 ( 0.243) Data 0.027 ( 0.027) Loss 5.4788e+00 (6.3116e+00) Acc@1 7.81 ( 1.67) Acc@5 17.97 ( 5.74) +Epoch: [0][2863/5004] Time 0.243 ( 0.243) Data 0.027 ( 0.027) Loss 5.4935e+00 (6.3113e+00) Acc@1 5.86 ( 1.67) Acc@5 15.23 ( 5.75) +Epoch: [0][2864/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.5278e+00 (6.3110e+00) Acc@1 3.52 ( 1.67) Acc@5 17.19 ( 5.75) +Epoch: [0][2865/5004] Time 0.236 ( 0.243) Data 0.023 ( 0.027) Loss 5.4913e+00 (6.3107e+00) Acc@1 4.69 ( 1.67) Acc@5 18.36 ( 5.75) +Epoch: [0][2866/5004] Time 0.239 ( 0.243) Data 0.026 ( 0.027) Loss 5.3615e+00 (6.3104e+00) Acc@1 5.47 ( 1.67) Acc@5 22.27 ( 5.76) +Epoch: [0][2867/5004] Time 0.237 ( 0.243) Data 0.027 ( 0.027) Loss 5.5630e+00 (6.3102e+00) Acc@1 5.47 ( 1.68) Acc@5 14.06 ( 5.76) +Epoch: [0][2868/5004] Time 0.239 ( 0.243) Data 0.027 ( 0.027) Loss 5.4878e+00 (6.3099e+00) Acc@1 5.47 ( 1.68) Acc@5 19.14 ( 5.77) +Epoch: [0][2869/5004] Time 0.240 ( 0.243) Data 0.028 ( 0.027) Loss 5.3506e+00 (6.3095e+00) Acc@1 6.25 ( 1.68) Acc@5 17.58 ( 5.77) +Epoch: [0][2870/5004] Time 0.242 ( 0.243) Data 0.026 ( 0.027) Loss 5.3701e+00 (6.3092e+00) Acc@1 5.86 ( 1.68) Acc@5 14.06 ( 5.77) +Epoch: [0][2871/5004] Time 0.233 ( 0.243) Data 0.023 ( 0.027) Loss 5.3887e+00 (6.3089e+00) Acc@1 7.81 ( 1.68) Acc@5 17.58 ( 5.78) +Epoch: [0][2872/5004] Time 0.241 ( 0.243) Data 0.027 ( 0.027) Loss 5.3971e+00 (6.3086e+00) Acc@1 6.64 ( 1.68) Acc@5 18.36 ( 5.78) +Epoch: [0][2873/5004] Time 0.237 ( 0.243) Data 0.026 ( 0.027) Loss 5.4881e+00 (6.3083e+00) Acc@1 3.91 ( 1.68) Acc@5 14.45 ( 5.79) +Epoch: [0][2874/5004] Time 0.240 ( 0.243) Data 0.028 ( 0.027) Loss 5.5177e+00 (6.3080e+00) Acc@1 3.52 ( 1.69) Acc@5 13.67 ( 5.79) +Epoch: [0][2875/5004] Time 0.239 ( 0.243) Data 0.027 ( 0.027) Loss 5.2957e+00 (6.3077e+00) Acc@1 5.47 ( 1.69) Acc@5 17.19 ( 5.79) +Epoch: [0][2876/5004] Time 0.240 ( 0.243) Data 0.027 ( 0.027) Loss 5.4828e+00 (6.3074e+00) Acc@1 6.25 ( 1.69) Acc@5 16.80 ( 5.80) +Epoch: [0][2877/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.4084e+00 (6.3071e+00) Acc@1 7.03 ( 1.69) Acc@5 17.19 ( 5.80) +Epoch: [0][2878/5004] Time 0.238 ( 0.243) Data 0.024 ( 0.027) Loss 5.4352e+00 (6.3068e+00) Acc@1 5.47 ( 1.69) Acc@5 17.19 ( 5.80) +Epoch: [0][2879/5004] Time 0.238 ( 0.243) Data 0.026 ( 0.027) Loss 5.4042e+00 (6.3064e+00) Acc@1 6.64 ( 1.69) Acc@5 19.53 ( 5.81) +Epoch: [0][2880/5004] Time 0.244 ( 0.243) Data 0.027 ( 0.027) Loss 5.3254e+00 (6.3061e+00) Acc@1 8.59 ( 1.70) Acc@5 17.97 ( 5.81) +Epoch: [0][2881/5004] Time 0.235 ( 0.243) Data 0.024 ( 0.027) Loss 5.4713e+00 (6.3058e+00) Acc@1 5.47 ( 1.70) Acc@5 12.50 ( 5.82) +Epoch: [0][2882/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.3663e+00 (6.3055e+00) Acc@1 6.25 ( 1.70) Acc@5 17.19 ( 5.82) +Epoch: [0][2883/5004] Time 0.234 ( 0.243) Data 0.022 ( 0.027) Loss 5.5400e+00 (6.3052e+00) Acc@1 2.73 ( 1.70) Acc@5 14.06 ( 5.82) +Epoch: [0][2884/5004] Time 0.241 ( 0.243) Data 0.028 ( 0.027) Loss 5.3939e+00 (6.3049e+00) Acc@1 6.64 ( 1.70) Acc@5 19.14 ( 5.83) +Epoch: [0][2885/5004] Time 0.236 ( 0.243) Data 0.026 ( 0.027) Loss 5.4908e+00 (6.3046e+00) Acc@1 6.64 ( 1.70) Acc@5 14.84 ( 5.83) +Epoch: [0][2886/5004] Time 0.240 ( 0.243) Data 0.028 ( 0.027) Loss 5.4202e+00 (6.3043e+00) Acc@1 5.86 ( 1.70) Acc@5 17.19 ( 5.83) +Epoch: [0][2887/5004] Time 0.237 ( 0.243) Data 0.026 ( 0.027) Loss 5.3408e+00 (6.3040e+00) Acc@1 5.47 ( 1.71) Acc@5 17.97 ( 5.84) +Epoch: [0][2888/5004] Time 0.239 ( 0.243) Data 0.029 ( 0.027) Loss 5.3400e+00 (6.3036e+00) Acc@1 5.47 ( 1.71) Acc@5 16.80 ( 5.84) +Epoch: [0][2889/5004] Time 0.238 ( 0.243) Data 0.028 ( 0.027) Loss 5.3907e+00 (6.3033e+00) Acc@1 5.86 ( 1.71) Acc@5 16.41 ( 5.85) +Epoch: [0][2890/5004] Time 0.241 ( 0.243) Data 0.029 ( 0.027) Loss 5.3519e+00 (6.3030e+00) Acc@1 5.47 ( 1.71) Acc@5 20.70 ( 5.85) +Epoch: [0][2891/5004] Time 0.237 ( 0.243) Data 0.027 ( 0.027) Loss 5.3613e+00 (6.3027e+00) Acc@1 3.91 ( 1.71) Acc@5 16.41 ( 5.85) +Epoch: [0][2892/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 5.4650e+00 (6.3024e+00) Acc@1 5.86 ( 1.71) Acc@5 17.19 ( 5.86) +Epoch: [0][2893/5004] Time 0.232 ( 0.243) Data 0.022 ( 0.027) Loss 5.5757e+00 (6.3021e+00) Acc@1 4.30 ( 1.71) Acc@5 13.67 ( 5.86) +Epoch: [0][2894/5004] Time 0.241 ( 0.243) Data 0.028 ( 0.027) Loss 5.3454e+00 (6.3018e+00) Acc@1 6.25 ( 1.71) Acc@5 16.02 ( 5.86) +Epoch: [0][2895/5004] Time 0.239 ( 0.243) Data 0.029 ( 0.027) Loss 5.3719e+00 (6.3015e+00) Acc@1 3.91 ( 1.71) Acc@5 14.84 ( 5.87) +Epoch: [0][2896/5004] Time 0.242 ( 0.243) Data 0.028 ( 0.027) Loss 5.5288e+00 (6.3012e+00) Acc@1 5.86 ( 1.72) Acc@5 14.84 ( 5.87) +Epoch: [0][2897/5004] Time 0.239 ( 0.243) Data 0.025 ( 0.027) Loss 5.5250e+00 (6.3009e+00) Acc@1 3.52 ( 1.72) Acc@5 13.28 ( 5.87) +Epoch: [0][2898/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 5.3922e+00 (6.3006e+00) Acc@1 3.91 ( 1.72) Acc@5 13.67 ( 5.88) +Epoch: [0][2899/5004] Time 0.243 ( 0.243) Data 0.023 ( 0.027) Loss 5.3985e+00 (6.3003e+00) Acc@1 5.86 ( 1.72) Acc@5 17.19 ( 5.88) +Epoch: [0][2900/5004] Time 0.240 ( 0.243) Data 0.025 ( 0.027) Loss 5.7236e+00 (6.3001e+00) Acc@1 5.86 ( 1.72) Acc@5 12.89 ( 5.88) +Epoch: [0][2901/5004] Time 0.242 ( 0.243) Data 0.025 ( 0.027) Loss 5.2940e+00 (6.2998e+00) Acc@1 7.03 ( 1.72) Acc@5 19.92 ( 5.89) +Epoch: [0][2902/5004] Time 0.247 ( 0.243) Data 0.024 ( 0.027) Loss 5.5180e+00 (6.2995e+00) Acc@1 3.12 ( 1.72) Acc@5 11.33 ( 5.89) +Epoch: [0][2903/5004] Time 0.241 ( 0.243) Data 0.021 ( 0.027) Loss 5.4061e+00 (6.2992e+00) Acc@1 6.25 ( 1.72) Acc@5 16.41 ( 5.89) +Epoch: [0][2904/5004] Time 0.235 ( 0.243) Data 0.021 ( 0.027) Loss 5.4278e+00 (6.2989e+00) Acc@1 3.12 ( 1.72) Acc@5 14.45 ( 5.90) +Epoch: [0][2905/5004] Time 0.239 ( 0.243) Data 0.027 ( 0.027) Loss 5.5769e+00 (6.2987e+00) Acc@1 4.30 ( 1.73) Acc@5 17.19 ( 5.90) +Epoch: [0][2906/5004] Time 0.237 ( 0.243) Data 0.026 ( 0.027) Loss 5.4313e+00 (6.2984e+00) Acc@1 5.47 ( 1.73) Acc@5 14.06 ( 5.90) +Epoch: [0][2907/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.3502e+00 (6.2980e+00) Acc@1 6.25 ( 1.73) Acc@5 20.70 ( 5.91) +Epoch: [0][2908/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 5.4663e+00 (6.2977e+00) Acc@1 4.30 ( 1.73) Acc@5 14.45 ( 5.91) +Epoch: [0][2909/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 5.4264e+00 (6.2974e+00) Acc@1 6.25 ( 1.73) Acc@5 16.80 ( 5.91) +Epoch: [0][2910/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 5.3804e+00 (6.2971e+00) Acc@1 6.25 ( 1.73) Acc@5 17.97 ( 5.92) +Epoch: [0][2911/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.027) Loss 5.4622e+00 (6.2968e+00) Acc@1 6.64 ( 1.73) Acc@5 14.45 ( 5.92) +Epoch: [0][2912/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 5.2964e+00 (6.2965e+00) Acc@1 5.86 ( 1.74) Acc@5 17.19 ( 5.92) +Epoch: [0][2913/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.3137e+00 (6.2962e+00) Acc@1 4.69 ( 1.74) Acc@5 15.23 ( 5.93) +Epoch: [0][2914/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 5.4008e+00 (6.2959e+00) Acc@1 3.12 ( 1.74) Acc@5 13.67 ( 5.93) +Epoch: [0][2915/5004] Time 0.237 ( 0.242) Data 0.027 ( 0.027) Loss 5.4299e+00 (6.2956e+00) Acc@1 3.12 ( 1.74) Acc@5 14.06 ( 5.93) +Epoch: [0][2916/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 5.3693e+00 (6.2952e+00) Acc@1 5.86 ( 1.74) Acc@5 13.28 ( 5.94) +Epoch: [0][2917/5004] Time 0.236 ( 0.242) Data 0.026 ( 0.027) Loss 5.4160e+00 (6.2949e+00) Acc@1 4.30 ( 1.74) Acc@5 12.89 ( 5.94) +Epoch: [0][2918/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.3446e+00 (6.2946e+00) Acc@1 5.86 ( 1.74) Acc@5 15.23 ( 5.94) +Epoch: [0][2919/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.3282e+00 (6.2943e+00) Acc@1 5.47 ( 1.74) Acc@5 19.53 ( 5.95) +Epoch: [0][2920/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 5.4229e+00 (6.2940e+00) Acc@1 3.91 ( 1.74) Acc@5 16.41 ( 5.95) +Epoch: [0][2921/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.3748e+00 (6.2937e+00) Acc@1 4.30 ( 1.74) Acc@5 19.53 ( 5.95) +Epoch: [0][2922/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 5.3991e+00 (6.2934e+00) Acc@1 7.42 ( 1.75) Acc@5 19.92 ( 5.96) +Epoch: [0][2923/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 5.2430e+00 (6.2930e+00) Acc@1 8.20 ( 1.75) Acc@5 20.31 ( 5.96) +Epoch: [0][2924/5004] Time 0.234 ( 0.242) Data 0.021 ( 0.027) Loss 5.2679e+00 (6.2927e+00) Acc@1 6.25 ( 1.75) Acc@5 17.97 ( 5.97) +Epoch: [0][2925/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 5.4037e+00 (6.2923e+00) Acc@1 7.42 ( 1.75) Acc@5 17.58 ( 5.97) +Epoch: [0][2926/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.5488e+00 (6.2921e+00) Acc@1 3.52 ( 1.75) Acc@5 13.67 ( 5.97) +Epoch: [0][2927/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 5.4912e+00 (6.2918e+00) Acc@1 4.69 ( 1.75) Acc@5 15.62 ( 5.98) +Epoch: [0][2928/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 5.2646e+00 (6.2915e+00) Acc@1 7.03 ( 1.75) Acc@5 19.92 ( 5.98) +Epoch: [0][2929/5004] Time 0.234 ( 0.242) Data 0.023 ( 0.027) Loss 5.3080e+00 (6.2911e+00) Acc@1 8.59 ( 1.76) Acc@5 21.88 ( 5.99) +Epoch: [0][2930/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.3856e+00 (6.2908e+00) Acc@1 7.42 ( 1.76) Acc@5 16.02 ( 5.99) +Epoch: [0][2931/5004] Time 0.237 ( 0.242) Data 0.027 ( 0.027) Loss 5.4552e+00 (6.2905e+00) Acc@1 5.47 ( 1.76) Acc@5 15.23 ( 5.99) +Epoch: [0][2932/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 5.3989e+00 (6.2902e+00) Acc@1 5.86 ( 1.76) Acc@5 16.02 ( 6.00) +Epoch: [0][2933/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 5.5097e+00 (6.2900e+00) Acc@1 5.47 ( 1.76) Acc@5 17.58 ( 6.00) +Epoch: [0][2934/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 5.5043e+00 (6.2897e+00) Acc@1 6.25 ( 1.76) Acc@5 16.02 ( 6.01) +Epoch: [0][2935/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.4710e+00 (6.2894e+00) Acc@1 4.69 ( 1.77) Acc@5 14.06 ( 6.01) +Epoch: [0][2936/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.2499e+00 (6.2891e+00) Acc@1 7.03 ( 1.77) Acc@5 19.14 ( 6.01) +Epoch: [0][2937/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 5.3945e+00 (6.2888e+00) Acc@1 5.86 ( 1.77) Acc@5 16.02 ( 6.02) +Epoch: [0][2938/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 5.4260e+00 (6.2885e+00) Acc@1 5.47 ( 1.77) Acc@5 16.80 ( 6.02) +Epoch: [0][2939/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 5.2636e+00 (6.2881e+00) Acc@1 5.47 ( 1.77) Acc@5 20.70 ( 6.02) +Epoch: [0][2940/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 5.4050e+00 (6.2878e+00) Acc@1 3.91 ( 1.77) Acc@5 14.06 ( 6.03) +Epoch: [0][2941/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.2780e+00 (6.2875e+00) Acc@1 5.86 ( 1.77) Acc@5 18.75 ( 6.03) +Epoch: [0][2942/5004] Time 0.238 ( 0.242) Data 0.023 ( 0.027) Loss 5.4545e+00 (6.2872e+00) Acc@1 7.03 ( 1.78) Acc@5 14.84 ( 6.03) +Epoch: [0][2943/5004] Time 0.235 ( 0.242) Data 0.025 ( 0.027) Loss 5.2939e+00 (6.2869e+00) Acc@1 7.03 ( 1.78) Acc@5 19.53 ( 6.04) +Epoch: [0][2944/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 5.4338e+00 (6.2866e+00) Acc@1 3.52 ( 1.78) Acc@5 15.62 ( 6.04) +Epoch: [0][2945/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.3662e+00 (6.2863e+00) Acc@1 7.03 ( 1.78) Acc@5 17.97 ( 6.05) +Epoch: [0][2946/5004] Time 0.239 ( 0.242) Data 0.029 ( 0.027) Loss 5.3914e+00 (6.2860e+00) Acc@1 8.59 ( 1.78) Acc@5 19.92 ( 6.05) +Epoch: [0][2947/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.5127e+00 (6.2857e+00) Acc@1 7.81 ( 1.78) Acc@5 16.41 ( 6.06) +Epoch: [0][2948/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 5.3989e+00 (6.2854e+00) Acc@1 5.47 ( 1.79) Acc@5 14.84 ( 6.06) +Epoch: [0][2949/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 5.4034e+00 (6.2851e+00) Acc@1 5.47 ( 1.79) Acc@5 18.75 ( 6.06) +Epoch: [0][2950/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 5.3608e+00 (6.2848e+00) Acc@1 4.69 ( 1.79) Acc@5 16.02 ( 6.07) +Epoch: [0][2951/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 5.4795e+00 (6.2845e+00) Acc@1 5.86 ( 1.79) Acc@5 15.62 ( 6.07) +Epoch: [0][2952/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 5.4331e+00 (6.2842e+00) Acc@1 7.03 ( 1.79) Acc@5 13.67 ( 6.07) +Epoch: [0][2953/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.027) Loss 5.3470e+00 (6.2839e+00) Acc@1 5.86 ( 1.79) Acc@5 17.19 ( 6.08) +Epoch: [0][2954/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 5.3208e+00 (6.2836e+00) Acc@1 7.42 ( 1.79) Acc@5 19.14 ( 6.08) +Epoch: [0][2955/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 5.4101e+00 (6.2833e+00) Acc@1 5.47 ( 1.79) Acc@5 17.58 ( 6.08) +Epoch: [0][2956/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 5.4228e+00 (6.2830e+00) Acc@1 5.47 ( 1.80) Acc@5 16.41 ( 6.09) +Epoch: [0][2957/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 5.5065e+00 (6.2827e+00) Acc@1 4.30 ( 1.80) Acc@5 14.45 ( 6.09) +Epoch: [0][2958/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 5.4883e+00 (6.2825e+00) Acc@1 6.25 ( 1.80) Acc@5 19.14 ( 6.09) +Epoch: [0][2959/5004] Time 0.237 ( 0.242) Data 0.021 ( 0.027) Loss 5.3340e+00 (6.2821e+00) Acc@1 6.64 ( 1.80) Acc@5 17.97 ( 6.10) +Epoch: [0][2960/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 5.3206e+00 (6.2818e+00) Acc@1 5.47 ( 1.80) Acc@5 16.41 ( 6.10) +Epoch: [0][2961/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.4163e+00 (6.2815e+00) Acc@1 5.86 ( 1.80) Acc@5 12.89 ( 6.10) +Epoch: [0][2962/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 5.4716e+00 (6.2812e+00) Acc@1 4.69 ( 1.80) Acc@5 14.84 ( 6.11) +Epoch: [0][2963/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.6193e+00 (6.2810e+00) Acc@1 6.25 ( 1.81) Acc@5 11.72 ( 6.11) +Epoch: [0][2964/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 5.2621e+00 (6.2807e+00) Acc@1 6.25 ( 1.81) Acc@5 17.19 ( 6.11) +Epoch: [0][2965/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 5.4024e+00 (6.2804e+00) Acc@1 3.91 ( 1.81) Acc@5 14.84 ( 6.12) +Epoch: [0][2966/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.027) Loss 5.3712e+00 (6.2801e+00) Acc@1 4.30 ( 1.81) Acc@5 16.02 ( 6.12) +Epoch: [0][2967/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.3353e+00 (6.2798e+00) Acc@1 6.64 ( 1.81) Acc@5 16.41 ( 6.12) +Epoch: [0][2968/5004] Time 0.249 ( 0.242) Data 0.024 ( 0.027) Loss 5.5602e+00 (6.2795e+00) Acc@1 5.08 ( 1.81) Acc@5 13.28 ( 6.12) +Epoch: [0][2969/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.4046e+00 (6.2792e+00) Acc@1 4.30 ( 1.81) Acc@5 18.75 ( 6.13) +Epoch: [0][2970/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.4133e+00 (6.2789e+00) Acc@1 5.47 ( 1.81) Acc@5 15.62 ( 6.13) +Epoch: [0][2971/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.4720e+00 (6.2787e+00) Acc@1 5.08 ( 1.81) Acc@5 17.58 ( 6.14) +Epoch: [0][2972/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 5.3278e+00 (6.2783e+00) Acc@1 4.69 ( 1.82) Acc@5 17.19 ( 6.14) +Epoch: [0][2973/5004] Time 0.254 ( 0.242) Data 0.023 ( 0.027) Loss 5.2897e+00 (6.2780e+00) Acc@1 6.64 ( 1.82) Acc@5 14.45 ( 6.14) +Epoch: [0][2974/5004] Time 0.241 ( 0.242) Data 0.017 ( 0.027) Loss 5.4048e+00 (6.2777e+00) Acc@1 5.86 ( 1.82) Acc@5 14.45 ( 6.15) +Epoch: [0][2975/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.4188e+00 (6.2774e+00) Acc@1 3.91 ( 1.82) Acc@5 15.62 ( 6.15) +Epoch: [0][2976/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 5.6094e+00 (6.2772e+00) Acc@1 3.12 ( 1.82) Acc@5 13.67 ( 6.15) +Epoch: [0][2977/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 5.2076e+00 (6.2768e+00) Acc@1 7.81 ( 1.82) Acc@5 19.92 ( 6.16) +Epoch: [0][2978/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 5.2628e+00 (6.2765e+00) Acc@1 8.20 ( 1.82) Acc@5 19.92 ( 6.16) +Epoch: [0][2979/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 5.4837e+00 (6.2762e+00) Acc@1 6.25 ( 1.82) Acc@5 16.80 ( 6.16) +Epoch: [0][2980/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.027) Loss 5.2911e+00 (6.2759e+00) Acc@1 6.25 ( 1.83) Acc@5 16.80 ( 6.17) +Epoch: [0][2981/5004] Time 0.249 ( 0.242) Data 0.024 ( 0.027) Loss 5.3177e+00 (6.2756e+00) Acc@1 7.42 ( 1.83) Acc@5 17.58 ( 6.17) +Epoch: [0][2982/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 5.1802e+00 (6.2752e+00) Acc@1 6.64 ( 1.83) Acc@5 19.53 ( 6.18) +Epoch: [0][2983/5004] Time 0.238 ( 0.242) Data 0.023 ( 0.027) Loss 5.3232e+00 (6.2749e+00) Acc@1 6.64 ( 1.83) Acc@5 17.58 ( 6.18) +Epoch: [0][2984/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 5.3511e+00 (6.2746e+00) Acc@1 4.69 ( 1.83) Acc@5 18.75 ( 6.18) +Epoch: [0][2985/5004] Time 0.234 ( 0.242) Data 0.022 ( 0.027) Loss 5.2323e+00 (6.2742e+00) Acc@1 5.86 ( 1.83) Acc@5 19.14 ( 6.19) +Epoch: [0][2986/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.3964e+00 (6.2739e+00) Acc@1 7.03 ( 1.84) Acc@5 16.41 ( 6.19) +Epoch: [0][2987/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 5.3551e+00 (6.2736e+00) Acc@1 6.25 ( 1.84) Acc@5 18.36 ( 6.20) +Epoch: [0][2988/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.4209e+00 (6.2734e+00) Acc@1 4.69 ( 1.84) Acc@5 17.19 ( 6.20) +Epoch: [0][2989/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.4102e+00 (6.2731e+00) Acc@1 3.91 ( 1.84) Acc@5 15.62 ( 6.20) +Epoch: [0][2990/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 5.4260e+00 (6.2728e+00) Acc@1 3.12 ( 1.84) Acc@5 13.67 ( 6.21) +Epoch: [0][2991/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.3026e+00 (6.2725e+00) Acc@1 6.64 ( 1.84) Acc@5 17.58 ( 6.21) +Epoch: [0][2992/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.3540e+00 (6.2721e+00) Acc@1 7.03 ( 1.84) Acc@5 19.92 ( 6.21) +Epoch: [0][2993/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.2809e+00 (6.2718e+00) Acc@1 5.86 ( 1.84) Acc@5 16.41 ( 6.22) +Epoch: [0][2994/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.2714e+00 (6.2715e+00) Acc@1 5.86 ( 1.85) Acc@5 16.02 ( 6.22) +Epoch: [0][2995/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 5.3237e+00 (6.2712e+00) Acc@1 6.64 ( 1.85) Acc@5 17.19 ( 6.22) +Epoch: [0][2996/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.4204e+00 (6.2709e+00) Acc@1 7.42 ( 1.85) Acc@5 18.36 ( 6.23) +Epoch: [0][2997/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.2262e+00 (6.2705e+00) Acc@1 7.81 ( 1.85) Acc@5 18.75 ( 6.23) +Epoch: [0][2998/5004] Time 0.249 ( 0.242) Data 0.038 ( 0.027) Loss 5.2698e+00 (6.2702e+00) Acc@1 4.69 ( 1.85) Acc@5 17.19 ( 6.24) +Epoch: [0][2999/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.4529e+00 (6.2699e+00) Acc@1 5.86 ( 1.85) Acc@5 12.11 ( 6.24) +Epoch: [0][3000/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.2379e+00 (6.2696e+00) Acc@1 10.94 ( 1.86) Acc@5 20.70 ( 6.24) +Epoch: [0][3001/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 5.5157e+00 (6.2693e+00) Acc@1 2.73 ( 1.86) Acc@5 14.06 ( 6.25) +Epoch: [0][3002/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 5.4178e+00 (6.2691e+00) Acc@1 6.64 ( 1.86) Acc@5 17.19 ( 6.25) +Epoch: [0][3003/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 5.2853e+00 (6.2687e+00) Acc@1 6.64 ( 1.86) Acc@5 19.92 ( 6.25) +Epoch: [0][3004/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.4097e+00 (6.2684e+00) Acc@1 6.25 ( 1.86) Acc@5 15.23 ( 6.26) +Epoch: [0][3005/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.027) Loss 5.3940e+00 (6.2681e+00) Acc@1 5.08 ( 1.86) Acc@5 12.11 ( 6.26) +Epoch: [0][3006/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.3662e+00 (6.2678e+00) Acc@1 4.30 ( 1.86) Acc@5 19.92 ( 6.26) +Epoch: [0][3007/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 5.4162e+00 (6.2676e+00) Acc@1 5.47 ( 1.86) Acc@5 16.41 ( 6.27) +Epoch: [0][3008/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.3764e+00 (6.2673e+00) Acc@1 3.91 ( 1.86) Acc@5 16.41 ( 6.27) +Epoch: [0][3009/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 5.5505e+00 (6.2670e+00) Acc@1 4.69 ( 1.87) Acc@5 15.62 ( 6.27) +Epoch: [0][3010/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.4857e+00 (6.2668e+00) Acc@1 7.42 ( 1.87) Acc@5 17.19 ( 6.28) +Epoch: [0][3011/5004] Time 0.247 ( 0.242) Data 0.025 ( 0.027) Loss 5.3703e+00 (6.2665e+00) Acc@1 5.08 ( 1.87) Acc@5 16.41 ( 6.28) +Epoch: [0][3012/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 5.2707e+00 (6.2661e+00) Acc@1 6.64 ( 1.87) Acc@5 20.70 ( 6.28) +Epoch: [0][3013/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.027) Loss 5.3883e+00 (6.2658e+00) Acc@1 7.03 ( 1.87) Acc@5 17.19 ( 6.29) +Epoch: [0][3014/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 5.4378e+00 (6.2656e+00) Acc@1 6.64 ( 1.87) Acc@5 16.80 ( 6.29) +Epoch: [0][3015/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.4900e+00 (6.2653e+00) Acc@1 3.52 ( 1.87) Acc@5 17.58 ( 6.30) +Epoch: [0][3016/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 5.2901e+00 (6.2650e+00) Acc@1 8.59 ( 1.88) Acc@5 21.48 ( 6.30) +Epoch: [0][3017/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.3477e+00 (6.2647e+00) Acc@1 5.47 ( 1.88) Acc@5 18.36 ( 6.30) +Epoch: [0][3018/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.3785e+00 (6.2644e+00) Acc@1 4.30 ( 1.88) Acc@5 20.31 ( 6.31) +Epoch: [0][3019/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 5.3469e+00 (6.2641e+00) Acc@1 4.69 ( 1.88) Acc@5 17.19 ( 6.31) +Epoch: [0][3020/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 5.5609e+00 (6.2639e+00) Acc@1 2.73 ( 1.88) Acc@5 14.06 ( 6.32) +Epoch: [0][3021/5004] Time 0.248 ( 0.242) Data 0.026 ( 0.027) Loss 5.3755e+00 (6.2636e+00) Acc@1 5.08 ( 1.88) Acc@5 16.80 ( 6.32) +Epoch: [0][3022/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 5.4281e+00 (6.2633e+00) Acc@1 6.25 ( 1.88) Acc@5 16.41 ( 6.32) +Epoch: [0][3023/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.3540e+00 (6.2630e+00) Acc@1 7.42 ( 1.88) Acc@5 16.80 ( 6.33) +Epoch: [0][3024/5004] Time 0.235 ( 0.242) Data 0.022 ( 0.027) Loss 5.4375e+00 (6.2627e+00) Acc@1 5.86 ( 1.88) Acc@5 14.45 ( 6.33) +Epoch: [0][3025/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 5.4063e+00 (6.2624e+00) Acc@1 4.30 ( 1.89) Acc@5 16.80 ( 6.33) +Epoch: [0][3026/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 5.3263e+00 (6.2621e+00) Acc@1 5.86 ( 1.89) Acc@5 21.88 ( 6.34) +Epoch: [0][3027/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 5.2232e+00 (6.2618e+00) Acc@1 7.03 ( 1.89) Acc@5 17.97 ( 6.34) +Epoch: [0][3028/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 5.3213e+00 (6.2615e+00) Acc@1 5.47 ( 1.89) Acc@5 17.97 ( 6.34) +Epoch: [0][3029/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.2392e+00 (6.2611e+00) Acc@1 6.25 ( 1.89) Acc@5 16.02 ( 6.35) +Epoch: [0][3030/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.027) Loss 5.4342e+00 (6.2609e+00) Acc@1 7.42 ( 1.89) Acc@5 19.53 ( 6.35) +Epoch: [0][3031/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.4526e+00 (6.2606e+00) Acc@1 5.47 ( 1.89) Acc@5 13.28 ( 6.35) +Epoch: [0][3032/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 5.3844e+00 (6.2603e+00) Acc@1 9.38 ( 1.90) Acc@5 23.44 ( 6.36) +Epoch: [0][3033/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.2067e+00 (6.2600e+00) Acc@1 5.86 ( 1.90) Acc@5 19.14 ( 6.36) +Epoch: [0][3034/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.3480e+00 (6.2597e+00) Acc@1 4.69 ( 1.90) Acc@5 19.14 ( 6.37) +Epoch: [0][3035/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 5.2924e+00 (6.2593e+00) Acc@1 6.64 ( 1.90) Acc@5 19.92 ( 6.37) +Epoch: [0][3036/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.2671e+00 (6.2590e+00) Acc@1 7.42 ( 1.90) Acc@5 18.75 ( 6.38) +Epoch: [0][3037/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.027) Loss 5.3618e+00 (6.2587e+00) Acc@1 5.47 ( 1.90) Acc@5 16.41 ( 6.38) +Epoch: [0][3038/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.4111e+00 (6.2584e+00) Acc@1 5.08 ( 1.90) Acc@5 17.19 ( 6.38) +Epoch: [0][3039/5004] Time 0.246 ( 0.242) Data 0.026 ( 0.027) Loss 5.2376e+00 (6.2581e+00) Acc@1 6.64 ( 1.91) Acc@5 21.09 ( 6.39) +Epoch: [0][3040/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 5.2681e+00 (6.2578e+00) Acc@1 6.25 ( 1.91) Acc@5 22.27 ( 6.39) +Epoch: [0][3041/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 5.4208e+00 (6.2575e+00) Acc@1 5.47 ( 1.91) Acc@5 18.75 ( 6.40) +Epoch: [0][3042/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.3527e+00 (6.2572e+00) Acc@1 7.03 ( 1.91) Acc@5 19.14 ( 6.40) +Epoch: [0][3043/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.4202e+00 (6.2569e+00) Acc@1 3.12 ( 1.91) Acc@5 15.23 ( 6.40) +Epoch: [0][3044/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.2807e+00 (6.2566e+00) Acc@1 8.20 ( 1.91) Acc@5 19.92 ( 6.41) +Epoch: [0][3045/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 5.5875e+00 (6.2564e+00) Acc@1 2.73 ( 1.91) Acc@5 7.81 ( 6.41) +Epoch: [0][3046/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 5.3934e+00 (6.2561e+00) Acc@1 8.20 ( 1.92) Acc@5 18.36 ( 6.41) +Epoch: [0][3047/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.2516e+00 (6.2558e+00) Acc@1 8.98 ( 1.92) Acc@5 18.75 ( 6.42) +Epoch: [0][3048/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.3143e+00 (6.2555e+00) Acc@1 8.59 ( 1.92) Acc@5 21.48 ( 6.42) +Epoch: [0][3049/5004] Time 0.197 ( 0.242) Data 0.026 ( 0.027) Loss 5.4688e+00 (6.2552e+00) Acc@1 5.47 ( 1.92) Acc@5 17.19 ( 6.43) +Epoch: [0][3050/5004] Time 0.236 ( 0.242) Data 0.066 ( 0.027) Loss 5.4616e+00 (6.2549e+00) Acc@1 4.30 ( 1.92) Acc@5 12.89 ( 6.43) +Epoch: [0][3051/5004] Time 0.239 ( 0.242) Data 0.066 ( 0.027) Loss 5.3243e+00 (6.2546e+00) Acc@1 4.69 ( 1.92) Acc@5 16.02 ( 6.43) +Epoch: [0][3052/5004] Time 0.236 ( 0.242) Data 0.065 ( 0.027) Loss 5.4356e+00 (6.2544e+00) Acc@1 3.52 ( 1.92) Acc@5 15.23 ( 6.43) +Epoch: [0][3053/5004] Time 0.238 ( 0.242) Data 0.065 ( 0.027) Loss 5.3772e+00 (6.2541e+00) Acc@1 7.42 ( 1.92) Acc@5 20.70 ( 6.44) +Epoch: [0][3054/5004] Time 0.239 ( 0.242) Data 0.064 ( 0.027) Loss 5.2772e+00 (6.2538e+00) Acc@1 4.69 ( 1.93) Acc@5 20.70 ( 6.44) +Epoch: [0][3055/5004] Time 0.238 ( 0.242) Data 0.065 ( 0.027) Loss 5.3617e+00 (6.2535e+00) Acc@1 5.08 ( 1.93) Acc@5 16.80 ( 6.45) +Epoch: [0][3056/5004] Time 0.274 ( 0.242) Data 0.064 ( 0.027) Loss 5.3188e+00 (6.2532e+00) Acc@1 5.47 ( 1.93) Acc@5 15.62 ( 6.45) +Epoch: [0][3057/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.2169e+00 (6.2528e+00) Acc@1 5.86 ( 1.93) Acc@5 19.92 ( 6.45) +Epoch: [0][3058/5004] Time 0.238 ( 0.242) Data 0.030 ( 0.027) Loss 5.5780e+00 (6.2526e+00) Acc@1 3.52 ( 1.93) Acc@5 12.50 ( 6.46) +Epoch: [0][3059/5004] Time 0.240 ( 0.242) Data 0.030 ( 0.027) Loss 5.4543e+00 (6.2524e+00) Acc@1 4.69 ( 1.93) Acc@5 14.84 ( 6.46) +Epoch: [0][3060/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 5.2350e+00 (6.2520e+00) Acc@1 6.25 ( 1.93) Acc@5 16.80 ( 6.46) +Epoch: [0][3061/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.4442e+00 (6.2518e+00) Acc@1 7.42 ( 1.93) Acc@5 16.80 ( 6.47) +Epoch: [0][3062/5004] Time 0.242 ( 0.242) Data 0.030 ( 0.027) Loss 5.3877e+00 (6.2515e+00) Acc@1 5.47 ( 1.94) Acc@5 14.45 ( 6.47) +Epoch: [0][3063/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.2340e+00 (6.2511e+00) Acc@1 8.59 ( 1.94) Acc@5 23.44 ( 6.47) +Epoch: [0][3064/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 5.4465e+00 (6.2509e+00) Acc@1 4.30 ( 1.94) Acc@5 16.02 ( 6.48) +Epoch: [0][3065/5004] Time 0.241 ( 0.242) Data 0.030 ( 0.027) Loss 5.1853e+00 (6.2505e+00) Acc@1 5.47 ( 1.94) Acc@5 21.48 ( 6.48) +Epoch: [0][3066/5004] Time 0.242 ( 0.242) Data 0.030 ( 0.027) Loss 5.4095e+00 (6.2503e+00) Acc@1 5.86 ( 1.94) Acc@5 12.50 ( 6.48) +Epoch: [0][3067/5004] Time 0.234 ( 0.242) Data 0.027 ( 0.027) Loss 5.3860e+00 (6.2500e+00) Acc@1 7.03 ( 1.94) Acc@5 16.02 ( 6.49) +Epoch: [0][3068/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 5.2640e+00 (6.2497e+00) Acc@1 5.47 ( 1.94) Acc@5 20.31 ( 6.49) +Epoch: [0][3069/5004] Time 0.240 ( 0.242) Data 0.030 ( 0.027) Loss 5.3330e+00 (6.2494e+00) Acc@1 6.64 ( 1.94) Acc@5 16.41 ( 6.49) +Epoch: [0][3070/5004] Time 0.243 ( 0.242) Data 0.032 ( 0.027) Loss 5.3579e+00 (6.2491e+00) Acc@1 6.64 ( 1.95) Acc@5 17.58 ( 6.50) +Epoch: [0][3071/5004] Time 0.235 ( 0.242) Data 0.028 ( 0.027) Loss 5.3815e+00 (6.2488e+00) Acc@1 5.08 ( 1.95) Acc@5 17.19 ( 6.50) +Epoch: [0][3072/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 5.4502e+00 (6.2485e+00) Acc@1 6.25 ( 1.95) Acc@5 17.19 ( 6.51) +Epoch: [0][3073/5004] Time 0.237 ( 0.242) Data 0.030 ( 0.027) Loss 5.3002e+00 (6.2482e+00) Acc@1 5.47 ( 1.95) Acc@5 15.23 ( 6.51) +Epoch: [0][3074/5004] Time 0.241 ( 0.242) Data 0.032 ( 0.027) Loss 5.4184e+00 (6.2479e+00) Acc@1 4.30 ( 1.95) Acc@5 16.41 ( 6.51) +Epoch: [0][3075/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.3110e+00 (6.2476e+00) Acc@1 4.69 ( 1.95) Acc@5 17.58 ( 6.52) +Epoch: [0][3076/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 5.3716e+00 (6.2474e+00) Acc@1 5.86 ( 1.95) Acc@5 17.58 ( 6.52) +Epoch: [0][3077/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.4439e+00 (6.2471e+00) Acc@1 8.59 ( 1.96) Acc@5 21.09 ( 6.52) +Epoch: [0][3078/5004] Time 0.237 ( 0.242) Data 0.030 ( 0.027) Loss 5.3353e+00 (6.2468e+00) Acc@1 5.86 ( 1.96) Acc@5 18.75 ( 6.53) +Epoch: [0][3079/5004] Time 0.238 ( 0.242) Data 0.031 ( 0.027) Loss 5.2374e+00 (6.2465e+00) Acc@1 7.81 ( 1.96) Acc@5 21.88 ( 6.53) +Epoch: [0][3080/5004] Time 0.240 ( 0.242) Data 0.031 ( 0.027) Loss 5.3908e+00 (6.2462e+00) Acc@1 5.08 ( 1.96) Acc@5 17.19 ( 6.54) +Epoch: [0][3081/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.3059e+00 (6.2459e+00) Acc@1 4.30 ( 1.96) Acc@5 16.41 ( 6.54) +Epoch: [0][3082/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 5.2275e+00 (6.2456e+00) Acc@1 3.91 ( 1.96) Acc@5 19.92 ( 6.54) +Epoch: [0][3083/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.027) Loss 5.2064e+00 (6.2452e+00) Acc@1 7.42 ( 1.96) Acc@5 18.75 ( 6.55) +Epoch: [0][3084/5004] Time 0.234 ( 0.242) Data 0.024 ( 0.027) Loss 5.3105e+00 (6.2449e+00) Acc@1 8.20 ( 1.96) Acc@5 20.70 ( 6.55) +Epoch: [0][3085/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.3734e+00 (6.2446e+00) Acc@1 4.69 ( 1.97) Acc@5 14.45 ( 6.55) +Epoch: [0][3086/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.3029e+00 (6.2443e+00) Acc@1 7.81 ( 1.97) Acc@5 18.36 ( 6.56) +Epoch: [0][3087/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.1956e+00 (6.2440e+00) Acc@1 5.86 ( 1.97) Acc@5 17.97 ( 6.56) +Epoch: [0][3088/5004] Time 0.245 ( 0.242) Data 0.030 ( 0.027) Loss 5.4250e+00 (6.2437e+00) Acc@1 3.52 ( 1.97) Acc@5 18.36 ( 6.57) +Epoch: [0][3089/5004] Time 0.236 ( 0.242) Data 0.023 ( 0.027) Loss 5.4084e+00 (6.2435e+00) Acc@1 6.64 ( 1.97) Acc@5 16.80 ( 6.57) +Epoch: [0][3090/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.3162e+00 (6.2432e+00) Acc@1 5.47 ( 1.97) Acc@5 16.41 ( 6.57) +Epoch: [0][3091/5004] Time 0.248 ( 0.242) Data 0.025 ( 0.027) Loss 5.1575e+00 (6.2428e+00) Acc@1 6.25 ( 1.97) Acc@5 19.92 ( 6.58) +Epoch: [0][3092/5004] Time 0.231 ( 0.242) Data 0.018 ( 0.027) Loss 5.4050e+00 (6.2425e+00) Acc@1 7.81 ( 1.97) Acc@5 16.02 ( 6.58) +Epoch: [0][3093/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 5.2803e+00 (6.2422e+00) Acc@1 5.47 ( 1.98) Acc@5 17.97 ( 6.58) +Epoch: [0][3094/5004] Time 0.240 ( 0.242) Data 0.022 ( 0.027) Loss 5.4374e+00 (6.2420e+00) Acc@1 5.08 ( 1.98) Acc@5 15.23 ( 6.59) +Epoch: [0][3095/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 5.3819e+00 (6.2417e+00) Acc@1 5.47 ( 1.98) Acc@5 14.84 ( 6.59) +Epoch: [0][3096/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 5.1674e+00 (6.2413e+00) Acc@1 6.64 ( 1.98) Acc@5 19.53 ( 6.59) +Epoch: [0][3097/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 5.3698e+00 (6.2411e+00) Acc@1 3.12 ( 1.98) Acc@5 10.94 ( 6.59) +Epoch: [0][3098/5004] Time 0.233 ( 0.242) Data 0.020 ( 0.027) Loss 5.3868e+00 (6.2408e+00) Acc@1 4.30 ( 1.98) Acc@5 16.41 ( 6.60) +Epoch: [0][3099/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 5.2644e+00 (6.2405e+00) Acc@1 5.86 ( 1.98) Acc@5 19.92 ( 6.60) +Epoch: [0][3100/5004] Time 0.235 ( 0.242) Data 0.023 ( 0.027) Loss 5.3592e+00 (6.2402e+00) Acc@1 5.47 ( 1.98) Acc@5 21.09 ( 6.61) +Epoch: [0][3101/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 5.4195e+00 (6.2399e+00) Acc@1 5.86 ( 1.98) Acc@5 17.19 ( 6.61) +Epoch: [0][3102/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.2441e+00 (6.2396e+00) Acc@1 5.47 ( 1.99) Acc@5 16.41 ( 6.61) +Epoch: [0][3103/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 5.2496e+00 (6.2393e+00) Acc@1 4.69 ( 1.99) Acc@5 16.02 ( 6.62) +Epoch: [0][3104/5004] Time 0.238 ( 0.242) Data 0.023 ( 0.027) Loss 5.3562e+00 (6.2390e+00) Acc@1 5.86 ( 1.99) Acc@5 18.36 ( 6.62) +Epoch: [0][3105/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 5.1501e+00 (6.2386e+00) Acc@1 6.25 ( 1.99) Acc@5 21.48 ( 6.62) +Epoch: [0][3106/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.2969e+00 (6.2383e+00) Acc@1 6.25 ( 1.99) Acc@5 16.02 ( 6.63) +Epoch: [0][3107/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 5.2271e+00 (6.2380e+00) Acc@1 8.98 ( 1.99) Acc@5 21.09 ( 6.63) +Epoch: [0][3108/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 5.2978e+00 (6.2377e+00) Acc@1 7.81 ( 1.99) Acc@5 19.14 ( 6.64) +Epoch: [0][3109/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 5.3450e+00 (6.2374e+00) Acc@1 4.30 ( 2.00) Acc@5 17.58 ( 6.64) +Epoch: [0][3110/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 5.7157e+00 (6.2373e+00) Acc@1 3.52 ( 2.00) Acc@5 11.72 ( 6.64) +Epoch: [0][3111/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 5.2185e+00 (6.2369e+00) Acc@1 7.81 ( 2.00) Acc@5 17.58 ( 6.65) +Epoch: [0][3112/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 5.3085e+00 (6.2366e+00) Acc@1 7.81 ( 2.00) Acc@5 17.97 ( 6.65) +Epoch: [0][3113/5004] Time 0.242 ( 0.242) Data 0.021 ( 0.027) Loss 5.2713e+00 (6.2363e+00) Acc@1 5.08 ( 2.00) Acc@5 16.41 ( 6.65) +Epoch: [0][3114/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.4389e+00 (6.2361e+00) Acc@1 6.64 ( 2.00) Acc@5 17.19 ( 6.66) +Epoch: [0][3115/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.3996e+00 (6.2358e+00) Acc@1 4.69 ( 2.00) Acc@5 14.45 ( 6.66) +Epoch: [0][3116/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 5.3144e+00 (6.2355e+00) Acc@1 6.25 ( 2.00) Acc@5 17.58 ( 6.66) +Epoch: [0][3117/5004] Time 0.255 ( 0.242) Data 0.021 ( 0.027) Loss 5.2632e+00 (6.2352e+00) Acc@1 7.03 ( 2.01) Acc@5 19.53 ( 6.67) +Epoch: [0][3118/5004] Time 0.250 ( 0.242) Data 0.015 ( 0.027) Loss 5.3343e+00 (6.2349e+00) Acc@1 4.69 ( 2.01) Acc@5 16.80 ( 6.67) +Epoch: [0][3119/5004] Time 0.240 ( 0.242) Data 0.018 ( 0.027) Loss 5.3250e+00 (6.2346e+00) Acc@1 5.47 ( 2.01) Acc@5 16.02 ( 6.67) +Epoch: [0][3120/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.3577e+00 (6.2343e+00) Acc@1 7.42 ( 2.01) Acc@5 16.80 ( 6.67) +Epoch: [0][3121/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.3739e+00 (6.2341e+00) Acc@1 6.64 ( 2.01) Acc@5 18.75 ( 6.68) +Epoch: [0][3122/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 5.2091e+00 (6.2337e+00) Acc@1 6.64 ( 2.01) Acc@5 21.09 ( 6.68) +Epoch: [0][3123/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.2693e+00 (6.2334e+00) Acc@1 3.91 ( 2.01) Acc@5 15.62 ( 6.69) +Epoch: [0][3124/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 5.3516e+00 (6.2331e+00) Acc@1 5.47 ( 2.01) Acc@5 16.80 ( 6.69) +Epoch: [0][3125/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.027) Loss 5.3207e+00 (6.2328e+00) Acc@1 6.64 ( 2.02) Acc@5 18.75 ( 6.69) +Epoch: [0][3126/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.4037e+00 (6.2326e+00) Acc@1 6.64 ( 2.02) Acc@5 16.41 ( 6.70) +Epoch: [0][3127/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 5.2016e+00 (6.2322e+00) Acc@1 7.81 ( 2.02) Acc@5 22.27 ( 6.70) +Epoch: [0][3128/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.2613e+00 (6.2319e+00) Acc@1 5.47 ( 2.02) Acc@5 19.14 ( 6.71) +Epoch: [0][3129/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.4281e+00 (6.2317e+00) Acc@1 7.03 ( 2.02) Acc@5 19.53 ( 6.71) +Epoch: [0][3130/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.2416e+00 (6.2314e+00) Acc@1 7.81 ( 2.02) Acc@5 19.92 ( 6.71) +Epoch: [0][3131/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.027) Loss 5.2449e+00 (6.2310e+00) Acc@1 9.77 ( 2.03) Acc@5 23.83 ( 6.72) +Epoch: [0][3132/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.2453e+00 (6.2307e+00) Acc@1 6.25 ( 2.03) Acc@5 17.97 ( 6.72) +Epoch: [0][3133/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 5.3289e+00 (6.2304e+00) Acc@1 6.25 ( 2.03) Acc@5 15.62 ( 6.73) +Epoch: [0][3134/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.027) Loss 5.1866e+00 (6.2301e+00) Acc@1 6.25 ( 2.03) Acc@5 20.31 ( 6.73) +Epoch: [0][3135/5004] Time 0.244 ( 0.242) Data 0.019 ( 0.027) Loss 5.2891e+00 (6.2298e+00) Acc@1 5.08 ( 2.03) Acc@5 20.31 ( 6.73) +Epoch: [0][3136/5004] Time 0.242 ( 0.242) Data 0.020 ( 0.027) Loss 5.3540e+00 (6.2295e+00) Acc@1 6.64 ( 2.03) Acc@5 15.62 ( 6.74) +Epoch: [0][3137/5004] Time 0.253 ( 0.242) Data 0.022 ( 0.027) Loss 5.4794e+00 (6.2293e+00) Acc@1 5.47 ( 2.03) Acc@5 16.80 ( 6.74) +Epoch: [0][3138/5004] Time 0.240 ( 0.242) Data 0.018 ( 0.027) Loss 5.1856e+00 (6.2290e+00) Acc@1 8.98 ( 2.04) Acc@5 23.44 ( 6.75) +Epoch: [0][3139/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.3564e+00 (6.2287e+00) Acc@1 6.25 ( 2.04) Acc@5 16.80 ( 6.75) +Epoch: [0][3140/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.5364e+00 (6.2285e+00) Acc@1 4.30 ( 2.04) Acc@5 15.62 ( 6.75) +Epoch: [0][3141/5004] Time 0.253 ( 0.242) Data 0.022 ( 0.027) Loss 5.2314e+00 (6.2281e+00) Acc@1 5.86 ( 2.04) Acc@5 17.19 ( 6.75) +Epoch: [0][3142/5004] Time 0.248 ( 0.242) Data 0.017 ( 0.027) Loss 5.3093e+00 (6.2279e+00) Acc@1 7.42 ( 2.04) Acc@5 17.97 ( 6.76) +Epoch: [0][3143/5004] Time 0.248 ( 0.242) Data 0.016 ( 0.027) Loss 5.2633e+00 (6.2275e+00) Acc@1 5.86 ( 2.04) Acc@5 17.19 ( 6.76) +Epoch: [0][3144/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 5.2325e+00 (6.2272e+00) Acc@1 11.33 ( 2.04) Acc@5 21.88 ( 6.77) +Epoch: [0][3145/5004] Time 0.242 ( 0.242) Data 0.018 ( 0.027) Loss 5.1324e+00 (6.2269e+00) Acc@1 7.03 ( 2.05) Acc@5 20.70 ( 6.77) +Epoch: [0][3146/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 5.1913e+00 (6.2266e+00) Acc@1 6.25 ( 2.05) Acc@5 19.14 ( 6.77) +Epoch: [0][3147/5004] Time 0.243 ( 0.242) Data 0.017 ( 0.027) Loss 5.4492e+00 (6.2263e+00) Acc@1 4.30 ( 2.05) Acc@5 12.50 ( 6.78) +Epoch: [0][3148/5004] Time 0.240 ( 0.242) Data 0.019 ( 0.027) Loss 5.3431e+00 (6.2260e+00) Acc@1 8.20 ( 2.05) Acc@5 19.53 ( 6.78) +Epoch: [0][3149/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 5.3389e+00 (6.2257e+00) Acc@1 8.20 ( 2.05) Acc@5 17.58 ( 6.78) +Epoch: [0][3150/5004] Time 0.255 ( 0.242) Data 0.021 ( 0.027) Loss 5.4463e+00 (6.2255e+00) Acc@1 2.73 ( 2.05) Acc@5 14.84 ( 6.79) +Epoch: [0][3151/5004] Time 0.247 ( 0.242) Data 0.012 ( 0.027) Loss 5.4397e+00 (6.2252e+00) Acc@1 6.64 ( 2.05) Acc@5 17.97 ( 6.79) +Epoch: [0][3152/5004] Time 0.251 ( 0.242) Data 0.011 ( 0.027) Loss 5.4118e+00 (6.2250e+00) Acc@1 6.64 ( 2.06) Acc@5 17.58 ( 6.79) +Epoch: [0][3153/5004] Time 0.247 ( 0.242) Data 0.018 ( 0.027) Loss 5.3233e+00 (6.2247e+00) Acc@1 7.81 ( 2.06) Acc@5 17.58 ( 6.80) +Epoch: [0][3154/5004] Time 0.245 ( 0.242) Data 0.017 ( 0.027) Loss 5.2830e+00 (6.2244e+00) Acc@1 6.64 ( 2.06) Acc@5 18.75 ( 6.80) +Epoch: [0][3155/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.2119e+00 (6.2241e+00) Acc@1 8.59 ( 2.06) Acc@5 19.92 ( 6.81) +Epoch: [0][3156/5004] Time 0.242 ( 0.242) Data 0.020 ( 0.027) Loss 5.2833e+00 (6.2238e+00) Acc@1 6.25 ( 2.06) Acc@5 20.70 ( 6.81) +Epoch: [0][3157/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 5.1654e+00 (6.2235e+00) Acc@1 5.08 ( 2.06) Acc@5 18.75 ( 6.81) +Epoch: [0][3158/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.027) Loss 5.1072e+00 (6.2231e+00) Acc@1 6.64 ( 2.06) Acc@5 20.31 ( 6.82) +Epoch: [0][3159/5004] Time 0.246 ( 0.242) Data 0.017 ( 0.027) Loss 5.3968e+00 (6.2228e+00) Acc@1 6.25 ( 2.07) Acc@5 17.58 ( 6.82) +Epoch: [0][3160/5004] Time 0.246 ( 0.242) Data 0.017 ( 0.027) Loss 5.2560e+00 (6.2225e+00) Acc@1 4.30 ( 2.07) Acc@5 18.36 ( 6.82) +Epoch: [0][3161/5004] Time 0.250 ( 0.242) Data 0.016 ( 0.027) Loss 5.1880e+00 (6.2222e+00) Acc@1 5.47 ( 2.07) Acc@5 20.31 ( 6.83) +Epoch: [0][3162/5004] Time 0.246 ( 0.242) Data 0.018 ( 0.027) Loss 5.2710e+00 (6.2219e+00) Acc@1 6.25 ( 2.07) Acc@5 19.92 ( 6.83) +Epoch: [0][3163/5004] Time 0.240 ( 0.242) Data 0.018 ( 0.027) Loss 5.2774e+00 (6.2216e+00) Acc@1 6.64 ( 2.07) Acc@5 16.41 ( 6.84) +Epoch: [0][3164/5004] Time 0.252 ( 0.242) Data 0.022 ( 0.027) Loss 5.1745e+00 (6.2213e+00) Acc@1 6.64 ( 2.07) Acc@5 18.75 ( 6.84) +Epoch: [0][3165/5004] Time 0.241 ( 0.242) Data 0.014 ( 0.027) Loss 5.4196e+00 (6.2210e+00) Acc@1 3.91 ( 2.07) Acc@5 16.80 ( 6.84) +Epoch: [0][3166/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 5.1803e+00 (6.2207e+00) Acc@1 8.98 ( 2.07) Acc@5 19.53 ( 6.85) +Epoch: [0][3167/5004] Time 0.201 ( 0.242) Data 0.020 ( 0.027) Loss 5.4254e+00 (6.2204e+00) Acc@1 3.91 ( 2.08) Acc@5 17.97 ( 6.85) +Epoch: [0][3168/5004] Time 0.236 ( 0.242) Data 0.058 ( 0.027) Loss 5.2471e+00 (6.2201e+00) Acc@1 7.81 ( 2.08) Acc@5 18.75 ( 6.85) +Epoch: [0][3169/5004] Time 0.236 ( 0.242) Data 0.059 ( 0.027) Loss 5.2218e+00 (6.2198e+00) Acc@1 8.59 ( 2.08) Acc@5 17.97 ( 6.86) +Epoch: [0][3170/5004] Time 0.237 ( 0.242) Data 0.059 ( 0.027) Loss 5.1938e+00 (6.2195e+00) Acc@1 5.08 ( 2.08) Acc@5 18.75 ( 6.86) +Epoch: [0][3171/5004] Time 0.236 ( 0.242) Data 0.059 ( 0.027) Loss 5.2783e+00 (6.2192e+00) Acc@1 7.42 ( 2.08) Acc@5 17.97 ( 6.87) +Epoch: [0][3172/5004] Time 0.241 ( 0.242) Data 0.059 ( 0.027) Loss 5.2202e+00 (6.2189e+00) Acc@1 7.03 ( 2.08) Acc@5 19.53 ( 6.87) +Epoch: [0][3173/5004] Time 0.231 ( 0.242) Data 0.054 ( 0.027) Loss 5.4397e+00 (6.2186e+00) Acc@1 4.69 ( 2.08) Acc@5 14.84 ( 6.87) +Epoch: [0][3174/5004] Time 0.236 ( 0.242) Data 0.059 ( 0.027) Loss 5.3047e+00 (6.2183e+00) Acc@1 5.86 ( 2.09) Acc@5 20.70 ( 6.88) +Epoch: [0][3175/5004] Time 0.239 ( 0.242) Data 0.059 ( 0.027) Loss 5.2389e+00 (6.2180e+00) Acc@1 6.25 ( 2.09) Acc@5 15.62 ( 6.88) +Epoch: [0][3176/5004] Time 0.234 ( 0.242) Data 0.057 ( 0.027) Loss 5.3022e+00 (6.2178e+00) Acc@1 4.30 ( 2.09) Acc@5 16.41 ( 6.88) +Epoch: [0][3177/5004] Time 0.237 ( 0.242) Data 0.059 ( 0.027) Loss 5.4241e+00 (6.2175e+00) Acc@1 6.64 ( 2.09) Acc@5 16.80 ( 6.88) +Epoch: [0][3178/5004] Time 0.237 ( 0.242) Data 0.058 ( 0.027) Loss 5.3449e+00 (6.2172e+00) Acc@1 7.81 ( 2.09) Acc@5 18.36 ( 6.89) +Epoch: [0][3179/5004] Time 0.262 ( 0.242) Data 0.058 ( 0.027) Loss 5.1631e+00 (6.2169e+00) Acc@1 7.42 ( 2.09) Acc@5 21.88 ( 6.89) +Epoch: [0][3180/5004] Time 0.254 ( 0.242) Data 0.033 ( 0.027) Loss 5.1113e+00 (6.2165e+00) Acc@1 6.64 ( 2.09) Acc@5 22.66 ( 6.90) +Epoch: [0][3181/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.027) Loss 5.1579e+00 (6.2162e+00) Acc@1 6.25 ( 2.10) Acc@5 19.14 ( 6.90) +Epoch: [0][3182/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 5.3244e+00 (6.2159e+00) Acc@1 5.08 ( 2.10) Acc@5 19.14 ( 6.91) +Epoch: [0][3183/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.027) Loss 5.2779e+00 (6.2156e+00) Acc@1 8.20 ( 2.10) Acc@5 19.14 ( 6.91) +Epoch: [0][3184/5004] Time 0.252 ( 0.242) Data 0.022 ( 0.027) Loss 5.0636e+00 (6.2153e+00) Acc@1 7.03 ( 2.10) Acc@5 20.70 ( 6.91) +Epoch: [0][3185/5004] Time 0.243 ( 0.242) Data 0.016 ( 0.027) Loss 5.1918e+00 (6.2150e+00) Acc@1 5.86 ( 2.10) Acc@5 19.14 ( 6.92) +Epoch: [0][3186/5004] Time 0.235 ( 0.242) Data 0.019 ( 0.027) Loss 5.1429e+00 (6.2146e+00) Acc@1 8.20 ( 2.10) Acc@5 21.48 ( 6.92) +Epoch: [0][3187/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 5.1764e+00 (6.2143e+00) Acc@1 6.64 ( 2.10) Acc@5 21.88 ( 6.93) +Epoch: [0][3188/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.2186e+00 (6.2140e+00) Acc@1 3.52 ( 2.10) Acc@5 20.31 ( 6.93) +Epoch: [0][3189/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.0388e+00 (6.2136e+00) Acc@1 6.25 ( 2.11) Acc@5 20.31 ( 6.94) +Epoch: [0][3190/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 5.3882e+00 (6.2134e+00) Acc@1 4.30 ( 2.11) Acc@5 18.36 ( 6.94) +Epoch: [0][3191/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 5.3151e+00 (6.2131e+00) Acc@1 6.64 ( 2.11) Acc@5 19.53 ( 6.94) +Epoch: [0][3192/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 5.3881e+00 (6.2128e+00) Acc@1 6.64 ( 2.11) Acc@5 18.75 ( 6.95) +Epoch: [0][3193/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 5.2746e+00 (6.2125e+00) Acc@1 8.20 ( 2.11) Acc@5 18.75 ( 6.95) +Epoch: [0][3194/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 5.2081e+00 (6.2122e+00) Acc@1 6.25 ( 2.11) Acc@5 14.84 ( 6.95) +Epoch: [0][3195/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.2942e+00 (6.2119e+00) Acc@1 5.47 ( 2.11) Acc@5 17.58 ( 6.96) +Epoch: [0][3196/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 5.1979e+00 (6.2116e+00) Acc@1 8.20 ( 2.12) Acc@5 19.53 ( 6.96) +Epoch: [0][3197/5004] Time 0.235 ( 0.242) Data 0.023 ( 0.027) Loss 5.2670e+00 (6.2113e+00) Acc@1 5.86 ( 2.12) Acc@5 19.53 ( 6.96) +Epoch: [0][3198/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 5.3643e+00 (6.2110e+00) Acc@1 4.69 ( 2.12) Acc@5 15.23 ( 6.97) +Epoch: [0][3199/5004] Time 0.235 ( 0.242) Data 0.022 ( 0.027) Loss 5.3583e+00 (6.2108e+00) Acc@1 5.86 ( 2.12) Acc@5 19.14 ( 6.97) +Epoch: [0][3200/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 5.3860e+00 (6.2105e+00) Acc@1 7.42 ( 2.12) Acc@5 18.75 ( 6.97) +Epoch: [0][3201/5004] Time 0.238 ( 0.242) Data 0.025 ( 0.027) Loss 5.1949e+00 (6.2102e+00) Acc@1 8.59 ( 2.12) Acc@5 21.09 ( 6.98) +Epoch: [0][3202/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 5.3686e+00 (6.2099e+00) Acc@1 3.91 ( 2.12) Acc@5 17.97 ( 6.98) +Epoch: [0][3203/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 5.4515e+00 (6.2097e+00) Acc@1 6.64 ( 2.12) Acc@5 15.62 ( 6.98) +Epoch: [0][3204/5004] Time 0.238 ( 0.242) Data 0.025 ( 0.027) Loss 5.1244e+00 (6.2094e+00) Acc@1 10.16 ( 2.13) Acc@5 24.61 ( 6.99) +Epoch: [0][3205/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.027) Loss 5.4234e+00 (6.2091e+00) Acc@1 5.86 ( 2.13) Acc@5 16.80 ( 6.99) +Epoch: [0][3206/5004] Time 0.234 ( 0.242) Data 0.023 ( 0.027) Loss 5.2519e+00 (6.2088e+00) Acc@1 6.64 ( 2.13) Acc@5 20.31 ( 7.00) +Epoch: [0][3207/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.2743e+00 (6.2085e+00) Acc@1 5.47 ( 2.13) Acc@5 19.53 ( 7.00) +Epoch: [0][3208/5004] Time 0.232 ( 0.242) Data 0.024 ( 0.027) Loss 5.0673e+00 (6.2082e+00) Acc@1 6.25 ( 2.13) Acc@5 22.66 ( 7.01) +Epoch: [0][3209/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.027) Loss 5.2246e+00 (6.2079e+00) Acc@1 8.20 ( 2.13) Acc@5 21.88 ( 7.01) +Epoch: [0][3210/5004] Time 0.228 ( 0.242) Data 0.021 ( 0.027) Loss 5.2527e+00 (6.2076e+00) Acc@1 6.64 ( 2.13) Acc@5 21.09 ( 7.02) +Epoch: [0][3211/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 5.3532e+00 (6.2073e+00) Acc@1 7.81 ( 2.14) Acc@5 18.36 ( 7.02) +Epoch: [0][3212/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.2981e+00 (6.2070e+00) Acc@1 4.69 ( 2.14) Acc@5 19.92 ( 7.02) +Epoch: [0][3213/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.3718e+00 (6.2068e+00) Acc@1 6.25 ( 2.14) Acc@5 19.14 ( 7.03) +Epoch: [0][3214/5004] Time 0.232 ( 0.242) Data 0.025 ( 0.027) Loss 5.3687e+00 (6.2065e+00) Acc@1 4.30 ( 2.14) Acc@5 17.19 ( 7.03) +Epoch: [0][3215/5004] Time 0.244 ( 0.242) Data 0.030 ( 0.027) Loss 5.2702e+00 (6.2062e+00) Acc@1 7.81 ( 2.14) Acc@5 17.19 ( 7.03) +Epoch: [0][3216/5004] Time 0.232 ( 0.242) Data 0.024 ( 0.027) Loss 5.3553e+00 (6.2059e+00) Acc@1 4.30 ( 2.14) Acc@5 15.62 ( 7.04) +Epoch: [0][3217/5004] Time 0.238 ( 0.242) Data 0.029 ( 0.027) Loss 5.4427e+00 (6.2057e+00) Acc@1 6.25 ( 2.14) Acc@5 16.80 ( 7.04) +Epoch: [0][3218/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 5.1159e+00 (6.2054e+00) Acc@1 9.38 ( 2.15) Acc@5 21.09 ( 7.04) +Epoch: [0][3219/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.4590e+00 (6.2051e+00) Acc@1 5.86 ( 2.15) Acc@5 13.67 ( 7.04) +Epoch: [0][3220/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 5.2143e+00 (6.2048e+00) Acc@1 8.98 ( 2.15) Acc@5 20.31 ( 7.05) +Epoch: [0][3221/5004] Time 0.242 ( 0.242) Data 0.029 ( 0.027) Loss 5.2417e+00 (6.2045e+00) Acc@1 5.86 ( 2.15) Acc@5 22.66 ( 7.05) +Epoch: [0][3222/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.3780e+00 (6.2043e+00) Acc@1 7.03 ( 2.15) Acc@5 17.97 ( 7.06) +Epoch: [0][3223/5004] Time 0.239 ( 0.242) Data 0.029 ( 0.027) Loss 5.2368e+00 (6.2040e+00) Acc@1 6.25 ( 2.15) Acc@5 17.97 ( 7.06) +Epoch: [0][3224/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.3280e+00 (6.2037e+00) Acc@1 7.03 ( 2.15) Acc@5 16.41 ( 7.06) +Epoch: [0][3225/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 5.2181e+00 (6.2034e+00) Acc@1 7.42 ( 2.16) Acc@5 19.92 ( 7.07) +Epoch: [0][3226/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 5.1903e+00 (6.2031e+00) Acc@1 7.42 ( 2.16) Acc@5 17.19 ( 7.07) +Epoch: [0][3227/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.3402e+00 (6.2028e+00) Acc@1 5.47 ( 2.16) Acc@5 17.97 ( 7.07) +Epoch: [0][3228/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 5.3408e+00 (6.2025e+00) Acc@1 6.64 ( 2.16) Acc@5 15.62 ( 7.08) +Epoch: [0][3229/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.2688e+00 (6.2023e+00) Acc@1 5.86 ( 2.16) Acc@5 17.97 ( 7.08) +Epoch: [0][3230/5004] Time 0.232 ( 0.242) Data 0.023 ( 0.027) Loss 5.5336e+00 (6.2021e+00) Acc@1 6.25 ( 2.16) Acc@5 15.62 ( 7.08) +Epoch: [0][3231/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.3098e+00 (6.2018e+00) Acc@1 5.47 ( 2.16) Acc@5 17.58 ( 7.09) +Epoch: [0][3232/5004] Time 0.238 ( 0.242) Data 0.030 ( 0.027) Loss 5.3599e+00 (6.2015e+00) Acc@1 4.69 ( 2.16) Acc@5 15.62 ( 7.09) +Epoch: [0][3233/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.3050e+00 (6.2012e+00) Acc@1 7.03 ( 2.17) Acc@5 17.58 ( 7.09) +Epoch: [0][3234/5004] Time 0.242 ( 0.242) Data 0.029 ( 0.027) Loss 5.1977e+00 (6.2009e+00) Acc@1 7.03 ( 2.17) Acc@5 21.48 ( 7.10) +Epoch: [0][3235/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.2304e+00 (6.2006e+00) Acc@1 7.03 ( 2.17) Acc@5 17.58 ( 7.10) +Epoch: [0][3236/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.3147e+00 (6.2004e+00) Acc@1 5.47 ( 2.17) Acc@5 16.80 ( 7.10) +Epoch: [0][3237/5004] Time 0.239 ( 0.242) Data 0.029 ( 0.027) Loss 5.2724e+00 (6.2001e+00) Acc@1 7.03 ( 2.17) Acc@5 20.70 ( 7.11) +Epoch: [0][3238/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.027) Loss 5.1592e+00 (6.1997e+00) Acc@1 6.25 ( 2.17) Acc@5 21.48 ( 7.11) +Epoch: [0][3239/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.2717e+00 (6.1995e+00) Acc@1 6.64 ( 2.17) Acc@5 21.88 ( 7.12) +Epoch: [0][3240/5004] Time 0.232 ( 0.242) Data 0.022 ( 0.027) Loss 5.2973e+00 (6.1992e+00) Acc@1 5.47 ( 2.17) Acc@5 17.97 ( 7.12) +Epoch: [0][3241/5004] Time 0.236 ( 0.242) Data 0.028 ( 0.027) Loss 5.3092e+00 (6.1989e+00) Acc@1 5.47 ( 2.18) Acc@5 21.09 ( 7.12) +Epoch: [0][3242/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 5.1036e+00 (6.1986e+00) Acc@1 9.77 ( 2.18) Acc@5 24.22 ( 7.13) +Epoch: [0][3243/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 5.1150e+00 (6.1982e+00) Acc@1 9.77 ( 2.18) Acc@5 24.22 ( 7.13) +Epoch: [0][3244/5004] Time 0.238 ( 0.242) Data 0.029 ( 0.027) Loss 5.1121e+00 (6.1979e+00) Acc@1 8.20 ( 2.18) Acc@5 21.48 ( 7.14) +Epoch: [0][3245/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.3515e+00 (6.1976e+00) Acc@1 7.03 ( 2.18) Acc@5 16.80 ( 7.14) +Epoch: [0][3246/5004] Time 0.238 ( 0.242) Data 0.029 ( 0.027) Loss 5.1430e+00 (6.1973e+00) Acc@1 8.98 ( 2.19) Acc@5 21.09 ( 7.15) +Epoch: [0][3247/5004] Time 0.235 ( 0.242) Data 0.028 ( 0.027) Loss 5.1968e+00 (6.1970e+00) Acc@1 9.38 ( 2.19) Acc@5 20.70 ( 7.15) +Epoch: [0][3248/5004] Time 0.241 ( 0.242) Data 0.030 ( 0.027) Loss 5.2892e+00 (6.1967e+00) Acc@1 5.47 ( 2.19) Acc@5 17.19 ( 7.15) +Epoch: [0][3249/5004] Time 0.236 ( 0.242) Data 0.026 ( 0.027) Loss 5.1811e+00 (6.1964e+00) Acc@1 7.81 ( 2.19) Acc@5 24.22 ( 7.16) +Epoch: [0][3250/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.2268e+00 (6.1961e+00) Acc@1 9.38 ( 2.19) Acc@5 19.14 ( 7.16) +Epoch: [0][3251/5004] Time 0.236 ( 0.242) Data 0.029 ( 0.027) Loss 5.2563e+00 (6.1958e+00) Acc@1 6.25 ( 2.19) Acc@5 23.44 ( 7.17) +Epoch: [0][3252/5004] Time 0.238 ( 0.242) Data 0.029 ( 0.027) Loss 5.2647e+00 (6.1955e+00) Acc@1 5.86 ( 2.20) Acc@5 17.97 ( 7.17) +Epoch: [0][3253/5004] Time 0.240 ( 0.242) Data 0.030 ( 0.027) Loss 5.4898e+00 (6.1953e+00) Acc@1 5.86 ( 2.20) Acc@5 17.58 ( 7.17) +Epoch: [0][3254/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.3881e+00 (6.1951e+00) Acc@1 5.08 ( 2.20) Acc@5 14.45 ( 7.18) +Epoch: [0][3255/5004] Time 0.239 ( 0.242) Data 0.029 ( 0.027) Loss 5.3122e+00 (6.1948e+00) Acc@1 7.81 ( 2.20) Acc@5 17.58 ( 7.18) +Epoch: [0][3256/5004] Time 0.236 ( 0.242) Data 0.029 ( 0.027) Loss 5.2313e+00 (6.1945e+00) Acc@1 7.42 ( 2.20) Acc@5 20.70 ( 7.18) +Epoch: [0][3257/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.027) Loss 5.0335e+00 (6.1942e+00) Acc@1 6.64 ( 2.20) Acc@5 21.88 ( 7.19) +Epoch: [0][3258/5004] Time 0.232 ( 0.242) Data 0.025 ( 0.027) Loss 5.1506e+00 (6.1938e+00) Acc@1 8.59 ( 2.20) Acc@5 19.14 ( 7.19) +Epoch: [0][3259/5004] Time 0.237 ( 0.242) Data 0.030 ( 0.027) Loss 5.2188e+00 (6.1935e+00) Acc@1 7.42 ( 2.21) Acc@5 20.70 ( 7.20) +Epoch: [0][3260/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 5.2608e+00 (6.1932e+00) Acc@1 5.86 ( 2.21) Acc@5 17.19 ( 7.20) +Epoch: [0][3261/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.2546e+00 (6.1930e+00) Acc@1 7.03 ( 2.21) Acc@5 17.58 ( 7.20) +Epoch: [0][3262/5004] Time 0.238 ( 0.242) Data 0.029 ( 0.027) Loss 5.2532e+00 (6.1927e+00) Acc@1 3.91 ( 2.21) Acc@5 14.84 ( 7.20) +Epoch: [0][3263/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 5.3182e+00 (6.1924e+00) Acc@1 8.20 ( 2.21) Acc@5 21.09 ( 7.21) +Epoch: [0][3264/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 5.3049e+00 (6.1921e+00) Acc@1 5.08 ( 2.21) Acc@5 16.41 ( 7.21) +Epoch: [0][3265/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 5.2289e+00 (6.1918e+00) Acc@1 8.98 ( 2.21) Acc@5 22.27 ( 7.22) +Epoch: [0][3266/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.1550e+00 (6.1915e+00) Acc@1 5.47 ( 2.21) Acc@5 23.05 ( 7.22) +Epoch: [0][3267/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 5.2890e+00 (6.1912e+00) Acc@1 7.42 ( 2.22) Acc@5 20.31 ( 7.22) +Epoch: [0][3268/5004] Time 0.242 ( 0.242) Data 0.029 ( 0.027) Loss 5.3095e+00 (6.1910e+00) Acc@1 5.86 ( 2.22) Acc@5 18.75 ( 7.23) +Epoch: [0][3269/5004] Time 0.234 ( 0.242) Data 0.024 ( 0.027) Loss 5.1766e+00 (6.1907e+00) Acc@1 8.98 ( 2.22) Acc@5 21.09 ( 7.23) +Epoch: [0][3270/5004] Time 0.239 ( 0.242) Data 0.029 ( 0.027) Loss 5.3264e+00 (6.1904e+00) Acc@1 6.25 ( 2.22) Acc@5 19.53 ( 7.24) +Epoch: [0][3271/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 5.1534e+00 (6.1901e+00) Acc@1 8.59 ( 2.22) Acc@5 21.48 ( 7.24) +Epoch: [0][3272/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.2337e+00 (6.1898e+00) Acc@1 6.64 ( 2.22) Acc@5 23.44 ( 7.25) +Epoch: [0][3273/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.2976e+00 (6.1895e+00) Acc@1 8.98 ( 2.23) Acc@5 19.53 ( 7.25) +Epoch: [0][3274/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.1505e+00 (6.1892e+00) Acc@1 7.03 ( 2.23) Acc@5 20.31 ( 7.25) +Epoch: [0][3275/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 5.2715e+00 (6.1889e+00) Acc@1 3.52 ( 2.23) Acc@5 18.36 ( 7.26) +Epoch: [0][3276/5004] Time 0.235 ( 0.242) Data 0.028 ( 0.027) Loss 5.3642e+00 (6.1887e+00) Acc@1 8.59 ( 2.23) Acc@5 20.70 ( 7.26) +Epoch: [0][3277/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 5.2867e+00 (6.1884e+00) Acc@1 5.86 ( 2.23) Acc@5 18.75 ( 7.26) +Epoch: [0][3278/5004] Time 0.236 ( 0.242) Data 0.029 ( 0.027) Loss 5.2536e+00 (6.1881e+00) Acc@1 8.20 ( 2.23) Acc@5 19.53 ( 7.27) +Epoch: [0][3279/5004] Time 0.237 ( 0.242) Data 0.030 ( 0.027) Loss 5.2498e+00 (6.1878e+00) Acc@1 6.25 ( 2.23) Acc@5 21.09 ( 7.27) +Epoch: [0][3280/5004] Time 0.241 ( 0.242) Data 0.030 ( 0.027) Loss 5.1784e+00 (6.1875e+00) Acc@1 6.64 ( 2.24) Acc@5 22.27 ( 7.28) +Epoch: [0][3281/5004] Time 0.233 ( 0.242) Data 0.027 ( 0.027) Loss 5.2254e+00 (6.1872e+00) Acc@1 7.03 ( 2.24) Acc@5 19.53 ( 7.28) +Epoch: [0][3282/5004] Time 0.237 ( 0.242) Data 0.031 ( 0.027) Loss 5.1879e+00 (6.1869e+00) Acc@1 6.64 ( 2.24) Acc@5 20.31 ( 7.28) +Epoch: [0][3283/5004] Time 0.238 ( 0.242) Data 0.030 ( 0.027) Loss 5.3855e+00 (6.1867e+00) Acc@1 8.20 ( 2.24) Acc@5 18.36 ( 7.29) +Epoch: [0][3284/5004] Time 0.237 ( 0.242) Data 0.029 ( 0.027) Loss 5.2426e+00 (6.1864e+00) Acc@1 4.30 ( 2.24) Acc@5 20.70 ( 7.29) +Epoch: [0][3285/5004] Time 0.239 ( 0.242) Data 0.031 ( 0.027) Loss 5.2017e+00 (6.1861e+00) Acc@1 5.08 ( 2.24) Acc@5 19.53 ( 7.30) +Epoch: [0][3286/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.1316e+00 (6.1858e+00) Acc@1 6.25 ( 2.24) Acc@5 19.53 ( 7.30) +Epoch: [0][3287/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.2794e+00 (6.1855e+00) Acc@1 5.47 ( 2.24) Acc@5 21.09 ( 7.30) +Epoch: [0][3288/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.2517e+00 (6.1852e+00) Acc@1 6.25 ( 2.24) Acc@5 17.19 ( 7.31) +Epoch: [0][3289/5004] Time 0.236 ( 0.242) Data 0.028 ( 0.027) Loss 5.4223e+00 (6.1850e+00) Acc@1 5.86 ( 2.25) Acc@5 17.97 ( 7.31) +Epoch: [0][3290/5004] Time 0.238 ( 0.242) Data 0.030 ( 0.027) Loss 5.2327e+00 (6.1847e+00) Acc@1 5.86 ( 2.25) Acc@5 17.58 ( 7.31) +Epoch: [0][3291/5004] Time 0.238 ( 0.242) Data 0.029 ( 0.027) Loss 5.4211e+00 (6.1845e+00) Acc@1 6.25 ( 2.25) Acc@5 17.58 ( 7.32) +Epoch: [0][3292/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 5.1692e+00 (6.1841e+00) Acc@1 6.25 ( 2.25) Acc@5 19.53 ( 7.32) +Epoch: [0][3293/5004] Time 0.239 ( 0.242) Data 0.029 ( 0.027) Loss 5.3621e+00 (6.1839e+00) Acc@1 7.81 ( 2.25) Acc@5 17.58 ( 7.32) +Epoch: [0][3294/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 5.1408e+00 (6.1836e+00) Acc@1 5.86 ( 2.25) Acc@5 23.05 ( 7.33) +Epoch: [0][3295/5004] Time 0.234 ( 0.242) Data 0.025 ( 0.027) Loss 5.3775e+00 (6.1833e+00) Acc@1 5.08 ( 2.25) Acc@5 17.19 ( 7.33) +Epoch: [0][3296/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 5.3708e+00 (6.1831e+00) Acc@1 6.25 ( 2.25) Acc@5 15.23 ( 7.33) +Epoch: [0][3297/5004] Time 0.236 ( 0.242) Data 0.027 ( 0.027) Loss 5.2152e+00 (6.1828e+00) Acc@1 6.25 ( 2.26) Acc@5 16.41 ( 7.34) +Epoch: [0][3298/5004] Time 0.236 ( 0.242) Data 0.028 ( 0.027) Loss 5.2261e+00 (6.1825e+00) Acc@1 6.64 ( 2.26) Acc@5 19.14 ( 7.34) +Epoch: [0][3299/5004] Time 0.230 ( 0.242) Data 0.030 ( 0.027) Loss 5.1836e+00 (6.1822e+00) Acc@1 5.86 ( 2.26) Acc@5 19.92 ( 7.34) +Epoch: [0][3300/5004] Time 0.236 ( 0.242) Data 0.037 ( 0.027) Loss 5.2684e+00 (6.1819e+00) Acc@1 6.64 ( 2.26) Acc@5 19.14 ( 7.35) +Epoch: [0][3301/5004] Time 0.239 ( 0.242) Data 0.038 ( 0.027) Loss 5.1620e+00 (6.1816e+00) Acc@1 10.55 ( 2.26) Acc@5 24.22 ( 7.35) +Epoch: [0][3302/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.1312e+00 (6.1813e+00) Acc@1 8.59 ( 2.26) Acc@5 24.61 ( 7.36) +Epoch: [0][3303/5004] Time 0.236 ( 0.242) Data 0.034 ( 0.027) Loss 5.1477e+00 (6.1810e+00) Acc@1 7.42 ( 2.27) Acc@5 21.48 ( 7.36) +Epoch: [0][3304/5004] Time 0.238 ( 0.242) Data 0.035 ( 0.027) Loss 5.1893e+00 (6.1807e+00) Acc@1 7.42 ( 2.27) Acc@5 23.44 ( 7.37) +Epoch: [0][3305/5004] Time 0.238 ( 0.242) Data 0.035 ( 0.027) Loss 5.1049e+00 (6.1804e+00) Acc@1 7.03 ( 2.27) Acc@5 22.66 ( 7.37) +Epoch: [0][3306/5004] Time 0.238 ( 0.242) Data 0.034 ( 0.027) Loss 5.1826e+00 (6.1801e+00) Acc@1 7.42 ( 2.27) Acc@5 21.48 ( 7.37) +Epoch: [0][3307/5004] Time 0.237 ( 0.242) Data 0.033 ( 0.027) Loss 5.1057e+00 (6.1797e+00) Acc@1 5.08 ( 2.27) Acc@5 20.70 ( 7.38) +Epoch: [0][3308/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 5.2461e+00 (6.1795e+00) Acc@1 6.64 ( 2.27) Acc@5 18.75 ( 7.38) +Epoch: [0][3309/5004] Time 0.246 ( 0.242) Data 0.036 ( 0.027) Loss 5.2087e+00 (6.1792e+00) Acc@1 5.47 ( 2.27) Acc@5 20.70 ( 7.39) +Epoch: [0][3310/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.1260e+00 (6.1788e+00) Acc@1 9.38 ( 2.27) Acc@5 19.92 ( 7.39) +Epoch: [0][3311/5004] Time 0.237 ( 0.242) Data 0.036 ( 0.027) Loss 5.1644e+00 (6.1785e+00) Acc@1 7.81 ( 2.28) Acc@5 19.53 ( 7.39) +Epoch: [0][3312/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.2802e+00 (6.1783e+00) Acc@1 8.98 ( 2.28) Acc@5 20.70 ( 7.40) +Epoch: [0][3313/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.1672e+00 (6.1780e+00) Acc@1 7.42 ( 2.28) Acc@5 20.70 ( 7.40) +Epoch: [0][3314/5004] Time 0.235 ( 0.242) Data 0.034 ( 0.027) Loss 5.0806e+00 (6.1776e+00) Acc@1 9.77 ( 2.28) Acc@5 21.88 ( 7.41) +Epoch: [0][3315/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.2014e+00 (6.1773e+00) Acc@1 3.12 ( 2.28) Acc@5 21.88 ( 7.41) +Epoch: [0][3316/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 5.2962e+00 (6.1771e+00) Acc@1 4.69 ( 2.28) Acc@5 16.41 ( 7.41) +Epoch: [0][3317/5004] Time 0.237 ( 0.242) Data 0.035 ( 0.027) Loss 5.0462e+00 (6.1767e+00) Acc@1 9.77 ( 2.29) Acc@5 24.61 ( 7.42) +Epoch: [0][3318/5004] Time 0.239 ( 0.242) Data 0.035 ( 0.027) Loss 5.0174e+00 (6.1764e+00) Acc@1 7.81 ( 2.29) Acc@5 24.22 ( 7.42) +Epoch: [0][3319/5004] Time 0.238 ( 0.242) Data 0.035 ( 0.027) Loss 5.3311e+00 (6.1761e+00) Acc@1 5.47 ( 2.29) Acc@5 17.58 ( 7.43) +Epoch: [0][3320/5004] Time 0.236 ( 0.242) Data 0.036 ( 0.027) Loss 5.1525e+00 (6.1758e+00) Acc@1 9.77 ( 2.29) Acc@5 24.61 ( 7.43) +Epoch: [0][3321/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.2538e+00 (6.1755e+00) Acc@1 4.30 ( 2.29) Acc@5 23.44 ( 7.44) +Epoch: [0][3322/5004] Time 0.237 ( 0.242) Data 0.035 ( 0.027) Loss 5.3946e+00 (6.1753e+00) Acc@1 6.64 ( 2.29) Acc@5 20.31 ( 7.44) +Epoch: [0][3323/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.1940e+00 (6.1750e+00) Acc@1 5.86 ( 2.29) Acc@5 17.97 ( 7.44) +Epoch: [0][3324/5004] Time 0.240 ( 0.242) Data 0.036 ( 0.027) Loss 5.1160e+00 (6.1747e+00) Acc@1 7.42 ( 2.30) Acc@5 21.88 ( 7.45) +Epoch: [0][3325/5004] Time 0.238 ( 0.242) Data 0.037 ( 0.027) Loss 5.2953e+00 (6.1744e+00) Acc@1 5.08 ( 2.30) Acc@5 19.92 ( 7.45) +Epoch: [0][3326/5004] Time 0.239 ( 0.242) Data 0.035 ( 0.027) Loss 5.3710e+00 (6.1742e+00) Acc@1 6.25 ( 2.30) Acc@5 15.62 ( 7.45) +Epoch: [0][3327/5004] Time 0.241 ( 0.242) Data 0.033 ( 0.027) Loss 5.2212e+00 (6.1739e+00) Acc@1 7.81 ( 2.30) Acc@5 19.92 ( 7.46) +Epoch: [0][3328/5004] Time 0.236 ( 0.242) Data 0.032 ( 0.027) Loss 5.2767e+00 (6.1736e+00) Acc@1 5.86 ( 2.30) Acc@5 14.06 ( 7.46) +Epoch: [0][3329/5004] Time 0.239 ( 0.242) Data 0.034 ( 0.027) Loss 5.2039e+00 (6.1733e+00) Acc@1 7.03 ( 2.30) Acc@5 21.88 ( 7.46) +Epoch: [0][3330/5004] Time 0.235 ( 0.242) Data 0.032 ( 0.027) Loss 5.3114e+00 (6.1731e+00) Acc@1 9.38 ( 2.30) Acc@5 17.97 ( 7.47) +Epoch: [0][3331/5004] Time 0.239 ( 0.242) Data 0.035 ( 0.027) Loss 4.9402e+00 (6.1727e+00) Acc@1 9.38 ( 2.31) Acc@5 23.44 ( 7.47) +Epoch: [0][3332/5004] Time 0.235 ( 0.242) Data 0.034 ( 0.027) Loss 5.0742e+00 (6.1724e+00) Acc@1 9.38 ( 2.31) Acc@5 23.05 ( 7.48) +Epoch: [0][3333/5004] Time 0.241 ( 0.242) Data 0.036 ( 0.027) Loss 5.2774e+00 (6.1721e+00) Acc@1 7.42 ( 2.31) Acc@5 21.48 ( 7.48) +Epoch: [0][3334/5004] Time 0.234 ( 0.242) Data 0.032 ( 0.027) Loss 5.4668e+00 (6.1719e+00) Acc@1 8.59 ( 2.31) Acc@5 17.19 ( 7.48) +Epoch: [0][3335/5004] Time 0.237 ( 0.242) Data 0.036 ( 0.027) Loss 4.9199e+00 (6.1715e+00) Acc@1 8.20 ( 2.31) Acc@5 25.39 ( 7.49) +Epoch: [0][3336/5004] Time 0.246 ( 0.242) Data 0.036 ( 0.027) Loss 5.0854e+00 (6.1712e+00) Acc@1 6.64 ( 2.31) Acc@5 22.27 ( 7.49) +Epoch: [0][3337/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 5.1059e+00 (6.1709e+00) Acc@1 8.59 ( 2.32) Acc@5 23.05 ( 7.50) +Epoch: [0][3338/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.1247e+00 (6.1706e+00) Acc@1 7.03 ( 2.32) Acc@5 21.88 ( 7.50) +Epoch: [0][3339/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.3543e+00 (6.1703e+00) Acc@1 6.64 ( 2.32) Acc@5 18.36 ( 7.51) +Epoch: [0][3340/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.1731e+00 (6.1700e+00) Acc@1 6.64 ( 2.32) Acc@5 19.53 ( 7.51) +Epoch: [0][3341/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.2543e+00 (6.1697e+00) Acc@1 5.86 ( 2.32) Acc@5 19.92 ( 7.51) +Epoch: [0][3342/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.2259e+00 (6.1695e+00) Acc@1 8.59 ( 2.32) Acc@5 18.75 ( 7.52) +Epoch: [0][3343/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.2601e+00 (6.1692e+00) Acc@1 8.98 ( 2.32) Acc@5 17.19 ( 7.52) +Epoch: [0][3344/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 5.1054e+00 (6.1689e+00) Acc@1 7.81 ( 2.33) Acc@5 19.92 ( 7.52) +Epoch: [0][3345/5004] Time 0.235 ( 0.242) Data 0.024 ( 0.027) Loss 5.2202e+00 (6.1686e+00) Acc@1 8.20 ( 2.33) Acc@5 19.53 ( 7.53) +Epoch: [0][3346/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.2116e+00 (6.1683e+00) Acc@1 6.64 ( 2.33) Acc@5 20.70 ( 7.53) +Epoch: [0][3347/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 5.3449e+00 (6.1681e+00) Acc@1 8.20 ( 2.33) Acc@5 16.80 ( 7.53) +Epoch: [0][3348/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.027) Loss 5.1109e+00 (6.1677e+00) Acc@1 5.86 ( 2.33) Acc@5 21.48 ( 7.54) +Epoch: [0][3349/5004] Time 0.235 ( 0.242) Data 0.023 ( 0.027) Loss 5.1751e+00 (6.1674e+00) Acc@1 5.86 ( 2.33) Acc@5 20.31 ( 7.54) +Epoch: [0][3350/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 5.2317e+00 (6.1672e+00) Acc@1 7.81 ( 2.34) Acc@5 21.88 ( 7.55) +Epoch: [0][3351/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 5.2145e+00 (6.1669e+00) Acc@1 7.42 ( 2.34) Acc@5 19.14 ( 7.55) +Epoch: [0][3352/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 5.1664e+00 (6.1666e+00) Acc@1 6.64 ( 2.34) Acc@5 19.92 ( 7.55) +Epoch: [0][3353/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 5.1751e+00 (6.1663e+00) Acc@1 7.03 ( 2.34) Acc@5 18.36 ( 7.56) +Epoch: [0][3354/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 5.0659e+00 (6.1660e+00) Acc@1 8.98 ( 2.34) Acc@5 23.44 ( 7.56) +Epoch: [0][3355/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.2115e+00 (6.1657e+00) Acc@1 9.38 ( 2.34) Acc@5 17.19 ( 7.56) +Epoch: [0][3356/5004] Time 0.249 ( 0.242) Data 0.023 ( 0.027) Loss 5.1774e+00 (6.1654e+00) Acc@1 6.64 ( 2.34) Acc@5 20.70 ( 7.57) +Epoch: [0][3357/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 5.3628e+00 (6.1651e+00) Acc@1 6.25 ( 2.35) Acc@5 17.97 ( 7.57) +Epoch: [0][3358/5004] Time 0.256 ( 0.242) Data 0.021 ( 0.027) Loss 5.3434e+00 (6.1649e+00) Acc@1 6.64 ( 2.35) Acc@5 16.80 ( 7.57) +Epoch: [0][3359/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.027) Loss 5.2857e+00 (6.1646e+00) Acc@1 6.25 ( 2.35) Acc@5 19.53 ( 7.58) +Epoch: [0][3360/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 5.1885e+00 (6.1643e+00) Acc@1 11.33 ( 2.35) Acc@5 23.44 ( 7.58) +Epoch: [0][3361/5004] Time 0.255 ( 0.242) Data 0.022 ( 0.027) Loss 5.1027e+00 (6.1640e+00) Acc@1 10.16 ( 2.35) Acc@5 20.31 ( 7.59) +Epoch: [0][3362/5004] Time 0.245 ( 0.242) Data 0.017 ( 0.027) Loss 5.4400e+00 (6.1638e+00) Acc@1 2.73 ( 2.35) Acc@5 16.02 ( 7.59) +Epoch: [0][3363/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.027) Loss 5.2231e+00 (6.1635e+00) Acc@1 8.59 ( 2.36) Acc@5 22.27 ( 7.59) +Epoch: [0][3364/5004] Time 0.247 ( 0.242) Data 0.018 ( 0.027) Loss 5.1827e+00 (6.1632e+00) Acc@1 7.42 ( 2.36) Acc@5 22.27 ( 7.60) +Epoch: [0][3365/5004] Time 0.242 ( 0.242) Data 0.020 ( 0.027) Loss 5.2219e+00 (6.1630e+00) Acc@1 8.59 ( 2.36) Acc@5 19.53 ( 7.60) +Epoch: [0][3366/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.1517e+00 (6.1627e+00) Acc@1 8.98 ( 2.36) Acc@5 22.66 ( 7.60) +Epoch: [0][3367/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 5.0581e+00 (6.1623e+00) Acc@1 8.98 ( 2.36) Acc@5 24.61 ( 7.61) +Epoch: [0][3368/5004] Time 0.257 ( 0.242) Data 0.021 ( 0.027) Loss 5.1117e+00 (6.1620e+00) Acc@1 8.59 ( 2.36) Acc@5 22.27 ( 7.61) +Epoch: [0][3369/5004] Time 0.251 ( 0.242) Data 0.016 ( 0.027) Loss 5.4010e+00 (6.1618e+00) Acc@1 5.08 ( 2.37) Acc@5 15.62 ( 7.62) +Epoch: [0][3370/5004] Time 0.247 ( 0.242) Data 0.016 ( 0.027) Loss 5.1922e+00 (6.1615e+00) Acc@1 8.20 ( 2.37) Acc@5 20.31 ( 7.62) +Epoch: [0][3371/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.027) Loss 5.3058e+00 (6.1613e+00) Acc@1 7.03 ( 2.37) Acc@5 17.58 ( 7.62) +Epoch: [0][3372/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 5.2347e+00 (6.1610e+00) Acc@1 4.30 ( 2.37) Acc@5 17.19 ( 7.63) +Epoch: [0][3373/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.2640e+00 (6.1607e+00) Acc@1 6.64 ( 2.37) Acc@5 15.23 ( 7.63) +Epoch: [0][3374/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 5.2083e+00 (6.1604e+00) Acc@1 7.81 ( 2.37) Acc@5 25.00 ( 7.63) +Epoch: [0][3375/5004] Time 0.238 ( 0.242) Data 0.017 ( 0.027) Loss 5.2605e+00 (6.1602e+00) Acc@1 7.81 ( 2.37) Acc@5 23.05 ( 7.64) +Epoch: [0][3376/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 5.1308e+00 (6.1599e+00) Acc@1 7.03 ( 2.37) Acc@5 24.61 ( 7.64) +Epoch: [0][3377/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.027) Loss 5.1635e+00 (6.1596e+00) Acc@1 7.03 ( 2.38) Acc@5 21.09 ( 7.65) +Epoch: [0][3378/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.027) Loss 5.2303e+00 (6.1593e+00) Acc@1 5.86 ( 2.38) Acc@5 20.70 ( 7.65) +Epoch: [0][3379/5004] Time 0.240 ( 0.242) Data 0.018 ( 0.027) Loss 5.1370e+00 (6.1590e+00) Acc@1 5.86 ( 2.38) Acc@5 22.27 ( 7.66) +Epoch: [0][3380/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.2376e+00 (6.1587e+00) Acc@1 5.86 ( 2.38) Acc@5 14.45 ( 7.66) +Epoch: [0][3381/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.0763e+00 (6.1584e+00) Acc@1 8.20 ( 2.38) Acc@5 23.83 ( 7.66) +Epoch: [0][3382/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.1912e+00 (6.1581e+00) Acc@1 8.20 ( 2.38) Acc@5 23.44 ( 7.67) +Epoch: [0][3383/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 5.0964e+00 (6.1578e+00) Acc@1 8.59 ( 2.38) Acc@5 23.44 ( 7.67) +Epoch: [0][3384/5004] Time 0.253 ( 0.242) Data 0.020 ( 0.027) Loss 5.0866e+00 (6.1575e+00) Acc@1 9.38 ( 2.39) Acc@5 22.27 ( 7.68) +Epoch: [0][3385/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.027) Loss 5.0196e+00 (6.1571e+00) Acc@1 9.77 ( 2.39) Acc@5 21.09 ( 7.68) +Epoch: [0][3386/5004] Time 0.261 ( 0.242) Data 0.020 ( 0.027) Loss 5.2437e+00 (6.1569e+00) Acc@1 7.81 ( 2.39) Acc@5 19.53 ( 7.68) +Epoch: [0][3387/5004] Time 0.264 ( 0.242) Data 0.016 ( 0.027) Loss 5.1181e+00 (6.1566e+00) Acc@1 7.42 ( 2.39) Acc@5 21.48 ( 7.69) +Epoch: [0][3388/5004] Time 0.258 ( 0.242) Data 0.013 ( 0.027) Loss 5.1725e+00 (6.1563e+00) Acc@1 8.20 ( 2.39) Acc@5 22.66 ( 7.69) +Epoch: [0][3389/5004] Time 0.259 ( 0.242) Data 0.014 ( 0.027) Loss 5.1077e+00 (6.1560e+00) Acc@1 9.38 ( 2.40) Acc@5 21.88 ( 7.70) +Epoch: [0][3390/5004] Time 0.254 ( 0.242) Data 0.016 ( 0.027) Loss 5.1901e+00 (6.1557e+00) Acc@1 6.25 ( 2.40) Acc@5 16.41 ( 7.70) +Epoch: [0][3391/5004] Time 0.264 ( 0.242) Data 0.018 ( 0.027) Loss 5.2250e+00 (6.1554e+00) Acc@1 8.59 ( 2.40) Acc@5 19.92 ( 7.70) +Epoch: [0][3392/5004] Time 0.240 ( 0.242) Data 0.015 ( 0.027) Loss 5.1456e+00 (6.1551e+00) Acc@1 5.86 ( 2.40) Acc@5 23.44 ( 7.71) +Epoch: [0][3393/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.0884e+00 (6.1548e+00) Acc@1 7.81 ( 2.40) Acc@5 21.88 ( 7.71) +Epoch: [0][3394/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.0610e+00 (6.1545e+00) Acc@1 12.11 ( 2.40) Acc@5 23.05 ( 7.72) +Epoch: [0][3395/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.1603e+00 (6.1542e+00) Acc@1 7.81 ( 2.41) Acc@5 20.70 ( 7.72) +Epoch: [0][3396/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 5.1046e+00 (6.1539e+00) Acc@1 5.86 ( 2.41) Acc@5 23.05 ( 7.72) +Epoch: [0][3397/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.027) Loss 5.2850e+00 (6.1536e+00) Acc@1 8.59 ( 2.41) Acc@5 25.39 ( 7.73) +Epoch: [0][3398/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.027) Loss 5.0392e+00 (6.1533e+00) Acc@1 8.98 ( 2.41) Acc@5 25.39 ( 7.73) +Epoch: [0][3399/5004] Time 0.247 ( 0.242) Data 0.019 ( 0.027) Loss 5.2057e+00 (6.1530e+00) Acc@1 7.03 ( 2.41) Acc@5 16.41 ( 7.74) +Epoch: [0][3400/5004] Time 0.255 ( 0.242) Data 0.021 ( 0.027) Loss 5.3909e+00 (6.1528e+00) Acc@1 5.08 ( 2.41) Acc@5 16.80 ( 7.74) +Epoch: [0][3401/5004] Time 0.248 ( 0.242) Data 0.017 ( 0.027) Loss 5.1630e+00 (6.1525e+00) Acc@1 7.03 ( 2.41) Acc@5 17.58 ( 7.74) +Epoch: [0][3402/5004] Time 0.255 ( 0.242) Data 0.020 ( 0.027) Loss 5.3578e+00 (6.1523e+00) Acc@1 8.20 ( 2.42) Acc@5 16.80 ( 7.74) +Epoch: [0][3403/5004] Time 0.257 ( 0.242) Data 0.020 ( 0.027) Loss 5.0994e+00 (6.1520e+00) Acc@1 7.42 ( 2.42) Acc@5 23.05 ( 7.75) +Epoch: [0][3404/5004] Time 0.265 ( 0.242) Data 0.017 ( 0.027) Loss 5.2576e+00 (6.1517e+00) Acc@1 7.81 ( 2.42) Acc@5 21.88 ( 7.75) +Epoch: [0][3405/5004] Time 0.260 ( 0.242) Data 0.016 ( 0.027) Loss 5.1564e+00 (6.1514e+00) Acc@1 5.86 ( 2.42) Acc@5 20.31 ( 7.76) +Epoch: [0][3406/5004] Time 0.259 ( 0.242) Data 0.018 ( 0.027) Loss 5.2837e+00 (6.1511e+00) Acc@1 6.25 ( 2.42) Acc@5 17.97 ( 7.76) +Epoch: [0][3407/5004] Time 0.266 ( 0.242) Data 0.016 ( 0.027) Loss 5.0499e+00 (6.1508e+00) Acc@1 8.20 ( 2.42) Acc@5 19.92 ( 7.76) +Epoch: [0][3408/5004] Time 0.257 ( 0.242) Data 0.016 ( 0.027) Loss 5.1259e+00 (6.1505e+00) Acc@1 8.59 ( 2.42) Acc@5 21.48 ( 7.77) +Epoch: [0][3409/5004] Time 0.261 ( 0.242) Data 0.017 ( 0.027) Loss 5.3186e+00 (6.1503e+00) Acc@1 7.81 ( 2.43) Acc@5 18.75 ( 7.77) +Epoch: [0][3410/5004] Time 0.241 ( 0.242) Data 0.017 ( 0.027) Loss 5.1551e+00 (6.1500e+00) Acc@1 7.81 ( 2.43) Acc@5 18.75 ( 7.77) +Epoch: [0][3411/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 5.2289e+00 (6.1497e+00) Acc@1 6.64 ( 2.43) Acc@5 19.53 ( 7.78) +Epoch: [0][3412/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 5.3278e+00 (6.1495e+00) Acc@1 5.08 ( 2.43) Acc@5 16.02 ( 7.78) +Epoch: [0][3413/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 5.1628e+00 (6.1492e+00) Acc@1 6.25 ( 2.43) Acc@5 19.14 ( 7.78) +Epoch: [0][3414/5004] Time 0.248 ( 0.242) Data 0.019 ( 0.027) Loss 5.3254e+00 (6.1489e+00) Acc@1 4.69 ( 2.43) Acc@5 16.02 ( 7.79) +Epoch: [0][3415/5004] Time 0.240 ( 0.242) Data 0.018 ( 0.027) Loss 5.3343e+00 (6.1487e+00) Acc@1 6.25 ( 2.43) Acc@5 14.84 ( 7.79) +Epoch: [0][3416/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.1085e+00 (6.1484e+00) Acc@1 8.98 ( 2.43) Acc@5 26.17 ( 7.79) +Epoch: [0][3417/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.3296e+00 (6.1482e+00) Acc@1 5.47 ( 2.44) Acc@5 12.89 ( 7.79) +Epoch: [0][3418/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.2131e+00 (6.1479e+00) Acc@1 4.69 ( 2.44) Acc@5 19.92 ( 7.80) +Epoch: [0][3419/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.1543e+00 (6.1476e+00) Acc@1 9.38 ( 2.44) Acc@5 20.70 ( 7.80) +Epoch: [0][3420/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.1571e+00 (6.1473e+00) Acc@1 7.81 ( 2.44) Acc@5 21.48 ( 7.81) +Epoch: [0][3421/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.2698e+00 (6.1471e+00) Acc@1 8.59 ( 2.44) Acc@5 20.31 ( 7.81) +Epoch: [0][3422/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 5.0696e+00 (6.1467e+00) Acc@1 6.64 ( 2.44) Acc@5 19.53 ( 7.81) +Epoch: [0][3423/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 5.4698e+00 (6.1465e+00) Acc@1 5.08 ( 2.44) Acc@5 16.02 ( 7.82) +Epoch: [0][3424/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 5.0696e+00 (6.1462e+00) Acc@1 8.20 ( 2.44) Acc@5 25.00 ( 7.82) +Epoch: [0][3425/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 5.2288e+00 (6.1460e+00) Acc@1 8.98 ( 2.45) Acc@5 22.27 ( 7.82) +Epoch: [0][3426/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 5.3288e+00 (6.1457e+00) Acc@1 7.42 ( 2.45) Acc@5 17.58 ( 7.83) +Epoch: [0][3427/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 5.1847e+00 (6.1454e+00) Acc@1 5.47 ( 2.45) Acc@5 23.83 ( 7.83) +Epoch: [0][3428/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 5.2376e+00 (6.1452e+00) Acc@1 7.42 ( 2.45) Acc@5 21.48 ( 7.84) +Epoch: [0][3429/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.0674e+00 (6.1449e+00) Acc@1 6.64 ( 2.45) Acc@5 20.31 ( 7.84) +Epoch: [0][3430/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 5.3215e+00 (6.1446e+00) Acc@1 6.64 ( 2.45) Acc@5 18.75 ( 7.84) +Epoch: [0][3431/5004] Time 0.242 ( 0.242) Data 0.021 ( 0.027) Loss 5.1094e+00 (6.1443e+00) Acc@1 6.64 ( 2.45) Acc@5 25.00 ( 7.85) +Epoch: [0][3432/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 5.0744e+00 (6.1440e+00) Acc@1 7.03 ( 2.46) Acc@5 18.75 ( 7.85) +Epoch: [0][3433/5004] Time 0.243 ( 0.242) Data 0.019 ( 0.027) Loss 5.0267e+00 (6.1437e+00) Acc@1 7.81 ( 2.46) Acc@5 24.61 ( 7.86) +Epoch: [0][3434/5004] Time 0.239 ( 0.242) Data 0.017 ( 0.027) Loss 5.1679e+00 (6.1434e+00) Acc@1 5.47 ( 2.46) Acc@5 17.97 ( 7.86) +Epoch: [0][3435/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.0828e+00 (6.1431e+00) Acc@1 7.81 ( 2.46) Acc@5 21.48 ( 7.86) +Epoch: [0][3436/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.0764e+00 (6.1428e+00) Acc@1 7.81 ( 2.46) Acc@5 23.05 ( 7.87) +Epoch: [0][3437/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.1355e+00 (6.1425e+00) Acc@1 7.81 ( 2.46) Acc@5 22.66 ( 7.87) +Epoch: [0][3438/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 5.1431e+00 (6.1422e+00) Acc@1 7.03 ( 2.46) Acc@5 19.92 ( 7.88) +Epoch: [0][3439/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 5.1497e+00 (6.1419e+00) Acc@1 7.81 ( 2.47) Acc@5 25.39 ( 7.88) +Epoch: [0][3440/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.0386e+00 (6.1416e+00) Acc@1 5.86 ( 2.47) Acc@5 26.17 ( 7.89) +Epoch: [0][3441/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 5.0577e+00 (6.1413e+00) Acc@1 8.98 ( 2.47) Acc@5 22.66 ( 7.89) +Epoch: [0][3442/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 5.0471e+00 (6.1410e+00) Acc@1 6.64 ( 2.47) Acc@5 19.92 ( 7.89) +Epoch: [0][3443/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.0764e+00 (6.1406e+00) Acc@1 10.16 ( 2.47) Acc@5 21.88 ( 7.90) +Epoch: [0][3444/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 5.0295e+00 (6.1403e+00) Acc@1 9.38 ( 2.47) Acc@5 22.27 ( 7.90) +Epoch: [0][3445/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.027) Loss 5.1989e+00 (6.1400e+00) Acc@1 6.64 ( 2.48) Acc@5 18.36 ( 7.90) +Epoch: [0][3446/5004] Time 0.234 ( 0.242) Data 0.017 ( 0.027) Loss 5.3488e+00 (6.1398e+00) Acc@1 5.47 ( 2.48) Acc@5 16.02 ( 7.91) +Epoch: [0][3447/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.3270e+00 (6.1396e+00) Acc@1 6.64 ( 2.48) Acc@5 19.14 ( 7.91) +Epoch: [0][3448/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 5.1003e+00 (6.1393e+00) Acc@1 9.77 ( 2.48) Acc@5 23.44 ( 7.91) +Epoch: [0][3449/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.0801e+00 (6.1390e+00) Acc@1 6.25 ( 2.48) Acc@5 21.48 ( 7.92) +Epoch: [0][3450/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.0469e+00 (6.1387e+00) Acc@1 7.81 ( 2.48) Acc@5 19.92 ( 7.92) +Epoch: [0][3451/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 5.1190e+00 (6.1384e+00) Acc@1 7.42 ( 2.48) Acc@5 21.09 ( 7.93) +Epoch: [0][3452/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 5.0296e+00 (6.1380e+00) Acc@1 6.64 ( 2.48) Acc@5 19.14 ( 7.93) +Epoch: [0][3453/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 5.1177e+00 (6.1377e+00) Acc@1 6.64 ( 2.49) Acc@5 20.70 ( 7.93) +Epoch: [0][3454/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.9235e+00 (6.1374e+00) Acc@1 8.59 ( 2.49) Acc@5 23.83 ( 7.94) +Epoch: [0][3455/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 5.1060e+00 (6.1371e+00) Acc@1 7.81 ( 2.49) Acc@5 20.31 ( 7.94) +Epoch: [0][3456/5004] Time 0.244 ( 0.242) Data 0.019 ( 0.027) Loss 5.0978e+00 (6.1368e+00) Acc@1 8.59 ( 2.49) Acc@5 21.88 ( 7.95) +Epoch: [0][3457/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 5.1309e+00 (6.1365e+00) Acc@1 7.03 ( 2.49) Acc@5 22.27 ( 7.95) +Epoch: [0][3458/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 5.4068e+00 (6.1363e+00) Acc@1 5.47 ( 2.49) Acc@5 14.84 ( 7.95) +Epoch: [0][3459/5004] Time 0.238 ( 0.242) Data 0.019 ( 0.027) Loss 5.1553e+00 (6.1360e+00) Acc@1 6.64 ( 2.49) Acc@5 21.48 ( 7.96) +Epoch: [0][3460/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.9329e+00 (6.1357e+00) Acc@1 10.16 ( 2.50) Acc@5 22.66 ( 7.96) +Epoch: [0][3461/5004] Time 0.239 ( 0.242) Data 0.019 ( 0.027) Loss 5.1362e+00 (6.1354e+00) Acc@1 8.20 ( 2.50) Acc@5 18.75 ( 7.96) +Epoch: [0][3462/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 5.1539e+00 (6.1351e+00) Acc@1 6.25 ( 2.50) Acc@5 18.36 ( 7.97) +Epoch: [0][3463/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.1986e+00 (6.1348e+00) Acc@1 5.86 ( 2.50) Acc@5 19.92 ( 7.97) +Epoch: [0][3464/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 5.2630e+00 (6.1346e+00) Acc@1 7.81 ( 2.50) Acc@5 20.31 ( 7.97) +Epoch: [0][3465/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.2755e+00 (6.1343e+00) Acc@1 5.08 ( 2.50) Acc@5 16.41 ( 7.98) +Epoch: [0][3466/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.2487e+00 (6.1341e+00) Acc@1 4.30 ( 2.50) Acc@5 19.14 ( 7.98) +Epoch: [0][3467/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.027) Loss 4.9718e+00 (6.1337e+00) Acc@1 10.94 ( 2.51) Acc@5 23.05 ( 7.98) +Epoch: [0][3468/5004] Time 0.243 ( 0.242) Data 0.016 ( 0.027) Loss 5.0844e+00 (6.1334e+00) Acc@1 8.98 ( 2.51) Acc@5 25.78 ( 7.99) +Epoch: [0][3469/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 5.1962e+00 (6.1332e+00) Acc@1 6.64 ( 2.51) Acc@5 19.92 ( 7.99) +Epoch: [0][3470/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.2524e+00 (6.1329e+00) Acc@1 5.47 ( 2.51) Acc@5 19.53 ( 7.99) +Epoch: [0][3471/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.0956e+00 (6.1326e+00) Acc@1 7.03 ( 2.51) Acc@5 19.53 ( 8.00) +Epoch: [0][3472/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.0667e+00 (6.1323e+00) Acc@1 9.77 ( 2.51) Acc@5 25.78 ( 8.00) +Epoch: [0][3473/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.9123e+00 (6.1319e+00) Acc@1 8.98 ( 2.51) Acc@5 26.17 ( 8.01) +Epoch: [0][3474/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.1216e+00 (6.1317e+00) Acc@1 7.81 ( 2.52) Acc@5 22.66 ( 8.01) +Epoch: [0][3475/5004] Time 0.242 ( 0.242) Data 0.021 ( 0.027) Loss 5.0830e+00 (6.1314e+00) Acc@1 8.98 ( 2.52) Acc@5 23.05 ( 8.02) +Epoch: [0][3476/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 4.9868e+00 (6.1310e+00) Acc@1 10.16 ( 2.52) Acc@5 27.73 ( 8.02) +Epoch: [0][3477/5004] Time 0.239 ( 0.242) Data 0.017 ( 0.027) Loss 4.9403e+00 (6.1307e+00) Acc@1 11.33 ( 2.52) Acc@5 25.78 ( 8.03) +Epoch: [0][3478/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 5.2500e+00 (6.1304e+00) Acc@1 5.47 ( 2.52) Acc@5 20.70 ( 8.03) +Epoch: [0][3479/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.3101e+00 (6.1302e+00) Acc@1 6.25 ( 2.52) Acc@5 17.58 ( 8.03) +Epoch: [0][3480/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 5.2926e+00 (6.1300e+00) Acc@1 5.08 ( 2.53) Acc@5 16.41 ( 8.04) +Epoch: [0][3481/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 5.0975e+00 (6.1297e+00) Acc@1 8.20 ( 2.53) Acc@5 19.14 ( 8.04) +Epoch: [0][3482/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.9314e+00 (6.1293e+00) Acc@1 9.77 ( 2.53) Acc@5 24.61 ( 8.04) +Epoch: [0][3483/5004] Time 0.242 ( 0.242) Data 0.021 ( 0.027) Loss 5.0674e+00 (6.1290e+00) Acc@1 7.42 ( 2.53) Acc@5 27.34 ( 8.05) +Epoch: [0][3484/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.027) Loss 5.3445e+00 (6.1288e+00) Acc@1 6.64 ( 2.53) Acc@5 18.75 ( 8.05) +Epoch: [0][3485/5004] Time 0.247 ( 0.242) Data 0.019 ( 0.027) Loss 5.2835e+00 (6.1285e+00) Acc@1 6.64 ( 2.53) Acc@5 21.88 ( 8.06) +Epoch: [0][3486/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 5.1838e+00 (6.1283e+00) Acc@1 6.64 ( 2.53) Acc@5 21.48 ( 8.06) +Epoch: [0][3487/5004] Time 0.257 ( 0.242) Data 0.020 ( 0.027) Loss 5.2196e+00 (6.1280e+00) Acc@1 5.08 ( 2.53) Acc@5 20.31 ( 8.06) +Epoch: [0][3488/5004] Time 0.245 ( 0.242) Data 0.016 ( 0.027) Loss 5.0954e+00 (6.1277e+00) Acc@1 6.64 ( 2.54) Acc@5 23.05 ( 8.07) +Epoch: [0][3489/5004] Time 0.246 ( 0.242) Data 0.018 ( 0.027) Loss 5.1376e+00 (6.1274e+00) Acc@1 4.69 ( 2.54) Acc@5 21.09 ( 8.07) +Epoch: [0][3490/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.027) Loss 5.1707e+00 (6.1272e+00) Acc@1 8.20 ( 2.54) Acc@5 19.92 ( 8.08) +Epoch: [0][3491/5004] Time 0.237 ( 0.242) Data 0.018 ( 0.027) Loss 5.0177e+00 (6.1268e+00) Acc@1 11.72 ( 2.54) Acc@5 21.09 ( 8.08) +Epoch: [0][3492/5004] Time 0.238 ( 0.242) Data 0.023 ( 0.027) Loss 5.1527e+00 (6.1266e+00) Acc@1 5.47 ( 2.54) Acc@5 21.09 ( 8.08) +Epoch: [0][3493/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 5.2179e+00 (6.1263e+00) Acc@1 7.42 ( 2.54) Acc@5 18.36 ( 8.09) +Epoch: [0][3494/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 5.2892e+00 (6.1261e+00) Acc@1 8.59 ( 2.54) Acc@5 19.53 ( 8.09) +Epoch: [0][3495/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 5.1961e+00 (6.1258e+00) Acc@1 7.42 ( 2.55) Acc@5 19.53 ( 8.09) +Epoch: [0][3496/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 5.1069e+00 (6.1255e+00) Acc@1 8.59 ( 2.55) Acc@5 22.27 ( 8.10) +Epoch: [0][3497/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 5.2900e+00 (6.1253e+00) Acc@1 5.47 ( 2.55) Acc@5 18.75 ( 8.10) +Epoch: [0][3498/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 5.2465e+00 (6.1250e+00) Acc@1 6.64 ( 2.55) Acc@5 20.70 ( 8.10) +Epoch: [0][3499/5004] Time 0.232 ( 0.242) Data 0.018 ( 0.027) Loss 5.1499e+00 (6.1247e+00) Acc@1 8.59 ( 2.55) Acc@5 24.61 ( 8.11) +Epoch: [0][3500/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 5.0728e+00 (6.1244e+00) Acc@1 9.38 ( 2.55) Acc@5 21.88 ( 8.11) +Epoch: [0][3501/5004] Time 0.253 ( 0.242) Data 0.023 ( 0.027) Loss 5.1198e+00 (6.1241e+00) Acc@1 7.42 ( 2.55) Acc@5 20.31 ( 8.12) +Epoch: [0][3502/5004] Time 0.248 ( 0.242) Data 0.018 ( 0.027) Loss 4.9859e+00 (6.1238e+00) Acc@1 8.98 ( 2.56) Acc@5 27.34 ( 8.12) +Epoch: [0][3503/5004] Time 0.241 ( 0.242) Data 0.015 ( 0.027) Loss 5.0052e+00 (6.1235e+00) Acc@1 10.94 ( 2.56) Acc@5 25.39 ( 8.13) +Epoch: [0][3504/5004] Time 0.239 ( 0.242) Data 0.020 ( 0.027) Loss 5.0911e+00 (6.1232e+00) Acc@1 6.25 ( 2.56) Acc@5 20.31 ( 8.13) +Epoch: [0][3505/5004] Time 0.240 ( 0.242) Data 0.022 ( 0.027) Loss 5.0821e+00 (6.1229e+00) Acc@1 6.64 ( 2.56) Acc@5 21.88 ( 8.13) +Epoch: [0][3506/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.0229e+00 (6.1226e+00) Acc@1 11.72 ( 2.56) Acc@5 26.56 ( 8.14) +Epoch: [0][3507/5004] Time 0.253 ( 0.242) Data 0.022 ( 0.027) Loss 5.1811e+00 (6.1223e+00) Acc@1 7.42 ( 2.57) Acc@5 19.14 ( 8.14) +Epoch: [0][3508/5004] Time 0.242 ( 0.242) Data 0.019 ( 0.027) Loss 5.1126e+00 (6.1220e+00) Acc@1 7.03 ( 2.57) Acc@5 19.92 ( 8.14) +Epoch: [0][3509/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 5.0753e+00 (6.1217e+00) Acc@1 8.20 ( 2.57) Acc@5 20.70 ( 8.15) +Epoch: [0][3510/5004] Time 0.237 ( 0.242) Data 0.018 ( 0.027) Loss 5.0947e+00 (6.1215e+00) Acc@1 10.94 ( 2.57) Acc@5 25.00 ( 8.15) +Epoch: [0][3511/5004] Time 0.239 ( 0.242) Data 0.022 ( 0.027) Loss 5.3039e+00 (6.1212e+00) Acc@1 8.98 ( 2.57) Acc@5 17.97 ( 8.16) +Epoch: [0][3512/5004] Time 0.239 ( 0.242) Data 0.022 ( 0.027) Loss 5.0886e+00 (6.1209e+00) Acc@1 6.64 ( 2.57) Acc@5 19.14 ( 8.16) +Epoch: [0][3513/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.0403e+00 (6.1206e+00) Acc@1 8.98 ( 2.58) Acc@5 20.31 ( 8.16) +Epoch: [0][3514/5004] Time 0.257 ( 0.242) Data 0.021 ( 0.027) Loss 5.2160e+00 (6.1204e+00) Acc@1 4.69 ( 2.58) Acc@5 18.75 ( 8.17) +Epoch: [0][3515/5004] Time 0.253 ( 0.242) Data 0.019 ( 0.027) Loss 5.3323e+00 (6.1201e+00) Acc@1 7.81 ( 2.58) Acc@5 16.41 ( 8.17) +Epoch: [0][3516/5004] Time 0.256 ( 0.242) Data 0.019 ( 0.027) Loss 5.2256e+00 (6.1199e+00) Acc@1 5.47 ( 2.58) Acc@5 20.31 ( 8.17) +Epoch: [0][3517/5004] Time 0.246 ( 0.242) Data 0.018 ( 0.027) Loss 5.2800e+00 (6.1196e+00) Acc@1 7.81 ( 2.58) Acc@5 21.88 ( 8.18) +Epoch: [0][3518/5004] Time 0.258 ( 0.242) Data 0.020 ( 0.027) Loss 5.1513e+00 (6.1194e+00) Acc@1 5.86 ( 2.58) Acc@5 21.48 ( 8.18) +Epoch: [0][3519/5004] Time 0.258 ( 0.242) Data 0.019 ( 0.027) Loss 5.0390e+00 (6.1191e+00) Acc@1 7.42 ( 2.58) Acc@5 20.70 ( 8.18) +Epoch: [0][3520/5004] Time 0.250 ( 0.242) Data 0.019 ( 0.027) Loss 4.8641e+00 (6.1187e+00) Acc@1 9.77 ( 2.58) Acc@5 22.66 ( 8.19) +Epoch: [0][3521/5004] Time 0.224 ( 0.242) Data 0.019 ( 0.027) Loss 5.3697e+00 (6.1185e+00) Acc@1 4.69 ( 2.58) Acc@5 19.92 ( 8.19) +Epoch: [0][3522/5004] Time 0.239 ( 0.242) Data 0.034 ( 0.027) Loss 5.3946e+00 (6.1183e+00) Acc@1 4.69 ( 2.59) Acc@5 16.80 ( 8.19) +Epoch: [0][3523/5004] Time 0.235 ( 0.242) Data 0.033 ( 0.027) Loss 5.1636e+00 (6.1180e+00) Acc@1 6.64 ( 2.59) Acc@5 22.27 ( 8.20) +Epoch: [0][3524/5004] Time 0.240 ( 0.242) Data 0.036 ( 0.027) Loss 5.2333e+00 (6.1178e+00) Acc@1 8.20 ( 2.59) Acc@5 19.92 ( 8.20) +Epoch: [0][3525/5004] Time 0.236 ( 0.242) Data 0.035 ( 0.027) Loss 5.3166e+00 (6.1175e+00) Acc@1 6.25 ( 2.59) Acc@5 15.23 ( 8.20) +Epoch: [0][3526/5004] Time 0.238 ( 0.242) Data 0.035 ( 0.027) Loss 5.1667e+00 (6.1173e+00) Acc@1 6.64 ( 2.59) Acc@5 19.53 ( 8.21) +Epoch: [0][3527/5004] Time 0.236 ( 0.242) Data 0.035 ( 0.027) Loss 5.1770e+00 (6.1170e+00) Acc@1 7.03 ( 2.59) Acc@5 23.83 ( 8.21) +Epoch: [0][3528/5004] Time 0.237 ( 0.242) Data 0.036 ( 0.027) Loss 5.1636e+00 (6.1167e+00) Acc@1 8.59 ( 2.59) Acc@5 20.70 ( 8.21) +Epoch: [0][3529/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 5.1613e+00 (6.1165e+00) Acc@1 10.94 ( 2.60) Acc@5 20.31 ( 8.22) +Epoch: [0][3530/5004] Time 0.239 ( 0.242) Data 0.034 ( 0.027) Loss 4.8699e+00 (6.1161e+00) Acc@1 12.11 ( 2.60) Acc@5 25.78 ( 8.22) +Epoch: [0][3531/5004] Time 0.239 ( 0.242) Data 0.032 ( 0.027) Loss 5.2663e+00 (6.1159e+00) Acc@1 6.64 ( 2.60) Acc@5 19.53 ( 8.22) +Epoch: [0][3532/5004] Time 0.233 ( 0.242) Data 0.032 ( 0.027) Loss 5.2415e+00 (6.1156e+00) Acc@1 7.42 ( 2.60) Acc@5 19.53 ( 8.23) +Epoch: [0][3533/5004] Time 0.241 ( 0.242) Data 0.036 ( 0.027) Loss 5.1654e+00 (6.1153e+00) Acc@1 8.20 ( 2.60) Acc@5 18.75 ( 8.23) +Epoch: [0][3534/5004] Time 0.237 ( 0.242) Data 0.032 ( 0.027) Loss 4.9860e+00 (6.1150e+00) Acc@1 9.38 ( 2.60) Acc@5 25.00 ( 8.24) +Epoch: [0][3535/5004] Time 0.234 ( 0.242) Data 0.033 ( 0.027) Loss 5.0866e+00 (6.1147e+00) Acc@1 7.03 ( 2.61) Acc@5 19.53 ( 8.24) +Epoch: [0][3536/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.3185e+00 (6.1145e+00) Acc@1 6.64 ( 2.61) Acc@5 17.97 ( 8.24) +Epoch: [0][3537/5004] Time 0.236 ( 0.242) Data 0.034 ( 0.027) Loss 5.1043e+00 (6.1142e+00) Acc@1 8.59 ( 2.61) Acc@5 21.88 ( 8.25) +Epoch: [0][3538/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.2106e+00 (6.1140e+00) Acc@1 8.20 ( 2.61) Acc@5 18.75 ( 8.25) +Epoch: [0][3539/5004] Time 0.242 ( 0.242) Data 0.035 ( 0.027) Loss 5.3020e+00 (6.1137e+00) Acc@1 7.42 ( 2.61) Acc@5 21.88 ( 8.25) +Epoch: [0][3540/5004] Time 0.240 ( 0.242) Data 0.033 ( 0.027) Loss 5.0520e+00 (6.1134e+00) Acc@1 9.77 ( 2.61) Acc@5 24.61 ( 8.26) +Epoch: [0][3541/5004] Time 0.233 ( 0.242) Data 0.032 ( 0.027) Loss 5.1418e+00 (6.1132e+00) Acc@1 7.42 ( 2.61) Acc@5 19.53 ( 8.26) +Epoch: [0][3542/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 5.0643e+00 (6.1129e+00) Acc@1 8.20 ( 2.62) Acc@5 19.53 ( 8.26) +Epoch: [0][3543/5004] Time 0.243 ( 0.242) Data 0.035 ( 0.027) Loss 5.0377e+00 (6.1126e+00) Acc@1 7.03 ( 2.62) Acc@5 21.88 ( 8.27) +Epoch: [0][3544/5004] Time 0.239 ( 0.242) Data 0.031 ( 0.027) Loss 5.1685e+00 (6.1123e+00) Acc@1 6.64 ( 2.62) Acc@5 21.88 ( 8.27) +Epoch: [0][3545/5004] Time 0.240 ( 0.242) Data 0.031 ( 0.027) Loss 5.0064e+00 (6.1120e+00) Acc@1 7.42 ( 2.62) Acc@5 22.66 ( 8.27) +Epoch: [0][3546/5004] Time 0.240 ( 0.242) Data 0.031 ( 0.027) Loss 5.2743e+00 (6.1118e+00) Acc@1 8.59 ( 2.62) Acc@5 20.70 ( 8.28) +Epoch: [0][3547/5004] Time 0.234 ( 0.242) Data 0.033 ( 0.027) Loss 5.0362e+00 (6.1115e+00) Acc@1 5.86 ( 2.62) Acc@5 21.88 ( 8.28) +Epoch: [0][3548/5004] Time 0.240 ( 0.242) Data 0.036 ( 0.027) Loss 4.9698e+00 (6.1111e+00) Acc@1 8.20 ( 2.62) Acc@5 27.34 ( 8.29) +Epoch: [0][3549/5004] Time 0.240 ( 0.242) Data 0.035 ( 0.027) Loss 5.0827e+00 (6.1108e+00) Acc@1 5.08 ( 2.62) Acc@5 17.97 ( 8.29) +Epoch: [0][3550/5004] Time 0.236 ( 0.242) Data 0.034 ( 0.027) Loss 5.2030e+00 (6.1106e+00) Acc@1 7.03 ( 2.63) Acc@5 21.88 ( 8.29) +Epoch: [0][3551/5004] Time 0.236 ( 0.242) Data 0.035 ( 0.027) Loss 5.0390e+00 (6.1103e+00) Acc@1 9.38 ( 2.63) Acc@5 21.09 ( 8.30) +Epoch: [0][3552/5004] Time 0.242 ( 0.242) Data 0.036 ( 0.027) Loss 5.2339e+00 (6.1100e+00) Acc@1 7.42 ( 2.63) Acc@5 17.58 ( 8.30) +Epoch: [0][3553/5004] Time 0.237 ( 0.242) Data 0.032 ( 0.027) Loss 5.0626e+00 (6.1097e+00) Acc@1 6.64 ( 2.63) Acc@5 18.75 ( 8.30) +Epoch: [0][3554/5004] Time 0.240 ( 0.242) Data 0.033 ( 0.027) Loss 5.0179e+00 (6.1094e+00) Acc@1 8.20 ( 2.63) Acc@5 18.75 ( 8.31) +Epoch: [0][3555/5004] Time 0.244 ( 0.242) Data 0.031 ( 0.027) Loss 5.2876e+00 (6.1092e+00) Acc@1 7.81 ( 2.63) Acc@5 18.36 ( 8.31) +Epoch: [0][3556/5004] Time 0.235 ( 0.242) Data 0.031 ( 0.027) Loss 4.9086e+00 (6.1089e+00) Acc@1 10.55 ( 2.64) Acc@5 26.56 ( 8.31) +Epoch: [0][3557/5004] Time 0.235 ( 0.242) Data 0.033 ( 0.027) Loss 5.2908e+00 (6.1086e+00) Acc@1 6.64 ( 2.64) Acc@5 19.92 ( 8.32) +Epoch: [0][3558/5004] Time 0.239 ( 0.242) Data 0.037 ( 0.027) Loss 5.0262e+00 (6.1083e+00) Acc@1 7.03 ( 2.64) Acc@5 19.14 ( 8.32) +Epoch: [0][3559/5004] Time 0.238 ( 0.242) Data 0.034 ( 0.027) Loss 5.0951e+00 (6.1080e+00) Acc@1 10.16 ( 2.64) Acc@5 21.88 ( 8.32) +Epoch: [0][3560/5004] Time 0.235 ( 0.242) Data 0.036 ( 0.027) Loss 5.0374e+00 (6.1077e+00) Acc@1 8.59 ( 2.64) Acc@5 22.66 ( 8.33) +Epoch: [0][3561/5004] Time 0.238 ( 0.242) Data 0.037 ( 0.027) Loss 5.0954e+00 (6.1075e+00) Acc@1 6.64 ( 2.64) Acc@5 23.83 ( 8.33) +Epoch: [0][3562/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 4.9922e+00 (6.1071e+00) Acc@1 10.16 ( 2.65) Acc@5 22.27 ( 8.34) +Epoch: [0][3563/5004] Time 0.236 ( 0.242) Data 0.035 ( 0.027) Loss 5.1020e+00 (6.1069e+00) Acc@1 8.59 ( 2.65) Acc@5 23.05 ( 8.34) +Epoch: [0][3564/5004] Time 0.238 ( 0.242) Data 0.037 ( 0.027) Loss 5.1216e+00 (6.1066e+00) Acc@1 8.20 ( 2.65) Acc@5 23.44 ( 8.35) +Epoch: [0][3565/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 5.1243e+00 (6.1063e+00) Acc@1 7.03 ( 2.65) Acc@5 23.83 ( 8.35) +Epoch: [0][3566/5004] Time 0.236 ( 0.242) Data 0.035 ( 0.027) Loss 5.0228e+00 (6.1060e+00) Acc@1 8.98 ( 2.65) Acc@5 23.44 ( 8.35) +Epoch: [0][3567/5004] Time 0.237 ( 0.242) Data 0.036 ( 0.027) Loss 5.0714e+00 (6.1057e+00) Acc@1 7.81 ( 2.65) Acc@5 23.83 ( 8.36) +Epoch: [0][3568/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 5.1532e+00 (6.1055e+00) Acc@1 7.03 ( 2.65) Acc@5 19.92 ( 8.36) +Epoch: [0][3569/5004] Time 0.240 ( 0.242) Data 0.036 ( 0.027) Loss 5.1527e+00 (6.1052e+00) Acc@1 8.20 ( 2.66) Acc@5 21.09 ( 8.36) +Epoch: [0][3570/5004] Time 0.243 ( 0.242) Data 0.035 ( 0.027) Loss 5.0160e+00 (6.1049e+00) Acc@1 7.81 ( 2.66) Acc@5 22.27 ( 8.37) +Epoch: [0][3571/5004] Time 0.241 ( 0.242) Data 0.034 ( 0.027) Loss 5.0444e+00 (6.1046e+00) Acc@1 8.98 ( 2.66) Acc@5 21.09 ( 8.37) +Epoch: [0][3572/5004] Time 0.240 ( 0.242) Data 0.032 ( 0.027) Loss 5.0810e+00 (6.1043e+00) Acc@1 7.03 ( 2.66) Acc@5 21.88 ( 8.38) +Epoch: [0][3573/5004] Time 0.235 ( 0.242) Data 0.032 ( 0.027) Loss 5.0247e+00 (6.1040e+00) Acc@1 10.55 ( 2.66) Acc@5 24.61 ( 8.38) +Epoch: [0][3574/5004] Time 0.238 ( 0.242) Data 0.034 ( 0.027) Loss 5.0672e+00 (6.1037e+00) Acc@1 7.81 ( 2.66) Acc@5 23.44 ( 8.38) +Epoch: [0][3575/5004] Time 0.234 ( 0.242) Data 0.034 ( 0.027) Loss 5.0462e+00 (6.1034e+00) Acc@1 10.16 ( 2.67) Acc@5 25.00 ( 8.39) +Epoch: [0][3576/5004] Time 0.239 ( 0.242) Data 0.036 ( 0.027) Loss 4.9697e+00 (6.1031e+00) Acc@1 9.77 ( 2.67) Acc@5 24.22 ( 8.39) +Epoch: [0][3577/5004] Time 0.235 ( 0.242) Data 0.035 ( 0.027) Loss 5.1117e+00 (6.1028e+00) Acc@1 8.20 ( 2.67) Acc@5 19.53 ( 8.40) +Epoch: [0][3578/5004] Time 0.246 ( 0.242) Data 0.040 ( 0.027) Loss 5.2727e+00 (6.1026e+00) Acc@1 4.69 ( 2.67) Acc@5 19.14 ( 8.40) +Epoch: [0][3579/5004] Time 0.233 ( 0.242) Data 0.033 ( 0.027) Loss 5.2671e+00 (6.1024e+00) Acc@1 8.98 ( 2.67) Acc@5 22.66 ( 8.40) +Epoch: [0][3580/5004] Time 0.243 ( 0.242) Data 0.038 ( 0.027) Loss 5.2339e+00 (6.1021e+00) Acc@1 7.81 ( 2.67) Acc@5 17.97 ( 8.41) +Epoch: [0][3581/5004] Time 0.230 ( 0.242) Data 0.031 ( 0.027) Loss 5.0357e+00 (6.1018e+00) Acc@1 7.03 ( 2.67) Acc@5 18.75 ( 8.41) +Epoch: [0][3582/5004] Time 0.239 ( 0.242) Data 0.037 ( 0.027) Loss 4.8402e+00 (6.1015e+00) Acc@1 14.84 ( 2.68) Acc@5 30.08 ( 8.42) +Epoch: [0][3583/5004] Time 0.237 ( 0.242) Data 0.038 ( 0.027) Loss 5.1875e+00 (6.1012e+00) Acc@1 8.98 ( 2.68) Acc@5 22.66 ( 8.42) +Epoch: [0][3584/5004] Time 0.237 ( 0.242) Data 0.037 ( 0.027) Loss 4.9931e+00 (6.1009e+00) Acc@1 9.38 ( 2.68) Acc@5 23.44 ( 8.42) +Epoch: [0][3585/5004] Time 0.236 ( 0.242) Data 0.037 ( 0.027) Loss 4.9485e+00 (6.1006e+00) Acc@1 8.59 ( 2.68) Acc@5 24.61 ( 8.43) +Epoch: [0][3586/5004] Time 0.240 ( 0.242) Data 0.037 ( 0.027) Loss 5.0595e+00 (6.1003e+00) Acc@1 9.77 ( 2.68) Acc@5 21.48 ( 8.43) +Epoch: [0][3587/5004] Time 0.236 ( 0.242) Data 0.034 ( 0.027) Loss 5.2473e+00 (6.1000e+00) Acc@1 8.20 ( 2.69) Acc@5 21.48 ( 8.44) +Epoch: [0][3588/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 5.2083e+00 (6.0998e+00) Acc@1 7.42 ( 2.69) Acc@5 17.58 ( 8.44) +Epoch: [0][3589/5004] Time 0.236 ( 0.242) Data 0.034 ( 0.027) Loss 5.0455e+00 (6.0995e+00) Acc@1 8.98 ( 2.69) Acc@5 22.27 ( 8.44) +Epoch: [0][3590/5004] Time 0.241 ( 0.242) Data 0.035 ( 0.027) Loss 4.9723e+00 (6.0992e+00) Acc@1 8.98 ( 2.69) Acc@5 23.05 ( 8.45) +Epoch: [0][3591/5004] Time 0.233 ( 0.242) Data 0.031 ( 0.027) Loss 5.2088e+00 (6.0989e+00) Acc@1 5.08 ( 2.69) Acc@5 18.75 ( 8.45) +Epoch: [0][3592/5004] Time 0.238 ( 0.242) Data 0.036 ( 0.027) Loss 5.1079e+00 (6.0987e+00) Acc@1 8.59 ( 2.69) Acc@5 22.66 ( 8.45) +Epoch: [0][3593/5004] Time 0.240 ( 0.242) Data 0.035 ( 0.027) Loss 5.0853e+00 (6.0984e+00) Acc@1 8.59 ( 2.70) Acc@5 25.00 ( 8.46) +Epoch: [0][3594/5004] Time 0.259 ( 0.242) Data 0.034 ( 0.027) Loss 5.1091e+00 (6.0981e+00) Acc@1 7.03 ( 2.70) Acc@5 23.44 ( 8.46) +Epoch: [0][3595/5004] Time 0.257 ( 0.242) Data 0.018 ( 0.027) Loss 5.1235e+00 (6.0978e+00) Acc@1 8.98 ( 2.70) Acc@5 22.66 ( 8.47) +Epoch: [0][3596/5004] Time 0.250 ( 0.242) Data 0.016 ( 0.027) Loss 5.0474e+00 (6.0975e+00) Acc@1 8.59 ( 2.70) Acc@5 23.05 ( 8.47) +Epoch: [0][3597/5004] Time 0.248 ( 0.242) Data 0.017 ( 0.027) Loss 4.9216e+00 (6.0972e+00) Acc@1 10.55 ( 2.70) Acc@5 25.00 ( 8.47) +Epoch: [0][3598/5004] Time 0.247 ( 0.242) Data 0.019 ( 0.027) Loss 4.9270e+00 (6.0969e+00) Acc@1 9.77 ( 2.70) Acc@5 25.39 ( 8.48) +Epoch: [0][3599/5004] Time 0.247 ( 0.242) Data 0.020 ( 0.027) Loss 5.0217e+00 (6.0966e+00) Acc@1 6.64 ( 2.71) Acc@5 23.05 ( 8.48) +Epoch: [0][3600/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.027) Loss 5.3215e+00 (6.0964e+00) Acc@1 4.69 ( 2.71) Acc@5 16.80 ( 8.49) +Epoch: [0][3601/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 5.0448e+00 (6.0961e+00) Acc@1 7.03 ( 2.71) Acc@5 21.09 ( 8.49) +Epoch: [0][3602/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 5.1239e+00 (6.0958e+00) Acc@1 6.64 ( 2.71) Acc@5 17.19 ( 8.49) +Epoch: [0][3603/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.027) Loss 5.2086e+00 (6.0956e+00) Acc@1 5.08 ( 2.71) Acc@5 16.41 ( 8.49) +Epoch: [0][3604/5004] Time 0.252 ( 0.242) Data 0.018 ( 0.027) Loss 5.1453e+00 (6.0953e+00) Acc@1 7.03 ( 2.71) Acc@5 19.14 ( 8.50) +Epoch: [0][3605/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.027) Loss 5.0060e+00 (6.0950e+00) Acc@1 8.59 ( 2.71) Acc@5 26.17 ( 8.50) +Epoch: [0][3606/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.027) Loss 5.0729e+00 (6.0947e+00) Acc@1 8.20 ( 2.71) Acc@5 25.00 ( 8.51) +Epoch: [0][3607/5004] Time 0.244 ( 0.242) Data 0.019 ( 0.027) Loss 4.9880e+00 (6.0944e+00) Acc@1 10.16 ( 2.71) Acc@5 26.95 ( 8.51) +Epoch: [0][3608/5004] Time 0.252 ( 0.242) Data 0.021 ( 0.027) Loss 5.1555e+00 (6.0942e+00) Acc@1 7.81 ( 2.72) Acc@5 23.44 ( 8.51) +Epoch: [0][3609/5004] Time 0.246 ( 0.242) Data 0.019 ( 0.027) Loss 4.9685e+00 (6.0938e+00) Acc@1 9.38 ( 2.72) Acc@5 30.47 ( 8.52) +Epoch: [0][3610/5004] Time 0.253 ( 0.242) Data 0.020 ( 0.027) Loss 5.0676e+00 (6.0936e+00) Acc@1 7.42 ( 2.72) Acc@5 19.14 ( 8.52) +Epoch: [0][3611/5004] Time 0.241 ( 0.242) Data 0.015 ( 0.027) Loss 4.9754e+00 (6.0933e+00) Acc@1 10.94 ( 2.72) Acc@5 25.00 ( 8.53) +Epoch: [0][3612/5004] Time 0.247 ( 0.242) Data 0.020 ( 0.027) Loss 5.2235e+00 (6.0930e+00) Acc@1 8.98 ( 2.72) Acc@5 19.92 ( 8.53) +Epoch: [0][3613/5004] Time 0.247 ( 0.242) Data 0.020 ( 0.027) Loss 5.1029e+00 (6.0927e+00) Acc@1 7.81 ( 2.72) Acc@5 23.05 ( 8.54) +Epoch: [0][3614/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.027) Loss 5.1928e+00 (6.0925e+00) Acc@1 6.64 ( 2.73) Acc@5 19.92 ( 8.54) +Epoch: [0][3615/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.027) Loss 5.1366e+00 (6.0922e+00) Acc@1 8.98 ( 2.73) Acc@5 21.88 ( 8.54) +Epoch: [0][3616/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.027) Loss 5.1290e+00 (6.0920e+00) Acc@1 7.03 ( 2.73) Acc@5 19.53 ( 8.55) +Epoch: [0][3617/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.027) Loss 5.1121e+00 (6.0917e+00) Acc@1 8.98 ( 2.73) Acc@5 21.88 ( 8.55) +Epoch: [0][3618/5004] Time 0.247 ( 0.242) Data 0.019 ( 0.027) Loss 5.0526e+00 (6.0914e+00) Acc@1 8.98 ( 2.73) Acc@5 22.66 ( 8.55) +Epoch: [0][3619/5004] Time 0.253 ( 0.242) Data 0.020 ( 0.027) Loss 4.9092e+00 (6.0911e+00) Acc@1 9.77 ( 2.73) Acc@5 25.00 ( 8.56) +Epoch: [0][3620/5004] Time 0.244 ( 0.242) Data 0.018 ( 0.027) Loss 5.2482e+00 (6.0908e+00) Acc@1 6.64 ( 2.74) Acc@5 19.53 ( 8.56) +Epoch: [0][3621/5004] Time 0.247 ( 0.242) Data 0.020 ( 0.027) Loss 4.9227e+00 (6.0905e+00) Acc@1 10.94 ( 2.74) Acc@5 28.52 ( 8.57) +Epoch: [0][3622/5004] Time 0.246 ( 0.242) Data 0.019 ( 0.027) Loss 5.1689e+00 (6.0903e+00) Acc@1 7.42 ( 2.74) Acc@5 21.88 ( 8.57) +Epoch: [0][3623/5004] Time 0.251 ( 0.242) Data 0.020 ( 0.027) Loss 5.1138e+00 (6.0900e+00) Acc@1 8.20 ( 2.74) Acc@5 23.05 ( 8.57) +Epoch: [0][3624/5004] Time 0.244 ( 0.242) Data 0.016 ( 0.027) Loss 5.4111e+00 (6.0898e+00) Acc@1 4.69 ( 2.74) Acc@5 17.58 ( 8.58) +Epoch: [0][3625/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.027) Loss 4.8641e+00 (6.0895e+00) Acc@1 6.64 ( 2.74) Acc@5 24.22 ( 8.58) +Epoch: [0][3626/5004] Time 0.256 ( 0.242) Data 0.020 ( 0.027) Loss 4.8740e+00 (6.0891e+00) Acc@1 9.77 ( 2.74) Acc@5 27.34 ( 8.59) +Epoch: [0][3627/5004] Time 0.242 ( 0.242) Data 0.016 ( 0.027) Loss 4.9615e+00 (6.0888e+00) Acc@1 7.42 ( 2.75) Acc@5 19.92 ( 8.59) +Epoch: [0][3628/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.027) Loss 5.1069e+00 (6.0886e+00) Acc@1 6.64 ( 2.75) Acc@5 21.09 ( 8.59) +Epoch: [0][3629/5004] Time 0.247 ( 0.242) Data 0.020 ( 0.027) Loss 5.1756e+00 (6.0883e+00) Acc@1 7.03 ( 2.75) Acc@5 19.92 ( 8.60) +Epoch: [0][3630/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.027) Loss 5.0054e+00 (6.0880e+00) Acc@1 8.20 ( 2.75) Acc@5 21.88 ( 8.60) +Epoch: [0][3631/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.027) Loss 5.1588e+00 (6.0877e+00) Acc@1 6.64 ( 2.75) Acc@5 21.09 ( 8.60) +Epoch: [0][3632/5004] Time 0.251 ( 0.242) Data 0.020 ( 0.027) Loss 5.1266e+00 (6.0875e+00) Acc@1 8.98 ( 2.75) Acc@5 23.44 ( 8.61) +Epoch: [0][3633/5004] Time 0.246 ( 0.242) Data 0.016 ( 0.027) Loss 5.1406e+00 (6.0872e+00) Acc@1 8.20 ( 2.75) Acc@5 18.36 ( 8.61) +Epoch: [0][3634/5004] Time 0.243 ( 0.242) Data 0.017 ( 0.027) Loss 5.0914e+00 (6.0869e+00) Acc@1 8.59 ( 2.76) Acc@5 23.44 ( 8.61) +Epoch: [0][3635/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 5.1125e+00 (6.0867e+00) Acc@1 6.25 ( 2.76) Acc@5 18.36 ( 8.62) +Epoch: [0][3636/5004] Time 0.247 ( 0.242) Data 0.020 ( 0.027) Loss 5.1643e+00 (6.0864e+00) Acc@1 8.20 ( 2.76) Acc@5 22.27 ( 8.62) +Epoch: [0][3637/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.027) Loss 4.9962e+00 (6.0861e+00) Acc@1 9.77 ( 2.76) Acc@5 26.17 ( 8.62) +Epoch: [0][3638/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.0219e+00 (6.0858e+00) Acc@1 9.38 ( 2.76) Acc@5 24.22 ( 8.63) +Epoch: [0][3639/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 5.2010e+00 (6.0856e+00) Acc@1 6.64 ( 2.76) Acc@5 23.05 ( 8.63) +Epoch: [0][3640/5004] Time 0.237 ( 0.242) Data 0.016 ( 0.027) Loss 4.9035e+00 (6.0853e+00) Acc@1 11.72 ( 2.76) Acc@5 25.78 ( 8.64) +Epoch: [0][3641/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.0281e+00 (6.0850e+00) Acc@1 6.64 ( 2.77) Acc@5 23.05 ( 8.64) +Epoch: [0][3642/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.9218e+00 (6.0847e+00) Acc@1 8.20 ( 2.77) Acc@5 27.34 ( 8.65) +Epoch: [0][3643/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.0347e+00 (6.0844e+00) Acc@1 9.77 ( 2.77) Acc@5 23.44 ( 8.65) +Epoch: [0][3644/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 5.0676e+00 (6.0841e+00) Acc@1 10.16 ( 2.77) Acc@5 25.00 ( 8.66) +Epoch: [0][3645/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.0282e+00 (6.0838e+00) Acc@1 10.16 ( 2.77) Acc@5 22.27 ( 8.66) +Epoch: [0][3646/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.1231e+00 (6.0835e+00) Acc@1 6.64 ( 2.77) Acc@5 19.53 ( 8.66) +Epoch: [0][3647/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.0325e+00 (6.0832e+00) Acc@1 8.59 ( 2.78) Acc@5 25.39 ( 8.67) +Epoch: [0][3648/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 5.0718e+00 (6.0830e+00) Acc@1 7.81 ( 2.78) Acc@5 17.58 ( 8.67) +Epoch: [0][3649/5004] Time 0.252 ( 0.242) Data 0.022 ( 0.027) Loss 5.1992e+00 (6.0827e+00) Acc@1 9.38 ( 2.78) Acc@5 23.05 ( 8.67) +Epoch: [0][3650/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.027) Loss 4.9860e+00 (6.0824e+00) Acc@1 8.98 ( 2.78) Acc@5 22.66 ( 8.68) +Epoch: [0][3651/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.0584e+00 (6.0821e+00) Acc@1 7.81 ( 2.78) Acc@5 19.92 ( 8.68) +Epoch: [0][3652/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 5.0970e+00 (6.0819e+00) Acc@1 8.59 ( 2.78) Acc@5 23.83 ( 8.68) +Epoch: [0][3653/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 5.0528e+00 (6.0816e+00) Acc@1 8.20 ( 2.79) Acc@5 21.88 ( 8.69) +Epoch: [0][3654/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.8951e+00 (6.0813e+00) Acc@1 10.94 ( 2.79) Acc@5 25.78 ( 8.69) +Epoch: [0][3655/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 5.1683e+00 (6.0810e+00) Acc@1 8.98 ( 2.79) Acc@5 23.44 ( 8.70) +Epoch: [0][3656/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.9855e+00 (6.0807e+00) Acc@1 7.42 ( 2.79) Acc@5 24.22 ( 8.70) +Epoch: [0][3657/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.1353e+00 (6.0805e+00) Acc@1 8.20 ( 2.79) Acc@5 20.31 ( 8.70) +Epoch: [0][3658/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.0491e+00 (6.0802e+00) Acc@1 9.38 ( 2.79) Acc@5 22.66 ( 8.71) +Epoch: [0][3659/5004] Time 0.253 ( 0.242) Data 0.026 ( 0.027) Loss 4.9556e+00 (6.0799e+00) Acc@1 9.38 ( 2.80) Acc@5 25.39 ( 8.71) +Epoch: [0][3660/5004] Time 0.234 ( 0.242) Data 0.017 ( 0.027) Loss 5.2134e+00 (6.0796e+00) Acc@1 9.38 ( 2.80) Acc@5 23.05 ( 8.72) +Epoch: [0][3661/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 5.0379e+00 (6.0794e+00) Acc@1 8.20 ( 2.80) Acc@5 22.66 ( 8.72) +Epoch: [0][3662/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.3351e+00 (6.0792e+00) Acc@1 7.81 ( 2.80) Acc@5 19.92 ( 8.72) +Epoch: [0][3663/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 5.0832e+00 (6.0789e+00) Acc@1 9.38 ( 2.80) Acc@5 24.61 ( 8.73) +Epoch: [0][3664/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 5.0696e+00 (6.0786e+00) Acc@1 7.81 ( 2.80) Acc@5 22.27 ( 8.73) +Epoch: [0][3665/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.9841e+00 (6.0783e+00) Acc@1 7.81 ( 2.80) Acc@5 23.83 ( 8.74) +Epoch: [0][3666/5004] Time 0.238 ( 0.242) Data 0.022 ( 0.027) Loss 5.0278e+00 (6.0780e+00) Acc@1 9.77 ( 2.81) Acc@5 23.44 ( 8.74) +Epoch: [0][3667/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.0903e+00 (6.0777e+00) Acc@1 8.59 ( 2.81) Acc@5 26.17 ( 8.74) +Epoch: [0][3668/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 5.0986e+00 (6.0775e+00) Acc@1 8.98 ( 2.81) Acc@5 20.31 ( 8.75) +Epoch: [0][3669/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 5.1105e+00 (6.0772e+00) Acc@1 7.03 ( 2.81) Acc@5 19.14 ( 8.75) +Epoch: [0][3670/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.0757e+00 (6.0769e+00) Acc@1 9.77 ( 2.81) Acc@5 23.05 ( 8.75) +Epoch: [0][3671/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 5.1256e+00 (6.0767e+00) Acc@1 6.64 ( 2.81) Acc@5 17.58 ( 8.76) +Epoch: [0][3672/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 5.0173e+00 (6.0764e+00) Acc@1 10.55 ( 2.82) Acc@5 24.22 ( 8.76) +Epoch: [0][3673/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 5.3236e+00 (6.0762e+00) Acc@1 6.64 ( 2.82) Acc@5 16.02 ( 8.76) +Epoch: [0][3674/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 5.1451e+00 (6.0759e+00) Acc@1 6.25 ( 2.82) Acc@5 23.44 ( 8.77) +Epoch: [0][3675/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 4.9696e+00 (6.0756e+00) Acc@1 7.42 ( 2.82) Acc@5 23.44 ( 8.77) +Epoch: [0][3676/5004] Time 0.248 ( 0.242) Data 0.024 ( 0.027) Loss 5.0073e+00 (6.0753e+00) Acc@1 8.20 ( 2.82) Acc@5 21.48 ( 8.77) +Epoch: [0][3677/5004] Time 0.239 ( 0.242) Data 0.021 ( 0.027) Loss 5.0640e+00 (6.0751e+00) Acc@1 7.03 ( 2.82) Acc@5 19.92 ( 8.78) +Epoch: [0][3678/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.9770e+00 (6.0748e+00) Acc@1 7.81 ( 2.82) Acc@5 23.83 ( 8.78) +Epoch: [0][3679/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 5.0972e+00 (6.0745e+00) Acc@1 9.77 ( 2.83) Acc@5 22.66 ( 8.78) +Epoch: [0][3680/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 5.2043e+00 (6.0743e+00) Acc@1 7.42 ( 2.83) Acc@5 22.66 ( 8.79) +Epoch: [0][3681/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.1736e+00 (6.0740e+00) Acc@1 8.20 ( 2.83) Acc@5 20.70 ( 8.79) +Epoch: [0][3682/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 5.0246e+00 (6.0737e+00) Acc@1 8.59 ( 2.83) Acc@5 23.83 ( 8.80) +Epoch: [0][3683/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.0614e+00 (6.0735e+00) Acc@1 12.11 ( 2.83) Acc@5 26.17 ( 8.80) +Epoch: [0][3684/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.9769e+00 (6.0732e+00) Acc@1 11.33 ( 2.83) Acc@5 24.22 ( 8.80) +Epoch: [0][3685/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 5.0576e+00 (6.0729e+00) Acc@1 10.16 ( 2.84) Acc@5 24.61 ( 8.81) +Epoch: [0][3686/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 5.3935e+00 (6.0727e+00) Acc@1 6.64 ( 2.84) Acc@5 18.75 ( 8.81) +Epoch: [0][3687/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 5.2215e+00 (6.0725e+00) Acc@1 10.94 ( 2.84) Acc@5 23.44 ( 8.82) +Epoch: [0][3688/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.9525e+00 (6.0722e+00) Acc@1 9.77 ( 2.84) Acc@5 23.05 ( 8.82) +Epoch: [0][3689/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 4.9789e+00 (6.0719e+00) Acc@1 9.77 ( 2.84) Acc@5 24.61 ( 8.82) +Epoch: [0][3690/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 4.7960e+00 (6.0715e+00) Acc@1 11.33 ( 2.85) Acc@5 30.86 ( 8.83) +Epoch: [0][3691/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 5.1294e+00 (6.0713e+00) Acc@1 7.81 ( 2.85) Acc@5 19.92 ( 8.83) +Epoch: [0][3692/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 5.1173e+00 (6.0710e+00) Acc@1 7.03 ( 2.85) Acc@5 21.88 ( 8.84) +Epoch: [0][3693/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 4.8587e+00 (6.0707e+00) Acc@1 10.16 ( 2.85) Acc@5 24.61 ( 8.84) +Epoch: [0][3694/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.1527e+00 (6.0704e+00) Acc@1 7.03 ( 2.85) Acc@5 23.05 ( 8.84) +Epoch: [0][3695/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 4.9703e+00 (6.0701e+00) Acc@1 8.59 ( 2.85) Acc@5 23.05 ( 8.85) +Epoch: [0][3696/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 5.0961e+00 (6.0699e+00) Acc@1 7.81 ( 2.85) Acc@5 25.39 ( 8.85) +Epoch: [0][3697/5004] Time 0.237 ( 0.242) Data 0.024 ( 0.027) Loss 4.8753e+00 (6.0696e+00) Acc@1 8.20 ( 2.86) Acc@5 26.56 ( 8.86) +Epoch: [0][3698/5004] Time 0.213 ( 0.242) Data 0.025 ( 0.027) Loss 4.8905e+00 (6.0692e+00) Acc@1 9.77 ( 2.86) Acc@5 26.56 ( 8.86) +Epoch: [0][3699/5004] Time 0.235 ( 0.242) Data 0.049 ( 0.027) Loss 4.9082e+00 (6.0689e+00) Acc@1 8.59 ( 2.86) Acc@5 22.27 ( 8.87) +Epoch: [0][3700/5004] Time 0.236 ( 0.242) Data 0.050 ( 0.027) Loss 4.9865e+00 (6.0686e+00) Acc@1 6.25 ( 2.86) Acc@5 23.44 ( 8.87) +Epoch: [0][3701/5004] Time 0.234 ( 0.242) Data 0.051 ( 0.027) Loss 5.0016e+00 (6.0683e+00) Acc@1 10.16 ( 2.86) Acc@5 27.34 ( 8.87) +Epoch: [0][3702/5004] Time 0.242 ( 0.242) Data 0.053 ( 0.027) Loss 5.0448e+00 (6.0681e+00) Acc@1 8.20 ( 2.86) Acc@5 20.70 ( 8.88) +Epoch: [0][3703/5004] Time 0.234 ( 0.242) Data 0.049 ( 0.027) Loss 4.9771e+00 (6.0678e+00) Acc@1 8.20 ( 2.86) Acc@5 23.44 ( 8.88) +Epoch: [0][3704/5004] Time 0.241 ( 0.242) Data 0.052 ( 0.027) Loss 5.1859e+00 (6.0675e+00) Acc@1 8.98 ( 2.87) Acc@5 18.36 ( 8.88) +Epoch: [0][3705/5004] Time 0.235 ( 0.242) Data 0.052 ( 0.027) Loss 5.0199e+00 (6.0673e+00) Acc@1 8.20 ( 2.87) Acc@5 21.48 ( 8.89) +Epoch: [0][3706/5004] Time 0.240 ( 0.242) Data 0.055 ( 0.027) Loss 4.9828e+00 (6.0670e+00) Acc@1 10.55 ( 2.87) Acc@5 25.00 ( 8.89) +Epoch: [0][3707/5004] Time 0.241 ( 0.242) Data 0.052 ( 0.027) Loss 4.9635e+00 (6.0667e+00) Acc@1 6.25 ( 2.87) Acc@5 24.61 ( 8.90) +Epoch: [0][3708/5004] Time 0.237 ( 0.242) Data 0.051 ( 0.027) Loss 5.0403e+00 (6.0664e+00) Acc@1 8.20 ( 2.87) Acc@5 20.70 ( 8.90) +Epoch: [0][3709/5004] Time 0.275 ( 0.242) Data 0.052 ( 0.027) Loss 5.1200e+00 (6.0661e+00) Acc@1 5.86 ( 2.87) Acc@5 20.70 ( 8.90) +Epoch: [0][3710/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.1174e+00 (6.0659e+00) Acc@1 10.16 ( 2.87) Acc@5 21.48 ( 8.91) +Epoch: [0][3711/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.9437e+00 (6.0656e+00) Acc@1 8.98 ( 2.88) Acc@5 24.22 ( 8.91) +Epoch: [0][3712/5004] Time 0.250 ( 0.242) Data 0.024 ( 0.027) Loss 5.1959e+00 (6.0653e+00) Acc@1 5.47 ( 2.88) Acc@5 19.53 ( 8.91) +Epoch: [0][3713/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.9280e+00 (6.0650e+00) Acc@1 9.77 ( 2.88) Acc@5 24.61 ( 8.92) +Epoch: [0][3714/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 4.9351e+00 (6.0647e+00) Acc@1 8.98 ( 2.88) Acc@5 25.78 ( 8.92) +Epoch: [0][3715/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.1239e+00 (6.0645e+00) Acc@1 9.77 ( 2.88) Acc@5 23.44 ( 8.93) +Epoch: [0][3716/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.027) Loss 5.0627e+00 (6.0642e+00) Acc@1 8.20 ( 2.88) Acc@5 23.44 ( 8.93) +Epoch: [0][3717/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 4.9703e+00 (6.0639e+00) Acc@1 11.72 ( 2.89) Acc@5 26.56 ( 8.93) +Epoch: [0][3718/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 4.9914e+00 (6.0636e+00) Acc@1 8.20 ( 2.89) Acc@5 21.88 ( 8.94) +Epoch: [0][3719/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 4.9039e+00 (6.0633e+00) Acc@1 5.47 ( 2.89) Acc@5 23.05 ( 8.94) +Epoch: [0][3720/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 5.0990e+00 (6.0631e+00) Acc@1 7.42 ( 2.89) Acc@5 22.27 ( 8.95) +Epoch: [0][3721/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.027) Loss 4.9076e+00 (6.0627e+00) Acc@1 10.16 ( 2.89) Acc@5 25.00 ( 8.95) +Epoch: [0][3722/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.0886e+00 (6.0625e+00) Acc@1 8.59 ( 2.89) Acc@5 21.48 ( 8.95) +Epoch: [0][3723/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.0052e+00 (6.0622e+00) Acc@1 9.38 ( 2.89) Acc@5 25.00 ( 8.96) +Epoch: [0][3724/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 4.8914e+00 (6.0619e+00) Acc@1 9.77 ( 2.90) Acc@5 27.73 ( 8.96) +Epoch: [0][3725/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 5.0158e+00 (6.0616e+00) Acc@1 8.20 ( 2.90) Acc@5 21.88 ( 8.97) +Epoch: [0][3726/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 5.0783e+00 (6.0613e+00) Acc@1 7.42 ( 2.90) Acc@5 25.00 ( 8.97) +Epoch: [0][3727/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.027) Loss 4.9385e+00 (6.0610e+00) Acc@1 8.59 ( 2.90) Acc@5 23.44 ( 8.97) +Epoch: [0][3728/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.9792e+00 (6.0607e+00) Acc@1 7.81 ( 2.90) Acc@5 21.09 ( 8.98) +Epoch: [0][3729/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.9526e+00 (6.0605e+00) Acc@1 8.98 ( 2.90) Acc@5 24.61 ( 8.98) +Epoch: [0][3730/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.0118e+00 (6.0602e+00) Acc@1 7.42 ( 2.91) Acc@5 22.27 ( 8.99) +Epoch: [0][3731/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.1045e+00 (6.0599e+00) Acc@1 6.25 ( 2.91) Acc@5 17.97 ( 8.99) +Epoch: [0][3732/5004] Time 0.253 ( 0.242) Data 0.022 ( 0.027) Loss 4.9331e+00 (6.0596e+00) Acc@1 8.98 ( 2.91) Acc@5 23.44 ( 8.99) +Epoch: [0][3733/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 5.0313e+00 (6.0593e+00) Acc@1 5.47 ( 2.91) Acc@5 19.14 ( 8.99) +Epoch: [0][3734/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.0451e+00 (6.0591e+00) Acc@1 7.81 ( 2.91) Acc@5 24.22 ( 9.00) +Epoch: [0][3735/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 4.9802e+00 (6.0588e+00) Acc@1 10.16 ( 2.91) Acc@5 23.44 ( 9.00) +Epoch: [0][3736/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.1323e+00 (6.0585e+00) Acc@1 8.59 ( 2.91) Acc@5 20.31 ( 9.01) +Epoch: [0][3737/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.027) Loss 5.0868e+00 (6.0583e+00) Acc@1 7.81 ( 2.91) Acc@5 23.44 ( 9.01) +Epoch: [0][3738/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 4.9454e+00 (6.0580e+00) Acc@1 8.98 ( 2.92) Acc@5 23.05 ( 9.01) +Epoch: [0][3739/5004] Time 0.251 ( 0.242) Data 0.023 ( 0.027) Loss 5.1218e+00 (6.0577e+00) Acc@1 6.64 ( 2.92) Acc@5 23.83 ( 9.02) +Epoch: [0][3740/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.027) Loss 5.0492e+00 (6.0575e+00) Acc@1 9.77 ( 2.92) Acc@5 23.83 ( 9.02) +Epoch: [0][3741/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.027) Loss 5.0166e+00 (6.0572e+00) Acc@1 10.16 ( 2.92) Acc@5 23.83 ( 9.02) +Epoch: [0][3742/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.0145e+00 (6.0569e+00) Acc@1 6.64 ( 2.92) Acc@5 23.05 ( 9.03) +Epoch: [0][3743/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 4.9607e+00 (6.0566e+00) Acc@1 10.55 ( 2.92) Acc@5 26.95 ( 9.03) +Epoch: [0][3744/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 5.0203e+00 (6.0563e+00) Acc@1 10.55 ( 2.93) Acc@5 21.09 ( 9.04) +Epoch: [0][3745/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.9826e+00 (6.0560e+00) Acc@1 6.64 ( 2.93) Acc@5 22.66 ( 9.04) +Epoch: [0][3746/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 4.7940e+00 (6.0557e+00) Acc@1 9.38 ( 2.93) Acc@5 27.73 ( 9.04) +Epoch: [0][3747/5004] Time 0.205 ( 0.242) Data 0.023 ( 0.027) Loss 5.1188e+00 (6.0555e+00) Acc@1 10.16 ( 2.93) Acc@5 21.09 ( 9.05) +Epoch: [0][3748/5004] Time 0.241 ( 0.242) Data 0.056 ( 0.027) Loss 5.0137e+00 (6.0552e+00) Acc@1 7.42 ( 2.93) Acc@5 22.66 ( 9.05) +Epoch: [0][3749/5004] Time 0.236 ( 0.242) Data 0.054 ( 0.027) Loss 4.9134e+00 (6.0549e+00) Acc@1 11.72 ( 2.93) Acc@5 27.34 ( 9.06) +Epoch: [0][3750/5004] Time 0.237 ( 0.242) Data 0.055 ( 0.027) Loss 5.0186e+00 (6.0546e+00) Acc@1 10.16 ( 2.94) Acc@5 24.22 ( 9.06) +Epoch: [0][3751/5004] Time 0.240 ( 0.242) Data 0.055 ( 0.027) Loss 5.0610e+00 (6.0543e+00) Acc@1 9.77 ( 2.94) Acc@5 22.27 ( 9.06) +Epoch: [0][3752/5004] Time 0.239 ( 0.242) Data 0.054 ( 0.027) Loss 4.8437e+00 (6.0540e+00) Acc@1 9.38 ( 2.94) Acc@5 27.34 ( 9.07) +Epoch: [0][3753/5004] Time 0.234 ( 0.242) Data 0.052 ( 0.027) Loss 5.1335e+00 (6.0538e+00) Acc@1 8.98 ( 2.94) Acc@5 24.22 ( 9.07) +Epoch: [0][3754/5004] Time 0.239 ( 0.242) Data 0.055 ( 0.027) Loss 5.0043e+00 (6.0535e+00) Acc@1 10.55 ( 2.94) Acc@5 23.83 ( 9.08) +Epoch: [0][3755/5004] Time 0.239 ( 0.242) Data 0.054 ( 0.027) Loss 4.9765e+00 (6.0532e+00) Acc@1 8.98 ( 2.94) Acc@5 19.14 ( 9.08) +Epoch: [0][3756/5004] Time 0.237 ( 0.242) Data 0.052 ( 0.027) Loss 5.1421e+00 (6.0530e+00) Acc@1 6.64 ( 2.95) Acc@5 19.53 ( 9.08) +Epoch: [0][3757/5004] Time 0.237 ( 0.242) Data 0.053 ( 0.027) Loss 5.0993e+00 (6.0527e+00) Acc@1 7.03 ( 2.95) Acc@5 21.88 ( 9.09) +Epoch: [0][3758/5004] Time 0.239 ( 0.242) Data 0.054 ( 0.027) Loss 4.9201e+00 (6.0524e+00) Acc@1 9.38 ( 2.95) Acc@5 23.83 ( 9.09) +Epoch: [0][3759/5004] Time 0.233 ( 0.242) Data 0.054 ( 0.027) Loss 5.0434e+00 (6.0521e+00) Acc@1 10.16 ( 2.95) Acc@5 22.66 ( 9.09) +Epoch: [0][3760/5004] Time 0.241 ( 0.242) Data 0.058 ( 0.027) Loss 4.9400e+00 (6.0518e+00) Acc@1 11.72 ( 2.95) Acc@5 25.00 ( 9.10) +Epoch: [0][3761/5004] Time 0.239 ( 0.242) Data 0.055 ( 0.027) Loss 5.0331e+00 (6.0516e+00) Acc@1 7.03 ( 2.95) Acc@5 22.27 ( 9.10) +Epoch: [0][3762/5004] Time 0.236 ( 0.242) Data 0.055 ( 0.027) Loss 5.0716e+00 (6.0513e+00) Acc@1 6.25 ( 2.95) Acc@5 25.78 ( 9.11) +Epoch: [0][3763/5004] Time 0.241 ( 0.242) Data 0.056 ( 0.027) Loss 5.1705e+00 (6.0511e+00) Acc@1 7.03 ( 2.96) Acc@5 19.14 ( 9.11) +Epoch: [0][3764/5004] Time 0.232 ( 0.242) Data 0.054 ( 0.027) Loss 5.1045e+00 (6.0508e+00) Acc@1 8.20 ( 2.96) Acc@5 21.88 ( 9.11) +Epoch: [0][3765/5004] Time 0.241 ( 0.242) Data 0.059 ( 0.027) Loss 5.0285e+00 (6.0505e+00) Acc@1 8.20 ( 2.96) Acc@5 21.88 ( 9.12) +Epoch: [0][3766/5004] Time 0.238 ( 0.242) Data 0.054 ( 0.027) Loss 5.1317e+00 (6.0503e+00) Acc@1 7.81 ( 2.96) Acc@5 21.09 ( 9.12) +Epoch: [0][3767/5004] Time 0.268 ( 0.242) Data 0.055 ( 0.027) Loss 4.9906e+00 (6.0500e+00) Acc@1 9.77 ( 2.96) Acc@5 23.83 ( 9.12) +Epoch: [0][3768/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.9551e+00 (6.0497e+00) Acc@1 8.20 ( 2.96) Acc@5 25.00 ( 9.13) +Epoch: [0][3769/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 5.1369e+00 (6.0495e+00) Acc@1 9.77 ( 2.96) Acc@5 20.70 ( 9.13) +Epoch: [0][3770/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.027) Loss 4.9523e+00 (6.0492e+00) Acc@1 8.20 ( 2.97) Acc@5 28.12 ( 9.13) +Epoch: [0][3771/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 5.2648e+00 (6.0490e+00) Acc@1 5.86 ( 2.97) Acc@5 18.36 ( 9.14) +Epoch: [0][3772/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 5.1203e+00 (6.0487e+00) Acc@1 6.64 ( 2.97) Acc@5 19.92 ( 9.14) +Epoch: [0][3773/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 5.0984e+00 (6.0485e+00) Acc@1 8.59 ( 2.97) Acc@5 21.48 ( 9.14) +Epoch: [0][3774/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.9562e+00 (6.0482e+00) Acc@1 9.38 ( 2.97) Acc@5 27.73 ( 9.15) +Epoch: [0][3775/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.8941e+00 (6.0479e+00) Acc@1 9.38 ( 2.97) Acc@5 25.39 ( 9.15) +Epoch: [0][3776/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.027) Loss 5.0018e+00 (6.0476e+00) Acc@1 9.38 ( 2.97) Acc@5 21.09 ( 9.16) +Epoch: [0][3777/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.9058e+00 (6.0473e+00) Acc@1 8.59 ( 2.98) Acc@5 24.61 ( 9.16) +Epoch: [0][3778/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 5.1637e+00 (6.0471e+00) Acc@1 9.38 ( 2.98) Acc@5 20.31 ( 9.16) +Epoch: [0][3779/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.9628e+00 (6.0468e+00) Acc@1 8.59 ( 2.98) Acc@5 23.05 ( 9.17) +Epoch: [0][3780/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.9451e+00 (6.0465e+00) Acc@1 10.55 ( 2.98) Acc@5 24.61 ( 9.17) +Epoch: [0][3781/5004] Time 0.249 ( 0.242) Data 0.025 ( 0.027) Loss 5.1240e+00 (6.0463e+00) Acc@1 9.77 ( 2.98) Acc@5 21.48 ( 9.17) +Epoch: [0][3782/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.1680e+00 (6.0460e+00) Acc@1 9.38 ( 2.98) Acc@5 20.70 ( 9.18) +Epoch: [0][3783/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.0243e+00 (6.0458e+00) Acc@1 9.77 ( 2.99) Acc@5 25.78 ( 9.18) +Epoch: [0][3784/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 5.1382e+00 (6.0455e+00) Acc@1 8.98 ( 2.99) Acc@5 19.92 ( 9.18) +Epoch: [0][3785/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 5.1283e+00 (6.0453e+00) Acc@1 10.94 ( 2.99) Acc@5 22.27 ( 9.19) +Epoch: [0][3786/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.027) Loss 4.9276e+00 (6.0450e+00) Acc@1 10.94 ( 2.99) Acc@5 23.44 ( 9.19) +Epoch: [0][3787/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 4.8527e+00 (6.0447e+00) Acc@1 10.94 ( 2.99) Acc@5 24.22 ( 9.19) +Epoch: [0][3788/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.9393e+00 (6.0444e+00) Acc@1 10.55 ( 3.00) Acc@5 27.73 ( 9.20) +Epoch: [0][3789/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 4.8943e+00 (6.0441e+00) Acc@1 10.55 ( 3.00) Acc@5 26.17 ( 9.20) +Epoch: [0][3790/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 5.1638e+00 (6.0438e+00) Acc@1 6.64 ( 3.00) Acc@5 19.14 ( 9.21) +Epoch: [0][3791/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.9540e+00 (6.0436e+00) Acc@1 9.38 ( 3.00) Acc@5 24.22 ( 9.21) +Epoch: [0][3792/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 5.0777e+00 (6.0433e+00) Acc@1 10.16 ( 3.00) Acc@5 25.39 ( 9.22) +Epoch: [0][3793/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 5.0933e+00 (6.0430e+00) Acc@1 6.64 ( 3.00) Acc@5 21.48 ( 9.22) +Epoch: [0][3794/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.9339e+00 (6.0428e+00) Acc@1 7.81 ( 3.01) Acc@5 26.95 ( 9.22) +Epoch: [0][3795/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 4.9632e+00 (6.0425e+00) Acc@1 9.77 ( 3.01) Acc@5 26.95 ( 9.23) +Epoch: [0][3796/5004] Time 0.240 ( 0.242) Data 0.022 ( 0.027) Loss 4.8668e+00 (6.0422e+00) Acc@1 9.77 ( 3.01) Acc@5 28.52 ( 9.23) +Epoch: [0][3797/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 4.6673e+00 (6.0418e+00) Acc@1 10.16 ( 3.01) Acc@5 33.98 ( 9.24) +Epoch: [0][3798/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 4.8980e+00 (6.0415e+00) Acc@1 7.81 ( 3.01) Acc@5 23.05 ( 9.24) +Epoch: [0][3799/5004] Time 0.246 ( 0.242) Data 0.026 ( 0.027) Loss 5.2347e+00 (6.0413e+00) Acc@1 7.81 ( 3.01) Acc@5 19.92 ( 9.25) +Epoch: [0][3800/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 5.2103e+00 (6.0411e+00) Acc@1 8.98 ( 3.01) Acc@5 19.53 ( 9.25) +Epoch: [0][3801/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.8439e+00 (6.0408e+00) Acc@1 12.11 ( 3.02) Acc@5 26.95 ( 9.25) +Epoch: [0][3802/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 5.1430e+00 (6.0405e+00) Acc@1 7.42 ( 3.02) Acc@5 21.48 ( 9.26) +Epoch: [0][3803/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.027) Loss 4.8613e+00 (6.0402e+00) Acc@1 11.72 ( 3.02) Acc@5 26.95 ( 9.26) +Epoch: [0][3804/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 5.1354e+00 (6.0400e+00) Acc@1 8.20 ( 3.02) Acc@5 19.14 ( 9.26) +Epoch: [0][3805/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.1322e+00 (6.0397e+00) Acc@1 8.20 ( 3.02) Acc@5 18.36 ( 9.27) +Epoch: [0][3806/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.9316e+00 (6.0394e+00) Acc@1 7.81 ( 3.02) Acc@5 23.44 ( 9.27) +Epoch: [0][3807/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.9592e+00 (6.0392e+00) Acc@1 11.33 ( 3.03) Acc@5 25.00 ( 9.27) +Epoch: [0][3808/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 5.0234e+00 (6.0389e+00) Acc@1 7.81 ( 3.03) Acc@5 22.66 ( 9.28) +Epoch: [0][3809/5004] Time 0.249 ( 0.242) Data 0.025 ( 0.027) Loss 4.9356e+00 (6.0386e+00) Acc@1 11.72 ( 3.03) Acc@5 27.34 ( 9.28) +Epoch: [0][3810/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 5.1785e+00 (6.0384e+00) Acc@1 6.25 ( 3.03) Acc@5 17.97 ( 9.28) +Epoch: [0][3811/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 4.9104e+00 (6.0381e+00) Acc@1 9.77 ( 3.03) Acc@5 23.83 ( 9.29) +Epoch: [0][3812/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.9944e+00 (6.0378e+00) Acc@1 7.81 ( 3.03) Acc@5 21.09 ( 9.29) +Epoch: [0][3813/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 5.0517e+00 (6.0375e+00) Acc@1 10.94 ( 3.04) Acc@5 23.44 ( 9.30) +Epoch: [0][3814/5004] Time 0.247 ( 0.242) Data 0.024 ( 0.027) Loss 4.9068e+00 (6.0372e+00) Acc@1 11.33 ( 3.04) Acc@5 24.61 ( 9.30) +Epoch: [0][3815/5004] Time 0.254 ( 0.242) Data 0.023 ( 0.027) Loss 5.1912e+00 (6.0370e+00) Acc@1 7.42 ( 3.04) Acc@5 21.48 ( 9.30) +Epoch: [0][3816/5004] Time 0.236 ( 0.242) Data 0.015 ( 0.027) Loss 5.0087e+00 (6.0368e+00) Acc@1 7.42 ( 3.04) Acc@5 25.78 ( 9.31) +Epoch: [0][3817/5004] Time 0.251 ( 0.242) Data 0.024 ( 0.027) Loss 5.2064e+00 (6.0365e+00) Acc@1 6.25 ( 3.04) Acc@5 18.36 ( 9.31) +Epoch: [0][3818/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 5.0996e+00 (6.0363e+00) Acc@1 9.38 ( 3.04) Acc@5 25.39 ( 9.31) +Epoch: [0][3819/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.027) Loss 4.8923e+00 (6.0360e+00) Acc@1 11.33 ( 3.05) Acc@5 28.12 ( 9.32) +Epoch: [0][3820/5004] Time 0.238 ( 0.242) Data 0.017 ( 0.027) Loss 5.1282e+00 (6.0358e+00) Acc@1 8.59 ( 3.05) Acc@5 20.70 ( 9.32) +Epoch: [0][3821/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.8930e+00 (6.0355e+00) Acc@1 10.16 ( 3.05) Acc@5 23.83 ( 9.32) +Epoch: [0][3822/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.027) Loss 4.8849e+00 (6.0352e+00) Acc@1 10.16 ( 3.05) Acc@5 25.39 ( 9.33) +Epoch: [0][3823/5004] Time 0.238 ( 0.242) Data 0.020 ( 0.027) Loss 5.0525e+00 (6.0349e+00) Acc@1 7.42 ( 3.05) Acc@5 24.22 ( 9.33) +Epoch: [0][3824/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 5.1210e+00 (6.0347e+00) Acc@1 8.59 ( 3.05) Acc@5 18.75 ( 9.34) +Epoch: [0][3825/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 5.0084e+00 (6.0344e+00) Acc@1 9.77 ( 3.05) Acc@5 21.88 ( 9.34) +Epoch: [0][3826/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 5.0010e+00 (6.0341e+00) Acc@1 9.38 ( 3.06) Acc@5 22.27 ( 9.34) +Epoch: [0][3827/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.9135e+00 (6.0338e+00) Acc@1 8.20 ( 3.06) Acc@5 25.00 ( 9.35) +Epoch: [0][3828/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.0124e+00 (6.0336e+00) Acc@1 10.16 ( 3.06) Acc@5 22.66 ( 9.35) +Epoch: [0][3829/5004] Time 0.249 ( 0.242) Data 0.024 ( 0.027) Loss 4.9749e+00 (6.0333e+00) Acc@1 9.38 ( 3.06) Acc@5 22.27 ( 9.35) +Epoch: [0][3830/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 5.0788e+00 (6.0330e+00) Acc@1 12.11 ( 3.06) Acc@5 22.27 ( 9.36) +Epoch: [0][3831/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.9963e+00 (6.0328e+00) Acc@1 10.55 ( 3.07) Acc@5 28.12 ( 9.36) +Epoch: [0][3832/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.0279e+00 (6.0325e+00) Acc@1 10.16 ( 3.07) Acc@5 24.61 ( 9.37) +Epoch: [0][3833/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 4.9778e+00 (6.0322e+00) Acc@1 7.81 ( 3.07) Acc@5 22.27 ( 9.37) +Epoch: [0][3834/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 5.0380e+00 (6.0320e+00) Acc@1 7.03 ( 3.07) Acc@5 25.39 ( 9.37) +Epoch: [0][3835/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.9520e+00 (6.0317e+00) Acc@1 8.20 ( 3.07) Acc@5 23.83 ( 9.38) +Epoch: [0][3836/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 4.8474e+00 (6.0314e+00) Acc@1 10.16 ( 3.07) Acc@5 28.12 ( 9.38) +Epoch: [0][3837/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 5.1163e+00 (6.0311e+00) Acc@1 8.98 ( 3.07) Acc@5 22.66 ( 9.38) +Epoch: [0][3838/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 5.0000e+00 (6.0309e+00) Acc@1 8.98 ( 3.08) Acc@5 25.78 ( 9.39) +Epoch: [0][3839/5004] Time 0.244 ( 0.242) Data 0.026 ( 0.027) Loss 5.0011e+00 (6.0306e+00) Acc@1 8.98 ( 3.08) Acc@5 20.31 ( 9.39) +Epoch: [0][3840/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.027) Loss 4.9398e+00 (6.0303e+00) Acc@1 10.94 ( 3.08) Acc@5 26.17 ( 9.40) +Epoch: [0][3841/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 4.9334e+00 (6.0300e+00) Acc@1 11.33 ( 3.08) Acc@5 26.56 ( 9.40) +Epoch: [0][3842/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 5.2178e+00 (6.0298e+00) Acc@1 5.47 ( 3.08) Acc@5 18.36 ( 9.40) +Epoch: [0][3843/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 5.0000e+00 (6.0296e+00) Acc@1 8.59 ( 3.08) Acc@5 23.05 ( 9.41) +Epoch: [0][3844/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.9344e+00 (6.0293e+00) Acc@1 8.59 ( 3.09) Acc@5 22.66 ( 9.41) +Epoch: [0][3845/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.8908e+00 (6.0290e+00) Acc@1 10.94 ( 3.09) Acc@5 24.22 ( 9.41) +Epoch: [0][3846/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.8450e+00 (6.0287e+00) Acc@1 11.33 ( 3.09) Acc@5 30.86 ( 9.42) +Epoch: [0][3847/5004] Time 0.245 ( 0.242) Data 0.026 ( 0.027) Loss 5.0535e+00 (6.0284e+00) Acc@1 7.81 ( 3.09) Acc@5 21.88 ( 9.42) +Epoch: [0][3848/5004] Time 0.248 ( 0.242) Data 0.024 ( 0.027) Loss 5.0838e+00 (6.0282e+00) Acc@1 7.42 ( 3.09) Acc@5 22.27 ( 9.43) +Epoch: [0][3849/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.9285e+00 (6.0279e+00) Acc@1 10.94 ( 3.09) Acc@5 26.95 ( 9.43) +Epoch: [0][3850/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 5.1434e+00 (6.0277e+00) Acc@1 5.47 ( 3.09) Acc@5 21.48 ( 9.43) +Epoch: [0][3851/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.9512e+00 (6.0274e+00) Acc@1 7.42 ( 3.10) Acc@5 23.83 ( 9.44) +Epoch: [0][3852/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 4.9705e+00 (6.0271e+00) Acc@1 13.67 ( 3.10) Acc@5 25.00 ( 9.44) +Epoch: [0][3853/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 5.0775e+00 (6.0269e+00) Acc@1 6.25 ( 3.10) Acc@5 22.66 ( 9.45) +Epoch: [0][3854/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 5.0305e+00 (6.0266e+00) Acc@1 8.98 ( 3.10) Acc@5 18.75 ( 9.45) +Epoch: [0][3855/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 5.1316e+00 (6.0264e+00) Acc@1 7.81 ( 3.10) Acc@5 21.48 ( 9.45) +Epoch: [0][3856/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 5.1173e+00 (6.0261e+00) Acc@1 10.16 ( 3.10) Acc@5 22.66 ( 9.45) +Epoch: [0][3857/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 5.1362e+00 (6.0259e+00) Acc@1 8.59 ( 3.11) Acc@5 23.05 ( 9.46) +Epoch: [0][3858/5004] Time 0.243 ( 0.242) Data 0.029 ( 0.027) Loss 5.1090e+00 (6.0257e+00) Acc@1 5.86 ( 3.11) Acc@5 21.88 ( 9.46) +Epoch: [0][3859/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.9922e+00 (6.0254e+00) Acc@1 8.98 ( 3.11) Acc@5 23.83 ( 9.46) +Epoch: [0][3860/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 5.0611e+00 (6.0251e+00) Acc@1 9.38 ( 3.11) Acc@5 24.61 ( 9.47) +Epoch: [0][3861/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 5.0699e+00 (6.0249e+00) Acc@1 7.42 ( 3.11) Acc@5 22.66 ( 9.47) +Epoch: [0][3862/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 5.1117e+00 (6.0247e+00) Acc@1 7.03 ( 3.11) Acc@5 22.66 ( 9.48) +Epoch: [0][3863/5004] Time 0.245 ( 0.242) Data 0.026 ( 0.027) Loss 4.7997e+00 (6.0243e+00) Acc@1 10.55 ( 3.11) Acc@5 26.95 ( 9.48) +Epoch: [0][3864/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.8490e+00 (6.0240e+00) Acc@1 8.59 ( 3.11) Acc@5 27.34 ( 9.48) +Epoch: [0][3865/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.9554e+00 (6.0238e+00) Acc@1 11.33 ( 3.12) Acc@5 28.91 ( 9.49) +Epoch: [0][3866/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.7643e+00 (6.0234e+00) Acc@1 12.89 ( 3.12) Acc@5 29.30 ( 9.49) +Epoch: [0][3867/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 4.9496e+00 (6.0232e+00) Acc@1 9.77 ( 3.12) Acc@5 26.95 ( 9.50) +Epoch: [0][3868/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 5.1001e+00 (6.0229e+00) Acc@1 7.03 ( 3.12) Acc@5 23.83 ( 9.50) +Epoch: [0][3869/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 4.8991e+00 (6.0226e+00) Acc@1 11.72 ( 3.12) Acc@5 28.52 ( 9.51) +Epoch: [0][3870/5004] Time 0.237 ( 0.242) Data 0.022 ( 0.027) Loss 4.6592e+00 (6.0223e+00) Acc@1 10.55 ( 3.13) Acc@5 32.03 ( 9.51) +Epoch: [0][3871/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 4.8646e+00 (6.0220e+00) Acc@1 14.45 ( 3.13) Acc@5 28.12 ( 9.52) +Epoch: [0][3872/5004] Time 0.246 ( 0.242) Data 0.026 ( 0.027) Loss 5.0075e+00 (6.0217e+00) Acc@1 9.77 ( 3.13) Acc@5 24.22 ( 9.52) +Epoch: [0][3873/5004] Time 0.248 ( 0.242) Data 0.025 ( 0.027) Loss 4.8757e+00 (6.0214e+00) Acc@1 9.38 ( 3.13) Acc@5 25.39 ( 9.53) +Epoch: [0][3874/5004] Time 0.238 ( 0.242) Data 0.020 ( 0.027) Loss 4.9626e+00 (6.0211e+00) Acc@1 10.94 ( 3.13) Acc@5 24.61 ( 9.53) +Epoch: [0][3875/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 5.0051e+00 (6.0209e+00) Acc@1 7.81 ( 3.14) Acc@5 22.27 ( 9.53) +Epoch: [0][3876/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.027) Loss 4.9849e+00 (6.0206e+00) Acc@1 7.81 ( 3.14) Acc@5 21.48 ( 9.54) +Epoch: [0][3877/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 5.0060e+00 (6.0204e+00) Acc@1 7.81 ( 3.14) Acc@5 24.22 ( 9.54) +Epoch: [0][3878/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 5.0898e+00 (6.0201e+00) Acc@1 7.03 ( 3.14) Acc@5 19.53 ( 9.54) +Epoch: [0][3879/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 4.9435e+00 (6.0198e+00) Acc@1 10.16 ( 3.14) Acc@5 22.66 ( 9.55) +Epoch: [0][3880/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 4.8295e+00 (6.0195e+00) Acc@1 10.16 ( 3.14) Acc@5 25.78 ( 9.55) +Epoch: [0][3881/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 5.1745e+00 (6.0193e+00) Acc@1 4.69 ( 3.14) Acc@5 16.80 ( 9.55) +Epoch: [0][3882/5004] Time 0.248 ( 0.242) Data 0.024 ( 0.027) Loss 5.1026e+00 (6.0191e+00) Acc@1 7.42 ( 3.14) Acc@5 22.66 ( 9.56) +Epoch: [0][3883/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.9560e+00 (6.0188e+00) Acc@1 9.77 ( 3.15) Acc@5 24.22 ( 9.56) +Epoch: [0][3884/5004] Time 0.254 ( 0.242) Data 0.024 ( 0.027) Loss 4.7347e+00 (6.0185e+00) Acc@1 8.59 ( 3.15) Acc@5 28.12 ( 9.56) +Epoch: [0][3885/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 5.1080e+00 (6.0182e+00) Acc@1 8.20 ( 3.15) Acc@5 23.05 ( 9.57) +Epoch: [0][3886/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.9012e+00 (6.0180e+00) Acc@1 13.28 ( 3.15) Acc@5 25.78 ( 9.57) +Epoch: [0][3887/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.8229e+00 (6.0176e+00) Acc@1 16.80 ( 3.15) Acc@5 29.30 ( 9.58) +Epoch: [0][3888/5004] Time 0.247 ( 0.242) Data 0.025 ( 0.027) Loss 4.8969e+00 (6.0174e+00) Acc@1 9.38 ( 3.16) Acc@5 22.66 ( 9.58) +Epoch: [0][3889/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.9931e+00 (6.0171e+00) Acc@1 11.72 ( 3.16) Acc@5 27.73 ( 9.58) +Epoch: [0][3890/5004] Time 0.240 ( 0.242) Data 0.018 ( 0.027) Loss 4.9918e+00 (6.0168e+00) Acc@1 7.42 ( 3.16) Acc@5 21.48 ( 9.59) +Epoch: [0][3891/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 4.9567e+00 (6.0166e+00) Acc@1 8.98 ( 3.16) Acc@5 23.44 ( 9.59) +Epoch: [0][3892/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.9697e+00 (6.0163e+00) Acc@1 11.72 ( 3.16) Acc@5 24.22 ( 9.60) +Epoch: [0][3893/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.8516e+00 (6.0160e+00) Acc@1 10.55 ( 3.17) Acc@5 28.12 ( 9.60) +Epoch: [0][3894/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.8884e+00 (6.0157e+00) Acc@1 10.94 ( 3.17) Acc@5 22.66 ( 9.60) +Epoch: [0][3895/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 4.9851e+00 (6.0154e+00) Acc@1 7.03 ( 3.17) Acc@5 20.70 ( 9.61) +Epoch: [0][3896/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.8466e+00 (6.0151e+00) Acc@1 13.28 ( 3.17) Acc@5 27.73 ( 9.61) +Epoch: [0][3897/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.9025e+00 (6.0148e+00) Acc@1 8.20 ( 3.17) Acc@5 25.00 ( 9.61) +Epoch: [0][3898/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 5.0126e+00 (6.0146e+00) Acc@1 9.77 ( 3.17) Acc@5 25.00 ( 9.62) +Epoch: [0][3899/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.027) Loss 4.8736e+00 (6.0143e+00) Acc@1 8.20 ( 3.17) Acc@5 26.56 ( 9.62) +Epoch: [0][3900/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.8762e+00 (6.0140e+00) Acc@1 9.38 ( 3.18) Acc@5 26.95 ( 9.63) +Epoch: [0][3901/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.1986e+00 (6.0138e+00) Acc@1 8.59 ( 3.18) Acc@5 19.53 ( 9.63) +Epoch: [0][3902/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.8916e+00 (6.0135e+00) Acc@1 10.94 ( 3.18) Acc@5 25.78 ( 9.63) +Epoch: [0][3903/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 5.0010e+00 (6.0133e+00) Acc@1 6.25 ( 3.18) Acc@5 23.83 ( 9.64) +Epoch: [0][3904/5004] Time 0.246 ( 0.242) Data 0.017 ( 0.027) Loss 4.8815e+00 (6.0130e+00) Acc@1 11.72 ( 3.18) Acc@5 23.44 ( 9.64) +Epoch: [0][3905/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.9159e+00 (6.0127e+00) Acc@1 6.64 ( 3.18) Acc@5 21.48 ( 9.64) +Epoch: [0][3906/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.6775e+00 (6.0123e+00) Acc@1 12.11 ( 3.19) Acc@5 29.69 ( 9.65) +Epoch: [0][3907/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.8360e+00 (6.0120e+00) Acc@1 11.72 ( 3.19) Acc@5 29.30 ( 9.65) +Epoch: [0][3908/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 5.0660e+00 (6.0118e+00) Acc@1 8.98 ( 3.19) Acc@5 25.39 ( 9.66) +Epoch: [0][3909/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 5.0212e+00 (6.0115e+00) Acc@1 11.72 ( 3.19) Acc@5 23.83 ( 9.66) +Epoch: [0][3910/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.9139e+00 (6.0113e+00) Acc@1 8.59 ( 3.19) Acc@5 25.39 ( 9.67) +Epoch: [0][3911/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 4.8917e+00 (6.0110e+00) Acc@1 12.50 ( 3.20) Acc@5 23.83 ( 9.67) +Epoch: [0][3912/5004] Time 0.240 ( 0.242) Data 0.018 ( 0.027) Loss 4.8418e+00 (6.0107e+00) Acc@1 9.77 ( 3.20) Acc@5 25.39 ( 9.67) +Epoch: [0][3913/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.8042e+00 (6.0104e+00) Acc@1 7.81 ( 3.20) Acc@5 22.66 ( 9.68) +Epoch: [0][3914/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 4.7540e+00 (6.0100e+00) Acc@1 12.11 ( 3.20) Acc@5 26.17 ( 9.68) +Epoch: [0][3915/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.7650e+00 (6.0097e+00) Acc@1 9.77 ( 3.20) Acc@5 30.47 ( 9.69) +Epoch: [0][3916/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 5.1054e+00 (6.0095e+00) Acc@1 8.20 ( 3.20) Acc@5 21.09 ( 9.69) +Epoch: [0][3917/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 4.8123e+00 (6.0092e+00) Acc@1 9.77 ( 3.21) Acc@5 26.56 ( 9.69) +Epoch: [0][3918/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.9078e+00 (6.0089e+00) Acc@1 8.98 ( 3.21) Acc@5 24.61 ( 9.70) +Epoch: [0][3919/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 5.2231e+00 (6.0087e+00) Acc@1 7.03 ( 3.21) Acc@5 21.09 ( 9.70) +Epoch: [0][3920/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 5.0292e+00 (6.0085e+00) Acc@1 8.20 ( 3.21) Acc@5 23.83 ( 9.70) +Epoch: [0][3921/5004] Time 0.241 ( 0.242) Data 0.018 ( 0.027) Loss 4.9288e+00 (6.0082e+00) Acc@1 8.59 ( 3.21) Acc@5 25.39 ( 9.71) +Epoch: [0][3922/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.027) Loss 4.9203e+00 (6.0079e+00) Acc@1 7.42 ( 3.21) Acc@5 25.39 ( 9.71) +Epoch: [0][3923/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 5.0606e+00 (6.0077e+00) Acc@1 6.25 ( 3.21) Acc@5 23.05 ( 9.72) +Epoch: [0][3924/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.8595e+00 (6.0074e+00) Acc@1 10.94 ( 3.21) Acc@5 28.12 ( 9.72) +Epoch: [0][3925/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 5.0775e+00 (6.0071e+00) Acc@1 5.86 ( 3.21) Acc@5 23.05 ( 9.72) +Epoch: [0][3926/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.0497e+00 (6.0069e+00) Acc@1 9.77 ( 3.22) Acc@5 24.61 ( 9.73) +Epoch: [0][3927/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 4.8232e+00 (6.0066e+00) Acc@1 7.42 ( 3.22) Acc@5 25.78 ( 9.73) +Epoch: [0][3928/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 4.9110e+00 (6.0063e+00) Acc@1 12.11 ( 3.22) Acc@5 32.81 ( 9.74) +Epoch: [0][3929/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 5.0393e+00 (6.0061e+00) Acc@1 10.55 ( 3.22) Acc@5 25.00 ( 9.74) +Epoch: [0][3930/5004] Time 0.249 ( 0.242) Data 0.023 ( 0.027) Loss 4.8549e+00 (6.0058e+00) Acc@1 9.38 ( 3.22) Acc@5 26.56 ( 9.75) +Epoch: [0][3931/5004] Time 0.242 ( 0.242) Data 0.019 ( 0.027) Loss 5.0224e+00 (6.0055e+00) Acc@1 9.38 ( 3.22) Acc@5 23.44 ( 9.75) +Epoch: [0][3932/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 4.8077e+00 (6.0052e+00) Acc@1 10.94 ( 3.23) Acc@5 25.00 ( 9.75) +Epoch: [0][3933/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 5.1577e+00 (6.0050e+00) Acc@1 5.47 ( 3.23) Acc@5 20.70 ( 9.76) +Epoch: [0][3934/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 5.1156e+00 (6.0048e+00) Acc@1 9.77 ( 3.23) Acc@5 23.83 ( 9.76) +Epoch: [0][3935/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.9132e+00 (6.0045e+00) Acc@1 9.38 ( 3.23) Acc@5 25.39 ( 9.76) +Epoch: [0][3936/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 5.0345e+00 (6.0043e+00) Acc@1 6.25 ( 3.23) Acc@5 22.27 ( 9.77) +Epoch: [0][3937/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 5.0873e+00 (6.0040e+00) Acc@1 6.25 ( 3.23) Acc@5 24.22 ( 9.77) +Epoch: [0][3938/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 4.8087e+00 (6.0037e+00) Acc@1 10.16 ( 3.23) Acc@5 28.12 ( 9.77) +Epoch: [0][3939/5004] Time 0.239 ( 0.242) Data 0.021 ( 0.027) Loss 4.9255e+00 (6.0034e+00) Acc@1 10.55 ( 3.24) Acc@5 25.39 ( 9.78) +Epoch: [0][3940/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 4.9256e+00 (6.0032e+00) Acc@1 7.42 ( 3.24) Acc@5 22.66 ( 9.78) +Epoch: [0][3941/5004] Time 0.238 ( 0.242) Data 0.021 ( 0.027) Loss 5.0400e+00 (6.0029e+00) Acc@1 7.42 ( 3.24) Acc@5 22.27 ( 9.79) +Epoch: [0][3942/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 4.9462e+00 (6.0027e+00) Acc@1 9.77 ( 3.24) Acc@5 25.00 ( 9.79) +Epoch: [0][3943/5004] Time 0.249 ( 0.242) Data 0.024 ( 0.027) Loss 4.9314e+00 (6.0024e+00) Acc@1 10.16 ( 3.24) Acc@5 24.22 ( 9.79) +Epoch: [0][3944/5004] Time 0.239 ( 0.242) Data 0.019 ( 0.027) Loss 4.8956e+00 (6.0021e+00) Acc@1 8.59 ( 3.24) Acc@5 27.73 ( 9.80) +Epoch: [0][3945/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 5.0175e+00 (6.0019e+00) Acc@1 7.03 ( 3.24) Acc@5 24.61 ( 9.80) +Epoch: [0][3946/5004] Time 0.237 ( 0.242) Data 0.015 ( 0.027) Loss 4.8819e+00 (6.0016e+00) Acc@1 10.94 ( 3.25) Acc@5 28.12 ( 9.81) +Epoch: [0][3947/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.0446e+00 (6.0013e+00) Acc@1 7.03 ( 3.25) Acc@5 23.05 ( 9.81) +Epoch: [0][3948/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 5.0438e+00 (6.0011e+00) Acc@1 7.42 ( 3.25) Acc@5 20.31 ( 9.81) +Epoch: [0][3949/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.8729e+00 (6.0008e+00) Acc@1 10.16 ( 3.25) Acc@5 24.61 ( 9.82) +Epoch: [0][3950/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.9314e+00 (6.0005e+00) Acc@1 10.16 ( 3.25) Acc@5 23.83 ( 9.82) +Epoch: [0][3951/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.9856e+00 (6.0003e+00) Acc@1 7.42 ( 3.25) Acc@5 25.00 ( 9.82) +Epoch: [0][3952/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 4.9297e+00 (6.0000e+00) Acc@1 9.77 ( 3.25) Acc@5 25.78 ( 9.83) +Epoch: [0][3953/5004] Time 0.249 ( 0.242) Data 0.023 ( 0.027) Loss 4.8876e+00 (5.9997e+00) Acc@1 11.33 ( 3.26) Acc@5 27.73 ( 9.83) +Epoch: [0][3954/5004] Time 0.239 ( 0.242) Data 0.018 ( 0.027) Loss 4.6881e+00 (5.9994e+00) Acc@1 11.72 ( 3.26) Acc@5 27.34 ( 9.84) +Epoch: [0][3955/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.0620e+00 (5.9992e+00) Acc@1 7.81 ( 3.26) Acc@5 19.14 ( 9.84) +Epoch: [0][3956/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.7945e+00 (5.9989e+00) Acc@1 10.94 ( 3.26) Acc@5 28.91 ( 9.84) +Epoch: [0][3957/5004] Time 0.256 ( 0.242) Data 0.022 ( 0.027) Loss 5.0100e+00 (5.9986e+00) Acc@1 8.59 ( 3.26) Acc@5 23.44 ( 9.85) +Epoch: [0][3958/5004] Time 0.245 ( 0.242) Data 0.017 ( 0.027) Loss 4.9018e+00 (5.9983e+00) Acc@1 11.33 ( 3.26) Acc@5 24.61 ( 9.85) +Epoch: [0][3959/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.027) Loss 4.9256e+00 (5.9981e+00) Acc@1 9.77 ( 3.27) Acc@5 24.22 ( 9.85) +Epoch: [0][3960/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.9002e+00 (5.9978e+00) Acc@1 10.55 ( 3.27) Acc@5 25.78 ( 9.86) +Epoch: [0][3961/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.8614e+00 (5.9975e+00) Acc@1 10.16 ( 3.27) Acc@5 26.17 ( 9.86) +Epoch: [0][3962/5004] Time 0.248 ( 0.242) Data 0.024 ( 0.027) Loss 5.1905e+00 (5.9973e+00) Acc@1 7.03 ( 3.27) Acc@5 18.75 ( 9.86) +Epoch: [0][3963/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.8404e+00 (5.9970e+00) Acc@1 9.38 ( 3.27) Acc@5 25.78 ( 9.87) +Epoch: [0][3964/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.8837e+00 (5.9967e+00) Acc@1 8.20 ( 3.27) Acc@5 22.27 ( 9.87) +Epoch: [0][3965/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.6434e+00 (5.9964e+00) Acc@1 11.72 ( 3.28) Acc@5 29.30 ( 9.88) +Epoch: [0][3966/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 5.0951e+00 (5.9961e+00) Acc@1 5.86 ( 3.28) Acc@5 21.09 ( 9.88) +Epoch: [0][3967/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.9372e+00 (5.9959e+00) Acc@1 7.42 ( 3.28) Acc@5 23.44 ( 9.88) +Epoch: [0][3968/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 5.1052e+00 (5.9957e+00) Acc@1 7.42 ( 3.28) Acc@5 21.48 ( 9.89) +Epoch: [0][3969/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 4.9433e+00 (5.9954e+00) Acc@1 11.33 ( 3.28) Acc@5 28.12 ( 9.89) +Epoch: [0][3970/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 4.9717e+00 (5.9951e+00) Acc@1 7.81 ( 3.28) Acc@5 25.00 ( 9.89) +Epoch: [0][3971/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.027) Loss 5.0378e+00 (5.9949e+00) Acc@1 9.38 ( 3.28) Acc@5 25.00 ( 9.90) +Epoch: [0][3972/5004] Time 0.251 ( 0.242) Data 0.020 ( 0.027) Loss 4.7210e+00 (5.9946e+00) Acc@1 11.33 ( 3.29) Acc@5 25.39 ( 9.90) +Epoch: [0][3973/5004] Time 0.247 ( 0.242) Data 0.019 ( 0.027) Loss 4.9659e+00 (5.9943e+00) Acc@1 11.72 ( 3.29) Acc@5 26.95 ( 9.91) +Epoch: [0][3974/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.027) Loss 4.8542e+00 (5.9940e+00) Acc@1 9.77 ( 3.29) Acc@5 28.12 ( 9.91) +Epoch: [0][3975/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 4.8578e+00 (5.9937e+00) Acc@1 13.28 ( 3.29) Acc@5 27.34 ( 9.91) +Epoch: [0][3976/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.7709e+00 (5.9934e+00) Acc@1 12.11 ( 3.29) Acc@5 26.56 ( 9.92) +Epoch: [0][3977/5004] Time 0.231 ( 0.242) Data 0.023 ( 0.027) Loss 4.9954e+00 (5.9932e+00) Acc@1 10.94 ( 3.30) Acc@5 26.56 ( 9.92) +Epoch: [0][3978/5004] Time 0.224 ( 0.242) Data 0.036 ( 0.027) Loss 4.8573e+00 (5.9929e+00) Acc@1 8.98 ( 3.30) Acc@5 23.44 ( 9.93) +Epoch: [0][3979/5004] Time 0.238 ( 0.242) Data 0.049 ( 0.027) Loss 5.0022e+00 (5.9926e+00) Acc@1 8.20 ( 3.30) Acc@5 23.83 ( 9.93) +Epoch: [0][3980/5004] Time 0.239 ( 0.242) Data 0.049 ( 0.027) Loss 4.9776e+00 (5.9924e+00) Acc@1 10.94 ( 3.30) Acc@5 22.66 ( 9.93) +Epoch: [0][3981/5004] Time 0.226 ( 0.242) Data 0.049 ( 0.027) Loss 4.9320e+00 (5.9921e+00) Acc@1 10.55 ( 3.30) Acc@5 25.00 ( 9.94) +Epoch: [0][3982/5004] Time 0.240 ( 0.242) Data 0.060 ( 0.027) Loss 4.7609e+00 (5.9918e+00) Acc@1 14.84 ( 3.30) Acc@5 29.30 ( 9.94) +Epoch: [0][3983/5004] Time 0.235 ( 0.242) Data 0.059 ( 0.027) Loss 5.0484e+00 (5.9916e+00) Acc@1 10.16 ( 3.31) Acc@5 27.34 ( 9.95) +Epoch: [0][3984/5004] Time 0.232 ( 0.242) Data 0.059 ( 0.027) Loss 4.9427e+00 (5.9913e+00) Acc@1 8.59 ( 3.31) Acc@5 27.34 ( 9.95) +Epoch: [0][3985/5004] Time 0.241 ( 0.242) Data 0.063 ( 0.027) Loss 5.0486e+00 (5.9911e+00) Acc@1 9.77 ( 3.31) Acc@5 21.88 ( 9.95) +Epoch: [0][3986/5004] Time 0.233 ( 0.242) Data 0.059 ( 0.027) Loss 5.0304e+00 (5.9908e+00) Acc@1 6.64 ( 3.31) Acc@5 22.66 ( 9.96) +Epoch: [0][3987/5004] Time 0.238 ( 0.242) Data 0.063 ( 0.027) Loss 4.8379e+00 (5.9905e+00) Acc@1 11.72 ( 3.31) Acc@5 25.78 ( 9.96) +Epoch: [0][3988/5004] Time 0.247 ( 0.242) Data 0.062 ( 0.027) Loss 4.7186e+00 (5.9902e+00) Acc@1 12.11 ( 3.31) Acc@5 29.30 ( 9.97) +Epoch: [0][3989/5004] Time 0.242 ( 0.242) Data 0.059 ( 0.027) Loss 4.9491e+00 (5.9900e+00) Acc@1 7.42 ( 3.32) Acc@5 25.00 ( 9.97) +Epoch: [0][3990/5004] Time 0.237 ( 0.242) Data 0.055 ( 0.027) Loss 5.0571e+00 (5.9897e+00) Acc@1 7.81 ( 3.32) Acc@5 24.61 ( 9.97) +Epoch: [0][3991/5004] Time 0.276 ( 0.242) Data 0.059 ( 0.027) Loss 4.8663e+00 (5.9895e+00) Acc@1 8.98 ( 3.32) Acc@5 26.17 ( 9.98) +Epoch: [0][3992/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 4.9015e+00 (5.9892e+00) Acc@1 8.59 ( 3.32) Acc@5 24.61 ( 9.98) +Epoch: [0][3993/5004] Time 0.249 ( 0.242) Data 0.028 ( 0.027) Loss 4.8330e+00 (5.9889e+00) Acc@1 11.72 ( 3.32) Acc@5 26.17 ( 9.98) +Epoch: [0][3994/5004] Time 0.252 ( 0.242) Data 0.027 ( 0.027) Loss 4.7622e+00 (5.9886e+00) Acc@1 10.94 ( 3.32) Acc@5 27.34 ( 9.99) +Epoch: [0][3995/5004] Time 0.247 ( 0.242) Data 0.026 ( 0.027) Loss 5.1000e+00 (5.9884e+00) Acc@1 7.42 ( 3.32) Acc@5 22.27 ( 9.99) +Epoch: [0][3996/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.027) Loss 5.2169e+00 (5.9882e+00) Acc@1 7.81 ( 3.33) Acc@5 19.92 ( 9.99) +Epoch: [0][3997/5004] Time 0.251 ( 0.242) Data 0.029 ( 0.027) Loss 4.7650e+00 (5.9879e+00) Acc@1 12.89 ( 3.33) Acc@5 30.47 ( 10.00) +Epoch: [0][3998/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 5.1141e+00 (5.9876e+00) Acc@1 6.25 ( 3.33) Acc@5 21.09 ( 10.00) +Epoch: [0][3999/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.027) Loss 4.8038e+00 (5.9873e+00) Acc@1 10.55 ( 3.33) Acc@5 28.52 ( 10.01) +Epoch: [0][4000/5004] Time 0.251 ( 0.242) Data 0.029 ( 0.027) Loss 4.8809e+00 (5.9871e+00) Acc@1 9.38 ( 3.33) Acc@5 22.66 ( 10.01) +Epoch: [0][4001/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.027) Loss 4.7161e+00 (5.9868e+00) Acc@1 10.16 ( 3.33) Acc@5 24.61 ( 10.01) +Epoch: [0][4002/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.9274e+00 (5.9865e+00) Acc@1 10.16 ( 3.34) Acc@5 24.22 ( 10.02) +Epoch: [0][4003/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.9185e+00 (5.9862e+00) Acc@1 10.55 ( 3.34) Acc@5 28.91 ( 10.02) +Epoch: [0][4004/5004] Time 0.250 ( 0.242) Data 0.027 ( 0.027) Loss 4.7182e+00 (5.9859e+00) Acc@1 10.16 ( 3.34) Acc@5 25.78 ( 10.03) +Epoch: [0][4005/5004] Time 0.227 ( 0.242) Data 0.020 ( 0.027) Loss 4.8768e+00 (5.9856e+00) Acc@1 8.59 ( 3.34) Acc@5 23.05 ( 10.03) +Epoch: [0][4006/5004] Time 0.238 ( 0.242) Data 0.031 ( 0.027) Loss 4.7774e+00 (5.9853e+00) Acc@1 11.72 ( 3.34) Acc@5 26.95 ( 10.03) +Epoch: [0][4007/5004] Time 0.239 ( 0.242) Data 0.032 ( 0.027) Loss 5.0049e+00 (5.9851e+00) Acc@1 10.16 ( 3.34) Acc@5 26.56 ( 10.04) +Epoch: [0][4008/5004] Time 0.237 ( 0.242) Data 0.031 ( 0.027) Loss 4.8563e+00 (5.9848e+00) Acc@1 11.72 ( 3.35) Acc@5 26.17 ( 10.04) +Epoch: [0][4009/5004] Time 0.240 ( 0.242) Data 0.032 ( 0.027) Loss 4.6943e+00 (5.9845e+00) Acc@1 11.33 ( 3.35) Acc@5 30.47 ( 10.05) +Epoch: [0][4010/5004] Time 0.236 ( 0.242) Data 0.031 ( 0.027) Loss 4.9221e+00 (5.9842e+00) Acc@1 11.72 ( 3.35) Acc@5 26.17 ( 10.05) +Epoch: [0][4011/5004] Time 0.241 ( 0.242) Data 0.033 ( 0.027) Loss 4.9081e+00 (5.9839e+00) Acc@1 10.94 ( 3.35) Acc@5 25.39 ( 10.05) +Epoch: [0][4012/5004] Time 0.238 ( 0.242) Data 0.031 ( 0.027) Loss 4.9415e+00 (5.9837e+00) Acc@1 7.81 ( 3.35) Acc@5 23.44 ( 10.06) +Epoch: [0][4013/5004] Time 0.240 ( 0.242) Data 0.031 ( 0.027) Loss 5.1030e+00 (5.9835e+00) Acc@1 6.64 ( 3.35) Acc@5 22.66 ( 10.06) +Epoch: [0][4014/5004] Time 0.239 ( 0.242) Data 0.032 ( 0.027) Loss 4.7834e+00 (5.9832e+00) Acc@1 10.55 ( 3.36) Acc@5 28.52 ( 10.07) +Epoch: [0][4015/5004] Time 0.240 ( 0.242) Data 0.032 ( 0.027) Loss 4.8249e+00 (5.9829e+00) Acc@1 10.94 ( 3.36) Acc@5 27.73 ( 10.07) +Epoch: [0][4016/5004] Time 0.240 ( 0.242) Data 0.032 ( 0.027) Loss 4.8540e+00 (5.9826e+00) Acc@1 10.55 ( 3.36) Acc@5 25.00 ( 10.07) +Epoch: [0][4017/5004] Time 0.240 ( 0.242) Data 0.031 ( 0.027) Loss 4.9058e+00 (5.9823e+00) Acc@1 9.77 ( 3.36) Acc@5 23.05 ( 10.08) +Epoch: [0][4018/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 5.0029e+00 (5.9821e+00) Acc@1 8.59 ( 3.36) Acc@5 24.61 ( 10.08) +Epoch: [0][4019/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.027) Loss 4.8639e+00 (5.9818e+00) Acc@1 7.42 ( 3.36) Acc@5 25.39 ( 10.08) +Epoch: [0][4020/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.8660e+00 (5.9815e+00) Acc@1 11.72 ( 3.37) Acc@5 29.69 ( 10.09) +Epoch: [0][4021/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 4.8267e+00 (5.9812e+00) Acc@1 8.98 ( 3.37) Acc@5 25.00 ( 10.09) +Epoch: [0][4022/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.6555e+00 (5.9809e+00) Acc@1 15.23 ( 3.37) Acc@5 31.25 ( 10.10) +Epoch: [0][4023/5004] Time 0.237 ( 0.242) Data 0.027 ( 0.027) Loss 4.8995e+00 (5.9806e+00) Acc@1 10.16 ( 3.37) Acc@5 24.61 ( 10.10) +Epoch: [0][4024/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 5.1105e+00 (5.9804e+00) Acc@1 8.98 ( 3.37) Acc@5 25.00 ( 10.11) +Epoch: [0][4025/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.9628e+00 (5.9802e+00) Acc@1 7.03 ( 3.37) Acc@5 21.09 ( 10.11) +Epoch: [0][4026/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.0365e+00 (5.9799e+00) Acc@1 8.98 ( 3.38) Acc@5 24.22 ( 10.11) +Epoch: [0][4027/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.9751e+00 (5.9797e+00) Acc@1 9.38 ( 3.38) Acc@5 26.56 ( 10.12) +Epoch: [0][4028/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.9376e+00 (5.9794e+00) Acc@1 10.16 ( 3.38) Acc@5 29.30 ( 10.12) +Epoch: [0][4029/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 4.7531e+00 (5.9791e+00) Acc@1 8.59 ( 3.38) Acc@5 26.56 ( 10.12) +Epoch: [0][4030/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.7970e+00 (5.9788e+00) Acc@1 10.55 ( 3.38) Acc@5 27.73 ( 10.13) +Epoch: [0][4031/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.9426e+00 (5.9786e+00) Acc@1 7.03 ( 3.38) Acc@5 22.27 ( 10.13) +Epoch: [0][4032/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.027) Loss 4.9024e+00 (5.9783e+00) Acc@1 13.28 ( 3.38) Acc@5 26.56 ( 10.14) +Epoch: [0][4033/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 4.8525e+00 (5.9780e+00) Acc@1 9.77 ( 3.39) Acc@5 23.83 ( 10.14) +Epoch: [0][4034/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.8035e+00 (5.9777e+00) Acc@1 11.72 ( 3.39) Acc@5 30.08 ( 10.14) +Epoch: [0][4035/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.7688e+00 (5.9774e+00) Acc@1 13.28 ( 3.39) Acc@5 28.52 ( 10.15) +Epoch: [0][4036/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 4.7535e+00 (5.9771e+00) Acc@1 10.16 ( 3.39) Acc@5 28.12 ( 10.15) +Epoch: [0][4037/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 4.9894e+00 (5.9769e+00) Acc@1 7.03 ( 3.39) Acc@5 25.00 ( 10.16) +Epoch: [0][4038/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 4.9087e+00 (5.9766e+00) Acc@1 11.72 ( 3.40) Acc@5 24.22 ( 10.16) +Epoch: [0][4039/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 5.1030e+00 (5.9764e+00) Acc@1 8.98 ( 3.40) Acc@5 20.70 ( 10.16) +Epoch: [0][4040/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 4.8476e+00 (5.9761e+00) Acc@1 10.16 ( 3.40) Acc@5 25.39 ( 10.17) +Epoch: [0][4041/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.7660e+00 (5.9758e+00) Acc@1 10.16 ( 3.40) Acc@5 25.00 ( 10.17) +Epoch: [0][4042/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.7712e+00 (5.9755e+00) Acc@1 9.38 ( 3.40) Acc@5 27.73 ( 10.18) +Epoch: [0][4043/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.0316e+00 (5.9753e+00) Acc@1 8.98 ( 3.40) Acc@5 22.27 ( 10.18) +Epoch: [0][4044/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.9323e+00 (5.9750e+00) Acc@1 8.20 ( 3.40) Acc@5 27.73 ( 10.18) +Epoch: [0][4045/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.8536e+00 (5.9748e+00) Acc@1 10.94 ( 3.41) Acc@5 29.30 ( 10.19) +Epoch: [0][4046/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 4.8955e+00 (5.9745e+00) Acc@1 11.72 ( 3.41) Acc@5 26.56 ( 10.19) +Epoch: [0][4047/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.8502e+00 (5.9742e+00) Acc@1 5.86 ( 3.41) Acc@5 25.00 ( 10.20) +Epoch: [0][4048/5004] Time 0.237 ( 0.242) Data 0.024 ( 0.027) Loss 4.7833e+00 (5.9739e+00) Acc@1 12.11 ( 3.41) Acc@5 31.64 ( 10.20) +Epoch: [0][4049/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.8528e+00 (5.9737e+00) Acc@1 10.55 ( 3.41) Acc@5 28.12 ( 10.20) +Epoch: [0][4050/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.027) Loss 4.7581e+00 (5.9734e+00) Acc@1 13.28 ( 3.42) Acc@5 30.08 ( 10.21) +Epoch: [0][4051/5004] Time 0.257 ( 0.242) Data 0.025 ( 0.027) Loss 4.7238e+00 (5.9730e+00) Acc@1 13.67 ( 3.42) Acc@5 28.52 ( 10.21) +Epoch: [0][4052/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.8571e+00 (5.9728e+00) Acc@1 10.16 ( 3.42) Acc@5 26.95 ( 10.22) +Epoch: [0][4053/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 4.8010e+00 (5.9725e+00) Acc@1 10.16 ( 3.42) Acc@5 27.34 ( 10.22) +Epoch: [0][4054/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 5.0922e+00 (5.9723e+00) Acc@1 8.98 ( 3.42) Acc@5 22.66 ( 10.23) +Epoch: [0][4055/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.9844e+00 (5.9720e+00) Acc@1 8.20 ( 3.42) Acc@5 23.44 ( 10.23) +Epoch: [0][4056/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.0105e+00 (5.9718e+00) Acc@1 9.77 ( 3.43) Acc@5 23.44 ( 10.23) +Epoch: [0][4057/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 4.8529e+00 (5.9715e+00) Acc@1 12.89 ( 3.43) Acc@5 30.47 ( 10.24) +Epoch: [0][4058/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.1632e+00 (5.9713e+00) Acc@1 8.59 ( 3.43) Acc@5 22.66 ( 10.24) +Epoch: [0][4059/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 5.0048e+00 (5.9711e+00) Acc@1 11.33 ( 3.43) Acc@5 23.83 ( 10.24) +Epoch: [0][4060/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.9288e+00 (5.9708e+00) Acc@1 13.28 ( 3.43) Acc@5 25.00 ( 10.25) +Epoch: [0][4061/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 4.9683e+00 (5.9706e+00) Acc@1 9.77 ( 3.43) Acc@5 22.66 ( 10.25) +Epoch: [0][4062/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 5.0147e+00 (5.9703e+00) Acc@1 8.98 ( 3.44) Acc@5 24.61 ( 10.25) +Epoch: [0][4063/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 4.8247e+00 (5.9701e+00) Acc@1 8.20 ( 3.44) Acc@5 28.12 ( 10.26) +Epoch: [0][4064/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 4.6903e+00 (5.9697e+00) Acc@1 9.38 ( 3.44) Acc@5 26.17 ( 10.26) +Epoch: [0][4065/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 5.0856e+00 (5.9695e+00) Acc@1 7.03 ( 3.44) Acc@5 23.83 ( 10.27) +Epoch: [0][4066/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.8859e+00 (5.9693e+00) Acc@1 11.33 ( 3.44) Acc@5 27.73 ( 10.27) +Epoch: [0][4067/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.027) Loss 4.9378e+00 (5.9690e+00) Acc@1 9.77 ( 3.44) Acc@5 23.05 ( 10.27) +Epoch: [0][4068/5004] Time 0.235 ( 0.242) Data 0.023 ( 0.027) Loss 4.7873e+00 (5.9687e+00) Acc@1 11.33 ( 3.45) Acc@5 28.91 ( 10.28) +Epoch: [0][4069/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 5.0487e+00 (5.9685e+00) Acc@1 7.03 ( 3.45) Acc@5 22.66 ( 10.28) +Epoch: [0][4070/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 4.6776e+00 (5.9682e+00) Acc@1 14.06 ( 3.45) Acc@5 30.47 ( 10.29) +Epoch: [0][4071/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.8475e+00 (5.9679e+00) Acc@1 10.16 ( 3.45) Acc@5 25.39 ( 10.29) +Epoch: [0][4072/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 4.8526e+00 (5.9676e+00) Acc@1 8.59 ( 3.45) Acc@5 25.39 ( 10.29) +Epoch: [0][4073/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.8074e+00 (5.9673e+00) Acc@1 11.33 ( 3.45) Acc@5 28.12 ( 10.30) +Epoch: [0][4074/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.8539e+00 (5.9671e+00) Acc@1 10.16 ( 3.46) Acc@5 29.30 ( 10.30) +Epoch: [0][4075/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.8416e+00 (5.9668e+00) Acc@1 9.77 ( 3.46) Acc@5 25.78 ( 10.31) +Epoch: [0][4076/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 5.0144e+00 (5.9665e+00) Acc@1 7.03 ( 3.46) Acc@5 21.88 ( 10.31) +Epoch: [0][4077/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.8899e+00 (5.9663e+00) Acc@1 8.59 ( 3.46) Acc@5 26.17 ( 10.31) +Epoch: [0][4078/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.7135e+00 (5.9660e+00) Acc@1 12.50 ( 3.46) Acc@5 28.52 ( 10.32) +Epoch: [0][4079/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.7365e+00 (5.9657e+00) Acc@1 10.55 ( 3.46) Acc@5 25.00 ( 10.32) +Epoch: [0][4080/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.9676e+00 (5.9654e+00) Acc@1 10.55 ( 3.46) Acc@5 24.61 ( 10.32) +Epoch: [0][4081/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 5.0160e+00 (5.9652e+00) Acc@1 7.42 ( 3.47) Acc@5 20.31 ( 10.33) +Epoch: [0][4082/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 5.0294e+00 (5.9650e+00) Acc@1 8.98 ( 3.47) Acc@5 21.09 ( 10.33) +Epoch: [0][4083/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 5.0992e+00 (5.9648e+00) Acc@1 5.86 ( 3.47) Acc@5 22.27 ( 10.33) +Epoch: [0][4084/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 4.9506e+00 (5.9645e+00) Acc@1 7.81 ( 3.47) Acc@5 25.00 ( 10.34) +Epoch: [0][4085/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 4.6098e+00 (5.9642e+00) Acc@1 10.16 ( 3.47) Acc@5 29.69 ( 10.34) +Epoch: [0][4086/5004] Time 0.237 ( 0.242) Data 0.027 ( 0.027) Loss 4.8354e+00 (5.9639e+00) Acc@1 9.38 ( 3.47) Acc@5 24.61 ( 10.34) +Epoch: [0][4087/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.8966e+00 (5.9636e+00) Acc@1 10.16 ( 3.47) Acc@5 26.17 ( 10.35) +Epoch: [0][4088/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.7828e+00 (5.9634e+00) Acc@1 15.23 ( 3.48) Acc@5 25.39 ( 10.35) +Epoch: [0][4089/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.027) Loss 5.1990e+00 (5.9632e+00) Acc@1 7.03 ( 3.48) Acc@5 20.31 ( 10.35) +Epoch: [0][4090/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 4.9557e+00 (5.9629e+00) Acc@1 10.55 ( 3.48) Acc@5 25.39 ( 10.36) +Epoch: [0][4091/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 4.8174e+00 (5.9626e+00) Acc@1 9.77 ( 3.48) Acc@5 26.56 ( 10.36) +Epoch: [0][4092/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.7285e+00 (5.9623e+00) Acc@1 12.89 ( 3.48) Acc@5 27.34 ( 10.37) +Epoch: [0][4093/5004] Time 0.237 ( 0.242) Data 0.027 ( 0.027) Loss 5.0580e+00 (5.9621e+00) Acc@1 8.59 ( 3.48) Acc@5 19.92 ( 10.37) +Epoch: [0][4094/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.9251e+00 (5.9619e+00) Acc@1 9.77 ( 3.49) Acc@5 25.39 ( 10.37) +Epoch: [0][4095/5004] Time 0.238 ( 0.242) Data 0.025 ( 0.027) Loss 4.8522e+00 (5.9616e+00) Acc@1 10.16 ( 3.49) Acc@5 26.56 ( 10.38) +Epoch: [0][4096/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.027) Loss 5.0766e+00 (5.9614e+00) Acc@1 6.25 ( 3.49) Acc@5 21.88 ( 10.38) +Epoch: [0][4097/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.8613e+00 (5.9611e+00) Acc@1 9.77 ( 3.49) Acc@5 26.56 ( 10.38) +Epoch: [0][4098/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 4.8521e+00 (5.9608e+00) Acc@1 13.67 ( 3.49) Acc@5 29.30 ( 10.39) +Epoch: [0][4099/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 4.7365e+00 (5.9605e+00) Acc@1 12.89 ( 3.49) Acc@5 33.20 ( 10.39) +Epoch: [0][4100/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.9443e+00 (5.9603e+00) Acc@1 11.33 ( 3.50) Acc@5 26.17 ( 10.40) +Epoch: [0][4101/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.7161e+00 (5.9600e+00) Acc@1 9.77 ( 3.50) Acc@5 28.52 ( 10.40) +Epoch: [0][4102/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.7820e+00 (5.9597e+00) Acc@1 9.38 ( 3.50) Acc@5 26.95 ( 10.40) +Epoch: [0][4103/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 4.7147e+00 (5.9594e+00) Acc@1 9.77 ( 3.50) Acc@5 26.56 ( 10.41) +Epoch: [0][4104/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.8447e+00 (5.9591e+00) Acc@1 10.94 ( 3.50) Acc@5 26.95 ( 10.41) +Epoch: [0][4105/5004] Time 0.238 ( 0.242) Data 0.028 ( 0.027) Loss 4.7615e+00 (5.9588e+00) Acc@1 9.77 ( 3.50) Acc@5 25.39 ( 10.42) +Epoch: [0][4106/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.6530e+00 (5.9585e+00) Acc@1 11.33 ( 3.51) Acc@5 28.91 ( 10.42) +Epoch: [0][4107/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 4.8877e+00 (5.9583e+00) Acc@1 10.94 ( 3.51) Acc@5 27.34 ( 10.42) +Epoch: [0][4108/5004] Time 0.240 ( 0.242) Data 0.029 ( 0.027) Loss 4.8212e+00 (5.9580e+00) Acc@1 14.06 ( 3.51) Acc@5 26.56 ( 10.43) +Epoch: [0][4109/5004] Time 0.239 ( 0.242) Data 0.032 ( 0.027) Loss 4.7842e+00 (5.9577e+00) Acc@1 12.11 ( 3.51) Acc@5 27.34 ( 10.43) +Epoch: [0][4110/5004] Time 0.236 ( 0.242) Data 0.030 ( 0.027) Loss 4.8895e+00 (5.9574e+00) Acc@1 12.11 ( 3.51) Acc@5 27.34 ( 10.44) +Epoch: [0][4111/5004] Time 0.245 ( 0.242) Data 0.033 ( 0.027) Loss 4.8026e+00 (5.9572e+00) Acc@1 7.42 ( 3.51) Acc@5 27.34 ( 10.44) +Epoch: [0][4112/5004] Time 0.238 ( 0.242) Data 0.029 ( 0.027) Loss 5.0084e+00 (5.9569e+00) Acc@1 9.38 ( 3.52) Acc@5 23.83 ( 10.44) +Epoch: [0][4113/5004] Time 0.237 ( 0.242) Data 0.030 ( 0.027) Loss 4.7093e+00 (5.9566e+00) Acc@1 12.89 ( 3.52) Acc@5 28.52 ( 10.45) +Epoch: [0][4114/5004] Time 0.238 ( 0.242) Data 0.031 ( 0.027) Loss 4.8350e+00 (5.9563e+00) Acc@1 12.50 ( 3.52) Acc@5 27.34 ( 10.45) +Epoch: [0][4115/5004] Time 0.249 ( 0.242) Data 0.031 ( 0.027) Loss 5.0481e+00 (5.9561e+00) Acc@1 7.03 ( 3.52) Acc@5 22.27 ( 10.46) +Epoch: [0][4116/5004] Time 0.233 ( 0.242) Data 0.022 ( 0.027) Loss 4.8087e+00 (5.9558e+00) Acc@1 9.38 ( 3.52) Acc@5 26.17 ( 10.46) +Epoch: [0][4117/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.9881e+00 (5.9556e+00) Acc@1 9.77 ( 3.52) Acc@5 24.22 ( 10.46) +Epoch: [0][4118/5004] Time 0.233 ( 0.242) Data 0.027 ( 0.027) Loss 4.6102e+00 (5.9553e+00) Acc@1 14.84 ( 3.53) Acc@5 30.86 ( 10.47) +Epoch: [0][4119/5004] Time 0.240 ( 0.242) Data 0.031 ( 0.027) Loss 5.1350e+00 (5.9551e+00) Acc@1 6.64 ( 3.53) Acc@5 25.00 ( 10.47) +Epoch: [0][4120/5004] Time 0.237 ( 0.242) Data 0.030 ( 0.027) Loss 4.9811e+00 (5.9548e+00) Acc@1 6.64 ( 3.53) Acc@5 22.27 ( 10.47) +Epoch: [0][4121/5004] Time 0.239 ( 0.242) Data 0.030 ( 0.027) Loss 4.8343e+00 (5.9546e+00) Acc@1 9.38 ( 3.53) Acc@5 25.78 ( 10.48) +Epoch: [0][4122/5004] Time 0.236 ( 0.242) Data 0.030 ( 0.027) Loss 4.7268e+00 (5.9543e+00) Acc@1 12.11 ( 3.53) Acc@5 27.73 ( 10.48) +Epoch: [0][4123/5004] Time 0.238 ( 0.242) Data 0.032 ( 0.027) Loss 4.8368e+00 (5.9540e+00) Acc@1 9.38 ( 3.53) Acc@5 29.30 ( 10.49) +Epoch: [0][4124/5004] Time 0.239 ( 0.242) Data 0.031 ( 0.027) Loss 4.9270e+00 (5.9538e+00) Acc@1 10.55 ( 3.54) Acc@5 28.12 ( 10.49) +Epoch: [0][4125/5004] Time 0.249 ( 0.242) Data 0.029 ( 0.027) Loss 4.8360e+00 (5.9535e+00) Acc@1 12.11 ( 3.54) Acc@5 26.17 ( 10.49) +Epoch: [0][4126/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 4.7841e+00 (5.9532e+00) Acc@1 11.72 ( 3.54) Acc@5 27.73 ( 10.50) +Epoch: [0][4127/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 4.7841e+00 (5.9529e+00) Acc@1 9.77 ( 3.54) Acc@5 24.22 ( 10.50) +Epoch: [0][4128/5004] Time 0.237 ( 0.242) Data 0.024 ( 0.027) Loss 4.8784e+00 (5.9527e+00) Acc@1 10.16 ( 3.54) Acc@5 27.73 ( 10.51) +Epoch: [0][4129/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.7650e+00 (5.9524e+00) Acc@1 10.94 ( 3.54) Acc@5 28.91 ( 10.51) +Epoch: [0][4130/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.0443e+00 (5.9522e+00) Acc@1 10.94 ( 3.55) Acc@5 26.95 ( 10.52) +Epoch: [0][4131/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 4.9057e+00 (5.9519e+00) Acc@1 15.62 ( 3.55) Acc@5 26.56 ( 10.52) +Epoch: [0][4132/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.8060e+00 (5.9516e+00) Acc@1 11.33 ( 3.55) Acc@5 25.39 ( 10.52) +Epoch: [0][4133/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.8533e+00 (5.9514e+00) Acc@1 14.84 ( 3.55) Acc@5 26.56 ( 10.53) +Epoch: [0][4134/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 4.9556e+00 (5.9511e+00) Acc@1 8.98 ( 3.56) Acc@5 24.22 ( 10.53) +Epoch: [0][4135/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.7142e+00 (5.9508e+00) Acc@1 10.94 ( 3.56) Acc@5 30.47 ( 10.53) +Epoch: [0][4136/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.9018e+00 (5.9506e+00) Acc@1 9.38 ( 3.56) Acc@5 28.91 ( 10.54) +Epoch: [0][4137/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.7980e+00 (5.9503e+00) Acc@1 8.98 ( 3.56) Acc@5 24.61 ( 10.54) +Epoch: [0][4138/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 4.8382e+00 (5.9500e+00) Acc@1 12.11 ( 3.56) Acc@5 25.39 ( 10.55) +Epoch: [0][4139/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.8941e+00 (5.9498e+00) Acc@1 9.38 ( 3.56) Acc@5 22.66 ( 10.55) +Epoch: [0][4140/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.8679e+00 (5.9495e+00) Acc@1 6.25 ( 3.56) Acc@5 24.22 ( 10.55) +Epoch: [0][4141/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 4.9110e+00 (5.9493e+00) Acc@1 10.55 ( 3.57) Acc@5 26.95 ( 10.56) +Epoch: [0][4142/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.8198e+00 (5.9490e+00) Acc@1 11.33 ( 3.57) Acc@5 27.73 ( 10.56) +Epoch: [0][4143/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 4.9128e+00 (5.9487e+00) Acc@1 10.55 ( 3.57) Acc@5 23.44 ( 10.56) +Epoch: [0][4144/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 4.8738e+00 (5.9485e+00) Acc@1 12.50 ( 3.57) Acc@5 26.56 ( 10.57) +Epoch: [0][4145/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 4.9060e+00 (5.9482e+00) Acc@1 10.94 ( 3.57) Acc@5 26.17 ( 10.57) +Epoch: [0][4146/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.0036e+00 (5.9480e+00) Acc@1 9.38 ( 3.57) Acc@5 23.83 ( 10.57) +Epoch: [0][4147/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 4.7496e+00 (5.9477e+00) Acc@1 12.89 ( 3.58) Acc@5 28.91 ( 10.58) +Epoch: [0][4148/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 5.0046e+00 (5.9475e+00) Acc@1 11.33 ( 3.58) Acc@5 25.39 ( 10.58) +Epoch: [0][4149/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 4.7154e+00 (5.9472e+00) Acc@1 6.64 ( 3.58) Acc@5 29.30 ( 10.59) +Epoch: [0][4150/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.7487e+00 (5.9469e+00) Acc@1 8.98 ( 3.58) Acc@5 28.12 ( 10.59) +Epoch: [0][4151/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 4.8441e+00 (5.9466e+00) Acc@1 11.72 ( 3.58) Acc@5 26.17 ( 10.59) +Epoch: [0][4152/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 4.7793e+00 (5.9463e+00) Acc@1 12.89 ( 3.58) Acc@5 28.52 ( 10.60) +Epoch: [0][4153/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 4.8739e+00 (5.9461e+00) Acc@1 10.55 ( 3.59) Acc@5 23.83 ( 10.60) +Epoch: [0][4154/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.7562e+00 (5.9458e+00) Acc@1 11.72 ( 3.59) Acc@5 28.91 ( 10.61) +Epoch: [0][4155/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 4.9036e+00 (5.9455e+00) Acc@1 8.98 ( 3.59) Acc@5 26.17 ( 10.61) +Epoch: [0][4156/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 4.8576e+00 (5.9453e+00) Acc@1 9.38 ( 3.59) Acc@5 25.39 ( 10.61) +Epoch: [0][4157/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 4.6800e+00 (5.9450e+00) Acc@1 10.94 ( 3.59) Acc@5 27.73 ( 10.62) +Epoch: [0][4158/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.8495e+00 (5.9447e+00) Acc@1 11.72 ( 3.59) Acc@5 30.47 ( 10.62) +Epoch: [0][4159/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 4.9749e+00 (5.9445e+00) Acc@1 10.55 ( 3.60) Acc@5 27.34 ( 10.63) +Epoch: [0][4160/5004] Time 0.245 ( 0.242) Data 0.027 ( 0.027) Loss 5.0087e+00 (5.9443e+00) Acc@1 9.77 ( 3.60) Acc@5 26.17 ( 10.63) +Epoch: [0][4161/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.8370e+00 (5.9440e+00) Acc@1 10.55 ( 3.60) Acc@5 27.34 ( 10.63) +Epoch: [0][4162/5004] Time 0.237 ( 0.242) Data 0.023 ( 0.027) Loss 4.8301e+00 (5.9437e+00) Acc@1 12.89 ( 3.60) Acc@5 30.08 ( 10.64) +Epoch: [0][4163/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.8103e+00 (5.9435e+00) Acc@1 11.72 ( 3.60) Acc@5 26.56 ( 10.64) +Epoch: [0][4164/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 4.7218e+00 (5.9432e+00) Acc@1 11.33 ( 3.61) Acc@5 25.00 ( 10.65) +Epoch: [0][4165/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 4.8191e+00 (5.9429e+00) Acc@1 11.72 ( 3.61) Acc@5 28.52 ( 10.65) +Epoch: [0][4166/5004] Time 0.247 ( 0.242) Data 0.026 ( 0.027) Loss 4.9641e+00 (5.9427e+00) Acc@1 7.81 ( 3.61) Acc@5 26.17 ( 10.65) +Epoch: [0][4167/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.8907e+00 (5.9424e+00) Acc@1 9.38 ( 3.61) Acc@5 24.61 ( 10.66) +Epoch: [0][4168/5004] Time 0.243 ( 0.242) Data 0.019 ( 0.027) Loss 4.7300e+00 (5.9421e+00) Acc@1 8.59 ( 3.61) Acc@5 28.12 ( 10.66) +Epoch: [0][4169/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 4.8998e+00 (5.9419e+00) Acc@1 11.33 ( 3.61) Acc@5 24.22 ( 10.67) +Epoch: [0][4170/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.8525e+00 (5.9416e+00) Acc@1 10.94 ( 3.61) Acc@5 27.34 ( 10.67) +Epoch: [0][4171/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.8219e+00 (5.9413e+00) Acc@1 10.55 ( 3.62) Acc@5 27.73 ( 10.67) +Epoch: [0][4172/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.9279e+00 (5.9411e+00) Acc@1 10.94 ( 3.62) Acc@5 26.95 ( 10.68) +Epoch: [0][4173/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.6517e+00 (5.9408e+00) Acc@1 14.45 ( 3.62) Acc@5 30.86 ( 10.68) +Epoch: [0][4174/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 4.9346e+00 (5.9405e+00) Acc@1 8.98 ( 3.62) Acc@5 26.95 ( 10.69) +Epoch: [0][4175/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.9728e+00 (5.9403e+00) Acc@1 9.38 ( 3.62) Acc@5 23.44 ( 10.69) +Epoch: [0][4176/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.9828e+00 (5.9401e+00) Acc@1 8.20 ( 3.62) Acc@5 20.70 ( 10.69) +Epoch: [0][4177/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.027) Loss 4.7761e+00 (5.9398e+00) Acc@1 13.67 ( 3.63) Acc@5 28.12 ( 10.70) +Epoch: [0][4178/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 4.9656e+00 (5.9396e+00) Acc@1 8.20 ( 3.63) Acc@5 21.88 ( 10.70) +Epoch: [0][4179/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.7258e+00 (5.9393e+00) Acc@1 10.94 ( 3.63) Acc@5 25.78 ( 10.70) +Epoch: [0][4180/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 4.7851e+00 (5.9390e+00) Acc@1 8.59 ( 3.63) Acc@5 25.00 ( 10.71) +Epoch: [0][4181/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.7929e+00 (5.9387e+00) Acc@1 9.38 ( 3.63) Acc@5 25.78 ( 10.71) +Epoch: [0][4182/5004] Time 0.239 ( 0.242) Data 0.022 ( 0.027) Loss 4.9279e+00 (5.9385e+00) Acc@1 9.77 ( 3.63) Acc@5 26.17 ( 10.71) +Epoch: [0][4183/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 4.7627e+00 (5.9382e+00) Acc@1 10.94 ( 3.64) Acc@5 30.08 ( 10.72) +Epoch: [0][4184/5004] Time 0.242 ( 0.242) Data 0.020 ( 0.027) Loss 4.7712e+00 (5.9379e+00) Acc@1 11.72 ( 3.64) Acc@5 26.95 ( 10.72) +Epoch: [0][4185/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.027) Loss 4.6684e+00 (5.9376e+00) Acc@1 12.11 ( 3.64) Acc@5 29.30 ( 10.73) +Epoch: [0][4186/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.8564e+00 (5.9374e+00) Acc@1 12.89 ( 3.64) Acc@5 24.61 ( 10.73) +Epoch: [0][4187/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.6489e+00 (5.9371e+00) Acc@1 11.33 ( 3.64) Acc@5 30.47 ( 10.73) +Epoch: [0][4188/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.7767e+00 (5.9368e+00) Acc@1 9.38 ( 3.64) Acc@5 25.39 ( 10.74) +Epoch: [0][4189/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.8087e+00 (5.9365e+00) Acc@1 10.94 ( 3.65) Acc@5 29.30 ( 10.74) +Epoch: [0][4190/5004] Time 0.240 ( 0.242) Data 0.022 ( 0.027) Loss 4.9606e+00 (5.9363e+00) Acc@1 9.77 ( 3.65) Acc@5 23.83 ( 10.74) +Epoch: [0][4191/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.027) Loss 4.8464e+00 (5.9360e+00) Acc@1 11.72 ( 3.65) Acc@5 25.00 ( 10.75) +Epoch: [0][4192/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.7988e+00 (5.9357e+00) Acc@1 11.33 ( 3.65) Acc@5 25.39 ( 10.75) +Epoch: [0][4193/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.8571e+00 (5.9355e+00) Acc@1 9.77 ( 3.65) Acc@5 29.30 ( 10.76) +Epoch: [0][4194/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.8865e+00 (5.9352e+00) Acc@1 8.59 ( 3.65) Acc@5 29.69 ( 10.76) +Epoch: [0][4195/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.9389e+00 (5.9350e+00) Acc@1 9.38 ( 3.66) Acc@5 25.78 ( 10.76) +Epoch: [0][4196/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.9274e+00 (5.9348e+00) Acc@1 10.16 ( 3.66) Acc@5 26.56 ( 10.77) +Epoch: [0][4197/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 4.6202e+00 (5.9344e+00) Acc@1 12.89 ( 3.66) Acc@5 30.47 ( 10.77) +Epoch: [0][4198/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 4.9006e+00 (5.9342e+00) Acc@1 9.38 ( 3.66) Acc@5 25.39 ( 10.78) +Epoch: [0][4199/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.9399e+00 (5.9340e+00) Acc@1 8.98 ( 3.66) Acc@5 26.17 ( 10.78) +Epoch: [0][4200/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.6898e+00 (5.9337e+00) Acc@1 10.55 ( 3.66) Acc@5 27.73 ( 10.78) +Epoch: [0][4201/5004] Time 0.239 ( 0.242) Data 0.021 ( 0.027) Loss 4.8640e+00 (5.9334e+00) Acc@1 8.59 ( 3.67) Acc@5 27.73 ( 10.79) +Epoch: [0][4202/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.7640e+00 (5.9331e+00) Acc@1 12.11 ( 3.67) Acc@5 26.95 ( 10.79) +Epoch: [0][4203/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 4.6238e+00 (5.9328e+00) Acc@1 10.16 ( 3.67) Acc@5 30.08 ( 10.80) +Epoch: [0][4204/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 4.5923e+00 (5.9325e+00) Acc@1 12.11 ( 3.67) Acc@5 29.69 ( 10.80) +Epoch: [0][4205/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.027) Loss 4.9729e+00 (5.9323e+00) Acc@1 8.98 ( 3.67) Acc@5 21.88 ( 10.80) +Epoch: [0][4206/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.8715e+00 (5.9320e+00) Acc@1 12.50 ( 3.67) Acc@5 25.00 ( 10.81) +Epoch: [0][4207/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.6214e+00 (5.9317e+00) Acc@1 10.94 ( 3.68) Acc@5 32.81 ( 10.81) +Epoch: [0][4208/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.7989e+00 (5.9314e+00) Acc@1 12.89 ( 3.68) Acc@5 25.00 ( 10.82) +Epoch: [0][4209/5004] Time 0.239 ( 0.242) Data 0.021 ( 0.027) Loss 4.8222e+00 (5.9312e+00) Acc@1 10.16 ( 3.68) Acc@5 27.34 ( 10.82) +Epoch: [0][4210/5004] Time 0.245 ( 0.242) Data 0.026 ( 0.027) Loss 4.9261e+00 (5.9309e+00) Acc@1 7.42 ( 3.68) Acc@5 25.78 ( 10.82) +Epoch: [0][4211/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 4.7473e+00 (5.9307e+00) Acc@1 8.98 ( 3.68) Acc@5 28.52 ( 10.83) +Epoch: [0][4212/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 4.7753e+00 (5.9304e+00) Acc@1 9.77 ( 3.68) Acc@5 30.86 ( 10.83) +Epoch: [0][4213/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.8018e+00 (5.9301e+00) Acc@1 10.94 ( 3.68) Acc@5 27.34 ( 10.84) +Epoch: [0][4214/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 4.5895e+00 (5.9298e+00) Acc@1 13.67 ( 3.69) Acc@5 30.08 ( 10.84) +Epoch: [0][4215/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.7872e+00 (5.9295e+00) Acc@1 8.20 ( 3.69) Acc@5 29.30 ( 10.84) +Epoch: [0][4216/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.7412e+00 (5.9292e+00) Acc@1 12.11 ( 3.69) Acc@5 28.12 ( 10.85) +Epoch: [0][4217/5004] Time 0.240 ( 0.242) Data 0.022 ( 0.027) Loss 4.8826e+00 (5.9290e+00) Acc@1 11.72 ( 3.69) Acc@5 26.17 ( 10.85) +Epoch: [0][4218/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 4.8314e+00 (5.9287e+00) Acc@1 10.55 ( 3.69) Acc@5 29.69 ( 10.86) +Epoch: [0][4219/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.9671e+00 (5.9285e+00) Acc@1 10.55 ( 3.70) Acc@5 25.00 ( 10.86) +Epoch: [0][4220/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 5.0379e+00 (5.9283e+00) Acc@1 12.11 ( 3.70) Acc@5 25.00 ( 10.86) +Epoch: [0][4221/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.6965e+00 (5.9280e+00) Acc@1 11.72 ( 3.70) Acc@5 30.47 ( 10.87) +Epoch: [0][4222/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.8594e+00 (5.9278e+00) Acc@1 11.33 ( 3.70) Acc@5 27.73 ( 10.87) +Epoch: [0][4223/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.7897e+00 (5.9275e+00) Acc@1 8.59 ( 3.70) Acc@5 25.78 ( 10.88) +Epoch: [0][4224/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.7054e+00 (5.9272e+00) Acc@1 11.33 ( 3.70) Acc@5 30.86 ( 10.88) +Epoch: [0][4225/5004] Time 0.255 ( 0.242) Data 0.020 ( 0.027) Loss 4.7445e+00 (5.9269e+00) Acc@1 10.16 ( 3.71) Acc@5 23.44 ( 10.88) +Epoch: [0][4226/5004] Time 0.251 ( 0.242) Data 0.018 ( 0.027) Loss 4.7427e+00 (5.9266e+00) Acc@1 10.55 ( 3.71) Acc@5 29.69 ( 10.89) +Epoch: [0][4227/5004] Time 0.238 ( 0.242) Data 0.020 ( 0.027) Loss 4.7659e+00 (5.9264e+00) Acc@1 9.77 ( 3.71) Acc@5 26.56 ( 10.89) +Epoch: [0][4228/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.8434e+00 (5.9261e+00) Acc@1 10.16 ( 3.71) Acc@5 26.17 ( 10.90) +Epoch: [0][4229/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 4.7966e+00 (5.9258e+00) Acc@1 11.33 ( 3.71) Acc@5 27.34 ( 10.90) +Epoch: [0][4230/5004] Time 0.259 ( 0.242) Data 0.019 ( 0.027) Loss 4.8604e+00 (5.9256e+00) Acc@1 8.98 ( 3.71) Acc@5 23.05 ( 10.90) +Epoch: [0][4231/5004] Time 0.245 ( 0.242) Data 0.017 ( 0.027) Loss 4.9604e+00 (5.9254e+00) Acc@1 10.16 ( 3.71) Acc@5 26.56 ( 10.91) +Epoch: [0][4232/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 4.6941e+00 (5.9251e+00) Acc@1 12.11 ( 3.72) Acc@5 26.56 ( 10.91) +Epoch: [0][4233/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.9741e+00 (5.9248e+00) Acc@1 11.33 ( 3.72) Acc@5 26.95 ( 10.91) +Epoch: [0][4234/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.027) Loss 4.8375e+00 (5.9246e+00) Acc@1 12.50 ( 3.72) Acc@5 25.78 ( 10.92) +Epoch: [0][4235/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 4.6897e+00 (5.9243e+00) Acc@1 12.50 ( 3.72) Acc@5 30.86 ( 10.92) +Epoch: [0][4236/5004] Time 0.239 ( 0.242) Data 0.021 ( 0.027) Loss 5.0310e+00 (5.9241e+00) Acc@1 6.64 ( 3.72) Acc@5 25.00 ( 10.92) +Epoch: [0][4237/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.8477e+00 (5.9238e+00) Acc@1 7.81 ( 3.72) Acc@5 28.91 ( 10.93) +Epoch: [0][4238/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.8085e+00 (5.9236e+00) Acc@1 9.77 ( 3.73) Acc@5 23.44 ( 10.93) +Epoch: [0][4239/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.7398e+00 (5.9233e+00) Acc@1 11.33 ( 3.73) Acc@5 31.25 ( 10.94) +Epoch: [0][4240/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 5.1197e+00 (5.9231e+00) Acc@1 7.03 ( 3.73) Acc@5 21.48 ( 10.94) +Epoch: [0][4241/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.6595e+00 (5.9228e+00) Acc@1 12.89 ( 3.73) Acc@5 30.47 ( 10.94) +Epoch: [0][4242/5004] Time 0.240 ( 0.242) Data 0.020 ( 0.027) Loss 4.7503e+00 (5.9225e+00) Acc@1 9.38 ( 3.73) Acc@5 26.95 ( 10.95) +Epoch: [0][4243/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.8639e+00 (5.9223e+00) Acc@1 14.84 ( 3.73) Acc@5 28.52 ( 10.95) +Epoch: [0][4244/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.8968e+00 (5.9220e+00) Acc@1 10.16 ( 3.74) Acc@5 27.73 ( 10.96) +Epoch: [0][4245/5004] Time 0.241 ( 0.242) Data 0.020 ( 0.027) Loss 4.7176e+00 (5.9218e+00) Acc@1 12.11 ( 3.74) Acc@5 30.47 ( 10.96) +Epoch: [0][4246/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.8005e+00 (5.9215e+00) Acc@1 12.50 ( 3.74) Acc@5 25.00 ( 10.96) +Epoch: [0][4247/5004] Time 0.241 ( 0.242) Data 0.021 ( 0.027) Loss 4.8796e+00 (5.9212e+00) Acc@1 11.33 ( 3.74) Acc@5 27.34 ( 10.97) +Epoch: [0][4248/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.8194e+00 (5.9210e+00) Acc@1 9.77 ( 3.74) Acc@5 26.17 ( 10.97) +Epoch: [0][4249/5004] Time 0.249 ( 0.242) Data 0.019 ( 0.027) Loss 4.7286e+00 (5.9207e+00) Acc@1 10.16 ( 3.74) Acc@5 27.34 ( 10.97) +Epoch: [0][4250/5004] Time 0.234 ( 0.242) Data 0.019 ( 0.027) Loss 4.8192e+00 (5.9204e+00) Acc@1 11.72 ( 3.75) Acc@5 28.12 ( 10.98) +Epoch: [0][4251/5004] Time 0.242 ( 0.242) Data 0.030 ( 0.027) Loss 4.9174e+00 (5.9202e+00) Acc@1 7.03 ( 3.75) Acc@5 26.17 ( 10.98) +Epoch: [0][4252/5004] Time 0.241 ( 0.242) Data 0.030 ( 0.027) Loss 4.7714e+00 (5.9199e+00) Acc@1 12.50 ( 3.75) Acc@5 28.91 ( 10.99) +Epoch: [0][4253/5004] Time 0.246 ( 0.242) Data 0.031 ( 0.027) Loss 4.8381e+00 (5.9197e+00) Acc@1 10.55 ( 3.75) Acc@5 28.52 ( 10.99) +Epoch: [0][4254/5004] Time 0.244 ( 0.242) Data 0.031 ( 0.027) Loss 4.5979e+00 (5.9194e+00) Acc@1 13.67 ( 3.75) Acc@5 31.25 ( 11.00) +Epoch: [0][4255/5004] Time 0.244 ( 0.242) Data 0.029 ( 0.027) Loss 4.8908e+00 (5.9191e+00) Acc@1 7.42 ( 3.75) Acc@5 27.34 ( 11.00) +Epoch: [0][4256/5004] Time 0.242 ( 0.242) Data 0.029 ( 0.027) Loss 4.9048e+00 (5.9189e+00) Acc@1 8.20 ( 3.76) Acc@5 23.83 ( 11.00) +Epoch: [0][4257/5004] Time 0.241 ( 0.242) Data 0.030 ( 0.027) Loss 4.7775e+00 (5.9186e+00) Acc@1 7.42 ( 3.76) Acc@5 28.12 ( 11.01) +Epoch: [0][4258/5004] Time 0.233 ( 0.242) Data 0.030 ( 0.027) Loss 4.7138e+00 (5.9183e+00) Acc@1 12.50 ( 3.76) Acc@5 27.73 ( 11.01) +Epoch: [0][4259/5004] Time 0.226 ( 0.242) Data 0.038 ( 0.027) Loss 4.7287e+00 (5.9181e+00) Acc@1 12.11 ( 3.76) Acc@5 26.95 ( 11.01) +Epoch: [0][4260/5004] Time 0.237 ( 0.242) Data 0.049 ( 0.027) Loss 4.7377e+00 (5.9178e+00) Acc@1 13.28 ( 3.76) Acc@5 23.83 ( 11.02) +Epoch: [0][4261/5004] Time 0.238 ( 0.242) Data 0.051 ( 0.027) Loss 4.8354e+00 (5.9175e+00) Acc@1 8.59 ( 3.76) Acc@5 27.73 ( 11.02) +Epoch: [0][4262/5004] Time 0.243 ( 0.242) Data 0.052 ( 0.027) Loss 4.7454e+00 (5.9173e+00) Acc@1 11.33 ( 3.77) Acc@5 28.52 ( 11.03) +Epoch: [0][4263/5004] Time 0.240 ( 0.242) Data 0.051 ( 0.027) Loss 4.7975e+00 (5.9170e+00) Acc@1 12.11 ( 3.77) Acc@5 31.64 ( 11.03) +Epoch: [0][4264/5004] Time 0.240 ( 0.242) Data 0.050 ( 0.027) Loss 4.8793e+00 (5.9167e+00) Acc@1 11.33 ( 3.77) Acc@5 23.44 ( 11.03) +Epoch: [0][4265/5004] Time 0.241 ( 0.242) Data 0.049 ( 0.027) Loss 4.8561e+00 (5.9165e+00) Acc@1 9.77 ( 3.77) Acc@5 26.17 ( 11.04) +Epoch: [0][4266/5004] Time 0.236 ( 0.242) Data 0.048 ( 0.027) Loss 4.8661e+00 (5.9163e+00) Acc@1 10.16 ( 3.77) Acc@5 24.61 ( 11.04) +Epoch: [0][4267/5004] Time 0.238 ( 0.242) Data 0.049 ( 0.027) Loss 4.9215e+00 (5.9160e+00) Acc@1 12.50 ( 3.77) Acc@5 26.17 ( 11.04) +Epoch: [0][4268/5004] Time 0.237 ( 0.242) Data 0.048 ( 0.027) Loss 4.7058e+00 (5.9157e+00) Acc@1 12.89 ( 3.78) Acc@5 30.47 ( 11.05) +Epoch: [0][4269/5004] Time 0.235 ( 0.242) Data 0.050 ( 0.027) Loss 4.7123e+00 (5.9155e+00) Acc@1 14.45 ( 3.78) Acc@5 30.86 ( 11.05) +Epoch: [0][4270/5004] Time 0.242 ( 0.242) Data 0.052 ( 0.027) Loss 4.8565e+00 (5.9152e+00) Acc@1 13.67 ( 3.78) Acc@5 26.95 ( 11.06) +Epoch: [0][4271/5004] Time 0.238 ( 0.242) Data 0.050 ( 0.027) Loss 4.7593e+00 (5.9149e+00) Acc@1 11.33 ( 3.78) Acc@5 25.00 ( 11.06) +Epoch: [0][4272/5004] Time 0.237 ( 0.242) Data 0.051 ( 0.027) Loss 4.6200e+00 (5.9146e+00) Acc@1 13.28 ( 3.78) Acc@5 28.52 ( 11.06) +Epoch: [0][4273/5004] Time 0.237 ( 0.242) Data 0.051 ( 0.027) Loss 4.7595e+00 (5.9144e+00) Acc@1 14.06 ( 3.79) Acc@5 30.08 ( 11.07) +Epoch: [0][4274/5004] Time 0.279 ( 0.242) Data 0.055 ( 0.027) Loss 4.9182e+00 (5.9141e+00) Acc@1 7.42 ( 3.79) Acc@5 23.44 ( 11.07) +Epoch: [0][4275/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 4.7593e+00 (5.9139e+00) Acc@1 10.94 ( 3.79) Acc@5 32.42 ( 11.08) +Epoch: [0][4276/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.7934e+00 (5.9136e+00) Acc@1 11.72 ( 3.79) Acc@5 26.56 ( 11.08) +Epoch: [0][4277/5004] Time 0.254 ( 0.242) Data 0.022 ( 0.027) Loss 4.7614e+00 (5.9133e+00) Acc@1 12.11 ( 3.79) Acc@5 28.52 ( 11.08) +Epoch: [0][4278/5004] Time 0.248 ( 0.242) Data 0.018 ( 0.027) Loss 4.5701e+00 (5.9130e+00) Acc@1 9.77 ( 3.79) Acc@5 27.73 ( 11.09) +Epoch: [0][4279/5004] Time 0.239 ( 0.242) Data 0.018 ( 0.027) Loss 4.9376e+00 (5.9128e+00) Acc@1 8.98 ( 3.80) Acc@5 26.17 ( 11.09) +Epoch: [0][4280/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.7880e+00 (5.9125e+00) Acc@1 11.33 ( 3.80) Acc@5 28.52 ( 11.09) +Epoch: [0][4281/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.7368e+00 (5.9123e+00) Acc@1 10.16 ( 3.80) Acc@5 25.78 ( 11.10) +Epoch: [0][4282/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 4.7460e+00 (5.9120e+00) Acc@1 13.28 ( 3.80) Acc@5 27.73 ( 11.10) +Epoch: [0][4283/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.8431e+00 (5.9117e+00) Acc@1 9.38 ( 3.80) Acc@5 29.30 ( 11.11) +Epoch: [0][4284/5004] Time 0.240 ( 0.242) Data 0.024 ( 0.027) Loss 4.8211e+00 (5.9115e+00) Acc@1 12.11 ( 3.80) Acc@5 25.00 ( 11.11) +Epoch: [0][4285/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.027) Loss 4.6950e+00 (5.9112e+00) Acc@1 12.11 ( 3.81) Acc@5 28.91 ( 11.11) +Epoch: [0][4286/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 4.7502e+00 (5.9109e+00) Acc@1 9.77 ( 3.81) Acc@5 30.08 ( 11.12) +Epoch: [0][4287/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 4.9691e+00 (5.9107e+00) Acc@1 10.94 ( 3.81) Acc@5 24.61 ( 11.12) +Epoch: [0][4288/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 4.8992e+00 (5.9105e+00) Acc@1 10.16 ( 3.81) Acc@5 26.17 ( 11.12) +Epoch: [0][4289/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 4.9013e+00 (5.9102e+00) Acc@1 9.38 ( 3.81) Acc@5 25.00 ( 11.13) +Epoch: [0][4290/5004] Time 0.250 ( 0.242) Data 0.025 ( 0.027) Loss 4.6599e+00 (5.9099e+00) Acc@1 12.11 ( 3.81) Acc@5 31.25 ( 11.13) +Epoch: [0][4291/5004] Time 0.235 ( 0.242) Data 0.019 ( 0.027) Loss 4.5909e+00 (5.9096e+00) Acc@1 10.94 ( 3.82) Acc@5 33.20 ( 11.14) +Epoch: [0][4292/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.4896e+00 (5.9093e+00) Acc@1 11.33 ( 3.82) Acc@5 34.77 ( 11.14) +Epoch: [0][4293/5004] Time 0.250 ( 0.242) Data 0.024 ( 0.027) Loss 4.5771e+00 (5.9090e+00) Acc@1 11.72 ( 3.82) Acc@5 28.52 ( 11.15) +Epoch: [0][4294/5004] Time 0.236 ( 0.242) Data 0.018 ( 0.027) Loss 4.7886e+00 (5.9087e+00) Acc@1 7.42 ( 3.82) Acc@5 29.69 ( 11.15) +Epoch: [0][4295/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.6953e+00 (5.9084e+00) Acc@1 13.67 ( 3.82) Acc@5 28.91 ( 11.16) +Epoch: [0][4296/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.7537e+00 (5.9082e+00) Acc@1 12.50 ( 3.82) Acc@5 30.08 ( 11.16) +Epoch: [0][4297/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.7052e+00 (5.9079e+00) Acc@1 14.84 ( 3.83) Acc@5 28.52 ( 11.16) +Epoch: [0][4298/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 4.6409e+00 (5.9076e+00) Acc@1 9.77 ( 3.83) Acc@5 29.30 ( 11.17) +Epoch: [0][4299/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.5253e+00 (5.9073e+00) Acc@1 15.23 ( 3.83) Acc@5 31.25 ( 11.17) +Epoch: [0][4300/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 4.8865e+00 (5.9070e+00) Acc@1 9.77 ( 3.83) Acc@5 27.34 ( 11.18) +Epoch: [0][4301/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.6540e+00 (5.9068e+00) Acc@1 10.94 ( 3.83) Acc@5 31.64 ( 11.18) +Epoch: [0][4302/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.9764e+00 (5.9065e+00) Acc@1 11.72 ( 3.84) Acc@5 26.17 ( 11.19) +Epoch: [0][4303/5004] Time 0.238 ( 0.242) Data 0.021 ( 0.027) Loss 4.6258e+00 (5.9062e+00) Acc@1 13.67 ( 3.84) Acc@5 30.86 ( 11.19) +Epoch: [0][4304/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 4.8290e+00 (5.9060e+00) Acc@1 9.38 ( 3.84) Acc@5 27.73 ( 11.19) +Epoch: [0][4305/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 4.7701e+00 (5.9057e+00) Acc@1 10.16 ( 3.84) Acc@5 26.95 ( 11.20) +Epoch: [0][4306/5004] Time 0.254 ( 0.242) Data 0.021 ( 0.027) Loss 4.7567e+00 (5.9055e+00) Acc@1 9.38 ( 3.84) Acc@5 27.34 ( 11.20) +Epoch: [0][4307/5004] Time 0.239 ( 0.242) Data 0.018 ( 0.027) Loss 4.7334e+00 (5.9052e+00) Acc@1 13.28 ( 3.84) Acc@5 25.00 ( 11.20) +Epoch: [0][4308/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.8292e+00 (5.9049e+00) Acc@1 10.55 ( 3.85) Acc@5 25.78 ( 11.21) +Epoch: [0][4309/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.8248e+00 (5.9047e+00) Acc@1 10.16 ( 3.85) Acc@5 26.17 ( 11.21) +Epoch: [0][4310/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 4.8139e+00 (5.9044e+00) Acc@1 9.77 ( 3.85) Acc@5 22.27 ( 11.21) +Epoch: [0][4311/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.7347e+00 (5.9042e+00) Acc@1 8.98 ( 3.85) Acc@5 26.95 ( 11.22) +Epoch: [0][4312/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.8306e+00 (5.9039e+00) Acc@1 11.33 ( 3.85) Acc@5 27.34 ( 11.22) +Epoch: [0][4313/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 4.7741e+00 (5.9037e+00) Acc@1 10.94 ( 3.85) Acc@5 28.12 ( 11.22) +Epoch: [0][4314/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.9588e+00 (5.9034e+00) Acc@1 7.42 ( 3.85) Acc@5 23.83 ( 11.23) +Epoch: [0][4315/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.9207e+00 (5.9032e+00) Acc@1 11.33 ( 3.86) Acc@5 26.95 ( 11.23) +Epoch: [0][4316/5004] Time 0.249 ( 0.242) Data 0.024 ( 0.027) Loss 4.9889e+00 (5.9030e+00) Acc@1 7.81 ( 3.86) Acc@5 23.44 ( 11.23) +Epoch: [0][4317/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.7923e+00 (5.9027e+00) Acc@1 9.77 ( 3.86) Acc@5 26.56 ( 11.24) +Epoch: [0][4318/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.8128e+00 (5.9025e+00) Acc@1 10.94 ( 3.86) Acc@5 26.17 ( 11.24) +Epoch: [0][4319/5004] Time 0.240 ( 0.242) Data 0.022 ( 0.027) Loss 4.7896e+00 (5.9022e+00) Acc@1 10.16 ( 3.86) Acc@5 24.22 ( 11.24) +Epoch: [0][4320/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 4.7552e+00 (5.9020e+00) Acc@1 11.72 ( 3.86) Acc@5 30.08 ( 11.25) +Epoch: [0][4321/5004] Time 0.248 ( 0.242) Data 0.023 ( 0.027) Loss 4.7996e+00 (5.9017e+00) Acc@1 10.55 ( 3.87) Acc@5 30.08 ( 11.25) +Epoch: [0][4322/5004] Time 0.251 ( 0.242) Data 0.022 ( 0.027) Loss 4.7925e+00 (5.9014e+00) Acc@1 12.11 ( 3.87) Acc@5 25.78 ( 11.26) +Epoch: [0][4323/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 4.9689e+00 (5.9012e+00) Acc@1 7.03 ( 3.87) Acc@5 23.05 ( 11.26) +Epoch: [0][4324/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.6224e+00 (5.9009e+00) Acc@1 13.28 ( 3.87) Acc@5 29.69 ( 11.26) +Epoch: [0][4325/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 4.9648e+00 (5.9007e+00) Acc@1 13.67 ( 3.87) Acc@5 21.48 ( 11.27) +Epoch: [0][4326/5004] Time 0.245 ( 0.242) Data 0.024 ( 0.027) Loss 4.8443e+00 (5.9005e+00) Acc@1 9.77 ( 3.87) Acc@5 25.78 ( 11.27) +Epoch: [0][4327/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 4.8629e+00 (5.9002e+00) Acc@1 10.55 ( 3.88) Acc@5 27.34 ( 11.27) +Epoch: [0][4328/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.7824e+00 (5.9000e+00) Acc@1 9.77 ( 3.88) Acc@5 28.52 ( 11.28) +Epoch: [0][4329/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.7636e+00 (5.8997e+00) Acc@1 12.11 ( 3.88) Acc@5 30.08 ( 11.28) +Epoch: [0][4330/5004] Time 0.250 ( 0.242) Data 0.023 ( 0.027) Loss 4.7994e+00 (5.8995e+00) Acc@1 11.33 ( 3.88) Acc@5 27.73 ( 11.29) +Epoch: [0][4331/5004] Time 0.256 ( 0.242) Data 0.020 ( 0.027) Loss 4.8181e+00 (5.8992e+00) Acc@1 10.94 ( 3.88) Acc@5 26.56 ( 11.29) +Epoch: [0][4332/5004] Time 0.257 ( 0.242) Data 0.015 ( 0.027) Loss 4.6602e+00 (5.8989e+00) Acc@1 10.55 ( 3.88) Acc@5 30.08 ( 11.29) +Epoch: [0][4333/5004] Time 0.252 ( 0.242) Data 0.013 ( 0.027) Loss 5.0688e+00 (5.8987e+00) Acc@1 7.42 ( 3.88) Acc@5 23.05 ( 11.30) +Epoch: [0][4334/5004] Time 0.247 ( 0.242) Data 0.019 ( 0.027) Loss 4.5669e+00 (5.8984e+00) Acc@1 13.67 ( 3.89) Acc@5 35.94 ( 11.30) +Epoch: [0][4335/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 4.6091e+00 (5.8981e+00) Acc@1 16.41 ( 3.89) Acc@5 31.25 ( 11.31) +Epoch: [0][4336/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.6903e+00 (5.8979e+00) Acc@1 14.45 ( 3.89) Acc@5 30.08 ( 11.31) +Epoch: [0][4337/5004] Time 0.239 ( 0.242) Data 0.021 ( 0.027) Loss 4.8456e+00 (5.8976e+00) Acc@1 11.72 ( 3.89) Acc@5 32.03 ( 11.31) +Epoch: [0][4338/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.9356e+00 (5.8974e+00) Acc@1 10.94 ( 3.90) Acc@5 23.83 ( 11.32) +Epoch: [0][4339/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 4.6169e+00 (5.8971e+00) Acc@1 14.45 ( 3.90) Acc@5 29.30 ( 11.32) +Epoch: [0][4340/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.9525e+00 (5.8969e+00) Acc@1 8.98 ( 3.90) Acc@5 25.39 ( 11.33) +Epoch: [0][4341/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.9203e+00 (5.8966e+00) Acc@1 11.72 ( 3.90) Acc@5 26.56 ( 11.33) +Epoch: [0][4342/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.8464e+00 (5.8964e+00) Acc@1 8.98 ( 3.90) Acc@5 28.52 ( 11.33) +Epoch: [0][4343/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.027) Loss 4.7011e+00 (5.8961e+00) Acc@1 8.59 ( 3.90) Acc@5 30.08 ( 11.34) +Epoch: [0][4344/5004] Time 0.249 ( 0.242) Data 0.024 ( 0.027) Loss 4.8028e+00 (5.8959e+00) Acc@1 9.38 ( 3.90) Acc@5 26.17 ( 11.34) +Epoch: [0][4345/5004] Time 0.237 ( 0.242) Data 0.018 ( 0.027) Loss 4.7744e+00 (5.8956e+00) Acc@1 12.50 ( 3.91) Acc@5 30.08 ( 11.34) +Epoch: [0][4346/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.9243e+00 (5.8954e+00) Acc@1 9.38 ( 3.91) Acc@5 22.66 ( 11.35) +Epoch: [0][4347/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.7480e+00 (5.8951e+00) Acc@1 13.67 ( 3.91) Acc@5 29.30 ( 11.35) +Epoch: [0][4348/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.7489e+00 (5.8949e+00) Acc@1 9.77 ( 3.91) Acc@5 27.34 ( 11.36) +Epoch: [0][4349/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 4.6675e+00 (5.8946e+00) Acc@1 14.06 ( 3.91) Acc@5 28.52 ( 11.36) +Epoch: [0][4350/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.7124e+00 (5.8943e+00) Acc@1 14.06 ( 3.92) Acc@5 30.08 ( 11.36) +Epoch: [0][4351/5004] Time 0.247 ( 0.242) Data 0.025 ( 0.027) Loss 4.6712e+00 (5.8940e+00) Acc@1 12.50 ( 3.92) Acc@5 27.73 ( 11.37) +Epoch: [0][4352/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.9541e+00 (5.8938e+00) Acc@1 11.33 ( 3.92) Acc@5 25.39 ( 11.37) +Epoch: [0][4353/5004] Time 0.247 ( 0.242) Data 0.026 ( 0.027) Loss 4.6457e+00 (5.8935e+00) Acc@1 12.89 ( 3.92) Acc@5 30.08 ( 11.37) +Epoch: [0][4354/5004] Time 0.256 ( 0.242) Data 0.025 ( 0.027) Loss 4.8151e+00 (5.8933e+00) Acc@1 10.16 ( 3.92) Acc@5 24.22 ( 11.38) +Epoch: [0][4355/5004] Time 0.255 ( 0.242) Data 0.022 ( 0.027) Loss 4.5125e+00 (5.8930e+00) Acc@1 15.62 ( 3.93) Acc@5 32.81 ( 11.38) +Epoch: [0][4356/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 4.5891e+00 (5.8927e+00) Acc@1 12.11 ( 3.93) Acc@5 32.42 ( 11.39) +Epoch: [0][4357/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.027) Loss 4.4711e+00 (5.8923e+00) Acc@1 16.80 ( 3.93) Acc@5 34.77 ( 11.39) +Epoch: [0][4358/5004] Time 0.236 ( 0.242) Data 0.022 ( 0.027) Loss 4.8306e+00 (5.8921e+00) Acc@1 9.77 ( 3.93) Acc@5 28.12 ( 11.40) +Epoch: [0][4359/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.027) Loss 4.7197e+00 (5.8918e+00) Acc@1 10.16 ( 3.93) Acc@5 30.08 ( 11.40) +Epoch: [0][4360/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.8753e+00 (5.8916e+00) Acc@1 12.11 ( 3.93) Acc@5 25.39 ( 11.40) +Epoch: [0][4361/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 4.8066e+00 (5.8913e+00) Acc@1 8.59 ( 3.94) Acc@5 28.12 ( 11.41) +Epoch: [0][4362/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.9068e+00 (5.8911e+00) Acc@1 10.94 ( 3.94) Acc@5 28.12 ( 11.41) +Epoch: [0][4363/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.8610e+00 (5.8909e+00) Acc@1 9.77 ( 3.94) Acc@5 24.61 ( 11.41) +Epoch: [0][4364/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.7009e+00 (5.8906e+00) Acc@1 10.55 ( 3.94) Acc@5 30.08 ( 11.42) +Epoch: [0][4365/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 4.7882e+00 (5.8904e+00) Acc@1 15.62 ( 3.94) Acc@5 28.91 ( 11.42) +Epoch: [0][4366/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 4.8030e+00 (5.8901e+00) Acc@1 10.16 ( 3.94) Acc@5 27.34 ( 11.43) +Epoch: [0][4367/5004] Time 0.244 ( 0.242) Data 0.026 ( 0.027) Loss 4.6859e+00 (5.8898e+00) Acc@1 13.67 ( 3.95) Acc@5 28.12 ( 11.43) +Epoch: [0][4368/5004] Time 0.247 ( 0.242) Data 0.026 ( 0.027) Loss 4.6165e+00 (5.8895e+00) Acc@1 14.45 ( 3.95) Acc@5 32.03 ( 11.44) +Epoch: [0][4369/5004] Time 0.236 ( 0.242) Data 0.022 ( 0.027) Loss 4.9743e+00 (5.8893e+00) Acc@1 10.94 ( 3.95) Acc@5 23.44 ( 11.44) +Epoch: [0][4370/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.8503e+00 (5.8891e+00) Acc@1 11.72 ( 3.95) Acc@5 26.95 ( 11.44) +Epoch: [0][4371/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.7944e+00 (5.8888e+00) Acc@1 10.94 ( 3.95) Acc@5 27.73 ( 11.45) +Epoch: [0][4372/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.7091e+00 (5.8886e+00) Acc@1 12.50 ( 3.96) Acc@5 28.52 ( 11.45) +Epoch: [0][4373/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.6182e+00 (5.8883e+00) Acc@1 13.28 ( 3.96) Acc@5 30.47 ( 11.45) +Epoch: [0][4374/5004] Time 0.211 ( 0.242) Data 0.027 ( 0.027) Loss 4.7312e+00 (5.8880e+00) Acc@1 12.89 ( 3.96) Acc@5 29.69 ( 11.46) +Epoch: [0][4375/5004] Time 0.235 ( 0.242) Data 0.054 ( 0.027) Loss 4.7046e+00 (5.8878e+00) Acc@1 14.06 ( 3.96) Acc@5 30.08 ( 11.46) +Epoch: [0][4376/5004] Time 0.235 ( 0.242) Data 0.056 ( 0.027) Loss 4.7626e+00 (5.8875e+00) Acc@1 8.98 ( 3.96) Acc@5 24.61 ( 11.46) +Epoch: [0][4377/5004] Time 0.239 ( 0.242) Data 0.058 ( 0.027) Loss 4.7343e+00 (5.8872e+00) Acc@1 10.94 ( 3.97) Acc@5 28.12 ( 11.47) +Epoch: [0][4378/5004] Time 0.242 ( 0.242) Data 0.058 ( 0.027) Loss 4.9357e+00 (5.8870e+00) Acc@1 9.77 ( 3.97) Acc@5 25.00 ( 11.47) +Epoch: [0][4379/5004] Time 0.237 ( 0.242) Data 0.053 ( 0.027) Loss 4.6885e+00 (5.8867e+00) Acc@1 10.16 ( 3.97) Acc@5 25.78 ( 11.48) +Epoch: [0][4380/5004] Time 0.241 ( 0.242) Data 0.053 ( 0.027) Loss 4.9012e+00 (5.8865e+00) Acc@1 11.33 ( 3.97) Acc@5 28.52 ( 11.48) +Epoch: [0][4381/5004] Time 0.232 ( 0.242) Data 0.049 ( 0.027) Loss 4.8345e+00 (5.8863e+00) Acc@1 10.55 ( 3.97) Acc@5 26.95 ( 11.48) +Epoch: [0][4382/5004] Time 0.236 ( 0.242) Data 0.055 ( 0.027) Loss 4.8070e+00 (5.8860e+00) Acc@1 12.50 ( 3.97) Acc@5 29.30 ( 11.49) +Epoch: [0][4383/5004] Time 0.240 ( 0.242) Data 0.056 ( 0.027) Loss 4.7272e+00 (5.8858e+00) Acc@1 11.72 ( 3.97) Acc@5 30.08 ( 11.49) +Epoch: [0][4384/5004] Time 0.235 ( 0.242) Data 0.054 ( 0.027) Loss 4.7214e+00 (5.8855e+00) Acc@1 11.72 ( 3.98) Acc@5 30.08 ( 11.50) +Epoch: [0][4385/5004] Time 0.239 ( 0.242) Data 0.055 ( 0.027) Loss 4.9454e+00 (5.8853e+00) Acc@1 9.77 ( 3.98) Acc@5 23.83 ( 11.50) +Epoch: [0][4386/5004] Time 0.236 ( 0.242) Data 0.054 ( 0.027) Loss 4.6405e+00 (5.8850e+00) Acc@1 12.89 ( 3.98) Acc@5 31.64 ( 11.50) +Epoch: [0][4387/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.6987e+00 (5.8847e+00) Acc@1 14.84 ( 3.98) Acc@5 27.73 ( 11.51) +Epoch: [0][4388/5004] Time 0.235 ( 0.242) Data 0.054 ( 0.027) Loss 4.6975e+00 (5.8845e+00) Acc@1 12.11 ( 3.98) Acc@5 29.69 ( 11.51) +Epoch: [0][4389/5004] Time 0.245 ( 0.242) Data 0.055 ( 0.027) Loss 4.7785e+00 (5.8842e+00) Acc@1 9.38 ( 3.99) Acc@5 28.12 ( 11.51) +Epoch: [0][4390/5004] Time 0.229 ( 0.242) Data 0.052 ( 0.027) Loss 4.5341e+00 (5.8839e+00) Acc@1 16.80 ( 3.99) Acc@5 32.03 ( 11.52) +Epoch: [0][4391/5004] Time 0.243 ( 0.242) Data 0.059 ( 0.027) Loss 4.8412e+00 (5.8837e+00) Acc@1 10.55 ( 3.99) Acc@5 26.56 ( 11.52) +Epoch: [0][4392/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.7058e+00 (5.8834e+00) Acc@1 14.84 ( 3.99) Acc@5 28.12 ( 11.53) +Epoch: [0][4393/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.8107e+00 (5.8832e+00) Acc@1 14.06 ( 3.99) Acc@5 28.91 ( 11.53) +Epoch: [0][4394/5004] Time 0.237 ( 0.242) Data 0.054 ( 0.027) Loss 4.4474e+00 (5.8828e+00) Acc@1 12.50 ( 4.00) Acc@5 32.81 ( 11.53) +Epoch: [0][4395/5004] Time 0.237 ( 0.242) Data 0.053 ( 0.027) Loss 4.7510e+00 (5.8826e+00) Acc@1 12.11 ( 4.00) Acc@5 26.95 ( 11.54) +Epoch: [0][4396/5004] Time 0.236 ( 0.242) Data 0.054 ( 0.027) Loss 4.7525e+00 (5.8823e+00) Acc@1 12.11 ( 4.00) Acc@5 27.73 ( 11.54) +Epoch: [0][4397/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.8148e+00 (5.8821e+00) Acc@1 11.33 ( 4.00) Acc@5 26.56 ( 11.55) +Epoch: [0][4398/5004] Time 0.246 ( 0.242) Data 0.055 ( 0.027) Loss 4.7304e+00 (5.8818e+00) Acc@1 12.11 ( 4.00) Acc@5 27.34 ( 11.55) +Epoch: [0][4399/5004] Time 0.229 ( 0.242) Data 0.047 ( 0.027) Loss 4.6771e+00 (5.8815e+00) Acc@1 12.11 ( 4.01) Acc@5 32.42 ( 11.55) +Epoch: [0][4400/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.8744e+00 (5.8813e+00) Acc@1 8.98 ( 4.01) Acc@5 23.83 ( 11.56) +Epoch: [0][4401/5004] Time 0.236 ( 0.242) Data 0.053 ( 0.027) Loss 4.8977e+00 (5.8811e+00) Acc@1 9.77 ( 4.01) Acc@5 25.00 ( 11.56) +Epoch: [0][4402/5004] Time 0.237 ( 0.242) Data 0.054 ( 0.027) Loss 4.7922e+00 (5.8808e+00) Acc@1 8.98 ( 4.01) Acc@5 25.39 ( 11.56) +Epoch: [0][4403/5004] Time 0.232 ( 0.242) Data 0.054 ( 0.027) Loss 5.0195e+00 (5.8806e+00) Acc@1 9.38 ( 4.01) Acc@5 23.44 ( 11.57) +Epoch: [0][4404/5004] Time 0.243 ( 0.242) Data 0.058 ( 0.027) Loss 4.5749e+00 (5.8803e+00) Acc@1 12.50 ( 4.01) Acc@5 33.20 ( 11.57) +Epoch: [0][4405/5004] Time 0.235 ( 0.242) Data 0.053 ( 0.027) Loss 4.6754e+00 (5.8801e+00) Acc@1 10.16 ( 4.01) Acc@5 28.91 ( 11.57) +Epoch: [0][4406/5004] Time 0.236 ( 0.242) Data 0.054 ( 0.027) Loss 4.7931e+00 (5.8798e+00) Acc@1 10.55 ( 4.02) Acc@5 31.64 ( 11.58) +Epoch: [0][4407/5004] Time 0.242 ( 0.242) Data 0.055 ( 0.027) Loss 4.8249e+00 (5.8796e+00) Acc@1 11.33 ( 4.02) Acc@5 28.12 ( 11.58) +Epoch: [0][4408/5004] Time 0.234 ( 0.242) Data 0.050 ( 0.027) Loss 4.6917e+00 (5.8793e+00) Acc@1 13.67 ( 4.02) Acc@5 30.86 ( 11.59) +Epoch: [0][4409/5004] Time 0.234 ( 0.242) Data 0.054 ( 0.027) Loss 4.7552e+00 (5.8791e+00) Acc@1 10.55 ( 4.02) Acc@5 31.25 ( 11.59) +Epoch: [0][4410/5004] Time 0.235 ( 0.242) Data 0.056 ( 0.027) Loss 4.7324e+00 (5.8788e+00) Acc@1 12.50 ( 4.02) Acc@5 28.12 ( 11.60) +Epoch: [0][4411/5004] Time 0.240 ( 0.242) Data 0.057 ( 0.027) Loss 4.7785e+00 (5.8785e+00) Acc@1 10.94 ( 4.02) Acc@5 30.47 ( 11.60) +Epoch: [0][4412/5004] Time 0.239 ( 0.242) Data 0.055 ( 0.027) Loss 4.8963e+00 (5.8783e+00) Acc@1 10.94 ( 4.03) Acc@5 24.61 ( 11.60) +Epoch: [0][4413/5004] Time 0.238 ( 0.242) Data 0.054 ( 0.027) Loss 4.7772e+00 (5.8781e+00) Acc@1 11.33 ( 4.03) Acc@5 32.42 ( 11.61) +Epoch: [0][4414/5004] Time 0.238 ( 0.242) Data 0.054 ( 0.027) Loss 4.7115e+00 (5.8778e+00) Acc@1 10.16 ( 4.03) Acc@5 32.81 ( 11.61) +Epoch: [0][4415/5004] Time 0.237 ( 0.242) Data 0.053 ( 0.027) Loss 4.8106e+00 (5.8776e+00) Acc@1 9.77 ( 4.03) Acc@5 26.95 ( 11.62) +Epoch: [0][4416/5004] Time 0.238 ( 0.242) Data 0.054 ( 0.027) Loss 4.7229e+00 (5.8773e+00) Acc@1 12.11 ( 4.03) Acc@5 31.25 ( 11.62) +Epoch: [0][4417/5004] Time 0.232 ( 0.242) Data 0.054 ( 0.027) Loss 4.6480e+00 (5.8770e+00) Acc@1 12.50 ( 4.03) Acc@5 32.81 ( 11.62) +Epoch: [0][4418/5004] Time 0.241 ( 0.242) Data 0.058 ( 0.027) Loss 4.8067e+00 (5.8768e+00) Acc@1 14.06 ( 4.04) Acc@5 28.52 ( 11.63) +Epoch: [0][4419/5004] Time 0.237 ( 0.242) Data 0.055 ( 0.027) Loss 4.5676e+00 (5.8765e+00) Acc@1 16.02 ( 4.04) Acc@5 29.30 ( 11.63) +Epoch: [0][4420/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.7684e+00 (5.8762e+00) Acc@1 13.67 ( 4.04) Acc@5 28.91 ( 11.64) +Epoch: [0][4421/5004] Time 0.236 ( 0.242) Data 0.054 ( 0.027) Loss 4.7182e+00 (5.8760e+00) Acc@1 13.28 ( 4.04) Acc@5 28.52 ( 11.64) +Epoch: [0][4422/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.7117e+00 (5.8757e+00) Acc@1 10.16 ( 4.04) Acc@5 32.42 ( 11.64) +Epoch: [0][4423/5004] Time 0.235 ( 0.242) Data 0.054 ( 0.027) Loss 4.8450e+00 (5.8755e+00) Acc@1 8.59 ( 4.05) Acc@5 26.95 ( 11.65) +Epoch: [0][4424/5004] Time 0.242 ( 0.242) Data 0.055 ( 0.027) Loss 4.6870e+00 (5.8752e+00) Acc@1 10.94 ( 4.05) Acc@5 27.34 ( 11.65) +Epoch: [0][4425/5004] Time 0.233 ( 0.242) Data 0.051 ( 0.027) Loss 4.7259e+00 (5.8750e+00) Acc@1 10.16 ( 4.05) Acc@5 26.56 ( 11.66) +Epoch: [0][4426/5004] Time 0.246 ( 0.242) Data 0.058 ( 0.027) Loss 4.7873e+00 (5.8747e+00) Acc@1 8.59 ( 4.05) Acc@5 27.34 ( 11.66) +Epoch: [0][4427/5004] Time 0.237 ( 0.242) Data 0.049 ( 0.027) Loss 4.7378e+00 (5.8745e+00) Acc@1 13.67 ( 4.05) Acc@5 27.73 ( 11.66) +Epoch: [0][4428/5004] Time 0.231 ( 0.242) Data 0.050 ( 0.027) Loss 4.8891e+00 (5.8742e+00) Acc@1 9.77 ( 4.05) Acc@5 25.00 ( 11.67) +Epoch: [0][4429/5004] Time 0.239 ( 0.242) Data 0.055 ( 0.027) Loss 4.7970e+00 (5.8740e+00) Acc@1 12.11 ( 4.05) Acc@5 27.73 ( 11.67) +Epoch: [0][4430/5004] Time 0.238 ( 0.242) Data 0.053 ( 0.027) Loss 4.7698e+00 (5.8737e+00) Acc@1 10.94 ( 4.06) Acc@5 26.56 ( 11.67) +Epoch: [0][4431/5004] Time 0.237 ( 0.242) Data 0.055 ( 0.027) Loss 4.8313e+00 (5.8735e+00) Acc@1 10.94 ( 4.06) Acc@5 26.95 ( 11.68) +Epoch: [0][4432/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.5979e+00 (5.8732e+00) Acc@1 11.72 ( 4.06) Acc@5 33.20 ( 11.68) +Epoch: [0][4433/5004] Time 0.236 ( 0.242) Data 0.054 ( 0.027) Loss 4.6025e+00 (5.8729e+00) Acc@1 12.89 ( 4.06) Acc@5 32.03 ( 11.69) +Epoch: [0][4434/5004] Time 0.238 ( 0.242) Data 0.055 ( 0.027) Loss 4.8463e+00 (5.8727e+00) Acc@1 8.59 ( 4.06) Acc@5 23.44 ( 11.69) +Epoch: [0][4435/5004] Time 0.246 ( 0.242) Data 0.054 ( 0.027) Loss 4.6303e+00 (5.8724e+00) Acc@1 13.28 ( 4.06) Acc@5 32.03 ( 11.69) +Epoch: [0][4436/5004] Time 0.236 ( 0.242) Data 0.051 ( 0.027) Loss 4.6430e+00 (5.8721e+00) Acc@1 10.94 ( 4.07) Acc@5 32.81 ( 11.70) +Epoch: [0][4437/5004] Time 0.269 ( 0.242) Data 0.053 ( 0.027) Loss 4.6193e+00 (5.8719e+00) Acc@1 14.45 ( 4.07) Acc@5 30.47 ( 11.70) +Epoch: [0][4438/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 4.8254e+00 (5.8716e+00) Acc@1 12.50 ( 4.07) Acc@5 25.78 ( 11.70) +Epoch: [0][4439/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.6807e+00 (5.8714e+00) Acc@1 10.16 ( 4.07) Acc@5 28.91 ( 11.71) +Epoch: [0][4440/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 4.7728e+00 (5.8711e+00) Acc@1 13.28 ( 4.07) Acc@5 29.30 ( 11.71) +Epoch: [0][4441/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.5857e+00 (5.8708e+00) Acc@1 11.33 ( 4.08) Acc@5 32.42 ( 11.72) +Epoch: [0][4442/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 4.6396e+00 (5.8705e+00) Acc@1 12.50 ( 4.08) Acc@5 30.47 ( 11.72) +Epoch: [0][4443/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 4.5800e+00 (5.8703e+00) Acc@1 12.11 ( 4.08) Acc@5 30.86 ( 11.73) +Epoch: [0][4444/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.8073e+00 (5.8700e+00) Acc@1 10.16 ( 4.08) Acc@5 26.95 ( 11.73) +Epoch: [0][4445/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.027) Loss 4.5509e+00 (5.8697e+00) Acc@1 11.33 ( 4.08) Acc@5 30.08 ( 11.73) +Epoch: [0][4446/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 4.9992e+00 (5.8695e+00) Acc@1 8.98 ( 4.08) Acc@5 23.05 ( 11.74) +Epoch: [0][4447/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.7653e+00 (5.8693e+00) Acc@1 11.72 ( 4.08) Acc@5 28.52 ( 11.74) +Epoch: [0][4448/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.7472e+00 (5.8690e+00) Acc@1 10.16 ( 4.09) Acc@5 30.47 ( 11.74) +Epoch: [0][4449/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 4.6330e+00 (5.8687e+00) Acc@1 9.77 ( 4.09) Acc@5 27.34 ( 11.75) +Epoch: [0][4450/5004] Time 0.240 ( 0.242) Data 0.025 ( 0.027) Loss 4.6733e+00 (5.8685e+00) Acc@1 11.33 ( 4.09) Acc@5 26.56 ( 11.75) +Epoch: [0][4451/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.027) Loss 4.6680e+00 (5.8682e+00) Acc@1 12.11 ( 4.09) Acc@5 30.86 ( 11.75) +Epoch: [0][4452/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 4.7679e+00 (5.8680e+00) Acc@1 12.11 ( 4.09) Acc@5 25.78 ( 11.76) +Epoch: [0][4453/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.7522e+00 (5.8677e+00) Acc@1 14.06 ( 4.10) Acc@5 27.73 ( 11.76) +Epoch: [0][4454/5004] Time 0.246 ( 0.242) Data 0.026 ( 0.027) Loss 4.5239e+00 (5.8674e+00) Acc@1 12.50 ( 4.10) Acc@5 35.94 ( 11.77) +Epoch: [0][4455/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.8313e+00 (5.8672e+00) Acc@1 9.77 ( 4.10) Acc@5 25.00 ( 11.77) +Epoch: [0][4456/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.5582e+00 (5.8669e+00) Acc@1 17.97 ( 4.10) Acc@5 35.16 ( 11.78) +Epoch: [0][4457/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 4.6424e+00 (5.8666e+00) Acc@1 10.16 ( 4.10) Acc@5 32.03 ( 11.78) +Epoch: [0][4458/5004] Time 0.246 ( 0.242) Data 0.025 ( 0.027) Loss 4.4859e+00 (5.8663e+00) Acc@1 10.55 ( 4.10) Acc@5 32.81 ( 11.78) +Epoch: [0][4459/5004] Time 0.246 ( 0.242) Data 0.026 ( 0.027) Loss 4.6577e+00 (5.8660e+00) Acc@1 12.11 ( 4.11) Acc@5 30.86 ( 11.79) +Epoch: [0][4460/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.9651e+00 (5.8658e+00) Acc@1 8.20 ( 4.11) Acc@5 23.83 ( 11.79) +Epoch: [0][4461/5004] Time 0.240 ( 0.242) Data 0.021 ( 0.027) Loss 4.6415e+00 (5.8655e+00) Acc@1 7.81 ( 4.11) Acc@5 30.47 ( 11.80) +Epoch: [0][4462/5004] Time 0.239 ( 0.242) Data 0.022 ( 0.027) Loss 4.8349e+00 (5.8653e+00) Acc@1 14.06 ( 4.11) Acc@5 31.64 ( 11.80) +Epoch: [0][4463/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.8724e+00 (5.8651e+00) Acc@1 7.81 ( 4.11) Acc@5 23.83 ( 11.80) +Epoch: [0][4464/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.6700e+00 (5.8648e+00) Acc@1 11.33 ( 4.11) Acc@5 30.47 ( 11.81) +Epoch: [0][4465/5004] Time 0.241 ( 0.242) Data 0.023 ( 0.027) Loss 4.6703e+00 (5.8646e+00) Acc@1 11.33 ( 4.11) Acc@5 28.91 ( 11.81) +Epoch: [0][4466/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.6118e+00 (5.8643e+00) Acc@1 14.84 ( 4.12) Acc@5 27.73 ( 11.81) +Epoch: [0][4467/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.6393e+00 (5.8640e+00) Acc@1 10.94 ( 4.12) Acc@5 30.08 ( 11.82) +Epoch: [0][4468/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 4.7994e+00 (5.8638e+00) Acc@1 10.94 ( 4.12) Acc@5 29.69 ( 11.82) +Epoch: [0][4469/5004] Time 0.239 ( 0.242) Data 0.022 ( 0.027) Loss 4.6168e+00 (5.8635e+00) Acc@1 14.84 ( 4.12) Acc@5 31.25 ( 11.83) +Epoch: [0][4470/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 4.6344e+00 (5.8632e+00) Acc@1 8.20 ( 4.12) Acc@5 30.08 ( 11.83) +Epoch: [0][4471/5004] Time 0.241 ( 0.242) Data 0.025 ( 0.027) Loss 4.5356e+00 (5.8629e+00) Acc@1 14.45 ( 4.13) Acc@5 32.03 ( 11.84) +Epoch: [0][4472/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.8693e+00 (5.8627e+00) Acc@1 11.33 ( 4.13) Acc@5 25.78 ( 11.84) +Epoch: [0][4473/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 4.7713e+00 (5.8624e+00) Acc@1 11.72 ( 4.13) Acc@5 28.52 ( 11.84) +Epoch: [0][4474/5004] Time 0.243 ( 0.242) Data 0.025 ( 0.027) Loss 4.8009e+00 (5.8622e+00) Acc@1 10.55 ( 4.13) Acc@5 26.95 ( 11.85) +Epoch: [0][4475/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 4.7548e+00 (5.8620e+00) Acc@1 10.16 ( 4.13) Acc@5 28.91 ( 11.85) +Epoch: [0][4476/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.7440e+00 (5.8617e+00) Acc@1 13.28 ( 4.13) Acc@5 25.00 ( 11.85) +Epoch: [0][4477/5004] Time 0.240 ( 0.242) Data 0.024 ( 0.027) Loss 4.6899e+00 (5.8615e+00) Acc@1 12.50 ( 4.14) Acc@5 26.95 ( 11.86) +Epoch: [0][4478/5004] Time 0.238 ( 0.242) Data 0.024 ( 0.027) Loss 4.8985e+00 (5.8612e+00) Acc@1 10.16 ( 4.14) Acc@5 26.17 ( 11.86) +Epoch: [0][4479/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.9043e+00 (5.8610e+00) Acc@1 8.59 ( 4.14) Acc@5 28.12 ( 11.86) +Epoch: [0][4480/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.8511e+00 (5.8608e+00) Acc@1 8.20 ( 4.14) Acc@5 28.52 ( 11.87) +Epoch: [0][4481/5004] Time 0.246 ( 0.242) Data 0.026 ( 0.027) Loss 4.7635e+00 (5.8606e+00) Acc@1 12.11 ( 4.14) Acc@5 30.47 ( 11.87) +Epoch: [0][4482/5004] Time 0.237 ( 0.242) Data 0.023 ( 0.027) Loss 4.8584e+00 (5.8603e+00) Acc@1 11.33 ( 4.14) Acc@5 27.73 ( 11.87) +Epoch: [0][4483/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 4.6103e+00 (5.8601e+00) Acc@1 10.55 ( 4.14) Acc@5 30.86 ( 11.88) +Epoch: [0][4484/5004] Time 0.236 ( 0.242) Data 0.024 ( 0.027) Loss 4.7633e+00 (5.8598e+00) Acc@1 7.81 ( 4.14) Acc@5 27.34 ( 11.88) +Epoch: [0][4485/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 4.7892e+00 (5.8596e+00) Acc@1 8.98 ( 4.15) Acc@5 24.22 ( 11.88) +Epoch: [0][4486/5004] Time 0.250 ( 0.242) Data 0.024 ( 0.027) Loss 4.7905e+00 (5.8593e+00) Acc@1 10.16 ( 4.15) Acc@5 30.86 ( 11.89) +Epoch: [0][4487/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.7219e+00 (5.8591e+00) Acc@1 14.45 ( 4.15) Acc@5 29.30 ( 11.89) +Epoch: [0][4488/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 4.7513e+00 (5.8588e+00) Acc@1 9.77 ( 4.15) Acc@5 28.12 ( 11.90) +Epoch: [0][4489/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.027) Loss 4.7836e+00 (5.8586e+00) Acc@1 10.94 ( 4.15) Acc@5 27.34 ( 11.90) +Epoch: [0][4490/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 4.6031e+00 (5.8583e+00) Acc@1 15.23 ( 4.15) Acc@5 27.34 ( 11.90) +Epoch: [0][4491/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.027) Loss 4.6707e+00 (5.8580e+00) Acc@1 14.84 ( 4.16) Acc@5 28.91 ( 11.91) +Epoch: [0][4492/5004] Time 0.256 ( 0.242) Data 0.020 ( 0.027) Loss 4.7315e+00 (5.8578e+00) Acc@1 12.50 ( 4.16) Acc@5 29.69 ( 11.91) +Epoch: [0][4493/5004] Time 0.243 ( 0.242) Data 0.017 ( 0.027) Loss 4.9562e+00 (5.8576e+00) Acc@1 10.16 ( 4.16) Acc@5 30.08 ( 11.91) +Epoch: [0][4494/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 4.6998e+00 (5.8573e+00) Acc@1 12.11 ( 4.16) Acc@5 30.47 ( 11.92) +Epoch: [0][4495/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 4.7004e+00 (5.8571e+00) Acc@1 12.11 ( 4.16) Acc@5 29.30 ( 11.92) +Epoch: [0][4496/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 4.6539e+00 (5.8568e+00) Acc@1 8.20 ( 4.16) Acc@5 32.42 ( 11.93) +Epoch: [0][4497/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.027) Loss 4.6790e+00 (5.8565e+00) Acc@1 14.45 ( 4.17) Acc@5 27.34 ( 11.93) +Epoch: [0][4498/5004] Time 0.247 ( 0.242) Data 0.017 ( 0.027) Loss 4.5157e+00 (5.8563e+00) Acc@1 12.89 ( 4.17) Acc@5 33.20 ( 11.94) +Epoch: [0][4499/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.027) Loss 4.4823e+00 (5.8559e+00) Acc@1 16.02 ( 4.17) Acc@5 35.16 ( 11.94) +Epoch: [0][4500/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.027) Loss 4.5413e+00 (5.8557e+00) Acc@1 15.23 ( 4.17) Acc@5 31.64 ( 11.95) +Epoch: [0][4501/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.8054e+00 (5.8554e+00) Acc@1 10.55 ( 4.17) Acc@5 29.30 ( 11.95) +Epoch: [0][4502/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 4.7300e+00 (5.8552e+00) Acc@1 14.06 ( 4.18) Acc@5 28.12 ( 11.95) +Epoch: [0][4503/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.7236e+00 (5.8549e+00) Acc@1 12.50 ( 4.18) Acc@5 26.95 ( 11.96) +Epoch: [0][4504/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 4.6037e+00 (5.8546e+00) Acc@1 12.11 ( 4.18) Acc@5 33.59 ( 11.96) +Epoch: [0][4505/5004] Time 0.249 ( 0.242) Data 0.021 ( 0.027) Loss 4.5645e+00 (5.8544e+00) Acc@1 11.72 ( 4.18) Acc@5 31.64 ( 11.97) +Epoch: [0][4506/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 4.6089e+00 (5.8541e+00) Acc@1 9.77 ( 4.18) Acc@5 31.64 ( 11.97) +Epoch: [0][4507/5004] Time 0.251 ( 0.242) Data 0.020 ( 0.027) Loss 4.6632e+00 (5.8538e+00) Acc@1 14.84 ( 4.19) Acc@5 33.20 ( 11.97) +Epoch: [0][4508/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 4.8380e+00 (5.8536e+00) Acc@1 12.11 ( 4.19) Acc@5 32.03 ( 11.98) +Epoch: [0][4509/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.027) Loss 4.8603e+00 (5.8534e+00) Acc@1 10.94 ( 4.19) Acc@5 28.12 ( 11.98) +Epoch: [0][4510/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.027) Loss 4.7580e+00 (5.8531e+00) Acc@1 10.55 ( 4.19) Acc@5 27.34 ( 11.99) +Epoch: [0][4511/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.7079e+00 (5.8529e+00) Acc@1 15.62 ( 4.19) Acc@5 28.91 ( 11.99) +Epoch: [0][4512/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.6875e+00 (5.8526e+00) Acc@1 14.84 ( 4.20) Acc@5 31.64 ( 11.99) +Epoch: [0][4513/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.7821e+00 (5.8524e+00) Acc@1 12.11 ( 4.20) Acc@5 28.12 ( 12.00) +Epoch: [0][4514/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 4.7040e+00 (5.8521e+00) Acc@1 12.89 ( 4.20) Acc@5 27.73 ( 12.00) +Epoch: [0][4515/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 4.6857e+00 (5.8519e+00) Acc@1 12.50 ( 4.20) Acc@5 32.03 ( 12.01) +Epoch: [0][4516/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.6967e+00 (5.8516e+00) Acc@1 12.50 ( 4.20) Acc@5 29.30 ( 12.01) +Epoch: [0][4517/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.5437e+00 (5.8513e+00) Acc@1 11.33 ( 4.20) Acc@5 33.98 ( 12.01) +Epoch: [0][4518/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.5659e+00 (5.8510e+00) Acc@1 14.06 ( 4.21) Acc@5 35.16 ( 12.02) +Epoch: [0][4519/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 4.8928e+00 (5.8508e+00) Acc@1 9.77 ( 4.21) Acc@5 27.34 ( 12.02) +Epoch: [0][4520/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.4583e+00 (5.8505e+00) Acc@1 14.84 ( 4.21) Acc@5 33.98 ( 12.03) +Epoch: [0][4521/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.6374e+00 (5.8502e+00) Acc@1 14.84 ( 4.21) Acc@5 33.98 ( 12.03) +Epoch: [0][4522/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.8041e+00 (5.8500e+00) Acc@1 12.50 ( 4.21) Acc@5 27.73 ( 12.04) +Epoch: [0][4523/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 4.7803e+00 (5.8498e+00) Acc@1 10.55 ( 4.22) Acc@5 26.17 ( 12.04) +Epoch: [0][4524/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.8303e+00 (5.8496e+00) Acc@1 9.77 ( 4.22) Acc@5 23.83 ( 12.04) +Epoch: [0][4525/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.7299e+00 (5.8493e+00) Acc@1 16.80 ( 4.22) Acc@5 30.47 ( 12.05) +Epoch: [0][4526/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.9272e+00 (5.8491e+00) Acc@1 9.38 ( 4.22) Acc@5 26.17 ( 12.05) +Epoch: [0][4527/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 5.0180e+00 (5.8489e+00) Acc@1 10.55 ( 4.22) Acc@5 26.17 ( 12.05) +Epoch: [0][4528/5004] Time 0.253 ( 0.242) Data 0.020 ( 0.027) Loss 4.8121e+00 (5.8487e+00) Acc@1 9.77 ( 4.22) Acc@5 25.78 ( 12.05) +Epoch: [0][4529/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.027) Loss 4.7466e+00 (5.8484e+00) Acc@1 12.11 ( 4.23) Acc@5 31.64 ( 12.06) +Epoch: [0][4530/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.6130e+00 (5.8482e+00) Acc@1 12.11 ( 4.23) Acc@5 26.95 ( 12.06) +Epoch: [0][4531/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.8138e+00 (5.8479e+00) Acc@1 11.72 ( 4.23) Acc@5 25.78 ( 12.07) +Epoch: [0][4532/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.027) Loss 4.7587e+00 (5.8477e+00) Acc@1 12.11 ( 4.23) Acc@5 27.34 ( 12.07) +Epoch: [0][4533/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.7756e+00 (5.8475e+00) Acc@1 13.67 ( 4.23) Acc@5 26.95 ( 12.07) +Epoch: [0][4534/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 4.7280e+00 (5.8472e+00) Acc@1 11.33 ( 4.23) Acc@5 28.12 ( 12.08) +Epoch: [0][4535/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.8034e+00 (5.8470e+00) Acc@1 12.11 ( 4.24) Acc@5 26.56 ( 12.08) +Epoch: [0][4536/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.027) Loss 4.7340e+00 (5.8467e+00) Acc@1 10.55 ( 4.24) Acc@5 28.52 ( 12.08) +Epoch: [0][4537/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 4.6928e+00 (5.8465e+00) Acc@1 10.16 ( 4.24) Acc@5 28.12 ( 12.09) +Epoch: [0][4538/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.027) Loss 4.8180e+00 (5.8463e+00) Acc@1 11.33 ( 4.24) Acc@5 27.34 ( 12.09) +Epoch: [0][4539/5004] Time 0.250 ( 0.242) Data 0.021 ( 0.027) Loss 4.7520e+00 (5.8460e+00) Acc@1 11.33 ( 4.24) Acc@5 30.47 ( 12.09) +Epoch: [0][4540/5004] Time 0.251 ( 0.242) Data 0.019 ( 0.027) Loss 4.4803e+00 (5.8457e+00) Acc@1 13.28 ( 4.24) Acc@5 35.16 ( 12.10) +Epoch: [0][4541/5004] Time 0.247 ( 0.242) Data 0.019 ( 0.027) Loss 4.7077e+00 (5.8455e+00) Acc@1 12.89 ( 4.25) Acc@5 28.12 ( 12.10) +Epoch: [0][4542/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.027) Loss 4.7105e+00 (5.8452e+00) Acc@1 10.94 ( 4.25) Acc@5 26.56 ( 12.10) +Epoch: [0][4543/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 4.7805e+00 (5.8450e+00) Acc@1 7.03 ( 4.25) Acc@5 28.52 ( 12.11) +Epoch: [0][4544/5004] Time 0.249 ( 0.242) Data 0.022 ( 0.027) Loss 4.8452e+00 (5.8448e+00) Acc@1 7.03 ( 4.25) Acc@5 24.61 ( 12.11) +Epoch: [0][4545/5004] Time 0.244 ( 0.242) Data 0.020 ( 0.027) Loss 4.4804e+00 (5.8445e+00) Acc@1 12.50 ( 4.25) Acc@5 32.42 ( 12.12) +Epoch: [0][4546/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.6616e+00 (5.8442e+00) Acc@1 9.38 ( 4.25) Acc@5 28.12 ( 12.12) +Epoch: [0][4547/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.7444e+00 (5.8440e+00) Acc@1 10.16 ( 4.25) Acc@5 28.12 ( 12.12) +Epoch: [0][4548/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.8081e+00 (5.8437e+00) Acc@1 8.20 ( 4.25) Acc@5 26.95 ( 12.13) +Epoch: [0][4549/5004] Time 0.251 ( 0.242) Data 0.021 ( 0.027) Loss 4.6724e+00 (5.8435e+00) Acc@1 15.62 ( 4.26) Acc@5 33.98 ( 12.13) +Epoch: [0][4550/5004] Time 0.244 ( 0.242) Data 0.019 ( 0.027) Loss 4.6040e+00 (5.8432e+00) Acc@1 14.06 ( 4.26) Acc@5 29.69 ( 12.13) +Epoch: [0][4551/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.7745e+00 (5.8430e+00) Acc@1 11.72 ( 4.26) Acc@5 26.17 ( 12.14) +Epoch: [0][4552/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.7786e+00 (5.8427e+00) Acc@1 10.55 ( 4.26) Acc@5 29.30 ( 12.14) +Epoch: [0][4553/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.5311e+00 (5.8425e+00) Acc@1 12.89 ( 4.26) Acc@5 31.25 ( 12.15) +Epoch: [0][4554/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.6532e+00 (5.8422e+00) Acc@1 12.11 ( 4.26) Acc@5 30.08 ( 12.15) +Epoch: [0][4555/5004] Time 0.250 ( 0.242) Data 0.020 ( 0.027) Loss 4.9156e+00 (5.8420e+00) Acc@1 10.55 ( 4.27) Acc@5 26.56 ( 12.15) +Epoch: [0][4556/5004] Time 0.242 ( 0.242) Data 0.016 ( 0.027) Loss 4.5898e+00 (5.8417e+00) Acc@1 10.94 ( 4.27) Acc@5 30.86 ( 12.16) +Epoch: [0][4557/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 4.6241e+00 (5.8414e+00) Acc@1 16.02 ( 4.27) Acc@5 32.81 ( 12.16) +Epoch: [0][4558/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 4.7498e+00 (5.8412e+00) Acc@1 14.06 ( 4.27) Acc@5 29.30 ( 12.17) +Epoch: [0][4559/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.027) Loss 4.8515e+00 (5.8410e+00) Acc@1 11.33 ( 4.27) Acc@5 26.17 ( 12.17) +Epoch: [0][4560/5004] Time 0.249 ( 0.242) Data 0.025 ( 0.027) Loss 4.5701e+00 (5.8407e+00) Acc@1 12.11 ( 4.28) Acc@5 31.64 ( 12.17) +Epoch: [0][4561/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.7026e+00 (5.8405e+00) Acc@1 13.28 ( 4.28) Acc@5 29.30 ( 12.18) +Epoch: [0][4562/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.7499e+00 (5.8402e+00) Acc@1 11.72 ( 4.28) Acc@5 28.91 ( 12.18) +Epoch: [0][4563/5004] Time 0.248 ( 0.242) Data 0.021 ( 0.027) Loss 4.6097e+00 (5.8400e+00) Acc@1 14.45 ( 4.28) Acc@5 32.42 ( 12.18) +Epoch: [0][4564/5004] Time 0.252 ( 0.242) Data 0.020 ( 0.027) Loss 4.7902e+00 (5.8397e+00) Acc@1 10.55 ( 4.28) Acc@5 30.08 ( 12.19) +Epoch: [0][4565/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 4.8753e+00 (5.8395e+00) Acc@1 9.38 ( 4.28) Acc@5 27.34 ( 12.19) +Epoch: [0][4566/5004] Time 0.247 ( 0.242) Data 0.020 ( 0.027) Loss 4.5807e+00 (5.8392e+00) Acc@1 14.84 ( 4.29) Acc@5 33.20 ( 12.20) +Epoch: [0][4567/5004] Time 0.253 ( 0.242) Data 0.021 ( 0.027) Loss 4.6874e+00 (5.8390e+00) Acc@1 12.50 ( 4.29) Acc@5 27.34 ( 12.20) +Epoch: [0][4568/5004] Time 0.247 ( 0.242) Data 0.017 ( 0.027) Loss 4.5665e+00 (5.8387e+00) Acc@1 14.84 ( 4.29) Acc@5 30.08 ( 12.20) +Epoch: [0][4569/5004] Time 0.249 ( 0.242) Data 0.020 ( 0.027) Loss 4.5320e+00 (5.8384e+00) Acc@1 16.02 ( 4.29) Acc@5 34.38 ( 12.21) +Epoch: [0][4570/5004] Time 0.245 ( 0.242) Data 0.020 ( 0.027) Loss 5.0587e+00 (5.8382e+00) Acc@1 8.20 ( 4.29) Acc@5 22.66 ( 12.21) +Epoch: [0][4571/5004] Time 0.247 ( 0.242) Data 0.021 ( 0.027) Loss 4.6365e+00 (5.8380e+00) Acc@1 11.72 ( 4.29) Acc@5 29.69 ( 12.21) +Epoch: [0][4572/5004] Time 0.242 ( 0.242) Data 0.020 ( 0.027) Loss 4.7112e+00 (5.8377e+00) Acc@1 12.11 ( 4.30) Acc@5 32.03 ( 12.22) +Epoch: [0][4573/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.4903e+00 (5.8374e+00) Acc@1 11.72 ( 4.30) Acc@5 32.81 ( 12.22) +Epoch: [0][4574/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.4488e+00 (5.8371e+00) Acc@1 13.28 ( 4.30) Acc@5 33.59 ( 12.23) +Epoch: [0][4575/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.7798e+00 (5.8369e+00) Acc@1 12.89 ( 4.30) Acc@5 27.73 ( 12.23) +Epoch: [0][4576/5004] Time 0.256 ( 0.242) Data 0.022 ( 0.027) Loss 4.6224e+00 (5.8366e+00) Acc@1 14.45 ( 4.30) Acc@5 29.69 ( 12.24) +Epoch: [0][4577/5004] Time 0.243 ( 0.242) Data 0.016 ( 0.027) Loss 4.4521e+00 (5.8363e+00) Acc@1 13.67 ( 4.31) Acc@5 35.16 ( 12.24) +Epoch: [0][4578/5004] Time 0.243 ( 0.242) Data 0.020 ( 0.027) Loss 4.6405e+00 (5.8361e+00) Acc@1 15.62 ( 4.31) Acc@5 31.25 ( 12.24) +Epoch: [0][4579/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.6285e+00 (5.8358e+00) Acc@1 12.89 ( 4.31) Acc@5 33.98 ( 12.25) +Epoch: [0][4580/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.7266e+00 (5.8356e+00) Acc@1 14.84 ( 4.31) Acc@5 30.86 ( 12.25) +Epoch: [0][4581/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.8152e+00 (5.8354e+00) Acc@1 9.77 ( 4.31) Acc@5 29.69 ( 12.26) +Epoch: [0][4582/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.8745e+00 (5.8351e+00) Acc@1 9.77 ( 4.32) Acc@5 25.39 ( 12.26) +Epoch: [0][4583/5004] Time 0.242 ( 0.242) Data 0.021 ( 0.027) Loss 4.5143e+00 (5.8349e+00) Acc@1 15.23 ( 4.32) Acc@5 32.81 ( 12.26) +Epoch: [0][4584/5004] Time 0.247 ( 0.242) Data 0.029 ( 0.027) Loss 4.6275e+00 (5.8346e+00) Acc@1 14.45 ( 4.32) Acc@5 33.59 ( 12.27) +Epoch: [0][4585/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 4.7700e+00 (5.8344e+00) Acc@1 14.45 ( 4.32) Acc@5 28.91 ( 12.27) +Epoch: [0][4586/5004] Time 0.250 ( 0.242) Data 0.028 ( 0.027) Loss 4.5827e+00 (5.8341e+00) Acc@1 14.84 ( 4.32) Acc@5 31.64 ( 12.28) +Epoch: [0][4587/5004] Time 0.252 ( 0.242) Data 0.028 ( 0.027) Loss 4.9815e+00 (5.8339e+00) Acc@1 11.72 ( 4.33) Acc@5 26.56 ( 12.28) +Epoch: [0][4588/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 4.7681e+00 (5.8337e+00) Acc@1 12.89 ( 4.33) Acc@5 25.78 ( 12.28) +Epoch: [0][4589/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.027) Loss 4.7479e+00 (5.8334e+00) Acc@1 10.94 ( 4.33) Acc@5 28.12 ( 12.29) +Epoch: [0][4590/5004] Time 0.247 ( 0.242) Data 0.027 ( 0.027) Loss 4.5086e+00 (5.8331e+00) Acc@1 15.62 ( 4.33) Acc@5 33.98 ( 12.29) +Epoch: [0][4591/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.027) Loss 4.5688e+00 (5.8329e+00) Acc@1 15.23 ( 4.33) Acc@5 29.30 ( 12.29) +Epoch: [0][4592/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.027) Loss 4.5949e+00 (5.8326e+00) Acc@1 13.28 ( 4.34) Acc@5 33.20 ( 12.30) +Epoch: [0][4593/5004] Time 0.248 ( 0.242) Data 0.029 ( 0.027) Loss 4.7700e+00 (5.8324e+00) Acc@1 12.89 ( 4.34) Acc@5 29.30 ( 12.30) +Epoch: [0][4594/5004] Time 0.250 ( 0.242) Data 0.027 ( 0.027) Loss 4.7095e+00 (5.8321e+00) Acc@1 9.77 ( 4.34) Acc@5 27.34 ( 12.31) +Epoch: [0][4595/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.027) Loss 4.8128e+00 (5.8319e+00) Acc@1 9.38 ( 4.34) Acc@5 26.56 ( 12.31) +Epoch: [0][4596/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.027) Loss 4.7492e+00 (5.8317e+00) Acc@1 12.89 ( 4.34) Acc@5 29.69 ( 12.31) +Epoch: [0][4597/5004] Time 0.249 ( 0.242) Data 0.029 ( 0.027) Loss 4.4384e+00 (5.8314e+00) Acc@1 15.62 ( 4.34) Acc@5 32.42 ( 12.32) +Epoch: [0][4598/5004] Time 0.242 ( 0.242) Data 0.025 ( 0.027) Loss 4.6424e+00 (5.8311e+00) Acc@1 12.50 ( 4.35) Acc@5 30.86 ( 12.32) +Epoch: [0][4599/5004] Time 0.252 ( 0.242) Data 0.028 ( 0.027) Loss 4.6497e+00 (5.8308e+00) Acc@1 10.16 ( 4.35) Acc@5 30.86 ( 12.33) +Epoch: [0][4600/5004] Time 0.239 ( 0.242) Data 0.022 ( 0.027) Loss 4.5436e+00 (5.8306e+00) Acc@1 12.89 ( 4.35) Acc@5 29.30 ( 12.33) +Epoch: [0][4601/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 4.7843e+00 (5.8303e+00) Acc@1 13.67 ( 4.35) Acc@5 28.12 ( 12.33) +Epoch: [0][4602/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 4.5062e+00 (5.8301e+00) Acc@1 15.23 ( 4.35) Acc@5 33.59 ( 12.34) +Epoch: [0][4603/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.027) Loss 4.6245e+00 (5.8298e+00) Acc@1 10.55 ( 4.36) Acc@5 28.52 ( 12.34) +Epoch: [0][4604/5004] Time 0.246 ( 0.242) Data 0.029 ( 0.027) Loss 4.8145e+00 (5.8296e+00) Acc@1 10.94 ( 4.36) Acc@5 25.78 ( 12.34) +Epoch: [0][4605/5004] Time 0.250 ( 0.242) Data 0.029 ( 0.027) Loss 4.8745e+00 (5.8294e+00) Acc@1 12.11 ( 4.36) Acc@5 24.61 ( 12.35) +Epoch: [0][4606/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.027) Loss 4.6017e+00 (5.8291e+00) Acc@1 16.02 ( 4.36) Acc@5 33.20 ( 12.35) +Epoch: [0][4607/5004] Time 0.249 ( 0.242) Data 0.029 ( 0.027) Loss 4.6129e+00 (5.8288e+00) Acc@1 14.84 ( 4.36) Acc@5 34.38 ( 12.36) +Epoch: [0][4608/5004] Time 0.244 ( 0.242) Data 0.026 ( 0.027) Loss 4.4833e+00 (5.8285e+00) Acc@1 17.19 ( 4.37) Acc@5 36.33 ( 12.36) +Epoch: [0][4609/5004] Time 0.249 ( 0.242) Data 0.028 ( 0.027) Loss 4.7052e+00 (5.8283e+00) Acc@1 11.33 ( 4.37) Acc@5 28.52 ( 12.36) +Epoch: [0][4610/5004] Time 0.249 ( 0.242) Data 0.028 ( 0.027) Loss 4.5251e+00 (5.8280e+00) Acc@1 14.06 ( 4.37) Acc@5 33.98 ( 12.37) +Epoch: [0][4611/5004] Time 0.250 ( 0.242) Data 0.027 ( 0.027) Loss 4.6615e+00 (5.8278e+00) Acc@1 10.16 ( 4.37) Acc@5 27.34 ( 12.37) +Epoch: [0][4612/5004] Time 0.246 ( 0.242) Data 0.027 ( 0.027) Loss 4.2295e+00 (5.8274e+00) Acc@1 20.31 ( 4.37) Acc@5 41.02 ( 12.38) +Epoch: [0][4613/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.027) Loss 4.5539e+00 (5.8271e+00) Acc@1 12.50 ( 4.38) Acc@5 33.98 ( 12.38) +Epoch: [0][4614/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 4.7313e+00 (5.8269e+00) Acc@1 11.33 ( 4.38) Acc@5 27.73 ( 12.39) +Epoch: [0][4615/5004] Time 0.246 ( 0.242) Data 0.028 ( 0.027) Loss 4.5597e+00 (5.8266e+00) Acc@1 18.36 ( 4.38) Acc@5 34.77 ( 12.39) +Epoch: [0][4616/5004] Time 0.235 ( 0.242) Data 0.023 ( 0.027) Loss 4.8348e+00 (5.8264e+00) Acc@1 8.59 ( 4.38) Acc@5 29.69 ( 12.40) +Epoch: [0][4617/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 4.8232e+00 (5.8262e+00) Acc@1 11.33 ( 4.38) Acc@5 28.52 ( 12.40) +Epoch: [0][4618/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.5981e+00 (5.8259e+00) Acc@1 8.98 ( 4.38) Acc@5 28.91 ( 12.40) +Epoch: [0][4619/5004] Time 0.237 ( 0.242) Data 0.024 ( 0.027) Loss 4.4609e+00 (5.8256e+00) Acc@1 14.84 ( 4.39) Acc@5 33.59 ( 12.41) +Epoch: [0][4620/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 4.7186e+00 (5.8254e+00) Acc@1 9.38 ( 4.39) Acc@5 31.64 ( 12.41) +Epoch: [0][4621/5004] Time 0.239 ( 0.242) Data 0.025 ( 0.027) Loss 4.6939e+00 (5.8252e+00) Acc@1 11.33 ( 4.39) Acc@5 30.86 ( 12.42) +Epoch: [0][4622/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 4.5211e+00 (5.8249e+00) Acc@1 14.84 ( 4.39) Acc@5 34.38 ( 12.42) +Epoch: [0][4623/5004] Time 0.243 ( 0.242) Data 0.027 ( 0.027) Loss 4.5953e+00 (5.8246e+00) Acc@1 15.62 ( 4.39) Acc@5 30.86 ( 12.42) +Epoch: [0][4624/5004] Time 0.235 ( 0.242) Data 0.023 ( 0.027) Loss 4.6104e+00 (5.8243e+00) Acc@1 16.02 ( 4.40) Acc@5 30.47 ( 12.43) +Epoch: [0][4625/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.6115e+00 (5.8241e+00) Acc@1 10.55 ( 4.40) Acc@5 29.69 ( 12.43) +Epoch: [0][4626/5004] Time 0.235 ( 0.242) Data 0.024 ( 0.027) Loss 4.8671e+00 (5.8239e+00) Acc@1 13.28 ( 4.40) Acc@5 26.95 ( 12.43) +Epoch: [0][4627/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.8126e+00 (5.8237e+00) Acc@1 11.72 ( 4.40) Acc@5 29.30 ( 12.44) +Epoch: [0][4628/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.6334e+00 (5.8234e+00) Acc@1 16.02 ( 4.40) Acc@5 34.38 ( 12.44) +Epoch: [0][4629/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 4.6718e+00 (5.8231e+00) Acc@1 12.89 ( 4.41) Acc@5 26.17 ( 12.45) +Epoch: [0][4630/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.6376e+00 (5.8229e+00) Acc@1 10.16 ( 4.41) Acc@5 31.64 ( 12.45) +Epoch: [0][4631/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 4.8069e+00 (5.8227e+00) Acc@1 12.50 ( 4.41) Acc@5 28.91 ( 12.45) +Epoch: [0][4632/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 4.6435e+00 (5.8224e+00) Acc@1 13.67 ( 4.41) Acc@5 36.33 ( 12.46) +Epoch: [0][4633/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.4340e+00 (5.8221e+00) Acc@1 15.62 ( 4.41) Acc@5 33.59 ( 12.46) +Epoch: [0][4634/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.5987e+00 (5.8219e+00) Acc@1 14.84 ( 4.41) Acc@5 30.86 ( 12.47) +Epoch: [0][4635/5004] Time 0.238 ( 0.242) Data 0.025 ( 0.027) Loss 4.6143e+00 (5.8216e+00) Acc@1 14.06 ( 4.42) Acc@5 30.08 ( 12.47) +Epoch: [0][4636/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.6637e+00 (5.8213e+00) Acc@1 14.84 ( 4.42) Acc@5 33.20 ( 12.48) +Epoch: [0][4637/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.5957e+00 (5.8211e+00) Acc@1 14.06 ( 4.42) Acc@5 28.91 ( 12.48) +Epoch: [0][4638/5004] Time 0.238 ( 0.242) Data 0.025 ( 0.027) Loss 4.6917e+00 (5.8208e+00) Acc@1 16.41 ( 4.42) Acc@5 32.03 ( 12.48) +Epoch: [0][4639/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 4.5770e+00 (5.8206e+00) Acc@1 16.80 ( 4.43) Acc@5 34.77 ( 12.49) +Epoch: [0][4640/5004] Time 0.237 ( 0.242) Data 0.027 ( 0.027) Loss 4.8146e+00 (5.8204e+00) Acc@1 10.16 ( 4.43) Acc@5 23.83 ( 12.49) +Epoch: [0][4641/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.8137e+00 (5.8201e+00) Acc@1 9.38 ( 4.43) Acc@5 25.39 ( 12.49) +Epoch: [0][4642/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 4.5229e+00 (5.8199e+00) Acc@1 12.50 ( 4.43) Acc@5 35.55 ( 12.50) +Epoch: [0][4643/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.7612e+00 (5.8196e+00) Acc@1 10.94 ( 4.43) Acc@5 27.34 ( 12.50) +Epoch: [0][4644/5004] Time 0.240 ( 0.242) Data 0.026 ( 0.027) Loss 4.6934e+00 (5.8194e+00) Acc@1 12.89 ( 4.43) Acc@5 28.52 ( 12.50) +Epoch: [0][4645/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 4.7597e+00 (5.8192e+00) Acc@1 8.59 ( 4.43) Acc@5 26.56 ( 12.51) +Epoch: [0][4646/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.7563e+00 (5.8189e+00) Acc@1 10.55 ( 4.44) Acc@5 24.22 ( 12.51) +Epoch: [0][4647/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.5309e+00 (5.8186e+00) Acc@1 13.67 ( 4.44) Acc@5 32.03 ( 12.51) +Epoch: [0][4648/5004] Time 0.244 ( 0.242) Data 0.024 ( 0.027) Loss 4.8763e+00 (5.8184e+00) Acc@1 11.33 ( 4.44) Acc@5 26.56 ( 12.52) +Epoch: [0][4649/5004] Time 0.235 ( 0.242) Data 0.019 ( 0.027) Loss 4.7729e+00 (5.8182e+00) Acc@1 10.16 ( 4.44) Acc@5 30.08 ( 12.52) +Epoch: [0][4650/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.5213e+00 (5.8179e+00) Acc@1 14.45 ( 4.44) Acc@5 30.08 ( 12.53) +Epoch: [0][4651/5004] Time 0.238 ( 0.242) Data 0.022 ( 0.027) Loss 4.5484e+00 (5.8177e+00) Acc@1 15.23 ( 4.45) Acc@5 29.69 ( 12.53) +Epoch: [0][4652/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.7515e+00 (5.8174e+00) Acc@1 12.89 ( 4.45) Acc@5 26.56 ( 12.53) +Epoch: [0][4653/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.8079e+00 (5.8172e+00) Acc@1 8.98 ( 4.45) Acc@5 24.61 ( 12.53) +Epoch: [0][4654/5004] Time 0.239 ( 0.242) Data 0.021 ( 0.027) Loss 4.4443e+00 (5.8169e+00) Acc@1 13.67 ( 4.45) Acc@5 33.59 ( 12.54) +Epoch: [0][4655/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.7435e+00 (5.8167e+00) Acc@1 11.33 ( 4.45) Acc@5 29.69 ( 12.54) +Epoch: [0][4656/5004] Time 0.238 ( 0.242) Data 0.022 ( 0.027) Loss 4.5957e+00 (5.8164e+00) Acc@1 10.94 ( 4.45) Acc@5 29.30 ( 12.55) +Epoch: [0][4657/5004] Time 0.239 ( 0.242) Data 0.024 ( 0.027) Loss 4.6892e+00 (5.8162e+00) Acc@1 9.38 ( 4.45) Acc@5 27.73 ( 12.55) +Epoch: [0][4658/5004] Time 0.241 ( 0.242) Data 0.024 ( 0.027) Loss 4.7305e+00 (5.8160e+00) Acc@1 11.72 ( 4.46) Acc@5 30.86 ( 12.55) +Epoch: [0][4659/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.7977e+00 (5.8157e+00) Acc@1 7.03 ( 4.46) Acc@5 25.00 ( 12.56) +Epoch: [0][4660/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.5781e+00 (5.8155e+00) Acc@1 12.89 ( 4.46) Acc@5 28.91 ( 12.56) +Epoch: [0][4661/5004] Time 0.245 ( 0.242) Data 0.019 ( 0.027) Loss 4.8025e+00 (5.8153e+00) Acc@1 10.94 ( 4.46) Acc@5 28.52 ( 12.56) +Epoch: [0][4662/5004] Time 0.233 ( 0.242) Data 0.016 ( 0.027) Loss 4.7691e+00 (5.8150e+00) Acc@1 11.33 ( 4.46) Acc@5 31.25 ( 12.57) +Epoch: [0][4663/5004] Time 0.249 ( 0.242) Data 0.023 ( 0.027) Loss 4.5089e+00 (5.8148e+00) Acc@1 15.62 ( 4.46) Acc@5 30.86 ( 12.57) +Epoch: [0][4664/5004] Time 0.250 ( 0.242) Data 0.019 ( 0.027) Loss 4.7003e+00 (5.8145e+00) Acc@1 10.16 ( 4.46) Acc@5 28.91 ( 12.57) +Epoch: [0][4665/5004] Time 0.241 ( 0.242) Data 0.018 ( 0.027) Loss 4.6087e+00 (5.8143e+00) Acc@1 16.80 ( 4.47) Acc@5 35.16 ( 12.58) +Epoch: [0][4666/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.5580e+00 (5.8140e+00) Acc@1 10.94 ( 4.47) Acc@5 29.69 ( 12.58) +Epoch: [0][4667/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.4228e+00 (5.8137e+00) Acc@1 11.72 ( 4.47) Acc@5 32.03 ( 12.59) +Epoch: [0][4668/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.7961e+00 (5.8135e+00) Acc@1 9.77 ( 4.47) Acc@5 27.73 ( 12.59) +Epoch: [0][4669/5004] Time 0.246 ( 0.242) Data 0.023 ( 0.027) Loss 4.5645e+00 (5.8132e+00) Acc@1 14.45 ( 4.47) Acc@5 31.25 ( 12.59) +Epoch: [0][4670/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.8219e+00 (5.8130e+00) Acc@1 15.62 ( 4.48) Acc@5 26.56 ( 12.60) +Epoch: [0][4671/5004] Time 0.242 ( 0.242) Data 0.019 ( 0.027) Loss 4.7460e+00 (5.8128e+00) Acc@1 11.72 ( 4.48) Acc@5 25.00 ( 12.60) +Epoch: [0][4672/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.6655e+00 (5.8125e+00) Acc@1 12.50 ( 4.48) Acc@5 29.69 ( 12.60) +Epoch: [0][4673/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.7843e+00 (5.8123e+00) Acc@1 12.11 ( 4.48) Acc@5 27.34 ( 12.61) +Epoch: [0][4674/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.6681e+00 (5.8121e+00) Acc@1 14.06 ( 4.48) Acc@5 32.81 ( 12.61) +Epoch: [0][4675/5004] Time 0.244 ( 0.242) Data 0.021 ( 0.027) Loss 4.5005e+00 (5.8118e+00) Acc@1 11.72 ( 4.48) Acc@5 33.20 ( 12.62) +Epoch: [0][4676/5004] Time 0.248 ( 0.242) Data 0.019 ( 0.027) Loss 4.6895e+00 (5.8115e+00) Acc@1 9.77 ( 4.49) Acc@5 26.95 ( 12.62) +Epoch: [0][4677/5004] Time 0.245 ( 0.242) Data 0.016 ( 0.027) Loss 4.6347e+00 (5.8113e+00) Acc@1 11.33 ( 4.49) Acc@5 30.86 ( 12.62) +Epoch: [0][4678/5004] Time 0.242 ( 0.242) Data 0.021 ( 0.027) Loss 4.8816e+00 (5.8111e+00) Acc@1 10.94 ( 4.49) Acc@5 25.00 ( 12.63) +Epoch: [0][4679/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.6986e+00 (5.8108e+00) Acc@1 13.28 ( 4.49) Acc@5 29.69 ( 12.63) +Epoch: [0][4680/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.5890e+00 (5.8106e+00) Acc@1 15.23 ( 4.49) Acc@5 32.03 ( 12.63) +Epoch: [0][4681/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.6973e+00 (5.8103e+00) Acc@1 10.55 ( 4.49) Acc@5 28.12 ( 12.64) +Epoch: [0][4682/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.8257e+00 (5.8101e+00) Acc@1 11.72 ( 4.50) Acc@5 30.47 ( 12.64) +Epoch: [0][4683/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.7764e+00 (5.8099e+00) Acc@1 8.20 ( 4.50) Acc@5 25.78 ( 12.64) +Epoch: [0][4684/5004] Time 0.248 ( 0.242) Data 0.022 ( 0.027) Loss 4.4778e+00 (5.8096e+00) Acc@1 11.72 ( 4.50) Acc@5 32.81 ( 12.65) +Epoch: [0][4685/5004] Time 0.245 ( 0.242) Data 0.019 ( 0.027) Loss 4.5912e+00 (5.8094e+00) Acc@1 12.50 ( 4.50) Acc@5 28.12 ( 12.65) +Epoch: [0][4686/5004] Time 0.248 ( 0.242) Data 0.020 ( 0.027) Loss 4.7656e+00 (5.8091e+00) Acc@1 12.11 ( 4.50) Acc@5 27.73 ( 12.65) +Epoch: [0][4687/5004] Time 0.242 ( 0.242) Data 0.020 ( 0.027) Loss 4.7324e+00 (5.8089e+00) Acc@1 8.98 ( 4.50) Acc@5 27.73 ( 12.66) +Epoch: [0][4688/5004] Time 0.245 ( 0.242) Data 0.022 ( 0.027) Loss 4.5279e+00 (5.8086e+00) Acc@1 13.67 ( 4.50) Acc@5 33.59 ( 12.66) +Epoch: [0][4689/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.8671e+00 (5.8084e+00) Acc@1 8.98 ( 4.50) Acc@5 22.66 ( 12.66) +Epoch: [0][4690/5004] Time 0.243 ( 0.242) Data 0.021 ( 0.027) Loss 4.5185e+00 (5.8082e+00) Acc@1 17.58 ( 4.51) Acc@5 32.42 ( 12.67) +Epoch: [0][4691/5004] Time 0.247 ( 0.242) Data 0.022 ( 0.027) Loss 4.7548e+00 (5.8079e+00) Acc@1 9.38 ( 4.51) Acc@5 27.73 ( 12.67) +Epoch: [0][4692/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.5431e+00 (5.8077e+00) Acc@1 12.50 ( 4.51) Acc@5 31.64 ( 12.68) +Epoch: [0][4693/5004] Time 0.245 ( 0.242) Data 0.021 ( 0.027) Loss 4.7991e+00 (5.8075e+00) Acc@1 12.89 ( 4.51) Acc@5 27.34 ( 12.68) +Epoch: [0][4694/5004] Time 0.246 ( 0.242) Data 0.020 ( 0.027) Loss 4.7156e+00 (5.8072e+00) Acc@1 8.98 ( 4.51) Acc@5 31.64 ( 12.68) +Epoch: [0][4695/5004] Time 0.241 ( 0.242) Data 0.022 ( 0.027) Loss 4.6825e+00 (5.8070e+00) Acc@1 14.84 ( 4.52) Acc@5 32.03 ( 12.69) +Epoch: [0][4696/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.6453e+00 (5.8067e+00) Acc@1 8.98 ( 4.52) Acc@5 27.73 ( 12.69) +Epoch: [0][4697/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.6232e+00 (5.8065e+00) Acc@1 10.16 ( 4.52) Acc@5 31.25 ( 12.69) +Epoch: [0][4698/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.7666e+00 (5.8063e+00) Acc@1 12.11 ( 4.52) Acc@5 29.69 ( 12.70) +Epoch: [0][4699/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.6799e+00 (5.8060e+00) Acc@1 15.23 ( 4.52) Acc@5 30.86 ( 12.70) +Epoch: [0][4700/5004] Time 0.248 ( 0.242) Data 0.025 ( 0.027) Loss 4.6130e+00 (5.8058e+00) Acc@1 12.11 ( 4.52) Acc@5 32.03 ( 12.71) +Epoch: [0][4701/5004] Time 0.237 ( 0.242) Data 0.020 ( 0.027) Loss 4.6456e+00 (5.8055e+00) Acc@1 16.41 ( 4.53) Acc@5 32.42 ( 12.71) +Epoch: [0][4702/5004] Time 0.240 ( 0.242) Data 0.024 ( 0.027) Loss 4.8592e+00 (5.8053e+00) Acc@1 10.16 ( 4.53) Acc@5 26.56 ( 12.71) +Epoch: [0][4703/5004] Time 0.250 ( 0.242) Data 0.025 ( 0.027) Loss 4.6683e+00 (5.8051e+00) Acc@1 10.16 ( 4.53) Acc@5 31.25 ( 12.72) +Epoch: [0][4704/5004] Time 0.240 ( 0.242) Data 0.023 ( 0.027) Loss 4.7765e+00 (5.8049e+00) Acc@1 13.67 ( 4.53) Acc@5 26.56 ( 12.72) +Epoch: [0][4705/5004] Time 0.242 ( 0.242) Data 0.024 ( 0.027) Loss 4.5973e+00 (5.8046e+00) Acc@1 16.41 ( 4.53) Acc@5 33.20 ( 12.72) +Epoch: [0][4706/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.7375e+00 (5.8044e+00) Acc@1 12.50 ( 4.53) Acc@5 30.86 ( 12.73) +Epoch: [0][4707/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.8271e+00 (5.8042e+00) Acc@1 12.11 ( 4.54) Acc@5 27.34 ( 12.73) +Epoch: [0][4708/5004] Time 0.246 ( 0.242) Data 0.024 ( 0.027) Loss 4.7242e+00 (5.8039e+00) Acc@1 13.67 ( 4.54) Acc@5 30.86 ( 12.73) +Epoch: [0][4709/5004] Time 0.244 ( 0.242) Data 0.022 ( 0.027) Loss 4.5052e+00 (5.8037e+00) Acc@1 9.77 ( 4.54) Acc@5 32.81 ( 12.74) +Epoch: [0][4710/5004] Time 0.243 ( 0.242) Data 0.023 ( 0.027) Loss 4.4145e+00 (5.8034e+00) Acc@1 14.45 ( 4.54) Acc@5 36.33 ( 12.74) +Epoch: [0][4711/5004] Time 0.245 ( 0.242) Data 0.023 ( 0.027) Loss 4.8311e+00 (5.8032e+00) Acc@1 14.84 ( 4.54) Acc@5 29.30 ( 12.75) +Epoch: [0][4712/5004] Time 0.244 ( 0.242) Data 0.023 ( 0.027) Loss 4.5904e+00 (5.8029e+00) Acc@1 14.84 ( 4.54) Acc@5 28.91 ( 12.75) +Epoch: [0][4713/5004] Time 0.250 ( 0.242) Data 0.022 ( 0.027) Loss 4.6694e+00 (5.8027e+00) Acc@1 11.72 ( 4.55) Acc@5 27.73 ( 12.75) +Epoch: [0][4714/5004] Time 0.234 ( 0.242) Data 0.021 ( 0.027) Loss 4.5762e+00 (5.8024e+00) Acc@1 13.67 ( 4.55) Acc@5 31.64 ( 12.76) +Epoch: [0][4715/5004] Time 0.244 ( 0.242) Data 0.027 ( 0.027) Loss 4.6069e+00 (5.8022e+00) Acc@1 14.06 ( 4.55) Acc@5 28.52 ( 12.76) +Epoch: [0][4716/5004] Time 0.237 ( 0.242) Data 0.023 ( 0.027) Loss 4.4011e+00 (5.8019e+00) Acc@1 16.41 ( 4.55) Acc@5 33.98 ( 12.77) +Epoch: [0][4717/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 4.4635e+00 (5.8016e+00) Acc@1 15.23 ( 4.56) Acc@5 34.77 ( 12.77) +Epoch: [0][4718/5004] Time 0.241 ( 0.242) Data 0.028 ( 0.027) Loss 4.7182e+00 (5.8013e+00) Acc@1 13.28 ( 4.56) Acc@5 30.47 ( 12.77) +Epoch: [0][4719/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 4.7779e+00 (5.8011e+00) Acc@1 14.84 ( 4.56) Acc@5 28.52 ( 12.78) +Epoch: [0][4720/5004] Time 0.237 ( 0.242) Data 0.027 ( 0.027) Loss 4.5998e+00 (5.8009e+00) Acc@1 10.94 ( 4.56) Acc@5 28.12 ( 12.78) +Epoch: [0][4721/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.8518e+00 (5.8007e+00) Acc@1 8.59 ( 4.56) Acc@5 26.56 ( 12.78) +Epoch: [0][4722/5004] Time 0.238 ( 0.242) Data 0.027 ( 0.027) Loss 4.6066e+00 (5.8004e+00) Acc@1 14.06 ( 4.56) Acc@5 31.64 ( 12.79) +Epoch: [0][4723/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.6736e+00 (5.8002e+00) Acc@1 10.16 ( 4.56) Acc@5 28.12 ( 12.79) +Epoch: [0][4724/5004] Time 0.239 ( 0.242) Data 0.026 ( 0.027) Loss 4.7091e+00 (5.8000e+00) Acc@1 10.94 ( 4.57) Acc@5 32.03 ( 12.79) +Epoch: [0][4725/5004] Time 0.236 ( 0.242) Data 0.026 ( 0.027) Loss 4.7494e+00 (5.7997e+00) Acc@1 10.94 ( 4.57) Acc@5 28.52 ( 12.80) +Epoch: [0][4726/5004] Time 0.239 ( 0.242) Data 0.028 ( 0.027) Loss 4.7879e+00 (5.7995e+00) Acc@1 10.55 ( 4.57) Acc@5 27.34 ( 12.80) +Epoch: [0][4727/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.6470e+00 (5.7993e+00) Acc@1 12.11 ( 4.57) Acc@5 28.91 ( 12.80) +Epoch: [0][4728/5004] Time 0.245 ( 0.242) Data 0.026 ( 0.027) Loss 4.6169e+00 (5.7990e+00) Acc@1 12.50 ( 4.57) Acc@5 30.86 ( 12.81) +Epoch: [0][4729/5004] Time 0.243 ( 0.242) Data 0.024 ( 0.027) Loss 4.5448e+00 (5.7988e+00) Acc@1 11.33 ( 4.57) Acc@5 31.25 ( 12.81) +Epoch: [0][4730/5004] Time 0.245 ( 0.242) Data 0.025 ( 0.027) Loss 4.9068e+00 (5.7986e+00) Acc@1 9.77 ( 4.57) Acc@5 25.39 ( 12.81) +Epoch: [0][4731/5004] Time 0.234 ( 0.242) Data 0.022 ( 0.027) Loss 4.8506e+00 (5.7984e+00) Acc@1 8.20 ( 4.58) Acc@5 23.05 ( 12.82) +Epoch: [0][4732/5004] Time 0.237 ( 0.242) Data 0.026 ( 0.027) Loss 4.6110e+00 (5.7981e+00) Acc@1 14.06 ( 4.58) Acc@5 31.25 ( 12.82) +Epoch: [0][4733/5004] Time 0.241 ( 0.242) Data 0.027 ( 0.027) Loss 4.7534e+00 (5.7979e+00) Acc@1 13.67 ( 4.58) Acc@5 29.69 ( 12.82) +Epoch: [0][4734/5004] Time 0.238 ( 0.242) Data 0.026 ( 0.027) Loss 4.8506e+00 (5.7977e+00) Acc@1 8.98 ( 4.58) Acc@5 26.17 ( 12.83) +Epoch: [0][4735/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.5530e+00 (5.7974e+00) Acc@1 14.45 ( 4.58) Acc@5 32.03 ( 12.83) +Epoch: [0][4736/5004] Time 0.243 ( 0.242) Data 0.028 ( 0.027) Loss 4.6197e+00 (5.7972e+00) Acc@1 11.72 ( 4.58) Acc@5 28.52 ( 12.83) +Epoch: [0][4737/5004] Time 0.242 ( 0.242) Data 0.026 ( 0.027) Loss 4.5474e+00 (5.7969e+00) Acc@1 9.77 ( 4.58) Acc@5 30.86 ( 12.84) +Epoch: [0][4738/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 4.6254e+00 (5.7967e+00) Acc@1 13.28 ( 4.59) Acc@5 29.30 ( 12.84) +Epoch: [0][4739/5004] Time 0.245 ( 0.242) Data 0.031 ( 0.027) Loss 4.4978e+00 (5.7964e+00) Acc@1 12.89 ( 4.59) Acc@5 35.16 ( 12.85) +Epoch: [0][4740/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.7968e+00 (5.7962e+00) Acc@1 7.42 ( 4.59) Acc@5 28.12 ( 12.85) +Epoch: [0][4741/5004] Time 0.242 ( 0.242) Data 0.027 ( 0.027) Loss 4.5368e+00 (5.7959e+00) Acc@1 9.77 ( 4.59) Acc@5 32.81 ( 12.85) +Epoch: [0][4742/5004] Time 0.240 ( 0.242) Data 0.027 ( 0.027) Loss 4.6949e+00 (5.7957e+00) Acc@1 14.45 ( 4.59) Acc@5 30.47 ( 12.86) +Epoch: [0][4743/5004] Time 0.237 ( 0.242) Data 0.025 ( 0.027) Loss 4.5748e+00 (5.7954e+00) Acc@1 14.45 ( 4.59) Acc@5 33.20 ( 12.86) +Epoch: [0][4744/5004] Time 0.242 ( 0.242) Data 0.028 ( 0.027) Loss 4.5211e+00 (5.7952e+00) Acc@1 12.50 ( 4.60) Acc@5 30.47 ( 12.87) +Epoch: [0][4745/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.5851e+00 (5.7949e+00) Acc@1 12.50 ( 4.60) Acc@5 34.77 ( 12.87) +Epoch: [0][4746/5004] Time 0.244 ( 0.242) Data 0.028 ( 0.027) Loss 4.4701e+00 (5.7946e+00) Acc@1 18.75 ( 4.60) Acc@5 37.50 ( 12.88) +Epoch: [0][4747/5004] Time 0.235 ( 0.242) Data 0.022 ( 0.027) Loss 4.6483e+00 (5.7944e+00) Acc@1 13.28 ( 4.60) Acc@5 32.03 ( 12.88) +Epoch: [0][4748/5004] Time 0.263 ( 0.242) Data 0.050 ( 0.027) Loss 4.5629e+00 (5.7941e+00) Acc@1 13.28 ( 4.60) Acc@5 33.20 ( 12.88) +Epoch: [0][4749/5004] Time 0.236 ( 0.242) Data 0.025 ( 0.027) Loss 4.8159e+00 (5.7939e+00) Acc@1 14.45 ( 4.61) Acc@5 32.81 ( 12.89) +Epoch: [0][4750/5004] Time 0.239 ( 0.242) Data 0.027 ( 0.027) Loss 4.6783e+00 (5.7937e+00) Acc@1 11.72 ( 4.61) Acc@5 30.47 ( 12.89) +Epoch: [0][4751/5004] Time 0.241 ( 0.242) Data 0.026 ( 0.027) Loss 4.6726e+00 (5.7935e+00) Acc@1 12.89 ( 4.61) Acc@5 25.00 ( 12.89) +Epoch: [0][4752/5004] Time 0.238 ( 0.242) Data 0.025 ( 0.027) Loss 4.5729e+00 (5.7932e+00) Acc@1 14.06 ( 4.61) Acc@5 30.08 ( 12.90) +Epoch: [0][4753/5004] Time 0.237 ( 0.242) Data 0.028 ( 0.027) Loss 4.5145e+00 (5.7929e+00) Acc@1 15.62 ( 4.61) Acc@5 32.03 ( 12.90) +Epoch: [0][4754/5004] Time 0.241 ( 0.242) Data 0.029 ( 0.027) Loss 4.4904e+00 (5.7927e+00) Acc@1 17.19 ( 4.62) Acc@5 32.81 ( 12.91) +Epoch: [0][4755/5004] Time 0.240 ( 0.242) Data 0.028 ( 0.027) Loss 4.7165e+00 (5.7924e+00) Acc@1 13.28 ( 4.62) Acc@5 27.34 ( 12.91) +Epoch: [0][4756/5004] Time 0.243 ( 0.242) Data 0.026 ( 0.027) Loss 4.9451e+00 (5.7923e+00) Acc@1 8.20 ( 4.62) Acc@5 24.22 ( 12.91) +Epoch: [0][4757/5004] Time 0.247 ( 0.242) Data 0.023 ( 0.027) Loss 4.8064e+00 (5.7920e+00) Acc@1 10.16 ( 4.62) Acc@5 28.91 ( 12.92) +Epoch: [0][4758/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 4.6810e+00 (5.7918e+00) Acc@1 14.06 ( 4.62) Acc@5 27.34 ( 12.92) +Epoch: [0][4759/5004] Time 0.244 ( 0.242) Data 0.025 ( 0.027) Loss 4.5679e+00 (5.7916e+00) Acc@1 10.55 ( 4.62) Acc@5 29.69 ( 12.92) +Epoch: [0][4760/5004] Time 0.281 ( 0.242) Data 0.021 ( 0.027) Loss 4.8969e+00 (5.7914e+00) Acc@1 8.98 ( 4.62) Acc@5 30.08 ( 12.93) +Epoch: [0][4761/5004] Time 0.255 ( 0.242) Data 0.011 ( 0.027) Loss 4.5524e+00 (5.7911e+00) Acc@1 10.94 ( 4.63) Acc@5 30.47 ( 12.93) +Epoch: [0][4762/5004] Time 0.248 ( 0.242) Data 0.014 ( 0.027) Loss 4.6866e+00 (5.7909e+00) Acc@1 11.72 ( 4.63) Acc@5 25.39 ( 12.93) +Epoch: [0][4763/5004] Time 0.242 ( 0.242) Data 0.012 ( 0.027) Loss 4.5822e+00 (5.7906e+00) Acc@1 14.84 ( 4.63) Acc@5 31.25 ( 12.94) +Epoch: [0][4764/5004] Time 0.241 ( 0.242) Data 0.020 ( 0.027) Loss 4.3487e+00 (5.7903e+00) Acc@1 17.97 ( 4.63) Acc@5 36.33 ( 12.94) +Epoch: [0][4765/5004] Time 0.239 ( 0.242) Data 0.019 ( 0.027) Loss 4.6049e+00 (5.7901e+00) Acc@1 14.84 ( 4.63) Acc@5 32.03 ( 12.94) +Epoch: [0][4766/5004] Time 0.246 ( 0.242) Data 0.021 ( 0.027) Loss 4.7398e+00 (5.7898e+00) Acc@1 10.55 ( 4.64) Acc@5 29.69 ( 12.95) +Epoch: [0][4767/5004] Time 0.241 ( 0.242) Data 0.017 ( 0.027) Loss 4.8197e+00 (5.7896e+00) Acc@1 10.94 ( 4.64) Acc@5 29.69 ( 12.95) +Epoch: [0][4768/5004] Time 0.247 ( 0.242) Data 0.017 ( 0.027) Loss 4.4371e+00 (5.7894e+00) Acc@1 15.23 ( 4.64) Acc@5 39.45 ( 12.96) +Epoch: [0][4769/5004] Time 0.244 ( 0.242) Data 0.014 ( 0.027) Loss 4.5410e+00 (5.7891e+00) Acc@1 13.28 ( 4.64) Acc@5 36.72 ( 12.96) +Epoch: [0][4770/5004] Time 0.244 ( 0.242) Data 0.017 ( 0.027) Loss 4.5339e+00 (5.7888e+00) Acc@1 13.67 ( 4.64) Acc@5 31.64 ( 12.97) +Epoch: [0][4771/5004] Time 0.241 ( 0.242) Data 0.018 ( 0.027) Loss 4.7093e+00 (5.7886e+00) Acc@1 10.94 ( 4.64) Acc@5 27.34 ( 12.97) +Epoch: [0][4772/5004] Time 0.246 ( 0.242) Data 0.019 ( 0.027) Loss 4.4355e+00 (5.7883e+00) Acc@1 11.33 ( 4.65) Acc@5 37.89 ( 12.97) +Epoch: [0][4773/5004] Time 0.239 ( 0.242) Data 0.017 ( 0.027) Loss 4.5648e+00 (5.7881e+00) Acc@1 13.28 ( 4.65) Acc@5 32.03 ( 12.98) +Epoch: [0][4774/5004] Time 0.243 ( 0.242) Data 0.019 ( 0.027) Loss 4.7759e+00 (5.7879e+00) Acc@1 13.28 ( 4.65) Acc@5 28.52 ( 12.98) +Epoch: [0][4775/5004] Time 0.243 ( 0.242) Data 0.019 ( 0.027) Loss 4.7626e+00 (5.7876e+00) Acc@1 10.94 ( 4.65) Acc@5 28.52 ( 12.98) +Epoch: [0][4776/5004] Time 0.241 ( 0.242) Data 0.018 ( 0.027) Loss 4.4855e+00 (5.7874e+00) Acc@1 16.41 ( 4.65) Acc@5 34.38 ( 12.99) +Epoch: [0][4777/5004] Time 0.243 ( 0.242) Data 0.019 ( 0.027) Loss 4.6731e+00 (5.7871e+00) Acc@1 12.89 ( 4.65) Acc@5 29.69 ( 12.99) +Epoch: [0][4778/5004] Time 0.235 ( 0.242) Data 0.016 ( 0.027) Loss 4.6207e+00 (5.7869e+00) Acc@1 16.02 ( 4.66) Acc@5 34.38 ( 13.00) +Epoch: [0][4779/5004] Time 0.242 ( 0.242) Data 0.022 ( 0.027) Loss 4.7113e+00 (5.7867e+00) Acc@1 11.33 ( 4.66) Acc@5 28.12 ( 13.00) +Epoch: [0][4780/5004] Time 0.237 ( 0.242) Data 0.021 ( 0.027) Loss 4.6095e+00 (5.7864e+00) Acc@1 14.06 ( 4.66) Acc@5 29.69 ( 13.00) +Epoch: [0][4781/5004] Time 0.239 ( 0.242) Data 0.023 ( 0.027) Loss 4.7878e+00 (5.7862e+00) Acc@1 14.84 ( 4.66) Acc@5 30.08 ( 13.01) +Epoch: [0][4782/5004] Time 0.238 ( 0.242) Data 0.023 ( 0.027) Loss 4.5264e+00 (5.7860e+00) Acc@1 10.55 ( 4.66) Acc@5 33.20 ( 13.01) +Epoch: [0][4783/5004] Time 0.242 ( 0.242) Data 0.023 ( 0.027) Loss 4.6610e+00 (5.7857e+00) Acc@1 15.23 ( 4.67) Acc@5 31.64 ( 13.02) +Epoch: [0][4784/5004] Time 0.243 ( 0.242) Data 0.022 ( 0.027) Loss 4.7308e+00 (5.7855e+00) Acc@1 13.28 ( 4.67) Acc@5 25.00 ( 13.02) +Epoch: [0][4785/5004] Time 0.246 ( 0.242) Data 0.022 ( 0.027) Loss 4.6948e+00 (5.7853e+00) Acc@1 14.06 ( 4.67) Acc@5 34.38 ( 13.02) +Epoch: [0][4786/5004] Time 0.245 ( 0.242) Data 0.019 ( 0.027) Loss 4.5016e+00 (5.7850e+00) Acc@1 16.41 ( 4.67) Acc@5 37.11 ( 13.03) +Epoch: [0][4787/5004] Time 0.243 ( 0.242) Data 0.017 ( 0.027) Loss 4.6564e+00 (5.7848e+00) Acc@1 14.06 ( 4.67) Acc@5 32.03 ( 13.03) +Epoch: [0][4788/5004] Time 0.239 ( 0.242) Data 0.017 ( 0.027) Loss 4.5723e+00 (5.7845e+00) Acc@1 14.84 ( 4.68) Acc@5 31.25 ( 13.04) +Epoch: [0][4789/5004] Time 0.245 ( 0.242) Data 0.018 ( 0.027) Loss 4.7626e+00 (5.7843e+00) Acc@1 10.16 ( 4.68) Acc@5 25.78 ( 13.04) +Epoch: [0][4790/5004] Time 0.250 ( 0.242) Data 0.015 ( 0.027) Loss 4.6500e+00 (5.7841e+00) Acc@1 10.94 ( 4.68) Acc@5 28.52 ( 13.04) +Epoch: [0][4791/5004] Time 0.250 ( 0.242) Data 0.012 ( 0.027) Loss 4.6446e+00 (5.7838e+00) Acc@1 12.89 ( 4.68) Acc@5 30.08 ( 13.04) +Epoch: [0][4792/5004] Time 0.255 ( 0.242) Data 0.015 ( 0.027) Loss 4.6717e+00 (5.7836e+00) Acc@1 14.06 ( 4.68) Acc@5 33.98 ( 13.05) +Epoch: [0][4793/5004] Time 0.251 ( 0.242) Data 0.012 ( 0.027) Loss 4.8052e+00 (5.7834e+00) Acc@1 10.55 ( 4.68) Acc@5 28.52 ( 13.05) +Epoch: [0][4794/5004] Time 0.246 ( 0.242) Data 0.014 ( 0.027) Loss 4.7691e+00 (5.7832e+00) Acc@1 10.94 ( 4.68) Acc@5 28.91 ( 13.06) +Epoch: [0][4795/5004] Time 0.245 ( 0.242) Data 0.015 ( 0.027) Loss 4.8222e+00 (5.7830e+00) Acc@1 10.94 ( 4.69) Acc@5 26.95 ( 13.06) +Epoch: [0][4796/5004] Time 0.242 ( 0.242) Data 0.016 ( 0.027) Loss 4.4712e+00 (5.7827e+00) Acc@1 11.72 ( 4.69) Acc@5 37.11 ( 13.06) +Epoch: [0][4797/5004] Time 0.246 ( 0.242) Data 0.015 ( 0.027) Loss 4.6917e+00 (5.7825e+00) Acc@1 12.11 ( 4.69) Acc@5 32.03 ( 13.07) +Epoch: [0][4798/5004] Time 0.250 ( 0.242) Data 0.013 ( 0.027) Loss 4.5093e+00 (5.7822e+00) Acc@1 14.84 ( 4.69) Acc@5 32.81 ( 13.07) +Epoch: [0][4799/5004] Time 0.250 ( 0.242) Data 0.013 ( 0.027) Loss 4.6869e+00 (5.7820e+00) Acc@1 9.38 ( 4.69) Acc@5 24.61 ( 13.07) +Epoch: [0][4800/5004] Time 0.252 ( 0.242) Data 0.014 ( 0.027) Loss 4.4076e+00 (5.7817e+00) Acc@1 15.23 ( 4.69) Acc@5 37.11 ( 13.08) +Epoch: [0][4801/5004] Time 0.254 ( 0.242) Data 0.014 ( 0.027) Loss 4.9126e+00 (5.7815e+00) Acc@1 10.94 ( 4.70) Acc@5 25.78 ( 13.08) +Epoch: [0][4802/5004] Time 0.251 ( 0.242) Data 0.013 ( 0.027) Loss 4.4704e+00 (5.7812e+00) Acc@1 10.94 ( 4.70) Acc@5 31.64 ( 13.09) +Epoch: [0][4803/5004] Time 0.247 ( 0.242) Data 0.014 ( 0.027) Loss 4.6121e+00 (5.7810e+00) Acc@1 14.06 ( 4.70) Acc@5 30.08 ( 13.09) +Epoch: [0][4804/5004] Time 0.241 ( 0.242) Data 0.013 ( 0.027) Loss 4.6575e+00 (5.7808e+00) Acc@1 13.28 ( 4.70) Acc@5 31.25 ( 13.09) +Epoch: [0][4805/5004] Time 0.245 ( 0.242) Data 0.016 ( 0.027) Loss 4.5546e+00 (5.7805e+00) Acc@1 13.67 ( 4.70) Acc@5 29.69 ( 13.10) +Epoch: [0][4806/5004] Time 0.246 ( 0.242) Data 0.015 ( 0.027) Loss 4.7320e+00 (5.7803e+00) Acc@1 12.89 ( 4.70) Acc@5 32.03 ( 13.10) +Epoch: [0][4807/5004] Time 0.243 ( 0.242) Data 0.014 ( 0.027) Loss 4.5531e+00 (5.7800e+00) Acc@1 9.38 ( 4.71) Acc@5 31.64 ( 13.10) +Epoch: [0][4808/5004] Time 0.252 ( 0.242) Data 0.015 ( 0.027) Loss 4.4344e+00 (5.7798e+00) Acc@1 15.62 ( 4.71) Acc@5 32.42 ( 13.11) +Epoch: [0][4809/5004] Time 0.253 ( 0.242) Data 0.011 ( 0.027) Loss 4.3269e+00 (5.7795e+00) Acc@1 17.97 ( 4.71) Acc@5 33.20 ( 13.11) +Epoch: [0][4810/5004] Time 0.254 ( 0.242) Data 0.013 ( 0.027) Loss 4.5659e+00 (5.7792e+00) Acc@1 16.02 ( 4.71) Acc@5 32.42 ( 13.12) +Epoch: [0][4811/5004] Time 0.253 ( 0.242) Data 0.012 ( 0.027) Loss 4.4466e+00 (5.7789e+00) Acc@1 17.58 ( 4.72) Acc@5 37.11 ( 13.12) +Epoch: [0][4812/5004] Time 0.258 ( 0.242) Data 0.012 ( 0.027) Loss 4.7130e+00 (5.7787e+00) Acc@1 9.38 ( 4.72) Acc@5 28.91 ( 13.12) +Epoch: [0][4813/5004] Time 0.260 ( 0.242) Data 0.011 ( 0.027) Loss 4.3553e+00 (5.7784e+00) Acc@1 15.62 ( 4.72) Acc@5 34.38 ( 13.13) +Epoch: [0][4814/5004] Time 0.259 ( 0.242) Data 0.008 ( 0.027) Loss 4.4090e+00 (5.7781e+00) Acc@1 18.36 ( 4.72) Acc@5 35.16 ( 13.13) +Epoch: [0][4815/5004] Time 0.256 ( 0.242) Data 0.010 ( 0.027) Loss 4.6086e+00 (5.7779e+00) Acc@1 14.06 ( 4.72) Acc@5 28.91 ( 13.14) +Epoch: [0][4816/5004] Time 0.255 ( 0.242) Data 0.010 ( 0.027) Loss 4.8338e+00 (5.7777e+00) Acc@1 12.11 ( 4.72) Acc@5 28.52 ( 13.14) +Epoch: [0][4817/5004] Time 0.244 ( 0.242) Data 0.015 ( 0.027) Loss 4.5407e+00 (5.7774e+00) Acc@1 10.55 ( 4.73) Acc@5 33.59 ( 13.14) +Epoch: [0][4818/5004] Time 0.244 ( 0.242) Data 0.018 ( 0.027) Loss 4.5816e+00 (5.7772e+00) Acc@1 12.50 ( 4.73) Acc@5 30.86 ( 13.15) +Epoch: [0][4819/5004] Time 0.243 ( 0.242) Data 0.017 ( 0.027) Loss 4.6388e+00 (5.7769e+00) Acc@1 14.06 ( 4.73) Acc@5 31.25 ( 13.15) +Epoch: [0][4820/5004] Time 0.244 ( 0.242) Data 0.017 ( 0.027) Loss 4.5388e+00 (5.7767e+00) Acc@1 14.45 ( 4.73) Acc@5 34.77 ( 13.16) +Epoch: [0][4821/5004] Time 0.247 ( 0.242) Data 0.016 ( 0.027) Loss 4.5858e+00 (5.7764e+00) Acc@1 12.50 ( 4.73) Acc@5 32.03 ( 13.16) +Epoch: [0][4822/5004] Time 0.255 ( 0.242) Data 0.015 ( 0.027) Loss 4.6169e+00 (5.7762e+00) Acc@1 15.23 ( 4.74) Acc@5 33.98 ( 13.16) +Epoch: [0][4823/5004] Time 0.249 ( 0.242) Data 0.012 ( 0.027) Loss 4.6365e+00 (5.7760e+00) Acc@1 11.33 ( 4.74) Acc@5 32.81 ( 13.17) +Epoch: [0][4824/5004] Time 0.285 ( 0.243) Data 0.015 ( 0.027) Loss 4.4171e+00 (5.7757e+00) Acc@1 15.23 ( 4.74) Acc@5 33.59 ( 13.17) +Epoch: [0][4825/5004] Time 0.270 ( 0.243) Data 0.014 ( 0.027) Loss 4.6697e+00 (5.7755e+00) Acc@1 13.28 ( 4.74) Acc@5 29.30 ( 13.18) +Epoch: [0][4826/5004] Time 0.248 ( 0.243) Data 0.012 ( 0.027) Loss 4.6705e+00 (5.7752e+00) Acc@1 9.77 ( 4.74) Acc@5 31.25 ( 13.18) +Epoch: [0][4827/5004] Time 0.249 ( 0.243) Data 0.014 ( 0.027) Loss 4.6242e+00 (5.7750e+00) Acc@1 13.67 ( 4.74) Acc@5 30.08 ( 13.18) +Epoch: [0][4828/5004] Time 0.247 ( 0.243) Data 0.015 ( 0.027) Loss 4.7404e+00 (5.7748e+00) Acc@1 11.72 ( 4.75) Acc@5 27.34 ( 13.19) +Epoch: [0][4829/5004] Time 0.262 ( 0.243) Data 0.014 ( 0.027) Loss 4.4498e+00 (5.7745e+00) Acc@1 12.50 ( 4.75) Acc@5 30.47 ( 13.19) +Epoch: [0][4830/5004] Time 0.259 ( 0.243) Data 0.010 ( 0.027) Loss 4.5445e+00 (5.7742e+00) Acc@1 13.28 ( 4.75) Acc@5 32.42 ( 13.19) +Epoch: [0][4831/5004] Time 0.242 ( 0.243) Data 0.013 ( 0.027) Loss 4.7426e+00 (5.7740e+00) Acc@1 12.11 ( 4.75) Acc@5 27.34 ( 13.20) +Epoch: [0][4832/5004] Time 0.241 ( 0.243) Data 0.017 ( 0.027) Loss 4.4984e+00 (5.7738e+00) Acc@1 12.89 ( 4.75) Acc@5 33.20 ( 13.20) +Epoch: [0][4833/5004] Time 0.249 ( 0.243) Data 0.015 ( 0.027) Loss 4.5299e+00 (5.7735e+00) Acc@1 11.72 ( 4.75) Acc@5 34.38 ( 13.21) +Epoch: [0][4834/5004] Time 0.242 ( 0.243) Data 0.012 ( 0.027) Loss 4.5691e+00 (5.7733e+00) Acc@1 11.72 ( 4.75) Acc@5 32.42 ( 13.21) +Epoch: [0][4835/5004] Time 0.249 ( 0.243) Data 0.016 ( 0.027) Loss 4.6108e+00 (5.7730e+00) Acc@1 13.28 ( 4.76) Acc@5 30.47 ( 13.21) +Epoch: [0][4836/5004] Time 0.253 ( 0.243) Data 0.014 ( 0.027) Loss 4.5647e+00 (5.7728e+00) Acc@1 15.23 ( 4.76) Acc@5 33.59 ( 13.22) +Epoch: [0][4837/5004] Time 0.250 ( 0.243) Data 0.013 ( 0.027) Loss 4.5791e+00 (5.7725e+00) Acc@1 14.84 ( 4.76) Acc@5 30.47 ( 13.22) +Epoch: [0][4838/5004] Time 0.261 ( 0.243) Data 0.014 ( 0.027) Loss 4.6123e+00 (5.7723e+00) Acc@1 11.72 ( 4.76) Acc@5 30.86 ( 13.22) +Epoch: [0][4839/5004] Time 0.261 ( 0.243) Data 0.008 ( 0.027) Loss 4.5320e+00 (5.7720e+00) Acc@1 12.11 ( 4.76) Acc@5 28.52 ( 13.23) +Epoch: [0][4840/5004] Time 0.260 ( 0.243) Data 0.010 ( 0.027) Loss 4.7110e+00 (5.7718e+00) Acc@1 12.11 ( 4.76) Acc@5 30.08 ( 13.23) +Epoch: [0][4841/5004] Time 0.257 ( 0.243) Data 0.009 ( 0.027) Loss 4.4755e+00 (5.7715e+00) Acc@1 14.06 ( 4.77) Acc@5 33.59 ( 13.23) +Epoch: [0][4842/5004] Time 0.257 ( 0.243) Data 0.011 ( 0.027) Loss 4.6455e+00 (5.7713e+00) Acc@1 13.67 ( 4.77) Acc@5 30.08 ( 13.24) +Epoch: [0][4843/5004] Time 0.256 ( 0.243) Data 0.010 ( 0.027) Loss 4.5337e+00 (5.7710e+00) Acc@1 15.62 ( 4.77) Acc@5 32.42 ( 13.24) +Epoch: [0][4844/5004] Time 0.252 ( 0.243) Data 0.010 ( 0.027) Loss 4.5384e+00 (5.7708e+00) Acc@1 15.62 ( 4.77) Acc@5 31.64 ( 13.25) +Epoch: [0][4845/5004] Time 0.259 ( 0.243) Data 0.012 ( 0.027) Loss 4.3732e+00 (5.7705e+00) Acc@1 14.84 ( 4.78) Acc@5 36.33 ( 13.25) +Epoch: [0][4846/5004] Time 0.260 ( 0.243) Data 0.011 ( 0.027) Loss 4.6405e+00 (5.7703e+00) Acc@1 14.06 ( 4.78) Acc@5 29.30 ( 13.25) +Epoch: [0][4847/5004] Time 0.258 ( 0.243) Data 0.010 ( 0.027) Loss 4.5166e+00 (5.7700e+00) Acc@1 14.45 ( 4.78) Acc@5 33.98 ( 13.26) +Epoch: [0][4848/5004] Time 0.259 ( 0.243) Data 0.010 ( 0.027) Loss 4.6196e+00 (5.7698e+00) Acc@1 10.94 ( 4.78) Acc@5 30.08 ( 13.26) +Epoch: [0][4849/5004] Time 0.258 ( 0.243) Data 0.009 ( 0.027) Loss 4.6249e+00 (5.7695e+00) Acc@1 9.77 ( 4.78) Acc@5 29.30 ( 13.27) +Epoch: [0][4850/5004] Time 0.248 ( 0.243) Data 0.014 ( 0.027) Loss 4.4691e+00 (5.7693e+00) Acc@1 14.06 ( 4.78) Acc@5 35.16 ( 13.27) +Epoch: [0][4851/5004] Time 0.253 ( 0.243) Data 0.011 ( 0.027) Loss 4.4654e+00 (5.7690e+00) Acc@1 12.11 ( 4.78) Acc@5 32.81 ( 13.27) +Epoch: [0][4852/5004] Time 0.262 ( 0.243) Data 0.010 ( 0.027) Loss 4.6854e+00 (5.7688e+00) Acc@1 12.50 ( 4.79) Acc@5 28.12 ( 13.28) +Epoch: [0][4853/5004] Time 0.266 ( 0.243) Data 0.010 ( 0.027) Loss 4.3882e+00 (5.7685e+00) Acc@1 13.67 ( 4.79) Acc@5 35.16 ( 13.28) +Epoch: [0][4854/5004] Time 0.267 ( 0.243) Data 0.009 ( 0.027) Loss 4.5227e+00 (5.7682e+00) Acc@1 14.45 ( 4.79) Acc@5 35.16 ( 13.29) +Epoch: [0][4855/5004] Time 0.273 ( 0.243) Data 0.008 ( 0.027) Loss 4.6148e+00 (5.7680e+00) Acc@1 13.28 ( 4.79) Acc@5 28.52 ( 13.29) +Epoch: [0][4856/5004] Time 0.275 ( 0.243) Data 0.008 ( 0.027) Loss 4.4053e+00 (5.7677e+00) Acc@1 15.62 ( 4.79) Acc@5 32.42 ( 13.29) +Epoch: [0][4857/5004] Time 0.275 ( 0.243) Data 0.007 ( 0.027) Loss 4.5672e+00 (5.7675e+00) Acc@1 14.45 ( 4.80) Acc@5 33.59 ( 13.30) +Epoch: [0][4858/5004] Time 0.258 ( 0.243) Data 0.010 ( 0.027) Loss 4.4880e+00 (5.7672e+00) Acc@1 16.02 ( 4.80) Acc@5 33.20 ( 13.30) +Epoch: [0][4859/5004] Time 0.248 ( 0.243) Data 0.011 ( 0.027) Loss 4.7655e+00 (5.7670e+00) Acc@1 11.72 ( 4.80) Acc@5 31.64 ( 13.30) +Epoch: [0][4860/5004] Time 0.253 ( 0.243) Data 0.012 ( 0.027) Loss 4.5967e+00 (5.7668e+00) Acc@1 14.45 ( 4.80) Acc@5 32.81 ( 13.31) +Epoch: [0][4861/5004] Time 0.256 ( 0.243) Data 0.012 ( 0.027) Loss 4.5829e+00 (5.7665e+00) Acc@1 12.11 ( 4.80) Acc@5 32.03 ( 13.31) +Epoch: [0][4862/5004] Time 0.260 ( 0.243) Data 0.011 ( 0.027) Loss 4.9673e+00 (5.7664e+00) Acc@1 11.72 ( 4.80) Acc@5 23.44 ( 13.31) +Epoch: [0][4863/5004] Time 0.252 ( 0.243) Data 0.010 ( 0.027) Loss 4.6727e+00 (5.7661e+00) Acc@1 15.62 ( 4.81) Acc@5 31.64 ( 13.32) +Epoch: [0][4864/5004] Time 0.246 ( 0.243) Data 0.013 ( 0.027) Loss 4.6186e+00 (5.7659e+00) Acc@1 12.50 ( 4.81) Acc@5 28.91 ( 13.32) +Epoch: [0][4865/5004] Time 0.247 ( 0.243) Data 0.020 ( 0.027) Loss 4.4727e+00 (5.7656e+00) Acc@1 14.06 ( 4.81) Acc@5 37.11 ( 13.33) +Epoch: [0][4866/5004] Time 0.244 ( 0.243) Data 0.018 ( 0.027) Loss 4.5440e+00 (5.7654e+00) Acc@1 15.23 ( 4.81) Acc@5 35.16 ( 13.33) +Epoch: [0][4867/5004] Time 0.249 ( 0.243) Data 0.018 ( 0.027) Loss 4.5577e+00 (5.7651e+00) Acc@1 14.06 ( 4.81) Acc@5 35.55 ( 13.34) +Epoch: [0][4868/5004] Time 0.238 ( 0.243) Data 0.017 ( 0.027) Loss 4.8827e+00 (5.7649e+00) Acc@1 10.16 ( 4.82) Acc@5 26.17 ( 13.34) +Epoch: [0][4869/5004] Time 0.242 ( 0.243) Data 0.020 ( 0.027) Loss 4.6104e+00 (5.7647e+00) Acc@1 12.11 ( 4.82) Acc@5 30.47 ( 13.34) +Epoch: [0][4870/5004] Time 0.242 ( 0.243) Data 0.019 ( 0.027) Loss 4.6816e+00 (5.7645e+00) Acc@1 11.72 ( 4.82) Acc@5 30.08 ( 13.35) +Epoch: [0][4871/5004] Time 0.243 ( 0.243) Data 0.018 ( 0.027) Loss 4.5894e+00 (5.7642e+00) Acc@1 13.67 ( 4.82) Acc@5 33.20 ( 13.35) +Epoch: [0][4872/5004] Time 0.249 ( 0.243) Data 0.019 ( 0.027) Loss 4.7424e+00 (5.7640e+00) Acc@1 12.89 ( 4.82) Acc@5 26.95 ( 13.35) +Epoch: [0][4873/5004] Time 0.239 ( 0.243) Data 0.016 ( 0.027) Loss 4.6217e+00 (5.7638e+00) Acc@1 13.67 ( 4.82) Acc@5 30.86 ( 13.36) +Epoch: [0][4874/5004] Time 0.249 ( 0.243) Data 0.017 ( 0.027) Loss 4.4479e+00 (5.7635e+00) Acc@1 17.19 ( 4.83) Acc@5 38.28 ( 13.36) +Epoch: [0][4875/5004] Time 0.261 ( 0.243) Data 0.013 ( 0.027) Loss 4.6359e+00 (5.7633e+00) Acc@1 16.02 ( 4.83) Acc@5 33.20 ( 13.37) +Epoch: [0][4876/5004] Time 0.268 ( 0.243) Data 0.008 ( 0.027) Loss 4.5989e+00 (5.7631e+00) Acc@1 11.72 ( 4.83) Acc@5 30.47 ( 13.37) +Epoch: [0][4877/5004] Time 0.285 ( 0.243) Data 0.007 ( 0.027) Loss 4.7555e+00 (5.7629e+00) Acc@1 10.16 ( 4.83) Acc@5 30.86 ( 13.37) +Epoch: [0][4878/5004] Time 0.287 ( 0.243) Data 0.007 ( 0.027) Loss 4.6454e+00 (5.7626e+00) Acc@1 13.28 ( 4.83) Acc@5 31.64 ( 13.38) +Epoch: [0][4879/5004] Time 0.284 ( 0.243) Data 0.007 ( 0.027) Loss 4.6968e+00 (5.7624e+00) Acc@1 8.20 ( 4.83) Acc@5 30.08 ( 13.38) +Epoch: [0][4880/5004] Time 0.288 ( 0.243) Data 0.007 ( 0.027) Loss 4.5547e+00 (5.7622e+00) Acc@1 16.80 ( 4.84) Acc@5 34.38 ( 13.38) +Epoch: [0][4881/5004] Time 0.286 ( 0.243) Data 0.007 ( 0.027) Loss 4.5432e+00 (5.7619e+00) Acc@1 12.50 ( 4.84) Acc@5 32.03 ( 13.39) +Epoch: [0][4882/5004] Time 0.280 ( 0.243) Data 0.008 ( 0.027) Loss 4.4506e+00 (5.7616e+00) Acc@1 12.11 ( 4.84) Acc@5 29.69 ( 13.39) +Epoch: [0][4883/5004] Time 0.293 ( 0.243) Data 0.008 ( 0.027) Loss 4.6579e+00 (5.7614e+00) Acc@1 14.06 ( 4.84) Acc@5 33.20 ( 13.39) +Epoch: [0][4884/5004] Time 0.279 ( 0.243) Data 0.007 ( 0.027) Loss 4.3975e+00 (5.7611e+00) Acc@1 14.06 ( 4.84) Acc@5 34.77 ( 13.40) +Epoch: [0][4885/5004] Time 0.280 ( 0.243) Data 0.008 ( 0.027) Loss 4.6904e+00 (5.7609e+00) Acc@1 15.23 ( 4.85) Acc@5 29.69 ( 13.40) +Epoch: [0][4886/5004] Time 0.284 ( 0.243) Data 0.007 ( 0.027) Loss 4.7022e+00 (5.7607e+00) Acc@1 11.33 ( 4.85) Acc@5 32.03 ( 13.41) +Epoch: [0][4887/5004] Time 0.286 ( 0.243) Data 0.007 ( 0.027) Loss 4.4446e+00 (5.7604e+00) Acc@1 12.11 ( 4.85) Acc@5 31.64 ( 13.41) +Epoch: [0][4888/5004] Time 0.287 ( 0.243) Data 0.007 ( 0.027) Loss 4.6475e+00 (5.7602e+00) Acc@1 11.72 ( 4.85) Acc@5 30.08 ( 13.41) +Epoch: [0][4889/5004] Time 0.274 ( 0.243) Data 0.008 ( 0.027) Loss 4.4356e+00 (5.7599e+00) Acc@1 13.67 ( 4.85) Acc@5 32.42 ( 13.42) +Epoch: [0][4890/5004] Time 0.245 ( 0.243) Data 0.012 ( 0.027) Loss 4.6290e+00 (5.7597e+00) Acc@1 14.84 ( 4.85) Acc@5 31.64 ( 13.42) +Epoch: [0][4891/5004] Time 0.241 ( 0.243) Data 0.015 ( 0.027) Loss 4.5330e+00 (5.7595e+00) Acc@1 13.28 ( 4.85) Acc@5 29.30 ( 13.42) +Epoch: [0][4892/5004] Time 0.244 ( 0.243) Data 0.016 ( 0.027) Loss 4.4766e+00 (5.7592e+00) Acc@1 14.06 ( 4.86) Acc@5 36.72 ( 13.43) +Epoch: [0][4893/5004] Time 0.243 ( 0.243) Data 0.016 ( 0.027) Loss 4.5269e+00 (5.7589e+00) Acc@1 14.06 ( 4.86) Acc@5 35.94 ( 13.43) +Epoch: [0][4894/5004] Time 0.252 ( 0.243) Data 0.015 ( 0.027) Loss 4.4815e+00 (5.7587e+00) Acc@1 16.02 ( 4.86) Acc@5 30.47 ( 13.44) +Epoch: [0][4895/5004] Time 0.252 ( 0.243) Data 0.011 ( 0.027) Loss 4.5655e+00 (5.7584e+00) Acc@1 12.50 ( 4.86) Acc@5 28.91 ( 13.44) +Epoch: [0][4896/5004] Time 0.253 ( 0.243) Data 0.013 ( 0.027) Loss 4.4843e+00 (5.7582e+00) Acc@1 12.89 ( 4.86) Acc@5 30.86 ( 13.44) +Epoch: [0][4897/5004] Time 0.251 ( 0.243) Data 0.013 ( 0.027) Loss 4.8117e+00 (5.7580e+00) Acc@1 10.55 ( 4.87) Acc@5 30.86 ( 13.45) +Epoch: [0][4898/5004] Time 0.249 ( 0.243) Data 0.013 ( 0.027) Loss 4.5891e+00 (5.7577e+00) Acc@1 16.41 ( 4.87) Acc@5 30.47 ( 13.45) +Epoch: [0][4899/5004] Time 0.256 ( 0.243) Data 0.015 ( 0.027) Loss 4.3867e+00 (5.7575e+00) Acc@1 12.89 ( 4.87) Acc@5 37.89 ( 13.46) +Epoch: [0][4900/5004] Time 0.259 ( 0.243) Data 0.008 ( 0.027) Loss 4.6257e+00 (5.7572e+00) Acc@1 12.50 ( 4.87) Acc@5 32.81 ( 13.46) +Epoch: [0][4901/5004] Time 0.264 ( 0.243) Data 0.009 ( 0.027) Loss 4.5897e+00 (5.7570e+00) Acc@1 13.67 ( 4.87) Acc@5 32.81 ( 13.46) +Epoch: [0][4902/5004] Time 0.240 ( 0.243) Data 0.013 ( 0.027) Loss 4.6242e+00 (5.7568e+00) Acc@1 13.28 ( 4.87) Acc@5 32.42 ( 13.47) +Epoch: [0][4903/5004] Time 0.244 ( 0.243) Data 0.016 ( 0.027) Loss 4.5667e+00 (5.7565e+00) Acc@1 16.02 ( 4.88) Acc@5 35.94 ( 13.47) +Epoch: [0][4904/5004] Time 0.243 ( 0.243) Data 0.016 ( 0.027) Loss 4.6983e+00 (5.7563e+00) Acc@1 12.11 ( 4.88) Acc@5 26.95 ( 13.48) +Epoch: [0][4905/5004] Time 0.249 ( 0.243) Data 0.014 ( 0.027) Loss 4.4406e+00 (5.7560e+00) Acc@1 14.84 ( 4.88) Acc@5 35.94 ( 13.48) +Epoch: [0][4906/5004] Time 0.254 ( 0.243) Data 0.013 ( 0.027) Loss 4.5575e+00 (5.7558e+00) Acc@1 13.28 ( 4.88) Acc@5 36.72 ( 13.48) +Epoch: [0][4907/5004] Time 0.253 ( 0.243) Data 0.013 ( 0.027) Loss 4.6343e+00 (5.7556e+00) Acc@1 12.11 ( 4.88) Acc@5 31.64 ( 13.49) +Epoch: [0][4908/5004] Time 0.253 ( 0.243) Data 0.013 ( 0.027) Loss 4.5578e+00 (5.7553e+00) Acc@1 11.33 ( 4.88) Acc@5 30.86 ( 13.49) +Epoch: [0][4909/5004] Time 0.248 ( 0.243) Data 0.009 ( 0.027) Loss 4.6144e+00 (5.7551e+00) Acc@1 12.50 ( 4.89) Acc@5 30.47 ( 13.50) +Epoch: [0][4910/5004] Time 0.255 ( 0.243) Data 0.013 ( 0.027) Loss 4.6127e+00 (5.7549e+00) Acc@1 14.06 ( 4.89) Acc@5 29.30 ( 13.50) +Epoch: [0][4911/5004] Time 0.242 ( 0.243) Data 0.015 ( 0.027) Loss 4.7344e+00 (5.7546e+00) Acc@1 10.55 ( 4.89) Acc@5 30.08 ( 13.50) +Epoch: [0][4912/5004] Time 0.242 ( 0.243) Data 0.016 ( 0.027) Loss 4.5696e+00 (5.7544e+00) Acc@1 13.28 ( 4.89) Acc@5 29.69 ( 13.50) +Epoch: [0][4913/5004] Time 0.251 ( 0.243) Data 0.018 ( 0.027) Loss 4.5533e+00 (5.7542e+00) Acc@1 12.50 ( 4.89) Acc@5 32.03 ( 13.51) +Epoch: [0][4914/5004] Time 0.249 ( 0.243) Data 0.010 ( 0.027) Loss 4.5035e+00 (5.7539e+00) Acc@1 13.67 ( 4.89) Acc@5 37.89 ( 13.51) +Epoch: [0][4915/5004] Time 0.246 ( 0.243) Data 0.015 ( 0.027) Loss 4.6079e+00 (5.7537e+00) Acc@1 10.16 ( 4.90) Acc@5 30.47 ( 13.52) +Epoch: [0][4916/5004] Time 0.238 ( 0.243) Data 0.015 ( 0.027) Loss 4.6263e+00 (5.7534e+00) Acc@1 10.55 ( 4.90) Acc@5 31.64 ( 13.52) +Epoch: [0][4917/5004] Time 0.245 ( 0.243) Data 0.018 ( 0.027) Loss 4.4661e+00 (5.7532e+00) Acc@1 8.98 ( 4.90) Acc@5 32.42 ( 13.52) +Epoch: [0][4918/5004] Time 0.240 ( 0.243) Data 0.017 ( 0.027) Loss 4.6097e+00 (5.7530e+00) Acc@1 11.72 ( 4.90) Acc@5 27.34 ( 13.53) +Epoch: [0][4919/5004] Time 0.266 ( 0.243) Data 0.020 ( 0.027) Loss 4.5960e+00 (5.7527e+00) Acc@1 13.67 ( 4.90) Acc@5 33.98 ( 13.53) +Epoch: [0][4920/5004] Time 0.260 ( 0.243) Data 0.013 ( 0.027) Loss 4.5500e+00 (5.7525e+00) Acc@1 12.11 ( 4.90) Acc@5 33.20 ( 13.54) +Epoch: [0][4921/5004] Time 0.261 ( 0.243) Data 0.013 ( 0.027) Loss 4.5563e+00 (5.7522e+00) Acc@1 11.72 ( 4.90) Acc@5 32.03 ( 13.54) +Epoch: [0][4922/5004] Time 0.264 ( 0.243) Data 0.014 ( 0.027) Loss 4.5823e+00 (5.7520e+00) Acc@1 10.94 ( 4.90) Acc@5 35.16 ( 13.54) +Epoch: [0][4923/5004] Time 0.242 ( 0.243) Data 0.012 ( 0.027) Loss 4.5370e+00 (5.7517e+00) Acc@1 14.45 ( 4.91) Acc@5 33.59 ( 13.55) +Epoch: [0][4924/5004] Time 0.255 ( 0.243) Data 0.017 ( 0.027) Loss 4.4510e+00 (5.7515e+00) Acc@1 13.28 ( 4.91) Acc@5 34.38 ( 13.55) +Epoch: [0][4925/5004] Time 0.252 ( 0.243) Data 0.014 ( 0.027) Loss 4.7238e+00 (5.7513e+00) Acc@1 13.28 ( 4.91) Acc@5 28.52 ( 13.56) +Epoch: [0][4926/5004] Time 0.243 ( 0.243) Data 0.013 ( 0.027) Loss 4.4939e+00 (5.7510e+00) Acc@1 14.45 ( 4.91) Acc@5 34.38 ( 13.56) +Epoch: [0][4927/5004] Time 0.244 ( 0.243) Data 0.015 ( 0.027) Loss 4.5541e+00 (5.7508e+00) Acc@1 13.28 ( 4.91) Acc@5 30.08 ( 13.56) +Epoch: [0][4928/5004] Time 0.251 ( 0.243) Data 0.016 ( 0.027) Loss 4.3895e+00 (5.7505e+00) Acc@1 16.80 ( 4.92) Acc@5 34.38 ( 13.57) +Epoch: [0][4929/5004] Time 0.251 ( 0.243) Data 0.013 ( 0.027) Loss 4.3665e+00 (5.7502e+00) Acc@1 14.45 ( 4.92) Acc@5 34.77 ( 13.57) +Epoch: [0][4930/5004] Time 0.231 ( 0.243) Data 0.013 ( 0.027) Loss 4.3743e+00 (5.7499e+00) Acc@1 17.19 ( 4.92) Acc@5 35.55 ( 13.58) +Epoch: [0][4931/5004] Time 0.245 ( 0.243) Data 0.028 ( 0.027) Loss 4.4109e+00 (5.7497e+00) Acc@1 15.62 ( 4.92) Acc@5 38.28 ( 13.58) +Epoch: [0][4932/5004] Time 0.246 ( 0.243) Data 0.028 ( 0.027) Loss 4.6802e+00 (5.7494e+00) Acc@1 13.28 ( 4.92) Acc@5 32.03 ( 13.58) +Epoch: [0][4933/5004] Time 0.253 ( 0.243) Data 0.028 ( 0.027) Loss 4.6507e+00 (5.7492e+00) Acc@1 13.67 ( 4.93) Acc@5 29.30 ( 13.59) +Epoch: [0][4934/5004] Time 0.244 ( 0.243) Data 0.025 ( 0.027) Loss 4.8022e+00 (5.7490e+00) Acc@1 10.94 ( 4.93) Acc@5 25.39 ( 13.59) +Epoch: [0][4935/5004] Time 0.243 ( 0.243) Data 0.027 ( 0.027) Loss 4.3392e+00 (5.7487e+00) Acc@1 15.23 ( 4.93) Acc@5 37.50 ( 13.59) +Epoch: [0][4936/5004] Time 0.242 ( 0.243) Data 0.030 ( 0.027) Loss 4.6373e+00 (5.7485e+00) Acc@1 14.06 ( 4.93) Acc@5 35.16 ( 13.60) +Epoch: [0][4937/5004] Time 0.246 ( 0.243) Data 0.029 ( 0.027) Loss 4.6260e+00 (5.7483e+00) Acc@1 12.50 ( 4.93) Acc@5 32.03 ( 13.60) +Epoch: [0][4938/5004] Time 0.243 ( 0.243) Data 0.028 ( 0.027) Loss 4.3373e+00 (5.7480e+00) Acc@1 14.45 ( 4.93) Acc@5 32.81 ( 13.61) +Epoch: [0][4939/5004] Time 0.242 ( 0.243) Data 0.028 ( 0.027) Loss 4.6592e+00 (5.7478e+00) Acc@1 14.84 ( 4.94) Acc@5 30.47 ( 13.61) +Epoch: [0][4940/5004] Time 0.248 ( 0.243) Data 0.029 ( 0.027) Loss 4.4453e+00 (5.7475e+00) Acc@1 14.84 ( 4.94) Acc@5 33.20 ( 13.61) +Epoch: [0][4941/5004] Time 0.243 ( 0.243) Data 0.028 ( 0.027) Loss 4.4258e+00 (5.7473e+00) Acc@1 17.58 ( 4.94) Acc@5 37.89 ( 13.62) +Epoch: [0][4942/5004] Time 0.251 ( 0.243) Data 0.029 ( 0.027) Loss 4.4990e+00 (5.7470e+00) Acc@1 16.02 ( 4.94) Acc@5 35.94 ( 13.62) +Epoch: [0][4943/5004] Time 0.249 ( 0.243) Data 0.024 ( 0.027) Loss 4.4506e+00 (5.7467e+00) Acc@1 15.23 ( 4.95) Acc@5 32.03 ( 13.63) +Epoch: [0][4944/5004] Time 0.257 ( 0.243) Data 0.027 ( 0.027) Loss 4.5634e+00 (5.7465e+00) Acc@1 12.50 ( 4.95) Acc@5 34.77 ( 13.63) +Epoch: [0][4945/5004] Time 0.249 ( 0.243) Data 0.025 ( 0.027) Loss 4.3528e+00 (5.7462e+00) Acc@1 14.45 ( 4.95) Acc@5 39.06 ( 13.64) +Epoch: [0][4946/5004] Time 0.244 ( 0.243) Data 0.028 ( 0.027) Loss 4.5430e+00 (5.7460e+00) Acc@1 16.02 ( 4.95) Acc@5 35.16 ( 13.64) +Epoch: [0][4947/5004] Time 0.239 ( 0.243) Data 0.028 ( 0.027) Loss 4.6044e+00 (5.7457e+00) Acc@1 11.72 ( 4.95) Acc@5 34.77 ( 13.65) +Epoch: [0][4948/5004] Time 0.244 ( 0.243) Data 0.030 ( 0.027) Loss 4.5335e+00 (5.7455e+00) Acc@1 15.62 ( 4.95) Acc@5 35.55 ( 13.65) +Epoch: [0][4949/5004] Time 0.250 ( 0.243) Data 0.029 ( 0.027) Loss 4.6772e+00 (5.7453e+00) Acc@1 11.72 ( 4.96) Acc@5 26.17 ( 13.65) +Epoch: [0][4950/5004] Time 0.241 ( 0.243) Data 0.024 ( 0.027) Loss 4.7081e+00 (5.7451e+00) Acc@1 11.33 ( 4.96) Acc@5 30.86 ( 13.66) +Epoch: [0][4951/5004] Time 0.247 ( 0.243) Data 0.027 ( 0.027) Loss 4.5440e+00 (5.7448e+00) Acc@1 15.23 ( 4.96) Acc@5 32.42 ( 13.66) +Epoch: [0][4952/5004] Time 0.269 ( 0.243) Data 0.027 ( 0.027) Loss 4.5197e+00 (5.7446e+00) Acc@1 13.28 ( 4.96) Acc@5 33.59 ( 13.66) +Epoch: [0][4953/5004] Time 0.276 ( 0.243) Data 0.008 ( 0.027) Loss 4.6776e+00 (5.7444e+00) Acc@1 13.67 ( 4.96) Acc@5 31.25 ( 13.67) +Epoch: [0][4954/5004] Time 0.273 ( 0.243) Data 0.009 ( 0.027) Loss 4.4708e+00 (5.7441e+00) Acc@1 13.28 ( 4.96) Acc@5 34.77 ( 13.67) +Epoch: [0][4955/5004] Time 0.295 ( 0.243) Data 0.008 ( 0.027) Loss 4.4678e+00 (5.7439e+00) Acc@1 13.67 ( 4.97) Acc@5 33.59 ( 13.68) +Epoch: [0][4956/5004] Time 0.298 ( 0.243) Data 0.009 ( 0.027) Loss 4.3981e+00 (5.7436e+00) Acc@1 15.23 ( 4.97) Acc@5 35.55 ( 13.68) +Epoch: [0][4957/5004] Time 0.286 ( 0.243) Data 0.009 ( 0.027) Loss 4.5114e+00 (5.7433e+00) Acc@1 14.84 ( 4.97) Acc@5 32.81 ( 13.68) +Epoch: [0][4958/5004] Time 0.283 ( 0.243) Data 0.009 ( 0.027) Loss 4.4686e+00 (5.7431e+00) Acc@1 14.84 ( 4.97) Acc@5 32.81 ( 13.69) +Epoch: [0][4959/5004] Time 0.272 ( 0.243) Data 0.008 ( 0.027) Loss 4.4492e+00 (5.7428e+00) Acc@1 12.11 ( 4.97) Acc@5 28.52 ( 13.69) +Epoch: [0][4960/5004] Time 0.267 ( 0.243) Data 0.009 ( 0.027) Loss 4.5289e+00 (5.7426e+00) Acc@1 13.28 ( 4.98) Acc@5 31.25 ( 13.69) +Epoch: [0][4961/5004] Time 0.264 ( 0.243) Data 0.010 ( 0.027) Loss 4.5884e+00 (5.7423e+00) Acc@1 15.23 ( 4.98) Acc@5 32.42 ( 13.70) +Epoch: [0][4962/5004] Time 0.247 ( 0.243) Data 0.014 ( 0.027) Loss 4.5294e+00 (5.7421e+00) Acc@1 14.06 ( 4.98) Acc@5 31.64 ( 13.70) +Epoch: [0][4963/5004] Time 0.254 ( 0.243) Data 0.017 ( 0.027) Loss 4.4891e+00 (5.7418e+00) Acc@1 15.62 ( 4.98) Acc@5 32.03 ( 13.71) +Epoch: [0][4964/5004] Time 0.255 ( 0.243) Data 0.013 ( 0.027) Loss 4.4317e+00 (5.7416e+00) Acc@1 18.75 ( 4.98) Acc@5 37.50 ( 13.71) +Epoch: [0][4965/5004] Time 0.252 ( 0.243) Data 0.013 ( 0.027) Loss 4.5477e+00 (5.7413e+00) Acc@1 17.19 ( 4.99) Acc@5 33.59 ( 13.71) +Epoch: [0][4966/5004] Time 0.256 ( 0.243) Data 0.015 ( 0.027) Loss 4.4628e+00 (5.7411e+00) Acc@1 15.23 ( 4.99) Acc@5 33.98 ( 13.72) +Epoch: [0][4967/5004] Time 0.246 ( 0.243) Data 0.012 ( 0.027) Loss 4.5768e+00 (5.7409e+00) Acc@1 15.62 ( 4.99) Acc@5 34.38 ( 13.72) +Epoch: [0][4968/5004] Time 0.254 ( 0.243) Data 0.018 ( 0.027) Loss 4.5854e+00 (5.7406e+00) Acc@1 12.11 ( 4.99) Acc@5 28.91 ( 13.73) +Epoch: [0][4969/5004] Time 0.247 ( 0.243) Data 0.014 ( 0.027) Loss 4.6123e+00 (5.7404e+00) Acc@1 15.62 ( 4.99) Acc@5 32.42 ( 13.73) +Epoch: [0][4970/5004] Time 0.254 ( 0.243) Data 0.021 ( 0.027) Loss 4.5571e+00 (5.7402e+00) Acc@1 14.06 ( 5.00) Acc@5 32.81 ( 13.73) +Epoch: [0][4971/5004] Time 0.254 ( 0.243) Data 0.014 ( 0.027) Loss 4.3591e+00 (5.7399e+00) Acc@1 15.62 ( 5.00) Acc@5 33.98 ( 13.74) +Epoch: [0][4972/5004] Time 0.264 ( 0.243) Data 0.014 ( 0.027) Loss 4.5048e+00 (5.7396e+00) Acc@1 11.72 ( 5.00) Acc@5 31.64 ( 13.74) +Epoch: [0][4973/5004] Time 0.253 ( 0.243) Data 0.012 ( 0.027) Loss 4.5768e+00 (5.7394e+00) Acc@1 13.28 ( 5.00) Acc@5 32.81 ( 13.74) +Epoch: [0][4974/5004] Time 0.254 ( 0.243) Data 0.012 ( 0.027) Loss 4.3737e+00 (5.7391e+00) Acc@1 14.84 ( 5.00) Acc@5 35.55 ( 13.75) +Epoch: [0][4975/5004] Time 0.257 ( 0.243) Data 0.015 ( 0.027) Loss 4.3508e+00 (5.7388e+00) Acc@1 15.62 ( 5.01) Acc@5 36.33 ( 13.75) +Epoch: [0][4976/5004] Time 0.247 ( 0.243) Data 0.015 ( 0.026) Loss 4.5052e+00 (5.7386e+00) Acc@1 12.11 ( 5.01) Acc@5 32.81 ( 13.76) +Epoch: [0][4977/5004] Time 0.252 ( 0.243) Data 0.015 ( 0.026) Loss 4.5838e+00 (5.7384e+00) Acc@1 12.50 ( 5.01) Acc@5 30.47 ( 13.76) +Epoch: [0][4978/5004] Time 0.236 ( 0.243) Data 0.018 ( 0.026) Loss 4.4569e+00 (5.7381e+00) Acc@1 14.06 ( 5.01) Acc@5 38.28 ( 13.77) +Epoch: [0][4979/5004] Time 0.241 ( 0.243) Data 0.023 ( 0.026) Loss 4.5200e+00 (5.7379e+00) Acc@1 16.41 ( 5.01) Acc@5 33.20 ( 13.77) +Epoch: [0][4980/5004] Time 0.252 ( 0.243) Data 0.024 ( 0.026) Loss 4.3250e+00 (5.7376e+00) Acc@1 15.23 ( 5.01) Acc@5 35.94 ( 13.77) +Epoch: [0][4981/5004] Time 0.250 ( 0.243) Data 0.018 ( 0.026) Loss 4.5707e+00 (5.7373e+00) Acc@1 14.84 ( 5.02) Acc@5 34.77 ( 13.78) +Epoch: [0][4982/5004] Time 0.261 ( 0.243) Data 0.022 ( 0.026) Loss 4.4293e+00 (5.7371e+00) Acc@1 16.41 ( 5.02) Acc@5 35.94 ( 13.78) +Epoch: [0][4983/5004] Time 0.252 ( 0.243) Data 0.017 ( 0.026) Loss 4.7000e+00 (5.7369e+00) Acc@1 12.50 ( 5.02) Acc@5 32.81 ( 13.79) +Epoch: [0][4984/5004] Time 0.265 ( 0.243) Data 0.019 ( 0.026) Loss 4.5700e+00 (5.7366e+00) Acc@1 12.50 ( 5.02) Acc@5 30.86 ( 13.79) +Epoch: [0][4985/5004] Time 0.252 ( 0.243) Data 0.015 ( 0.026) Loss 4.7652e+00 (5.7364e+00) Acc@1 10.16 ( 5.02) Acc@5 29.30 ( 13.79) +Epoch: [0][4986/5004] Time 0.261 ( 0.243) Data 0.017 ( 0.026) Loss 4.7401e+00 (5.7362e+00) Acc@1 11.72 ( 5.02) Acc@5 32.42 ( 13.80) +Epoch: [0][4987/5004] Time 0.252 ( 0.243) Data 0.016 ( 0.026) Loss 4.4364e+00 (5.7360e+00) Acc@1 16.80 ( 5.03) Acc@5 32.42 ( 13.80) +Epoch: [0][4988/5004] Time 0.251 ( 0.243) Data 0.017 ( 0.026) Loss 4.8072e+00 (5.7358e+00) Acc@1 8.98 ( 5.03) Acc@5 26.56 ( 13.80) +Epoch: [0][4989/5004] Time 0.257 ( 0.243) Data 0.021 ( 0.026) Loss 4.7605e+00 (5.7356e+00) Acc@1 13.28 ( 5.03) Acc@5 28.91 ( 13.81) +Epoch: [0][4990/5004] Time 0.253 ( 0.243) Data 0.019 ( 0.026) Loss 4.5413e+00 (5.7354e+00) Acc@1 12.50 ( 5.03) Acc@5 32.81 ( 13.81) +Epoch: [0][4991/5004] Time 0.261 ( 0.243) Data 0.020 ( 0.026) Loss 4.6063e+00 (5.7351e+00) Acc@1 13.28 ( 5.03) Acc@5 33.98 ( 13.81) +Epoch: [0][4992/5004] Time 0.252 ( 0.243) Data 0.017 ( 0.026) Loss 4.4528e+00 (5.7349e+00) Acc@1 16.80 ( 5.03) Acc@5 33.20 ( 13.82) +Epoch: [0][4993/5004] Time 0.258 ( 0.243) Data 0.018 ( 0.026) Loss 4.4926e+00 (5.7346e+00) Acc@1 18.36 ( 5.04) Acc@5 32.03 ( 13.82) +Epoch: [0][4994/5004] Time 0.266 ( 0.243) Data 0.018 ( 0.026) Loss 4.7293e+00 (5.7344e+00) Acc@1 14.45 ( 5.04) Acc@5 31.64 ( 13.82) +Epoch: [0][4995/5004] Time 0.252 ( 0.243) Data 0.014 ( 0.026) Loss 4.5468e+00 (5.7342e+00) Acc@1 17.58 ( 5.04) Acc@5 35.55 ( 13.83) +Epoch: [0][4996/5004] Time 0.258 ( 0.243) Data 0.019 ( 0.026) Loss 4.5155e+00 (5.7339e+00) Acc@1 16.02 ( 5.04) Acc@5 33.98 ( 13.83) +Epoch: [0][4997/5004] Time 0.248 ( 0.243) Data 0.018 ( 0.026) Loss 4.5619e+00 (5.7337e+00) Acc@1 12.11 ( 5.05) Acc@5 32.42 ( 13.84) +Epoch: [0][4998/5004] Time 0.265 ( 0.243) Data 0.020 ( 0.026) Loss 4.6329e+00 (5.7335e+00) Acc@1 11.33 ( 5.05) Acc@5 26.95 ( 13.84) +Epoch: [0][4999/5004] Time 0.264 ( 0.243) Data 0.013 ( 0.026) Loss 4.5627e+00 (5.7333e+00) Acc@1 12.89 ( 5.05) Acc@5 30.08 ( 13.84) +Epoch: [0][5000/5004] Time 0.271 ( 0.243) Data 0.010 ( 0.026) Loss 4.5127e+00 (5.7330e+00) Acc@1 16.02 ( 5.05) Acc@5 33.20 ( 13.85) +Epoch: [0][5001/5004] Time 0.261 ( 0.243) Data 0.009 ( 0.026) Loss 4.3150e+00 (5.7327e+00) Acc@1 15.23 ( 5.05) Acc@5 35.94 ( 13.85) +Epoch: [0][5002/5004] Time 0.265 ( 0.243) Data 0.012 ( 0.026) Loss 4.6050e+00 (5.7325e+00) Acc@1 14.06 ( 5.05) Acc@5 32.42 ( 13.85) +Epoch: [0][5003/5004] Time 0.269 ( 0.243) Data 0.013 ( 0.026) Loss 4.7062e+00 (5.7323e+00) Acc@1 12.50 ( 5.06) Acc@5 29.30 ( 13.86) +[npu id: 0 ] batch_size: 256 Time: 0.243 * FPS@all 1053.555 diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/train_ResNet101_ID1595_for_PyTorch_bs256_1p_perf_loss.txt" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/train_ResNet101_ID1595_for_PyTorch_bs256_1p_perf_loss.txt" new file mode 100644 index 0000000000000000000000000000000000000000..1e2c531173c76c10bdc83c2e6fc9150398657178 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/output/0/train_ResNet101_ID1595_for_PyTorch_bs256_1p_perf_loss.txt" @@ -0,0 +1,10008 @@ +7.1284e+00 +8.6593e+00 +9.5272e+00 +8.3551e+00 +8.6900e+00 +8.8442e+00 +7.5124e+00 +8.7513e+00 +8.4683e+00 +8.1676e+00 +8.2323e+00 +7.7600e+00 +7.6141e+00 +8.0500e+00 +7.8809e+00 +7.6010e+00 +7.5687e+00 +7.7687e+00 +9.0109e+00 +7.4484e+00 +7.3605e+00 +7.1803e+00 +7.7433e+00 +8.3131e+00 +7.5259e+00 +7.1664e+00 +7.1321e+00 +7.1734e+00 +7.0932e+00 +7.2980e+00 +7.0408e+00 +7.0054e+00 +6.9824e+00 +7.0612e+00 +6.9541e+00 +6.9516e+00 +6.9389e+00 +6.9487e+00 +6.9494e+00 +6.9794e+00 +6.9615e+00 +6.9375e+00 +7.0985e+00 +6.9116e+00 +6.9217e+00 +6.9310e+00 +6.9175e+00 +6.9402e+00 +7.0500e+00 +6.9396e+00 +6.9335e+00 +6.9301e+00 +6.9106e+00 +6.9183e+00 +6.9287e+00 +6.9055e+00 +6.8933e+00 +6.9284e+00 +6.8942e+00 +6.9559e+00 +6.9157e+00 +6.9127e+00 +6.9173e+00 +6.9168e+00 +6.9022e+00 +6.9269e+00 +6.9046e+00 +6.8840e+00 +6.9088e+00 +6.9110e+00 +6.9031e+00 +6.9139e+00 +6.8853e+00 +6.8804e+00 +6.9051e+00 +6.9058e+00 +6.8905e+00 +6.9131e+00 +6.9044e+00 +6.9107e+00 +6.9145e+00 +6.9048e+00 +6.9143e+00 +6.9050e+00 +6.9378e+00 +6.9118e+00 +6.9211e+00 +6.9301e+00 +6.9074e+00 +6.9421e+00 +6.8935e+00 +6.9175e+00 +6.9175e+00 +6.8978e+00 +6.9228e+00 +6.9092e+00 +6.8961e+00 +6.9005e+00 +6.9057e+00 +6.9490e+00 +6.9313e+00 +6.9129e+00 +6.8933e+00 +6.9098e+00 +6.9323e+00 +6.9283e+00 +6.9174e+00 +6.8973e+00 +6.9061e+00 +6.9083e+00 +6.8965e+00 +6.8907e+00 +6.9153e+00 +6.9036e+00 +6.9938e+00 +6.8924e+00 +6.9054e+00 +6.8820e+00 +6.9104e+00 +6.9092e+00 +6.9232e+00 +6.9078e+00 +6.8891e+00 +6.9134e+00 +6.9144e+00 +6.9167e+00 +6.8809e+00 +6.9052e+00 +6.8908e+00 +6.8974e+00 +6.8855e+00 +6.9503e+00 +6.9034e+00 +6.9069e+00 +6.9067e+00 +6.9081e+00 +6.9073e+00 +6.9141e+00 +6.9162e+00 +6.8952e+00 +6.8825e+00 +6.9196e+00 +6.8937e+00 +7.0251e+00 +6.9050e+00 +6.9091e+00 +6.9004e+00 +6.9079e+00 +6.9192e+00 +6.9033e+00 +6.9119e+00 +6.9053e+00 +6.9230e+00 +6.9133e+00 +6.8999e+00 +6.9164e+00 +6.9078e+00 +6.9015e+00 +6.9090e+00 +6.8973e+00 +6.9102e+00 +6.9004e+00 +6.9164e+00 +6.9124e+00 +6.9005e+00 +6.9069e+00 +6.8998e+00 +6.8850e+00 +6.8924e+00 +6.9052e+00 +6.8916e+00 +6.8904e+00 +6.8918e+00 +6.9082e+00 +6.9023e+00 +6.8979e+00 +6.9041e+00 +6.9039e+00 +6.8999e+00 +6.8799e+00 +6.9307e+00 +6.8876e+00 +6.8743e+00 +6.9040e+00 +7.3134e+00 +6.8843e+00 +6.9121e+00 +6.9121e+00 +6.9151e+00 +6.8938e+00 +6.9116e+00 +6.9060e+00 +6.9082e+00 +6.8949e+00 +6.8953e+00 +6.8974e+00 +6.9128e+00 +6.8846e+00 +6.8858e+00 +6.8917e+00 +6.8916e+00 +6.9135e+00 +6.9005e+00 +6.8896e+00 +6.9163e+00 +6.9111e+00 +6.9013e+00 +6.9095e+00 +6.9100e+00 +6.8864e+00 +6.9078e+00 +6.9239e+00 +6.9143e+00 +6.9012e+00 +6.8996e+00 +6.9172e+00 +6.9065e+00 +6.9088e+00 +6.9145e+00 +6.9150e+00 +6.8884e+00 +6.9496e+00 +6.9361e+00 +6.9008e+00 +6.9106e+00 +6.9065e+00 +6.8867e+00 +6.9070e+00 +6.8886e+00 +6.9297e+00 +6.9617e+00 +6.8992e+00 +6.9185e+00 +6.8868e+00 +6.8905e+00 +6.9059e+00 +6.9074e+00 +6.8900e+00 +6.9064e+00 +6.9071e+00 +6.8967e+00 +6.9032e+00 +6.9005e+00 +6.9124e+00 +6.9149e+00 +6.9009e+00 +6.9049e+00 +6.9059e+00 +6.9288e+00 +6.9214e+00 +6.9135e+00 +6.8989e+00 +6.8909e+00 +6.9014e+00 +6.8934e+00 +6.8826e+00 +6.9038e+00 +6.9026e+00 +6.9051e+00 +6.8915e+00 +6.9054e+00 +6.9099e+00 +6.9032e+00 +6.8919e+00 +6.9052e+00 +6.9029e+00 +6.8913e+00 +6.8942e+00 +6.9011e+00 +6.9158e+00 +6.9049e+00 +6.8827e+00 +6.9004e+00 +6.8811e+00 +6.9067e+00 +6.9173e+00 +6.8925e+00 +6.8792e+00 +6.9413e+00 +6.8975e+00 +6.9278e+00 +6.9107e+00 +6.9130e+00 +6.8811e+00 +6.9064e+00 +6.8960e+00 +6.9006e+00 +6.8880e+00 +6.8972e+00 +6.8995e+00 +6.9087e+00 +6.8989e+00 +6.8962e+00 +6.8905e+00 +6.9034e+00 +6.9125e+00 +6.8823e+00 +6.9008e+00 +6.9099e+00 +6.8990e+00 +6.8890e+00 +6.8940e+00 +6.8889e+00 +6.8959e+00 +6.9122e+00 +6.9076e+00 +6.8916e+00 +6.9075e+00 +6.8926e+00 +7.0136e+00 +6.9202e+00 +6.9073e+00 +6.9152e+00 +6.9190e+00 +6.9089e+00 +6.9034e+00 +6.8940e+00 +6.8943e+00 +6.8911e+00 +6.9223e+00 +6.9046e+00 +6.9048e+00 +6.8966e+00 +6.8996e+00 +6.8993e+00 +6.8912e+00 +6.8963e+00 +6.9135e+00 +6.9024e+00 +6.9000e+00 +6.8922e+00 +6.9175e+00 +6.8858e+00 +6.8918e+00 +6.9063e+00 +6.8935e+00 +6.8846e+00 +6.9041e+00 +6.8648e+00 +6.8938e+00 +6.9089e+00 +6.8731e+00 +6.8951e+00 +6.8855e+00 +6.9134e+00 +6.9158e+00 +6.9029e+00 +6.8796e+00 +6.8855e+00 +6.9056e+00 +6.9020e+00 +6.9171e+00 +6.9071e+00 +6.9143e+00 +6.8884e+00 +6.8881e+00 +6.9047e+00 +6.9063e+00 +7.0665e+00 +6.9054e+00 +6.8948e+00 +6.8917e+00 +6.8730e+00 +6.9019e+00 +6.9112e+00 +6.9001e+00 +6.9232e+00 +6.9152e+00 +6.8935e+00 +6.9028e+00 +6.9044e+00 +6.9095e+00 +6.8977e+00 +6.9209e+00 +6.9060e+00 +6.8983e+00 +6.8913e+00 +6.9077e+00 +6.9116e+00 +6.9077e+00 +6.8884e+00 +6.9055e+00 +6.8999e+00 +6.9082e+00 +6.8846e+00 +6.8873e+00 +6.9113e+00 +6.9103e+00 +6.8935e+00 +6.8667e+00 +6.8886e+00 +6.9096e+00 +6.8905e+00 +6.9037e+00 +6.9149e+00 +6.8776e+00 +6.9101e+00 +6.9060e+00 +6.9044e+00 +6.8701e+00 +6.9192e+00 +6.8919e+00 +6.9173e+00 +6.8854e+00 +6.8961e+00 +6.9001e+00 +6.8946e+00 +6.9008e+00 +6.8877e+00 +6.8953e+00 +6.8991e+00 +6.8855e+00 +6.8985e+00 +6.8710e+00 +6.9023e+00 +6.8946e+00 +6.8726e+00 +6.9001e+00 +6.8887e+00 +6.8648e+00 +6.8971e+00 +6.8828e+00 +6.9037e+00 +6.9040e+00 +6.9091e+00 +6.9081e+00 +6.8907e+00 +6.8716e+00 +6.9182e+00 +6.8789e+00 +6.9008e+00 +6.8950e+00 +6.9087e+00 +6.9083e+00 +6.8922e+00 +6.9170e+00 +6.8902e+00 +6.9156e+00 +6.8958e+00 +6.8722e+00 +6.9149e+00 +6.9120e+00 +6.8758e+00 +6.9059e+00 +6.9051e+00 +6.8971e+00 +6.8935e+00 +6.9058e+00 +6.8900e+00 +6.9089e+00 +6.8848e+00 +6.8798e+00 +6.9024e+00 +6.9064e+00 +6.8978e+00 +6.8765e+00 +6.8964e+00 +6.9170e+00 +6.8768e+00 +6.8816e+00 +6.8797e+00 +6.8945e+00 +6.8931e+00 +6.8881e+00 +6.8872e+00 +6.8815e+00 +6.8844e+00 +6.8788e+00 +6.8992e+00 +6.8835e+00 +6.9141e+00 +6.8760e+00 +6.9079e+00 +6.8847e+00 +6.8989e+00 +6.9004e+00 +6.8949e+00 +6.8881e+00 +6.9007e+00 +6.8887e+00 +6.8712e+00 +6.8910e+00 +6.8832e+00 +6.8820e+00 +6.8932e+00 +6.8861e+00 +6.8915e+00 +6.8807e+00 +6.8816e+00 +6.8919e+00 +6.8833e+00 +6.8723e+00 +6.8955e+00 +6.8735e+00 +6.8850e+00 +6.8765e+00 +6.8449e+00 +6.8828e+00 +6.8763e+00 +6.8728e+00 +6.8993e+00 +6.8828e+00 +6.8667e+00 +6.8970e+00 +6.8821e+00 +6.8662e+00 +6.9289e+00 +6.8855e+00 +6.8896e+00 +6.8722e+00 +6.8849e+00 +6.8962e+00 +6.8895e+00 +6.8656e+00 +6.8842e+00 +6.8817e+00 +6.9088e+00 +6.8629e+00 +6.8683e+00 +6.8759e+00 +6.8833e+00 +6.8898e+00 +6.8765e+00 +6.8789e+00 +6.8913e+00 +6.8554e+00 +6.8808e+00 +6.8839e+00 +6.8565e+00 +6.8894e+00 +6.8599e+00 +6.8950e+00 +6.8481e+00 +6.8349e+00 +6.8978e+00 +6.9046e+00 +6.8670e+00 +6.8644e+00 +6.8619e+00 +6.8928e+00 +6.8695e+00 +6.8840e+00 +6.9025e+00 +6.8495e+00 +6.8654e+00 +6.8781e+00 +6.8466e+00 +6.8708e+00 +6.9052e+00 +6.8516e+00 +6.8219e+00 +6.8561e+00 +6.8770e+00 +6.8457e+00 +6.8610e+00 +6.8360e+00 +6.8726e+00 +6.8903e+00 +6.8750e+00 +6.8594e+00 +6.8458e+00 +6.8906e+00 +6.8777e+00 +6.8588e+00 +6.8642e+00 +6.8462e+00 +6.8409e+00 +6.8841e+00 +6.8533e+00 +6.8413e+00 +6.8459e+00 +6.8355e+00 +6.8458e+00 +6.8769e+00 +6.8525e+00 +6.8585e+00 +6.8511e+00 +6.8796e+00 +6.8624e+00 +6.8602e+00 +6.8486e+00 +6.8382e+00 +6.8569e+00 +6.8210e+00 +6.8327e+00 +6.8207e+00 +6.8411e+00 +6.8434e+00 +6.8462e+00 +6.8057e+00 +6.8846e+00 +6.8258e+00 +6.8551e+00 +6.8733e+00 +6.8327e+00 +6.8706e+00 +6.8228e+00 +6.8477e+00 +6.8566e+00 +6.8388e+00 +6.9730e+00 +6.8482e+00 +6.8209e+00 +6.8197e+00 +6.8438e+00 +6.8413e+00 +6.8738e+00 +6.8130e+00 +6.8095e+00 +6.8157e+00 +6.8178e+00 +6.8308e+00 +6.8123e+00 +6.8292e+00 +6.8363e+00 +6.8402e+00 +6.8062e+00 +6.7752e+00 +6.8784e+00 +6.8100e+00 +6.8751e+00 +6.8228e+00 +6.7739e+00 +6.8646e+00 +6.8263e+00 +6.8552e+00 +6.8643e+00 +6.8539e+00 +6.8271e+00 +6.8123e+00 +6.7688e+00 +6.8407e+00 +6.7822e+00 +6.8404e+00 +6.8420e+00 +6.8305e+00 +6.8317e+00 +6.8067e+00 +6.8522e+00 +6.8420e+00 +6.8416e+00 +6.7960e+00 +6.7582e+00 +6.8175e+00 +6.7842e+00 +6.8666e+00 +6.8177e+00 +6.8347e+00 +6.7411e+00 +6.7987e+00 +6.8064e+00 +6.8511e+00 +6.7996e+00 +6.8324e+00 +6.8182e+00 +6.7676e+00 +6.8215e+00 +6.8605e+00 +6.8124e+00 +6.8259e+00 +6.7842e+00 +6.8012e+00 +6.7588e+00 +6.7698e+00 +6.8143e+00 +6.8021e+00 +6.7903e+00 +6.7716e+00 +6.7858e+00 +6.7707e+00 +6.8189e+00 +6.7474e+00 +6.7969e+00 +6.8350e+00 +6.7845e+00 +6.7967e+00 +6.7227e+00 +6.7978e+00 +6.7940e+00 +6.8097e+00 +6.7249e+00 +6.8328e+00 +6.8061e+00 +6.7944e+00 +6.7538e+00 +6.7928e+00 +6.8062e+00 +6.7960e+00 +6.7772e+00 +6.8267e+00 +6.7962e+00 +6.7935e+00 +6.7547e+00 +6.7938e+00 +6.7997e+00 +6.7578e+00 +6.8340e+00 +6.7596e+00 +6.7900e+00 +6.7942e+00 +6.8207e+00 +6.7811e+00 +6.7903e+00 +6.8257e+00 +6.7848e+00 +6.8274e+00 +6.8091e+00 +6.7929e+00 +6.7519e+00 +6.7715e+00 +6.7552e+00 +6.7935e+00 +6.7709e+00 +6.7507e+00 +6.8318e+00 +6.8196e+00 +6.7874e+00 +6.7798e+00 +6.7509e+00 +6.7111e+00 +6.7982e+00 +6.7758e+00 +6.7513e+00 +6.8487e+00 +6.7830e+00 +6.7463e+00 +6.7603e+00 +6.7135e+00 +6.7802e+00 +6.7733e+00 +6.7914e+00 +6.7794e+00 +6.8379e+00 +6.7470e+00 +6.7484e+00 +6.8033e+00 +6.7671e+00 +6.7491e+00 +6.7796e+00 +6.7394e+00 +6.7562e+00 +6.7607e+00 +6.8022e+00 +6.7848e+00 +6.7940e+00 +6.7476e+00 +6.7598e+00 +6.7342e+00 +6.7313e+00 +6.7544e+00 +6.7663e+00 +6.7498e+00 +6.7318e+00 +6.7261e+00 +6.7644e+00 +6.7217e+00 +6.7608e+00 +6.7668e+00 +6.6999e+00 +6.7377e+00 +6.7207e+00 +6.7721e+00 +6.7394e+00 +6.7596e+00 +6.7983e+00 +6.7662e+00 +6.7960e+00 +6.7648e+00 +6.6788e+00 +6.7563e+00 +6.7550e+00 +6.7234e+00 +6.7232e+00 +6.7600e+00 +6.7357e+00 +6.7328e+00 +6.7314e+00 +6.6561e+00 +6.6965e+00 +6.7689e+00 +6.7023e+00 +6.7325e+00 +6.7458e+00 +6.8207e+00 +6.6930e+00 +6.7238e+00 +6.7600e+00 +6.7355e+00 +6.7659e+00 +6.7434e+00 +6.7409e+00 +6.7395e+00 +6.7572e+00 +6.7169e+00 +6.7659e+00 +6.7652e+00 +6.7069e+00 +6.7473e+00 +6.6947e+00 +6.7111e+00 +6.7774e+00 +6.7340e+00 +6.7137e+00 +6.7419e+00 +6.7497e+00 +6.7197e+00 +6.7421e+00 +6.7418e+00 +6.7555e+00 +6.6898e+00 +6.7155e+00 +6.7102e+00 +6.6651e+00 +6.6914e+00 +6.8272e+00 +6.6330e+00 +6.7128e+00 +6.7701e+00 +6.6802e+00 +6.7129e+00 +6.6999e+00 +6.6856e+00 +6.7094e+00 +6.6900e+00 +6.6879e+00 +6.6856e+00 +6.6618e+00 +6.7441e+00 +6.6826e+00 +6.7311e+00 +6.7218e+00 +6.7127e+00 +6.7441e+00 +6.6563e+00 +6.6703e+00 +6.7149e+00 +6.6870e+00 +6.6788e+00 +6.7434e+00 +6.6562e+00 +6.6764e+00 +6.6457e+00 +6.6862e+00 +6.6416e+00 +6.7771e+00 +6.7240e+00 +6.6886e+00 +6.6923e+00 +6.6383e+00 +6.6398e+00 +6.6543e+00 +6.6729e+00 +6.6319e+00 +6.7144e+00 +6.6691e+00 +6.7167e+00 +6.6680e+00 +6.7065e+00 +6.7136e+00 +6.7470e+00 +6.6364e+00 +6.6422e+00 +6.7149e+00 +6.6896e+00 +6.7000e+00 +6.6969e+00 +6.6869e+00 +6.7326e+00 +6.6978e+00 +6.6793e+00 +6.5643e+00 +6.6531e+00 +6.6791e+00 +6.6698e+00 +6.6887e+00 +6.6896e+00 +6.6358e+00 +6.7604e+00 +6.6488e+00 +6.6979e+00 +6.6550e+00 +6.7443e+00 +6.6944e+00 +6.7200e+00 +6.7008e+00 +6.5638e+00 +6.6463e+00 +6.6758e+00 +6.6639e+00 +6.6646e+00 +6.6675e+00 +6.6438e+00 +6.6732e+00 +6.6758e+00 +6.6820e+00 +6.7055e+00 +6.7264e+00 +6.6462e+00 +6.6903e+00 +6.6924e+00 +6.6891e+00 +6.6558e+00 +6.6783e+00 +6.6434e+00 +6.6480e+00 +6.6923e+00 +6.5919e+00 +6.6339e+00 +6.6126e+00 +6.6787e+00 +6.6766e+00 +6.6787e+00 +6.7137e+00 +6.6569e+00 +6.7052e+00 +6.6411e+00 +6.6620e+00 +6.6799e+00 +6.6059e+00 +6.6981e+00 +6.6304e+00 +6.6259e+00 +6.6601e+00 +6.6727e+00 +6.7490e+00 +6.6613e+00 +6.7060e+00 +6.6796e+00 +6.6344e+00 +6.6731e+00 +6.6377e+00 +6.6461e+00 +6.6613e+00 +6.6293e+00 +6.5619e+00 +6.6555e+00 +6.7021e+00 +6.6675e+00 +6.6884e+00 +6.6423e+00 +6.6171e+00 +6.6313e+00 +6.5663e+00 +6.6424e+00 +6.6452e+00 +6.6069e+00 +6.5656e+00 +6.6396e+00 +6.7760e+00 +6.7070e+00 +6.6485e+00 +6.5912e+00 +6.6044e+00 +6.6439e+00 +6.6593e+00 +6.5980e+00 +6.5807e+00 +6.6284e+00 +6.6066e+00 +6.6054e+00 +6.5819e+00 +6.6604e+00 +6.5904e+00 +6.6498e+00 +6.5395e+00 +6.5845e+00 +6.6115e+00 +6.6425e+00 +6.7343e+00 +6.6102e+00 +6.6308e+00 +6.6767e+00 +6.5974e+00 +6.6209e+00 +6.6264e+00 +6.6618e+00 +6.6920e+00 +6.6049e+00 +6.6440e+00 +6.5973e+00 +6.6402e+00 +6.6373e+00 +6.6421e+00 +6.5987e+00 +6.6493e+00 +6.6595e+00 +6.6754e+00 +6.6464e+00 +6.5872e+00 +6.6700e+00 +6.6166e+00 +6.5495e+00 +6.6031e+00 +6.6210e+00 +6.5051e+00 +6.6197e+00 +6.5641e+00 +6.6606e+00 +6.6242e+00 +6.6478e+00 +6.6333e+00 +6.5575e+00 +6.5881e+00 +6.5473e+00 +6.5767e+00 +6.6105e+00 +6.6374e+00 +6.5697e+00 +6.6380e+00 +6.5745e+00 +6.5510e+00 +6.6642e+00 +6.6845e+00 +6.5341e+00 +6.6511e+00 +6.5104e+00 +6.7300e+00 +6.5924e+00 +6.7619e+00 +6.6147e+00 +6.5624e+00 +6.5996e+00 +6.6462e+00 +6.5859e+00 +6.5855e+00 +6.5473e+00 +6.5644e+00 +6.5261e+00 +6.6228e+00 +6.5947e+00 +6.6140e+00 +6.5863e+00 +6.5518e+00 +6.6367e+00 +6.6277e+00 +6.5804e+00 +6.5894e+00 +6.6114e+00 +6.6319e+00 +6.5901e+00 +6.6157e+00 +6.5204e+00 +6.5613e+00 +6.5164e+00 +6.6514e+00 +6.5814e+00 +6.5507e+00 +6.5464e+00 +6.5455e+00 +6.6037e+00 +6.5907e+00 +6.6052e+00 +6.5734e+00 +6.5543e+00 +6.5199e+00 +6.5062e+00 +6.5316e+00 +6.6779e+00 +6.5029e+00 +6.5954e+00 +6.5419e+00 +6.5168e+00 +6.6493e+00 +6.4756e+00 +6.5601e+00 +6.6356e+00 +6.6806e+00 +6.5703e+00 +6.5498e+00 +6.5219e+00 +6.6475e+00 +6.6552e+00 +6.5810e+00 +6.5474e+00 +6.5969e+00 +6.5889e+00 +6.5994e+00 +6.5196e+00 +6.5625e+00 +6.5061e+00 +6.6160e+00 +6.6218e+00 +6.5248e+00 +6.5025e+00 +6.6667e+00 +6.6407e+00 +6.5809e+00 +6.5705e+00 +6.5472e+00 +6.6507e+00 +6.5088e+00 +6.5922e+00 +6.4746e+00 +6.5978e+00 +6.5915e+00 +6.4656e+00 +6.5648e+00 +6.6436e+00 +6.5309e+00 +6.5474e+00 +6.5610e+00 +6.4982e+00 +6.5853e+00 +6.5581e+00 +6.5549e+00 +6.5306e+00 +6.5493e+00 +6.5638e+00 +6.5415e+00 +6.5323e+00 +6.5494e+00 +6.5054e+00 +6.5223e+00 +6.6440e+00 +6.4709e+00 +6.6017e+00 +6.5013e+00 +6.4601e+00 +6.4961e+00 +6.4736e+00 +6.5305e+00 +6.4868e+00 +6.4555e+00 +6.5168e+00 +6.4468e+00 +6.5879e+00 +6.4508e+00 +6.5128e+00 +6.5561e+00 +6.6104e+00 +6.5616e+00 +6.5288e+00 +6.5285e+00 +6.4903e+00 +6.5250e+00 +6.6541e+00 +6.5366e+00 +6.5135e+00 +6.4438e+00 +6.4447e+00 +6.6344e+00 +6.5512e+00 +6.5292e+00 +6.5443e+00 +6.5545e+00 +6.5945e+00 +6.4738e+00 +6.4725e+00 +6.5390e+00 +6.4270e+00 +6.4562e+00 +6.4135e+00 +6.4431e+00 +6.4681e+00 +6.4868e+00 +6.4783e+00 +6.4897e+00 +6.5253e+00 +6.6011e+00 +6.4578e+00 +6.4633e+00 +6.4100e+00 +6.5483e+00 +6.5265e+00 +6.5666e+00 +6.5296e+00 +6.5031e+00 +6.6110e+00 +6.4237e+00 +6.4419e+00 +6.5598e+00 +6.5698e+00 +6.4571e+00 +6.4365e+00 +6.4593e+00 +6.5082e+00 +6.5324e+00 +6.5855e+00 +6.4060e+00 +6.4459e+00 +6.4859e+00 +6.6005e+00 +6.4756e+00 +6.4454e+00 +6.5951e+00 +6.5359e+00 +6.5574e+00 +6.4302e+00 +6.4992e+00 +6.4677e+00 +6.5888e+00 +6.5119e+00 +6.5359e+00 +6.5434e+00 +6.5234e+00 +6.5142e+00 +6.4809e+00 +6.4906e+00 +6.5789e+00 +6.3823e+00 +6.5776e+00 +6.5054e+00 +6.5691e+00 +6.4427e+00 +6.5158e+00 +6.5275e+00 +6.4744e+00 +6.5385e+00 +6.4491e+00 +6.4146e+00 +6.5301e+00 +6.5043e+00 +6.4290e+00 +6.4379e+00 +6.5327e+00 +6.4306e+00 +6.5953e+00 +6.5162e+00 +6.4772e+00 +6.5291e+00 +6.4888e+00 +6.3933e+00 +6.4377e+00 +6.5765e+00 +6.4773e+00 +6.4497e+00 +6.4928e+00 +6.3957e+00 +6.4891e+00 +6.4502e+00 +6.5640e+00 +6.3804e+00 +6.5117e+00 +6.4300e+00 +6.4365e+00 +6.5048e+00 +6.5424e+00 +6.4618e+00 +6.3403e+00 +6.4037e+00 +6.4161e+00 +6.3947e+00 +6.4388e+00 +6.4944e+00 +6.4316e+00 +6.4433e+00 +6.4282e+00 +6.5178e+00 +6.5146e+00 +6.4876e+00 +6.4250e+00 +6.4470e+00 +6.4201e+00 +6.4905e+00 +6.3618e+00 +6.4406e+00 +6.4198e+00 +6.4886e+00 +6.3112e+00 +6.5404e+00 +6.5018e+00 +6.4281e+00 +6.4048e+00 +6.5059e+00 +6.3292e+00 +6.3653e+00 +6.3779e+00 +6.5100e+00 +6.4875e+00 +6.4479e+00 +6.3697e+00 +6.4246e+00 +6.5503e+00 +6.3958e+00 +6.4417e+00 +6.4559e+00 +6.5690e+00 +6.5855e+00 +6.5444e+00 +6.3557e+00 +6.3755e+00 +6.3811e+00 +6.4712e+00 +6.5466e+00 +6.4431e+00 +6.4809e+00 +6.5216e+00 +6.4891e+00 +6.3798e+00 +6.5055e+00 +6.4041e+00 +6.4691e+00 +6.5379e+00 +6.4660e+00 +6.4280e+00 +6.4608e+00 +6.4110e+00 +6.5028e+00 +6.4340e+00 +6.4096e+00 +6.3859e+00 +6.3549e+00 +6.3428e+00 +6.3967e+00 +6.4403e+00 +6.4673e+00 +6.2918e+00 +6.5052e+00 +6.3467e+00 +6.4065e+00 +6.4357e+00 +6.3781e+00 +6.3506e+00 +6.4241e+00 +6.4375e+00 +6.3504e+00 +6.4617e+00 +6.4214e+00 +6.3391e+00 +6.4163e+00 +6.3841e+00 +6.3572e+00 +6.3798e+00 +6.3804e+00 +6.3625e+00 +6.4310e+00 +6.4898e+00 +6.4075e+00 +6.5419e+00 +6.3940e+00 +6.3722e+00 +6.5138e+00 +6.5274e+00 +6.4615e+00 +6.3853e+00 +6.5334e+00 +6.4043e+00 +6.3497e+00 +6.3474e+00 +6.4821e+00 +6.4784e+00 +6.3568e+00 +6.3914e+00 +6.5115e+00 +6.3340e+00 +6.3611e+00 +6.3682e+00 +6.3658e+00 +6.4214e+00 +6.2681e+00 +6.3793e+00 +6.3695e+00 +6.2861e+00 +6.4391e+00 +6.3437e+00 +6.3736e+00 +6.3678e+00 +6.4270e+00 +6.5202e+00 +6.4663e+00 +6.4155e+00 +6.3921e+00 +6.3030e+00 +6.3896e+00 +6.4997e+00 +6.3610e+00 +6.4195e+00 +6.4650e+00 +6.4012e+00 +6.3806e+00 +6.3803e+00 +6.4176e+00 +6.3044e+00 +6.3093e+00 +6.3978e+00 +6.3523e+00 +6.3853e+00 +6.3014e+00 +6.5711e+00 +6.3389e+00 +6.3254e+00 +6.2977e+00 +6.3900e+00 +6.3432e+00 +6.3229e+00 +6.3834e+00 +6.2609e+00 +6.3489e+00 +6.5541e+00 +6.3816e+00 +6.3931e+00 +6.3263e+00 +6.3166e+00 +6.3621e+00 +6.2188e+00 +6.4002e+00 +6.3129e+00 +6.3375e+00 +6.3984e+00 +6.3931e+00 +6.2205e+00 +6.4339e+00 +6.3401e+00 +6.3700e+00 +6.3224e+00 +6.3305e+00 +6.4470e+00 +6.3360e+00 +6.1795e+00 +6.2694e+00 +6.2787e+00 +6.2532e+00 +6.4125e+00 +6.3096e+00 +6.2590e+00 +6.3638e+00 +6.2412e+00 +6.2491e+00 +6.3082e+00 +6.2790e+00 +6.3492e+00 +6.4170e+00 +6.3644e+00 +6.3260e+00 +6.2833e+00 +6.2706e+00 +6.4572e+00 +6.3763e+00 +6.2526e+00 +6.2076e+00 +6.3113e+00 +6.2884e+00 +6.3492e+00 +6.2837e+00 +6.4013e+00 +6.3221e+00 +6.3740e+00 +6.3742e+00 +6.2466e+00 +6.3644e+00 +6.2651e+00 +6.2780e+00 +6.2412e+00 +6.3895e+00 +6.2770e+00 +6.3611e+00 +6.3003e+00 +6.1737e+00 +6.2777e+00 +6.2934e+00 +6.3127e+00 +6.2478e+00 +6.2638e+00 +6.2570e+00 +6.3286e+00 +6.2761e+00 +6.2900e+00 +6.2390e+00 +6.3252e+00 +6.2377e+00 +6.3278e+00 +6.2638e+00 +6.3727e+00 +6.2691e+00 +6.2055e+00 +6.2526e+00 +6.3752e+00 +6.1857e+00 +6.3265e+00 +6.3754e+00 +6.3165e+00 +6.2717e+00 +6.2634e+00 +6.3670e+00 +6.2475e+00 +6.2395e+00 +6.2867e+00 +6.3297e+00 +6.1355e+00 +6.1980e+00 +6.3071e+00 +6.1472e+00 +6.3563e+00 +6.2142e+00 +6.2932e+00 +6.2359e+00 +6.3353e+00 +6.2622e+00 +6.4513e+00 +6.2995e+00 +6.2314e+00 +6.2915e+00 +6.2190e+00 +6.3534e+00 +6.3487e+00 +6.3157e+00 +6.1931e+00 +6.4112e+00 +6.2980e+00 +6.1800e+00 +6.2490e+00 +6.1101e+00 +6.2451e+00 +6.2390e+00 +6.2240e+00 +6.1914e+00 +6.2158e+00 +6.2695e+00 +6.2473e+00 +6.3158e+00 +6.2713e+00 +6.2241e+00 +6.3024e+00 +6.3270e+00 +6.3048e+00 +6.3105e+00 +6.2362e+00 +6.1931e+00 +6.2352e+00 +6.2094e+00 +6.3642e+00 +6.2692e+00 +6.2933e+00 +6.3833e+00 +6.1545e+00 +6.3310e+00 +6.2399e+00 +6.2470e+00 +6.2505e+00 +6.2150e+00 +6.2503e+00 +6.3077e+00 +6.3023e+00 +6.2406e+00 +6.2438e+00 +6.2934e+00 +6.3063e+00 +6.2277e+00 +6.2253e+00 +6.2888e+00 +6.1503e+00 +6.1702e+00 +6.2719e+00 +6.3179e+00 +6.2715e+00 +6.3820e+00 +6.3044e+00 +6.2277e+00 +6.1795e+00 +6.3450e+00 +6.1779e+00 +6.1738e+00 +6.1556e+00 +6.2513e+00 +6.2816e+00 +6.2324e+00 +6.2892e+00 +6.2318e+00 +6.1499e+00 +6.1558e+00 +6.3091e+00 +6.1107e+00 +6.3404e+00 +6.2402e+00 +6.2289e+00 +6.3021e+00 +6.2117e+00 +6.1836e+00 +6.2755e+00 +6.2979e+00 +6.2580e+00 +6.0091e+00 +6.1477e+00 +6.2548e+00 +6.2265e+00 +6.1972e+00 +6.2239e+00 +6.2418e+00 +6.1506e+00 +6.2323e+00 +6.1737e+00 +6.1320e+00 +6.2560e+00 +6.1821e+00 +6.1980e+00 +6.3082e+00 +6.1652e+00 +6.1305e+00 +6.2058e+00 +6.2434e+00 +6.2432e+00 +6.1922e+00 +6.2959e+00 +6.3707e+00 +6.2363e+00 +6.1636e+00 +6.1462e+00 +6.2544e+00 +6.3015e+00 +6.3610e+00 +6.2046e+00 +6.3110e+00 +6.1909e+00 +6.1586e+00 +6.2207e+00 +6.2503e+00 +6.0979e+00 +6.3842e+00 +6.3328e+00 +6.2704e+00 +6.2203e+00 +6.2013e+00 +6.2946e+00 +6.1973e+00 +6.1691e+00 +6.3083e+00 +6.2538e+00 +6.1893e+00 +6.3059e+00 +6.3325e+00 +6.1155e+00 +6.3222e+00 +6.2158e+00 +6.2599e+00 +6.2362e+00 +6.2405e+00 +6.2532e+00 +6.2763e+00 +6.1422e+00 +6.2447e+00 +6.2154e+00 +6.1191e+00 +6.1192e+00 +6.1314e+00 +6.2342e+00 +6.2519e+00 +6.1882e+00 +6.2924e+00 +6.1657e+00 +6.1998e+00 +6.1405e+00 +6.2610e+00 +6.1499e+00 +6.3120e+00 +6.2073e+00 +6.1608e+00 +6.1852e+00 +6.3017e+00 +6.1838e+00 +6.2170e+00 +6.2075e+00 +6.1706e+00 +6.1297e+00 +6.2389e+00 +6.1402e+00 +6.2667e+00 +6.2832e+00 +6.1954e+00 +6.1516e+00 +6.2377e+00 +6.1120e+00 +6.3711e+00 +6.0187e+00 +6.2844e+00 +6.1311e+00 +6.1943e+00 +6.1019e+00 +6.2087e+00 +6.0433e+00 +6.1208e+00 +6.1447e+00 +6.2045e+00 +6.1229e+00 +6.1547e+00 +6.2099e+00 +6.2601e+00 +6.0653e+00 +6.1612e+00 +6.1563e+00 +6.1880e+00 +6.2453e+00 +6.0937e+00 +6.2553e+00 +6.1586e+00 +6.2422e+00 +6.2275e+00 +6.2906e+00 +6.2221e+00 +6.0901e+00 +6.1900e+00 +6.1244e+00 +6.1125e+00 +6.1855e+00 +6.1510e+00 +6.1798e+00 +6.1608e+00 +6.0458e+00 +6.1492e+00 +6.0680e+00 +6.2087e+00 +6.1310e+00 +6.1316e+00 +6.1747e+00 +6.1172e+00 +6.1110e+00 +6.1383e+00 +6.1165e+00 +6.0949e+00 +6.2561e+00 +6.2271e+00 +6.1370e+00 +6.1528e+00 +6.1146e+00 +6.1604e+00 +6.1704e+00 +6.1208e+00 +6.2547e+00 +6.1287e+00 +6.1600e+00 +6.1823e+00 +6.1246e+00 +6.1511e+00 +6.0582e+00 +6.1691e+00 +6.2556e+00 +6.0511e+00 +6.0548e+00 +6.1456e+00 +6.1391e+00 +6.1972e+00 +6.1704e+00 +6.1348e+00 +6.1420e+00 +6.1865e+00 +6.0599e+00 +6.1069e+00 +6.1867e+00 +6.1279e+00 +6.0872e+00 +6.1161e+00 +6.1532e+00 +6.1203e+00 +6.0742e+00 +6.0750e+00 +6.2615e+00 +6.0949e+00 +6.2251e+00 +6.2348e+00 +6.2127e+00 +6.0923e+00 +6.1035e+00 +6.2271e+00 +6.1105e+00 +6.0466e+00 +6.1138e+00 +6.2065e+00 +6.1114e+00 +6.1120e+00 +6.2484e+00 +6.1748e+00 +6.2653e+00 +6.1549e+00 +6.2230e+00 +6.1842e+00 +6.0838e+00 +6.0433e+00 +6.0443e+00 +6.1498e+00 +6.0596e+00 +6.1914e+00 +6.0309e+00 +6.0861e+00 +5.9340e+00 +6.0812e+00 +6.1495e+00 +6.1003e+00 +6.0937e+00 +5.9172e+00 +5.9841e+00 +6.1055e+00 +6.0576e+00 +6.1577e+00 +6.0781e+00 +6.2285e+00 +6.0485e+00 +6.1462e+00 +6.0954e+00 +6.1748e+00 +6.2707e+00 +6.1001e+00 +6.1933e+00 +6.2395e+00 +6.0913e+00 +6.1195e+00 +6.0370e+00 +6.0369e+00 +6.1822e+00 +6.2212e+00 +6.0950e+00 +6.0732e+00 +6.1707e+00 +6.1881e+00 +6.1594e+00 +5.8065e+00 +6.0659e+00 +5.9919e+00 +6.0433e+00 +6.1407e+00 +6.2211e+00 +5.9636e+00 +5.9459e+00 +6.0432e+00 +6.1375e+00 +6.0789e+00 +6.1382e+00 +6.0653e+00 +6.0367e+00 +6.0638e+00 +6.1218e+00 +6.0586e+00 +6.1248e+00 +6.0495e+00 +6.0315e+00 +6.0527e+00 +6.0548e+00 +5.9845e+00 +6.0269e+00 +6.1214e+00 +5.9683e+00 +6.0032e+00 +6.0998e+00 +6.0621e+00 +5.9694e+00 +6.1650e+00 +6.2593e+00 +5.9943e+00 +5.9903e+00 +6.0900e+00 +6.0092e+00 +6.0758e+00 +5.9855e+00 +6.1232e+00 +6.1651e+00 +6.0622e+00 +5.9588e+00 +5.9450e+00 +5.9684e+00 +5.9261e+00 +6.1173e+00 +6.0845e+00 +6.0911e+00 +5.9977e+00 +6.0402e+00 +6.0924e+00 +6.0855e+00 +6.0890e+00 +5.9302e+00 +6.2334e+00 +6.1144e+00 +5.9035e+00 +6.2051e+00 +6.0656e+00 +5.9300e+00 +5.9045e+00 +5.9920e+00 +6.0314e+00 +5.9396e+00 +6.1314e+00 +6.1426e+00 +6.1165e+00 +6.0661e+00 +5.9897e+00 +6.0113e+00 +5.9804e+00 +5.9925e+00 +6.0146e+00 +6.1596e+00 +6.0826e+00 +5.8912e+00 +6.0025e+00 +6.1222e+00 +5.9999e+00 +6.0775e+00 +6.1352e+00 +6.0019e+00 +6.0047e+00 +6.1633e+00 +5.9517e+00 +5.9941e+00 +5.9421e+00 +5.9198e+00 +6.0120e+00 +5.9919e+00 +6.0123e+00 +6.0338e+00 +5.9574e+00 +6.0715e+00 +5.9572e+00 +6.0069e+00 +5.9899e+00 +6.1474e+00 +6.2308e+00 +6.0127e+00 +6.0737e+00 +6.1558e+00 +6.0268e+00 +6.1370e+00 +5.9557e+00 +5.9012e+00 +6.0767e+00 +6.0052e+00 +6.0097e+00 +6.0374e+00 +6.1072e+00 +5.9392e+00 +6.1855e+00 +6.0241e+00 +5.9896e+00 +6.1185e+00 +5.9917e+00 +6.0658e+00 +6.0110e+00 +6.0169e+00 +6.0112e+00 +6.0556e+00 +5.8792e+00 +6.0188e+00 +6.1237e+00 +6.0402e+00 +5.9867e+00 +5.9492e+00 +6.1214e+00 +5.9829e+00 +5.9468e+00 +5.8616e+00 +5.8081e+00 +6.0418e+00 +5.9114e+00 +6.0987e+00 +5.8739e+00 +5.9725e+00 +6.1321e+00 +5.9141e+00 +5.9960e+00 +5.9443e+00 +6.0335e+00 +6.1037e+00 +5.9687e+00 +5.9866e+00 +5.9921e+00 +6.0327e+00 +5.9912e+00 +5.9756e+00 +6.0451e+00 +6.0157e+00 +5.9688e+00 +5.8886e+00 +5.8466e+00 +5.8956e+00 +5.9688e+00 +5.9942e+00 +6.0276e+00 +5.9884e+00 +5.7954e+00 +5.9892e+00 +6.0576e+00 +6.0374e+00 +6.1042e+00 +6.0168e+00 +6.1416e+00 +6.0121e+00 +6.0888e+00 +6.1364e+00 +5.9949e+00 +6.0629e+00 +6.1133e+00 +5.9585e+00 +6.1257e+00 +5.9141e+00 +5.9143e+00 +6.0018e+00 +5.9938e+00 +6.0544e+00 +5.9936e+00 +6.0161e+00 +5.9068e+00 +6.0421e+00 +5.9807e+00 +5.9405e+00 +5.8365e+00 +5.8715e+00 +6.0791e+00 +5.9537e+00 +5.8042e+00 +6.0225e+00 +5.9325e+00 +5.9694e+00 +5.9379e+00 +6.1620e+00 +6.0152e+00 +5.9895e+00 +5.8409e+00 +5.9589e+00 +5.8399e+00 +5.8988e+00 +5.9288e+00 +6.0114e+00 +5.9575e+00 +5.9549e+00 +5.9477e+00 +5.8826e+00 +6.0914e+00 +5.8012e+00 +5.9086e+00 +5.9897e+00 +5.8610e+00 +5.9041e+00 +5.8540e+00 +5.9422e+00 +6.0304e+00 +5.8840e+00 +5.9754e+00 +5.8764e+00 +5.9434e+00 +5.8338e+00 +5.8620e+00 +6.0048e+00 +5.8864e+00 +6.0824e+00 +5.9145e+00 +5.8193e+00 +6.0788e+00 +5.8428e+00 +5.8613e+00 +5.7906e+00 +5.8872e+00 +5.8682e+00 +5.9745e+00 +5.7857e+00 +5.9021e+00 +5.8503e+00 +5.8827e+00 +6.0456e+00 +5.7930e+00 +6.0142e+00 +6.0560e+00 +5.8860e+00 +5.9062e+00 +5.7776e+00 +6.0026e+00 +5.8630e+00 +5.8193e+00 +5.9056e+00 +5.9643e+00 +5.8846e+00 +5.8601e+00 +5.9434e+00 +5.9664e+00 +5.9856e+00 +5.8161e+00 +5.8699e+00 +5.9174e+00 +5.9008e+00 +6.0884e+00 +5.8733e+00 +5.9692e+00 +5.8624e+00 +5.9450e+00 +5.8981e+00 +6.0162e+00 +5.9350e+00 +5.9278e+00 +5.9670e+00 +5.9824e+00 +5.8647e+00 +5.9682e+00 +5.9358e+00 +5.8024e+00 +6.0844e+00 +5.9816e+00 +5.8619e+00 +6.0808e+00 +6.0119e+00 +5.9539e+00 +5.8887e+00 +5.8032e+00 +5.8994e+00 +5.9913e+00 +5.7999e+00 +5.9834e+00 +5.9327e+00 +5.9495e+00 +5.8878e+00 +6.0339e+00 +6.0302e+00 +5.9281e+00 +5.9763e+00 +5.8048e+00 +6.0138e+00 +5.9200e+00 +5.8339e+00 +5.8638e+00 +5.8095e+00 +5.8449e+00 +5.9644e+00 +5.7913e+00 +5.8011e+00 +5.9184e+00 +5.8091e+00 +5.8889e+00 +5.8140e+00 +5.7781e+00 +5.8968e+00 +5.8197e+00 +5.9279e+00 +5.8592e+00 +5.8295e+00 +5.7980e+00 +5.6988e+00 +5.8664e+00 +5.9330e+00 +6.0490e+00 +5.7167e+00 +5.8518e+00 +6.0362e+00 +5.7891e+00 +5.7855e+00 +5.8951e+00 +5.9346e+00 +5.9133e+00 +5.8255e+00 +5.9455e+00 +5.8678e+00 +6.0361e+00 +5.6729e+00 +5.9632e+00 +5.8744e+00 +5.8039e+00 +6.1014e+00 +5.9379e+00 +5.8855e+00 +5.8670e+00 +5.8268e+00 +5.8919e+00 +5.8918e+00 +5.9585e+00 +6.0008e+00 +6.0698e+00 +5.9662e+00 +5.9095e+00 +5.8971e+00 +5.7985e+00 +5.8820e+00 +5.8930e+00 +5.9247e+00 +5.9281e+00 +5.9409e+00 +5.8453e+00 +5.9049e+00 +5.7563e+00 +5.7883e+00 +5.9086e+00 +6.0362e+00 +5.9312e+00 +5.8872e+00 +5.7634e+00 +5.8251e+00 +5.8679e+00 +5.9161e+00 +5.9183e+00 +5.7592e+00 +5.8421e+00 +5.8326e+00 +5.9787e+00 +6.0362e+00 +5.9595e+00 +5.8831e+00 +5.9568e+00 +5.9892e+00 +5.8595e+00 +5.9504e+00 +5.9089e+00 +5.8373e+00 +5.8971e+00 +5.8542e+00 +5.8316e+00 +5.8221e+00 +5.8195e+00 +6.0695e+00 +5.9728e+00 +5.8639e+00 +5.8431e+00 +5.9416e+00 +5.8024e+00 +5.8913e+00 +5.8488e+00 +5.9110e+00 +5.8074e+00 +5.7967e+00 +5.6602e+00 +5.8477e+00 +5.8152e+00 +5.8749e+00 +5.8151e+00 +5.8646e+00 +5.8609e+00 +5.8023e+00 +5.7522e+00 +5.6834e+00 +5.8786e+00 +5.7914e+00 +5.9762e+00 +5.8877e+00 +5.9374e+00 +5.8876e+00 +5.8752e+00 +5.9519e+00 +5.9109e+00 +5.9505e+00 +5.8879e+00 +5.8594e+00 +5.8694e+00 +5.8475e+00 +5.7873e+00 +5.7836e+00 +5.7481e+00 +5.6289e+00 +5.8971e+00 +5.7366e+00 +5.7377e+00 +5.5516e+00 +5.8798e+00 +5.8011e+00 +5.9569e+00 +5.7557e+00 +5.8123e+00 +5.7836e+00 +5.9352e+00 +5.7749e+00 +5.8955e+00 +5.8109e+00 +5.7214e+00 +5.7767e+00 +5.8281e+00 +5.7322e+00 +5.9186e+00 +5.8811e+00 +5.7061e+00 +5.8876e+00 +5.7820e+00 +5.8479e+00 +5.8339e+00 +5.8795e+00 +5.7583e+00 +5.8102e+00 +5.7982e+00 +5.7103e+00 +5.7080e+00 +5.9405e+00 +5.7672e+00 +5.7154e+00 +5.7353e+00 +5.9803e+00 +5.7328e+00 +5.9542e+00 +5.7586e+00 +5.8726e+00 +5.8167e+00 +5.8700e+00 +5.7448e+00 +5.8537e+00 +5.8186e+00 +5.7740e+00 +5.6877e+00 +5.8320e+00 +5.9345e+00 +5.7928e+00 +5.8356e+00 +5.7444e+00 +5.8790e+00 +5.9467e+00 +5.8602e+00 +5.6681e+00 +5.6842e+00 +5.6730e+00 +5.8115e+00 +5.6809e+00 +5.8397e+00 +5.7063e+00 +5.7830e+00 +5.8422e+00 +5.6356e+00 +5.7182e+00 +5.7925e+00 +5.7765e+00 +5.8168e+00 +5.8697e+00 +5.7969e+00 +5.7744e+00 +5.7220e+00 +5.8251e+00 +5.7214e+00 +5.7628e+00 +5.6990e+00 +5.9394e+00 +5.7466e+00 +5.8545e+00 +5.7716e+00 +5.9270e+00 +5.8033e+00 +5.7611e+00 +5.8547e+00 +5.8473e+00 +5.8194e+00 +5.8866e+00 +5.7877e+00 +5.8022e+00 +5.8525e+00 +5.9062e+00 +5.8226e+00 +5.7480e+00 +5.7339e+00 +5.7679e+00 +5.7813e+00 +5.6454e+00 +5.9081e+00 +5.8419e+00 +5.8306e+00 +5.6953e+00 +5.8237e+00 +5.7322e+00 +5.8549e+00 +5.8175e+00 +5.7747e+00 +5.7112e+00 +5.6554e+00 +5.7736e+00 +5.9161e+00 +5.7442e+00 +5.8682e+00 +5.6368e+00 +5.6190e+00 +5.8432e+00 +5.6145e+00 +5.6722e+00 +5.8362e+00 +5.6734e+00 +5.7668e+00 +5.7404e+00 +5.8578e+00 +5.8260e+00 +5.7593e+00 +5.6559e+00 +5.6654e+00 +5.7191e+00 +5.5765e+00 +5.8470e+00 +5.6849e+00 +5.7346e+00 +5.6279e+00 +5.7687e+00 +5.8686e+00 +5.6768e+00 +5.8109e+00 +5.7149e+00 +5.6717e+00 +5.8093e+00 +5.7417e+00 +5.9887e+00 +5.7669e+00 +5.7300e+00 +5.8640e+00 +5.7658e+00 +5.9303e+00 +5.9165e+00 +5.7229e+00 +5.5696e+00 +5.7818e+00 +5.5873e+00 +5.7112e+00 +5.6107e+00 +5.9027e+00 +5.7982e+00 +5.8966e+00 +5.8789e+00 +5.6537e+00 +5.7866e+00 +5.7875e+00 +5.7833e+00 +5.7224e+00 +5.7960e+00 +5.6809e+00 +5.6610e+00 +5.6848e+00 +5.4916e+00 +5.8598e+00 +5.6660e+00 +5.6889e+00 +5.7352e+00 +5.6614e+00 +5.7704e+00 +5.8702e+00 +5.7488e+00 +5.8080e+00 +5.7534e+00 +5.6712e+00 +5.6623e+00 +5.6661e+00 +5.7583e+00 +5.6571e+00 +5.5408e+00 +5.7324e+00 +5.8428e+00 +5.7963e+00 +5.7108e+00 +5.8415e+00 +5.8575e+00 +5.7555e+00 +5.8408e+00 +5.7737e+00 +5.7271e+00 +5.7704e+00 +5.7522e+00 +5.7147e+00 +5.7009e+00 +5.8046e+00 +5.6721e+00 +5.7000e+00 +5.7326e+00 +5.6179e+00 +5.7015e+00 +5.6379e+00 +5.7351e+00 +5.6324e+00 +5.6208e+00 +5.6391e+00 +5.6547e+00 +5.7248e+00 +5.6013e+00 +5.8119e+00 +5.7953e+00 +5.7688e+00 +5.7161e+00 +5.6105e+00 +5.7362e+00 +5.7480e+00 +5.6802e+00 +5.6911e+00 +5.6380e+00 +5.5802e+00 +5.7112e+00 +5.5101e+00 +5.6625e+00 +5.4723e+00 +5.7478e+00 +5.8014e+00 +5.8564e+00 +5.7715e+00 +5.7377e+00 +5.8873e+00 +5.7294e+00 +5.7086e+00 +5.5912e+00 +5.5802e+00 +5.6710e+00 +5.7036e+00 +5.5364e+00 +5.7077e+00 +5.7537e+00 +5.6396e+00 +5.7370e+00 +5.6188e+00 +5.7263e+00 +5.6227e+00 +5.5957e+00 +5.6438e+00 +5.5680e+00 +5.7263e+00 +5.7532e+00 +5.5513e+00 +5.7482e+00 +5.7415e+00 +5.5947e+00 +5.7218e+00 +5.7164e+00 +5.5793e+00 +5.6363e+00 +5.7699e+00 +5.7461e+00 +5.8458e+00 +5.6628e+00 +5.4944e+00 +5.6208e+00 +5.7045e+00 +5.7516e+00 +5.5241e+00 +5.7907e+00 +5.6990e+00 +5.4271e+00 +5.6516e+00 +5.5812e+00 +5.6165e+00 +5.8023e+00 +5.6450e+00 +5.5793e+00 +5.5731e+00 +5.6767e+00 +5.6407e+00 +5.5821e+00 +5.9506e+00 +5.5767e+00 +5.4659e+00 +5.6190e+00 +5.6685e+00 +5.6642e+00 +5.6944e+00 +5.5965e+00 +5.6002e+00 +5.6646e+00 +5.7039e+00 +5.7781e+00 +5.5814e+00 +5.6087e+00 +5.6295e+00 +5.7009e+00 +5.6387e+00 +5.4810e+00 +5.5667e+00 +5.5793e+00 +5.6519e+00 +5.6839e+00 +5.5718e+00 +5.5450e+00 +5.5872e+00 +5.6402e+00 +5.5815e+00 +5.4138e+00 +5.5982e+00 +5.6555e+00 +5.4861e+00 +5.6619e+00 +5.6179e+00 +5.4754e+00 +5.6467e+00 +5.7469e+00 +5.6236e+00 +5.6221e+00 +5.4767e+00 +5.4763e+00 +5.5149e+00 +5.6314e+00 +5.6109e+00 +5.7464e+00 +5.5769e+00 +5.6348e+00 +5.6738e+00 +5.6209e+00 +5.6809e+00 +5.5252e+00 +5.7629e+00 +5.6656e+00 +5.6626e+00 +5.6098e+00 +5.6514e+00 +5.6496e+00 +5.5320e+00 +5.7541e+00 +5.6402e+00 +5.4814e+00 +5.5759e+00 +5.5292e+00 +5.7214e+00 +5.6209e+00 +5.6770e+00 +5.8087e+00 +5.7043e+00 +5.6896e+00 +5.8109e+00 +5.7212e+00 +5.5977e+00 +5.5067e+00 +5.5789e+00 +5.4021e+00 +5.4917e+00 +5.5326e+00 +5.6026e+00 +5.5076e+00 +5.3627e+00 +5.4157e+00 +5.5322e+00 +5.4933e+00 +5.7870e+00 +5.5421e+00 +5.5379e+00 +5.4681e+00 +5.6624e+00 +5.5409e+00 +5.6333e+00 +5.5780e+00 +5.4301e+00 +5.6246e+00 +5.7060e+00 +5.6888e+00 +5.6844e+00 +5.7482e+00 +5.7045e+00 +5.5713e+00 +5.5047e+00 +5.5366e+00 +5.5611e+00 +5.5764e+00 +5.6121e+00 +5.4857e+00 +5.3908e+00 +5.5464e+00 +5.6996e+00 +5.5246e+00 +5.5423e+00 +5.4692e+00 +5.5765e+00 +5.7091e+00 +5.6296e+00 +5.5420e+00 +5.4733e+00 +5.5778e+00 +5.6948e+00 +5.4892e+00 +5.6200e+00 +5.4041e+00 +5.5881e+00 +5.4618e+00 +5.5187e+00 +5.5043e+00 +5.4989e+00 +5.4519e+00 +5.5346e+00 +5.5494e+00 +5.3751e+00 +5.5735e+00 +5.5896e+00 +5.6385e+00 +5.4820e+00 +5.7612e+00 +5.4444e+00 +5.5449e+00 +5.6478e+00 +5.5188e+00 +5.4425e+00 +5.7404e+00 +5.5389e+00 +5.6001e+00 +5.6984e+00 +5.5366e+00 +5.5431e+00 +5.4298e+00 +5.5158e+00 +5.6845e+00 +5.6816e+00 +5.6418e+00 +5.5253e+00 +5.6050e+00 +5.5862e+00 +5.5705e+00 +5.6061e+00 +5.5167e+00 +5.8128e+00 +5.4409e+00 +5.7127e+00 +5.4060e+00 +5.5275e+00 +5.4829e+00 +5.4720e+00 +5.6083e+00 +5.5894e+00 +5.6123e+00 +5.6190e+00 +5.5182e+00 +5.6375e+00 +5.8143e+00 +5.5764e+00 +5.7318e+00 +5.5502e+00 +5.6694e+00 +5.7690e+00 +5.7304e+00 +5.6667e+00 +5.5439e+00 +5.3422e+00 +5.5405e+00 +5.4955e+00 +5.4818e+00 +5.5071e+00 +5.6807e+00 +5.5701e+00 +5.5699e+00 +5.4725e+00 +5.6065e+00 +5.6271e+00 +5.6482e+00 +5.4829e+00 +5.4110e+00 +5.4644e+00 +5.5126e+00 +5.5120e+00 +5.5408e+00 +5.7053e+00 +5.5243e+00 +5.5293e+00 +5.6164e+00 +5.5472e+00 +5.5133e+00 +5.5592e+00 +5.5166e+00 +5.4530e+00 +5.4844e+00 +5.5095e+00 +5.4880e+00 +5.5076e+00 +5.5062e+00 +5.4397e+00 +5.5843e+00 +5.5366e+00 +5.4960e+00 +5.5619e+00 +5.5510e+00 +5.2300e+00 +5.4360e+00 +5.4144e+00 +5.6379e+00 +5.4937e+00 +5.4999e+00 +5.6881e+00 +5.4238e+00 +5.5799e+00 +5.5692e+00 +5.5608e+00 +5.4211e+00 +5.6344e+00 +5.6397e+00 +5.3938e+00 +5.4130e+00 +5.4991e+00 +5.5721e+00 +5.5447e+00 +5.4605e+00 +5.6474e+00 +5.5223e+00 +5.4501e+00 +5.4746e+00 +5.6476e+00 +5.5029e+00 +5.5109e+00 +5.5150e+00 +5.6181e+00 +5.5420e+00 +5.4154e+00 +5.4307e+00 +5.4150e+00 +5.5497e+00 +5.4709e+00 +5.3756e+00 +5.5434e+00 +5.6508e+00 +5.6226e+00 +5.6177e+00 +5.5283e+00 +5.5611e+00 +5.6421e+00 +5.5173e+00 +5.5629e+00 +5.3512e+00 +5.4446e+00 +5.6021e+00 +5.5749e+00 +5.5260e+00 +5.6156e+00 +5.5842e+00 +5.5411e+00 +5.3671e+00 +5.5530e+00 +5.4305e+00 +5.4829e+00 +5.5394e+00 +5.5694e+00 +5.5532e+00 +5.4671e+00 +5.7900e+00 +5.5179e+00 +5.3545e+00 +5.5847e+00 +5.4458e+00 +5.4857e+00 +5.5886e+00 +5.5443e+00 +5.5259e+00 +5.3363e+00 +5.4068e+00 +5.5196e+00 +5.3645e+00 +5.4029e+00 +5.3594e+00 +5.5000e+00 +5.4159e+00 +5.3969e+00 +5.4478e+00 +5.5034e+00 +5.4595e+00 +5.3213e+00 +5.4445e+00 +5.5040e+00 +5.1955e+00 +5.3897e+00 +5.6003e+00 +5.3884e+00 +5.6413e+00 +5.5729e+00 +5.4203e+00 +5.3905e+00 +5.5983e+00 +5.5902e+00 +5.5005e+00 +5.5453e+00 +5.5852e+00 +5.3651e+00 +5.2997e+00 +5.3870e+00 +5.4461e+00 +5.3513e+00 +5.4330e+00 +5.4855e+00 +5.5043e+00 +5.4354e+00 +5.5261e+00 +5.6217e+00 +5.4648e+00 +5.4722e+00 +5.5891e+00 +5.4522e+00 +5.4833e+00 +5.5560e+00 +5.5456e+00 +5.4709e+00 +5.4660e+00 +5.4564e+00 +5.2912e+00 +5.6210e+00 +5.5201e+00 +5.7217e+00 +5.5895e+00 +5.4528e+00 +5.6624e+00 +5.5503e+00 +5.5822e+00 +5.3537e+00 +5.5319e+00 +5.4063e+00 +5.3493e+00 +5.4879e+00 +5.5127e+00 +5.3385e+00 +5.4472e+00 +5.5861e+00 +5.4788e+00 +5.4935e+00 +5.5278e+00 +5.4913e+00 +5.3615e+00 +5.5630e+00 +5.4878e+00 +5.3506e+00 +5.3701e+00 +5.3887e+00 +5.3971e+00 +5.4881e+00 +5.5177e+00 +5.2957e+00 +5.4828e+00 +5.4084e+00 +5.4352e+00 +5.4042e+00 +5.3254e+00 +5.4713e+00 +5.3663e+00 +5.5400e+00 +5.3939e+00 +5.4908e+00 +5.4202e+00 +5.3408e+00 +5.3400e+00 +5.3907e+00 +5.3519e+00 +5.3613e+00 +5.4650e+00 +5.5757e+00 +5.3454e+00 +5.3719e+00 +5.5288e+00 +5.5250e+00 +5.3922e+00 +5.3985e+00 +5.7236e+00 +5.2940e+00 +5.5180e+00 +5.4061e+00 +5.4278e+00 +5.5769e+00 +5.4313e+00 +5.3502e+00 +5.4663e+00 +5.4264e+00 +5.3804e+00 +5.4622e+00 +5.2964e+00 +5.3137e+00 +5.4008e+00 +5.4299e+00 +5.3693e+00 +5.4160e+00 +5.3446e+00 +5.3282e+00 +5.4229e+00 +5.3748e+00 +5.3991e+00 +5.2430e+00 +5.2679e+00 +5.4037e+00 +5.5488e+00 +5.4912e+00 +5.2646e+00 +5.3080e+00 +5.3856e+00 +5.4552e+00 +5.3989e+00 +5.5097e+00 +5.5043e+00 +5.4710e+00 +5.2499e+00 +5.3945e+00 +5.4260e+00 +5.2636e+00 +5.4050e+00 +5.2780e+00 +5.4545e+00 +5.2939e+00 +5.4338e+00 +5.3662e+00 +5.3914e+00 +5.5127e+00 +5.3989e+00 +5.4034e+00 +5.3608e+00 +5.4795e+00 +5.4331e+00 +5.3470e+00 +5.3208e+00 +5.4101e+00 +5.4228e+00 +5.5065e+00 +5.4883e+00 +5.3340e+00 +5.3206e+00 +5.4163e+00 +5.4716e+00 +5.6193e+00 +5.2621e+00 +5.4024e+00 +5.3712e+00 +5.3353e+00 +5.5602e+00 +5.4046e+00 +5.4133e+00 +5.4720e+00 +5.3278e+00 +5.2897e+00 +5.4048e+00 +5.4188e+00 +5.6094e+00 +5.2076e+00 +5.2628e+00 +5.4837e+00 +5.2911e+00 +5.3177e+00 +5.1802e+00 +5.3232e+00 +5.3511e+00 +5.2323e+00 +5.3964e+00 +5.3551e+00 +5.4209e+00 +5.4102e+00 +5.4260e+00 +5.3026e+00 +5.3540e+00 +5.2809e+00 +5.2714e+00 +5.3237e+00 +5.4204e+00 +5.2262e+00 +5.2698e+00 +5.4529e+00 +5.2379e+00 +5.5157e+00 +5.4178e+00 +5.2853e+00 +5.4097e+00 +5.3940e+00 +5.3662e+00 +5.4162e+00 +5.3764e+00 +5.5505e+00 +5.4857e+00 +5.3703e+00 +5.2707e+00 +5.3883e+00 +5.4378e+00 +5.4900e+00 +5.2901e+00 +5.3477e+00 +5.3785e+00 +5.3469e+00 +5.5609e+00 +5.3755e+00 +5.4281e+00 +5.3540e+00 +5.4375e+00 +5.4063e+00 +5.3263e+00 +5.2232e+00 +5.3213e+00 +5.2392e+00 +5.4342e+00 +5.4526e+00 +5.3844e+00 +5.2067e+00 +5.3480e+00 +5.2924e+00 +5.2671e+00 +5.3618e+00 +5.4111e+00 +5.2376e+00 +5.2681e+00 +5.4208e+00 +5.3527e+00 +5.4202e+00 +5.2807e+00 +5.5875e+00 +5.3934e+00 +5.2516e+00 +5.3143e+00 +5.4688e+00 +5.4616e+00 +5.3243e+00 +5.4356e+00 +5.3772e+00 +5.2772e+00 +5.3617e+00 +5.3188e+00 +5.2169e+00 +5.5780e+00 +5.4543e+00 +5.2350e+00 +5.4442e+00 +5.3877e+00 +5.2340e+00 +5.4465e+00 +5.1853e+00 +5.4095e+00 +5.3860e+00 +5.2640e+00 +5.3330e+00 +5.3579e+00 +5.3815e+00 +5.4502e+00 +5.3002e+00 +5.4184e+00 +5.3110e+00 +5.3716e+00 +5.4439e+00 +5.3353e+00 +5.2374e+00 +5.3908e+00 +5.3059e+00 +5.2275e+00 +5.2064e+00 +5.3105e+00 +5.3734e+00 +5.3029e+00 +5.1956e+00 +5.4250e+00 +5.4084e+00 +5.3162e+00 +5.1575e+00 +5.4050e+00 +5.2803e+00 +5.4374e+00 +5.3819e+00 +5.1674e+00 +5.3698e+00 +5.3868e+00 +5.2644e+00 +5.3592e+00 +5.4195e+00 +5.2441e+00 +5.2496e+00 +5.3562e+00 +5.1501e+00 +5.2969e+00 +5.2271e+00 +5.2978e+00 +5.3450e+00 +5.7157e+00 +5.2185e+00 +5.3085e+00 +5.2713e+00 +5.4389e+00 +5.3996e+00 +5.3144e+00 +5.2632e+00 +5.3343e+00 +5.3250e+00 +5.3577e+00 +5.3739e+00 +5.2091e+00 +5.2693e+00 +5.3516e+00 +5.3207e+00 +5.4037e+00 +5.2016e+00 +5.2613e+00 +5.4281e+00 +5.2416e+00 +5.2449e+00 +5.2453e+00 +5.3289e+00 +5.1866e+00 +5.2891e+00 +5.3540e+00 +5.4794e+00 +5.1856e+00 +5.3564e+00 +5.5364e+00 +5.2314e+00 +5.3093e+00 +5.2633e+00 +5.2325e+00 +5.1324e+00 +5.1913e+00 +5.4492e+00 +5.3431e+00 +5.3389e+00 +5.4463e+00 +5.4397e+00 +5.4118e+00 +5.3233e+00 +5.2830e+00 +5.2119e+00 +5.2833e+00 +5.1654e+00 +5.1072e+00 +5.3968e+00 +5.2560e+00 +5.1880e+00 +5.2710e+00 +5.2774e+00 +5.1745e+00 +5.4196e+00 +5.1803e+00 +5.4254e+00 +5.2471e+00 +5.2218e+00 +5.1938e+00 +5.2783e+00 +5.2202e+00 +5.4397e+00 +5.3047e+00 +5.2389e+00 +5.3022e+00 +5.4241e+00 +5.3449e+00 +5.1631e+00 +5.1113e+00 +5.1579e+00 +5.3244e+00 +5.2779e+00 +5.0636e+00 +5.1918e+00 +5.1429e+00 +5.1764e+00 +5.2186e+00 +5.0388e+00 +5.3882e+00 +5.3151e+00 +5.3881e+00 +5.2746e+00 +5.2081e+00 +5.2942e+00 +5.1979e+00 +5.2670e+00 +5.3643e+00 +5.3583e+00 +5.3860e+00 +5.1949e+00 +5.3686e+00 +5.4515e+00 +5.1244e+00 +5.4234e+00 +5.2519e+00 +5.2743e+00 +5.0673e+00 +5.2246e+00 +5.2527e+00 +5.3532e+00 +5.2981e+00 +5.3718e+00 +5.3687e+00 +5.2702e+00 +5.3553e+00 +5.4427e+00 +5.1159e+00 +5.4590e+00 +5.2143e+00 +5.2417e+00 +5.3780e+00 +5.2368e+00 +5.3280e+00 +5.2181e+00 +5.1903e+00 +5.3402e+00 +5.3408e+00 +5.2688e+00 +5.5336e+00 +5.3098e+00 +5.3599e+00 +5.3050e+00 +5.1977e+00 +5.2304e+00 +5.3147e+00 +5.2724e+00 +5.1592e+00 +5.2717e+00 +5.2973e+00 +5.3092e+00 +5.1036e+00 +5.1150e+00 +5.1121e+00 +5.3515e+00 +5.1430e+00 +5.1968e+00 +5.2892e+00 +5.1811e+00 +5.2268e+00 +5.2563e+00 +5.2647e+00 +5.4898e+00 +5.3881e+00 +5.3122e+00 +5.2313e+00 +5.0335e+00 +5.1506e+00 +5.2188e+00 +5.2608e+00 +5.2546e+00 +5.2532e+00 +5.3182e+00 +5.3049e+00 +5.2289e+00 +5.1550e+00 +5.2890e+00 +5.3095e+00 +5.1766e+00 +5.3264e+00 +5.1534e+00 +5.2337e+00 +5.2976e+00 +5.1505e+00 +5.2715e+00 +5.3642e+00 +5.2867e+00 +5.2536e+00 +5.2498e+00 +5.1784e+00 +5.2254e+00 +5.1879e+00 +5.3855e+00 +5.2426e+00 +5.2017e+00 +5.1316e+00 +5.2794e+00 +5.2517e+00 +5.4223e+00 +5.2327e+00 +5.4211e+00 +5.1692e+00 +5.3621e+00 +5.1408e+00 +5.3775e+00 +5.3708e+00 +5.2152e+00 +5.2261e+00 +5.1836e+00 +5.2684e+00 +5.1620e+00 +5.1312e+00 +5.1477e+00 +5.1893e+00 +5.1049e+00 +5.1826e+00 +5.1057e+00 +5.2461e+00 +5.2087e+00 +5.1260e+00 +5.1644e+00 +5.2802e+00 +5.1672e+00 +5.0806e+00 +5.2014e+00 +5.2962e+00 +5.0462e+00 +5.0174e+00 +5.3311e+00 +5.1525e+00 +5.2538e+00 +5.3946e+00 +5.1940e+00 +5.1160e+00 +5.2953e+00 +5.3710e+00 +5.2212e+00 +5.2767e+00 +5.2039e+00 +5.3114e+00 +4.9402e+00 +5.0742e+00 +5.2774e+00 +5.4668e+00 +4.9199e+00 +5.0854e+00 +5.1059e+00 +5.1247e+00 +5.3543e+00 +5.1731e+00 +5.2543e+00 +5.2259e+00 +5.2601e+00 +5.1054e+00 +5.2202e+00 +5.2116e+00 +5.3449e+00 +5.1109e+00 +5.1751e+00 +5.2317e+00 +5.2145e+00 +5.1664e+00 +5.1751e+00 +5.0659e+00 +5.2115e+00 +5.1774e+00 +5.3628e+00 +5.3434e+00 +5.2857e+00 +5.1885e+00 +5.1027e+00 +5.4400e+00 +5.2231e+00 +5.1827e+00 +5.2219e+00 +5.1517e+00 +5.0581e+00 +5.1117e+00 +5.4010e+00 +5.1922e+00 +5.3058e+00 +5.2347e+00 +5.2640e+00 +5.2083e+00 +5.2605e+00 +5.1308e+00 +5.1635e+00 +5.2303e+00 +5.1370e+00 +5.2376e+00 +5.0763e+00 +5.1912e+00 +5.0964e+00 +5.0866e+00 +5.0196e+00 +5.2437e+00 +5.1181e+00 +5.1725e+00 +5.1077e+00 +5.1901e+00 +5.2250e+00 +5.1456e+00 +5.0884e+00 +5.0610e+00 +5.1603e+00 +5.1046e+00 +5.2850e+00 +5.0392e+00 +5.2057e+00 +5.3909e+00 +5.1630e+00 +5.3578e+00 +5.0994e+00 +5.2576e+00 +5.1564e+00 +5.2837e+00 +5.0499e+00 +5.1259e+00 +5.3186e+00 +5.1551e+00 +5.2289e+00 +5.3278e+00 +5.1628e+00 +5.3254e+00 +5.3343e+00 +5.1085e+00 +5.3296e+00 +5.2131e+00 +5.1543e+00 +5.1571e+00 +5.2698e+00 +5.0696e+00 +5.4698e+00 +5.0696e+00 +5.2288e+00 +5.3288e+00 +5.1847e+00 +5.2376e+00 +5.0674e+00 +5.3215e+00 +5.1094e+00 +5.0744e+00 +5.0267e+00 +5.1679e+00 +5.0828e+00 +5.0764e+00 +5.1355e+00 +5.1431e+00 +5.1497e+00 +5.0386e+00 +5.0577e+00 +5.0471e+00 +5.0764e+00 +5.0295e+00 +5.1989e+00 +5.3488e+00 +5.3270e+00 +5.1003e+00 +5.0801e+00 +5.0469e+00 +5.1190e+00 +5.0296e+00 +5.1177e+00 +4.9235e+00 +5.1060e+00 +5.0978e+00 +5.1309e+00 +5.4068e+00 +5.1553e+00 +4.9329e+00 +5.1362e+00 +5.1539e+00 +5.1986e+00 +5.2630e+00 +5.2755e+00 +5.2487e+00 +4.9718e+00 +5.0844e+00 +5.1962e+00 +5.2524e+00 +5.0956e+00 +5.0667e+00 +4.9123e+00 +5.1216e+00 +5.0830e+00 +4.9868e+00 +4.9403e+00 +5.2500e+00 +5.3101e+00 +5.2926e+00 +5.0975e+00 +4.9314e+00 +5.0674e+00 +5.3445e+00 +5.2835e+00 +5.1838e+00 +5.2196e+00 +5.0954e+00 +5.1376e+00 +5.1707e+00 +5.0177e+00 +5.1527e+00 +5.2179e+00 +5.2892e+00 +5.1961e+00 +5.1069e+00 +5.2900e+00 +5.2465e+00 +5.1499e+00 +5.0728e+00 +5.1198e+00 +4.9859e+00 +5.0052e+00 +5.0911e+00 +5.0821e+00 +5.0229e+00 +5.1811e+00 +5.1126e+00 +5.0753e+00 +5.0947e+00 +5.3039e+00 +5.0886e+00 +5.0403e+00 +5.2160e+00 +5.3323e+00 +5.2256e+00 +5.2800e+00 +5.1513e+00 +5.0390e+00 +4.8641e+00 +5.3697e+00 +5.3946e+00 +5.1636e+00 +5.2333e+00 +5.3166e+00 +5.1667e+00 +5.1770e+00 +5.1636e+00 +5.1613e+00 +4.8699e+00 +5.2663e+00 +5.2415e+00 +5.1654e+00 +4.9860e+00 +5.0866e+00 +5.3185e+00 +5.1043e+00 +5.2106e+00 +5.3020e+00 +5.0520e+00 +5.1418e+00 +5.0643e+00 +5.0377e+00 +5.1685e+00 +5.0064e+00 +5.2743e+00 +5.0362e+00 +4.9698e+00 +5.0827e+00 +5.2030e+00 +5.0390e+00 +5.2339e+00 +5.0626e+00 +5.0179e+00 +5.2876e+00 +4.9086e+00 +5.2908e+00 +5.0262e+00 +5.0951e+00 +5.0374e+00 +5.0954e+00 +4.9922e+00 +5.1020e+00 +5.1216e+00 +5.1243e+00 +5.0228e+00 +5.0714e+00 +5.1532e+00 +5.1527e+00 +5.0160e+00 +5.0444e+00 +5.0810e+00 +5.0247e+00 +5.0672e+00 +5.0462e+00 +4.9697e+00 +5.1117e+00 +5.2727e+00 +5.2671e+00 +5.2339e+00 +5.0357e+00 +4.8402e+00 +5.1875e+00 +4.9931e+00 +4.9485e+00 +5.0595e+00 +5.2473e+00 +5.2083e+00 +5.0455e+00 +4.9723e+00 +5.2088e+00 +5.1079e+00 +5.0853e+00 +5.1091e+00 +5.1235e+00 +5.0474e+00 +4.9216e+00 +4.9270e+00 +5.0217e+00 +5.3215e+00 +5.0448e+00 +5.1239e+00 +5.2086e+00 +5.1453e+00 +5.0060e+00 +5.0729e+00 +4.9880e+00 +5.1555e+00 +4.9685e+00 +5.0676e+00 +4.9754e+00 +5.2235e+00 +5.1029e+00 +5.1928e+00 +5.1366e+00 +5.1290e+00 +5.1121e+00 +5.0526e+00 +4.9092e+00 +5.2482e+00 +4.9227e+00 +5.1689e+00 +5.1138e+00 +5.4111e+00 +4.8641e+00 +4.8740e+00 +4.9615e+00 +5.1069e+00 +5.1756e+00 +5.0054e+00 +5.1588e+00 +5.1266e+00 +5.1406e+00 +5.0914e+00 +5.1125e+00 +5.1643e+00 +4.9962e+00 +5.0219e+00 +5.2010e+00 +4.9035e+00 +5.0281e+00 +4.9218e+00 +5.0347e+00 +5.0676e+00 +5.0282e+00 +5.1231e+00 +5.0325e+00 +5.0718e+00 +5.1992e+00 +4.9860e+00 +5.0584e+00 +5.0970e+00 +5.0528e+00 +4.8951e+00 +5.1683e+00 +4.9855e+00 +5.1353e+00 +5.0491e+00 +4.9556e+00 +5.2134e+00 +5.0379e+00 +5.3351e+00 +5.0832e+00 +5.0696e+00 +4.9841e+00 +5.0278e+00 +5.0903e+00 +5.0986e+00 +5.1105e+00 +5.0757e+00 +5.1256e+00 +5.0173e+00 +5.3236e+00 +5.1451e+00 +4.9696e+00 +5.0073e+00 +5.0640e+00 +4.9770e+00 +5.0972e+00 +5.2043e+00 +5.1736e+00 +5.0246e+00 +5.0614e+00 +4.9769e+00 +5.0576e+00 +5.3935e+00 +5.2215e+00 +4.9525e+00 +4.9789e+00 +4.7960e+00 +5.1294e+00 +5.1173e+00 +4.8587e+00 +5.1527e+00 +4.9703e+00 +5.0961e+00 +4.8753e+00 +4.8905e+00 +4.9082e+00 +4.9865e+00 +5.0016e+00 +5.0448e+00 +4.9771e+00 +5.1859e+00 +5.0199e+00 +4.9828e+00 +4.9635e+00 +5.0403e+00 +5.1200e+00 +5.1174e+00 +4.9437e+00 +5.1959e+00 +4.9280e+00 +4.9351e+00 +5.1239e+00 +5.0627e+00 +4.9703e+00 +4.9914e+00 +4.9039e+00 +5.0990e+00 +4.9076e+00 +5.0886e+00 +5.0052e+00 +4.8914e+00 +5.0158e+00 +5.0783e+00 +4.9385e+00 +4.9792e+00 +4.9526e+00 +5.0118e+00 +5.1045e+00 +4.9331e+00 +5.0313e+00 +5.0451e+00 +4.9802e+00 +5.1323e+00 +5.0868e+00 +4.9454e+00 +5.1218e+00 +5.0492e+00 +5.0166e+00 +5.0145e+00 +4.9607e+00 +5.0203e+00 +4.9826e+00 +4.7940e+00 +5.1188e+00 +5.0137e+00 +4.9134e+00 +5.0186e+00 +5.0610e+00 +4.8437e+00 +5.1335e+00 +5.0043e+00 +4.9765e+00 +5.1421e+00 +5.0993e+00 +4.9201e+00 +5.0434e+00 +4.9400e+00 +5.0331e+00 +5.0716e+00 +5.1705e+00 +5.1045e+00 +5.0285e+00 +5.1317e+00 +4.9906e+00 +4.9551e+00 +5.1369e+00 +4.9523e+00 +5.2648e+00 +5.1203e+00 +5.0984e+00 +4.9562e+00 +4.8941e+00 +5.0018e+00 +4.9058e+00 +5.1637e+00 +4.9628e+00 +4.9451e+00 +5.1240e+00 +5.1680e+00 +5.0243e+00 +5.1382e+00 +5.1283e+00 +4.9276e+00 +4.8527e+00 +4.9393e+00 +4.8943e+00 +5.1638e+00 +4.9540e+00 +5.0777e+00 +5.0933e+00 +4.9339e+00 +4.9632e+00 +4.8668e+00 +4.6673e+00 +4.8980e+00 +5.2347e+00 +5.2103e+00 +4.8439e+00 +5.1430e+00 +4.8613e+00 +5.1354e+00 +5.1322e+00 +4.9316e+00 +4.9592e+00 +5.0234e+00 +4.9356e+00 +5.1785e+00 +4.9104e+00 +4.9944e+00 +5.0517e+00 +4.9068e+00 +5.1912e+00 +5.0087e+00 +5.2064e+00 +5.0996e+00 +4.8923e+00 +5.1282e+00 +4.8930e+00 +4.8849e+00 +5.0525e+00 +5.1210e+00 +5.0084e+00 +5.0010e+00 +4.9135e+00 +5.0124e+00 +4.9749e+00 +5.0788e+00 +4.9963e+00 +5.0279e+00 +4.9778e+00 +5.0380e+00 +4.9520e+00 +4.8474e+00 +5.1163e+00 +5.0000e+00 +5.0011e+00 +4.9398e+00 +4.9334e+00 +5.2178e+00 +5.0000e+00 +4.9344e+00 +4.8908e+00 +4.8450e+00 +5.0535e+00 +5.0838e+00 +4.9285e+00 +5.1434e+00 +4.9512e+00 +4.9705e+00 +5.0775e+00 +5.0305e+00 +5.1316e+00 +5.1173e+00 +5.1362e+00 +5.1090e+00 +4.9922e+00 +5.0611e+00 +5.0699e+00 +5.1117e+00 +4.7997e+00 +4.8490e+00 +4.9554e+00 +4.7643e+00 +4.9496e+00 +5.1001e+00 +4.8991e+00 +4.6592e+00 +4.8646e+00 +5.0075e+00 +4.8757e+00 +4.9626e+00 +5.0051e+00 +4.9849e+00 +5.0060e+00 +5.0898e+00 +4.9435e+00 +4.8295e+00 +5.1745e+00 +5.1026e+00 +4.9560e+00 +4.7347e+00 +5.1080e+00 +4.9012e+00 +4.8229e+00 +4.8969e+00 +4.9931e+00 +4.9918e+00 +4.9567e+00 +4.9697e+00 +4.8516e+00 +4.8884e+00 +4.9851e+00 +4.8466e+00 +4.9025e+00 +5.0126e+00 +4.8736e+00 +4.8762e+00 +5.1986e+00 +4.8916e+00 +5.0010e+00 +4.8815e+00 +4.9159e+00 +4.6775e+00 +4.8360e+00 +5.0660e+00 +5.0212e+00 +4.9139e+00 +4.8917e+00 +4.8418e+00 +4.8042e+00 +4.7540e+00 +4.7650e+00 +5.1054e+00 +4.8123e+00 +4.9078e+00 +5.2231e+00 +5.0292e+00 +4.9288e+00 +4.9203e+00 +5.0606e+00 +4.8595e+00 +5.0775e+00 +5.0497e+00 +4.8232e+00 +4.9110e+00 +5.0393e+00 +4.8549e+00 +5.0224e+00 +4.8077e+00 +5.1577e+00 +5.1156e+00 +4.9132e+00 +5.0345e+00 +5.0873e+00 +4.8087e+00 +4.9255e+00 +4.9256e+00 +5.0400e+00 +4.9462e+00 +4.9314e+00 +4.8956e+00 +5.0175e+00 +4.8819e+00 +5.0446e+00 +5.0438e+00 +4.8729e+00 +4.9314e+00 +4.9856e+00 +4.9297e+00 +4.8876e+00 +4.6881e+00 +5.0620e+00 +4.7945e+00 +5.0100e+00 +4.9018e+00 +4.9256e+00 +4.9002e+00 +4.8614e+00 +5.1905e+00 +4.8404e+00 +4.8837e+00 +4.6434e+00 +5.0951e+00 +4.9372e+00 +5.1052e+00 +4.9433e+00 +4.9717e+00 +5.0378e+00 +4.7210e+00 +4.9659e+00 +4.8542e+00 +4.8578e+00 +4.7709e+00 +4.9954e+00 +4.8573e+00 +5.0022e+00 +4.9776e+00 +4.9320e+00 +4.7609e+00 +5.0484e+00 +4.9427e+00 +5.0486e+00 +5.0304e+00 +4.8379e+00 +4.7186e+00 +4.9491e+00 +5.0571e+00 +4.8663e+00 +4.9015e+00 +4.8330e+00 +4.7622e+00 +5.1000e+00 +5.2169e+00 +4.7650e+00 +5.1141e+00 +4.8038e+00 +4.8809e+00 +4.7161e+00 +4.9274e+00 +4.9185e+00 +4.7182e+00 +4.8768e+00 +4.7774e+00 +5.0049e+00 +4.8563e+00 +4.6943e+00 +4.9221e+00 +4.9081e+00 +4.9415e+00 +5.1030e+00 +4.7834e+00 +4.8249e+00 +4.8540e+00 +4.9058e+00 +5.0029e+00 +4.8639e+00 +4.8660e+00 +4.8267e+00 +4.6555e+00 +4.8995e+00 +5.1105e+00 +4.9628e+00 +5.0365e+00 +4.9751e+00 +4.9376e+00 +4.7531e+00 +4.7970e+00 +4.9426e+00 +4.9024e+00 +4.8525e+00 +4.8035e+00 +4.7688e+00 +4.7535e+00 +4.9894e+00 +4.9087e+00 +5.1030e+00 +4.8476e+00 +4.7660e+00 +4.7712e+00 +5.0316e+00 +4.9323e+00 +4.8536e+00 +4.8955e+00 +4.8502e+00 +4.7833e+00 +4.8528e+00 +4.7581e+00 +4.7238e+00 +4.8571e+00 +4.8010e+00 +5.0922e+00 +4.9844e+00 +5.0105e+00 +4.8529e+00 +5.1632e+00 +5.0048e+00 +4.9288e+00 +4.9683e+00 +5.0147e+00 +4.8247e+00 +4.6903e+00 +5.0856e+00 +4.8859e+00 +4.9378e+00 +4.7873e+00 +5.0487e+00 +4.6776e+00 +4.8475e+00 +4.8526e+00 +4.8074e+00 +4.8539e+00 +4.8416e+00 +5.0144e+00 +4.8899e+00 +4.7135e+00 +4.7365e+00 +4.9676e+00 +5.0160e+00 +5.0294e+00 +5.0992e+00 +4.9506e+00 +4.6098e+00 +4.8354e+00 +4.8966e+00 +4.7828e+00 +5.1990e+00 +4.9557e+00 +4.8174e+00 +4.7285e+00 +5.0580e+00 +4.9251e+00 +4.8522e+00 +5.0766e+00 +4.8613e+00 +4.8521e+00 +4.7365e+00 +4.9443e+00 +4.7161e+00 +4.7820e+00 +4.7147e+00 +4.8447e+00 +4.7615e+00 +4.6530e+00 +4.8877e+00 +4.8212e+00 +4.7842e+00 +4.8895e+00 +4.8026e+00 +5.0084e+00 +4.7093e+00 +4.8350e+00 +5.0481e+00 +4.8087e+00 +4.9881e+00 +4.6102e+00 +5.1350e+00 +4.9811e+00 +4.8343e+00 +4.7268e+00 +4.8368e+00 +4.9270e+00 +4.8360e+00 +4.7841e+00 +4.7841e+00 +4.8784e+00 +4.7650e+00 +5.0443e+00 +4.9057e+00 +4.8060e+00 +4.8533e+00 +4.9556e+00 +4.7142e+00 +4.9018e+00 +4.7980e+00 +4.8382e+00 +4.8941e+00 +4.8679e+00 +4.9110e+00 +4.8198e+00 +4.9128e+00 +4.8738e+00 +4.9060e+00 +5.0036e+00 +4.7496e+00 +5.0046e+00 +4.7154e+00 +4.7487e+00 +4.8441e+00 +4.7793e+00 +4.8739e+00 +4.7562e+00 +4.9036e+00 +4.8576e+00 +4.6800e+00 +4.8495e+00 +4.9749e+00 +5.0087e+00 +4.8370e+00 +4.8301e+00 +4.8103e+00 +4.7218e+00 +4.8191e+00 +4.9641e+00 +4.8907e+00 +4.7300e+00 +4.8998e+00 +4.8525e+00 +4.8219e+00 +4.9279e+00 +4.6517e+00 +4.9346e+00 +4.9728e+00 +4.9828e+00 +4.7761e+00 +4.9656e+00 +4.7258e+00 +4.7851e+00 +4.7929e+00 +4.9279e+00 +4.7627e+00 +4.7712e+00 +4.6684e+00 +4.8564e+00 +4.6489e+00 +4.7767e+00 +4.8087e+00 +4.9606e+00 +4.8464e+00 +4.7988e+00 +4.8571e+00 +4.8865e+00 +4.9389e+00 +4.9274e+00 +4.6202e+00 +4.9006e+00 +4.9399e+00 +4.6898e+00 +4.8640e+00 +4.7640e+00 +4.6238e+00 +4.5923e+00 +4.9729e+00 +4.8715e+00 +4.6214e+00 +4.7989e+00 +4.8222e+00 +4.9261e+00 +4.7473e+00 +4.7753e+00 +4.8018e+00 +4.5895e+00 +4.7872e+00 +4.7412e+00 +4.8826e+00 +4.8314e+00 +4.9671e+00 +5.0379e+00 +4.6965e+00 +4.8594e+00 +4.7897e+00 +4.7054e+00 +4.7445e+00 +4.7427e+00 +4.7659e+00 +4.8434e+00 +4.7966e+00 +4.8604e+00 +4.9604e+00 +4.6941e+00 +4.9741e+00 +4.8375e+00 +4.6897e+00 +5.0310e+00 +4.8477e+00 +4.8085e+00 +4.7398e+00 +5.1197e+00 +4.6595e+00 +4.7503e+00 +4.8639e+00 +4.8968e+00 +4.7176e+00 +4.8005e+00 +4.8796e+00 +4.8194e+00 +4.7286e+00 +4.8192e+00 +4.9174e+00 +4.7714e+00 +4.8381e+00 +4.5979e+00 +4.8908e+00 +4.9048e+00 +4.7775e+00 +4.7138e+00 +4.7287e+00 +4.7377e+00 +4.8354e+00 +4.7454e+00 +4.7975e+00 +4.8793e+00 +4.8561e+00 +4.8661e+00 +4.9215e+00 +4.7058e+00 +4.7123e+00 +4.8565e+00 +4.7593e+00 +4.6200e+00 +4.7595e+00 +4.9182e+00 +4.7593e+00 +4.7934e+00 +4.7614e+00 +4.5701e+00 +4.9376e+00 +4.7880e+00 +4.7368e+00 +4.7460e+00 +4.8431e+00 +4.8211e+00 +4.6950e+00 +4.7502e+00 +4.9691e+00 +4.8992e+00 +4.9013e+00 +4.6599e+00 +4.5909e+00 +4.4896e+00 +4.5771e+00 +4.7886e+00 +4.6953e+00 +4.7537e+00 +4.7052e+00 +4.6409e+00 +4.5253e+00 +4.8865e+00 +4.6540e+00 +4.9764e+00 +4.6258e+00 +4.8290e+00 +4.7701e+00 +4.7567e+00 +4.7334e+00 +4.8292e+00 +4.8248e+00 +4.8139e+00 +4.7347e+00 +4.8306e+00 +4.7741e+00 +4.9588e+00 +4.9207e+00 +4.9889e+00 +4.7923e+00 +4.8128e+00 +4.7896e+00 +4.7552e+00 +4.7996e+00 +4.7925e+00 +4.9689e+00 +4.6224e+00 +4.9648e+00 +4.8443e+00 +4.8629e+00 +4.7824e+00 +4.7636e+00 +4.7994e+00 +4.8181e+00 +4.6602e+00 +5.0688e+00 +4.5669e+00 +4.6091e+00 +4.6903e+00 +4.8456e+00 +4.9356e+00 +4.6169e+00 +4.9525e+00 +4.9203e+00 +4.8464e+00 +4.7011e+00 +4.8028e+00 +4.7744e+00 +4.9243e+00 +4.7480e+00 +4.7489e+00 +4.6675e+00 +4.7124e+00 +4.6712e+00 +4.9541e+00 +4.6457e+00 +4.8151e+00 +4.5125e+00 +4.5891e+00 +4.4711e+00 +4.8306e+00 +4.7197e+00 +4.8753e+00 +4.8066e+00 +4.9068e+00 +4.8610e+00 +4.7009e+00 +4.7882e+00 +4.8030e+00 +4.6859e+00 +4.6165e+00 +4.9743e+00 +4.8503e+00 +4.7944e+00 +4.7091e+00 +4.6182e+00 +4.7312e+00 +4.7046e+00 +4.7626e+00 +4.7343e+00 +4.9357e+00 +4.6885e+00 +4.9012e+00 +4.8345e+00 +4.8070e+00 +4.7272e+00 +4.7214e+00 +4.9454e+00 +4.6405e+00 +4.6987e+00 +4.6975e+00 +4.7785e+00 +4.5341e+00 +4.8412e+00 +4.7058e+00 +4.8107e+00 +4.4474e+00 +4.7510e+00 +4.7525e+00 +4.8148e+00 +4.7304e+00 +4.6771e+00 +4.8744e+00 +4.8977e+00 +4.7922e+00 +5.0195e+00 +4.5749e+00 +4.6754e+00 +4.7931e+00 +4.8249e+00 +4.6917e+00 +4.7552e+00 +4.7324e+00 +4.7785e+00 +4.8963e+00 +4.7772e+00 +4.7115e+00 +4.8106e+00 +4.7229e+00 +4.6480e+00 +4.8067e+00 +4.5676e+00 +4.7684e+00 +4.7182e+00 +4.7117e+00 +4.8450e+00 +4.6870e+00 +4.7259e+00 +4.7873e+00 +4.7378e+00 +4.8891e+00 +4.7970e+00 +4.7698e+00 +4.8313e+00 +4.5979e+00 +4.6025e+00 +4.8463e+00 +4.6303e+00 +4.6430e+00 +4.6193e+00 +4.8254e+00 +4.6807e+00 +4.7728e+00 +4.5857e+00 +4.6396e+00 +4.5800e+00 +4.8073e+00 +4.5509e+00 +4.9992e+00 +4.7653e+00 +4.7472e+00 +4.6330e+00 +4.6733e+00 +4.6680e+00 +4.7679e+00 +4.7522e+00 +4.5239e+00 +4.8313e+00 +4.5582e+00 +4.6424e+00 +4.4859e+00 +4.6577e+00 +4.9651e+00 +4.6415e+00 +4.8349e+00 +4.8724e+00 +4.6700e+00 +4.6703e+00 +4.6118e+00 +4.6393e+00 +4.7994e+00 +4.6168e+00 +4.6344e+00 +4.5356e+00 +4.8693e+00 +4.7713e+00 +4.8009e+00 +4.7548e+00 +4.7440e+00 +4.6899e+00 +4.8985e+00 +4.9043e+00 +4.8511e+00 +4.7635e+00 +4.8584e+00 +4.6103e+00 +4.7633e+00 +4.7892e+00 +4.7905e+00 +4.7219e+00 +4.7513e+00 +4.7836e+00 +4.6031e+00 +4.6707e+00 +4.7315e+00 +4.9562e+00 +4.6998e+00 +4.7004e+00 +4.6539e+00 +4.6790e+00 +4.5157e+00 +4.4823e+00 +4.5413e+00 +4.8054e+00 +4.7300e+00 +4.7236e+00 +4.6037e+00 +4.5645e+00 +4.6089e+00 +4.6632e+00 +4.8380e+00 +4.8603e+00 +4.7580e+00 +4.7079e+00 +4.6875e+00 +4.7821e+00 +4.7040e+00 +4.6857e+00 +4.6967e+00 +4.5437e+00 +4.5659e+00 +4.8928e+00 +4.4583e+00 +4.6374e+00 +4.8041e+00 +4.7803e+00 +4.8303e+00 +4.7299e+00 +4.9272e+00 +5.0180e+00 +4.8121e+00 +4.7466e+00 +4.6130e+00 +4.8138e+00 +4.7587e+00 +4.7756e+00 +4.7280e+00 +4.8034e+00 +4.7340e+00 +4.6928e+00 +4.8180e+00 +4.7520e+00 +4.4803e+00 +4.7077e+00 +4.7105e+00 +4.7805e+00 +4.8452e+00 +4.4804e+00 +4.6616e+00 +4.7444e+00 +4.8081e+00 +4.6724e+00 +4.6040e+00 +4.7745e+00 +4.7786e+00 +4.5311e+00 +4.6532e+00 +4.9156e+00 +4.5898e+00 +4.6241e+00 +4.7498e+00 +4.8515e+00 +4.5701e+00 +4.7026e+00 +4.7499e+00 +4.6097e+00 +4.7902e+00 +4.8753e+00 +4.5807e+00 +4.6874e+00 +4.5665e+00 +4.5320e+00 +5.0587e+00 +4.6365e+00 +4.7112e+00 +4.4903e+00 +4.4488e+00 +4.7798e+00 +4.6224e+00 +4.4521e+00 +4.6405e+00 +4.6285e+00 +4.7266e+00 +4.8152e+00 +4.8745e+00 +4.5143e+00 +4.6275e+00 +4.7700e+00 +4.5827e+00 +4.9815e+00 +4.7681e+00 +4.7479e+00 +4.5086e+00 +4.5688e+00 +4.5949e+00 +4.7700e+00 +4.7095e+00 +4.8128e+00 +4.7492e+00 +4.4384e+00 +4.6424e+00 +4.6497e+00 +4.5436e+00 +4.7843e+00 +4.5062e+00 +4.6245e+00 +4.8145e+00 +4.8745e+00 +4.6017e+00 +4.6129e+00 +4.4833e+00 +4.7052e+00 +4.5251e+00 +4.6615e+00 +4.2295e+00 +4.5539e+00 +4.7313e+00 +4.5597e+00 +4.8348e+00 +4.8232e+00 +4.5981e+00 +4.4609e+00 +4.7186e+00 +4.6939e+00 +4.5211e+00 +4.5953e+00 +4.6104e+00 +4.6115e+00 +4.8671e+00 +4.8126e+00 +4.6334e+00 +4.6718e+00 +4.6376e+00 +4.8069e+00 +4.6435e+00 +4.4340e+00 +4.5987e+00 +4.6143e+00 +4.6637e+00 +4.5957e+00 +4.6917e+00 +4.5770e+00 +4.8146e+00 +4.8137e+00 +4.5229e+00 +4.7612e+00 +4.6934e+00 +4.7597e+00 +4.7563e+00 +4.5309e+00 +4.8763e+00 +4.7729e+00 +4.5213e+00 +4.5484e+00 +4.7515e+00 +4.8079e+00 +4.4443e+00 +4.7435e+00 +4.5957e+00 +4.6892e+00 +4.7305e+00 +4.7977e+00 +4.5781e+00 +4.8025e+00 +4.7691e+00 +4.5089e+00 +4.7003e+00 +4.6087e+00 +4.5580e+00 +4.4228e+00 +4.7961e+00 +4.5645e+00 +4.8219e+00 +4.7460e+00 +4.6655e+00 +4.7843e+00 +4.6681e+00 +4.5005e+00 +4.6895e+00 +4.6347e+00 +4.8816e+00 +4.6986e+00 +4.5890e+00 +4.6973e+00 +4.8257e+00 +4.7764e+00 +4.4778e+00 +4.5912e+00 +4.7656e+00 +4.7324e+00 +4.5279e+00 +4.8671e+00 +4.5185e+00 +4.7548e+00 +4.5431e+00 +4.7991e+00 +4.7156e+00 +4.6825e+00 +4.6453e+00 +4.6232e+00 +4.7666e+00 +4.6799e+00 +4.6130e+00 +4.6456e+00 +4.8592e+00 +4.6683e+00 +4.7765e+00 +4.5973e+00 +4.7375e+00 +4.8271e+00 +4.7242e+00 +4.5052e+00 +4.4145e+00 +4.8311e+00 +4.5904e+00 +4.6694e+00 +4.5762e+00 +4.6069e+00 +4.4011e+00 +4.4635e+00 +4.7182e+00 +4.7779e+00 +4.5998e+00 +4.8518e+00 +4.6066e+00 +4.6736e+00 +4.7091e+00 +4.7494e+00 +4.7879e+00 +4.6470e+00 +4.6169e+00 +4.5448e+00 +4.9068e+00 +4.8506e+00 +4.6110e+00 +4.7534e+00 +4.8506e+00 +4.5530e+00 +4.6197e+00 +4.5474e+00 +4.6254e+00 +4.4978e+00 +4.7968e+00 +4.5368e+00 +4.6949e+00 +4.5748e+00 +4.5211e+00 +4.5851e+00 +4.4701e+00 +4.6483e+00 +4.5629e+00 +4.8159e+00 +4.6783e+00 +4.6726e+00 +4.5729e+00 +4.5145e+00 +4.4904e+00 +4.7165e+00 +4.9451e+00 +4.8064e+00 +4.6810e+00 +4.5679e+00 +4.8969e+00 +4.5524e+00 +4.6866e+00 +4.5822e+00 +4.3487e+00 +4.6049e+00 +4.7398e+00 +4.8197e+00 +4.4371e+00 +4.5410e+00 +4.5339e+00 +4.7093e+00 +4.4355e+00 +4.5648e+00 +4.7759e+00 +4.7626e+00 +4.4855e+00 +4.6731e+00 +4.6207e+00 +4.7113e+00 +4.6095e+00 +4.7878e+00 +4.5264e+00 +4.6610e+00 +4.7308e+00 +4.6948e+00 +4.5016e+00 +4.6564e+00 +4.5723e+00 +4.7626e+00 +4.6500e+00 +4.6446e+00 +4.6717e+00 +4.8052e+00 +4.7691e+00 +4.8222e+00 +4.4712e+00 +4.6917e+00 +4.5093e+00 +4.6869e+00 +4.4076e+00 +4.9126e+00 +4.4704e+00 +4.6121e+00 +4.6575e+00 +4.5546e+00 +4.7320e+00 +4.5531e+00 +4.4344e+00 +4.3269e+00 +4.5659e+00 +4.4466e+00 +4.7130e+00 +4.3553e+00 +4.4090e+00 +4.6086e+00 +4.8338e+00 +4.5407e+00 +4.5816e+00 +4.6388e+00 +4.5388e+00 +4.5858e+00 +4.6169e+00 +4.6365e+00 +4.4171e+00 +4.6697e+00 +4.6705e+00 +4.6242e+00 +4.7404e+00 +4.4498e+00 +4.5445e+00 +4.7426e+00 +4.4984e+00 +4.5299e+00 +4.5691e+00 +4.6108e+00 +4.5647e+00 +4.5791e+00 +4.6123e+00 +4.5320e+00 +4.7110e+00 +4.4755e+00 +4.6455e+00 +4.5337e+00 +4.5384e+00 +4.3732e+00 +4.6405e+00 +4.5166e+00 +4.6196e+00 +4.6249e+00 +4.4691e+00 +4.4654e+00 +4.6854e+00 +4.3882e+00 +4.5227e+00 +4.6148e+00 +4.4053e+00 +4.5672e+00 +4.4880e+00 +4.7655e+00 +4.5967e+00 +4.5829e+00 +4.9673e+00 +4.6727e+00 +4.6186e+00 +4.4727e+00 +4.5440e+00 +4.5577e+00 +4.8827e+00 +4.6104e+00 +4.6816e+00 +4.5894e+00 +4.7424e+00 +4.6217e+00 +4.4479e+00 +4.6359e+00 +4.5989e+00 +4.7555e+00 +4.6454e+00 +4.6968e+00 +4.5547e+00 +4.5432e+00 +4.4506e+00 +4.6579e+00 +4.3975e+00 +4.6904e+00 +4.7022e+00 +4.4446e+00 +4.6475e+00 +4.4356e+00 +4.6290e+00 +4.5330e+00 +4.4766e+00 +4.5269e+00 +4.4815e+00 +4.5655e+00 +4.4843e+00 +4.8117e+00 +4.5891e+00 +4.3867e+00 +4.6257e+00 +4.5897e+00 +4.6242e+00 +4.5667e+00 +4.6983e+00 +4.4406e+00 +4.5575e+00 +4.6343e+00 +4.5578e+00 +4.6144e+00 +4.6127e+00 +4.7344e+00 +4.5696e+00 +4.5533e+00 +4.5035e+00 +4.6079e+00 +4.6263e+00 +4.4661e+00 +4.6097e+00 +4.5960e+00 +4.5500e+00 +4.5563e+00 +4.5823e+00 +4.5370e+00 +4.4510e+00 +4.7238e+00 +4.4939e+00 +4.5541e+00 +4.3895e+00 +4.3665e+00 +4.3743e+00 +4.4109e+00 +4.6802e+00 +4.6507e+00 +4.8022e+00 +4.3392e+00 +4.6373e+00 +4.6260e+00 +4.3373e+00 +4.6592e+00 +4.4453e+00 +4.4258e+00 +4.4990e+00 +4.4506e+00 +4.5634e+00 +4.3528e+00 +4.5430e+00 +4.6044e+00 +4.5335e+00 +4.6772e+00 +4.7081e+00 +4.5440e+00 +4.5197e+00 +4.6776e+00 +4.4708e+00 +4.4678e+00 +4.3981e+00 +4.5114e+00 +4.4686e+00 +4.4492e+00 +4.5289e+00 +4.5884e+00 +4.5294e+00 +4.4891e+00 +4.4317e+00 +4.5477e+00 +4.4628e+00 +4.5768e+00 +4.5854e+00 +4.6123e+00 +4.5571e+00 +4.3591e+00 +4.5048e+00 +4.5768e+00 +4.3737e+00 +4.3508e+00 +4.5052e+00 +4.5838e+00 +4.4569e+00 +4.5200e+00 +4.3250e+00 +4.5707e+00 +4.4293e+00 +4.7000e+00 +4.5700e+00 +4.7652e+00 +4.7401e+00 +4.4364e+00 +4.8072e+00 +4.7605e+00 +4.5413e+00 +4.6063e+00 +4.4528e+00 +4.4926e+00 +4.7293e+00 +4.5468e+00 +4.5155e+00 +4.5619e+00 +4.6329e+00 +4.5627e+00 +4.5127e+00 +4.3150e+00 +4.6050e+00 +4.7062e+00 +4.5661e+00 +4.6276e+00 +4.3564e+00 +4.5424e+00 +4.6385e+00 +4.7014e+00 +4.3807e+00 +4.5564e+00 +4.5207e+00 +4.5641e+00 +4.5712e+00 +4.7505e+00 +4.6641e+00 +4.3390e+00 +4.6795e+00 +4.5343e+00 +4.5547e+00 +4.4633e+00 +4.5532e+00 +4.4771e+00 +4.6114e+00 +4.4133e+00 +4.5059e+00 +4.5126e+00 +4.4950e+00 +4.3143e+00 +4.4279e+00 +4.6218e+00 +4.5168e+00 +4.5393e+00 +4.4250e+00 +4.4348e+00 +4.6472e+00 +4.4249e+00 +4.4068e+00 +4.5791e+00 +4.5991e+00 +4.4604e+00 +4.6977e+00 +4.5215e+00 +4.5372e+00 +4.2781e+00 +4.3902e+00 +4.4368e+00 +4.2576e+00 +4.3333e+00 +4.7272e+00 +4.3041e+00 +4.6483e+00 +4.6357e+00 +4.5616e+00 +4.6162e+00 +4.5258e+00 +4.3857e+00 +4.4994e+00 +4.3476e+00 +4.6073e+00 +4.5696e+00 +4.4957e+00 +4.4182e+00 +4.4576e+00 +4.4432e+00 +4.6690e+00 +4.3010e+00 +4.4391e+00 +4.5513e+00 +4.5175e+00 +4.5846e+00 +4.5441e+00 +4.5015e+00 +4.5356e+00 +4.5446e+00 +4.5721e+00 +4.6285e+00 +4.5004e+00 +4.2617e+00 +4.6275e+00 +4.7460e+00 +4.4092e+00 +4.3599e+00 +4.6158e+00 +4.6201e+00 +4.4047e+00 +4.4261e+00 +4.3764e+00 +4.5563e+00 +4.6389e+00 +4.2925e+00 +4.3516e+00 +4.5790e+00 +4.3756e+00 +4.5731e+00 +4.4554e+00 +4.3684e+00 +4.4334e+00 +4.7034e+00 +4.5737e+00 +4.3135e+00 +4.5284e+00 +4.3952e+00 +4.5617e+00 +4.2906e+00 +4.3568e+00 +4.4180e+00 +4.4394e+00 +4.4053e+00 +4.5626e+00 +4.5508e+00 +4.3972e+00 +4.5118e+00 +4.6770e+00 +4.4836e+00 +4.3687e+00 +4.4541e+00 +4.4346e+00 +4.6439e+00 +4.4697e+00 +4.3314e+00 +4.4850e+00 +4.5674e+00 +4.7115e+00 +4.5613e+00 +4.4625e+00 +4.4002e+00 +4.4634e+00 +4.3455e+00 +4.4639e+00 +4.3724e+00 +4.3501e+00 +4.4300e+00 +4.3416e+00 +4.5283e+00 +4.4215e+00 +4.6191e+00 +4.3624e+00 +4.6125e+00 +4.4843e+00 +4.3789e+00 +4.3770e+00 +4.6950e+00 +4.4580e+00 +4.6000e+00 +4.4871e+00 +4.5869e+00 +4.8340e+00 +4.5316e+00 +4.3987e+00 +4.5331e+00 +4.2400e+00 +4.4422e+00 +4.4605e+00 +4.5681e+00 +4.4149e+00 +4.4792e+00 +4.4563e+00 +4.4150e+00 +4.6220e+00 +4.5818e+00 +4.4919e+00 +4.4279e+00 +4.3694e+00 +4.4720e+00 +4.6385e+00 +4.5978e+00 +4.5334e+00 +4.4597e+00 +4.3550e+00 +4.3803e+00 +4.4386e+00 +4.5465e+00 +4.5842e+00 +4.5616e+00 +4.5450e+00 +4.3964e+00 +4.4841e+00 +4.2736e+00 +4.6363e+00 +4.5633e+00 +4.5645e+00 +4.5415e+00 +4.4122e+00 +4.4602e+00 +4.5078e+00 +4.6021e+00 +4.2984e+00 +4.5188e+00 +4.6531e+00 +4.4563e+00 +4.5459e+00 +4.5899e+00 +4.4408e+00 +4.7015e+00 +4.3737e+00 +4.4252e+00 +4.6148e+00 +4.5019e+00 +4.5411e+00 +4.4983e+00 +4.3524e+00 +4.5481e+00 +4.3081e+00 +4.6192e+00 +4.5091e+00 +4.2471e+00 +4.5281e+00 +4.3283e+00 +4.1789e+00 +4.6035e+00 +4.5602e+00 +4.2927e+00 +4.3215e+00 +4.6410e+00 +4.5399e+00 +4.3738e+00 +4.5711e+00 +4.5628e+00 +4.5563e+00 +4.4288e+00 +4.5930e+00 +4.3596e+00 +4.2881e+00 +4.3978e+00 +4.4960e+00 +4.4355e+00 +4.4438e+00 +4.4708e+00 +4.6558e+00 +4.4050e+00 +4.3196e+00 +4.2779e+00 +4.2440e+00 +4.5373e+00 +4.7074e+00 +4.6633e+00 +4.3704e+00 +4.4262e+00 +4.4293e+00 +4.6242e+00 +4.5352e+00 +4.3605e+00 +4.3871e+00 +4.5659e+00 +4.4327e+00 +4.5372e+00 +4.3518e+00 +4.3701e+00 +4.4327e+00 +4.6435e+00 +4.5362e+00 +4.4845e+00 +4.3965e+00 +4.5729e+00 +4.7328e+00 +4.3418e+00 +4.2238e+00 +4.3603e+00 +4.2656e+00 +4.3202e+00 +4.5019e+00 +4.5198e+00 +4.4885e+00 +4.4149e+00 +4.3857e+00 +4.3243e+00 +4.3730e+00 +4.2594e+00 +4.4748e+00 +4.5545e+00 +4.4397e+00 +4.3184e+00 +4.4908e+00 +4.2655e+00 +4.4936e+00 +4.2901e+00 +4.4094e+00 +4.3026e+00 +4.5145e+00 +4.5050e+00 +4.4300e+00 +4.4593e+00 +4.3613e+00 +4.4845e+00 +4.5156e+00 +4.3287e+00 +4.3557e+00 +4.5538e+00 +4.4158e+00 +4.5402e+00 +4.3366e+00 +4.4641e+00 +4.3760e+00 +4.4016e+00 +4.4871e+00 +4.6555e+00 +4.5760e+00 +4.5379e+00 +4.5128e+00 +4.3937e+00 +4.4130e+00 +4.5463e+00 +4.5805e+00 +4.6051e+00 +4.3413e+00 +4.6395e+00 +4.3248e+00 +4.3808e+00 +4.2272e+00 +4.4102e+00 +4.5909e+00 +4.3904e+00 +4.6680e+00 +4.3849e+00 +4.6384e+00 +4.3856e+00 +4.3418e+00 +4.4165e+00 +4.2480e+00 +4.3417e+00 +4.6267e+00 +4.2733e+00 +4.4555e+00 +4.4002e+00 +4.4857e+00 +4.2536e+00 +4.3204e+00 +4.2569e+00 +4.2716e+00 +4.4595e+00 +4.5590e+00 +4.4441e+00 +4.4709e+00 +4.4313e+00 +4.5850e+00 +4.3366e+00 +4.4524e+00 +4.6186e+00 +4.1954e+00 +4.3344e+00 +4.4934e+00 +4.2519e+00 +4.4162e+00 +4.2980e+00 +4.3807e+00 +4.5174e+00 +4.2863e+00 +4.2496e+00 +4.3189e+00 +4.3014e+00 +4.1138e+00 +4.3590e+00 +4.5089e+00 +4.4823e+00 +4.5379e+00 +4.4918e+00 +4.3081e+00 +4.3839e+00 +4.4913e+00 +4.4687e+00 +4.5461e+00 +4.3203e+00 +4.4577e+00 +4.3313e+00 +4.4024e+00 +4.5073e+00 +4.5102e+00 +4.4377e+00 +4.2567e+00 +4.2777e+00 +4.1523e+00 +4.3424e+00 +4.4979e+00 +4.5022e+00 +4.4735e+00 +4.3742e+00 +4.5433e+00 +4.2817e+00 +4.2150e+00 +4.5023e+00 +4.1735e+00 +4.1960e+00 +4.5861e+00 +4.2496e+00 +4.5110e+00 +4.5661e+00 +4.6031e+00 +4.4277e+00 +4.1910e+00 +4.4616e+00 +4.2172e+00 +4.5804e+00 +4.2766e+00 +4.2759e+00 +4.3303e+00 +4.3219e+00 +4.3385e+00 +4.3200e+00 +4.4628e+00 +4.6727e+00 +4.4840e+00 +4.1879e+00 +4.1622e+00 +4.2891e+00 +4.4028e+00 +4.2606e+00 +4.4643e+00 +4.3375e+00 +4.5498e+00 +4.4875e+00 +4.3759e+00 +4.5553e+00 +4.0201e+00 +4.4411e+00 +4.3476e+00 +4.3300e+00 +4.4218e+00 +4.4941e+00 +4.4660e+00 +4.3969e+00 +4.7454e+00 +4.3312e+00 +4.3063e+00 +4.4225e+00 +4.2735e+00 +4.1377e+00 +4.3643e+00 +4.6574e+00 +4.7194e+00 +4.2243e+00 +4.4512e+00 +4.5579e+00 +4.3798e+00 +4.4082e+00 +4.1458e+00 +4.5293e+00 +4.3271e+00 +4.5148e+00 +4.1776e+00 +4.4923e+00 +4.3328e+00 +4.3541e+00 +4.3862e+00 +4.3317e+00 +4.2783e+00 +4.1994e+00 +4.6109e+00 +4.4952e+00 +4.5791e+00 +4.2334e+00 +4.2450e+00 +4.5049e+00 +4.2327e+00 +4.3130e+00 +4.4293e+00 +4.2430e+00 +4.3276e+00 +4.4289e+00 +4.0753e+00 +4.3772e+00 +4.2138e+00 +4.3339e+00 +4.4974e+00 +4.4488e+00 +4.3566e+00 +4.5453e+00 +4.4566e+00 +4.5748e+00 +4.4774e+00 +4.3455e+00 +4.6509e+00 +4.4821e+00 +4.4338e+00 +4.3226e+00 +4.5361e+00 +4.5097e+00 +4.1601e+00 +4.3758e+00 +4.5663e+00 +4.2146e+00 +4.5370e+00 +4.2941e+00 +4.5158e+00 +4.4445e+00 +4.1380e+00 +4.1371e+00 +4.2952e+00 +4.5278e+00 +4.2624e+00 +4.3911e+00 +4.0960e+00 +4.1533e+00 +4.4661e+00 +4.4119e+00 +4.4416e+00 +4.4128e+00 +4.4137e+00 +4.2880e+00 +4.3141e+00 +4.4306e+00 +4.4159e+00 +4.2804e+00 +4.2701e+00 +4.2552e+00 +4.5980e+00 +4.3704e+00 +4.3361e+00 +4.3565e+00 +4.3375e+00 +4.3372e+00 +4.2147e+00 +4.4765e+00 +4.2142e+00 +4.2907e+00 +4.3220e+00 +4.5608e+00 +4.5303e+00 +4.4945e+00 +4.4130e+00 +4.3042e+00 +4.4809e+00 +4.2587e+00 +4.4817e+00 +4.5914e+00 +4.2550e+00 +4.4733e+00 +4.3950e+00 +4.2143e+00 +4.1960e+00 +4.4024e+00 +4.4286e+00 +4.3794e+00 +4.4255e+00 +4.1534e+00 +4.4387e+00 +4.4240e+00 +4.3438e+00 +4.2682e+00 +4.4539e+00 +4.2536e+00 +4.4787e+00 +4.3606e+00 +4.2127e+00 +4.2678e+00 +4.1325e+00 +4.6087e+00 +4.3429e+00 +4.3333e+00 +4.2966e+00 +4.2812e+00 +4.0328e+00 +4.2401e+00 +4.5453e+00 +4.3928e+00 +4.2604e+00 +4.3250e+00 +4.1458e+00 +4.4653e+00 +4.3047e+00 +4.1490e+00 +4.5396e+00 +4.3163e+00 +4.2074e+00 +4.4314e+00 +4.4645e+00 +4.2920e+00 +4.3501e+00 +4.2313e+00 +4.1878e+00 +4.5385e+00 +4.3600e+00 +4.2320e+00 +4.2125e+00 +4.5129e+00 +4.3612e+00 +4.3480e+00 +4.3341e+00 +4.3949e+00 +4.4505e+00 +4.3415e+00 +4.3954e+00 +4.2924e+00 +4.2836e+00 +4.1740e+00 +4.4597e+00 +4.4810e+00 +4.4138e+00 +4.1884e+00 +4.2975e+00 +4.3253e+00 +4.3874e+00 +4.3099e+00 +4.1888e+00 +4.5643e+00 +4.3286e+00 +4.2893e+00 +4.3975e+00 +4.3264e+00 +4.3353e+00 +4.2075e+00 +4.0944e+00 +4.4145e+00 +4.4030e+00 +4.4466e+00 +4.2938e+00 +4.3540e+00 +4.3831e+00 +4.2825e+00 +4.2587e+00 +4.1895e+00 +4.2889e+00 +4.2343e+00 +4.3873e+00 +4.3637e+00 +4.3229e+00 +4.5597e+00 +4.5253e+00 +4.5048e+00 +4.6060e+00 +4.2298e+00 +4.1621e+00 +4.1692e+00 +4.1120e+00 +4.2015e+00 +4.4208e+00 +4.1342e+00 +4.2192e+00 +4.2930e+00 +4.3892e+00 +4.6202e+00 +4.4992e+00 +4.2876e+00 +4.5252e+00 +4.1444e+00 +4.4836e+00 +4.2374e+00 +4.2320e+00 +4.0329e+00 +4.1640e+00 +4.1990e+00 +4.4057e+00 +4.3068e+00 +4.2380e+00 +4.2154e+00 +4.0633e+00 +4.1913e+00 +4.2469e+00 +4.2638e+00 +4.4494e+00 +4.3091e+00 +4.0770e+00 +4.4129e+00 +4.3449e+00 +4.3433e+00 +4.3030e+00 +4.3526e+00 +4.1240e+00 +4.1294e+00 +4.5255e+00 +4.1124e+00 +4.3279e+00 +4.3846e+00 +4.3644e+00 +4.3655e+00 +4.3542e+00 +4.2492e+00 +4.3841e+00 +4.3323e+00 +4.0617e+00 +4.1817e+00 +4.3451e+00 +4.6248e+00 +4.5433e+00 +4.3738e+00 +4.3851e+00 +4.4722e+00 +4.2071e+00 +4.6144e+00 +4.4504e+00 +4.2666e+00 +4.4527e+00 +4.3161e+00 +4.1441e+00 +4.5235e+00 +4.2816e+00 +4.4383e+00 +4.4292e+00 +4.2944e+00 +4.1916e+00 +4.2761e+00 +4.2643e+00 +4.0966e+00 +4.1644e+00 +4.3586e+00 +4.2618e+00 +4.4506e+00 +4.1592e+00 +4.1093e+00 +4.3023e+00 +4.3480e+00 +4.4878e+00 +4.4576e+00 +4.3096e+00 +4.1135e+00 +4.2420e+00 +4.1311e+00 +4.2879e+00 +4.4220e+00 +4.1810e+00 +4.0615e+00 +4.1520e+00 +4.3006e+00 +4.2312e+00 +4.3806e+00 +4.6018e+00 +4.2768e+00 +4.0906e+00 +4.3175e+00 +4.3434e+00 +4.4431e+00 +4.2695e+00 +4.2609e+00 +4.2597e+00 +4.3128e+00 +4.3246e+00 +4.2098e+00 +4.4346e+00 +4.3970e+00 +4.3880e+00 +4.1356e+00 +4.5507e+00 +4.5470e+00 +4.1993e+00 +4.4562e+00 +4.1509e+00 +4.3308e+00 +4.3808e+00 +4.2225e+00 +4.1811e+00 +4.1692e+00 +4.1747e+00 +4.3870e+00 +4.1528e+00 +4.2451e+00 +4.3091e+00 +4.3504e+00 +4.2365e+00 +4.3315e+00 +4.2316e+00 +4.4625e+00 +4.4986e+00 +4.1049e+00 +4.3917e+00 +4.4717e+00 +4.7119e+00 +4.6653e+00 +4.1655e+00 +4.3396e+00 +4.2068e+00 +4.2674e+00 +4.2970e+00 +4.4649e+00 +4.2745e+00 +4.4962e+00 +4.0988e+00 +4.1278e+00 +4.2790e+00 +4.5253e+00 +4.2696e+00 +4.1460e+00 +4.5128e+00 +4.2776e+00 +4.4426e+00 +4.2466e+00 +4.2657e+00 +4.4832e+00 +4.2989e+00 +4.3483e+00 +4.5656e+00 +4.1383e+00 +4.3886e+00 +4.0840e+00 +4.3666e+00 +4.1707e+00 +4.2968e+00 +4.2586e+00 +4.4268e+00 +4.2663e+00 +4.1615e+00 +4.3376e+00 +4.4652e+00 +4.2928e+00 +4.3198e+00 +4.4345e+00 +4.0961e+00 +4.4994e+00 +4.3128e+00 +4.2891e+00 +4.3797e+00 +4.2376e+00 +4.3377e+00 +4.3164e+00 +4.4751e+00 +4.2097e+00 +4.4779e+00 +4.2913e+00 +4.2635e+00 +4.2335e+00 +4.5141e+00 +4.1537e+00 +4.1887e+00 +4.2586e+00 +4.3912e+00 +4.2675e+00 +4.1209e+00 +4.3236e+00 +4.1724e+00 +4.2643e+00 +4.1097e+00 +4.3447e+00 +4.3364e+00 +4.3775e+00 +4.3429e+00 +4.2409e+00 +4.3602e+00 +4.0884e+00 +4.4677e+00 +4.2760e+00 +4.2706e+00 +4.3228e+00 +4.1977e+00 +4.2750e+00 +4.4374e+00 +4.2353e+00 +4.1536e+00 +4.1336e+00 +4.2706e+00 +4.4353e+00 +4.2862e+00 +4.3342e+00 +4.0858e+00 +4.2231e+00 +4.1904e+00 +3.9867e+00 +4.1522e+00 +4.3076e+00 +4.2759e+00 +4.0977e+00 +4.0079e+00 +4.0108e+00 +4.1455e+00 +4.4695e+00 +4.1655e+00 +4.3172e+00 +4.2494e+00 +4.1309e+00 +4.2213e+00 +4.2439e+00 +4.3599e+00 +4.2390e+00 +4.1840e+00 +4.3388e+00 +4.0962e+00 +4.2923e+00 +4.1128e+00 +4.1471e+00 +4.2083e+00 +4.3614e+00 +4.3693e+00 +4.3198e+00 +4.2453e+00 +4.4022e+00 +4.6139e+00 +4.2427e+00 +4.0715e+00 +4.1223e+00 +4.3781e+00 +4.1270e+00 +4.4982e+00 +4.1045e+00 +4.1293e+00 +4.4212e+00 +4.3774e+00 +4.2521e+00 +4.3435e+00 +3.9954e+00 +4.2855e+00 +4.4175e+00 +3.9806e+00 +4.3288e+00 +4.4560e+00 +4.3388e+00 +4.3195e+00 +4.3246e+00 +4.2345e+00 +4.2404e+00 +4.1927e+00 +4.2418e+00 +4.3734e+00 +4.2171e+00 +4.2243e+00 +4.3520e+00 +4.1665e+00 +4.2968e+00 +4.2278e+00 +3.9845e+00 +4.2327e+00 +4.4164e+00 +4.3095e+00 +4.2468e+00 +3.9676e+00 +4.4553e+00 +4.1349e+00 +4.3911e+00 +4.3576e+00 +4.2385e+00 +4.3172e+00 +4.4020e+00 +4.2507e+00 +4.1729e+00 +4.4559e+00 +4.2408e+00 +4.1221e+00 +4.2731e+00 +4.3223e+00 +4.3959e+00 +4.0199e+00 +4.0414e+00 +4.1356e+00 +4.1847e+00 +4.1982e+00 +3.8842e+00 +4.4274e+00 +4.1739e+00 +4.2782e+00 +4.1514e+00 +4.1369e+00 +4.1322e+00 +4.0964e+00 +4.1800e+00 +4.1539e+00 +4.4251e+00 +4.1519e+00 +4.2545e+00 +4.2055e+00 +4.2987e+00 +4.3199e+00 +4.2234e+00 +4.3630e+00 +4.3104e+00 +4.3276e+00 +4.1507e+00 +3.9960e+00 +4.1681e+00 +4.0607e+00 +4.1974e+00 +4.0850e+00 +4.3945e+00 +4.1555e+00 +4.3325e+00 +4.0700e+00 +4.3412e+00 +4.3603e+00 +4.3398e+00 +4.1734e+00 +4.2371e+00 +4.2846e+00 +4.2564e+00 +4.3674e+00 +4.1585e+00 +4.2027e+00 +4.3543e+00 +4.0948e+00 +4.2386e+00 +4.1976e+00 +4.2460e+00 +4.1689e+00 +4.3871e+00 +4.0845e+00 +4.1229e+00 +4.1914e+00 +4.1645e+00 +4.1509e+00 +4.0829e+00 +4.0409e+00 +4.3351e+00 +4.3363e+00 +4.2749e+00 +4.1748e+00 +4.4778e+00 +4.2335e+00 +4.0365e+00 +4.3541e+00 +4.2741e+00 +4.2382e+00 +4.3871e+00 +4.4823e+00 +4.0346e+00 +4.0269e+00 +4.2087e+00 +4.1425e+00 +4.0871e+00 +4.3484e+00 +4.1787e+00 +4.5055e+00 +4.2364e+00 +4.0589e+00 +4.0521e+00 +4.2735e+00 +4.1495e+00 +4.2672e+00 +4.0210e+00 +4.2945e+00 +4.2694e+00 +4.4842e+00 +4.2381e+00 +4.2245e+00 +4.2140e+00 +4.3582e+00 +4.1472e+00 +4.2118e+00 +4.3395e+00 +4.1416e+00 +4.1918e+00 +4.0708e+00 +4.0937e+00 +4.0232e+00 +4.2071e+00 +4.2641e+00 +4.4512e+00 +4.3492e+00 +4.0721e+00 +4.2491e+00 +4.1219e+00 +4.1426e+00 +4.0164e+00 +4.2181e+00 +4.1316e+00 +3.8811e+00 +4.1981e+00 +4.1499e+00 +4.3786e+00 +4.0654e+00 +4.1665e+00 +4.2157e+00 +4.1768e+00 +4.4939e+00 +4.2851e+00 +4.3112e+00 +4.1580e+00 +4.2700e+00 +4.4740e+00 +4.1782e+00 +4.1383e+00 +4.2672e+00 +4.3014e+00 +4.0150e+00 +4.2046e+00 +4.3078e+00 +4.3191e+00 +4.3266e+00 +4.2693e+00 +4.1384e+00 +4.2416e+00 +4.1401e+00 +4.1196e+00 +4.5223e+00 +4.1415e+00 +4.1953e+00 +4.2432e+00 +4.2444e+00 +4.5095e+00 +3.8083e+00 +3.9376e+00 +4.1581e+00 +4.1851e+00 +4.0979e+00 +4.1107e+00 +3.9796e+00 +4.1411e+00 +4.1783e+00 +4.1964e+00 +4.1025e+00 +4.1727e+00 +4.1422e+00 +4.1306e+00 +4.1417e+00 +4.0437e+00 +4.4695e+00 +4.0220e+00 +4.3393e+00 +4.2579e+00 +4.1393e+00 +4.1017e+00 +4.2183e+00 +4.0276e+00 +4.1574e+00 +4.4785e+00 +4.2630e+00 +4.0849e+00 +3.8983e+00 +4.2841e+00 +4.3601e+00 +4.1972e+00 +4.1662e+00 +3.9903e+00 +4.2034e+00 +4.0980e+00 +4.1120e+00 +4.1543e+00 +4.2383e+00 +4.1140e+00 +4.4671e+00 +4.1530e+00 +4.1238e+00 +4.0585e+00 +3.8745e+00 +4.2371e+00 +4.4388e+00 +4.2206e+00 +4.1618e+00 +4.2857e+00 +4.4007e+00 +4.2392e+00 +4.2324e+00 +4.1002e+00 +4.1468e+00 +4.1286e+00 +4.4042e+00 +4.2294e+00 +4.2438e+00 +4.1120e+00 +4.4141e+00 +4.1580e+00 +4.2274e+00 +4.0990e+00 +4.1360e+00 +4.3664e+00 +4.0960e+00 +4.0217e+00 +3.9669e+00 +4.2240e+00 +4.0719e+00 +4.4698e+00 +4.1852e+00 +3.9401e+00 +3.9825e+00 +3.9753e+00 +4.2043e+00 +4.0573e+00 +4.0781e+00 +4.2246e+00 +4.0636e+00 +4.2804e+00 +4.2260e+00 +3.9849e+00 +4.1261e+00 +4.2099e+00 +4.1770e+00 +4.2504e+00 +4.2255e+00 +4.0591e+00 +4.3197e+00 +3.9671e+00 +4.1822e+00 +4.2268e+00 +4.1627e+00 +4.2908e+00 +3.9744e+00 +3.9563e+00 +4.1557e+00 +4.3137e+00 +4.1028e+00 +4.1106e+00 +4.2228e+00 +3.9204e+00 +4.0583e+00 +4.1268e+00 +4.1026e+00 +4.2579e+00 +4.2142e+00 +4.1644e+00 +3.9501e+00 +4.2072e+00 +4.2266e+00 +4.1041e+00 +4.1669e+00 +4.3536e+00 +4.0293e+00 +4.1048e+00 +4.2040e+00 +4.5014e+00 +4.1895e+00 +4.3016e+00 +4.3261e+00 +4.0609e+00 +3.6715e+00 +4.1302e+00 +4.1246e+00 +3.9052e+00 +4.2540e+00 +4.1248e+00 +3.9966e+00 +4.1009e+00 +3.9407e+00 +4.1438e+00 +3.9162e+00 +4.1462e+00 +4.1618e+00 +4.4040e+00 +4.3110e+00 +4.2510e+00 +3.8899e+00 +4.1392e+00 +4.2086e+00 +4.1850e+00 +4.0644e+00 +4.1483e+00 +4.2332e+00 +4.0439e+00 +4.2479e+00 +4.1983e+00 +4.1258e+00 +4.2478e+00 +4.2432e+00 +4.1392e+00 +4.2974e+00 +4.1972e+00 +4.1896e+00 +4.2170e+00 +4.0821e+00 +4.0190e+00 +4.0207e+00 +4.1601e+00 +4.2091e+00 +4.0816e+00 +3.9508e+00 +4.2198e+00 +4.4389e+00 +4.3244e+00 +4.0825e+00 +4.2079e+00 +4.2335e+00 +4.2873e+00 +4.2108e+00 +4.1501e+00 +3.9371e+00 +4.0910e+00 +4.2400e+00 +4.2498e+00 +3.8264e+00 +4.0012e+00 +4.3590e+00 +4.1938e+00 +4.1687e+00 +4.4691e+00 +4.1654e+00 +4.4283e+00 +4.0863e+00 +4.2081e+00 +4.4782e+00 +4.0874e+00 +4.2261e+00 +4.2528e+00 +4.0490e+00 +4.0428e+00 +4.2657e+00 +4.2507e+00 +3.9889e+00 +4.2346e+00 +4.0913e+00 +4.2259e+00 +4.2528e+00 +4.2980e+00 +3.9945e+00 +4.1433e+00 +4.0975e+00 +4.1789e+00 +4.2293e+00 +4.1221e+00 +4.1621e+00 +4.3366e+00 +4.3340e+00 +4.2027e+00 +4.0607e+00 +3.8464e+00 +4.2787e+00 +4.1627e+00 +4.2095e+00 +4.2493e+00 +4.3215e+00 +4.1933e+00 +4.3105e+00 +4.3048e+00 +4.1309e+00 +4.1371e+00 +4.1457e+00 +4.3023e+00 +4.0394e+00 +4.2306e+00 +4.2538e+00 +4.0546e+00 +4.1074e+00 +4.0064e+00 +4.3332e+00 +4.0942e+00 +4.1770e+00 +4.1318e+00 +4.2654e+00 +3.7783e+00 +4.0644e+00 +4.2234e+00 +4.0424e+00 +4.2190e+00 +3.9857e+00 +3.8781e+00 +4.1833e+00 +4.3044e+00 +4.1987e+00 +4.2101e+00 +4.1814e+00 +4.2106e+00 +4.3239e+00 +3.9974e+00 +4.0227e+00 +4.2036e+00 +4.2985e+00 +4.1910e+00 +3.9929e+00 +4.4740e+00 +4.0065e+00 +4.1431e+00 +4.1976e+00 +4.2415e+00 +4.3636e+00 +3.8829e+00 +4.1328e+00 +4.0935e+00 +4.1014e+00 +4.1416e+00 +3.9511e+00 +4.0706e+00 +4.2281e+00 +4.2699e+00 +3.8799e+00 +3.9610e+00 +4.3294e+00 +4.2154e+00 +4.1065e+00 +3.9790e+00 +4.1908e+00 +4.2458e+00 +4.1694e+00 +3.9826e+00 +4.1021e+00 +3.9870e+00 +3.9414e+00 +4.1929e+00 +4.0100e+00 +4.0882e+00 +4.1555e+00 +4.0166e+00 +4.2910e+00 +3.9892e+00 +4.2327e+00 +4.2134e+00 +3.9535e+00 +4.2844e+00 +4.2313e+00 +3.9906e+00 +4.0654e+00 +4.1505e+00 +4.1589e+00 +3.9305e+00 +4.0239e+00 +4.1679e+00 +4.0857e+00 +4.1130e+00 +4.0931e+00 +3.8539e+00 +4.1992e+00 +4.1888e+00 +4.1483e+00 +4.2157e+00 +4.2050e+00 +3.8888e+00 +4.0969e+00 +4.2493e+00 +4.1856e+00 +4.2958e+00 +3.8184e+00 +3.9947e+00 +4.0709e+00 +4.1950e+00 +4.1890e+00 +4.1583e+00 +4.1053e+00 +4.2645e+00 +3.9915e+00 +4.0893e+00 +4.0629e+00 +4.2573e+00 +3.9746e+00 +3.9765e+00 +4.3212e+00 +4.2735e+00 +4.3013e+00 +4.1871e+00 +4.2829e+00 +4.1953e+00 +4.0132e+00 +4.1929e+00 +4.2222e+00 +4.1247e+00 +3.9955e+00 +4.0731e+00 +3.9798e+00 +4.3949e+00 +4.1354e+00 +3.9107e+00 +3.9786e+00 +4.0565e+00 +4.0701e+00 +3.9046e+00 +4.3450e+00 +3.9260e+00 +4.1380e+00 +3.8922e+00 +4.1768e+00 +4.0409e+00 +4.0916e+00 +4.1368e+00 +4.2116e+00 +4.0246e+00 +4.0404e+00 +3.9876e+00 +4.0691e+00 +4.1429e+00 +4.1501e+00 +4.0383e+00 +4.0860e+00 +4.1942e+00 +4.1045e+00 +4.1566e+00 +4.2224e+00 +4.1328e+00 +3.8172e+00 +4.0385e+00 +3.9848e+00 +4.0026e+00 +4.0796e+00 +4.2840e+00 +4.1464e+00 +4.1955e+00 +4.1038e+00 +4.2332e+00 +4.2153e+00 +4.0112e+00 +4.0110e+00 +4.1141e+00 +4.0023e+00 +3.9285e+00 +4.1291e+00 +4.0620e+00 +3.8912e+00 +3.9480e+00 +4.2032e+00 +4.0279e+00 +4.0607e+00 +4.3239e+00 +4.2217e+00 +4.0786e+00 +4.0217e+00 +4.1447e+00 +4.1457e+00 +4.1850e+00 +4.0914e+00 +3.9664e+00 +4.2739e+00 +4.1675e+00 +4.1555e+00 +4.2056e+00 +4.0461e+00 +4.2717e+00 +4.2021e+00 +4.1579e+00 +4.1355e+00 +4.0589e+00 +4.0908e+00 +4.1403e+00 +4.2834e+00 +3.9626e+00 +4.1572e+00 +4.0118e+00 +4.2276e+00 +4.1021e+00 +4.1685e+00 +4.4393e+00 +4.2176e+00 +4.1650e+00 +3.8732e+00 +4.0312e+00 +3.8334e+00 +4.2271e+00 +4.2736e+00 +3.8655e+00 +3.9883e+00 +3.9971e+00 +3.9466e+00 +4.2973e+00 +4.1076e+00 +4.1253e+00 +3.9103e+00 +4.1924e+00 +4.1459e+00 +4.0228e+00 +4.0376e+00 +4.1256e+00 +4.0720e+00 +3.9497e+00 +4.3201e+00 +4.2776e+00 +4.0910e+00 +4.2438e+00 +4.0659e+00 +4.1433e+00 +4.0398e+00 +4.3341e+00 +3.8928e+00 +3.9756e+00 +3.9160e+00 +4.1763e+00 +4.0433e+00 +4.2634e+00 +4.0250e+00 +3.9933e+00 +4.0930e+00 +4.0104e+00 +3.9493e+00 +3.9598e+00 +4.0127e+00 +3.9094e+00 +4.1086e+00 +4.0757e+00 +4.2286e+00 +4.0896e+00 +3.9375e+00 +4.0936e+00 +4.0929e+00 +4.0751e+00 +4.1770e+00 +4.1693e+00 +4.0577e+00 +4.1469e+00 +4.2207e+00 +4.2046e+00 +3.8979e+00 +4.0746e+00 +3.9787e+00 +4.0334e+00 +3.9025e+00 +4.1246e+00 +4.2651e+00 +3.8660e+00 +4.2372e+00 +4.0282e+00 +4.0096e+00 +3.9371e+00 +4.2258e+00 +4.0848e+00 +4.1646e+00 +4.0681e+00 +4.0063e+00 +3.8945e+00 +4.0899e+00 +4.0637e+00 +3.9424e+00 +4.0831e+00 +4.0844e+00 +4.0764e+00 +4.1876e+00 +3.8476e+00 +3.8718e+00 +4.0581e+00 +4.0548e+00 +4.0689e+00 +4.0649e+00 +3.9744e+00 +4.0206e+00 +3.9947e+00 +4.1170e+00 +3.9824e+00 +3.9230e+00 +3.8687e+00 +4.1639e+00 +4.0481e+00 +4.1391e+00 +3.9709e+00 +4.0428e+00 +4.0831e+00 +4.0150e+00 +4.0178e+00 +4.0838e+00 +3.9256e+00 +3.9854e+00 +4.0630e+00 +4.1328e+00 +3.9823e+00 +4.0905e+00 +4.0315e+00 +4.1073e+00 +4.1778e+00 +3.9334e+00 +4.3925e+00 +4.1638e+00 +4.4283e+00 +3.9453e+00 +4.0688e+00 +4.1055e+00 +3.9667e+00 +4.0821e+00 +3.8953e+00 +3.9634e+00 +4.2918e+00 +3.8680e+00 +4.0707e+00 +4.2484e+00 +4.0190e+00 +4.0970e+00 +3.9554e+00 +3.9877e+00 +4.0838e+00 +3.9985e+00 +3.8982e+00 +4.1842e+00 +3.9872e+00 +3.9521e+00 +4.2123e+00 +3.8805e+00 +3.9378e+00 +4.4086e+00 +4.1565e+00 +4.1784e+00 +3.9776e+00 +4.1766e+00 +4.2096e+00 +4.1673e+00 +3.8803e+00 +4.3173e+00 +3.9185e+00 +4.1840e+00 +3.7831e+00 +4.1134e+00 +3.9111e+00 +4.2830e+00 +4.0266e+00 +3.8259e+00 +3.9189e+00 +4.1067e+00 +3.9411e+00 +3.9929e+00 +4.3285e+00 +4.0745e+00 +4.0682e+00 +4.0822e+00 +4.1333e+00 +4.1740e+00 +3.9817e+00 +4.1739e+00 +4.1665e+00 +4.2629e+00 +4.2612e+00 +4.1754e+00 +4.0761e+00 +4.0025e+00 +3.8257e+00 +3.9632e+00 +4.2633e+00 +4.1934e+00 +3.9840e+00 +3.8979e+00 +3.8884e+00 +4.2025e+00 +3.9827e+00 +4.0400e+00 +3.9830e+00 +3.9057e+00 +4.0824e+00 +4.2181e+00 +4.0580e+00 +4.1452e+00 +4.1217e+00 +3.7594e+00 +3.9321e+00 +4.2193e+00 +4.2072e+00 +4.2994e+00 +3.8795e+00 +3.9955e+00 +4.1055e+00 +3.7481e+00 +4.0171e+00 +4.0125e+00 +3.9363e+00 +4.1460e+00 +3.9256e+00 +4.3081e+00 +4.1936e+00 +4.0885e+00 +3.8977e+00 +3.9242e+00 +4.1048e+00 +4.1463e+00 +4.1113e+00 +4.1348e+00 +4.0675e+00 +3.9364e+00 +3.9348e+00 +3.9130e+00 +4.2595e+00 +3.8932e+00 +4.0646e+00 +3.9688e+00 +4.0797e+00 +4.1309e+00 +3.8086e+00 +4.1230e+00 +3.9169e+00 +4.1092e+00 +3.9121e+00 +4.1775e+00 +4.2701e+00 +3.7745e+00 +4.0335e+00 +4.1587e+00 +4.0610e+00 +3.9296e+00 +4.0666e+00 +4.1582e+00 +3.9841e+00 +3.9070e+00 +4.0587e+00 +3.9228e+00 +3.8413e+00 +4.1370e+00 +4.0758e+00 +4.1806e+00 +3.9577e+00 +4.0261e+00 +3.8817e+00 +3.9391e+00 +4.2469e+00 +4.0224e+00 +3.9269e+00 +3.9309e+00 +3.9398e+00 +3.9411e+00 +3.8495e+00 +3.8422e+00 +4.1131e+00 +3.9302e+00 +3.8067e+00 +3.8705e+00 +4.0040e+00 +3.8478e+00 +3.8126e+00 +4.0632e+00 +4.0592e+00 +4.1154e+00 +4.1581e+00 +4.2445e+00 +3.7784e+00 +3.9061e+00 +3.9815e+00 +4.1703e+00 +4.0723e+00 +4.0966e+00 +3.9834e+00 +4.1598e+00 +4.0970e+00 +3.9688e+00 +4.0740e+00 +3.9308e+00 +3.9186e+00 +3.9976e+00 +3.9410e+00 +4.3507e+00 +4.1098e+00 +4.2199e+00 +4.0819e+00 +4.2194e+00 +4.1168e+00 +4.0510e+00 +3.8153e+00 +3.8497e+00 +3.8453e+00 +3.9146e+00 +3.7238e+00 +3.8640e+00 +3.8978e+00 +4.1385e+00 +3.9792e+00 +4.2162e+00 +4.3220e+00 +3.8796e+00 +4.2819e+00 +4.0277e+00 +4.3221e+00 +3.7588e+00 +3.9903e+00 +4.1206e+00 +3.7851e+00 +4.1497e+00 +3.8298e+00 +3.9942e+00 +3.9523e+00 +3.9435e+00 +4.0302e+00 +3.8704e+00 +4.0176e+00 +3.8612e+00 +4.1473e+00 +4.1187e+00 +4.0572e+00 +4.0228e+00 +3.8908e+00 +3.9323e+00 +4.1246e+00 +3.7998e+00 +4.1292e+00 +4.1411e+00 +3.8106e+00 +4.1421e+00 +3.9371e+00 +4.1679e+00 +4.1431e+00 +4.1762e+00 +3.9332e+00 +3.9066e+00 +3.8779e+00 +4.1676e+00 +4.0562e+00 +3.7959e+00 +4.2415e+00 +4.1468e+00 +4.1340e+00 +3.9892e+00 +3.8010e+00 +4.1200e+00 +3.9267e+00 +4.0034e+00 +3.8992e+00 +3.9975e+00 +4.1795e+00 +3.9199e+00 +3.8047e+00 +3.8882e+00 +3.9090e+00 +4.0789e+00 +4.2747e+00 +4.0563e+00 +4.0187e+00 +4.0385e+00 +4.1308e+00 +4.0332e+00 +3.9637e+00 +3.9986e+00 +4.1354e+00 +4.0704e+00 +3.8404e+00 +4.1824e+00 +3.6622e+00 +3.9742e+00 +3.9047e+00 +3.9892e+00 +3.9965e+00 +3.9457e+00 +3.9550e+00 +4.0214e+00 +3.8235e+00 +4.1256e+00 +4.3371e+00 +4.2509e+00 +4.1755e+00 +3.8911e+00 +4.1727e+00 +4.1882e+00 +4.0396e+00 +4.2598e+00 +4.0827e+00 +3.9064e+00 +4.2317e+00 +3.8765e+00 +4.1222e+00 +4.2821e+00 +4.2049e+00 +4.0057e+00 +3.9456e+00 +3.9320e+00 +3.8607e+00 +3.9519e+00 +3.9643e+00 +4.2029e+00 +4.0618e+00 +3.8223e+00 +4.0010e+00 +3.9756e+00 +4.2547e+00 +4.1677e+00 +3.9168e+00 +3.8208e+00 +4.0857e+00 +3.9479e+00 +3.8462e+00 +3.9416e+00 +4.1276e+00 +3.8763e+00 +3.7994e+00 +4.0434e+00 +3.9777e+00 +3.9476e+00 +3.7584e+00 +4.1359e+00 +3.8094e+00 +3.9933e+00 +3.9480e+00 +3.9182e+00 +4.0496e+00 +3.8871e+00 +4.0592e+00 +3.9440e+00 +3.9579e+00 +4.0457e+00 +4.1326e+00 +3.9793e+00 +3.7745e+00 +4.1138e+00 +4.1379e+00 +4.1210e+00 +3.8280e+00 +3.9273e+00 +3.9335e+00 +3.8580e+00 +4.0158e+00 +3.9685e+00 +4.0112e+00 +4.1766e+00 +3.9275e+00 +4.0006e+00 +3.8630e+00 +4.0983e+00 +3.9908e+00 +4.1716e+00 +3.7807e+00 +4.0037e+00 +3.9859e+00 +3.9179e+00 +4.1984e+00 +3.7683e+00 +3.9429e+00 +3.5846e+00 +3.8339e+00 +4.0209e+00 +3.7208e+00 +4.0132e+00 +4.0288e+00 +4.0673e+00 +4.2262e+00 +4.1895e+00 +3.9804e+00 +4.1530e+00 +3.9463e+00 +3.8510e+00 +3.7640e+00 +3.9863e+00 +4.0237e+00 +3.7748e+00 +3.9231e+00 +3.8381e+00 +3.9071e+00 +3.8581e+00 +3.7666e+00 +3.7673e+00 +3.9637e+00 +4.0223e+00 +4.0950e+00 +3.7960e+00 +3.7352e+00 +4.2004e+00 +4.0210e+00 +3.9393e+00 +4.0824e+00 +4.1418e+00 +3.8582e+00 +4.1624e+00 +4.1411e+00 +4.1310e+00 +4.1616e+00 +3.8949e+00 +4.2241e+00 +3.8519e+00 +3.9241e+00 +3.9530e+00 +3.9285e+00 +3.9220e+00 +3.9586e+00 +3.9207e+00 +3.9589e+00 +4.0569e+00 +3.7881e+00 +4.0319e+00 +4.0139e+00 +4.0096e+00 +4.0515e+00 +3.7967e+00 +4.0836e+00 +3.9624e+00 +4.0476e+00 +3.8822e+00 +3.8407e+00 +4.1197e+00 +4.0152e+00 +3.8491e+00 +3.8963e+00 +3.7975e+00 +3.7293e+00 +3.7819e+00 +4.0099e+00 +3.8632e+00 +3.9830e+00 +3.9698e+00 +3.7319e+00 +4.3173e+00 +3.7713e+00 +4.1857e+00 +4.2283e+00 +3.9850e+00 +4.0330e+00 +3.8677e+00 +3.9621e+00 +3.7292e+00 +3.9157e+00 +3.7416e+00 +3.8358e+00 +3.9901e+00 +3.9948e+00 +3.8078e+00 +4.1348e+00 +3.8378e+00 +4.2789e+00 +3.9107e+00 +3.9802e+00 +3.9011e+00 +3.7574e+00 +3.8840e+00 +4.0941e+00 +3.8219e+00 +3.8619e+00 +4.0421e+00 +3.9701e+00 +3.8680e+00 +3.8595e+00 +4.1338e+00 +4.1357e+00 +3.8076e+00 +3.8863e+00 +4.0642e+00 +4.0304e+00 +4.1039e+00 +4.0806e+00 +3.9896e+00 +3.8877e+00 +3.7646e+00 +4.0346e+00 +3.9295e+00 +4.0276e+00 +4.0080e+00 +3.7983e+00 +4.0897e+00 +3.8656e+00 +3.9699e+00 +3.7870e+00 +3.9631e+00 +3.8981e+00 +3.8367e+00 +3.8844e+00 +3.7797e+00 +3.9683e+00 +4.0244e+00 +3.9611e+00 +3.9171e+00 +3.9027e+00 +3.7128e+00 +4.0872e+00 +4.0607e+00 +3.9528e+00 +3.9915e+00 +3.8824e+00 +4.0949e+00 +3.9444e+00 +3.9374e+00 +3.9520e+00 +3.7486e+00 +4.0927e+00 +3.8484e+00 +3.9762e+00 +4.1425e+00 +4.0208e+00 +4.0394e+00 +4.0026e+00 +3.9394e+00 +3.7199e+00 +4.0064e+00 +3.9427e+00 +3.8382e+00 +3.8703e+00 +4.0295e+00 +3.7935e+00 +4.0830e+00 +3.6728e+00 +4.0981e+00 +3.9522e+00 +3.8412e+00 +3.8898e+00 +4.0796e+00 +4.3633e+00 +3.7919e+00 +3.9468e+00 +4.0317e+00 +3.8599e+00 +4.0885e+00 +3.7691e+00 +4.0647e+00 +3.9852e+00 +3.8217e+00 +3.9502e+00 +4.0393e+00 +3.8211e+00 +4.0287e+00 +3.9697e+00 +3.9348e+00 +3.7958e+00 +4.0929e+00 +3.8527e+00 +3.8915e+00 +3.7877e+00 +3.6447e+00 +3.8322e+00 +3.8774e+00 +3.9521e+00 +4.1158e+00 +4.0348e+00 +3.9536e+00 +3.9742e+00 +4.0626e+00 +3.8429e+00 +3.7177e+00 +3.9808e+00 +3.7822e+00 +3.9529e+00 +3.8669e+00 +3.8766e+00 +3.9801e+00 +4.0054e+00 +3.8048e+00 +4.1387e+00 +3.7886e+00 +3.9095e+00 +4.1216e+00 +4.1313e+00 +3.9069e+00 +3.9531e+00 +4.0115e+00 +3.8867e+00 +4.1771e+00 +3.9281e+00 +3.9958e+00 +3.8850e+00 +3.8346e+00 +3.8114e+00 +3.7833e+00 +3.8634e+00 +3.9485e+00 +3.9702e+00 +3.7520e+00 +3.7678e+00 +3.9387e+00 +4.1211e+00 +3.7599e+00 +4.0094e+00 +4.0808e+00 +3.9216e+00 +3.9671e+00 +3.9575e+00 +3.7860e+00 +3.8823e+00 +3.8247e+00 +3.8254e+00 +4.1803e+00 +3.9355e+00 +3.7034e+00 +3.9866e+00 +3.9451e+00 +3.7387e+00 +3.8945e+00 +3.9974e+00 +3.9883e+00 +3.7344e+00 +4.0093e+00 +4.2092e+00 +4.0856e+00 +3.8614e+00 +4.0219e+00 +3.9703e+00 +3.7463e+00 +4.0915e+00 +3.9495e+00 +3.8650e+00 +3.8839e+00 +4.0681e+00 +3.9007e+00 +3.8986e+00 +3.7160e+00 +3.8187e+00 +3.9049e+00 +3.9219e+00 +3.8721e+00 +3.6703e+00 +3.9337e+00 +3.8475e+00 +3.6945e+00 +4.0563e+00 +4.1731e+00 +4.1093e+00 +4.1815e+00 +3.8991e+00 +4.0566e+00 +3.7979e+00 +3.7198e+00 +4.0190e+00 +3.9501e+00 +3.8941e+00 +3.7849e+00 +4.0427e+00 +3.8219e+00 +3.7197e+00 +4.0213e+00 +3.8759e+00 +3.9790e+00 +3.9073e+00 +4.1513e+00 +3.9621e+00 +3.7978e+00 +3.9002e+00 +3.8176e+00 +3.8539e+00 +3.8668e+00 +3.9985e+00 +3.8728e+00 +3.7543e+00 +3.9602e+00 +3.8898e+00 +4.0404e+00 +3.9249e+00 +3.9263e+00 +3.7317e+00 +4.1859e+00 +4.0236e+00 +4.0457e+00 +3.8387e+00 +3.8349e+00 +4.0237e+00 +3.9723e+00 +3.8526e+00 +3.9144e+00 +4.1672e+00 +3.9686e+00 +3.9382e+00 +4.1386e+00 +3.9389e+00 +3.5482e+00 +3.8062e+00 +4.0990e+00 +3.8644e+00 +3.9616e+00 +3.9203e+00 +4.1111e+00 +4.1820e+00 +3.9930e+00 +3.9781e+00 +3.9452e+00 +3.9451e+00 +3.8276e+00 +3.7955e+00 +3.9200e+00 +3.7472e+00 +3.9380e+00 +3.8219e+00 +3.9247e+00 +4.0709e+00 +3.9069e+00 +3.7589e+00 +3.8452e+00 +3.9186e+00 +3.8944e+00 +3.8882e+00 +3.9656e+00 +4.2074e+00 +3.9373e+00 +3.9133e+00 +3.9913e+00 +3.8063e+00 +3.9657e+00 +3.5698e+00 +3.7794e+00 +3.8103e+00 +3.8675e+00 +3.8899e+00 +3.7330e+00 +3.9954e+00 +3.9350e+00 +4.0419e+00 +4.1429e+00 +4.1841e+00 +3.8131e+00 +3.9017e+00 +4.0238e+00 +3.9972e+00 +3.7397e+00 +3.9003e+00 +3.8582e+00 +4.0137e+00 +3.8272e+00 +3.7465e+00 +3.8436e+00 +3.8219e+00 +3.9275e+00 +3.8100e+00 +3.5549e+00 +3.8426e+00 +3.7018e+00 +3.8570e+00 +3.8406e+00 +3.7927e+00 +3.7754e+00 +4.0281e+00 +3.7557e+00 +4.1015e+00 +3.8642e+00 +3.9523e+00 +4.0444e+00 +3.9575e+00 +4.0592e+00 +3.8999e+00 +3.8239e+00 +3.7614e+00 +4.0735e+00 +3.8866e+00 +3.7762e+00 +3.6904e+00 +3.9410e+00 +3.8967e+00 +3.7227e+00 +4.0712e+00 +3.9357e+00 +3.9193e+00 +4.1295e+00 +3.8952e+00 +3.9332e+00 +4.1021e+00 +3.5885e+00 +3.9301e+00 +3.8073e+00 +4.0580e+00 +4.0783e+00 +3.8039e+00 +3.8792e+00 +3.8739e+00 +3.5995e+00 +4.0319e+00 +3.9595e+00 +4.1487e+00 +3.7158e+00 +3.9258e+00 +4.0761e+00 +3.8467e+00 +3.8952e+00 +4.3390e+00 +3.7935e+00 +3.9704e+00 +3.7089e+00 +3.9947e+00 +3.8771e+00 +4.0038e+00 +3.9838e+00 +3.8746e+00 +3.6826e+00 +3.7311e+00 +3.7519e+00 +3.6797e+00 +3.8574e+00 +3.8958e+00 +3.9025e+00 +3.8004e+00 +3.8487e+00 +3.9424e+00 +3.6065e+00 +3.7279e+00 +3.9246e+00 +4.0030e+00 +3.7908e+00 +3.5713e+00 +4.0610e+00 +3.8411e+00 +3.8787e+00 +3.7512e+00 +4.1149e+00 +3.7423e+00 +4.0545e+00 +4.0062e+00 +3.9991e+00 +3.5902e+00 +3.8029e+00 +4.0812e+00 +4.0481e+00 +3.9636e+00 +3.8468e+00 +3.9086e+00 +4.1533e+00 +3.6800e+00 +3.8914e+00 +3.7964e+00 +3.7511e+00 +4.0298e+00 +3.7505e+00 +3.7053e+00 +3.8009e+00 +3.8100e+00 +4.0219e+00 +3.9316e+00 +3.8034e+00 +3.7015e+00 +4.0490e+00 +3.8749e+00 +3.7108e+00 +3.8981e+00 +3.9110e+00 +3.9875e+00 +4.1259e+00 +3.8075e+00 +3.9564e+00 +3.9433e+00 +3.8173e+00 +3.8514e+00 +3.7572e+00 +4.0160e+00 +3.8194e+00 +3.8383e+00 +3.7262e+00 +3.9837e+00 +3.8499e+00 +3.9812e+00 +3.6123e+00 +3.8551e+00 +4.0803e+00 +3.9819e+00 +3.8575e+00 +3.8495e+00 +4.0055e+00 +3.8545e+00 +4.1123e+00 +3.8127e+00 +3.9344e+00 +4.0002e+00 +3.9354e+00 +3.8912e+00 +3.6398e+00 +3.6618e+00 +3.8971e+00 +3.9539e+00 +3.8527e+00 +3.7120e+00 +3.9243e+00 +3.7665e+00 +4.1406e+00 +3.8816e+00 +3.5754e+00 +4.0210e+00 +3.7422e+00 +4.0474e+00 +3.8088e+00 +3.8050e+00 +3.7029e+00 +3.7456e+00 +3.9162e+00 +4.0459e+00 +3.7554e+00 +3.7623e+00 +3.8089e+00 +3.9461e+00 +3.8830e+00 +3.9028e+00 +4.0445e+00 +3.8504e+00 +3.7139e+00 +3.8751e+00 +4.2640e+00 +4.0574e+00 +3.9711e+00 +3.9678e+00 +4.0768e+00 +3.9372e+00 +4.0507e+00 +4.0142e+00 +4.0132e+00 +3.6534e+00 +4.0284e+00 +3.9456e+00 +3.5860e+00 +3.8582e+00 +3.8421e+00 +3.5788e+00 +3.9140e+00 +3.8235e+00 +3.8111e+00 +4.1812e+00 +3.6924e+00 +3.6744e+00 +4.0110e+00 +3.9122e+00 +3.8987e+00 +3.8180e+00 +3.8074e+00 +3.8866e+00 +3.7069e+00 +4.1272e+00 +3.9708e+00 +3.6108e+00 +3.5848e+00 +3.5522e+00 +3.7357e+00 +3.9137e+00 +3.9543e+00 +3.8146e+00 +3.7544e+00 +3.8679e+00 +3.8952e+00 +4.0198e+00 +4.0627e+00 +4.0598e+00 +4.0232e+00 +3.7901e+00 +3.8921e+00 +3.9749e+00 +3.8050e+00 +3.9160e+00 +3.7548e+00 +3.8555e+00 +3.8166e+00 +3.8743e+00 +4.1411e+00 +3.8627e+00 +3.6596e+00 +3.6686e+00 +3.8881e+00 +3.8176e+00 +3.8095e+00 +3.9272e+00 +3.6926e+00 +3.8317e+00 +3.8744e+00 +3.7723e+00 +4.0869e+00 +3.8232e+00 +3.6857e+00 +3.8139e+00 +3.8957e+00 +3.7163e+00 +4.1781e+00 +3.9609e+00 +4.0165e+00 +3.7307e+00 +3.7151e+00 +3.8773e+00 +3.7896e+00 +3.8356e+00 +4.1436e+00 +3.7015e+00 +3.7803e+00 +3.7592e+00 +3.5953e+00 +3.8963e+00 +3.8661e+00 +3.6298e+00 +3.6394e+00 +3.9672e+00 +3.9876e+00 +3.9585e+00 +3.9068e+00 +3.9304e+00 +4.0076e+00 +3.8009e+00 +4.0772e+00 +3.8325e+00 +3.7051e+00 +4.0349e+00 +3.9632e+00 +3.9312e+00 +3.8835e+00 +3.9980e+00 +3.8910e+00 +3.9290e+00 +3.9447e+00 +3.7328e+00 +3.7253e+00 +3.7356e+00 +3.7976e+00 +3.8281e+00 +3.5754e+00 +3.6489e+00 +3.6918e+00 +3.9818e+00 +3.6226e+00 +4.2102e+00 +3.9258e+00 +4.0622e+00 +3.8461e+00 +3.8677e+00 +3.9481e+00 +4.0748e+00 +4.2925e+00 +3.9743e+00 +3.7496e+00 +3.9242e+00 +3.7878e+00 +3.9384e+00 +3.5425e+00 +3.9354e+00 +3.7675e+00 +3.9403e+00 +3.5551e+00 +3.6300e+00 +3.9455e+00 +3.6457e+00 +3.8709e+00 +3.7596e+00 +3.7867e+00 +3.9022e+00 +3.6495e+00 +3.9596e+00 +3.6315e+00 +3.6155e+00 +3.8189e+00 +3.8882e+00 +3.8348e+00 +3.6160e+00 +3.7082e+00 +3.9127e+00 +3.7097e+00 +3.8929e+00 +3.9537e+00 +3.8423e+00 +3.8145e+00 +3.6043e+00 +4.0197e+00 +3.8193e+00 +3.8474e+00 +3.9288e+00 +4.0865e+00 +3.6897e+00 +3.9616e+00 +3.6801e+00 +3.6325e+00 +3.5977e+00 +3.5945e+00 +3.9175e+00 +3.6556e+00 +3.6748e+00 +3.8851e+00 +3.5879e+00 +3.9325e+00 +3.5862e+00 +3.9572e+00 +4.0049e+00 +3.9156e+00 +3.8007e+00 +3.8807e+00 +3.6794e+00 +4.0352e+00 +3.6848e+00 +3.8667e+00 +3.8689e+00 +4.0368e+00 +3.8746e+00 +4.0169e+00 +3.9941e+00 +4.0686e+00 +3.9781e+00 +3.8119e+00 +3.6891e+00 +3.8633e+00 +4.1974e+00 +3.7979e+00 +3.5859e+00 +3.9699e+00 +3.8603e+00 +3.8219e+00 +3.7837e+00 +3.8419e+00 +3.9259e+00 +3.9431e+00 +3.8691e+00 +3.7373e+00 +3.8979e+00 +3.9501e+00 +3.8413e+00 +4.1093e+00 +3.7338e+00 +4.1261e+00 +3.9670e+00 +3.7749e+00 +3.9935e+00 +3.8084e+00 +3.8766e+00 +4.0016e+00 +3.6468e+00 +3.7975e+00 +4.3767e+00 +3.8875e+00 +3.9819e+00 +3.8563e+00 +3.7666e+00 +3.6696e+00 +3.9680e+00 +3.6918e+00 +3.7225e+00 +3.6119e+00 +3.8380e+00 +3.6378e+00 +3.8320e+00 +3.5257e+00 +3.9778e+00 +3.7677e+00 +3.6910e+00 +3.8753e+00 +3.8923e+00 +3.8155e+00 +3.8078e+00 +3.9041e+00 +3.9915e+00 +3.7675e+00 +3.6306e+00 +3.9551e+00 +3.6738e+00 +3.7302e+00 +3.5917e+00 +3.6792e+00 +3.9345e+00 +3.9334e+00 +3.9132e+00 +3.8668e+00 +3.7356e+00 +3.6730e+00 +3.9681e+00 +3.7271e+00 +3.9026e+00 +3.8718e+00 +3.6370e+00 +3.7748e+00 +3.7376e+00 +3.7487e+00 +3.8178e+00 +3.8667e+00 +3.6588e+00 +3.6179e+00 +3.7702e+00 +3.5909e+00 +4.0863e+00 +3.7504e+00 +4.0612e+00 +3.5103e+00 +3.8976e+00 +3.8723e+00 +3.9782e+00 +3.7315e+00 +3.8618e+00 +3.8935e+00 +3.6432e+00 +3.6950e+00 +3.6715e+00 +3.7463e+00 +3.7854e+00 +3.5527e+00 +3.8836e+00 +3.7940e+00 +3.7184e+00 +3.7155e+00 +3.5446e+00 +3.6722e+00 +3.8272e+00 +3.8738e+00 +4.0006e+00 +3.8707e+00 +3.9286e+00 +3.8115e+00 +3.9551e+00 +3.7264e+00 +3.8979e+00 +3.7534e+00 +3.9695e+00 +3.6208e+00 +3.8166e+00 +3.8776e+00 +3.7968e+00 +3.9064e+00 +3.9560e+00 +3.9981e+00 +3.8053e+00 +3.7804e+00 +4.1002e+00 +3.8159e+00 +3.4924e+00 +3.6831e+00 +3.9967e+00 +3.8602e+00 +3.7543e+00 +3.7503e+00 +3.6601e+00 +3.8185e+00 +3.8763e+00 +4.0810e+00 +3.8419e+00 +3.7038e+00 +3.8714e+00 +4.0242e+00 +3.5979e+00 +4.0314e+00 +3.7768e+00 +3.7471e+00 +3.6464e+00 +3.9180e+00 +3.8973e+00 +3.8087e+00 +3.8973e+00 +3.8175e+00 +3.8645e+00 +3.5433e+00 +3.6216e+00 +3.6471e+00 +3.5826e+00 +3.8906e+00 +3.9742e+00 +3.5280e+00 +3.6921e+00 +3.7666e+00 +3.5744e+00 +3.7846e+00 +3.7579e+00 +3.6300e+00 +3.8173e+00 +3.7385e+00 +3.8969e+00 +3.6610e+00 +3.7564e+00 +3.7784e+00 +3.6424e+00 +3.7688e+00 +3.7148e+00 +4.0385e+00 +3.8617e+00 +3.5421e+00 +3.6682e+00 +3.5863e+00 +3.8027e+00 +3.9081e+00 +3.6447e+00 +3.5816e+00 +3.7138e+00 +3.9020e+00 +3.7917e+00 +3.6166e+00 +3.6461e+00 +3.7845e+00 +3.7823e+00 +3.7689e+00 +3.3800e+00 +3.9641e+00 +3.8646e+00 +3.6122e+00 +3.8184e+00 +3.8117e+00 +4.0309e+00 +3.7310e+00 +3.9019e+00 +3.7288e+00 +4.0360e+00 +3.7173e+00 +3.7649e+00 +3.6728e+00 +4.0173e+00 +3.6526e+00 +3.7573e+00 +3.8614e+00 +3.6254e+00 +3.7264e+00 +3.8100e+00 +3.6935e+00 +3.9548e+00 +3.7384e+00 +3.8125e+00 +3.4738e+00 +3.9534e+00 +3.8951e+00 +4.0091e+00 +3.7752e+00 +3.8175e+00 +3.7484e+00 +3.8186e+00 +3.9469e+00 +3.7303e+00 +3.7744e+00 +3.8223e+00 +3.7803e+00 +3.8927e+00 +3.9215e+00 +3.4861e+00 +3.6370e+00 +3.9590e+00 +3.7915e+00 +3.9232e+00 +3.5777e+00 +3.9123e+00 +3.6276e+00 +3.8130e+00 +3.9168e+00 +3.9142e+00 +3.7956e+00 +3.6285e+00 +3.7946e+00 +3.7037e+00 +3.8572e+00 +3.6836e+00 +3.8802e+00 +3.7669e+00 +3.6567e+00 +3.6485e+00 +3.8630e+00 +3.7916e+00 +3.8306e+00 +3.5473e+00 +3.4370e+00 +3.8382e+00 +3.6453e+00 +3.8074e+00 +3.7952e+00 +3.7054e+00 +3.6550e+00 +4.0103e+00 +3.8055e+00 +3.5207e+00 +3.7462e+00 +3.8612e+00 +4.1221e+00 +3.8314e+00 +3.9647e+00 +3.9875e+00 +3.6524e+00 +3.7730e+00 +3.9908e+00 +3.7945e+00 +3.6282e+00 +3.7770e+00 +3.7447e+00 +3.6452e+00 +3.5352e+00 +3.8525e+00 +3.6919e+00 +3.8197e+00 +3.7267e+00 +3.7481e+00 +3.8067e+00 +3.8334e+00 +3.7286e+00 +3.8530e+00 +3.8498e+00 +3.8469e+00 +3.6314e+00 +3.8638e+00 +3.6532e+00 +3.6266e+00 +3.7152e+00 +3.7097e+00 +3.8024e+00 +3.8355e+00 +3.6341e+00 +3.8283e+00 +3.9258e+00 +3.6027e+00 +3.8156e+00 +3.9721e+00 +3.4990e+00 +3.8604e+00 +3.7202e+00 +3.5414e+00 +3.7134e+00 +3.7053e+00 +3.4748e+00 +3.6025e+00 +3.7220e+00 +3.7223e+00 +3.9695e+00 +3.7943e+00 +3.8592e+00 +3.7341e+00 +3.6335e+00 +3.6124e+00 +3.9818e+00 +3.9549e+00 +3.7696e+00 +3.4805e+00 +3.7977e+00 +3.5364e+00 +3.8950e+00 +3.5328e+00 +3.9085e+00 +3.7900e+00 +3.6879e+00 +3.9018e+00 +3.5762e+00 +3.8880e+00 +3.7048e+00 +3.9708e+00 +3.6997e+00 +3.7113e+00 +3.7723e+00 +3.8089e+00 +3.7030e+00 +3.8108e+00 +3.8482e+00 +3.7818e+00 +3.7175e+00 +3.8295e+00 +3.8594e+00 +3.9351e+00 +3.8691e+00 +3.7771e+00 +3.7396e+00 +3.8981e+00 +3.9048e+00 +3.5548e+00 +3.5603e+00 +3.9845e+00 +4.0021e+00 +3.7372e+00 +3.9407e+00 +3.7676e+00 +3.7294e+00 +3.6995e+00 +3.9449e+00 +3.5643e+00 +3.6031e+00 +3.9458e+00 +3.6087e+00 +3.6555e+00 +3.7382e+00 +3.6166e+00 +3.4568e+00 +3.7365e+00 +3.6731e+00 +3.8932e+00 +3.8512e+00 +3.6099e+00 +3.6931e+00 +3.8253e+00 +3.8571e+00 +3.6511e+00 +3.7194e+00 +3.6364e+00 +3.8725e+00 +3.6630e+00 +3.7737e+00 +3.8066e+00 +3.8633e+00 +3.7687e+00 +3.8235e+00 +3.6660e+00 +3.7174e+00 +3.5950e+00 +3.6361e+00 +3.6186e+00 +3.7470e+00 +3.9090e+00 +3.6449e+00 +3.8814e+00 +3.6477e+00 +3.9590e+00 +3.7090e+00 +3.5851e+00 +3.7684e+00 +3.7405e+00 +3.4736e+00 +3.4360e+00 +3.7041e+00 +3.6570e+00 +3.7248e+00 +3.8561e+00 +3.8193e+00 +3.8137e+00 +3.9044e+00 +3.7603e+00 +3.7553e+00 +3.4965e+00 +3.7659e+00 +3.8147e+00 +3.7624e+00 +3.3844e+00 +3.5955e+00 +3.6247e+00 +3.5614e+00 +3.6773e+00 +3.7500e+00 +3.6277e+00 +3.8290e+00 +3.6613e+00 +3.6337e+00 +3.8565e+00 +3.8809e+00 +3.6849e+00 +3.8450e+00 +3.6064e+00 +3.7413e+00 +3.6343e+00 +3.8005e+00 +3.5828e+00 +3.5406e+00 +3.6768e+00 +3.8691e+00 +3.6364e+00 +3.8573e+00 +4.1025e+00 +3.5957e+00 +3.8642e+00 +3.5456e+00 +3.6112e+00 +3.8939e+00 +3.6832e+00 +3.8979e+00 +3.7800e+00 +3.4934e+00 +3.9469e+00 +3.7331e+00 +3.6561e+00 +3.5984e+00 +3.9714e+00 +3.7121e+00 +3.8823e+00 +3.6108e+00 +3.8572e+00 +3.7597e+00 +3.7336e+00 +3.7961e+00 +3.8197e+00 +3.6569e+00 +3.7079e+00 +3.7054e+00 +3.6535e+00 +3.8142e+00 +3.8719e+00 +3.9928e+00 +3.8202e+00 +4.0257e+00 +3.4291e+00 +3.7031e+00 +3.8753e+00 +3.7386e+00 +3.9888e+00 +3.8998e+00 +3.8106e+00 +3.8258e+00 +3.9893e+00 +3.8570e+00 +3.7972e+00 +3.8780e+00 +3.6488e+00 +3.7416e+00 +3.6915e+00 +3.6111e+00 +3.6624e+00 +3.7041e+00 +3.9100e+00 +3.6374e+00 +3.6351e+00 +3.8139e+00 +3.4905e+00 +3.6406e+00 +3.9803e+00 +3.5528e+00 +3.7716e+00 +3.8790e+00 +3.7808e+00 +3.8001e+00 +3.8370e+00 +3.7495e+00 +3.5814e+00 +3.9137e+00 +3.8101e+00 +3.7367e+00 +3.8614e+00 +3.7105e+00 +3.8749e+00 +3.7595e+00 +3.5520e+00 +3.6053e+00 +3.6746e+00 +3.8229e+00 +3.6460e+00 +3.6565e+00 +3.7202e+00 +3.7512e+00 +3.8505e+00 +3.7371e+00 +3.8592e+00 +3.4863e+00 +3.9447e+00 +3.9024e+00 +3.7081e+00 +3.6868e+00 +3.7254e+00 +3.5815e+00 +3.9821e+00 +3.6846e+00 +3.7687e+00 +3.7079e+00 +3.6884e+00 +3.6102e+00 +3.8858e+00 +3.7349e+00 +3.9371e+00 +3.6656e+00 +3.7173e+00 +3.7431e+00 +3.7509e+00 +3.8146e+00 +3.5878e+00 +3.7112e+00 +3.7838e+00 +3.6834e+00 +3.8581e+00 +3.7485e+00 +3.7749e+00 +3.6269e+00 +3.8616e+00 +3.7776e+00 +3.5467e+00 +3.7517e+00 +3.8495e+00 +3.7148e+00 +3.9840e+00 +3.6892e+00 +3.6589e+00 +3.5228e+00 +3.7454e+00 +3.6216e+00 +3.9060e+00 +3.4618e+00 +3.6250e+00 +3.5544e+00 +3.7379e+00 +3.6019e+00 +3.6970e+00 +3.6055e+00 +3.6961e+00 +3.8814e+00 +3.6186e+00 +3.6329e+00 +3.8830e+00 +3.8224e+00 +3.8586e+00 +3.3884e+00 +3.7878e+00 +3.7381e+00 +3.5799e+00 +3.6788e+00 +3.6515e+00 +3.7411e+00 +3.4858e+00 +3.8181e+00 +3.9071e+00 +3.8459e+00 +3.6137e+00 +3.9053e+00 +3.4398e+00 +3.6076e+00 +4.0780e+00 +3.6371e+00 +3.6323e+00 +3.7771e+00 +3.9926e+00 +3.8330e+00 +3.7610e+00 +3.7912e+00 +3.8624e+00 +3.8217e+00 +3.8190e+00 +3.6589e+00 +3.6685e+00 +3.9646e+00 +3.5551e+00 +3.6724e+00 +3.8493e+00 +3.7407e+00 +3.8687e+00 +3.5642e+00 +3.5736e+00 +3.5001e+00 +3.6458e+00 +3.6893e+00 +3.5692e+00 +3.6024e+00 +3.5114e+00 +3.8587e+00 +3.4850e+00 +3.6662e+00 +3.5465e+00 +3.4955e+00 +3.6897e+00 +3.5596e+00 +3.7416e+00 +3.6902e+00 +3.6728e+00 +3.7333e+00 +3.8235e+00 +3.6543e+00 +3.8134e+00 +3.7854e+00 +3.6402e+00 +3.6436e+00 +3.9897e+00 +3.5192e+00 +3.6910e+00 +3.7744e+00 +3.4692e+00 +3.6960e+00 +3.6852e+00 +3.6995e+00 +3.5274e+00 +3.6362e+00 +3.8343e+00 +3.5620e+00 +3.7228e+00 +3.8176e+00 +3.6752e+00 +3.7763e+00 +3.6788e+00 +3.8655e+00 +3.8913e+00 +3.7500e+00 +3.5996e+00 +3.7261e+00 +3.7908e+00 +3.7571e+00 +3.8147e+00 +3.6316e+00 +3.7402e+00 +3.5545e+00 +3.5847e+00 +3.5669e+00 +3.6180e+00 +3.6273e+00 +3.5966e+00 +3.7180e+00 +3.6497e+00 +3.6267e+00 +3.7704e+00 +3.6144e+00 +3.8194e+00 +3.5586e+00 +3.7830e+00 +3.6164e+00 +3.5738e+00 +3.3891e+00 +3.5856e+00 +3.8380e+00 +3.9705e+00 +3.8130e+00 +3.6991e+00 +3.6705e+00 +3.8000e+00 +3.5031e+00 +3.5705e+00 +3.9172e+00 +3.4705e+00 +3.8159e+00 +3.6896e+00 +3.8675e+00 +3.5394e+00 +3.6310e+00 +3.7568e+00 +3.7142e+00 +3.8212e+00 +3.5225e+00 +3.8674e+00 +3.6205e+00 +3.6732e+00 +3.7268e+00 +3.9274e+00 +3.5888e+00 +3.8569e+00 +3.4381e+00 +3.6008e+00 +3.8891e+00 +3.6904e+00 +3.8416e+00 +3.7835e+00 +3.9026e+00 +3.7960e+00 +3.6263e+00 +3.7445e+00 +3.6330e+00 +3.4287e+00 +3.8555e+00 +3.8483e+00 +3.9001e+00 +3.5817e+00 +3.8575e+00 +4.1121e+00 +3.9188e+00 +3.4998e+00 +3.7674e+00 +3.5486e+00 +3.7077e+00 +3.8045e+00 +3.5562e+00 +3.4721e+00 +3.8293e+00 +3.7461e+00 +3.7377e+00 +3.6406e+00 +3.6739e+00 +3.7650e+00 +3.7387e+00 +3.6163e+00 +3.6924e+00 +3.8490e+00 +3.9459e+00 +3.6232e+00 +3.9566e+00 +3.6597e+00 +3.5820e+00 +3.8415e+00 +3.5676e+00 +3.6498e+00 +3.5248e+00 +3.6733e+00 +3.8412e+00 +3.5054e+00 +3.4469e+00 +3.5579e+00 +3.6670e+00 +3.6141e+00 +3.6763e+00 +3.9655e+00 +3.6836e+00 +3.5403e+00 +3.7726e+00 +3.6782e+00 +3.9354e+00 +3.6293e+00 +3.6457e+00 +3.7144e+00 +3.6387e+00 +3.6126e+00 +3.6181e+00 +3.6415e+00 +3.5322e+00 +3.6727e+00 +3.5769e+00 +3.6052e+00 +3.5407e+00 +3.4361e+00 +3.7760e+00 +3.7931e+00 +3.4955e+00 +3.7245e+00 +3.6297e+00 +3.7508e+00 +3.4922e+00 +3.5170e+00 +3.8087e+00 +3.6181e+00 +3.7158e+00 +3.4899e+00 +3.7122e+00 +3.7467e+00 +3.5042e+00 +3.6093e+00 +3.6921e+00 +3.6556e+00 +4.0269e+00 +3.6326e+00 +3.8377e+00 +3.7678e+00 +3.5318e+00 +3.7484e+00 +3.6551e+00 +3.4409e+00 +3.7998e+00 +3.4402e+00 +3.7280e+00 +3.8106e+00 +3.7440e+00 +3.6653e+00 +3.4772e+00 +3.6434e+00 +3.4223e+00 +3.5875e+00 +3.7244e+00 +3.3742e+00 +3.8401e+00 +3.5467e+00 +3.6626e+00 +3.3705e+00 +3.6543e+00 +3.6337e+00 +3.6927e+00 +3.5074e+00 +3.6635e+00 +3.8092e+00 +3.8747e+00 +3.5054e+00 +3.6438e+00 +3.3772e+00 +3.8372e+00 +3.6675e+00 +3.6686e+00 +3.7293e+00 +3.6573e+00 +3.8325e+00 +3.7733e+00 +3.8494e+00 +3.4867e+00 +3.6456e+00 +3.9498e+00 +3.6600e+00 +3.5092e+00 +3.6392e+00 +3.6300e+00 +3.3424e+00 +3.7951e+00 +3.6943e+00 +3.5955e+00 +3.8393e+00 +3.4852e+00 +3.7802e+00 +3.9367e+00 +3.4812e+00 +3.3453e+00 +3.7009e+00 +3.6304e+00 +3.4581e+00 +3.7241e+00 +3.7096e+00 +3.5645e+00 +3.8085e+00 +3.5764e+00 +3.4496e+00 +3.7462e+00 +3.5906e+00 +3.5658e+00 +3.8006e+00 +3.6116e+00 +3.8003e+00 +3.4646e+00 +3.7046e+00 +3.6724e+00 +3.8201e+00 +3.6343e+00 +3.4803e+00 +3.7328e+00 +3.7237e+00 +3.5800e+00 +3.4901e+00 +3.6659e+00 +3.6344e+00 +3.9383e+00 +3.3556e+00 +3.8178e+00 +3.4538e+00 +3.7588e+00 +3.7156e+00 +3.6135e+00 +3.7208e+00 +3.4293e+00 +3.4075e+00 +3.6793e+00 +3.6617e+00 +3.6520e+00 +3.8177e+00 +3.8964e+00 +3.7610e+00 +3.8287e+00 +3.6999e+00 +3.5630e+00 +3.6506e+00 +3.7459e+00 +3.5633e+00 +3.4431e+00 +3.6163e+00 +3.6608e+00 +3.5361e+00 +3.5706e+00 +3.8900e+00 +3.7877e+00 +3.4953e+00 +3.5337e+00 +3.8667e+00 +3.4322e+00 +3.6744e+00 +3.6254e+00 +3.8271e+00 +3.6448e+00 +3.5452e+00 +3.6918e+00 +3.7387e+00 +3.5784e+00 +3.7212e+00 +3.7479e+00 +3.5662e+00 +3.5339e+00 +3.7446e+00 +3.7298e+00 +3.9255e+00 +3.5160e+00 +3.5539e+00 +3.4857e+00 +3.7979e+00 +3.6542e+00 +3.6464e+00 +3.7720e+00 +3.8460e+00 +3.7069e+00 +3.5896e+00 +3.7640e+00 +3.7178e+00 +3.8078e+00 +3.6445e+00 +3.9215e+00 +3.5395e+00 +3.6357e+00 +3.4529e+00 +3.6078e+00 +3.7709e+00 +3.7337e+00 +3.6677e+00 +3.3848e+00 +3.5457e+00 +3.5565e+00 +3.5370e+00 +3.5592e+00 +3.4673e+00 +3.4210e+00 +3.7530e+00 +3.3108e+00 +3.5183e+00 +3.4771e+00 +3.4230e+00 +3.7020e+00 +3.6624e+00 +3.6926e+00 +3.6646e+00 +3.6132e+00 +4.0036e+00 +3.8836e+00 +3.6498e+00 +3.6676e+00 +3.6098e+00 +3.7770e+00 +3.5986e+00 +3.5383e+00 +3.3849e+00 +3.9746e+00 +3.5150e+00 +3.8229e+00 +3.6722e+00 +3.6489e+00 +3.6220e+00 +3.6655e+00 +3.7401e+00 +3.7393e+00 +3.8802e+00 +3.6067e+00 +3.6763e+00 +3.6575e+00 +3.6121e+00 +3.6981e+00 +3.4960e+00 +3.8049e+00 +3.7178e+00 +3.6596e+00 +3.7896e+00 +3.6130e+00 +3.5630e+00 +3.8056e+00 +3.8824e+00 +3.6070e+00 +3.6164e+00 +3.5387e+00 +3.7177e+00 +3.5311e+00 +3.7900e+00 +3.4770e+00 +3.8059e+00 +3.5162e+00 +3.6962e+00 +3.9981e+00 +3.6662e+00 +3.7453e+00 +3.7125e+00 +3.5906e+00 +3.5748e+00 +3.5491e+00 +3.6079e+00 +3.8001e+00 +3.5493e+00 +3.4956e+00 +3.4672e+00 +3.5272e+00 +3.6869e+00 +3.6572e+00 +3.6223e+00 +3.6815e+00 +3.7448e+00 +3.6960e+00 +3.7787e+00 +3.7937e+00 +3.3240e+00 +3.7361e+00 +3.4792e+00 +3.2931e+00 +3.4123e+00 +3.7779e+00 +3.4242e+00 +3.8064e+00 +3.7091e+00 +3.4095e+00 +3.4820e+00 +3.3700e+00 +3.6866e+00 +3.5723e+00 +3.4491e+00 +3.6898e+00 +3.8114e+00 +3.4092e+00 +3.6762e+00 +3.5543e+00 +3.8470e+00 +3.7681e+00 +3.5680e+00 +3.5173e+00 +3.6421e+00 +3.5477e+00 +3.7776e+00 +3.4714e+00 +3.6589e+00 +3.8880e+00 +3.6981e+00 +3.5690e+00 +3.8325e+00 +3.7465e+00 +3.5332e+00 +3.6348e+00 +3.6250e+00 +3.5239e+00 +3.5245e+00 +3.6104e+00 +3.4142e+00 +3.4457e+00 +3.7024e+00 +3.5789e+00 +3.4502e+00 +3.5298e+00 +3.4266e+00 +3.5783e+00 +3.5589e+00 +3.6564e+00 +3.8458e+00 +3.4827e+00 +3.3803e+00 +3.5311e+00 +3.4945e+00 +3.4641e+00 +3.5416e+00 +3.6744e+00 +3.6286e+00 +3.5139e+00 +3.7379e+00 +3.6962e+00 +3.6442e+00 +3.6086e+00 +3.6354e+00 +3.7378e+00 +3.6749e+00 +3.3761e+00 +3.4631e+00 +3.6285e+00 +3.6669e+00 +3.8400e+00 +3.7243e+00 +3.4651e+00 +3.3953e+00 +3.7042e+00 +3.4501e+00 +3.4488e+00 +3.7019e+00 +3.7525e+00 +3.6220e+00 +3.7931e+00 +3.4648e+00 +3.5861e+00 +3.5432e+00 +3.7708e+00 +3.4895e+00 +3.6778e+00 +3.6617e+00 +3.7313e+00 +3.5787e+00 +3.4904e+00 +3.3048e+00 +3.7910e+00 +3.4893e+00 +3.7321e+00 +3.7909e+00 +3.6479e+00 +3.7296e+00 +3.8059e+00 +3.4220e+00 +3.8284e+00 +3.7946e+00 +3.6609e+00 +3.5192e+00 +3.5481e+00 +3.5832e+00 +3.5050e+00 +3.4732e+00 +3.7199e+00 +3.5727e+00 +3.7167e+00 +3.4661e+00 +3.7182e+00 +3.4791e+00 +3.5027e+00 +3.7240e+00 +3.5447e+00 +3.5345e+00 +3.6703e+00 +3.4159e+00 +3.7020e+00 +3.3808e+00 +3.7698e+00 +3.5637e+00 +3.7874e+00 +3.6175e+00 +3.3683e+00 +3.5108e+00 +3.5977e+00 +3.5020e+00 +3.6571e+00 +3.5615e+00 +3.5057e+00 +3.6059e+00 +3.5374e+00 +3.6524e+00 +3.5558e+00 +3.5550e+00 +3.3979e+00 +3.5036e+00 +3.6631e+00 +3.6150e+00 +3.8461e+00 +3.4026e+00 +3.6278e+00 +3.8024e+00 +3.5999e+00 +3.6532e+00 +3.5941e+00 +3.5263e+00 +3.7573e+00 +3.3619e+00 +3.7530e+00 +3.7058e+00 +3.3212e+00 +3.3973e+00 +3.4879e+00 +3.4893e+00 +3.6312e+00 +3.7044e+00 +3.5064e+00 +3.4949e+00 +3.4872e+00 +3.3989e+00 +3.4513e+00 +3.5766e+00 +3.6018e+00 +3.7030e+00 +3.4095e+00 +3.5198e+00 +3.4075e+00 +3.6168e+00 +3.6572e+00 +3.8006e+00 +3.6535e+00 +3.5580e+00 +3.5259e+00 +3.7709e+00 +3.5028e+00 +3.5113e+00 +3.5558e+00 +3.6653e+00 +3.4553e+00 +3.5410e+00 +3.7459e+00 +3.8250e+00 +3.6060e+00 +3.6695e+00 +3.6213e+00 +3.9090e+00 +3.5822e+00 +3.8731e+00 +3.4542e+00 +3.5125e+00 +3.4823e+00 +3.6424e+00 +3.5138e+00 +3.5960e+00 +3.6486e+00 +3.5764e+00 +3.7466e+00 +3.2682e+00 +3.4842e+00 +3.4702e+00 +3.6204e+00 +3.7204e+00 +3.8085e+00 +3.6149e+00 +3.4357e+00 +3.6129e+00 +3.6046e+00 +3.5563e+00 +3.7274e+00 +3.6784e+00 +3.4715e+00 +3.4713e+00 +3.4706e+00 +3.4388e+00 +3.5775e+00 +3.7246e+00 +3.3820e+00 +3.7270e+00 +3.6358e+00 +3.5716e+00 +3.6736e+00 +3.8898e+00 +3.5931e+00 +3.6010e+00 +3.5854e+00 +3.6548e+00 +3.6664e+00 +3.4989e+00 +3.3197e+00 +3.7341e+00 +3.7517e+00 +3.5124e+00 +3.9396e+00 +3.4879e+00 +3.6444e+00 +3.3542e+00 +3.4836e+00 +3.6806e+00 +3.5720e+00 +3.7403e+00 +3.4042e+00 +3.6715e+00 +3.6235e+00 +3.7761e+00 +3.3825e+00 +3.6022e+00 +3.7461e+00 +3.4633e+00 +3.5022e+00 +3.4326e+00 +3.7812e+00 +3.3502e+00 +3.9817e+00 +3.6796e+00 +3.3984e+00 +3.6871e+00 +3.7625e+00 +3.4434e+00 +3.6553e+00 +3.4363e+00 +3.4724e+00 +3.5564e+00 +3.8044e+00 +3.4724e+00 +3.5060e+00 +3.5302e+00 +3.9033e+00 +3.6609e+00 +3.5888e+00 +3.3626e+00 +3.6606e+00 +3.5685e+00 +3.7999e+00 +3.3931e+00 +3.5656e+00 +3.6718e+00 +3.6107e+00 +3.4552e+00 +3.5213e+00 +3.4669e+00 +3.6311e+00 +3.6951e+00 +3.4884e+00 +3.7115e+00 +3.6608e+00 +3.6307e+00 +3.5053e+00 +3.6434e+00 +3.7355e+00 +3.7243e+00 +3.4681e+00 +3.5662e+00 +3.7130e+00 +3.4879e+00 +3.6710e+00 +3.6388e+00 +3.4555e+00 +3.9142e+00 +3.6597e+00 +3.6043e+00 +3.4639e+00 +3.5245e+00 +3.5828e+00 +3.6860e+00 +3.5977e+00 +3.5451e+00 +3.7351e+00 +3.6288e+00 +3.5503e+00 +3.6049e+00 +3.4726e+00 +3.6813e+00 +3.3652e+00 +3.4978e+00 +3.5596e+00 +3.6063e+00 +3.4316e+00 +3.6338e+00 +3.2663e+00 +3.5887e+00 +3.4007e+00 +3.6795e+00 +3.8095e+00 +3.6453e+00 +3.4849e+00 +3.7343e+00 +3.6027e+00 +3.5378e+00 +3.3584e+00 +3.2236e+00 +3.3458e+00 +3.4288e+00 +3.3481e+00 +3.7616e+00 +3.4801e+00 +3.5306e+00 +3.6276e+00 +3.6665e+00 +3.6634e+00 +3.5248e+00 +3.5216e+00 +3.6616e+00 +3.7403e+00 +3.4415e+00 +3.5642e+00 +3.3303e+00 +3.5676e+00 +3.6500e+00 +3.5783e+00 +3.7316e+00 +3.5722e+00 +3.8060e+00 +3.4575e+00 +3.6021e+00 +3.4702e+00 +3.6423e+00 +3.3073e+00 +3.4640e+00 +3.6179e+00 +3.4632e+00 +3.6548e+00 +3.6120e+00 +3.3041e+00 +3.5363e+00 +3.4610e+00 +3.5238e+00 +3.6789e+00 +3.6695e+00 +3.3046e+00 +3.4718e+00 +3.5160e+00 +3.6413e+00 +3.6040e+00 +3.6328e+00 +3.6326e+00 +3.4751e+00 +3.4278e+00 +3.1986e+00 +3.4592e+00 +3.5433e+00 +3.4608e+00 +3.7054e+00 +3.5915e+00 +3.6141e+00 +3.6329e+00 +3.7820e+00 +3.5186e+00 +3.5565e+00 +3.5834e+00 +3.7063e+00 +3.4964e+00 +3.4888e+00 +3.6221e+00 +3.6330e+00 +3.5026e+00 +3.5486e+00 +3.5505e+00 +3.6758e+00 +3.3895e+00 +3.8047e+00 +3.4599e+00 +3.4401e+00 +3.8439e+00 +3.7015e+00 +3.6703e+00 +3.6472e+00 +3.3239e+00 +3.5600e+00 +3.4890e+00 +3.4857e+00 +3.5067e+00 +3.4163e+00 +3.8841e+00 +3.6231e+00 +3.5930e+00 +3.5953e+00 +3.4346e+00 +3.5664e+00 +3.4809e+00 +3.4604e+00 +3.5746e+00 +3.2886e+00 +3.3287e+00 +3.4098e+00 +3.5345e+00 +3.5127e+00 +3.4998e+00 +3.6334e+00 +3.6206e+00 +3.5412e+00 +3.6726e+00 +3.6520e+00 +3.6216e+00 +3.5246e+00 +3.5952e+00 +3.4334e+00 +3.6755e+00 +3.5689e+00 +3.6468e+00 +3.5963e+00 +3.8029e+00 +3.6744e+00 +3.5920e+00 +3.3372e+00 +3.5033e+00 +3.5735e+00 +3.4862e+00 +3.7135e+00 +3.5455e+00 +3.4102e+00 +3.3791e+00 +3.7250e+00 +3.5598e+00 +3.5427e+00 +3.6784e+00 +3.8534e+00 +3.3680e+00 +3.4649e+00 +3.5283e+00 +3.8401e+00 +3.4915e+00 +3.5633e+00 +3.6730e+00 +3.4831e+00 +3.3951e+00 +3.4938e+00 +3.5491e+00 +3.4022e+00 +3.3749e+00 +4.0263e+00 +3.5806e+00 +3.5312e+00 +3.9451e+00 +3.4883e+00 +3.6288e+00 +3.6144e+00 +3.4586e+00 +3.9752e+00 +3.6602e+00 +3.3774e+00 +3.7694e+00 +3.5583e+00 +3.4900e+00 +3.6023e+00 +3.4986e+00 +3.2904e+00 +3.3981e+00 +3.6663e+00 +3.6195e+00 +3.4582e+00 +3.6020e+00 +3.7122e+00 +3.6488e+00 +3.7265e+00 +3.5499e+00 +3.6323e+00 +3.6631e+00 +3.4128e+00 +3.4856e+00 +3.4545e+00 +3.5858e+00 +3.7659e+00 +3.5298e+00 +3.6339e+00 +3.5801e+00 +3.3278e+00 +3.6242e+00 +3.6955e+00 +3.5298e+00 +3.6306e+00 +3.7737e+00 +3.7131e+00 +3.6289e+00 +3.4674e+00 +3.5635e+00 +3.6111e+00 +3.2722e+00 +3.7837e+00 +3.5368e+00 +3.2905e+00 +3.4581e+00 +3.5859e+00 +3.6785e+00 +3.4071e+00 +3.4821e+00 +3.4813e+00 +3.7837e+00 +3.5171e+00 +3.4123e+00 +3.3095e+00 +3.4793e+00 +3.5701e+00 +3.4566e+00 +3.6949e+00 +3.5668e+00 +3.7494e+00 +3.4690e+00 +3.9039e+00 +3.7115e+00 +3.6332e+00 +3.7582e+00 +3.4263e+00 +3.6291e+00 +3.4833e+00 +3.3114e+00 +3.6399e+00 +3.6579e+00 +3.5006e+00 +3.5217e+00 +3.8178e+00 +3.5014e+00 +3.8368e+00 +3.5402e+00 +3.5589e+00 +3.6606e+00 +3.5845e+00 +3.6388e+00 +3.7020e+00 +3.5552e+00 +3.6610e+00 +3.6524e+00 +3.5839e+00 +3.4706e+00 +3.3851e+00 +3.6201e+00 +3.4898e+00 +3.5110e+00 +3.7125e+00 +3.5643e+00 +3.5413e+00 +3.4783e+00 +3.3819e+00 +3.4849e+00 +3.5846e+00 +3.3388e+00 +3.4970e+00 +3.6583e+00 +3.5871e+00 +3.4619e+00 +3.4682e+00 +3.4995e+00 +3.3834e+00 +3.5492e+00 +3.4012e+00 +3.1972e+00 +3.2493e+00 +3.5095e+00 +3.6428e+00 +3.3851e+00 +3.5451e+00 +3.2674e+00 +3.5359e+00 +3.4870e+00 +3.3711e+00 +3.4114e+00 +3.6480e+00 +3.5435e+00 +3.6433e+00 +3.3973e+00 +3.2838e+00 +3.4447e+00 +3.3419e+00 +3.5978e+00 +3.4209e+00 +3.3975e+00 +3.1148e+00 +3.2695e+00 +3.4847e+00 +3.4586e+00 +3.3912e+00 +3.5702e+00 +3.3751e+00 +3.7322e+00 +3.7756e+00 +3.5518e+00 +3.3358e+00 +3.8517e+00 +3.5681e+00 +3.5819e+00 +3.6183e+00 +3.7558e+00 +3.4579e+00 +3.7455e+00 +3.2112e+00 +3.7294e+00 +3.3556e+00 +3.4942e+00 +3.3088e+00 +3.4945e+00 +3.4854e+00 +3.5493e+00 +3.4484e+00 +3.5969e+00 +3.5277e+00 +3.4638e+00 +3.5325e+00 +3.4485e+00 +3.5347e+00 +3.6137e+00 +3.3664e+00 +3.6757e+00 +3.4499e+00 +3.4991e+00 +3.5012e+00 +3.3888e+00 +3.7495e+00 +3.5022e+00 +3.3277e+00 +3.7120e+00 +3.9088e+00 +3.5364e+00 +3.5347e+00 +3.4851e+00 +3.6159e+00 +3.7130e+00 +3.4977e+00 +3.3743e+00 +3.4951e+00 +3.8048e+00 +3.3969e+00 +3.6182e+00 +3.6894e+00 +3.4292e+00 +3.6575e+00 +3.3199e+00 +3.5586e+00 +3.3016e+00 +3.6020e+00 +3.3503e+00 +3.6439e+00 +3.5398e+00 +3.4422e+00 +3.7221e+00 +3.3372e+00 +3.2967e+00 +3.3225e+00 +3.6927e+00 +3.7218e+00 +3.4062e+00 +3.5160e+00 +3.5906e+00 +3.3249e+00 +3.5844e+00 +3.6083e+00 +3.4319e+00 +3.2982e+00 +3.5141e+00 +3.4617e+00 +3.7170e+00 +3.4562e+00 +3.4192e+00 +3.2838e+00 +3.4342e+00 +3.3601e+00 +3.4934e+00 +3.6252e+00 +3.7190e+00 +3.7101e+00 +3.3370e+00 +3.7356e+00 +3.7365e+00 +3.5139e+00 +3.7061e+00 +3.6978e+00 +3.3771e+00 +3.4428e+00 +3.2604e+00 +3.9043e+00 +3.3310e+00 +3.6010e+00 +3.5166e+00 +3.5316e+00 +3.6772e+00 +3.6172e+00 +3.5761e+00 +3.3213e+00 +3.3762e+00 +3.7496e+00 +3.5004e+00 +3.7904e+00 +3.4754e+00 +3.5914e+00 +3.5673e+00 +3.5126e+00 +3.4641e+00 +3.4684e+00 +3.4883e+00 +3.5641e+00 +3.5667e+00 +3.6182e+00 +3.5496e+00 +3.5068e+00 +3.5408e+00 +3.1711e+00 +3.3738e+00 +3.4397e+00 +3.6804e+00 +3.6741e+00 +3.3208e+00 +3.2616e+00 +3.3221e+00 +3.5445e+00 +3.5834e+00 +3.4460e+00 +3.3372e+00 +3.3944e+00 +3.4902e+00 +3.2965e+00 +3.7031e+00 +3.6114e+00 +3.6280e+00 +3.4913e+00 +3.6569e+00 +3.3835e+00 +3.4987e+00 +3.7531e+00 +3.5299e+00 +3.7646e+00 +3.6471e+00 +3.7164e+00 +3.3570e+00 +3.5041e+00 +3.5354e+00 +3.5548e+00 +3.3303e+00 +3.4384e+00 +3.6796e+00 +3.7048e+00 +3.6669e+00 +3.4955e+00 +3.5686e+00 +3.5969e+00 +3.4845e+00 +3.3097e+00 +3.3544e+00 +3.3461e+00 +3.6161e+00 +3.4560e+00 +3.4704e+00 +3.5020e+00 +3.6476e+00 +3.5614e+00 +3.2828e+00 +3.4121e+00 +3.6591e+00 +3.5505e+00 +3.4135e+00 +3.4166e+00 +3.3764e+00 +3.5637e+00 +3.7032e+00 +3.5168e+00 +3.4616e+00 +3.3305e+00 +3.3193e+00 +3.4111e+00 +3.3177e+00 +3.5840e+00 +3.6258e+00 +3.2616e+00 +3.5624e+00 +3.4457e+00 +3.5099e+00 +3.5590e+00 +3.4919e+00 +3.3305e+00 +3.2001e+00 +3.4411e+00 +3.5150e+00 +3.6098e+00 +3.5448e+00 +3.5438e+00 +3.6156e+00 +3.4893e+00 +3.5382e+00 +3.4306e+00 +3.5251e+00 +3.4427e+00 +3.5938e+00 +3.3657e+00 +3.3975e+00 +3.5269e+00 +3.6127e+00 +3.5122e+00 +3.5494e+00 +3.3252e+00 +3.4633e+00 +3.5017e+00 +3.4499e+00 +3.3781e+00 +3.7279e+00 +3.5339e+00 +3.5155e+00 +3.6305e+00 +3.5538e+00 +3.5005e+00 +3.6385e+00 +3.4207e+00 +3.5980e+00 +3.6918e+00 +3.5504e+00 +3.5554e+00 +3.4246e+00 +3.4936e+00 +3.3478e+00 +3.6054e+00 +3.3285e+00 +3.6052e+00 +3.7562e+00 +3.4365e+00 +3.2365e+00 +3.3266e+00 +3.7271e+00 +3.5282e+00 +3.4348e+00 +3.5850e+00 +3.4346e+00 +3.4897e+00 +3.5588e+00 +3.6693e+00 +3.4541e+00 +3.4325e+00 +3.2227e+00 +3.6683e+00 +3.4817e+00 +3.6019e+00 +3.5778e+00 +3.3694e+00 +3.5823e+00 +3.3342e+00 +3.5412e+00 +3.3203e+00 +3.4383e+00 +3.5698e+00 +3.2678e+00 +3.5672e+00 +3.3071e+00 +3.4341e+00 +3.1723e+00 +3.6526e+00 +3.3804e+00 +3.4762e+00 +3.4256e+00 +3.3997e+00 +3.2397e+00 +3.5802e+00 +3.4512e+00 +3.3201e+00 +3.4360e+00 +3.5948e+00 +3.5533e+00 +3.5621e+00 +3.7207e+00 +3.5815e+00 +3.5933e+00 +3.4980e+00 +3.2502e+00 +3.6915e+00 +3.2670e+00 +3.3226e+00 +3.5779e+00 +3.5102e+00 +3.3600e+00 +3.5792e+00 +3.3877e+00 +3.4521e+00 +3.3942e+00 +3.4411e+00 +3.6565e+00 +3.4487e+00 +3.3864e+00 +3.4696e+00 +3.6079e+00 +3.7627e+00 +3.4002e+00 +3.5648e+00 +3.3166e+00 +3.5668e+00 +3.3814e+00 +3.2890e+00 +3.4579e+00 +3.3532e+00 +3.4096e+00 +3.5480e+00 +3.4178e+00 +3.6600e+00 +3.4732e+00 +3.3778e+00 +3.3350e+00 +3.5933e+00 +3.3705e+00 +3.7139e+00 +3.6023e+00 +3.5055e+00 +3.4439e+00 +3.6296e+00 +3.5023e+00 +3.6229e+00 +3.4008e+00 +3.3135e+00 +3.4448e+00 +3.5076e+00 +3.1759e+00 +3.6751e+00 +3.6675e+00 +3.7427e+00 +3.3738e+00 +3.3689e+00 +3.5262e+00 +3.1866e+00 +3.3810e+00 +3.5358e+00 +3.7259e+00 +3.5700e+00 +3.3829e+00 +3.4070e+00 +3.6755e+00 +3.5254e+00 +3.5307e+00 +3.4200e+00 +3.4682e+00 +3.2503e+00 +3.4487e+00 +3.3040e+00 +3.6373e+00 +3.6537e+00 +3.6961e+00 +3.3884e+00 +3.7371e+00 +3.4926e+00 +3.2179e+00 +3.5870e+00 +3.5184e+00 +3.4344e+00 +3.3345e+00 +3.5732e+00 +3.5728e+00 +3.3817e+00 +3.5317e+00 +3.2153e+00 +3.4909e+00 +3.0764e+00 +3.6379e+00 +3.4149e+00 +3.3850e+00 +3.2652e+00 +3.5073e+00 +3.5158e+00 +3.4717e+00 +3.3789e+00 +3.6636e+00 +3.1755e+00 +3.5091e+00 +3.4322e+00 +3.6068e+00 +3.6112e+00 +3.4529e+00 +3.2444e+00 +3.6331e+00 +3.2731e+00 +3.2647e+00 +3.5376e+00 +3.4160e+00 +3.3354e+00 +3.4496e+00 +3.4902e+00 +3.6280e+00 +3.5942e+00 +3.3057e+00 +3.5682e+00 +3.3913e+00 +3.3052e+00 +3.4350e+00 +3.2377e+00 +3.3225e+00 +3.3373e+00 +3.2332e+00 +3.4319e+00 +3.5254e+00 +3.6028e+00 +3.6524e+00 +3.5028e+00 +3.2675e+00 +3.2787e+00 +3.3590e+00 +3.3635e+00 +3.3599e+00 +3.6036e+00 +3.4098e+00 +3.5195e+00 +3.3948e+00 +3.3706e+00 +3.5138e+00 +3.4867e+00 +3.3687e+00 +3.2829e+00 +3.5078e+00 +3.4369e+00 +3.4215e+00 +3.3458e+00 +3.4085e+00 +3.4811e+00 +3.6942e+00 +3.3142e+00 +3.5270e+00 +3.4999e+00 +3.2799e+00 +3.5515e+00 +3.4674e+00 +3.2922e+00 +3.4460e+00 +3.5304e+00 +3.3095e+00 +3.4902e+00 +3.6939e+00 +3.4137e+00 +3.2198e+00 +3.5844e+00 +3.3198e+00 +3.3771e+00 +3.6824e+00 +3.5047e+00 +3.7003e+00 +3.2444e+00 +3.4254e+00 +3.7740e+00 +3.4248e+00 +3.5071e+00 +3.5010e+00 +3.0846e+00 +3.3644e+00 +3.6341e+00 +3.5384e+00 +3.2653e+00 +3.6156e+00 +3.3693e+00 +3.4572e+00 +3.4768e+00 +3.5759e+00 +3.1772e+00 +3.7287e+00 +3.4070e+00 +3.3167e+00 +3.1916e+00 +3.4348e+00 +3.4962e+00 +3.6316e+00 +3.4959e+00 +3.2270e+00 +3.5616e+00 +3.0673e+00 +3.2591e+00 +3.5157e+00 +3.2130e+00 +3.3587e+00 +3.4758e+00 +3.5144e+00 +3.4114e+00 +3.5333e+00 +3.5720e+00 +3.6180e+00 +3.6153e+00 +3.5187e+00 +3.3036e+00 +3.5887e+00 +3.4237e+00 +3.5672e+00 +3.7445e+00 +3.6667e+00 +3.5271e+00 +3.4844e+00 +3.5470e+00 +3.4171e+00 +3.3774e+00 +3.4870e+00 +3.7777e+00 +3.4045e+00 +3.5727e+00 +3.4364e+00 +3.4907e+00 +3.6297e+00 +3.5103e+00 +3.4670e+00 +3.6163e+00 +3.6393e+00 +3.2172e+00 +3.5398e+00 +3.3551e+00 +3.3940e+00 +3.6590e+00 +3.3611e+00 +3.1650e+00 +3.4749e+00 +3.6583e+00 +3.3451e+00 +3.5383e+00 +3.4634e+00 +3.1321e+00 +3.3270e+00 +3.4760e+00 +3.6131e+00 +3.4817e+00 +3.4162e+00 +3.3169e+00 +3.6047e+00 +3.8206e+00 +3.4996e+00 +3.5810e+00 +3.5234e+00 +3.4131e+00 +3.4602e+00 +3.3917e+00 +3.4680e+00 +3.5728e+00 +3.4496e+00 +3.5246e+00 +3.3280e+00 +3.5196e+00 +3.3554e+00 +3.4863e+00 +3.5723e+00 +3.3751e+00 +3.2660e+00 +3.2973e+00 +3.5487e+00 +3.6160e+00 +3.3617e+00 +3.4000e+00 +3.3699e+00 +3.3809e+00 +3.6450e+00 +3.6370e+00 +3.1810e+00 +3.7090e+00 +3.3078e+00 +3.2874e+00 +3.4556e+00 +3.3699e+00 +3.5339e+00 +3.4635e+00 +3.5294e+00 +3.7002e+00 +3.6420e+00 +3.2159e+00 +3.4349e+00 +3.4217e+00 +3.5205e+00 +3.4546e+00 +3.6142e+00 +3.4379e+00 +3.6686e+00 +3.3275e+00 +3.2473e+00 +3.2702e+00 +3.6351e+00 +3.3749e+00 +3.7392e+00 +3.3422e+00 +3.5166e+00 +3.5186e+00 +3.4078e+00 +3.2495e+00 +3.1741e+00 +3.4613e+00 +3.1663e+00 +3.3402e+00 +3.5811e+00 +3.5664e+00 +3.3291e+00 +3.1445e+00 +3.2715e+00 +3.5555e+00 +3.2402e+00 +3.5489e+00 +3.4010e+00 +3.3464e+00 +3.5199e+00 +3.2308e+00 +3.2012e+00 +3.1534e+00 +3.3314e+00 +3.7001e+00 +3.5810e+00 +3.2986e+00 +3.3549e+00 +3.3365e+00 +3.5653e+00 +3.3019e+00 +3.4095e+00 +3.4180e+00 +3.4575e+00 +3.4578e+00 +3.3721e+00 +3.5520e+00 +3.4358e+00 +3.5255e+00 +3.3531e+00 +3.2742e+00 +3.5174e+00 +3.5171e+00 +3.4027e+00 +3.4536e+00 +3.4661e+00 +3.6574e+00 +3.6723e+00 +3.5424e+00 +3.2766e+00 +3.2087e+00 +3.5473e+00 +3.2671e+00 +3.5537e+00 +3.5329e+00 +3.7500e+00 +3.6400e+00 +3.6289e+00 +3.5505e+00 +3.5322e+00 +3.3950e+00 +3.2444e+00 +3.4858e+00 +3.6172e+00 +3.6827e+00 +3.2323e+00 +3.5474e+00 +3.6270e+00 +3.7035e+00 diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_eval_8p.sh" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_eval_8p.sh" new file mode 100644 index 0000000000000000000000000000000000000000..182d4e9ab5d3ef6cdab81132ca7eea83381a2a43 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_eval_8p.sh" @@ -0,0 +1,128 @@ +#!/usr/bin/env bash + +##################README################## +# 请注意修改(必选)项,确保配置正确 +# 最后的输出结果环节,没有强制要求按照示例执行,只需按照运行结果示例得到相同目录结构和输出结果即可 + +##################基础配置参数,需要模型审视修改################## +# 必选字段(必须在此处定义的参数): Network batch_size RANK_SIZE +#网络名称,同目录名称(必选) +Network="ResNet101_ID1595_for_PyTorch" +#训练epoch +train_epochs=110 +#训练batch_size(必选) +batch_size=2048 +#训练step +#train_steps=`expr 1281167 / ${batch_size}` +#学习率 +learning_rate=0.4 +# 训练使用的npu卡数 +export RANK_SIZE=8 +# 数据集路径,保持为空,不需要修改 +data_path="" +# 上一次训练生成的ckpt文件路径 +resume=checkpoint.pth.tar + +# 参数校验,data_path为必传参数, 其他参数的增删由模型自身决定;此处若新增参数需在上面有定义并赋值 +for para in $* +do + if [[ $para == --world_size* ]];then + world_size=`echo ${para#*=}` + elif [[ $para == --data_path* ]];then + data_path=`echo ${para#*=}` + fi +done + +# 校验是否传入data_path,不需要修改 +if [[ $data_path == "" ]];then + echo "[Error] para \"data_path\" must be confing" + exit 1 +fi + + +##################指定训练脚本执行路径################## +# cd到与test文件同层级目录下执行脚本,提高兼容性;test_path_dir为包含test文件夹的路径 +cur_path=`pwd` +cur_path_last_dirname=${cur_path##*/} +if [ x"${cur_path_last_dirname}" == x"test" ]; then + test_path_dir=${cur_path} + cd .. + cur_path=`pwd` +else + test_path_dir=${cur_path}/test +fi + + +##################创建日志输出目录,根据模型审视################## +# 模型采用非循环方式启动多卡训练,创建日志输出目录如下;采用循环方式启动多卡训练的模型,在循环中创建日志输出目录,可参考CRNN模型 +# 非循环方式下8卡训练日志输出路径中的ASCEND_DEVICE_ID默认为0,只是人为指定文件夹名称, 不涉及训练业务 +ASCEND_DEVICE_ID=0 +if [ -d ${test_path_dir}/output/$ASCEND_DEVICE_ID ];then + rm -rf ${test_path_dir}/output/$ASCEND_DEVICE_ID + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +else + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +fi + +##################启动训练脚本################## +# 训练开始时间,不需要修改 +start_time=$(date +%s) +# source 环境变量 +source ${test_path_dir}/env.sh +# 启动脚本(必选) +python3.7 ./main.py \ + ${data_path} \ + -a resnet101 \ + --evaluate \ + --resume ${resume} \ + --addr=$(hostname -I |awk '{print $1}') \ + --seed=49 \ + --workers=160 \ + --learning-rate=${learning_rate} \ + --mom=0.9 \ + --weight-decay=1.0e-04 \ + --print-freq=1 \ + --dist-url='tcp://127.0.0.1:50000' \ + --multiprocessing-distributed \ + --world-size=1 \ + --rank=0 \ + --device='npu' \ + --dist-backend='hccl' \ + --epochs=${train_epochs} \ + --batch-size=${batch_size} \ + --amp \ + --loss-scale=1024 > ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log 2>&1 & + +wait + +##################获取训练数据##################(必选) +#训练结束时间,不需要修改 +end_time=$(date +%s) +e2e_time=$(( $end_time - $start_time )) + +#结果打印,不需要修改 +echo "------------------ Final result ------------------" + +#输出训练精度,需要模型审视修改 +train_accuracy=`grep -a '* Acc@1' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk 'END {print}'|awk -F "Acc@1" '{print $NF}'|awk -F " " '{print $1}'` +#打印,不需要修改 +echo "Final Train Accuracy : ${train_accuracy}" +echo "E2E Training Duration sec : $e2e_time" + +#训练用例信息,不需要修改 +BatchSize=${batch_size} +DeviceType=`uname -m` +CaseName=${Network}_bs${BatchSize}_${RANK_SIZE}'p'_'eval' + + +#最后一个迭代loss值,不需要修改 +ActualLoss=`grep Test ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${ASCEND_DEVICE_ID}.log |awk -F "Loss" '{print $NF}' | awk -F " " '{print $1}' | awk 'END {print}'` +#关键信息打印到${CaseName}.log中,不需要修改 +echo "Network = ${Network}" > ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "RankSize = ${RANK_SIZE}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "BatchSize = ${BatchSize}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "DeviceType = ${DeviceType}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "CaseName = ${CaseName}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "TrainAccuracy = ${train_accuracy}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualLoss = ${ActualLoss}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "E2ETrainingTime = ${e2e_time}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_full_1p.sh" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_full_1p.sh" new file mode 100644 index 0000000000000000000000000000000000000000..da5a23696328ff5a1633cdfe5d6598d7622751ff --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_full_1p.sh" @@ -0,0 +1,173 @@ +#!/usr/bin/env bash + +##################README################## +# 请注意修改(必选)项,确保配置正确 +# 最后的输出结果环节,没有强制要求按照示例执行,只需按照运行结果示例得到相同目录结构和输出结果即可 + +##################基础配置参数,需要模型审视修改################## +# 必选字段(必须在此处定义的参数): Network batch_size RANK_SIZE +#网络名称,同目录名称(必选) +Network="ResNet101_ID1595_for_PyTorch" +#训练epoch +train_epochs=110 +#训练batch_size(必选) +batch_size=256 +#训练step +#train_steps=`expr 1281167 / ${batch_size}` +#学习率 +learning_rate=0.1 +# 训练使用的npu卡数 +export RANK_SIZE=1 +# 数据集路径,保持为空,不需要修改 +data_path="" + +#维测参数,precision_mode需要模型审视修改 +#precision_mode="allow_mix_precision" +#维持参数,以下不需要修改 +over_dump=False +data_dump_flag=False +data_dump_step="10" +profiling=False + +# 帮助信息,不需要修改 +if [[ $1 == --help || $1 == -h ]];then + echo"usage:./train_performance_1P.sh " + echo " " + echo "parameter explain: + --precision_mode precision mode(allow_fp32_to_fp16/force_fp16/must_keep_origin_dtype/allow_mix_precision) + --over_dump if or not over detection, default is False + --data_dump_flag data dump flag, default is False + --data_dump_step data dump step, default is 10 + --profiling if or not profiling for performance debug, default is False + --data_path source data of training + -h/--help show help message + " + exit 1 +fi + +#参数校验,不需要修改 +for para in $* +do + if [[ $para == --device_id* ]];then + device_id=`echo ${para#*=}` + elif [[ $para == --data_path* ]];then + data_path=`echo ${para#*=}` + fi +done + +#校验是否传入data_path,不需要修改 +if [[ $data_path == "" ]];then + echo "[Error] para \"data_path\" must be confing" + exit 1 +fi + +# 校验单卡训练是否指定了device id,分动态分配device id 与手动指定device id,此处不需要修改 +if [ $ASCEND_DEVICE_ID ];then + echo "device id is ${ASCEND_DEVICE_ID}" + ln -s source dest +elif [ ${device_id} ]; then + export ASCEND_DEVICE_ID=${device_id} + echo "device id is ${ASCEND_DEVICE_ID}" +else + echo "[Error] device id must be confing" + exit 1 +fi + + +#################指定训练脚本执行路径################## +# cd到与test文件同层级目录下执行脚本,提高兼容性;test_path_dir为包含test文件夹的路径 +cur_path=`pwd` +cur_path_last_dirname=${cur_path##*/} +if [ x"${cur_path_last_dirname}" == x"test" ]; then + test_path_dir=${cur_path} + cd .. + cur_path=`pwd` +else + test_path_dir=${cur_path}/test +fi + + +##################创建日志输出目录,不需要修改################## +ASCEND_DEVICE_ID=${device_id} +if [ -d ${test_path_dir}/output/$ASCEND_DEVICE_ID ];then + rm -rf ${test_path_dir}/output/$ASCEND_DEVICE_ID + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +else + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +fi + +##################启动训练脚本################## +#训练开始时间,不需要修改 +start_time=$(date +%s) +# source 环境变量 +source ${test_path_dir}/env.sh +# 启动脚本(必选) +python3.7 ./main.py \ + ${data_path} \ + -a resnet101 \ + --addr=$(hostname -I |awk '{print $1}') \ + --seed=49 \ + --workers=128 \ + --learning-rate=${learning_rate} \ + --mom=0.9 \ + --weight-decay=1.0e-04 \ + --print-freq=1 \ + --device='npu' \ + --gpu=${ASCEND_DEVICE_ID} \ + --dist-backend='hccl' \ + --epochs=${train_epochs} \ + --amp \ + --FusedSGD \ + --batch-size=${batch_size} > ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log 2>&1 & + +wait + +##################获取训练数据##################(必选) +# 训练结束时间,不需要修改 +end_time=$(date +%s) +e2e_time=$(( $end_time - $start_time )) + +# 终端结果打印,不需要修改 +echo "------------------ Final result ------------------" +# 输出性能FPS,需要模型审视修改 +FPS=`grep -a 'FPS' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk -F " " '{print $NF}'|awk 'END {print}'` +# 打印,不需要修改 +echo "Final Performance images/sec : $FPS" + +# 输出训练精度,需要模型审视修改 +train_accuracy=`grep -a '* Acc@1' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk 'END {print}'|awk -F "Acc@1" '{print $NF}'|awk -F " " '{print $1}'` +# 打印,不需要修改 +echo "Final Train Accuracy : ${train_accuracy}" +echo "E2E Training Duration sec : $e2e_time" + +# 性能看护结果汇总 +# 训练用例信息,不需要修改 +BatchSize=${batch_size} +DeviceType=`uname -m` +CaseName=${Network}_bs${BatchSize}_${RANK_SIZE}'p'_'acc' + +# 获取性能数据,不需要修改 +# 吞吐量 +ActualFPS=${FPS} +# 单迭代训练时长 +TrainingTime=`awk 'BEGIN{printf "%.2f\n", '${batch_size}'*1000/'${FPS}'}'` + +# 从train_$ASCEND_DEVICE_ID.log提取Loss到train_${CaseName}_loss.txt中,需要根据模型审视 +grep Epoch: ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_$ASCEND_DEVICE_ID.log|grep -v Test|awk -F "Loss" '{print $NF}' | awk -F " " '{print $1}' >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt + +# 最后一个迭代loss值,不需要修改 +ActualLoss=`awk 'END {print}' ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt` + + +##################将训练数据存入文件################## +# 关键信息打印到${CaseName}.log中,不需要修改 +echo "Network = ${Network}" > ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "RankSize = ${RANK_SIZE}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "BatchSize = ${BatchSize}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "DeviceType = ${DeviceType}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "CaseName = ${CaseName}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualFPS = ${ActualFPS}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "TrainingTime = ${TrainingTime}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "TrainAccuracy = ${train_accuracy}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualLoss = ${ActualLoss}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "E2ETrainingTime = ${e2e_time}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log \ No newline at end of file diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_full_8p.sh" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_full_8p.sh" new file mode 100644 index 0000000000000000000000000000000000000000..471ac69e1dab173b8a2947ea35cad8fbef31269f --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_full_8p.sh" @@ -0,0 +1,165 @@ +#!/usr/bin/env bash + +##################README################## +# 请注意修改(必选)项,确保配置正确 +# 最后的输出结果环节,没有强制要求按照示例执行,只需按照运行结果示例得到相同目录结构和输出结果即可 + +##################基础配置参数,需要模型审视修改################## +# 必选字段(必须在此处定义的参数): Network batch_size RANK_SIZE +#网络名称,同目录名称(必选) +Network="ResNet101_ID1595_for_PyTorch" +#训练epoch +train_epochs=110 +#训练batch_size(必选) +batch_size=2048 +#训练step +#train_steps=`expr 1281167 / ${batch_size}` +#学习率 +learning_rate=0.5 +# 训练使用的npu卡数 +export RANK_SIZE=8 +# 数据集路径,保持为空,不需要修改 +data_path="" + +#维测参数,precision_mode需要模型审视修改 +#precision_mode="allow_mix_precision" +#维持参数,以下不需要修改 +over_dump=False +data_dump_flag=False +data_dump_step="10" +profiling=False + +# 帮助信息,不需要修改 +if [[ $1 == --help || $1 == -h ]];then + echo"usage:./train_performance_1P.sh " + echo " " + echo "parameter explain: + --precision_mode precision mode(allow_fp32_to_fp16/force_fp16/must_keep_origin_dtype/allow_mix_precision) + --over_dump if or not over detection, default is False + --data_dump_flag data dump flag, default is False + --data_dump_step data dump step, default is 10 + --profiling if or not profiling for performance debug, default is False + --data_path source data of training + -h/--help show help message + " + exit 1 +fi + +#参数校验,不需要修改 +for para in $* +do + if [[ $para == --workers* ]];then + workers=`echo ${para#*=}` + elif [[ $para == --data_path* ]];then + data_path=`echo ${para#*=}` + fi +done + +#校验是否传入data_path,不需要修改 +if [[ $data_path == "" ]];then + echo "[Error] para \"data_path\" must be confing" + exit 1 +fi + +##################指定训练脚本执行路径################## +# cd到与test文件同层级目录下执行脚本,提高兼容性;test_path_dir为包含test文件夹的路径 +cur_path=`pwd` +cur_path_last_dirname=${cur_path##*/} +if [ x"${cur_path_last_dirname}" == x"test" ]; then + test_path_dir=${cur_path} + cd .. + cur_path=`pwd` +else + test_path_dir=${cur_path}/test +fi + + +##################创建日志输出目录,根据模型审视################## +# 模型采用非循环方式启动多卡训练,创建日志输出目录如下;采用循环方式启动多卡训练的模型,在循环中创建日志输出目录,可参考CRNN模型 +# 非循环方式下8卡训练日志输出路径中的ASCEND_DEVICE_ID默认为0,只是人为指定文件夹名称, 不涉及训练业务 +ASCEND_DEVICE_ID=0 +if [ -d ${test_path_dir}/output/$ASCEND_DEVICE_ID ];then + rm -rf ${test_path_dir}/output/$ASCEND_DEVICE_ID + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +else + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +fi + +##################启动训练脚本################## +#训练开始时间,不需要修改 +start_time=$(date +%s) +# source 环境变量 +source ${test_path_dir}/env.sh +# 启动脚本(必选) +python3.7 ./main.py \ + ${data_path} \ + -a resnet101 \ + --addr=$(hostname -I |awk '{print $1}') \ + --seed=49 \ + --workers=128 \ + --learning-rate=${learning_rate} \ + --mom=0.9 \ + --weight-decay=1.0e-04 \ + --print-freq=1 \ + --dist-url='tcp://127.0.0.1:50000' \ + --multiprocessing-distributed \ + --world-size=1 \ + --rank=0 \ + --device='npu' \ + --dist-backend='hccl' \ + --epochs=${train_epochs} \ + --batch-size=${batch_size} \ + --amp \ + --device_list=0,1,2,3,4,5,6,7 \ + --FusedSGD \ + --loss-scale=1024 > ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log 2>&1 & + +wait + +##################获取训练数据##################(必选) +# 训练结束时间,不需要修改 +end_time=$(date +%s) +e2e_time=$(( $end_time - $start_time )) + +# 结果打印,不需要修改 +echo "------------------ Final result ------------------" +# 输出性能FPS,需要模型审视修改 +FPS=`grep -a 'FPS' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk -F " " '{print $NF}'|awk 'END {print}'` +# 打印,不需要修改 +echo "Final Performance images/sec : $FPS" + +# 输出训练精度,需要模型审视修改 +train_accuracy=`grep -a '* Acc@1' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk 'END {print}'|awk -F "Acc@1" '{print $NF}'|awk -F " " '{print $1}'` +# 打印,不需要修改 +echo "Final Train Accuracy : ${train_accuracy}" +echo "E2E Training Duration sec : $e2e_time" + +# 性能看护结果汇总 +# 训练用例信息,不需要修改 +BatchSize=${batch_size} +DeviceType=`uname -m` +CaseName=${Network}_bs${BatchSize}_${RANK_SIZE}'p'_'acc' + +# 获取性能数据,不需要修改 +# 吞吐量 +ActualFPS=${FPS} +# 单迭代训练时长 +TrainingTime=`awk 'BEGIN{printf "%.2f\n", '${batch_size}'*1000/'${FPS}'}'` + +# 从train_$ASCEND_DEVICE_ID.log提取Loss到train_${CaseName}_loss.txt中,需要根据模型审视 +grep Epoch: ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_$ASCEND_DEVICE_ID.log|grep -v Test|awk -F "Loss" '{print $NF}' | awk -F " " '{print $1}' >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt + +# 最后一个迭代loss值,不需要修改 +ActualLoss=`awk 'END {print}' ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt` + +# 关键信息打印到${CaseName}.log中,不需要修改 +echo "Network = ${Network}" > ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "RankSize = ${RANK_SIZE}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "BatchSize = ${BatchSize}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "DeviceType = ${DeviceType}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "CaseName = ${CaseName}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualFPS = ${ActualFPS}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "TrainingTime = ${TrainingTime}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "TrainAccuracy = ${train_accuracy}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualLoss = ${ActualLoss}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "E2ETrainingTime = ${e2e_time}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_performance_1p.sh" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_performance_1p.sh" new file mode 100644 index 0000000000000000000000000000000000000000..90b0d7ee8c4bbacf07e9665aab638becba3a92de --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_performance_1p.sh" @@ -0,0 +1,171 @@ +#!/usr/bin/env bash + +##################README################## +# 请注意修改(必选)项,确保配置正确 +# 最后的输出结果环节,没有强制要求按照示例执行,只需按照运行结果示例得到相同目录结构和输出结果即可 + +##################基础配置参数,需要模型审视修改################## +# 必选字段(必须在此处定义的参数): Network batch_size RANK_SIZE +#网络名称,同目录名称(必选) +Network="ResNet101_ID1595_for_PyTorch" +#训练epoch +train_epochs=1 +#训练batch_size(必选) +batch_size=256 +#训练step +#train_steps=`expr 1281167 / ${batch_size}` +#学习率 +learning_rate=0.1 +# 训练使用的npu卡数 +export RANK_SIZE=1 +# 数据集路径,保持为空,不需要修改 +data_path="" + +#维测参数,precision_mode需要模型审视修改 +#precision_mode="allow_mix_precision" +#维持参数,以下不需要修改 +over_dump=False +data_dump_flag=False +data_dump_step="10" +profiling=False + +# 帮助信息,不需要修改 +if [[ $1 == --help || $1 == -h ]];then + echo"usage:./train_performance_1P.sh " + echo " " + echo "parameter explain: + --precision_mode precision mode(allow_fp32_to_fp16/force_fp16/must_keep_origin_dtype/allow_mix_precision) + --over_dump if or not over detection, default is False + --data_dump_flag data dump flag, default is False + --data_dump_step data dump step, default is 10 + --profiling if or not profiling for performance debug, default is False + --data_path source data of training + -h/--help show help message + " + exit 1 +fi + +#参数校验,不需要修改 +for para in $* +do + if [[ $para == --device_id* ]];then + device_id=`echo ${para#*=}` + elif [[ $para == --data_path* ]];then + data_path=`echo ${para#*=}` + fi +done + +#校验是否传入data_path,不需要修改 +if [[ $data_path == "" ]];then + echo "[Error] para \"data_path\" must be confing" + exit 1 +fi + +# 校验单卡训练是否指定了device id,分动态分配device id 与手动指定device id,此处不需要修改 +if [ $ASCEND_DEVICE_ID ];then + echo "device id is ${ASCEND_DEVICE_ID}" +elif [ ${device_id} ]; then + export ASCEND_DEVICE_ID=${device_id} + echo "device id is ${ASCEND_DEVICE_ID}" +else + echo "[Error] device id must be confing" + exit 1 +fi + + +##################指定训练脚本执行路径################## +# cd到与test文件同层级目录下执行脚本,提高兼容性;test_path_dir为包含test文件夹的路径 +cur_path=`pwd` +cur_path_last_dirname=${cur_path##*/} +if [ x"${cur_path_last_dirname}" == x"test" ]; then + test_path_dir=${cur_path} + cd .. + cur_path=`pwd` +else + test_path_dir=${cur_path}/test +fi + + +##################创建日志输出目录,不需要修改################## +ASCEND_DEVICE_ID=${device_id} +if [ -d ${test_path_dir}/output/$ASCEND_DEVICE_ID ];then + rm -rf ${test_path_dir}/output/$ASCEND_DEVICE_ID + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +else + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +fi + +##################启动训练脚本################## +#训练开始时间,不需要修改 +start_time=$(date +%s) +# source 环境变量 +source ${test_path_dir}/env.sh +# 启动脚本(必选) +python3.7 ./main.py \ + ${data_path} \ + -a resnet101 \ + --addr=$(hostname -I |awk '{print $1}') \ + --seed=49 \ + --workers=128 \ + --learning-rate=${learning_rate} \ + --mom=0.9 \ + --weight-decay=1.0e-04 \ + --print-freq=1 \ + --device='npu' \ + --gpu=${ASCEND_DEVICE_ID} \ + --dist-backend='hccl' \ + --epochs=${train_epochs} \ + --amp \ + --FusedSGD \ + --batch-size=${batch_size} > ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log 2>&1 & + +wait + +##################获取训练数据##################(必选) +# 训练结束时间,不需要修改 +end_time=$(date +%s) +e2e_time=$(( $end_time - $start_time )) + +# 终端结果打印,不需要修改 +echo "------------------ Final result ------------------" +# 输出性能FPS,需要模型审视修改 +FPS=`grep -a 'FPS' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk -F " " '{print $NF}'|awk 'END {print}'` +# 打印,不需要修改 +echo "Final Performance images/sec : $FPS" + +# 输出训练精度,需要模型审视修改 +train_accuracy=`grep -a '* Acc@1' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk 'END {print}'|awk -F "Acc@1" '{print $NF}'|awk -F " " '{print $1}'` +# 打印,不需要修改 +echo "Final Train Accuracy : ${train_accuracy}" +echo "E2E Training Duration sec : $e2e_time" + +# 性能看护结果汇总 +# 训练用例信息,不需要修改 +BatchSize=${batch_size} +DeviceType=`uname -m` +CaseName=${Network}_bs${BatchSize}_${RANK_SIZE}'p'_'perf' + +# 获取性能数据,不需要修改 +# 吞吐量 +ActualFPS=${FPS} +# 单迭代训练时长 +TrainingTime=`awk 'BEGIN{printf "%.2f\n", '${batch_size}'*1000/'${FPS}'}'` + +# 从train_$ASCEND_DEVICE_ID.log提取Loss到train_${CaseName}_loss.txt中,需要根据模型审视 +grep Epoch: ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_$ASCEND_DEVICE_ID.log|grep -v Test|awk -F "Loss" '{print $NF}' | awk -F " " '{print $1}' >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt + +# 最后一个迭代loss值,不需要修改 +ActualLoss=`awk 'END {print}' ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt` + + +##################将训练数据存入文件################## +# 关键信息打印到${CaseName}.log中,不需要修改 +echo "Network = ${Network}" > ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "RankSize = ${RANK_SIZE}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "BatchSize = ${BatchSize}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "DeviceType = ${DeviceType}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "CaseName = ${CaseName}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualFPS = ${ActualFPS}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "TrainingTime = ${TrainingTime}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualLoss = ${ActualLoss}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "E2ETrainingTime = ${e2e_time}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log diff --git "a/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_performance_8p.sh" "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_performance_8p.sh" new file mode 100644 index 0000000000000000000000000000000000000000..811bf6c43409270214cc443df1648004e0ab0cd0 --- /dev/null +++ "b/Pytorch\350\256\255\347\273\203\347\244\272\344\276\213/ResNet101_ID1595_for_PyTorch/test/train_performance_8p.sh" @@ -0,0 +1,164 @@ +#!/usr/bin/env bash + +##################README################## +# 请注意修改(必选)项,确保配置正确 +# 最后的输出结果环节,没有强制要求按照示例执行,只需按照运行结果示例得到相同目录结构和输出结果即可 + +##################基础配置参数,需要模型审视修改################## +# 必选字段(必须在此处定义的参数): Network batch_size RANK_SIZE +#网络名称,同目录名称(必选) +Network="ResNet101_ID1595_for_PyTorch" +#训练epoch +train_epochs=2 +#训练batch_size(必选) +batch_size=2048 +#训练step +#train_steps=`expr 1281167 / ${batch_size}` +#学习率 +learning_rate=0.5 +# 训练使用的npu卡数 +export RANK_SIZE=8 +# 数据集路径,保持为空,不需要修改 +data_path="" + +#维测参数,precision_mode需要模型审视修改 +#precision_mode="allow_mix_precision" +#维持参数,以下不需要修改 +over_dump=False +data_dump_flag=False +data_dump_step="10" +profiling=False + +# 帮助信息,不需要修改 +if [[ $1 == --help || $1 == -h ]];then + echo"usage:./train_performance_1P.sh " + echo " " + echo "parameter explain: + --precision_mode precision mode(allow_fp32_to_fp16/force_fp16/must_keep_origin_dtype/allow_mix_precision) + --over_dump if or not over detection, default is False + --data_dump_flag data dump flag, default is False + --data_dump_step data dump step, default is 10 + --profiling if or not profiling for performance debug, default is False + --data_path source data of training + -h/--help show help message + " + exit 1 +fi + +#参数校验,不需要修改 +for para in $* +do + if [[ $para == --workers* ]];then + workers=`echo ${para#*=}` + elif [[ $para == --data_path* ]];then + data_path=`echo ${para#*=}` + fi +done + +#校验是否传入data_path,不需要修改 +if [[ $data_path == "" ]];then + echo "[Error] para \"data_path\" must be confing" + exit 1 +fi + +##################指定训练脚本执行路径################## +# cd到与test文件同层级目录下执行脚本,提高兼容性;test_path_dir为包含test文件夹的路径 +cur_path=`pwd` +cur_path_last_dirname=${cur_path##*/} +if [ x"${cur_path_last_dirname}" == x"test" ]; then + test_path_dir=${cur_path} + cd .. + cur_path=`pwd` +else + test_path_dir=${cur_path}/test +fi + + +##################创建日志输出目录,根据模型审视################## +# 模型采用非循环方式启动多卡训练,创建日志输出目录如下;采用循环方式启动多卡训练的模型,在循环中创建日志输出目录,可参考CRNN模型 +# 非循环方式下8卡训练日志输出路径中的ASCEND_DEVICE_ID默认为0,只是人为指定文件夹名称, 不涉及训练业务 +ASCEND_DEVICE_ID=0 +if [ -d ${test_path_dir}/output/$ASCEND_DEVICE_ID ];then + rm -rf ${test_path_dir}/output/$ASCEND_DEVICE_ID + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +else + mkdir -p ${test_path_dir}/output/$ASCEND_DEVICE_ID +fi + +##################启动训练脚本################## +#训练开始时间,不需要修改 +start_time=$(date +%s) +# source 环境变量 +source ${test_path_dir}/env.sh +# 启动脚本(必选) +python3.7 ./main.py \ + ${data_path} \ + -a resnet101 \ + --addr=$(hostname -I |awk '{print $1}') \ + --seed=49 \ + --workers=128 \ + --learning-rate=${learning_rate} \ + --mom=0.9 \ + --weight-decay=1.0e-04 \ + --print-freq=1 \ + --dist-url='tcp://127.0.0.1:50000' \ + --multiprocessing-distributed \ + --world-size=1 \ + --rank=0 \ + --device='npu' \ + --dist-backend='hccl' \ + --epochs=${train_epochs} \ + --batch-size=${batch_size} \ + --amp \ + --device_list=0,1,2,3,4,5,6,7 \ + --FusedSGD \ + --loss-scale=1024 > ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log 2>&1 & + +wait + +##################获取训练数据##################(必选) +# 训练结束时间,不需要修改 +end_time=$(date +%s) +e2e_time=$(( $end_time - $start_time )) + +# 结果打印,不需要修改 +echo "------------------ Final result ------------------" +#输出性能FPS,需要模型审视修改 +FPS=`grep -a 'FPS' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk -F " " '{print $NF}'|awk 'END {print}'` +#打印,不需要修改 +echo "Final Performance images/sec : $FPS" + +#输出训练精度,需要模型审视修改 +train_accuracy=`grep -a '* Acc@1' ${test_path_dir}/output/${ASCEND_DEVICE_ID}/train_${ASCEND_DEVICE_ID}.log|awk 'END {print}'|awk -F "Acc@1" '{print $NF}'|awk -F " " '{print $1}'` +#打印,不需要修改 +echo "Final Train Accuracy : ${train_accuracy}" +echo "E2E Training Duration sec : $e2e_time" + +#性能看护结果汇总 +#训练用例信息,不需要修改 +BatchSize=${batch_size} +DeviceType=`uname -m` +CaseName=${Network}_bs${BatchSize}_${RANK_SIZE}'p'_'perf' + +##获取性能数据,不需要修改 +#吞吐量 +ActualFPS=${FPS} +#单迭代训练时长 +TrainingTime=`awk 'BEGIN{printf "%.2f\n", '${batch_size}'*1000/'${FPS}'}'` + +#从train_$ASCEND_DEVICE_ID.log提取Loss到train_${CaseName}_loss.txt中,需要根据模型审视 +grep Epoch: ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_$ASCEND_DEVICE_ID.log|grep -v Test|awk -F "Loss" '{print $NF}' | awk -F " " '{print $1}' >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt + +#最后一个迭代loss值,不需要修改 +ActualLoss=`awk 'END {print}' ${test_path_dir}/output/$ASCEND_DEVICE_ID/train_${CaseName}_loss.txt` + +#关键信息打印到${CaseName}.log中,不需要修改 +echo "Network = ${Network}" > ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "RankSize = ${RANK_SIZE}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "BatchSize = ${BatchSize}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "DeviceType = ${DeviceType}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "CaseName = ${CaseName}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualFPS = ${ActualFPS}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "TrainingTime = ${TrainingTime}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "ActualLoss = ${ActualLoss}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log +echo "E2ETrainingTime = ${e2e_time}" >> ${test_path_dir}/output/$ASCEND_DEVICE_ID/${CaseName}.log diff --git a/README.md b/README.md index c49d9ffd81d48fbc3f8aaf505213eedf9a074ad3..af69bbd8e897e51a39e28340a0947b2cdf6e78b5 100644 --- a/README.md +++ b/README.md @@ -9,7 +9,10 @@ Pytorch模型众智文档管理仓,用于协作开发 [Ascend Pytorch 模型众智 FAQ](./AscendPytorch模型众智FAQ.md) -[模型众筹历史交流培训录屏归档地址](./%E6%A8%A1%E5%9E%8B%E4%BC%97%E7%AD%B9%E5%8E%86%E5%8F%B2%E4%BA%A4%E6%B5%81%E5%9F%B9%E8%AE%AD%E5%BD%95%E5%B1%8F%E5%BD%92%E6%A1%A3%E5%9C%B0%E5%9D%80.md) + +#### 录屏 +[2020-11-03_PyTorch模型众筹培训和交流](https://ascend-release.obs.cn-north-4.myhuaweicloud.com/Training/2020-11-03_PyTorch%E6%A8%A1%E5%9E%8B%E4%BC%97%E7%AD%B9%E5%9F%B9%E8%AE%AD%E5%92%8C%E4%BA%A4%E6%B5%81.mp4) +[2021-03-09_PyTorch模型众筹培训和交流](https://ascend-release.obs.cn-north-4.myhuaweicloud.com/Training/2021-03-09%20PyTorch%E6%A8%A1%E5%9E%8B%E4%BC%97%E6%99%BA%E5%9F%B9%E8%AE%AD.mp4) #### 更新说明 @@ -21,3 +24,25 @@ Pytorch模型众智文档管理仓,用于协作开发 2. fork 该仓到个人仓 3. push, 提交PR +##### FAQ贡献格式 + +``` +1. 图片统一放置在```./figures/```目录下,命名格式为 model_faq${number}_${图片序号}.png + +2. FAQ样板如下 + +### FAQ${number}、${title}。 + +* 现象描述 + +![](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/figures/model_faq${number}_${图片序号}.png) + +* 原因分析 + + xxxx + +* 处理方法 + + xxxx +``` + diff --git a/figures/aicore_profiling_fig1.png b/figures/aicore_profiling_fig1.png new file mode 100644 index 0000000000000000000000000000000000000000..cccb56eab432c1ac8c363bfc3d48e582c4f4d67c Binary files /dev/null and b/figures/aicore_profiling_fig1.png differ diff --git a/figures/aicore_profiling_fig2.png b/figures/aicore_profiling_fig2.png new file mode 100644 index 0000000000000000000000000000000000000000..f983683ff7fd2e15d2081ecdc8ea2410508e625a Binary files /dev/null and b/figures/aicore_profiling_fig2.png differ diff --git a/figures/aicore_profiling_fig3.png b/figures/aicore_profiling_fig3.png new file mode 100644 index 0000000000000000000000000000000000000000..5e4ae0c5b57d36b334e168e182fb5e19aabc3251 Binary files /dev/null and b/figures/aicore_profiling_fig3.png differ diff --git a/figures/aicore_profiling_fig4.png b/figures/aicore_profiling_fig4.png new file mode 100644 index 0000000000000000000000000000000000000000..2192c7959db28c963dc69bd7c623e683ae02b69f Binary files /dev/null and b/figures/aicore_profiling_fig4.png differ diff --git a/figures/aicore_profiling_fig5.png b/figures/aicore_profiling_fig5.png new file mode 100644 index 0000000000000000000000000000000000000000..c59ec9af203862b97e1e366262a3d9e815c14387 Binary files /dev/null and b/figures/aicore_profiling_fig5.png differ diff --git a/figures/aicore_profiling_fig6.png b/figures/aicore_profiling_fig6.png new file mode 100644 index 0000000000000000000000000000000000000000..65440a7b8763b801a0d7da66022b5785e8dfd7d2 Binary files /dev/null and b/figures/aicore_profiling_fig6.png differ diff --git a/figures/aicore_profiling_fig7.png b/figures/aicore_profiling_fig7.png new file mode 100644 index 0000000000000000000000000000000000000000..81e4b8a286b36bbd5fad513c360bf14ac19a741c Binary files /dev/null and b/figures/aicore_profiling_fig7.png differ diff --git a/figures/aicore_profiling_fig8.png b/figures/aicore_profiling_fig8.png new file mode 100644 index 0000000000000000000000000000000000000000..32ffd4f03fbab677008cd1db3aeacfc9461250c2 Binary files /dev/null and b/figures/aicore_profiling_fig8.png differ diff --git a/figures/dump_autotune_fig1.PNG b/figures/dump_autotune_fig1.PNG new file mode 100644 index 0000000000000000000000000000000000000000..9b045498baa59bcec9058fdc4d2cdb98ae73506a Binary files /dev/null and b/figures/dump_autotune_fig1.PNG differ diff --git a/figures/dump_autotune_fig2.PNG b/figures/dump_autotune_fig2.PNG new file mode 100644 index 0000000000000000000000000000000000000000..23ce0a3223995f91da2a0bb5b426c9e37bd18f2c Binary files /dev/null and b/figures/dump_autotune_fig2.PNG differ diff --git a/figures/dump_autotune_fig3.PNG b/figures/dump_autotune_fig3.PNG new file mode 100644 index 0000000000000000000000000000000000000000..46edf3bcfa8a53077344dd3fbe91c7e2c5c752d2 Binary files /dev/null and b/figures/dump_autotune_fig3.PNG differ diff --git a/figures/dump_autotune_fig4.PNG b/figures/dump_autotune_fig4.PNG new file mode 100644 index 0000000000000000000000000000000000000000..1640f6d0e5404af15dbf38628fa33cbaeafc56b6 Binary files /dev/null and b/figures/dump_autotune_fig4.PNG differ diff --git a/figures/model_faq19_0527.png b/figures/model_faq19_0527.png new file mode 100644 index 0000000000000000000000000000000000000000..a48bc1ecaa6ba24e17fa6eb8509825703bf046ec Binary files /dev/null and b/figures/model_faq19_0527.png differ diff --git a/figures/model_faq1_2901.png b/figures/model_faq1_2901.png new file mode 100644 index 0000000000000000000000000000000000000000..73c57c2a218516b47182dd3de03fd9a883049a05 Binary files /dev/null and b/figures/model_faq1_2901.png differ diff --git a/figures/model_faq20_0528.png b/figures/model_faq20_0528.png new file mode 100644 index 0000000000000000000000000000000000000000..739a7308a39a345ef4ced30de765e4344adedcb1 Binary files /dev/null and b/figures/model_faq20_0528.png differ diff --git a/figures/model_faq21_0529.PNG b/figures/model_faq21_0529.PNG new file mode 100644 index 0000000000000000000000000000000000000000..6f48dc45c7d18f23c60402768641cca4c4f3fc3b Binary files /dev/null and b/figures/model_faq21_0529.PNG differ diff --git a/figures/model_faq22_0604.PNG b/figures/model_faq22_0604.PNG new file mode 100644 index 0000000000000000000000000000000000000000..d7958269a53f0ec0ea1374bb89963d8b6ce7d0e6 Binary files /dev/null and b/figures/model_faq22_0604.PNG differ diff --git a/figures/model_faq27_0618.png b/figures/model_faq27_0618.png new file mode 100644 index 0000000000000000000000000000000000000000..89277d60786e0c3dc3513086471b749126830c6b Binary files /dev/null and b/figures/model_faq27_0618.png differ diff --git a/figures/model_faq28_0618.png b/figures/model_faq28_0618.png new file mode 100644 index 0000000000000000000000000000000000000000..4d02f95c6d2dd751844fe990ce15b480f9ab5959 Binary files /dev/null and b/figures/model_faq28_0618.png differ diff --git a/figures/model_faq2_0201.png b/figures/model_faq2_0201.png new file mode 100644 index 0000000000000000000000000000000000000000..cfeb1129522fc980bf885b88c54c0b7ba79dbe65 Binary files /dev/null and b/figures/model_faq2_0201.png differ diff --git a/figures/model_faq2_0301.png b/figures/model_faq2_0301.png new file mode 100644 index 0000000000000000000000000000000000000000..bcab2fe32f75ef55317dcdaa4001d9829c4213b2 Binary files /dev/null and b/figures/model_faq2_0301.png differ diff --git a/figures/moder_faq20_0607.png b/figures/moder_faq20_0607.png new file mode 100644 index 0000000000000000000000000000000000000000..3767984fb5dcc44ff8d2ef4d0d2c7aef765c5e96 Binary files /dev/null and b/figures/moder_faq20_0607.png differ diff --git a/figures/moder_faq21_0607.png b/figures/moder_faq21_0607.png new file mode 100644 index 0000000000000000000000000000000000000000..c48695de2adaffb0b4a301a629d94844a7cfb4f6 Binary files /dev/null and b/figures/moder_faq21_0607.png differ diff --git a/figures/pyotrch_offline_infer.png b/figures/pyotrch_offline_infer.png new file mode 100644 index 0000000000000000000000000000000000000000..bc5458e4c1edce001bdbd2193100ff18327318c9 Binary files /dev/null and b/figures/pyotrch_offline_infer.png differ diff --git a/img.png b/img.png new file mode 100644 index 0000000000000000000000000000000000000000..21b57a6cdb11d770ed321f5c6c2bd3d86c9bbb6e Binary files /dev/null and b/img.png differ diff --git a/img_1.png b/img_1.png new file mode 100644 index 0000000000000000000000000000000000000000..21b57a6cdb11d770ed321f5c6c2bd3d86c9bbb6e Binary files /dev/null and b/img_1.png differ diff --git a/img_2.png b/img_2.png new file mode 100644 index 0000000000000000000000000000000000000000..1861c94be6d90e8e6e51d4b4940d2c5bbd4681dc Binary files /dev/null and b/img_2.png differ diff --git "a/officefile/\351\231\204\344\273\2661_\351\253\230\346\240\241\345\220\210\344\275\234xx\347\275\221\347\273\234\350\275\254\351\252\214\346\224\266_CheckList.xlsx" "b/officefile/\351\231\204\344\273\2661_\351\253\230\346\240\241\345\220\210\344\275\234xx\347\275\221\347\273\234\350\275\254\351\252\214\346\224\266_CheckList.xlsx" index 4e3215333819b638466981df563fdfc1c06776d0..4e48d1fb01e92c6f7e9118594b66d1367185272e 100644 Binary files "a/officefile/\351\231\204\344\273\2661_\351\253\230\346\240\241\345\220\210\344\275\234xx\347\275\221\347\273\234\350\275\254\351\252\214\346\224\266_CheckList.xlsx" and "b/officefile/\351\231\204\344\273\2661_\351\253\230\346\240\241\345\220\210\344\275\234xx\347\275\221\347\273\234\350\275\254\351\252\214\346\224\266_CheckList.xlsx" differ diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/official/.keep" "b/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/official/.keep" deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git "a/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/research/.keep" "b/onnx\347\253\257\345\210\260\347\253\257\346\216\250\347\220\206\346\214\207\345\257\274/research/.keep" deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/src/accuracy_comparision.py b/src/accuracy_comparision.py index 977d039a724236ba0bc1a79dd54d8283ae35c6a4..590104a3d0ce950e77206c81673b2292b5f880de 100644 --- a/src/accuracy_comparision.py +++ b/src/accuracy_comparision.py @@ -1,3 +1,20 @@ +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + # -*- coding: utf-8 -*- """用于精度比对 """ @@ -5,7 +22,7 @@ import torch import torchvision from apex import amp - +import copy ##### 需自行改写部分 start ##### # 获得模型 @@ -30,9 +47,19 @@ NPU_PROF = True # 设置hook def hook_func(name, save_dict, module): def hook_function(module, inputs, outputs): - save_dict[name + '_inputs'] = inputs - save_dict[name + '_outputs'] = outputs - + inputs_key = name + '_inputs' + idx = 0 + while inputs_key in save_dict: + inputs_key = inputs_key.split('-')[0] + '-%d'%idx + idx +=1 + save_dict[inputs_key] = inputs + + outputs_key = name + '_outputs' + idx = 0 + while outputs_key in save_dict: + outputs_key = outputs_key.split('-')[0] + '-%d'%idx + idx +=1 + save_dict[outputs_key] = outputs return hook_function @@ -40,7 +67,7 @@ def hook_func(name, save_dict, module): # CPU固定输入和权重 model = get_model() optimizer = torch.optim.SGD(model.parameters(), 0.1) -state_dict = model.state_dict() +state_dict = copy.deepcopy(model.state_dict()) # CPU注册hook,cpu_dict用于存储对比对象 cpu_dict = {} @@ -50,7 +77,7 @@ for name, module in model.named_modules(): # CPU运行正反向,获取正反向每个module的输入输出和所有参数的grad out = model(input_tensor) -loss = out.sum() +loss = out.mean() optimizer.zero_grad() loss.backward() optimizer.step() @@ -80,12 +107,11 @@ if AMP_MODE: # NPU运行正反向,获取正反向每个module的输入输出和所有参数的grad out = model(input_tensor) -loss = out.sum() +loss = out.mean() optimizer.zero_grad() if AMP_MODE: with amp.scale_loss(loss, optimizer) as scaled_loss: scaled_loss.backward() - optimizer.step() else: loss.backward() optimizer.step() @@ -97,12 +123,20 @@ for name, param in model.named_parameters(): # 递归得到对比值 def compare(x1, x2, prefix=''): if isinstance(x1, tuple): - for idx in range(len(x1)): - return compare(x1[idx], x2[idx], prefix=prefix+'_%d'%idx) - elif isinstance(x1, torch.Tensor): + if x1: + for idx in range(len(x1)): + try: + compare(x1[idx], x2[idx], prefix=prefix + '.%d' % idx) + except Exception as e: + print(str(e)) + print(prefix, 'failed.') + elif isinstance(x1, torch.Tensor) and isinstance(x2, torch.Tensor): try: l1_error = (x1 - x2.cpu()).abs().mean() - print(prefix, l1_error) + rel_error = l1_error / (x1.abs().mean()) + print(prefix, 'l1_error: ', l1_error, 'rel_error', rel_error) + if l1_error * rel_error > 10 : + print('\n###\n',prefix, 'should checked!','\n###\n') except Exception as e: print(str(e)) print(prefix, 'failed.') @@ -114,7 +148,7 @@ for k in cpu_dict: if NPU_PROF: with torch.autograd.profiler.profile(use_npu=True) as prof: out = model(input_tensor) - loss = out.sum() + loss = out.mean() optimizer.zero_grad() if AMP_MODE: with amp.scale_loss(loss, optimizer) as scaled_loss: diff --git a/src/bn.py b/src/bn.py index c5c018b45d3b292c0e5daadc0eeb9f4b23560c0a..fdea25783a7b811b619d6c83fca1bcdcf35d2642 100644 --- a/src/bn.py +++ b/src/bn.py @@ -1,3 +1,20 @@ +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + import torch import torch.nn as nn torch.npu.set_device("npu:0") diff --git a/src/demo.py b/src/demo.py index 84ead9654ff260d50688609e3c4aa2af9fadf571..f124c81303be5a8a9ea29e25a9df1aa676162fc2 100644 --- a/src/demo.py +++ b/src/demo.py @@ -1,3 +1,20 @@ +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + # -*- coding: utf-8 -*- """demo.py """ diff --git a/src/env.sh b/src/env.sh new file mode 100644 index 0000000000000000000000000000000000000000..ef01f3ca7eeb0e05ee2c95ccd53d64e07058caaa --- /dev/null +++ b/src/env.sh @@ -0,0 +1,66 @@ +#!/bin/bash +export install_path=/usr/local/Ascend + +if [ -d ${install_path}/toolkit ]; then + export LD_LIBRARY_PATH=/usr/include/hdf5/lib/:/usr/local/:/usr/local/lib/:/usr/lib/:${install_path}/fwkacllib/lib64/:${install_path}/driver/lib64/common/:${install_path}/driver/lib64/driver/:${install_path}/add-ons:${path_lib}:${LD_LIBRARY_PATH} + export PATH=${install_path}/fwkacllib/ccec_compiler/bin:${install_path}/fwkacllib/bin:$PATH + export PYTHONPATH=${install_path}/fwkacllib/python/site-packages:${install_path}/tfplugin/python/site-packages:${install_path}/toolkit/python/site-packages:$PYTHONPATH + export PYTHONPATH=/usr/local/python3.7.5/lib/python3.7/site-packages:$PYTHONPATH + export ASCEND_OPP_PATH=${install_path}/opp +else + if [ -d ${install_path}/nnae/latest ];then + export LD_LIBRARY_PATH=/usr/local/:/usr/local/python3.7.5/lib/:/usr/local/openblas/lib:/usr/local/lib/:/usr/lib64/:/usr/lib/:${install_path}/nnae/latest/fwkacllib/lib64/:${install_path}/driver/lib64/common/:${install_path}/driver/lib64/driver/:${install_path}/add-ons/:/usr/lib/aarch64_64-linux-gnu:$LD_LIBRARY_PATH + export PATH=$PATH:${install_path}/nnae/latest/fwkacllib/ccec_compiler/bin/:${install_path}/nnae/latest/toolkit/tools/ide_daemon/bin/ + export ASCEND_OPP_PATH=${install_path}/nnae/latest/opp/ + export OPTION_EXEC_EXTERN_PLUGIN_PATH=${install_path}/nnae/latest/fwkacllib/lib64/plugin/opskernel/libfe.so:${install_path}/nnae/latest/fwkacllib/lib64/plugin/opskernel/libaicpu_engine.so:${install_path}/nnae/latest/fwkacllib/lib64/plugin/opskernel/libge_local_engine.so + export PYTHONPATH=${install_path}/nnae/latest/fwkacllib/python/site-packages/:${install_path}/nnae/latest/fwkacllib/python/site-packages/auto_tune.egg/auto_tune:${install_path}/nnae/latest/fwkacllib/python/site-packages/schedule_search.egg:$PYTHONPATH + export ASCEND_AICPU_PATH=${install_path}/nnae/latest + else + export LD_LIBRARY_PATH=/usr/local/:/usr/local/lib/:/usr/lib64/:/usr/lib/:/usr/local/python3.7.5/lib/:/usr/local/openblas/lib:${install_path}/ascend-toolkit/latest/fwkacllib/lib64/:${install_path}/driver/lib64/common/:${install_path}/driver/lib64/driver/:${install_path}/add-ons/:/usr/lib/aarch64-linux-gnu:$LD_LIBRARY_PATH + export PATH=$PATH:${install_path}/ascend-toolkit/latest/fwkacllib/ccec_compiler/bin/:${install_path}/ascend-toolkit/latest/toolkit/tools/ide_daemon/bin/ + export ASCEND_OPP_PATH=${install_path}/ascend-toolkit/latest/opp/ + export OPTION_EXEC_EXTERN_PLUGIN_PATH=${install_path}/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libfe.so:${install_path}/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libaicpu_engine.so:${install_path}/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libge_local_engine.so + export PYTHONPATH=${install_path}/ascend-toolkit/latest/fwkacllib/python/site-packages/:${install_path}/ascend-toolkit/latest/fwkacllib/python/site-packages/auto_tune.egg/auto_tune:${install_path}/ascend-toolkit/latest/fwkacllib/python/site-packages/schedule_search.egg:$PYTHONPATH + export ASCEND_AICPU_PATH=${install_path}/ascend-toolkit/latest + fi +fi + + +#将Host日志输出到串口,0-关闭/1-开启 +export ASCEND_SLOG_PRINT_TO_STDOUT=0 +#设置默认日志级别,0-debug/1-info/2-warning/3-error +export ASCEND_GLOBAL_LOG_LEVEL=3 +#设置Event日志开启标志,0-关闭/1-开启 +export ASCEND_GLOBAL_EVENT_ENABLE=0 +#设置是否开启taskque,0-关闭/1-开启 +export TASK_QUEUE_ENABLE=1 +#设置是否开启PTCopy,0-关闭/1-开启 +export PTCOPY_ENABLE=1 +#设置是否开启combined标志,0-关闭/1-开启 +export COMBINED_ENABLE=1 +#设置特殊场景是否需要重新编译,不需要修改 +export DYNAMIC_OP="ADD#MUL" +#HCCL白名单开关,1-关闭/0-开启 +export HCCL_WHITELIST_DISABLE=1 + + +path_lib=$(python3.7 -c """ +import sys +import re +result='' +for index in range(len(sys.path)): + match_sit = re.search('-packages', sys.path[index]) + if match_sit is not None: + match_lib = re.search('lib', sys.path[index]) + + if match_lib is not None: + end=match_lib.span()[1] + result += sys.path[index][0:end] + ':' + + result+=sys.path[index] + '/torch/lib:' +print(result)""" +) + +echo ${path_lib} + +export LD_LIBRARY_PATH=/usr/local/python3.7.5/lib/:${path_lib}:$LD_LIBRARY_PATH diff --git a/src/get_ascend_op_info.py b/src/get_ascend_op_info.py index 1c99d991d660ef5e4e6502d056b4e65b1accb4f1..9eaf9903cc47b4bb1add7a72caebfb2d03e02e23 100644 --- a/src/get_ascend_op_info.py +++ b/src/get_ascend_op_info.py @@ -1,3 +1,20 @@ +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + # -*- coding: utf-8 -*- """用于导出OPINFO """ diff --git a/src/model_prof.py b/src/model_prof.py index b11f5af9d3e657da6e6f2652f55c0af9cd132433..c85a70a900f5df03ec6974228150214fe91f6dd9 100644 --- a/src/model_prof.py +++ b/src/model_prof.py @@ -1,3 +1,20 @@ +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + # -*- coding: utf-8 -*- """pytorch_prof.py """ diff --git a/src/op_prof.py b/src/op_prof.py index dbee981147ef6a0239715ac5d3ee2741424643ad..8ee73fc2aedd9b004459ecac5af6823bc17214b9 100644 --- a/src/op_prof.py +++ b/src/op_prof.py @@ -1,3 +1,20 @@ +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + import torch import time import argparse diff --git a/src/pthtar2onnx.py b/src/pthtar2onnx.py new file mode 100644 index 0000000000000000000000000000000000000000..e54d46416dcf72e3de8ee54c2c75fcbbf6c03046 --- /dev/null +++ b/src/pthtar2onnx.py @@ -0,0 +1,50 @@ +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ============================================================================ + +import torch +from DistributedResnet50.image_classification import resnet +import torch.onnx + +from collections import OrderedDict + + +def proc_node_module(checkpoint, AttrName): + new_state_dict = OrderedDict() + for k, v in checkpoint[AttrName].items(): + if(k[0:7] == "module."): + name = k[7:] + else: + name = k[0:] + new_state_dict[name] = v + return new_state_dict + + +def convert(): + checkpoint = torch.load("./resnet50checkpoint.pth.tar", map_location='cpu') + checkpoint['state_dict'] = proc_node_module(checkpoint, 'state_dict') + model = resnet.build_resnet("resnet50", "classic") + model.load_state_dict(checkpoint['state_dict']) + model.eval() + print(model) + + input_names = ["actual_input_1"] + output_names = ["output1"] + dummy_input = torch.randn(16, 3, 224, 224) + torch.onnx.export(model, dummy_input, "resnet50_npu_16.onnx", input_names=input_names, output_names=output_names, + opset_version=11) + + +if __name__ == "__main__": + convert() \ No newline at end of file diff --git a/src/recompiled_op.py b/src/recompiled_op.py index 5dfb6c99fe122895fcf8b4745ced31ed3fe5cfb1..840d502517a657be270256a36760a0913de878da 100644 --- a/src/recompiled_op.py +++ b/src/recompiled_op.py @@ -1,5 +1,22 @@ +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + # -*- coding: utf-8 -*- -"""用于导出OPINFO +"""用于导出动态shape算子 """ import os diff --git a/src/set_npu_env.sh b/src/set_npu_env.sh deleted file mode 100644 index 0963d2264dbfe8e737ebd00198db056270247bda..0000000000000000000000000000000000000000 --- a/src/set_npu_env.sh +++ /dev/null @@ -1,26 +0,0 @@ -export LD_LIBRARY_PATH=/usr/local/:/usr/local/lib/:/usr/lib64/:/usr/lib/:/usr/local/python3.7.5/lib/:/usr/local/openblas/lib:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/lib64/:/usr/local/Ascend/driver/lib64/common/:/usr/local/Ascend/driver/lib64/driver/:/usr/local/Ascend/add-ons/:/usr/lib/aarch64-linux-gnu:$LD_LIBRARY_PATH -export PATH=$PATH:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/ccec_compiler/bin/:/usr/local/Ascend/ascend-toolkit/latest/toolkit/tools/ide_daemon/bin/ -export ASCEND_OPP_PATH=/usr/local/Ascend/ascend-toolkit/latest/opp/ -export OPTION_EXEC_EXTERN_PLUGIN_PATH=/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libfe.so:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libaicpu_engine.so:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/lib64/plugin/opskernel/libge_local_engine.so -export PYTHONPATH=/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/python/site-packages/:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/python/site-packages/auto_tune.egg/auto_tune:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/python/site-packages/schedule_search.egg:$PYTHONPATH - -export TASK_QUEUE_ENABLE=1 -export ASCEND_SLOG_PRINT_TO_STDOUT=0 -export ASCEND_GLOBAL_LOG_LEVEL=3 -export ASCEND_AICPU_PATH=/usr/local/Ascend/ascend-toolkit/latest - -path_lib=$(python3.7 -c """ -import sys -import re -result='' -for index in range(len(sys.path)): - match_sit = re.search('-packages', sys.path[index]) - if match_sit is not None: - match_lib = re.search('lib', sys.path[index]) - if match_lib is not None: - end=match_lib.span()[1] - result += sys.path[index][0:end] + ':' - result+=sys.path[index] + '/torch/lib:' -print(result)""" -) -export LD_LIBRARY_PATH=/usr/local/python3.7.5/lib/:${path_lib}:$LD_LIBRARY_PATH \ No newline at end of file diff --git a/src/test.py b/src/test.py index 48f85f3fa24411f31ad8e9674beadf5121b21e78..1fba9c146e9125426192548ce3f8dfaae9efd9f6 100644 --- a/src/test.py +++ b/src/test.py @@ -1,4 +1,21 @@ -#!/usr/bin/env python +# Copyright (c) Soumith Chintala 2016, +# All rights reserved +# +# Copyright 2020 Huawei Technologies Co., Ltd +# +# Licensed under the BSD 3-Clause License (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://spdx.org/licenses/BSD-3-Clause.html +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +#!/usr/bin/env python # -*- coding: utf-8 -*- import torch diff --git "a/src/\345\233\272\345\256\232\345\212\250\346\200\201shape\350\214\203\344\276\213\346\226\207\346\241\243.md" "b/src/\345\233\272\345\256\232\345\212\250\346\200\201shape\350\214\203\344\276\213\346\226\207\346\241\243.md" index 6da9dec86ee2136cf44b5afd73be0d3d6bedc2cc..a8ab7890a2e59b674e5f652ff752bd4890137360 100644 --- "a/src/\345\233\272\345\256\232\345\212\250\346\200\201shape\350\214\203\344\276\213\346\226\207\346\241\243.md" +++ "b/src/\345\233\272\345\256\232\345\212\250\346\200\201shape\350\214\203\344\276\213\346\226\207\346\241\243.md" @@ -53,7 +53,6 @@ for i in gt_bboxes: ``` - ### 计算类修改固定shape - 计算类修改固定shape常见于loss计算,其核心逻辑在于将取切片相关操作转化为乘法操作,有以下几种常见的场景 @@ -107,6 +106,24 @@ d = c.sum() # [2, 3, 4].sum() ``` + +4. 重复索引取值 +```python +# 原始代码 +a = torch.tensor([1, 2, 3]) +b = torch.tensor([2, 1, 1, 2, 3, 3, 1]) +c = a[b - 1] # [2, 1, 1, 2, 3, 3, 1] +d = c.sum() # 13 + +# 修改代码 +a = torch.tensor([1, 2, 3]) +b = torch.tensor([2, 1, 1, 2, 3, 3, 1]) +c = a.index_select(0, (b-1.0).long()) # [2, 1, 1, 2, 3, 3, 1] +d = c.sum() # 13 +``` + + + ##### 计算类修改固定shape样例,以FCOS网络为例 对于动态shape问题,在输入数据尺寸是固定的前提下,每个batch的动态变化主要在计算loss处产生,比如对于检测类问题,每幅图像预测出的目标框的个数是会动态变化的,因此我们可以从计算loss这部分代码开始分析。以fcos模型为例,进入loss函数后我们对于“可疑”变量,也就是说可能发生shape变化的变量,依次打印其value和shape等信息,观察每个step的shape是否一致。
diff --git a/tmp b/tmp new file mode 100644 index 0000000000000000000000000000000000000000..5aeaf5c28058bb0b2d9f200c4ad5a6324e9eefce --- /dev/null +++ b/tmp @@ -0,0 +1,16 @@ +https://ascend-model-file.obs.cn-north-4.myhuaweicloud.com/%E9%AA%8C%E6%94%B6-%E6%8E%A8%E7%90%86/perf/nasnet.tar.gz +https://ascend-model-file.obs.cn-north-4.myhuaweicloud.com/%E9%AA%8C%E6%94%B6-%E6%8E%A8%E7%90%86/perf/AlexNet%E6%80%A7%E8%83%BD%E4%B8%8D%E8%BE%BE%E6%A0%87%E5%88%9D%E6%AD%A5%E5%88%86%E6%9E%90.rar + +交付件样例: +https://gitee.com/pengyeqing/modelzoo/tree/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50 +https://gitee.com/pengyeqing/modelzoo/blob/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50/README.md +https://gitee.com/pengyeqing/modelzoo/blob/master/built-in/ACL_PyTorch/Benchmark/cv/classification/ResNext50/test/README.md +主要参考模板README.md与test/README.md + +代码提交参考: +https://gitee.com/ascend/modelzoo/pulls/2122 +https://gitee.com/ascend/modelzoo/pulls/2061 +https://gitee.com/ascend/modelzoo/pulls/2137 +https://gitee.com/ascend/modelzoo/pulls/2309 +https://gitee.com/ascend/modelzoo/pulls/2328 +https://gitee.com/ascend/modelzoo/pulls/2585 \ No newline at end of file diff --git "a/\347\246\273\347\272\277autotune\344\275\277\347\224\250\346\214\207\345\257\274\344\271\246.md" "b/\347\246\273\347\272\277autotune\344\275\277\347\224\250\346\214\207\345\257\274\344\271\246.md" new file mode 100644 index 0000000000000000000000000000000000000000..a79c18d19b16029587a899f9d58e91c85327f5f6 --- /dev/null +++ "b/\347\246\273\347\272\277autotune\344\275\277\347\224\250\346\214\207\345\257\274\344\271\246.md" @@ -0,0 +1,66 @@ +- 适用范围 +1. 在某个特定shape性能大幅下降 +2. 精益求精,对10ms上下的性能提升也高度敏感 + +## 1 设置DUMP数据环境变量 +``` +# 此处使用cann组合包路径 +export install_path=/usr/local/Ascend/ascend-toolkit/latest/ +export LD_LIBRARY_PATH=${install_path}/fwkacllib/lib64:$LD_LIBRARY_PATH +export PYTHONPATH=${install_path}/fwkacllib/python/site-packages:${install_path}/fwkacllib/python/site-packages/auto_tune.egg/auto_tune:${install_path}/fwkacllib/python/site-packages/schedule_search.egg:$PYTHONPATH +export ASCEND_OPP_PATH=${install_path}/opp + +#开启dump数据功能 +export ENABLE_TUNE_DUMP=True + +#设置dump数据路径 +export TUNE_DUMP_PATH=/home/HwHiAiUser/DumpData +``` + +## 2 DUMP数据 +- 运行想要DUMP数据的网络脚本,运行几个step即可。 +运行完成后,在dump数据路径下会生成相应的数据 + +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/dump_autotune_fig1.PNG) + +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/dump_autotune_fig2.PNG) + +## 3 开始autotune + +``` +export install_path=/usr/local/Ascend/ascend-toolkit/latest/ +export PATH=${PATH}:${install_path}/fwkacllib/bin/:${install_path}/atc/ccec_cpmpiler/bin:${install_path}/atc/bin + +#强制tune标志位,开启后无论是否已有知识库都会重新生成 +export REPEAT_TUNE=True + +python3.7.5 ${install_path}/fwkacllib/python/site-packages/schedule_search.egg/schedule_search/msoptune.py --start /home/HwHiAiUser/DumpData +``` +- 需要tensorflow==1.15.0 +- 默认先进行cube的autotune,再进行vector的RL,总时长(亲测时间约18h)和具体任务相关 +- cube的autotune的结果默认地址为:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/data/tiling/ascend910/custom/,如想要给其他机器使用则需要将该结果放置对应机器的上述位置 +- vector的RL的结果默认地址为: /usr/local/Ascend/ascend-toolkit/latest/fwkacllib/data/rl/Ascend910/custom/,如想要给其他机器使用则需要将该结果放置对应机器的上述位置 +- cat *.json|wc -l查看有多少调优记录 +- 需要对上述两个文件夹执行 +``` +chown -R HwHiAiUser:HwHiAiUser * && chmod 777 * +``` +避免权限问题。 + +### 3.1 调试过程问题汇总 +- 调试过程中可能会遇到如下错误 +![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/dump_autotune_fig3.PNG) + + 执行指令安装相应的依赖 +``` +pip3.7.5 install /usr/local/Ascend/ascend-toolkit/latest/fwkacllib/lib64/*.whl +``` +- 执行autotune报错,提示如下错误 + ![](https://gitee.com/zwx5317131/ascend-pytorch-crowdintelligence-doc/raw/master/figures/dump_autotune_fig4.PNG) + + 需要先source下环境变量,[环境变量脚本参考路径](https://gitee.com/wangjiangben_hw/ascend-pytorch-crowdintelligence-doc/raw/master/src/env.sh) + +- 如果需要重新测试其他模型的autotune,请先将下面三个路径下的数据清除后再执行autotune + (1) dump数据路径:/home/HwHiAiUser/DumpData + (2) cube结果:/usr/local/Ascend/ascend-toolkit/latest/fwkacllib/data/tiling/ascend910/custom/ + (3) vector的RL结果: /usr/local/Ascend/ascend-toolkit/latest/fwkacllib/data/rl/Ascend910/custom/ \ No newline at end of file