代码拉取完成,页面将自动刷新
同步操作将从 src-openEuler/pytorch 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
From 0d3ceb3058201868765ff3aa1126685f3f7f9ecc Mon Sep 17 00:00:00 2001
From: Andrew Calvano <calvano@fb.com>
Date: Fri, 17 Nov 2023 17:29:04 +0000
Subject: [PATCH] Fix for PyTorch mobile flatbuffer loader out of bounds reads
(#110162)
Summary:
The mobile_ivalue_size field in the mobile_bytecode flatbuffer schema can be larger than the ivalues vector. This introduces potential for memory corruption when parsing the mobile_bytecode Module.
This diff fixes the issue by ensuring that mobile_ivalue_size is less than the size of the ivalues vector.
Test Plan: contbuild & OSS CI
Differential Revision: D49687548
Pull Request resolved: https://github.com/pytorch/pytorch/pull/110162
Approved by: https://github.com/malfet
---
torch/csrc/jit/mobile/flatbuffer_loader.cpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/torch/csrc/jit/mobile/flatbuffer_loader.cpp b/torch/csrc/jit/mobile/flatbuffer_loader.cpp
index 2fb12a4f..2069330b 100644
--- a/torch/csrc/jit/mobile/flatbuffer_loader.cpp
+++ b/torch/csrc/jit/mobile/flatbuffer_loader.cpp
@@ -302,7 +302,7 @@ mobile::Module FlatbufferLoader::parseModule(
storage_loaded_.resize(module->storage_data_size(), false);
mobile_ivalue_size_ = module_->mobile_ivalue_size();
- if (mobile_ivalue_size_ == 0) {
+ if (mobile_ivalue_size_ == 0 || mobile_ivalue_size_ > ivalues->size()) {
mobile_ivalue_size_ = ivalues->size();
}
--
2.43.0
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。