Cannot find `include file "head.v" in directories

本文介绍了在使用Modelsim过程中遇到的无法识别include文件的问题及解决方法。当Verilog程序引用的宏文件未能被正确识别时,可通过设置编译选项来解决此问题。

Cannot find `include file "timescale.v" in directories

 

modelsim无法识别include文件的解决方法

Cannot find `include file "timescale.v" in directories

问题如图:


无法找到引用的文件,timescale.v和定义的宏文件i2c_master_defines.v,这两个文件与程序在同一路径下,如图:


程序内放在模块开头部分引用宏文件:



然后就产生了错误。

解决方法:

打开:compile---compile options,弹出对话框,设置如下,在设置中添加include files选项即可:




这样设置后,编译后,完全正确。

11:24:50 **** Incremental Build of configuration Debug for project Seekfree_TC264_Opensource_Library **** amk -j8 all TASKING VX-toolset for AURIX Development Studio (non-commercial): program builder v1.1r8 Build 22011964 cctc -cs --dep-file="cpu0_main.d" --misrac-version=2004 -D__CPU__=tc26xb "-fC:/Users/86134/AURIX-v1.10.2-workspace/Seekfree_TC264_Opensource_Library/Debug/TASKING_C_C___Compiler-Include_paths__-I_.opt" --iso=99 --c++14 --language=+volatile --exceptions --anachronisms --fp-model=3 -O0 --tradeoff=4 --compact-max-size=200 -g -Wc-w544 -Wc-w557 -Ctc26xb -Y0 -N0 -Z0 -o "user/cpu0_main.src" "../user/cpu0_main.c" cctc -cs --dep-file="cpu1_main.d" --misrac-version=2004 -D__CPU__=tc26xb "-fC:/Users/86134/AURIX-v1.10.2-workspace/Seekfree_TC264_Opensource_Library/Debug/TASKING_C_C___Compiler-Include_paths__-I_.opt" --iso=99 --c++14 --language=+volatile --exceptions --anachronisms --fp-model=3 -O0 --tradeoff=4 --compact-max-size=200 -g -Wc-w544 -Wc-w557 -Ctc26xb -Y0 -N0 -Z0 -o "user/cpu1_main.src" "../user/cpu1_main.c" cctc -cs --dep-file="isr.d" --misrac-version=2004 -D__CPU__=tc26xb "-fC:/Users/86134/AURIX-v1.10.2-workspace/Seekfree_TC264_Opensource_Library/Debug/TASKING_C_C___Compiler-Include_paths__-I_.opt" --iso=99 --c++14 --language=+volatile --exceptions --anachronisms --fp-model=3 -O0 --tradeoff=4 --compact-max-size=200 -g -Wc-w544 -Wc-w557 -Ctc26xb -Y0 -N0 -Z0 -o "user/isr.src" "../user/isr.c" cctc -cs --dep-file="All_Init.d" --misrac-version=2004 -D__CPU__=tc26xb "-fC:/Users/86134/AURIX-v1.10.2-workspace/Seekfree_TC264_Opensource_Library/Debug/TASKING_C_C___Compiler-Include_paths__-I_.opt" --iso=99 --c++14 --language=+volatile --exceptions --anachronisms --fp-model=3 -O0 --tradeoff=4 --compact-max-size=200 -g -Wc-w544 -Wc-w557 -Ctc26xb -Y0 -N0 -Z0 -o "code/All_Init.src" "../code/All_Init.c" cctc -cs --dep-file="PID.d" --misrac-version=2004 -D__CPU__=tc26xb "-fC:/Users/86134/AURIX-v1.10.2-workspace/Seekfree_TC264_Opensource_Library/Debug/TASKING_C_C___Compiler-Include_paths__-I_.opt" --iso=99 --c++14 --language=+volatile --exceptions --anachronisms --fp-model=3 -O0 --tradeoff=4 --compact-max-size=200 -g -Wc-w544 -Wc-w557 -Ctc26xb -Y0 -N0 -Z0 -o "code/PID.src" "../code/PID.c" ctc E219: ["C:\\Users\\86134\\AURIX-v1.10.2-workspace\\Seekfree_TC264_Opensource_Library\\libraries\\zf_common\zf_common_headfile.h" 104/1] cannot open #include file "zf_device_mt9v03x.h" amk E452: ["user/subdir.mk" 29/0] target 'user/cpu0_main.src' returned exit code 1 1 errors, 0 warnings ctc E219: ["C:\\Users\\86134\\AURIX-v1.10.2-workspace\\Seekfree_TC264_Opensource_Library\\libraries\\zf_common\zf_common_headfile.h" 104/1] cannot open #include file "zf_device_mt9v03x.h" 1 errors, 0 warnings amk E452: ["user/subdir.mk" 33/0] target 'user/cpu1_main.src' returned exit code 1 ctc E219: ["C:\\Users\\86134\\AURIX-v1.10.2-workspace\\Seekfree_TC264_Opensource_Library\\libraries\\zf_common\zf_common_headfile.h" 104/1] cannot open #include file "zf_device_mt9v03x.h" amk E452: ["user/subdir.mk" 37/0] target 'user/isr.src' returned exit code 1 1 errors, 0 warnings ctc E219: ["C:\\Users\\86134\\AURIX-v1.10.2-workspace\\Seekfree_TC264_Opensource_Library\\libraries\\zf_common\zf_common_headfile.h" 104/1] cannot open #include file "zf_device_mt9v03x.h" 1 errors, 0 warnings amk E452: ["code/subdir.mk" 41/0] target 'code/All_Init.src' returned exit code 1 ctc E219: ["C:\\Users\\86134\\AURIX-v1.10.2-workspace\\Seekfree_TC264_Opensource_Library\\libraries\\zf_common\zf_common_headfile.h" 104/1] cannot open #include file "zf_device_mt9v03x.h" 1 errors, 0 warnings amk E452: ["code/subdir.mk" 45/0] target 'code/PID.src' returned exit code 1 amk E451: make stopped "amk -j8 all" terminated with exit code 1. Build might be incomplete. 11:24:53 Build Failed. 11 errors, 0 warnings. (took 2s.896ms)
最新发布
12-08
import http.server import time import socketserver import os import threading import socket import json #下面的导入从SimpleHTTPServer.py复制: import posixpath import urllib.parse import cgi import sys import shutil import mimetypes import io import re #从jfrog下载并解压文件-开始 from file_downloader import FileDownloader # 创建 FileDownloader 实例 print(f"start download file \n") downloader = FileDownloader() # 执行下载和解压 downloader.execute() #从jfrog下载并解压文件-结束 PORT = 8000 i = 1 class MyThreadingHTTPServer(socketserver.ThreadingTCPServer): allow_reuse_address = 1 def server_bind(self): """Override server_bind to store the server name.""" socketserver.TCPServer.server_bind(self) host, port = self.socket.getsockname()[:2] self.server_name = socket.getfqdn(host) self.server_port = port #Handler = SimpleHTTPServer.SimpleHTTPRequestHandler class MyHTTPRequestHandler(http.server.SimpleHTTPRequestHandler): def dumpRequestHeaders(self): pass #print 'dumpRequestHeaders: raw_requestline=%s \nheaders=\n%s' % (self.raw_requestline,self.headers) def copyfile_by_range(self, fin, fout, start, end): print( "copyfile_by_range: start=%d end=%d" % (start, end)) READ_BUFFER_SIZE = 4*1024; fin.seek(start, os.SEEK_SET) if end<0: #代表原始Range请求未指定完整范围,只指定了开始位置 buf = fin.read(READ_BUFFER_SIZE) #FIXME:健壮性fix,如果读到内容小于size参数?需要判断len(buf) if len(buf)!=READ_BUFFER_SIZE: pass #print "copyfile_by_range: len(buf)!=READ_BUFFER_SIZE 1 len(buf)=%d" % (len(buf)) while buf: fout.write(buf) fout.flush() buf = fin.read(READ_BUFFER_SIZE) if len(buf)==0: break #?? if len(buf)!=READ_BUFFER_SIZE: pass #print "copyfile_by_range: len(buf)!=READ_BUFFER_SIZE 2 len(buf)=%d" % (len(buf)) fout.write(buf) break else: bytes_left = end-start+1 while bytes_left >= READ_BUFFER_SIZE: buf = fin.read(READ_BUFFER_SIZE) if len(buf)!=READ_BUFFER_SIZE: pass #print "copyfile_by_range: len(buf)!=READ_BUFFER_SIZE 3 len(buf)=%d" % (len(buf)) fout.write(buf) bytes_left = bytes_left - READ_BUFFER_SIZE if bytes_left>0: buf = fin.read(bytes_left) if len(buf)!=bytes_left: pass #print "copyfile_by_range: len(buf)!=bytes_left len(buf)=%d bytes_left=" % (len(buf), bytes_left) fout.write(buf) def do_GET(self): self.dumpRequestHeaders() #用于查看客户端浏览器的User-Agent设置; # #SimpleHTTPServer.SimpleHTTPRequestHandler.do_GET(self) f, range = self.send_head() #原来的send_head这个函数实现有点莫名其妙? if f: if range: #注意,响应头部已经在send_head()里设置完成了,这里只需要调整io读写指针 self.copyfile_by_range(f, self.wfile, range[0], range[1]) else: self.copyfile(f, self.wfile) f.close() #重载SimpleHTTPServer.py里的实现,以实现:(1)按修改日期排序(2)正确显示中文 #TODO:支持更多查询参数?html输出代码美化? def list_directory(self, path): """Helper to produce a directory listing (absent index.html). Return value is either a file object, or None (indicating an error). In either case, the headers are sent, making the interface the same as for send_head(). """ try: list = os.listdir(path) except os.error: self.send_error(404, "No permission to list directory") return None #list.sort(key=lambda a: a.lower()) ''' def compare_by_modtime(x, y): stat_x = os.stat(path + "/" + x) stat_y = os.stat(path + "/" + y) if stat_x.st_mtime < stat_y.st_mtime: return -1 elif stat_x.st_mtime > stat_y.st_mtime: return 1 else: return 0 list.sort(lambda x,y: compare_by_modtime(y,x)) #最近修改的排在前面 ''' def key_by_mtime(a): try: return - os.stat(path + "/" + a).st_mtime except Exception as e: return 0 #?? list.sort(key=key_by_mtime) f = [] enc = "gb18030"; displaypath = cgi.escape(urllib.parse.unquote(self.path)) f.append('<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" ' '"http://www.w3.org/TR/html4/strict.dtd">') f.append('<html>\n<head>') f.append('<meta http-equiv="Content-Type" ' 'content="text/html; charset=%s">' % enc) f.append("<title>Directory listing for %s</title>\n" % displaypath) f.append("<body>\n<h2>Directory listing for %s</h2>\n" % displaypath) f.append("<hr>\n<ul>\n") for name in list: fullname = os.path.join(path, name) displayname = linkname = name is_dir = False # Append / for directories or @ for symbolic links if os.path.isdir(fullname): displayname = name + "/" linkname = name + "/" is_dir = True if os.path.islink(fullname): displayname = name + "@" # Note: a link to a directory displays with @ and links with / f.append('<li><a href="%s">%s</a>\n' % (urllib.parse.quote(linkname), cgi.escape(displayname))) #self.path是浏览器请求路径,而path是本地文件系统路径 if not is_dir and name.endswith('.mp4'): f.append('|<a href="/playvideo?path=%s">播放</a>\n' % (urllib.parse.quote( os.path.join(self.path, name)))) f.append('</li>') f.append("</ul>\n<hr>\n</body>\n</html>\n") encoded = '\n'.join(f).encode(enc, 'surrogateescape') b = io.BytesIO() b.write(encoded) b.seek(0) self.send_response(200,"success") encoding = "gbk" #sys.getfilesystemencoding() self.send_header("Content-Type", "text/html; charset=%s" % encoding) self.send_header("Content-Length", str(len(encoded))) self.end_headers() return b #TODO:支持Range请求,这样可以提供基于HTTP的视频流媒体服务 def send_head(self): """ overwrite send_head to set Last-Modified & Expires to disable browser cache; """ enc = "gb18030"; unquoted_path = urllib.parse.unquote(self.path) print( "send_head: self.path=%s unquoted_path=%s" % (self.path, unquoted_path)) PLAYVIDEO_REQUEST = re.compile(r'/playvideo\?path=(.+)$') m = PLAYVIDEO_REQUEST.match(unquoted_path) if m: #TODO: 重构这里的代码 video_path = m.group(1) print( "send_head: video_path=%s" % video_path) self.send_response(200) self.send_header("Content-Type", "text/html") self.end_headers() self.wfile.write(('<video src="%s" controls></video>' % video_path).encode(enc, 'surrogateescape')) #注意,这个地方不需要urllib.quote return (None,None) path = self.translate_path(self.path) f = None if os.path.isdir(path): if not self.path.endswith('/'): # redirect browser - doing basically what apache does self.send_response(301) sa = s.socket.getsockname() self.send_header("Location", "http://" + str(sa[0]) + ":" + str(sa[1]) + self.path + "/icc_delta.zip_E.51") self.wfile.flush() time.sleep(1) self.end_headers() return (None,None) for index in "index.html", "index.htm": index = os.path.join(path, index) if os.path.exists(index): path = index break else: return (self.list_directory(path), None) ctype = self.guess_type(path) try: # Always read in binary mode. Opening files in text mode may cause # newline translations, making the actual size of the content # transmitted *less* than the content-length! f = open(path, 'rb') #Get file size: f.seek(0, os.SEEK_END) filesize = f.tell() f.seek(0, os.SEEK_SET) #TODO: 检查原始请求是否指定了Range头部 if self.headers.get("Range"): range_value = self.headers["Range"] #range_value = "bytes=31219987-71219986" print( "send_head: range_value=[%s]" % range_value) #直接使用正则表达式匹配: Range: bytes=100- HTTP_RANGE_HEADER = re.compile(r'bytes=([0-9]+)\s*\-\s*(([0-9]+)?)') m = re.match(HTTP_RANGE_HEADER, range_value) if m: start_str = m.group(1) start = int(start_str) end_str = m.group(2) end = -1 if len(end_str)>0: end = int(end_str) #现在可以写Range响应头部了: self.send_response(206, "Partial Content") self.send_header("Content-Type", ctype) if end==-1: self.send_header("Content-Length", str(filesize-start)) else: self.send_header("Content-Length", str(end-start+1)) self.send_header("Accept-Ranges", "bytes") if end<0: content_range_header_value = "bytes %d-%d/%d" % (start, filesize-1, filesize) else: content_range_header_value = "bytes %d-%d/%d" % (start, end, filesize) self.send_header("Content-Range", content_range_header_value) print( "send_head: ok, serve 206 for Range request %s-%s,Content-Range=%s" % (start_str, end_str, content_range_header_value)) self.send_header("Connection", "close") self.end_headers() return (f, [start, end]) else: print( "send_head: error! INVALID Range request header!!") self.send_error(400, "Bad Request") self.wfile.flush() self.end_headers() return (None,None) except IOError: self.send_error(404, "File not found") return (None,None) self.send_response(200,"success") self.send_header("Content-Type", ctype) file_stat = os.fstat(f.fileno()) self.send_header("Content-Length", str(file_stat[6])) #self.send_header("Last-Modified", self.date_time_string(file_stat.st_mtime)) self.send_header("Last-Modified", self.date_time_string(time.time())) #self.send_header("Expires", self.date_time_string(time.time()+5)) #self.send_header("Cache-control", "no-cache, no-store, must-revalidate, max-age=0, proxy-revalidate, no-transform") #self.send_header("Pragma", "no-cache") self.send_header("Connection", "close") self.end_headers() return (f, None) def do_POST(self): print(self.path) if self.path == '/upload/log_package': self.handle_chunked_upload() if self.path == '/request_package_url.cgi': self.handle_post_url() else: # 处理其他POST请求(如果有的话) pass def handle_post_url(self): # 构造符合指定格式的响应 response = [ {"ModuleID":"51","URL":"http://0.0.0.0:8000/ICC_mpu/cloud_iccmpu322483.51"} ] print(response) # 返回处理结果 self.send_response(200, "success") self.send_header('Content-Type', 'application/json') self.end_headers() self.wfile.write(json.dumps(response).encode('utf-8')) return def handle_chunked_upload(self): global i print(i) content_type, _ = cgi.parse_header(self.headers.get('Content-Type')) content_range, _ = cgi.parse_header(self.headers.get('Content-Range')) lst = re.split(r"[-/]", content_range) print(lst) if content_type == 'multipart/form-data': form_data = cgi.FieldStorage( fp=self.rfile, headers=self.headers, environ={'REQUEST_METHOD': 'POST', 'CONTENT_TYPE': self.headers['Content-Type']} ) # 获取分片相关信息 total_chunks = int(lst[2]) current_chunk = int(lst[1]) file_name = "1632778555351_0123456789V000001_20210927_213557_14869_ota_V1.zip" # 将分片保存到指定位置 upload_path = '/fota/' # 替换为你希望保存上传文件的目录 part = file_name + '_part_' + str(i) file_path = os.path.join(upload_path, part) print(file_path) with open(file_path, 'wb') as new_file: new_file.write(form_data['file'].file.read()) new_file.flush() os.fsync(new_file.fileno()) # 如果是最后一个分片,组合分片 if current_chunk == total_chunks: combined_file_path = os.path.join(upload_path, file_name) with open(combined_file_path, 'ab') as combined_file: for chunk_num in range(1, i+1): print(chunk_num) part_num = file_name + '_part_' + str(chunk_num) part_file_path = os.path.join(upload_path, part_num) print(part_file_path) with open(part_file_path, 'rb') as part_file: combined_file.write(part_file.read()) print("write success") os.remove(part_file_path) # 删除已经组合的分片文件 print("delete success") # 构造符合指定格式的响应 response = { 'code': 200, 'message': 'success', 'data': { 'fileUrl': 'http://0.0.0.0:8000/upload/log_package' # 替换为服务器地址 } } # 用JSON格式回应客户端 self.send_response(200, "success") self.send_header('Content-Type', 'application/json') self.end_headers() self.wfile.write(json.dumps(response).encode('utf-8')) return # 如果不是最后一个分片,简单返回成功 response = { 'code': 200, 'message': 'success' } # 用JSON格式回应客户端 i = i + 1 self.send_response(200, "success") self.send_header('Content-Type', 'application/json') self.end_headers() self.wfile.write(json.dumps(response).encode('utf-8')) return else: i = 1 self.send_error(400, '请求错误:文件上传的内容类型无效') return s = MyThreadingHTTPServer(("", PORT), MyHTTPRequestHandler) sa = s.socket.getsockname() print("Serving MyThreadingHTTPServer on", sa[0], "port", sa[1], "...") s.serve_forever() 参考这个python 代码的实现来写c++的
10-24
PowerShell 7 环境已加载 (版本: 7.5.2) PowerShell 7 环境已加载 (版本: 7.5.2) PS C:\Users\Administrator\Desktop> cd E:\PyTorch_Build\pytorch PS E:\PyTorch_Build\pytorch> python -m venv rtx5070_env PS E:\PyTorch_Build\pytorch> .\rtx5070_env\Scripts\activate (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 修复之前的脚本错误 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $fixedActivation = @" >> try { >> & "$activatePath" >> Write-Host "✅ 虚拟环境激活成功" -ForegroundColor Green >> python -VV >> } >> catch [System.Exception] { >> Write-Host "❌ 激活失败: $($_.Exception.Message)" -ForegroundColor Red >> } >> "@ InvalidOperation: Line | 3 | & "$activatePath" | ~~~~~~~~~~~~~ | The variable '$activatePath' cannot be retrieved because it has not been set. (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 切换到PyTorch源码目录 (rtx5070_env) PS E:\PyTorch_Build\pytorch> cd E:\PyTorch_Build\pytorch (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 更新pip到最新版 (rtx5070_env) PS E:\PyTorch_Build\pytorch> python -m pip install --upgrade pip Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Requirement already satisfied: pip in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (22.3.1) Collecting pip Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b7/3f/945ef7ab14dc4f9d7f40288d2df998d1837ee0888ec3659c813487572faa/pip-25.2-py3-none-any.whl (1.8 MB) Installing collected packages: pip Attempting uninstall: pip Found existing installation: pip 22.3.1 Uninstalling pip-22.3.1: Successfully uninstalled pip-22.3.1 Successfully installed pip-25.2 (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 安装编译依赖 (rtx5070_env) PS E:\PyTorch_Build\pytorch> pip install -r requirements-build.txt --verbose Using pip 25.2 from E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\pip (python 3.10) Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting setuptools<80.0,>=70.1.0 (from -r requirements-build.txt (line 2)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/0d/6d/b4752b044bf94cb802d88a888dc7d288baaf77d7910b7dedda74b5ceea0c/setuptools-79.0.1-py3-none-any.whl (1.3 MB) Collecting cmake>=3.27 (from -r requirements-build.txt (line 3)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/7c/d0/73cae88d8c25973f2465d5a4457264f95617c16ad321824ed4c243734511/cmake-4.1.0-py3-none-win_amd64.whl (37.6 MB) Collecting ninja (from -r requirements-build.txt (line 4)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/29/45/c0adfbfb0b5895aa18cec400c535b4f7ff3e52536e0403602fc1a23f7de9/ninja-1.13.0-py3-none-win_amd64.whl (309 kB) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/f3/db/8e12381333aea300890829a0a36bfa738cac95475d88982d538725143fd9/numpy-2.3.0.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/numpy/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/2e/19/d7c972dfe90a353dbd3efbbe1d14a5951de80c99c9dc1b93cd998d51dc0f/numpy-2.3.1.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/numpy/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/37/7d/3fec4199c5ffb892bed55cff901e4f39a58c81df9c44c280499e92cad264/numpy-2.3.2.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/numpy/) (requires-python:>=3.11) Collecting numpy (from -r requirements-build.txt (line 5)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a3/dd/4b822569d6b96c39d1215dbae0582fd99954dcbcf0c1a13c61783feaca3f/numpy-2.2.6-cp310-cp310-win_amd64.whl (12.9 MB) Collecting packaging (from -r requirements-build.txt (line 6)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl (66 kB) Collecting pyyaml (from -r requirements-build.txt (line 7)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl (161 kB) Collecting requests (from -r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl (64 kB) Collecting six (from -r requirements-build.txt (line 9)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl (11 kB) Collecting typing-extensions>=4.10.0 (from -r requirements-build.txt (line 10)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl (44 kB) Collecting charset_normalizer<4,>=2 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/e2/c6/f05db471f81af1fa01839d44ae2a8bfeec8d2a8b4590f16c4e7393afd323/charset_normalizer-3.4.3-cp310-cp310-win_amd64.whl (107 kB) Collecting idna<4,>=2.5 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl (70 kB) Collecting urllib3<3,>=1.21.1 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl (129 kB) Collecting certifi>=2017.4.17 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl (161 kB) Installing collected packages: urllib3, typing-extensions, six, setuptools, pyyaml, packaging, numpy, ninja, idna, cmake, charset_normalizer, certifi, requests Attempting uninstall: setuptools Found existing installation: setuptools 65.5.0 Uninstalling setuptools-65.5.0: Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\_distutils_hack\ Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\distutils-precedence.pth Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\pkg_resources\ Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\setuptools-65.5.0.dist-info\ Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\setuptools\ Successfully uninstalled setuptools-65.5.0 Successfully installed certifi-2025.8.3 charset_normalizer-3.4.3 cmake-4.1.0 idna-3.10 ninja-1.13.0 numpy-2.2.6 packaging-25.0 pyyaml-6.0.2 requests-2.32.5 setuptools-79.0.1 six-1.17.0 typing-extensions-4.15.0 urllib3-2.5.0 (rtx5070_env) PS E:\PyTorch_Build\pytorch> pip install cmake ninja --upgrade Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Requirement already satisfied: cmake in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (4.1.0) Requirement already satisfied: ninja in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (1.13.0) (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 清理旧编译产物 (rtx5070_env) PS E:\PyTorch_Build\pytorch> Remove-Item -Recurse -Force build, dist -ErrorAction SilentlyContinue (rtx5070_env) PS E:\PyTorch_Build\pytorch> Write-Host "`n==== 编译环境验证 ====" -ForegroundColor Cyan ==== 编译环境验证 ==== (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 1. 目录验证 (rtx5070_env) PS E:\PyTorch_Build\pytorch> Write-Host "当前目录: $pwd" 当前目录: E:\PyTorch_Build\pytorch (rtx5070_env) PS E:\PyTorch_Build\pytorch> if ($pwd -ne "E:\PyTorch_Build\pytorch") { >> Write-Host "⚠️ 错误: 需要切换到E:\PyTorch_Build\pytorch" -ForegroundColor Yellow >> cd E:\PyTorch_Build\pytorch >> } ⚠️ 错误: 需要切换到E:\PyTorch_Build\pytorch (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 2. CUDA工具链验证 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $cudaStatus = @( >> "nvcc --version", >> "nvidia-smi", >> "where cudnn64_8.dll" >> ) (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> foreach ($cmd in $cudaStatus) { >> Write-Host "`n执行: $cmd" -ForegroundColor Magenta >> try { >> Invoke-Expression $cmd >> } >> catch { >> Write-Host "❌ 命令失败: $_" -ForegroundColor Red >> } >> } 执行: nvcc --version nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2025 NVIDIA Corporation Built on Wed_Jul_16_20:06:48_Pacific_Daylight_Time_2025 Cuda compilation tools, release 13.0, V13.0.48 Build cuda_13.0.r13.0/compiler.36260728_0 执行: nvidia-smi Wed Sep 3 22:04:47 2025 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 580.97 Driver Version: 580.97 CUDA Version: 13.0 | +-----------------------------------------+------------------------+----------------------+ | GPU Name Driver-Model | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA GeForce RTX 5070 WDDM | 00000000:01:00.0 On | N/A | | 0% 38C P3 22W / 250W | 1601MiB / 12227MiB | 1% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ +-----------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=========================================================================================| | 0 N/A N/A 1540 C+G ...yb3d8bbwe\WindowsTerminal.exe N/A | | 0 N/A N/A 1916 C+G C:\Windows\System32\dwm.exe N/A | | 0 N/A N/A 4972 C+G ...em32\ApplicationFrameHost.exe N/A | | 0 N/A N/A 5036 C+G ...ef.win7x64\steamwebhelper.exe N/A | | 0 N/A N/A 5996 C+G ...8bbwe\PhoneExperienceHost.exe N/A | | 0 N/A N/A 6420 C+G ...ntrolPanel\SystemSettings.exe N/A | | 0 N/A N/A 8280 C+G C:\Windows\explorer.exe N/A | | 0 N/A N/A 8428 C+G ...indows\System32\ShellHost.exe N/A | | 0 N/A N/A 8616 C+G ..._cw5n1h2txyewy\SearchHost.exe N/A | | 0 N/A N/A 9212 C+G ...y\StartMenuExperienceHost.exe N/A | | 0 N/A N/A 10092 C+G ...0.3405.125\msedgewebview2.exe N/A | | 0 N/A N/A 12816 C+G ...5n1h2txyewy\TextInputHost.exe N/A | | 0 N/A N/A 13400 C+G ...crosoft\OneDrive\OneDrive.exe N/A | | 0 N/A N/A 14212 C+G ...t\Edge\Application\msedge.exe N/A | | 0 N/A N/A 14440 C+G ...acted\runtime\WeChatAppEx.exe N/A | | 0 N/A N/A 15156 C+G ...les\Tencent\Weixin\Weixin.exe N/A | | 0 N/A N/A 18312 C+G ...es\Microsoft VS Code\Code.exe N/A | +-----------------------------------------------------------------------------------------+ 执行: where cudnn64_8.dll (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 3. Python环境验证 (rtx5070_env) PS E:\PyTorch_Build\pytorch> Write-Host "`nPython环境状态:" -ForegroundColor Magenta Python环境状态: (rtx5070_env) PS E:\PyTorch_Build\pytorch> pip show torch | Select-String "Location" WARNING: Package(s) not found: torch (rtx5070_env) PS E:\PyTorch_Build\pytorch> python -c "import torch; print(f'PyTorch版本: {torch.__version__}')" Traceback (most recent call last): File "<string>", line 1, in <module> File "E:\PyTorch_Build\pytorch\torch\__init__.py", line 61, in <module> from torch.torch_version import __version__ as __version__ File "E:\PyTorch_Build\pytorch\torch\torch_version.py", line 5, in <module> from torch.version import __version__ as internal_version ModuleNotFoundError: No module named 'torch.version' (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 设置RTX 5070专属编译参数 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $cmakeArgs = @( >> "-B build", >> "-G Ninja", >> "-DUSE_CUDA=ON", >> "-DUSE_CUDNN=ON", >> "-DCUDA_TOOLKIT_ROOT_DIR=`"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v13.0`"", >> "-DCUDNN_ROOT_DIR=`"E:\Program Files\NVIDIA\CUNND\v9.12`"", >> "-DCUDA_ARCH_LIST=`"8.9`"", # RTX 5070架构 >> "-DTORCH_CUDA_ARCH_LIST=`"8.9`"", >> "-DCMAKE_BUILD_TYPE=Release", >> "-DUSE_NCCL=OFF", >> "-DUSE_MKLDNN=ON", >> "-DTORCH_CUDA_VERSION=11.8" # 兼容旧驱动 >> ) (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 启动配置过程 (rtx5070_env) PS E:\PyTorch_Build\pytorch> cmake ($cmakeArgs -join " ") CMake Error: Unable to (re)create the private pkgRedirects directory: E:/PyTorch_Build/pytorch/build -G Ninja -DUSE_CUDA=ON -DUSE_CUDNN=ON -DCUDA_TOOLKIT_ROOT_DIR="C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0" -DCUDNN_ROOT_DIR="E:/Program Files/NVIDIA/CUNND/v9.12" -DCUDA_ARCH_LIST="8.9" -DTORCH_CUDA_ARCH_LIST="8.9" -DCMAKE_BUILD_TYPE=Release -DUSE_NCCL=OFF -DUSE_MKLDNN=ON -DTORCH_CUDA_VERSION=11.8/CMakeFiles/pkgRedirects This may be caused by not having read/write access to the build directory. Try specifying a location with read/write access like: cmake -B build If using a CMake presets file, ensure that preset parameter 'binaryDir' expands to a writable directory. (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 设置并行编译(根据CPU核心数调整) (rtx5070_env) PS E:\PyTorch_Build\pytorch> $env:MAX_JOBS = 8 (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 启动编译并记录日志 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $logFile = "build_$(Get-Date -Format 'yyyyMMdd_HHmmss').log" (rtx5070_env) PS E:\PyTorch_Build\pytorch> Start-Transcript -Path $logFile Transcript started, output file is build_20250903_220514.log (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> try { >> cmake --build build --config Release --parallel $env:MAX_JOBS >> pip install -v --no-build-isolation . >> } >> catch { >> Write-Host "🔥 编译失败!错误详情: $_" -ForegroundColor Red >> } Error: E:/PyTorch_Build/pytorch/build is not a directory Using pip 25.2 from E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\pip (python 3.10) Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Processing e:\pytorch_build\pytorch Running command Preparing metadata (pyproject.toml) Building wheel torch-2.9.0a0+git2d31c3d E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:82: SetuptoolsDeprecationWarning: `project.license` as a TOML table is deprecated !! ******************************************************************************** Please use a simple string containing a SPDX expression for `project.license`. You can also use `project.license-files`. (Both options available on setuptools>=77.0.0). By 2026-Feb-18, you need to update your project and remove deprecated calls or your builds will no longer be supported. See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! corresp(dist, value, root_dir) running dist_info creating C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info writing C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\PKG-INFO writing dependency_links to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\dependency_links.txt writing entry points to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\entry_points.txt writing requirements to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\requires.txt writing top-level names to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\top_level.txt writing manifest file 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\SOURCES.txt' reading manifest file 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'BUILD' warning: no files found matching '*.BUILD' warning: no files found matching 'BUCK' warning: no files found matching '[Mm]akefile.*' warning: no files found matching '*.[Dd]ockerfile' warning: no files found matching '[Dd]ockerfile.*' warning: no previously-included files matching '*.o' found anywhere in distribution warning: no previously-included files matching '*.obj' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.a' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution no previously-included directories found matching '*\.git' warning: no previously-included files matching '*~' found anywhere in distribution warning: no previously-included files matching '*.swp' found anywhere in distribution adding license file 'LICENSE' adding license file 'NOTICE' writing manifest file 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\SOURCES.txt' creating 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch-2.9.0a0+git2d31c3d.dist-info' Preparing metadata (pyproject.toml) ... done Collecting filelock (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl (15 kB) Requirement already satisfied: typing-extensions>=4.10.0 in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (from torch==2.9.0a0+git2d31c3d) (4.15.0) Collecting sympy>=1.13.3 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a2/09/77d55d46fd61b4a135c444fc97158ef34a095e5681d0a6c10b75bf356191/sympy-1.14.0-py3-none-any.whl (6.3 MB) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/eb/8d/776adee7bbf76365fdd7f2552710282c79a4ead5d2a46408c9043a2b70ba/networkx-3.5-py3-none-any.whl (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/6c/4f/ccdb8ad3a38e583f214547fd2f7ff1fc160c43a75af88e6aec213404b96a/networkx-3.5.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/3f/a1/46c1b6e202e3109d2a035b21a7e5534c5bb233ee30752d7f16a0bd4c3989/networkx-3.5rc0-py3-none-any.whl (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/90/7e/0319606a20ced20730806b9f7fe91d8a92f7da63d76a5c388f87d3f7d294/networkx-3.5rc0.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Collecting networkx>=2.5.1 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b9/54/dd730b32ea14ea797530a4479b2ed46a6fb250f682a9cfb997e968bf0261/networkx-3.4.2-py3-none-any.whl (1.7 MB) Collecting jinja2 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl (134 kB) Collecting fsspec>=0.8.5 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/47/71/70db47e4f6ce3e5c37a607355f80da8860a33226be640226ac52cb05ef2e/fsspec-2025.9.0-py3-none-any.whl (199 kB) Collecting mpmath<1.4,>=1.1.0 (from sympy>=1.13.3->torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/43/e3/7d92a15f894aa0c9c4b49b8ee9ac9850d6e63b03c9c32c0367a13ae62209/mpmath-1.3.0-py3-none-any.whl (536 kB) Collecting MarkupSafe>=2.0 (from jinja2->torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl (15 kB) Building wheels for collected packages: torch Running command Building wheel for torch (pyproject.toml) Building wheel torch-2.9.0a0+git2d31c3d -- Building version 2.9.0a0+git2d31c3d E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\setuptools\_distutils\_msvccompiler.py:12: UserWarning: _get_vc_env is private; find an alternative (pypa/distutils#340) warnings.warn( Cloning into 'nccl'... Note: switching to '3ea7eedf3b9b94f1d9f99f4e55536dfcbd23c1ca'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by switching back to a branch. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -c with the switch command. Example: git switch -c <new-branch-name> Or undo this operation with: git switch - Turn off this advice by setting config variable advice.detachedHead to false cmake -GNinja -DBUILD_PYTHON=True -DBUILD_TEST=True -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=E:\PyTorch_Build\pytorch\torch -DCMAKE_PREFIX_PATH=E:\PyTorch_Build\pytorch\rtx5070_env\Lib\site-packages -DCUDNN_INCLUDE_DIR=E:\Program Files\NVIDIA\CUNND\v9.12\include\12.9 -DCUDNN_LIBRARY=E:\Program Files\NVIDIA\CUNND\v9.12\lib\12.9\x64 -DCUDNN_ROOT=E:\Program Files\NVIDIA\CUNND\v9.12 -DPython_EXECUTABLE=E:\PyTorch_Build\pytorch\rtx5070_env\Scripts\python.exe -DPython_NumPy_INCLUDE_DIR=E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\numpy\_core\include -DTORCH_BUILD_VERSION=2.9.0a0+git2d31c3d -DTORCH_CUDA_ARCH_LIST=8.9 -DUSE_NUMPY=True -DUSE_OPENBLAS=1 E:\PyTorch_Build\pytorch CMake Deprecation Warning at CMakeLists.txt:9 (cmake_policy): The OLD behavior for policy CMP0126 will be removed from a future version of CMake. The cmake-policies(7) manual explains that the OLD behaviors of all policies are deprecated and that a policy should be set to OLD only under specific short-term circumstances. Projects should be ported to the NEW behavior and not rely on setting a policy to OLD. -- The CXX compiler identification is MSVC 19.44.35215.0 -- The C compiler identification is MSVC 19.44.35215.0 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Not forcing any particular BLAS to be found CMake Warning at CMakeLists.txt:421 (message): TensorPipe cannot be used on Windows. Set it to OFF CMake Warning at CMakeLists.txt:423 (message): KleidiAI cannot be used on Windows. Set it to OFF CMake Warning at CMakeLists.txt:435 (message): Libuv is not installed in current conda env. Set USE_DISTRIBUTED to OFF. Please run command 'conda install -c conda-forge libuv=1.39' to install libuv. -- Performing Test C_HAS_AVX_1 -- Performing Test C_HAS_AVX_1 - Success -- Performing Test C_HAS_AVX2_1 -- Performing Test C_HAS_AVX2_1 - Success -- Performing Test C_HAS_AVX512_1 -- Performing Test C_HAS_AVX512_1 - Success -- Performing Test CXX_HAS_AVX_1 -- Performing Test CXX_HAS_AVX_1 - Success -- Performing Test CXX_HAS_AVX2_1 -- Performing Test CXX_HAS_AVX2_1 - Success -- Performing Test CXX_HAS_AVX512_1 -- Performing Test CXX_HAS_AVX512_1 - Success -- Current compiler supports avx2 extension. Will build perfkernels. -- Performing Test COMPILER_SUPPORTS_HIDDEN_VISIBILITY -- Performing Test COMPILER_SUPPORTS_HIDDEN_VISIBILITY - Failed -- Performing Test COMPILER_SUPPORTS_HIDDEN_INLINE_VISIBILITY -- Performing Test COMPILER_SUPPORTS_HIDDEN_INLINE_VISIBILITY - Failed -- Could not find hardware support for NEON on this machine. -- No OMAP3 processor on this machine. -- No OMAP4 processor on this machine. -- Compiler does not support SVE extension. Will not build perfkernels. CMake Warning at CMakeLists.txt:841 (message): x64 operating system is required for FBGEMM. Not compiling with FBGEMM. Turn this warning off by USE_FBGEMM=OFF. -- Performing Test HAS/UTF_8 -- Performing Test HAS/UTF_8 - Success -- Found CUDA: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0 (found version "13.0") -- The CUDA compiler identification is NVIDIA 13.0.48 with host compiler MSVC 19.44.35215.0 -- Detecting CUDA compiler ABI info -- Detecting CUDA compiler ABI info - done -- Check for working CUDA compiler: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0/bin/nvcc.exe - skipped -- Detecting CUDA compile features -- Detecting CUDA compile features - done -- Found CUDAToolkit: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0/include (found version "13.0.48") -- PyTorch: CUDA detected: 13.0 -- PyTorch: CUDA nvcc is: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0/bin/nvcc.exe -- PyTorch: CUDA toolkit directory: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0 -- PyTorch: Header version is: 13.0 -- Found Python: E:\PyTorch_Build\pytorch\rtx5070_env\Scripts\python.exe (found version "3.10.10") found components: Interpreter CMake Warning at cmake/public/cuda.cmake:140 (message): Failed to compute shorthash for libnvrtc.so Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- Found CUDNN: E:/Program Files/NVIDIA/CUNND/v9.12/lib/13.0/x64/cudnn.lib -- Could NOT find CUSPARSELT (missing: CUSPARSELT_LIBRARY_PATH CUSPARSELT_INCLUDE_PATH) CMake Warning at cmake/public/cuda.cmake:226 (message): Cannot find cuSPARSELt library. Turning the option off Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- Could NOT find CUDSS (missing: CUDSS_LIBRARY_PATH CUDSS_INCLUDE_PATH) CMake Warning at cmake/public/cuda.cmake:242 (message): Cannot find CUDSS library. Turning the option off Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- USE_CUFILE is set to 0. Compiling without cuFile support CMake Warning at cmake/public/cuda.cmake:317 (message): pytorch is not compatible with `CMAKE_CUDA_ARCHITECTURES` and will ignore its value. Please configure `TORCH_CUDA_ARCH_LIST` instead. Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- Added CUDA NVCC flags for: -gencode;arch=compute_89,code=sm_89 CMake Warning at cmake/Dependencies.cmake:95 (message): Not compiling with XPU. Could NOT find SYCL. Suppress this warning with -DUSE_XPU=OFF. Call Stack (most recent call first): CMakeLists.txt:869 (include) -- Building using own protobuf under third_party per request. -- Use custom protobuf build. CMake Warning at cmake/ProtoBuf.cmake:37 (message): Ancient protobuf forces CMake compatibility Call Stack (most recent call first): cmake/ProtoBuf.cmake:87 (custom_protobuf_find) cmake/Dependencies.cmake:107 (include) CMakeLists.txt:869 (include) CMake Deprecation Warning at third_party/protobuf/cmake/CMakeLists.txt:2 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. -- -- 3.13.0.0 -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Looking for pthread_create in pthreads -- Looking for pthread_create in pthreads - not found -- Looking for pthread_create in pthread -- Looking for pthread_create in pthread - not found -- Found Threads: TRUE -- Caffe2 protobuf include directory: $<BUILD_INTERFACE:E:/PyTorch_Build/pytorch/third_party/protobuf/src>$<INSTALL_INTERFACE:include> -- Trying to find preferred BLAS backend of choice: MKL -- MKL_THREADING = OMP -- Looking for sys/types.h -- Looking for sys/types.h - found -- Looking for stdint.h -- Looking for stdint.h - found -- Looking for stddef.h -- Looking for stddef.h - found -- Check size of void* -- Check size of void* - done -- MKL_THREADING = OMP CMake Warning at cmake/Dependencies.cmake:213 (message): MKL could not be found. Defaulting to Eigen Call Stack (most recent call first): CMakeLists.txt:869 (include) CMake Warning at cmake/Dependencies.cmake:279 (message): Preferred BLAS (MKL) cannot be found, now searching for a general BLAS library Call Stack (most recent call first): CMakeLists.txt:869 (include) -- MKL_THREADING = OMP -- Checking for [mkl_intel_lp64 - mkl_intel_thread - mkl_core - libiomp5md] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_intel_thread - mkl_core - libiomp5md] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_intel_thread - mkl_core] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_intel_thread - mkl_core] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_sequential - mkl_core] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_sequential - mkl_core] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_core - libiomp5md - pthread] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_core - libiomp5md - pthread] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_core - pthread] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_core - pthread] -- Library mkl_intel: not found -- Checking for [mkl - guide - pthread - m] -- Library mkl: not found -- MKL library not found -- Checking for [blis] -- Library blis: BLAS_blis_LIBRARY-NOTFOUND -- Checking for [Accelerate] -- Library Accelerate: BLAS_Accelerate_LIBRARY-NOTFOUND -- Checking for [vecLib] -- Library vecLib: BLAS_vecLib_LIBRARY-NOTFOUND -- Checking for [flexiblas] -- Library flexiblas: BLAS_flexiblas_LIBRARY-NOTFOUND -- Checking for [openblas] -- Library openblas: BLAS_openblas_LIBRARY-NOTFOUND -- Checking for [openblas - pthread - m] -- Library openblas: BLAS_openblas_LIBRARY-NOTFOUND -- Checking for [openblas - pthread - m - gomp] -- Library openblas: BLAS_openblas_LIBRARY-NOTFOUND -- Checking for [libopenblas] -- Library libopenblas: BLAS_libopenblas_LIBRARY-NOTFOUND -- Checking for [goto2 - gfortran] -- Library goto2: BLAS_goto2_LIBRARY-NOTFOUND -- Checking for [goto2 - gfortran - pthread] -- Library goto2: BLAS_goto2_LIBRARY-NOTFOUND -- Checking for [acml - gfortran] -- Library acml: BLAS_acml_LIBRARY-NOTFOUND -- Checking for [blis] -- Library blis: BLAS_blis_LIBRARY-NOTFOUND -- Could NOT find Atlas (missing: Atlas_CBLAS_INCLUDE_DIR Atlas_CLAPACK_INCLUDE_DIR Atlas_CBLAS_LIBRARY Atlas_BLAS_LIBRARY Atlas_LAPACK_LIBRARY) -- Checking for [ptf77blas - atlas - gfortran] -- Library ptf77blas: BLAS_ptf77blas_LIBRARY-NOTFOUND -- Checking for [] -- Looking for sgemm_ -- Looking for sgemm_ - not found -- Cannot find a library with BLAS API. Not using BLAS. -- Using pocketfft in directory: E:/PyTorch_Build/pytorch/third_party/pocketfft/ CMake Deprecation Warning at third_party/pthreadpool/CMakeLists.txt:1 (CMAKE_MINIMUM_REQUIRED): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. CMake Deprecation Warning at third_party/FXdiv/CMakeLists.txt:1 (CMAKE_MINIMUM_REQUIRED): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. CMake Deprecation Warning at third_party/cpuinfo/CMakeLists.txt:1 (CMAKE_MINIMUM_REQUIRED): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. -- The ASM compiler identification is MSVC CMake Warning (dev) at rtx5070_env/Lib/site-packages/cmake/data/share/cmake-4.1/Modules/CMakeDetermineASMCompiler.cmake:234 (message): Policy CMP194 is not set: MSVC is not an assembler for language ASM. Run "cmake --help-policy CMP194" for policy details. Use the cmake_policy command to set the policy and suppress this warning. Call Stack (most recent call first): third_party/XNNPACK/CMakeLists.txt:18 (PROJECT) This warning is for project developers. Use -Wno-dev to suppress it. -- Found assembler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe -- Building for XNNPACK_TARGET_PROCESSOR: x86_64 -- Generating microkernels.cmake
09-04
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值