[WARN]Warning: Multiple build commands for output file /

本文提供了一步解决Xcode警告的方法,即删除CopyBundleResources阶段中的特定文件,以解决‘多个构建命令输出文件’的问题。
xcode中 有时候会报一个警告: 
[WARN]Warning: Multiple build commands for output file /xxx 

要解决这个问题很简单: 

1.选择你的工程 
2.选择target 
3.点击 Build Phases 
4.展开Copy Bundle Resources 
5.删除里面的刚才提示警告的文件,一般为红色的名字的文件
(stable-diffusion) G:\modelhub\yuliu11\stable-diffusion-webui>pip install gfpgan Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting gfpgan Using cached https://pypi.tuna.tsinghua.edu.cn/packages/80/a2/84bb50a2655fda1e6f35ae57399526051b8a8b96ad730aea82abeaac4de8/gfpgan-1.3.8-py3-none-any.whl (52 kB) Requirement already satisfied: basicsr>=1.4.2 in d:\ac\envs\stable-diffusion\lib\site-packages (from gfpgan) (1.4.2) Collecting facexlib>=0.2.5 (from gfpgan) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/36/7b/2147339dafe1c4800514c9c21ee4444f8b419ce51dfc7695220a8e0069a6/facexlib-0.3.0-py3-none-any.whl (59 kB) Requirement already satisfied: lmdb in d:\ac\envs\stable-diffusion\lib\site-packages (from gfpgan) (1.7.5) Requirement already satisfied: numpy in d:\ac\envs\stable-diffusion\lib\site-packages (from gfpgan) (1.26.4) Requirement already satisfied: opencv-python in d:\ac\envs\stable-diffusion\lib\site-packages (from gfpgan) (4.12.0.88) Requirement already satisfied: pyyaml in d:\ac\envs\stable-diffusion\lib\site-packages (from gfpgan) (6.0.3) Requirement already satisfied: scipy in d:\ac\envs\stable-diffusion\lib\site-packages (from gfpgan) (1.15.3) INFO: pip is looking at multiple versions of gfpgan to determine which version is compatible with other requirements. This could take a while. Collecting gfpgan Using cached https://pypi.tuna.tsinghua.edu.cn/packages/31/88/a7651885208ce76e16f41c41fa429133716a0828eac0335f513052962ada/gfpgan-1.3.7-py3-none-any.whl (58 kB) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/7d/34/a708e205ec90e3639d61599f1fd305bf52da68676bf6d632cfafcaf0dcd9/gfpgan-1.3.6-py3-none-any.whl (62 kB) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/c4/36/d611103427877ede4dc426e4c427ebd48f83882ca121f472a63f90b0077d/gfpgan-1.3.5-py3-none-any.whl (47 kB) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/65/70/f88a1631812ddb3f35e7c83de4037de47017e9a8c6fec972851f2c454eb1/gfpgan-1.3.4-py3-none-any.whl (47 kB) Collecting numpy<1.21 (from gfpgan) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/f3/1f/fe9459e39335e7d0e372b5e5dcd60f4381d3d1b42f0b9c8222102ff29ded/numpy-1.20.3.zip (7.8 MB) Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... error error: subprocess-exited-with-error × Preparing metadata (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [272 lines of output] setup.py:66: RuntimeWarning: NumPy 1.20.3 may not yet support Python 3.10. warnings.warn( Running from numpy source directory. setup.py:485: UserWarning: Unrecognized setuptools command, proceeding with generating Cython sources and expanding templates run_build = parse_setuppy_commands() Processing numpy/random\_bounded_integers.pxd.in Processing numpy/random\bit_generator.pyx Processing numpy/random\mtrand.pyx Processing numpy/random\_bounded_integers.pyx.in Processing numpy/random\_common.pyx Processing numpy/random\_generator.pyx Processing numpy/random\_mt19937.pyx Processing numpy/random\_pcg64.pyx Processing numpy/random\_philox.pyx Processing numpy/random\_sfc64.pyx Cythonizing sources blas_opt_info: blas_mkl_info: No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils customize MSVCCompiler libraries mkl_rt not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE blis_info: libraries blis not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE openblas_info: libraries openblas not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] get_default_fcompiler: matching types: '['gnu', 'intelv', 'absoft', 'compaqv', 'intelev', 'gnu95', 'g95', 'intelvem', 'intelem', 'flang']' customize GnuFCompiler Could not locate executable g77 Could not locate executable f77 customize IntelVisualFCompiler Could not locate executable ifort Could not locate executable ifl customize AbsoftFCompiler Could not locate executable f90 customize CompaqVisualFCompiler Could not locate executable DF customize IntelItaniumVisualFCompiler Could not locate executable efl customize Gnu95FCompiler Could not locate executable gfortran Could not locate executable f95 customize G95FCompiler Could not locate executable g95 customize IntelEM64VisualFCompiler customize IntelEM64TFCompiler Could not locate executable efort Could not locate executable efc customize PGroupFlangCompiler Could not locate executable flang don't know how to compile Fortran code on platform 'nt' NOT AVAILABLE atlas_3_10_blas_threads_info: Setting PTATLAS=ATLAS libraries tatlas not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE atlas_3_10_blas_info: libraries satlas not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE atlas_blas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE atlas_blas_info: libraries f77blas,cblas,atlas not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\system_info.py:1989: UserWarning: Optimized (vendor) Blas libraries are not found. Falls back to netlib Blas library which has worse performance. A better performance should be easily gained by switching Blas library. if self._calc_info(blas): blas_info: libraries blas not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\system_info.py:1989: UserWarning: Blas (http://www.netlib.org/blas/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [blas]) or by setting the BLAS environment variable. if self._calc_info(blas): blas_src_info: NOT AVAILABLE C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\system_info.py:1989: UserWarning: Blas (http://www.netlib.org/blas/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [blas_src]) or by setting the BLAS_SRC environment variable. if self._calc_info(blas): NOT AVAILABLE non-existing path in 'numpy\\distutils': 'site.cfg' lapack_opt_info: lapack_mkl_info: libraries mkl_rt not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE openblas_lapack_info: libraries openblas not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE openblas_clapack_info: libraries openblas,lapack not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE flame_info: libraries flame not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE atlas_3_10_threads_info: Setting PTATLAS=ATLAS libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\lib libraries tatlas,tatlas not found in D:\ac\envs\stable-diffusion\lib libraries lapack_atlas not found in C:\ libraries tatlas,tatlas not found in C:\ libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\libs libraries tatlas,tatlas not found in D:\ac\envs\stable-diffusion\libs libraries lapack_atlas not found in D:\ac\Library\lib libraries tatlas,tatlas not found in D:\ac\Library\lib <class 'numpy.distutils.system_info.atlas_3_10_threads_info'> NOT AVAILABLE atlas_3_10_info: libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\lib libraries satlas,satlas not found in D:\ac\envs\stable-diffusion\lib libraries lapack_atlas not found in C:\ libraries satlas,satlas not found in C:\ libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\libs libraries satlas,satlas not found in D:\ac\envs\stable-diffusion\libs libraries lapack_atlas not found in D:\ac\Library\lib libraries satlas,satlas not found in D:\ac\Library\lib <class 'numpy.distutils.system_info.atlas_3_10_info'> NOT AVAILABLE atlas_threads_info: Setting PTATLAS=ATLAS libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\lib libraries ptf77blas,ptcblas,atlas not found in D:\ac\envs\stable-diffusion\lib libraries lapack_atlas not found in C:\ libraries ptf77blas,ptcblas,atlas not found in C:\ libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\libs libraries ptf77blas,ptcblas,atlas not found in D:\ac\envs\stable-diffusion\libs libraries lapack_atlas not found in D:\ac\Library\lib libraries ptf77blas,ptcblas,atlas not found in D:\ac\Library\lib <class 'numpy.distutils.system_info.atlas_threads_info'> NOT AVAILABLE atlas_info: libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\lib libraries f77blas,cblas,atlas not found in D:\ac\envs\stable-diffusion\lib libraries lapack_atlas not found in C:\ libraries f77blas,cblas,atlas not found in C:\ libraries lapack_atlas not found in D:\ac\envs\stable-diffusion\libs libraries f77blas,cblas,atlas not found in D:\ac\envs\stable-diffusion\libs libraries lapack_atlas not found in D:\ac\Library\lib libraries f77blas,cblas,atlas not found in D:\ac\Library\lib <class 'numpy.distutils.system_info.atlas_info'> NOT AVAILABLE lapack_info: libraries lapack not found in ['D:\\ac\\envs\\stable-diffusion\\lib', 'C:\\', 'D:\\ac\\envs\\stable-diffusion\\libs', 'D:\\ac\\Library\\lib'] NOT AVAILABLE C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\system_info.py:1849: UserWarning: Lapack (http://www.netlib.org/lapack/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [lapack]) or by setting the LAPACK environment variable. return getattr(self, '_calc_info_{}'.format(name))() lapack_src_info: NOT AVAILABLE C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\system_info.py:1849: UserWarning: Lapack (http://www.netlib.org/lapack/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [lapack_src]) or by setting the LAPACK_SRC environment variable. return getattr(self, '_calc_info_{}'.format(name))() NOT AVAILABLE numpy_linalg_lapack_lite: FOUND: language = c define_macros = [('HAVE_BLAS_ILP64', None), ('BLAS_SYMBOL_SUFFIX', '64_')] C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\dist.py:275: UserWarning: Unknown distribution option: 'define_macros' warnings.warn(msg) running dist_info running build_src build_src building py_modules sources creating build creating build\src.win-amd64-3.10 creating build\src.win-amd64-3.10\numpy creating build\src.win-amd64-3.10\numpy\distutils building library "npymath" sources Traceback (most recent call last): File "D:\ac\envs\stable-diffusion\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 389, in <module> main() File "D:\ac\envs\stable-diffusion\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 373, in main json_out["return_val"] = hook(**hook_input["kwargs"]) File "D:\ac\envs\stable-diffusion\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 175, in prepare_metadata_for_build_wheel return hook(metadata_directory, config_settings) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\build_meta.py", line 157, in prepare_metadata_for_build_wheel self.run_setup() File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\build_meta.py", line 248, in run_setup super(_BuildMetaLegacyBackend, File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\build_meta.py", line 142, in run_setup exec(compile(code, __file__, 'exec'), locals()) File "setup.py", line 513, in <module> setup_package() File "setup.py", line 505, in setup_package setup(**metadata) File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\core.py", line 169, in setup return old_setup(**new_attr) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\__init__.py", line 165, in setup return distutils.core.setup(**attrs) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\core.py", line 148, in setup dist.run_commands() File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 967, in run_commands self.run_command(cmd) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 986, in run_command cmd_obj.run() File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\command\dist_info.py", line 31, in run egg_info.run() File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\egg_info.py", line 24, in run self.run_command("build_src") File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\cmd.py", line 313, in run_command self.distribution.run_command(command) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 986, in run_command cmd_obj.run() File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\build_src.py", line 144, in run self.build_sources() File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\build_src.py", line 155, in build_sources self.build_library_sources(*libname_info) File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\build_src.py", line 288, in build_library_sources sources = self.generate_sources(sources, (lib_name, build_info)) File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\build_src.py", line 378, in generate_sources source = func(extension, build_dir) File "numpy\core\setup.py", line 671, in get_mathlib_info st = config_cmd.try_link('int main(void) { return 0;}') File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\command\config.py", line 243, in try_link self._link(body, headers, include_dirs, File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\config.py", line 162, in _link return self._wrap_method(old_config._link, lang, File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\config.py", line 96, in _wrap_method ret = mth(*((self,)+args)) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\command\config.py", line 137, in _link (src, obj) = self._compile(body, headers, include_dirs, lang) File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\config.py", line 105, in _compile src, obj = self._wrap_method(old_config._compile, lang, File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\command\config.py", line 96, in _wrap_method ret = mth(*((self,)+args)) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\command\config.py", line 132, in _compile self.compiler.compile([src], include_dirs=include_dirs) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\_msvccompiler.py", line 401, in compile self.spawn(args) File "C:\Users\Administrator\AppData\Local\Temp\pip-build-env-v2l4bhja\overlay\Lib\site-packages\setuptools\_distutils\_msvccompiler.py", line 505, in spawn return super().spawn(cmd, env=env) File "C:\Users\Administrator\AppData\Local\Temp\pip-install-gor1kse0\numpy_92ade539c0ac480f82e2525154ceb2c0\numpy\distutils\ccompiler.py", line 90, in <lambda> m = lambda self, *args, **kw: func(self, *args, **kw) TypeError: CCompiler_spawn() got an unexpected keyword argument 'env' [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed × Encountered error while generating package metadata. ╰─> numpy note: This is an issue with the package mentioned above, not pip. hint: See above for details. (stable-diffusion) G:\modelhub\yuliu11\stable-diffusion-webui>pip list Package Version ------------------------- ------------ absl-py 2.3.1 accelerate 1.12.0 addict 2.4.0 aenum 3.1.16 aiofiles 23.2.1 aiohappyeyeballs 2.6.1 aiohttp 3.13.2 aiosignal 1.4.0 altair 5.5.0 annotated-doc 0.0.4 annotated-types 0.7.0 antlr4-python3-runtime 4.9.3 anyio 4.12.0 async-timeout 5.0.1 attrs 25.4.0 basicsr 1.4.2 blendmodes 2025 certifi 2025.11.12 charset-normalizer 3.4.4 clean-fid 0.1.35 click 8.3.1 colorama 0.4.6 contourpy 1.3.2 cycler 0.12.1 einops 0.8.1 exceptiongroup 1.3.1 fastapi 0.123.0 ffmpy 1.0.0 filelock 3.19.1 fonttools 4.61.0 frozenlist 1.8.0 fsspec 2025.9.0 ftfy 6.3.1 future 1.0.0 gitdb 4.0.12 GitPython 3.1.45 gradio 3.41.2 gradio_client 0.5.0 grpcio 1.76.0 h11 0.16.0 hf-xet 1.2.0 httpcore 1.0.9 httpx 0.28.1 huggingface-hub 0.36.0 idna 3.11 ImageIO 2.37.2 importlib_resources 6.5.2 inflection 0.5.1 Jinja2 3.1.6 jsonmerge 1.9.2 jsonschema 4.25.1 jsonschema-specifications 2025.9.1 kiwisolver 1.4.9 kornia 0.8.2 kornia_rs 0.1.10 lark 1.3.1 lazy_loader 0.4 lightning-utilities 0.15.2 lmdb 1.7.5 Markdown 3.10 MarkupSafe 2.1.5 matplotlib 3.10.7 modelscope 1.32.0 mpmath 1.3.0 multidict 6.7.0 narwhals 2.13.0 networkx 3.3 numpy 1.26.4 omegaconf 2.3.0 open_clip_torch 3.2.0 opencv-python 4.12.0.88 orjson 3.11.4 packaging 25.0 pandas 2.3.3 piexif 1.1.3 pillow 10.4.0 pip 25.3 propcache 0.4.1 protobuf 6.33.1 psutil 7.1.3 pydantic 2.12.5 pydantic_core 2.41.5 pydub 0.25.1 pyparsing 3.2.5 python-dateutil 2.9.0.post0 python-multipart 0.0.20 pytorch-lightning 2.6.0 pytz 2025.2 PyYAML 6.0.3 referencing 0.37.0 regex 2025.11.3 requests 2.32.5 resize-right 0.0.2 rpds-py 0.30.0 safetensors 0.7.0 scikit-image 0.25.2 scipy 1.15.3 semantic-version 2.10.0 setuptools 80.9.0 shellingham 1.5.4 six 1.17.0 smmap 5.0.2 starlette 0.50.0 sympy 1.14.0 tensorboard 2.20.0 tensorboard-data-server 0.7.2 tifffile 2025.5.10 timm 1.0.22 tokenizers 0.13.3 tomesd 0.1.3 torch 2.9.1+cu128 torchdiffeq 0.2.5 torchmetrics 1.8.2 torchsde 0.2.6 torchvision 0.24.1+cu128 tqdm 4.67.1 trampoline 0.1.2 transformers 4.30.2 typer-slim 0.20.0 typing_extensions 4.15.0 typing-inspection 0.4.2 tzdata 2025.2 urllib3 2.5.0 uvicorn 0.38.0 wcwidth 0.2.14 websockets 11.0.3 Werkzeug 3.1.4 wheel 0.45.1 yarl 1.22.0 (stable-diffusion) G:\modelhub\yuliu11\stable-diffusion-webui>
最新发布
12-03
解析以下函数:“def execute(reptrav, multiple=False, with_dbg=False, only_env=False, follow_output=True, fpara=None, facmtps=1., runner=None, **kargs): """Run a Code_Aster execution in 'reptrav'. Arguments : multiple : False if only one execution is run (so stop if it fails), True if several executions are run (don't stop when error occurs) with_dbg : start debugger or not, fpara : deprecated, follow_output : print output to follow the execution, kargs give "run, conf, prof, build" instances + exec name Return a tuple (diag, tcpu, tsys, ttot, validbase). """ # 1. ----- initializations run = kargs['run'] conf = kargs['conf'] prof = kargs['prof'] build = kargs['build'] exetmp = kargs['exe'] ctest = prof['parent'][0] == "astout" waf_inst = build.support('waf') waf_nosupv = build.support('nosuperv') waf_noresu = build.support('noresu') waf_orb = build.support('orbinitref') use_numthreads = build.support('use_numthreads') run.DBG("version supports: waf ({0}), nosuperv ({1}), " "orbinitref ({2}), numthreads ({3})".format(waf_inst, waf_nosupv, waf_orb, use_numthreads)) if not waf_inst: exetmp = osp.join('.', osp.basename(exetmp)) tcpu = 0. tsys = 0. ttot = 0. validbase = True if runner is None: runner = Runner() runner.set_rep_trav(reptrav) interact = ('interact' in prof.args or prof.args.get('args', '').find('-interact') > -1) hide_command = ("hide-command" in prof.args or prof.args.get('args', '').find('--hide-command') > -1) os.chdir(reptrav) # 2. ----- list of command files list_comm = glob('fort.1.*') list_comm.sort() if osp.exists('fort.1'): list_comm.insert(0, 'fort.1') if waf_nosupv: for fcomm in list_comm: add_import_commands(fcomm) # 3. ----- arguments list drep = { 'REPOUT' : 'rep_outils', 'REPMAT' : 'rep_mat', 'REPDEX' : 'rep_dex' } cmd = [] if waf_nosupv: if interact: cmd.append('-i') cmd.append('fort.1') else: if waf_inst: cmd.append(osp.join(conf['SRCPY'][0], conf['ARGPYT'][0])) else: cmd.append(osp.join(conf['REPPY'][0], conf['ARGPYT'][0])) cmd.extend(conf['ARGEXE']) # warning: using --commandes will turn off backward compatibility cmd.append('-commandes') cmd.append('fort.1') # cmd.append(_fmtoption('command', 'fort.1')) # remove deprecated options long_opts_rm = ['rep', 'mem', 'mxmemdy', 'memory_stat', 'memjeveux_stat', 'type_alloc', 'taille', 'partition', 'origine', 'eficas_path'] # for version < 12.6/13.2 that does not support --ORBInitRef=, ignore it if not waf_orb: long_opts_rm.append('ORBInitRef') cmd_memtps = {} for k, v in list(prof.args.items()): if k == 'args': cmd.append(prof.args[k]) elif k in long_opts_rm: warn("this command line option is deprecated : --%s" % k, DeprecationWarning, stacklevel=3) elif k in ('memjeveux', 'tpmax'): cmd_memtps[k] = v elif v.strip() == '' and k in list(drep.values()): run.Mess(_('empty value not allowed for "%s"') % k, '<A>_INVALID_PARAMETER') else: cmd.append(_fmtoption(k, v)) # add arguments to find the process (for as_actu/as_del) if not 'astout' in prof['actions'] and not 'distribution' in prof['actions']: cmd.append(_fmtoption('num_job', run['num_job'])) cmd.append(_fmtoption('mode', prof['mode'][0])) # arguments which can be set in file 'config.txt' for kconf, karg in list(drep.items()): if conf[kconf][0] != '' and not karg in list(prof.args.keys()): cmd.append(_fmtoption(karg, conf[kconf][0])) ncpus = prof['ncpus'][0] try: ncpus = max(1, int(ncpus)) except ValueError: ncpus = '' if use_numthreads: if ncpus == '': ncpus = max([run[prof['mode'][0] + '_nbpmax'] // 2, 1]) cmd.append(_fmtoption('numthreads', ncpus)) elif ncpus == '': ncpus = '1' # 4. ----- add parameters from prof if on_64bits(): facW = 8 else: facW = 4 tps = 0 memj = 0 nbp = 0 try: tps = int(float(prof.args['tpmax'])) except KeyError: run.Mess(_('tpmax not provided in profile'), '<E>_INCORRECT_PARA') except ValueError: run.Mess(_('incorrect value for tpmax (%s) in profile') % \ prof.args['tpmax'], '<E>_INCORRECT_PARA') try: memj = float(prof.args['memjeveux']) except KeyError: run.Mess(_('memjeveux not provided in profile'), '<E>_INCORRECT_PARA') except ValueError: run.Mess(_('incorrect value for memjeveux (%s) in profile') % \ prof.args['memjeveux'], '<E>_INCORRECT_PARA') try: nbp = int(ncpus) except ValueError: run.Mess(_('incorrect value for ncpus (%s) in profile') % \ prof['ncpus'][0], '<E>_INCORRECT_PARA') # 4.1. check for memory, time and procs limits run.Mess(_('Parameters : memory %d MB - time limit %d s') % (memj*facW, tps)) check_limits(run, prof['mode'][0], tps, memj*facW, nbp, runner.nbnode(), runner.nbcpu()) # check for previous errors (parameters) if not multiple: run.CheckOK() elif run.GetGrav(run.diag) > run.GetGrav('<A>'): run.Mess(_('error in parameters : %s') % run.diag) return run.diag, tcpu, tsys, ttot, validbase # 5. ----- only environment, print command lines to execute if only_env: run.Mess(ufmt(_('Code_Aster environment prepared in %s'), reptrav), 'OK') run.Mess(_('To start execution copy/paste following lines in a ksh/bash shell :')) run.Mess(' cd %s' % reptrav, 'SILENT') run.Mess(' . %s' % osp.join(confdir, 'profile.sh'), 'SILENT') tmp_profile = "profile_tmp.sh" with open(tmp_profile, 'w') as f: f.write(""" export PYTHONPATH=$PYTHONPATH:. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:. """) # set per version environment for f in conf.get_with_absolute_path('ENV_SH') + [tmp_profile]: run.Mess(' . %s' % f, 'SILENT') cmd.insert(0, exetmp) # add memjeveux and tpmax cmd.extend([_fmtoption(k, v) for k, v in list(cmd_memtps.items())]) cmdline = ' '.join(cmd) # get pdb.py path pdbpy_cmd = "import os, sys ; " \ "pdbpy = os.path.join(sys.prefix, 'lib', 'python' + sys.version[:3], 'pdb.py')" d = {} exec(pdbpy_cmd, d) pdbpy = d['pdbpy'] if runner.really(): #XXX and if not ? perhaps because of exit_code cmdline = runner.get_exec_command(cmdline, add_tee=False, env=conf.get_with_absolute_path('ENV_SH')) # print command lines k = 0 for fcomm in list_comm: k += 1 run.Mess(_("Command line %d :") % k) run.Mess("cp %s fort.1" % fcomm, 'SILENT') run.Mess(cmdline, 'SILENT') # how to start the Python debugger if not runner.really(): run.Mess(_('To start execution in the Python debugger you could type :'), 'SILENT') pdb_cmd = cmdline.replace(exetmp, '%s %s' % (exetmp, ' '.join(pdbpy.splitlines()))) run.Mess("cp %s fort.1" % fcomm, 'SILENT') run.Mess(pdb_cmd, 'SILENT') diag = 'OK' # 6. ----- really execute else: # 6.1. content of reptrav if not ctest: run.Mess(ufmt(_('Content of %s before execution'), reptrav), 'TITLE') out = run.Shell(cmd='ls -la')[1] print(out) if len(list_comm) == 0: run.Mess(_('no .comm file found'), '<E>_NO_COMM_FILE') # check for previous errors (copy datas) if not multiple: run.CheckOK() elif run.GetGrav(run.diag) > run.GetGrav('<A>'): run.Mess(_('error occurs during preparing data : %s') % run.diag) return run.diag, tcpu, tsys, ttot, validbase # 6.2. complete command line cmd.append('--suivi_batch') add_tee = False if with_dbg: # how to run the debugger cmd_dbg = run.config.get('cmd_dbg', '') if not cmd_dbg: run.Mess(_('command line to run the debugger not defined'), '<F>_DEBUG_ERROR') if cmd_dbg.find('gdb') > -1: ldbg = ['break main', ] else: ldbg = ['stop in main', ] # add memjeveux and tpmax update_cmd_memtps(cmd_memtps) cmd.extend([_fmtoption(k, v) for k, v in list(cmd_memtps.items())]) pos_memtps = -1 cmd_args = ' '.join(cmd) ldbg.append('run ' + cmd_args) cmd = getdbgcmd(cmd_dbg, exetmp, '', ldbg, cmd_args) else: add_tee = True cmd.insert(0, exetmp) # position where insert memjeveux+tpmax pos_memtps = len(cmd) # keep for compatibility with version < 13.1 os.environ['OMP_NUM_THREADS'] = str(ncpus) # unlimit coredump size try: corefilesize = int(prof['corefilesize'][0]) * 1024*1024 except ValueError: corefilesize = 'unlimited' run.SetLimit('core', corefilesize) # 6.3. if multiple .comm files, keep previous bases if len(list_comm) > 1: run.Mess(_('%d command files') % len(list_comm)) validbase = False BASE_PREC = osp.join(reptrav, 'BASE_PREC') run.MkDir(BASE_PREC) # 6.4. for each .comm file diag = '?' diag_ok = None k = 0 for fcomm in list_comm: k += 1 os.chdir(runner.reptrav()) run.Copy('fort.1', fcomm) # start execution tit = _('Code_Aster run') run.timer.Start(tit) # add memjeveux and tpmax at the right position if os.name != 'nt' or not add_tee: cmd_i = cmd[:] else: print("cmd", cmd) cmd_i = [cm.replace(" ","\" \"") if not cm.startswith("--max_") else cm for cm in cmd] update_cmd_memtps(cmd_memtps) if pos_memtps > -1: for key, value in list(cmd_memtps.items()): cmd_i.insert(pos_memtps, _fmtoption(key, value)) if True or not ctest: run.Mess(tit, 'TITLE') run.Mess(_('Command line %d :') % k) if not run['verbose']: run.Mess(' '.join(cmd_i)) if waf_nosupv and not hide_command: dash = "# " + "-" * 90 with open('fort.1', 'rb') as f: content =[_("Content of the file to execute"), dash, to_unicode(f.read()), dash] run.Mess(os.linesep.join(content)) cmd_exec = runner.get_exec_command(' '.join(cmd_i), add_tee=add_tee, env=conf.get_with_absolute_path('ENV_SH')) # go iret, exec_output = run.Shell(cmd_exec, follow_output=follow_output, interact=interact) if iret != 0: cats = ['fort.6'] if not waf_noresu: cats.extend(['fort.8', 'fort.9']) for f in cats: run.FileCat(text="""\n <I>_EXIT_CODE = %s""" % iret, dest=f) if not follow_output and not ctest: print(exec_output) # mpirun does not include cpu/sys time of childrens, add it in timer if exec_output: runner.add_to_timer(exec_output, tit) run.timer.Stop(tit) if k < len(list_comm): for b in glob('vola.*')+glob('loca.*'): run.Delete(b, remove_dirs=False) if len(list_comm) > 1: ldiag = build.getDiag(cas_test=ctest) diag_k = ldiag[0] tcpu += ldiag[1] tsys += ldiag[2] ttot += ldiag[3] run.FileCat('fort.6', 'fort_bis.6') run.Delete('fort.6', remove_dirs=False) if not waf_noresu: run.FileCat('fort.8', 'fort_bis.8') run.Delete('fort.8', remove_dirs=False) run.FileCat('fort.9', 'fort_bis.9') run.Delete('fort.9', remove_dirs=False) if re.search('<[ESF]{1}>', diag_k): # switch <F> to <E> if multiple .comm if diag_k.find('<F>') > -1: diag_k = diag_k.replace('<F>', '<E>') # ...and try to restore previous bases run.Mess(ufmt(_('restore bases from %s'), BASE_PREC)) lbas = glob(osp.join(BASE_PREC, 'glob.*')) + \ glob(osp.join(BASE_PREC, 'bhdf.*')) + \ glob(osp.join(BASE_PREC, 'pick.*')) if len(lbas) > 0: run.Copy(os.getcwd(), niverr='INFO', verbose=follow_output, *lbas) else: run.Mess(_('no glob/bhdf base to restore'), '<A>_ALARM') run.Mess(_('execution aborted (comm file #%d)') % k, diag_k) diag = diag_k break else: # save bases in BASE_PREC if next execution fails validbase = True if k < len(list_comm): if not ctest: run.Mess(ufmt(_('save bases into %s'), BASE_PREC)) lbas = glob('glob.*') + \ glob('bhdf.*') + \ glob('pick.*') run.Copy(BASE_PREC, niverr='INFO', verbose=follow_output, *lbas) run.Mess(_('execution ended (comm file #%d)') % k, diag_k) # at least one is ok/alarm ? keep the "worse good" status! if run.GetGrav(diag_k) in (0, 1): diag_ok = diag_ok or 'OK' if run.GetGrav(diag_ok) < run.GetGrav(diag_k): diag_ok = diag_k # the worst diagnostic if run.GetGrav(diag) < run.GetGrav(diag_k): diag = diag_k # 6.5. global diagnostic if len(list_comm) > 1: run.Rename('fort_bis.6', 'fort.6') run.Rename('fort_bis.8', 'fort.8') run.Rename('fort_bis.9', 'fort.9') else: diag, tcpu, tsys, ttot = build.getDiag(cas_test=ctest)[:4] validbase = run.GetGrav(diag) <= run.GetGrav('<S>') if ctest and run.GetGrav(diag) < 0: diag = '<F>_' + diag if ctest and diag == 'NO_TEST_RESU' and diag_ok: diag = diag_ok run.ReinitDiag(diag) # expected diagnostic ? if prof['expected_diag'][0]: expect = prof['expected_diag'][0] if run.GetGrav(diag) >= run.GetGrav('<E>'): diag = '<F>_ERROR' if run.GetGrav(diag) == run.GetGrav(expect): run.Mess(_('Diagnostic is as expected.')) diag = 'OK' else: run.Mess(_("Diagnostic is not as expected (got '%s').") % diag) diag = 'NOOK_TEST_RESU' run.ReinitDiag(diag) run.Mess(_('Code_Aster run ended, diagnostic : %s') % diag) # 6.6. post-mortem analysis of the core file if not with_dbg: cmd_dbg = run.config.get('cmd_post', '') lcor = glob('core*') if cmd_dbg and lcor: run.Mess(_('Code_Aster run created a coredump'), '<E>_CORE_FILE') if not multiple: # take the first one if several core files core = lcor[0] run.Mess(ufmt(_('core file name : %s'), core)) cmd = getdbgcmd(cmd_dbg, exetmp, core, ('where', 'quit'), '') tit = _('Coredump analysis') run.Mess(tit, 'TITLE') run.timer.Start(tit) iret, output = run.Shell(' '.join(cmd), alt_comment='coredump analysis...', verbose=True) if iret == 0 and not ctest: print(output) run.timer.Stop(tit) if not ctest: # 6.7. content of reptrav run.Mess(ufmt(_('Content of %s after execution'), os.getcwd()), 'TITLE') out = run.Shell(cmd='ls -la . REPE_OUT')[1] print(out) # 6.8. print some informations run.Mess(_('Size of bases'), 'TITLE') lf = glob('vola.*') lf.extend(glob('loca.*')) lf.extend(glob('glob.*')) lf.extend(glob('bhdf.*')) lf.extend(glob('pick.*')) for f in lf: run.Mess(_('size of %s : %12d bytes') % (f, os.stat(f).st_size)) return diag, tcpu, tsys, ttot, validbase def _fmtoption(key, value=None): """Format an option""" key = key.lstrip('-') if value is None or (type(value) is str and not value.strip()): fmt = '--{0}'.format(key) else: fmt = '--{0}={1}'.format(key, value) return fmt ”
07-10
> hvigor ERROR: Failed :entry:default@CompileArkTS... > hvigor ERROR: ArkTS Compiler Error 1 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/cateability/CateAbility.ets:10:5 "globalThis" is not supported (arkts-no-globalthis) 2 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/goodability/GoodAbility.ets:10:5 "globalThis" is not supported (arkts-no-globalthis) 3 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/cateability/CatePage.ets:10:29 "globalThis" is not supported (arkts-no-globalthis) 4 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/pages/LoginPage.ets:11:25 "globalThis" is not supported (arkts-no-globalthis) 5 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/pages/LoginPage.ets:12:24 "globalThis" is not supported (arkts-no-globalthis) 6 WARN: ArkTS:WARN: For details about ArkTS syntax errors, see FAQs 7 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/pages/MinePage.ets:91:20 'SetOrCreate' has been deprecated. 8 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/pages/MinePage.ets:94:20 'SetOrCreate' has been deprecated. 9 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/goodability/DetailsPage.ets:264:20 'SetOrCreate' has been deprecated. 10 WARN: ArkTS:WARN File: D:/Deveco/HM/Day3/entry/src/main/ets/pages/LoginPage.ets:67:22 'SetOrCreate' has been deprecated. 1 ERROR: ArkTS:ERROR File: D:/Deveco/HM/Day3/entry/src/main/ets/db/SearchPage.ets:40:13 The component 'List' can only have the child component ListItem and Section and ListItemGroup. COMPILE RESULT:FAIL {ERROR:2 WARN:10} > hvigor ERROR: BUILD FAILED in 2 s 420 ms Process finished with exit code -1
07-10
### ArkTS 编译错误解决方案 在鸿蒙应用开发中,使用 ArkTS(增强的 TypeScript)时可能会遇到一些编译器报错问题。以下针对 `'globalThis' not supported`、`'SetOrCreate' deprecated` 以及 `List child component error` 进行分析与解决。 #### `'globalThis' not supported` ArkTS 当前版本可能尚未完全支持 `globalThis` 关键字,这是由于其运行环境对标准 JavaScript 的兼容性限制所致。若代码中出现 `'globalThis' not supported` 错误,建议改用平台支持的全局对象访问方式,例如使用 `global` 或 `window` 来替代: ```typescript // 不推荐使用 globalThis const myVar = globalThis.myVar; // 推荐使用 platform 兼容写法 const myVar = (global as any).myVar || (window as any).myVar; ``` 该写法通过类型断言和平台对象兼容处理,避免编译器报错[^1]。 #### `'SetOrCreate' deprecated` 如果编译器提示 `'SetOrCreate' is deprecated`,说明该 API 已被弃用。查阅官方文档可知,HarmonyOS SDK 在更新版本中引入了新的状态管理机制,如 `State`、`Prop` 和 `Link` 等装饰器来替代旧有的 `SetOrCreate` 方法。开发者应根据组件生命周期进行状态更新,例如使用 `@State` 装饰器实现局部状态变更: ```typescript @Component struct MyComponent { @State message: string = "Initial Message" build() { Column() { Text(this.message) .fontSize(30) Button('Update') .onClick(() => { this.message = "Updated Message" }) } .width('100%') .height('100%') } } ``` 上述代码通过 `@State` 实现响应式更新,避免使用已弃用的 `SetOrCreate` 方法[^2]。 #### List 子组件类型限制错误 在使用 `List` 组件构建列表界面时,若子元素类型不符合规范,会触发编译错误。ArkTS 对 `List` 的子组件有严格的类型约束,要求所有子项必须是 `ListItem` 类型或其派生类型。因此,以下写法会导致错误: ```typescript List() { Text("Item 1") // 错误:Text 不是 ListItem 类型 } ``` 正确做法是将每个子项包裹在 `ListItem` 中: ```typescript List() { ListItem() { Text("Item 1") .fontSize(24) } } ``` 此方式符合 HarmonyOS UI 框架对 `List` 组件的结构要求,确保渲染正常执行[^3]。 ---
评论 5
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值