About "Rsync"

From Wikipedia, the free encyclopedia

 

rsync is a software application for Unix and Windows systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate. An important feature of rsync not found in most similar programs/protocols is that the mirroring takes place with only one transmission in each direction. rsync can copy or display directory contents and copy files, optionally usingcompression and recursion.

In daemon mode, rsync listens on the default TCP port of 873, serving files in the native rsync protocol or via a remote shell such as RSH or SSH. In the latter case, the rsync client executable must be installed on both the local and the remote host.

 

 

…………

 

@http://en.wikipedia.org/wiki/Rsync

代码转载自:https://pan.quark.cn/s/a4b39357ea24 本文重点阐述了利用 LabVIEW 软件构建的锁相放大器的设计方案及其具体实施流程,并探讨了该设备在声波相位差定位系统中的实际运用情况。 锁相放大器作为一项基础测量技术,其核心功能在于能够精确锁定微弱信号的频率参数并完成相关测量工作。 在采用 LabVIEW 软件开发的锁相放大器系统中,通过计算测量信号与两条参考信号之间的互相关函数,实现对微弱信号的频率锁定,同时输出被测信号的幅值信息。 虚拟仪器技术是一种基于计算机硬件平台的仪器系统,其显著特征在于用户可以根据实际需求自主设计仪器功能,配备虚拟化操作界面,并将测试功能完全由专用软件程序实现。 虚拟仪器系统的基本架构主要由计算机主机、专用软件程序以及硬件接口模块等核心部件构成。 虚拟仪器最突出的优势在于其功能完全取决于软件编程,用户可以根据具体应用场景灵活调整系统功能参数。 在基于 LabVIEW 软件开发的锁相放大器系统中,主要运用 LabVIEW 软件平台完成锁相放大器功能的整体设计。 LabVIEW 作为一个图形化编程环境,能够高效地完成虚拟仪器的开发工作。 借助 LabVIEW 软件,可以快速构建锁相放大器的用户操作界面,并且可以根据实际需求进行灵活调整和功能扩展。 锁相放大器系统的关键构成要素包括测量信号输入通道、参考信号输入通道、频率锁定处理单元以及信号幅值输出单元。 测量信号是系统需要检测的对象,参考信号则用于引导系统完成对测量信号的频率锁定。 频率锁定处理单元负责实现测量信号的锁定功能,信号幅值输出单元则负责输出被测信号的幅值大小。 在锁相放大器的实际实现过程中,系统采用了双路参考信号输入方案来锁定测量信号。 通过分析两路参考信号之间的相...
边缘计算环境中基于启发式算法的深度神经网络卸载策略(Matlab代码实现)内容概要:本文介绍了在边缘计算环境中,利用启发式算法实现深度神经网络任务卸载的策略,并提供了相应的Matlab代码实现。文章重点探讨了如何通过合理的任务划分与调度,将深度神经网络的计算任务高效地卸载到边缘服务器,从而降低终端设备的计算负担、减少延迟并提高整体系统效率。文中涵盖了问题建模、启发式算法设计(如贪心策略、遗传算法、粒子群优化等可能的候选方法)、性能评估指标(如能耗、延迟、资源利用率)以及仿真实验结果分析等内容,旨在为边缘智能计算中的模型推理优化提供可行的技术路径。; 适合人群:具备一定编程基础,熟悉Matlab工具,从事边缘计算、人工智能、物联网或智能系统优化方向的研究生、科研人员及工程技术人员。; 使用场景及目标:①研究深度神经网络在资源受限设备上的部署与优化;②探索边缘计算环境下的任务卸载机制与算法设计;③通过Matlab仿真验证不同启发式算法在实际场景中的性能表现,优化系统延迟与能耗。; 阅读建议:建议读者结合提供的Matlab代码进行实践操作,重点关注算法实现细节与仿真参数设置,同时可尝试复现并对比不同启发式算法的效果,以深入理解边缘计算中DNN卸载的核心挑战与解决方案。
# Copyright (C) 2017 Mediatek # Author: Richard Sun # Some code and influence taken from externalsrc.bbclass: # Copyright (C) 2012 Linux Foundation # Author: Richard Purdie # Some code and influence taken from srctree.bbclass: # Copyright (C) 2009 Chris Larson <clarson@kergoth.com> # Released under the MIT license (see COPYING.MIT for the terms) # # workonsrc.bbclass enables use of an existing source tree, usually workon to # the build system to build a piece of software rather than the usual fetch/unpack/patch # process. # # To use, add workonsrc to the global inherit and set WORKONSRC to point at the # directory you want to use containing the sources e.g. from local.conf for a recipe # called "myrecipe" you would do: # # INHERIT += "workonsrc" # WORKONSRC_pn-myrecipe = "/path/to/my/source/tree" # # In order to make this class work for both target and native versions (or with # multilibs/cross or other BBCLASSEXTEND variants), B is set to point to a separate # directory under the work directory (split source and build directories). This is # the default, but the build directory can be set to the source directory if # circumstances dictate by setting WORKONSRC_BUILD to the same value, e.g.: # # WORKONSRC_BUILD_pn-myrecipe = "/path/to/my/source/tree" # SRCTREECOVEREDTASKS ?= "do_patch do_unpack do_fetch" EXTERNALSRC_SYMLINKS ?= "oe-workdir:${WORKDIR} oe-logs:${T}" python () { import subprocess, os.path depends = d.getVar("DEPENDS") depends = "%s rsync-native" % depends d.setVar("DEPENDS", depends) pn = d.getVar('PN') d.appendVarFlag('do_populate_lic', 'depends', ' %s:do_configure' % pn) workonsrc = d.getVar('WORKONSRC') workonsrcbuild = d.getVar('WORKONSRC_BUILD') if workonsrc and not workonsrc.startswith("/"): bb.error("WORKONSRC must be an absolute path") if workonsrcbuild and not workonsrcbuild.startswith("/"): bb.error("WORKONSRC_BUILD must be an absolute path") workonprebuilt = workonsrc.replace("../src/", "../prebuilt/") if not os.path.exists(workonsrc): if os.path.exists(workonprebuilt): workonsrc = workonprebuilt else: bb.note("Both %s and %s aren't existed" % (workonsrc, workonprebuilt) ) # If this is the base recipe and WORKONSRC is set for it or any of its # derivatives, then enable BB_DONT_CACHE to force the recipe to always be # re-parsed so that the file-checksums function for do_compile is run every # time. bpn = d.getVar('BPN') classextend = (d.getVar('BBCLASSEXTEND') or '').split() if bpn == d.getVar('PN') or not classextend: if (workonsrc or ('native' in classextend and d.getVar('WORKONSRC_pn-%s-native' % bpn)) or ('nativesdk' in classextend and d.getVar('WORKONSRC_pn-nativesdk-%s' % bpn)) or ('cross' in classextend and d.getVar('WORKONSRC_pn-%s-cross' % bpn))): d.setVar('BB_DONT_CACHE', '1') if workonsrc: import oe.recipeutils import oe.path d.setVar('S', workonsrc) if workonsrcbuild: d.setVar('B', workonsrcbuild) else: d.setVar('B', '${WORKDIR}/${BPN}-${PV}/') workonsrcbuild = d.getVar('B') if workonsrc != workonsrcbuild: d.setVar('S', workonsrcbuild) local_srcuri = [] fetch = bb.fetch2.Fetch((d.getVar('SRC_URI') or '').split(), d) for url in fetch.urls: url_data = fetch.ud[url] parm = url_data.parm if (url_data.type == 'file' or 'type' in parm and parm['type'] == 'kmeta'): local_srcuri.append(url) d.setVar('SRC_URI', ' '.join(local_srcuri)) if '{SRCPV}' in d.getVar('PV', False): # Dummy value because the default function can't be called with blank SRC_URI d.setVar('SRCPV', '999') if d.getVar('CONFIGUREOPT_DEPTRACK') == '--disable-dependency-tracking': d.setVar('CONFIGUREOPT_DEPTRACK', '') tasks = filter(lambda k: d.getVarFlag(k, "task"), d.keys()) for task in tasks: if task.endswith("_setscene"): # sstate is never going to work for workon source trees, disable it bb.build.deltask(task, d) else: # Since configure will likely touch ${S}, ensure only we lock so one task has access at a time d.appendVarFlag(task, "lockfiles", " ${S}/singletask.lock") # We do not want our source to be wiped out, ever (kernel.bbclass does this for do_clean) cleandirs = oe.recipeutils.split_var_value(d.getVarFlag(task, 'cleandirs', False) or '') setvalue = False for cleandir in cleandirs[:]: if oe.path.is_path_parent(workonsrc, d.expand(cleandir)): cleandirs.remove(cleandir) setvalue = True if setvalue: d.setVarFlag(task, 'cleandirs', ' '.join(cleandirs)) fetch_tasks = ['do_fetch', 'do_unpack'] # If we deltask do_patch, there's no dependency to ensure do_unpack gets run, so add one # Note that we cannot use d.appendVarFlag() here because deps is expected to be a list object, not a string d.setVarFlag('do_configure', 'deps', (d.getVarFlag('do_configure', 'deps', False) or []) + ['do_unpack']) for task in d.getVar("SRCTREECOVEREDTASKS").split(): if local_srcuri and task in fetch_tasks: continue bb.build.deltask(task, d) d.prependVarFlag('do_compile', 'prefuncs', "workonsrc_compile_prefunc ") d.prependVarFlag('do_configure', 'prefuncs', "workonsrc_configure_prefunc ") d.setVarFlag('do_compile', 'file-checksums', '${@srctree_hash_files_(d)}') d.setVarFlag('do_configure', 'file-checksums', '${@srctree_configure_hash_files_(d)}') # We don't want the workdir to go away d.appendVar('RM_WORK_EXCLUDE', ' ' + d.getVar('PN')) bb.build.addtask('do_buildclean', 'do_clean' if d.getVar('S') == d.getVar('B') else None, None, d) # If B=S the same builddir is used even for different architectures. # Thus, use a shared CONFIGURESTAMPFILE and STAMP directory so that # change of do_configure task hash is correctly detected and stamps are # invalidated if e.g. MACHINE changes. if d.getVar('S') == d.getVar('B'): configstamp = '${TMPDIR}/work-shared/${PN}/${EXTENDPE}${PV}-${PR}/configure.sstate' d.setVar('CONFIGURESTAMPFILE', configstamp) d.setVar('STAMP', '${STAMPS_DIR}/work-shared/${PN}/${EXTENDPE}${PV}-${PR}') d.setVar('STAMPCLEAN', '${STAMPS_DIR}/work-shared/${PN}/*-*') } python workonsrc_configure_prefunc() { srctree_rsync_files(d) s_dir = d.getVar('S') # Create desired symlinks symlinks = (d.getVar('EXTERNALSRC_SYMLINKS') or '').split() newlinks = [] for symlink in symlinks: symsplit = symlink.split(':', 1) lnkfile = os.path.join(s_dir, symsplit[0]) target = d.expand(symsplit[1]) if len(symsplit) > 1: if os.path.islink(lnkfile): # Link already exists, leave it if it points to the right location already if os.readlink(lnkfile) == target: continue os.unlink(lnkfile) elif os.path.exists(lnkfile): # File/dir exists with same name as link, just leave it alone continue os.symlink(target, lnkfile) newlinks.append(symsplit[0]) # Hide the symlinks from git try: git_exclude_file = os.path.join(s_dir, '.git/info/exclude') if os.path.exists(git_exclude_file): with open(git_exclude_file, 'r+') as efile: elines = efile.readlines() for link in newlinks: if link in elines or '/'+link in elines: continue efile.write('/' + link + '\n') except IOError as ioe: bb.note('Failed to hide EXTERNALSRC_SYMLINKS from git') } python workonsrc_compile_prefunc() { srctree_rsync_files(d) # Make it obvious that this is happening, since forgetting about it could lead to much confusion bb.plain('NOTE: %s: compiling from workon source tree %s' % (d.getVar('PN'), d.getVar('WORKONSRC'))) } do_buildclean[dirs] = "${S} ${B}" do_buildclean[nostamp] = "1" do_buildclean[doc] = "Call 'make clean' or equivalent in ${B}" workonsrc_do_buildclean() { if [ -e Makefile -o -e makefile -o -e GNUmakefile ]; then rm -f ${@' '.join([x.split(':')[0] for x in (d.getVar('EXTERNALSRC_SYMLINKS') or '').split()])} if [ "${CLEANBROKEN}" != "1" ]; then oe_runmake clean || die "make failed" fi else bbnote "nothing to do - no makefile found" fi } def srctree_rsync_files(d): import subprocess, os.path workonsrc = d.getVar('WORKONSRC') workonprebuilt = workonsrc.replace("../src/", "../prebuilt/") if not os.path.exists(workonsrc): if os.path.exists(workonprebuilt): workonsrc = workonprebuilt else: bb.note("Both %s and %s aren't existed" % (workonsrc, workonprebuilt) ) if workonsrc: d.setVar('S', workonsrc) workonsrcbuild = d.getVar('WORKONSRC_BUILD') if workonsrcbuild: d.setVar('B', workonsrcbuild) else: d.setVar('B', '${WORKDIR}/${BPN}-${PV}/') workonsrcbuild = d.getVar('B') if workonsrc != workonsrcbuild: cmd = "mkdir -p %s" % (workonsrcbuild) subprocess.call(cmd, shell=True) if os.path.exists(workonsrc): workonsrc_rsync_appended_flag = d.getVar('WORKONSRC_RSYNC_APPENDED_FLAG') if workonsrc_rsync_appended_flag is None: workonsrc_rsync_appended_flag="" cmd = "rsync -aL %s %s/* %s" % (workonsrc_rsync_appended_flag, workonsrc, workonsrcbuild) ret = subprocess.call(cmd, shell=True) d.setVar('S', workonsrcbuild) def srctree_hash_files_(d, srcdir=None): import shutil import subprocess import tempfile s_dir = srcdir or d.getVar('WORKONSRC') git_dir = None try: git_dir = os.path.join(s_dir, subprocess.check_output(['git', '-C', s_dir, 'rev-parse', '--git-dir'], stderr=subprocess.DEVNULL).decode("utf-8").rstrip()) except subprocess.CalledProcessError: pass ret = " " if git_dir is not None: oe_hash_file = os.path.join(git_dir, 'oe-devtool-tree-sha1') with tempfile.NamedTemporaryFile(prefix='oe-devtool-index') as tmp_index: # Clone index shutil.copyfile(os.path.join(git_dir, 'index'), tmp_index.name) # Update our custom index env = os.environ.copy() env['GIT_INDEX_FILE'] = tmp_index.name subprocess.check_output(['git', 'add', '-A', '.'], cwd=s_dir, env=env) sha1 = subprocess.check_output(['git', 'write-tree'], cwd=s_dir, env=env).decode("utf-8") with open(oe_hash_file, 'w') as fobj: fobj.write(sha1) ret = oe_hash_file + ':True' else: ret = s_dir + '/*:True' return ret def srctree_configure_hash_files_(d): """ Get the list of files that should trigger do_configure to re-execute, based on the value of CONFIGURE_FILES """ in_files = (d.getVar('CONFIGURE_FILES') or '').split() out_items = [] search_files = [] for entry in in_files: if entry.startswith('/'): out_items.append('%s:%s' % (entry, os.path.exists(entry))) else: search_files.append(entry) if search_files: s_dir = d.getVar('WORKONSRC') for root, _, files in os.walk(s_dir): for f in files: if f in search_files: out_items.append('%s:True' % os.path.join(root, f)) return ' '.join(out_items) EXPORT_FUNCTIONS do_buildclean
09-17
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值