About "Rsync"

From Wikipedia, the free encyclopedia

 

rsync is a software application for Unix and Windows systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate. An important feature of rsync not found in most similar programs/protocols is that the mirroring takes place with only one transmission in each direction. rsync can copy or display directory contents and copy files, optionally usingcompression and recursion.

In daemon mode, rsync listens on the default TCP port of 873, serving files in the native rsync protocol or via a remote shell such as RSH or SSH. In the latter case, the rsync client executable must be installed on both the local and the remote host.

 

 

…………

 

@http://en.wikipedia.org/wiki/Rsync

先展示下效果 https://pan.quark.cn/s/e81b877737c1 Node.js 是一种基于 Chrome V8 引擎的 JavaScript 执行环境,它使开发者能够在服务器端执行 JavaScript 编程,显著促进了全栈开发的应用普及。 在 Node.js 的开发流程中,`node_modules` 文件夹用于存储所有依赖的模块,随着项目的进展,该文件夹可能会变得异常庞大,其中包含了众多可能已不再需要的文件和文件夹,这不仅会消耗大量的硬盘空间,还可能减慢项目的加载时间。 `ModClean 2.0` 正是为了应对这一挑战而设计的工具。 `ModClean` 是一款用于清理 `node_modules` 的软件,其核心功能是移除那些不再被使用的文件和文件夹,从而确保项目的整洁性和运行效率。 `ModClean 2.0` 是此工具的改进版本,在原有功能上增加了更多特性,从而提高了清理工作的效率和精确度。 在 `ModClean 2.0` 中,用户可以设置清理规则,例如排除特定的模块或文件类型,以防止误删重要文件。 该工具通常会保留项目所依赖的核心模块,但会移除测试、文档、示例代码等非运行时必需的部分。 通过这种方式,`ModClean` 能够协助开发者优化项目结构,减少不必要的依赖,加快项目的构建速度。 使用 `ModClean` 的步骤大致如下:1. 需要先安装 `ModClean`,在项目的根目录中执行以下命令: ``` npm install modclean -g ```2. 创建配置文件 `.modcleanrc.json` 或 `.modcleanrc.js`,设定希望清理的规则。 比如,可能需要忽略 `LICENSE` 文件或整个 `docs`...
2026最新微信在线AI客服系统源码 微信客服AI系统是一款基于PHP开发的智能客服解决方案,完美集成企业微信客服,为企业提供7×24小时智能客服服务。系统支持文本对话、图片分析、视频分析等多种交互方式,并具备完善的对话管理、人工转接、咨询提醒等高级功能。 核心功能 ### 1.  智能AI客服 #### 自动回复 - **上下文理解**:系统自动保存用户对话历史,AI能够理解上下文,提供连贯的对话体验 - **个性化配置**:可自定义系统提示词、最大输出长度等AI参数 #### 产品知识库集成 - **公司信息**:支持配置公司简介、官网、竞争对手等信息 - **产品列表**:可添加多个产品,包括产品名称、配置、价格、适用人群、特点等 - **常见问题FAQ**:预设常见问题及答案,AI优先使用知识库内容回答 - **促销活动**:支持配置当前优惠活动,AI会自动向用户推荐 ### 2. 多媒体支持 #### 图片分析 - 支持用户发送图片,AI自动分析图片内容 - 可结合文字描述,提供更精准的分析结果 - 支持常见图片格式:JPG、PNG、GIF、WebP等 #### 视频分析 - 支持用户发送视频,AI自动分析视频内容 - 视频文件自动保存到服务器,提供公网访问 - 支持常见视频格式:MP4、等 ### 3.  人工客服转接 #### 关键词触发 - **自定义关键词**:可配置多个转人工触发关键词(如:人工、客服、转人工等) - **自动转接**:用户消息包含关键词时,自动转接给指定人工客服 - **友好提示**:转接前向用户发送提示消息,提升用户体验 #### 一键介入功能 - **后台管理**:管理员可在对话管理页面查看所有对话记录 - **快速转接**:点击"一键介入"按钮,立即将用户转接给人工客服
# Copyright (C) 2017 Mediatek # Author: Richard Sun # Some code and influence taken from externalsrc.bbclass: # Copyright (C) 2012 Linux Foundation # Author: Richard Purdie # Some code and influence taken from srctree.bbclass: # Copyright (C) 2009 Chris Larson <clarson@kergoth.com> # Released under the MIT license (see COPYING.MIT for the terms) # # workonsrc.bbclass enables use of an existing source tree, usually workon to # the build system to build a piece of software rather than the usual fetch/unpack/patch # process. # # To use, add workonsrc to the global inherit and set WORKONSRC to point at the # directory you want to use containing the sources e.g. from local.conf for a recipe # called "myrecipe" you would do: # # INHERIT += "workonsrc" # WORKONSRC_pn-myrecipe = "/path/to/my/source/tree" # # In order to make this class work for both target and native versions (or with # multilibs/cross or other BBCLASSEXTEND variants), B is set to point to a separate # directory under the work directory (split source and build directories). This is # the default, but the build directory can be set to the source directory if # circumstances dictate by setting WORKONSRC_BUILD to the same value, e.g.: # # WORKONSRC_BUILD_pn-myrecipe = "/path/to/my/source/tree" # SRCTREECOVEREDTASKS ?= "do_patch do_unpack do_fetch" EXTERNALSRC_SYMLINKS ?= "oe-workdir:${WORKDIR} oe-logs:${T}" python () { import subprocess, os.path depends = d.getVar("DEPENDS") depends = "%s rsync-native" % depends d.setVar("DEPENDS", depends) pn = d.getVar('PN') d.appendVarFlag('do_populate_lic', 'depends', ' %s:do_configure' % pn) workonsrc = d.getVar('WORKONSRC') workonsrcbuild = d.getVar('WORKONSRC_BUILD') if workonsrc and not workonsrc.startswith("/"): bb.error("WORKONSRC must be an absolute path") if workonsrcbuild and not workonsrcbuild.startswith("/"): bb.error("WORKONSRC_BUILD must be an absolute path") workonprebuilt = workonsrc.replace("../src/", "../prebuilt/") if not os.path.exists(workonsrc): if os.path.exists(workonprebuilt): workonsrc = workonprebuilt else: bb.note("Both %s and %s aren't existed" % (workonsrc, workonprebuilt) ) # If this is the base recipe and WORKONSRC is set for it or any of its # derivatives, then enable BB_DONT_CACHE to force the recipe to always be # re-parsed so that the file-checksums function for do_compile is run every # time. bpn = d.getVar('BPN') classextend = (d.getVar('BBCLASSEXTEND') or '').split() if bpn == d.getVar('PN') or not classextend: if (workonsrc or ('native' in classextend and d.getVar('WORKONSRC_pn-%s-native' % bpn)) or ('nativesdk' in classextend and d.getVar('WORKONSRC_pn-nativesdk-%s' % bpn)) or ('cross' in classextend and d.getVar('WORKONSRC_pn-%s-cross' % bpn))): d.setVar('BB_DONT_CACHE', '1') if workonsrc: import oe.recipeutils import oe.path d.setVar('S', workonsrc) if workonsrcbuild: d.setVar('B', workonsrcbuild) else: d.setVar('B', '${WORKDIR}/${BPN}-${PV}/') workonsrcbuild = d.getVar('B') if workonsrc != workonsrcbuild: d.setVar('S', workonsrcbuild) local_srcuri = [] fetch = bb.fetch2.Fetch((d.getVar('SRC_URI') or '').split(), d) for url in fetch.urls: url_data = fetch.ud[url] parm = url_data.parm if (url_data.type == 'file' or 'type' in parm and parm['type'] == 'kmeta'): local_srcuri.append(url) d.setVar('SRC_URI', ' '.join(local_srcuri)) if '{SRCPV}' in d.getVar('PV', False): # Dummy value because the default function can't be called with blank SRC_URI d.setVar('SRCPV', '999') if d.getVar('CONFIGUREOPT_DEPTRACK') == '--disable-dependency-tracking': d.setVar('CONFIGUREOPT_DEPTRACK', '') tasks = filter(lambda k: d.getVarFlag(k, "task"), d.keys()) for task in tasks: if task.endswith("_setscene"): # sstate is never going to work for workon source trees, disable it bb.build.deltask(task, d) else: # Since configure will likely touch ${S}, ensure only we lock so one task has access at a time d.appendVarFlag(task, "lockfiles", " ${S}/singletask.lock") # We do not want our source to be wiped out, ever (kernel.bbclass does this for do_clean) cleandirs = oe.recipeutils.split_var_value(d.getVarFlag(task, 'cleandirs', False) or '') setvalue = False for cleandir in cleandirs[:]: if oe.path.is_path_parent(workonsrc, d.expand(cleandir)): cleandirs.remove(cleandir) setvalue = True if setvalue: d.setVarFlag(task, 'cleandirs', ' '.join(cleandirs)) fetch_tasks = ['do_fetch', 'do_unpack'] # If we deltask do_patch, there's no dependency to ensure do_unpack gets run, so add one # Note that we cannot use d.appendVarFlag() here because deps is expected to be a list object, not a string d.setVarFlag('do_configure', 'deps', (d.getVarFlag('do_configure', 'deps', False) or []) + ['do_unpack']) for task in d.getVar("SRCTREECOVEREDTASKS").split(): if local_srcuri and task in fetch_tasks: continue bb.build.deltask(task, d) d.prependVarFlag('do_compile', 'prefuncs', "workonsrc_compile_prefunc ") d.prependVarFlag('do_configure', 'prefuncs', "workonsrc_configure_prefunc ") d.setVarFlag('do_compile', 'file-checksums', '${@srctree_hash_files_(d)}') d.setVarFlag('do_configure', 'file-checksums', '${@srctree_configure_hash_files_(d)}') # We don't want the workdir to go away d.appendVar('RM_WORK_EXCLUDE', ' ' + d.getVar('PN')) bb.build.addtask('do_buildclean', 'do_clean' if d.getVar('S') == d.getVar('B') else None, None, d) # If B=S the same builddir is used even for different architectures. # Thus, use a shared CONFIGURESTAMPFILE and STAMP directory so that # change of do_configure task hash is correctly detected and stamps are # invalidated if e.g. MACHINE changes. if d.getVar('S') == d.getVar('B'): configstamp = '${TMPDIR}/work-shared/${PN}/${EXTENDPE}${PV}-${PR}/configure.sstate' d.setVar('CONFIGURESTAMPFILE', configstamp) d.setVar('STAMP', '${STAMPS_DIR}/work-shared/${PN}/${EXTENDPE}${PV}-${PR}') d.setVar('STAMPCLEAN', '${STAMPS_DIR}/work-shared/${PN}/*-*') } python workonsrc_configure_prefunc() { srctree_rsync_files(d) s_dir = d.getVar('S') # Create desired symlinks symlinks = (d.getVar('EXTERNALSRC_SYMLINKS') or '').split() newlinks = [] for symlink in symlinks: symsplit = symlink.split(':', 1) lnkfile = os.path.join(s_dir, symsplit[0]) target = d.expand(symsplit[1]) if len(symsplit) > 1: if os.path.islink(lnkfile): # Link already exists, leave it if it points to the right location already if os.readlink(lnkfile) == target: continue os.unlink(lnkfile) elif os.path.exists(lnkfile): # File/dir exists with same name as link, just leave it alone continue os.symlink(target, lnkfile) newlinks.append(symsplit[0]) # Hide the symlinks from git try: git_exclude_file = os.path.join(s_dir, '.git/info/exclude') if os.path.exists(git_exclude_file): with open(git_exclude_file, 'r+') as efile: elines = efile.readlines() for link in newlinks: if link in elines or '/'+link in elines: continue efile.write('/' + link + '\n') except IOError as ioe: bb.note('Failed to hide EXTERNALSRC_SYMLINKS from git') } python workonsrc_compile_prefunc() { srctree_rsync_files(d) # Make it obvious that this is happening, since forgetting about it could lead to much confusion bb.plain('NOTE: %s: compiling from workon source tree %s' % (d.getVar('PN'), d.getVar('WORKONSRC'))) } do_buildclean[dirs] = "${S} ${B}" do_buildclean[nostamp] = "1" do_buildclean[doc] = "Call 'make clean' or equivalent in ${B}" workonsrc_do_buildclean() { if [ -e Makefile -o -e makefile -o -e GNUmakefile ]; then rm -f ${@' '.join([x.split(':')[0] for x in (d.getVar('EXTERNALSRC_SYMLINKS') or '').split()])} if [ "${CLEANBROKEN}" != "1" ]; then oe_runmake clean || die "make failed" fi else bbnote "nothing to do - no makefile found" fi } def srctree_rsync_files(d): import subprocess, os.path workonsrc = d.getVar('WORKONSRC') workonprebuilt = workonsrc.replace("../src/", "../prebuilt/") if not os.path.exists(workonsrc): if os.path.exists(workonprebuilt): workonsrc = workonprebuilt else: bb.note("Both %s and %s aren't existed" % (workonsrc, workonprebuilt) ) if workonsrc: d.setVar('S', workonsrc) workonsrcbuild = d.getVar('WORKONSRC_BUILD') if workonsrcbuild: d.setVar('B', workonsrcbuild) else: d.setVar('B', '${WORKDIR}/${BPN}-${PV}/') workonsrcbuild = d.getVar('B') if workonsrc != workonsrcbuild: cmd = "mkdir -p %s" % (workonsrcbuild) subprocess.call(cmd, shell=True) if os.path.exists(workonsrc): workonsrc_rsync_appended_flag = d.getVar('WORKONSRC_RSYNC_APPENDED_FLAG') if workonsrc_rsync_appended_flag is None: workonsrc_rsync_appended_flag="" cmd = "rsync -aL %s %s/* %s" % (workonsrc_rsync_appended_flag, workonsrc, workonsrcbuild) ret = subprocess.call(cmd, shell=True) d.setVar('S', workonsrcbuild) def srctree_hash_files_(d, srcdir=None): import shutil import subprocess import tempfile s_dir = srcdir or d.getVar('WORKONSRC') git_dir = None try: git_dir = os.path.join(s_dir, subprocess.check_output(['git', '-C', s_dir, 'rev-parse', '--git-dir'], stderr=subprocess.DEVNULL).decode("utf-8").rstrip()) except subprocess.CalledProcessError: pass ret = " " if git_dir is not None: oe_hash_file = os.path.join(git_dir, 'oe-devtool-tree-sha1') with tempfile.NamedTemporaryFile(prefix='oe-devtool-index') as tmp_index: # Clone index shutil.copyfile(os.path.join(git_dir, 'index'), tmp_index.name) # Update our custom index env = os.environ.copy() env['GIT_INDEX_FILE'] = tmp_index.name subprocess.check_output(['git', 'add', '-A', '.'], cwd=s_dir, env=env) sha1 = subprocess.check_output(['git', 'write-tree'], cwd=s_dir, env=env).decode("utf-8") with open(oe_hash_file, 'w') as fobj: fobj.write(sha1) ret = oe_hash_file + ':True' else: ret = s_dir + '/*:True' return ret def srctree_configure_hash_files_(d): """ Get the list of files that should trigger do_configure to re-execute, based on the value of CONFIGURE_FILES """ in_files = (d.getVar('CONFIGURE_FILES') or '').split() out_items = [] search_files = [] for entry in in_files: if entry.startswith('/'): out_items.append('%s:%s' % (entry, os.path.exists(entry))) else: search_files.append(entry) if search_files: s_dir = d.getVar('WORKONSRC') for root, _, files in os.walk(s_dir): for f in files: if f in search_files: out_items.append('%s:True' % os.path.join(root, f)) return ' '.join(out_items) EXPORT_FUNCTIONS do_buildclean
09-17
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值