/hadoop/src/contrib/build.xml

本文档详细介绍了Hadoop贡献模块(build.xml)的构建过程,包括编译、打包、测试等阶段,并指出了如何通过Ant任务来管理和执行这些操作。

/hadoop/src/contrib/build.xml


<?xml version="1.0"?>

<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<project name="hadoopcontrib" default="compile" basedir=".">
 
  <!-- In case one of the contrib subdirectories -->
  <!-- fails the build or test targets and you cannot fix it: -->
  <!-- Then add to fileset: excludes="badcontrib/build.xml" -->

  <!-- ====================================================== -->
  <!-- Compile contribs.                                      -->
  <!-- ====================================================== -->
  <target name="compile">
    <subant target="compile">
      <fileset dir="." includes="*/build.xml"/>
    </subant>
  </target>
 
  <!-- ====================================================== -->
  <!-- Package contrib jars.                                  -->
  <!-- ====================================================== -->
  <target name="package">
    <subant target="package">
      <fileset dir="." includes="*/build.xml"/>
    </subant>
  </target>
 
  <!-- ====================================================== -->
  <!-- Test all the contribs.                               -->
  <!-- ====================================================== -->
  <target name="test">
     <property name="hadoop.root" location="${root}/../../../"/>
     <property name="build.contrib.dir" location="${hadoop.root}/build/contrib"/>
     <delete file="${build.contrib.dir}/testsfailed"/>
    <subant target="test">
      <property name="continueOnFailure" value="true"/>
      <fileset dir="." includes="hdfsproxy/build.xml"/>
      <fileset dir="." includes="streaming/build.xml"/>
      <fileset dir="." includes="fairscheduler/build.xml"/>
      <fileset dir="." includes="capacity-scheduler/build.xml"/>
      <fileset dir="." includes="gridmix/build.xml"/>
    </subant>
     <available file="${build.contrib.dir}/testsfailed" property="testsfailed"/>
     <fail if="testsfailed">Tests failed!</fail>
  </target>
 
 
  <!-- ====================================================== -->
  <!-- Test all the contrib system tests                     -->
  <!-- ====================================================== -->
  <target name="test-system-contrib">
    <property name="hadoop.root" location="${root}/../../../"/>
    <property name="build.contrib.dir" location="${hadoop.root}/build/contrib"/>
    <delete file="${build.contrib.dir}/testsfailed"/>
    <subant target="test-system">
       <property name="continueOnFailure" value="true"/>
       <property name="hadoop.home" value="${hadoop.home}"/>
       <property name="hadoop.conf.dir" value="${hadoop.conf.dir}"/>
       <property name="hadoop.conf.dir.deployed"
           value="${hadoop.conf.dir.deployed}"/>
       <fileset dir="." includes="hdfsproxy/build.xml"/>
       <fileset dir="." includes="streaming/build.xml"/>
       <fileset dir="." includes="fairscheduler/build.xml"/>
       <fileset dir="." includes="capacity-scheduler/build.xml"/>
       <fileset dir="." includes="gridmix/build.xml"/>
    </subant>
    <available file="${build.contrib.dir}/testsfailed" property="testsfailed"/>
    <fail if="testsfailed">Tests failed!</fail>
  </target>

  <!-- ====================================================== -->
  <!-- Clean all the contribs.                              -->
  <!-- ====================================================== -->
  <target name="clean">
    <subant target="clean">
      <fileset dir="." includes="*/build.xml"/>
    </subant>
  </target>

</project>


在Ubuntu中使用R语言安装devtools时出现如下错误 第一个错误: 试开uRL'https://cloud.r-project.org/src/contrib/pkgbuild 1.4.8.tar.gz'Error in download.file(url, destfile, method, mode = "wb".···): 无法打开URL'https://cloud.r-project.org/src/contrib/pkgbuild 1.4.8.tar.gz' 此外:警告信息: In download.file(url,destfile, method, mode = "wb", ...): URL 'https://cloud.r-project.org/src/contrib/pkgbuild 1.4.8.tar.gz': Timeout of 60 seconds was reached download.packages(pkgs, destdir = tmpd,available = available,里有警告: 下载程序包'pkgbuild’时出了问题 第二个错误: In file included from ada.cpp:3:0: ada.h:5255:10:fatal error:charconv:没有那个文件或目录 #include <charconv> ^~~~~~~~~~ compilation terminated. /usr/lib/R/etc/Makeconf:204:recipe for target 'ada.o' failed make: *** [ada.o] Error 1 ERROR: compilation failed for package 'curl’ *removing '/home/hadoop/R/x86_64-pc-linux-anu-library/4.4/curl' 第三个错误: ERROR: dependency 'pkgbuild’ is not available for package 'pkgload' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/pkgload' ERRoR: dependency 'curl’ is not available for package 'credentials’ *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/credentials' 第四个错误: ERROR: dependencies 'curl’, 'pkgbuild’ are not available for package 'rcmdcheck' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/rcmdcheck' ERROR: dependency 'curl’ is not available for package 'rversions’ *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/rversions' ERRoR: dependency 'pkgload’ is not available for package 'testthat’ *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/testthat' ERROR: dependency 'curl’ is not available for package 'urlchecker' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/urlchecker' 第五个错误: ERROR: dependency 'curl’ is not available for package 'httr2’ *removing ’/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/httr2‘ 第六个错误: string bidi.cpp:12:10: fatal error: fribidi.h: 没有那个文件或目录 #include <fribidi.h> ^~~~~~~~~~ compilation terminated /usr/lib/R/etc/Makeconf:204:recipe for target 'string bidi.o' failed make: *** 「string bidi.o] Error 1 ERROR: compilation failed for package 'textshaping' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/textshaping' 第七个错误: ERROR: dependency 'credentials’ is not available for package 'gert' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/gert' ERROR: dependency 'httr2’ is not available for package 'gh' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/gh' 第八个错误 ERROR: dependency 'textshaping’ is not available for package 'ragg' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/ragg' 第九个错误: ERROR: dependencies 'curl’ 'gert’, 'gh’ are not available for package 'usethis.’ *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/usethis' 第十个错误: ERROR: dependency 'pkgload’ is not available for package 'roxygen2' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/roxygen2' 第十一个错误: ERROR: dependencies 'httr2’, 'ragg’ are not available for package 'pkgdown' *removing '/home/hadoop/R/x86_64-pc-linux-gnu-library/4.4/pkgdown' 第十二个错误: ERROR: dependencies 'usethis’, 'pkgbuild’, 'pkgdown’, 'pkgload’, 'rcmdcheck’, 'roxygen2’, 'rversions’, 'testthat', 'urlchecker’ are not availa ble for package 'devtools' removing '/home/hadoop/R/x86 64-pc-linux-gnu-library/4.4/devtools’
最新发布
06-21
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值