1 Star 0 Fork 35

李恬/hadoop

forked from src-openEuler/hadoop 
加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
hadoop.spec 77.56 KB
一键复制 编辑 原始数据 按行查看 历史
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804805806807808809810811812813814815816817818819820821822823824825826827828829830831832833834835836837838839840841842843844845846847848849850851852853854855856857858859860861862863864865866867868869870871872873874875876877878879880881882883884885886887888889890891892893894895896897898899900901902903904905906907908909910911912913914915916917918919920921922923924925926927928929930931932933934935936937938939940941942943944945946947948949950951952953954955956957958959960961962963964965966967968969970971972973974975976977978979980981982983984985986987988989990991992993994995996997998999100010011002100310041005100610071008100910101011101210131014101510161017101810191020102110221023102410251026102710281029103010311032103310341035103610371038103910401041104210431044104510461047104810491050105110521053105410551056105710581059106010611062106310641065106610671068106910701071107210731074107510761077107810791080108110821083108410851086108710881089109010911092109310941095109610971098109911001101110211031104110511061107110811091110111111121113111411151116111711181119112011211122112311241125112611271128112911301131113211331134113511361137113811391140114111421143114411451146114711481149115011511152115311541155115611571158115911601161116211631164116511661167116811691170117111721173117411751176117711781179118011811182118311841185118611871188118911901191119211931194119511961197119811991200120112021203120412051206120712081209121012111212121312141215121612171218121912201221122212231224122512261227122812291230123112321233123412351236123712381239124012411242124312441245124612471248124912501251125212531254125512561257125812591260126112621263126412651266126712681269
%global _hardened_build 1
%global hadoop_version %{version}
%global hdfs_services hadoop-zkfc.service hadoop-datanode.service hadoop-secondarynamenode.service hadoop-namenode.service hadoop-journalnode.service
%global mapreduce_services hadoop-historyserver.service
%global yarn_services hadoop-proxyserver.service hadoop-resourcemanager.service hadoop-nodemanager.service hadoop-timelineserver.service
# Filter out undesired provides and requires
%global __requires_exclude_from ^%{_libdir}/%{name}/libhadoop.so$
%global __provides_exclude_from ^%{_libdir}/%{name}/.*$
%define _binaries_in_noarch_packages_terminate_build 0
%define huawei_repo https://repo.huaweicloud.com/repository
Name: hadoop
Version: 3.3.6
Release: 8
Summary: A software platform for processing vast amounts of data
# The BSD license file is missing
# https://issues.apache.org/jira/browse/HADOOP-9849
License: Apache-2.0 and BSD and Zlib and BSL-1.0 and MPL-2.0 and EPL-1.0 and MIT
URL: https://%{name}.apache.org
Source0: https://www.apache.org/dist/%{name}/core/%{name}-%{version}/%{name}-%{version}-src.tar.gz
Source1: %{name}-layout.sh
Source2: %{name}-hdfs.service.template
Source3: %{name}-mapreduce.service.template
Source4: %{name}-yarn.service.template
Source5: context.xml
Source6: %{name}.logrotate
Source7: %{name}-httpfs.sysconfig
Source8: hdfs-create-dirs
Source9: %{name}-tomcat-users.xml
Source10: %{name}-core-site.xml
Source11: %{name}-hdfs-site.xml
Source12: %{name}-mapred-site.xml
Source13: %{name}-yarn-site.xml
Source14: yarn-v1.22.5.tar.gz
Source15: node-12.22.1-linux-x64.tar.gz
Source16: node-v12.22.1-linux-arm64.tar.gz
Source17: https://github.com/protocolbuffers/protobuf/archive/refs/tags/v3.7.1.tar.gz
Source18: https://github.com/grpc/grpc-java/archive/refs/tags/v1.26.0.tar.gz
Source19: https://services.gradle.org/distributions/gradle-5.6.2-bin.zip
Source20: https://github.com/google/protobuf/releases/download/v3.11.0/protobuf-all-3.11.0.tar.gz
Source21: https://github.com/protocolbuffers/protobuf/archive/refs/tags/v3.11.0.tar.gz
Patch0: 01-lock-triple-beam-version-to-1.3.0.patch
Patch1: 02-Upgrade-os-maven-plugin-to-1.7.1.patch
Patch2: 03-Fix-build-on-riscv.patch
Patch3: 04-Enhance-access-control-for-RunJar.patch
%ifarch riscv64
Patch1000: 1000-Added-support-for-building-the-riscv64-protoc-binari.patch
Patch1001: 1001-Added-support-for-building-the-riscv64-protoc-gen-gr.patch
Patch1002: 1002-Added-support-for-building-the-riscv64-protoc-binari.patch
%endif
BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-root
BuildRequires: java-1.8.0-openjdk-devel maven hostname maven-local tomcat cmake snappy openssl-devel
BuildRequires: cyrus-sasl-devel chrpath systemd protobuf2-compiler protobuf2-devel protobuf2-java protobuf2
BuildRequires: leveldbjni leveldb-java hawtjni-runtime gcc-c++
BuildRequires: npm chrpath
%ifarch riscv64
BuildRequires: autoconf automake libtool pkgconfig zlib-devel libstdc++-static
%endif
Requires: java-1.8.0-openjdk protobuf2-java apache-zookeeper
%description
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
%package client
Summary: Libraries for Apache Hadoop clients
BuildArch: noarch
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-hdfs = %{version}-%{release}
Requires: %{name}-mapreduce = %{version}-%{release}
Requires: %{name}-yarn = %{version}-%{release}
%description client
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides libraries for Apache Hadoop clients.
%package common
Summary: Common files needed by Apache Hadoop daemons
BuildArch: noarch
Requires(pre): /usr/sbin/useradd
Obsoletes: %{name}-javadoc < 2.4.1-22%{?dist}
Requires: apache-zookeeper
Requires: leveldb
Requires: protobuf2-java
Conflicts: hadoop-3.1-client
%description common
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains common files and utilities needed by other Apache
Hadoop modules.
%package common-native
Summary: The native Apache Hadoop library file
Requires: %{name}-common = %{version}-%{release}
Conflicts: hadoop-3.1-common
%description common-native
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains the native-hadoop library
%package devel
Summary: Headers for Apache Hadoop
Requires: libhdfs%{?_isa} = %{version}-%{release}
Conflicts: hadoop-3.1-common-native
%description devel
Header files for Apache Hadoop's hdfs library and other utilities
%package hdfs
Summary: The Apache Hadoop Distributed File System
BuildArch: noarch
Requires: apache-commons-daemon-jsvc
Requires: %{name}-common = %{version}-%{release}
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-hdfs
%description hdfs
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
The Hadoop Distributed File System (HDFS) is the primary storage system
used by Apache Hadoop applications.
%package httpfs
Summary: Provides web access to HDFS
BuildArch: noarch
Requires: apache-commons-dbcp
Requires: ecj >= 1:4.2.1-6
Requires: json_simple
Requires: tomcat
Requires: tomcat-lib
Requires: tcnative
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-httpfs
%description httpfs
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides a server that provides HTTP REST API support for
the complete FileSystem/FileContext interface in HDFS.
%package -n libhdfs
Summary: The Apache Hadoop Filesystem Library
Requires: %{name}-hdfs = %{version}-%{release}
Requires: lzo
Conflicts: hadoop-3.1-libhdfs
%description -n libhdfs
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides the Apache Hadoop Filesystem Library.
%package mapreduce
Summary: Apache Hadoop MapReduce (MRv2)
BuildArch: noarch
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-mapreduce-examples = %{version}-%{release}
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-mapreduce
%description mapreduce
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides Apache Hadoop MapReduce (MRv2).
%package mapreduce-examples
Summary: Apache Hadoop MapReduce (MRv2) examples
BuildArch: noarch
Requires: hsqldb
Conflicts: hadoop-3.1-mapreduce-examples
%description mapreduce-examples
This package contains mapreduce examples.
%package maven-plugin
Summary: Apache Hadoop maven plugin
BuildArch: noarch
Requires: maven
Conflicts: hadoop-3.1-maven-plugin
%description maven-plugin
The Apache Hadoop maven plugin
%package tests
Summary: Apache Hadoop test resources
BuildArch: noarch
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-hdfs = %{version}-%{release}
Requires: %{name}-mapreduce = %{version}-%{release}
Requires: %{name}-yarn = %{version}-%{release}
Conflicts: hadoop-3.1-tests
%description tests
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains test related resources for Apache Hadoop.
%package yarn
Summary: Apache Hadoop YARN
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-mapreduce = %{version}-%{release}
Requires: aopalliance
Requires: atinject
Requires: hamcrest
Requires: hawtjni
Requires: leveldbjni
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-yarn nodejs-yarn
%description yarn
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains Apache Hadoop YARN.
%package yarn-security
Summary: The ability to run Apache Hadoop YARN in secure mode
Requires: %{name}-yarn = %{version}-%{release}
Conflicts: hadoop-3.1-yarn-security
%description yarn-security
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains files needed to run Apache Hadoop YARN in secure mode.
%prep
%ifarch riscv64
mkdir -p ${HOME}/%{name}-prep_dir
# protoc
tar -mxf %{SOURCE17} -C ${HOME}/%{name}-prep_dir
pushd ${HOME}/%{name}-prep_dir/protobuf-3.7.1
%patch 1000 -p1
./autogen.sh
./protoc-artifacts/build-protoc.sh linux riscv64 protoc
mvn install:install-file -DgroupId=com.google.protobuf -DartifactId=protoc -Dversion=3.7.1 -Dclassifier=linux-riscv64 -Dpackaging=exe -Dfile=protoc-artifacts/target/linux/riscv64/protoc.exe
popd
# prepare gradle and protobuf for build protoc-gen-grpc-java
mkdir -p %{_tmppath}/source
cp %{SOURCE19} %{_tmppath}/source
tar xzf %{SOURCE20} -C %{_tmppath}/source
tar -mxf %{SOURCE21} -C ${HOME}/%{name}-prep_dir
pushd ${HOME}/%{name}-prep_dir/protobuf-3.11.0
%patch 1002 -p1
./autogen.sh
./protoc-artifacts/build-protoc.sh linux riscv64 protoc
mvn install:install-file -DgroupId=com.google.protobuf -DartifactId=protoc -Dversion=3.11.0 -Dclassifier=linux-riscv64 -Dpackaging=exe -Dfile=protoc-artifacts/target/linux/riscv64/protoc.exe
popd
# protoc-gen-grpc-java
tar -mxf %{SOURCE18} -C ${HOME}/%{name}-prep_dir
pushd ${HOME}/%{name}-prep_dir/grpc-java-1.26.0
%patch 1001 -p1
sed -i "s,@HOME@,${HOME},g" build.gradle
sed -i 's|https\\://services.gradle.org/distributions|file://%{_tmppath}/source|g' gradle/wrapper/gradle-wrapper.properties
SKIP_TESTS=true ARCH=riscv64 ./buildscripts/kokoro/unix.sh
mvn install:install-file -DgroupId=io.grpc -DartifactId=protoc-gen-grpc-java -Dversion=1.26.0 -Dclassifier=linux-riscv64 -Dpackaging=exe -Dfile=mvn-artifacts/io/grpc/protoc-gen-grpc-java/1.26.0/protoc-gen-grpc-java-1.26.0-linux-riscv64.exe
popd
mvn install:install-file -DgroupId=com.google.protobuf -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-riscv64 -Dpackaging=exe -Dfile=/usr/bin/protoc
%setup -qn %{name}-%{version}-src
%patch 0 -p1
%patch 1 -p1
%patch 2 -p1
%else
%autosetup -p1 -n %{name}-%{version}-src
%endif
mvn install:install-file -DgroupId=org.fusesource.leveldbjni -DartifactId=leveldbjni-all -Dversion=1.8 -Dpackaging=jar -Dfile=/usr/lib/java/leveldbjni-all.jar
mvn install:install-file -DgroupId=org.fusesource.leveldbjni -DartifactId=leveldbjni -Dversion=1.8 -Dpackaging=jar -Dfile=/usr/lib/java/leveldbjni/leveldbjni.jar
mvn install:install-file -DgroupId=org.iq80.leveldb -DartifactId=leveldb-api -Dversion=0.7 -Dpackaging=jar -Dfile=/usr/share/java/leveldb-java/leveldb-api.jar
mvn install:install-file -DgroupId=org.iq80.leveldb -DartifactId=leveldb-benchmark -Dversion=0.7 -Dpackaging=jar -Dfile=/usr/share/java/leveldb-java/leveldb-benchmark.jar
mvn install:install-file -DgroupId=org.iq80.leveldb -DartifactId=leveldb -Dversion=0.7 -Dpackaging=jar -Dfile=/usr/share/java/leveldb-java/leveldb.jar
mvn install:install-file -DgroupId=orn.fusesource.hawtjni -DartifactId=hawtjni-runtime -Dversion=1.16 -Dpackaging=jar -Dfile=/usr/lib/java/hawtjni/hawtjni-runtime.jar
mkdir -p ${HOME}/.m2/repository/com/github/eirslett/node/12.22.1/
cp %{SOURCE15} ${HOME}/.m2/repository/com/github/eirslett/node/12.22.1/
cp %{SOURCE16} ${HOME}/.m2/repository/com/github/eirslett/node/12.22.1/
mv ${HOME}/.m2/repository/com/github/eirslett/node/12.22.1/node-v12.22.1-linux-arm64.tar.gz ${HOME}/.m2/repository/com/github/eirslett/node/12.22.1/node-12.22.1-linux-arm64.tar.gz
mkdir -p ${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/
cp %{SOURCE14} ${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/
mv ${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/yarn-v1.22.5.tar.gz ${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/yarn-1.22.5.tar.gz
tar -xzvf ${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/yarn-1.22.5.tar.gz -C ${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/
npm config set registry %{huawei_repo}/npm/
npm cache clean -f
${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/yarn-v1.22.5/bin/yarn config set registry %{huawei_repo}/npm/ -g
${HOME}/.m2/repository/com/github/eirslett/yarn/1.22.5/yarn-v1.22.5/bin/yarn config set ignore-engines true
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-hdfs-project/hadoop-hdfs
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-web-proxy
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common
%pom_add_dep org.fusesource.leveldbjni:leveldbjni:1.8 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server
%pom_add_dep org.fusesource.hawtjni:hawtjni-runtime:1.16 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice
%pom_disable_module hadoop-minikdc hadoop-common-project
%pom_disable_module hadoop-pipes hadoop-tools
%pom_disable_module hadoop-azure hadoop-tools
%pom_disable_module hadoop-yarn-server-timelineservice-hbase-tests hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/pom.xml
%ifarch riscv64
%pom_disable_module hadoop-yarn-applications-catalog-webapp hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-catalog/pom.xml
%endif
# War files we don't want
%mvn_package :%{name}-auth-examples __noinstall
%mvn_package :%{name}-hdfs-httpfs __noinstall
# Parts we don't want to distribute
%mvn_package :%{name}-assemblies __noinstall
# Workaround for bz1012059
%mvn_package :%{name}-project-dist __noinstall
# Create separate file lists for packaging
%mvn_package :::tests: %{name}-tests
%mvn_package :%{name}-*-tests::{}: %{name}-tests
%mvn_package :%{name}-client*::{}: %{name}-client
%mvn_package :%{name}-hdfs*::{}: %{name}-hdfs
%mvn_package :%{name}-mapreduce-examples*::{}: %{name}-mapreduce-examples
%mvn_package :%{name}-mapreduce*::{}: %{name}-mapreduce
%mvn_package :%{name}-archives::{}: %{name}-mapreduce
%mvn_package :%{name}-datajoin::{}: %{name}-mapreduce
%mvn_package :%{name}-distcp::{}: %{name}-mapreduce
%mvn_package :%{name}-extras::{}: %{name}-mapreduce
%mvn_package :%{name}-gridmix::{}: %{name}-mapreduce
%mvn_package :%{name}-openstack::{}: %{name}-mapreduce
%mvn_package :%{name}-rumen::{}: %{name}-mapreduce
%mvn_package :%{name}-sls::{}: %{name}-mapreduce
%mvn_package :%{name}-streaming::{}: %{name}-mapreduce
%mvn_package :%{name}-tools*::{}: %{name}-mapreduce
%mvn_package :%{name}-maven-plugins::{}: %{name}-maven-plugin
%mvn_package :%{name}-minicluster::{}: %{name}-tests
%mvn_package :%{name}-yarn*::{}: %{name}-yarn
# Jar files that need to be overridden due to installation location
%mvn_file :%{name}-common::tests: %{name}/%{name}-common
%build
%ifarch riscv64
export MAVEN_OPTS="-Xms2048M -Xmx8000M"
function keepalive() { while true; do date; sleep 3600; done }
keepalive &
%endif
mvn clean -Dsnappy.lib=/usr/lib64 -Dbundle.snappy -Dcontainer-executor.conf.dir=%{_sysconfdir}/%{name} -Pdist,native -DskipTests -DskipIT -Dmaven.javadoc.skip=true package
%install
# Copy all jar files except those generated by the build
# $1 the src directory
# $2 the dest directory
copy_dep_jars()
{
find $1 ! -name "hadoop-*.jar" -name "*.jar" | xargs install -m 0644 -t $2
rm -f $2/tools-*.jar
}
# Create symlinks for jars from the build
# $1 the location to create the symlink
link_hadoop_jars()
{
for f in `ls hadoop-* | grep -v tests | grep -v examples`
do
n=`echo $f | sed -e "s/-%{version}//" -e "s/1.1.1//"`
if [ -L $1/$n ]
then
continue
elif [ -e $1/$f ]
then
rm -f $1/$f $1/$n
fi
p=`find %{buildroot}%{_jnidir} %{buildroot}%{_javadir}/%{name} -name $n | sed "s#%{buildroot}##"`
%{__ln_s} $p $1/$n
done
}
%mvn_install
install -d -m 0755 %{buildroot}%{_libdir}/%{name}
install -d -m 0755 %{buildroot}%{_includedir}/%{name}
install -d -m 0755 %{buildroot}%{_jnidir}/%{name}
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/client/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/common/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/hdfs/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/hdfs/webapps
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/mapreduce/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/yarn/lib
install -d -m 0755 %{buildroot}%{_sysconfdir}/%{name}/tomcat/Catalina/localhost
install -d -m 0755 %{buildroot}%{_sysconfdir}/logrotate.d
install -d -m 0755 %{buildroot}%{_sysconfdir}/sysconfig
install -d -m 0755 %{buildroot}%{_tmpfilesdir}
install -d -m 0755 %{buildroot}%{_sharedstatedir}/%{name}-hdfs
install -d -m 0755 %{buildroot}%{_sharedstatedir}/tomcats/httpfs
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-yarn
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-httpfs/temp
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-httpfs/work
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-mapreduce
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-yarn
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-hdfs
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-httpfs
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-mapreduce
install -d -m 0755 %{buildroot}%{_var}/run/%{name}-yarn
install -d -m 0755 %{buildroot}%{_var}/run/%{name}-hdfs
install -d -m 0755 %{buildroot}%{_var}/run/%{name}-mapreduce
basedir='%{name}-common-project/%{name}-common/target/%{name}-common-%{hadoop_version}'
hdfsdir='%{name}-hdfs-project/%{name}-hdfs/target/%{name}-hdfs-%{hadoop_version}'
httpfsdir='%{name}-hdfs-project/%{name}-hdfs-httpfs/target/%{name}-hdfs-httpfs-%{hadoop_version}'
mapreddir='%{name}-mapreduce-project/target/%{name}-mapreduce-%{hadoop_version}'
yarndir='%{name}-yarn-project/target/%{name}-yarn-project-%{hadoop_version}'
# copy jar package
install -d -m 0755 %{buildroot}%{_datadir}/java/%{name}
install -d -m 0755 %{buildroot}%{_datadir}/maven-poms/%{name}
# client
install -m 0755 %{name}-client-modules/%{name}-client/target/hadoop-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client.jar
echo %{_datadir}/java/%{name}/hadoop-client.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client.pom >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-api/target/hadoop-client-api-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-api.jar
echo %{_datadir}/java/%{name}/hadoop-client-api.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-api/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client-api.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client-api.pom >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-minicluster/target/hadoop-client-minicluster-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-minicluster.jar
echo %{_datadir}/java/%{name}/hadoop-client-minicluster.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-minicluster/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client-minicluster.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client-minicluster.pom >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-runtime/target/hadoop-client-runtime-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-runtime.jar
echo %{_datadir}/java/%{name}/hadoop-client-runtime.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-runtime/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client-runtime.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client-runtime.pom >> .mfiles-hadoop-client
# common
install -m 0755 %{name}-common-project/%{name}-annotations/target/hadoop-annotations-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-annotations.jar
echo %{_datadir}/java/%{name}/hadoop-annotations.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-auth/target/hadoop-auth-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-auth.jar
echo %{_datadir}/java/%{name}/hadoop-auth.jar >> .mfiles
install -m 0755 %{name}-tools/%{name}-aws/target/hadoop-aws-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-aws.jar
echo %{_datadir}/java/%{name}/hadoop-aws.jar >> .mfiles
install -m 0755 %{name}-build-tools/target/hadoop-build-tools-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-build-tools.jar
echo %{_datadir}/java/%{name}/hadoop-build-tools.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-nfs/target/hadoop-nfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-nfs.jar
echo %{_datadir}/java/%{name}/hadoop-nfs.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-common/target/hadoop-common-%{version}.jar %{buildroot}%{_prefix}/lib/java/hadoop/hadoop-common.jar
echo %{_prefix}/lib/java/hadoop/hadoop-common.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-kms/target/hadoop-kms-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-kms.jar
echo %{_datadir}/java/%{name}/hadoop-kms.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-annotations/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-annotations.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-annotations.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-auth/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-auth.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-auth.pom >> .mfiles
install -m 0755 %{name}-tools/%{name}-aws/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-aws.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-aws.pom >> .mfiles
install -m 0755 %{name}-build-tools/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-build-tools.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-build-tools.pom >> .mfiles
install -m 0755 %{name}-common-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-common-project.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-common-project.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-common.pom >> .mfiles
install -m 0755 %{name}-dist/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-dist.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-dist.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-nfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-nfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-nfs.pom >> .mfiles
install -m 0755 %{name}-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-project.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-project.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-kms/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-kms.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-kms.pom >> .mfiles
echo %{_sysconfdir}/%{name}/hadoop-user-functions.sh.example >> .mfiles
echo %{_sysconfdir}/%{name}/shellprofile.d/example.sh >> .mfiles
echo %{_sysconfdir}/%{name}/workers >> .mfiles
echo %{_prefix}/libexec/hadoop-functions.sh >> .mfiles
echo %{_prefix}/libexec/hadoop-layout.sh.example >> .mfiles
echo %{_prefix}/sbin/workers.sh >> .mfiles
echo %{_datadir}/%{name}/common/hadoop-common.jar >> .mfiles
# hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-client/target/hadoop-hdfs-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-client.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-client.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-nfs/target/hadoop-hdfs-nfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-nfs.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-nfs.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs/target/hadoop-hdfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-nfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-nfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-nfs.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-project.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-project.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-httpfs/target/hadoop-hdfs-httpfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-httpfs.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-httpfs.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-httpfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-httpfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-httpfs.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-native-client.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-native-client.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-native-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-native-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-native-client.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-rbf/target/hadoop-hdfs-rbf-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-rbf.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-rbf.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-rbf/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-rbf.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-rbf.pom >> .mfiles-hadoop-hdfs
echo %{_prefix}/libexec/shellprofile.d/hadoop-hdfs.sh >> .mfiles-hadoop-hdfs
# mapreduce
install -m 0755 %{name}-tools/%{name}-archives/target/hadoop-archives-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-archives.jar
echo %{_datadir}/java/%{name}/hadoop-archives.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-datajoin/target/hadoop-datajoin-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-datajoin.jar
echo %{_datadir}/java/%{name}/hadoop-datajoin.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-distcp/target/hadoop-distcp-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-distcp.jar
echo %{_datadir}/java/%{name}/hadoop-distcp.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-extras/target/hadoop-extras-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-extras.jar
echo %{_datadir}/java/%{name}/hadoop-extras.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-gridmix/target/hadoop-gridmix-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-gridmix.jar
echo %{_datadir}/java/%{name}/hadoop-gridmix.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-app/target/hadoop-mapreduce-client-app-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-app.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-app.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-common/target/hadoop-mapreduce-client-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-common.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-common.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-core/target/hadoop-mapreduce-client-core-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-core.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-core.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs-plugins/target/hadoop-mapreduce-client-hs-plugins-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-hs-plugins.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-hs-plugins.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs/target/hadoop-mapreduce-client-hs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-hs.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-hs.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-jobclient/target/hadoop-mapreduce-client-jobclient-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-shuffle/target/hadoop-mapreduce-client-shuffle-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-shuffle.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-shuffle.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-openstack/target/hadoop-openstack-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-openstack.jar
echo %{_datadir}/java/%{name}/hadoop-openstack.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-rumen/target/hadoop-rumen-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-rumen.jar
echo %{_datadir}/java/%{name}/hadoop-rumen.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-sls/target/hadoop-sls-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-sls.jar
echo %{_datadir}/java/%{name}/hadoop-sls.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-streaming/target/hadoop-streaming-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-streaming.jar
echo %{_datadir}/java/%{name}/hadoop-streaming.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-tools-dist/target/hadoop-tools-dist-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-tools-dist.jar
echo %{_datadir}/java/%{name}/hadoop-tools-dist.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-archives/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-archives.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-archives.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-datajoin/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-datajoin.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-datajoin.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-distcp/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-distcp.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-distcp.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-extras/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-extras.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-extras.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-gridmix/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-gridmix.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-gridmix.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-app/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-app.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-app.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-common.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-core/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-core.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-core.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs-plugins/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs-plugins.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs-plugins.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-jobclient/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-jobclient.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-jobclient.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-shuffle/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-shuffle.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-shuffle.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-openstack/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-openstack.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-openstack.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-rumen/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-rumen.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-rumen.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-sls/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-sls.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-sls.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-streaming/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-streaming.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-streaming.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-tools-dist/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-tools-dist.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-tools-dist.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-tools.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-tools.pom >> .mfiles-hadoop-mapreduce
echo %{_prefix}/libexec/shellprofile.d/hadoop-mapreduce.sh >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-nativetask/target/hadoop-mapreduce-client-nativetask-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-nativetask.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-nativetask.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-nativetask/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-nativetask.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-nativetask.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-uploader/target/hadoop-mapreduce-client-uploader-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-uploader.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-uploader.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-uploader/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-uploader.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-uploader.pom >> .mfiles-hadoop-mapreduce
# mapreduce-examples
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-examples/target/hadoop-mapreduce-examples-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-examples.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-examples.jar >> .mfiles-hadoop-mapreduce-examples
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-examples/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-examples.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-examples.pom >> .mfiles-hadoop-mapreduce-examples
# maven-plugin
install -m 0755 %{name}-maven-plugins/target/hadoop-maven-plugins-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-maven-plugins.jar
echo %{_datadir}/java/%{name}/hadoop-maven-plugins.jar >> .mfiles-hadoop-maven-plugin
install -m 0755 %{name}-maven-plugins/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-maven-plugins.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-maven-plugins.pom >> .mfiles-hadoop-maven-plugin
# tests
install -m 0755 %{name}-client-modules/%{name}-client/target/hadoop-client-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-tests.jar
echo %{_datadir}/java/%{name}/hadoop-client-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-common-project/%{name}-common/target/hadoop-common-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-common-tests.jar
echo %{_datadir}/java/%{name}/hadoop-common-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-hdfs-project/%{name}-hdfs/target/hadoop-hdfs-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-tests.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-app/target/hadoop-mapreduce-client-app-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-app-tests.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-app-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-jobclient/target/hadoop-mapreduce-client-jobclient-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient-tests.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-minicluster/target/hadoop-minicluster-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-minicluster.jar
echo %{_datadir}/java/%{name}/hadoop-minicluster.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-tools/%{name}-tools-dist/target/hadoop-tools-dist-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-tools-dist-tests.jar
echo %{_datadir}/java/%{name}/hadoop-tools-dist-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-common/target/hadoop-yarn-common-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-common-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-common-tests.jar >> .mfiles-hadoop-tests
#install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-registry/target/hadoop-yarn-registry-%{version}-test-sources.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-registry-test-sources.jar
#echo %{_datadir}/java/%{name}/hadoop-yarn-registry-test-sources.jar >> .mfiles-hadoop-test-sources
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-resourcemanager/target/hadoop-yarn-server-resourcemanager-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-sharedcachemanager/target/hadoop-yarn-server-sharedcachemanager-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-tests/target/hadoop-yarn-server-tests-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-tests-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-tests-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-tests/target/hadoop-yarn-server-tests-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-minicluster/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-minicluster.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-minicluster.pom >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-tests/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-tests.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-tests.pom >> .mfiles-hadoop-tests
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-client/target/hadoop-hdfs-client-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-client-tests.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-client-tests.jar >> .mfiles-hadoop-tests
# yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-api/target/hadoop-yarn-api-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-api.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-api.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-applications-distributedshell/target/hadoop-yarn-applications-distributedshell-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-applications-distributedshell.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-applications-distributedshell.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-applications-unmanaged-am-launcher/target/hadoop-yarn-applications-unmanaged-am-launcher-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-client/target/hadoop-yarn-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-client.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-client.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-common/target/hadoop-yarn-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-common.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-common.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-registry/target/hadoop-yarn-registry-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-registry.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-registry.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-applicationhistoryservice/target/hadoop-yarn-server-applicationhistoryservice-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-applicationhistoryservice.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-applicationhistoryservice.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-common/target/hadoop-yarn-server-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-common.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-common.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-resourcemanager/target/hadoop-yarn-server-resourcemanager-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-sharedcachemanager/target/hadoop-yarn-server-sharedcachemanager-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-web-proxy/target/hadoop-yarn-server-web-proxy-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-web-proxy.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-web-proxy.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-nodemanager/target/hadoop-yarn-server-nodemanager-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-nodemanager.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-nodemanager.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-router/target/hadoop-yarn-server-router-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-router.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-router.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-router/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-router.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-router.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timeline-pluginstorage/target/hadoop-yarn-server-timeline-pluginstorage-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timeline-pluginstorage.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timeline-pluginstorage.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice/target/hadoop-yarn-server-timelineservice-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-client/target/hadoop-yarn-server-timelineservice-hbase-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-client.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-client.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-common/target/hadoop-yarn-server-timelineservice-hbase-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-common.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-common.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/target/%{name}-yarn-project-%{version}/share/%{name}/yarn/timelineservice/hadoop-yarn-server-timelineservice-hbase-coprocessor-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-coprocessor.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-coprocessor.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timeline-pluginstorage/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timeline-pluginstorage.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timeline-pluginstorage.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-api/target/hadoop-yarn-services-api-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-services-api.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-services-api.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-api/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-services-api.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-services-api.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-core/target/hadoop-yarn-services-core-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-services-core.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-services-core.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-core/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-services-core.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-services-core.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-api/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-api.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-api.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/hadoop-yarn-applications-distributedshell/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-distributedshell.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-distributedshell.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/hadoop-yarn-applications-unmanaged-am-launcher/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-applications.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-applications.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-client.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-common.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-registry/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-registry.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-registry.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-applicationhistoryservice/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-applicationhistoryservice.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-applicationhistoryservice.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-common.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-nodemanager/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-nodemanager.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-nodemanager.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-resourcemanager/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-resourcemanager.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-resourcemanager.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-sharedcachemanager/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-sharedcachemanager.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-sharedcachemanager.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-web-proxy/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-web-proxy.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-web-proxy.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-site/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-site.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-site.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-client.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-common.pom >> .mfiles-hadoop-yarn
echo %{_sysconfdir}/%{name}/yarnservice-log4j.properties >> .mfiles-hadoop-yarn
echo %{_prefix}/bin/container-executor >> .mfiles-hadoop-yarn
echo %{_prefix}/bin/oom-listener >> .mfiles-hadoop-yarn
echo %{_prefix}/bin/test-container-executor >> .mfiles-hadoop-yarn
echo %{_prefix}/libexec/shellprofile.d/hadoop-yarn.sh >> .mfiles-hadoop-yarn
echo %{_prefix}/sbin/FederationStateStore/* >> .mfiles-hadoop-yarn
# copy script folders
for dir in bin libexec sbin
do
cp -arf $basedir/$dir %{buildroot}%{_prefix}
cp -arf $hdfsdir/$dir %{buildroot}%{_prefix}
cp -arf $mapreddir/$dir %{buildroot}%{_prefix}
cp -arf $yarndir/$dir %{buildroot}%{_prefix}
done
# This binary is obsoleted and causes a conflict with qt-devel
rm -rf %{buildroot}%{_bindir}/rcc
# Duplicate files
rm -f %{buildroot}%{_sbindir}/hdfs-config.sh
# copy config files
cp -arf $basedir/etc/* %{buildroot}%{_sysconfdir}
cp -arf $httpfsdir/etc/* %{buildroot}%{_sysconfdir}
cp -arf $mapreddir/etc/* %{buildroot}%{_sysconfdir}
cp -arf $yarndir/etc/* %{buildroot}%{_sysconfdir}
# copy binaries
cp -arf $basedir/lib/native/libhadoop.so* %{buildroot}%{_libdir}/%{name}
chrpath --delete %{buildroot}%{_libdir}/%{name}/*
cp -arf ./hadoop-hdfs-project/hadoop-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}/include/hdfs.h %{buildroot}%{_includedir}/%{name}
cp -arf ./hadoop-hdfs-project/hadoop-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}/lib/native/libhdfs.so* %{buildroot}%{_libdir}
chrpath --delete %{buildroot}%{_libdir}/libhdfs*
# Not needed since httpfs is deployed with existing systemd setup
rm -f %{buildroot}%{_sbindir}/httpfs.sh
rm -f %{buildroot}%{_libexecdir}/httpfs-config.sh
rm -f %{buildroot}%{_bindir}/httpfs-env.sh
# Remove files with .cmd extension
find %{buildroot} -name *.cmd | xargs rm -f
# Modify hadoop-env.sh to point to correct locations for JAVA_HOME
# and JSVC_HOME.
sed -i "s|\${JAVA_HOME}|/usr/lib/jvm/jre|" %{buildroot}%{_sysconfdir}/%{name}/%{name}-env.sh
sed -i "s|\${JSVC_HOME}|/usr/bin|" %{buildroot}%{_sysconfdir}/%{name}/%{name}-env.sh
# Ensure the java provided DocumentBuilderFactory is used
sed -i "s|\(HADOOP_OPTS.*=.*\)\$HADOOP_CLIENT_OPTS|\1 -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl \$HADOOP_CLIENT_OPTS|" %{buildroot}%{_sysconfdir}/%{name}/%{name}-env.sh
echo "export YARN_OPTS=\"\$YARN_OPTS -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl\"" >> %{buildroot}%{_sysconfdir}/%{name}/yarn-env.sh
# Workaround for bz1012059
install -d -m 0755 %{buildroot}%{_mavenpomdir}/
install -pm 644 hadoop-project-dist/pom.xml %{buildroot}%{_mavenpomdir}/JPP.%{name}-%{name}-project-dist.pom
%{__ln_s} %{_jnidir}/%{name}/hadoop-common.jar %{buildroot}%{_datadir}/%{name}/common
%{__ln_s} %{_javadir}/%{name}/hadoop-hdfs.jar %{buildroot}%{_datadir}/%{name}/hdfs
%{__ln_s} %{_javadir}/%{name}/hadoop-client.jar %{buildroot}%{_datadir}/%{name}/client
# client jar depenencies
copy_dep_jars hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client/lib %{buildroot}%{_datadir}/%{name}/client/lib
pushd hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client/lib
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/client/lib
popd
cp -f hadoop-client-modules/%{name}-client-api/target/hadoop-client-api-%{version}.jar hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
cp -f hadoop-client-modules/%{name}-client-minicluster/target/hadoop-client-minicluster-%{version}.jar hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
cp -f hadoop-client-modules/%{name}-client-runtime/target/hadoop-client-runtime-%{version}.jar hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
pushd hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/client
popd
# common jar depenencies
copy_dep_jars $basedir/share/%{name}/common/lib %{buildroot}%{_datadir}/%{name}/common/lib
cp -f hadoop-common-project/%{name}-kms/target/hadoop-kms-%{version}.jar $basedir/share/%{name}/common
cp -f hadoop-common-project/%{name}-nfs/target/hadoop-nfs-%{version}.jar $basedir/share/%{name}/common
cp -f hadoop-common-project/%{name}-auth/target/hadoop-auth-%{version}.jar $basedir/share/%{name}/common
pushd $basedir/share/%{name}/common
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/common
popd
pushd $basedir/share/%{name}/common/lib
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/common/lib
\cp -r %{_builddir}/hadoop-%{version}-src/hadoop-dist/target/hadoop-%{version}/share/hadoop/common/lib/* %{buildroot}%{_datadir}/%{name}/common/lib
popd
# hdfs jar dependencies
copy_dep_jars $hdfsdir/share/%{name}/hdfs/lib %{buildroot}%{_datadir}/%{name}/hdfs/lib
%{__ln_s} %{_jnidir}/%{name}/%{name}-hdfs-bkjournal.jar %{buildroot}%{_datadir}/%{name}/hdfs/lib
cp -f hadoop-hdfs-project/%{name}-hdfs-client/target/hadoop-hdfs-client-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-httpfs/target/hadoop-hdfs-httpfs-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-nfs/target/hadoop-hdfs-nfs-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-rbf/target/hadoop-hdfs-rbf-%{version}.jar $hdfsdir/share/%{name}/hdfs
pushd $hdfsdir/share/%{name}/hdfs
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/hdfs
\cp -r %{_builddir}/hadoop-%{version}-src/hadoop-dist/target/hadoop-%{version}/share/hadoop/hdfs/lib %{buildroot}%{_datadir}/%{name}/hdfs/lib
popd
# httpfs
# Create the webapp directory structure
pushd %{buildroot}%{_sharedstatedir}/tomcats/httpfs
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/conf conf
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/lib lib
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/logs logs
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/temp temp
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/webapps webapps
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/work work
popd
# Copy the tomcat configuration and overlay with specific configuration bits.
# This is needed so the httpfs instance won't collide with a system running
# tomcat
for cfgfile in catalina.policy catalina.properties context.xml \
tomcat.conf web.xml server.xml logging.properties;
do
cp -a %{_sysconfdir}/tomcat/$cfgfile %{buildroot}%{_sysconfdir}/%{name}/tomcat
done
# Replace, in place, the Tomcat configuration files delivered with the current
# Fedora release. See BZ#1295968 for some reason.
sed -i -e 's/8005/${httpfs.admin.port}/g' -e 's/8080/${httpfs.http.port}/g' %{buildroot}%{_sysconfdir}/%{name}/tomcat/server.xml
sed -i -e 's/catalina.base/httpfs.log.dir/g' %{buildroot}%{_sysconfdir}/%{name}/tomcat/logging.properties
# Given the permission, only the root and tomcat users can access to that file,
# not the build user. So, the build would fail here.
install -m 660 %{SOURCE9} %{buildroot}%{_sysconfdir}/%{name}/tomcat/tomcat-users.xml
# Copy the httpfs webapp
cp -arf %{name}-hdfs-project/%{name}-hdfs-httpfs/target/classes/webapps/webhdfs %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps
# Tell tomcat to follow symlinks
install -d -m 0766 %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/META-INF/
cp %{SOURCE5} %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/META-INF/
# Remove the jars included in the webapp and create symlinks
rm -f %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/tools*.jar
rm -f %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/tomcat-*.jar
pushd %{buildroot}%{_datadir}/%{name}/httpfs/tomcat
%{__ln_s} %{_datadir}/tomcat/bin bin
%{__ln_s} %{_sysconfdir}/%{name}/tomcat conf
%{__ln_s} %{_datadir}/tomcat/lib lib
%{__ln_s} %{_var}/cache/%{name}-httpfs/temp temp
%{__ln_s} %{_var}/cache/%{name}-httpfs/work work
%{__ln_s} %{_var}/log/%{name}-httpfs logs
popd
# mapreduce jar dependencies
mrdir='%{name}-mapreduce-project/target/%{name}-mapreduce-%{hadoop_version}'
copy_dep_jars $mrdir/share/%{name}/mapreduce/lib %{buildroot}%{_datadir}/%{name}/mapreduce/lib
%{__ln_s} %{_javadir}/%{name}/%{name}-annotations.jar %{buildroot}%{_datadir}/%{name}/mapreduce/lib
cp -f hadoop-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-nativetask/target/hadoop-mapreduce-client-nativetask-%{version}.jar $mrdir/share/%{name}/mapreduce
cp -f hadoop-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-uploader/target/hadoop-mapreduce-client-uploader-%{version}.jar $mrdir/share/%{name}/mapreduce
cp -f hadoop-mapreduce-project/%{name}-mapreduce-examples/target/hadoop-mapreduce-examples-%{version}.jar $mrdir/share/%{name}/mapreduce
pushd $mrdir/share/%{name}/mapreduce
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/mapreduce
popd
# yarn jar dependencies
yarndir='%{name}-yarn-project/target/%{name}-yarn-project-%{hadoop_version}'
copy_dep_jars $yarndir/share/%{name}/yarn/lib %{buildroot}%{_datadir}/%{name}/yarn/lib
%{__ln_s} %{_javadir}/%{name}/%{name}-annotations.jar %{buildroot}%{_datadir}/%{name}/yarn/lib
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-nodemanager/target/hadoop-yarn-server-nodemanager-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-router/target/hadoop-yarn-server-router-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timeline-pluginstorage/target/hadoop-yarn-server-timeline-pluginstorage-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-api/target/hadoop-yarn-services-api-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-core/target/hadoop-yarn-services-core-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice/target/hadoop-yarn-server-timelineservice-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-client/target/hadoop-yarn-server-timelineservice-hbase-client-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-common/target/hadoop-yarn-server-timelineservice-hbase-common-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/target/%{name}-yarn-project-%{version}/share/%{name}/yarn/timelineservice/hadoop-yarn-server-timelineservice-hbase-coprocessor-%{version}.jar $yarndir/share/%{name}/yarn
pushd $yarndir/share/%{name}/yarn
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/yarn
popd
# Install hdfs webapp bits
cp -arf hadoop-hdfs-project/hadoop-hdfs/target/webapps/* %{buildroot}%{_datadir}/%{name}/hdfs/webapps
# hadoop layout. Convert to appropriate lib location for 32 and 64 bit archs
lib=$(echo %{?_libdir} | sed -e 's:/usr/\(.*\):\1:')
if [ "$lib" = "%_libdir" ]; then
echo "_libdir is not located in /usr. Lib location is wrong"
exit 1
fi
sed -e "s|HADOOP_COMMON_LIB_NATIVE_DIR\s*=.*|HADOOP_COMMON_LIB_NATIVE_DIR=$lib/%{name}|" %{SOURCE1} > %{buildroot}%{_libexecdir}/%{name}-layout.sh
# Default config
cp -f %{SOURCE10} %{buildroot}%{_sysconfdir}/%{name}/core-site.xml
cp -f %{SOURCE11} %{buildroot}%{_sysconfdir}/%{name}/hdfs-site.xml
cp -f %{SOURCE12} %{buildroot}%{_sysconfdir}/%{name}/mapred-site.xml
cp -f %{SOURCE13} %{buildroot}%{_sysconfdir}/%{name}/yarn-site.xml
# systemd configuration
install -d -m 0755 %{buildroot}%{_unitdir}/
for service in %{hdfs_services} %{mapreduce_services} %{yarn_services}
do
s=`echo $service | cut -d'-' -f 2 | cut -d'.' -f 1`
daemon=$s
if [[ "%{hdfs_services}" == *$service* ]]
then
src=%{SOURCE2}
elif [[ "%{mapreduce_services}" == *$service* ]]
then
src=%{SOURCE3}
elif [[ "%{yarn_services}" == *$service* ]]
then
if [[ "$s" == "timelineserver" ]]
then
daemon='historyserver'
fi
src=%{SOURCE4}
else
echo "Failed to determine type of service for %service"
exit 1
fi
sed -e "s|DAEMON|$daemon|g" -e "/LimitNPROC/a\SuccessExitStatus=SIGKILL" $src > %{buildroot}%{_unitdir}/%{name}-$s.service
done
cp -f %{SOURCE7} %{buildroot}%{_sysconfdir}/sysconfig/tomcat@httpfs
# Ensure /var/run directories are recreated on boot
echo "d %{_var}/run/%{name}-yarn 0775 yarn hadoop -" > %{buildroot}%{_tmpfilesdir}/%{name}-yarn.conf
echo "d %{_var}/run/%{name}-hdfs 0775 hdfs hadoop -" > %{buildroot}%{_tmpfilesdir}/%{name}-hdfs.conf
echo "d %{_var}/run/%{name}-mapreduce 0775 mapred hadoop -" > %{buildroot}%{_tmpfilesdir}/%{name}-mapreduce.conf
# logrotate config
for type in hdfs httpfs yarn mapreduce
do
sed -e "s|NAME|$type|" %{SOURCE6} > %{buildroot}%{_sysconfdir}/logrotate.d/%{name}-$type
done
sed -i "s|{|%{_var}/log/hadoop-hdfs/*.audit\n{|" %{buildroot}%{_sysconfdir}/logrotate.d/%{name}-hdfs
# hdfs init script
install -m 755 %{SOURCE8} %{buildroot}%{_sbindir}
chrpath -d %{buildroot}%{_bindir}/container-executor
chrpath -d %{buildroot}%{_bindir}/test-container-executor
%pretrans -p <lua> hdfs
path = "%{_datadir}/%{name}/hdfs/webapps"
st = posix.stat(path)
if st and st.type == "link" then
os.remove(path)
end
%pre common
getent group hadoop >/dev/null || groupadd -r hadoop
%pre hdfs
getent group hdfs >/dev/null || groupadd -r hdfs
getent passwd hdfs >/dev/null || /usr/sbin/useradd --comment "Apache Hadoop HDFS" --shell /sbin/nologin -M -r -g hdfs -G hadoop --home %{_sharedstatedir}/%{name}-hdfs hdfs
%pre mapreduce
getent group mapred >/dev/null || groupadd -r mapred
getent passwd mapred >/dev/null || /usr/sbin/useradd --comment "Apache Hadoop MapReduce" --shell /sbin/nologin -M -r -g mapred -G hadoop --home %{_var}/cache/%{name}-mapreduce mapred
%pre yarn
getent group yarn >/dev/null || groupadd -r yarn
getent passwd yarn >/dev/null || /usr/sbin/useradd --comment "Apache Hadoop Yarn" --shell /sbin/nologin -M -r -g yarn -G hadoop --home %{_var}/cache/%{name}-yarn yarn
%preun hdfs
%systemd_preun %{hdfs_services}
%preun mapreduce
%systemd_preun %{mapreduce_services}
%preun yarn
%systemd_preun %{yarn_services}
%post common-native -p /sbin/ldconfig
%post hdfs
# Change the home directory for the hdfs user
if [[ `getent passwd hdfs | cut -d: -f 6` != "%{_sharedstatedir}/%{name}-hdfs" ]]
then
/usr/sbin/usermod -d %{_sharedstatedir}/%{name}-hdfs hdfs
fi
if [ $1 -gt 1 ]
then
if [ -d %{_var}/cache/%{name}-hdfs ] && [ ! -L %{_var}/cache/%{name}-hdfs ]
then
# Move the existing hdfs data to the new location
mv -f %{_var}/cache/%{name}-hdfs/* %{_sharedstatedir}/%{name}-hdfs/
fi
fi
%systemd_post %{hdfs_services}
%post -n libhdfs -p /sbin/ldconfig
%post mapreduce
%systemd_post %{mapreduce_services}
%post yarn
%systemd_post %{yarn_services}
%postun common-native -p /sbin/ldconfig
%postun hdfs
%systemd_postun_with_restart %{hdfs_services}
if [ $1 -lt 1 ]
then
# Remove the compatibility symlink
rm -f %{_var}/cache/%{name}-hdfs
fi
%postun -n libhdfs -p /sbin/ldconfig
%postun mapreduce
%systemd_postun_with_restart %{mapreduce_services}
%postun yarn
%systemd_postun_with_restart %{yarn_services}
%posttrans hdfs
# Create a symlink to the new location for hdfs data in case the user changed
# the configuration file and the new one isn't in place to point to the
# correct location
if [ ! -e %{_var}/cache/%{name}-hdfs ]
then
%{__ln_s} %{_sharedstatedir}/%{name}-hdfs %{_var}/cache
fi
%files -f .mfiles-%{name}-client client
%{_datadir}/%{name}/client
%files -f .mfiles common
%doc LICENSE.txt
%doc NOTICE.txt
%doc README.txt
%config(noreplace) %{_sysconfdir}/%{name}/core-site.xml
%config(noreplace) %{_sysconfdir}/%{name}/%{name}-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/%{name}-metrics2.properties
%config(noreplace) %{_sysconfdir}/%{name}/%{name}-policy.xml
%config(noreplace) %{_sysconfdir}/%{name}/log4j.properties
%config(noreplace) %{_sysconfdir}/%{name}/ssl-client.xml.example
%config(noreplace) %{_sysconfdir}/%{name}/ssl-server.xml.example
%config(noreplace) %{_sysconfdir}/%{name}/configuration.xsl
%dir %{_datadir}/%{name}
%dir %{_datadir}/%{name}/common
%{_datadir}/%{name}/common/lib
%{_datadir}/%{name}/common/hadoop-kms.jar
%{_datadir}/%{name}/common/hadoop-nfs.jar
%{_datadir}/%{name}/common/hadoop-auth.jar
%{_libexecdir}/%{name}-config.sh
%{_libexecdir}/%{name}-layout.sh
# Workaround for bz1012059
%{_mavenpomdir}/JPP.%{name}-%{name}-project-dist.pom
%{_bindir}/%{name}
%{_sbindir}/%{name}-daemon.sh
%{_sbindir}/%{name}-daemons.sh
%{_sbindir}/start-all.sh
%{_sbindir}/start-balancer.sh
%{_sbindir}/start-dfs.sh
%{_sbindir}/start-secure-dns.sh
%{_sbindir}/stop-all.sh
%{_sbindir}/stop-balancer.sh
%{_sbindir}/stop-dfs.sh
%{_sbindir}/stop-secure-dns.sh
%files common-native
%{_libdir}/%{name}/libhadoop.*
%files devel
%{_includedir}/%{name}
%{_libdir}/libhdfs.so
%files -f .mfiles-%{name}-hdfs hdfs
%config(noreplace) %{_sysconfdir}/%{name}/hdfs-site.xml
%{_datadir}/%{name}/hdfs
%{_unitdir}/%{name}-datanode.service
%{_unitdir}/%{name}-namenode.service
%{_unitdir}/%{name}-journalnode.service
%{_unitdir}/%{name}-secondarynamenode.service
%{_unitdir}/%{name}-zkfc.service
%{_libexecdir}/hdfs-config.sh
%{_bindir}/hdfs
%{_sbindir}/distribute-exclude.sh
%{_sbindir}/refresh-namenodes.sh
%{_sbindir}/hdfs-create-dirs
%{_tmpfilesdir}/%{name}-hdfs.conf
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-hdfs
%attr(0755,hdfs,hadoop) %dir %{_var}/run/%{name}-hdfs
%attr(0755,hdfs,hadoop) %dir %{_var}/log/%{name}-hdfs
%attr(0755,hdfs,hadoop) %dir %{_sharedstatedir}/%{name}-hdfs
%files httpfs
%config(noreplace) %{_sysconfdir}/sysconfig/tomcat@httpfs
%config(noreplace) %{_sysconfdir}/%{name}/httpfs-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/httpfs-log4j.properties
%config(noreplace) %{_sysconfdir}/%{name}/httpfs-site.xml
%attr(-,tomcat,tomcat) %config(noreplace) %{_sysconfdir}/%{name}/tomcat/*.*
%attr(0775,root,tomcat) %dir %{_sysconfdir}/%{name}/tomcat
%attr(0775,root,tomcat) %dir %{_sysconfdir}/%{name}/tomcat/Catalina
%attr(0775,root,tomcat) %dir %{_sysconfdir}/%{name}/tomcat/Catalina/localhost
%{_datadir}/%{name}/httpfs
%{_sharedstatedir}/tomcats/httpfs
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-httpfs
%attr(0775,root,tomcat) %dir %{_var}/log/%{name}-httpfs
%attr(0775,root,tomcat) %dir %{_var}/cache/%{name}-httpfs
%attr(0775,root,tomcat) %dir %{_var}/cache/%{name}-httpfs/temp
%attr(0775,root,tomcat) %dir %{_var}/cache/%{name}-httpfs/work
%files -n libhdfs
%{_libdir}/libhdfs.so.*
%files -f .mfiles-%{name}-mapreduce mapreduce
%config(noreplace) %{_sysconfdir}/%{name}/mapred-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/mapred-queues.xml.template
%config(noreplace) %{_sysconfdir}/%{name}/mapred-site.xml
%{_datadir}/%{name}/mapreduce
%{_libexecdir}/mapred-config.sh
%{_unitdir}/%{name}-historyserver.service
%{_bindir}/mapred
%{_sbindir}/mr-jobhistory-daemon.sh
%{_tmpfilesdir}/%{name}-mapreduce.conf
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-mapreduce
%attr(0755,mapred,hadoop) %dir %{_var}/run/%{name}-mapreduce
%attr(0755,mapred,hadoop) %dir %{_var}/log/%{name}-mapreduce
%attr(0755,mapred,hadoop) %dir %{_var}/cache/%{name}-mapreduce
%files -f .mfiles-%{name}-mapreduce-examples mapreduce-examples
%files -f .mfiles-%{name}-maven-plugin maven-plugin
%files -f .mfiles-%{name}-tests tests
%files -f .mfiles-%{name}-yarn yarn
%config(noreplace) %{_sysconfdir}/%{name}/capacity-scheduler.xml
%config(noreplace) %{_sysconfdir}/%{name}/yarn-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/yarn-site.xml
%{_unitdir}/%{name}-nodemanager.service
%{_unitdir}/%{name}-proxyserver.service
%{_unitdir}/%{name}-resourcemanager.service
%{_unitdir}/%{name}-timelineserver.service
%{_libexecdir}/yarn-config.sh
%{_datadir}/%{name}/yarn
%{_bindir}/yarn
%{_sbindir}/yarn-daemon.sh
%{_sbindir}/yarn-daemons.sh
%{_sbindir}/start-yarn.sh
%{_sbindir}/stop-yarn.sh
%{_tmpfilesdir}/%{name}-yarn.conf
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-yarn
%attr(0755,yarn,hadoop) %dir %{_var}/run/%{name}-yarn
%attr(0755,yarn,hadoop) %dir %{_var}/log/%{name}-yarn
%attr(0755,yarn,hadoop) %dir %{_var}/cache/%{name}-yarn
%files yarn-security
%config(noreplace) %{_sysconfdir}/%{name}/container-executor.cfg
%changelog
* Mon Dec 16 2024 litian <dev11105@linx-info.com> - 3.3.6-8
- fix %patchN is deprecated warning
* Fri Dec 13 2024 wenweijian <weijian.wen@foxmail.com> - 3.3.6-7
- Add huawei repository
* Mon Nov 11 2024 xiaochn <xiaochuannan@inspur.com> - 3.3.6-6
- Remove npm repo in spec file and use unified configuration of the build environment
* Thu Sep 26 2024 Deyuan Fan <fandeyuan@kylinos.cn> - 3.3.6-5
- fix CVE-2024-23454
* Tue Jun 25 2024 Dingli Zhang <dingli@iscas.ac.cn> - 3.3.6-4
- Remove riscv64 prebuilded files
- Build protoc and protoc-gen-grpc-java in prep state for riscv64
* Sun May 12 2024 Dingli Zhang <dingli@iscas.ac.cn> - 3.3.6-3
- Fix build on riscv64
- Upgrade os-maven-plugin to 1.7.1
- Remove settings.xml and use ${HOME}/.m2/repository as maven repo
* Fri Dec 15 2023 xiexing <xiexing4@hisilicon.com> - 3.3.6-2
- add conflicts to hadoop spec
* Mon Nov 27 2023 wenweijian wenweijian2@huawei.com - 3.3.6-1
- fix cve CVE-2023-26031
* Wed Aug 16 2023 Jia Chao <jiac13@chinaunicom.cn> - 3.3.4-4
- fix: use $HOME rather than /home/abuild, suit for all buildtools.
- fix: yarn have ELF bin, not noarch at all.
* Thu Jul 13 2023 sunyanan <sunyanan@xfusion.com> - 3.3.4-3
- lock triple-beam version to 1.3.0
* Thu Mar 9 2023 xiexing <xiexing4@hisilicon.com> - 3.3.4-2
- fix EBS install problem
* Mon Sep 19 2022 xiasenlin <xiasenlin1@huawei.com> - 3.3.4-1
- fix cve CVE-2021-25642
* Thu Sep 8 2022 xiasenlin <xiasenlin1@huawei.com> - 3.3.3-2
- add chrpath to solve check_rpath warning
* Thu Aug 11 2022 xiexing <xiexing4@hisilicon.com> - 3.3.3-1
- update version
* Mon Jul 12 2021 lingsheng <lingsheng@huawei.com> - 3.2.1-5
- Fix stop service failure
* Sat Jul 10 2021 wangyue <wangyue92@huawei.com> - 3.2.1-4
- Add gcc-c++ to build dependency
* Fri Jun 25 2021 wangyue <wangyue92@huawei.com> - 3.2.1-3
- Fix CVE-2019-17195
* Fri May 14 2021 wangyue <wangyue92@huawei.com> - 3.2.1-2
- Fix CVE-2020-9492
* Thu May 13 2021 Ge Wang <wangge20@huawei.com> - 3.2.1-1
- Init package
Loading...
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/litian_code/hadoop.git
git@gitee.com:litian_code/hadoop.git
litian_code
hadoop
hadoop
master

搜索帮助