代码拉取完成,页面将自动刷新
同步操作将从 src-openEuler/hadoop 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
%global _hardened_build 1
%global hadoop_version %{version}
%global hdfs_services hadoop-zkfc.service hadoop-datanode.service hadoop-secondarynamenode.service hadoop-namenode.service hadoop-journalnode.service
%global mapreduce_services hadoop-historyserver.service
%global yarn_services hadoop-proxyserver.service hadoop-resourcemanager.service hadoop-nodemanager.service hadoop-timelineserver.service
# Filter out undesired provides and requires
%global __requires_exclude_from ^%{_libdir}/%{name}/libhadoop.so$
%global __provides_exclude_from ^%{_libdir}/%{name}/.*$
%define _binaries_in_noarch_packages_terminate_build 0
Name: hadoop
Version: 3.3.6
Release: 2
Summary: A software platform for processing vast amounts of data
# The BSD license file is missing
# https://issues.apache.org/jira/browse/HADOOP-9849
License: Apache-2.0 and BSD and Zlib and BSL-1.0 and MPL-2.0 and EPL-1.0 and MIT
URL: https://%{name}.apache.org
Source0: https://www.apache.org/dist/%{name}/core/%{name}-%{version}/%{name}-%{version}-src.tar.gz
Source1: %{name}-layout.sh
Source2: %{name}-hdfs.service.template
Source3: %{name}-mapreduce.service.template
Source4: %{name}-yarn.service.template
Source5: context.xml
Source6: %{name}.logrotate
Source7: %{name}-httpfs.sysconfig
Source8: hdfs-create-dirs
Source9: %{name}-tomcat-users.xml
Source10: %{name}-core-site.xml
Source11: %{name}-hdfs-site.xml
Source12: %{name}-mapred-site.xml
Source13: %{name}-yarn-site.xml
Source14: yarn-v1.22.5.tar.gz
Source15: node-12.22.1-linux-x64.tar.gz
Source16: node-v12.22.1-linux-arm64.tar.gz
Source17: settings.xml
Patch0: 01-lock-triple-beam-version-to-1.3.0.patch
BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-root
BuildRequires: java-1.8.0-openjdk-devel maven hostname maven-local tomcat cmake snappy openssl-devel
BuildRequires: cyrus-sasl-devel chrpath systemd protobuf2-compiler protobuf2-devel protobuf2-java protobuf2
BuildRequires: leveldbjni leveldb-java hawtjni-runtime gcc-c++
BuildRequires: npm chrpath
Requires: java-1.8.0-openjdk protobuf2-java apache-zookeeper
%description
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
%package client
Summary: Libraries for Apache Hadoop clients
BuildArch: noarch
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-hdfs = %{version}-%{release}
Requires: %{name}-mapreduce = %{version}-%{release}
Requires: %{name}-yarn = %{version}-%{release}
%description client
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides libraries for Apache Hadoop clients.
%package common
Summary: Common files needed by Apache Hadoop daemons
BuildArch: noarch
Requires(pre): /usr/sbin/useradd
Obsoletes: %{name}-javadoc < 2.4.1-22%{?dist}
Requires: apache-zookeeper
Requires: leveldb
Requires: protobuf2-java
Conflicts: hadoop-3.1-client
%description common
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains common files and utilities needed by other Apache
Hadoop modules.
%package common-native
Summary: The native Apache Hadoop library file
Requires: %{name}-common = %{version}-%{release}
Conflicts: hadoop-3.1-common
%description common-native
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains the native-hadoop library
%package devel
Summary: Headers for Apache Hadoop
Requires: libhdfs%{?_isa} = %{version}-%{release}
Conflicts: hadoop-3.1-common-native
%description devel
Header files for Apache Hadoop's hdfs library and other utilities
%package hdfs
Summary: The Apache Hadoop Distributed File System
BuildArch: noarch
Requires: apache-commons-daemon-jsvc
Requires: %{name}-common = %{version}-%{release}
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-hdfs
%description hdfs
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
The Hadoop Distributed File System (HDFS) is the primary storage system
used by Apache Hadoop applications.
%package httpfs
Summary: Provides web access to HDFS
BuildArch: noarch
Requires: apache-commons-dbcp
Requires: ecj >= 1:4.2.1-6
Requires: json_simple
Requires: tomcat
Requires: tomcat-lib
Requires: tcnative
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-httpfs
%description httpfs
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides a server that provides HTTP REST API support for
the complete FileSystem/FileContext interface in HDFS.
%package -n libhdfs
Summary: The Apache Hadoop Filesystem Library
Requires: %{name}-hdfs = %{version}-%{release}
Requires: lzo
Conflicts: hadoop-3.1-libhdfs
%description -n libhdfs
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides the Apache Hadoop Filesystem Library.
%package mapreduce
Summary: Apache Hadoop MapReduce (MRv2)
BuildArch: noarch
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-mapreduce-examples = %{version}-%{release}
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-mapreduce
%description mapreduce
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package provides Apache Hadoop MapReduce (MRv2).
%package mapreduce-examples
Summary: Apache Hadoop MapReduce (MRv2) examples
BuildArch: noarch
Requires: hsqldb
Conflicts: hadoop-3.1-mapreduce-examples
%description mapreduce-examples
This package contains mapreduce examples.
%package maven-plugin
Summary: Apache Hadoop maven plugin
BuildArch: noarch
Requires: maven
Conflicts: hadoop-3.1-maven-plugin
%description maven-plugin
The Apache Hadoop maven plugin
%package tests
Summary: Apache Hadoop test resources
BuildArch: noarch
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-hdfs = %{version}-%{release}
Requires: %{name}-mapreduce = %{version}-%{release}
Requires: %{name}-yarn = %{version}-%{release}
Conflicts: hadoop-3.1-tests
%description tests
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains test related resources for Apache Hadoop.
%package yarn
Summary: Apache Hadoop YARN
Requires: %{name}-common = %{version}-%{release}
Requires: %{name}-mapreduce = %{version}-%{release}
Requires: aopalliance
Requires: atinject
Requires: hamcrest
Requires: hawtjni
Requires: leveldbjni
Requires(post): systemd
Requires(preun): systemd
Requires(postun): systemd
Conflicts: hadoop-3.1-yarn nodejs-yarn
%description yarn
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains Apache Hadoop YARN.
%package yarn-security
Summary: The ability to run Apache Hadoop YARN in secure mode
Requires: %{name}-yarn = %{version}-%{release}
Conflicts: hadoop-3.1-yarn-security
%description yarn-security
Apache Hadoop is a framework that allows for the distributed processing of
large data sets across clusters of computers using simple programming models.
It is designed to scale up from single servers to thousands of machines, each
offering local computation and storage.
This package contains files needed to run Apache Hadoop YARN in secure mode.
%prep
%autosetup -p1 -n %{name}-%{version}-src
cp %{SOURCE17} ./
sed -i "s,@HOME@,${HOME},g" settings.xml
mvn install:install-file -DgroupId=org.fusesource.leveldbjni -DartifactId=leveldbjni-all -Dversion=1.8 -Dpackaging=jar -Dfile=/usr/lib/java/leveldbjni-all.jar -s settings.xml
mvn install:install-file -DgroupId=org.fusesource.leveldbjni -DartifactId=leveldbjni -Dversion=1.8 -Dpackaging=jar -Dfile=/usr/lib/java/leveldbjni/leveldbjni.jar -s settings.xml
mvn install:install-file -DgroupId=org.iq80.leveldb -DartifactId=leveldb-api -Dversion=0.7 -Dpackaging=jar -Dfile=/usr/share/java/leveldb-java/leveldb-api.jar -s settings.xml
mvn install:install-file -DgroupId=org.iq80.leveldb -DartifactId=leveldb-benchmark -Dversion=0.7 -Dpackaging=jar -Dfile=/usr/share/java/leveldb-java/leveldb-benchmark.jar -s settings.xml
mvn install:install-file -DgroupId=org.iq80.leveldb -DartifactId=leveldb -Dversion=0.7 -Dpackaging=jar -Dfile=/usr/share/java/leveldb-java/leveldb.jar -s settings.xml
mvn install:install-file -DgroupId=orn.fusesource.hawtjni -DartifactId=hawtjni-runtime -Dversion=1.16 -Dpackaging=jar -Dfile=/usr/lib/java/hawtjni/hawtjni-runtime.jar -s settings.xml
mkdir -p ${HOME}/repository/com/github/eirslett/node/12.22.1/
cp %{SOURCE15} ${HOME}/repository/com/github/eirslett/node/12.22.1/
cp %{SOURCE16} ${HOME}/repository/com/github/eirslett/node/12.22.1/
mv ${HOME}/repository/com/github/eirslett/node/12.22.1/node-v12.22.1-linux-arm64.tar.gz ${HOME}/repository/com/github/eirslett/node/12.22.1/node-12.22.1-linux-arm64.tar.gz
mkdir -p ${HOME}/repository/com/github/eirslett/yarn/1.22.5/
cp %{SOURCE14} ${HOME}/repository/com/github/eirslett/yarn/1.22.5/
mv ${HOME}/repository/com/github/eirslett/yarn/1.22.5/yarn-v1.22.5.tar.gz ${HOME}/repository/com/github/eirslett/yarn/1.22.5/yarn-1.22.5.tar.gz
tar -xzvf ${HOME}/repository/com/github/eirslett/yarn/1.22.5/yarn-1.22.5.tar.gz -C ${HOME}/repository/com/github/eirslett/yarn/1.22.5/
npm config set registry https://repo.huaweicloud.com/repository/npm/
npm cache clean -f
${HOME}/repository/com/github/eirslett/yarn/1.22.5/yarn-v1.22.5/bin/yarn config set registry https://repo.huaweicloud.com/repository/npm/ -g
${HOME}/repository/com/github/eirslett/yarn/1.22.5/yarn-v1.22.5/bin/yarn config set ignore-engines true
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-hdfs-project/hadoop-hdfs
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-web-proxy
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice
%pom_add_dep org.iq80.leveldb:leveldb-api:0.7 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common
%pom_add_dep org.fusesource.leveldbjni:leveldbjni:1.8 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server
%pom_add_dep org.fusesource.hawtjni:hawtjni-runtime:1.16 hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice
%pom_disable_module hadoop-minikdc hadoop-common-project
%pom_disable_module hadoop-pipes hadoop-tools
%pom_disable_module hadoop-azure hadoop-tools
%pom_disable_module hadoop-yarn-server-timelineservice-hbase-tests hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/pom.xml
# War files we don't want
%mvn_package :%{name}-auth-examples __noinstall
%mvn_package :%{name}-hdfs-httpfs __noinstall
# Parts we don't want to distribute
%mvn_package :%{name}-assemblies __noinstall
# Workaround for bz1012059
%mvn_package :%{name}-project-dist __noinstall
# Create separate file lists for packaging
%mvn_package :::tests: %{name}-tests
%mvn_package :%{name}-*-tests::{}: %{name}-tests
%mvn_package :%{name}-client*::{}: %{name}-client
%mvn_package :%{name}-hdfs*::{}: %{name}-hdfs
%mvn_package :%{name}-mapreduce-examples*::{}: %{name}-mapreduce-examples
%mvn_package :%{name}-mapreduce*::{}: %{name}-mapreduce
%mvn_package :%{name}-archives::{}: %{name}-mapreduce
%mvn_package :%{name}-datajoin::{}: %{name}-mapreduce
%mvn_package :%{name}-distcp::{}: %{name}-mapreduce
%mvn_package :%{name}-extras::{}: %{name}-mapreduce
%mvn_package :%{name}-gridmix::{}: %{name}-mapreduce
%mvn_package :%{name}-openstack::{}: %{name}-mapreduce
%mvn_package :%{name}-rumen::{}: %{name}-mapreduce
%mvn_package :%{name}-sls::{}: %{name}-mapreduce
%mvn_package :%{name}-streaming::{}: %{name}-mapreduce
%mvn_package :%{name}-tools*::{}: %{name}-mapreduce
%mvn_package :%{name}-maven-plugins::{}: %{name}-maven-plugin
%mvn_package :%{name}-minicluster::{}: %{name}-tests
%mvn_package :%{name}-yarn*::{}: %{name}-yarn
# Jar files that need to be overridden due to installation location
%mvn_file :%{name}-common::tests: %{name}/%{name}-common
%build
mvn clean -Dsnappy.lib=/usr/lib64 -Dbundle.snappy -Dcontainer-executor.conf.dir=%{_sysconfdir}/%{name} -Pdist,native -DskipTests -DskipIT -Dmaven.javadoc.skip=true package -s settings.xml
%install
# Copy all jar files except those generated by the build
# $1 the src directory
# $2 the dest directory
copy_dep_jars()
{
find $1 ! -name "hadoop-*.jar" -name "*.jar" | xargs install -m 0644 -t $2
rm -f $2/tools-*.jar
}
# Create symlinks for jars from the build
# $1 the location to create the symlink
link_hadoop_jars()
{
for f in `ls hadoop-* | grep -v tests | grep -v examples`
do
n=`echo $f | sed -e "s/-%{version}//" -e "s/1.1.1//"`
if [ -L $1/$n ]
then
continue
elif [ -e $1/$f ]
then
rm -f $1/$f $1/$n
fi
p=`find %{buildroot}%{_jnidir} %{buildroot}%{_javadir}/%{name} -name $n | sed "s#%{buildroot}##"`
%{__ln_s} $p $1/$n
done
}
%mvn_install
install -d -m 0755 %{buildroot}%{_libdir}/%{name}
install -d -m 0755 %{buildroot}%{_includedir}/%{name}
install -d -m 0755 %{buildroot}%{_jnidir}/%{name}
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/client/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/common/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/hdfs/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/hdfs/webapps
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/mapreduce/lib
install -d -m 0755 %{buildroot}%{_datadir}/%{name}/yarn/lib
install -d -m 0755 %{buildroot}%{_sysconfdir}/%{name}/tomcat/Catalina/localhost
install -d -m 0755 %{buildroot}%{_sysconfdir}/logrotate.d
install -d -m 0755 %{buildroot}%{_sysconfdir}/sysconfig
install -d -m 0755 %{buildroot}%{_tmpfilesdir}
install -d -m 0755 %{buildroot}%{_sharedstatedir}/%{name}-hdfs
install -d -m 0755 %{buildroot}%{_sharedstatedir}/tomcats/httpfs
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-yarn
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-httpfs/temp
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-httpfs/work
install -d -m 0755 %{buildroot}%{_var}/cache/%{name}-mapreduce
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-yarn
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-hdfs
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-httpfs
install -d -m 0755 %{buildroot}%{_var}/log/%{name}-mapreduce
install -d -m 0755 %{buildroot}%{_var}/run/%{name}-yarn
install -d -m 0755 %{buildroot}%{_var}/run/%{name}-hdfs
install -d -m 0755 %{buildroot}%{_var}/run/%{name}-mapreduce
basedir='%{name}-common-project/%{name}-common/target/%{name}-common-%{hadoop_version}'
hdfsdir='%{name}-hdfs-project/%{name}-hdfs/target/%{name}-hdfs-%{hadoop_version}'
httpfsdir='%{name}-hdfs-project/%{name}-hdfs-httpfs/target/%{name}-hdfs-httpfs-%{hadoop_version}'
mapreddir='%{name}-mapreduce-project/target/%{name}-mapreduce-%{hadoop_version}'
yarndir='%{name}-yarn-project/target/%{name}-yarn-project-%{hadoop_version}'
# copy jar package
install -d -m 0755 %{buildroot}%{_datadir}/java/%{name}
install -d -m 0755 %{buildroot}%{_datadir}/maven-poms/%{name}
# client
install -m 0755 %{name}-client-modules/%{name}-client/target/hadoop-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client.jar
echo %{_datadir}/java/%{name}/hadoop-client.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client.pom >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-api/target/hadoop-client-api-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-api.jar
echo %{_datadir}/java/%{name}/hadoop-client-api.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-api/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client-api.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client-api.pom >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-minicluster/target/hadoop-client-minicluster-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-minicluster.jar
echo %{_datadir}/java/%{name}/hadoop-client-minicluster.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-minicluster/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client-minicluster.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client-minicluster.pom >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-runtime/target/hadoop-client-runtime-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-runtime.jar
echo %{_datadir}/java/%{name}/hadoop-client-runtime.jar >> .mfiles-hadoop-client
install -m 0755 %{name}-client-modules/%{name}-client-runtime/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-client-runtime.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-client-runtime.pom >> .mfiles-hadoop-client
# common
install -m 0755 %{name}-common-project/%{name}-annotations/target/hadoop-annotations-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-annotations.jar
echo %{_datadir}/java/%{name}/hadoop-annotations.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-auth/target/hadoop-auth-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-auth.jar
echo %{_datadir}/java/%{name}/hadoop-auth.jar >> .mfiles
install -m 0755 %{name}-tools/%{name}-aws/target/hadoop-aws-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-aws.jar
echo %{_datadir}/java/%{name}/hadoop-aws.jar >> .mfiles
install -m 0755 %{name}-build-tools/target/hadoop-build-tools-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-build-tools.jar
echo %{_datadir}/java/%{name}/hadoop-build-tools.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-nfs/target/hadoop-nfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-nfs.jar
echo %{_datadir}/java/%{name}/hadoop-nfs.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-common/target/hadoop-common-%{version}.jar %{buildroot}%{_prefix}/lib/java/hadoop/hadoop-common.jar
echo %{_prefix}/lib/java/hadoop/hadoop-common.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-kms/target/hadoop-kms-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-kms.jar
echo %{_datadir}/java/%{name}/hadoop-kms.jar >> .mfiles
install -m 0755 %{name}-common-project/%{name}-annotations/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-annotations.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-annotations.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-auth/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-auth.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-auth.pom >> .mfiles
install -m 0755 %{name}-tools/%{name}-aws/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-aws.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-aws.pom >> .mfiles
install -m 0755 %{name}-build-tools/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-build-tools.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-build-tools.pom >> .mfiles
install -m 0755 %{name}-common-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-common-project.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-common-project.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-common.pom >> .mfiles
install -m 0755 %{name}-dist/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-dist.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-dist.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-nfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-nfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-nfs.pom >> .mfiles
install -m 0755 %{name}-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-project.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-project.pom >> .mfiles
install -m 0755 %{name}-common-project/%{name}-kms/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-kms.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-kms.pom >> .mfiles
echo %{_sysconfdir}/%{name}/hadoop-user-functions.sh.example >> .mfiles
echo %{_sysconfdir}/%{name}/shellprofile.d/example.sh >> .mfiles
echo %{_sysconfdir}/%{name}/workers >> .mfiles
echo %{_prefix}/libexec/hadoop-functions.sh >> .mfiles
echo %{_prefix}/libexec/hadoop-layout.sh.example >> .mfiles
echo %{_prefix}/sbin/workers.sh >> .mfiles
echo %{_datadir}/%{name}/common/hadoop-common.jar >> .mfiles
# hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-client/target/hadoop-hdfs-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-client.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-client.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-nfs/target/hadoop-hdfs-nfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-nfs.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-nfs.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs/target/hadoop-hdfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-nfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-nfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-nfs.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-project.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-project.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-httpfs/target/hadoop-hdfs-httpfs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-httpfs.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-httpfs.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-httpfs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-httpfs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-httpfs.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-native-client.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-native-client.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-native-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-native-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-native-client.pom >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-rbf/target/hadoop-hdfs-rbf-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-rbf.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-rbf.jar >> .mfiles-hadoop-hdfs
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-rbf/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-hdfs-rbf.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-hdfs-rbf.pom >> .mfiles-hadoop-hdfs
echo %{_prefix}/libexec/shellprofile.d/hadoop-hdfs.sh >> .mfiles-hadoop-hdfs
# mapreduce
install -m 0755 %{name}-tools/%{name}-archives/target/hadoop-archives-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-archives.jar
echo %{_datadir}/java/%{name}/hadoop-archives.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-datajoin/target/hadoop-datajoin-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-datajoin.jar
echo %{_datadir}/java/%{name}/hadoop-datajoin.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-distcp/target/hadoop-distcp-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-distcp.jar
echo %{_datadir}/java/%{name}/hadoop-distcp.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-extras/target/hadoop-extras-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-extras.jar
echo %{_datadir}/java/%{name}/hadoop-extras.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-gridmix/target/hadoop-gridmix-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-gridmix.jar
echo %{_datadir}/java/%{name}/hadoop-gridmix.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-app/target/hadoop-mapreduce-client-app-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-app.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-app.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-common/target/hadoop-mapreduce-client-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-common.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-common.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-core/target/hadoop-mapreduce-client-core-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-core.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-core.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs-plugins/target/hadoop-mapreduce-client-hs-plugins-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-hs-plugins.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-hs-plugins.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs/target/hadoop-mapreduce-client-hs-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-hs.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-hs.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-jobclient/target/hadoop-mapreduce-client-jobclient-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-shuffle/target/hadoop-mapreduce-client-shuffle-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-shuffle.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-shuffle.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-openstack/target/hadoop-openstack-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-openstack.jar
echo %{_datadir}/java/%{name}/hadoop-openstack.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-rumen/target/hadoop-rumen-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-rumen.jar
echo %{_datadir}/java/%{name}/hadoop-rumen.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-sls/target/hadoop-sls-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-sls.jar
echo %{_datadir}/java/%{name}/hadoop-sls.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-streaming/target/hadoop-streaming-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-streaming.jar
echo %{_datadir}/java/%{name}/hadoop-streaming.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-tools-dist/target/hadoop-tools-dist-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-tools-dist.jar
echo %{_datadir}/java/%{name}/hadoop-tools-dist.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-archives/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-archives.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-archives.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-datajoin/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-datajoin.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-datajoin.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-distcp/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-distcp.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-distcp.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-extras/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-extras.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-extras.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-gridmix/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-gridmix.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-gridmix.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-app/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-app.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-app.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-common.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-core/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-core.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-core.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs-plugins/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs-plugins.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs-plugins.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-hs/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-hs.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-jobclient/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-jobclient.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-jobclient.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-shuffle/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-shuffle.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-shuffle.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-openstack/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-openstack.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-openstack.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-rumen/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-rumen.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-rumen.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-sls/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-sls.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-sls.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-streaming/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-streaming.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-streaming.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/%{name}-tools-dist/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-tools-dist.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-tools-dist.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-tools/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-tools.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-tools.pom >> .mfiles-hadoop-mapreduce
echo %{_prefix}/libexec/shellprofile.d/hadoop-mapreduce.sh >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-nativetask/target/hadoop-mapreduce-client-nativetask-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-nativetask.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-nativetask.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-nativetask/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-nativetask.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-nativetask.pom >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-uploader/target/hadoop-mapreduce-client-uploader-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-uploader.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-uploader.jar >> .mfiles-hadoop-mapreduce
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-uploader/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-uploader.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-client-uploader.pom >> .mfiles-hadoop-mapreduce
# mapreduce-examples
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-examples/target/hadoop-mapreduce-examples-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-examples.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-examples.jar >> .mfiles-hadoop-mapreduce-examples
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-examples/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-mapreduce-examples.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-mapreduce-examples.pom >> .mfiles-hadoop-mapreduce-examples
# maven-plugin
install -m 0755 %{name}-maven-plugins/target/hadoop-maven-plugins-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-maven-plugins.jar
echo %{_datadir}/java/%{name}/hadoop-maven-plugins.jar >> .mfiles-hadoop-maven-plugin
install -m 0755 %{name}-maven-plugins/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-maven-plugins.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-maven-plugins.pom >> .mfiles-hadoop-maven-plugin
# tests
install -m 0755 %{name}-client-modules/%{name}-client/target/hadoop-client-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-client-tests.jar
echo %{_datadir}/java/%{name}/hadoop-client-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-common-project/%{name}-common/target/hadoop-common-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-common-tests.jar
echo %{_datadir}/java/%{name}/hadoop-common-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-hdfs-project/%{name}-hdfs/target/hadoop-hdfs-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-tests.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-app/target/hadoop-mapreduce-client-app-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-app-tests.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-app-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-jobclient/target/hadoop-mapreduce-client-jobclient-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient-tests.jar
echo %{_datadir}/java/%{name}/hadoop-mapreduce-client-jobclient-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-minicluster/target/hadoop-minicluster-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-minicluster.jar
echo %{_datadir}/java/%{name}/hadoop-minicluster.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-tools/%{name}-tools-dist/target/hadoop-tools-dist-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-tools-dist-tests.jar
echo %{_datadir}/java/%{name}/hadoop-tools-dist-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-common/target/hadoop-yarn-common-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-common-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-common-tests.jar >> .mfiles-hadoop-tests
#install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-registry/target/hadoop-yarn-registry-%{version}-test-sources.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-registry-test-sources.jar
#echo %{_datadir}/java/%{name}/hadoop-yarn-registry-test-sources.jar >> .mfiles-hadoop-test-sources
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-resourcemanager/target/hadoop-yarn-server-resourcemanager-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-sharedcachemanager/target/hadoop-yarn-server-sharedcachemanager-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-tests/target/hadoop-yarn-server-tests-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-tests-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-tests-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-tests/target/hadoop-yarn-server-tests-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-tests.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-tests.jar >> .mfiles-hadoop-tests
install -m 0755 %{name}-minicluster/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-minicluster.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-minicluster.pom >> .mfiles-hadoop-tests
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-tests/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-tests.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-tests.pom >> .mfiles-hadoop-tests
install -m 0755 %{name}-hdfs-project/%{name}-hdfs-client/target/hadoop-hdfs-client-%{version}-tests.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-hdfs-client-tests.jar
echo %{_datadir}/java/%{name}/hadoop-hdfs-client-tests.jar >> .mfiles-hadoop-tests
# yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-api/target/hadoop-yarn-api-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-api.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-api.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-applications-distributedshell/target/hadoop-yarn-applications-distributedshell-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-applications-distributedshell.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-applications-distributedshell.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-applications-unmanaged-am-launcher/target/hadoop-yarn-applications-unmanaged-am-launcher-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-client/target/hadoop-yarn-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-client.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-client.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-common/target/hadoop-yarn-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-common.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-common.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-registry/target/hadoop-yarn-registry-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-registry.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-registry.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-applicationhistoryservice/target/hadoop-yarn-server-applicationhistoryservice-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-applicationhistoryservice.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-applicationhistoryservice.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-common/target/hadoop-yarn-server-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-common.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-common.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-resourcemanager/target/hadoop-yarn-server-resourcemanager-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-resourcemanager.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-sharedcachemanager/target/hadoop-yarn-server-sharedcachemanager-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-sharedcachemanager.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-web-proxy/target/hadoop-yarn-server-web-proxy-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-web-proxy.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-web-proxy.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-nodemanager/target/hadoop-yarn-server-nodemanager-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-nodemanager.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-nodemanager.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-router/target/hadoop-yarn-server-router-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-router.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-router.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-router/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-router.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-router.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timeline-pluginstorage/target/hadoop-yarn-server-timeline-pluginstorage-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timeline-pluginstorage.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timeline-pluginstorage.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice/target/hadoop-yarn-server-timelineservice-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-client/target/hadoop-yarn-server-timelineservice-hbase-client-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-client.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-client.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-common/target/hadoop-yarn-server-timelineservice-hbase-common-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-common.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-common.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/target/%{name}-yarn-project-%{version}/share/%{name}/yarn/timelineservice/hadoop-yarn-server-timelineservice-hbase-coprocessor-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-coprocessor.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-server-timelineservice-hbase-coprocessor.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timeline-pluginstorage/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timeline-pluginstorage.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timeline-pluginstorage.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-api/target/hadoop-yarn-services-api-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-services-api.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-services-api.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-api/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-services-api.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-services-api.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-core/target/hadoop-yarn-services-core-%{version}.jar %{buildroot}%{_datadir}/java/%{name}/hadoop-yarn-services-core.jar
echo %{_datadir}/java/%{name}/hadoop-yarn-services-core.jar >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-core/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-services-core.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-services-core.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-api/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-api.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-api.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/hadoop-yarn-applications-distributedshell/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-distributedshell.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-distributedshell.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/hadoop-yarn-applications-unmanaged-am-launcher/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-applications-unmanaged-am-launcher.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-applications/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-applications.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-applications.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-client.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-common.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-registry/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-registry.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-registry.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-applicationhistoryservice/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-applicationhistoryservice.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-applicationhistoryservice.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-common.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-nodemanager/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-nodemanager.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-nodemanager.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-resourcemanager/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-resourcemanager.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-resourcemanager.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-sharedcachemanager/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-sharedcachemanager.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-sharedcachemanager.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/hadoop-yarn-server-web-proxy/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-web-proxy.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-web-proxy.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-site/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-site.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-site.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-client/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-client.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-client.pom >> .mfiles-hadoop-yarn
install -m 0755 %{name}-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-common/pom.xml %{buildroot}%{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-common.pom
echo %{_datadir}/maven-poms/%{name}/hadoop-yarn-server-timelineservice-hbase-common.pom >> .mfiles-hadoop-yarn
echo %{_sysconfdir}/%{name}/yarnservice-log4j.properties >> .mfiles-hadoop-yarn
echo %{_prefix}/bin/container-executor >> .mfiles-hadoop-yarn
echo %{_prefix}/bin/oom-listener >> .mfiles-hadoop-yarn
echo %{_prefix}/bin/test-container-executor >> .mfiles-hadoop-yarn
echo %{_prefix}/libexec/shellprofile.d/hadoop-yarn.sh >> .mfiles-hadoop-yarn
echo %{_prefix}/sbin/FederationStateStore/* >> .mfiles-hadoop-yarn
# copy script folders
for dir in bin libexec sbin
do
cp -arf $basedir/$dir %{buildroot}%{_prefix}
cp -arf $hdfsdir/$dir %{buildroot}%{_prefix}
cp -arf $mapreddir/$dir %{buildroot}%{_prefix}
cp -arf $yarndir/$dir %{buildroot}%{_prefix}
done
# This binary is obsoleted and causes a conflict with qt-devel
rm -rf %{buildroot}%{_bindir}/rcc
# Duplicate files
rm -f %{buildroot}%{_sbindir}/hdfs-config.sh
# copy config files
cp -arf $basedir/etc/* %{buildroot}%{_sysconfdir}
cp -arf $httpfsdir/etc/* %{buildroot}%{_sysconfdir}
cp -arf $mapreddir/etc/* %{buildroot}%{_sysconfdir}
cp -arf $yarndir/etc/* %{buildroot}%{_sysconfdir}
# copy binaries
cp -arf $basedir/lib/native/libhadoop.so* %{buildroot}%{_libdir}/%{name}
chrpath --delete %{buildroot}%{_libdir}/%{name}/*
cp -arf ./hadoop-hdfs-project/hadoop-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}/include/hdfs.h %{buildroot}%{_includedir}/%{name}
cp -arf ./hadoop-hdfs-project/hadoop-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}/lib/native/libhdfs.so* %{buildroot}%{_libdir}
chrpath --delete %{buildroot}%{_libdir}/libhdfs*
# Not needed since httpfs is deployed with existing systemd setup
rm -f %{buildroot}%{_sbindir}/httpfs.sh
rm -f %{buildroot}%{_libexecdir}/httpfs-config.sh
rm -f %{buildroot}%{_bindir}/httpfs-env.sh
# Remove files with .cmd extension
find %{buildroot} -name *.cmd | xargs rm -f
# Modify hadoop-env.sh to point to correct locations for JAVA_HOME
# and JSVC_HOME.
sed -i "s|\${JAVA_HOME}|/usr/lib/jvm/jre|" %{buildroot}%{_sysconfdir}/%{name}/%{name}-env.sh
sed -i "s|\${JSVC_HOME}|/usr/bin|" %{buildroot}%{_sysconfdir}/%{name}/%{name}-env.sh
# Ensure the java provided DocumentBuilderFactory is used
sed -i "s|\(HADOOP_OPTS.*=.*\)\$HADOOP_CLIENT_OPTS|\1 -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl \$HADOOP_CLIENT_OPTS|" %{buildroot}%{_sysconfdir}/%{name}/%{name}-env.sh
echo "export YARN_OPTS=\"\$YARN_OPTS -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl\"" >> %{buildroot}%{_sysconfdir}/%{name}/yarn-env.sh
# Workaround for bz1012059
install -d -m 0755 %{buildroot}%{_mavenpomdir}/
install -pm 644 hadoop-project-dist/pom.xml %{buildroot}%{_mavenpomdir}/JPP.%{name}-%{name}-project-dist.pom
%{__ln_s} %{_jnidir}/%{name}/hadoop-common.jar %{buildroot}%{_datadir}/%{name}/common
%{__ln_s} %{_javadir}/%{name}/hadoop-hdfs.jar %{buildroot}%{_datadir}/%{name}/hdfs
%{__ln_s} %{_javadir}/%{name}/hadoop-client.jar %{buildroot}%{_datadir}/%{name}/client
# client jar depenencies
copy_dep_jars hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client/lib %{buildroot}%{_datadir}/%{name}/client/lib
pushd hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client/lib
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/client/lib
popd
cp -f hadoop-client-modules/%{name}-client-api/target/hadoop-client-api-%{version}.jar hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
cp -f hadoop-client-modules/%{name}-client-minicluster/target/hadoop-client-minicluster-%{version}.jar hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
cp -f hadoop-client-modules/%{name}-client-runtime/target/hadoop-client-runtime-%{version}.jar hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
pushd hadoop-client-modules/%{name}-client/target/%{name}-client-%{hadoop_version}/share/%{name}/client
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/client
popd
# common jar depenencies
copy_dep_jars $basedir/share/%{name}/common/lib %{buildroot}%{_datadir}/%{name}/common/lib
cp -f hadoop-common-project/%{name}-kms/target/hadoop-kms-%{version}.jar $basedir/share/%{name}/common
cp -f hadoop-common-project/%{name}-nfs/target/hadoop-nfs-%{version}.jar $basedir/share/%{name}/common
cp -f hadoop-common-project/%{name}-auth/target/hadoop-auth-%{version}.jar $basedir/share/%{name}/common
pushd $basedir/share/%{name}/common
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/common
popd
pushd $basedir/share/%{name}/common/lib
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/common/lib
\cp -r %{_builddir}/hadoop-%{version}-src/hadoop-dist/target/hadoop-%{version}/share/hadoop/common/lib/* %{buildroot}%{_datadir}/%{name}/common/lib
popd
# hdfs jar dependencies
copy_dep_jars $hdfsdir/share/%{name}/hdfs/lib %{buildroot}%{_datadir}/%{name}/hdfs/lib
%{__ln_s} %{_jnidir}/%{name}/%{name}-hdfs-bkjournal.jar %{buildroot}%{_datadir}/%{name}/hdfs/lib
cp -f hadoop-hdfs-project/%{name}-hdfs-client/target/hadoop-hdfs-client-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-httpfs/target/hadoop-hdfs-httpfs-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-native-client/target/hadoop-hdfs-native-client-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-nfs/target/hadoop-hdfs-nfs-%{version}.jar $hdfsdir/share/%{name}/hdfs
cp -f hadoop-hdfs-project/%{name}-hdfs-rbf/target/hadoop-hdfs-rbf-%{version}.jar $hdfsdir/share/%{name}/hdfs
pushd $hdfsdir/share/%{name}/hdfs
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/hdfs
\cp -r %{_builddir}/hadoop-%{version}-src/hadoop-dist/target/hadoop-%{version}/share/hadoop/hdfs/lib %{buildroot}%{_datadir}/%{name}/hdfs/lib
popd
# httpfs
# Create the webapp directory structure
pushd %{buildroot}%{_sharedstatedir}/tomcats/httpfs
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/conf conf
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/lib lib
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/logs logs
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/temp temp
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/webapps webapps
%{__ln_s} %{_datadir}/%{name}/httpfs/tomcat/work work
popd
# Copy the tomcat configuration and overlay with specific configuration bits.
# This is needed so the httpfs instance won't collide with a system running
# tomcat
for cfgfile in catalina.policy catalina.properties context.xml \
tomcat.conf web.xml server.xml logging.properties;
do
cp -a %{_sysconfdir}/tomcat/$cfgfile %{buildroot}%{_sysconfdir}/%{name}/tomcat
done
# Replace, in place, the Tomcat configuration files delivered with the current
# Fedora release. See BZ#1295968 for some reason.
sed -i -e 's/8005/${httpfs.admin.port}/g' -e 's/8080/${httpfs.http.port}/g' %{buildroot}%{_sysconfdir}/%{name}/tomcat/server.xml
sed -i -e 's/catalina.base/httpfs.log.dir/g' %{buildroot}%{_sysconfdir}/%{name}/tomcat/logging.properties
# Given the permission, only the root and tomcat users can access to that file,
# not the build user. So, the build would fail here.
install -m 660 %{SOURCE9} %{buildroot}%{_sysconfdir}/%{name}/tomcat/tomcat-users.xml
# Copy the httpfs webapp
cp -arf %{name}-hdfs-project/%{name}-hdfs-httpfs/target/classes/webapps/webhdfs %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps
# Tell tomcat to follow symlinks
install -d -m 0766 %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/META-INF/
cp %{SOURCE5} %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/META-INF/
# Remove the jars included in the webapp and create symlinks
rm -f %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/tools*.jar
rm -f %{buildroot}%{_datadir}/%{name}/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/tomcat-*.jar
pushd %{buildroot}%{_datadir}/%{name}/httpfs/tomcat
%{__ln_s} %{_datadir}/tomcat/bin bin
%{__ln_s} %{_sysconfdir}/%{name}/tomcat conf
%{__ln_s} %{_datadir}/tomcat/lib lib
%{__ln_s} %{_var}/cache/%{name}-httpfs/temp temp
%{__ln_s} %{_var}/cache/%{name}-httpfs/work work
%{__ln_s} %{_var}/log/%{name}-httpfs logs
popd
# mapreduce jar dependencies
mrdir='%{name}-mapreduce-project/target/%{name}-mapreduce-%{hadoop_version}'
copy_dep_jars $mrdir/share/%{name}/mapreduce/lib %{buildroot}%{_datadir}/%{name}/mapreduce/lib
%{__ln_s} %{_javadir}/%{name}/%{name}-annotations.jar %{buildroot}%{_datadir}/%{name}/mapreduce/lib
cp -f hadoop-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-nativetask/target/hadoop-mapreduce-client-nativetask-%{version}.jar $mrdir/share/%{name}/mapreduce
cp -f hadoop-mapreduce-project/%{name}-mapreduce-client/%{name}-mapreduce-client-uploader/target/hadoop-mapreduce-client-uploader-%{version}.jar $mrdir/share/%{name}/mapreduce
cp -f hadoop-mapreduce-project/%{name}-mapreduce-examples/target/hadoop-mapreduce-examples-%{version}.jar $mrdir/share/%{name}/mapreduce
pushd $mrdir/share/%{name}/mapreduce
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/mapreduce
popd
# yarn jar dependencies
yarndir='%{name}-yarn-project/target/%{name}-yarn-project-%{hadoop_version}'
copy_dep_jars $yarndir/share/%{name}/yarn/lib %{buildroot}%{_datadir}/%{name}/yarn/lib
%{__ln_s} %{_javadir}/%{name}/%{name}-annotations.jar %{buildroot}%{_datadir}/%{name}/yarn/lib
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-nodemanager/target/hadoop-yarn-server-nodemanager-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-router/target/hadoop-yarn-server-router-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timeline-pluginstorage/target/hadoop-yarn-server-timeline-pluginstorage-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-api/target/hadoop-yarn-services-api-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-applications/%{name}-yarn-services/%{name}-yarn-services-core/target/hadoop-yarn-services-core-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice/target/hadoop-yarn-server-timelineservice-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-client/target/hadoop-yarn-server-timelineservice-hbase-client-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/%{name}-yarn/%{name}-yarn-server/%{name}-yarn-server-timelineservice-hbase/%{name}-yarn-server-timelineservice-hbase-common/target/hadoop-yarn-server-timelineservice-hbase-common-%{version}.jar $yarndir/share/%{name}/yarn
cp -f hadoop-yarn-project/target/%{name}-yarn-project-%{version}/share/%{name}/yarn/timelineservice/hadoop-yarn-server-timelineservice-hbase-coprocessor-%{version}.jar $yarndir/share/%{name}/yarn
pushd $yarndir/share/%{name}/yarn
link_hadoop_jars %{buildroot}%{_datadir}/%{name}/yarn
popd
# Install hdfs webapp bits
cp -arf hadoop-hdfs-project/hadoop-hdfs/target/webapps/* %{buildroot}%{_datadir}/%{name}/hdfs/webapps
# hadoop layout. Convert to appropriate lib location for 32 and 64 bit archs
lib=$(echo %{?_libdir} | sed -e 's:/usr/\(.*\):\1:')
if [ "$lib" = "%_libdir" ]; then
echo "_libdir is not located in /usr. Lib location is wrong"
exit 1
fi
sed -e "s|HADOOP_COMMON_LIB_NATIVE_DIR\s*=.*|HADOOP_COMMON_LIB_NATIVE_DIR=$lib/%{name}|" %{SOURCE1} > %{buildroot}%{_libexecdir}/%{name}-layout.sh
# Default config
cp -f %{SOURCE10} %{buildroot}%{_sysconfdir}/%{name}/core-site.xml
cp -f %{SOURCE11} %{buildroot}%{_sysconfdir}/%{name}/hdfs-site.xml
cp -f %{SOURCE12} %{buildroot}%{_sysconfdir}/%{name}/mapred-site.xml
cp -f %{SOURCE13} %{buildroot}%{_sysconfdir}/%{name}/yarn-site.xml
# systemd configuration
install -d -m 0755 %{buildroot}%{_unitdir}/
for service in %{hdfs_services} %{mapreduce_services} %{yarn_services}
do
s=`echo $service | cut -d'-' -f 2 | cut -d'.' -f 1`
daemon=$s
if [[ "%{hdfs_services}" == *$service* ]]
then
src=%{SOURCE2}
elif [[ "%{mapreduce_services}" == *$service* ]]
then
src=%{SOURCE3}
elif [[ "%{yarn_services}" == *$service* ]]
then
if [[ "$s" == "timelineserver" ]]
then
daemon='historyserver'
fi
src=%{SOURCE4}
else
echo "Failed to determine type of service for %service"
exit 1
fi
sed -e "s|DAEMON|$daemon|g" -e "/LimitNPROC/a\SuccessExitStatus=SIGKILL" $src > %{buildroot}%{_unitdir}/%{name}-$s.service
done
cp -f %{SOURCE7} %{buildroot}%{_sysconfdir}/sysconfig/tomcat@httpfs
# Ensure /var/run directories are recreated on boot
echo "d %{_var}/run/%{name}-yarn 0775 yarn hadoop -" > %{buildroot}%{_tmpfilesdir}/%{name}-yarn.conf
echo "d %{_var}/run/%{name}-hdfs 0775 hdfs hadoop -" > %{buildroot}%{_tmpfilesdir}/%{name}-hdfs.conf
echo "d %{_var}/run/%{name}-mapreduce 0775 mapred hadoop -" > %{buildroot}%{_tmpfilesdir}/%{name}-mapreduce.conf
# logrotate config
for type in hdfs httpfs yarn mapreduce
do
sed -e "s|NAME|$type|" %{SOURCE6} > %{buildroot}%{_sysconfdir}/logrotate.d/%{name}-$type
done
sed -i "s|{|%{_var}/log/hadoop-hdfs/*.audit\n{|" %{buildroot}%{_sysconfdir}/logrotate.d/%{name}-hdfs
# hdfs init script
install -m 755 %{SOURCE8} %{buildroot}%{_sbindir}
chrpath -d %{buildroot}%{_bindir}/container-executor
chrpath -d %{buildroot}%{_bindir}/test-container-executor
%pretrans -p <lua> hdfs
path = "%{_datadir}/%{name}/hdfs/webapps"
st = posix.stat(path)
if st and st.type == "link" then
os.remove(path)
end
%pre common
getent group hadoop >/dev/null || groupadd -r hadoop
%pre hdfs
getent group hdfs >/dev/null || groupadd -r hdfs
getent passwd hdfs >/dev/null || /usr/sbin/useradd --comment "Apache Hadoop HDFS" --shell /sbin/nologin -M -r -g hdfs -G hadoop --home %{_sharedstatedir}/%{name}-hdfs hdfs
%pre mapreduce
getent group mapred >/dev/null || groupadd -r mapred
getent passwd mapred >/dev/null || /usr/sbin/useradd --comment "Apache Hadoop MapReduce" --shell /sbin/nologin -M -r -g mapred -G hadoop --home %{_var}/cache/%{name}-mapreduce mapred
%pre yarn
getent group yarn >/dev/null || groupadd -r yarn
getent passwd yarn >/dev/null || /usr/sbin/useradd --comment "Apache Hadoop Yarn" --shell /sbin/nologin -M -r -g yarn -G hadoop --home %{_var}/cache/%{name}-yarn yarn
%preun hdfs
%systemd_preun %{hdfs_services}
%preun mapreduce
%systemd_preun %{mapreduce_services}
%preun yarn
%systemd_preun %{yarn_services}
%post common-native -p /sbin/ldconfig
%post hdfs
# Change the home directory for the hdfs user
if [[ `getent passwd hdfs | cut -d: -f 6` != "%{_sharedstatedir}/%{name}-hdfs" ]]
then
/usr/sbin/usermod -d %{_sharedstatedir}/%{name}-hdfs hdfs
fi
if [ $1 -gt 1 ]
then
if [ -d %{_var}/cache/%{name}-hdfs ] && [ ! -L %{_var}/cache/%{name}-hdfs ]
then
# Move the existing hdfs data to the new location
mv -f %{_var}/cache/%{name}-hdfs/* %{_sharedstatedir}/%{name}-hdfs/
fi
fi
%systemd_post %{hdfs_services}
%post -n libhdfs -p /sbin/ldconfig
%post mapreduce
%systemd_post %{mapreduce_services}
%post yarn
%systemd_post %{yarn_services}
%postun common-native -p /sbin/ldconfig
%postun hdfs
%systemd_postun_with_restart %{hdfs_services}
if [ $1 -lt 1 ]
then
# Remove the compatibility symlink
rm -f %{_var}/cache/%{name}-hdfs
fi
%postun -n libhdfs -p /sbin/ldconfig
%postun mapreduce
%systemd_postun_with_restart %{mapreduce_services}
%postun yarn
%systemd_postun_with_restart %{yarn_services}
%posttrans hdfs
# Create a symlink to the new location for hdfs data in case the user changed
# the configuration file and the new one isn't in place to point to the
# correct location
if [ ! -e %{_var}/cache/%{name}-hdfs ]
then
%{__ln_s} %{_sharedstatedir}/%{name}-hdfs %{_var}/cache
fi
%files -f .mfiles-%{name}-client client
%{_datadir}/%{name}/client
%files -f .mfiles common
%doc LICENSE.txt
%doc NOTICE.txt
%doc README.txt
%config(noreplace) %{_sysconfdir}/%{name}/core-site.xml
%config(noreplace) %{_sysconfdir}/%{name}/%{name}-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/%{name}-metrics2.properties
%config(noreplace) %{_sysconfdir}/%{name}/%{name}-policy.xml
%config(noreplace) %{_sysconfdir}/%{name}/log4j.properties
%config(noreplace) %{_sysconfdir}/%{name}/ssl-client.xml.example
%config(noreplace) %{_sysconfdir}/%{name}/ssl-server.xml.example
%config(noreplace) %{_sysconfdir}/%{name}/configuration.xsl
%dir %{_datadir}/%{name}
%dir %{_datadir}/%{name}/common
%{_datadir}/%{name}/common/lib
%{_datadir}/%{name}/common/hadoop-kms.jar
%{_datadir}/%{name}/common/hadoop-nfs.jar
%{_datadir}/%{name}/common/hadoop-auth.jar
%{_libexecdir}/%{name}-config.sh
%{_libexecdir}/%{name}-layout.sh
# Workaround for bz1012059
%{_mavenpomdir}/JPP.%{name}-%{name}-project-dist.pom
%{_bindir}/%{name}
%{_sbindir}/%{name}-daemon.sh
%{_sbindir}/%{name}-daemons.sh
%{_sbindir}/start-all.sh
%{_sbindir}/start-balancer.sh
%{_sbindir}/start-dfs.sh
%{_sbindir}/start-secure-dns.sh
%{_sbindir}/stop-all.sh
%{_sbindir}/stop-balancer.sh
%{_sbindir}/stop-dfs.sh
%{_sbindir}/stop-secure-dns.sh
%files common-native
%{_libdir}/%{name}/libhadoop.*
%files devel
%{_includedir}/%{name}
%{_libdir}/libhdfs.so
%files -f .mfiles-%{name}-hdfs hdfs
%config(noreplace) %{_sysconfdir}/%{name}/hdfs-site.xml
%{_datadir}/%{name}/hdfs
%{_unitdir}/%{name}-datanode.service
%{_unitdir}/%{name}-namenode.service
%{_unitdir}/%{name}-journalnode.service
%{_unitdir}/%{name}-secondarynamenode.service
%{_unitdir}/%{name}-zkfc.service
%{_libexecdir}/hdfs-config.sh
%{_bindir}/hdfs
%{_sbindir}/distribute-exclude.sh
%{_sbindir}/refresh-namenodes.sh
%{_sbindir}/hdfs-create-dirs
%{_tmpfilesdir}/%{name}-hdfs.conf
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-hdfs
%attr(0755,hdfs,hadoop) %dir %{_var}/run/%{name}-hdfs
%attr(0755,hdfs,hadoop) %dir %{_var}/log/%{name}-hdfs
%attr(0755,hdfs,hadoop) %dir %{_sharedstatedir}/%{name}-hdfs
%files httpfs
%config(noreplace) %{_sysconfdir}/sysconfig/tomcat@httpfs
%config(noreplace) %{_sysconfdir}/%{name}/httpfs-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/httpfs-log4j.properties
%config(noreplace) %{_sysconfdir}/%{name}/httpfs-site.xml
%attr(-,tomcat,tomcat) %config(noreplace) %{_sysconfdir}/%{name}/tomcat/*.*
%attr(0775,root,tomcat) %dir %{_sysconfdir}/%{name}/tomcat
%attr(0775,root,tomcat) %dir %{_sysconfdir}/%{name}/tomcat/Catalina
%attr(0775,root,tomcat) %dir %{_sysconfdir}/%{name}/tomcat/Catalina/localhost
%{_datadir}/%{name}/httpfs
%{_sharedstatedir}/tomcats/httpfs
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-httpfs
%attr(0775,root,tomcat) %dir %{_var}/log/%{name}-httpfs
%attr(0775,root,tomcat) %dir %{_var}/cache/%{name}-httpfs
%attr(0775,root,tomcat) %dir %{_var}/cache/%{name}-httpfs/temp
%attr(0775,root,tomcat) %dir %{_var}/cache/%{name}-httpfs/work
%files -n libhdfs
%{_libdir}/libhdfs.so.*
%files -f .mfiles-%{name}-mapreduce mapreduce
%config(noreplace) %{_sysconfdir}/%{name}/mapred-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/mapred-queues.xml.template
%config(noreplace) %{_sysconfdir}/%{name}/mapred-site.xml
%{_datadir}/%{name}/mapreduce
%{_libexecdir}/mapred-config.sh
%{_unitdir}/%{name}-historyserver.service
%{_bindir}/mapred
%{_sbindir}/mr-jobhistory-daemon.sh
%{_tmpfilesdir}/%{name}-mapreduce.conf
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-mapreduce
%attr(0755,mapred,hadoop) %dir %{_var}/run/%{name}-mapreduce
%attr(0755,mapred,hadoop) %dir %{_var}/log/%{name}-mapreduce
%attr(0755,mapred,hadoop) %dir %{_var}/cache/%{name}-mapreduce
%files -f .mfiles-%{name}-mapreduce-examples mapreduce-examples
%files -f .mfiles-%{name}-maven-plugin maven-plugin
%files -f .mfiles-%{name}-tests tests
%files -f .mfiles-%{name}-yarn yarn
%config(noreplace) %{_sysconfdir}/%{name}/capacity-scheduler.xml
%config(noreplace) %{_sysconfdir}/%{name}/yarn-env.sh
%config(noreplace) %{_sysconfdir}/%{name}/yarn-site.xml
%{_unitdir}/%{name}-nodemanager.service
%{_unitdir}/%{name}-proxyserver.service
%{_unitdir}/%{name}-resourcemanager.service
%{_unitdir}/%{name}-timelineserver.service
%{_libexecdir}/yarn-config.sh
%{_datadir}/%{name}/yarn
%{_bindir}/yarn
%{_sbindir}/yarn-daemon.sh
%{_sbindir}/yarn-daemons.sh
%{_sbindir}/start-yarn.sh
%{_sbindir}/stop-yarn.sh
%{_tmpfilesdir}/%{name}-yarn.conf
%config(noreplace) %attr(644, root, root) %{_sysconfdir}/logrotate.d/%{name}-yarn
%attr(0755,yarn,hadoop) %dir %{_var}/run/%{name}-yarn
%attr(0755,yarn,hadoop) %dir %{_var}/log/%{name}-yarn
%attr(0755,yarn,hadoop) %dir %{_var}/cache/%{name}-yarn
%files yarn-security
%config(noreplace) %{_sysconfdir}/%{name}/container-executor.cfg
%changelog
* Fri Dec 15 2023 xiexing <xiexing4@hisilicon.com> - 3.3.6-2
- add conflicts to hadoop spec
* Mon Nov 27 2023 wenweijian wenweijian2@huawei.com - 3.3.6-1
- fix cve CVE-2023-26031
* Wed Aug 16 2023 Jia Chao <jiac13@chinaunicom.cn> - 3.3.4-4
- fix: use $HOME rather than /home/abuild, suit for all buildtools.
- fix: yarn have ELF bin, not noarch at all.
* Thu Jul 13 2023 sunyanan <sunyanan@xfusion.com> - 3.3.4-3
- lock triple-beam version to 1.3.0
* Thu Mar 9 2023 xiexing <xiexing4@hisilicon.com> - 3.3.4-2
- fix EBS install problem
* Mon Sep 19 2022 xiasenlin <xiasenlin1@huawei.com> - 3.3.4-1
- fix cve CVE-2021-25642
* Thu Sep 8 2022 xiasenlin <xiasenlin1@huawei.com> - 3.3.3-2
- add chrpath to solve check_rpath warning
* Thu Aug 11 2022 xiexing <xiexing4@hisilicon.com> - 3.3.3-1
- update version
* Mon Jul 12 2021 lingsheng <lingsheng@huawei.com> - 3.2.1-5
- Fix stop service failure
* Sat Jul 10 2021 wangyue <wangyue92@huawei.com> - 3.2.1-4
- Add gcc-c++ to build dependency
* Fri Jun 25 2021 wangyue <wangyue92@huawei.com> - 3.2.1-3
- Fix CVE-2019-17195
* Fri May 14 2021 wangyue <wangyue92@huawei.com> - 3.2.1-2
- Fix CVE-2020-9492
* Thu May 13 2021 Ge Wang <wangge20@huawei.com> - 3.2.1-1
- Init package
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。