我在ec2示例中安装了sqoop,引用http://kontext.tech/docs/dataandbusinessintelligence/p/configure-sqoop-in-a-edge-node-of-hadoop-cluster 我的hadoop集群也运行良好。
我搞错了 Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
我用上面提到的方法解决了这个问题。但不幸的是,我在运行sqoop import时遇到另一个错误: Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : Error: Could not find or load main class org.apache.hadoop.mapred.YarnChild
.
请建议我如何克服这个错误。
以下是我的sqoop-env.template.sh的外观:
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# included in all the hadoop scripts with source command
# should not be executable directly
# also should not be passed any arguments, since we need original $*
# Set Hadoop-specific environment variables here.
# Set path to where bin/hadoop is available
# export HADOOP_COMMON_HOME=$HOME/hadoop-3.1.0
# Set path to where hadoop-*-core.jar is available
# export HADOOP_MAPRED_HOME=$HOME/hadoop-3.1.0
# set the path to where bin/hbase is available
# export HBASE_HOME=
# Set the path to where bin/hive is available
# export HIVE_HOME=
# Set the path for where zookeper config dir is
# export ZOOCFGDIR=`
暂无答案!
目前还没有任何答案,快来回答吧!