eclipse遠端連線虛擬機器Linux上hadoop2.7.7報錯

fantasy_4發表於2019-04-03

本人在Windows上使用eclipse遠端連線虛擬機器上的hadoop2.7.7 報錯如下
19/04/93 21:32:01 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation.UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
19/04/93 21:32:01 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation.UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
19/04/93 21:32:01 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation.UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[GetGroups], valueName=Time)
19/04/93 21:32:01 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
19/04/93 21:32:01 DEBUG util.KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
19/04/93 21:32:01 DEBUG security.Groups: Creating new Groups object
19/04/93 21:32:01 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library…
19/04/93 21:32:01 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
19/04/93 21:32:01 DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
19/04/93 21:32:01 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
19/04/93 21:32:01 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
19/04/93 21:32:01 DEBUG security.UserGroupInformation: hadoop login
19/04/93 21:32:01 DEBUG security.UserGroupInformation: hadoop login commit
19/04/93 21:32:01 DEBUG security.UserGroupInformation: Using user: “hadoop” with name hadoop
19/04/93 21:32:01 DEBUG security.UserGroupInformation: User entry: “hadoop”
19/04/93 21:32:01 DEBUG security.UserGroupInformation: Assuming keytab is managed externally since logged in from subject.
19/04/93 21:32:01 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
19/04/93 21:32:01 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
19/04/93 21:32:01 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
19/04/93 21:32:01 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
19/04/93 21:32:01 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
19/04/93 21:32:02 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
19/04/93 21:32:02 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine.RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngineServerProtoBufRpcInvoker@e7edb54
19/04/93 21:32:02 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@306f16f3
19/04/93 21:32:02 DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
19/04/93 21:32:02 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
19/04/93 21:32:06 DEBUG security.UserGroupInformation: PrivilegedAction as:hadoop (auth:SIMPLE) from:org.apache.hadoop.mapreduce.Job.connect(Job.java:1255)
19/04/93 21:32:06 DEBUG mapreduce.Cluster: Trying ClientProtocolProvider : org.apache.hadoop.mapred.YarnClientProtocolProvider
19/04/93 21:32:06 DEBUG service.AbstractService: Service: org.apache.hadoop.mapred.ResourceMgrDelegate entered state INITED
19/04/93 21:32:06 DEBUG service.AbstractService: Service: org.apache.hadoop.yarn.client.api.impl.YarnClientImpl entered state INITED
19/04/93 21:32:13 INFO client.RMProxy: Connecting to ResourceManager at master:8032
19/04/93 21:32:13 DEBUG security.UserGroupInformation: PrivilegedAction as:hadoop (auth:SIMPLE) from:org.apache.hadoop.yarn.client.RMProxy.getProxy(RMProxy.java:136)
19/04/93 21:32:13 DEBUG ipc.YarnRPC: Creating YarnRPC for org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC
19/04/93 21:32:13 DEBUG ipc.HadoopYarnProtoRPC: Creating a HadoopYarnProtoRpc proxy for protocol interface org.apache.hadoop.yarn.api.ApplicationClientProtocol
19/04/93 21:32:13 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@306f16f3
19/04/93 21:32:13 DEBUG service.AbstractService: Service org.apache.hadoop.yarn.client.api.impl.YarnClientImpl is started
19/04/93 21:32:13 DEBUG service.AbstractService: Service org.apache.hadoop.mapred.ResourceMgrDelegate is started
19/04/93 21:32:13 DEBUG security.UserGroupInformation: PrivilegedAction as:hadoop (auth:SIMPLE) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:331)
19/04/93 21:32:13 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
19/04/93 21:32:13 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
19/04/93 21:32:13 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
19/04/93 21:32:13 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
19/04/93 21:32:13 INFO mapreduce.Cluster: Failed to use org.apache.hadoop.mapred.YarnClientProtocolProvider due to error: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.fs.AbstractFileSystem.newInstance(AbstractFileSystem.java:135)
at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:164)
at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:249)
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:334)
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:331)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:331)
at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:448)
at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:474)
at org.apache.hadoop.mapred.YARNRunner.(YARNRunner.java:148)
at org.apache.hadoop.mapred.YARNRunner.(YARNRunner.java:132)
at org.apache.hadoop.mapred.YARNRunner.(YARNRunner.java:122)
at org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34)
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95)
at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1255)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at com.wf.demo1.WordCount.main(WordCount.java:80)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.hadoop.fs.AbstractFileSystem.newInstance(AbstractFileSystem.java:133)
… 26 more
Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: master
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:320)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:687)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:628)
at org.apache.hadoop.fs.Hdfs.(Hdfs.java:88)
… 31 more
Caused by: java.net.UnknownHostException: master
… 37 more
19/04/93 21:32:13 DEBUG mapreduce.Cluster: Trying ClientProtocolProvider : org.apache.hadoop.mapred.LocalClientProtocolProvider
19/04/93 21:32:13 DEBUG mapreduce.Cluster: Cannot pick org.apache.hadoop.mapred.LocalClientProtocolProvider as the ClientProtocolProvider - returned null protocol
19/04/93 21:32:13 DEBUG security.UserGroupInformation: PrivilegedActionException as:hadoop (auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
Exception in thread “main” java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.

at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1255)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at com.wf.demo1.WordCount.main(WordCount.java:80)
19/04/93 21:32:13 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@306f16f3

目前小白一個 完全不知道原因,懇請各位大神賜教。下面附上core-site.xml、hdfs-site.xml.、mapred-site.xml、yarn-site.xml配置

core-site.xml



fs.defaultFS
hdfs://master:9000


hadoop.tmp.dir
/usr/local/hadoop/tmp


io.native.lib.available
true

dfs-site.xm


dfs.namenode.secondary.http-address
master:9001


dfs.replication
3

mapred-site.xml

mapreduce.framework.name yarn mapreduce.jobhistory.address slave1:10020 mapred.remote.os Linux

yarn-site.xml

mapreduce.jobhistory.webapp.address slave1:19888 yarn.nodemanager.aux-services mapreduce_shuffle yarn.nodemanager.aux-services.mapreduce.shuffle.class org.apache.hadoop.mapred.ShuffleHandler yarn.resourcemanager.hostname master yarn.resourcemanager.address master:8032 yarn.resourcemanager.scheduler.address master:8030 yarn.resourcemanager.resource-tracker.address master:8031 yarn.resourcemanager.admin.address master:8033 yarn.resourcemanager.webapp.address master:8088 mapreduce.application.classpath $HADOOP_CONF_DIR, $HADOOP_COMMON_HOME/share/hadoop/common/*, $HADOOP_COMMON_HOME/share/hadoop/common/lib/*, $HADOOP_HDFS_HOME/share/hadoop/hdfs/*, $HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*, $HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*, $HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*, $HADOOP_YARN_HOME/share/hadoop/yarn/*, $HADOOP_YARN_HOME/share/hadoop/yarn/lib/*

eclipse 使用的外掛 是網上下載的 ,說讓我檢查mapreduce.framework.name我配置的是yarn啊,一直都是這個錯誤,求解啊

相關文章