I have also seen the following two similar links, but they were different from mine that I will describe it in this post:
- Apache Atlas: Http 503 Service Unavailable Error when connecting from Java Client
- HTTP apache server Error 503 service unavailable
I'm trying to run Apache Atlas on my local computer. So I have cloned it from the official repository.
From the README.md file of the repository I have entered the following two commands and it runs successfully:
mvn clean install -DskipTests -X
mvn clean -DskipTests package -Pdist,embedded-hbase-solr
These are the commands and changes I tried:
tar xzvf apache-atlas-3.0.0-SNAPSHOT-server.tar.gz
cd apache-atlas-3.0.0-SNAPSHOT/bin/
python2.7 atlas_start.py
After running the python script, I got this log:
configured for local hbase.
hbase started.
configured for local solr.
solr.xml doesn't exist in /bigdata/atlas/distro/target/apache-atlas-3.0.0-SNAPSHOT/data/solr, copying from /bigdata/atlas/distro/target/apache-atlas-3.0.0-SNAPSHOT/solr/server/solr/solr.xml
solr started.
setting up solr collections...
starting atlas on host localhost
starting atlas on port 21000
Apache Atlas Server started!!!
It seems that it doesn't have any problem, but whenever I want to test it with curl, something bad has happened?
curl -u admin:admin http://localhost:21000/api/atlas/v2/types/typedefs/head
Error:
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 503 Service Unavailable</title>
</head>
<body><h2>HTTP ERROR 503 Service Unavailable</h2>
<table>
<tr><th>URI:</th><td>/api/atlas/v2/types/typedefs/head</td></tr>
<tr><th>STATUS:</th><td>503</td></tr>
<tr><th>MESSAGE:</th><td>Service Unavailable</td></tr>
<tr><th>SERVLET:</th><td>-</td></tr>
</table>
<hr><a href="http://eclipse.org/jetty">Powered by Jetty:// 9.4.31.v20200723</a><hr/>
</body>
</html>
Here is the application.log file:
2021-03-13 17:05:37,484 INFO - [main:] ~ Loading atlas-application.properties from file:/bigdata/atlas/distro/target/apache-atlas-3.0.0-SNAPSHOT/conf/atlas-application.properties (ApplicationProperties:137)
2021-03-13 17:05:37,503 INFO - [main:] ~ Using graphdb backend 'janus' (ApplicationProperties:317)
2021-03-13 17:05:37,503 INFO - [main:] ~ Using storage backend 'hbase2' (ApplicationProperties:328)
2021-03-13 17:05:37,503 INFO - [main:] ~ Using index backend 'solr' (ApplicationProperties:339)
2021-03-13 17:05:37,515 INFO - [main:] ~ Atlas is running in MODE: PROD. (ApplicationProperties:343)
2021-03-13 17:05:37,516 INFO - [main:] ~ Setting solr-wait-searcher property 'true' (ApplicationProperties:349)
2021-03-13 17:05:37,516 INFO - [main:] ~ Setting index.search.map-name property 'false' (ApplicationProperties:353)
2021-03-13 17:05:37,516 INFO - [main:] ~ Setting atlas.graph.index.search.max-result-set-size = 150 (ApplicationProperties:363)
2021-03-13 17:05:37,517 INFO - [main:] ~ Property (set to default) atlas.graph.cache.db-cache = true (ApplicationProperties:375)
2021-03-13 17:05:37,517 INFO - [main:] ~ Property (set to default) atlas.graph.cache.db-cache-clean-wait = 20 (ApplicationProperties:375)
2021-03-13 17:05:37,517 INFO - [main:] ~ Property (set to default) atlas.graph.cache.db-cache-size = 0.5 (ApplicationProperties:375)
2021-03-13 17:05:37,517 INFO - [main:] ~ Property (set to default) atlas.graph.cache.tx-cache-size = 15000 (ApplicationProperties:375)
2021-03-13 17:05:37,518 INFO - [main:] ~ Property (set to default) atlas.graph.cache.tx-dirty-size = 120 (ApplicationProperties:375)
2021-03-13 17:05:37,535 INFO - [main:] ~
########################################################################################
Atlas Server (STARTUP)
project.name: apache-atlas
project.description: Metadata Management and Data Governance Platform over Hadoop
build.user: root
build.epoch: 1615641603895
project.version: 3.0.0-SNAPSHOT
build.version: 3.0.0-SNAPSHOT
vc.revision: 7eab2cb8d53ca4c86366e896119a1d7906ccb5b3
vc.source.url: scm:git:git://git.apache.org/atlas.git/atlas-webapp
######################################################################################## (Atlas:215)
2021-03-13 17:05:37,535 INFO - [main:] ~ >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> (Atlas:216)
2021-03-13 17:05:37,535 INFO - [main:] ~ Server starting with TLS ? false on port 21000 (Atlas:217)
2021-03-13 17:05:37,538 INFO - [main:] ~ <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< (Atlas:218)
2021-03-13 17:05:39,801 INFO - [main:] ~ No authentication method configured. Defaulting to simple authentication (LoginProcessor:102)
2021-03-13 17:05:40,136 WARN - [main:] ~ Unable to load native-hadoop library for your platform... using builtin-java classes where applicable (NativeCodeLoader:60)
2021-03-13 17:05:40,351 INFO - [main:] ~ Logged in user root (auth:SIMPLE) (LoginProcessor:77)
2021-03-13 17:05:41,716 INFO - [main:] ~ Not running setup per configuration atlas.server.run.setup.on.start. (SetupSteps$SetupRequired:189)
2021-03-13 17:05:43,892 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 1 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:44,989 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 2 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:46,092 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 3 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:47,194 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 4 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:48,295 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 5 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:49,397 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 6 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:50,499 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 7 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:51,601 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 8 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:52,703 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 9 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:53,805 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 10 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:54,907 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 11 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:56,009 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 12 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:57,110 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 13 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:58,212 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 14 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:05:59,313 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 15 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:00,414 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 16 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:01,517 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 17 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:02,618 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 18 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:03,720 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 19 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:04,822 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 20 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:16,941 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for get of /hbase/hbaseid, code = CONNECTIONLOSS, retries = 30, give up (ReadOnlyZKClient$ZKTask$1:196)
2021-03-13 17:06:16,958 WARN - [main:] ~ Retrieve cluster id failed (ConnectionImplementation:576)
java.util.concurrent.ExecutionException: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
at org.apache.hadoop.hbase.client.ConnectionImplementation.retrieveClusterId(ConnectionImplementation.java:574)
at org.apache.hadoop.hbase.client.ConnectionImplementation.<init>(ConnectionImplementation.java:307)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.hadoop.hbase.client.ConnectionFactory.lambda$createConnection$0(ConnectionFactory.java:230)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:347)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:228)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:128)
at org.janusgraph.diskstorage.hbase2.HBaseCompat2_0.createConnection(HBaseCompat2_0.java:46)
at org.janusgraph.diskstorage.hbase2.HBaseStoreManager.<init>(HBaseStoreManager.java:314)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:58)
at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:440)
at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:411)
at org.janusgraph.graphdb.configuration.builder.GraphDatabaseConfigurationBuilder.build(GraphDatabaseConfigurationBuilder.java:50)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:161)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:132)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:112)
at org.apache.atlas.repository.graphdb.janus.AtlasJanusGraphDatabase.initJanusGraph(AtlasJanusGraphDatabase.java:182)
at org.apache.atlas.repository.graphdb.janus.AtlasJanusGraphDatabase.getGraphInstance(AtlasJanusGraphDatabase.java:169)
at org.apache.atlas.repository.graphdb.janus.AtlasJanusGraphDatabase.getGraph(AtlasJanusGraphDatabase.java:278)
at org.apache.atlas.repository.graph.AtlasGraphProvider.getGraphInstance(AtlasGraphProvider.java:52)
at org.apache.atlas.repository.graph.AtlasGraphProvider.get(AtlasGraphProvider.java:98)
at org.apache.atlas.repository.graph.AtlasGraphProvider$$EnhancerBySpringCGLIB$$55698c0f.CGLIB$get$0(<generated>)
at org.apache.atlas.repository.graph.AtlasGraphProvider$$EnhancerBySpringCGLIB$$55698c0f$$FastClassBySpringCGLIB$$ec479fde.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:358)
at org.apache.atlas.repository.graph.AtlasGraphProvider$$EnhancerBySpringCGLIB$$55698c0f.get(<generated>)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:162)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:588)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1176)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1071)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:511)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:481)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:312)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:308)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:211)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1134)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1062)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:835)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:741)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:189)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1196)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1098)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:511)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:481)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:312)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:308)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:211)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1134)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1062)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:835)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:741)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:189)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1196)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1098)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:511)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:481)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:312)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:308)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:89)
at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:102)
at org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:103)
at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:245)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:1041)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:1015)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:471)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:312)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:308)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:756)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:867)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:542)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:443)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:325)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:107)
at org.apache.atlas.web.setup.KerberosAwareListener.contextInitialized(KerberosAwareListener.java:31)
at org.eclipse.jetty.server.handler.ContextHandler.callContextInitialized(ContextHandler.java:1013)
at org.eclipse.jetty.servlet.ServletContextHandler.callContextInitialized(ServletContextHandler.java:553)
at org.eclipse.jetty.server.handler.ContextHandler.contextInitialized(ContextHandler.java:942)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:782)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:360)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1445)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1409)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:855)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:275)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:524)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
at org.eclipse.jetty.server.Server.start(Server.java:408)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:110)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:97)
at org.eclipse.jetty.server.Server.doStart(Server.java:372)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
at org.apache.atlas.web.service.EmbeddedServer.start(EmbeddedServer.java:113)
at org.apache.atlas.Atlas.main(Atlas.java:133)
Caused by: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
at org.apache.zookeeper.KeeperException.create(KeeperException.java:102)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:54)
at org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient$ZKTask$1.exec(ReadOnlyZKClient.java:198)
at org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient.run(ReadOnlyZKClient.java:342)
at java.base/java.lang.Thread.run(Thread.java:834)
2021-03-13 17:06:18,044 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for list of /hbase, code = CONNECTIONLOSS, retries = 1 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:19,144 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for list of /hbase, code = CONNECTIONLOSS, retries = 2 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:20,246 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for list of /hbase, code = CONNECTIONLOSS, retries = 3 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:21,348 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for list of /hbase, code = CONNECTIONLOSS, retries = 4 (ReadOnlyZKClient$ZKTask$1:192)
2021-03-13 17:06:22,449 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for list of /hbase, code = CONNECTIONLOSS, retries = 5 (ReadOnlyZKClient$ZKTask$1:192)
.
.
.
2021-03-13T13:58:36.264Z, RpcRetryingCaller{globalStartTime=1615643246874, pause=100, maxAttempts=16}, java.io.IOException: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:145)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:383)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:357)
at org.apache.hadoop.hbase.MetaTableAccessor.getTableState(MetaTableAccessor.java:1164)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:461)
at org.apache.hadoop.hbase.client.HBaseAdmin$6.rpcCall(HBaseAdmin.java:467)
at org.apache.hadoop.hbase.client.HBaseAdmin$6.rpcCall(HBaseAdmin.java:464)
at org.apache.hadoop.hbase.client.RpcRetryingCallable.call(RpcRetryingCallable.java:58)
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)
... 101 more
Caused by: java.io.IOException: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase
at org.apache.hadoop.hbase.client.ConnectionImplementation.get(ConnectionImplementation.java:2117)
at org.apache.hadoop.hbase.client.ConnectionImplementation.locateMeta(ConnectionImplementation.java:814)
at org.apache.hadoop.hbase.client.ConnectionImplementation.locateRegion(ConnectionImplementation.java:781)
at org.apache.hadoop.hbase.client.HRegionLocator.getRegionLocation(HRegionLocator.java:64)
at org.apache.hadoop.hbase.client.RegionLocator.getRegionLocation(RegionLocator.java:58)
at org.apache.hadoop.hbase.client.RegionLocator.getRegionLocation(RegionLocator.java:47)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:223)
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:105)
... 109 more
Caused by: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase
at org.apache.zookeeper.KeeperException.create(KeeperException.java:102)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:54)
at org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient$ZKTask$1.exec(ReadOnlyZKClient.java:198)
at org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient.run(ReadOnlyZKClient.java:342)
at java.base/java.lang.Thread.run(Thread.java:834)
2021-03-13 17:28:37,365 WARN - [ReadOnlyZKClient-localhost:2181@0x39acd1f1:] ~ 0x39acd1f1 to localhost:2181 failed for list of /hbase, code = CONNECTIONLOSS, retries = 1 (ReadOnlyZKClient$ZKTask$1:192)
Would someone help me how to install apache atlas properly?
PS I have also tried 2.1.0
version. Whenever I want to run it through atlas_start.py
script, error has been occured.
Exception: [Errno 13] Permission denied
Traceback (most recent call last):
File "atlas_start.py", line 163, in <module>
returncode = main()
File "atlas_start.py", line 73, in main
mc.expandWebApp(atlas_home)
File "./distro/target/apache-atlas-2.1.0-bin/apache-atlas-2.1.0/bin/atlas_config.py", line 162, in expandWebApp
jar(atlasWarPath)
File "./distro/target/apache-atlas-2.1.0-bin/apache-atlas-2.1.0/bin/atlas_config.py", line 215, in jar
process = runProcess(commandline)
File "./distro/target/apache-atlas-2.1.0-bin/apache-atlas-2.1.0/bin/atlas_config.py", line 251, in runProcess
p = subprocess.Popen(commandline, stdout=stdoutFile, stderr=stderrFile, shell=shell)
File "/usr/lib/python2.7/subprocess.py", line 394, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1047, in _execute_child
raise child_exception
OSError: [Errno 13] Permission denied
2条答案
按热度按时间olmpazwi1#
试试这个,对我很有效
在此文件中:{图册}/hbase/配置文件/hbase-env.sh
https://serverfault.com/questions/599661/could-not-start-zk-at-requested-port-of-2181-while-export-hbase-manages-zk-fals
ruyhziif2#
2021-03-13 17:06:18,044警告- [只读ZK客户端本地主机:2181@0x39acd1f1:] ~ 0x 39 acd 1f 1到本地主机:2181对于/hbase列表失败,代码=连接丢失,重试次数= 1(只读ZK客户端$ZK任务$1:192)
这个错误是关于在zookeeper中获取/hbase zknode失败。你能验证HBase是否已经启动并运行吗?HBase日志应该在apache-atlas-3.0.0-SNAPSHOT/hbase/logs目录中。
顺便说一句,我建立了同样的配置文件,嵌入式hbase-solr,从最新的主人几分钟回来,并能够成功地启动。