You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
2019-04-04 17:29:50 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-04-04 17:29:55 INFO HiveThriftServer2:2566 - Started daemon with process name: [email protected]
2019-04-04 17:29:55 INFO SignalUtils:54 - Registered signal handler for TERM
2019-04-04 17:29:55 INFO SignalUtils:54 - Registered signal handler for HUP
2019-04-04 17:29:55 INFO SignalUtils:54 - Registered signal handler for INT
2019-04-04 17:29:55 INFO HiveThriftServer2:54 - Starting SparkContext
2019-04-04 17:29:55 INFO SparkContext:54 - Running Spark version 2.4.0
2019-04-04 17:29:55 INFO SparkContext:54 - Submitted application: Thrift JDBC/ODBC Server
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing view acls to: ibm
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing modify acls to: ibm
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing view acls groups to:
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing modify acls groups to:
2019-04-04 17:29:55 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ibm); groups with view permissions: Set(); users with modify permissions: Set(ibm); groups with modify permissions: Set()
2019-04-04 17:30:01 INFO Utils:54 - Successfully started service 'sparkDriver' on port 54993.
2019-04-04 17:30:01 INFO SparkEnv:54 - Registering MapOutputTracker
2019-04-04 17:30:01 INFO SparkEnv:54 - Registering BlockManagerMaster
2019-04-04 17:30:01 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-04-04 17:30:01 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2019-04-04 17:30:01 INFO DiskBlockManager:54 - Created local directory at /private/var/folders/x6/5rj4r3f94m96hr12q2srdz700000gn/T/blockmgr-88cba71c-88de-4080-83ed-cba14ce6b8d8
2019-04-04 17:30:01 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2019-04-04 17:30:01 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2019-04-04 17:30:01 INFO log:192 - Logging initialized @22420ms
2019-04-04 17:30:01 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2019-04-04 17:30:01 INFO Server:419 - Started @22524ms
2019-04-04 17:30:01 INFO AbstractConnector:278 - Started ServerConnector@bf71cec{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-04-04 17:30:01 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1500e009{/jobs,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5bdaf2ce{/jobs/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42d236fb{/jobs/job,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@19f21b6b{/jobs/job/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1532c619{/stages,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@46044faa{/stages/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1358b28e{/stages/stage,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7de4a01f{/stages/stage/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2bfeb1ef{/stages/pool,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@778ca8ef{/stages/pool/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@208e9ef6{/storage,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@78b236a0{/storage/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@261d8190{/storage/rdd,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@34448e6c{/storage/rdd/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@60e9df3c{/environment,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@907f2b7{/environment/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@435ce306{/executors,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@537b32ef{/executors/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7dc51783{/executors/threadDump,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4b61d0c6{/executors/threadDump/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6f815e7f{/static,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@110844f6{/,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6f89f665{/api,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a43d133{/jobs/job/kill,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@39ce27f2{/stages/stage/kill,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://10.16.2.83:4040
2019-04-04 17:30:01 INFO Executor:54 - Starting executor ID driver on host localhost
2019-04-04 17:30:01 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54994.
2019-04-04 17:30:01 INFO NettyBlockTransferService:54 - Server created on 10.16.2.83:54994
2019-04-04 17:30:01 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2019-04-04 17:30:02 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO BlockManagerMasterEndpoint:54 - Registering block manager 10.16.2.83:54994 with 366.3 MB RAM, BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54562ea6{/metrics/json,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO SharedState:54 - loading hive config file: file:/Users/ibm/atif/spark/conf/hive-site.xml
2019-04-04 17:30:02 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/ibm/atif/spark/spark-warehouse').
2019-04-04 17:30:02 INFO SharedState:54 - Warehouse path is 'file:/Users/ibm/atif/spark/spark-warehouse'.
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6a9950f1{/SQL,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7ad54c55{/SQL/json,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1a3e5f23{/SQL/execution,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6293e39e{/SQL/execution/json,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3daa82be{/static/sql,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
2019-04-04 17:30:03 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:03 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:03 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:04 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:04 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:04 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:05 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:05 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:05 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:06 WARN Hive:168 - Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
... 36 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 42 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 50 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
... 47 more
2019-04-04 17:30:06 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:06 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:06 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:07 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:07 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:07 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:08 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:08 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:08 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
... 18 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
... 33 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 39 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 47 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
... 44 more
2019-04-04 17:30:09 INFO SparkContext:54 - Invoking stop() from shutdown hook
2019-04-04 17:30:09 INFO AbstractConnector:318 - Stopped Spark@bf71cec{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-04-04 17:30:09 INFO SparkUI:54 - Stopped Spark web UI at http://10.16.2.83:4040
2019-04-04 17:30:09 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2019-04-04 17:30:09 INFO MemoryStore:54 - MemoryStore cleared
2019-04-04 17:30:09 INFO BlockManager:54 - BlockManager stopped
2019-04-04 17:30:09 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2019-04-04 17:30:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2019-04-04 17:30:09 INFO SparkContext:54 - Successfully stopped SparkContext
2019-04-04 17:30:09 INFO ShutdownHookManager:54 - Shutdown hook called
2019-04-04 17:30:09 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/x6/5rj4r3f94m96hr12q2srdz700000gn/T/spark-d554e64a-4cfd-46db-a63d-27da4372ed0c
2019-04-04 17:30:09 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/x6/5rj4r3f94m96hr12q2srdz700000gn/T/spark-70a9d853-cd5e-49d1-8526-2dcc293d2e4f
The text was updated successfully, but these errors were encountered:
Spark Command: /Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/bin/java -cp /Users/ibm/atif/spark/conf/:/Users/ibm/atif/spark/jars/* -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --name Thrift JDBC/ODBC Server spark-internal
2019-04-04 17:29:50 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-04-04 17:29:55 INFO HiveThriftServer2:2566 - Started daemon with process name: [email protected]
2019-04-04 17:29:55 INFO SignalUtils:54 - Registered signal handler for TERM
2019-04-04 17:29:55 INFO SignalUtils:54 - Registered signal handler for HUP
2019-04-04 17:29:55 INFO SignalUtils:54 - Registered signal handler for INT
2019-04-04 17:29:55 INFO HiveThriftServer2:54 - Starting SparkContext
2019-04-04 17:29:55 INFO SparkContext:54 - Running Spark version 2.4.0
2019-04-04 17:29:55 INFO SparkContext:54 - Submitted application: Thrift JDBC/ODBC Server
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing view acls to: ibm
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing modify acls to: ibm
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing view acls groups to:
2019-04-04 17:29:55 INFO SecurityManager:54 - Changing modify acls groups to:
2019-04-04 17:29:55 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ibm); groups with view permissions: Set(); users with modify permissions: Set(ibm); groups with modify permissions: Set()
2019-04-04 17:30:01 INFO Utils:54 - Successfully started service 'sparkDriver' on port 54993.
2019-04-04 17:30:01 INFO SparkEnv:54 - Registering MapOutputTracker
2019-04-04 17:30:01 INFO SparkEnv:54 - Registering BlockManagerMaster
2019-04-04 17:30:01 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-04-04 17:30:01 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2019-04-04 17:30:01 INFO DiskBlockManager:54 - Created local directory at /private/var/folders/x6/5rj4r3f94m96hr12q2srdz700000gn/T/blockmgr-88cba71c-88de-4080-83ed-cba14ce6b8d8
2019-04-04 17:30:01 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2019-04-04 17:30:01 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2019-04-04 17:30:01 INFO log:192 - Logging initialized @22420ms
2019-04-04 17:30:01 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2019-04-04 17:30:01 INFO Server:419 - Started @22524ms
2019-04-04 17:30:01 INFO AbstractConnector:278 - Started ServerConnector@bf71cec{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-04-04 17:30:01 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1500e009{/jobs,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5bdaf2ce{/jobs/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42d236fb{/jobs/job,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@19f21b6b{/jobs/job/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1532c619{/stages,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@46044faa{/stages/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1358b28e{/stages/stage,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7de4a01f{/stages/stage/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2bfeb1ef{/stages/pool,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@778ca8ef{/stages/pool/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@208e9ef6{/storage,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@78b236a0{/storage/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@261d8190{/storage/rdd,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@34448e6c{/storage/rdd/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@60e9df3c{/environment,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@907f2b7{/environment/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@435ce306{/executors,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@537b32ef{/executors/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7dc51783{/executors/threadDump,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4b61d0c6{/executors/threadDump/json,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6f815e7f{/static,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@110844f6{/,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6f89f665{/api,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a43d133{/jobs/job/kill,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@39ce27f2{/stages/stage/kill,null,AVAILABLE,@spark}
2019-04-04 17:30:01 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://10.16.2.83:4040
2019-04-04 17:30:01 INFO Executor:54 - Starting executor ID driver on host localhost
2019-04-04 17:30:01 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54994.
2019-04-04 17:30:01 INFO NettyBlockTransferService:54 - Server created on 10.16.2.83:54994
2019-04-04 17:30:01 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2019-04-04 17:30:02 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO BlockManagerMasterEndpoint:54 - Registering block manager 10.16.2.83:54994 with 366.3 MB RAM, BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 10.16.2.83, 54994, None)
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54562ea6{/metrics/json,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO SharedState:54 - loading hive config file: file:/Users/ibm/atif/spark/conf/hive-site.xml
2019-04-04 17:30:02 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/ibm/atif/spark/spark-warehouse').
2019-04-04 17:30:02 INFO SharedState:54 - Warehouse path is 'file:/Users/ibm/atif/spark/spark-warehouse'.
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6a9950f1{/SQL,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7ad54c55{/SQL/json,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1a3e5f23{/SQL/execution,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6293e39e{/SQL/execution/json,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3daa82be{/static/sql,null,AVAILABLE,@spark}
2019-04-04 17:30:02 INFO HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
2019-04-04 17:30:03 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:03 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:03 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:04 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:04 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:04 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:05 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:05 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:05 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:06 WARN Hive:168 - Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
... 36 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 42 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 50 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
... 47 more
2019-04-04 17:30:06 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:06 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:06 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:07 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:07 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:07 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
2019-04-04 17:30:08 INFO metastore:376 - Trying to connect to metastore with URI thrift://localhost:9083
2019-04-04 17:30:08 WARN metastore:428 - Failed to connect to the MetaStore Server...
2019-04-04 17:30:08 INFO metastore:459 - Waiting 1 seconds before next connection attempt.
Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
... 18 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
... 33 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 39 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:79)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 47 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
... 44 more
2019-04-04 17:30:09 INFO SparkContext:54 - Invoking stop() from shutdown hook
2019-04-04 17:30:09 INFO AbstractConnector:318 - Stopped Spark@bf71cec{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-04-04 17:30:09 INFO SparkUI:54 - Stopped Spark web UI at http://10.16.2.83:4040
2019-04-04 17:30:09 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2019-04-04 17:30:09 INFO MemoryStore:54 - MemoryStore cleared
2019-04-04 17:30:09 INFO BlockManager:54 - BlockManager stopped
2019-04-04 17:30:09 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2019-04-04 17:30:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2019-04-04 17:30:09 INFO SparkContext:54 - Successfully stopped SparkContext
2019-04-04 17:30:09 INFO ShutdownHookManager:54 - Shutdown hook called
2019-04-04 17:30:09 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/x6/5rj4r3f94m96hr12q2srdz700000gn/T/spark-d554e64a-4cfd-46db-a63d-27da4372ed0c
2019-04-04 17:30:09 INFO ShutdownHookManager:54 - Deleting directory /private/var/folders/x6/5rj4r3f94m96hr12q2srdz700000gn/T/spark-70a9d853-cd5e-49d1-8526-2dcc293d2e4f
The text was updated successfully, but these errors were encountered: