PLEASE HELP: ./shark-withinfo not connecting to spark master

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

PLEASE HELP: ./shark-withinfo not connecting to spark master

danoomistmatiste
Hi,  I have posted this query a couple of times but not received any responses.    

I have the following components installed and running for spark,

scala-2.9.3
spark-0.8.1-incubating-bin-cdh4

I am able to start the spark master (running on port 7077) and one worker.  I have also installed shark (shark-0.8.0-bin-cdh4).  I have set the following in my shark-env.sh

export HADOOP_HOME=/Users/hadoop/hadoop-2.0.0-cdh4.2.0
export HIVE_HOME=/Users/hadoop/shark-0.8.0-bin-cdh4/hive-0.9.0-shark-0.8.0-bin
export MASTER=spark://localhost:7077
export SPARK_HOME=/Users/hadoop/spark-0.8.1-incubating-bin-cdh4
export SPARK_MEM=1g
export SCALA_HOME=/Users/hadoop/scala-2.9.3

However, when I try to run the shark shell with ./shark-withinfo, I get the following exception (buried within a lot of other info messages)

14/01/08 15:32:28 ERROR client.Client$ClientActor: Connection to master failed; stopping client
14/01/08 15:32:28 ERROR cluster.SparkDeploySchedulerBackend: Disconnected from Spark cluster!
14/01/08 15:32:28 ERROR cluster.ClusterScheduler: Exiting due to error from cluster scheduler: Disconnected from Spark cluster

Anyone run into this issue before?
Reply | Threaded
Open this post in threaded view
|

Re: PLEASE HELP: ./shark-withinfo not connecting to spark master

Andrew Ash
Hello,

Shark doesn't have a matching version to the recent Spark 0.8.1 release yet.  If you want to run Shark, you'll need to stick with Spark 0.8.0 for the moment until Shark 0.8.1 is released.  I'd guess dropping back on that version would fix your problems.

Andrew


On Thu, Jan 9, 2014 at 1:23 PM, danoomistmatiste <[hidden email]> wrote:
Hi,  I have posted this query a couple of times but not received any
responses.

I have the following components installed and running for spark,

scala-2.9.3
spark-0.8.1-incubating-bin-cdh4

I am able to start the spark master (running on port 7077) and one worker.
I have also installed shark (shark-0.8.0-bin-cdh4).  I have set the
following in my shark-env.sh

export HADOOP_HOME=/Users/hadoop/hadoop-2.0.0-cdh4.2.0
export
HIVE_HOME=/Users/hadoop/shark-0.8.0-bin-cdh4/hive-0.9.0-shark-0.8.0-bin
export MASTER=spark://localhost:7077
export SPARK_HOME=/Users/hadoop/spark-0.8.1-incubating-bin-cdh4
export SPARK_MEM=1g
export SCALA_HOME=/Users/hadoop/scala-2.9.3

However, when I try to run the shark shell with ./shark-withinfo, I get the
following exception (buried within a lot of other info messages)

14/01/08 15:32:28 ERROR client.Client$ClientActor: Connection to master
failed; stopping client
14/01/08 15:32:28 ERROR cluster.SparkDeploySchedulerBackend: Disconnected
from Spark cluster!
14/01/08 15:32:28 ERROR cluster.ClusterScheduler: Exiting due to error from
cluster scheduler: Disconnected from Spark cluster

Anyone run into this issue before?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/PLEASE-HELP-shark-withinfo-not-connecting-to-spark-master-tp419.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: PLEASE HELP: ./shark-withinfo not connecting to spark master

danoomistmatiste
Andrew, Thank you very much.  That is exactly what I did,  I downloaded 0.8.0 of spark, rebuilt it and now I am able to connect to spark successfully.  I am however running into another issue when trying to run commands from the shark shell,

a simple show tables; command gives me this error.  I have configured hive-default.xml and placed it in the hive directory packaged with shark.


     > show tables;
java.lang.NoSuchFieldError: METASTORE_MODE
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:110)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2092)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2102)
        at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1076)
        at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1065)
        at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1992)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:323)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
        at shark.SharkCliDriver.processCmd(SharkCliDriver.scala:294)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341)
        at shark.SharkCliDriver$.main(SharkCliDriver.scala:203)
        at shark.SharkCliDriver.main(SharkCliDriver.scala)
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.DDLTask

Reply | Threaded
Open this post in threaded view
|

Re: PLEASE HELP: ./shark-withinfo not connecting to spark master

Andrew Ash
I haven't seen that particular one before, but Shark only works with its bundled version of Hive-0.9.0, not any other version.  The reason is Shark had to make some patches in Hive 0.9.0 so it's not vanilla 0.9.0, but moving Shark to later versions of Hive takes some dev work that's not quite landed yet.

Make sure you're using hive-0.9.0-shark-0.8.0 and don't have any other hive jars on your classpath anywhere.


On Thu, Jan 9, 2014 at 5:24 PM, danoomistmatiste <[hidden email]> wrote:
Andrew, Thank you very much.  That is exactly what I did,  I downloaded 0.8.0
of spark, rebuilt it and now I am able to connect to spark successfully.  I
am however running into another issue when trying to run commands from the
shark shell,

a simple show tables; command gives me this error.  I have configured
hive-default.xml and placed it in the hive directory packaged with shark.


     > show tables;
java.lang.NoSuchFieldError: METASTORE_MODE
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:110)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2092)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2102)
        at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1076)
        at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1065)
        at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1992)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:323)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
        at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
        at shark.SharkCliDriver.processCmd(SharkCliDriver.scala:294)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341)
        at shark.SharkCliDriver$.main(SharkCliDriver.scala:203)
        at shark.SharkCliDriver.main(SharkCliDriver.scala)
FAILED: Execution Error, return code -101 from
org.apache.hadoop.hive.ql.exec.DDLTask





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/PLEASE-HELP-shark-withinfo-not-connecting-to-spark-master-tp419p426.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: PLEASE HELP: ./shark-withinfo not connecting to spark master

danoomistmatiste
I have now pointed all references for hive to the one packaged with shark.  When I try to run any command from the shark shell, I get this error,

Caused by: java.sql.SQLException: null,  message from server: "Host '192.168.1.172' is not allowed to connect to this MySQL server"