Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with spark-iot-analytics #1

Open
felansu opened this issue Nov 23, 2016 · 3 comments
Open

Problem with spark-iot-analytics #1

felansu opened this issue Nov 23, 2016 · 3 comments

Comments

@felansu
Copy link

felansu commented Nov 23, 2016

All commands on README.md file works and run fine, but after 2 minutes with ./bin/runStreaming.sh loaded, this error appears and kill the process...

felansu@universia:~/Repository/bahir-iot-demo/spark-iot-analytics$ ./bin/runStreaming.sh 
 is NOT set
SPARK_HOME defined as '/home/felansu/Software/spark-2.0.2-bin-hadoop2.7'

[info] Loading project definition from /home/felansu/Repository/bahir-iot-demo/spark-iot-analytics/project
[info] Set current project to spark-iot-analytics (in build file:/home/felansu/Repository/bahir-iot-demo/spark-iot-analytics/)
[success] Total time: 1 s, completed 23/11/2016 01:46:52
[info] Updating {file:/home/felansu/Repository/bahir-iot-demo/spark-iot-analytics/}spark-iot-analytics...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 2 Scala sources to /home/felansu/Repository/bahir-iot-demo/spark-iot-analytics/target/scala-2.11/classes...
[success] Total time: 26 s, completed 23/11/2016 01:47:18
[warn] Multiple main classes detected.  Run 'show discoveredMainClasses' to see the list
[info] Packaging /home/felansu/Repository/bahir-iot-demo/spark-iot-analytics/target/scala-2.11/spark-iot-analytics_2.11-1.0.jar ...
[info] Done packaging.
[success] Total time: 0 s, completed 23/11/2016 01:47:18
[warn] Multiple main classes detected.  Run 'show discoveredMainClasses' to see the list
[info] Including: spark-tags_2.11-2.0.1.jar
[info] Including: org.eclipse.paho.client.mqttv3-1.0.2.jar
[info] Including: scala-reflect-2.11.8.jar
[info] Including: scala-library-2.11.8.jar
[info] Including: scalatest_2.11-2.2.6.jar
[info] Run completed in 90 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Including: scala-xml_2.11-1.0.6.jar
[info] Including: unused-1.0.0.jar
[info] Including: org.eclipse.paho.client.mqttv3-1.0.2.jar
[info] Checking every *.class/*.jar file's SHA-1.
[info] Merging files...
[warn] Merging 'META-INF/NOTICE' with strategy 'rename'
[warn] Merging 'about.html' with strategy 'rename'
[warn] Merging 'META-INF/LICENSE' with strategy 'rename'
[warn] Merging 'META-INF/DEPENDENCIES' with strategy 'discard'
[warn] Merging 'META-INF/MANIFEST.MF' with strategy 'discard'
[warn] Merging 'META-INF/maven/org.apache.spark/spark-tags_2.11/pom.properties' with strategy 'discard'
[warn] Merging 'META-INF/maven/org.apache.spark/spark-tags_2.11/pom.xml' with strategy 'discard'
[warn] Merging 'META-INF/maven/org.eclipse.paho/org.eclipse.paho.client.mqttv3/pom.properties' with strategy 'discard'
[warn] Merging 'META-INF/maven/org.eclipse.paho/org.eclipse.paho.client.mqttv3/pom.xml' with strategy 'discard'
[warn] Merging 'META-INF/maven/org.spark-project.spark/unused/pom.properties' with strategy 'discard'
[warn] Merging 'META-INF/maven/org.spark-project.spark/unused/pom.xml' with strategy 'discard'
[error] 1 error was encountered during merge
java.lang.RuntimeException: deduplicate: different file contents found in the following:
/home/felansu/.ivy2/cache/org.apache.spark/spark-tags_2.11/jars/spark-tags_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class
/home/felansu/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:org/apache/spark/unused/UnusedStubClass.class
	at sbtassembly.Assembly$.applyStrategies(Assembly.scala:140)
	at sbtassembly.Assembly$.x$1$lzycompute$1(Assembly.scala:25)
	at sbtassembly.Assembly$.x$1$1(Assembly.scala:23)
	at sbtassembly.Assembly$.stratMapping$lzycompute$1(Assembly.scala:23)
	at sbtassembly.Assembly$.stratMapping$1(Assembly.scala:23)
	at sbtassembly.Assembly$.inputs$lzycompute$1(Assembly.scala:67)
	at sbtassembly.Assembly$.inputs$1(Assembly.scala:57)
	at sbtassembly.Assembly$.apply(Assembly.scala:83)
	at sbtassembly.Assembly$$anonfun$assemblyTask$1.apply(Assembly.scala:242)
	at sbtassembly.Assembly$$anonfun$assemblyTask$1.apply(Assembly.scala:239)
	at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
	at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
	at sbt.std.Transform$$anon$4.work(System.scala:63)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
	at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
	at sbt.Execute.work(Execute.scala:237)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
	at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /home/felansu/.ivy2/cache/org.apache.spark/spark-tags_2.11/jars/spark-tags_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class
[error] /home/felansu/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:org/apache/spark/unused/UnusedStubClass.class
[error] Total time: 2 s, completed 23/11/2016 01:47:21
Starting Spark Application at /home/felansu/Software/spark-2.0.2-bin-hadoop2.7
Ivy Default Cache set to: /home/felansu/.ivy2/cache
The jars for the packages stored in: /home/felansu/.ivy2/jars
:: loading settings :: url = jar:file:/home/felansu/Software/spark-2.0.2-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.bahir#spark-streaming-mqtt_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
	confs: [default]
	found org.apache.bahir#spark-streaming-mqtt_2.11;2.0.1 in list
	found org.apache.spark#spark-tags_2.11;2.0.1 in central
	found org.scalatest#scalatest_2.11;2.2.6 in list
	found org.scala-lang#scala-reflect;2.11.8 in list
	[2.11.8] org.scala-lang#scala-reflect;2.11.8
	found org.scala-lang.modules#scala-xml_2.11;1.0.2 in list
	found org.spark-project.spark#unused;1.0.0 in list
	found org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.2 in list
:: resolution report :: resolve 4560ms :: artifacts dl 17ms
	:: modules in use:
	org.apache.bahir#spark-streaming-mqtt_2.11;2.0.1 from list in [default]
	org.apache.spark#spark-tags_2.11;2.0.1 from central in [default]
	org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.2 from list in [default]
	org.scala-lang#scala-reflect;2.11.8 from list in [default]
	org.scala-lang.modules#scala-xml_2.11;1.0.2 from list in [default]
	org.scalatest#scalatest_2.11;2.2.6 from list in [default]
	org.spark-project.spark#unused;1.0.0 from list in [default]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   7   |   1   |   1   |   0   ||   7   |   0   |
	---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
	unknown resolver sbt-chain

	unknown resolver sbt-chain

	unknown resolver null

	unknown resolver sbt-chain

	unknown resolver sbt-chain

	unknown resolver sbt-chain

	unknown resolver sbt-chain

	unknown resolver sbt-chain


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
	confs: [default]
	0 artifacts copied, 7 already retrieved (0kB/16ms)
>>> mqtt server tcp://localhost:1883
>>> topic       bahir/iot/id/simulator/evt/power
16/11/23 01:48:28 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
16/11/23 01:48:28 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836)
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
org.apache.bahir.iot.MQTTStreamingApplication$.main(MQTTStreamingApplication.scala:46)
org.apache.bahir.iot.MQTTStreamingApplication.main(MQTTStreamingApplication.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

The currently active SparkContext was created at:

(No active SparkContext.)
         
	at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:101)
	at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1658)
	at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2162)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:542)
	at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836)
	at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
	at org.apache.bahir.iot.MQTTStreamingApplication$.main(MQTTStreamingApplication.scala:46)
	at org.apache.bahir.iot.MQTTStreamingApplication.main(MQTTStreamingApplication.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836)
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
org.apache.bahir.iot.MQTTStreamingApplication$.main(MQTTStreamingApplication.scala:46)
org.apache.bahir.iot.MQTTStreamingApplication.main(MQTTStreamingApplication.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

The currently active SparkContext was created at:

(No active SparkContext.)
         
	at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:101)
	at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1658)
	at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2162)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:542)
	at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836)
	at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
	at org.apache.bahir.iot.MQTTStreamingApplication$.main(MQTTStreamingApplication.scala:46)
	at org.apache.bahir.iot.MQTTStreamingApplication.main(MQTTStreamingApplication.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
@lresende
Copy link
Owner

@felansu Thanks for reporting this, let me take a look and see if I can reproduce (and fix this) ...
FYI, I had it running for a few hours for testing purposes ok last week...

@felansu
Copy link
Author

felansu commented Nov 23, 2016

@lresende Now I am in a corporate environment and I will try to reproduce it

@felansu
Copy link
Author

felansu commented Nov 24, 2016

@lresende i send an e-mail to [email protected]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants