You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 31, 2018. It is now read-only.
When I was testing monospark on our own cluster (instead of ec2), I encountered NoSuchFileException when HdfsDiskMonotask tries to access local disk.
I was using scala 2.10.4 and JRE 1.8 (build 1.8.0_131-b11).
The reason is sun.nio does not accept the path format starting with"file:/". I can reply this NoSuchFileException in scala 2.10.4.
I fix this by stripping the prefix "file:" in localDirectory in the line 153 of HdfsDiskMonotask.scala.
I use "return localDirectory.map(_.stripPrefix("file:"))" instead of "localDirectory".
17/11/19 12:50:24 ERROR LocalDagScheduler: LocalDagScheduler event loop failed
java.nio.file.NoSuchFileException: file:/home/hadoop/hadoop-2.0.0-cdh4.2.0/tmp/dfs/data
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileStore.devFor(UnixFileStore.java:57)
at sun.nio.fs.UnixFileStore.<init>(UnixFileStore.java:64)
at sun.nio.fs.LinuxFileStore.<init>(LinuxFileStore.java:44)
at sun.nio.fs.LinuxFileSystemProvider.getFileStore(LinuxFileSystemProvider.java:51)
at sun.nio.fs.LinuxFileSystemProvider.getFileStore(LinuxFileSystemProvider.java:39)
at sun.nio.fs.UnixFileSystemProvider.getFileStore(UnixFileSystemProvider.java:368)
at java.nio.file.Files.getFileStore(Files.java:1461)
at org.apache.spark.storage.BlockFileManager$.getDiskNameFromPath(BlockFileManager.scala:211)
at org.apache.spark.monotasks.disk.HdfsDiskMonotask$$anonfun$4.apply(HdfsDiskMonotask.scala:79)
at org.apache.spark.monotasks.disk.HdfsDiskMonotask$$anonfun$4.apply(HdfsDiskMonotask.scala:75)
at scala.Option.flatMap(Option.scala:170)
at org.apache.spark.monotasks.disk.HdfsDiskMonotask.chooseLocalDir(HdfsDiskMonotask.scala:75)
at org.apache.spark.monotasks.disk.DiskScheduler.submitTask(DiskScheduler.scala:158)
at org.apache.spark.monotasks.LocalDagScheduler.org$apache$spark$monotasks$LocalDagScheduler$$scheduleMonotask(LocalDagScheduler.scala:399)
at org.apache.spark.monotasks.LocalDagScheduler.org$apache$spark$monotasks$LocalDagScheduler$$submitMonotask(LocalDagScheduler.scala:200)
at org.apache.spark.monotasks.LocalDagScheduler$$anonfun$onReceive$1.apply(LocalDagScheduler.scala:180)
at org.apache.spark.monotasks.LocalDagScheduler$$anonfun$onReceive$1.apply(LocalDagScheduler.scala:180)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.monotasks.LocalDagScheduler.onReceive(LocalDagScheduler.scala:180)
at org.apache.spark.monotasks.LocalDagScheduler.onReceive(LocalDagScheduler.scala:43)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
17/11/19 12:50:24 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[local-dag-scheduler-event-loop,5,main]
java.nio.file.NoSuchFileException: file:/home/hadoop/hadoop-2.0.0-cdh4.2.0/tmp/dfs/data
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileStore.devFor(UnixFileStore.java:57)
at sun.nio.fs.UnixFileStore.<init>(UnixFileStore.java:64)
at sun.nio.fs.LinuxFileStore.<init>(LinuxFileStore.java:44)
at sun.nio.fs.LinuxFileSystemProvider.getFileStore(LinuxFileSystemProvider.java:51)
at sun.nio.fs.LinuxFileSystemProvider.getFileStore(LinuxFileSystemProvider.java:39)
at sun.nio.fs.UnixFileSystemProvider.getFileStore(UnixFileSystemProvider.java:368)
at java.nio.file.Files.getFileStore(Files.java:1461)
at org.apache.spark.storage.BlockFileManager$.getDiskNameFromPath(BlockFileManager.scala:211)
at org.apache.spark.monotasks.disk.HdfsDiskMonotask$$anonfun$4.apply(HdfsDiskMonotask.scala:79)
at org.apache.spark.monotasks.disk.HdfsDiskMonotask$$anonfun$4.apply(HdfsDiskMonotask.scala:75)
at scala.Option.flatMap(Option.scala:170)
at org.apache.spark.monotasks.disk.HdfsDiskMonotask.chooseLocalDir(HdfsDiskMonotask.scala:75)
at org.apache.spark.monotasks.disk.DiskScheduler.submitTask(DiskScheduler.scala:158)
at org.apache.spark.monotasks.LocalDagScheduler.org$apache$spark$monotasks$LocalDagScheduler$$scheduleMonotask(LocalDagScheduler.scala:399)
at org.apache.spark.monotasks.LocalDagScheduler.org$apache$spark$monotasks$LocalDagScheduler$$submitMonotask(LocalDagScheduler.scala:200)
at org.apache.spark.monotasks.LocalDagScheduler$$anonfun$onReceive$1.apply(LocalDagScheduler.scala:180)
at org.apache.spark.monotasks.LocalDagScheduler$$anonfun$onReceive$1.apply(LocalDagScheduler.scala:180)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.monotasks.LocalDagScheduler.onReceive(LocalDagScheduler.scala:180)
at org.apache.spark.monotasks.LocalDagScheduler.onReceive(LocalDagScheduler.scala:43)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
When I was testing monospark on our own cluster (instead of ec2), I encountered NoSuchFileException when HdfsDiskMonotask tries to access local disk.
I was using scala 2.10.4 and JRE 1.8 (build 1.8.0_131-b11).
The reason is sun.nio does not accept the path format starting with"file:/". I can reply this NoSuchFileException in scala 2.10.4.
I fix this by stripping the prefix "file:" in localDirectory in the line 153 of HdfsDiskMonotask.scala.
I use "return localDirectory.map(_.stripPrefix("file:"))" instead of "localDirectory".
spark-monotasks/core/src/main/scala/org/apache/spark/monotasks/disk/HdfsDiskMonotask.scala
Line 153 in ad17278
Here is the log:
The text was updated successfully, but these errors were encountered: