Aufgabe nicht serialisierbar: java.io.notserializableException: java.io.printstreamJava

Java-Forum
Guest
 Aufgabe nicht serialisierbar: java.io.notserializableException: java.io.printstream

Post by Guest »

Ich lerne Funken mit Java. Ich mache sehr grundlegende und eingebaute Operationen, aber mit Aufgaben, nicht serialisierbarer Fehler. Ich weiß nicht, was schief gelaufen ist, da keine benutzerdefinierte Klasse verwendet wurde. /p>
public class FlatmapTest {

public static void main(String[] args) {
Logger.getLogger("org.apache").setLevel(Level.ERROR);
SparkConf sparkConf = new SparkConf().setAppName("learningSpark").setMaster("local[*]");
JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);
//FlatMap and Filter
List logs = Arrays.asList(
"WARN: Tuesday 4 Sep 1024",
"FATAL: Tuesday 4 Sep 1222",
"WARN: Tuesday 4 Sep 2028",
"FATAL: Tuesday 4 Sep 1428",
"DEBUG: Tuesday 4 Sep 1812",
"WARN: Tuesday 4 Sep 2218");
javaSparkContext.parallelize(logs)
.flatMap(log -> Arrays.asList(log.split(" ")).iterator())
.filter(string -> string.length() > 1)
.foreach(System.out::println);

javaSparkContext.close();
}
}
< /code>
Fehlermeldung: < /p>
Exception in thread "main" org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2037)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:874)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:873)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
at org.apache.spark.rdd.RDD.foreach(RDD.scala:873)
at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:350)
at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45)
at com.virtualpairprogrammers.rdd.FlatmapTest.main(FlatmapTest.java:47)

Caused by: java.io.NotSerializableException: java.io.PrintStream
Serialization stack:
- object not serializable (class: java.io.PrintStream, value: java.io.PrintStream@1d8e2eea)
- element of array (index: 0)
- array (class [Ljava.lang.Object;, size 1)
- field (class: java.lang.invoke.SerializedLambda, name: capturedArgs, type: class [Ljava.lang.Object;)
- object (class java.lang.invoke.SerializedLambda, SerializedLambda[capturingClass=class com.virtualpairprogrammers.rdd.FlatmapTest, functionalInterfaceMethod=org/apache/spark/api/java/function/VoidFunction.call:(Ljava/lang/Object;)V, implementation=invokeVirtual java/io/PrintStream.println:(Ljava/lang/String;)V, instantiatedMethodType=(Ljava/lang/String;)V, numCaptured=1])
- writeReplace data (class: java.lang.invoke.SerializedLambda)
- object (class com.virtualpairprogrammers.rdd.FlatmapTest$$Lambda$4/761863997, com.virtualpairprogrammers.rdd.FlatmapTest$$Lambda$4/761863997@347bdeef)
- field (class: org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1, name: f$14, type: interface org.apache.spark.api.java.function.VoidFunction)
- object (class org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1, )
at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295)
... 12 more
< /code>
Bitte führen Sie mich an, wie löste ich diesen Fehler? und Ansatz Ich muss den Spark -Code debuggen.

Danke im Voraus. < /p>

Quick Reply

Change Text Case: 
   
  • Similar Topics
    Replies
    Views
    Last post