Package private scope in scala visible in Java
When I used the bytecode generated by Scala code from Java code, I just found the strange behavior of scala scope Consider using the following code snippets of spark (spark 1.4, Hadoop 2.6):
import java.util.Arrays; import java.util.List; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.broadcast.Broadcast; public class Test { public static void main(String[] args) { JavaSparkContext sc = new JavaSparkContext(new SparkConf() .setMaster("local[*]") .setAppName("test")); Broadcast<List<Integer>> broadcast = sc.broadcast(Arrays.asList(1,2,3)); broadcast.destroy(true); // fails with java.io.IOException: org.apache.spark.SparkException: // Attempted to use Broadcast(0) after it was destroyed sc.parallelize(Arrays.asList("task1","task2"),2) .foreach(x -> System.out.println(broadcast.getValue())); } }
This code failed because I voluntarily destroyed the broadcast before using it, but the fact is that in my mental model, it should not be compiled, let alone run normally
Indeed, broadcast Destroy (Boolean) is declared private [spark], so it should not be visible from my code I'll try to watch the bytecode of the radio, but it's not my specialty, so I prefer to publish this question In addition, sorry, I'm too lazy to create an example that doesn't depend on spark, but at least you get the idea Please note that I can use Spark's various package private methods, not just broadcasting
What happened to any idea?
Solution
If we reconstruct this problem with a simpler example:
package yuvie class X { private[yuvie] def destory(d: Boolean) = true }
And decompile in Java:
[yuvali@localhost yuvie]$javap -p X.class Compiled from "X.scala" public class yuvie.X { public boolean destory(boolean); public yuvie.X(); }
We see that the private [package] in scala is exposed in Java Why? This comes from the fact that Java private packages are not equal to scala private packages There is a good explanation in this post: