Java – foreach function does not work in spark dataframe

According to the dataframes API, the definition is:

public void foreach(scala.Function1<Row,scala.runtime.@R_841_2419@edUnit> f)

Apply function f to all rows

But when I want

Dataframe df = sql.read()
    .format("com.databricks.spark.csv")
    .option("header","true")
    .load("file:///home/hadoop/Desktop/examples.csv");

df.foreach(x->
{
   System.out.println(x);
});

I received a compile - time error Is there anything wrong?

Solution

You can convert it to Java RDD to use lambda, as follows:

df.toJavaRDD().foreach(x->
   System.out.println(x)
);
The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>