Apache Spark warts for wartremover.
Add the following to your project/plugins.sbt
:
resolvers += Resolver.bintrayIvyRepo("omervk", "wartremover-spark")
libraryDependencies += "com.omervk" %% "wartremover-spark" % "0.0.3"
addSbtPlugin("com.omervk" % "sbt-wartremover-spark" % "0.0.3")
(this is pending a release into the default sbt-plugins repo)
And in your build.sbt
:
wartremoverErrors ++= SparkWart.All
// Or alternatively
wartremoverErrors += SparkWart.UnserializableCapture
Here is a list of warts under the com.omervk.wartremover.spark.warts
package.
One of the most painful things with Spark is getting an error that a captures value is not serializable, since this only ever happens at runtime.
Important Note: This wart is a work in progress to cover as many of the cases where this happens. Please open issues detailing issues with it (both false-positives and false-negatives) with detailed repros.
val captured = new Unserializable(value = Random.nextInt())
dataset.map { value: Int =>
value + captured.value // Won't compile: Functions sent to Spark may not close over unserializable values (captured)
}