Spark, Hadoop, and Kryo utilities
Classes that implement the Registrar interface can use various shorthands for registering classes with Kryo.
Adapted from RegistrationTest:
register(
cls[A], // comes with an AlsoRegister that loops in other classes
arr[Foo], // register a class and an Array of that class
cls[B] → BSerializer(), // use a custom Serializer
CDRegistrar // register all of another Registrar's registrations
)- custom
Serializers andAlsoRegisters are picked up implicitly if not provided explicitly. AlsoRegisters are recursive, allowing for much easier and more robust accountability about what is registered and why, and ensurance that needed registrations aren't overlooked.
Configuration: serializable Hadoop-ConfigurationwrapperContext:SparkContextwrapper that is also a HadoopConfiguration, for unification of "global configuration access" patternsConf: load aSparkConfwith settings from file(s) specified in theSPARK_PROPERTIES_FILESenvironment variable
SparkConfBase: trait that brokers setting config key-values and creating aSparkConf- many mix-ins for common spark-configuration groups:
KeyPartitioner/Partitioner: shorthands for common Spark-Partitioner-creation patterns- from the first field of tuple-like objects
- from a partial function
- from a function
Histogramaccumulator