SBT plugin for Apache Spark on AWS EMR
sparkMasterEbsSize
, sparkCoreEbsSize
) for Master and Core nodes.sparkSubmitJob
to sparkSubmit
.Add the default value of sparkAwsRegion back but set it as "changeme". This prevent the error when using SBT Configs.
Refined several setting types and provided some shortcuts
sparkSecurityGroupIds
and sparkEmrConfigs
from Option[T]
to [T]
.sparkInstanceBidPrice
to Option[Double]
.sparkEc2KeyName
to sparkInstanceKeyName
.sparkEmrApplications
and sparkS3LogUri
.sparkAwsRegion
and sparkS3JarFolder
.Let user modify the Spark configs for each job using
sparkSubmitConfs := Map("spark.executor.memory" -> "10G", "spark.executor.instances" -> "2")
Upgraded to SBT 1.0