代码拉取完成,页面将自动刷新
ConnectorInterface
that simplify the use of custom connectorscom.jcdecaux.setl.internal
:
DeltaConnector.update
DeltaConnector.partition
FileConnector.delete()
to avoid ambiguity (use FileConnector.drop()
instead)spark_3.0
StructuredStreamingConnector
(PR #119)DeltaConnector
(PR #118)ZipArchiver
that can zip files/directories (PR #124)Setl.hasExternalInput
that always returns false (PR #121)FileConnector
now lists path correctly for nested directories (#97)showDiagram()
method to Pipeline that prints the Mermaid code and generates the live editor URL 🎩🐰✨ (#52)delete
method in JDBCConnector
(#82)drop
method in DBConnector
(#83)setl.config {
spark {
spark.app.name = "my_app"
spark.sql.shuffle.partitions = "1000"
}
}
setl.config_2 {
spark.app.name = "my_app"
spark.sql.shuffle.partitions = "1000"
}
filter(cond: Set[Condition])
for Dataset and DataFrame.setUserDefinedSuffixKey
and getUserDefinedSuffixKey
to SparkRepository.case class CompressionDemo(@Compress col1: Seq[Int],
@Compress(compressor = classOf[GZIPCompressor]) col2: Seq[String])
SparkRepository.findby(conditions)
method when we filter by case class field name instead of column namereadStandardJSON
and writeStandardJSON
method into JSONConnector to read/write standard JSON format fileStage
. Use can turn in on by setting parallel
to true.beforeAll
into ConfigLoader
addStage
and addFactory
that take a class object as input. The instantiation will be handled by the stage.get[A](cls: Class[_ <: Factory[_]): A
.Delivery
annotation to handle inputs of a Factory
class Foo {
@Delivery(producer = classOf[Factory1], optional = true)
var input1: String = _
@Delivery(producer = classOf[Factory2])
var input2: String = _
}
suffix
in FileConnector
and SparkRepository
partitionBy
in FileConnector
and SparkRepository
filenamePattern
into the configuration fileConf
object from Map.
Conf(Map("a" -> "A"))
DispatchManager
class. It will dispatch its deliverable object to setters (denoted by @Delivery) of a factoryDeliverable
class, which contains a payload to be deliveredPipelineInspector
to describe a pipelineFileConnector
and DBConnector
EnrichedConnector
Conf
into SparkRepositoryBuilder
and changed all the set methods of SparkRepositoryBuilder
to use the conf objectcom.jcdecaux.setl.annotations
to com.jcdecaux.setl.annotation
ColumnName
, which could be used to replace the current column name with an alias in the data storage.CompoundKey
. It could be used to define a compound key for databases that only allow one partition keyCondition
Connector
ConnectorBuilder
to directly build a connector from a typesafe's Config
objectSparkRepositoryBuilder
AppEnv
SparkRepositoryBuilder
that allows creation of a SparkRepository
for a given class without creating a dedicated Repository
classSparkRepository
by creating ExcelConnector
Logging
traitFactory
class covariance issue (0764d10d616c3171d9bfd58acfffafbd8b9dda15).gitlab-ci.yml
to speed up CI.gitlab-ci.yml
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。