Spark streaming : deserializaton time large with Random Forest

后端 未结 0 1325
情书的邮戳
情书的邮戳 2021-02-02 03:26

I have a spark program, processing a stream of data from Kafka by an ML model (ensemble of two random forest). My overall RF is updated every few batches, adding (in one forest)

相关标签:
回答
  • 消灭零回复
提交回复
热议问题