Description
Hi,
I'd like to consume/read data from ES via spark structured streaming.
The only possible solution I know today is via a workaround by using a combination of logstash and kafka
(ES -> logstash -> kafka -> spark structured streaming
).
It would be very helpful to have a direct solution via spark structured streaming so you just invoke readstream towards the ES-backend.
Technically, the solution should offer a possibility to provide an ES query as a parameter when connecting to ES where it should be possible to define a filter and/or a timeframe on each interval.
I don't think that it should be that hard to implement, as the spark-batch variant of reading data from ES does already exist and most part of this code could to be reused for the structured streaming version.