1
votes

I am creating streaming analytics application using Spark, Flink & Kafka. Each analytics/functionality will implement as a Microservice so that this analytics can able to use in the different project later.

I run my Spark/Flink job perfectly in Simple Scala application and submit this job over Spark & Flink cluster respectively. But I have to start/run this job when REST POST startJob() request invoke to my web service.

How can I integrate my Spark & Flink data processing functionality in a web service oriented application?

Till now I tried Lagom Microservice but i found so many issues you can check

  1. Best approach to ingest Streaming Data in Lagom Microservice
  2. java.io.NotSerializableException using Apache Flink with Lagom

I think i am not taking the right direction for Stream Processing Microservice Application. Looking for right direction to implement this analytics over REST Service.

2

2 Answers

0
votes

Flink has a REST API you can use to submit and control jobs -- it's used by the Flink Web UI. See the docs here. See also this previous question.

0
votes

I think the REST API provides job running details, Any Flink API provides suppose if Spring Boot REST end point call connects Kafka streaming data, and returns Kafka data?