2
votes

New to the kafka/flink/scala/sbt combo and trying to setup the following

  • A multi-topic Kafka Queue
  • Flink streaming job using scala jar
  • A scala jar that reads data from a topic, process and then pushes data to another topic

Uptil now

  • Able to properly setup Kafka and Flink.
  • Able to read kafka queue using the Kafka.jar example that comes with flink binary.

Able to create a wordcount jar ( Thanks to ipoteka )
Now trying to create a streaming-word-count jar but running into sbt issues
Now trying to create an example wordcount.jar before attempting the actual kafka/spark streaming example.
But Running into simSBT issues Any idea what am i overlooking.
Also let me know if i have any unnecessary declarations.
Also would appreciate if someone shares a simple program to read/write kakfa queue.

Project setup -

|- project/plugins.sbt
|- build.sbt
|- src/main/scala/WordCount.scala

build.sbt

name := "Kakfa-Flink Project"

version := "1.0"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

// Updated : Correction pointed by ipoteka 
libraryDependencies += "org.apache.kafka" % "kafka_2.10" % "0.10.0.0"

libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.0.0"

libraryDependencies += "org.apache.flink" %% "flink-clients" % "1.0.0"

libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % "1.0.0"

// for jar building
mainClass in compile := Some("StreamWordCount")

plugins.sbt

// *** creating fat jar
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.1")

WordCount.scala

package prog

import org.apache.flink.api.scala._
import org.apache.flink.streaming.api.scala.DataStream
import org.apache.flink.streaming.api.windowing.time.Time

object WordCount {

  type WordCount = (String, Int)

  def main(lines: DataStream[String], stopWords: Set[String], window: Time): DataStream[WordCount] = {
    lines
      .flatMap(line => line.split(" "))
      .filter(word => !word.isEmpty)
      .map(word => word.toLowerCase)
      .filter(word => !stopWords.contains(word))
      .map(word => (word, 1))
      .keyBy(0)
      .timeWindow(window)
      .sum(1)
  }

}

StreamWordCount.scala

package prog

import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
import org.apache.flink.streaming.util.serialization.SimpleStringSchema

import org.apache.flink.api.scala._
import org.apache.flink.streaming.api.scala.DataStream
import org.apache.flink.streaming.api.windowing.time.Time



object Main {
  def main(args: Array[String]) {

  type WordCount = (String, Int)

    val env = StreamExecutionEnvironment.getExecutionEnvironment
    val properties = new Properties()
    properties.setProperty("bootstrap.servers", "localhost:9092")
    properties.setProperty("zookeeper.connect", "localhost:2181")
    properties.setProperty("group.id", "test")
    val stream = env
      .addSource(new FlinkKafkaConsumer082[String]("topic", new SimpleStringSchema(), properties))
      .flatMap(line => line.split(" "))
      .filter(word => !word.isEmpty)
      .map(word => word.toLowerCase)
      .filter(word => !stopWords.contains(word))
      .map(word => (word, 1))
      .keyBy(0)
      .timeWindow(window)
      .sum(1)
      .print

    env.execute("Flink Kafka Example")
  }
}

Error while creating jar ( UPDATED )

[vagrant@streaming ex]$ /opt/sbt/bin/sbt  package
    [error] /home/vagrant/ex/src/main/scala/StreamWordCount.scala:4: object connectors is not a member of package org.apache.flink.streaming
[error] import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082
[error]                                   ^
[error] /home/vagrant/ex/src/main/scala/StreamWordCount.scala:18: not found: type Properties
[error]     val properties = new Properties()
[error]                          ^
[error] /home/vagrant/ex/src/main/scala/StreamWordCount.scala:23: not found: type FlinkKafkaConsumer082
[error]       .addSource(new FlinkKafkaConsumer082[String]("topic", new SimpleStringSchema(), properties))
[error]                      ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 31 s, completed Jul 3, 2016 9:02:18 PM
1

1 Answers

2
votes

Where did you get these versions? I don't see kafka release 1.0.0. Look at maven (press sbt tab):

libraryDependencies += "org.apache.kafka" % "kafka_2.10" % "0.10.0.0"

I would also recommend you to check all the other versions. Spark current release is 1.6.2, for example.