0
votes

i have two question

1) I want to use Kafka with Google cloud Dataflow Pipeline program. in my pipeline program I want to read data from kafka is it possible?

2) I created Instance with BigQuery enabled now i want to enable Pubsub how can i do ?

3

3 Answers

3
votes

(1) Ad mentioned by Raghu, support for writing to/reading from Kafka was added to Apache Beam in mid-2016 with the KafkaIO package. You can check the package's documentation[1] to see how to use it.

(2) I'm not quite sure what you mean. Can you provide more details?

[1] https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/io/kafka/KafkaIO.html

2
votes

Kafka support was added to Dataflow (and Apache Beam) in mid 2016. You can read and write to Kafka streaming pipelines. See JavaDoc for KafkaIO in Apache Beam.

1
votes

(2) As of April 27, 2015, you can enable Cloud Pub/Sub API as follows:

  1. Go to your project page on the Developer Console
  2. Click APIs & auth -> APIs
  3. Click More within Google Cloud APIs
  4. Click Cloud Pub/Sub API
  5. Click Enable API