1
votes

I am using dataflow for my work to write some data into the bigtable.
Currently, I got a task to read rows from the bigtable.
However, whenever I try to read rows from the bigtable using bigtable-hbase-dataflow, it fails and complains as follow.

 Error:   (3218070e4dd208d3): java.lang.IllegalArgumentException: b <= a
at org.apache.hadoop.hbase.util.Bytes.iterateOnSplits(Bytes.java:1720)
at org.apache.hadoop.hbase.util.Bytes.split(Bytes.java:1683)
at org.apache.hadoop.hbase.util.Bytes.split(Bytes.java:1664)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$AbstractSource.split(CloudBigtableIO.java:512)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$AbstractSource.getSplits(CloudBigtableIO.java:358)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$Source.splitIntoBundles(CloudBigtableIO.java:593)
at com.google.cloud.dataflow.sdk.runners.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:413)
at com.google.cloud.dataflow.sdk.runners.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:171)
at com.google.cloud.dataflow.sdk.runners.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:149)
at com.google.cloud.dataflow.sdk.runners.worker.SourceOperationExecutor.execute(SourceOperationExecutor.java:58)
at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.executeWork(DataflowWorker.java:288)
at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:221)
at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:173)
at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.doWork(DataflowWorkerHarness.java:193)
at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:173)
at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:160)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

I am using 'com.google.cloud.dataflow:google-cloud-dataflow-java-sdk-all:1.6.0' and 'com.google.cloud.bigtable:bigtable-hbase-dataflow:0.9.0' now.

Here's my code.

CloudBigtableScanConfiguration config = new CloudBigtableScanConfiguration.Builder()
    .withProjectId("project-id")
    .withInstanceId("instance-id")
    .withTableId("table")
    .build();
pipeline.apply(Read.<Result>from(CloudBigtableIO.read(config)))
    .apply(ParDo.of(new Test()));

FYI, I just read from bigtable and just count rows using aggregator in Test DoFn.

static class Test extends DoFn<Result, Result> {
    private static final long serialVersionUID = 0L;
    private final Aggregator<Long, Long> rowCount = createAggregator("row_count", new Sum.SumLongFn());

    @Override
    public void processElement(ProcessContext c) {
        rowCount.addValue(1L);
        c.output(c.element());
    }
}

I just followed tutorial on the dataflow document, but it fails. Can anyone help me out?

1
Just checking the basics - in your actual code, you replaced project-id, instance-id, and table with the real values, yes?Kenn Knowles
Yes. Of course I did :)Kyuntae Ethan Kim
This looks like a bug in the cloud bigtable client. I created a github issue to track this: github.com/GoogleCloudPlatform/cloud-bigtable-client/issues/912Solomon Duskis
Thank you @Solomon! I will keep an eye on it.Kyuntae Ethan Kim

1 Answers

1
votes

The root cause was a dependency issue:

Previously, our build file omitted this dependency:

compile 'io.netty:netty-tcnative-boringssl-static:1.1.33.Fork22'

Today, I added the dependency and it resolved all the issues. I double-checked that the problem arises when I don't have it in the build file.

From https://github.com/GoogleCloudPlatform/cloud-bigtable-client/issues/912#issuecomment-249999380.