0
votes

How can we update Cassandra Counter value using Spark SQL?
I tried a Cassandra CQL query that updates a counter value using DBeaver and it worked. However, when I tried it using SQL, an error Update statement is not expected.
Another thing, INSERT OVERWRITE doesn't work here (I think) as I need to increment the Counter Value: counter_column=counter_column+1

1

1 Answers

0
votes

I just found a way to do that:
When inserting value into a table with Counter value in Spark, Cassandra will add that value to the current value..so executing the following SQL:

INSERT INTO some_table VALUES ('key0', 1)
INSERT INTO some_table VALUES ('key0', 2)

Will yield the following table state:

key0, 3