I am saving my predicted results to a Cassandra DB using spark Cassandra connector using below codes:
CassandraJavaUtil.javaFunctions(sensorDataRDD).writerBuilder(modelParamter.keyspace, "sensor_data_2",
CassandraJavaUtil.mapToRow(SensorData2Double.class)).saveToCassandra();
The data is based on time-stamps scaled in seconds. Therefore, the predicted writing could be scaled in hour. I need to first delete all previous records. The delete should occur on a special column in Cassandra table providing with its unique key.
I am not sure how to delete all previous records to make sure that when I insert new records using above java code, they are not deleted by my cassandra delete query afterwards.
Is there any atomicity on Cassandra columns when I delete or insert rows (primary keys)?