0
votes

How does spark sql reads apache cassandra data ttl with Dataframe (Spark Sql). I don't find any example? Can i get an example?

My previous question was on rdd Does Spark: Spark get ttl column from cassandra

But now the query is for Dataframe.

1

1 Answers

0
votes

The Spark Cassandra Connector doesn't support TTL and WriteTime in DataFrame API. You can track JIRA SPARKC-528 for progress.

You can read TTL and/or WriteTime using the RDD API (best with mapper to case class), and then convert into DataFrame.

Update, September 2020th: Support for TTL & WriteTime for Dataframes was released as part of SCC 2.5.0 release.