0
votes

There is an IngiteRDD that can reflect changes to the underlying cache which is very nice in some cases to overcome the shortcoming of immutability of Spark' RDD.

There is an IgniteRDD.sql method that runs ANSI sql (not spark sql), this method returns a DataFrame that is the concept of spark sql. When I get this DataFrame object, can i use it as a normal DataFrame that I don't need to think that it is from Ignite world? That is, I can register temp table, and then do distributed join with other DataFrame. When sql like distribued join, is executed, does Ignite use Spark SQL Engine or the Ignite Engine to run the sql?

1

1 Answers

0
votes

You can use Dataframe API after you executed a query, but in this case it will not be distributed. I.e. it will work with the local result set already fetched to the driver.

Full Dataframe support in Ignite will be available next year.