0
votes

I'm using spark-cassandra-connector_2.11 at version 2.3.0. Running Latest Spark 2.3.0 Trying to Read Data From Cassandra (3.0.11.1485) DSE (5.0.5).

Example Read that Works Without a Problem:

 JavaRDD<Customer> result = javaFunctions(sc).cassandraTable(MyKeyspaceName, "customers", mapRowTo(Customer.class));

Another Read that works correctly: If I'm doing from unit test - single thread - single read as follow.

cassandraConnector.withSessionDo(new AbstractFunction1<Session, Void>() {
                @Override
                public Void apply(Session session) {
                   //Read something from Cassandra via Session - Works Fine Here as well.
                }
            });

Example Read (mapPartitions+withSessionDo) Problematic Code:

CassandraConnector cassandraConnector = CassandraConnector.apply(sc.getConf());

SomeSparkRDD.mapPartitions((FlatMapFunction<Iterator<Customer>, CustomerEx>) customerIterator ->
            cassandraConnector.withSessionDo(new AbstractFunction1<Session, Iterator<CustomerEx>>() {
                @Override
                public Iterator<CustomerEx> apply(Session session) {
                    return asStream(customerIterator, false)
                            .map(customer -> fetchDataViaSession(customer, session))
                            .filter(x -> x != null)
                            .iterator();
                }
            }));


public static <T> Stream<T> asStream(Iterator<T> sourceIterator, boolean parallel) {
    Iterable<T> iterable = () -> sourceIterator;
    return StreamSupport.stream(iterable.spliterator(), parallel);
}

Some iterations of: map(customer -> fetchDataViaSession(customer, session)) Works but The Majority Fails with NoHostAvailableException.

Tried various setups with no success:

spark.cassandra.connection.connections_per_executor_max
spark.cassandra.connection.keep_alive_ms
spark.cassandra.input.fetch.size_in_rows
spark.cassandra.input.split.size_in_mb

Also Tried to reduce the number of Partitions of the RDD which I do mapPartitions+withSessionDo on.
2

2 Answers

0
votes

Check if your Cassandra cluster is SSL enabled. In case it is, I've seen the same error in case you don't configure the correct certificate.

0
votes

Looks like this solved it:

.set("spark.cassandra.connection.keep_alive_ms", "1200000")