Is it possible to have big values (more than 65535 bytes) into Cassandra collection types?
The datastax documentation states about using collection (http://www.datastax.com/documentation/cql/3.1/cql/cql_using/use_collections_c.html) "The maximum size of an item in a collection is 64K"
On another side the CQL Limits (http://www.datastax.com/documentation/cql/3.1/cql/cql_reference/refLimits.html) "Collection item, value of: 2GB (Cassandra 2.1 v3 protocol), 64K (Cassandra 2.0.x and earlier)"
So, having Cassandra cluster version 2.1, so I was expecting it is possible to add big value, but I got the error:
<stdin>:12:code=2200 [Invalid query] message="Map value is too long. Map values are limited to 65535 bytes but 100000 bytes value provided"
My CQL shell version is:
[cqlsh 5.0.1 | Cassandra 2.1.2 | CQL spec 3.2.0 | Native protocol v3]
The script sample that caused to the error above:
DROP KEYSPACE example;
CREATE KEYSPACE example WITH replication = {'class':'SimpleStrategy', 'replication_factor':1};
USE example;
CREATE TABLE IF NOT EXISTS big_map_values (
person_id int,
images map<text, blob>, -- map: md5(url) -> image
PRIMARY KEY(person_id)
);
INSERT INTO big_map_values (person_id, images)
VALUES(17, {'6fa9093ec07a71f859cae269feee18ec' : textAsBlob('in real sample code I have 10000 a characters here')});