I am trying to read a very large file (running to GB-s) from Google cloud storage bucket. I read it as Blob, and then open an InputStream out of the Blob.
"Blob blob = get_from_bucket("my-file");
ReadChannel channel = blob.reader();
InputStream str = Channels.newInputStream(channel); "
My question is, is the entire file moved to Blob object in one go or is it done in chunks? In the former case, it could lead to Out of Memory , right?
Is there a way to read the object from bucket just like we do with FileInpuStream so that I can read files irrespective of size of the file?