I'm trying to write temporary files on the workers executing Dataflow jobs, but it seems like the files are getting deleted while the job is still running. If I SSH into the running VM, I'm able to execute the exact same file-generating command and the files are not destroyed -- perhaps this is a cleanup that happens for the dataflow runner user only. Is it possible to use temp files or is this a platform limitation?
Specifically, I'm attempting to write to the location returned by Files.createTempDir()
, which is /tmp/someidentifier
.
Edit: Not sure what was happening when I posted, but Files.createTempDirectory()
works...