I tried a dataflow job to read from Google cloud storage and write to Local machine. I used a DirectPipelineRunner. The job completed successfully. But i don't see the files written in my local machine. Should i specify any ip/hostname along with my local location corresponding to the output location parameter? How will i specify a location in my local machine?
Command below:
gcloud dataflow jobs run sampleJobname1 --gcs-location gs://bucket/templatename1 --parameters inputFilePattern=gs://samplegcsbucket/abc/*,outputLocation=C:\data\gcp\outer,runner=DirectPipelineRunner
CODE:
PCollection<String> textData =pipeline.apply("Read Text Data", TextIO.read().from(options.getInputFilePattern()));
textData.apply("Write Text Data",TextIO.write().to(options.getOutputLocation()));