1
votes

When running in Cloud Shell the sample code from this posting by Google's Slaven Bilac, an error occurs.

chuck@wordone-wordtwo-1234567:~/google-cloud-ml/samples/flowers$ ./sample.sh
Your active configuration is: [cloudshell-270]


Using job id:  flowers_chuck_20170130_212715

# Takes about 30 mins to preprocess everything.  We serialize the two
# preprocess.py synchronous calls just for shell scripting ease; you could use
# --runner DataflowPipelineRunner to run them asynchronously.  Typically,
# the total worker time is higher when running on Cloud instead of your local
# machine due to increased network traffic and the use of more cost efficient
# CPU's.  Check progress here: https://console.cloud.google.com/dataflow
python trainer/preprocess.py \
  --input_dict "$DICT_FILE" \
  --input_path "gs://cloud-ml-data/img/flower_photos/eval_set.csv" \
  --output_path "${GCS_PATH}/preproc/eval" \
  --cloud
WARNING:root:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be.
WARNING:root:BlockingDataflowPipelineRunner is deprecated, use BlockingDataflowRunner instead.
WARNING:root:BlockingDataflowRunner is deprecated, use DataflowRunner instead.
Traceback (most recent call last):
  File "trainer/preprocess.py", line 446, in <module>
main(sys.argv[1:])

  File "trainer/preprocess.py", line 442, in main
run(arg_dict)
  File "trainer/preprocess.py", line 361, in run
p = beam.Pipeline(options=pipeline_options)
  File "/home/chuck/.local/lib/python2.7/site-packages/apache_beam/pipeline.py", line 131, in __init__
    'Pipeline has validations errors: \n' + '\n'.join(errors))
ValueError: Pipeline has validations errors: 
Missing GCS path option: temp_location.

How can one modify the file(s) from GoogleCloudPlatform/cloudml-samples/flowers to avoid this?

2

2 Answers

1
votes

Chuck,

it seems that the problem you are seeing is due to versioning mismatch. Simple fix is to replace staging_location with temp_location in default_values dic inside preprocess.py:

default_values = {
    'project':
        get_cloud_project(),
    # **here**  
    'temp_location':
        os.path.join(os.path.dirname(parsed_args.output_path), 'temp'),
    'runner':
        'DataflowRunner',
    'extra_package':
        Default.CML_PACKAGE,
    'save_main_session':
        True,
} 

We will be updating the sample shortly to reflect this.

0
votes

Have you tried running the CloudShell setup script? It helps manage some of the versions, which seems to be the core issue here:

curl https://raw.githubusercontent.com/GoogleCloudPlatform/cloudml-samples/master/tools/setup_cloud_shell.sh | bash