As Mikhail Berlyant said,
BigQuery does not provide ability to directly export/download query
result to GCS or Local File.
You can still export it using the Web UI in just three steps
- Configure query to save the results in a BigQuery table and run it.
- Export the table to a bucket in GCS.
- Download from the bucket.
Step 1
When in BigQuery screen, before running the query go to More > Query Settings

This opens the following

Here you want to have
- Destination: Set a destination table for query results
- Project name: select the project.
- Dataset name: select a dataset. If you don't have one, create it and come back.
- Table name: give whatever name you want (must contain only letters, numbers, or underscores).
- Result size: Allow large results (no size limit).
Then Save it and the Query is configured to be saved in a specific table. Now you can run the Query.
Step 2
To export it to GCP you have to go to the table and click EXPORT > Export to GCS.

This opens the following screen

In Select GCS location you define the bucket, the folder and the file.
For instances, you have a bucket named daria_bucket (Use only lowercase letters, numbers, hyphens (-), and underscores (_). Dots (.) may be used to form a valid domain name.) and want to save the file(s) in the root of the bucket with the name test, then you write (in Select GCS location)
daria_bucket/test.csv
If the file is too big (more than 1 GB), you'll get an error. To fix it, you'll have to save it in more files using wildcard. So, you'll need to add *, just like that
daria_bucket/test*.csv

This is going to store, inside of the bucket daria_bucket, all the data extracted from the table in more than one file named test000000000000, test000000000001, test000000000002, ... testX.
Step 3
Then go to Storage and you'll see the bucket.

Go inside of it and you'll find the one (or more) file(s). You can then download from there.