0
votes

I'm using Snowflake on a Windows PC.

For example: https://<my_org>.snowflakecomputing.com/console#/internal/worksheet

I have a bunch of queries, the collective output of which I want to capture and load into a file.

Apart from running the queries one-at-a-time and using copy-and-paste to populate the file, is there a way I can run all the queries at once and have the output logged to a file on my PC?

2
Can you perform a union and insert that directly into a table ? Also if you want to write it using programming there are other options ? Let me know. Also put the queries what you are doing. I can take a look. Thanks.hopeIsTheonlyWeapon

2 Answers

3
votes

There are many ways to achieve the high level outcome that you are seeking, but you have not provided enough context to know which would be best-suited to your situation. For example, by mentioning https://<my_org>.snowflakecomputing.com/console#/internal/worksheet, it is clear that you are currently planning to execute the series of queries through the Snowflake web UI. Is using the web UI a strict requirement of your use-case?

If not, I would recommend that you consider using a Python script (along with the Snowflake Connector for Python) for a task like this. One strategy would be to have the Python script serially process each query as follows:

  1. Execute the query
  2. Export the result set (as a CSV file) to a stage location in cloud storage via two of Snowflake's powerful features:
  1. Download the CSV file to your local host via Snowflake's GET command

Here is a sample of what such a Python script might look like...

import snowflake.connector

query_array = [r"""
SELECT ...
  FROM ...
 WHERE ...
""",r"""
SELECT ...
  FROM ...
 WHERE ...
"""
]

conn = snowflake.connector.connect(
  account = ...
 ,user = ...
 ,password = ...
 ,role = ...
 ,warehouse = ...
)

file_number = 0;
for query in query_array:
  file_number += 1
  file_name = f"{file_prefix}_{file_number}.csv.gz"

  rs_query = conn.cursor(snowflake.connector.DictCursor).execute(query)
  query_id = rs_query.sfqid  # Retrieve query ID for query execution

  sql_copy_into = f"""
COPY INTO @MY_STAGE/{file_name}
  FROM (SELECT * FROM TABLE(RESULT_SCAN('{query_id}')))
  DETAILED_OUTPUT = TRUE
  HEADER = TRUE
  SINGLE = TRUE
  OVERWRITE = TRUE
"""
  rs_copy_into = conn.cursor(snowflake.connector.DictCursor).execute(sql_copy_into)

  for row_copy_into in rs_copy_into:
    file_name_in_stage = row_copy_into["FILE_NAME"]

    sql_get_to_local = f"""
GET @MY_STAGE/{file_name_in_stage} file://.
"""

    rs_get_to_local = conn.cursor(snowflake.connector.DictCursor).execute(sql_get_to_local)

Note: I have chosen (for performance reasons) to export and transfer the files as zipped (gz) files; you could skip this by passing the COMPRESSION=NONE option in the COPY INTO <location> command.

Also, if your result sets are much smaller, then you could use an entirely different strategy and simply have Python pull and write the results of each query directly to a local file. I assumed that your result sets might be larger, hence the export + download option I have employed here.

2
votes

You can use the SnowSQL client for this. See https://docs.snowflake.com/en/user-guide/snowsql.html

Once you get it configured, then you can make a batch file or similar that calls SnowSQL to run each of your queries and write the output to a file. Something like:

@echo off
>output.txt (
  snowsql -q "select blah"
  snowsql -q "select blah"
  ...
  snowsql -q "select blah"
)