So I have been trying to get a process set up to pull data from my Snowflake database in python using the Snowflake Python Connector. I have made a method for requesting data (shown below)
import snowflake.connector
def request_data(s, query):
snowflake_connection = snowflake.connector.connect(user = 'user',
password = 'password',
account = 'account',
warehouse = 'warehouse',
database = 'database',
schema = s)
try:
with snowflake_connection.cursor() as cursor:
cursor.execute(query)
data = cursor.fetch_pandas_all()
finally:
snowflake_connection.close()
return data
I have been able to request data from this method and work with the data in Python such as the example below
query = "select * from books.sales where date between '2020-08-01' and '2020-11-31'"
sales = request_data('BOOKS', query)
However, when I try to request a larger amount of data (such as change the date range to 2020-08-01 through 2021-07-31), I am getting an error
250003: Failed to get the response. Hanging? method: get, url: <snowflake url>
I have tried looking through the documentation and one thing that I have tested is printing the attribute rowcount in the cursor - print(cursor.rowcount) (which I added to the request data method between cursor.execute(query) and data = cursor.fetch_pandas_all(). When I did this I saw that rowcount was matching the number of rows that I got when I tested the query on a worksheet in snowflakecomputing.com.
So I am imagining it has to do with something related to the amount of data. The query where I have the date range be 2020-08-01 to 2021-07-31 was about 39,000 rows. I have looked at documentation for a limit on the amount of data in a request with the Snowflake Python Connector, and the only number that I ever saw was 10,000.
When I tried to reduce my date range so that the number of rows would be less than that, I was still getting the same error so I am not sure what is wrong. I could break the one query into multiple queries, but I am trying to keep my requests to a minimum. If someone knows how to solve this issue, I would greatly appreciate it.