0
votes

''' cursor.execute(Select * From Table); '''

Iam using the above code to execute the above select query, but this code gets stucked, because in the table, I am having 93 million records,

Do we have any other method to extract all the data from snowflake table in python script

2

2 Answers

1
votes

Depending on what you are trying to do with that data, it'd probably be most efficient to run a COPY INTO location statement to extract the data into a file to a stage, and then run a GET via Python to bring that file locally to wherever you are running python.

However, you might want to provide more detail on how you are using the data in python after the cursor.execute statement. Are you going to iterate over that data set to do something (in which case, you may be better off issuing SQL statements directly to Snowflake, instead), loading it into Pandas to do something (there are better Snowflake functions for pandas in that case), or something else? If you are just creating a file from it, then my suggestion above will work.

0
votes
  1. The problem is when you are fetching data from Snowflake to Python, the query is getting stuck due to the volume of record and the SF to Python Data conversion.

  2. Are you trying to fetch all the data from the table and how are you using the Data in the downstream which is most important. Restrict the number of columns

  3. Improving Query Performance by Bypassing Data Conversion

To improve query performance, use the SnowflakeNoConverterToPython class in the snowflake.connector.converter_null module to bypass data conversions from the Snowflake internal data type to the native Python data type, e.g.:

con = snowflake.connector.connect(
    ...
    converter_class=SnowflakeNoConverterToPython
)
for rec in con.cursor().execute("SELECT * FROM large_table"):
    # rec includes raw Snowflake data