0
votes

I created a Stream in Snowflake on a table and created a task to move the data to a table. Even after the task is complete the data in the stream is not purging. Because of that the task is not getting skipped and keep reinserting data from stream to the table and the final table keeps on growing. What can be the reason? It was working yesterday but from today the stream is not purging even after a DML is executed using that stream using a task.

create or replace stream test_stream on table test_table_raw APPEND_ONLY = TRUE;
create or replace task test_task_task warehouse = test_warehouse
schedule = '1 minute'
when system$stream_has_data('test_stream') 
as insert into test_table
SELECT 
level1.FILE_NAME,
level1.FILE_ROWNUMBER,
GET(lvl, '@id')::string as app_id
FROM (SELECT FILE_NAME,FILE_ROWNUMBER,src:"$" as lvl FROM test_table_raw)  level1,
lateral FLATTEN(LVL:"$")  level2
where level2.value like '%<test %';

alter task test_task resume;

select 
(select count(*) from test_table) table_count,
(select count(*) from test_stream) stream_count;

TABLE_COUNT STREAM_COUNT
500             1
1

1 Answers

1
votes

It appears that you are not consuming the stream in the DML operation. You are inserting rows from the table on which the stream is built rather than the stream itself. In order to advance the stream, you'll need to change "FROM test_table_raw" to "FROM test_stream". Give that a try and let me know.

Thanks.