0
votes

I have a select statement which returns 0 or more rows.

I'm trying to come up with a plsql proc with a cursor to produce xml output fro, all rows returned into 100 rows at a time. i'm doing this to chunk loo rows at a time based on requirement.

So basically my program should follow below logic

cursor c1 is select id,code_id,code_desc from table order by id; --returns some rows  

if c1%notfound
  then return;` -- exit from procedure 
else
loop
  grab first 100 rows from select and append to a variable
  and assign it to a variable; 
  update this variable into a clob field in a table. 
  grab next 100 rows and append into a variable
  update this variable into a clob field in a table in another row;see below 
  table data 

   and so on
   and grab remaining rows and append into a variable 
   print the variable;
until no data found;
exit

I'm trying to do convert the output from select statement into xml text.

The output should look something like below:

TABLE: STG_XML_DATA

LOOP_NO(NUMBER), XML_TEXT(CLOB), ROWS_PROCESSED
1                <XML><id>1</ID><id>2</ID>..<ID>100</ID></XML>     100
2                <XML><id>101</ID><id>102</ID>..<ID>200</ID></XML> 200 
3                <XML><id>301</ID><id>102</ID>..<ID>320</ID></XML> 20

Can someone please help

1
Why are you trying to only process 100 rows at a time? And could you clarify what your rows look like, or what "append to a variable" looks like? You can store rows in PL/SQL collections for example, but that doesn't really make them any easier to print. - kfinity
I'm just trying to chunk the rows 100 at a time and append that into variable that can be stored into a table not necessarily print. Im working on generating some xml text which follows a pattern. the values returned from this select statement will be used to append that into a clob_column. - user1751356
Have a read here for some VERY useful information on working with cursors, specifically limiting rows retrieved as you want to do and how to properly know when you've reached the end (not by using NOTFOUND!) : oracle.com/technetwork/issue-archive/2008/08-mar/… - Gary_W

1 Answers

0
votes

First of all, can you do this with a single INSERT ... SELECT statement that does what you want with reasonable performance? If you're doing a million rows, yes, breaking them up into chunks may be a good idea. But if it's 100, that might be your best bet.

For your actual question, you want to use BULK COLLECT into a collection variable and possibly FORALL. So your function is going to look something like this:

DECLARE
    TYPE id_tt IS TABLE OF NUMBER;
    TYPE desc_tt IS TABLE OF VARCHAR2(100);
    l_ids         id_tt;
    l_code_ids    id_tt;
    l_code_descs  desc_tt;
    cursor c1 is select id,code_id,code_desc from table 
BEGIN
    OPEN c1;
    LOOP
        FETCH c1 BULK COLLECT INTO l_ids, l_code_ids, l_code_descs
        LIMIT 100;

        EXIT WHEN l_ids.COUNT = 0;

        FORALL idx IN 1..l_ids.COUNT
        INSERT [... some insert statement here ...]

        [... maybe some other processing here...]
    END LOOP;
    CLOSE c1;
END;

What you absolutely do not want to do is fetch a row, process it, fetch another row, etc. SQL is a set-oriented language, so try to operate on sets. Every time you switch context from SQL to PL/SQL there is a cost and it can kill your performance.

See: http://www.oracle.com/technetwork/issue-archive/2012/12-sep/o52plsql-1709862.html