8
votes

Problem Background

Generating and accessing data of a fixed column layout is easy. You can create local temp tables up-front, and populate them by calling stored procedures.

On the other hand, if you want to generate data with a dynamic column layout, you must generally build an SQL statement dynamically and execute it with "exec sp_executesql". Since the data layout is unknown at run-time, you cannot create a temp-table up-front, and once inside the "exec sp_executesql" statement, any temporary tables created there are bound to that scope and vanish when the call returns, so it's much more difficult to access the data (i.e. your options are more limited).

My Specific Situation

I have a query that needs to access data in a dynamically generated table.

The table is generated by a stored procedure, which dynamically builds a query, stores it in a variable "@sql nvarchar(max)", and runs it by calling "exec sp_executesql @statement = @sql".

The @sql statement was something like "select * into #temptable from...", but #temptable was destroyed by the time "exec sp_executesql" returned. A quick fix for this was to just use "##temptable" instead (i.e. a global temp table), because it survives when the stored procedure returns AND I can easily access it in the calling scope (because it has a known/static name).

I don't like this solution because global temp tables aren't thread-safe (name collistion-wise), and I don't want to have to mess with dynamically-generated unique names, because I'll just end up having to use more dynamic SQL to access them... which puts me right back at square one, leaving the data inaccessible outside the SP.

I don't think returning table variables (through output parameters) is an option (new to SQL Server 2008 too), unless it can be done without having to define a static table type. The tables my stored procedure generates are dynamic, and depend on the input parameter(s) passed.

Inline table-valued functions are not an option, because I'm running code loops to build the @sql query and calling "exec sp_executesql".

Multi-statement table-valued functions (instead of the stored procedure), is also not an option, because such a function must have a well-defined table format, whereas I'm running dyanmic SQL to return a table with a variable number of columns and column names depending on the input parameter values.

All I really want to do is select the result set of the dynamic query into a new table, but I'm finding it difficult, as none of the above works; particularly irritating is how local temporary tables aren't local to the session, but local to the stored procedure so that they vanish upon returning. The only solution I've seen insists that using OPENROWSET is the only way, but I don't want to mess with connection strings inside my stored procedure, for the same reason I don't want to include unique-name-management code... it's just way more complicated than it ought to be.

In summary, I just want to execute dynamic SQL that generates a dataset of an unknown format, and be able to easily access it from the calling scope.

3
Why the downvote? While the poster is trying to do something SQL is not really meant to do, he's obviously put some effort into considering the various options, and he's expressed his goal clearly.user565869
This problem interests me, because it seems like something SQL should handle really well. The real problem here is how the scope of local temporary tables is too limited. What we could really use is a third type of temporary table that can be created in child scopes, but is bound to the top-level calling scope. Table variables are nice to be passed around, and finally in SQL Server 2008 that can be passed as output parameters, but... they still require a fixed definition and pre-declared type to pass them as output parameters.Triynko
I couldn't have said it better myself! I have exactly the same problem. BTW, I wish T-SQL supported a 'variant table variable' that doesn't require a pre-determined structure.Pete Alvin

3 Answers

1
votes

Create the temp table before the sp_executesql: it will still be in scope for "inner" scopes like the sp_executesql

Change the SQL to do an INSERT rather than SELECT..INTO...

Edit:

Make the table wide enough to cover all options.

Frankly, SQL is designed to work with fixed table definitions: variable output signatures (tables) leads to the problem you have...

1
votes

One clunky but possible option is to create a single-column table in the caller, then alter it in the callee. This gets around the scope issue but makes things quite a bit harder...

If you build a stored procedure that accepts input and output table names, which makes the second table look exactly like the first by dropping and adding columns, you might have a more general solution.

Selecting from tempdb.INFORMATION_SCHEMA.COLUMNS will let you find the column information about any temp table (which you can detect by whether the table name starts with #). Here's an example:

CREATE TABLE #blah (a int)
SELECT *
FROM tempdb.INFORMATION_SCHEMA.COLUMNS
WHERE Object_id('tempdb.dbo.' + TABLE_NAME) = Object_id('tempdb.dbo.#blah')
DROP TABLE #blah

Note that the table name in the view is not #blah (it will probably be something like #blah___{lots of underscores}___00000000021D thus the use of Object_id() to correlate the two.

To use it practically, instead of filling the first table with data, then morphing the second table and copying the data into it, I would suggest creating an empty table first by running your process with an added TOP 0 or WHERE 1 = 0, then copying the table's structure to the correct one with your table-copying SP, then running the data process for real to insert only once to the correct table.

Like I said, clunky, but could be useful in some narrow situations where no other options are available.

1
votes

So, I am copying my answer from Insert results of a stored procedure into a temporary table. Hope it helps. Yes, it's global temporary table again and the only different is the GUID appended there.


I met the same problem and here is what I did for this from Paul's suggestion. The main part is here is to use NEWID() to avoid multiple users run the store procedures/scripts at the same time, the pain for global temporary table.

DECLARE @sql varchar(max) = '', 
@tmp_global_table varchar(255) = '##global_tmp_' + CONVERT(varchar(36), NEWID())
SET @sql = @sql + 'select * into [' + @tmp_global_table + '] from YOURTABLE'
EXEC(@sql)

EXEC('SELECT * FROM [' + @tmp_global_table + ']')