2
votes

In my Azure SQL DB I have an external table - let's call this tableName_origData - and I have another table which we'll refer to as tableName.

tableName was created using a generated CREATE script from tableName_origData (in its original location) so I can be sure that all the column types are identical.

However, when I run

INSERT INTO tableName (
    [list of column names]
)
SELECT
    [same list of column names]
FROM
    tableName_origData

I encounter the following exception:

Large object column support is limited to only nvarchar(max) data type.

As far as my understanding of Azure SQL's data types goes, I don't have anything larger than NVARCHAR(MAX). Furthermore, the message implies that NVARCHAR(MAX) is supported (and I can see that the same script works on other tables which contain NVARCHAR(MAX).

Can anyone better explain the cause of this exception and what I might need to do in order to insert its data into an identical table?

Here is a list of all the column types used in the table(s):

BIGINT x 3
NCHAR(20) x 1
NVARCHAR(45) x 5
NVARCHAR(100) x 14
NVARCHAR(MAX) x 10

1

1 Answers

1
votes

External tables is read-only. The developer can select data, but cannot perform any form of DML-processing

To solve this issue please use this technique:

https://technology.amis.nl/2005/04/05/updateable-external-tables/

Warn: Unless for the simplest of uses, we do not recommend using this technique for any serious application