0
votes

My setup is - SSMS connecting to Azure SQL server. This server has deployed on it a number of SSIS packages to be run.

When a package is under execution, I have the error shown below:

SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Resource ID : 1. The request limit for the database is 60 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance."

.

After doing much research, I found that a potential solution was to decrease the MAXDOP amount. Set the value between 1 or 0. Doing this had no effect and did not prevent the error message.

Then, also as a result of research, I found out that I could restart the SQL services running. This ensured the SSIS package I execute could complete without errors twice! But on third attempt, the execution processes failed with the above error.

I found out online that there could be other reasons why I'm getting this error. Any chance someone can assist by pointing me in the right direction.

Many thanks Thanks.

1
"Set the value between 1 and 0" so what did you set it to? What was the code you used? 0 won't fix your issue but 1 might.Nick.McDermaid
@Nick.McDermaid I’ve set it to 0 (then ran the package, got the error) and then 1 (still same error). The code I used - ALTER DATABASE SCOPED CONFIGURATION SET MAXDOP = 1. Ran this line of code against the db that I’m getting the error on. I initially used 0 as I thought that meant ‘unlimited’.user54287
0 means unlimited parallelism = lots of processes. 1 means only one.Nick.McDermaid
@Nick.McDermaid right, thanks. I only used 0 out of desperation. I still get the same error though, using 1 i.e.. I don’t know what else could cause the error. Something else is, I don’t know what.user54287

1 Answers

0
votes

There was a package with a data-flow task in it. This task was writing to over 30 table simultaneously. In the end, I broke the task down into mini tasks, to reduce the number of simultaneous db access.

This solved my issue.