I've set up an Azure Data Factory pipeline to transfer the data from one table in our SQL Server Database to our new Azure Search service. The transfer job continuously fails giving the following error:
Copy activity encountered a user error at Sink side: GatewayNodeName=SQLMAIN01,ErrorCode=UserErrorAzuerSearchOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when writing data to Azure Search Index '001'.,Source=Microsoft.DataTransfer.ClientLibrary.AzureSearch,''Type=Microsoft.Rest.Azure.CloudException,Message=Operation returned an invalid status code 'RequestEntityTooLarge',Source=Microsoft.Azure.Search,'.
From what I've read thus far, Request Entity Too Large error is a standard HTTP error 413 found inside REST API. Of all the research I've done though, nothing helps me understand how I can truly diagnose and resolve this error.
Has anyone dealt with this with specific context to Azure? I would like to find out how to get all of our database data into our Azure Search service. If there are adjustments that can be made on the Azure side to increase the allowed request size, the process for doing so certainly is not readily-available anywhere I've seen on the internet nor in the Azure documentation.