0
votes

I am importing data in a CSV to Azure Table Storage using the Microsoft Azure Storage Explorer application. It's pretty wide with 100+ columns.

When I import a sample of 10 rows it works fine. When I try to load the complete data set (60K rows), Azure Storage Explorer crashes.

I am about to explore other tools (PowerShell, console app, etc.). Before I do I was wondering if there is some restriction I am missing or overlooking?

1

1 Answers

1
votes

I am importing data in a CSV to Azure Table Storage using the Azure Storage Explorer application

I also repro the issue with Azure Storage Explorer. As I know that Azure Storage Explorer is not actively maintained for long time.

It works correctly for me with Microsoft Azure Storage Explorer. Please have a try to use Microsoft Azure Storage Explorer, it may work correctly if the CSV file is not over limitation. More info about Azure storage table limitation please refer to document.

Max size of a table entity 1M

Max number of properties in a table entity 252

Max size of a single table 500T

The following is my test result.

enter image description here

Update1:

If there is an entity that size is larger that 1M, then Microsoft Azure Storage Explorer tool also crashed for me. Please have a try to check whether there is a entity that size is large that 1M. If it is that case, the CSV file may not be import to the Azure storage table as table is limited that table entity size less than 1M.

enter image description here

We aslo can send our feedback to tool team, if we meet any problem.

enter image description here