1
votes

I have a question regarding the media conversion tool for Sitecore.

With this module you can convert media items between a hard drive location and a Sitecore database, and vice versa. But each time I convert some items it keeps taking additional harddrive space.

So when I convert 3gb to the hard drive it adds an additional 3gb (which seems logic -> 6gb total) but then when I convert them back to the blob format it adds another 3gb (9gb total). Instead of overwriting the previous version in the database.

Is there a way to clean the previous blobs or something? Because now it is using too much hard drive space.

Thanks in advance.

1
You can try to login to Sitecore shell, open Content Panel application, choose Database and run Clean Up Databases tool from there. I helped me in some scenarios but I'm not sure if it will work in yours as well.Marek Musielak
Yes. Use "Clean Up Databases", and in some instances you need to follow up with shrinkdb from your SQL Management StudioMark Cassidy

1 Answers

1
votes

Using "Clean Up Databases" should work, but if the size gets too large, as my client's blob table did, the clean up will fail due to either a SQL timeout or because SQL Server uses up all the available locks.

Another solution is to run a script to manually clean up the blobs table. We had this issue previously and Sitecore support was able to provide us with a script to do so:

DECLARE @UsableBlobs table(
    ID uniqueidentifier
    );

I-N-S-E-R-T INTO 
    @UsableBlobs    
S-E-L-E-C-T convert(uniqueidentifier,[Value]) as EmpID from [Fields]
where [Value] != '' 
and (FieldId='{40E50ED9-BA07-4702-992E-A912738D32DC}' or FieldId='{DBBE7D99-1388-4357-BB34-AD71EDF18ED3}') 
D-E-L-E-T-E from [Blobs] 
where [BlobId] not in (S-E-L-E-C-T * from @UsableBlobs)

This basically looks for blobs that are still in use and stores them in a temp table. It them compares the items in this table to the Blobs table and deletes the ones that aren't in the temp table.

In our case, even this was bombing out due to the SQL Server locks problem, so I updated the delete statement to be delete top (x) from [Blobs] where x is a number you feel is more appropriate. I started at 1000 and eventually went up to deleting 400,000 records at a time. (Yes, it was that large)

So try the built-in "Clean Up Databases" option first and failing that, try to run the script to manually clean the table.

Edit note: Sorry, had to change the "insert", "select" and "delete" commands to allow SO to save the entry.