0
votes

What I want to do is save thousands set of images.

Each image set has variable number of images. I've seen that hdf5 format doesn't allocate disk until actual data is written.

I decided to write (number of set, maximum set size, imgX,imgY) shape array.

This array is really huge. If I write every cell in the array, therefore, I can't write array at once. So, I made code to write each image at a time. But, it seems like that writing one at a time, doesn't write anyting at all.

Belows are my code and their output. It shows me that I write actual image, but once I read from hdf5, it gives blank image.

with h5py.File("D:\\data_icon\\flaticon\\test.hdf5",'w') as hf:
    flt = hf.create_dataset("flaticon", (2,500,128,128))
    for idx, icon_image in enumerate(pack_image_list[0][:5]):
        flt[0][idx]=icon_image
        plt.subplot(1,10,idx+1)
        plt.imshow(icon_image, cmap=plt.get_cmap('gray'))
        plt.axis('off')
    plt.show()

    for idx, icon_image in enumerate(pack_image_list[0][5:15]):
        flt[0][idx]=icon_image
        plt.subplot(1,10,idx+1)
        plt.imshow(icon_image, cmap=plt.get_cmap('gray'))
        plt.axis('off')
    plt.show()

    for i in range(1,11):
        plt.subplot(1,10,i)
        plt.imshow(flt[0][i], cmap=plt.get_cmap('gray'))
        plt.axis('off')
    plt.show()

output enter image description here

1

1 Answers

0
votes

First, what is the reason for not writing separate datasets in groups? One of the strengths of HDF5 is the ability to store many arrays in a structure of named groups and datasets in a single file.

Second, what exactly is the data in pack_image_list?

If you want further help, provide a copy of your input data :-)