![]() ![]() # return first image in TIFF file as numpy array ![]() """Write images from TIFF files to chunked dataset in HDF5 file.""" Hdf5file, tifffiles, dataset_name='2p', chunksize=600, compression=None Try the following, which gives me ~400 MB/s uncompressed and ~210 MB/s LZF compressed for a SPIM dataset: import h5py H5write(h5outputfile, '/2p', img,start,count) Īny advice for getting an HDF5 file that’s not so big in a timely manner by using Python? Img = imread( fullfile(allfiles(ii).folder,allfiles(ii).name)) H5create(h5outputfile,'/2p',"Chunksize",) I added the Chunksize argument so you don’t have to load the entirety of the file to look at/use it, maybe my Chunksize is bad? h5outputfile='D:20211011_LHE047_plane1_-419.55_raw.h5' I’m hoping to make a converter that uses Python so I can seamlessly convert all the tiffs to HDF5 once they have been generated (I’m using Python to run Docker containers that make the tiffs).įor some reason, when I run the converter, the size of the data balloons from about 24GB to 100GB! I’m thinking it’s because the code in MATLAB doesn’t use compression at all at the moment, most likely due to the fact that it takes an enormous amount of time to write the h5 file with compression enabled (from 25 minutes without compression to overnight with any level of compression). The way it’s currently being done is through a MATLAB script that uses their H5 Writer. I’m currently in the process of automating the conversion of directories of tiffs (about 47k images/directory) into an HDF5 format (until suite2p supports zarr as a backend). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |