From sorench at gmail.com Sat Oct 12 00:56:57 2013 From: sorench at gmail.com (Soren Christensen) Date: Fri, 11 Oct 2013 21:56:57 -0700 Subject: [MINC-development] hdf attribute size Message-ID: Hi, Has anyone else had problems with the 64kb limit on attributes in the minc-header? I am storing a slice-time matrix of values from the originating DICOM files in order to not loose information from some key DICOM attributes. Often I will have thounsands of DICOMs going into one MINC file. For attributes that are uniform accross all DICOMs I just write the atreibute once, but for varying attributes I fill in the above slice-time matrix. So for thousands of files this get pretty big. AcquisitionDateTime for example will take more bytes to store than 64kb. I use strings only for header items. So I get errors like these from hdf (via miset_attr_values): HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0: #000: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Adeprec.c line 165 in H5Acreate1(): unable to create attribute major: Attribute minor: Unable to initialize object #001: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5A.c line 496 in H5A_create(): unable to create attribute in object header major: Attribute minor: Unable to insert object #002: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Oattribute.c line 346 in H5O_attr_create(): unable to create new attribute in header major: Attribute minor: Unable to insert object #003: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Omessage.c line 224 in H5O_msg_append_real(): unable to create new message major: Object header minor: No space available for allocation #004: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Omessage.c line 1925 in H5O_msg_alloc(): unable to allocate space for message major: Object header minor: Unable to initialize object #005: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Oalloc.c line 1136 in H5O_alloc(): object header message is too large major: Object header minor: Unable to initialize object problem with key: "AcquisitionDateTime" Does anyone know of an easy workaround for this? It seems one can change the hdf call quite easily to do away with this 64kb limitation, but I am not sure of any potential penalties. An optimistic interpretation of the below document says it will be more efficient and is simply a new feature of hdf 1.8. (inserting new header items I suppose). http://www.hdfgroup.org/HDF5/doc/UG/13_Attributes.html (see 8.5) Compatibility with hdf<1.8 would be lost So I suppose one could make an miset_attr_values_large or somehow optionally create a dense group? Or perhaps replace the setup with "dense" groups (see hdf document), provided it provides appropriate backwards file compatibility? I'd like to pursue this when I get more time available, but maybe someone well versed in hdf could chime in? Thanks Soren -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.janke at gmail.com Sun Oct 13 16:33:28 2013 From: a.janke at gmail.com (Andrew Janke) Date: Mon, 14 Oct 2013 06:33:28 +1000 Subject: [MINC-development] hdf attribute size In-Reply-To: References: Message-ID: Hi Soren, No I haven't hit the limit yet but given the amount you are trying to stuff in you probably should be using a data variable for this instead of the data definition (attribute). An existing example of how this is done in MINC would be the slice time position variable for MINC files with irregular spacing. There are text/string HDF datatypes so in my mind this would be the best long-term solution for this. a On 12 October 2013 14:56, Soren Christensen wrote: > Hi, > Has anyone else had problems with the 64kb limit on attributes in the > minc-header? > I am storing a slice-time matrix of values from the originating DICOM files > in order to not loose information from some key DICOM attributes. > Often I will have thounsands of DICOMs going into one MINC file. For > attributes that are uniform accross all DICOMs I just write the atreibute > once, but for varying attributes I fill in the above slice-time matrix. So > for thousands of files this get pretty big. AcquisitionDateTime for example > will take more bytes to store than 64kb. I use strings only for header > items. > So I get errors like these from hdf (via miset_attr_values): > > HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0: > > #000: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Adeprec.c > line 165 in H5Acreate1(): unable to create attribute > > major: Attribute > > minor: Unable to initialize object > > #001: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5A.c line > 496 in H5A_create(): unable to create attribute in object header > > major: Attribute > > minor: Unable to insert object > > #002: > /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Oattribute.c line > 346 in H5O_attr_create(): unable to create new attribute in header > > major: Attribute > > minor: Unable to insert object > > #003: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Omessage.c > line 224 in H5O_msg_append_real(): unable to create new message > > major: Object header > > minor: No space available for allocation > > #004: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Omessage.c > line 1925 in H5O_msg_alloc(): unable to allocate space for message > > major: Object header > > minor: Unable to initialize object > > #005: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Oalloc.c > line 1136 in H5O_alloc(): object header message is too large > > major: Object header > > minor: Unable to initialize object > > problem with key: "AcquisitionDateTime" > > > Does anyone know of an easy workaround for this? > > It seems one can change the hdf call quite easily to do away with this 64kb > limitation, but I am not sure of any potential penalties. An optimistic > interpretation of the below document says it will be more efficient and is > simply a new feature of hdf 1.8. (inserting new header items I suppose). > http://www.hdfgroup.org/HDF5/doc/UG/13_Attributes.html (see 8.5) > Compatibility with hdf<1.8 would be lost > > So I suppose one could make an miset_attr_values_large or somehow optionally > create a dense group? Or perhaps replace the setup with "dense" groups (see > hdf document), provided it provides appropriate backwards file > compatibility? > > I'd like to pursue this when I get more time available, but maybe someone > well versed in hdf could chime in? > > Thanks > Soren > > > > _______________________________________________ > MINC-development mailing list > MINC-development at bic.mni.mcgill.ca > http://www.bic.mni.mcgill.ca/mailman/listinfo/minc-development > From sorench at gmail.com Mon Oct 14 01:35:35 2013 From: sorench at gmail.com (Soren Christensen) Date: Sun, 13 Oct 2013 22:35:35 -0700 Subject: [MINC-development] hdf attribute size In-Reply-To: References: Message-ID: Thanks, I will have a look at that - is that in dcm2mnc? Soren On Sun, Oct 13, 2013 at 1:33 PM, Andrew Janke wrote: > Hi Soren, > > No I haven't hit the limit yet but given the amount you are trying to > stuff in you probably should be using a data variable for this instead > of the data definition (attribute). > > An existing example of how this is done in MINC would be the slice > time position variable for MINC files with irregular spacing. There > are text/string HDF datatypes so in my mind this would be the best > long-term solution for this. > > > a > > > On 12 October 2013 14:56, Soren Christensen wrote: > > Hi, > > Has anyone else had problems with the 64kb limit on attributes in the > > minc-header? > > I am storing a slice-time matrix of values from the originating DICOM > files > > in order to not loose information from some key DICOM attributes. > > Often I will have thounsands of DICOMs going into one MINC file. For > > attributes that are uniform accross all DICOMs I just write the atreibute > > once, but for varying attributes I fill in the above slice-time matrix. > So > > for thousands of files this get pretty big. AcquisitionDateTime for > example > > will take more bytes to store than 64kb. I use strings only for header > > items. > > So I get errors like these from hdf (via miset_attr_values): > > > > HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0: > > > > #000: > /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Adeprec.c > > line 165 in H5Acreate1(): unable to create attribute > > > > major: Attribute > > > > minor: Unable to initialize object > > > > #001: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5A.c line > > 496 in H5A_create(): unable to create attribute in object header > > > > major: Attribute > > > > minor: Unable to insert object > > > > #002: > > /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Oattribute.c > line > > 346 in H5O_attr_create(): unable to create new attribute in header > > > > major: Attribute > > > > minor: Unable to insert object > > > > #003: > /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Omessage.c > > line 224 in H5O_msg_append_real(): unable to create new message > > > > major: Object header > > > > minor: No space available for allocation > > > > #004: > /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Omessage.c > > line 1925 in H5O_msg_alloc(): unable to allocate space for message > > > > major: Object header > > > > minor: Unable to initialize object > > > > #005: /home/s/CODE/ITK/Modules/ThirdParty/HDF5/src/itkhdf5/src/H5Oalloc.c > > line 1136 in H5O_alloc(): object header message is too large > > > > major: Object header > > > > minor: Unable to initialize object > > > > problem with key: "AcquisitionDateTime" > > > > > > Does anyone know of an easy workaround for this? > > > > It seems one can change the hdf call quite easily to do away with this > 64kb > > limitation, but I am not sure of any potential penalties. An optimistic > > interpretation of the below document says it will be more efficient and > is > > simply a new feature of hdf 1.8. (inserting new header items I suppose). > > http://www.hdfgroup.org/HDF5/doc/UG/13_Attributes.html (see 8.5) > > Compatibility with hdf<1.8 would be lost > > > > So I suppose one could make an miset_attr_values_large or somehow > optionally > > create a dense group? Or perhaps replace the setup with "dense" groups > (see > > hdf document), provided it provides appropriate backwards file > > compatibility? > > > > I'd like to pursue this when I get more time available, but maybe someone > > well versed in hdf could chime in? > > > > Thanks > > Soren > > > > > > > > _______________________________________________ > > MINC-development mailing list > > MINC-development at bic.mni.mcgill.ca > > http://www.bic.mni.mcgill.ca/mailman/listinfo/minc-development > > > _______________________________________________ > MINC-development mailing list > MINC-development at bic.mni.mcgill.ca > http://www.bic.mni.mcgill.ca/mailman/listinfo/minc-development > -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.janke at gmail.com Mon Oct 14 01:42:23 2013 From: a.janke at gmail.com (Andrew Janke) Date: Mon, 14 Oct 2013 15:42:23 +1000 Subject: [MINC-development] hdf attribute size In-Reply-To: References: Message-ID: On 14 October 2013 15:35, Soren Christensen wrote: > Thanks, I will have a look at that - is that in dcm2mnc? It will be, but this is for a list of values not text. Same ideas should apply though. See: https://github.com/BIC-MNI/minc-tools/blob/master/conversion/dcm2mnc/minc_file.c#L567 and https://github.com/BIC-MNI/minc-tools/blob/master/conversion/dcm2mnc/minc_file.c#L1227 a From sorench at gmail.com Mon Oct 14 02:04:16 2013 From: sorench at gmail.com (Soren Christensen) Date: Sun, 13 Oct 2013 23:04:16 -0700 Subject: [MINC-development] hdf attribute size In-Reply-To: References: Message-ID: Thanks for that. I can use that for the slice times which are the most important, but I have a lot of other attributes also exceeding the allowed size. Would it be a bad idea to make datasets for all header attributes that you know of? Thanks Soren On Sun, Oct 13, 2013 at 10:42 PM, Andrew Janke wrote: > On 14 October 2013 15:35, Soren Christensen wrote: > > Thanks, I will have a look at that - is that in dcm2mnc? > > It will be, but this is for a list of values not text. Same ideas > should apply though. See: > > > https://github.com/BIC-MNI/minc-tools/blob/master/conversion/dcm2mnc/minc_file.c#L567 > > and > > > https://github.com/BIC-MNI/minc-tools/blob/master/conversion/dcm2mnc/minc_file.c#L1227 > > > a > _______________________________________________ > MINC-development mailing list > MINC-development at bic.mni.mcgill.ca > http://www.bic.mni.mcgill.ca/mailman/listinfo/minc-development > -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.janke at gmail.com Mon Oct 14 02:07:12 2013 From: a.janke at gmail.com (Andrew Janke) Date: Mon, 14 Oct 2013 16:07:12 +1000 Subject: [MINC-development] hdf attribute size In-Reply-To: References: Message-ID: On 14 October 2013 16:04, Soren Christensen wrote: > I can use that for the slice times which are the most important, but I have > a lot of other attributes also exceeding the allowed size. Would it be a bad > idea to make datasets for all header attributes that you know of? No, I think that would be a very good idea! a