File python-hdf5storage.changes of Package python-hdf5storage

-------------------------------------------------------------------
Thu May 24 17:34:42 UTC 2018 - toddrme2178@gmail.com

- Spec file cleanups

-------------------------------------------------------------------
Sun Jun 11 06:17:53 UTC 2017 - toddrme2178@gmail.com


- Implement single-spec version
- Fix source URL
- Update to version 0.1.14.
  + Bugfix release that also added a couple features.
    * Issue #45. Fixed syntax errors in unicode strings for Python 3.0 to 3.2.
    * Issues #44 and #47. Fixed bugs in testing of conversion and storage of string types.
    * Issue #46. Fixed raising of RuntimeWarnings in tests due to signalling NaNs.
    * Added requirements files for building documentation and running tests.
    * Made it so that Matlab compatability tests are skipped if Matlab is not found, instead of raising errors.
- Update to version 0.1.13.
  + Bugfix release fixing the following bug.
    * Issue #36. Fixed bugs in writing int and long to HDF5 and their tests on 32 bit systems.
- Update to version 0.1.12. 
  + Bugfix release fixing the following bugs. In addition, copyright years were also updated and notices put in the Matlab files used for testing.
    *   Issue #32. Fixed transposing before reshaping np.ndarray when reading from HDF5 files where python metadata was stored but not Matlab metadata.
    *   Issue #33. Fixed the loss of the number of characters when reading empty numpy string arrays.
    *   Issue #34. Fixed a conversion error when np.chararray are written with Matlab metadata.
- Update to version 0.1.11. 
  + Bugfix release fixing the following.
    *   Issue #30. Fixed loadmat not opening files in read mode.
- Update to version 0.1.10. 
  + Minor feature/performance fix release doing the following.
    *   Issue #29. Added writes and reads functions to write and read more than one piece of data at a time and made savemat and loadmat use them to increase performance. Previously, the HDF5 file was being opened and closed for each piece of data, which impacted performance, especially for large files.
- Update to version 0.1.9. 
  + Bugfix and minor feature release doing the following.
    *   Issue #23. Fixed bug where a structured np.ndarray with a field name of 'O' could never be written as an HDF5 COMPOUND Dataset (falsely thought a field’s dtype was object).
    *   Issue #6. Added optional data compression and the storage of data checksums. Controlled by several new options.
- Update to version 0.1.8. 
  + Bugfix release fixing the following two bugs.
    *   Issue #21. Fixed bug where the 'MATLAB_class' Attribute is not set when writing dict types when writing MATLAB metadata.
    *   Issue #22. Fixed bug where null characters ('\x00') and forward slashes ('/') were allowed in dict keys and the field names of structured np.ndarray (except that forward slashes are allowed when the structured_numpy_ndarray_as_struct is not set as is the case when the matlab_compatible option is set). These cause problems for the h5py package and the HDF5 library. NotImplementedError is now thrown in these cases.
- Update to version 0.1.7. 
  + Bugfix release with an added compatibility option and some added test code. Did the following.
    *   Fixed an issue reading variables larger than 2 GB in MATLAB MAT v7.3 files when no explicit variable names to read are given to hdf5storage.loadmat. Fix also reduces memory consumption and processing time a little bit by removing an unneeded memory copy.
    *   Options now will accept any additional keyword arguments it doesn’t support, ignoring them, to be API compatible with future package versions with added options.
    *   Added tests for reading data that has been compressed or had other HDF5 filters applied.
- Update to version 0.1.6. 
  + Bugfix release fixing a bug with determining the maximum size of a Python 2.x int on a 32-bit system.
- Update to version 0.1.5. 
  + Bugfix release fixing the following bug.
    *   Fixed bug where an int could be stored that is too big to fit into an int when read back in Python 2.x. When it is too big, it is converted to a long.
    *   Fixed a bug where an int or long that is too big to big to fit into an np.int64 raised the wrong exception.
    *   Fixed bug where fields names for structured np.ndarray with non-ASCII characters (assumed to be UTF-8 encoded in Python 2.x) can’t be read or written properly.
    *   Fixed bug where np.bytes_ with non-ASCII characters can were converted incorrectly to UTF-16 when that option is set (set implicitly when doing MATLAB compatibility). Now, it throws a NotImplementedError.
- Update to version 0.1.4. 
  + Bugfix release fixing the following bugs. Thanks goes to mrdomino for writing the bug fixes.
    *   Fixed bug where dtype is used as a keyword parameter of np.ndarray.astype when it is a positional argument.
    *   Fixed error caused by h5py.__version__ being absent on Ubuntu 12.04.
- Update to version 0.1.3. 
  + Bugfix release fixing the following bug.
    *   Fixed broken ability to correctly read and write empty structured np.ndarray (has fields).
- Update to version 0.1.2. 
  + Bugfix release fixing the following bugs.
    *   Removed mistaken support for np.float16 for h5py versions before 2.2 since that was when support for it was introduced.
    *   Structured np.ndarray where one or more fields is of the 'object' dtype can now be written without an error when the structured_numpy_ndarray_as_struct option is not set. They are written as an HDF5 Group, as if the option was set.
    *   Support for the 'MATLAB_fields' Attribute for data types that are structures in MATLAB has been added for when the version of the h5py package being used is 2.3 or greater. Support is still missing for earlier versions (this package requires a minimum version of 2.1).
    *   The check for non-unicode string keys (str in Python 3 and unicode in Python 2) in the type dict is done right before any changes are made to the HDF5 file instead of in the middle so that no changes are applied if an invalid key is present.
    *   HDF5 userblock set with the proper metadata for MATLAB support right at the beginning of when data is being written to an HDF5 file instead of at the end, meaning the writing can crash and the file will still be a valid MATLAB file.

-------------------------------------------------------------------
Fri May 30 13:26:15 UTC 2014 - toddrme2178@gmail.com

- initial version