"""Downsample using :func:`scipy.ndimage.zoom`."""
"""
Downsample a 3D volume using Dask and scipy.ndimage.zoom.
This method performs multi-scale downsampling on a 3D dataset, generating image pyramids. It processes the data in chunks using Dask.
Args:
base (dask.array): The 3D array (volume) to be downsampled. Must be a Dask array for chunked processing.
Returns:
list of dask.array: A list of downsampled volumes, where each element represents a different scale. The first element corresponds to the original resolution, and subsequent elements represent progressively downsampled versions.
The downsampling process occurs scale by scale, using the following steps:
- For each scale, the array is resized based on the downscale factor, computed as a function of the current scale level.
- The `scipy.ndimage.zoom` function is used to perform interpolation, with chunk-wise processing handled by Dask's `map_blocks` function.
- The output is rechunked to match the input volume's original chunk size.
Export image data to OME-Zarr format with pyramidal downsampling.
Export 3D image data to OME-Zarr format with pyramidal downsampling.
Automatically calculates the number of downsampled scales such that the smallest scale fits within the specified `chunk_size`.
This function generates a multi-scale OME-Zarr representation of the input data, which is commonly used for large imaging datasets. The downsampled scales are calculated such that the smallest scale fits within the specified `chunk_size`.
Args:
Args:
path (str): The directory where the OME-Zarr data will be stored.
path (str): The directory where the OME-Zarr data will be stored.
data (np.ndarray): The image data to be exported.
data (np.ndarray or dask.array): The 3D image data to be exported. Supports both NumPy and Dask arrays.
chunk_size (int, optional): The size of the chunks for storing data. Defaults to 100.
chunk_size (int, optional): The size of the chunks for storing data. This affects both the original data and the downsampled scales. Defaults to 256.
downsample_rate (int, optional): Factor by which to downsample the data for each scale. Must be greater than 1. Defaults to 2.
downsample_rate (int, optional): The factor by which to downsample the data for each scale. Must be greater than 1. Defaults to 2.
order (int, optional): Interpolation order to use when downsampling. Defaults to 0 (nearest-neighbor).
order (int, optional): The interpolation order to use when downsampling. Defaults to 1 (linear). Use 0 for a faster nearest-neighbor interpolation.
replace (bool, optional): Whether to replace the existing directory if it already exists. Defaults to False.
replace (bool, optional): Whether to replace the existing directory if it already exists. Defaults to False.
progress_bar (bool, optional): Whether to display progress while writing data to disk. Defaults to True.
method (str, optional): The method used for downsampling. If set to "dask", Dask arrays are used for chunking and downsampling. Defaults to "scaleZYX".
progress_bar (bool, optional): Whether to display a progress bar during export. Defaults to True.
progress_bar_repeat_time (str or int, optional): The repeat interval (in seconds) for updating the progress bar. Defaults to "auto".
Raises:
Raises:
ValueError: If the directory already exists and `replace` is False.
ValueError: If the directory already exists and `replace` is False.
ValueError: If `downsample_rate` is less than or equal to 1.
ValueError: If `downsample_rate` is less than or equal to 1.