diff --git a/.gitignore b/.gitignore index dda45d250..e1caa7aa9 100644 --- a/.gitignore +++ b/.gitignore @@ -142,3 +142,5 @@ examples/data # pixi environments .pixi +/imod/tests/mydask.png +/imod/tests/*_report.xml diff --git a/docs/api/changelog.rst b/docs/api/changelog.rst index 991bc3258..86574b81c 100644 --- a/docs/api/changelog.rst +++ b/docs/api/changelog.rst @@ -6,8 +6,85 @@ All notable changes to this project will be documented in this file. The format is based on `Keep a Changelog`_, and this project adheres to `Semantic Versioning`_. -0.17.2 ------- + +[Unreleased- feature branch] +---------------------------- + +Fixed +~~~~~ +- Multiple ``HorizontalFlowBarrier`` objects attached to + :class:`imod.mf6.GroundwaterFlowModel` are merged into a single horizontal + flow barrier for MODFLOW 6 +- Bug where error would be thrown when barriers in a ``HorizontalFlowBarrier`` + would be snapped to the same cell edge. These are now summed. +- Improve performance validation upon Package initialization +- Improve performance writing ``HorizontalFlowBarrier`` objects + +Changed +~~~~~~~ +- :class:`imod.mf6.Well` now also validates that well filter top is above well + filter bottom +- :func:`open_projectfile_data` now also imports well filter top and bottom. +- :class:`imod.mf6.Well` now logs a warning if any wells are removed during writing. +- :class:`imod.mf6.HorizontalFlowBarrierResistance`, + :class:`imod.mf6.HorizontalFlowBarrierMultiplier`, + :class:`imod.mf6.HorizontalFlowBarrierHydraulicCharacteristic` now uses + vertical Polygons instead of Linestrings as geometry, and ``"ztop"`` and + ``"zbottom"`` variables are not used anymore. See + :func:`imod.prepare.linestring_to_square_zpolygons` and + :func:`imod.prepare.linestring_to_trapezoid_zpolygons` to generate these + polygons. +- :func:`open_projectfile_data` now returns well data grouped by ipf name, + instead of generic, separate number per entry. +- :class:`imod.mf6.Well` now supports wells which have a filter with zero + length, where ``"screen_top"`` equals ``"screen_bottom"``. +- :class:`imod.mf6.Well` shares the same default ``minimum_thickness`` as + :func:`imod.prepare.assign_wells`, which is 0.05, before this was 1.0. +- :func:`imod.prepare.allocate_drn_cells`, + :func:`imod.prepare.allocate_ghb_cells`, + :func:`imod.prepare.allocate_riv_cells`, now allocate to the first model layer + when elevations are above model top for all methods in + :func:`imod.prepare.ALLOCATION_OPTION`. +- :meth:`imod.mf6.Well.to_mf6_pkg` got a new argument: + ``strict_well_validation``, which controls the behavior for when wells are + removed entirely during their assignment to layers. This replaces the + ``is_partitioned`` argument. + +Added +~~~~~ + +- :meth:`imod.mf6.Modflow6Simulation.from_imod5_data` to import imod5 data + loaded with :func:`imod.formats.prj.open_projectfile_data` as a MODFLOW 6 + simulation. +- :func:`imod.prepare.linestring_to_square_zpolygons` and + :func:`imod.prepare.linestring_to_trapezoid_zpolygons` to generate vertical + polygons that can be used to specify horizontal flow barriers, specifically: + :class:`imod.mf6.HorizontalFlowBarrierResistance`, + :class:`imod.mf6.HorizontalFlowBarrierMultiplier`, + :class:`imod.mf6.HorizontalFlowBarrierHydraulicCharacteristic`. +- :class:`imod.mf6.LayeredWell` to specify wells directly to layers instead + assigning them with filter depths. +- :func:`imod.prepare.cleanup_drn`, :func:`imod.prepare.cleanup_ghb`, + :func:`imod.prepare.cleanup_riv`, :func:`imod.prepare.cleanup_wel`. These are + utility functions to clean up drainage, general head boundaries, and rivers, + respectively. +- :meth:`imod.mf6.Drainage.cleanup`, + :meth:`imod.mf6.GeneralHeadboundary.cleanup`, :meth:`imod.mf6.River.cleanup`, + :meth:`imod.mf6.Well.cleanup` convenience methods to call the corresponding + cleanup utility functions with the appropriate arguments. + + +Removed +~~~~~~~ + +- :func:`imod.formats.prj.convert_to_disv` has been removed. This functionality + has been replaced by :meth:`imod.mf6.Modflow6Simulation.from_imod5_data`. To + convert a structured simulation to an unstructured simulation, call: + :meth:`imod.mf6.Modflow6Simulation.regrid_like` + + +[0.17.2] - 2024-09-17 +--------------------- Fixed ~~~~~ @@ -51,6 +128,16 @@ Changed :class:`imod.mf6.regrid.RiverRegridMethod`, :class:`imod.mf6.regrid.SpecificStorageRegridMethod`, :class:`imod.mf6.regrid.StorageCoefficientRegridMethod`. +- Renamed ``imod.mf6.LayeredHorizontalFlowBarrier`` classes to + :class:`imod.mf6.SingleLayerHorizontalFlowBarrierResistance`, + :class:`imod.mf6.SingleLayerHorizontalFlowBarrierHydraulicCharacteristic`, + :class:`imod.mf6.SingleLayerHorizontalFlowBarrierMultiplier`, + +Fixed +~~~~~ +- :func:`imod.formats.prj.open_projectfile_data` now reports the path to a + faulty IPF or IDF file in the error message. + @@ -292,9 +379,12 @@ Added - Added Python 3.11 support. - The GWF-GWF exchange options are derived from user created packages (NPF, OC) and set automatically. -- Added the ``simulation_start_time`` and ``time_unit`` arguments. To the ``Modflow6Simulation.open_`` methods, and ``imod.mf6.out.open_`` functions. This converts the ``"time"`` coordinate to datetimes. -- added :meth:`imod.mf6.Modflow6Simulation.mask_all_models` to apply a mask to all models under a simulation, - provided the simulation is not split and the models use the same discretization. +- Added the ``simulation_start_time`` and ``time_unit`` arguments. To the + ``Modflow6Simulation.open_`` methods, and ``imod.mf6.out.open_`` functions. + This converts the ``"time"`` coordinate to datetimes. +- added :meth:`imod.mf6.Modflow6Simulation.mask_all_models` to apply a mask to + all models under a simulation, provided the simulation is not split and the + models use the same discretization. Changed diff --git a/docs/api/io.rst b/docs/api/io.rst index 69a12d91b..2d10b3bce 100644 --- a/docs/api/io.rst +++ b/docs/api/io.rst @@ -29,4 +29,3 @@ Input/output prj.read_projectfile prj.open_projectfile_data prj.read_timfile - prj.convert_to_disv diff --git a/docs/api/mf6.rst b/docs/api/mf6.rst index db0b31292..aca682c48 100644 --- a/docs/api/mf6.rst +++ b/docs/api/mf6.rst @@ -70,19 +70,38 @@ Flow Packages Buoyancy ConstantHead Drainage + Drainage.mask + Drainage.regrid_like + Drainage.cleanup Evapotranspiration GeneralHeadBoundary + GeneralHeadBoundary.mask + GeneralHeadBoundary.regrid_like + GeneralHeadBoundary.cleanup HorizontalFlowBarrierHydraulicCharacteristic HorizontalFlowBarrierMultiplier HorizontalFlowBarrierResistance + LayeredWell + LayeredWell.from_imod5_data + LayeredWell.mask + LayeredWell.regrid_like + LayeredWell.to_mf6_pkg InitialConditions NodePropertyFlow Recharge River + River.mask + River.regrid_like + River.cleanup SpecificStorage StorageCoefficient UnsaturatedZoneFlow Well + Well.cleanup + Well.from_imod5_data + Well.mask + Well.regrid_like + Well.to_mf6_pkg WellDisStructured WellDisVertices diff --git a/docs/api/prepare.rst b/docs/api/prepare.rst index 47641986f..74c4aa75f 100644 --- a/docs/api/prepare.rst +++ b/docs/api/prepare.rst @@ -25,6 +25,9 @@ Prepare model input zonal_aggregate_polygons zonal_aggregate_raster + linestring_to_square_zpolygons + linestring_to_trapezoid_zpolygons + assign_wells get_lower_active_grid_cells @@ -44,3 +47,8 @@ Prepare model input distribute_drn_conductance distribute_ghb_conductance distribute_riv_conductance + + cleanup_drn + cleanup_ghb + cleanup_riv + cleanup_wel diff --git a/examples/imod5-backwards-compatibility/imod5_conversion.py b/examples/imod5-backwards-compatibility/imod5_conversion.py deleted file mode 100644 index 88000dbb8..000000000 --- a/examples/imod5-backwards-compatibility/imod5_conversion.py +++ /dev/null @@ -1,148 +0,0 @@ -""" -This is an example of how to convert an existing iMOD5 model into an -unstructured Modflow 6 model. For this we'll use the ``convert_to_disv`` -function, which is still an experimental feature. In this example, we -have to work around the following issues in the converter: - -1. It expects no layer to be assigned for the rch and riv package. - In the example iMOD5 model, a layer is assigned to the rch and riv package. -2. The BarycentricInterpolator, used to compute starting heads, - introduced nans at the edges, whereas OverlapRegridder used to find the active - cells, suffers from no such thing. -3. CHDs are separated into different systems for each layer -4. Due to a bug with broadcasting n_times too many wells are generated: - 94 times & 94 indices. - -""" - -# %% -# Imports -# ------- -import numpy as np -import xugrid as xu - -import imod - -# %% -# Read data -# --------- -# -# For this example we'll get our data from the data shipped with iMOD Python. -# To read your own iMOD5 model, you can call -# :doc:`/api/generated/io/imod.formats.prj.open_projectfile_data` -temp_dir = imod.util.temporary_directory() - -data_prj, repeat_stress = imod.data.imod5_projectfile_data(temp_dir) - -data_prj - -# %% -# Cleanup -# ------- -# -# Remove layers -# TODO: If layer assigned, do not assign at depth? - -data_prj["rch"]["rate"] = data_prj["rch"]["rate"].sel(layer=1) -data_prj["drn-1"]["conductance"] = data_prj["drn-1"]["conductance"].sel(layer=1) -data_prj["drn-1"]["elevation"] = data_prj["drn-1"]["elevation"].sel(layer=1) - -# %% -# Target grid -# ----------- -# -# The rch rate is defined on a coarse grid, -# so we use this grid to make a lightweight example. - -dummy = data_prj["rch"]["rate"] -dummy.load() - -target = xu.UgridDataArray.from_structured(dummy) -triangular_grid = target.grid.triangulate() -voronoi_grid = triangular_grid.tesselate_centroidal_voronoi() - -voronoi_grid.plot() - -# %% -# Convert -# ------- -# -# We can convert the iMOD5 model to a Modflow6 model on the unstructured grid -# with the following function: - -mf6_model = imod.prj.convert_to_disv(data_prj, voronoi_grid) - -mf6_model - -# %% -# Cleanup -# ------- -# -# Clean starting head. Due to regridding, there is an empty edge at the -# bottom grid in starting head. - -edge = np.isnan(mf6_model["shd"]["start"].sel(layer=1)) -edge = edge & (mf6_model["disv"]["idomain"] == 1) -shd_mean_layer = mf6_model["shd"]["start"].mean(dim="mesh2d_nFaces") -mf6_model["shd"]["start"] = mf6_model["shd"]["start"].where(~edge, shd_mean_layer) - -# For some reason, all wells were broadcasted n_time times to index, -# resulting in 94 duplicate wells in a single cells -# Select 1, to reduce this. -# Furthermore rates we constant in time, but not along index - -for pkgname in ["wel-1", "wel-2"]: - rates = mf6_model[pkgname].dataset.isel(time=[1])["rate"].values - mf6_model[pkgname].dataset = mf6_model[pkgname].dataset.sel(index=[1], drop=False) - # Assign varying rates through to time to dataset - mf6_model[pkgname].dataset["rate"].values = rates.T - -# %% -# Assign to simulation -# -------------------- -# -# A Modflow 6 model is not a complete simulation, we still have to define a -# Modflow6Simulation and have to include some extra information - -mf6_sim = imod.mf6.Modflow6Simulation(name="mf6sim") -mf6_sim["gwf1"] = mf6_model - -# %% -# Set solver -mf6_sim["ims"] = imod.mf6.SolutionPresetModerate(modelnames=["gwf1"]) - -# %% -# Create time discretization, we'll only have to specify the end time. iMOD -# Python will take the other time steps from the stress packages. - -endtime = np.datetime64("2013-04-01T00:00:00.000000000") - -mf6_sim.create_time_discretization(additional_times=[endtime]) - - -# %% -# Write modflow 6 data - -modeldir = temp_dir / "mf6" -mf6_sim.write(directory=modeldir) - -# %% -# Run Modflow 6 simulation - -mf6_sim.run() - -# %% -# Read results from Modflow 6 simulation - -hds = imod.mf6.open_hds( - modeldir / "gwf1" / "gwf1.hds", modeldir / "gwf1" / "disv.disv.grb" -) - -hds.load() - -# %% -# Visualize - -hds.isel(time=-1, layer=4).ugrid.plot() - -# %% diff --git a/examples/mf6/different_ways_to_regrid_models.py b/examples/mf6/different_ways_to_regrid_models.py index af0bbc774..d789766f8 100644 --- a/examples/mf6/different_ways_to_regrid_models.py +++ b/examples/mf6/different_ways_to_regrid_models.py @@ -100,10 +100,10 @@ # because initializing a regridder is costly. regridder_types = NodePropertyFlowRegridMethod(k=(RegridderType.CENTROIDLOCATOR,)) -regrid_context = RegridderWeightsCache() +regrid_cache = RegridderWeightsCache() npf_regridded = model["npf"].regrid_like( target_grid=target_grid, - regrid_context=regrid_context, + regrid_cache=regrid_cache, regridder_types=regridder_types, ) new_model["npf"] = npf_regridded diff --git a/examples/user-guide/08-regridding.py b/examples/user-guide/08-regridding.py index 178e6044f..ee81e832c 100644 --- a/examples/user-guide/08-regridding.py +++ b/examples/user-guide/08-regridding.py @@ -191,9 +191,9 @@ # undergo custom regridding at this stage. from imod.mf6.utilities.regrid import RegridderWeightsCache -regrid_context = RegridderWeightsCache() +regrid_cache = RegridderWeightsCache() -regrid_context +regrid_cache # %% # Regrid the recharge package with a custom regridder. In this case we opt @@ -206,7 +206,7 @@ regridded_recharge = original_rch_package.regrid_like( target_grid, - regrid_context=regrid_context, + regrid_cache=regrid_cache, regridder_types=regridder_types, ) diff --git a/imod/formats/ipf.py b/imod/formats/ipf.py index 35cf82c6b..2e1e03f63 100644 --- a/imod/formats/ipf.py +++ b/imod/formats/ipf.py @@ -17,6 +17,7 @@ import pandas as pd import imod +from imod.util.time import to_pandas_datetime_series def _infer_delimwhitespace(line, ncol): @@ -230,15 +231,7 @@ def read_associated(path, kwargs={}): if nrow > 0 and itype == 1: time_column = colnames[0] - len_date = len(df[time_column].iloc[0]) - if len_date == 14: - df[time_column] = pd.to_datetime(df[time_column], format="%Y%m%d%H%M%S") - elif len_date == 8: - df[time_column] = pd.to_datetime(df[time_column], format="%Y%m%d") - else: - raise ValueError( - f"{path.name}: datetime format must be yyyymmddhhmmss or yyyymmdd" - ) + df[time_column] = to_pandas_datetime_series(df[time_column]) return df diff --git a/imod/formats/prj/__init__.py b/imod/formats/prj/__init__.py index f5a1a5831..c2bd90f23 100644 --- a/imod/formats/prj/__init__.py +++ b/imod/formats/prj/__init__.py @@ -1,2 +1 @@ -from imod.formats.prj.disv_conversion import convert_to_disv from imod.formats.prj.prj import open_projectfile_data, read_projectfile, read_timfile diff --git a/imod/formats/prj/disv_conversion.py b/imod/formats/prj/disv_conversion.py deleted file mode 100644 index f6e9c2f29..000000000 --- a/imod/formats/prj/disv_conversion.py +++ /dev/null @@ -1,847 +0,0 @@ -""" -Most of the functionality here attempts to replicate what iMOD does with -project files. -""" - -from __future__ import annotations - -import itertools -import pickle -from collections import Counter -from datetime import datetime -from typing import Dict, List, Optional, Tuple, cast - -import numpy as np -import pandas as pd -import xarray as xr -import xugrid as xu - -import imod -from imod.mf6.model import Modflow6Model -from imod.mf6.utilities.package import get_repeat_stress -from imod.prepare.layer import get_upper_active_grid_cells -from imod.typing import GridDataArray -from imod.util.imports import MissingOptionalModule - -try: - import geopandas as gpd -except ImportError: - gpd = MissingOptionalModule("geopandas") - - -def hash_xy(da: xr.DataArray) -> Tuple[int]: - """ - Create a unique identifier based on the x and y coordinates of a DataArray. - """ - x = hash(pickle.dumps(da["x"].values)) - y = hash(pickle.dumps(da["y"].values)) - return x, y - - -class SingularTargetRegridderWeightsCache: - """ - Create a mapping of (source_coords, regridder_cls) => regridding weights. - - Allows re-use of the regridding weights, as computing the weights is the - most costly step. - - The regridder only processes x and y coordinates: we can hash these, - and get a unique identifier. The target is assumed to be constant. - """ - - def __init__(self, projectfile_data, target, cache_size: int): - # Collect the x-y coordinates of all x-y dimensioned DataArrays. - # Determine which regridding method to use. - # Count occurrences of both. - # Create and cache weights of the most common ones. - keys = [] - sources = {} - methods = {} - for pkgdict in projectfile_data.values(): - for variable, da in pkgdict.items(): - xydims = {"x", "y"} - - if isinstance(da, xr.DataArray) and xydims.issubset(da.dims): - # for initial condition, constant head, general head boundary - if variable == "head": - cls = xu.BarycentricInterpolator - method = None - elif variable == "conductance": - cls = xu.RelativeOverlapRegridder - method = "conductance" - else: - cls = xu.OverlapRegridder - method = "mean" - - x, y = hash_xy(da) - key = (x, y, cls) - keys.append(key) - sources[key] = da - methods[key] = method - - counter = Counter(keys) - self.target = target - self.weights = {} - for key, _ in counter.most_common(cache_size): - cls = key[2] - ugrid_source = xu.UgridDataArray.from_structured(sources[key]) - kwargs = {"source": ugrid_source, "target": target} - method = methods[key] - if method is not None: - kwargs["method"] = method - regridder = cls(**kwargs) - self.weights[key] = regridder.weights - - def regrid( - self, - source: xr.DataArray, - method: str = "mean", - original2d: Optional[xr.DataArray] = None, - ): - if source.dims[-2:] != ("y", "x"): # So it's a constant - if original2d is None: - raise ValueError("original2d must be provided for constant values") - source = source * xr.ones_like(original2d) - - kwargs = {"target": self.target} - if method == "barycentric": - cls = xu.BarycentricInterpolator - elif method == "conductance": - cls = xu.RelativeOverlapRegridder - kwargs["method"] = method - else: - cls = xu.OverlapRegridder - kwargs["method"] = method - - x, y = hash_xy(source) - key = (x, y, cls) - if key in self.weights: - kwargs["weights"] = self.weights[key] - regridder = cls.from_weights(**kwargs) - # Avoid creation of a UgridDataArray here - dims = source.dims[:-2] - coords = {k: source.coords[k] for k in dims} - facedim = self.target.face_dimension - face_source = xr.DataArray( - source.data.reshape(*source.shape[:-2], -1), - coords=coords, - dims=[*dims, facedim], - name=source.name, - ) - return xu.UgridDataArray( - regridder.regrid_dataarray(face_source, (facedim,)), - regridder._target.ugrid_topology, - ) - else: - ugrid_source = xu.UgridDataArray.from_structured(source) - kwargs["source"] = ugrid_source - regridder = cls(**kwargs) - return regridder.regrid(ugrid_source) - - -def raise_on_layer(value, variable: str): - da = value[variable] - if "layer" in da.dims: - raise ValueError(f"{variable} should not be assigned a layer") - return da - - -def finish(uda): - """ - Set dimension order, and drop empty layers. - """ - facedim = uda.ugrid.grid.face_dimension - dims = ("time", "layer", facedim) - return uda.transpose(*dims, missing_dims="ignore").dropna("layer", how="all") - - -def create_idomain(thickness): - """ - Find cells that should get a passthrough value: IDOMAIN = -1. - - We may find them by forward and back-filling: if they are filled in both, they - contain active cells in both directions, and the value should be set to -1. - """ - active = ( - thickness > 0 - ).compute() # TODO larger than a specific value for round off? - ones = xu.ones_like(active, dtype=float).where(active) - passthrough = ones.ffill("layer").notnull() & ones.bfill("layer").notnull() - idomain = ones.combine_first( - xu.full_like(active, -1.0, dtype=float).where(passthrough) - ) - return idomain.fillna(0).astype(int) - - -def create_disv( - cache, - top, - bottom, - ibound, -): - if top.dims == ("layer",): - if ibound.dims != ("layer", "y", "x"): - raise ValueError( - "Either ibound or top must have dimensions (layer, y, x) to " - "derive model extent. Both may not be provided as constants." - ) - top = top * xr.ones_like(ibound) - original2d = ibound.isel(layer=0, drop=True) - else: - original2d = top.isel(layer=0, drop=True) - - if bottom.dims == ("layer",): - bottom = bottom * xr.ones_like(ibound) - - top = top.compute() - bottom = bottom.compute() - disv_top = cache.regrid(top).compute() - disv_bottom = cache.regrid(bottom).compute() - thickness = disv_top - disv_bottom - idomain = create_idomain(thickness) - disv = imod.mf6.VerticesDiscretization( - top=disv_top.sel(layer=1), - bottom=disv_bottom, - idomain=idomain, - ) - active = idomain > 0 - return disv, disv_top, disv_bottom, active, original2d - - -def create_npf( - cache, - k, - vertical_anisotropy, - active, - original2d, -): - disv_k = cache.regrid(k, method="geometric_mean", original2d=original2d).where( - active - ) - k33 = k * vertical_anisotropy - disv_k33 = cache.regrid(k33, method="harmonic_mean", original2d=original2d).where( - active - ) - return imod.mf6.NodePropertyFlow( - icelltype=0, - k=disv_k, - k33=disv_k33, - ) - - -def create_chd( - cache, - model, - key, - value, - ibound, - active, - original2d, - repeat, - **kwargs, -): - head = value["head"] - - if "layer" in head.coords: - layer = head.layer - ibound = ibound.sel(layer=layer) - active = active.sel(layer=layer) - - disv_head = cache.regrid( - head, - method="barycentric", - original2d=original2d, - ) - valid = (ibound < 0) & active - - if not valid.any(): - return - - chd = imod.mf6.ConstantHead(head=disv_head.where(valid)) - if repeat is not None: - chd.dataset["repeat_stress"] = get_repeat_stress(repeat) - model[key] = chd - return - - -def create_drn( - cache, - model, - key, - value, - active, - top, - bottom, - original2d, - repeat, - **kwargs, -): - conductance = raise_on_layer(value, "conductance") - elevation = raise_on_layer(value, "elevation") - - disv_cond = cache.regrid( - conductance, - method="conductance", - original2d=original2d, - ) - disv_elev = cache.regrid(elevation, original2d=original2d) - valid = (disv_cond > 0) & disv_elev.notnull() & active - location = xu.ones_like(active, dtype=float) - location = location.where((disv_elev > bottom) & (disv_elev <= top)).where(valid) - disv_cond = finish(location * disv_cond) - disv_elev = finish(location * disv_elev) - - if disv_cond.isnull().all(): - return - - drn = imod.mf6.Drainage( - elevation=disv_elev, - conductance=disv_cond, - ) - if repeat is not None: - drn.dataset["repeat_stress"] = get_repeat_stress(repeat) - model[key] = drn - return - - -def create_ghb( - cache, - model, - key, - value, - active, - original2d, - repeat, - **kwargs, -): - conductance = value["conductance"] - head = value["head"] - - disv_cond = cache.regrid( - conductance, - method="conductance", - original2d=original2d, - ) - disv_head = cache.regrid( - head, - method="barycentric", - original2d=original2d, - ) - valid = (disv_cond > 0.0) & disv_head.notnull() & active - - ghb = imod.mf6.GeneralHeadBoundary( - conductance=disv_cond.where(valid), - head=disv_head.where(valid), - ) - if repeat is not None: - ghb.dataset["repeat_stress"] = get_repeat_stress(repeat) - model[key] = ghb - return - - -def create_riv( - cache, - model, - key, - value, - active, - original2d, - top, - bottom, - repeat, - **kwargs, -): - def assign_to_layer( - conductance, stage, elevation, infiltration_factor, top, bottom, active - ): - """ - Assign river boundary to multiple layers. Distribute the conductance based - on the vertical degree of overlap. - - Parameters - ---------- - conductance - stage: - water stage - elevation: - bottom elevation of the river - infiltration_factor - factor (generally <1) to reduce infiltration conductance compared - to drainage conductance. - top: - layer model top elevation - bottom: - layer model bottom elevation - active: - active or inactive cells (idomain > 0) - """ - valid = conductance > 0.0 - conductance = conductance.where(valid) - stage = stage.where(valid) - elevation = elevation.where(valid) - elevation = elevation.where(elevation <= stage, other=stage) - - # TODO: this removes too much when the stage is higher than the top... - # Instead: just cut through all layers until the bottom elevation. - # Then, assign a transmissivity weighted conductance. - water_top = stage.where(stage <= top) - water_bottom = elevation.where(elevation > bottom) - layer_height = top - bottom - layer_height = layer_height.where(active) # avoid 0 thickness layers - fraction = (water_top - water_bottom) / layer_height - # Set values of 0.0 to 1.0, but do not change NaN values: - fraction = fraction.where(~(fraction == 0.0), other=1.0) - location = xu.ones_like(fraction).where(fraction.notnull() & active) - - layered_conductance = finish(conductance * fraction) - layered_stage = finish(stage * location) - layered_elevation = finish(elevation * location) - infiltration_factor = finish(infiltration_factor * location) - - return ( - layered_conductance, - layered_stage, - layered_elevation, - infiltration_factor, - ) - - conductance = raise_on_layer(value, "conductance") - stage = raise_on_layer(value, "stage") - bottom_elevation = raise_on_layer(value, "bottom_elevation") - infiltration_factor = raise_on_layer(value, "infiltration_factor") - - disv_cond_2d = cache.regrid( - conductance, - method="conductance", - original2d=original2d, - ).compute() - disv_elev_2d = cache.regrid(bottom_elevation, original2d=original2d) - disv_stage_2d = cache.regrid(stage, original2d=original2d) - disv_inff_2d = cache.regrid(infiltration_factor, original2d=original2d) - - disv_cond, disv_stage, disv_elev, disv_inff = assign_to_layer( - conductance=disv_cond_2d, - stage=disv_stage_2d, - elevation=disv_elev_2d, - infiltration_factor=disv_inff_2d, - top=top, - bottom=bottom, - active=active, - ) - - if disv_cond.isnull().all(): - return - - # The infiltration factor may be 0. In that case, we need only create a DRN - # package. - drn = imod.mf6.Drainage( - conductance=(1.0 - disv_inff) * disv_cond, - elevation=disv_stage, - ) - if repeat is not None: - drn.dataset["repeat_stress"] = get_repeat_stress(repeat) - model[f"{key}-drn"] = drn - - riv_cond = disv_cond * disv_inff - riv_valid = riv_cond > 0.0 - if not riv_valid.any(): - return - - riv = imod.mf6.River( - stage=disv_stage.where(riv_valid), - conductance=riv_cond.where(riv_valid), - bottom_elevation=disv_elev.where(riv_valid), - ) - if repeat is not None: - riv.dataset["repeat_stress"] = get_repeat_stress(repeat) - model[key] = riv - - return - - -def create_rch( - cache, - model, - key, - value, - active, - original2d, - repeat, - **kwargs, -): - rate = raise_on_layer(value, "rate") * 0.001 - disv_rate = cache.regrid(rate, original2d=original2d).where(active) - # Find highest active layer - location = get_upper_active_grid_cells(active) - disv_rate = finish(disv_rate.where(location)) - - # Skip if there's no data - if disv_rate.isnull().all(): - return - - rch = imod.mf6.Recharge(rate=disv_rate) - if repeat is not None: - rch.dataset["repeat_stress"] = get_repeat_stress(repeat) - model[key] = rch - return - - -def create_evt( - cache, - model, - key, - value, - active, - original2d, - repeat, - **kwargs, -): - surface = raise_on_layer(value, "surface") - rate = raise_on_layer(value, "rate") * 0.001 - depth = raise_on_layer(value, "depth") - - # Find highest active layer - highest = active["layer"] == active["layer"].where(active).min() - location = highest.where(highest) - - disv_surface = cache.regrid(surface, original2d=original2d).where(active) - disv_surface = finish(disv_surface * location) - - disv_rate = cache.regrid(rate, original2d=original2d).where(active) - disv_rate = finish(disv_rate * location) - - disv_depth = cache.regrid(depth, original2d=original2d).where(active) - disv_depth = finish(disv_depth * location) - - # At depth 1.0, the rate is 0.0. - proportion_depth = xu.ones_like(disv_surface).where(disv_surface.notnull()) - proportion_rate = xu.zeros_like(disv_surface).where(disv_surface.notnull()) - - evt = imod.mf6.Evapotranspiration( - surface=disv_surface, - rate=disv_rate, - depth=disv_depth, - proportion_rate=proportion_rate, - proportion_depth=proportion_depth, - ) - if repeat is not None: - evt.dataset["repeat_stress"] = get_repeat_stress(repeat) - model[key] = evt - return - - -def create_sto( - cache, - storage_coefficient, - active, - original2d, - transient, -): - if storage_coefficient is None: - disv_coef = 0.0 - else: - disv_coef = cache.regrid(storage_coefficient, original2d=original2d).where( - active - ) - - sto = imod.mf6.StorageCoefficient( - storage_coefficient=disv_coef, - specific_yield=0.0, - transient=transient, - convertible=0, - ) - return sto - - -def create_wel( - cache, - model, - key, - value, - active, - top, - bottom, - k, - repeat, - **kwargs, -): - target = cache.target - dataframe = value["dataframe"] - layer = value["layer"] - - if layer <= 0: - dataframe = imod.prepare.assign_wells( - wells=dataframe, - top=top, - bottom=bottom, - k=k, - minimum_thickness=0.01, - minimum_k=1.0, - ) - else: - dataframe["index"] = np.arange(len(dataframe)) - dataframe["layer"] = layer - - first = dataframe.groupby("index").first() - well_layer = first["layer"].values - xy = np.column_stack([first["x"], first["y"]]) - cell2d = target.locate_points(xy) - valid = (cell2d >= 0) & active.values[well_layer - 1, cell2d] - - cell2d = cell2d[valid] + 1 - # Skip if no wells are located inside cells - if not valid.any(): - return - - if "time" in dataframe.columns: - # Ensure the well data is rectangular. - time = np.unique(dataframe["time"].values) - dataframe = dataframe.set_index("time") - # First ffill, then bfill! - dfs = [df.reindex(time).ffill().bfill() for _, df in dataframe.groupby("index")] - rate = ( - pd.concat(dfs) - .reset_index() - .set_index(["time", "index"])["rate"] - .to_xarray() - ) - else: - rate = xr.DataArray( - dataframe["rate"], coords={"index": dataframe["index"]}, dims=["index"] - ) - - # Don't forget to remove the out-of-bounds points. - rate = rate.where(xr.DataArray(valid, dims=["index"]), drop=True) - - wel = imod.mf6.WellDisVertices( - layer=well_layer, - cell2d=cell2d, - rate=rate, - ) - if repeat is not None: - wel.dataset["repeat_stress"] = get_repeat_stress(repeat) - - model[key] = wel - return - - -def create_ic(cache, model, key, value, active, **kwargs): - start = value["head"] - disv_start = cache.regrid(source=start, method="barycentric").where(active) - model[key] = imod.mf6.InitialConditions(start=disv_start) - return - - -def create_hfb( - model: Modflow6Model, - key: str, - value: Dict, - top: GridDataArray, - bottom: GridDataArray, - **kwargs, -) -> None: - dataframe = value["geodataframe"] - - barrier_gdf = gpd.GeoDataFrame( - geometry=dataframe["geometry"].values, - data={ - "resistance": dataframe["resistance"].values, - "ztop": np.ones_like(dataframe["geometry"].values) * top.max().values, - "zbottom": np.ones_like(dataframe["geometry"].values) * bottom.min().values, - }, - ) - - model[key] = imod.mf6.HorizontalFlowBarrierResistance(barrier_gdf) - - -def merge_hfbs( - horizontal_flow_barriers: List[imod.mf6.HorizontalFlowBarrierResistance], -): - datasets = [] - for horizontal_flow_barrier in horizontal_flow_barriers: - datasets.append(horizontal_flow_barrier.dataset) - - combined_dataset = xr.concat(datasets, "index") - combined_dataset.coords["index"] = np.arange(combined_dataset.sizes["index"]) - - combined_dataframe = cast(gpd.GeoDataFrame, combined_dataset.to_dataframe()) - combined_dataframe.drop("print_input", axis=1, inplace=True) # noqa: PD002 - - return imod.mf6.HorizontalFlowBarrierResistance(combined_dataframe) - - -PKG_CONVERSION = { - "chd": create_chd, - "drn": create_drn, - "evt": create_evt, - "ghb": create_ghb, - "hfb": create_hfb, - "shd": create_ic, - "rch": create_rch, - "riv": create_riv, - "wel": create_wel, -} - - -def expand_repetitions( - repeat_stress: List[datetime], time_min: datetime, time_max: datetime -) -> Dict[datetime, datetime]: - expanded = {} - for year, date in itertools.product( - range(time_min.year, time_max.year + 1), - repeat_stress, - ): - newdate = date.replace(year=year) - if newdate < time_max: - expanded[newdate] = date - return expanded - - -def convert_to_disv( - projectfile_data, target, time_min=None, time_max=None, repeat_stress=None -): - """ - Convert the contents of a project file to a MODFLOW6 DISV model. - - The ``time_min`` and ``time_max`` are **both** required when - ``repeat_stress`` is given. The entries in the Periods section of the - project file will be expanded to yearly repeats between ``time_min`` and - ``time_max``. - - Additionally, ``time_min`` and ``time_max`` may be used to slice the input - to a specific time domain. - - The returned model is steady-state if none of the packages contain a time - dimension. The model is transient if any of the packages contain a time - dimension. This can be changed by setting the "transient" value in the - storage package of the returned model. Storage coefficient input is - required for a transient model. - - Parameters - ---------- - projectfile_data: dict - Dictionary with the projectfile topics as keys, and the data - as xarray.DataArray, pandas.DataFrame, or geopandas.GeoDataFrame. - target: xu.Ugrid2d - The unstructured target topology. All data is transformed to match this - topology. - time_min: datetime, optional - Minimum starting time of a stress. - Required when ``repeat_stress`` is provided. - time_max: datetime, optional - Maximum starting time of a stress. - Required when ``repeat_stress`` is provided. - repeat_stress: dict of dict of string to datetime, optional - This dict contains contains, per topic, the period alias (a string) to - its datetime. - - Returns - ------- - disv_model: imod.mf6.GroundwaterFlowModel - - """ - if repeat_stress is not None: - if time_min is None or time_max is None: - raise ValueError( - "time_min and time_max are required when repeat_stress is given" - ) - - for arg in (time_min, time_max): - if arg is not None and not isinstance(arg, datetime): - raise TypeError( - "time_min and time_max must be datetime.datetime. " - f"Received: {type(arg).__name__}" - ) - - data = projectfile_data.copy() - model = imod.mf6.GroundwaterFlowModel() - - # Setup the regridding weights cache. - weights_cache = SingularTargetRegridderWeightsCache(data, target, cache_size=5) - - # Mandatory packages first. - ibound = data["bnd"]["ibound"].compute() - disv, top, bottom, active, original2d = create_disv( - cache=weights_cache, - top=data["top"]["top"], - bottom=data["bot"]["bottom"], - ibound=ibound, - ) - - npf = create_npf( - cache=weights_cache, - k=data["khv"]["kh"], - vertical_anisotropy=data["kva"]["vertical_anisotropy"], - active=active, - original2d=original2d, - ) - - model["npf"] = npf - model["disv"] = disv - model["oc"] = imod.mf6.OutputControl(save_head="all") - - # Used in other package construction: - k = npf["k"].compute() - new_ibound = weights_cache.regrid(source=ibound, method="minimum").compute() - - # Boundary conditions, one by one. - for key, value in data.items(): - pkg = key.split("-")[0] - convert = PKG_CONVERSION.get(pkg) - # Skip unsupported packages - if convert is None: - continue - - if repeat_stress is None: - repeat = None - else: - repeat = repeat_stress.get(key) - if repeat is not None: - repeat = expand_repetitions(repeat, time_min, time_max) - - try: - # conversion will update model instance - convert( - cache=weights_cache, - model=model, - key=key, - value=value, - ibound=new_ibound, - active=active, - original2d=original2d, - top=top, - bottom=bottom, - k=k, - repeat=repeat, - ) - except Exception as e: - raise type(e)(f"{e}\nduring conversion of {key}") - - # Treat hfb's separately: they must be merged into one, - # as MODFLOW6 only supports a single HFB. - hfb_keys = [key for key in model.keys() if key.split("-")[0] == "hfb"] - hfbs = [model.pop(key) for key in hfb_keys] - if hfbs: - model["hfb"] = merge_hfbs(hfbs) - - transient = any("time" in pkg.dataset.dims for pkg in model.values()) - if transient and (time_min is not None or time_max is not None): - model = model.clip_box(time_min=time_min, time_max=time_max) - - sto_entry = data.get("sto") - if sto_entry is None: - if transient: - raise ValueError("storage input is required for a transient run") - storage_coefficient = None - else: - storage_coefficient = sto_entry["storage_coefficient"] - - model["sto"] = create_sto( - cache=weights_cache, - storage_coefficient=storage_coefficient, - active=active, - original2d=original2d, - transient=transient, - ) - - return model diff --git a/imod/formats/prj/prj.py b/imod/formats/prj/prj.py index b71d66f4e..7dc838c2a 100644 --- a/imod/formats/prj/prj.py +++ b/imod/formats/prj/prj.py @@ -3,7 +3,10 @@ """ import shlex +import textwrap from collections import defaultdict +from dataclasses import asdict, dataclass +from dataclasses import field as data_field from datetime import datetime from itertools import chain from os import PathLike @@ -15,6 +18,8 @@ import xarray as xr import imod +import imod.logging +from imod.logging.loglevel import LogLevel FilePath = Union[str, "PathLike[str]"] @@ -519,7 +524,43 @@ def _try_read_with_func(func, path, *args, **kwargs): raise type(e)(f"{e}. Error thrown while opening file: {path}") -def _create_datarray_from_paths(paths: List[str], headers: List[Dict[str, Any]]): +def _get_array_transformation_parameters( + headers: List[Dict[str, Any]], key: str, dim: str +) -> Union[xr.DataArray | float]: + """ + In imod5 prj files one can add linear transformation parameters to transform + the data read from an idf file: we can specify a multiplication factor and a + constant that will be added to the values. The factor and addition + parameters can be can be scalar (if applied to 1 idf), or they can be + xr.DataArrays if the factor and addition are for example layer- or + time-dependent (if both we apply the transformations one at a time) + + Parameters + ---------- + headers: List[Dict[str, Any]] + prj-file lines which we want to import, serialized as a dictionary. + key: str + specifies the name of the transformation parameter in the idf file. + Usually "factor" or "addition" + dim: str + the name of the dimension over which transformation parameters are + expected to differ for the current import. Usually "time"or "layer" + """ + if dim in headers[0].keys(): + return xr.DataArray( + data=[header[key] for header in headers], + dims=(dim,), + coords={dim: [header[dim] for header in headers]}, + ) + else: + return headers[0][key] + + +def _create_dataarray_from_paths( + paths: List[str], headers: List[Dict[str, Any]], dim: str +) -> xr.DataArray: + factor = _get_array_transformation_parameters(headers, "factor", dim) + addition = _get_array_transformation_parameters(headers, "addition", dim) da = _try_read_with_func( imod.formats.array_io.reading._load, paths, @@ -527,22 +568,38 @@ def _create_datarray_from_paths(paths: List[str], headers: List[Dict[str, Any]]) _read=imod.idf._read, headers=headers, ) - return da + # Ensure factor and addition do not have more dimensions than da + if isinstance(factor, xr.DataArray): + missing_dims = set(factor.dims) - set(da.dims) + if missing_dims: + factor = factor.isel({d: 0 for d in missing_dims}, drop=True) + addition = addition.isel({d: 0 for d in missing_dims}, drop=True) + + return da * factor + addition -def _create_dataarray_from_values(values: List[float], headers: List[Dict[str, Any]]): + +def _create_dataarray_from_values( + values: List[float], headers: List[Dict[str, Any]], dim: str +): + factor = _get_array_transformation_parameters(headers, "factor", dim) + addition = _get_array_transformation_parameters(headers, "addition", dim) coords = _merge_coords(headers) firstdims = headers[0]["dims"] shape = [len(coord) for coord in coords.values()] da = xr.DataArray(np.reshape(values, shape), dims=firstdims, coords=coords) - return da + return da * factor + addition def _create_dataarray( - paths: List[str], headers: List[Dict[str, Any]], values: List[float] + paths: List[str], headers: List[Dict[str, Any]], values: List[float], dim: str ) -> xr.DataArray: """ Create a DataArray from a list of IDF paths, or from constant values. + + There are mixed cases possible, where some of the layers or stress periods + contain only a single constant value, and the others are specified as IDFs. + In that case, we cannot do a straightforward concatenation. """ values_valid = [] paths_valid = [] @@ -557,54 +614,19 @@ def _create_dataarray( paths_valid.append(path) if paths_valid and values_valid: - dap = _create_datarray_from_paths(paths_valid, headers_paths) - dav = _create_dataarray_from_values(values_valid, headers_values) + # Both lists contain entries: mixed case. + dap = _create_dataarray_from_paths(paths_valid, headers_paths, dim=dim) + dav = _create_dataarray_from_values(values_valid, headers_values, dim=dim) dap.name = "tmp" dav.name = "tmp" da = xr.merge((dap, dav), join="outer")["tmp"] elif paths_valid: - da = _create_datarray_from_paths(paths_valid, headers_paths) + # Only paths provided + da = _create_dataarray_from_paths(paths_valid, headers_paths, dim=dim) elif values_valid: - da = _create_dataarray_from_values(values_valid, headers_values) + # Only scalar values provided + da = _create_dataarray_from_values(values_valid, headers_values, dim=dim) - da = apply_factor_and_addition(headers, da) - return da - - -def apply_factor_and_addition(headers, da): - if not ("layer" in da.coords or "time" in da.dims): - factor = headers[0]["factor"] - addition = headers[0]["addition"] - da = da * factor + addition - elif "layer" in da.coords and "time" not in da.dims: - da = apply_factor_and_addition_per_layer(headers, da) - else: - header_per_time = defaultdict(list) - for time in da.coords["time"].values: - for header in headers: - if np.datetime64(header["time"]) == time: - header_per_time[time].append(header) - - for time in da.coords["time"]: - da.loc[{"time": time}] = apply_factor_and_addition( - header_per_time[np.datetime64(time.values)], - da.sel(time=time, drop=True), - ) - return da - - -def apply_factor_and_addition_per_layer(headers, da): - layer = da.coords["layer"].values - header_per_layer = {} - for header in headers: - if header["layer"] in header_per_layer.keys(): - raise ValueError("error in project file: layer repetition") - header_per_layer[header["layer"]] = header - addition_values = [header_per_layer[lay]["addition"] for lay in layer] - factor_values = [header_per_layer[lay]["factor"] for lay in layer] - addition = xr.DataArray(addition_values, coords={"layer": layer}, dims=("layer")) - factor = xr.DataArray(factor_values, coords={"layer": layer}, dims=("layer",)) - da = da * factor + addition return da @@ -628,7 +650,7 @@ def _open_package_idf( headers.append(header) values.append(value) - das[variable] = _create_dataarray(paths, headers, values) + das[variable] = _create_dataarray(paths, headers, values, dim="layer") return [das] @@ -755,7 +777,7 @@ def _open_boundary_condition_idf( for i, (paths, headers, values) in enumerate( zip(system_paths.values(), system_headers.values(), system_values.values()) ): - das[i][variable] = _create_dataarray(paths, headers, values) + das[i][variable] = _create_dataarray(paths, headers, values, dim="time") repeats = sorted(all_repeats) return das, repeats @@ -779,11 +801,40 @@ def _read_package_gen( return out +@dataclass +class IpfResult: + has_associated: bool = data_field(default_factory=bool) + dataframe: list[pd.DataFrame] = data_field(default_factory=list) + layer: list[int] = data_field(default_factory=list) + time: list[str] = data_field(default_factory=list) + factor: list[float] = data_field(default_factory=list) + addition: list[float] = data_field(default_factory=list) + + def append( + self, + dataframe: pd.DataFrame, + layer: int, + time: str, + factor: float, + addition: float, + ): + self.dataframe.append(dataframe) + self.layer.append(layer) + self.time.append(time) + self.factor.append(factor) + self.addition.append(addition) + + def _read_package_ipf( block_content: Dict[str, Any], periods: Dict[str, datetime] -) -> Tuple[List[Dict[str, Any]], List[datetime]]: - out = [] +) -> Tuple[Dict[str, Dict], List[datetime]]: + out = defaultdict(IpfResult) repeats = [] + + # we will store in this set the tuples of (x, y, id, well_top, well_bot) + # which should be unique for each well + imported_wells = {} + for entry in block_content["ipf"]: timestring = entry["time"] layer = entry["layer"] @@ -800,14 +851,16 @@ def _read_package_ipf( ipf_df, indexcol, ext = _try_read_with_func(imod.ipf._read_ipf, path) if indexcol == 0: # No associated files + has_associated = False columns = ("x", "y", "rate") if layer <= 0: df = ipf_df.iloc[:, :5] - columns = columns + ("top", "bottom") + columns = columns + ("filt_top", "filt_bot") else: df = ipf_df.iloc[:, :3] df.columns = columns else: + has_associated = True dfs = [] for row in ipf_df.itertuples(): filename = row[indexcol] @@ -819,21 +872,46 @@ def _read_package_ipf( df_assoc["x"] = row[1] df_assoc["y"] = row[2] df_assoc["id"] = path_assoc.stem - if layer <= 0: - df_assoc["top"] = row[4] - df_assoc["bottom"] = row[5] + df_assoc["filt_top"] = row[4] + df_assoc["filt_bot"] = row[5] + + well_characteristics = ( + row[1], + row[2], + path_assoc.stem, + row[4], + row[5], + ) + if well_characteristics not in imported_wells.keys(): + imported_wells[well_characteristics] = 0 + else: + suffix = imported_wells[well_characteristics] + 1 + imported_wells[well_characteristics] = suffix + df_assoc["id"] = df_assoc["id"] + f"_{suffix}" + + log_message = textwrap.dedent( + f"""A well with the same x, y, id, filter_top and filter_bot was already imported. + This happened at x = {row[1]}, y = { row[2]}, id = {path_assoc.stem} + Now the ID for this new well was appended with the suffix _{suffix}) + """ + ) + + imod.logging.logger.log( + loglevel=LogLevel.WARNING, + message=log_message, + additional_depth=2, + ) + dfs.append(df_assoc) df = pd.concat(dfs, ignore_index=True, sort=False) df["rate"] = df["rate"] * factor + addition - d = { - "dataframe": df, - "layer": layer, - "time": time, - } - out.append(d) + out[path.stem].has_associated = has_associated + out[path.stem].append(df, layer, time, factor, addition) + + out_dict_ls: dict[str, dict] = {key: asdict(o) for key, o in out.items()} repeats = sorted(repeats) - return out, repeats + return out_dict_ls, repeats def read_projectfile(path: FilePath) -> Dict[str, Any]: @@ -969,7 +1047,15 @@ def open_projectfile_data(path: FilePath) -> Dict[str, Any]: data, repeats = _read_package_ipf(block_content, periods) elif key == "(cap)": variables = set(METASWAP_VARS).intersection(block_content.keys()) + # check for optional ipf input + read_ipf = False + if 'path' in block_content['artifical_recharge_layer'][0]: + if 'ipf' in block_content['artifical_recharge_layer'][0]['path'].suffix.lower(): + variables.remove('artifical_recharge_layer') + read_ipf = True data = _open_package_idf(block_content, variables) + if read_ipf: + data[0]['artifical_recharge_dataframe'] = imod.ipf.read(block_content['artifical_recharge_layer'][0]['path']) elif key in ("extra", "(pcg)"): data = [block_content] elif key in KEYS: @@ -986,7 +1072,12 @@ def open_projectfile_data(path: FilePath) -> Dict[str, Any]: raise type(e)(f"{e}. Errored while opening/reading data entries for: {key}") strippedkey = key.strip("(").strip(")") - if len(data) > 1: + if strippedkey == "wel": + for key, d in data.items(): + named_key = f"{strippedkey}-{key}" + prj_data[named_key] = d + repeat_stress[named_key] = repeats + elif len(data) > 1: for i, da in enumerate(data): numbered_key = f"{strippedkey}-{i + 1}" prj_data[numbered_key] = da diff --git a/imod/logging/ilogger.py b/imod/logging/ilogger.py index 115035d2f..8297a09ff 100644 --- a/imod/logging/ilogger.py +++ b/imod/logging/ilogger.py @@ -9,7 +9,7 @@ class ILogger: """ @abstractmethod - def debug(self, message: str, additional_depth: int) -> None: + def debug(self, message: str, additional_depth: int = 0) -> None: """ Log message with severity ':attr:`~imod.logging.loglevel.LogLevel.DEBUG`'. @@ -24,7 +24,7 @@ def debug(self, message: str, additional_depth: int) -> None: raise NotImplementedError @abstractmethod - def info(self, message: str, additional_depth: int) -> None: + def info(self, message: str, additional_depth: int = 0) -> None: """ Log message with severity ':attr:`~imod.logging.loglevel.LogLevel.INFO`'. @@ -39,7 +39,7 @@ def info(self, message: str, additional_depth: int) -> None: raise NotImplementedError @abstractmethod - def warning(self, message: str, additional_depth: int) -> None: + def warning(self, message: str, additional_depth: int = 0) -> None: """ Log message with severity ':attr:`~imod.logging.loglevel.LogLevel.WARNING`'. @@ -54,7 +54,7 @@ def warning(self, message: str, additional_depth: int) -> None: raise NotImplementedError @abstractmethod - def error(self, message: str, additional_depth: int) -> None: + def error(self, message: str, additional_depth: int = 0) -> None: """ Log message with severity ':attr:`~imod.logging.loglevel.LogLevel.ERROR`'. @@ -69,7 +69,7 @@ def error(self, message: str, additional_depth: int) -> None: raise NotImplementedError @abstractmethod - def critical(self, message: str, additional_depth: int) -> None: + def critical(self, message: str, additional_depth: int = 0) -> None: """ Log message with severity ':attr:`~imod.logging.loglevel.LogLevel.CRITICAL`'. @@ -83,7 +83,7 @@ def critical(self, message: str, additional_depth: int) -> None: """ raise NotImplementedError - def log(self, loglevel: LogLevel, message: str, additional_depth: int) -> None: + def log(self, loglevel: LogLevel, message: str, additional_depth: int = 0) -> None: """ logs a message with the specified urgency level. """ diff --git a/imod/mf6/__init__.py b/imod/mf6/__init__.py index d72b1bdb8..652602432 100644 --- a/imod/mf6/__init__.py +++ b/imod/mf6/__init__.py @@ -21,9 +21,9 @@ HorizontalFlowBarrierHydraulicCharacteristic, HorizontalFlowBarrierMultiplier, HorizontalFlowBarrierResistance, - LayeredHorizontalFlowBarrierHydraulicCharacteristic, - LayeredHorizontalFlowBarrierMultiplier, - LayeredHorizontalFlowBarrierResistance, + SingleLayerHorizontalFlowBarrierHydraulicCharacteristic, + SingleLayerHorizontalFlowBarrierMultiplier, + SingleLayerHorizontalFlowBarrierResistance, ) from imod.mf6.ic import InitialConditions from imod.mf6.ims import ( @@ -49,5 +49,5 @@ from imod.mf6.timedis import TimeDiscretization from imod.mf6.utilities.regrid import RegridderType, RegridderWeightsCache from imod.mf6.uzf import UnsaturatedZoneFlow -from imod.mf6.wel import Well, WellDisStructured, WellDisVertices +from imod.mf6.wel import LayeredWell, Well, WellDisStructured, WellDisVertices from imod.mf6.write_context import WriteContext diff --git a/imod/mf6/chd.py b/imod/mf6/chd.py index 65e1bf3e0..c171a558b 100644 --- a/imod/mf6/chd.py +++ b/imod/mf6/chd.py @@ -1,9 +1,15 @@ +from copy import deepcopy +from typing import Optional + import numpy as np from imod.logging import init_log_decorator +from imod.logging.logging_decorators import standard_log_decorator from imod.mf6.boundary_condition import BoundaryCondition +from imod.mf6.dis import StructuredDiscretization from imod.mf6.interfaces.iregridpackage import IRegridPackage from imod.mf6.regrid.regrid_schemes import ConstantHeadRegridMethod +from imod.mf6.utilities.regrid import RegridderWeightsCache, _regrid_package_data from imod.mf6.validation import BOUNDARY_DIMS_SCHEMA, CONC_DIMS_SCHEMA from imod.schemata import ( AllInsideNoDataSchema, @@ -15,6 +21,7 @@ IndexesSchema, OtherCoordsSchema, ) +from imod.typing import GridDataArray class ConstantHead(BoundaryCondition, IRegridPackage): @@ -141,3 +148,143 @@ def _validate(self, schemata, **kwargs): errors = super()._validate(schemata, **kwargs) return errors + + @classmethod + @standard_log_decorator() + def from_imod5_data( + cls, + key: str, + imod5_data: dict[str, dict[str, GridDataArray]], + target_discretization: StructuredDiscretization, + regridder_types: Optional[ConstantHeadRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "ConstantHead": + """ + Construct a ConstantHead-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + This function can be used if chd packages are defined in the imod5 data. + + If they are not, then imod5 assumed that at all the locations where ibound + = -1 a chd package is active with the starting head of the simulation + as a constant. In that case, use the from_imod5_shd_data function instead + of this one. + + The creation of a chd package from shd data should only be done if no chd + packages at all are present in the imod5_data + + + + Parameters + ---------- + key: str + The key used in the imod5 data dictionary that is used to refer + to the chd package that we want to import. + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_discretization: StructuredDiscretization package + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + regridder_types: RegridMethodType, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + + Returns + ------- + A list of Modflow 6 ConstantHead packages. + """ + return cls._from_head_data( + imod5_data[key]["head"], + imod5_data["bnd"]["ibound"], + target_discretization, + regridder_types, + regrid_cache, + ) + + @classmethod + @standard_log_decorator() + def from_imod5_shd_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + target_discretization: StructuredDiscretization, + regridder_types: Optional[ConstantHeadRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "ConstantHead": + """ + Construct a ConstantHead-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + This function can be used if no chd packages at all are defined in the imod5 data. + + In that case, imod5 assumed that at all the locations where ibound + = -1, a chd package is active with the starting head of the simulation + as a constant. + + So this function creates a single chd package that will be present at all locations where + ibound == -1. The assigned head will be the starting head, specified in the array "shd" + in the imod5 data. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_discretization: StructuredDiscretization package + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + regridder_types: ConstantHeadRegridMethod, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + A Modflow 6 ConstantHead package. + """ + return cls._from_head_data( + imod5_data["shd"]["head"], + imod5_data["bnd"]["ibound"], + target_discretization, + regridder_types, + regrid_cache, + ) + + @classmethod + def _from_head_data( + cls, + head: GridDataArray, + ibound: GridDataArray, + target_discretization: StructuredDiscretization, + regridder_types: Optional[ConstantHeadRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "ConstantHead": + target_idomain = target_discretization.dataset["idomain"] + + if regridder_types is None: + regridder_types = ConstantHead.get_regrid_methods() + + data = {"head": head, "ibound": ibound} + + regridded_package_data = _regrid_package_data( + data, target_idomain, regridder_types, regrid_cache, {} + ) + head = regridded_package_data["head"] + ibound = regridded_package_data["ibound"] + + # select locations where ibound < 0 + head = head.where(ibound < 0) + + # select locations where idomain > 0 + head = head.where(target_idomain > 0) + + regridded_package_data["head"] = head + regridded_package_data.pop("ibound") + + return cls(**regridded_package_data, validate=True) + + @classmethod + def get_regrid_methods(cls) -> ConstantHeadRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/dis.py b/imod/mf6/dis.py index 520643b66..aca17c137 100644 --- a/imod/mf6/dis.py +++ b/imod/mf6/dis.py @@ -1,14 +1,21 @@ import pathlib -from typing import Any, List +from copy import deepcopy +from typing import Any, List, Optional import numpy as np import imod -from imod.logging import init_log_decorator +from imod.logging import init_log_decorator, standard_log_decorator from imod.mf6.interfaces.imaskingsettings import IMaskingSettings from imod.mf6.interfaces.iregridpackage import IRegridPackage from imod.mf6.package import Package -from imod.mf6.regrid.regrid_schemes import DiscretizationRegridMethod +from imod.mf6.regrid.regrid_schemes import DiscretizationRegridMethod, RegridMethodType +from imod.mf6.utilities.grid import create_smallest_target_grid +from imod.mf6.utilities.imod5_converter import convert_ibound_to_idomain +from imod.mf6.utilities.regrid import ( + RegridderWeightsCache, + _regrid_package_data, +) from imod.mf6.validation import DisBottomSchema from imod.schemata import ( ActiveCellsConnectedSchema, @@ -18,7 +25,10 @@ DTypeSchema, IdentityNoDataSchema, IndexesSchema, + UniqueValuesSchema, + ValidationError, ) +from imod.typing.grid import GridDataArray class StructuredDiscretization(Package, IRegridPackage, IMaskingSettings): @@ -146,3 +156,81 @@ def _validate(self, schemata, **kwargs): errors = super()._validate(schemata, **kwargs) return errors + + @classmethod + @standard_log_decorator() + def from_imod5_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + regridder_types: Optional[RegridMethodType] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + validate: bool = True, + ) -> "StructuredDiscretization": + """ + Construct package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + Method regrids all variables to a target grid with the smallest extent + and smallest cellsize available in all the grids. Consequently it + converts iMODFLOW data to MODFLOW 6 data. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + regridder_types: DiscretizationRegridMethod, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + Modflow 6 StructuredDiscretization package. + + """ + data = { + "idomain": imod5_data["bnd"]["ibound"].astype(np.int32), + "top": imod5_data["top"]["top"], + "bottom": imod5_data["bot"]["bottom"], + } + + target_grid = create_smallest_target_grid(*data.values()) + + if regridder_types is None: + regridder_types = StructuredDiscretization.get_regrid_methods() + + new_package_data = _regrid_package_data( + data, target_grid, regridder_types, regrid_cache + ) + + # Validate iMOD5 data + if validate: + UniqueValuesSchema([-1, 0, 1]).validate(imod5_data["bnd"]["ibound"]) + if not np.all( + new_package_data["top"][1:].data == new_package_data["bottom"][:-1].data + ): + raise ValidationError( + "Model discretization not fully 3D. Make sure TOP[n+1] matches BOT[n]" + ) + + thickness = new_package_data["top"] - new_package_data["bottom"] + new_package_data["idomain"] = convert_ibound_to_idomain( + new_package_data["idomain"], thickness + ) + + # TOP 3D -> TOP 2D + # Assume iMOD5 data provided as fully 3D and not Quasi-3D + new_package_data["top"] = new_package_data["top"].sel(layer=1, drop=True) + + return cls(**new_package_data, validate=True) + + @classmethod + def get_regrid_methods(cls) -> DiscretizationRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/drn.py b/imod/mf6/drn.py index a7cf5230b..cf1e1b1e4 100644 --- a/imod/mf6/drn.py +++ b/imod/mf6/drn.py @@ -1,10 +1,27 @@ +from copy import deepcopy +from datetime import datetime +from typing import Optional + import numpy as np -from imod.logging import init_log_decorator +from imod.logging import init_log_decorator, standard_log_decorator from imod.mf6.boundary_condition import BoundaryCondition +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.disv import VerticesDiscretization from imod.mf6.interfaces.iregridpackage import IRegridPackage +from imod.mf6.npf import NodePropertyFlow from imod.mf6.regrid.regrid_schemes import DrainageRegridMethod +from imod.mf6.utilities.regrid import ( + RegridderWeightsCache, + _regrid_package_data, +) from imod.mf6.validation import BOUNDARY_DIMS_SCHEMA, CONC_DIMS_SCHEMA +from imod.prepare.cleanup import cleanup_drn +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION, allocate_drn_cells +from imod.prepare.topsystem.conductance import ( + DISTRIBUTING_OPTION, + distribute_drn_conductance, +) from imod.schemata import ( AllInsideNoDataSchema, AllNoDataSchema, @@ -16,6 +33,9 @@ IndexesSchema, OtherCoordsSchema, ) +from imod.typing import GridDataArray +from imod.typing.grid import enforce_dim_order, is_planar_grid +from imod.util.expand_repetitions import expand_repetitions class Drainage(BoundaryCondition, IRegridPackage): @@ -144,3 +164,127 @@ def _validate(self, schemata, **kwargs): errors = super()._validate(schemata, **kwargs) return errors + + @standard_log_decorator() + def cleanup(self, dis: StructuredDiscretization | VerticesDiscretization) -> None: + """ + Clean up package inplace. This method calls + :func:`imod.prepare.cleanup.cleanup_drn`, see documentation of that + function for details on cleanup. + + dis: imod.mf6.StructuredDiscretization | imod.mf6.VerticesDiscretization + Model discretization package. + """ + dis_dict = {"idomain": dis.dataset["idomain"]} + cleaned_dict = self._call_func_on_grids(cleanup_drn, dis_dict) + super().__init__(cleaned_dict) + + @classmethod + def from_imod5_data( + cls, + key: str, + imod5_data: dict[str, dict[str, GridDataArray]], + period_data: dict[str, list[datetime]], + target_discretization: StructuredDiscretization, + target_npf: NodePropertyFlow, + time_min: datetime, + time_max: datetime, + allocation_option: ALLOCATION_OPTION, + distributing_option: DISTRIBUTING_OPTION, + regridder_types: Optional[DrainageRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "Drainage": + """ + Construct a drainage-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + period_data: dict + Dictionary with iMOD5 period data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_discretization: StructuredDiscretization package + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + target_npf: NodePropertyFlow package + The conductivity information, used to compute drainage flux + allocation_option: ALLOCATION_OPTION + allocation option. + distributing_option: dict[str, DISTRIBUTING_OPTION] + distributing option. + time_min: datetime + Begin-time of the simulation. Used for expanding period data. + time_max: datetime + End-time of the simulation. Used for expanding period data. + regridder_types: DrainageRegridMethod, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + A Modflow 6 Drainage package. + """ + + target_top = target_discretization.dataset["top"] + target_bottom = target_discretization.dataset["bottom"] + target_idomain = target_discretization.dataset["idomain"] + + data = { + "elevation": imod5_data[key]["elevation"], + "conductance": imod5_data[key]["conductance"], + } + is_planar = is_planar_grid(data["elevation"]) + + if regridder_types is None: + regridder_types = Drainage.get_regrid_methods() + + regridded_package_data = _regrid_package_data( + data, target_idomain, regridder_types, regrid_cache, {} + ) + + conductance = regridded_package_data["conductance"] + + if is_planar: + planar_elevation = regridded_package_data["elevation"] + + drn_allocation = allocate_drn_cells( + allocation_option, + target_idomain == 1, + target_top, + target_bottom, + planar_elevation, + ) + + layered_elevation = planar_elevation.where(drn_allocation) + layered_elevation = enforce_dim_order(layered_elevation) + regridded_package_data["elevation"] = layered_elevation + + regridded_package_data["conductance"] = distribute_drn_conductance( + distributing_option, + drn_allocation, + conductance, + target_top, + target_bottom, + target_npf.dataset["k"], + planar_elevation, + ) + + drn = Drainage(**regridded_package_data, validate=True) + repeat = period_data.get(key) + if repeat is not None: + drn.set_repeat_stress(expand_repetitions(repeat, time_min, time_max)) + return drn + + @classmethod + def get_regrid_methods(cls) -> DrainageRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/ghb.py b/imod/mf6/ghb.py index 68d8930f7..e40d32b86 100644 --- a/imod/mf6/ghb.py +++ b/imod/mf6/ghb.py @@ -1,10 +1,27 @@ +from copy import deepcopy +from datetime import datetime +from typing import Optional + import numpy as np -from imod.logging import init_log_decorator +from imod.logging import init_log_decorator, standard_log_decorator from imod.mf6.boundary_condition import BoundaryCondition +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.disv import VerticesDiscretization from imod.mf6.interfaces.iregridpackage import IRegridPackage -from imod.mf6.regrid.regrid_schemes import GeneralHeadBoundaryRegridMethod +from imod.mf6.npf import NodePropertyFlow +from imod.mf6.regrid.regrid_schemes import ( + GeneralHeadBoundaryRegridMethod, + RegridMethodType, +) +from imod.mf6.utilities.regrid import RegridderWeightsCache, _regrid_package_data from imod.mf6.validation import BOUNDARY_DIMS_SCHEMA, CONC_DIMS_SCHEMA +from imod.prepare.cleanup import cleanup_ghb +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION, allocate_ghb_cells +from imod.prepare.topsystem.conductance import ( + DISTRIBUTING_OPTION, + distribute_ghb_conductance, +) from imod.schemata import ( AllInsideNoDataSchema, AllNoDataSchema, @@ -16,6 +33,9 @@ IndexesSchema, OtherCoordsSchema, ) +from imod.typing import GridDataArray +from imod.typing.grid import enforce_dim_order, is_planar_grid +from imod.util.expand_repetitions import expand_repetitions class GeneralHeadBoundary(BoundaryCondition, IRegridPackage): @@ -147,3 +167,130 @@ def _validate(self, schemata, **kwargs): errors = super()._validate(schemata, **kwargs) return errors + + @standard_log_decorator() + def cleanup(self, dis: StructuredDiscretization | VerticesDiscretization) -> None: + """ + Clean up package inplace. This method calls + :func:`imod.prepare.cleanup.cleanup_ghb`, see documentation of that + function for details on cleanup. + + dis: imod.mf6.StructuredDiscretization | imod.mf6.VerticesDiscretization + Model discretization package. + """ + dis_dict = {"idomain": dis.dataset["idomain"]} + cleaned_dict = self._call_func_on_grids(cleanup_ghb, dis_dict) + super().__init__(cleaned_dict) + + @classmethod + def from_imod5_data( + cls, + key: str, + imod5_data: dict[str, dict[str, GridDataArray]], + period_data: dict[str, list[datetime]], + target_discretization, + target_npf: NodePropertyFlow, + time_min: datetime, + time_max: datetime, + allocation_option: ALLOCATION_OPTION, + distributing_option: DISTRIBUTING_OPTION, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + regridder_types: Optional[RegridMethodType] = None, + ) -> "GeneralHeadBoundary": + """ + Construct a GeneralHeadBoundary-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + period_data: dict + Dictionary with iMOD5 period data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_discretization: StructuredDiscretization package + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + target_npf: NodePropertyFlow package + The conductivity information, used to compute GHB flux + allocation_option: ALLOCATION_OPTION + allocation option. + time_min: datetime + Begin-time of the simulation. Used for expanding period data. + time_max: datetime + End-time of the simulation. Used for expanding period data. + distributing_option: dict[str, DISTRIBUTING_OPTION] + distributing option. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + regridder_types: RegridMethodType, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + + Returns + ------- + A Modflow 6 GeneralHeadBoundary packages. + """ + target_top = target_discretization.dataset["top"] + target_bottom = target_discretization.dataset["bottom"] + target_idomain = target_discretization.dataset["idomain"] + + idomain = target_discretization.dataset["idomain"] + data = { + "head": imod5_data[key]["head"], + "conductance": imod5_data[key]["conductance"], + } + is_planar = is_planar_grid(data["conductance"]) + + if regridder_types is None: + regridder_types = GeneralHeadBoundaryRegridMethod() + + regridded_package_data = _regrid_package_data( + data, idomain, regridder_types, regrid_cache, {} + ) + if is_planar: + conductance = regridded_package_data["conductance"] + + planar_head = regridded_package_data["head"] + k = target_npf.dataset["k"] + + ghb_allocation = allocate_ghb_cells( + allocation_option, + target_idomain == 1, + target_top, + target_bottom, + planar_head, + ) + + layered_head = planar_head.where(ghb_allocation) + layered_head = enforce_dim_order(layered_head) + + regridded_package_data["head"] = layered_head + + if "layer" in conductance.coords: + conductance = conductance.isel({"layer": 0}, drop=True) + + regridded_package_data["conductance"] = distribute_ghb_conductance( + distributing_option, + ghb_allocation, + conductance, + target_top, + target_bottom, + k, + ) + + ghb = GeneralHeadBoundary(**regridded_package_data, validate=True) + repeat = period_data.get(key) + if repeat is not None: + ghb.set_repeat_stress(expand_repetitions(repeat, time_min, time_max)) + return ghb + + @classmethod + def get_regrid_methods(cls) -> GeneralHeadBoundaryRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/hfb.py b/imod/mf6/hfb.py index 5163005fa..5713be0a7 100644 --- a/imod/mf6/hfb.py +++ b/imod/mf6/hfb.py @@ -4,10 +4,12 @@ import typing from copy import deepcopy from enum import Enum -from typing import TYPE_CHECKING, Optional, Tuple +from typing import TYPE_CHECKING, Dict, List, Optional, Tuple import cftime import numpy as np +import numpy.typing as npt +import pandas as pd import xarray as xr import xugrid as xu from fastcore.dispatch import typedispatch @@ -17,9 +19,16 @@ from imod.mf6.interfaces.ilinedatapackage import ILineDataPackage from imod.mf6.mf6_hfb_adapter import Mf6HorizontalFlowBarrier from imod.mf6.package import Package +from imod.mf6.utilities.clip import clip_line_gdf_by_grid from imod.mf6.utilities.grid import broadcast_to_full_domain -from imod.schemata import EmptyIndexesSchema -from imod.typing import GridDataArray +from imod.mf6.utilities.hfb import ( + _create_zlinestring_from_bound_df, + _extract_hfb_bounds_from_zpolygons, + _prepare_index_names, +) +from imod.schemata import EmptyIndexesSchema, MaxNUniqueValuesSchema +from imod.typing import GeoDataFrameType, GridDataArray, LineStringType +from imod.typing.grid import as_ugrid_dataarray from imod.util.imports import MissingOptionalModule if TYPE_CHECKING: @@ -30,16 +39,20 @@ except ImportError: gpd = MissingOptionalModule("geopandas") -try: + +if TYPE_CHECKING: import shapely -except ImportError: - shapely = MissingOptionalModule("shapely") +else: + try: + import shapely + except ImportError: + shapely = MissingOptionalModule("shapely") @typedispatch def _derive_connected_cell_ids( idomain: xr.DataArray, grid: xu.Ugrid2d, edge_index: np.ndarray -): +) -> xr.Dataset: """ Derive the cell ids of the connected cells of an edge on a structured grid. @@ -88,7 +101,7 @@ def _derive_connected_cell_ids( @typedispatch # type: ignore[no-redef] def _derive_connected_cell_ids( _: xu.UgridDataArray, grid: xu.Ugrid2d, edge_index: np.ndarray -): +) -> xr.Dataset: """ Derive the cell ids of the connected cells of an edge on an unstructured grid. @@ -175,21 +188,93 @@ def to_connected_cells_dataset( }, ) - barrier_dataset = ( - barrier_dataset.stack(cell_id=("layer", "edge_index"), create_index=False) - .drop_vars("edge_index") - .reset_coords() + barrier_dataset = barrier_dataset.stack( + cell_id=("layer", "edge_index"), create_index=True ) return barrier_dataset.dropna("cell_id") +def _make_linestring_from_polygon( + dataframe: GeoDataFrameType, +) -> List[LineStringType]: + """ + Make linestring from a polygon with one axis in the vertical dimension (z), + and one axis in the horizontal plane (x & y dimension). + """ + coordinates, index = shapely.get_coordinates(dataframe.geometry, return_index=True) + df = pd.DataFrame( + {"polygon_index": index, "x": coordinates[:, 0], "y": coordinates[:, 1]} + ) + df = df.drop_duplicates().reset_index(drop=True) + df = df.set_index("polygon_index") + + linestrings = [ + shapely.LineString(gb.values) for _, gb in df.groupby("polygon_index") + ] + + return linestrings + + +def _select_dataframe_with_snapped_line_index( + snapped_dataset: xr.Dataset, edge_index: np.ndarray, dataframe: GeoDataFrameType +): + """ + Select dataframe rows with line indices of snapped edges. Usually, the + broadcasting results in a larger dataframe where individual rows of input + dataframe are repeated for multiple edges. + """ + line_index = snapped_dataset["line_index"].values + line_index = line_index[edge_index].astype(int) + return dataframe.iloc[line_index] + + +def _extract_mean_hfb_bounds_from_dataframe( + dataframe: GeoDataFrameType, +) -> Tuple[pd.Series, pd.Series]: + """ + Extract hfb bounds from dataframe. Requires dataframe geometry to be of type + shapely "Z Polygon". + + For the upper z bounds, function takes the average of the depth of the two + upper nodes. The same holds for the lower z bounds, but then with the two + lower nodes. + + As a visual representation, this happens for each z bound: + + . . + \ >>> + \ . >>> --- . + \ / >>> - + . . + + """ + dataframe = _prepare_index_names(dataframe) + + if not dataframe.geometry.has_z.all(): + raise TypeError("GeoDataFrame geometry has no z, which is required.") + + lower, upper = _extract_hfb_bounds_from_zpolygons(dataframe) + # Compute means inbetween nodes. + index_names = lower.index.names + lower_mean = lower.groupby(index_names)["z"].mean() + upper_mean = upper.groupby(index_names)["z"].mean() + + # Assign to dataframe to map means to right index. + df_to_broadcast = dataframe.copy() + df_to_broadcast["lower"] = lower_mean + df_to_broadcast["upper"] = upper_mean + + return df_to_broadcast["lower"], df_to_broadcast["upper"] + + def _fraction_layer_overlap( snapped_dataset: xu.UgridDataset, edge_index: np.ndarray, + dataframe: GeoDataFrameType, top: xu.UgridDataArray, bottom: xu.UgridDataArray, -): +) -> xr.DataArray: """ Computes the fraction a barrier occupies inside a layer. """ @@ -202,11 +287,10 @@ def _fraction_layer_overlap( layer_bounds[..., 0] = typing.cast(np.ndarray, bottom_mean.values).T layer_bounds[..., 1] = typing.cast(np.ndarray, top_mean.values).T + zmin, zmax = _extract_mean_hfb_bounds_from_dataframe(dataframe) hfb_bounds = np.empty((n_edge, n_layer, 2), dtype=float) - hfb_bounds[..., 0] = ( - snapped_dataset["zbottom"].values[edge_index].reshape(n_edge, 1) - ) - hfb_bounds[..., 1] = snapped_dataset["ztop"].values[edge_index].reshape(n_edge, 1) + hfb_bounds[..., 0] = zmin.values[:, np.newaxis] + hfb_bounds[..., 1] = zmax.values[:, np.newaxis] overlap = _vectorized_overlap(hfb_bounds, layer_bounds) height = layer_bounds[..., 1] - layer_bounds[..., 0] @@ -245,7 +329,7 @@ def _mean_left_and_right( return xr.concat((uda_left, uda_right), dim="two").mean("two") -def _vectorized_overlap(bounds_a: np.ndarray, bounds_b: np.ndarray): +def _vectorized_overlap(bounds_a: np.ndarray, bounds_b: np.ndarray) -> np.ndarray: """ Vectorized overlap computation. Returns the overlap of 2 vectors along the same axis. If there is no overlap zero will be returned. @@ -272,6 +356,73 @@ def _vectorized_overlap(bounds_a: np.ndarray, bounds_b: np.ndarray): ) +def _prepare_barrier_dataset_for_mf6_adapter(dataset: xr.Dataset) -> xr.Dataset: + """ + Prepare barrier dataset for the initialization of Mf6HorizontalFlowBarrier. + The dataset is expected to have "edge_index" and "layer" coordinates and a + multi-index "cell_id" coordinate. The dataset contains as variables: + "cell_id1", "cell_id2", and "hydraulic_characteristic". + + - Reset coords to get a coordless "cell_id" dimension instead of a multi-index coord + - Assign "layer" as variable to dataset instead of as coord. + """ + # Store layer to work around multiindex issue where dropping the edge_index + # removes the layer as well. + layer = dataset.coords["layer"].values + # Drop leftover coordinate and reset cell_id. + dataset = dataset.drop_vars("edge_index").reset_coords() + # Attach layer again + dataset["layer"] = ("cell_id", layer) + return dataset + + +def _snap_to_grid_and_aggregate( + barrier_dataframe: GeoDataFrameType, grid2d: xu.Ugrid2d, vardict_agg: dict[str, str] +) -> tuple[xu.UgridDataset, npt.NDArray]: + """ + Snap barrier dataframe to grid and aggregate multiple lines with a list of + methods per variable. + + Parameters + ---------- + barrier_dataframe: geopandas.GeoDataFrame + GeoDataFrame with barriers, should have variable "line_index". + grid2d: xugrid.Ugrid2d + Grid to snap lines to + vardict_agg: dict + Mapping of variable name to aggregation method + + Returns + ------- + snapping_dataset: xugrid.UgridDataset + Dataset with all variables snapped and aggregated to cell edges + edge_index: numpy.array + 1D array with indices of cell edges that lines were snapped to + """ + snapping_df = xu.create_snap_to_grid_dataframe( + barrier_dataframe, grid2d, max_snap_distance=0.5 + ) + # Map other variables to snapping_df with line indices + line_index = snapping_df["line_index"] + vars_to_snap = list(vardict_agg.keys()) + if "line_index" in vars_to_snap: + vars_to_snap.remove("line_index") # snapping_df already has line_index + for varname in vars_to_snap: + snapping_df[varname] = barrier_dataframe[varname].iloc[line_index].to_numpy() + # Aggregate to grid edges + gb_edge = snapping_df.groupby("edge_index") + # Initialize dataset and dataarray with the right shape and dims + snapped_dataset = xu.UgridDataset(grids=[grid2d]) + new = xr.DataArray(np.full(grid2d.n_edge, np.nan), dims=[grid2d.edge_dimension]) + edge_index = np.array(list(gb_edge.indices.keys())) + # Aggregate with different methods per variable + for varname, method in vardict_agg.items(): + snapped_dataset[varname] = new.copy() + snapped_dataset[varname].data[edge_index] = gb_edge[varname].aggregate(method) + + return snapped_dataset, edge_index + + class BarrierType(Enum): HydraulicCharacteristic = 0 Multiplier = 1 @@ -301,7 +452,7 @@ def _get_variable_names_for_gdf(self) -> list[str]: ] + self._get_vertical_variables() @property - def line_data(self) -> "gpd.GeoDataFrame": + def line_data(self) -> GeoDataFrameType: variables_for_gdf = self._get_variable_names_for_gdf() return gpd.GeoDataFrame( self.dataset[variables_for_gdf].to_dataframe(), @@ -309,7 +460,7 @@ def line_data(self) -> "gpd.GeoDataFrame": ) @line_data.setter - def line_data(self, value: "gpd.GeoDataFrame") -> None: + def line_data(self, value: GeoDataFrameType) -> None: variables_for_gdf = self._get_variable_names_for_gdf() self.dataset = self.dataset.merge( value.to_xarray(), overwrite_vars=variables_for_gdf, join="right" @@ -336,49 +487,37 @@ def _compute_barrier_values( ): raise NotImplementedError() - def to_mf6_pkg( + def _to_connected_cells_dataset( self, idomain: GridDataArray, top: GridDataArray, bottom: GridDataArray, k: GridDataArray, - validate: bool = False, - ) -> Mf6HorizontalFlowBarrier: + ) -> xr.Dataset: """ - Write package to Modflow 6 package. - - Based on the model grid, top and bottoms, the layers in which the barrier belong are computed. If the - barrier only partially occupies a layer an effective resistance or hydraulic conductivity for that layer is - calculated. This calculation is skipped for the Multiplier type. - - Parameters - ---------- - idomain: GridDataArray - Grid with active cells. - top: GridDataArray - Grid with top of model layers. - bottom: GridDataArray - Grid with bottom of model layers. - k: GridDataArray - Grid with hydraulic conductivities. - validate: bool - Run validation before converting + Method does the following + - forces input grids to unstructured + - snaps lines to cell edges + - remove edge values connected to cell edges + - compute barrier values + - remove edge values to inactive cells + - finds connected cells in dataset Returns ------- - + dataset with connected cells, containing: + - cell_id1 + - cell_id2 + - layer + - value name """ - if validate: - self._validate(self._write_schemata) - top, bottom = broadcast_to_full_domain(idomain, top, bottom) k = idomain * k + # Enforce unstructured unstructured_grid, top, bottom, k = ( - self.__to_unstructured(idomain, top, bottom, k) - if isinstance(idomain, xr.DataArray) - else [idomain, top, bottom, k] + as_ugrid_dataarray(grid) for grid in [idomain, top, bottom, k] ) - snapped_dataset, edge_index = self.__snap_to_grid(idomain) + snapped_dataset, edge_index = self._snap_to_grid(idomain) edge_index = self.__remove_invalid_edges(unstructured_grid, edge_index) barrier_values = self._compute_barrier_values( @@ -401,7 +540,7 @@ def to_mf6_pkg( ) ) - barrier_dataset = to_connected_cells_dataset( + return to_connected_cells_dataset( idomain, unstructured_grid.ugrid.grid, edge_index, @@ -412,16 +551,65 @@ def to_mf6_pkg( }, ) - barrier_dataset["print_input"] = self.dataset["print_input"] + def _to_mf6_pkg(self, barrier_dataset: xr.Dataset) -> Mf6HorizontalFlowBarrier: + """ + Internal method, which does the following + - final coordinate cleanup of barrier dataset + - adds missing options to dataset + Parameters + ---------- + barrier_dataset: xr.Dataset + Xarray dataset with dimensions "cell_dims1", "cell_dims2", "cell_id". + Additional coordinates should be "layer" and "edge_index". + + Returns + ------- + Mf6HorizontalFlowBarrier + """ + barrier_dataset["print_input"] = self.dataset["print_input"] + barrier_dataset = _prepare_barrier_dataset_for_mf6_adapter(barrier_dataset) return Mf6HorizontalFlowBarrier(**barrier_dataset.data_vars) + def to_mf6_pkg( + self, + idomain: GridDataArray, + top: GridDataArray, + bottom: GridDataArray, + k: GridDataArray, + ) -> Mf6HorizontalFlowBarrier: + """ + Write package to Modflow 6 package. + + Based on the model grid, top and bottoms, the layers in which the barrier belong are computed. If the + barrier only partially occupies a layer an effective resistance or hydraulic conductivity for that layer is + calculated. This calculation is skipped for the Multiplier type. + + Parameters + ---------- + idomain: GridDataArray + Grid with active cells. + top: GridDataArray + Grid with top of model layers. + bottom: GridDataArray + Grid with bottom of model layers. + k: GridDataArray + Grid with hydraulic conductivities. + + Returns + ------- + Mf6HorizontalFlowBarrier + Low level representation of the HFB package as MODFLOW 6 expects it. + """ + barrier_dataset = self._to_connected_cells_dataset(idomain, top, bottom, k) + return self._to_mf6_pkg(barrier_dataset) + def is_empty(self) -> bool: if super().is_empty(): return True linestrings = self.dataset["geometry"] - only_empty_lines = all(ls.is_empty for ls in linestrings.values) + only_empty_lines = all(ls.is_empty for ls in linestrings.values.ravel()) return only_empty_lines def _resistance_layer( @@ -484,7 +672,12 @@ def _resistance_layer_overlap( snapped_dataset[self._get_variable_name()] ).values[edge_index] - fraction = _fraction_layer_overlap(snapped_dataset, edge_index, top, bottom) + dataframe = _select_dataframe_with_snapped_line_index( + snapped_dataset, edge_index, self.line_data + ) + fraction = _fraction_layer_overlap( + snapped_dataset, edge_index, dataframe, top, bottom + ) c_aquifer = 1.0 / k_mean inverse_c = (fraction / resistance) + ((1.0 - fraction) / c_aquifer) @@ -580,8 +773,7 @@ def clip_box( sliced : Package """ cls = type(self) - new = cls.__new__(cls) - new.dataset = copy.deepcopy(self.dataset) + new = cls._from_dataset(copy.deepcopy(self.dataset)) new.line_data = self.line_data return new @@ -592,37 +784,36 @@ def mask(self, _) -> Package: """ return deepcopy(self) - @staticmethod - def __to_unstructured( - idomain: xr.DataArray, top: xr.DataArray, bottom: xr.DataArray, k: xr.DataArray - ) -> Tuple[ - xu.UgridDataArray, xu.UgridDataArray, xu.UgridDataArray, xu.UgridDataArray - ]: - unstruct = xu.UgridDataArray.from_structured(idomain) - top = xu.UgridDataArray.from_structured(top) - bottom = xu.UgridDataArray.from_structured(bottom) - k = xu.UgridDataArray.from_structured(k) - - return unstruct, top, bottom, k - - def __snap_to_grid( + def _snap_to_grid( self, idomain: GridDataArray ) -> Tuple[xu.UgridDataset, np.ndarray]: - if "layer" in self.dataset: - variable_names = [self._get_variable_name(), "geometry", "layer"] + variable_name = self._get_variable_name() + has_layer = "layer" in self._get_vertical_variables() + # Create geodataframe with barriers + if has_layer: + varnames_for_df = [variable_name, "geometry", "layer"] else: - variable_names = [self._get_variable_name(), "geometry", "ztop", "zbottom"] - barrier_dataframe = self.dataset[variable_names].to_dataframe() - - snapped_dataset, _ = typing.cast( - xu.UgridDataset, - xu.snap_to_grid(barrier_dataframe, grid=idomain, max_snap_distance=0.5), + varnames_for_df = [variable_name, "geometry"] + barrier_dataframe = gpd.GeoDataFrame( + self.dataset[varnames_for_df].to_dataframe() ) - edge_index = np.argwhere( - snapped_dataset[self._get_variable_name()].notnull().values - ).ravel() + # Convert vertical polygon to linestring + if not has_layer: + lower, _ = _extract_hfb_bounds_from_zpolygons(barrier_dataframe) + linestring = _create_zlinestring_from_bound_df(lower) + barrier_dataframe["geometry"] = linestring["geometry"] + # Clip barriers outside domain + barrier_dataframe = clip_line_gdf_by_grid( + barrier_dataframe, idomain.sel(layer=1) + ) + # Prepare variable names and methods for aggregation + vardict_agg = {"line_index": "first", variable_name: "sum"} + if has_layer: + vardict_agg["layer"] = "first" + # Create grid from structured + grid2d = as_ugrid_dataarray(idomain.sel(layer=1)).grid - return snapped_dataset, edge_index + return _snap_to_grid_and_aggregate(barrier_dataframe, grid2d, vardict_agg) @staticmethod def __remove_invalid_edges( @@ -679,7 +870,7 @@ def __remove_edge_values_connected_to_inactive_cells( class HorizontalFlowBarrierHydraulicCharacteristic(HorizontalFlowBarrierBase): """ - Horizontal Flow Barrier (HFB) package + Horizontal Flow Barrier (HFB) package Input to the Horizontal Flow Barrier (HFB) Package is read from the file that has type "HFB6" in the Name File. Only one HFB Package can be @@ -692,21 +883,20 @@ class HorizontalFlowBarrierHydraulicCharacteristic(HorizontalFlowBarrierBase): Dataframe that describes: - geometry: the geometries of the barriers, - hydraulic_characteristic: the hydraulic characteristic of the barriers - - ztop: the top z-value of the barriers - - zbottom: the bottom z-value of the barriers print_input: bool Examples -------- - >>> barrier_x = [-1000.0, 0.0, 1000.0] - >>> barrier_y = [500.0, 250.0, 500.0] + >>> x = [-10.0, 0.0, 10.0] + >>> y = [10.0, 0.0, -10.0] + >>> ztop = [10.0, 20.0, 15.0] + >>> zbot = [-10.0, -20.0, 0.0] + >>> polygons = linestring_to_trapezoid_zpolygons(x, y, ztop, zbot) >>> barrier_gdf = gpd.GeoDataFrame( - >>> geometry=[shapely.linestrings(barrier_x, barrier_y),], + >>> geometry=polygons, >>> data={ - >>> "hydraulic_characteristic": [1e-3,], - >>> "ztop": [10.0,], - >>> "zbottom": [0.0,], + >>> "resistance": [1e-3, 1e-3], >>> }, >>> ) >>> hfb = imod.mf6.HorizontalFlowBarrierHydraulicCharacteristic(barrier_gdf) @@ -728,7 +918,7 @@ def _get_variable_name(self) -> str: return "hydraulic_characteristic" def _get_vertical_variables(self) -> list: - return ["ztop", "zbottom"] + return [] def _compute_barrier_values( self, snapped_dataset, edge_index, idomain, top, bottom, k @@ -740,9 +930,11 @@ def _compute_barrier_values( return barrier_values -class LayeredHorizontalFlowBarrierHydraulicCharacteristic(HorizontalFlowBarrierBase): +class SingleLayerHorizontalFlowBarrierHydraulicCharacteristic( + HorizontalFlowBarrierBase +): """ - Horizontal Flow Barrier (HFB) package + Horizontal Flow Barrier (HFB) package Input to the Horizontal Flow Barrier (HFB) Package is read from the file that has type "HFB6" in the Name File. Only one HFB Package can be @@ -754,8 +946,10 @@ class LayeredHorizontalFlowBarrierHydraulicCharacteristic(HorizontalFlowBarrierB geometry: gpd.GeoDataFrame Dataframe that describes: - geometry: the geometries of the barriers, - - hydraulic_characteristic: the hydraulic characteristic of the barriers - - layer: model layer for the barrier + - hydraulic_characteristic: the hydraulic characteristic of the + barriers + - layer: model layer for the barrier, only 1 single layer can be + entered. print_input: bool Examples @@ -774,6 +968,11 @@ class LayeredHorizontalFlowBarrierHydraulicCharacteristic(HorizontalFlowBarrierB """ + _write_schemata = { + "geometry": [EmptyIndexesSchema()], + "layer": [MaxNUniqueValuesSchema(1)], + } + @init_log_decorator() def __init__( self, @@ -805,7 +1004,7 @@ def _compute_barrier_values( class HorizontalFlowBarrierMultiplier(HorizontalFlowBarrierBase): """ - Horizontal Flow Barrier (HFB) package + Horizontal Flow Barrier (HFB) package Input to the Horizontal Flow Barrier (HFB) Package is read from the file that has type "HFB6" in the Name File. Only one HFB Package can be @@ -820,21 +1019,20 @@ class HorizontalFlowBarrierMultiplier(HorizontalFlowBarrierBase): Dataframe that describes: - geometry: the geometries of the barriers, - multiplier: the multiplier of the barriers - - ztop: the top z-value of the barriers - - zbottom: the bottom z-value of the barriers print_input: bool Examples -------- - >>> barrier_x = [-1000.0, 0.0, 1000.0] - >>> barrier_y = [500.0, 250.0, 500.0] + >>> x = [-10.0, 0.0, 10.0] + >>> y = [10.0, 0.0, -10.0] + >>> ztop = [10.0, 20.0, 15.0] + >>> zbot = [-10.0, -20.0, 0.0] + >>> polygons = linestring_to_trapezoid_zpolygons(x, y, ztop, zbot) >>> barrier_gdf = gpd.GeoDataFrame( - >>> geometry=[shapely.linestrings(barrier_x, barrier_y),], + >>> geometry=polygons, >>> data={ - >>> "multiplier": [1.5,], - >>> "ztop": [10.0,], - >>> "zbottom": [0.0,], + >>> "multiplier": [10.0, 10.0], >>> }, >>> ) >>> hfb = imod.mf6.HorizontalFlowBarrierMultiplier(barrier_gdf) @@ -856,12 +1054,17 @@ def _get_variable_name(self) -> str: return "multiplier" def _get_vertical_variables(self) -> list: - return ["ztop", "zbottom"] + return [] def _compute_barrier_values( self, snapped_dataset, edge_index, idomain, top, bottom, k ): - fraction = _fraction_layer_overlap(snapped_dataset, edge_index, top, bottom) + dataframe = _select_dataframe_with_snapped_line_index( + snapped_dataset, edge_index, self.line_data + ) + fraction = _fraction_layer_overlap( + snapped_dataset, edge_index, dataframe, top, bottom + ) barrier_values = ( fraction.where(fraction) @@ -871,24 +1074,23 @@ def _compute_barrier_values( return barrier_values -class LayeredHorizontalFlowBarrierMultiplier(HorizontalFlowBarrierBase): +class SingleLayerHorizontalFlowBarrierMultiplier(HorizontalFlowBarrierBase): """ - Horizontal Flow Barrier (HFB) package + Horizontal Flow Barrier (HFB) package Input to the Horizontal Flow Barrier (HFB) Package is read from the file that has type "HFB6" in the Name File. Only one HFB Package can be specified for a GWF model. https://water.usgs.gov/water-resources/software/MODFLOW-6/mf6io_6.2.2.pdf - If parts of the barrier overlap a layer the multiplier is applied to the entire layer. - Parameters ---------- geometry: gpd.GeoDataFrame Dataframe that describes: - geometry: the geometries of the barriers, - multiplier: the multiplier of the barriers - - layer: model layer for the barrier + - layer: model layer for the barrier, only 1 single layer can be + entered. print_input: bool Examples @@ -907,6 +1109,11 @@ class LayeredHorizontalFlowBarrierMultiplier(HorizontalFlowBarrierBase): """ + _write_schemata = { + "geometry": [EmptyIndexesSchema()], + "layer": [MaxNUniqueValuesSchema(1)], + } + @init_log_decorator() def __init__( self, @@ -971,21 +1178,21 @@ class HorizontalFlowBarrierResistance(HorizontalFlowBarrierBase): Dataframe that describes: - geometry: the geometries of the barriers, - resistance: the resistance of the barriers - - ztop: the top z-value of the barriers - - zbottom: the bottom z-value of the barriers + print_input: bool Examples -------- - >>> barrier_x = [-1000.0, 0.0, 1000.0] - >>> barrier_y = [500.0, 250.0, 500.0] + >>> x = [-10.0, 0.0, 10.0] + >>> y = [10.0, 0.0, -10.0] + >>> ztop = [10.0, 20.0, 15.0] + >>> zbot = [-10.0, -20.0, 0.0] + >>> polygons = linestring_to_trapezoid_zpolygons(x, y, ztop, zbot) >>> barrier_gdf = gpd.GeoDataFrame( - >>> geometry=[shapely.linestrings(barrier_x, barrier_y),], + >>> geometry=polygons, >>> data={ - >>> "resistance": [1e3,], - >>> "ztop": [10.0,], - >>> "zbottom": [0.0,], + >>> "resistance": [1e3, 1e3], >>> }, >>> ) >>> hfb = imod.mf6.HorizontalFlowBarrierResistance(barrier_gdf) @@ -1008,7 +1215,7 @@ def _get_variable_name(self) -> str: return "resistance" def _get_vertical_variables(self) -> list: - return ["ztop", "zbottom"] + return [] def _compute_barrier_values( self, snapped_dataset, edge_index, idomain, top, bottom, k @@ -1020,7 +1227,7 @@ def _compute_barrier_values( return barrier_values -class LayeredHorizontalFlowBarrierResistance(HorizontalFlowBarrierBase): +class SingleLayerHorizontalFlowBarrierResistance(HorizontalFlowBarrierBase): """ Horizontal Flow Barrier (HFB) package @@ -1035,7 +1242,8 @@ class LayeredHorizontalFlowBarrierResistance(HorizontalFlowBarrierBase): Dataframe that describes: - geometry: the geometries of the barriers, - resistance: the resistance of the barriers - - layer: model layer for the barrier + - layer: model layer for the barrier, only 1 single layer can be + entered. print_input: bool Examples @@ -1055,6 +1263,11 @@ class LayeredHorizontalFlowBarrierResistance(HorizontalFlowBarrierBase): """ + _write_schemata = { + "geometry": [EmptyIndexesSchema()], + "layer": [MaxNUniqueValuesSchema(1)], + } + @init_log_decorator() def __init__( self, @@ -1080,5 +1293,27 @@ def _compute_barrier_values( edge_index, idomain, ) - return barrier_values + + @classmethod + def from_imod5_dataset( + cls, key: str, imod5_data: Dict[str, Dict[str, GridDataArray]] + ): + imod5_keys = list(imod5_data.keys()) + if key not in imod5_keys: + raise ValueError("hfb key not present.") + + hfb_dict = imod5_data[key] + if not list(hfb_dict.keys()) == ["geodataframe", "layer"]: + raise ValueError("hfb is not a SingleLayerHorizontalFlowBarrierResistance") + layer = hfb_dict["layer"] + if layer == 0: + raise ValueError( + "assigning to layer 0 is not supported for " + "SingleLayerHorizontalFlowBarrierResistance. " + "Try HorizontalFlowBarrierResistance class." + ) + geometry_layer = hfb_dict["geodataframe"] + geometry_layer["layer"] = layer + + return cls(geometry_layer) diff --git a/imod/mf6/ic.py b/imod/mf6/ic.py index d3719a476..017463c95 100644 --- a/imod/mf6/ic.py +++ b/imod/mf6/ic.py @@ -1,14 +1,19 @@ import warnings -from typing import Any +from copy import deepcopy +from typing import Any, Optional import numpy as np from imod.logging import init_log_decorator from imod.mf6.interfaces.iregridpackage import IRegridPackage from imod.mf6.package import Package -from imod.mf6.regrid.regrid_schemes import InitialConditionsRegridMethod +from imod.mf6.regrid.regrid_schemes import ( + InitialConditionsRegridMethod, +) +from imod.mf6.utilities.regrid import RegridderWeightsCache, _regrid_package_data from imod.mf6.validation import PKG_DIMS_SCHEMA from imod.schemata import DTypeSchema, IdentityNoDataSchema, IndexesSchema +from imod.typing import GridDataArray class InitialConditions(Package, IRegridPackage): @@ -89,3 +94,55 @@ def render(self, directory, pkgname, globaltimes, binary): self["start"], icdirectory, "strt", binary=binary ) return self._template.render(d) + + @classmethod + def from_imod5_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + target_grid: GridDataArray, + regridder_types: Optional[InitialConditionsRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "InitialConditions": + """ + Construct an InitialConditions-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_grid: GridDataArray + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + regridder_types: InitialConditionsRegridMethod, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + Modflow 6 InitialConditions package. + """ + + data = { + "start": imod5_data["shd"]["head"], + } + + if regridder_types is None: + regridder_types = InitialConditions.get_regrid_methods() + + new_package_data = _regrid_package_data( + data, target_grid, regridder_types, regrid_cache, {} + ) + return cls(**new_package_data, validate=True) + + @classmethod + def get_regrid_methods(cls) -> InitialConditionsRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/interfaces/ipackagebase.py b/imod/mf6/interfaces/ipackagebase.py index 038e20876..6351fe79d 100644 --- a/imod/mf6/interfaces/ipackagebase.py +++ b/imod/mf6/interfaces/ipackagebase.py @@ -2,6 +2,8 @@ import xarray as xr +from imod.typing import GridDataset + class IPackageBase(ABC): """ @@ -17,3 +19,8 @@ def dataset(self) -> xr.Dataset: @abstractmethod def dataset(self, value: xr.Dataset) -> None: raise NotImplementedError + + @classmethod + @abstractmethod + def _from_dataset(self, ds: GridDataset): + raise NotImplementedError diff --git a/imod/mf6/model.py b/imod/mf6/model.py index f3a5cc20a..54ac24b07 100644 --- a/imod/mf6/model.py +++ b/imod/mf6/model.py @@ -6,7 +6,7 @@ import pathlib from copy import deepcopy from pathlib import Path -from typing import Any, Optional, Tuple, Union +from typing import Any, List, Optional, Tuple, Union import cftime import jinja2 @@ -19,17 +19,23 @@ import imod from imod.logging import standard_log_decorator +from imod.mf6.hfb import HorizontalFlowBarrierBase from imod.mf6.interfaces.imodel import IModel from imod.mf6.package import Package from imod.mf6.statusinfo import NestedStatusInfo, StatusInfo, StatusInfoBase from imod.mf6.utilities.mask import _mask_all_packages +from imod.mf6.utilities.mf6hfb import merge_hfb_packages from imod.mf6.utilities.regrid import RegridderWeightsCache, _regrid_like from imod.mf6.validation import pkg_errors_to_status_info +from imod.mf6.validation_context import ValidationContext +from imod.mf6.wel import GridAgnosticWell from imod.mf6.write_context import WriteContext from imod.schemata import ValidationError from imod.typing import GridDataArray from imod.typing.grid import is_spatial_grid +HFB_PKGNAME = "hfb_merged" + class Modflow6Model(collections.UserDict, IModel, abc.ABC): _mandatory_packages: tuple[str, ...] = () @@ -145,12 +151,20 @@ def render(self, modelname: str, write_context: WriteContext): d = {k: v for k, v in self._options.items() if not (v is None or v is False)} packages = [] + has_hfb = False for pkgname, pkg in self.items(): # Add the six to the package id pkg_id = pkg._pkg_id + # Skip if hfb + if pkg_id == "hfb": + has_hfb = True + continue key = f"{pkg_id}6" path = dir_for_render / f"{pkgname}.{pkg_id}" packages.append((key, path.as_posix(), pkgname)) + if has_hfb: + path = dir_for_render / f"{HFB_PKGNAME}.hfb" + packages.append(("hfb6", path.as_posix(), HFB_PKGNAME)) d["packages"] = packages return self._template.render(d) @@ -227,7 +241,11 @@ def validate(self, model_name: str = "") -> StatusInfoBase: @standard_log_decorator() def write( - self, modelname, globaltimes, validate: bool, write_context: WriteContext + self, + modelname, + globaltimes, + write_context: WriteContext, + validate_context: ValidationContext, ) -> StatusInfoBase: """ Write model namefile @@ -237,7 +255,7 @@ def write( workdir = write_context.simulation_directory modeldirectory = workdir / modelname Path(modeldirectory).mkdir(exist_ok=True, parents=True) - if validate: + if validate_context.validate: model_status_info = self.validate(modelname) if model_status_info.has_errors(): return model_status_info @@ -252,34 +270,26 @@ def write( pkg_write_context = write_context.copy_with_new_write_directory( new_write_directory=modeldirectory ) + mf6_hfb_ls: List[HorizontalFlowBarrierBase] = [] for pkg_name, pkg in self.items(): try: - if isinstance(pkg, imod.mf6.Well): + if issubclass(type(pkg), GridAgnosticWell): top, bottom, idomain = self.__get_domain_geometry() k = self.__get_k() - mf6_well_pkg = pkg.to_mf6_pkg( + mf6_well_pkg = pkg._to_mf6_pkg( idomain, top, bottom, k, - validate, - pkg_write_context.is_partitioned, + validate_context, ) - mf6_well_pkg.write( pkgname=pkg_name, globaltimes=globaltimes, write_context=pkg_write_context, ) - elif isinstance(pkg, imod.mf6.HorizontalFlowBarrierBase): - top, bottom, idomain = self.__get_domain_geometry() - k = self.__get_k() - mf6_hfb_pkg = pkg.to_mf6_pkg(idomain, top, bottom, k, validate) - mf6_hfb_pkg.write( - pkgname=pkg_name, - globaltimes=globaltimes, - write_context=pkg_write_context, - ) + elif issubclass(type(pkg), imod.mf6.HorizontalFlowBarrierBase): + mf6_hfb_ls.append(pkg) else: pkg.write( pkgname=pkg_name, @@ -289,6 +299,20 @@ def write( except Exception as e: raise type(e)(f"{e}\nError occured while writing {pkg_name}") + if len(mf6_hfb_ls) > 0: + try: + pkg_name = HFB_PKGNAME + top, bottom, idomain = self.__get_domain_geometry() + k = self.__get_k() + mf6_hfb_pkg = merge_hfb_packages(mf6_hfb_ls, idomain, top, bottom, k) + mf6_hfb_pkg.write( + pkgname=pkg_name, + globaltimes=globaltimes, + write_context=pkg_write_context, + ) + except Exception as e: + raise type(e)(f"{e}\nError occured while writing {pkg_name}") + return NestedStatusInfo(modelname) @standard_log_decorator() @@ -505,7 +529,7 @@ def regrid_like( self, target_grid: GridDataArray, validate: bool = True, - regrid_context: Optional[RegridderWeightsCache] = None, + regrid_cache: Optional[RegridderWeightsCache] = None, ) -> "Modflow6Model": """ Creates a model by regridding the packages of this model to another discretization. @@ -519,7 +543,7 @@ def regrid_like( a grid defined over the same discretization as the one we want to regrid the package to validate: bool set to true to validate the regridded packages - regrid_context: Optional RegridderWeightsCache + regrid_cache: Optional RegridderWeightsCache stores regridder weights for different regridders. Can be used to speed up regridding, if the same regridders are used several times for regridding different arrays. @@ -528,7 +552,7 @@ def regrid_like( a model with similar packages to the input model, and with all the data-arrays regridded to another discretization, similar to the one used in input argument "target_grid" """ - return _regrid_like(self, target_grid, validate, regrid_context) + return _regrid_like(self, target_grid, validate, regrid_cache) def mask_all_packages( self, diff --git a/imod/mf6/model_gwf.py b/imod/mf6/model_gwf.py index 76fa6318a..35e6b93b9 100644 --- a/imod/mf6/model_gwf.py +++ b/imod/mf6/model_gwf.py @@ -1,15 +1,47 @@ from __future__ import annotations -from typing import Optional +import textwrap +from datetime import datetime +from typing import Optional, cast import cftime import numpy as np from imod.logging import init_log_decorator +from imod.logging.logging_decorators import standard_log_decorator from imod.mf6 import ConstantHead from imod.mf6.clipped_boundary_condition_creator import create_clipped_boundary +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.drn import Drainage +from imod.mf6.ghb import GeneralHeadBoundary +from imod.mf6.hfb import SingleLayerHorizontalFlowBarrierResistance +from imod.mf6.ic import InitialConditions from imod.mf6.model import Modflow6Model +from imod.mf6.npf import NodePropertyFlow +from imod.mf6.rch import Recharge +from imod.mf6.regrid.regrid_schemes import ( + ConstantHeadRegridMethod, + DiscretizationRegridMethod, + DrainageRegridMethod, + GeneralHeadBoundaryRegridMethod, + InitialConditionsRegridMethod, + NodePropertyFlowRegridMethod, + RechargeRegridMethod, + RegridMethodType, + RiverRegridMethod, + StorageCoefficientRegridMethod, +) +from imod.mf6.riv import River +from imod.mf6.sto import StorageCoefficient +from imod.mf6.utilities.chd_concat import concat_layered_chd_packages +from imod.mf6.utilities.regrid import RegridderWeightsCache +from imod.mf6.wel import LayeredWell, Well +from imod.prepare.topsystem.default_allocation_methods import ( + SimulationAllocationOptions, + SimulationDistributingOptions, +) from imod.typing import GridDataArray +from imod.typing.grid import zeros_like class GroundwaterFlowModel(Modflow6Model): @@ -153,3 +185,242 @@ def update_buoyancy_package(self, transport_models_per_flow_model) -> None: transport_models_old = buoyancy_package.get_transport_model_names() if len(transport_models_old) == len(transport_models_per_flow_model): buoyancy_package.update_transport_models(transport_models_per_flow_model) + + @classmethod + @standard_log_decorator() + def from_imod5_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + period_data: dict[str, list[datetime]], + allocation_options: SimulationAllocationOptions, + distributing_options: SimulationDistributingOptions, + times: list[datetime], + regridder_types: dict[str, RegridMethodType], + ) -> "GroundwaterFlowModel": + """ + Imports a GroundwaterFlowModel (GWF) from the data in an IMOD5 project file. + It adds the packages for which import from imod5 is supported. + Some packages (like OC) must be added manually later. + + + Parameters + ---------- + imod5_data: dict[str, dict[str, GridDataArray]] + dictionary containing the arrays mentioned in the project file as xarray datasets, + under the key of the package type to which it belongs + allocation_options: SimulationAllocationOptions + object containing the allocation options per package type. + If you want a package to have a different allocation option, + then it should be imported separately + distributing_options: SimulationDistributingOptions + object containing the conductivity distribution options per package type. + If you want a package to have a different allocation option, + then it should be imported separately + time_min: datetime + Begin-time of the simulation. + time_max: datetime + End-time of the simulation. + regridder_types: dict[str, RegridMethodType] + the key is the package name. The value is a subclass of RegridMethodType. + + Returns + ------- + A GWF model containing the packages that could be imported form IMOD5. Users must still + add the OC package to the model. + + """ + # first import the singleton packages + # import discretization + regrid_cache = RegridderWeightsCache() + + dis_pkg = StructuredDiscretization.from_imod5_data( + imod5_data, + cast(DiscretizationRegridMethod, regridder_types.get("dis")), + regrid_cache, + False, + ) + grid = dis_pkg.dataset["idomain"] + + # import npf + npf_pkg = NodePropertyFlow.from_imod5_data( + imod5_data, + grid, + cast(NodePropertyFlowRegridMethod, regridder_types.get("npf")), + regrid_cache, + ) + + # import sto + sto_pkg = None + if "sto" in imod5_data.keys(): + sto_pkg = StorageCoefficient.from_imod5_data( + imod5_data, + grid, + cast(StorageCoefficientRegridMethod, regridder_types.get("sto")), + regrid_cache, + ) + + # import initial conditions + ic_pkg = InitialConditions.from_imod5_data( + imod5_data, + grid, + cast(InitialConditionsRegridMethod, regridder_types.get("ic")), + regrid_cache, + ) + + # import recharge + rch_pkg = None + if "rch" in imod5_data.keys(): + rch_pkg = Recharge.from_imod5_data( + imod5_data, + dis_pkg, + cast(RechargeRegridMethod, regridder_types.get("rch")), + regrid_cache, + ) + + result = GroundwaterFlowModel() + result["dis"] = dis_pkg + result["npf"] = npf_pkg + if sto_pkg is not None: + result["sto"] = sto_pkg + result["ic"] = ic_pkg + if rch_pkg is not None: + result["rch"] = rch_pkg + + # now import the non-singleton packages' + + # import wells + imod5_keys = list(imod5_data.keys()) + wel_keys = [key for key in imod5_keys if key[0:3] == "wel"] + for wel_key in wel_keys: + wel_key_truncated = wel_key[:16] + if wel_key_truncated in result.keys(): + # Remove this when https://github.com/Deltares/imod-python/issues/1167 + # is resolved + msg = textwrap.dedent( + f"""Truncated key: '{wel_key_truncated}' already assigned to + imported model, please rename wells so that unique names are + formed after they are truncated to 16 characters for MODFLOW + 6. + """ + ) + raise KeyError(msg) + layer = np.array(imod5_data[wel_key]["layer"]) + if np.any(layer == 0): + result[wel_key_truncated] = Well.from_imod5_data( + wel_key, imod5_data, times + ) + else: + result[wel_key_truncated] = LayeredWell.from_imod5_data( + wel_key, imod5_data, times + ) + + # import ghb's + imod5_keys = list(imod5_data.keys()) + ghb_keys = [key for key in imod5_keys if key[0:3] == "ghb"] + for ghb_key in ghb_keys: + ghb_pkg = GeneralHeadBoundary.from_imod5_data( + ghb_key, + imod5_data, + period_data, + dis_pkg, + npf_pkg, + times[0], + times[-1], + allocation_options.ghb, + distributing_options.ghb, + regridder_types=cast( + GeneralHeadBoundaryRegridMethod, regridder_types.get(ghb_key) + ), + regrid_cache=regrid_cache, + ) + result[ghb_key] = ghb_pkg + + # import drainage + + drainage_keys = [key for key in imod5_keys if key[0:3] == "drn"] + for drn_key in drainage_keys: + drn_pkg = Drainage.from_imod5_data( + drn_key, + imod5_data, + period_data, + dis_pkg, + npf_pkg, + times[0], + times[-1], + allocation_options.drn, + distributing_option=distributing_options.drn, + regridder_types=cast( + DrainageRegridMethod, regridder_types.get(drn_key) + ), + regrid_cache=regrid_cache, + ) + result[drn_key] = drn_pkg + + # import rivers ( and drainage to account for infiltration factor) + riv_keys = [key for key in imod5_keys if key[0:3] == "riv"] + for riv_key in riv_keys: + riv_pkg, riv_drn_pkg = River.from_imod5_data( + riv_key, + imod5_data, + period_data, + dis_pkg, + npf_pkg, + times[0], + times[-1], + allocation_options.riv, + distributing_options.riv, + cast(RiverRegridMethod, regridder_types.get(riv_key)), + regrid_cache, + ) + if riv_pkg is not None: + result[riv_key + "riv"] = riv_pkg + if riv_drn_pkg is not None: + result[riv_key + "drn"] = riv_drn_pkg + + # import hfb + hfb_keys = [key for key in imod5_keys if key[0:3] == "hfb"] + if len(hfb_keys) != 0: + for hfb_key in hfb_keys: + result[hfb_key] = ( + SingleLayerHorizontalFlowBarrierResistance.from_imod5_dataset( + hfb_key, imod5_data + ) + ) + + # import chd + chd_keys = [key for key in imod5_keys if key[0:3] == "chd"] + if len(chd_keys) == 0: + result["chd_from_shd"] = ConstantHead.from_imod5_shd_data( + imod5_data, + dis_pkg, + cast(ConstantHeadRegridMethod, regridder_types.get("chd_from_shd")), + regrid_cache, + ) + else: + chd_packages = {} + for chd_key in chd_keys: + chd_packages[chd_key] = ConstantHead.from_imod5_data( + chd_key, + imod5_data, + dis_pkg, + cast(ConstantHeadRegridMethod, regridder_types.get(chd_key)), + regrid_cache, + ) + merged_chd = concat_layered_chd_packages( + "chd", chd_packages, remove_merged_packages=True + ) + if merged_chd is not None: + result["chd_merged"] = merged_chd + for key, chd_package in chd_packages.items(): + result[key] = chd_package + + if "sto" not in result.keys(): + zeros = zeros_like(grid, dtype=float) + result["sto"] = StorageCoefficient( + storage_coefficient=zeros, + specific_yield=zeros, + transient=False, + convertible=zeros.astype(int), + ) + + return result diff --git a/imod/mf6/npf.py b/imod/mf6/npf.py index 1ed4d0131..102518ee3 100644 --- a/imod/mf6/npf.py +++ b/imod/mf6/npf.py @@ -1,11 +1,21 @@ import warnings +from copy import deepcopy +from typing import Optional import numpy as np +import xarray as xr from imod.logging import init_log_decorator from imod.mf6.interfaces.iregridpackage import IRegridPackage from imod.mf6.package import Package -from imod.mf6.regrid.regrid_schemes import NodePropertyFlowRegridMethod +from imod.mf6.regrid.regrid_schemes import ( + NodePropertyFlowRegridMethod, +) +from imod.mf6.utilities.imod5_converter import fill_missing_layers +from imod.mf6.utilities.regrid import ( + RegridderWeightsCache, + _regrid_package_data, +) from imod.mf6.validation import PKG_DIMS_SCHEMA from imod.schemata import ( AllValueSchema, @@ -16,6 +26,7 @@ IndexesSchema, ) from imod.typing import GridDataArray +from imod.typing.grid import zeros_like def _dataarray_to_bool(griddataarray: GridDataArray) -> bool: @@ -441,3 +452,79 @@ def _validate(self, schemata, **kwargs): errors = super()._validate(schemata, **kwargs) return errors + + @classmethod + def from_imod5_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + target_grid: GridDataArray, + regridder_types: Optional[NodePropertyFlowRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "NodePropertyFlow": + """ + Construct an npf-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_grid: GridDataArray + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + regridder_types: RegridMethodType, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + Modflow 6 npf package. + + """ + + data = { + "k": imod5_data["khv"]["kh"], + } + has_vertical_anisotropy = ( + "kva" in imod5_data.keys() + and "vertical_anisotropy" in imod5_data["kva"].keys() + ) + has_horizontal_anisotropy = "ani" in imod5_data.keys() + + if has_vertical_anisotropy: + data["k33"] = data["k"] * imod5_data["kva"]["vertical_anisotropy"] + if has_horizontal_anisotropy: + if not np.all(np.isnan(imod5_data["ani"]["factor"].values)): + factor = imod5_data["ani"]["factor"] + factor = fill_missing_layers(factor, target_grid, 1) + data["k22"] = data["k"] * factor + if not np.all(np.isnan(imod5_data["ani"]["angle"].values)): + angle1 = imod5_data["ani"]["angle"] + angle1 = 90.0 - angle1 + angle1 = xr.where(angle1 < 0, 360.0 + angle1, angle1) + angle1 = fill_missing_layers(angle1, target_grid, 0) + data["angle1"] = angle1 + + icelltype = zeros_like(target_grid, dtype=int) + + if regridder_types is None: + regridder_types = NodePropertyFlow.get_regrid_methods() + + new_package_data = _regrid_package_data( + data, target_grid, regridder_types, regrid_cache, {} + ) + new_package_data["icelltype"] = icelltype + + return NodePropertyFlow(**new_package_data, validate=True) + + @classmethod + def get_regrid_methods(cls) -> NodePropertyFlowRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/package.py b/imod/mf6/package.py index bc74c26e7..b3b3b914f 100644 --- a/imod/mf6/package.py +++ b/imod/mf6/package.py @@ -4,7 +4,7 @@ import pathlib from collections import defaultdict from copy import deepcopy -from typing import Any, Dict, List, Mapping, Optional, Tuple, Union +from typing import Any, Callable, Dict, List, Mapping, Optional, Tuple, Union import cftime import jinja2 @@ -25,6 +25,7 @@ ) from imod.mf6.regrid.regrid_schemes import EmptyRegridMethod, RegridMethodType from imod.mf6.utilities.mask import mask_package +from imod.mf6.utilities.package import _is_valid from imod.mf6.utilities.regrid import ( RegridderWeightsCache, _regrid_like, @@ -80,25 +81,12 @@ def sel(self): f"{type(self).__name__}(**{self._pkg_id}.dataset.sel(**selection))" ) - def _valid(self, value): - """ - Filters values that are None, False, or a numpy.bool_ False. - Needs to be this specific, since 0.0 and 0 are valid values, but are - equal to a boolean False. - """ - # Test singletons - if value is False or value is None: - return False - # Test numpy bool (not singleton) - elif isinstance(value, np.bool_) and not value: - return False - # When dumping to netCDF and reading back, None will have been - # converted into a NaN. Only check NaN if it's a floating type to avoid - # TypeErrors. - elif np.issubdtype(type(value), np.floating) and np.isnan(value): - return False - else: - return True + def cleanup(self, dis: Any): + raise NotImplementedError("Method not implemented for this package.") + + @staticmethod + def _valid(value: Any) -> bool: + return _is_valid(value) @staticmethod def _number_format(dtype: type): @@ -531,9 +519,7 @@ def clip_box( selection = selection.sel(x=x_slice, y=y_slice) cls = type(self) - new = cls.__new__(cls) - new.dataset = selection - return new + return cls._from_dataset(selection) def mask(self, mask: GridDataArray) -> Any: """ @@ -560,7 +546,7 @@ def mask(self, mask: GridDataArray) -> Any: def regrid_like( self, target_grid: GridDataArray, - regrid_context: RegridderWeightsCache, + regrid_cache: RegridderWeightsCache, regridder_types: Optional[RegridMethodType] = None, ) -> "Package": """ @@ -587,7 +573,7 @@ def regrid_like( ---------- target_grid: xr.DataArray or xu.UgridDataArray a grid defined over the same discretization as the one we want to regrid the package to. - regrid_context: RegridderWeightsCache, optional + regrid_cache: RegridderWeightsCache, optional stores regridder weights for different regridders. Can be used to speed up regridding, if the same regridders are used several times for regridding different arrays. regridder_types: RegridMethodType, optional @@ -600,7 +586,7 @@ def regrid_like( similar to the one used in input argument "target_grid" """ try: - result = _regrid_like(self, target_grid, regrid_context, regridder_types) + result = _regrid_like(self, target_grid, regrid_cache, regridder_types) except ValueError as e: raise e except Exception: @@ -647,6 +633,28 @@ def get_non_grid_data(self, grid_names: list[str]) -> dict[str, Any]: result[name] = self.dataset[name].values[()] return result + def _call_func_on_grids( + self, func: Callable, dis: dict + ) -> dict[str, GridDataArray]: + """ + Call function on dictionary of grids and merge settings back into + dictionary. + + Parameters + ---------- + func: Callable + Function to call on all grids + """ + grid_varnames = list(self._write_schemata.keys()) + grids = { + varname: self.dataset[varname] + for varname in grid_varnames + if varname in self.dataset.keys() + } + cleaned_grids = func(**dis, **grids) + settings = self.get_non_grid_data(grid_varnames) + return cleaned_grids | settings + def is_splitting_supported(self) -> bool: return True @@ -656,5 +664,6 @@ def is_regridding_supported(self) -> bool: def is_clipping_supported(self) -> bool: return True - def get_regrid_methods(self) -> RegridMethodType: - return deepcopy(self._regrid_method) + @classmethod + def get_regrid_methods(cls) -> RegridMethodType: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/pkgbase.py b/imod/mf6/pkgbase.py index de1051893..477840d34 100644 --- a/imod/mf6/pkgbase.py +++ b/imod/mf6/pkgbase.py @@ -67,6 +67,16 @@ def _netcdf_encoding(self): """ return {} + @classmethod + def _from_dataset(cls, ds: GridDataset): + """ + Create package from dataset. Note that no initialization validation is + done. + """ + instance = cls.__new__(cls) + instance.dataset = ds + return instance + @classmethod def from_file(cls, path, **kwargs): """ @@ -120,6 +130,4 @@ def from_file(cls, path, **kwargs): if isinstance(stripped_value, numbers.Real) and np.isnan(stripped_value): # type: ignore[call-overload] dataset[key] = None - instance = cls.__new__(cls) - instance.dataset = dataset - return instance + return cls._from_dataset(dataset) diff --git a/imod/mf6/rch.py b/imod/mf6/rch.py index 65d2cfa48..afdd7129e 100644 --- a/imod/mf6/rch.py +++ b/imod/mf6/rch.py @@ -1,10 +1,17 @@ +from copy import deepcopy +from typing import Optional + import numpy as np from imod.logging import init_log_decorator from imod.mf6.boundary_condition import BoundaryCondition +from imod.mf6.dis import StructuredDiscretization from imod.mf6.interfaces.iregridpackage import IRegridPackage from imod.mf6.regrid.regrid_schemes import RechargeRegridMethod +from imod.mf6.utilities.imod5_converter import convert_unit_rch_rate +from imod.mf6.utilities.regrid import RegridderWeightsCache, _regrid_package_data from imod.mf6.validation import BOUNDARY_DIMS_SCHEMA, CONC_DIMS_SCHEMA +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION, allocate_rch_cells from imod.schemata import ( AllInsideNoDataSchema, AllNoDataSchema, @@ -16,6 +23,11 @@ IndexesSchema, OtherCoordsSchema, ) +from imod.typing import GridDataArray +from imod.typing.grid import ( + enforce_dim_order, + is_planar_grid, +) class Recharge(BoundaryCondition, IRegridPackage): @@ -143,3 +155,80 @@ def _validate(self, schemata, **kwargs): errors = super()._validate(schemata, **kwargs) return errors + + @classmethod + def from_imod5_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + dis_pkg: StructuredDiscretization, + regridder_types: Optional[RechargeRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "Recharge": + """ + Construct an rch-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + dis_pkg: GridDataArray + The discretization package for the simulation. Its grid does not + need to be identical to one of the input grids. + regridder_types: RechargeRegridMethod, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + Modflow 6 rch package. + + """ + new_idomain = dis_pkg.dataset["idomain"] + data = { + "rate": convert_unit_rch_rate(imod5_data["rch"]["rate"]), + } + new_package_data = {} + + # first regrid the inputs to the target grid. + if regridder_types is None: + regridder_settings = Recharge.get_regrid_methods() + + new_package_data = _regrid_package_data( + data, new_idomain, regridder_settings, regrid_cache, {} + ) + + # if rate has only layer 0, then it is planar. + if is_planar_grid(new_package_data["rate"]): + if "layer" in new_package_data["rate"].dims: + planar_rate_regridded = new_package_data["rate"].isel( + layer=0, drop=True + ) + else: + planar_rate_regridded = new_package_data["rate"] + + # create an array indicating in which cells rch is active + is_rch_cell = allocate_rch_cells( + ALLOCATION_OPTION.at_first_active, + new_idomain == 1, + planar_rate_regridded, + ) + + # remove rch from cells where it is not allocated and broadcast over layers. + rch_rate = planar_rate_regridded.where(is_rch_cell) + rch_rate = enforce_dim_order(rch_rate) + new_package_data["rate"] = rch_rate + + return Recharge(**new_package_data, validate=True, fixed_cell=False) + + @classmethod + def get_regrid_methods(cls) -> RechargeRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/regrid/regrid_schemes.py b/imod/mf6/regrid/regrid_schemes.py index 7d6c6e69f..33d496e39 100644 --- a/imod/mf6/regrid/regrid_schemes.py +++ b/imod/mf6/regrid/regrid_schemes.py @@ -20,6 +20,9 @@ class RegridMethodType(Protocol): __dataclass_fields__: ClassVar[dict] + def asdict(self) -> dict: + return vars(self) + @dataclass(config=_CONFIG) class ConstantHeadRegridMethod(RegridMethodType): @@ -51,6 +54,7 @@ class ConstantHeadRegridMethod(RegridMethodType): "mean", ) # TODO: should be set to barycentric once supported concentration: _RegridVarType = (RegridderType.OVERLAP, "mean") + ibound: _RegridVarType = (RegridderType.OVERLAP, "mode") @dataclass(config=_CONFIG) @@ -404,6 +408,7 @@ class RiverRegridMethod(RegridMethodType): conductance: _RegridVarType = (RegridderType.RELATIVEOVERLAP, "conductance") bottom_elevation: _RegridVarType = (RegridderType.OVERLAP, "mean") concentration: _RegridVarType = (RegridderType.OVERLAP, "mean") + infiltration_factor: _RegridVarType = (RegridderType.OVERLAP, "mean") @dataclass(config=_CONFIG) diff --git a/imod/mf6/riv.py b/imod/mf6/riv.py index 8c4c0358c..b0ae22d3d 100644 --- a/imod/mf6/riv.py +++ b/imod/mf6/riv.py @@ -1,10 +1,31 @@ +from copy import deepcopy +from datetime import datetime +from typing import Optional, Tuple + import numpy as np +import xarray as xr -from imod.logging import init_log_decorator +from imod import logging +from imod.logging import init_log_decorator, standard_log_decorator +from imod.logging.loglevel import LogLevel from imod.mf6.boundary_condition import BoundaryCondition +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.disv import VerticesDiscretization +from imod.mf6.drn import Drainage from imod.mf6.interfaces.iregridpackage import IRegridPackage +from imod.mf6.npf import NodePropertyFlow from imod.mf6.regrid.regrid_schemes import RiverRegridMethod +from imod.mf6.utilities.regrid import ( + RegridderWeightsCache, + _regrid_package_data, +) from imod.mf6.validation import BOUNDARY_DIMS_SCHEMA, CONC_DIMS_SCHEMA +from imod.prepare.cleanup import cleanup_riv +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION, allocate_riv_cells +from imod.prepare.topsystem.conductance import ( + DISTRIBUTING_OPTION, + distribute_riv_conductance, +) from imod.schemata import ( AllInsideNoDataSchema, AllNoDataSchema, @@ -16,6 +37,9 @@ IndexesSchema, OtherCoordsSchema, ) +from imod.typing import GridDataArray +from imod.typing.grid import enforce_dim_order, is_planar_grid +from imod.util.expand_repetitions import expand_repetitions class River(BoundaryCondition, IRegridPackage): @@ -162,3 +186,287 @@ def _validate(self, schemata, **kwargs): errors = super()._validate(schemata, **kwargs) return errors + + @standard_log_decorator() + def cleanup(self, dis: StructuredDiscretization | VerticesDiscretization) -> None: + """ + Clean up package inplace. This method calls + :func:`imod.prepare.cleanup.cleanup_riv`, see documentation of that + function for details on cleanup. + + dis: imod.mf6.StructuredDiscretization | imod.mf6.VerticesDiscretization + Model discretization package. + """ + dis_dict = {"idomain": dis.dataset["idomain"], "bottom": dis.dataset["bottom"]} + cleaned_dict = self._call_func_on_grids(cleanup_riv, dis_dict) + super().__init__(cleaned_dict) + + @classmethod + def from_imod5_data( + cls, + key: str, + imod5_data: dict[str, dict[str, GridDataArray]], + period_data: dict[str, list[datetime]], + target_discretization: StructuredDiscretization, + target_npf: NodePropertyFlow, + time_min: datetime, + time_max: datetime, + allocation_option_riv: ALLOCATION_OPTION, + distributing_option_riv: DISTRIBUTING_OPTION, + regridder_types: Optional[RiverRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> Tuple[Optional["River"], Optional[Drainage]]: + """ + Construct a river-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + key: str + Packagename of the package that needs to be converted to river + package. + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + period_data: dict + Dictionary with iMOD5 period data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_discretization: StructuredDiscretization package + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + time_min: datetime + Begin-time of the simulation. Used for expanding period data. + time_max: datetime + End-time of the simulation. Used for expanding period data. + allocation_option: ALLOCATION_OPTION + allocation option. + distributing_option: dict[str, DISTRIBUTING_OPTION] + distributing option. + regridder_types: RiverRegridMethod, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + A MF6 river package, and a drainage package to account + for the infiltration factor which exists in IMOD5 but not in MF6. + Both the river package and the drainage package can be None, + this can happen if the infiltration factor is 0 or 1 everywhere. + """ + + logger = logging.logger + # gather discretrizations + target_top = target_discretization.dataset["top"] + target_bottom = target_discretization.dataset["bottom"] + target_idomain = target_discretization.dataset["idomain"] + target_k = target_npf.dataset["k"] + + # gather input data + data = { + "conductance": imod5_data[key]["conductance"].copy(deep=True), + "stage": imod5_data[key]["stage"].copy(deep=True), + "bottom_elevation": imod5_data[key]["bottom_elevation"].copy(deep=True), + "infiltration_factor": imod5_data[key]["infiltration_factor"].copy( + deep=True + ), + } + is_planar_conductance = is_planar_grid(data["conductance"]) + + # set up regridder methods + if regridder_types is None: + regridder_types = River.get_regrid_methods() + # regrid the input data + regridded_package_data = _regrid_package_data( + data, target_idomain, regridder_types, regrid_cache, {} + ) + + conductance = regridded_package_data["conductance"] + infiltration_factor = regridded_package_data["infiltration_factor"] + + if is_planar_conductance: + riv_allocation = allocate_riv_cells( + allocation_option_riv, + target_idomain == 1, + target_top, + target_bottom, + regridded_package_data["stage"], + regridded_package_data["bottom_elevation"], + ) + + regridded_package_data["conductance"] = distribute_riv_conductance( + distributing_option_riv, + riv_allocation[0], + conductance, + target_top, + target_bottom, + target_k, + regridded_package_data["stage"], + regridded_package_data["bottom_elevation"], + ) + + # create layered arrays of stage and bottom elevation + layered_stage = regridded_package_data["stage"].where(riv_allocation[0]) + layered_stage = enforce_dim_order(layered_stage) + regridded_package_data["stage"] = layered_stage + + layered_bottom_elevation = regridded_package_data["bottom_elevation"].where( + riv_allocation[0] + ) + layered_bottom_elevation = enforce_dim_order(layered_bottom_elevation) + + # due to regridding, the layered_bottom_elevation could be smaller than the + # bottom, so here we overwrite it with bottom if that's + # the case. + + if np.any((target_bottom > layered_bottom_elevation).values[()]): + logger.log( + loglevel=LogLevel.WARNING, + message="Note: riv bottom was detected below model bottom. Updated the riv's bottom.", + additional_depth=0, + ) + layered_bottom_elevation = xr.where( + target_bottom > layered_bottom_elevation, + target_bottom, + layered_bottom_elevation, + ) + + regridded_package_data["bottom_elevation"] = layered_bottom_elevation + + # update the conductance of the river package to account for the infiltration + # factor + drain_conductance, river_conductance = cls.split_conductance( + regridded_package_data["conductance"], infiltration_factor + ) + regridded_package_data["conductance"] = river_conductance + regridded_package_data.pop("infiltration_factor") + regridded_package_data["bottom_elevation"] = enforce_dim_order( + regridded_package_data["bottom_elevation"] + ) + + river_package = River(**regridded_package_data, validate=True) + optional_river_package: Optional[River] = None + optional_drainage_package: Optional[Drainage] = None + # create a drainage package with the conductance we computed from the infiltration factor + drainage_arrays = { + "stage": regridded_package_data["stage"], + "conductance": drain_conductance, + } + + drainage_package = cls.create_infiltration_factor_drain( + drainage_arrays["stage"], + drainage_arrays["conductance"], + ) + # remove River package if its mask is False everywhere + mask = ~np.isnan(river_conductance) + if np.any(mask): + optional_river_package = river_package.mask(mask) + else: + optional_river_package = None + + # remove Drainage package if its mask is False everywhere + mask = ~np.isnan(drain_conductance) + if np.any(mask): + optional_drainage_package = drainage_package.mask(mask) + else: + optional_drainage_package = None + + repeat = period_data.get(key) + if repeat is not None: + if optional_river_package is not None: + optional_river_package.set_repeat_stress( + expand_repetitions(repeat, time_min, time_max) + ) + if optional_drainage_package is not None: + optional_drainage_package.set_repeat_stress( + expand_repetitions(repeat, time_min, time_max) + ) + + return (optional_river_package, optional_drainage_package) + + @classmethod + def create_infiltration_factor_drain( + cls, + drain_elevation: GridDataArray, + drain_conductance: GridDataArray, + ): + """ + Create a drainage package from the river package, to account for the infiltration factor. + This factor is optional in imod5, but it does not exist in MF6, so we mimic its effect + with a Drainage boundary. + """ + + mask = ~np.isnan(drain_conductance) + drainage = Drainage(drain_elevation, drain_conductance) + drainage.mask(mask) + return drainage + + @classmethod + def split_conductance(cls, conductance, infiltration_factor): + """ + Seperates (exfiltration) conductance with an infiltration factor (iMODFLOW) into + a drainage conductance and a river conductance following methods explained in Zaadnoordijk (2009). + + Parameters + ---------- + conductance : xr.DataArray or float + Exfiltration conductance. Is the default conductance provided to the iMODFLOW river package + infiltration_factor : xr.DataArray or float + Infiltration factor. The exfiltration conductance is multiplied with this factor to compute + the infiltration conductance. If 0, no infiltration takes place; if 1, infiltration is equal to exfiltration + + Returns + ------- + drainage_conductance : xr.DataArray + conductance for the drainage package + river_conductance : xr.DataArray + conductance for the river package + + Derivation + ---------- + From Zaadnoordijk (2009): + [1] cond_RIV = A/ci + [2] cond_DRN = A * (ci-cd) / (ci*cd) + Where cond_RIV and cond_DRN repsectively are the River and Drainage conductance [L^2/T], + A is the cell area [L^2] and ci and cd respectively are the infiltration and exfiltration resistance [T] + + Taking f as the infiltration factor and cond_d as the exfiltration conductance, we can write (iMOD manual): + [3] ci = cd * (1/f) + [4] cond_d = A/cd + + We can then rewrite equations 1 and 2 to: + [5] cond_RIV = f * cond_d + [6] cond_DRN = (1-f) * cond_d + + References + ---------- + Zaadnoordijk, W. (2009). + Simulating Piecewise-Linear Surface Water and Ground Water Interactions with MODFLOW. + Ground Water. + https://ngwa.onlinelibrary.wiley.com/doi/10.1111/j.1745-6584.2009.00582.x + + iMOD manual v5.2 (2020) + https://oss.deltares.nl/web/imod/ + + """ + if np.any(infiltration_factor > 1): + raise ValueError("The infiltration factor should not exceed 1") + + drainage_conductance = conductance * (1 - infiltration_factor) + + river_conductance = conductance * infiltration_factor + + # clean up the packages + drainage_conductance = drainage_conductance.where(drainage_conductance > 0) + river_conductance = river_conductance.where(river_conductance > 0) + return drainage_conductance, river_conductance + + @classmethod + def get_regrid_methods(cls) -> RiverRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/simulation.py b/imod/mf6/simulation.py index d41505aa1..2a4af57e5 100644 --- a/imod/mf6/simulation.py +++ b/imod/mf6/simulation.py @@ -5,6 +5,7 @@ import subprocess import warnings from copy import deepcopy +from datetime import datetime from pathlib import Path from typing import Any, Callable, DefaultDict, Iterable, Optional, Union, cast @@ -24,7 +25,7 @@ from imod.mf6.gwfgwf import GWFGWF from imod.mf6.gwfgwt import GWFGWT from imod.mf6.gwtgwt import GWTGWT -from imod.mf6.ims import Solution +from imod.mf6.ims import Solution, SolutionPresetModerate from imod.mf6.interfaces.imodel import IModel from imod.mf6.interfaces.isimulation import ISimulation from imod.mf6.model import Modflow6Model @@ -37,11 +38,17 @@ from imod.mf6.multimodel.modelsplitter import create_partition_info, slice_model from imod.mf6.out import open_cbc, open_conc, open_hds from imod.mf6.package import Package +from imod.mf6.regrid.regrid_schemes import RegridMethodType from imod.mf6.ssm import SourceSinkMixing from imod.mf6.statusinfo import NestedStatusInfo from imod.mf6.utilities.mask import _mask_all_models from imod.mf6.utilities.regrid import _regrid_like +from imod.mf6.validation_context import ValidationContext from imod.mf6.write_context import WriteContext +from imod.prepare.topsystem.default_allocation_methods import ( + SimulationAllocationOptions, + SimulationDistributingOptions, +) from imod.schemata import ValidationError from imod.typing import GridDataArray, GridDataset from imod.typing.grid import ( @@ -91,6 +98,7 @@ def __init__(self, name): self.name = name self.directory = None self._initialize_template() + self._validation_context = ValidationContext() def __setitem__(self, key, value): super().__setitem__(key, value) @@ -247,8 +255,9 @@ def write( """ # create write context write_context = WriteContext(directory, binary, use_absolute_paths) + self._validation_context.validate = validate if self.is_split(): - write_context.is_partitioned = True + self._validation_context.strict_well_validation = False # Check models for required content for key, model in self.items(): @@ -290,8 +299,8 @@ def write( value.write( modelname=key, globaltimes=globaltimes, - validate=validate, write_context=model_write_context, + validate_context=self._validation_context, ) ) elif isinstance(value, Package): @@ -1316,3 +1325,67 @@ def mask_all_models( -1 sets cells to vertical passthrough """ _mask_all_models(self, mask) + + @classmethod + @standard_log_decorator() + def from_imod5_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + period_data: dict[str, dict[str, GridDataArray]], + allocation_options: SimulationAllocationOptions, + distributing_options: SimulationDistributingOptions, + times: list[datetime], + regridder_types: dict[str, RegridMethodType] = {}, + ) -> "Modflow6Simulation": + """ + Imports a GroundwaterFlowModel (GWF) from the data in an IMOD5 project file. + It adds the packages for which import from imod5 is supported. + Some packages (like OC) must be added manually later. + + + Parameters + ---------- + imod5_data: dict[str, dict[str, GridDataArray]] + dictionary containing the arrays mentioned in the project file as xarray datasets, + under the key of the package type to which it belongs + allocation_options: SimulationAllocationOptions + object containing the allocation options per package type. + If you want a package to have a different allocation option, + then it should be imported separately + distributing_options: SimulationDistributingOptions + object containing the conductivity distribution options per package type. + If you want a package to have a different allocation option, + then it should be imported separately + times: list[datetime] + time discretization of the model to be imported. + regridder_types: dict[str, RegridMethodType] + the key is the package name. The value is the RegridMethodType + object containing the settings for regridding the package with the + specified key + + Returns + ------- + """ + simulation = Modflow6Simulation("imported_simulation") + simulation._validation_context.strict_well_validation = False + + # import GWF model, + groundwaterFlowModel = GroundwaterFlowModel.from_imod5_data( + imod5_data, + period_data, + allocation_options, + distributing_options, + times, + regridder_types, + ) + simulation["imported_model"] = groundwaterFlowModel + + # generate ims package + solution = SolutionPresetModerate( + ["imported_model"], + print_option="all", + ) + simulation["ims"] = solution + + simulation.create_time_discretization(additional_times=times) + return simulation diff --git a/imod/mf6/ssm.py b/imod/mf6/ssm.py index bc64ce224..d9e8f6f22 100644 --- a/imod/mf6/ssm.py +++ b/imod/mf6/ssm.py @@ -4,6 +4,7 @@ from imod.mf6 import GroundwaterFlowModel from imod.mf6.boundary_condition import BoundaryCondition from imod.mf6.interfaces.iregridpackage import IRegridPackage +from imod.mf6.regrid.regrid_schemes import EmptyRegridMethod, RegridMethodType from imod.schemata import DTypeSchema @@ -165,3 +166,7 @@ def from_flow_model( save_flows=save_flows, validate=validate, ) + + @classmethod + def get_regrid_methods(cls) -> RegridMethodType: + return EmptyRegridMethod() diff --git a/imod/mf6/sto.py b/imod/mf6/sto.py index b3849f67b..b39e805ad 100644 --- a/imod/mf6/sto.py +++ b/imod/mf6/sto.py @@ -1,5 +1,6 @@ import abc -from typing import Any, Dict +from copy import deepcopy +from typing import Any, Dict, Optional import numpy as np @@ -10,6 +11,7 @@ SpecificStorageRegridMethod, StorageCoefficientRegridMethod, ) +from imod.mf6.utilities.regrid import RegridderWeightsCache, _regrid_package_data from imod.mf6.validation import PKG_DIMS_SCHEMA from imod.schemata import ( AllValueSchema, @@ -18,6 +20,8 @@ IdentityNoDataSchema, IndexesSchema, ) +from imod.typing import GridDataArray +from imod.typing.grid import zeros_like class Storage(Package): @@ -191,6 +195,10 @@ def render(self, directory, pkgname, globaltimes, binary): d = self._render_dict(directory, pkgname, globaltimes, binary) return self._template.render(d) + @classmethod + def get_regrid_methods(cls) -> SpecificStorageRegridMethod: + return deepcopy(cls._regrid_method) + class StorageCoefficient(StorageBase): """ @@ -288,12 +296,7 @@ class StorageCoefficient(StorageBase): AllValueSchema(">=", 0.0), IdentityNoDataSchema(other="idomain", is_other_notnull=(">", 0)), ), - "convertible": ( - IdentityNoDataSchema(other="idomain", is_other_notnull=(">", 0)), - # No need to check coords: dataset ensures they align with idomain. - ), } - _template = Package._initialize_template(_pkg_id) _regrid_method = StorageCoefficientRegridMethod() @@ -321,3 +324,63 @@ def render(self, directory, pkgname, globaltimes, binary): d = self._render_dict(directory, pkgname, globaltimes, binary) d["storagecoefficient"] = True return self._template.render(d) + + @classmethod + def from_imod5_data( + cls, + imod5_data: dict[str, dict[str, GridDataArray]], + target_grid: GridDataArray, + regridder_types: Optional[StorageCoefficientRegridMethod] = None, + regrid_cache: RegridderWeightsCache = RegridderWeightsCache(), + ) -> "StorageCoefficient": + """ + Construct a StorageCoefficient-package from iMOD5 data, loaded with the + :func:`imod.formats.prj.open_projectfile_data` function. + + .. note:: + + The method expects the iMOD5 model to be fully 3D, not quasi-3D. + + Parameters + ---------- + imod5_data: dict + Dictionary with iMOD5 data. This can be constructed from the + :func:`imod.formats.prj.open_projectfile_data` method. + target_grid: GridDataArray + The grid that should be used for the new package. Does not + need to be identical to one of the input grids. + regridder_types: StorageCoefficientRegridMethod, optional + Optional dataclass with regridder types for a specific variable. + Use this to override default regridding methods. + regrid_cache:Optional RegridderWeightsCache + stores regridder weights for different regridders. Can be used to speed up regridding, + if the same regridders are used several times for regridding different arrays. + + Returns + ------- + Modflow 6 StorageCoefficient package. Its specific yield is 0 and it's transient if any storage_coefficient + is larger than 0. All cells are set to inconvertible (they stay confined throughout the simulation) + """ + + data = { + "storage_coefficient": imod5_data["sto"]["storage_coefficient"], + } + + if regridder_types is None: + regridder_types = StorageCoefficient.get_regrid_methods() + + new_package_data = _regrid_package_data( + data, target_grid, regridder_types, regrid_cache, {} + ) + + new_package_data["convertible"] = zeros_like(target_grid, dtype=int) + new_package_data["transient"] = np.any( + new_package_data["storage_coefficient"].values > 0 + ) + new_package_data["specific_yield"] = None + + return cls(**new_package_data, validate=True, save_flows=False) + + @classmethod + def get_regrid_methods(cls) -> StorageCoefficientRegridMethod: + return deepcopy(cls._regrid_method) diff --git a/imod/mf6/utilities/chd_concat.py b/imod/mf6/utilities/chd_concat.py new file mode 100644 index 000000000..8aae8ab40 --- /dev/null +++ b/imod/mf6/utilities/chd_concat.py @@ -0,0 +1,48 @@ +from typing import Optional + +import xarray as xr + +from imod.mf6.chd import ConstantHead + + +def concat_layered_chd_packages( + name: str, + dict_packages: dict[str, ConstantHead], + remove_merged_packages: bool = True, +) -> Optional[ConstantHead]: + """ + + Parameters + ---------- + name: str + The name of the package that was split over layers. + If they are called "chd-1" and so on then set name to "chd" + dict_packages: dict[str, ConstantHead] + dictionary with package names as key and the packages as values + remove_merged_packages: bool = True + set to True to remove merged packages from dict_packages + + This function merges chd-packages whose name starts with "name" into a a + single chd package. This is aimed at chd packages that are split over + layers- so we would have chd-1, chd-2 and so on and these packages would + define a chd package for layer 1, 2 and so on. This function merges them + into a single chd package. If remove_merged_packages is True, then the + packages that are concatenated are removed from the input dictionary, so + that this on output only contains the packages that were not merged. + """ + + candidate_keys = [k for k in dict_packages.keys() if name in k[0 : len(name)]] + if len(candidate_keys) == 0: + return None + + dataset_list = [] + for key in candidate_keys: + pack = dict_packages[key] + dataset_list.append(pack.dataset) + if remove_merged_packages: + dict_packages.pop(key) + + concat_dataset = xr.concat( + dataset_list, dim="layer", compat="equals", data_vars="different" + ) + return ConstantHead._from_dataset(concat_dataset) diff --git a/imod/mf6/utilities/clip.py b/imod/mf6/utilities/clip.py index 45bbe5b31..31ef95fb3 100644 --- a/imod/mf6/utilities/clip.py +++ b/imod/mf6/utilities/clip.py @@ -11,7 +11,11 @@ from imod.mf6.interfaces.ipackagebase import IPackageBase from imod.mf6.interfaces.ipointdatapackage import IPointDataPackage from imod.mf6.utilities.grid import get_active_domain_slice -from imod.typing import GridDataArray +from imod.mf6.utilities.hfb import ( + clipped_hfb_zlinestrings_to_zpolygons, + hfb_zpolygons_to_zlinestrings, +) +from imod.typing import GeoDataFrameType, GridDataArray from imod.typing.grid import bounding_polygon, is_spatial_grid from imod.util.imports import MissingOptionalModule @@ -50,9 +54,7 @@ def clip_by_grid(package: IPackageBase, active: xu.UgridDataArray) -> IPackageBa clipped_dataset = package.dataset.isel(domain_slice, missing_dims="ignore") cls = type(package) - new = cls.__new__(cls) - new.dataset = clipped_dataset - return new + return cls._from_dataset(clipped_dataset) @typedispatch # type: ignore[no-redef] @@ -70,9 +72,7 @@ def clip_by_grid( # noqa: F811 selection = package.dataset.loc[{"index": is_inside_exterior}] cls = type(package) - new = cls.__new__(cls) - new.dataset = selection - return new + return cls._from_dataset(selection) def _filter_inactive_cells(package, active): @@ -97,10 +97,18 @@ def _filter_inactive_cells(package, active): @typedispatch # type: ignore[no-redef, misc] def clip_by_grid(package: ILineDataPackage, active: GridDataArray) -> ILineDataPackage: # noqa: F811 """Clip LineDataPackage outside unstructured/structured grid.""" + clipped_line_data = clip_line_gdf_by_grid(package.line_data, active) - # Clip line with polygon - bounding_gdf = bounding_polygon(active) - clipped_line_data = package.line_data.clip(bounding_gdf) + # Create new instance + clipped_package = deepcopy(package) + clipped_package.line_data = clipped_line_data + return clipped_package + + +def _clip_linestring( + gdf_linestrings: GeoDataFrameType, bounding_gdf: GeoDataFrameType +) -> GeoDataFrameType: + clipped_line_data = gdf_linestrings.clip(bounding_gdf) # Catch edge case: when line crosses only vertex of polygon, a point # or multipoint is returned. Drop these. @@ -110,10 +118,35 @@ def clip_by_grid(package: ILineDataPackage, active: GridDataArray) -> ILineDataP ) clipped_line_data = clipped_line_data[~is_points] - # Convert MultiLineStrings to LineStrings - clipped_line_data = clipped_line_data.explode("geometry", ignore_index=True) + if clipped_line_data.index.shape[0] == 0: + # Shortcut if GeoDataFrame is empty + return clipped_line_data - # Create new instance - clipped_package = deepcopy(package) - clipped_package.line_data = clipped_line_data - return clipped_package + # Convert MultiLineStrings to LineStrings, index parts of MultiLineStrings + clipped_line_data = clipped_line_data.explode( + "geometry", ignore_index=False, index_parts=True + ) + if clipped_line_data.index.nlevels == 3: + index_names = ["bound", "index", "parts"] + else: + index_names = ["index", "parts"] + clipped_line_data.index = clipped_line_data.index.set_names(index_names) + return clipped_line_data + + +def clip_line_gdf_by_grid( + gdf: GeoDataFrameType, active: GridDataArray +) -> GeoDataFrameType: + """Clip GeoDataFrame by bounding polygon of grid""" + # Clip line with polygon + bounding_gdf = bounding_polygon(active) + + if (shapely.get_type_id(gdf.geometry) == shapely.GeometryType.POLYGON).any(): + # Shapely returns z linestrings when clipping our vertical z polygons. + # To work around this convert polygons to zlinestrings to clip. + # Consequently construct polygons from these clipped linestrings. + gdf_linestrings = hfb_zpolygons_to_zlinestrings(gdf) + clipped_linestrings = _clip_linestring(gdf_linestrings, bounding_gdf) + return clipped_hfb_zlinestrings_to_zpolygons(clipped_linestrings) + else: + return _clip_linestring(gdf, bounding_gdf) diff --git a/imod/mf6/utilities/grid.py b/imod/mf6/utilities/grid.py index d5d5ac8be..c643fd94c 100644 --- a/imod/mf6/utilities/grid.py +++ b/imod/mf6/utilities/grid.py @@ -80,3 +80,22 @@ def create_geometric_grid_info(active: xr.DataArray) -> pd.DataFrame: "dy": dy.flatten(), } ) + + +def create_smallest_target_grid(*grids: xr.DataArray) -> xr.DataArray: + """ + Create smallest target grid from multiple structured grids. This is the grid + with smallest extent and finest resolution amongst all provided grids. + """ + dx_ls, xmin_ls, xmax_ls, dy_ls, ymin_ls, ymax_ls = zip( + *[imod.util.spatial.spatial_reference(grid) for grid in grids] + ) + + dx = min(dx_ls) + xmin = max(xmin_ls) + xmax = min(xmax_ls) + dy = max(dy_ls) + ymax = min(ymax_ls) + ymin = max(ymin_ls) + + return imod.util.spatial.empty_2d(dx, xmin, xmax, dy, ymin, ymax) diff --git a/imod/mf6/utilities/hfb.py b/imod/mf6/utilities/hfb.py new file mode 100644 index 000000000..1b43763fe --- /dev/null +++ b/imod/mf6/utilities/hfb.py @@ -0,0 +1,191 @@ +from typing import TYPE_CHECKING, Tuple + +import pandas as pd + +from imod.typing import GeoDataFrameType, GeoSeriesType +from imod.util.imports import MissingOptionalModule + +if TYPE_CHECKING: + import geopandas as gpd +else: + try: + import geopandas as gpd + except ImportError: + gpd = MissingOptionalModule("geopandas") + +try: + import shapely +except ImportError: + shapely = MissingOptionalModule("shapely") + + +def _create_zlinestring_from_bound_df(bound: pd.DataFrame) -> GeoDataFrameType: + """Create geodataframe with linestring geometry from dataframe with bounds.""" + bound = _prepare_index_names(bound) + # Make sure only x, y, z or x, y in columns + columns = sorted({"x", "y", "z"} & set(bound.columns)) + index_names = list(bound.index.names) + # Prevent multiindex to be created in groupby by avoiding list + if bound.index.name: + index_to_group = bound.index.name + else: + index_to_group = index_names + # Each linestring has its own index, therefore groupby index. + mapping_linestrings = [ + (g[0], shapely.LineString(g[1][columns].values)) + for g in bound.groupby(index_to_group) + ] + index, linestrings = zip(*mapping_linestrings) + + gdf = gpd.GeoDataFrame( + linestrings, index=index, columns=["geometry"], geometry="geometry" + ) + gdf.index = gdf.index.set_names(index_names) + return gdf + + +def _create_zpolygon_from_polygon_df(polygon_df: pd.DataFrame) -> GeoDataFrameType: + """ + Create geodataframe with polygon geometry from dataframe with polygon + nodes. The dataframe with nodes must have a multi-index with name ["index", + "parts"] + """ + index_names = ["index", "parts"] + polygons = [ + (g[0], shapely.Polygon(g[1].values)) for g in polygon_df.groupby(index_names) + ] + index_tuples, polygons_data = list(zip(*polygons)) + multi_index = pd.MultiIndex.from_tuples(index_tuples, names=index_names) + return gpd.GeoDataFrame( + polygons_data, columns=["geometry"], index=multi_index, geometry="geometry" + ) + + +def _prepare_index_names( + dataframe: GeoDataFrameType, +) -> GeoDataFrameType: + """ + Prepare index names, if single index, index should be named 'index', if + multi-index, it should be '(index, parts)'; where 'index' refers to the line + index of the original linestrings provided by user and 'parts' to segment of + this linestring after clipping. If the line index was not named 'index', but + is None, this function sets it to 'index'. This is aligned with how pandas + names an unnamed index when calling df.reset_index(). + """ + index_names = dataframe.index.names + + match index_names: + case ["index"] | ["index", "parts"]: + return dataframe + case [None]: # Unnamed line index + new_index_names = ["index"] + case [None, "parts"]: # Unnamed line index + new_index_names = ["index", "parts"] + case _: + raise IndexError( + f"Index names should be ['index'] or ['index', 'parts']. Got {index_names}" + ) + + dataframe.index = dataframe.index.set_names(new_index_names) + return dataframe + + +def _extract_hfb_bounds_from_zpolygons( + dataframe: GeoDataFrameType, +) -> Tuple[pd.DataFrame, pd.DataFrame]: + """ + Extract hfb bounds from dataframe. Requires dataframe geometry to be of type + shapely "Z Polygon". + """ + dataframe = _prepare_index_names(dataframe) + + if not dataframe.geometry.has_z.all(): + raise TypeError("GeoDataFrame geometry has no z, which is required.") + + coordinates = dataframe.geometry.get_coordinates(include_z=True) + + groupby_names = list(dataframe.index.names) + ["x", "y"] + grouped = coordinates.reset_index().groupby(groupby_names) + + lower = grouped.min().reset_index(["x", "y"]) + upper = grouped.max().reset_index(["x", "y"]) + + return lower, upper + + +def hfb_zpolygons_to_zlinestrings(dataframe: GeoDataFrameType) -> GeoDataFrameType: + """ + Convert GeoDataFrame with zpolygons to zlinestrings. + + Paramaters + ---------- + dataframe: GeoDataFrame + GeoDataFrame with a Z Polygons as datatype. + + Returns + ------- + GeoDataFrame with upper and lower bound as linestrings. + The multi-index denotes whether linestring designates "upper" or "lower" + bound. + """ + lower, upper = _extract_hfb_bounds_from_zpolygons(dataframe) + + lower_gdf = _create_zlinestring_from_bound_df(lower) + upper_gdf = _create_zlinestring_from_bound_df(upper) + + bounds_gdf = pd.concat( + [lower_gdf, upper_gdf], + keys=[ + "lower", + "upper", + ], + ) + + return bounds_gdf + + +def _flip_linestrings(df: pd.DataFrame) -> pd.DataFrame: + """ + Flip linestrings, preserve linestring order. + """ + # Add extra index ascending to sort with, with reset. + # This new index denotes unique nodes + df_reset = df.reset_index() + # Set to multi-index to prepare sort + df_multi = df_reset.set_index(["index", "parts", df_reset.index]) + # Sort, only reverse newly added index. + df_sorted = df_multi.sort_index( + level=[0, 1, 2], ascending=[True, True, False], axis=0 + ) + # Drop index added for sorting + return df_sorted.reset_index(level=2, drop=True) + + +def clipped_hfb_zlinestrings_to_zpolygons( + bounds_gdf: GeoSeriesType, +) -> GeoDataFrameType: + """ + Convert clipped zlinestrings provided with bounds_gdf to zpolygons + + Parameters + ---------- + bounds_gdf: Dataframe + Dataframe with for each polygon an upper and lower shapely.LINESTRING, + indicated by index "upper" and "lower". + """ + # Empty Dataframe + if bounds_gdf.shape[0] == 0: + return bounds_gdf + + coordinates = bounds_gdf.get_coordinates(include_z=True) + # Sort index to ascending everywhere to be able to assign flip upper bound + # linestrings without errors. + coordinates = coordinates.sort_index( + level=[0, 1, 2], ascending=[True, True, True], axis=0 + ) + # Reverse upper bound to prevent bowtie polygon from being made. + coordinates.loc["upper"] = _flip_linestrings(coordinates.loc["upper"]).values + # Drop index with "upper" and "lower" in it. + coordinates = coordinates.reset_index(level=0, drop=True) + + return _create_zpolygon_from_polygon_df(coordinates) diff --git a/imod/mf6/utilities/imod5_converter.py b/imod/mf6/utilities/imod5_converter.py new file mode 100644 index 000000000..e599c8522 --- /dev/null +++ b/imod/mf6/utilities/imod5_converter.py @@ -0,0 +1,50 @@ +from typing import Union + +import numpy as np +import xarray as xr + +from imod.typing.grid import full_like + + +def convert_ibound_to_idomain( + ibound: xr.DataArray, thickness: xr.DataArray +) -> xr.DataArray: + # Convert IBOUND to IDOMAIN + # -1 to 1, these will have to be filled with + # CHD cells. + idomain = np.abs(ibound) + + # Thickness <= 0 -> IDOMAIN = -1 + active_and_zero_thickness = (thickness <= 0) & (idomain == 1) + # Don't make cells at top or bottom vpt, these should be inactive. + # First, set all potential vpts to nan to be able to utilize ffill and bfill + idomain_float = idomain.where(~active_and_zero_thickness) # type: ignore[attr-defined] + passthrough = (idomain_float.ffill("layer") == 1) & ( + idomain_float.bfill("layer") == 1 + ) + # Then fill nans where passthrough with -1 + idomain_float = idomain_float.combine_first( + full_like(idomain_float, -1.0, dtype=float).where(passthrough) + ) + # Fill the remaining nans at tops and bottoms with 0 + return idomain_float.fillna(0).astype(int) + + +def convert_unit_rch_rate(rate: xr.DataArray) -> xr.DataArray: + """Convert recharge from iMOD5's mm/d to m/d""" + mm_to_m_conversion = 1e-3 + return rate * mm_to_m_conversion + + +def fill_missing_layers( + source: xr.DataArray, full: xr.DataArray, fillvalue: Union[float | int] +) -> xr.DataArray: + """ + This function takes a source grid in which the layer dimension is + incomplete. It creates a result-grid which has the same layers as the "full" + grid, which is assumed to have all layers. The result has the values in the + source for the layers that are in the source. For the other layers, the + fillvalue is assigned. + """ + layer = full.coords["layer"] + return source.reindex(layer=layer, fill_value=fillvalue) diff --git a/imod/mf6/utilities/mask.py b/imod/mf6/utilities/mask.py index 884f97648..e02b6442a 100644 --- a/imod/mf6/utilities/mask.py +++ b/imod/mf6/utilities/mask.py @@ -1,6 +1,7 @@ import numbers import numpy as np +import xarray as xr from fastcore.dispatch import typedispatch from xarray.core.utils import is_scalar @@ -129,3 +130,17 @@ def _adjust_mask_for_unlayered_data( array_mask = mask.isel(layer=0) return array_mask + + +def mask_arrays(arrays: dict[str, xr.DataArray]) -> dict[str, xr.DataArray]: + """ + This function takes a dictionary of xr.DataArrays. The arrays are assumed to have the same + coordinates. When a np.nan value is found in any array, the other arrays are also + set to np.nan at the same coordinates. + """ + masks = [xr.DataArray(~np.isnan(array)) for array in arrays.values()] + # Get total mask across all arrays + total_mask = xr.concat(masks[:], dim="arrays").all("arrays") + # Mask arrays with total mask + arrays_masked = {key: array.where(total_mask) for key, array in arrays.items()} + return arrays_masked diff --git a/imod/mf6/utilities/mf6hfb.py b/imod/mf6/utilities/mf6hfb.py new file mode 100644 index 000000000..82a70f4d1 --- /dev/null +++ b/imod/mf6/utilities/mf6hfb.py @@ -0,0 +1,67 @@ +from typing import List + +import xarray as xr + +from imod.mf6.hfb import ( + HorizontalFlowBarrierBase, + _prepare_barrier_dataset_for_mf6_adapter, +) +from imod.mf6.mf6_hfb_adapter import Mf6HorizontalFlowBarrier +from imod.typing import GridDataArray + + +def inverse_sum(a: xr.Dataset) -> xr.Dataset: + """Sum of the inverse""" + return (1 / a).sum() + + +def merge_hfb_packages( + hfb_ls: List[HorizontalFlowBarrierBase], + idomain: GridDataArray, + top: GridDataArray, + bottom: GridDataArray, + k: GridDataArray, +) -> Mf6HorizontalFlowBarrier: + """ + Merges HorizontalFlowBarrier packages into single package as MODFLOW 6 + doesn't support multiple HFB packages. + + Parameters + ---------- + hfb_ls: list + List of HorizontalFlowBarrier packages. These will be merged into one. + Function takes settings like "print_input" from the first object in the + list. + idomain: GridDataArray + Grid with active cells. + top: GridDataArray + Grid with top of model layers. + bottom: GridDataArray + Grid with bottom of model layers. + k: GridDataArray + Grid with hydraulic conductivities. + """ + + barrier_ls = [ + hfb._to_connected_cells_dataset(idomain, top, bottom, k) for hfb in hfb_ls + ] + barrier_dataset = xr.concat(barrier_ls, dim="cell_id") + + # xarray GroupbyDataset doesn't allow reducing with different methods per variable. + # Therefore groupby twice: once for cell_id, once for hydraulic_characteristic. + cell_id_merged = ( + barrier_dataset[["cell_id1", "cell_id2"]].groupby("cell_id").first() + ) + hc_merged = 1 / barrier_dataset[["hydraulic_characteristic"]].groupby( + "cell_id" + ).map(inverse_sum) + # Force correct dim order + cell_id_merged = cell_id_merged.transpose("cell_dims1", "cell_dims2", "cell_id") + # Merge datasets into one + barrier_dataset_merged = xr.merge([cell_id_merged, hc_merged], join="exact") + # Set leftover options + barrier_dataset_merged["print_input"] = hfb_ls[0].dataset["print_input"] + barrier_dataset_merged = _prepare_barrier_dataset_for_mf6_adapter( + barrier_dataset_merged + ) + return Mf6HorizontalFlowBarrier(**barrier_dataset_merged.data_vars) diff --git a/imod/mf6/utilities/package.py b/imod/mf6/utilities/package.py index 2a9f2aada..6d0c84239 100644 --- a/imod/mf6/utilities/package.py +++ b/imod/mf6/utilities/package.py @@ -1,3 +1,5 @@ +from typing import Any + import numpy as np import xarray as xr @@ -25,3 +27,24 @@ def get_repeat_stress(times) -> xr.DataArray: data=np.column_stack((keys, values)), dims=("repeat", "repeat_items"), ) + + +def _is_valid(value: Any) -> bool: + """ + Filters values that are None, False, or a numpy.bool_ False. + Needs to be this specific, since 0.0 and 0 are valid values, but are + equal to a boolean False. + """ + # Test singletons + if value is False or value is None: + return False + # Test numpy bool (not singleton) + elif isinstance(value, np.bool_) and not value: + return False + # When dumping to netCDF and reading back, None will have been + # converted into a NaN. Only check NaN if it's a floating type to avoid + # TypeErrors. + elif np.issubdtype(type(value), np.floating) and np.isnan(value): + return False + else: + return True diff --git a/imod/mf6/utilities/regrid.py b/imod/mf6/utilities/regrid.py index 86dae8622..356e10760 100644 --- a/imod/mf6/utilities/regrid.py +++ b/imod/mf6/utilities/regrid.py @@ -22,9 +22,15 @@ from imod.mf6.regrid.regrid_schemes import EmptyRegridMethod, RegridMethodType from imod.mf6.statusinfo import NestedStatusInfo from imod.mf6.utilities.clip import clip_by_grid +from imod.mf6.utilities.package import _is_valid from imod.mf6.utilities.regridding_types import RegridderType from imod.schemata import ValidationError -from imod.typing.grid import GridDataArray, get_grid_geometry_hash, ones_like +from imod.typing.grid import ( + GridDataArray, + GridDataset, + get_grid_geometry_hash, + ones_like, +) HashRegridderMapping = Tuple[int, int, BaseRegridder] @@ -135,60 +141,103 @@ def assign_coord_if_present( def _regrid_array( - package: IRegridPackage, - varname: str, + da: GridDataArray, regridder_collection: RegridderWeightsCache, - regridder_name: str, - regridder_function: str, + regridder_name: Union[RegridderType, BaseRegridder], + regridder_function: Optional[str], target_grid: GridDataArray, ) -> Optional[GridDataArray]: """ - Regrids a data_array. The array is specified by its key in the dataset. - Each data-array can represent: - -a scalar value, valid for the whole grid - -an array of a different scalar per layer - -an array with a value per grid block - -None + Regrids a GridDataArray. Each DataArray can represent: + - a scalar value, valid for the whole grid + - an array of a different scalar per layer + - an array with a value per grid block + - None """ # skip regridding for arrays with no valid values (such as "None") - if not package._valid(package.dataset[varname].values[()]): + if not _is_valid(da.values[()]): return None # the dataarray might be a scalar. If it is, then it does not need regridding. - if is_scalar(package.dataset[varname]): - return package.dataset[varname].values[()] + if is_scalar(da): + return da.values[()] # type: ignore [attr-defined] - if isinstance(package.dataset[varname], xr.DataArray): - coords = package.dataset[varname].coords + if isinstance(da, xr.DataArray): + coords = da.coords # if it is an xr.DataArray it may be layer-based; then no regridding is needed if not ("x" in coords and "y" in coords): - return package.dataset[varname] + return da # if it is an xr.DataArray it needs the dx, dy coordinates for regridding, which are otherwise not mandatory if not ("dx" in coords and "dy" in coords): raise ValueError( - f"DataArray {varname} does not have both a dx and dy coordinates" + f"GridDataArray {da.name} does not have both a dx and dy coordinates" ) # obtain an instance of a regridder for the chosen method regridder = regridder_collection.get_regridder( - package.dataset[varname], + da, target_grid, regridder_name, regridder_function, ) # store original dtype of data - original_dtype = package.dataset[varname].dtype + original_dtype = da.dtype # regrid data array - regridded_array = regridder.regrid(package.dataset[varname]) + regridded_array = regridder.regrid(da) # reconvert the result to the same dtype as the original return regridded_array.astype(original_dtype) +def _regrid_package_data( + package_data: dict[str, GridDataArray] | GridDataset, + target_grid: GridDataArray, + regridder_settings: RegridMethodType, + regrid_cache: RegridderWeightsCache, + new_package_data: dict[str, GridDataArray] = {}, +) -> dict[str, GridDataArray]: + """ + Regrid package data. Loops over regridder settings to regrid variables one + by one. Variables not existent in the package data are skipped. Regridded + package data is added to a dictionary, which can optionally be provided as + argument to extend. + """ + settings_dict = RegridMethodType.asdict(regridder_settings) + for ( + varname, + regridder_type_and_function, + ) in settings_dict.items(): + regridder_function: Optional[str] = None + regridder_name = regridder_type_and_function[0] + if len(regridder_type_and_function) > 1: + regridder_function = regridder_type_and_function[1] + + # skip variables that are not in this dataset + if varname not in package_data.keys(): + continue + + # regrid the variable + new_package_data[varname] = _regrid_array( + package_data[varname], + regrid_cache, + regridder_name, + regridder_function, + target_grid, + ) + # set dx and dy if present in target_grid + new_package_data[varname] = assign_coord_if_present( + "dx", target_grid, new_package_data[varname] + ) + new_package_data[varname] = assign_coord_if_present( + "dy", target_grid, new_package_data[varname] + ) + return new_package_data + + def _get_unique_regridder_types(model: IModel) -> defaultdict[RegridderType, list[str]]: """ This function loops over the packages and collects all regridder-types that are in use. @@ -217,7 +266,7 @@ def _get_unique_regridder_types(model: IModel) -> defaultdict[RegridderType, lis def _regrid_like( package: IRegridPackage, target_grid: GridDataArray, - regrid_context: RegridderWeightsCache, + regrid_cache: RegridderWeightsCache, regridder_types: Optional[RegridMethodType] = None, ) -> IPackage: """ @@ -244,7 +293,7 @@ def _regrid_like( package to regrid target_grid: xr.DataArray or xu.UgridDataArray a grid defined over the same discretization as the one we want to regrid the package to - regrid_context: RegridderWeightsCache + regrid_cache: RegridderWeightsCache stores regridder weights for different regridders. Can be used to speed up regridding, if the same regridders are used several times for regridding different arrays. regridder_types: RegridMethodType, optional @@ -265,41 +314,17 @@ def _regrid_like( remove_expanded_auxiliary_variables_from_dataset(package) if regridder_types is None: - regridder_settings = asdict(package.get_regrid_methods(), dict_factory=dict) - else: - regridder_settings = asdict(regridder_types, dict_factory=dict) - - new_package_data = package.get_non_grid_data(list(regridder_settings.keys())) - - for ( - varname, - regridder_type_and_function, - ) in regridder_settings.items(): - regridder_function = None - regridder_name = regridder_type_and_function[0] - if len(regridder_type_and_function) > 1: - regridder_function = regridder_type_and_function[1] + regridder_types = package._regrid_method - # skip variables that are not in this dataset - if varname not in package.dataset.keys(): - continue + new_package_data = package.get_non_grid_data(regridder_types.asdict().keys()) + new_package_data = _regrid_package_data( + package.dataset, + target_grid, + regridder_types, + regrid_cache, + new_package_data=new_package_data, + ) - # regrid the variable - new_package_data[varname] = _regrid_array( - package, - varname, - regrid_context, - regridder_name, - regridder_function, - target_grid, - ) - # set dx and dy if present in target_grid - new_package_data[varname] = assign_coord_if_present( - "dx", target_grid, new_package_data[varname] - ) - new_package_data[varname] = assign_coord_if_present( - "dy", target_grid, new_package_data[varname] - ) if hasattr(package, "auxiliary_data_fields"): expand_transient_auxiliary_variables(package) @@ -311,7 +336,7 @@ def _regrid_like( model: IModel, target_grid: GridDataArray, validate: bool = True, - regrid_context: Optional[RegridderWeightsCache] = None, + regrid_cache: Optional[RegridderWeightsCache] = None, ) -> IModel: """ Creates a model by regridding the packages of this model to another discretization. @@ -325,7 +350,7 @@ def _regrid_like( a grid defined over the same discretization as the one we want to regrid the package to validate: bool set to true to validate the regridded packages - regrid_context: Optional RegridderWeightsCache + regrid_cache: Optional RegridderWeightsCache stores regridder weights for different regridders. Can be used to speed up regridding, if the same regridders are used several times for regridding different arrays. @@ -340,18 +365,18 @@ def _regrid_like( f"regridding this model cannot be done due to the presence of package {error_with_object_name}" ) new_model = model.__class__() - if regrid_context is None: - regrid_context = RegridderWeightsCache() + if regrid_cache is None: + regrid_cache = RegridderWeightsCache() for pkg_name, pkg in model.items(): if isinstance(pkg, (IRegridPackage, ILineDataPackage, IPointDataPackage)): - new_model[pkg_name] = pkg.regrid_like(target_grid, regrid_context) + new_model[pkg_name] = pkg.regrid_like(target_grid, regrid_cache) else: raise NotImplementedError( f"regridding is not implemented for package {pkg_name} of type {type(pkg)}" ) methods = _get_unique_regridder_types(model) - output_domain = _get_regridding_domain(model, target_grid, regrid_context, methods) + output_domain = _get_regridding_domain(model, target_grid, regrid_cache, methods) new_model.mask_all_packages(output_domain) new_model.purge_empty_packages() if validate: @@ -397,7 +422,7 @@ def _regrid_like( raise ValueError( "Unable to regrid simulation. Regridding can only be done on simulations that have a single flow model." ) - regrid_context = RegridderWeightsCache() + regrid_cache = RegridderWeightsCache() models = simulation.get_models() for model_name, model in models.items(): @@ -410,7 +435,7 @@ def _regrid_like( result = simulation.__class__(regridded_simulation_name) for key, item in simulation.items(): if isinstance(item, IModel): - result[key] = item.regrid_like(target_grid, validate, regrid_context) + result[key] = item.regrid_like(target_grid, validate, regrid_cache) elif key == "gwtgwf_exchanges": pass elif isinstance(item, IPackage) and not isinstance(item, IRegridPackage): @@ -455,7 +480,7 @@ def _regrid_like(package: object, target_grid: GridDataArray, *_) -> None: def _get_regridding_domain( model: IModel, target_grid: GridDataArray, - regrid_context: RegridderWeightsCache, + regrid_cache: RegridderWeightsCache, methods: defaultdict[RegridderType, list[str]], ) -> GridDataArray: """ @@ -466,7 +491,7 @@ def _get_regridding_domain( idomain = model.domain included_in_all = ones_like(target_grid) regridders = [ - regrid_context.get_regridder(idomain, target_grid, regriddertype, function) + regrid_cache.get_regridder(idomain, target_grid, regriddertype, function) for regriddertype, functionlist in methods.items() for function in functionlist ] diff --git a/imod/mf6/validation_context.py b/imod/mf6/validation_context.py new file mode 100644 index 000000000..8b364ca53 --- /dev/null +++ b/imod/mf6/validation_context.py @@ -0,0 +1,7 @@ +from dataclasses import dataclass + + +@dataclass +class ValidationContext: + validate: bool = True + strict_well_validation: bool = True diff --git a/imod/mf6/wel.py b/imod/mf6/wel.py index 080046c99..6d567cd84 100644 --- a/imod/mf6/wel.py +++ b/imod/mf6/wel.py @@ -1,6 +1,10 @@ from __future__ import annotations +import abc +import itertools +import textwrap import warnings +from datetime import datetime from typing import Any, Optional, Tuple, Union import cftime @@ -11,7 +15,11 @@ import xugrid as xu import imod -from imod.logging import init_log_decorator +import imod.mf6.utilities +from imod.logging import init_log_decorator, logger +from imod.logging.logging_decorators import standard_log_decorator +from imod.logging.loglevel import LogLevel +from imod.mf6 import StructuredDiscretization, VerticesDiscretization from imod.mf6.boundary_condition import ( BoundaryCondition, DisStructuredBoundaryCondition, @@ -21,11 +29,15 @@ from imod.mf6.mf6_wel_adapter import Mf6Wel from imod.mf6.package import Package from imod.mf6.utilities.dataset import remove_inactive +from imod.mf6.utilities.grid import broadcast_to_full_domain from imod.mf6.validation import validation_pkg_error_message +from imod.mf6.validation_context import ValidationContext from imod.mf6.write_context import WriteContext from imod.prepare import assign_wells +from imod.prepare.cleanup import cleanup_wel from imod.prepare.layer import create_layered_top from imod.schemata import ( + AllValueSchema, AnyNoDataSchema, DTypeSchema, EmptyIndexesSchema, @@ -34,6 +46,7 @@ from imod.select.points import points_indices, points_values from imod.typing import GridDataArray from imod.typing.grid import is_spatial_grid, ones_like +from imod.util.expand_repetitions import resample_timeseries from imod.util.structured import values_within_range @@ -54,108 +67,163 @@ def _assign_dims(arg: Any) -> Tuple | xr.DataArray: return "index", arg -def mask_2D(package: Well, domain_2d: GridDataArray) -> Well: +def mask_2D(package: GridAgnosticWell, domain_2d: GridDataArray) -> GridAgnosticWell: point_active = points_values(domain_2d, x=package.x, y=package.y) is_inside_exterior = point_active == 1 selection = package.dataset.loc[{"index": is_inside_exterior}] cls = type(package) - new = cls.__new__(cls) - new.dataset = selection - return new + return cls._from_dataset(selection) -class Well(BoundaryCondition, IPointDataPackage): +def _df_groups_to_da_rates( + unique_well_groups: pd.api.typing.DataFrameGroupBy, +) -> xr.DataArray: + # Convert dataframes all groups to DataArrays + is_steady_state = "time" not in unique_well_groups[0].columns + if is_steady_state: + da_groups = [ + xr.DataArray(df_group["rate"].iloc[0]) for df_group in unique_well_groups + ] + else: + da_groups = [ + xr.DataArray( + df_group["rate"], dims=("time"), coords={"time": df_group["time"]} + ) + for df_group in unique_well_groups + ] + # Assign index coordinates + da_groups = [ + da_group.expand_dims(dim="index").assign_coords(index=[i]) + for i, da_group in enumerate(da_groups) + ] + # Concatenate datarrays along index dimension + return xr.concat(da_groups, dim="index") + + +def _prepare_well_rates_from_groups( + pkg_data: dict, + unique_well_groups: pd.api.typing.DataFrameGroupBy, + times: list[datetime], +) -> xr.DataArray: """ - Agnostic WEL package, which accepts x, y and a top and bottom of the well screens. - - This package can be written to any provided model grid. - Any number of WEL Packages can be specified for a single groundwater flow model. - https://water.usgs.gov/water-resources/software/MODFLOW-6/mf6io_6.0.4.pdf#page=63 - - Parameters - ---------- - - y: float or list of floats - is the y location of the well. - x: float or list of floats - is the x location of the well. - screen_top: float or list of floats - is the top of the well screen. - screen_bottom: float or list of floats - is the bottom of the well screen. - rate: float, list of floats or xr.DataArray - is the volumetric well rate. A positive value indicates well - (injection) and a negative value indicates discharge (extraction) (q). - If provided as DataArray, an ``"index"`` dimension is required and an - optional ``"time"`` dimension and coordinate specify transient input. - In the latter case, it is important that dimensions are in the order: - ``("time", "index")`` - concentration: array of floats (xr.DataArray, optional) - if this flow package is used in simulations also involving transport, then this array is used - as the concentration for inflow over this boundary. - concentration_boundary_type: ({"AUX", "AUXMIXED"}, optional) - if this flow package is used in simulations also involving transport, then this keyword specifies - how outflow over this boundary is computed. - id: list of Any, optional - assign an identifier code to each well. if not provided, one will be generated - Must be convertible to string, and unique entries. - minimum_k: float, optional - on creating point wells, no point wells will be placed in cells with a lower horizontal conductivity than this - minimum_thickness: float, optional - on creating point wells, no point wells will be placed in cells with a lower thickness than this - print_input: ({True, False}, optional) - keyword to indicate that the list of well information will be written to - the listing file immediately after it is read. - Default is False. - print_flows: ({True, False}, optional) - Indicates that the list of well flow rates will be printed to the - listing file for every stress period time step in which "BUDGET PRINT" - is specified in Output Control. If there is no Output Control option - and PRINT FLOWS is specified, then flow rates are printed for the last - time step of each stress period. - Default is False. - save_flows: ({True, False}, optional) - Indicates that well flow terms will be written to the file specified - with "BUDGET FILEOUT" in Output Control. - Default is False. - observations: [Not yet supported.] - Default is None. - validate: {True, False} - Flag to indicate whether the package should be validated upon - initialization. This raises a ValidationError if package input is - provided in the wrong manner. Defaults to True. - repeat_stress: Optional[xr.DataArray] of datetimes - Used to repeat data for e.g. repeating stress periods such as - seasonality without duplicating the values. The DataArray should have - dimensions ``("repeat", "repeat_items")``. The ``repeat_items`` - dimension should have size 2: the first value is the "key", the second - value is the "value". For the "key" datetime, the data of the "value" - datetime will be used. Can also be set with a dictionary using the - ``set_repeat_stress`` method. - - Examples - --------- - - >>> screen_top = [0.0, 0.0] - >>> screen_bottom = [-2.0, -2.0] - >>> y = [83.0, 77.0] - >>> x = [81.0, 82.0] - >>> rate = [1.0, 1.0] - - >>> imod.mf6.Well(x, y, screen_top, screen_bottom, rate) + Prepare well rates from dataframe groups, grouped by unique well locations. + Resample timeseries if ipf with associated text files. + """ + has_associated = pkg_data["has_associated"] + start_times = times[:-1] # Starts stress periods. + if has_associated: + # Resample times per group + unique_well_groups = [ + resample_timeseries(df_group, start_times) + for df_group in unique_well_groups + ] + return _df_groups_to_da_rates(unique_well_groups) + + +def _prepare_df_ipf_associated( + pkg_data: dict, start_times: list[datetime], all_well_times: list[datetime] +) -> pd.DataFrame: + """Prepare dataframe for an ipf with associated timeseries in a textfile.""" + # Validate if associated wells are assigned multiple layers, factors, + # and additions. + for entry in ["layer", "factor", "addition"]: + uniques = set(pkg_data[entry]) + if len(uniques) > 1: + raise ValueError( + f"IPF with associated textfiles assigned multiple {entry}s: {uniques}" + ) + # Validate if associated wells are defined only on first timestep or all + # timesteps + is_defined_all = len(set(all_well_times) - set(pkg_data["time"])) == 0 + is_defined_first = (len(pkg_data["time"]) == 1) & ( + pkg_data["time"][0] == all_well_times[0] + ) + if not is_defined_all and not is_defined_first: + raise ValueError( + "IPF with associated textfiles assigned to wrong times. " + "Should be assigned to all times or only first time. " + f"PRJ times: {all_well_times}, package times: {pkg_data['time']}" + ) + df = pkg_data["dataframe"][0] + df["layer"] = pkg_data["layer"][0] + return df + + +def _prepare_df_ipf_unassociated( + pkg_data: dict, start_times: list[datetime] +) -> pd.DataFrame: + """Prepare dataframe for an ipf with no associated timeseries.""" + is_steady_state = any(t is None for t in pkg_data["time"]) + if is_steady_state: + index_dicts = [{"layer": lay} for lay in pkg_data["layer"]] + else: + index_dicts = [ + {"time": t, "layer": lay} + for t, lay in zip(pkg_data["time"], pkg_data["layer"]) + ] + # Concatenate dataframes, assign layer and times + iter_dfs_dims = zip(pkg_data["dataframe"], index_dicts) + df = pd.concat([df.assign(**index_dict) for df, index_dict in iter_dfs_dims]) + # Prepare multi-index dataframe to convert to a multi-dimensional DataArray + # later. + dimnames = list(index_dicts[0].keys()) + df_multi = df.set_index(dimnames + [df.index]) + df_multi.index = df_multi.index.set_names(dimnames + ["ipf_row"]) + # Temporarily convert to DataArray with 2 dimensions, as it allows for + # multi-dimensional ffilling, instead pandas' ffilling the last value in a + # column of the flattened table. + ipf_row_index = pkg_data["dataframe"][0].index + # Forward fill location columns, only reindex layer, filt_top and filt_bot + # if present. + cols_ffill_if_present = {"x", "y", "filt_top", "filt_bot"} + cols_ffill = cols_ffill_if_present & set(df.columns) + da_multi = df_multi.to_xarray() + indexers = {"ipf_row": ipf_row_index} + if not is_steady_state: + indexers["time"] = start_times + # Multi-dimensional reindex, forward fill well locations, fill well rates + # with 0.0. + df_ffilled = da_multi[cols_ffill].reindex(indexers, method="ffill").to_dataframe() + df_fill_zero = da_multi["rate"].reindex(indexers, fill_value=0.0).to_dataframe() + # Combine columns and reset dataframe back into a simple long table with + # single index. + df_out = pd.concat([df_ffilled, df_fill_zero], axis="columns") + return df_out.reset_index().drop(columns="ipf_row") + + +def _unpack_package_data( + pkg_data: dict, times: list[datetime], all_well_times: list[datetime] +) -> pd.DataFrame: + """Unpack package data to dataframe""" + start_times = times[:-1] # Starts stress periods. + has_associated = pkg_data["has_associated"] + if has_associated: + return _prepare_df_ipf_associated(pkg_data, start_times, all_well_times) + else: + return _prepare_df_ipf_unassociated(pkg_data, start_times) - For a transient well: - >>> weltimes = pd.date_range("2000-01-01", "2000-01-03") +def get_all_imod5_prj_well_times(imod5_data: dict) -> list[datetime]: + """Get all times a well data is defined on in a prj file""" + wel_keys = [key for key in imod5_data.keys() if key.startswith("wel")] + wel_times_per_pkg = [imod5_data[wel_key]["time"] for wel_key in wel_keys] + # Flatten list + wel_times_flat = itertools.chain.from_iterable(wel_times_per_pkg) + # Get unique times by converting to set and sorting. ``sorted`` also + # transforms set to a list again. + return sorted(set(wel_times_flat)) - >>> rate_factor_time = xr.DataArray([0.5, 1.0], coords={"time": weltimes}, dims=("time",)) - >>> rate_transient = rate_factor_time * xr.DataArray(rate, dims=("index",)) - >>> imod.mf6.Well(x, y, screen_top, screen_bottom, rate_transient) +class GridAgnosticWell(BoundaryCondition, IPointDataPackage, abc.ABC): + """ + Abstract base class for grid agnostic wells """ + _imod5_depth_colnames: list[str] = [] + _depth_colnames: list[tuple[str, type]] = [] + @property def x(self) -> npt.NDArray[np.float64]: return self.dataset["x"].values @@ -164,310 +232,83 @@ def x(self) -> npt.NDArray[np.float64]: def y(self) -> npt.NDArray[np.float64]: return self.dataset["y"].values - _pkg_id = "wel" - - _auxiliary_data = {"concentration": "species"} - _init_schemata = { - "screen_top": [DTypeSchema(np.floating)], - "screen_bottom": [DTypeSchema(np.floating)], - "y": [DTypeSchema(np.floating)], - "x": [DTypeSchema(np.floating)], - "rate": [DTypeSchema(np.floating)], - "concentration": [DTypeSchema(np.floating)], - } - _write_schemata = { - "screen_top": [AnyNoDataSchema(), EmptyIndexesSchema()], - "screen_bottom": [AnyNoDataSchema(), EmptyIndexesSchema()], - "y": [AnyNoDataSchema(), EmptyIndexesSchema()], - "x": [AnyNoDataSchema(), EmptyIndexesSchema()], - "rate": [AnyNoDataSchema(), EmptyIndexesSchema()], - "concentration": [AnyNoDataSchema(), EmptyIndexesSchema()], - } - - @init_log_decorator() - def __init__( - self, - x: list[float], - y: list[float], - screen_top: list[float], - screen_bottom: list[float], - rate: list[float] | xr.DataArray, - concentration: Optional[list[float] | xr.DataArray] = None, - concentration_boundary_type="aux", - id: Optional[list[Any]] = None, - minimum_k: float = 0.1, - minimum_thickness: float = 1.0, - print_input: bool = False, - print_flows: bool = False, - save_flows: bool = False, - observations=None, - validate: bool = True, - repeat_stress: Optional[xr.DataArray] = None, - ): - if id is None: - id = [str(i) for i in range(len(x))] - else: - set_id = set(id) - if len(id) != len(set_id): - raise ValueError("id's must be unique") - id = [str(i) for i in id] - dict_dataset = { - "screen_top": _assign_dims(screen_top), - "screen_bottom": _assign_dims(screen_bottom), - "y": _assign_dims(y), - "x": _assign_dims(x), - "rate": _assign_dims(rate), - "id": _assign_dims(id), - "minimum_k": minimum_k, - "minimum_thickness": minimum_thickness, - "print_input": print_input, - "print_flows": print_flows, - "save_flows": save_flows, - "observations": observations, - "repeat_stress": repeat_stress, - "concentration": concentration, - "concentration_boundary_type": concentration_boundary_type, - } - super().__init__(dict_dataset) - # Set index as coordinate - index_coord = np.arange(self.dataset.dims["index"]) - self.dataset = self.dataset.assign_coords(index=index_coord) - self._validate_init_schemata(validate) - @classmethod def is_grid_agnostic_package(cls) -> bool: return True - def clip_box( - self, - time_min: Optional[cftime.datetime | np.datetime64 | str] = None, - time_max: Optional[cftime.datetime | np.datetime64 | str] = None, - layer_min: Optional[int] = None, - layer_max: Optional[int] = None, - x_min: Optional[float] = None, - x_max: Optional[float] = None, - y_min: Optional[float] = None, - y_max: Optional[float] = None, - top: Optional[GridDataArray] = None, - bottom: Optional[GridDataArray] = None, - ) -> Package: + def _create_cellid( + self, assigned_wells: pd.DataFrame, active: xr.DataArray + ) -> GridDataArray: + like = ones_like(active) + + # Groupby index and select first, to unset any duplicate records + # introduced by the multi-indexed "time" dimension. + df_for_cellid = assigned_wells.groupby("index").first() + d_for_cellid = df_for_cellid[["x", "y", "layer"]].to_dict("list") + + return self._derive_cellid_from_points(like, **d_for_cellid) + + def _create_dataset_vars( + self, assigned_wells: pd.DataFrame, cellid: xr.DataArray + ) -> xr.Dataset: """ - Clip a package by a bounding box (time, layer, y, x). + Create dataset with all variables (rate, concentration), with a similar shape as the cellids. + """ + data_vars = ["id", "rate"] + if "concentration" in assigned_wells.columns: + data_vars.append("concentration") - The well package doesn't use the layer attribute to describe its depth and length. - Instead, it uses the screen_top and screen_bottom parameters which corresponds with - the z-coordinates of the top and bottom of the well. To go from a layer_min and - layer_max to z-values used for clipping the well a top and bottom array have to be - provided as well. + ds_vars = assigned_wells[data_vars].to_xarray() + # "rate" variable in conversion from multi-indexed DataFrame to xarray + # DataArray results in duplicated values for "rate" along dimension + # "species". Select first species to reduce this again. + index_names = assigned_wells.index.names + if "species" in index_names: + ds_vars["rate"] = ds_vars["rate"].isel(species=0) - Slicing intervals may be half-bounded, by providing None: + # Carefully rename the dimension and set coordinates + d_rename = {"index": "ncellid"} + ds_vars = ds_vars.rename_dims(**d_rename).rename_vars(**d_rename) + ds_vars = ds_vars.assign_coords(**{"ncellid": cellid.coords["ncellid"].values}) - * To select 500.0 <= x <= 1000.0: - ``clip_box(x_min=500.0, x_max=1000.0)``. - * To select x <= 1000.0: ``clip_box(x_min=None, x_max=1000.0)`` - or ``clip_box(x_max=1000.0)``. - * To select x >= 500.0: ``clip_box(x_min = 500.0, x_max=None.0)`` - or ``clip_box(x_min=1000.0)``. + return ds_vars + + @staticmethod + def _derive_cellid_from_points( + dst_grid: GridDataArray, + x: list, + y: list, + layer: list, + ) -> GridDataArray: + """ + Create DataArray with Modflow6 cell identifiers based on x, y coordinates + in a dataframe. For structured grid this DataArray contains 3 columns: + ``layer, row, column``. For unstructured grids, this contains 2 columns: + ``layer, cell2d``. + See also: https://water.usgs.gov/water-resources/software/MODFLOW-6/mf6io_6.4.0.pdf#page=35 + + Note + ---- + The "layer" coordinate should already be provided in the dataframe. + To determine the layer coordinate based on screen depts, look at + :func:`imod.prepare.wells.assign_wells`. Parameters ---------- - time_min: optional - time_max: optional - layer_min: optional, int - layer_max: optional, int - x_min: optional, float - x_max: optional, float - y_min: optional, float - y_max: optional, float - top: optional, GridDataArray - bottom: optional, GridDataArray - state_for_boundary: optional, GridDataArray + dst_grid: {xr.DataArray, xu.UgridDataArray} + Destination grid to map the points to based on their x and y coordinates. + x: {list, np.array} + array-like with x-coordinates + y: {list, np.array} + array-like with y-coordinates + layer: {list, np.array} + array-like with layer-coordinates Returns ------- - sliced : Package - """ - if (layer_max or layer_min) and (top is None or bottom is None): - raise ValueError( - "When clipping by layer both the top and bottom should be defined" - ) - - if top is not None: - # Bug in mypy when using unions in isInstance - if not isinstance(top, GridDataArray) or "layer" not in top.coords: # type: ignore - top = create_layered_top(bottom, top) - - # The super method will select in the time dimension without issues. - new = super().clip_box(time_min=time_min, time_max=time_max) - - ds = new.dataset - - z_max = self._find_well_value_at_layer(ds, top, layer_max) - z_min = self._find_well_value_at_layer(ds, bottom, layer_min) - - if z_max is not None: - ds["screen_top"] = ds["screen_top"].clip(None, z_max) - if z_min is not None: - ds["screen_bottom"] = ds["screen_bottom"].clip(z_min, None) - - # Initiate array of True with right shape to deal with case no spatial - # selection needs to be done. - in_bounds = np.full(ds.dims["index"], True) - # Select all variables along "index" dimension - in_bounds &= values_within_range(ds["x"], x_min, x_max) - in_bounds &= values_within_range(ds["y"], y_min, y_max) - in_bounds &= values_within_range(ds["screen_top"], z_min, z_max) - in_bounds &= values_within_range(ds["screen_bottom"], z_min, z_max) - # remove wells where the screen bottom and top are the same - in_bounds &= abs(ds["screen_bottom"] - ds["screen_top"]) > 1e-5 - # Replace dataset with reduced dataset based on booleans - new.dataset = ds.loc[{"index": in_bounds}] - - return new - - @staticmethod - def _find_well_value_at_layer( - well_dataset: xr.Dataset, grid: GridDataArray, layer: Optional[int] - ): - value = None if layer is None else grid.isel(layer=layer) - - # if value is a grid select the values at the well locations and drop the dimensions - if (value is not None) and is_spatial_grid(value): - value = imod.select.points_values( - value, - x=well_dataset["x"].values, - y=well_dataset["y"].values, - out_of_bounds="ignore", - ).drop_vars(lambda x: x.coords) - - return value - - def write( - self, - pkgname: str, - globaltimes: Union[list[np.datetime64], np.ndarray], - write_context: WriteContext, - ): - raise NotImplementedError( - "To write a wel package first convert it to a MF6 well using to_mf6_pkg." - ) - - def __create_wells_df(self) -> pd.DataFrame: - wells_df = self.dataset.to_dataframe() - wells_df = wells_df.rename( - columns={ - "screen_top": "top", - "screen_bottom": "bottom", - } - ) - - return wells_df - - def __create_assigned_wells( - self, - wells_df: pd.DataFrame, - active: GridDataArray, - top: GridDataArray, - bottom: GridDataArray, - k: GridDataArray, - minimum_k: float, - minimum_thickness: float, - ): - # Ensure top, bottom & k - # are broadcasted to 3d grid - like = ones_like(active) - bottom = like * bottom - top_2d = (like * top).sel(layer=1) - top_3d = bottom.shift(layer=1).fillna(top_2d) - - k = like * k - - index_names = wells_df.index.names - - # Unset multi-index, because assign_wells cannot deal with - # multi-indices which is returned by self.dataset.to_dataframe() in - # case of a "time" and "species" coordinate. - wells_df = wells_df.reset_index() - - wells_assigned = assign_wells( - wells_df, top_3d, bottom, k, minimum_thickness, minimum_k, True - ) - # Set multi-index again - wells_assigned = wells_assigned.set_index(index_names).sort_index() - - return wells_assigned - - def __create_dataset_vars( - self, wells_assigned: pd.DataFrame, wells_df: pd.DataFrame, cellid: xr.DataArray - ) -> xr.Dataset: - """ - Create dataset with all variables (rate, concentration), with a similar shape as the cellids. - """ - data_vars = ["rate"] - if "concentration" in wells_assigned.columns: - data_vars.append("concentration") - - ds_vars = wells_assigned[data_vars].to_xarray() - # "rate" variable in conversion from multi-indexed DataFrame to xarray - # DataArray results in duplicated values for "rate" along dimension - # "species". Select first species to reduce this again. - index_names = wells_df.index.names - if "species" in index_names: - ds_vars["rate"] = ds_vars["rate"].isel(species=0) - - # Carefully rename the dimension and set coordinates - d_rename = {"index": "ncellid"} - ds_vars = ds_vars.rename_dims(**d_rename).rename_vars(**d_rename) - ds_vars = ds_vars.assign_coords(**{"ncellid": cellid.coords["ncellid"].values}) - - return ds_vars - - def __create_cellid(self, wells_assigned: pd.DataFrame, active: xr.DataArray): - like = ones_like(active) - - # Groupby index and select first, to unset any duplicate records - # introduced by the multi-indexed "time" dimension. - df_for_cellid = wells_assigned.groupby("index").first() - d_for_cellid = df_for_cellid[["x", "y", "layer"]].to_dict("list") - - return self.__derive_cellid_from_points(like, **d_for_cellid) - - @staticmethod - def __derive_cellid_from_points( - dst_grid: GridDataArray, - x: list, - y: list, - layer: list, - ) -> GridDataArray: - """ - Create DataArray with Modflow6 cell identifiers based on x, y coordinates - in a dataframe. For structured grid this DataArray contains 3 columns: - ``layer, row, column``. For unstructured grids, this contains 2 columns: - ``layer, cell2d``. - See also: https://water.usgs.gov/water-resources/software/MODFLOW-6/mf6io_6.4.0.pdf#page=35 - - Note - ---- - The "layer" coordinate should already be provided in the dataframe. - To determine the layer coordinate based on screen depts, look at - :func:`imod.prepare.wells.assign_wells`. - - Parameters - ---------- - dst_grid: {xr.DataArray, xu.UgridDataArray} - Destination grid to map the points to based on their x and y coordinates. - x: {list, np.array} - array-like with x-coordinates - y: {list, np.array} - array-like with y-coordinates - layer: {list, np.array} - array-like with layer-coordinates - - Returns - ------- - cellid : xr.DataArray - 2D DataArray with a ``ncellid`` rows and 3 to 2 columns, depending - on whether on a structured or unstructured grid.""" + cellid : xr.DataArray + 2D DataArray with a ``ncellid`` rows and 3 to 2 columns, depending + on whether on a structured or unstructured grid.""" # Find indices belonging to x, y coordinates indices_cell2d = points_indices(dst_grid, out_of_bounds="ignore", x=x, y=y) @@ -502,14 +343,44 @@ def __derive_cellid_from_points( "x": ("ncellid", x), "y": ("ncellid", y), } - cellid = cellid.assign_coords(**coords) + cellid = cellid.assign_coords(coords=coords) return cellid def render(self, directory, pkgname, globaltimes, binary): raise NotImplementedError( - f"{self.__class__.__name__} is a grid-agnostic package and does not have a render method. To render the package, first convert to a Modflow6 package by calling pkg.to_mf6_pkg()" + textwrap.dedent( + f"""{self.__class__.__name__} is a grid-agnostic package and does not + have a render method. To render the package, first convert to a + Modflow6 package by calling pkg.to_mf6_pkg()""" + ) + ) + + def write( + self, + pkgname: str, + globaltimes: Union[list[np.datetime64], np.ndarray], + write_context: WriteContext, + ): + raise NotImplementedError( + "To write a wel package first convert it to a MF6 well using to_mf6_pkg." + ) + + def mask(self, domain: GridDataArray) -> GridAgnosticWell: + """ + Mask wells based on two-dimensional domain. For three-dimensional + masking: Wells falling in inactive cells are automatically removed in + the call to write to Modflow 6 package. You can verify this by calling + the ``to_mf6_pkg`` method. + """ + + # Drop layer coordinate if present, otherwise a layer coordinate is assigned + # which causes conflicts downstream when assigning wells and deriving + # cellids. + domain_2d = domain.isel(layer=0, drop=True, missing_dims="ignore").drop_vars( + "layer", errors="ignore" ) + return mask_2D(self, domain_2d) def to_mf6_pkg( self, @@ -518,7 +389,7 @@ def to_mf6_pkg( bottom: GridDataArray, k: GridDataArray, validate: bool = False, - is_partitioned: bool = False, + strict_well_validation: bool = True, ) -> Mf6Wel: """ Write package to Modflow 6 package. @@ -537,9 +408,6 @@ def to_mf6_pkg( Parameters ---------- - is_partitioned: bool - validate: bool - Run validation before converting active: {xarry.DataArray, xugrid.UgridDataArray} Grid with active cells. top: {xarry.DataArray, xugrid.UgridDataArray} @@ -548,42 +416,60 @@ def to_mf6_pkg( Grid with bottom of model layers. k: {xarry.DataArray, xugrid.UgridDataArray} Grid with hydraulic conductivities. + validate: bool, default True + Run validation before converting + strict_well_validation: bool, default True + Set well validation strict: + Throw error if well is removed entirely during its assignment to + layers. + Returns ------- Mf6Wel Object with wells as list based input. """ - if validate: + validation_context = ValidationContext( + validate=validate, strict_well_validation=strict_well_validation + ) + return self._to_mf6_pkg(active, top, bottom, k, validation_context) + + def _to_mf6_pkg( + self, + active: GridDataArray, + top: GridDataArray, + bottom: GridDataArray, + k: GridDataArray, + validation_context: ValidationContext, + ) -> Mf6Wel: + if validation_context.validate: errors = self._validate(self._write_schemata) if len(errors) > 0: message = validation_pkg_error_message(errors) raise ValidationError(message) - minimum_k = self.dataset["minimum_k"].item() - minimum_thickness = self.dataset["minimum_thickness"].item() - - wells_df = self.__create_wells_df() - wells_assigned = self.__create_assigned_wells( - wells_df, active, top, bottom, k, minimum_k, minimum_thickness - ) - + wells_df = self._create_wells_df() nwells_df = len(wells_df["id"].unique()) - nwells_assigned = ( - 0 if wells_assigned.empty else len(wells_assigned["id"].unique()) - ) - if nwells_df == 0: - raise ValueError("No wells were assigned in package. None were present.") - - if not is_partitioned and nwells_df != nwells_assigned: - raise ValueError( - "One or more well(s) are completely invalid due to minimum conductivity and thickness constraints." + raise ValidationError( + "No wells were assigned in package. None were present." ) + assigned_wells = self._assign_wells_to_layers(wells_df, active, top, bottom, k) + filtered_assigned_well_ids = self.gather_filtered_well_ids( + assigned_wells, wells_df + ) + message_assign = self.to_mf6_package_information( + filtered_assigned_well_ids, reason_text="permeability/thickness constraints" + ) + error_on_well_removal = validation_context.strict_well_validation + if error_on_well_removal and len(filtered_assigned_well_ids) > 0: + logger.log(loglevel=LogLevel.ERROR, message=message_assign) + raise ValidationError(message_assign) + ds = xr.Dataset() - ds["cellid"] = self.__create_cellid(wells_assigned, active) + ds["cellid"] = self._create_cellid(assigned_wells, active) - ds_vars = self.__create_dataset_vars(wells_assigned, wells_df, ds["cellid"]) + ds_vars = self._create_dataset_vars(assigned_wells, ds["cellid"]) ds = ds.assign(**ds_vars.data_vars) ds = remove_inactive(ds, active) @@ -591,23 +477,830 @@ def to_mf6_pkg( ds["print_flows"] = self["print_flows"].values[()] ds["print_input"] = self["print_input"].values[()] + filtered_final_well_ids = self.gather_filtered_well_ids(ds, wells_df) + if len(filtered_final_well_ids) > 0: + reason_text = "inactive cells or permeability/thickness constraints" + message_end = self.to_mf6_package_information( + filtered_final_well_ids, reason_text=reason_text + ) + logger.log(loglevel=LogLevel.WARNING, message=message_end) + + ds = ds.drop_vars("id") + return Mf6Wel(**ds.data_vars) - def mask(self, domain: GridDataArray) -> Well: - """ - Mask wells based on two-dimensional domain. For three-dimensional - masking: Wells falling in inactive cells are automatically removed in - the call to write to Modflow 6 package. You can verify this by calling - the ``to_mf6_pkg`` method. + def gather_filtered_well_ids( + self, well_data_filtered: pd.DataFrame | xr.Dataset, well_data: pd.DataFrame + ) -> list[str]: + filtered_well_ids = [ + id + for id in well_data["id"].unique() + if id not in well_data_filtered["id"].values + ] + return filtered_well_ids + + def to_mf6_package_information( + self, filtered_wells: list[str], reason_text: str + ) -> str: + message = textwrap.dedent( + f"""Some wells were not placed in the MF6 well package. This can be + due to {reason_text}.\n""" + ) + if len(filtered_wells) < 10: + message += "The filtered wells are: \n" + else: + message += " The first 10 unplaced wells are: \n" + + for i in range(min(10, len(filtered_wells))): + ids = filtered_wells[i] + x = self.dataset["x"][int(filtered_wells[i])].values[()] + y = self.dataset["y"][int(filtered_wells[i])].values[()] + message += f" id = {ids} x = {x} y = {y} \n" + return message + + def _create_wells_df(self) -> pd.DataFrame: + raise NotImplementedError("Method in abstract base class called") + + def _assign_wells_to_layers( + self, + wells_df: pd.DataFrame, + active: GridDataArray, + top: GridDataArray, + bottom: GridDataArray, + k: GridDataArray, + ) -> pd.DataFrame: + raise NotImplementedError("Method in abstract base class called") + + @classmethod + def _validate_imod5_depth_information( + cls, key: str, pkg_data: dict, df: pd.DataFrame + ) -> None: + raise NotImplementedError("Method in abstract base class called") + + @classmethod + def from_imod5_data( + cls, + key: str, + imod5_data: dict[str, dict[str, GridDataArray]], + times: list[datetime], + minimum_k: float = 0.1, + minimum_thickness: float = 0.05, + ) -> "GridAgnosticWell": """ + Convert wells to imod5 data, loaded with + :func:`imod.formats.prj.open_projectfile_data`, to a Well object. As + iMOD5 handles wells differently than iMOD Python normally does, some + data transformations are made, which are outlined further. + + iMOD5 stores well information in IPF files and it supports two ways to + specify injection/extraction rates: + + 1. A timeseries of well rates, in an associated text file. We will + call these "associated wells" further in this text. + 2. Constant rates in an IPF file, without an associated text file. + We will call these "unassociated wells" further in this text. + + Depending on this, iMOD5 does different things, which we need to mimic + in this method. + + *Associated wells* + + Wells with timeseries in an associated textfile are processed as + follows: + + - Wells are validated if the following requirements are met + * Associated well entries in projectfile are defined on either all + timestamps or just the first + * Multiplication and addition factors need to remain constant through time + * Same associated well cannot be assigned to multiple layers + - The dataframe of the first projectfile timestamp is selected + - Rate timeseries are resampled with a time weighted mean to the + simulation times. + - When simulation times fall outside well timeseries range, the last + rate is forward filled. + - Projectfile timestamps are not used. Even if assigned to a + "steady-state" timestamp, the resulting dataset still uses simulation + times. + + *Unassociated wells* + + Wells without associated textfiles are processed as follows: + + - When a unassociated well disappears from the next time entry in the + projectfile, the well is deactivated by setting its rate to 0.0. This + is to prevent the well being activated again in case of any potential + forward filling at a later stage by + :meth:`imod.mf6.Modflow6Simulation.create_time_discretization` + - Wells assigned to a "steady-state" entry in the projectfile will have + no "time" dimension in the resulting dataset. + - Times beyond the year 2261 are out of bounds for pandas. In associated + timeseries these are ignored, instead the last stage is forward + filled. + + .. note:: + In case you are wondering why is this so complicated? There are two + main reasons: + + - iMOD5 is inconsistent in how it treats timeseries for grid data, + compared to point data. Whereas grids are forward filled when + there is no entry specified for a time entry, unassociated wells + are deactivated. Associated wells, however, are forward filled. + - Normally there are two levels in which times are defined: The + simulation times, which are the requested times for the + simulation, and projectfile times, on which data is defined. With + associated ipfs, times are defined in three + levels: There are simulation times (in iMOD5 in the ini + file), there are projectfile times, and there are times + defined in the associated textfiles on which data is defined. - # Drop layer coordinate if present, otherwise a layer coordinate is assigned - # which causes conflicts downstream when assigning wells and deriving - # cellids. - domain_2d = domain.isel(layer=0, drop=True, missing_dims="ignore").drop_vars( - "layer", errors="ignore" + Parameters + ---------- + + key: str + Name of the well system in the imod5 data + imod5_data: dict + iMOD5 data loaded from a projectfile with + :func:`imod.formats.prj.open_projectfile_data` + times: list + Simulation times + minimum_k: float, optional + On creating point wells, no point wells will be placed in cells with + a lower horizontal conductivity than this. Wells are placed when + ``to_mf6_pkg`` is called. + minimum_thickness: float, optional + On creating point wells, no point wells will be placed in cells with + a lower thickness than this. Wells are placed when ``to_mf6_pkg`` is + called. + """ + pkg_data = imod5_data[key] + all_well_times = get_all_imod5_prj_well_times(imod5_data) + + df = _unpack_package_data(pkg_data, times, all_well_times) + cls._validate_imod5_depth_information(key, pkg_data, df) + + # Groupby unique wells, to get dataframes per time. + colnames_group = ["x", "y"] + cls._imod5_depth_colnames + # Associated wells need additional grouping by id + if pkg_data["has_associated"]: + colnames_group.append("id") + wel_index, unique_well_groups = zip(*df.groupby(colnames_group)) + + # Unpack wel indices by zipping + varnames = [("x", float), ("y", float)] + cls._depth_colnames + index_values = zip(*wel_index) + cls_input: dict[str, Any] = { + var: np.array(value, dtype=dtype) + for (var, dtype), value in zip(varnames, index_values) + } + cls_input["rate"] = _prepare_well_rates_from_groups( + pkg_data, unique_well_groups, times ) - return mask_2D(self, domain_2d) + cls_input["minimum_k"] = minimum_k + cls_input["minimum_thickness"] = minimum_thickness + + return cls(**cls_input) + + +class Well(GridAgnosticWell): + """ + Agnostic WEL package, which accepts x, y and a top and bottom of the well screens. + + This package can be written to any provided model grid. + Any number of WEL Packages can be specified for a single groundwater flow model. + https://water.usgs.gov/water-resources/software/MODFLOW-6/mf6io_6.0.4.pdf#page=63 + + Parameters + ---------- + + y: list of floats or np.array of floats + is the y location of the well. + x: list of floats or np.array of floats + is the x location of the well. + screen_top: list of floats or np.array of floats + is the top of the well screen. + screen_bottom: list of floats or np.array of floats + is the bottom of the well screen. + rate: list of floats or xr.DataArray + is the volumetric well rate. A positive value indicates well + (injection) and a negative value indicates discharge (extraction) (q). + If provided as DataArray, an ``"index"`` dimension is required and an + optional ``"time"`` dimension and coordinate specify transient input. + In the latter case, it is important that dimensions are in the order: + ``("time", "index")`` + concentration: array of floats (xr.DataArray, optional) + if this flow package is used in simulations also involving transport, then this array is used + as the concentration for inflow over this boundary. + concentration_boundary_type: ({"AUX", "AUXMIXED"}, optional) + if this flow package is used in simulations also involving transport, then this keyword specifies + how outflow over this boundary is computed. + id: list of Any, optional + assign an identifier code to each well. if not provided, one will be generated + Must be convertible to string, and unique entries. + minimum_k: float, optional + on creating point wells, no point wells will be placed in cells with a lower horizontal conductivity than this + minimum_thickness: float, optional + on creating point wells, no point wells will be placed in cells with a lower thickness than this + print_input: ({True, False}, optional) + keyword to indicate that the list of well information will be written to + the listing file immediately after it is read. + Default is False. + print_flows: ({True, False}, optional) + Indicates that the list of well flow rates will be printed to the + listing file for every stress period time step in which "BUDGET PRINT" + is specified in Output Control. If there is no Output Control option + and PRINT FLOWS is specified, then flow rates are printed for the last + time step of each stress period. + Default is False. + save_flows: ({True, False}, optional) + Indicates that well flow terms will be written to the file specified + with "BUDGET FILEOUT" in Output Control. + Default is False. + observations: [Not yet supported.] + Default is None. + validate: {True, False} + Flag to indicate whether the package should be validated upon + initialization. This raises a ValidationError if package input is + provided in the wrong manner. Defaults to True. + repeat_stress: Optional[xr.DataArray] of datetimes + Used to repeat data for e.g. repeating stress periods such as + seasonality without duplicating the values. The DataArray should have + dimensions ``("repeat", "repeat_items")``. The ``repeat_items`` + dimension should have size 2: the first value is the "key", the second + value is the "value". For the "key" datetime, the data of the "value" + datetime will be used. Can also be set with a dictionary using the + ``set_repeat_stress`` method. + + Examples + --------- + + >>> screen_top = [0.0, 0.0] + >>> screen_bottom = [-2.0, -2.0] + >>> y = [83.0, 77.0] + >>> x = [81.0, 82.0] + >>> rate = [1.0, 1.0] + + >>> imod.mf6.Well(x, y, screen_top, screen_bottom, rate) + + For a transient well: + + >>> weltimes = pd.date_range("2000-01-01", "2000-01-03") + + >>> rate_factor_time = xr.DataArray([0.5, 1.0], coords={"time": weltimes}, dims=("time",)) + >>> rate_transient = rate_factor_time * xr.DataArray(rate, dims=("index",)) + + >>> imod.mf6.Well(x, y, screen_top, screen_bottom, rate_transient) + """ + + _pkg_id = "wel" + + _auxiliary_data = {"concentration": "species"} + _init_schemata = { + "screen_top": [DTypeSchema(np.floating)], + "screen_bottom": [DTypeSchema(np.floating)], + "y": [DTypeSchema(np.floating)], + "x": [DTypeSchema(np.floating)], + "rate": [DTypeSchema(np.floating)], + "concentration": [DTypeSchema(np.floating)], + } + _write_schemata = { + "screen_top": [AnyNoDataSchema(), EmptyIndexesSchema()], + "screen_bottom": [ + AnyNoDataSchema(), + EmptyIndexesSchema(), + AllValueSchema("<=", "screen_top"), + ], + "y": [AnyNoDataSchema(), EmptyIndexesSchema()], + "x": [AnyNoDataSchema(), EmptyIndexesSchema()], + "rate": [AnyNoDataSchema(), EmptyIndexesSchema()], + "concentration": [AnyNoDataSchema(), EmptyIndexesSchema()], + } + + _imod5_depth_colnames: list[str] = ["filt_top", "filt_bot"] + _depth_colnames: list[tuple[str, type]] = [ + ("screen_top", float), + ("screen_bottom", float), + ] + + @init_log_decorator() + def __init__( + self, + x: np.ndarray | list[float], + y: np.ndarray | list[float], + screen_top: np.ndarray | list[float], + screen_bottom: np.ndarray | list[float], + rate: list[float] | xr.DataArray, + concentration: Optional[list[float] | xr.DataArray] = None, + concentration_boundary_type="aux", + id: Optional[list[Any]] = None, + minimum_k: float = 0.1, + minimum_thickness: float = 0.05, + print_input: bool = False, + print_flows: bool = False, + save_flows: bool = False, + observations=None, + validate: bool = True, + repeat_stress: Optional[xr.DataArray] = None, + ): + if id is None: + id = [str(i) for i in range(len(x))] + else: + set_id = set(id) + if len(id) != len(set_id): + raise ValueError("id's must be unique") + id = [str(i) for i in id] + dict_dataset = { + "screen_top": _assign_dims(screen_top), + "screen_bottom": _assign_dims(screen_bottom), + "y": _assign_dims(y), + "x": _assign_dims(x), + "rate": _assign_dims(rate), + "id": _assign_dims(id), + "minimum_k": minimum_k, + "minimum_thickness": minimum_thickness, + "print_input": print_input, + "print_flows": print_flows, + "save_flows": save_flows, + "observations": observations, + "repeat_stress": repeat_stress, + "concentration": concentration, + "concentration_boundary_type": concentration_boundary_type, + } + super().__init__(dict_dataset) + # Set index as coordinate + index_coord = np.arange(self.dataset.dims["index"]) + self.dataset = self.dataset.assign_coords(index=index_coord) + self._validate_init_schemata(validate) + + def clip_box( + self, + time_min: Optional[cftime.datetime | np.datetime64 | str] = None, + time_max: Optional[cftime.datetime | np.datetime64 | str] = None, + layer_min: Optional[int] = None, + layer_max: Optional[int] = None, + x_min: Optional[float] = None, + x_max: Optional[float] = None, + y_min: Optional[float] = None, + y_max: Optional[float] = None, + top: Optional[GridDataArray] = None, + bottom: Optional[GridDataArray] = None, + ) -> Package: + """ + Clip a package by a bounding box (time, layer, y, x). + + The well package doesn't use the layer attribute to describe its depth and length. + Instead, it uses the screen_top and screen_bottom parameters which corresponds with + the z-coordinates of the top and bottom of the well. To go from a layer_min and + layer_max to z-values used for clipping the well a top and bottom array have to be + provided as well. + + Slicing intervals may be half-bounded, by providing None: + + * To select 500.0 <= x <= 1000.0: + ``clip_box(x_min=500.0, x_max=1000.0)``. + * To select x <= 1000.0: ``clip_box(x_min=None, x_max=1000.0)`` + or ``clip_box(x_max=1000.0)``. + * To select x >= 500.0: ``clip_box(x_min = 500.0, x_max=None.0)`` + or ``clip_box(x_min=1000.0)``. + + Parameters + ---------- + time_min: optional + time_max: optional + layer_min: optional, int + layer_max: optional, int + x_min: optional, float + x_max: optional, float + y_min: optional, float + y_max: optional, float + top: optional, GridDataArray + bottom: optional, GridDataArray + + Returns + ------- + sliced : Package + """ + if (layer_max or layer_min) and (top is None or bottom is None): + raise ValueError( + "When clipping by layer both the top and bottom should be defined" + ) + + if top is not None: + # Bug in mypy when using unions in isInstance + if not isinstance(top, GridDataArray) or "layer" not in top.coords: # type: ignore + top = create_layered_top(bottom, top) + + # The super method will select in the time dimension without issues. + new = super().clip_box(time_min=time_min, time_max=time_max) + + ds = new.dataset + + z_max = self._find_well_value_at_layer(ds, top, layer_max) + z_min = self._find_well_value_at_layer(ds, bottom, layer_min) + + if z_max is not None: + ds["screen_top"] = ds["screen_top"].clip(None, z_max) + if z_min is not None: + ds["screen_bottom"] = ds["screen_bottom"].clip(z_min, None) + + # Initiate array of True with right shape to deal with case no spatial + # selection needs to be done. + in_bounds = np.full(ds.dims["index"], True) + # Select all variables along "index" dimension + in_bounds &= values_within_range(ds["x"], x_min, x_max) + in_bounds &= values_within_range(ds["y"], y_min, y_max) + in_bounds &= values_within_range(ds["screen_top"], z_min, z_max) + in_bounds &= values_within_range(ds["screen_bottom"], z_min, z_max) + # remove wells where the screen bottom and top are the same + in_bounds &= abs(ds["screen_bottom"] - ds["screen_top"]) > 1e-5 + # Replace dataset with reduced dataset based on booleans + new.dataset = ds.loc[{"index": in_bounds}] + + return new + + @staticmethod + def _find_well_value_at_layer( + well_dataset: xr.Dataset, grid: GridDataArray, layer: Optional[int] + ): + value = None if layer is None else grid.isel(layer=layer) + + # if value is a grid select the values at the well locations and drop the dimensions + if (value is not None) and is_spatial_grid(value): + value = imod.select.points_values( + value, + x=well_dataset["x"].values, + y=well_dataset["y"].values, + out_of_bounds="ignore", + ).drop_vars(lambda x: x.coords) + + return value + + def _create_wells_df(self) -> pd.DataFrame: + wells_df = self.dataset.to_dataframe() + wells_df = wells_df.rename( + columns={ + "screen_top": "top", + "screen_bottom": "bottom", + } + ) + + return wells_df + + @standard_log_decorator() + def _validate(self, schemata: dict, **kwargs) -> dict[str, list[ValidationError]]: + kwargs["screen_top"] = self.dataset["screen_top"] + return Package._validate(self, schemata, **kwargs) + + def _assign_wells_to_layers( + self, + wells_df: pd.DataFrame, + active: GridDataArray, + top: GridDataArray, + bottom: GridDataArray, + k: GridDataArray, + ) -> pd.DataFrame: + # Ensure top, bottom & k + # are broadcasted to 3d grid + like = ones_like(active) + bottom = like * bottom + top_2d = (like * top).sel(layer=1) + top_3d = bottom.shift(layer=1).fillna(top_2d) + k = like * k + + index_names = wells_df.index.names + + minimum_k = self.dataset["minimum_k"].item() + minimum_thickness = self.dataset["minimum_thickness"].item() + + # Unset multi-index, because assign_wells cannot deal with + # multi-indices which is returned by self.dataset.to_dataframe() in + # case of a "time" and "species" coordinate. + wells_df = wells_df.reset_index() + + assigned_wells = assign_wells( + wells_df, top_3d, bottom, k, minimum_thickness, minimum_k, True + ) + # Set multi-index again + assigned_wells = assigned_wells.set_index(index_names).sort_index() + + return assigned_wells + + @classmethod + def _validate_imod5_depth_information( + cls, key: str, pkg_data: dict, df: pd.DataFrame + ) -> None: + if "layer" in pkg_data.keys() and (np.any(np.array(pkg_data["layer"]) != 0)): + log_msg = textwrap.dedent( + f""" + In well {key} a layer was assigned, but this is not + supported for imod.mf6.Well. Assignment will be done based on + filter_top and filter_bottom, and the chosen layer + ({pkg_data["layer"]}) will be ignored. To specify by layer, use + imod.mf6.LayeredWell. + """ + ) + logger.log(loglevel=LogLevel.WARNING, message=log_msg, additional_depth=2) + + if "filt_top" not in df.columns or "filt_bot" not in df.columns: + log_msg = textwrap.dedent( + f""" + In well {key} the 'filt_top' and 'filt_bot' columns were + not both found; this is not supported for import. To specify by + layer, use imod.mf6.LayeredWell. + """ + ) + logger.log(loglevel=LogLevel.ERROR, message=log_msg, additional_depth=2) + raise ValueError(log_msg) + + @standard_log_decorator() + def cleanup(self, dis: StructuredDiscretization | VerticesDiscretization): + """ + Clean up package inplace. This method calls + :func:`imod.prepare.cleanup.cleanup_wel`, see documentation of that + function for details on cleanup. + + dis: imod.mf6.StructuredDiscretization | imod.mf6.VerticesDiscretization + Model discretization package. + """ + # Top and bottom should be forced to grids with a x, y coordinates + top, bottom = broadcast_to_full_domain(**dict(dis.dataset.data_vars)) + # Collect point variable datanames + point_varnames = list(self._write_schemata.keys()) + if "concentration" not in self.dataset.keys(): + point_varnames.remove("concentration") + point_varnames.append("id") + # Create dataset with purely point locations + point_ds = self.dataset[point_varnames] + # Take first item of irrelevant dimensions + point_ds = point_ds.isel(time=0, species=0, drop=True, missing_dims="ignore") + # Cleanup well dataframe + wells = point_ds.to_dataframe() + minimum_thickness = float(self.dataset["minimum_thickness"]) + cleaned_wells = cleanup_wel(wells, top.isel(layer=0), bottom, minimum_thickness) + # Select with ids in cleaned dataframe to drop points outside grid. + well_ids = cleaned_wells.index + dataset_cleaned = self.dataset.swap_dims({"index": "id"}).sel(id=well_ids) + # Assign adjusted screen top and bottom + dataset_cleaned["screen_top"] = cleaned_wells["screen_top"] + dataset_cleaned["screen_bottom"] = cleaned_wells["screen_bottom"] + # Ensure dtype of id is preserved + id_type = self.dataset["id"].dtype + dataset_cleaned = dataset_cleaned.swap_dims({"id": "index"}).reset_coords("id") + dataset_cleaned["id"] = dataset_cleaned["id"].astype(id_type) + # Override dataset + self.dataset = dataset_cleaned + + +class LayeredWell(GridAgnosticWell): + """ + Agnostic WEL package, which accepts x, y and layers. + + This package can be written to any provided model grid, given that it has + enough layers. Any number of WEL Packages can be specified for a single + groundwater flow model. + https://water.usgs.gov/water-resources/software/MODFLOW-6/mf6io_6.0.4.pdf#page=63 + + Parameters + ---------- + + y: list of floats or np.array of floats + is the y location of the well. + x: list of floats or np.array of floats + is the x location of the well. + layer: list of ints or np.array of ints + is the layer of the well. + rate: list of floats or xr.DataArray + is the volumetric well rate. A positive value indicates well + (injection) and a negative value indicates discharge (extraction) (q). + If provided as DataArray, an ``"index"`` dimension is required and an + optional ``"time"`` dimension and coordinate specify transient input. + In the latter case, it is important that dimensions are in the order: + ``("time", "index")`` + concentration: array of floats (xr.DataArray, optional) + if this flow package is used in simulations also involving transport, then this array is used + as the concentration for inflow over this boundary. + concentration_boundary_type: ({"AUX", "AUXMIXED"}, optional) + if this flow package is used in simulations also involving transport, then this keyword specifies + how outflow over this boundary is computed. + id: list of Any, optional + assign an identifier code to each well. if not provided, one will be generated + Must be convertible to string, and unique entries. + minimum_k: float, optional + on creating point wells, no point wells will be placed in cells with a lower horizontal conductivity than this + minimum_thickness: float, optional + on creating point wells, no point wells will be placed in cells with a lower thickness than this + print_input: ({True, False}, optional) + keyword to indicate that the list of well information will be written to + the listing file immediately after it is read. + Default is False. + print_flows: ({True, False}, optional) + Indicates that the list of well flow rates will be printed to the + listing file for every stress period time step in which "BUDGET PRINT" + is specified in Output Control. If there is no Output Control option + and PRINT FLOWS is specified, then flow rates are printed for the last + time step of each stress period. + Default is False. + save_flows: ({True, False}, optional) + Indicates that well flow terms will be written to the file specified + with "BUDGET FILEOUT" in Output Control. + Default is False. + observations: [Not yet supported.] + Default is None. + validate: {True, False} + Flag to indicate whether the package should be validated upon + initialization. This raises a ValidationError if package input is + provided in the wrong manner. Defaults to True. + repeat_stress: Optional[xr.DataArray] of datetimes + Used to repeat data for e.g. repeating stress periods such as + seasonality without duplicating the values. The DataArray should have + dimensions ``("repeat", "repeat_items")``. The ``repeat_items`` + dimension should have size 2: the first value is the "key", the second + value is the "value". For the "key" datetime, the data of the "value" + datetime will be used. Can also be set with a dictionary using the + ``set_repeat_stress`` method. + + Examples + --------- + + >>> layer = [1, 2] + >>> y = [83.0, 77.0] + >>> x = [81.0, 82.0] + >>> rate = [1.0, 1.0] + + >>> imod.mf6.LayeredWell(x, y, layer, rate) + + For a transient well: + + >>> weltimes = pd.date_range("2000-01-01", "2000-01-03") + + >>> rate_factor_time = xr.DataArray([0.5, 1.0], coords={"time": weltimes}, dims=("time",)) + >>> rate_transient = rate_factor_time * xr.DataArray(rate, dims=("index",)) + + >>> imod.mf6.LayeredWell(x, y, layer, rate_transient) + """ + + _pkg_id = "wel" + + _auxiliary_data = {"concentration": "species"} + _init_schemata = { + "layer": [DTypeSchema(np.integer)], + "y": [DTypeSchema(np.floating)], + "x": [DTypeSchema(np.floating)], + "rate": [DTypeSchema(np.floating)], + "concentration": [DTypeSchema(np.floating)], + } + _write_schemata = { + "layer": [AnyNoDataSchema(), EmptyIndexesSchema()], + "y": [AnyNoDataSchema(), EmptyIndexesSchema()], + "x": [AnyNoDataSchema(), EmptyIndexesSchema()], + "rate": [AnyNoDataSchema(), EmptyIndexesSchema()], + "concentration": [AnyNoDataSchema(), EmptyIndexesSchema()], + } + _imod5_depth_colnames: list[str] = ["layer"] + _depth_colnames: list[tuple[str, type]] = [("layer", int)] + + @init_log_decorator() + def __init__( + self, + x: np.ndarray | list[float], + y: np.ndarray | list[float], + layer: np.ndarray | list[int], + rate: list[float] | xr.DataArray, + concentration: Optional[list[float] | xr.DataArray] = None, + concentration_boundary_type="aux", + id: Optional[list[Any]] = None, + minimum_k: float = 0.1, + minimum_thickness: float = 1.0, + print_input: bool = False, + print_flows: bool = False, + save_flows: bool = False, + observations=None, + validate: bool = True, + repeat_stress: Optional[xr.DataArray] = None, + ): + if id is None: + id = [str(i) for i in range(len(x))] + else: + set_id = set(id) + if len(id) != len(set_id): + raise ValueError("id's must be unique") + id = [str(i) for i in id] + dict_dataset = { + "layer": _assign_dims(layer), + "y": _assign_dims(y), + "x": _assign_dims(x), + "rate": _assign_dims(rate), + "id": _assign_dims(id), + "minimum_k": minimum_k, + "minimum_thickness": minimum_thickness, + "print_input": print_input, + "print_flows": print_flows, + "save_flows": save_flows, + "observations": observations, + "repeat_stress": repeat_stress, + "concentration": concentration, + "concentration_boundary_type": concentration_boundary_type, + } + super().__init__(dict_dataset) + # Set index as coordinate + index_coord = np.arange(self.dataset.dims["index"]) + self.dataset = self.dataset.assign_coords(index=index_coord) + self._validate_init_schemata(validate) + + def clip_box( + self, + time_min: Optional[cftime.datetime | np.datetime64 | str] = None, + time_max: Optional[cftime.datetime | np.datetime64 | str] = None, + layer_min: Optional[int] = None, + layer_max: Optional[int] = None, + x_min: Optional[float] = None, + x_max: Optional[float] = None, + y_min: Optional[float] = None, + y_max: Optional[float] = None, + top: Optional[GridDataArray] = None, + bottom: Optional[GridDataArray] = None, + ) -> Package: + """ + Clip a package by a bounding box (time, layer, y, x). + + Slicing intervals may be half-bounded, by providing None: + + * To select 500.0 <= x <= 1000.0: + ``clip_box(x_min=500.0, x_max=1000.0)``. + * To select x <= 1000.0: ``clip_box(x_min=None, x_max=1000.0)`` + or ``clip_box(x_max=1000.0)``. + * To select x >= 500.0: ``clip_box(x_min = 500.0, x_max=None.0)`` + or ``clip_box(x_min=1000.0)``. + + Parameters + ---------- + time_min: optional + time_max: optional + layer_min: optional, int + layer_max: optional, int + x_min: optional, float + x_max: optional, float + y_min: optional, float + y_max: optional, float + top: optional, GridDataArray + bottom: optional, GridDataArray + + Returns + ------- + sliced : Package + """ + # The super method will select in the time dimension without issues. + new = super().clip_box(time_min=time_min, time_max=time_max) + + ds = new.dataset + + # Initiate array of True with right shape to deal with case no spatial + # selection needs to be done. + in_bounds = np.full(ds.dims["index"], True) + # Select all variables along "index" dimension + in_bounds &= values_within_range(ds["x"], x_min, x_max) + in_bounds &= values_within_range(ds["y"], y_min, y_max) + in_bounds &= values_within_range(ds["layer"], layer_min, layer_max) + # Replace dataset with reduced dataset based on booleans + new.dataset = ds.loc[{"index": in_bounds}] + + return new + + def _create_wells_df(self) -> pd.DataFrame: + return self.dataset.to_dataframe() + + def _assign_wells_to_layers( + self, + wells_df: pd.DataFrame, + active: GridDataArray, + top: GridDataArray, + bottom: GridDataArray, + k: GridDataArray, + ) -> pd.DataFrame: + return wells_df + + @classmethod + def _validate_imod5_depth_information( + cls, key: str, pkg_data: dict, df: pd.DataFrame + ) -> None: + if np.any(np.array(pkg_data["layer"]) == 0): + log_msg = textwrap.dedent( + f""" + Well {key} in projectfile is assigned to layer 0, but should be > + 0 for LayeredWell + """ + ) + logger.log(loglevel=LogLevel.ERROR, message=log_msg, additional_depth=2) + raise ValueError(log_msg) + + if "layer" not in df.columns: + log_msg = textwrap.dedent( + f""" + IPF file {key} has no layer assigned, but this is required + for LayeredWell. + """ + ) + logger.log(loglevel=LogLevel.ERROR, message=log_msg, additional_depth=2) + raise ValueError(log_msg) class WellDisStructured(DisStructuredBoundaryCondition): diff --git a/imod/mf6/write_context.py b/imod/mf6/write_context.py index a36fe7827..9679b2d3e 100644 --- a/imod/mf6/write_context.py +++ b/imod/mf6/write_context.py @@ -4,7 +4,6 @@ from dataclasses import dataclass from os.path import relpath from pathlib import Path -from typing import Optional, Union @dataclass @@ -28,22 +27,18 @@ class WriteContext: it will be set to the simulation_directrory. """ - def __init__( - self, - simulation_directory: Path = Path("."), - use_binary: bool = False, - use_absolute_paths: bool = False, - write_directory: Optional[Union[str, Path]] = None, - ): - self.__simulation_directory = Path(simulation_directory) - self.__use_binary = use_binary - self.__use_absolute_paths = use_absolute_paths - self.__write_directory = ( - Path(write_directory) - if write_directory is not None - else self.__simulation_directory + simulation_directory: Path = Path(".") + use_binary: bool = False + use_absolute_paths: bool = False + write_directory: Path = None # type: ignore + + def __post_init__(self): + self.simulation_directory = Path(self.simulation_directory) + self.write_directory = ( + Path(self.write_directory) + if self.write_directory is not None + else self.simulation_directory ) - self.__is_partitioned = False def get_formatted_write_directory(self) -> Path: """ @@ -52,34 +47,14 @@ def get_formatted_write_directory(self) -> Path: be relative to the simulation directory, which makes it usable by MF6. """ if self.use_absolute_paths: - return self.__write_directory - return Path(relpath(self.write_directory, self.__simulation_directory)) + return self.write_directory + return Path(relpath(self.write_directory, self.simulation_directory)) def copy_with_new_write_directory(self, new_write_directory: Path) -> WriteContext: new_context = deepcopy(self) - new_context.__write_directory = Path(new_write_directory) + new_context.write_directory = Path(new_write_directory) return new_context - @property - def simulation_directory(self) -> Path: - return self.__simulation_directory - - @property - def use_binary(self) -> bool: - return self.__use_binary - - @use_binary.setter - def use_binary(self, value) -> None: - self.__use_binary = value - - @property - def use_absolute_paths(self) -> bool: - return self.__use_absolute_paths - - @property - def write_directory(self) -> Path: - return self.__write_directory - @property def root_directory(self) -> Path: """ @@ -87,14 +62,6 @@ def root_directory(self) -> Path: that are in agreement with the use_absolute_paths setting. """ if self.use_absolute_paths: - return self.__simulation_directory + return self.simulation_directory else: return Path("") - - @property - def is_partitioned(self) -> bool: - return self.__is_partitioned - - @is_partitioned.setter - def is_partitioned(self, value: bool) -> None: - self.__is_partitioned = value diff --git a/imod/msw/__init__.py b/imod/msw/__init__.py index bcbe174aa..81ec7a41f 100644 --- a/imod/msw/__init__.py +++ b/imod/msw/__init__.py @@ -15,5 +15,5 @@ from imod.msw.output_control import TimeOutputControl, VariableOutputControl from imod.msw.ponding import Ponding from imod.msw.scaling_factors import ScalingFactors -from imod.msw.sprinkling import Sprinkling +from imod.msw.sprinkling import Sprinkling, SprinklingMultipleSources from imod.msw.vegetation import AnnualCropFactors diff --git a/imod/msw/coupler_mapping.py b/imod/msw/coupler_mapping.py index b7294813a..0b0662e04 100644 --- a/imod/msw/coupler_mapping.py +++ b/imod/msw/coupler_mapping.py @@ -9,7 +9,6 @@ from imod.msw.fixed_format import VariableMetaData from imod.msw.pkgbase import MetaSwapPackage - class CouplerMapping(MetaSwapPackage): """ This contains the data to connect MODFLOW 6 cells to MetaSWAP svats. @@ -108,10 +107,9 @@ def _create_well_id(self, svat): well_column = self.well["column"] - 1 well_layer = self.well["layer"] - 1 - n_mod = self.idomain_active.sum() - mod_id = xr.full_like(self.idomain_active, 0, dtype=np.int64) - mod_id.values[self.idomain_active.values] = np.arange(1, n_mod + 1) - + n_mod = self.idomain_active.sum().compute() + mod_id = xr.full_like(self.idomain_active, 0, dtype=np.int64).compute() + mod_id.values[self.idomain_active.values] = np.arange(1, n_mod + 1) well_mod_id = mod_id[well_layer, well_row, well_column] well_mod_id = np.tile(well_mod_id, (n_subunit, 1)) diff --git a/imod/msw/model.py b/imod/msw/model.py index b8b965991..18cf7db4d 100644 --- a/imod/msw/model.py +++ b/imod/msw/model.py @@ -212,9 +212,9 @@ def write(self, directory: Union[str, Path]): """ # Model checks - self._check_required_packages() - self._check_vegetation_indices_in_annual_crop_factors() - self._check_landuse_indices_in_lookup_options() + # self._check_required_packages() + # self._check_vegetation_indices_in_annual_crop_factors() + # self._check_landuse_indices_in_lookup_options() # Force to Path directory = Path(directory) diff --git a/imod/msw/sprinkling.py b/imod/msw/sprinkling.py index fa043aee7..70d38f07d 100644 --- a/imod/msw/sprinkling.py +++ b/imod/msw/sprinkling.py @@ -22,8 +22,11 @@ class Sprinkling(MetaSwapPackage): max_abstraction_surfacewater: array of floats (xr.DataArray) Describes the maximum abstraction of surfacewater to SVAT units in m3 per day. This array must not have a subunit coordinate. + well_id: array of int (xr.DataArray) + Describes per svat the corresponing id of the well package + from which water is extracted. well: WellDisStructured - Describes the sprinkling of SVAT units coming groundwater. + Describes the source location for groundwater intake. """ _file_name = "scap_svat.inp" @@ -33,21 +36,19 @@ class Sprinkling(MetaSwapPackage): "max_abstraction_surfacewater_mm_d": VariableMetaData(8, None, None, str), "max_abstraction_groundwater_m3_d": VariableMetaData(8, 0.0, 1e9, float), "max_abstraction_surfacewater_m3_d": VariableMetaData(8, 0.0, 1e9, float), - "svat_groundwater": VariableMetaData(10, None, None, str), + "svat_groundwater": VariableMetaData(10, 1, 99999999, int), "layer": VariableMetaData(6, 1, 9999, int), "trajectory": VariableMetaData(10, None, None, str), } - _with_subunit = () - _without_subunit = ( + _with_subunit = ( "max_abstraction_groundwater_m3_d", - "max_abstraction_surfacewater_m3_d", - ) + "max_abstraction_surfacewater_m3_d",) + _without_subunit = () _to_fill = ( "max_abstraction_groundwater_mm_d", "max_abstraction_surfacewater_mm_d", - "svat_groundwater", "trajectory", ) @@ -62,27 +63,91 @@ def __init__( self.dataset["max_abstraction_surfacewater_m3_d"] = max_abstraction_surfacewater self.well = well - self._pkgcheck() + self._pkgcheck() + def _render(self, file, index, svat): + def ravel_per_subunit(array: xr.DataArray) -> np.ndarray: + # per defined well element, all subunits + array_out = array.to_numpy()[:, well_row, well_column].ravel() + # per defined well element, per defined subunits + return array_out[np.isfinite(array_out)] + well_row = self.well["row"] - 1 well_column = self.well["column"] - 1 well_layer = self.well["layer"] + max_rate_per_svat = self.dataset["max_abstraction_groundwater_m3_d"].where( + svat > 0 + ) + layer_per_svat = xr.full_like(max_rate_per_svat, np.nan) + layer_per_svat[:, well_row, well_column] = well_layer + + layer_source = ravel_per_subunit( + layer_per_svat.where(max_rate_per_svat > 0) + ).astype(dtype=np.int32) + svat_source_target = ravel_per_subunit( + svat.where(max_rate_per_svat > 0) + ).astype(dtype=np.int32) + + data_dict = { + "svat": svat_source_target, + "layer": layer_source, + "svat_groundwater": svat_source_target, + } + + + for var in self._with_subunit: + array = self.dataset[var].where(max_rate_per_svat > 0).to_numpy() + array = array[np.isfinite(array)] + data_dict[var] = array - n_subunit = svat["subunit"].size + for var in self._to_fill: + data_dict[var] = "" - well_svat = svat.values[:, well_row, well_column] - well_active = well_svat != 0 + dataframe = pd.DataFrame( + data=data_dict, columns=list(self._metadata_dict.keys()) + ) + + self._check_range(dataframe) - # Tile well_layers for each subunit - layer = np.tile(well_layer, (n_subunit, 1)) + return self.write_dataframe_fixed_width(file, dataframe) - data_dict = {"svat": well_svat[well_active], "layer": layer[well_active]} - for var in self._without_subunit: - well_arr = self.dataset[var].values[well_row, well_column] - well_arr = np.tile(well_arr, (n_subunit, 1)) - data_dict[var] = well_arr[well_active] +class SprinklingMultipleSources(Sprinkling): + + def __init__( + self, + max_abstraction_groundwater: xr.DataArray, + max_abstraction_surfacewater: xr.DataArray, + well: WellDisStructured, + well_id: xr.DataArray + ): + super().__init__(max_abstraction_groundwater, max_abstraction_surfacewater, well) + self.well_id = well_id + + + def _render(self, file, index, svat): + well_row = self.well["row"] - 1 + well_column = self.well["column"] - 1 + layer_source = self.well["layer"] + max_rate = self.dataset["max_abstraction_groundwater_m3_d"] + + svat_target = svat.where(max_rate > 0).to_numpy() + svat_target = svat_target[np.isfinite(svat_target)].astype(dtype=np.int32) + + well_id = self.well_id.where(max_rate > 0).to_numpy() + well_id = well_id[np.isfinite(well_id)].astype(dtype=np.int32) + + # always use the first svat as source + svat_source = svat.to_numpy()[0, well_row[well_id], well_column[well_id]] + layer_source = layer_source[well_id] + + data_dict = {"svat": svat_target, "layer": layer_source, "svat_groundwater": svat_source} + + for var in self._with_subunit: + array = self.dataset[var].where(max_rate > 0).to_numpy() + array = array[np.isfinite(array)] + data_dict[var] = array for var in self._to_fill: data_dict[var] = "" @@ -93,4 +158,4 @@ def _render(self, file, index, svat): self._check_range(dataframe) - return self.write_dataframe_fixed_width(file, dataframe) + return self.write_dataframe_fixed_width(file, dataframe) \ No newline at end of file diff --git a/imod/prepare/__init__.py b/imod/prepare/__init__.py index 13032246d..03342c87e 100644 --- a/imod/prepare/__init__.py +++ b/imod/prepare/__init__.py @@ -13,7 +13,12 @@ speed by making use of the Numba compiler, to be able to regrid large datasets. """ -from imod.prepare import spatial, subsoil, surface_water +from imod.prepare import hfb, spatial, subsoil, surface_water +from imod.prepare.cleanup import cleanup_drn, cleanup_ghb, cleanup_riv, cleanup_wel +from imod.prepare.hfb import ( + linestring_to_square_zpolygons, + linestring_to_trapezoid_zpolygons, +) from imod.prepare.layer import ( create_layered_top, get_lower_active_grid_cells, diff --git a/imod/prepare/cleanup.py b/imod/prepare/cleanup.py new file mode 100644 index 000000000..39a36b66b --- /dev/null +++ b/imod/prepare/cleanup.py @@ -0,0 +1,338 @@ +"""Cleanup utilities""" + +from enum import Enum +from typing import Optional + +import pandas as pd +import xarray as xr + +from imod.mf6.utilities.mask import mask_arrays +from imod.prepare.wells import locate_wells, validate_well_columnnames +from imod.schemata import scalar_None +from imod.typing import GridDataArray + + +class AlignLevelsMode(Enum): + TOPDOWN = 0 + BOTTOMUP = 1 + + +def align_nodata(grids: dict[str, xr.DataArray]) -> dict[str, xr.DataArray]: + return mask_arrays(grids) + + +def align_interface_levels( + top: GridDataArray, + bottom: GridDataArray, + method: AlignLevelsMode = AlignLevelsMode.TOPDOWN, +) -> tuple[GridDataArray, GridDataArray]: + to_align = top < bottom + + match method: + case AlignLevelsMode.BOTTOMUP: + return top.where(~to_align, bottom), bottom + case AlignLevelsMode.TOPDOWN: + return top, bottom.where(~to_align, top) + case _: + raise TypeError(f"Unmatched case for method, got {method}") + + +def _cleanup_robin_boundary( + idomain: GridDataArray, grids: dict[str, GridDataArray] +) -> dict[str, GridDataArray]: + """Cleanup robin boundary condition (i.e. bc with conductance)""" + active = idomain == 1 + # Deactivate conductance cells outside active domain; this nodata + # inconsistency will be aligned in the final call to align_nodata + conductance = grids["conductance"].where(active) + concentration = grids["concentration"] + # Make conductance cells with erronous values inactive + grids["conductance"] = conductance.where(conductance > 0.0) + # Clip negative concentration cells to 0.0 + if (concentration is not None) and not scalar_None(concentration): + grids["concentration"] = concentration.clip(min=0.0) + else: + grids.pop("concentration") + + # Align nodata + return align_nodata(grids) + + +def cleanup_riv( + idomain: GridDataArray, + bottom: GridDataArray, + stage: GridDataArray, + conductance: GridDataArray, + bottom_elevation: GridDataArray, + concentration: Optional[GridDataArray] = None, +) -> dict[str, GridDataArray]: + """ + Clean up river data, fixes some common mistakes causing ValidationErrors by + doing the following: + + - Cells where conductance <= 0 are deactivated. + - Cells where concentration < 0 are set to 0.0. + - Cells outside active domain (idomain==1) are removed. + - Align NoData: If one variable has an inactive cell in one cell, ensure + this cell is deactivated for all variables. + - River bottom elevations below model bottom of a layer are set to model + bottom of that layer. + - River bottom elevations which exceed river stage are lowered to river + stage. + + Parameters + ---------- + idomain: xarray.DataArray | xugrid.UgridDataArray + MODFLOW 6 model domain. idomain==1 is considered active domain. + bottom: xarray.DataArray | xugrid.UgridDataArray + Grid with model bottoms + stage: xarray.DataArray | xugrid.UgridDataArray + Grid with river stages + conductance: xarray.DataArray | xugrid.UgridDataArray + Grid with conductances + bottom_elevation: xarray.DataArray | xugrid.UgridDataArray + Grid with river bottom elevations + concentration: xarray.DataArray | xugrid.UgridDataArray, optional + Optional grid with concentrations + + Returns + ------- + dict[str, xarray.DataArray | xugrid.UgridDataArray] + Dict of cleaned up grids. Has keys: "stage", "conductance", + "bottom_elevation", "concentration". + """ + # Output dict + output_dict = { + "stage": stage, + "conductance": conductance, + "bottom_elevation": bottom_elevation, + "concentration": concentration, + } + output_dict = _cleanup_robin_boundary(idomain, output_dict) + if (output_dict["stage"] < bottom).any(): + raise ValueError( + "River stage below bottom of model layer, cannot fix this. " + "Probably rivers are assigned to the wrong layer, you can reallocate " + "river data to model layers with: " + "``imod.prepare.topsystem.allocate_riv_cells``." + ) + # Ensure bottom elevation above model bottom + output_dict["bottom_elevation"], _ = align_interface_levels( + output_dict["bottom_elevation"], bottom, AlignLevelsMode.BOTTOMUP + ) + # Ensure stage above bottom_elevation + output_dict["stage"], output_dict["bottom_elevation"] = align_interface_levels( + output_dict["stage"], output_dict["bottom_elevation"], AlignLevelsMode.TOPDOWN + ) + return output_dict + + +def cleanup_drn( + idomain: GridDataArray, + elevation: GridDataArray, + conductance: GridDataArray, + concentration: Optional[GridDataArray] = None, +) -> dict[str, GridDataArray]: + """ + Clean up drain data, fixes some common mistakes causing ValidationErrors by + doing the following: + + - Cells where conductance <= 0 are deactivated. + - Cells where concentration < 0 are set to 0.0. + - Cells outside active domain (idomain==1) are removed. + - Align NoData: If one variable has an inactive cell in one cell, ensure + this cell is deactivated for all variables. + + Parameters + ---------- + idomain: xarray.DataArray | xugrid.UgridDataArray + MODFLOW 6 model domain. idomain==1 is considered active domain. + elevation: xarray.DataArray | xugrid.UgridDataArray + Grid with drain elevations + conductance: xarray.DataArray | xugrid.UgridDataArray + Grid with conductances + concentration: xarray.DataArray | xugrid.UgridDataArray, optional + Optional grid with concentrations + + Returns + ------- + dict[str, xarray.DataArray | xugrid.UgridDataArray] + Dict of cleaned up grids. Has keys: "elevation", "conductance", + "concentration". + """ + # Output dict + output_dict = { + "elevation": elevation, + "conductance": conductance, + "concentration": concentration, + } + return _cleanup_robin_boundary(idomain, output_dict) + + +def cleanup_ghb( + idomain: GridDataArray, + head: GridDataArray, + conductance: GridDataArray, + concentration: Optional[GridDataArray] = None, +) -> dict[str, GridDataArray]: + """ + Clean up general head boundary data, fixes some common mistakes causing + ValidationErrors by doing the following: + + - Cells where conductance <= 0 are deactivated. + - Cells where concentration < 0 are set to 0.0. + - Cells outside active domain (idomain==1) are removed. + - Align NoData: If one variable has an inactive cell in one cell, ensure + this cell is deactivated for all variables. + + Parameters + ---------- + idomain: xarray.DataArray | xugrid.UgridDataArray + MODFLOW 6 model domain. idomain==1 is considered active domain. + head: xarray.DataArray | xugrid.UgridDataArray + Grid with heads + conductance: xarray.DataArray | xugrid.UgridDataArray + Grid with conductances + concentration: xarray.DataArray | xugrid.UgridDataArray, optional + Optional grid with concentrations + + Returns + ------- + dict[str, xarray.DataArray | xugrid.UgridDataArray] + Dict of cleaned up grids. Has keys: "head", "conductance", + "concentration". + """ + # Output dict + output_dict = { + "head": head, + "conductance": conductance, + "concentration": concentration, + } + return _cleanup_robin_boundary(idomain, output_dict) + + +def _locate_wells_in_bounds( + wells: pd.DataFrame, top: GridDataArray, bottom: GridDataArray +) -> tuple[pd.DataFrame, pd.Series, pd.Series]: + """ + Locate wells in model bounds, wells outside bounds are dropped. Returned + dataframes and series have well "id" as index. + + Returns + ------- + wells_in_bounds: pd.DataFrame + wells in model boundaries. Has "id" as index. + xy_top_series: pd.Series + model top at well xy location. Has "id" as index. + xy_base_series: pd.Series + model base at well xy location. Has "id" as index. + """ + id_in_bounds, xy_top, xy_bottom, _ = locate_wells( + wells, top, bottom, validate=False + ) + xy_base_model = xy_bottom.isel(layer=-1, drop=True) + + # Assign id as coordinates + xy_top = xy_top.assign_coords(id=("index", id_in_bounds)) + xy_base_model = xy_base_model.assign_coords(id=("index", id_in_bounds)) + # Create pandas dataframes/series with "id" as index. + xy_top_series = xy_top.to_dataframe(name="top").set_index("id")["top"] + xy_base_series = xy_base_model.to_dataframe(name="bottom").set_index("id")["bottom"] + wells_in_bounds = wells.set_index("id").loc[id_in_bounds] + return wells_in_bounds, xy_top_series, xy_base_series + + +def _clip_filter_screen_to_surface_level( + cleaned_wells: pd.DataFrame, xy_top_series: pd.Series +) -> pd.DataFrame: + cleaned_wells["screen_top"] = cleaned_wells["screen_top"].clip(upper=xy_top_series) + return cleaned_wells + + +def _drop_wells_below_model_base( + cleaned_wells: pd.DataFrame, xy_base_series: pd.Series +) -> pd.DataFrame: + is_below_base = cleaned_wells["screen_top"] >= xy_base_series + return cleaned_wells.loc[is_below_base] + + +def _clip_filter_bottom_to_model_base( + cleaned_wells: pd.DataFrame, xy_base_series: pd.Series +) -> pd.DataFrame: + cleaned_wells["screen_bottom"] = cleaned_wells["screen_bottom"].clip( + lower=xy_base_series + ) + return cleaned_wells + + +def _set_inverted_filters_to_point_filters(cleaned_wells: pd.DataFrame) -> pd.DataFrame: + # Convert all filters where screen bottom exceeds screen top to + # point filters + cleaned_wells["screen_bottom"] = cleaned_wells["screen_bottom"].clip( + upper=cleaned_wells["screen_top"] + ) + return cleaned_wells + + +def _set_ultrathin_filters_to_point_filters( + cleaned_wells: pd.DataFrame, minimum_thickness: float +) -> pd.DataFrame: + not_ultrathin_layer = ( + cleaned_wells["screen_top"] - cleaned_wells["screen_bottom"] + ) > minimum_thickness + cleaned_wells["screen_bottom"] = cleaned_wells["screen_bottom"].where( + not_ultrathin_layer, cleaned_wells["screen_top"] + ) + return cleaned_wells + + +def cleanup_wel( + wells: pd.DataFrame, + top: GridDataArray, + bottom: GridDataArray, + minimum_thickness: float = 0.05, +) -> pd.DataFrame: + """ + Clean up dataframe with wells, fixes some common mistakes in the following + order: + + 1. Wells outside grid bounds are dropped + 2. Filters above surface level are set to surface level + 3. Drop wells with filters entirely below base + 4. Clip filter screen_bottom to model base + 5. Clip filter screen_bottom to screen_top + 6. Well filters thinner than minimum thickness are made point filters + + Parameters + ---------- + wells: pandas.Dataframe + Dataframe with wells to be cleaned up. Requires columns ``"x", "y", + "id", "screen_top", "screen_bottom"`` + top: xarray.DataArray | xugrid.UgridDataArray + Grid with model top + bottom: xarray.DataArray | xugrid.UgridDataArray + Grid with model bottoms + minimum_thickness: float + Minimum thickness, filter thinner than this thickness are set to point + filters + + Returns + ------- + pandas.DataFrame + Cleaned well dataframe. + """ + validate_well_columnnames( + wells, names={"x", "y", "id", "screen_top", "screen_bottom"} + ) + + cleaned_wells, xy_top_series, xy_base_series = _locate_wells_in_bounds( + wells, top, bottom + ) + cleaned_wells = _clip_filter_screen_to_surface_level(cleaned_wells, xy_top_series) + cleaned_wells = _drop_wells_below_model_base(cleaned_wells, xy_base_series) + cleaned_wells = _clip_filter_bottom_to_model_base(cleaned_wells, xy_base_series) + cleaned_wells = _set_inverted_filters_to_point_filters(cleaned_wells) + cleaned_wells = _set_ultrathin_filters_to_point_filters( + cleaned_wells, minimum_thickness + ) + return cleaned_wells diff --git a/imod/prepare/hfb.py b/imod/prepare/hfb.py new file mode 100644 index 000000000..c56af9826 --- /dev/null +++ b/imod/prepare/hfb.py @@ -0,0 +1,219 @@ +from itertools import pairwise +from typing import TYPE_CHECKING, List, Tuple + +from imod.typing import PolygonType +from imod.util.imports import MissingOptionalModule + +if TYPE_CHECKING: + import shapely +else: + try: + import shapely + except ImportError: + shapely = MissingOptionalModule("shapely") + + +def _line_to_square_zpolygon( + x: Tuple[float, float], y: Tuple[float, float], z: Tuple[float, float] +) -> PolygonType: + """ + Creates polygon as follows:: + + xy0,z0 -- xy1,z0 + | | + | | + | | + xy0,z1 -- xy1,z1 + """ + return shapely.Polygon( + ( + (x[0], y[0], z[0]), + (x[0], y[0], z[1]), + (x[1], y[1], z[1]), + (x[1], y[1], z[0]), + ), + ) + + +def linestring_to_square_zpolygons( + barrier_x: List[float], + barrier_y: List[float], + barrier_ztop: List[float], + barrier_zbottom: List[float], +) -> List[PolygonType]: + """ + Create square vertical polygons from linestrings, with a varying ztop and + zbottom over the line. Note: If the lists of x and y values of length N, the + list of z values need to have length N-1. These are shaped as follows:: + + xy0,zt0 -- xy1,zt0 + | | + | xy1,zt1 ---- xy2,zt1 + | | | + xy0,zb0 -- xy1,zb0 | + | | + | | + xy1,zb1 ---- xy2,zb1 + + Parameters + ---------- + barrier_x: list of floats + x-locations of barrier, length N + barrier_y: list of floats + y-locations of barrier, length N + barrier_ztop: list of floats + top of barrier, length N-1 + barrier_zbot: list of floats + bottom of barrier, length N-1 + + Returns + ------- + List of polygons with z dimension. + + Examples + -------- + + >>> x = [-10.0, 0.0, 10.0] + >>> y = [10.0, 0.0, -10.0] + >>> ztop = [10.0, 20.0] + >>> zbot = [-10.0, -20.0] + >>> polygons = linestring_to_square_zpolygons(x, y, ztop, zbot) + + You can use these polygons to construct horizontal flow barriers: + + >>> geometry = gpd.GeoDataFrame(geometry=polygons, data={ + >>> "resistance": [1e3, 1e3], + >>> }, + >>> ) + >>> hfb = imod.mf6.HorizontalFlowBarrierResistance(geometry, print_input) + """ + n = len(barrier_x) + expected_lengths = (n, n, n - 1, n - 1) + actual_lengths = ( + len(barrier_x), + len(barrier_y), + len(barrier_ztop), + len(barrier_zbottom), + ) + if expected_lengths != actual_lengths: + raise ValueError( + "Lengths of barrier data, not properly made. For lengths: (x, y," + f" ztop, zbottom). Expected lengths: {expected_lengths}, received" + f" lengths: {actual_lengths}" + ) + + x_pairs = pairwise(barrier_x) + y_pairs = pairwise(barrier_y) + z_pairs = zip(barrier_ztop, barrier_zbottom) + return [ + _line_to_square_zpolygon(x, y, z) for x, y, z in zip(x_pairs, y_pairs, z_pairs) + ] + + +def _line_to_trapezoid_zpolygon( + x: Tuple[float, float], + y: Tuple[float, float], + zt: Tuple[float, float], + zb: Tuple[float, float], +) -> PolygonType: + """ + Creates polygon as follows:: + + xy0,zt0 + | \ + | \ + | xy1,zt1 + | | + | | + | xy1,zb1 + | / + | / + xy0,zb0 + """ + return shapely.Polygon( + ( + (x[0], y[0], zt[0]), + (x[0], y[0], zb[1]), + (x[1], y[1], zt[1]), + (x[1], y[1], zb[0]), + ), + ) + + +def linestring_to_trapezoid_zpolygons( + barrier_x: List[float], + barrier_y: List[float], + barrier_ztop: List[float], + barrier_zbottom: List[float], +) -> List[PolygonType]: + """ + Create trapezoid vertical polygons from linestrings, with a varying ztop and + zbottom over the line. These are shaped as follows:: + + xy0,zt0 xy2,zt2 + | \ / | + | \ / | + | xy1,zt1 | + | | | + | | | + | xy1,zb1 -- xy2,zb2 + | / + | / + xy0,zb0 + + Parameters + ---------- + barrier_x: list of floats + x-locations of barrier, length N + barrier_y: list of floats + y-locations of barrier, length N + barrier_ztop: list of floats + top of barrier, length N + barrier_zbot: list of floats + bottom of barrier, length N + + Returns + ------- + List of polygons with z dimension. + + Examples + -------- + + >>> x = [-10.0, 0.0, 10.0] + >>> y = [10.0, 0.0, -10.0] + >>> ztop = [10.0, 20.0, 15.0] + >>> zbot = [-10.0, -20.0, 0.0] + >>> polygons = linestring_to_trapezoid_zpolygons(x, y, ztop, zbot) + + You can use these polygons to construct horizontal flow barriers: + + >>> geometry = gpd.GeoDataFrame(geometry=polygons, data={ + >>> "resistance": [1e3, 1e3], + >>> }, + >>> ) + >>> hfb = imod.mf6.HorizontalFlowBarrierResistance(geometry, print_input) + """ + + n = len(barrier_x) + expected_lengths = (n, n, n, n) + actual_lengths = ( + len(barrier_x), + len(barrier_y), + len(barrier_ztop), + len(barrier_zbottom), + ) + if expected_lengths != actual_lengths: + raise ValueError( + "Lengths of barrier data, not properly made. For lengths: (x, y," + f" ztop, zbottom). Expected lengths: {expected_lengths}, received" + f" lengths: {actual_lengths}" + ) + + x_pairs = pairwise(barrier_x) + y_pairs = pairwise(barrier_y) + zt_pairs = pairwise(barrier_ztop) + zb_pairs = pairwise(barrier_zbottom) + return [ + _line_to_trapezoid_zpolygon(x, y, zt, zb) + for x, y, zt, zb in zip(x_pairs, y_pairs, zt_pairs, zb_pairs) + ] diff --git a/imod/prepare/spatial.py b/imod/prepare/spatial.py index 993f7779d..0982a1ac2 100644 --- a/imod/prepare/spatial.py +++ b/imod/prepare/spatial.py @@ -233,7 +233,7 @@ def rasterize(geodataframe, like, column=None, fill=np.nan, **kwargs): """ if column is not None: - shapes = list(zip(geodataframe.geometry, geodataframe[column])) + shapes = list(zip(geodataframe.geometry, geodataframe[column].astype(dtype=np.float64))) else: shapes = list(geodataframe.geometry) diff --git a/imod/prepare/topsystem/allocation.py b/imod/prepare/topsystem/allocation.py index c3c894116..aa2f570c4 100644 --- a/imod/prepare/topsystem/allocation.py +++ b/imod/prepare/topsystem/allocation.py @@ -314,6 +314,20 @@ def _enforce_layered_top(top: GridDataArray, bottom: GridDataArray): return create_layered_top(bottom, top) +def get_above_lower_bound(bottom_elevation: GridDataArray, top_layered: GridDataArray): + """ + Returns boolean array that indicates cells are above the lower vertical + limit of the topsystem. These are the cells located above the + bottom_elevation grid or in the first layer. + """ + top_layer_label = {"layer": min(top_layered.coords["layer"])} + is_above_lower_bound = bottom_elevation <= top_layered + # Bottom elevation above top surface is allowed, so these are set to True + # regardless. + is_above_lower_bound.loc[top_layer_label] = ~bottom_elevation.isnull() + return is_above_lower_bound + + @enforced_dim_order def _allocate_cells__stage_to_riv_bot( top: GridDataArray, @@ -351,7 +365,9 @@ def _allocate_cells__stage_to_riv_bot( top_layered = _enforce_layered_top(top, bottom) - riv_cells = (stage > bottom) & (bottom_elevation < top_layered) + is_above_lower_bound = get_above_lower_bound(bottom_elevation, top_layered) + is_below_upper_bound = stage > bottom + riv_cells = is_below_upper_bound & is_above_lower_bound return riv_cells, None @@ -394,7 +410,9 @@ def _allocate_cells__first_active_to_elevation( top_layered = _enforce_layered_top(top, bottom) - riv_cells = (upper_active_layer <= layer) & (bottom_elevation < top_layered) + is_above_lower_bound = get_above_lower_bound(bottom_elevation, top_layered) + is_below_upper_bound = upper_active_layer <= layer + riv_cells = is_below_upper_bound & is_above_lower_bound & active return riv_cells, None @@ -442,13 +460,13 @@ def _allocate_cells__stage_to_riv_bot_drn_above( PLANAR_GRID.validate(bottom_elevation) top_layered = _enforce_layered_top(top, bottom) - + is_above_lower_bound = get_above_lower_bound(bottom_elevation, top_layered) upper_active_layer = get_upper_active_layer_number(active) layer = active.coords["layer"] - drn_cells = (upper_active_layer <= layer) & (bottom >= stage) - riv_cells = ( - (upper_active_layer <= layer) & (bottom_elevation < top_layered) - ) != drn_cells + is_below_upper_bound = upper_active_layer <= layer + is_below_upper_bound_and_active = is_below_upper_bound & active + drn_cells = is_below_upper_bound_and_active & (bottom >= stage) + riv_cells = (is_below_upper_bound_and_active & is_above_lower_bound) != drn_cells return riv_cells, drn_cells @@ -482,8 +500,9 @@ def _allocate_cells__at_elevation( PLANAR_GRID.validate(elevation) top_layered = _enforce_layered_top(top, bottom) - - riv_cells = (elevation < top_layered) & (elevation >= bottom) + is_above_lower_bound = get_above_lower_bound(elevation, top_layered) + is_below_upper_bound = elevation >= bottom + riv_cells = is_below_upper_bound & is_above_lower_bound return riv_cells, None diff --git a/imod/prepare/topsystem/conductance.py b/imod/prepare/topsystem/conductance.py index f8d5ad92a..788397e06 100644 --- a/imod/prepare/topsystem/conductance.py +++ b/imod/prepare/topsystem/conductance.py @@ -3,7 +3,10 @@ import numpy as np -from imod.prepare.topsystem.allocation import _enforce_layered_top +from imod.prepare.topsystem.allocation import ( + _enforce_layered_top, + get_above_lower_bound, +) from imod.schemata import DimsSchema from imod.typing import GridDataArray from imod.typing.grid import ones_like, preserve_gridtype, zeros_like @@ -348,14 +351,23 @@ def _compute_crosscut_thickness( outside = zeros_like(allocated).astype(bool) if bc_top is not None: - upper_layer_bc = (bc_top < top_layered) & (bc_top > bottom) + top_is_above_lower_bound = get_above_lower_bound(bc_top, top_layered) + upper_layer_bc = top_is_above_lower_bound & (bc_top > bottom) outside = outside | (bc_top < bottom) thickness = thickness.where(~upper_layer_bc, thickness - (top_layered - bc_top)) if bc_bottom is not None: - lower_layer_bc = (bc_bottom < top_layered) & (bc_bottom > bottom) - outside = outside | (bc_bottom > top_layered) - thickness = thickness.where(~lower_layer_bc, thickness - (bc_bottom - bottom)) + bot_is_above_lower_bound = get_above_lower_bound(bc_bottom, top_layered) + lower_layer_bc = bot_is_above_lower_bound & (bc_bottom > bottom) + outside = outside | ~bot_is_above_lower_bound + corrected_thickness = thickness - (bc_bottom - bottom) + # Set top layer to 1.0, where top exceeds bc_bottom + top_layer_label = {"layer": min(top_layered.coords["layer"])} + is_above_surface = top_layered.loc[top_layer_label] < bc_bottom + corrected_thickness.loc[top_layer_label] = corrected_thickness.loc[ + top_layer_label + ].where(is_above_surface, 1.0) + thickness = thickness.where(~lower_layer_bc, corrected_thickness) thickness = thickness.where(~outside, 0.0) @@ -389,19 +401,26 @@ def _distribute_weights__by_corrected_transmissivity( if bc_top is not None: PLANAR_GRID.validate(bc_top) - upper_layer_bc = (bc_top < top_layered) & (bc_top > bottom) + top_is_above_lower_bound = get_above_lower_bound(bc_top, top_layered) + upper_layer_bc = top_is_above_lower_bound & (bc_top > bottom) # Computing vertical midpoint of river crosscutting layers. Fc = Fc.where(~upper_layer_bc, (bottom + bc_top) / 2) if bc_bottom is not None: PLANAR_GRID.validate(bc_bottom) - lower_layer_bc = (bc_bottom < top_layered) & (bc_bottom > bottom) + bot_is_above_lower_bound = get_above_lower_bound(bc_bottom, top_layered) + lower_layer_bc = bot_is_above_lower_bound & (bc_bottom > bottom) # Computing vertical midpoint of river crosscutting layers. Fc = Fc.where(~lower_layer_bc, (top_layered + bc_bottom) / 2) # Correction factor for mismatch between midpoints of crosscut layers and # layer midpoints. F = 1.0 - np.abs(midpoints - Fc) / (layer_thickness * 0.5) + # Negative values can be introduced when elevation above surface level, set + # these to 1.0. + top_layer_index = {"layer": min(top_layered.coords["layer"])} + F_top_layer = F.loc[top_layer_index] + F.loc[top_layer_index] = F_top_layer.where(F_top_layer >= 0.0, 1.0) transmissivity_corrected = transmissivity * F return transmissivity_corrected / transmissivity_corrected.sum(dim="layer") diff --git a/imod/prepare/topsystem/default_allocation_methods.py b/imod/prepare/topsystem/default_allocation_methods.py new file mode 100644 index 000000000..227188fc2 --- /dev/null +++ b/imod/prepare/topsystem/default_allocation_methods.py @@ -0,0 +1,42 @@ +from dataclasses import dataclass + +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION +from imod.prepare.topsystem.conductance import DISTRIBUTING_OPTION + + +@dataclass() +class SimulationAllocationOptions: + """ + Object containing allocation otpions, specified per packages type on + importing fron imod5. Can be used to set defaults when importing a + simulation or a GroundwaterFlowModel from imod5. + + Parameters + ---------- + drn: allocation option to be used for drainage packages + riv: allocation option to be used for river packages + + """ + + drn: ALLOCATION_OPTION = ALLOCATION_OPTION.first_active_to_elevation + riv: ALLOCATION_OPTION = ALLOCATION_OPTION.stage_to_riv_bot + ghb: ALLOCATION_OPTION = ALLOCATION_OPTION.at_elevation + + +@dataclass() +class SimulationDistributingOptions: + """ + Object containing conductivity distribution methods, specified per packages + type. Can be used to set defaults when importing a simulation or a + GroundwaterFlowModel from imod5. + + Parameters + ---------- + drn: distribution option to be used for drainage packages + riv: distribution option to be used for river packages + + """ + + drn: DISTRIBUTING_OPTION = DISTRIBUTING_OPTION.by_corrected_transmissivity + riv: DISTRIBUTING_OPTION = DISTRIBUTING_OPTION.by_corrected_transmissivity + ghb: DISTRIBUTING_OPTION = DISTRIBUTING_OPTION.by_layer_transmissivity diff --git a/imod/prepare/wells.py b/imod/prepare/wells.py index 7c69e2cb4..bb1f00eaf 100644 --- a/imod/prepare/wells.py +++ b/imod/prepare/wells.py @@ -2,17 +2,21 @@ Assign wells to layers. """ -from typing import Optional, Union +from typing import Optional import numpy as np +import numpy.typing as npt import pandas as pd import xarray as xr import xugrid as xu import imod +from imod.typing import GridDataArray -def vectorized_overlap(bounds_a, bounds_b): +def compute_vectorized_overlap( + bounds_a: npt.NDArray[np.float64], bounds_b: npt.NDArray[np.float64] +) -> npt.NDArray[np.float64]: """ Vectorized overlap computation. Compare with: @@ -25,30 +29,66 @@ def vectorized_overlap(bounds_a, bounds_b): ) -def compute_overlap(wells, top, bottom): - # layer bounds shape of (n_well, n_layer, 2) - layer_bounds = np.stack((bottom, top), axis=-1) +def compute_point_filter_overlap( + bounds_wells: npt.NDArray[np.float64], bounds_layers: npt.NDArray[np.float64] +) -> npt.NDArray[np.float64]: + """ + Special case for filters with zero filter length, these are set to layer + thickness. Filters which are not in a layer or have a nonzero filter length + are set to zero overlap. + """ + # Unwrap for readability + wells_top = bounds_wells[:, 1] + wells_bottom = bounds_wells[:, 0] + layers_top = bounds_layers[:, 1] + layers_bottom = bounds_layers[:, 0] + + has_zero_filter_length = wells_top == wells_bottom + in_layer = (layers_top >= wells_top) & (layers_bottom < wells_bottom) + layer_thickness = layers_top - layers_bottom + # Multiplication to set any elements not meeting the criteria to zero. + point_filter_overlap = ( + has_zero_filter_length.astype(float) * in_layer.astype(float) * layer_thickness + ) + return point_filter_overlap + + +def compute_overlap( + wells: pd.DataFrame, top: GridDataArray, bottom: GridDataArray +) -> npt.NDArray[np.float64]: + # layer bounds stack shape of (n_well, n_layer, 2) + layer_bounds_stack = np.stack((bottom, top), axis=-1) well_bounds = np.broadcast_to( np.stack( (wells["bottom"].to_numpy(), wells["top"].to_numpy()), axis=-1, )[np.newaxis, :, :], - layer_bounds.shape, + layer_bounds_stack.shape, + ).reshape(-1, 2) + layer_bounds = layer_bounds_stack.reshape(-1, 2) + + # Deal with filters with a nonzero length + interval_filter_overlap = compute_vectorized_overlap( + well_bounds, + layer_bounds, ) - overlap = vectorized_overlap( - well_bounds.reshape((-1, 2)), - layer_bounds.reshape((-1, 2)), + # Deal with filters with zero length + point_filter_overlap = compute_point_filter_overlap( + well_bounds, + layer_bounds, ) - return overlap + return np.maximum(interval_filter_overlap, point_filter_overlap) def locate_wells( wells: pd.DataFrame, - top: Union[xr.DataArray, xu.UgridDataArray], - bottom: Union[xr.DataArray, xu.UgridDataArray], - k: Optional[Union[xr.DataArray, xu.UgridDataArray]], + top: GridDataArray, + bottom: GridDataArray, + k: Optional[GridDataArray] = None, validate: bool = True, -): +) -> tuple[ + npt.NDArray[np.object_], GridDataArray, GridDataArray, float | GridDataArray +]: if not isinstance(top, (xu.UgridDataArray, xr.DataArray)): raise TypeError( "top and bottom should be DataArray or UgridDataArray, received: " @@ -56,7 +96,7 @@ def locate_wells( ) # Default to a xy_k value of 1.0: weigh every layer equally. - xy_k = 1.0 + xy_k: float | GridDataArray = 1.0 first = wells.groupby("id").first() x = first["x"].to_numpy() y = first["y"].to_numpy() @@ -86,11 +126,27 @@ def locate_wells( return id_in_bounds, xy_top, xy_bottom, xy_k +def validate_well_columnnames( + wells: pd.DataFrame, names: set = {"x", "y", "id"} +) -> None: + missing = names.difference(wells.columns) + if missing: + raise ValueError(f"Columns are missing in wells dataframe: {missing}") + + +def validate_arg_types_equal(**kwargs): + types = [type(arg) for arg in (kwargs.values()) if arg is not None] + if len(set(types)) != 1: + members = ", ".join([t.__name__ for t in types]) + names = ", ".join(kwargs.keys()) + raise TypeError(f"{names} should be of the same type, received: {members}") + + def assign_wells( wells: pd.DataFrame, - top: Union[xr.DataArray, xu.UgridDataArray], - bottom: Union[xr.DataArray, xu.UgridDataArray], - k: Optional[Union[xr.DataArray, xu.UgridDataArray]] = None, + top: GridDataArray, + bottom: GridDataArray, + k: Optional[GridDataArray] = None, minimum_thickness: Optional[float] = 0.05, minimum_k: Optional[float] = 1.0, validate: bool = True, @@ -101,17 +157,19 @@ def assign_wells( thickness and minimum k should be set to avoid placing wells in clay layers. - Wells located outside of the grid are removed. + Wells where well screen_top equals screen_bottom are assigned to the layer + they are located in, without any subdivision. Wells located outside of the + grid are removed. Parameters ---------- - wells: pd.DataFrame + wells: pandas.DataFrame Should contain columns x, y, id, top, bottom, rate. - top: xr.DataArray or xu.UgridDataArray + top: xarray.DataArray or xugrid.UgridDataArray Top of the model layers. - bottom: xr.DataArray or xu.UgridDataArray + bottom: xarray.DataArray or xugrid.UgridDataArray Bottom of the model layers. - k: xr.DataArray or xu.UgridDataArray, optional + k: xarray.DataArray or xugrid.UgridDataArray, optional Horizontal conductivity of the model layers. minimum_thickness: float, optional, default: 0.01 minimum_k: float, optional, default: 1.0 @@ -124,19 +182,9 @@ def assign_wells( Wells with rate subdivided per layer. Contains the original columns of ``wells``, as well as layer, overlap, transmissivity. """ - - names = {"x", "y", "id", "top", "bottom", "rate"} - missing = names.difference(wells.columns) - if missing: - raise ValueError(f"Columns are missing in wells dataframe: {missing}") - - types = [type(arg) for arg in (top, bottom, k) if arg is not None] - if len(set(types)) != 1: - members = ",".join([t.__name__ for t in types]) - raise TypeError( - "top, bottom, and optionally k should be of the same type, " - f"received: {members}" - ) + columnnames = {"x", "y", "id", "top", "bottom", "rate"} + validate_well_columnnames(wells, columnnames) + validate_arg_types_equal(top=top, bottom=bottom, k=k) id_in_bounds, xy_top, xy_bottom, xy_k = locate_wells( wells, top, bottom, k, validate @@ -145,36 +193,40 @@ def assign_wells( first = wells_in_bounds.groupby("id").first() overlap = compute_overlap(first, xy_top, xy_bottom) - if k is None: - k = 1.0 + if isinstance(xy_k, (xr.DataArray, xu.UgridDataArray)): + k_for_df = xy_k.values.ravel() else: - k = xy_k.values.ravel() + k_for_df = xy_k # Distribute rate according to transmissivity. n_layer, n_well = xy_top.shape - df = pd.DataFrame( + df_factor = pd.DataFrame( index=pd.Index(np.tile(first.index, n_layer), name="id"), data={ "layer": np.repeat(top["layer"], n_well), "overlap": overlap, - "k": k, - "transmissivity": overlap * k, + "k": k_for_df, + "transmissivity": overlap * k_for_df, }, ) # remove entries # -in very thin layers or when the wellbore penetrates the layer very little # -in low conductivity layers - df = df.loc[(df["overlap"] >= minimum_thickness) & (df["k"] >= minimum_k)] - df["rate"] = df["transmissivity"] / df.groupby("id")["transmissivity"].transform( - "sum" - ) + df_factor = df_factor.loc[ + (df_factor["overlap"] >= minimum_thickness) & (df_factor["k"] >= minimum_k) + ] + df_factor["rate"] = df_factor["transmissivity"] / df_factor.groupby("id")[ + "transmissivity" + ].transform("sum") # Create a unique index for every id-layer combination. - df["index"] = np.arange(len(df)) - df = df.reset_index() + df_factor["index"] = np.arange(len(df_factor)) + df_factor = df_factor.reset_index() # Get rid of those that are removed because of minimum thickness or # transmissivity. - wells_in_bounds = wells_in_bounds.loc[wells_in_bounds["id"].isin(df["id"].unique())] + wells_in_bounds = wells_in_bounds.loc[ + wells_in_bounds["id"].isin(df_factor["id"].unique()) + ] # Use pandas multi-index broadcasting. # Maintain all other columns as-is. @@ -182,7 +234,7 @@ def assign_wells( wells_in_bounds["overlap"] = 1.0 wells_in_bounds["k"] = 1.0 wells_in_bounds["transmissivity"] = 1.0 - columns = list(set(wells_in_bounds.columns).difference(df.columns)) + columns = list(set(wells_in_bounds.columns).difference(df_factor.columns)) indexes = ["id"] for dim in ["species", "time"]: @@ -190,9 +242,9 @@ def assign_wells( indexes.append(dim) columns.remove(dim) - df[columns] = 1 # N.B. integer! + df_factor[columns] = 1 # N.B. integer! assigned = ( - wells_in_bounds.set_index(indexes) * df.set_index(["id", "layer"]) + wells_in_bounds.set_index(indexes) * df_factor.set_index(["id", "layer"]) ).reset_index() return assigned diff --git a/imod/schemata.py b/imod/schemata.py index 73c598ad3..fd9d8d737 100644 --- a/imod/schemata.py +++ b/imod/schemata.py @@ -74,7 +74,7 @@ def scalar_None(obj): if not isinstance(obj, (xr.DataArray, xu.UgridDataArray)): return False else: - return (len(obj.shape) == 0) & (~obj.notnull()).all() + return (len(obj.shape) == 0) and (obj.isnull()).all() def align_other_obj_with_coords( @@ -503,6 +503,36 @@ def validate(self, obj: GridDataArray, **kwargs) -> None: ) +class MaxNUniqueValuesSchema(BaseSchema): + """ + Fails if amount of unique values exceeds a limit. + """ + + def __init__(self, max_unique_values: int): + self.max_unique_values = max_unique_values + + def validate(self, obj: GridDataArray, **kwargs) -> None: + if len(np.unique(obj)) > self.max_unique_values: + raise ValidationError( + f"Amount of unique values exceeds limit of {self.max_unique_values}" + ) + + +class UniqueValuesSchema(BaseSchema): + def __init__(self, other_value: list) -> None: + """ + Validate if unique values in other values list + """ + self.other_value = other_value + + def validate(self, obj: GridDataArray, **kwargs) -> None: + unique_values = np.unique(obj) + if not np.all(np.isin(unique_values, self.other_value)): + raise ValidationError( + f"Unique values not matching: {self.other_value}, got unique values: {unique_values}" + ) + + def _notnull(obj): """ Helper function; does the same as xr.DataArray.notnull. This function is to diff --git a/imod/select/points.py b/imod/select/points.py index 033d54abd..dba479a84 100644 --- a/imod/select/points.py +++ b/imod/select/points.py @@ -1,13 +1,16 @@ import warnings +from typing import Any import numpy as np +import numpy.typing as npt import xarray as xr import xugrid as xu import imod +from imod.typing import GridDataArray -def get_unstructured_cell2d_from_xy(uda, **points): +def get_unstructured_cell2d_from_xy(uda: xu.UgridDataArray, **points) -> npt.NDArray: # Unstructured grids always require to be tested both on x and y coordinates # to see if points are within bounds. for coord in ["x", "y"]: @@ -21,7 +24,7 @@ def get_unstructured_cell2d_from_xy(uda, **points): return uda.ugrid.grid.locate_points(xy) -def __check_and_get_points_shape(points) -> dict: +def __check_and_get_points_shape(points: dict) -> dict: """Check whether points have the right shape""" shapes = {} for coord, value in points.items(): @@ -36,13 +39,13 @@ def __check_and_get_points_shape(points) -> dict: return shapes -def __check_point_shapes_consistency(shapes): +def __check_point_shapes_consistency(shapes: dict): if not len(set(shapes.values())) == 1: msg = "\n".join([f"{coord}: {shape}" for coord, shape in shapes.items()]) raise ValueError(f"Shapes of coordinates do match each other:\n{msg}") -def _check_points(points): +def _check_points(points: dict): """ Check whether the array with points has the right and consistent shape. """ @@ -51,7 +54,7 @@ def _check_points(points): __check_point_shapes_consistency(shapes) -def __arr_like_points(points, fill_value): +def __arr_like_points(points: dict, fill_value: Any) -> npt.NDArray: """ Return array with the same shape as the first array provided in points. """ @@ -61,7 +64,7 @@ def __arr_like_points(points, fill_value): return np.full(shape, fill_value) -def points_in_bounds(da, **points): +def points_in_bounds(da: GridDataArray, **points) -> npt.NDArray[np.bool]: """ Returns whether points specified by keyword arguments fall within the bounds of ``da``. @@ -117,7 +120,7 @@ def points_in_bounds(da, **points): return in_bounds -def check_points_in_bounds(da, points, out_of_bounds): +def check_points_in_bounds(da: GridDataArray, points: dict, out_of_bounds: str): inside = points_in_bounds(da, **points) # Error handling msg = "Not all points are located within the bounds of the DataArray" @@ -137,7 +140,9 @@ def check_points_in_bounds(da, points, out_of_bounds): return points, inside -def _get_indices_1d(da, coordname, x): +def _get_indices_1d( + da: xr.DataArray, coordname: str, x: npt.NDArray[np.floating] +) -> npt.NDArray[np.intp]: x = np.atleast_1d(x) x_decreasing = da.indexes[coordname].is_monotonic_decreasing dx, xmin, _ = imod.util.spatial.coord_reference(da.coords[coordname]) @@ -172,7 +177,9 @@ def _get_indices_1d(da, coordname, x): return ixs -def points_indices(da, out_of_bounds="raise", **points): +def points_indices( + da: GridDataArray, out_of_bounds: str = "raise", **points +) -> dict[str, xr.DataArray]: """ Get the indices for points as defined by the arrays x and y. @@ -184,7 +191,7 @@ def points_indices(da, out_of_bounds="raise", **points): Parameters ---------- - da : xr.DataArray + da : xarray.DataArray | xu.UgridDataArray out_of_bounds : {"raise", "warn", "ignore"}, default: "raise" What to do if the points are not located in the bounds of the DataArray: @@ -246,7 +253,7 @@ def points_indices(da, out_of_bounds="raise", **points): return indices -def points_values(da, out_of_bounds="raise", **points): +def points_values(da: GridDataArray, out_of_bounds="raise", **points) -> GridDataArray: """ Get values from specified points. @@ -287,12 +294,17 @@ def points_values(da, out_of_bounds="raise", **points): indices = imod.select.points.points_indices( da, out_of_bounds=out_of_bounds, **iterable_points ) - selection = da.isel(**indices) + selection = da.isel(indexers=indices) return selection -def points_set_values(da, values, out_of_bounds="raise", **points): +def points_set_values( + da: GridDataArray, + values: int | float | npt.NDArray[np.number], + out_of_bounds: str = "raise", + **points, +): """ Set values at specified points. diff --git a/imod/tests/conftest.py b/imod/tests/conftest.py index 80abb3889..595741876 100644 --- a/imod/tests/conftest.py +++ b/imod/tests/conftest.py @@ -1,5 +1,9 @@ import pytest +from .fixtures.backward_compatibility_fixture import ( + imod5_dataset, + imod5_dataset_periods, +) from .fixtures.flow_basic_fixture import ( basic_dis, basic_dis__topsystem, @@ -18,6 +22,12 @@ ) from .fixtures.flow_example_fixture import imodflow_model from .fixtures.flow_transport_simulation_fixture import flow_transport_simulation +from .fixtures.imod5_well_data import ( + well_duplication_import_prj, + well_mixed_ipfs, + well_out_of_bounds_ipfs, + well_regular_import_prj, +) from .fixtures.mf6_circle_fixture import ( circle_model, circle_model_evt, diff --git a/imod/tests/fixtures/backward_compatibility_fixture.py b/imod/tests/fixtures/backward_compatibility_fixture.py new file mode 100644 index 000000000..b62b9193a --- /dev/null +++ b/imod/tests/fixtures/backward_compatibility_fixture.py @@ -0,0 +1,522 @@ +from datetime import datetime +from zipfile import ZipFile + +import pytest +import xarray as xr + +import imod +from imod.data.sample_data import REGISTRY +from imod.formats.prj.prj import open_projectfile_data + + +@pytest.fixture(scope="module") +def imod5_dataset(): + tmp_path = imod.util.temporary_directory() + data = imod.data.imod5_projectfile_data(tmp_path) + + pd = data[1] + data = data[0] + + _load_imod5_data_in_memory(data) + + # Fix data for ibound as it contains floating values like 0.34, 0.25 etc. + ibound = data["bnd"]["ibound"] + ibound = ibound.where(ibound <= 0, 1) + data["bnd"]["ibound"] = ibound + return data, pd + + +def _load_imod5_data_in_memory(imod5_data): + """For debugging purposes, load everything in memory""" + for pkg in imod5_data.values(): + for vardata in pkg.values(): + if isinstance(vardata, xr.DataArray): + vardata.load() + + +@pytest.fixture(scope="module") +def imod5_dataset_periods() -> tuple[dict[str, any], dict[str, list[datetime]]]: + tmp_path = imod.util.temporary_directory() + fname_model = REGISTRY.fetch("iMOD5_model.zip") + + with ZipFile(fname_model) as archive: + archive.extractall(tmp_path) + + with open(tmp_path / "iMOD5_model_pooch" / "iMOD5_model.prj", "w") as f: + f.write(period_prj) + + data = open_projectfile_data(tmp_path / "iMOD5_model_pooch" / "iMOD5_model.prj") + + grid_data = data[0] + period_data = data[1] + + _load_imod5_data_in_memory(grid_data) + + # Fix data for ibound as it contains floating values like 0.34, 0.25 etc. + ibound = grid_data["bnd"]["ibound"] + ibound = ibound.where(ibound <= 0, 1) + grid_data["bnd"]["ibound"] = ibound + return grid_data, period_data + + +period_prj = """\ +0001,(BND),1, Boundary Condition +001,37 +1,2,1,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L1.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,2,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L2.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,3,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L3.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,4,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L4.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,5,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L5.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,6,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L6.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,7,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L7.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,8,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L8.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,9,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L9.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,10,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L10.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,11,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L11.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,12,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L12.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,13,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L13.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,14,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L14.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,15,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L15.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,16,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L16.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,17,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L17.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,18,1.0,0.0,-999.99, '.\Database\BND\VERSION_1\IBOUND_L18.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,19,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L19.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,20,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L20.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,21,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L21.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,22,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L22.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,23,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L23.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,24,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L24.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,25,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L25.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,26,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L26.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,27,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L27.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,28,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L28.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,29,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L29.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,30,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L30.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,31,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L31.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,32,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L32.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,33,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L33.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,34,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L34.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,35,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L35.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,36,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L36.IDF' >>> (BND) Boundary Settings (IDF) <<< +1,2,37,1.0,0.0,-999.99,'.\Database\BND\VERSION_1\IBOUND_L37.IDF' >>> (BND) Boundary Settings (IDF) <<< + + +0001,(TOP),1, Top Elevation +001,37 +1,2,1,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L1.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,2,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L2.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,3,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L3.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,4,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L4.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,5,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L5.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,6,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L6.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,7,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L7.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,8,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L8.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,9,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L9.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,10,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L10.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,11,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L11.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,12,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L12.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,13,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L13.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,14,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L14.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,15,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L15.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,16,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L16.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,17,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L17.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,18,1.0,0.0,-999.99, '.\Database\TOP\VERSION_1\TOP_L18.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,19,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L19.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,20,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L20.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,21,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L21.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,22,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L22.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,23,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L23.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,24,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L24.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,25,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L25.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,26,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L26.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,27,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L27.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,28,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L28.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,29,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L29.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,30,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L30.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,31,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L31.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,32,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L32.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,33,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L33.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,34,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L34.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,35,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L35.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,36,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L36.IDF' >>> (TOP) Top of Modellayer (IDF) <<< +1,2,37,1.0,0.0,-999.99,'.\Database\TOP\VERSION_1\TOP_L37.IDF' >>> (TOP) Top of Modellayer (IDF) <<< + + +0001,(BOT),1, Bottom Elevation +001,37 +1,2,1,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L1.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,2,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L2.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,3,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L3.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,4,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L4.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,5,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L5.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,6,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L6.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,7,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L7.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,8,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L8.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,9,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L9.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,10,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L10.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,11,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L11.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,12,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L12.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,13,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L13.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,14,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L14.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,15,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L15.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,16,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L16.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,17,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L17.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,18,1.0,0.0,-999.99, '.\Database\BOT\VERSION_1\BOT_L18.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,19,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L19.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,20,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L20.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,21,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L21.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,22,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L22.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,23,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L23.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,24,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L24.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,25,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L25.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,26,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L26.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,27,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L27.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,28,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L28.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,29,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L29.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,30,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L30.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,31,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L31.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,32,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L32.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,33,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L33.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,34,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L34.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,35,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L35.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,36,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L36.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< +1,2,37,1.0,0.0,-999.99,'.\Database\BOT\VERSION_1\BOT_L37.IDF' >>> (BOT) Bottom of Modellayer (IDF) <<< + + +0001,(KHV),1, Horizontal Permeability +001,37 +1,1,1,1.0,0.0,1.0 >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,2,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L2.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,3,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L3.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,4,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L4.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,5,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L5.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,6,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L6.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,7,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L7.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,8,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L8.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,9,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L9.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,10,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L10.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,11,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L11.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,12,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L12.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,13,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L13.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,14,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L14.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,15,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L15.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,16,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L16.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,17,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L17.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,18,1.0,0.0,-999.99, '.\Database\KHV\VERSION_1\IPEST_KHV_L18.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,19,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L19.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,20,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L20.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,21,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L21.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,22,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L22.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,23,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L23.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,24,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L24.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,25,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L25.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,26,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L26.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,27,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L27.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,28,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L28.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,29,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L29.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,30,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L30.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,1,31,1.0,0.0,1.0 >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,32,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L32.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,33,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L33.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,34,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L34.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,35,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L35.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,36,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L36.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< +1,2,37,1.0,0.0,-999.99,'.\Database\KHV\VERSION_1\IPEST_KHV_L37.IDF' >>> (KHV) Horizontal Permeability (IDF) <<< + + +0001,(KVA),1, Vertical Anisotropy +001,37 +1,1,1,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,2,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,3,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,4,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,5,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,6,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,7,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,8,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,9,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,10,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,11,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,12,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,13,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,14,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,15,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,16,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,17,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,18,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,19,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,20,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,21,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,22,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,23,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,24,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,25,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,26,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,27,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,28,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,29,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,30,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,31,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,32,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,33,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,34,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,35,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,36,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,37,1.0,0.0,0.3,'' >>> (KVA) Vertical Anisotropy (IDF) <<< + +0001,(SHD),1, Starting Heads +001,37 +1,2,1,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L1.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,2,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L2.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,3,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L3.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,4,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L4.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,5,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L5.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,6,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L6.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,7,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L7.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,8,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L8.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,9,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L9.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,10,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L10.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,11,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L11.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,12,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L12.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,13,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L13.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,14,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L14.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,15,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L15.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,16,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L16.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,17,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L17.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,18,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L18.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,19,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L19.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,20,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L20.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,21,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L21.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,22,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L22.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,23,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L23.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,1,24,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,25,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,26,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,27,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,28,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,29,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,30,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,31,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,32,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,33,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,34,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,35,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,36,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< +1,1,37,1.0,0.0,20.5, >>> (SHD) Starting Heads (IDF) <<< + + +0001,(ANI),0, Anisotropy +002,04 +1,2,2,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_FACTOR.IDF' >>> (FCT) Factor (IDF) <<< +1,2,4,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_FACTOR.IDF' >>> (FCT) Factor (IDF) <<< +1,2,6,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_FACTOR.IDF' >>> (FCT) Factor (IDF) <<< +1,2,8,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_FACTOR.IDF' >>> (FCT) Factor (IDF) <<< +1,2,2,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_HOEK.IDF' >>> (FCT) Factor (IDF) <<< +1,2,4,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_HOEK.IDF' >>> (FCT) Factor (IDF) <<< +1,2,6,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_HOEK.IDF' >>> (FCT) Factor (IDF) <<< +1,2,8,1.0,0.0,-999.99,'.\Database\ANI\VERSION_1\ANI_HOEK.IDF' >>> (ANG) Angle (IDF) <<< + + +0001,(STO),1, Storage +001,37 +1,1,1,1.0,0.0,0.15,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,2,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,3,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,4,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,5,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,6,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,7,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,8,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,9,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,10,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,11,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,12,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,13,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,14,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,15,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,16,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,17,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,18,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,19,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,20,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,21,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,22,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,23,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,24,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,25,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,26,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,27,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,28,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,29,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,30,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,31,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,32,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,33,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,34,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,35,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,36,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< +1,1,37,1.0,0.0,1.0e-5,'' >>> (KVA) Vertical Anisotropy (IDF) <<< + +0001,(HFB),1, Horizontal Flow Barrier +001,26 + 1,2, 003, 1.000000 , 10.00000 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BX.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 005, 1.000000 , 1000.000 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_SY.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 007, 1.000000 , 1000.000 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_SY.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 009, 1.000000 , 1000.000 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_SY.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 011, 1.000000 , 1000.000 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_SY.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 013, 1.000000 , 1000.000 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_SY.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 015, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 021, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 023, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 023, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 025, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 025, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 027, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 027, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 029, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 031, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 031, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 031, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 033, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 033, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 033, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 033, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 035, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 035, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 037, 1.000000 , 101000.0 , -999.9900 ,'.\Database\HFB\VERSION_1\IBV2_HOOFDBREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + 1,2, 037, 1.000000 , 400.0000 , -999.9900 , '.\Database\HFB\VERSION_1\IBV2_BREUKEN_BR.GEN' >>> (HFB) Horizontal Barrier Flow (GEN) <<< + +0002,(RIV),1, Rivers +winter +004,003 +1,2,0,1.0,0.0,-999.99,'.\Database\RIV\VERSION_1\RIVER_PRIMAIR\IPEST_RIVER_PRIMAIR_COND_GEMIDDELD.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\IPEST_COND19912011.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\BELGIE\COND_CAT012.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\RIVER_PRIMAIR\RIVER_PRIMAIR_STAGE_GEMIDDELD.IDF' >>> (RST) River Stage (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\STAGE19912011.IDF' >>> (RST) River Stage (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\BELGIE\STAGE_CAT012.IDF' >>> (RST) River Stage (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\RIVER_PRIMAIR\RIVER_PRIMAIR_BOTTOM_GEMIDDELD.IDF' >>> (RBT) River Bottom (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\BOTTOM19912011.IDF' >>> (RBT) River Bottom (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\BELGIE\BOT_CAT012.IDF' >>> (RBT) River Bottom (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\RIVER_PRIMAIR\RIVER_PRIMAIR_INFFCT_GEMIDDELD.IDF' >>> (RIF) Infiltration Factor (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\INFFCT19912011.IDF' >>> (RIF) Infiltration Factor (IDF) <<< +1,1,0,1.0,0.0,1.0, '' >>> (RIF) Infiltration Factor (IDF) <<< +summer +004,003 +1,2,0,1.0,0.0,-999.99,'.\Database\RIV\VERSION_1\RIVER_PRIMAIR\IPEST_RIVER_PRIMAIR_COND_GEMIDDELD.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\IPEST_COND19912011.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\BELGIE\COND_CAT012.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\RIVER_PRIMAIR\RIVER_PRIMAIR_STAGE_GEMIDDELD.IDF' >>> (RST) River Stage (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\STAGE19912011.IDF' >>> (RST) River Stage (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\BELGIE\STAGE_CAT012.IDF' >>> (RST) River Stage (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\RIVER_PRIMAIR\RIVER_PRIMAIR_BOTTOM_GEMIDDELD.IDF' >>> (RBT) River Bottom (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\BOTTOM19912011.IDF' >>> (RBT) River Bottom (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\BELGIE\BOT_CAT012.IDF' >>> (RBT) River Bottom (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\RIVER_PRIMAIR\RIVER_PRIMAIR_INFFCT_GEMIDDELD.IDF' >>> (RIF) Infiltration Factor (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\RIV\VERSION_1\MAAS\INFFCT19912011.IDF' >>> (RIF) Infiltration Factor (IDF) <<< +1,1,0,1.0,0.0,1.0, '' >>> (RIF) Infiltration Factor (IDF) <<< + + +0001,(RCH),1, Recharge +STEADY-STATE +001,001 +1,2,1,1.0,0.0,-999.99,'.\Database\RCH\VERSION_1\GWAANVULLING_MEAN_19940114-20111231.IDF' >>> (RCH) Recharge Rate (IDF) <<< + +0001,(WEL),1, Wells +STEADY-STATE +001,003 +1,2,5,1.0,0.0,-999.99, '.\Database\WEL\VERSION_1\WELLS_L3.IPF' >>> (WRA) Well Rate (IPF) <<< +1,2,7,1.0,0.0,-999.99, '.\Database\WEL\VERSION_1\WELLS_L4.IPF' >>> (WRA) Well Rate (IPF) <<< +1,2,9,1.0,0.0,-999.99, '.\Database\WEL\VERSION_1\WELLS_L5.IPF' >>> (WRA) Well Rate (IPF) <<< + +0002,(DRN),1, Drainage +winter +002,002 +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\IPEST_DRAINAGE_CONDUCTANCE.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99,'.\Database\DRN\VERSION_1\RIVER_SECUNDAIR\IPEST_RIVER_SECUNDAIR_COND_WINTER.IDF' >>> (CON) Conductance (IDF) <<< +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\DRAINAGE_STAGE.IDF' >>> (DEL) Drainage Level (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\RIVER_SECUNDAIR\RIVER_SECUNDAIR_BOTTOM_WINTER.IDF' >>> (DEL) Drainage Level (IDF) <<< +summer +002,002 +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\IPEST_DRAINAGE_CONDUCTANCE.IDF' >>> (CON) Conductance (IDF) <<< +1,2,0,1.0,0.0,-999.99,'.\Database\DRN\VERSION_1\RIVER_SECUNDAIR\IPEST_RIVER_SECUNDAIR_COND_WINTER.IDF' >>> (CON) Conductance (IDF) <<< +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\DRAINAGE_STAGE.IDF' >>> (DEL) Drainage Level (IDF) <<< +1,2,0,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\RIVER_SECUNDAIR\RIVER_SECUNDAIR_BOTTOM_WINTER.IDF' >>> (DEL) Drainage Level (IDF) <<< + + +0002,(GHB),1, general +winter +002,001 +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\IPEST_DRAINAGE_CONDUCTANCE.IDF' >>> (CON) Conductance (IDF) <<< +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\DRAINAGE_STAGE.IDF' >>> (DEL) Drainage Level (IDF) <<< +summer +002,001 +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\IPEST_DRAINAGE_CONDUCTANCE.IDF' >>> (CON) Conductance (IDF) <<< +1,2,1,1.0,0.0,-999.99, '.\Database\DRN\VERSION_1\DRAINAGE_STAGE.IDF' >>> (DEL) Drainage Level (IDF) <<< + +0001,(CHD),1, Constant Head +STEADY-STATE +001,37 +1,2,1,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L1.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,2,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L2.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,3,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L3.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,4,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L4.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,5,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L5.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,6,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L6.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,7,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L7.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,8,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L8.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,9,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L9.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,10,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L10.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,11,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L11.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,12,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L12.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,13,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L13.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,14,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L14.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,15,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L15.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,16,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L16.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,17,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L17.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,18,1.0,0.0,-999.99, '.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L18.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,19,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L19.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,20,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L20.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,21,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L21.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,22,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L22.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,23,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L23.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,24,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L24.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,25,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L25.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,26,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L26.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,27,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L27.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,28,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L28.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,29,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L29.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,30,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L30.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,31,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L31.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,32,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L32.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,33,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L33.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,34,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L34.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,35,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L35.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,36,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L36.IDF' >>> (SHD) Starting Heads (IDF) <<< +1,2,37,1.0,0.0,-999.99,'.\Database\SHD\VERSION_1\STATIONAIR\\25\HEAD_STEADY-STATE_L37.IDF' >>> (SHD) Starting Heads (IDF) <<< + + +0001,(PCG),1, Precondition Conjugate-Gradient + MXITER= 5000 + ITER1= 20 + HCLOSE= 0.1000000E-02 + RCLOSE= 0.1000000 + RELAX= 0.9800000 + NPCOND= 1 + IPRPCG= 1 + MUTPCG= 0 + DAMPPCG= 1.000000 + DAMPPCGT=1.000000 + IQERROR= 0 + QERROR= 0.1000000 + +Periods +summer +01-04-1990 00:00:00 +winter +01-10-1990 00:00:00 + +Species +"benzene",1 + +""" diff --git a/imod/tests/fixtures/imod5_well_data.py b/imod/tests/fixtures/imod5_well_data.py new file mode 100644 index 000000000..a489588a3 --- /dev/null +++ b/imod/tests/fixtures/imod5_well_data.py @@ -0,0 +1,289 @@ +import os +import textwrap +from pathlib import Path + +import pytest + +import imod + +ipf_associated_header = textwrap.dedent( + """\ + 3 + 18 + x + y + q_m3 + FilterTopLevel + FilterBottomLevel + putcode + FilterNo + ALIAS + StartDateTime + SurfaceLevel + WellTopLevel + WellBottomLevel + Status + Type + Comment + CommentBy + Site + Organisation + 3,txt """ +) + +ipf_simple_header = textwrap.dedent( + """\ + 13 + 9 + xcoord + ycoord + q_assigned + top + bot + col + row + lay + whichmodel + 0,txt + """ +) + + +def projectfile_string(tmp_path): + return textwrap.dedent( + f"""\ + 0001,(WEL),1, Wells,[WRA] + 2000-01-01 00:00:00 + 001,002 + 1,2, 000, 1.000000 , 0.000000 , -999.9900 ,'{tmp_path}\ipf1.ipf' + 1,2, 000, 1.000000 , 0.000000 , -999.9900 ,'{tmp_path}\ipf2.ipf' + """ + ) + + +def ipf1_string_no_duplication(): + return textwrap.dedent( + f"""\ + {ipf_associated_header} + 191231.52,406381.47,timeseries_wel1,4.11,-1.69,01-PP001,0,B46D0517,"30-11-1981 00:00",13.41,13.41,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + 191171.96,406420.89,timeseries_wel1,3.78,-2.02,01-PP002,0,B46D0518,"30-11-1981 00:00",13.18,13.18,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + 191112.11,406460.02,timeseries_wel1,3.81,-1.99,01-PP003,0,B46D0519,"30-11-1981 00:00",13.21,13.21,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + """ + ) + + +def ipf1_string_duplication(): + return textwrap.dedent( + f"""\ + {ipf_associated_header} + 191231.52,406381.47,timeseries_wel1,4.11,-1.69,01-PP001,0,B46D0517,"30-11-1981 00:00",13.41,13.41,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + 191171.96,406420.89,timeseries_wel1,3.78,-2.02,01-PP002,0,B46D0518,"30-11-1981 00:00",13.18,13.18,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + 191231.52,406381.47,other_timeseries_wel1,4.11,-1.69,01-PP001,0,B46D0517,"30-11-1981 00:00",13.41,13.41,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + """ + ) + + +def ipf2_string(): + return textwrap.dedent( + f"""\ + {ipf_associated_header} + 191231.52,406381.47,timeseries_wel1,4.11,-1.69,01-PP001,0,B46D0517,"30-11-1981 00:00",13.41,13.41,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + 191171.96,406420.89,timeseries_wel1,3.78,-2.02,01-PP002,0,B46D0518,"30-11-1981 00:00",13.18,13.18,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + 191112.11,406460.02,timeseries_wel1,3.81,-1.99,01-PP003,0,B46D0519,"30-11-1981 00:00",13.21,13.21,nan,Inactive,Vertical,"Oxmeer","User1 - Zeeland Water","Oxmeer","Zeeland Water" + """ + ) + + +def ipf_simple_string(): + return textwrap.dedent( + f"""\ + {ipf_simple_header} + 276875.0,584375.0,-0.2,1.7,1.7,1107,162,1,4.0 + 276875.0,584125.0,-0.1,4.7,4.7,1107,163,1,4.0 + 276625.0,583875.0,-0.0,0.07,0.07,1106,164,1,4.0 + 276875.0,583625.0,-0.1,-0.26,-0.26,1107,165,1,4.0 + 276875.0,583375.0,-0.4,-2.65,-2.65,1107,166,1,4.0 + 276875.0,583125.0,-0.1,3.1,3.1,1107,167,1,4.0 + 277125.0,582875.0,-0.9,-0.45,-0.45,1108,168,1,4.0 + 277125.0,582625.0,-0.6,-2.65,-2.65,1108,169,1,4.0 + 277125.0,582375.0,-0.3,-2.65,-2.65,1108,170,1,4.0 + 277125.0,582125.0,-0.2,-2.65,-2.65,1108,171,1,4.0 + 277125.0,581875.0,-0.2,-2.65,-2.65,1108,172,1,4.0 + 277125.0,581625.0,-0.3,2.54,2.54,1108,173,1,4.0 + 277125.0,581375.0,0.2,1.6,1.6,1108,174,1,4.0 + 277125.0,581125.0,0.5,0.6,0.6,1108,175,1,4.0 + """ + ) + + +def timeseries_string(): + return textwrap.dedent( + """\ + 6 + 2 + DATE,-9999.0 + MEASUREMENT,-9999.0 + 19811130,-676.1507971461288 + 19811231,-766.7777419354838 + 19820131,-847.6367741935485 + 19820228,-927.3857142857142 + 19820331,-859.2109677419355 + 19820430,-882.7713333333334 + """ + ) + + +def other_timeseries_string(): + return textwrap.dedent( + """\ + 6 + 2 + DATE,-9999.0 + MEASUREMENT,-9999.0 + 19811130,-174.1507971461288 + 19811231,-166.7777419354838 + 19820131,-147.6367741935485 + 19820228,-127.3857142857142 + 19820331,-159.2109677419355 + 19820430,-182.7713333333334 + """ + ) + + +def out_of_bounds_timeseries_string(): + return textwrap.dedent( + """\ + 6 + 2 + DATE,-9999.0 + MEASUREMENT,-9999.0 + 19811130,-174.1507971461288 + 19811231,-166.7777419354838 + 19820131,-147.6367741935485 + 19820228,-127.3857142857142 + 19820331,-159.2109677419355 + 29991112,-182.7713333333334 + """ + ) + + +def write_ipf_assoc_files( + projectfile_str, + ipf1_str, + ipf2_str, + timeseries_wel1_str, + tmp_path, + other_timeseries_string=None, +): + with open(Path(tmp_path) / "projectfile.prj", "w") as f: + f.write(projectfile_str) + + with open(Path(tmp_path) / "ipf1.ipf", "w") as f: + f.write(ipf1_str) + + with open(Path(tmp_path) / "ipf2.ipf", "w") as f: + f.write(ipf2_str) + + with open(Path(tmp_path) / "timeseries_wel1.txt", "w") as f: + f.write(timeseries_wel1_str) + + if other_timeseries_string is not None: + with open(Path(tmp_path) / "other_timeseries_wel1.txt", "w") as f: + f.write(other_timeseries_string) + + return Path(tmp_path) / "projectfile.prj" + + +def write_ipf_mixed_files( + ipf_assoc_str, ipf_simple_str, timeseries_wel1_str, other_timeseries_str, tmp_path +): + file_dict = { + "associated.ipf": ipf_assoc_str, + "simple1.ipf": ipf_simple_str, + "simple2.ipf": ipf_simple_str, + "simple3.ipf": ipf_simple_str, + "timeseries_wel1.txt": timeseries_wel1_str, + "other_timeseries_wel1.txt": other_timeseries_str, + } + + paths = [] + for file, string in file_dict.items(): + path = Path(tmp_path) / file + paths.append(path) + with open(path, "w") as f: + f.write(string) + + return paths + + +@pytest.fixture(scope="session") +def well_regular_import_prj(): + tmp_path = imod.util.temporary_directory() + os.makedirs(tmp_path) + + projectfile_str = projectfile_string(tmp_path) + ipf1_str = ipf1_string_no_duplication() + ipf2_str = ipf2_string() + timeseries_well_str = timeseries_string() + + return write_ipf_assoc_files( + projectfile_str, ipf1_str, ipf2_str, timeseries_well_str, tmp_path + ) + + +@pytest.fixture(scope="session") +def well_duplication_import_prj(): + tmp_path = imod.util.temporary_directory() + os.makedirs(tmp_path) + + projectfile_str = projectfile_string(tmp_path) + ipf1_str = ipf1_string_duplication() + ipf2_str = ipf2_string() + timeseries_well_str = timeseries_string() + other_timeseries_well_str = other_timeseries_string() + return write_ipf_assoc_files( + projectfile_str, + ipf1_str, + ipf2_str, + timeseries_well_str, + tmp_path, + other_timeseries_well_str, + ) + + +@pytest.fixture(scope="session") +def well_mixed_ipfs(): + tmp_path = imod.util.temporary_directory() + os.makedirs(tmp_path) + + ipf_assoc_str = ipf1_string_duplication() + ipf_simple_str = ipf_simple_string() + timeseries_well_str = timeseries_string() + other_timeseries_well_str = other_timeseries_string() + + return write_ipf_mixed_files( + ipf_assoc_str, + ipf_simple_str, + timeseries_well_str, + other_timeseries_well_str, + tmp_path, + ) + + +@pytest.fixture(scope="session") +def well_out_of_bounds_ipfs(): + tmp_path = imod.util.temporary_directory() + os.makedirs(tmp_path) + + ipf_assoc_str = ipf1_string_duplication() + ipf_simple_str = ipf_simple_string() + timeseries_well_str = timeseries_string() + other_timeseries_well_str = out_of_bounds_timeseries_string() + + return write_ipf_mixed_files( + ipf_assoc_str, + ipf_simple_str, + timeseries_well_str, + other_timeseries_well_str, + tmp_path, + ) diff --git a/imod/tests/fixtures/mf6_small_models_fixture.py b/imod/tests/fixtures/mf6_small_models_fixture.py index f5354767f..cbea30bfe 100644 --- a/imod/tests/fixtures/mf6_small_models_fixture.py +++ b/imod/tests/fixtures/mf6_small_models_fixture.py @@ -25,7 +25,7 @@ def grid_data_structured( dims = ("layer", "y", "x") layer = np.arange(1, nlayer + 1) - coords = {"layer": layer, "y": y, "x": x, "dx": cellsize, "dy": cellsize} + coords = {"layer": layer, "y": y, "x": x, "dx": cellsize, "dy": -cellsize} structured_grid_data = xr.DataArray( np.ones(shape, dtype=dtype) * value, coords=coords, dims=dims diff --git a/imod/tests/test_formats/test_disv_conversion.py b/imod/tests/test_formats/test_disv_conversion.py deleted file mode 100644 index 48e11fabc..000000000 --- a/imod/tests/test_formats/test_disv_conversion.py +++ /dev/null @@ -1,70 +0,0 @@ -import numpy as np -import pytest -import xugrid as xu - -import imod - -# xugrid 0.5.0 introduced a bug which prevents disv_converter from functioning -# properly. Skip if this one occurs. -xu_version_to_skip = (xu.__version__ == "0.5.0") | (xu.__version__ < "0.4.0") - - -def create_quadgrid(ibound): - return xu.Ugrid2d.from_structured(ibound) - - -def create_trigrid(ibound): - import matplotlib - - dx, xmin, xmax, dy, ymin, ymax = imod.util.spatial.spatial_reference(ibound) - x = np.arange(xmin, xmax + dx, dx) - y = np.arange(ymin, ymax + abs(dy), abs(dy)) - node_y, node_x = [a.ravel() for a in np.meshgrid(x, y, indexing="ij")] - triangulation = matplotlib.tri.Triangulation(node_x, node_y) - grid = xu.Ugrid2d(node_x, node_y, -1, triangulation.triangles) - return grid - - -@pytest.mark.skipif(xu_version_to_skip, reason="xugrid == 0.5.0 | xugrid < 0.4.0") -@pytest.mark.usefixtures("imodflow_model") -@pytest.mark.parametrize("create_grid", [create_quadgrid, create_trigrid]) -def test_convert_to_disv(imodflow_model, tmp_path, create_grid): - imodflow_model.write(tmp_path / "imodflow") - - data, repeats = imod.prj.open_projectfile_data(tmp_path / "imodflow/imodflow.prj") - tim_data = imod.prj.read_timfile(tmp_path / "imodflow/time_discretization.tim") - times = sorted([d["time"] for d in tim_data]) - target = create_grid(data["bnd"]["ibound"]) - - disv_model = imod.prj.convert_to_disv( - projectfile_data=data, - target=target, - time_min=times[0], - time_max=times[-1], - repeat_stress=repeats, - ) - - simulation = imod.mf6.Modflow6Simulation(name="disv") - simulation["gwf"] = disv_model - simulation["solver"] = imod.mf6.Solution( - modelnames=["gwf"], - print_option="summary", - outer_dvclose=1.0e-4, - outer_maximum=500, - under_relaxation=None, - inner_dvclose=1.0e-4, - inner_rclose=0.001, - inner_maximum=100, - linear_acceleration="cg", - scaling_method=None, - reordering_method=None, - relaxation_factor=0.97, - ) - simulation.create_time_discretization(times) - - modeldir = tmp_path / "disv" - simulation.write(modeldir) - simulation.run() - - head = imod.mf6.open_hds(modeldir / "gwf/gwf.hds", modeldir / "gwf/disv.disv.grb") - assert isinstance(head, xu.UgridDataArray) diff --git a/imod/tests/test_formats/test_prj.py b/imod/tests/test_formats/test_prj.py index 9aebf1a5a..296baef79 100644 --- a/imod/tests/test_formats/test_prj.py +++ b/imod/tests/test_formats/test_prj.py @@ -527,8 +527,8 @@ def test_open_projectfile_data(self): assert isinstance(content["ghb"]["head"], xr.DataArray) assert isinstance(content["rch"]["rate"], xr.DataArray) assert isinstance(content["cap"]["landuse"], xr.DataArray) - assert isinstance(content["wel-1"]["dataframe"], pd.DataFrame) - assert isinstance(content["wel-2"]["dataframe"], pd.DataFrame) + assert isinstance(content["wel-wells_l1"]["dataframe"][0], pd.DataFrame) + assert isinstance(content["wel-wells_l2"]["dataframe"][0], pd.DataFrame) assert isinstance(content["hfb-1"]["geodataframe"], gpd.GeoDataFrame) assert isinstance(content["hfb-2"]["geodataframe"], gpd.GeoDataFrame) assert isinstance(content["pcg"], dict) diff --git a/imod/tests/test_formats/test_prj_wel.py b/imod/tests/test_formats/test_prj_wel.py new file mode 100644 index 000000000..6656653ef --- /dev/null +++ b/imod/tests/test_formats/test_prj_wel.py @@ -0,0 +1,944 @@ +from datetime import datetime +from shutil import copyfile +from textwrap import dedent +from typing import Union + +import numpy as np +import pandas as pd +import pytest +from pytest_cases import ( + get_all_cases, + get_parametrize_args, + parametrize, + parametrize_with_cases, +) + +from imod.formats.prj import open_projectfile_data +from imod.mf6 import LayeredWell, Well + + +class WellPrjCases: + """Cases for projectfile well records""" + + def case_simple__steady_state(self): + return dedent( + """ + 0001,(WEL),1 + steady-state + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_associated__steady_state(self): + return dedent( + """ + 0001,(WEL),1 + steady-state + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + def case_mixed__steady_state(self): + return dedent( + """ + 0001,(WEL),1 + steady-state + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_simple__first(self): + return dedent( + """ + 0001,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_simple__first_multi_layer1(self): + return dedent( + """ + 0001,(WEL),1 + 1982-01-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 002, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_simple__first_multi_layer2(self): + return dedent( + """ + 0001,(WEL),1 + 1982-01-01 + 001,002 + 1,2, 000, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_simple__all_same(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-02-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-03-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_simple__all_same_multi_layer1(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 002, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-02-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 002, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-03-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 002, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_simple__all_same_multi_layer2(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,002 + 1,2, 000, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-02-01 + 001,002 + 1,2, 000, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-03-01 + 001,002 + 1,2, 000, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + """ + ) + + def case_simple__all_different1(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-02-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple2.ipf" + 1982-03-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple3.ipf" + """ + ) + + def case_simple__all_different2(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-02-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple2.ipf" + 1982-03-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple3.ipf" + """ + ) + + def case_simple__all_different3(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-02-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple2.ipf" + 1982-03-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple3.ipf" + """ + ) + + def case_associated__first(self): + return dedent( + """ + 0001,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + def case_associated__all(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + 1982-02-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + 1982-03-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + def case_associated__all_varying_factors(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + 1982-02-01 + 001,001 + 1,2, 001, 0.5, 0.0, -999.9900 ,"ipf/associated.ipf" + 1982-03-01 + 001,001 + 1,2, 001, 0.2, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + def case_associated__multiple_layers_different_factors(self): + return dedent( + """ + 0001,(WEL),1 + 1982-01-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + 1,2, 002, 0.75, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + def case_mixed__first(self): + return dedent( + """ + 0001,(WEL),1 + 1982-01-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + def case_mixed__all(self): + return dedent( + """ + 0003,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + 1982-02-01 + 001,002 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-03-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + def case_mixed__associated_second(self): + return dedent( + """ + 0002,(WEL),1 + 1982-01-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/simple1.ipf" + 1982-02-01 + 001,001 + 1,2, 001, 1.0, 0.0, -999.9900 ,"ipf/associated.ipf" + """ + ) + + +class WellReadCases: + """Expected cases as interpreted by ``imod.formats.prj.open_projectfile_data``""" + + def case_simple__steady_state(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [None], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_associated__steady_state(self): + return { + "wel-associated": { + "has_associated": True, + "time": [None], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_mixed__steady_state(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [None], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + "wel-associated": { + "has_associated": True, + "time": [None], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_simple__first(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_simple__first_multi_layer1(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1), datetime(1982, 1, 1)], + "layer": [1, 2], + "factor": [1.0, 1.0], + "addition": [0.0, 0.0], + }, + } + + def case_simple__first_multi_layer2(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1), datetime(1982, 1, 1)], + "layer": [0, 1], + "factor": [1.0, 1.0], + "addition": [0.0, 0.0], + }, + } + + def case_simple__all_same(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [ + datetime(1982, 1, 1), + datetime(1982, 2, 1), + datetime(1982, 3, 1), + ], + "layer": [1, 1, 1], + "factor": [1.0, 1.0, 1.0], + "addition": [0.0, 0.0, 0.0], + }, + } + + def case_simple__all_same_multi_layer1(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [ + datetime(1982, 1, 1), + datetime(1982, 1, 1), + datetime(1982, 2, 1), + datetime(1982, 2, 1), + datetime(1982, 3, 1), + datetime(1982, 3, 1), + ], + "layer": [1, 2, 1, 2, 1, 2], + "factor": [1.0, 1.0, 1.0, 1.0, 1.0, 1.0], + "addition": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0], + }, + } + + def case_simple__all_same_multi_layer2(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [ + datetime(1982, 1, 1), + datetime(1982, 1, 1), + datetime(1982, 2, 1), + datetime(1982, 2, 1), + datetime(1982, 3, 1), + datetime(1982, 3, 1), + ], + "layer": [0, 1, 0, 1, 0, 1], + "factor": [1.0, 1.0, 1.0, 1.0, 1.0, 1.0], + "addition": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0], + }, + } + + def case_simple__all_different1(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + "wel-simple2": { + "has_associated": False, + "time": [datetime(1982, 2, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + "wel-simple3": { + "has_associated": False, + "time": [datetime(1982, 3, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_simple__all_different2(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1), datetime(1982, 2, 1)], + "layer": [1, 1], + "factor": [1.0, 1.0], + "addition": [0.0, 0.0], + }, + "wel-simple2": { + "has_associated": False, + "time": [datetime(1982, 2, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + "wel-simple3": { + "has_associated": False, + "time": [datetime(1982, 3, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_simple__all_different3(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1), datetime(1982, 3, 1)], + "layer": [1, 1], + "factor": [1.0, 1.0], + "addition": [0.0, 0.0], + }, + "wel-simple2": { + "has_associated": False, + "time": [datetime(1982, 2, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + "wel-simple3": { + "has_associated": False, + "time": [datetime(1982, 3, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_associated__first(self): + return { + "wel-associated": { + "has_associated": True, + "time": [datetime(1982, 1, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + } + } + + def case_associated__all(self): + return { + "wel-associated": { + "has_associated": True, + "time": [ + datetime(1982, 1, 1), + datetime(1982, 2, 1), + datetime(1982, 3, 1), + ], + "layer": [1, 1, 1], + "factor": [1.0, 1.0, 1.0], + "addition": [0.0, 0.0, 0.0], + }, + } + + def case_associated__all_varying_factors(self): + return { + "wel-associated": { + "has_associated": True, + "time": [ + datetime(1982, 1, 1), + datetime(1982, 2, 1), + datetime(1982, 3, 1), + ], + "layer": [1, 1, 1], + "factor": [1.0, 0.5, 0.2], + "addition": [0.0, 0.0, 0.0], + }, + } + + def case_associated__multiple_layers_different_factors(self): + return { + "wel-associated": { + "has_associated": True, + "time": [ + datetime(1982, 1, 1), + datetime(1982, 1, 1), + ], + "layer": [1, 2], + "factor": [1.0, 0.75], + "addition": [0.0, 0.0], + }, + } + + def case_mixed__first(self): + return { + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + "wel-associated": { + "has_associated": True, + "time": [datetime(1982, 1, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_mixed__all(self): + return { + "wel-associated": { + "has_associated": True, + "time": [ + datetime(1982, 1, 1), + datetime(1982, 2, 1), + datetime(1982, 3, 1), + ], + "layer": [1, 1, 1], + "factor": [1.0, 1.0, 1.0], + "addition": [0.0, 0.0, 0.0], + }, + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 2, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + def case_mixed__associated_second(self): + return { + "wel-associated": { + "has_associated": True, + "time": [datetime(1982, 2, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + "wel-simple1": { + "has_associated": False, + "time": [datetime(1982, 1, 1)], + "layer": [1], + "factor": [1.0], + "addition": [0.0], + }, + } + + +class WellPackageCases: + """ + Expected cases as loaded with from_imod5_data. + Returns a tuple with as first element a bool whether the import is expected to fail + The second element specifies in which timesteps the rates are set to zero. + + Returns + ------- + {wellname: (fails, has_time, datetimes_set_to_zero)} + """ + + def case_simple__steady_state(self): + return { + "wel-simple1": (False, False, []), + } + + def case_associated__steady_state(self): + return { + "wel-associated": (False, True, []), + } + + def case_mixed__steady_state(self): + return { + "wel-associated": (False, True, []), + "wel-simple1": (False, False, []), + } + + def case_simple__first(self): + return { + "wel-simple1": (False, True, [datetime(1982, 2, 1), datetime(1982, 3, 1)]), + } + + def case_simple__first_multi_layer1(self): + return { + "wel-simple1": (False, True, [datetime(1982, 2, 1), datetime(1982, 3, 1)]), + } + + def case_simple__first_multi_layer2(self): + return { + "wel-simple1": (True, False, []), + } + + def case_simple__all_same(self): + return { + "wel-simple1": (False, True, []), + } + + def case_simple__all_same_multi_layer1(self): + return { + "wel-simple1": (False, True, []), + } + + def case_simple__all_same_multi_layer2(self): + return { + "wel-simple1": (True, False, []), + } + + def case_simple__all_different1(self): + return { + "wel-simple1": (False, True, [datetime(1982, 2, 1), datetime(1982, 3, 1)]), + "wel-simple2": (False, True, [datetime(1982, 3, 1)]), + "wel-simple3": (False, True, []), + } + + def case_simple__all_different2(self): + return { + "wel-simple1": (False, True, [datetime(1982, 3, 1)]), + "wel-simple2": (False, True, [datetime(1982, 3, 1)]), + "wel-simple3": (False, True, []), + } + + def case_simple__all_different3(self): + return { + "wel-simple1": (False, True, [datetime(1982, 2, 1)]), + "wel-simple2": (False, True, [datetime(1982, 3, 1)]), + "wel-simple3": (False, True, []), + } + + def case_associated__first(self): + return {"wel-associated": (False, True, [])} + + def case_associated__all(self): + return {"wel-associated": (False, True, [])} + + def case_associated__all_varying_factors(self): + return {"wel-associated": (True, False, [])} + + def case_associated__multiple_layers_different_factors(self): + return {"wel-associated": (True, False, [])} + + def case_mixed__first(self): + return { + "wel-simple1": (False, True, [datetime(1982, 2, 1), datetime(1982, 3, 1)]), + "wel-associated": (False, True, []), + } + + def case_mixed__all(self): + return { + "wel-simple1": (False, True, [datetime(1982, 3, 1)]), + "wel-associated": (False, True, []), + } + + def case_mixed__associated_second(self): + return { + "wel-simple1": (False, True, [datetime(1982, 2, 1), datetime(1982, 3, 1)]), + "wel-associated": (True, False, []), + } + + +# pytest_cases doesn't support any "zipped test cases", instead it takes the +# outer product of cases, when providing multiple case sets. +# https://github.com/smarie/python-pytest-cases/issues/284 +# To support this, we would like to retrieve all function arguments from the +# case classes and to zip them together, something like +# zip(input_args,expected). +def case_args_to_parametrize(cases, prefix): + """Manually retrieve all case args of a set in cases.""" + + # Decorate some dummy function to be able to call ``get_all_cases``. For some + # reason, pytest_cases requires a decorated function (despite telling us + # differently in the docs.) + @parametrize_with_cases("case", cases=cases) + def f(case): + return case + + all_cases = get_all_cases(f, cases=cases) + return get_parametrize_args(f, all_cases, prefix) + + +PRJ_ARGS = case_args_to_parametrize(WellPrjCases, "case_") +READ_ARGS = case_args_to_parametrize(WellReadCases, "case_") +PKG_ARGS = case_args_to_parametrize(WellPackageCases, "case_") + + +def setup_test_files(wel_case, wel_file, well_mixed_ipfs, tmp_path): + """ + Write string to projectfile, and copy ipf files to directory. + """ + with open(wel_file, "w") as f: + f.write(wel_case) + + ipf_dir = tmp_path / "ipf" + ipf_dir.mkdir(exist_ok=True) + + # copy files to test folder + for p in well_mixed_ipfs: + copyfile(p, ipf_dir / p.name) + + +def get_case_name(request): + id_name = request.node.callspec.id + # Verify right cases are matched. This can go wrong when case names are not + # inserted in the right order in Case class. + cases = id_name.split("-") + # First entry refers to wel obj, we can skip this. + assert cases[1] == cases[-1] + + return cases[1] + + +@parametrize("wel_case, expected", argvalues=list(zip(PRJ_ARGS, READ_ARGS))) +def test_open_projectfile_data_wells( + wel_case, expected, well_mixed_ipfs, tmp_path, request +): + # Arrange + case_name = get_case_name(request) + wel_file = tmp_path / f"{case_name}.prj" + setup_test_files(wel_case, wel_file, well_mixed_ipfs, tmp_path) + + # Act + data, _ = open_projectfile_data(wel_file) + assert len(set(expected.keys()) ^ set(data.keys())) == 0 + fields = ["time", "layer", "addition", "factor", "has_associated"] + for wel_name, wel_expected in expected.items(): + actual = data[wel_name] + for field in fields: + assert field in actual + assert actual[field] == wel_expected[field] + + +@parametrize("wel_case, expected", argvalues=list(zip(PRJ_ARGS, READ_ARGS))) +def test_open_projectfile_data_out_of_bounds_wells( + wel_case, expected, well_out_of_bounds_ipfs, tmp_path, request +): + # Arrange + case_name = get_case_name(request) + wel_file = tmp_path / f"{case_name}.prj" + setup_test_files(wel_case, wel_file, well_out_of_bounds_ipfs, tmp_path) + + # Act + data, _ = open_projectfile_data(wel_file) + assert len(set(expected.keys()) ^ set(data.keys())) == 0 + fields = ["time", "layer", "addition", "factor", "has_associated"] + for wel_name, wel_expected in expected.items(): + actual = data[wel_name] + for field in fields: + assert field in actual + assert actual[field] == wel_expected[field] + if actual["has_associated"]: + timeseries = data["wel-associated"]["dataframe"][0]["time"] + # Test if last element NaT + assert timeseries.iloc[-1] is pd.NaT + + +@parametrize("wel_case, expected_dict", argvalues=list(zip(PRJ_ARGS, PKG_ARGS))) +@parametrize("wel_cls", argvalues=[LayeredWell, Well]) +def test_from_imod5_data_wells( + wel_cls: Union[LayeredWell, Well], + wel_case, + expected_dict, + well_mixed_ipfs, + tmp_path, + request, +): + # Arrange + # Replace layer number to zero if non-layered well. + if wel_cls == Well: + wel_case = wel_case.replace("1,2, 001", "1,2, 000") + # Write prj and copy ipfs to right folder. + case_name = get_case_name(request) + wel_file = tmp_path / f"{case_name}.prj" + setup_test_files(wel_case, wel_file, well_mixed_ipfs, tmp_path) + + times = [datetime(1982, i + 1, 1) for i in range(4)] + + # Act + data, _ = open_projectfile_data(wel_file) + for wellname in data.keys(): + assert wellname in expected_dict.keys() + fails, has_time, expected_set_to_zero = expected_dict[wellname] + if fails: + with pytest.raises(ValueError): + wel_cls.from_imod5_data(wellname, data, times=times) + else: + well = wel_cls.from_imod5_data(wellname, data, times=times) + rate = well.dataset["rate"] + if has_time: + actual_set_to_zero = [ + t.values + for t in rate.coords["time"] + if (rate.sel(time=t) == 0.0).all() + ] + expected_set_to_zero = [ + np.datetime64(t, "ns") for t in expected_set_to_zero + ] + diff = set(actual_set_to_zero) ^ set(expected_set_to_zero) + assert len(diff) == 0 + else: + assert "time" not in rate.dims + assert "time" not in rate.coords + + +@parametrize("wel_case, expected_dict", argvalues=list(zip(PRJ_ARGS, PKG_ARGS))) +@parametrize("wel_cls", argvalues=[LayeredWell, Well]) +def test_from_imod5_data_wells__outside_range( + wel_cls: Union[LayeredWell, Well], + wel_case, + expected_dict, + well_mixed_ipfs, + tmp_path, + request, +): + """ + Test when values are retrieved outside time domain of wells, should be all + set to zero for unassociated ipfs, and be forward filled with the last entry + for associated ipfs. + """ + # Arrange + # Replace layer number to zero if non-layered well. + if wel_cls == Well: + wel_case = wel_case.replace("1,2, 001", "1,2, 000") + # Write prj and copy ipfs to right folder. + case_name = get_case_name(request) + wel_file = tmp_path / f"{case_name}.prj" + setup_test_files(wel_case, wel_file, well_mixed_ipfs, tmp_path) + + times = [datetime(1985, i + 1, 1) for i in range(4)] + + # Act + data, _ = open_projectfile_data(wel_file) + for wellname in data.keys(): + assert wellname in expected_dict.keys() + fails, has_time, _ = expected_dict[wellname] + if fails: + with pytest.raises(ValueError): + wel_cls.from_imod5_data(wellname, data, times=times) + else: + well = wel_cls.from_imod5_data(wellname, data, times=times) + rate = well.dataset["rate"] + if has_time: + actual_set_to_zero = [ + t.values + for t in rate.coords["time"] + if (rate.sel(time=t) == 0.0).all() + ] + if data[wellname]["has_associated"]: + expected_set_to_zero = [] + else: + expected_set_to_zero = [np.datetime64(t, "ns") for t in times[:-1]] + diff = set(actual_set_to_zero) ^ set(expected_set_to_zero) + assert len(diff) == 0 + else: + assert "time" not in rate.dims + assert "time" not in rate.coords + + +@parametrize("wel_case, expected_dict", argvalues=list(zip(PRJ_ARGS, PKG_ARGS))) +@parametrize("wel_cls", argvalues=[LayeredWell, Well]) +def test_from_imod5_data_wells__wells_out_of_bounds( + wel_cls: Union[LayeredWell, Well], + wel_case, + expected_dict, + well_out_of_bounds_ipfs, + tmp_path, + request, +): + # Arrange + # Replace layer number to zero if non-layered well. + if wel_cls == Well: + wel_case = wel_case.replace("1,2, 001", "1,2, 000") + # Write prj and copy ipfs to right folder. + case_name = get_case_name(request) + wel_file = tmp_path / f"{case_name}.prj" + setup_test_files(wel_case, wel_file, well_out_of_bounds_ipfs, tmp_path) + + times = [datetime(1982, i + 3, 1) for i in range(4)] + + # Act + data, _ = open_projectfile_data(wel_file) + for wellname in data.keys(): + assert wellname in expected_dict.keys() + fails, _, _ = expected_dict[wellname] + if fails: + with pytest.raises(ValueError): + wel_cls.from_imod5_data(wellname, data, times=times) + else: + well = wel_cls.from_imod5_data(wellname, data, times=times) + if data[wellname]["has_associated"]: + # Last value in dataframe returned by open_projectfile is time + # NaT (out of bounds), so expect second last rate in dataframe + # as final rate in well package. + expected_last_rate = data[wellname]["dataframe"][0]["rate"].iloc[-2] + actual_last_rate = well.dataset["rate"].isel(index=1, time=-1).item() + assert actual_last_rate == expected_last_rate diff --git a/imod/tests/test_mf6/test_circle.py b/imod/tests/test_mf6/test_circle.py index 19d40e8d9..063b1c81c 100644 --- a/imod/tests/test_mf6/test_circle.py +++ b/imod/tests/test_mf6/test_circle.py @@ -9,6 +9,7 @@ import imod from imod.logging import LoggerType, LogLevel +from imod.mf6.validation_context import ValidationContext from imod.mf6.write_context import WriteContext @@ -57,8 +58,8 @@ def test_gwfmodel_render(circle_model, tmp_path): simulation = circle_model globaltimes = simulation["time_discretization"]["time"].values gwfmodel = simulation["GWF_1"] - write_context = WriteContext() - actual = gwfmodel.render("GWF_1", write_context) + write_context1 = WriteContext() + actual = gwfmodel.render("GWF_1", write_context1) path = "GWF_1" expected = textwrap.dedent( f"""\ @@ -77,8 +78,9 @@ def test_gwfmodel_render(circle_model, tmp_path): """ ) assert actual == expected - context = WriteContext(tmp_path) - gwfmodel.write("GWF_1", globaltimes, True, context) + validation_context = ValidationContext(True) + write_context2 = WriteContext(tmp_path) + gwfmodel.write("GWF_1", globaltimes, write_context2, validation_context) assert (tmp_path / "GWF_1" / "GWF_1.nam").is_file() assert (tmp_path / "GWF_1").is_dir() @@ -110,8 +112,8 @@ def test_gwfmodel_render_evt(circle_model_evt, tmp_path): simulation = circle_model_evt globaltimes = simulation["time_discretization"]["time"].values gwfmodel = simulation["GWF_1"] - write_context = WriteContext() - actual = gwfmodel.render("GWF_1", write_context) + write_context1 = WriteContext() + actual = gwfmodel.render("GWF_1", write_context1) path = "GWF_1" expected = textwrap.dedent( f"""\ @@ -131,7 +133,8 @@ def test_gwfmodel_render_evt(circle_model_evt, tmp_path): """ ) assert actual == expected - context = WriteContext(tmp_path) - gwfmodel.write("GWF_1", globaltimes, True, context) + validation_context = ValidationContext(True) + write_context2 = WriteContext(tmp_path) + gwfmodel.write("GWF_1", globaltimes, write_context2, validation_context) assert (tmp_path / "GWF_1" / "GWF_1.nam").is_file() assert (tmp_path / "GWF_1").is_dir() diff --git a/imod/tests/test_mf6/test_ex01_twri.py b/imod/tests/test_mf6/test_ex01_twri.py index 40ead6a72..c6763c097 100644 --- a/imod/tests/test_mf6/test_ex01_twri.py +++ b/imod/tests/test_mf6/test_ex01_twri.py @@ -8,6 +8,7 @@ import xarray as xr import imod +from imod.mf6.validation_context import ValidationContext from imod.mf6.write_context import WriteContext from imod.schemata import ValidationError from imod.typing.grid import ones_like @@ -351,6 +352,7 @@ def test_gwfmodel_render(twri_model, tmp_path): globaltimes = simulation["time_discretization"]["time"].values gwfmodel = simulation["GWF_1"] path = Path(tmp_path.stem).as_posix() + validation_context = ValidationContext(tmp_path) write_context = WriteContext(tmp_path) actual = gwfmodel.render(path, write_context) expected = textwrap.dedent( @@ -373,7 +375,7 @@ def test_gwfmodel_render(twri_model, tmp_path): """ ) assert actual == expected - gwfmodel.write("GWF_1", globaltimes, True, write_context) + gwfmodel.write("GWF_1", globaltimes, write_context, validation_context) assert (tmp_path / "GWF_1" / "GWF_1.nam").is_file() assert (tmp_path / "GWF_1").is_dir() diff --git a/imod/tests/test_mf6/test_import_prj.py b/imod/tests/test_mf6/test_import_prj.py index 43c2516b4..c1f33a1eb 100644 --- a/imod/tests/test_mf6/test_import_prj.py +++ b/imod/tests/test_mf6/test_import_prj.py @@ -1,11 +1,15 @@ +import sys from textwrap import dedent from zipfile import ZipFile import numpy as np from numpy.testing import assert_allclose +import imod from imod.data.sample_data import create_pooch_registry, load_pooch_registry from imod.formats.prj import open_projectfile_data +from imod.logging.config import LoggerType +from imod.logging.loglevel import LogLevel registry = create_pooch_registry() registry = load_pooch_registry(registry) @@ -181,17 +185,90 @@ def test_import_ipf(tmp_path): result_snippet_1 = open_projectfile_data(projects_file) assert np.all( - result_snippet_1[0]["wel-1"]["dataframe"]["rate"] - == 2 * result_snippet_0[0]["wel-1"]["dataframe"]["rate"] + 1.3 + result_snippet_1[0]["wel-WELLS_L3"]["dataframe"][0]["rate"] + == 2 * result_snippet_0[0]["wel-WELLS_L3"]["dataframe"][0]["rate"] + 1.3 ) assert np.all( - result_snippet_1[0]["wel-2"]["dataframe"]["rate"] - == -1 * result_snippet_0[0]["wel-2"]["dataframe"]["rate"] + 0 + result_snippet_1[0]["wel-WELLS_L4"]["dataframe"][0]["rate"] + == -1 * result_snippet_0[0]["wel-WELLS_L4"]["dataframe"][0]["rate"] + 0 ) assert np.all( - result_snippet_1[0]["wel-3"]["dataframe"]["rate"] - == 2 * result_snippet_0[0]["wel-2"]["dataframe"]["rate"] + 1.3 + result_snippet_1[0]["wel-WELLS_L5"]["dataframe"][0]["rate"] + == 2 * result_snippet_0[0]["wel-WELLS_L4"]["dataframe"][0]["rate"] + 1.3 ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L3"]["dataframe"][0]["filt_top"] == 11.0 + ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L3"]["dataframe"][0]["filt_bot"] == 6.0 + ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L4"]["dataframe"][0]["filt_top"] == 11.0 + ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L4"]["dataframe"][0]["filt_bot"] == 6.0 + ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L5"]["dataframe"][0]["filt_top"] == 11.0 + ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L5"]["dataframe"][0]["filt_bot"] == 6.0 + ) + + +def test_import_ipf_unique_id_and_logging(tmp_path): + with ZipFile(fname_model) as archive: + archive.extractall(tmp_path) + + logfile_path = tmp_path / "logfile.txt" + + try: + with open(logfile_path, "w") as sys.stdout: + # start logging + imod.logging.configure( + LoggerType.PYTHON, + log_level=LogLevel.WARNING, + add_default_file_handler=False, + add_default_stream_handler=True, + ) + projects_file = tmp_path / "iMOD5_model_pooch" / "iMOD5_model.prj" + + file1 = open(projects_file, "w") + file1.write( + snippet_gen_import_ipf( + factor1=2.0, addition1=1.3, factor2=-1.0, addition2=0.0 + ) + ) + file1.close() + + # Act + result_snippet_1 = open_projectfile_data(projects_file) + finally: + # turn the logger off again + imod.logging.configure( + LoggerType.NULL, + log_level=LogLevel.WARNING, + add_default_file_handler=False, + add_default_stream_handler=False, + ) + + # test that id's were made unique + # Assert + assert np.all( + result_snippet_1[0]["wel-WELLS_L3"]["dataframe"][0]["id"] == "extractions" + ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L4"]["dataframe"][0]["id"] == "extractions_1" + ) + assert np.all( + result_snippet_1[0]["wel-WELLS_L5"]["dataframe"][0]["id"] == "extractions_2" + ) + + with open(logfile_path, "r") as log_file: + log = log_file.read() + assert "This happened at x = 197910, y = 362860, id = extractions" in log + assert "appended with the suffix _1" in log + assert "appended with the suffix _2" in log def snippet_boundary_condition(factor: float, addition: float): diff --git a/imod/tests/test_mf6/test_mf6_LHM.py b/imod/tests/test_mf6/test_mf6_LHM.py new file mode 100644 index 000000000..a2b5cacea --- /dev/null +++ b/imod/tests/test_mf6/test_mf6_LHM.py @@ -0,0 +1,94 @@ +""" +LHM tests, these are pytest-marked with 'user_acceptance'. + +These require the LHM model to be available on the local drive. The tests looks +for the path to the projectfile needs to be included in a .env file, with the +environmental variable "LHM_PRJ" with the path to the projectfile. +""" + +import os +import sys + +import pandas as pd +import pytest + +import imod +from imod.formats.prj.prj import open_projectfile_data +from imod.logging.config import LoggerType +from imod.logging.loglevel import LogLevel +from imod.mf6.oc import OutputControl +from imod.mf6.regrid.regrid_schemes import ( + DiscretizationRegridMethod, + NodePropertyFlowRegridMethod, + StorageCoefficientRegridMethod, +) +from imod.mf6.simulation import Modflow6Simulation +from imod.mf6.utilities.mf6hfb import merge_hfb_packages +from imod.mf6.write_context import WriteContext +from imod.prepare.topsystem.default_allocation_methods import ( + SimulationAllocationOptions, + SimulationDistributingOptions, +) + + +# In function, not a fixture, to allow logging of the import. +def LHM_imod5_data(): + lhm_prjfile = os.environ["LHM_PRJ"] + data = open_projectfile_data(lhm_prjfile) + + imod5_data = data[0] + period_data = data[1] + default_simulation_allocation_options = SimulationAllocationOptions + default_simulation_distributing_options = SimulationDistributingOptions + + regridding_option = {} + regridding_option["npf"] = NodePropertyFlowRegridMethod() + regridding_option["dis"] = DiscretizationRegridMethod() + regridding_option["sto"] = StorageCoefficientRegridMethod() + times = pd.date_range(start="1/1/2018", end="12/1/2018", freq="ME") + + simulation = Modflow6Simulation.from_imod5_data( + imod5_data, + period_data, + default_simulation_allocation_options, + default_simulation_distributing_options, + times, + regridding_option, + ) + simulation["imported_model"]["oc"] = OutputControl( + save_head="last", save_budget="last" + ) + return simulation + + +@pytest.mark.user_acceptance +def test_mf6_LHM_write_HFB(tmp_path): + logfile_path = tmp_path / "logfile.txt" + with open(logfile_path, "w") as sys.stdout: + imod.logging.configure( + LoggerType.PYTHON, + log_level=LogLevel.DEBUG, + add_default_file_handler=False, + add_default_stream_handler=True, + ) + simulation = LHM_imod5_data() + model = simulation["imported_model"] + + mf6_hfb_ls = [] + for key, pkg in model.items(): + if issubclass(type(pkg), imod.mf6.HorizontalFlowBarrierBase): + mf6_hfb_ls.append(pkg) + pkg.dataset.load() + + top, bottom, idomain = model._Modflow6Model__get_domain_geometry() + k = model._Modflow6Model__get_k() + + mf6_hfb = merge_hfb_packages(mf6_hfb_ls, idomain, top, bottom, k) + + times = pd.date_range(start="1/1/2018", end="12/1/2018", freq="ME") + + out_dir = tmp_path / "LHM" + out_dir.mkdir(parents=True, exist_ok=True) + write_context = WriteContext(out_dir, use_binary=True, use_absolute_paths=False) + + mf6_hfb.write("hfb", times, write_context) diff --git a/imod/tests/test_mf6/test_mf6_array_masking.py b/imod/tests/test_mf6/test_mf6_array_masking.py new file mode 100644 index 000000000..f0b30dbeb --- /dev/null +++ b/imod/tests/test_mf6/test_mf6_array_masking.py @@ -0,0 +1,29 @@ +import numpy as np +import xarray as xr + +from imod.mf6.utilities.mask import mask_arrays + + +def test_array_masking(): + x = [1] + y = [1, 2, 3] + layer = [1, 2] + coords = {"layer": layer, "y": y, "x": x} + dims = ("layer", "y", "x") + + array1 = xr.DataArray([[[1], [1], [1]], [[1], [1], [1]]], coords=coords, dims=dims) + array2 = xr.DataArray( + [[[np.nan], [1], [1]], [[1], [1], [1]]], coords=coords, dims=dims + ) + + masked_arrays = mask_arrays({"array1": array1, "array2": array2}) + + # element 0,0,0 should be nan in both arrays + assert np.isnan(masked_arrays["array1"].values[0, 0, 0]) + assert np.isnan(masked_arrays["array2"].values[0, 0, 0]) + + # there should be only 1 nan in both arrays + masked_arrays["array1"].values[0, 0, 0] = 1 + masked_arrays["array2"].values[0, 0, 0] = 1 + assert np.all(~np.isnan(masked_arrays["array1"].values)) + assert np.all(~np.isnan(masked_arrays["array2"].values)) diff --git a/imod/tests/test_mf6/test_mf6_chd.py b/imod/tests/test_mf6/test_mf6_chd.py index 4b33dd57e..c3b96b982 100644 --- a/imod/tests/test_mf6/test_mf6_chd.py +++ b/imod/tests/test_mf6/test_mf6_chd.py @@ -7,6 +7,9 @@ import xarray as xr import imod +from imod.mf6.chd import ConstantHead +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.utilities.chd_concat import concat_layered_chd_packages from imod.mf6.write_context import WriteContext from imod.schemata import ValidationError @@ -192,3 +195,90 @@ def test_write_concentration_period_data(head_fc, concentration_fc): assert ( data.count("2") == 1755 ) # the number 2 is in the concentration data, and in the cell indices. + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5(imod5_dataset, tmp_path): + imod5_data = imod5_dataset[0] + + target_dis = StructuredDiscretization.from_imod5_data(imod5_data) + + chd3 = imod.mf6.ConstantHead.from_imod5_data( + "chd-3", + imod5_data, + target_dis, + regridder_types=None, + ) + + assert isinstance(chd3, imod.mf6.ConstantHead) + assert np.count_nonzero(~np.isnan(chd3.dataset["head"].values)) == 589 + assert len(chd3.dataset["layer"].values) == 1 + + # write the packages for write validation + write_context = WriteContext(simulation_directory=tmp_path, use_binary=False) + chd3.write("chd3", [1], write_context) + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5_shd(imod5_dataset, tmp_path): + imod5_data = imod5_dataset[0] + + target_dis = StructuredDiscretization.from_imod5_data(imod5_data) + + chd_shd = imod.mf6.ConstantHead.from_imod5_shd_data( + imod5_data, + target_dis, + regridder_types=None, + ) + + assert isinstance(chd_shd, imod.mf6.ConstantHead) + assert len(chd_shd.dataset["layer"].values) == 37 + # write the packages for write validation + write_context = WriteContext(simulation_directory=tmp_path, use_binary=False) + chd_shd.write("chd_shd", [1], write_context) + + +@pytest.mark.unittest_jit +@pytest.mark.parametrize("remove_merged_packages", [True, False]) +@pytest.mark.usefixtures("imod5_dataset") +def test_concatenate_chd(imod5_dataset, tmp_path, remove_merged_packages): + # Arrange + imod5_data = imod5_dataset[0] + + target_dis = StructuredDiscretization.from_imod5_data(imod5_data) + chd_packages = {} + + # import a few chd packages per layer + for layer in range(1, 7): + key = f"chd-{layer}" + chd_packages[key] = imod.mf6.ConstantHead.from_imod5_data( + key, + imod5_data, + target_dis, + ) + + # import a few chd packages per layer but store them under another key + for layer in range(8, 16): + key = f"chd-{layer}" + other_key = f"other_chd-{layer}" + chd_packages[other_key] = imod.mf6.ConstantHead.from_imod5_data( + key, + imod5_data, + target_dis, + ) + + # Act + merged_package = concat_layered_chd_packages( + "chd", chd_packages, remove_merged_packages + ) + + # Assert + assert isinstance(merged_package, ConstantHead) + assert len(merged_package["layer"]) == 6 + if remove_merged_packages: + assert len(chd_packages) == 8 + else: + assert len(chd_packages) == 14 + # write the packages for write validation + write_context = WriteContext(simulation_directory=tmp_path, use_binary=False) + merged_package.write("merged_chd", [1], write_context) diff --git a/imod/tests/test_mf6/test_mf6_dis.py b/imod/tests/test_mf6/test_mf6_dis.py index ec12f0838..be502d8e5 100644 --- a/imod/tests/test_mf6/test_mf6_dis.py +++ b/imod/tests/test_mf6/test_mf6_dis.py @@ -8,6 +8,9 @@ import imod from imod.mf6.write_context import WriteContext from imod.schemata import ValidationError +from imod.tests.fixtures.backward_compatibility_fixture import ( + _load_imod5_data_in_memory, +) @pytest.fixture(scope="function") @@ -189,3 +192,65 @@ def test_write_ascii_griddata_2d_3d(idomain_and_bottom, tmp_path): with open(directory / "dis/botm.dat") as f: bottom_content = f.readlines() assert len(bottom_content) == 1 + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5_data__idomain_values(imod5_dataset): + imod5_data = imod5_dataset[0] + + dis = imod.mf6.StructuredDiscretization.from_imod5_data(imod5_data) + + # Test if idomain has appropriate count + assert (dis["idomain"] == -1).sum() == 371824 + assert (dis["idomain"] == 0).sum() == 176912 + assert (dis["idomain"] == 1).sum() == 703936 + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5_data__grid_extent(imod5_dataset): + imod5_data = imod5_dataset[0] + + dis = imod.mf6.StructuredDiscretization.from_imod5_data(imod5_data) + + # Test if regridded to smallest grid resolution + assert dis["top"].dx == 25.0 + assert dis["top"].dy == -25.0 + assert (dis.dataset.coords["x"][1] - dis.dataset.coords["x"][0]) == 25.0 + assert (dis.dataset.coords["y"][1] - dis.dataset.coords["y"][0]) == -25.0 + + # Test extent + assert dis.dataset.coords["y"].min() == 360712.5 + assert dis.dataset.coords["y"].max() == 365287.5 + assert dis.dataset.coords["x"].min() == 194712.5 + assert dis.dataset.coords["x"].max() == 199287.5 + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5_data__write(imod5_dataset, tmp_path): + directory = tmp_path / "dis_griddata" + directory.mkdir() + write_context = WriteContext(simulation_directory=directory) + imod5_data = imod5_dataset[0] + + dis = imod.mf6.StructuredDiscretization.from_imod5_data(imod5_data) + + # Test if package written without ValidationError + dis.write(pkgname="dis", globaltimes=[], write_context=write_context) + + # Assert if files written + assert (directory / "dis/top.dat").exists() + assert (directory / "dis/botm.dat").exists() + + +def test_from_imod5_data__validation_error(tmp_path): + # don't use the fixture "imod5_dataset" for this test, because we don't want the + # ibound cleanup. Without this cleanup we get a validation error, + # which is what we want to test here. + + tmp_path = imod.util.temporary_directory() + data = imod.data.imod5_projectfile_data(tmp_path) + data = data[0] + + _load_imod5_data_in_memory(data) + with pytest.raises(ValidationError): + imod.mf6.StructuredDiscretization.from_imod5_data(data) diff --git a/imod/tests/test_mf6/test_mf6_drn.py b/imod/tests/test_mf6/test_mf6_drn.py index 0887a4532..a2163ce5d 100644 --- a/imod/tests/test_mf6/test_mf6_drn.py +++ b/imod/tests/test_mf6/test_mf6_drn.py @@ -1,15 +1,26 @@ import pathlib import textwrap +from datetime import datetime import numpy as np import pandas as pd import pytest import xarray as xr +from pytest_cases import parametrize_with_cases import imod +import imod.mf6.drn from imod.logging import LoggerType, LogLevel +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.npf import NodePropertyFlow from imod.mf6.utilities.package import get_repeat_stress from imod.mf6.write_context import WriteContext +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION +from imod.prepare.topsystem.conductance import DISTRIBUTING_OPTION +from imod.prepare.topsystem.default_allocation_methods import ( + SimulationAllocationOptions, + SimulationDistributingOptions, +) from imod.schemata import ValidationError @@ -465,3 +476,51 @@ def test_html_repr(drainage): html_string = imod.mf6.Drainage(**drainage)._repr_html_() assert isinstance(html_string, str) assert html_string.split("")[0] == "
Drainage" + + +class AllocationSettings: + def case_default(self): + return SimulationAllocationOptions.drn, SimulationDistributingOptions.drn + + def case_custom(self): + return ALLOCATION_OPTION.at_elevation, DISTRIBUTING_OPTION.by_crosscut_thickness + + +@parametrize_with_cases( + ["allocation_setting", "distribution_setting"], cases=AllocationSettings +) +def test_from_imod5( + imod5_dataset_periods, tmp_path, allocation_setting, distribution_setting +): + period_data = imod5_dataset_periods[1] + imod5_dataset = imod5_dataset_periods[0] + target_dis = StructuredDiscretization.from_imod5_data(imod5_dataset, validate=False) + target_npf = NodePropertyFlow.from_imod5_data( + imod5_dataset, target_dis.dataset["idomain"] + ) + + drn_2 = imod.mf6.Drainage.from_imod5_data( + "drn-2", + imod5_dataset, + period_data, + target_dis, + target_npf, + allocation_option=allocation_setting, + distributing_option=distribution_setting, + time_min=datetime(2002, 2, 2), + time_max=datetime(2022, 2, 2), + regridder_types=None, + ) + + assert isinstance(drn_2, imod.mf6.Drainage) + + pkg_errors = drn_2._validate( + schemata=drn_2._write_schemata, + idomain=target_dis["idomain"], + bottom=target_dis["bottom"], + ) + assert len(pkg_errors) == 0 + + # write the packages for write validation + write_context = WriteContext(simulation_directory=tmp_path, use_binary=False) + drn_2.write("mydrn", [1], write_context) diff --git a/imod/tests/test_mf6/test_mf6_generalheadboundary.py b/imod/tests/test_mf6/test_mf6_generalheadboundary.py new file mode 100644 index 000000000..bd9ca4648 --- /dev/null +++ b/imod/tests/test_mf6/test_mf6_generalheadboundary.py @@ -0,0 +1,66 @@ +from datetime import datetime + +import imod +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.npf import NodePropertyFlow +from imod.mf6.write_context import WriteContext +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION +from imod.prepare.topsystem.conductance import DISTRIBUTING_OPTION + + +def test_from_imod5_non_planar(imod5_dataset_periods, tmp_path): + period_data = imod5_dataset_periods[1] + imod5_dataset = imod5_dataset_periods[0] + target_dis = StructuredDiscretization.from_imod5_data(imod5_dataset, validate=False) + target_npf = NodePropertyFlow.from_imod5_data( + imod5_dataset, target_dis.dataset["idomain"] + ) + + ghb = imod.mf6.GeneralHeadBoundary.from_imod5_data( + "ghb", + imod5_dataset, + period_data, + target_dis, + target_npf, + time_min=datetime(2002, 2, 2), + time_max=datetime(2022, 2, 2), + allocation_option=ALLOCATION_OPTION.at_elevation, + distributing_option=DISTRIBUTING_OPTION.by_crosscut_thickness, + ) + + assert isinstance(ghb, imod.mf6.GeneralHeadBoundary) + + # write the packages for write validation + write_context = WriteContext(simulation_directory=tmp_path, use_binary=False) + ghb.write("ghb", [1], write_context) + + +def test_from_imod5_planar(imod5_dataset_periods, tmp_path): + period_data = imod5_dataset_periods[1] + imod5_dataset = imod5_dataset_periods[0] + target_dis = StructuredDiscretization.from_imod5_data(imod5_dataset, validate=False) + target_npf = NodePropertyFlow.from_imod5_data( + imod5_dataset, target_dis.dataset["idomain"] + ) + imod5_dataset["ghb"]["conductance"] = imod5_dataset["ghb"][ + "conductance" + ].assign_coords({"layer": [0]}) + imod5_dataset["ghb"]["head"] = imod5_dataset["ghb"]["head"].isel({"layer": 0}) + + ghb = imod.mf6.GeneralHeadBoundary.from_imod5_data( + "ghb", + imod5_dataset, + period_data, + target_dis, + target_npf, + time_min=datetime(2002, 2, 2), + time_max=datetime(2022, 2, 2), + allocation_option=ALLOCATION_OPTION.at_elevation, + distributing_option=DISTRIBUTING_OPTION.by_layer_thickness, + ) + + assert isinstance(ghb, imod.mf6.GeneralHeadBoundary) + + # write the packages for write validation + write_context = WriteContext(simulation_directory=tmp_path, use_binary=False) + ghb.write("ghb", [1], write_context) diff --git a/imod/tests/test_mf6/test_mf6_hfb.py b/imod/tests/test_mf6/test_mf6_hfb.py index 3e7c61109..b4892b302 100644 --- a/imod/tests/test_mf6/test_mf6_hfb.py +++ b/imod/tests/test_mf6/test_mf6_hfb.py @@ -1,25 +1,40 @@ +from copy import deepcopy from unittest.mock import patch import geopandas as gpd import numpy as np import pytest -import shapely import xarray as xr import xugrid as xu from numpy.testing import assert_array_equal +from shapely import Polygon, get_coordinates, linestrings from imod.mf6 import ( HorizontalFlowBarrierHydraulicCharacteristic, HorizontalFlowBarrierMultiplier, HorizontalFlowBarrierResistance, - LayeredHorizontalFlowBarrierHydraulicCharacteristic, - LayeredHorizontalFlowBarrierMultiplier, - LayeredHorizontalFlowBarrierResistance, + SingleLayerHorizontalFlowBarrierHydraulicCharacteristic, + SingleLayerHorizontalFlowBarrierMultiplier, + SingleLayerHorizontalFlowBarrierResistance, ) -from imod.mf6.hfb import to_connected_cells_dataset +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.hfb import ( + _extract_mean_hfb_bounds_from_dataframe, + _make_linestring_from_polygon, + _prepare_barrier_dataset_for_mf6_adapter, + _snap_to_grid_and_aggregate, + to_connected_cells_dataset, +) +from imod.mf6.ims import SolutionPresetSimple +from imod.mf6.npf import NodePropertyFlow +from imod.mf6.simulation import Modflow6Simulation from imod.mf6.utilities.regrid import RegridderWeightsCache +from imod.prepare.hfb import ( + linestring_to_square_zpolygons, + linestring_to_trapezoid_zpolygons, +) from imod.tests.fixtures.flow_basic_fixture import BasicDisSettings -from imod.typing.grid import ones_like +from imod.typing.grid import nan_like, ones_like @pytest.mark.parametrize("dis", ["basic_unstructured_dis", "basic_dis"]) @@ -52,15 +67,19 @@ def test_to_mf6_creates_mf6_adapter_init( print_input = False + barrier_ztop = [0.0, 0.0] + barrier_zbottom = [min(bottom.values), min(bottom.values)] barrier_y = [5.5, 5.5, 5.5] barrier_x = [82.0, 40.0, 0.0] + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + geometry = gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ - barrier_value_name: [barrier_value], - "ztop": [0.0], - "zbottom": [min(bottom.values)], + barrier_value_name: [barrier_value, barrier_value], }, ) @@ -70,7 +89,10 @@ def test_to_mf6_creates_mf6_adapter_init( _ = hfb.to_mf6_pkg(idomain, top, bottom, k) # Assert. - snapped, _ = xu.snap_to_grid(geometry, grid=idomain, max_snap_distance=0.5) + lines = _make_linestring_from_polygon(geometry) + gdf_line = deepcopy(geometry) + gdf_line["geometry"] = lines + snapped, _ = xu.snap_to_grid(gdf_line, grid=idomain, max_snap_distance=0.5) edge_index = np.argwhere(snapped[barrier_value_name].notnull().values).ravel() grid = ( @@ -88,6 +110,7 @@ def test_to_mf6_creates_mf6_adapter_init( expected_values = to_connected_cells_dataset( idomain, grid, edge_index, {barrier_value_name: expected_barrier_values} ) + expected_values = _prepare_barrier_dataset_for_mf6_adapter(expected_values) mf6_flow_barrier_mock.assert_called_once() @@ -102,34 +125,32 @@ def test_to_mf6_creates_mf6_adapter_init( @pytest.mark.parametrize("dis", ["basic_unstructured_dis", "basic_dis"]) -def test_to_mf6_creates_mf6_adapter( +def test_hfb_regrid( dis, request, ): - barrier_class, barrier_value_name, barrier_value = ( - HorizontalFlowBarrierResistance, - "resistance", - 1e3, - ) - # Arrange idomain, _, _ = request.getfixturevalue(dis) print_input = False + barrier_ztop = [0.0, 0.0] + barrier_zbottom = [-100.0, -100.0] barrier_y = [5.5, 5.5, 5.5] barrier_x = [82.0, 40.0, 0.0] + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + geometry = gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ - barrier_value_name: [barrier_value], - "ztop": [0.0], - "zbottom": [-100.0], + "resistance": [1e3, 1e3], }, ) - hfb = barrier_class(geometry, print_input) + hfb = HorizontalFlowBarrierResistance(geometry, print_input) # Act if isinstance(idomain, xu.UgridDataArray): @@ -137,24 +158,28 @@ def test_to_mf6_creates_mf6_adapter( else: idomain_clipped = idomain.sel(x=slice(None, 54.0)) - regrid_context = RegridderWeightsCache() + regrid_cache = RegridderWeightsCache() - hfb_clipped = hfb.regrid_like(idomain_clipped.sel(layer=1), regrid_context) + hfb_clipped = hfb.regrid_like(idomain_clipped.sel(layer=1), regrid_cache) # Assert - x, y = hfb_clipped.dataset["geometry"].values[0].xy - np.testing.assert_allclose(x, [50.0, 40.0, 0.0]) - np.testing.assert_allclose(y, [5.5, 5.5, 5.5]) + geometries = hfb_clipped.dataset["geometry"].values.ravel() + + xyz_arr = get_coordinates(geometries[0], include_z=True) # 2nd polygon is clipped + + np.testing.assert_allclose(xyz_arr[:, 0], [40.0, 50.0, 50.0, 40.0, 40.0]) + np.testing.assert_allclose(xyz_arr[:, 1], [5.5, 5.5, 5.5, 5.5, 5.5]) + np.testing.assert_allclose(xyz_arr[:, 2], [-100.0, -100.0, 0.0, 0.0, -100.0]) @pytest.mark.parametrize("dis", ["basic_unstructured_dis", "basic_dis"]) @pytest.mark.parametrize( "barrier_class, barrier_value_name, barrier_value, expected_hydraulic_characteristic", [ - (LayeredHorizontalFlowBarrierResistance, "resistance", 1e3, 1e-3), - (LayeredHorizontalFlowBarrierMultiplier, "multiplier", 1.5, -1.5), + (SingleLayerHorizontalFlowBarrierResistance, "resistance", 1e3, 1e-3), + (SingleLayerHorizontalFlowBarrierMultiplier, "multiplier", 1.5, -1.5), ( - LayeredHorizontalFlowBarrierHydraulicCharacteristic, + SingleLayerHorizontalFlowBarrierHydraulicCharacteristic, "hydraulic_characteristic", 1e-3, 1e-3, @@ -182,7 +207,7 @@ def test_to_mf6_creates_mf6_adapter_layered( geometry = gpd.GeoDataFrame( geometry=[ - shapely.linestrings(barrier_x, barrier_y), + linestrings(barrier_x, barrier_y), ], data={ barrier_value_name: [barrier_value], @@ -193,7 +218,7 @@ def test_to_mf6_creates_mf6_adapter_layered( hfb = barrier_class(geometry, print_input) # Act. - _ = hfb.to_mf6_pkg(idomain, top, bottom, k, False) + _ = hfb.to_mf6_pkg(idomain, top, bottom, k) # Assert. snapped, _ = xu.snap_to_grid(geometry, grid=idomain, max_snap_distance=0.5) @@ -216,6 +241,7 @@ def test_to_mf6_creates_mf6_adapter_layered( expected_values = to_connected_cells_dataset( idomain, grid, edge_index, {barrier_value_name: expected_barrier_values} ) + expected_values = _prepare_barrier_dataset_for_mf6_adapter(expected_values) mf6_flow_barrier_mock.assert_called_once() @@ -235,7 +261,6 @@ def test_to_mf6_creates_mf6_adapter_layered( (-5.0, -35.0, np.array([1, 1e3, 1])), # 2nd layer (0.0, -35.0, np.array([1e3, 1e3, 1])), # 1st and 2nd layer (-5.0, -135.0, np.array([1, 1e3, 1e3])), # 2nd and 3th layer - (-5.0, -135.0, np.array([1, 1e3, 1e3])), # 2nd and 3th layer (100.0, -135.0, np.array([1e3, 1e3, 1e3])), # ztop out of bounds (0.0, -200.0, np.array([1e3, 1e3, 1e3])), # zbottom out of bounds (100.0, 50.0, np.array([1, 1, 1])), # z-range has no overlap with the domain @@ -247,7 +272,7 @@ def test_to_mf6_creates_mf6_adapter_layered( ], ) @patch("imod.mf6.mf6_hfb_adapter.Mf6HorizontalFlowBarrier.__new__", autospec=True) -def test_to_mf6_different_z_boundaries( +def test_to_mf6_different_constant_z_boundaries( mf6_flow_barrier_mock, basic_dis, ztop, zbottom, expected_values ): # Arrange. @@ -256,15 +281,19 @@ def test_to_mf6_different_z_boundaries( print_input = False + barrier_ztop = [ztop, ztop] + barrier_zbottom = [zbottom, zbottom] barrier_y = [5.5, 5.5, 5.5] barrier_x = [82.0, 40.0, 0.0] + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + geometry = gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ - "resistance": [1e3], - "ztop": [ztop], - "zbottom": [zbottom], + "resistance": [1e3, 1e3], }, ) @@ -280,6 +309,193 @@ def test_to_mf6_different_z_boundaries( assert_array_equal(max_values_per_layer, 1.0 / expected_values) +@pytest.mark.parametrize( + "barrier_ztop, barrier_zbottom, expected_values", + [ + ( + [0.0, -5.0], + [ + -35.0, + -35.0, + ], + np.array([[1e3, 1e3, 1], [1, 3e3, 1]]), + ), # 1st and 2nd layer, 2nd layer + ( + [100.0, 0.0], + [-135.0, -35.0], + np.array([[1e3, 1e3, 1e3], [3e3, 3e3, 1]]), + ), # ztop out of bounds, 1st and 2nd layer, + ( + [0.0, 100.0], + [-200.0, 50.0], + np.array([[1e3, 1e3, 1e3], [1, 1, 1]]), + ), # zbottom out of bounds, z-range has no overlap with the domain + ], +) +@patch("imod.mf6.mf6_hfb_adapter.Mf6HorizontalFlowBarrier.__new__", autospec=True) +def test_to_mf6_different_varying_square_z_boundaries( + mf6_flow_barrier_mock, basic_dis, barrier_ztop, barrier_zbottom, expected_values +): + """ + Test with square zpolygons with varying bounds. The second barrier is a + barrier that is so short it should be ignored. + """ + # Arrange. + idomain, top, bottom = basic_dis + k = ones_like(top) + + print_input = False + + # Insert second barrier values, which need to be ignored + barrier_ztop.insert(1, min(barrier_ztop)) + barrier_zbottom.insert(1, max(barrier_zbottom)) + barrier_y = [5.5, 5.5, 5.5, 5.5] + barrier_x = [0.0, 40.0, 41.0, 82.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + + geometry = gpd.GeoDataFrame( + geometry=polygons, + data={ + "resistance": [1e3, 2e3, 3e3], + }, + ) + + hfb = HorizontalFlowBarrierResistance(geometry, print_input) + + # Act. + _ = hfb.to_mf6_pkg(idomain, top, bottom, k) + + # Assert. + _, args = mf6_flow_barrier_mock.call_args + barrier_values = args["hydraulic_characteristic"].values.reshape(3, 8) + assert_array_equal(barrier_values[:, 3:5], 1.0 / expected_values.T) + + +@pytest.mark.parametrize( + "barrier_ztop, barrier_zbottom, expected_values", + [ + ( + [2.5, -2.5, -7.5], + [ + -35.0, + -35.0, + -35.0, + ], + np.array([[1e3, 1e3, 1], [1, 3e3, 1]]), + ), # 1st and 2nd layer, 2nd layer + ( + [200.0, 0.0, 0.0], + [-270.0, -35.0, -35.0], + np.array([[1e3, 1e3, 1e3], [3e3, 3e3, 1]]), + ), # ztop out of bounds, 1st and 2nd layer, + ( + [0.0, 200.0, 200.0], + [-400.0, 50.0, 50.0], + np.array([[1e3, 1e3, 1e3], [1, 1, 1]]), + ), # zbottom out of bounds, z-range has no overlap with the domain + ], +) +@patch("imod.mf6.mf6_hfb_adapter.Mf6HorizontalFlowBarrier.__new__", autospec=True) +def test_to_mf6_different_trapezoid_z_boundaries( + mf6_flow_barrier_mock, basic_dis, barrier_ztop, barrier_zbottom, expected_values +): + # Arrange. + idomain, top, bottom = basic_dis + k = ones_like(top) + + print_input = False + + barrier_y = [5.5, 5.5, 5.5] + barrier_x = [0.0, 40.0, 82.0] + + polygons = linestring_to_trapezoid_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + + geometry = gpd.GeoDataFrame( + geometry=polygons, + data={ + "resistance": [1e3, 3e3], + }, + ) + + hfb = HorizontalFlowBarrierResistance(geometry, print_input) + + # Act. + _ = hfb.to_mf6_pkg(idomain, top, bottom, k) + + # Assert. + _, args = mf6_flow_barrier_mock.call_args + barrier_values = args["hydraulic_characteristic"].values.reshape(3, 8) + assert_array_equal(barrier_values[:, 3:5], 1.0 / expected_values.T) + + +@pytest.mark.parametrize( + "layer, expected_values", + [ + (2, np.array([1e3, 1e3, 1e3, 1e3, 1e3, 1e3, 1e3, 1e3])), # 2nd layer + ], +) +@patch("imod.mf6.mf6_hfb_adapter.Mf6HorizontalFlowBarrier.__new__", autospec=True) +def test_to_mf6_layered_hfb(mf6_flow_barrier_mock, basic_dis, layer, expected_values): + # Arrange. + idomain, top, bottom = basic_dis + k = ones_like(top) + + print_input = False + + barrier_y = [5.5, 5.5, 5.5] + barrier_x = [82.0, 40.0, 0.0] + + geometry = gpd.GeoDataFrame( + geometry=[linestrings(barrier_x, barrier_y)], + data={ + "resistance": [1e3], + "layer": [layer], + }, + ) + + hfb = SingleLayerHorizontalFlowBarrierResistance(geometry, print_input) + + # Act. + _ = hfb.to_mf6_pkg(idomain, top, bottom, k) + + # Assert. + _, args = mf6_flow_barrier_mock.call_args + barrier_values = args["hydraulic_characteristic"].values + assert_array_equal(barrier_values, 1.0 / expected_values) + expected_layer = np.full((8,), layer) + barrier_layer = args["layer"].values + assert_array_equal(barrier_layer, expected_layer) + + +def test_to_mf6_layered_hfb__error(): + """Throws error because multiple layers attached to one object.""" + # Arrange. + print_input = False + + barrier_y = [5.5, 5.5, 5.5] + barrier_x = [82.0, 40.0, 0.0] + + linestring = linestrings(barrier_x, barrier_y) + + geometry = gpd.GeoDataFrame( + geometry=[linestring, linestring], + data={ + "resistance": [1e3, 1e3], + "layer": [1, 2], + }, + ) + + hfb = SingleLayerHorizontalFlowBarrierResistance(geometry, print_input) + errors = hfb._validate(hfb._write_schemata) + + assert len(errors) > 0 + + @pytest.mark.parametrize( "barrier_x_loc, expected_number_barriers", [ @@ -310,15 +526,19 @@ def test_to_mf6_remove_invalid_edges( ) k = ones_like(top) + barrier_ztop = [0.0] + barrier_zbottom = [-5.0] barrier_y = [0.0, 2.0] barrier_x = [barrier_x_loc, barrier_x_loc] + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + geometry = gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [1e3], - "ztop": [0], - "zbottom": [-5], }, ) @@ -367,15 +587,23 @@ def test_to_mf6_remove_barrier_parts_adjacent_to_inactive_cells( ] = inactivity_marker # make cell inactive k = ones_like(top) + barrier_ztop = [ + 0.0, + ] + barrier_zbottom = [ + -5.0, + ] barrier_y = [0.0, 2.0] barrier_x = [barrier_x_loc, barrier_x_loc] + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + geometry = gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [1e3], - "ztop": [0], - "zbottom": [-5], }, ) @@ -393,22 +621,27 @@ def test_to_mf6_remove_barrier_parts_adjacent_to_inactive_cells( def test_is_empty(): geometry = gpd.GeoDataFrame( - geometry=[shapely.linestrings([], [])], + geometry=[Polygon()], data={ "resistance": [], - "ztop": [], - "zbottom": [], }, ) hfb = HorizontalFlowBarrierResistance(geometry) assert hfb.is_empty() + barrier_ztop = [0.0, 0.0] + barrier_zbottom = [-5.0, -5.0] + barrier_y = [0.0, 2.0, 3.0] + barrier_x = [0.0, 0.0, 0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + geometry = gpd.GeoDataFrame( - geometry=[shapely.linestrings([0, 0], [1, 1])], + geometry=polygons, data={ - "resistance": [1.0], - "ztop": [1.0], - "zbottom": [1.0], + "resistance": [1.0, 1.0], }, ) @@ -424,15 +657,21 @@ def test_is_empty(): @pytest.mark.parametrize("print_input", [True, False]) def test_set_options(print_input, parameterizable_basic_dis): idomain, top, bottom = parameterizable_basic_dis + + barrier_x = [-1000.0, 1000.0] + barrier_y = [0.3, 0.3] + barrier_ztop = [top.values[0]] + barrier_zbottom = [bottom.values[-1]] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + hfb = HorizontalFlowBarrierResistance( geometry=gpd.GeoDataFrame( - geometry=[ - shapely.linestrings([-1000.0, 1000.0], [0.3, 0.3]), - ], + geometry=polygons, data={ "resistance": [1e3], - "ztop": [10.0], - "zbottom": [0.0], }, ), print_input=print_input, @@ -440,3 +679,221 @@ def test_set_options(print_input, parameterizable_basic_dis): k = ones_like(top) mf6_package = hfb.to_mf6_pkg(idomain, top, bottom, k) assert mf6_package.dataset["print_input"].values[()] == print_input + + +@pytest.mark.usefixtures("imod5_dataset") +def test_hfb_from_imod5(imod5_dataset, tmp_path): + imod5_data = imod5_dataset[0] + target_dis = StructuredDiscretization.from_imod5_data(imod5_data) + target_npf = NodePropertyFlow.from_imod5_data( + imod5_data, target_dis.dataset["idomain"] + ) + + hfb = SingleLayerHorizontalFlowBarrierResistance.from_imod5_dataset( + "hfb-3", imod5_data + ) + hfb_package = hfb.to_mf6_pkg( + target_dis["idomain"], target_dis["top"], target_dis["bottom"], target_npf["k"] + ) + assert list(np.unique(hfb_package["layer"].values)) == [7] + + +@pytest.mark.usefixtures("structured_flow_model") +def test_snap_to_grid_and_aggregate(structured_flow_model): + idomain = structured_flow_model["dis"]["idomain"] + grid2d = xu.Ugrid2d.from_structured(idomain) + + barrier_y = [11.0, 5.0, -1.0] + barrier_x = [5.0, 5.0, 5.0] + line = linestrings(barrier_x, barrier_y) + layer = [1, 1, 1] + + geometry_triple = gpd.GeoDataFrame( + geometry=[line, line, line], + data={ + "resistance": [400.0, 400.0, 400.0], + "layer": layer, + "line_index": [0, 1, 2], + }, + ) + geometry_triple = geometry_triple.set_index("line_index") + + vardict_agg = {"resistance": "sum", "layer": "first"} + + snapped_dataset, edge_index = _snap_to_grid_and_aggregate( + geometry_triple, grid2d, vardict_agg + ) + + argwhere_summed_expected = np.array([7, 20, 33, 46, 59, 72], dtype=np.int64) + argwhere_summed_actual = np.nonzero((snapped_dataset["resistance"] == 1200).values)[ + 0 + ] + + np.testing.assert_array_equal(argwhere_summed_actual, argwhere_summed_expected) + np.testing.assert_array_equal(edge_index, argwhere_summed_expected) + + +@pytest.mark.usefixtures("structured_flow_model") +def test_combine_linestrings(structured_flow_model): + dis = structured_flow_model["dis"] + top, bottom, idomain = ( + dis["top"], + dis["bottom"], + dis["idomain"], + ) + k = xr.ones_like(idomain) + + barrier_y = [11.0, 5.0, -1.0] + barrier_x = [5.0, 5.0, 5.0] + line = linestrings(barrier_x, barrier_y) + + geometry_single = gpd.GeoDataFrame( + geometry=[line], + data={ + "resistance": [1200.0], + "layer": [1], + }, + ) + geometry_triple = gpd.GeoDataFrame( + geometry=[line, line, line], + data={ + "resistance": [400.0, 400.0, 400.0], + "layer": [1, 1, 1], + }, + ) + hfb_single = SingleLayerHorizontalFlowBarrierResistance(geometry_single) + hfb_triple = SingleLayerHorizontalFlowBarrierResistance(geometry_triple) + mf6_hfb_single = hfb_single.to_mf6_pkg(idomain, top, bottom, k) + mf6_hfb_triple = hfb_triple.to_mf6_pkg(idomain, top, bottom, k) + + xr.testing.assert_equal(mf6_hfb_single.dataset, mf6_hfb_triple.dataset) + + +@pytest.mark.usefixtures("structured_flow_model") +def test_run_multiple_hfbs(tmp_path, structured_flow_model): + # Single layered model + structured_flow_model = structured_flow_model.clip_box(layer_max=1) + structured_flow_model["dis"]["bottom"] = structured_flow_model["dis"][ + "bottom" + ].isel(x=0, y=0, drop=True) + # Arrange boundary conditions into something simple: + # A linear decline from left to right, forced by chd + structured_flow_model.pop("rch") + chd_head = nan_like(structured_flow_model["chd"].dataset["head"]) + chd_head[:, :, 0] = 10.0 + chd_head[:, :, -1] = 0.0 + structured_flow_model["chd"].dataset["head"] = chd_head + + barrier_y = [11.0, 5.0, -1.0] + barrier_x = [5.0, 5.0, 5.0] + + geometry = gpd.GeoDataFrame( + geometry=[linestrings(barrier_x, barrier_y)], + data={ + "resistance": [1200.0], + "layer": [1], + }, + ) + + simulation_single = Modflow6Simulation("single_hfb") + structured_flow_model["hfb"] = SingleLayerHorizontalFlowBarrierResistance(geometry) + simulation_single["GWF"] = structured_flow_model + simulation_single["solver"] = SolutionPresetSimple(["GWF"]) + simulation_single.create_time_discretization(["2000-01-01", "2000-01-02"]) + simulation_single.write(tmp_path / "single") + simulation_single.run() + head_single = simulation_single.open_head() + + geometry = gpd.GeoDataFrame( + geometry=[linestrings(barrier_x, barrier_y)], + data={ + "resistance": [400.0], + "layer": [1], + }, + ) + + simulation_triple = Modflow6Simulation("triple_hfb") + structured_flow_model.pop("hfb") # Remove high resistance HFB package now. + structured_flow_model["hfb-1"] = SingleLayerHorizontalFlowBarrierResistance( + geometry + ) + structured_flow_model["hfb-2"] = SingleLayerHorizontalFlowBarrierResistance( + geometry + ) + structured_flow_model["hfb-3"] = SingleLayerHorizontalFlowBarrierResistance( + geometry + ) + simulation_triple["GWF"] = structured_flow_model + simulation_triple["solver"] = SolutionPresetSimple(["GWF"]) + simulation_triple.create_time_discretization(["2000-01-01", "2000-01-02"]) + simulation_triple.write(tmp_path / "triple") + simulation_triple.run() + head_triple = simulation_triple.open_head() + + xr.testing.assert_equal(head_single, head_triple) + + +def test_make_linestring_from_polygon(): + barrier_x = [-1000.0, 1000.0] + barrier_y = [0.3, 0.3] + barrier_ztop = [10.0] + barrier_zbottom = [-10.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + + gdf_polygons = gpd.GeoDataFrame( + geometry=polygons, + data={ + "resistance": [1e3], + }, + ) + + linestrings = _make_linestring_from_polygon(gdf_polygons) + + coordinates = get_coordinates(linestrings) + + np.testing.assert_allclose(barrier_x, coordinates[:, 0]) + np.testing.assert_allclose(barrier_y, coordinates[:, 1]) + + +def test_extract_hfb_bounds_from_dataframe(): + barrier_x = [-1000.0, 0.0, 1000.0] + barrier_y = [0.3, 0.3, 0.3] + barrier_ztop = [10.0, 20.0] + barrier_zbottom = [-10.0, -30.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + + gdf_polygons = gpd.GeoDataFrame( + geometry=polygons, + data={ + "resistance": [1e3, 1e3], + }, + ) + + zmin, zmax = _extract_mean_hfb_bounds_from_dataframe(gdf_polygons) + + np.testing.assert_equal(zmin.values, barrier_zbottom) + np.testing.assert_equal(zmax.values, barrier_ztop) + + +def test_extract_hfb_bounds_from_dataframe__fails(): + """Test if function throws error when providing a line.""" + barrier_x = [-1000.0, 0.0, 1000.0] + barrier_y = [0.3, 0.3, 0.3] + + line_data = linestrings(barrier_x, barrier_y) + + gdf_polygons = gpd.GeoDataFrame( + geometry=[line_data, line_data], + data={ + "resistance": [1e3, 1e3], + }, + ) + + with pytest.raises(TypeError): + _extract_mean_hfb_bounds_from_dataframe(gdf_polygons) diff --git a/imod/tests/test_mf6/test_mf6_ic.py b/imod/tests/test_mf6/test_mf6_ic.py index ef3ebecbf..d1bc23623 100644 --- a/imod/tests/test_mf6/test_mf6_ic.py +++ b/imod/tests/test_mf6/test_mf6_ic.py @@ -1,5 +1,6 @@ import pathlib import textwrap +from copy import deepcopy import pytest @@ -40,3 +41,17 @@ def test_validate_false(): def test_wrong_arguments(): with pytest.raises(ValueError): imod.mf6.InitialConditions(head=0.0, start=1.0) + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + + target_grid = data["khv"]["kh"] + + ic = imod.mf6.InitialConditions.from_imod5_data(data, target_grid) + + ic._validate_init_schemata(True) + + rendered_ic = ic.render(tmp_path, "ic", None, False) + assert "strt" in rendered_ic diff --git a/imod/tests/test_mf6/test_mf6_logging.py b/imod/tests/test_mf6/test_mf6_logging.py index 0b02740f9..16611de17 100644 --- a/imod/tests/test_mf6/test_mf6_logging.py +++ b/imod/tests/test_mf6/test_mf6_logging.py @@ -10,6 +10,7 @@ import imod from imod.logging import LoggerType, LogLevel, standard_log_decorator +from imod.mf6.validation_context import ValidationContext from imod.mf6.write_context import WriteContext out = StringIO() @@ -80,6 +81,7 @@ def test_write_model_is_logged( # arrange logfile_path = tmp_path / "logfile.txt" transport_model = flow_transport_simulation["tpt_c"] + validation_context = ValidationContext() write_context = WriteContext(simulation_directory=tmp_path, use_binary=True) globaltimes = np.array( [ @@ -95,7 +97,9 @@ def test_write_model_is_logged( add_default_file_handler=False, add_default_stream_handler=True, ) - transport_model.write("model.txt", globaltimes, True, write_context) + transport_model.write( + "model.txt", globaltimes, write_context, validation_context + ) # assert with open(logfile_path, "r") as log_file: diff --git a/imod/tests/test_mf6/test_mf6_model.py b/imod/tests/test_mf6/test_mf6_model.py index fc0c30084..a5597c5bb 100644 --- a/imod/tests/test_mf6/test_mf6_model.py +++ b/imod/tests/test_mf6/test_mf6_model.py @@ -16,6 +16,7 @@ from imod.mf6.model import Modflow6Model from imod.mf6.model_gwf import GroundwaterFlowModel from imod.mf6.package import Package +from imod.mf6.validation_context import ValidationContext from imod.mf6.write_context import WriteContext from imod.schemata import ValidationError @@ -105,6 +106,7 @@ def test_write_valid_model_without_error(self, tmpdir_factory): model_name = "Test model" model = Modflow6Model() # create write context + validation_context = ValidationContext() write_context = WriteContext(tmp_path) discretization_mock = MagicMock(spec_set=Package) @@ -119,7 +121,9 @@ def test_write_valid_model_without_error(self, tmpdir_factory): global_times_mock = MagicMock(spec_set=imod.mf6.TimeDiscretization) # Act. - status = model.write(model_name, global_times_mock, True, write_context) + status = model.write( + model_name, global_times_mock, write_context, validation_context + ) # Assert. assert not status.has_errors() @@ -130,6 +134,7 @@ def test_write_without_dis_pkg_return_error(self, tmpdir_factory): model_name = "Test model" model = Modflow6Model() # create write context + validation_context = ValidationContext() write_context = WriteContext(tmp_path) template_mock = MagicMock(spec_set=Template) @@ -139,7 +144,9 @@ def test_write_without_dis_pkg_return_error(self, tmpdir_factory): global_times_mock = MagicMock(spec_set=imod.mf6.TimeDiscretization) # Act. - status = model.write(model_name, global_times_mock, True, write_context) + status = model.write( + model_name, global_times_mock, write_context, validation_context + ) # Assert. assert status.has_errors() @@ -150,6 +157,7 @@ def test_write_with_invalid_pkg_returns_error(self, tmpdir_factory): model_name = "Test model" model = Modflow6Model() # create write context + validation_context = ValidationContext() write_context = WriteContext(tmp_path) discretization_mock = MagicMock(spec_set=Package) @@ -168,7 +176,7 @@ def test_write_with_invalid_pkg_returns_error(self, tmpdir_factory): # Act. status = model.write( - model_name, global_times_mock, True, write_context=write_context + model_name, global_times_mock, write_context, validation_context ) # Assert. @@ -178,6 +186,7 @@ def test_write_with_two_invalid_pkg_returns_two_errors(self, tmpdir_factory): # Arrange. tmp_path = tmpdir_factory.mktemp("TestSimulation") model_name = "Test model" + validation_context = ValidationContext() write_context = WriteContext(simulation_directory=tmp_path) model = Modflow6Model() @@ -205,7 +214,9 @@ def test_write_with_two_invalid_pkg_returns_two_errors(self, tmpdir_factory): # Act. write_context = WriteContext(tmp_path) - status = model.write(model_name, global_times_mock, True, write_context) + status = model.write( + model_name, global_times_mock, write_context, validation_context + ) # Assert. assert len(status.errors) == 2 diff --git a/imod/tests/test_mf6/test_mf6_npf.py b/imod/tests/test_mf6/test_mf6_npf.py index da2c68779..a73cb64f4 100644 --- a/imod/tests/test_mf6/test_mf6_npf.py +++ b/imod/tests/test_mf6/test_mf6_npf.py @@ -1,12 +1,14 @@ import pathlib import re import textwrap +from copy import deepcopy import numpy as np import pytest import xarray as xr import imod +from imod.mf6.utilities.regridding_types import RegridderType from imod.schemata import ValidationError @@ -208,3 +210,120 @@ def test_configure_xt3d(tmp_path): assert "xt3d" not in rendered assert "rhs" not in rendered assert not npf.get_xt3d_option() + + +@pytest.mark.usefixtures("imod5_dataset") +def test_npf_from_imod5_isotropic(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + # throw out kva (=vertical anisotropy array) and ani (=horizontal anisotropy array) + data.pop("kva") + data.pop("ani") + + target_grid = data["khv"]["kh"] + npf = imod.mf6.NodePropertyFlow.from_imod5_data(data, target_grid) + + # Test array values are the same for k ( disregarding the locations where k == np.nan) + k_nan_removed = xr.where(np.isnan(npf.dataset["k"]), 0, npf.dataset["k"]) + np.testing.assert_allclose(k_nan_removed, data["khv"]["kh"].values) + + rendered_npf = npf.render(tmp_path, "npf", None, None) + assert "k22" not in rendered_npf + assert "k33" not in rendered_npf + assert "angle1" not in rendered_npf + assert "angle2" not in rendered_npf + assert "angle3" not in rendered_npf + + +@pytest.mark.usefixtures("imod5_dataset") +def test_npf_from_imod5_horizontal_anisotropy(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + # throw out kva (=vertical anisotropy array) + data.pop("kva") + + target_grid = data["khv"]["kh"] + data["ani"]["angle"].values[:, :, :] = 135.0 + data["ani"]["factor"].values[:, :, :] = 0.1 + npf = imod.mf6.NodePropertyFlow.from_imod5_data(data, target_grid) + + # Test array values for k22 and angle1 + for layer in npf.dataset["k"].coords["layer"].values: + ds_layer = npf.dataset.sel({"layer": layer}) + + ds_layer = ds_layer.fillna(0.0) + + if layer in data["ani"]["factor"].coords["layer"].values: + np.testing.assert_allclose( + ds_layer["k"].values * 0.1, ds_layer["k22"].values, atol=1e-10 + ) + assert np.all(ds_layer["angle1"].values == 315.0) + else: + assert np.all(ds_layer["k"].values == ds_layer["k22"].values) + assert np.all(ds_layer["angle1"].values == 0.0) + + rendered_npf = npf.render(tmp_path, "npf", None, None) + assert "k22" in rendered_npf + assert "k33" not in rendered_npf + assert "angle1" in rendered_npf + assert "angle2" not in rendered_npf + assert "angle3" not in rendered_npf + + +@pytest.mark.usefixtures("imod5_dataset") +def test_npf_from_imod5_vertical_anisotropy(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + # throw out ani (=horizontal anisotropy array) + data.pop("ani") + + data["kva"]["vertical_anisotropy"].values[:] = 0.1 + target_grid = data["khv"]["kh"] + + npf = imod.mf6.NodePropertyFlow.from_imod5_data(data, target_grid) + + # Test array values for k33 + for layer in npf.dataset["k"].coords["layer"].values: + k_layer = npf.dataset["k"].sel({"layer": layer}) + k33_layer = npf.dataset["k33"].sel({"layer": layer}) + + k_layer = xr.where(np.isnan(k_layer), 0.0, k_layer) + k33_layer = xr.where(np.isnan(k33_layer), 0.0, k33_layer) + np.testing.assert_allclose(k_layer.values * 0.1, k33_layer.values, atol=1e-10) + + rendered_npf = npf.render(tmp_path, "npf", None, None) + assert "k22" not in rendered_npf + assert "k33" in rendered_npf + assert "angle1" not in rendered_npf + assert "angle2" not in rendered_npf + assert "angle3" not in rendered_npf + + +@pytest.mark.usefixtures("imod5_dataset") +def test_npf_from_imod5_settings(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + + # move the coordinates a bit so that it doesn't match the grid of k (and the regridding settings will matter) + target_grid = data["khv"]["kh"] + x = target_grid["x"].values + x += 50 + y = target_grid["y"].values + y += 50 + target_grid = target_grid.assign_coords({"x": x, "y": y}) + + settings = imod.mf6.NodePropertyFlow.get_regrid_methods() + settings_1 = deepcopy(settings) + settings_1.k = ( + RegridderType.OVERLAP, + "harmonic_mean", + ) + npf_1 = imod.mf6.NodePropertyFlow.from_imod5_data(data, target_grid, settings_1) + + settings_2 = deepcopy(settings) + settings_2.k = ( + RegridderType.OVERLAP, + "mode", + ) + npf_2 = imod.mf6.NodePropertyFlow.from_imod5_data(data, target_grid, settings_2) + + # assert that different settings lead to different results. + diff = npf_1.dataset["k"] - npf_2.dataset["k"] + diff = xr.where(np.isnan(diff), 0, diff) + assert diff.values.max() > 0.1 diff --git a/imod/tests/test_mf6/test_mf6_rch.py b/imod/tests/test_mf6/test_mf6_rch.py index 43567bd80..5119d9d11 100644 --- a/imod/tests/test_mf6/test_mf6_rch.py +++ b/imod/tests/test_mf6/test_mf6_rch.py @@ -2,14 +2,17 @@ import re import tempfile import textwrap +from copy import deepcopy import numpy as np import pytest import xarray as xr import imod +from imod.mf6.dis import StructuredDiscretization from imod.mf6.write_context import WriteContext from imod.schemata import ValidationError +from imod.typing.grid import is_planar_grid, is_transient_data_grid, nan_like @pytest.fixture(scope="function") @@ -324,3 +327,131 @@ def test_clip_box(rch_dict): selection = rch.clip_box(x_min=10.0, x_max=20.0, y_min=10.0, y_max=20.0) assert selection["rate"].dims == ("y", "x") assert selection["rate"].shape == (1, 1) + + +@pytest.mark.usefixtures("imod5_dataset") +def test_planar_rch_from_imod5_constant(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + target_discretization = StructuredDiscretization.from_imod5_data(data) + + # create a planar grid with time-independent recharge + data["rch"]["rate"]["layer"].values[0] = 0 + assert not is_transient_data_grid(data["rch"]["rate"]) + assert is_planar_grid(data["rch"]["rate"]) + + # Act + rch = imod.mf6.Recharge.from_imod5_data(data, target_discretization) + rendered_rch = rch.render(tmp_path, "rch", None, None) + + # Assert + np.testing.assert_allclose( + data["rch"]["rate"].mean().values / 1e3, + rch.dataset["rate"].mean().values, + atol=1e-5, + ) + assert "maxbound 33856" in rendered_rch + assert rendered_rch.count("begin period") == 1 + # teardown + data["rch"]["rate"]["layer"].values[0] = 1 + + +@pytest.mark.usefixtures("imod5_dataset") +def test_planar_rch_from_imod5_transient(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + target_discretization = StructuredDiscretization.from_imod5_data(data) + + # create a grid with recharge for 3 timesteps + input_recharge = data["rch"]["rate"].copy(deep=True) + input_recharge = input_recharge.expand_dims({"time": [0, 1, 2]}) + + # make it planar by setting the layer coordinate to 0 + input_recharge = input_recharge.assign_coords({"layer": [0]}) + + # update the data set + data["rch"]["rate"] = input_recharge + assert is_transient_data_grid(data["rch"]["rate"]) + assert is_planar_grid(data["rch"]["rate"]) + + # act + rch = imod.mf6.Recharge.from_imod5_data(data, target_discretization) + rendered_rch = rch.render(tmp_path, "rch", [0, 1, 2], None) + + # assert + np.testing.assert_allclose( + data["rch"]["rate"].mean().values / 1e3, + rch.dataset["rate"].mean().values, + atol=1e-5, + ) + assert rendered_rch.count("begin period") == 3 + assert "maxbound 33856" in rendered_rch + + +@pytest.mark.usefixtures("imod5_dataset") +def test_non_planar_rch_from_imod5_constant(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + target_discretization = StructuredDiscretization.from_imod5_data(data) + + # make the first layer of the target grid inactive + target_grid = target_discretization.dataset["idomain"] + target_grid.loc[{"layer": 1}] = 0 + + # the input for recharge is on the second layer of the targetgrid + original_rch = data["rch"]["rate"].copy(deep=True) + data["rch"]["rate"] = data["rch"]["rate"].assign_coords({"layer": [0]}) + input_recharge = nan_like(data["khv"]["kh"]) + input_recharge.loc[{"layer": 2}] = data["rch"]["rate"].isel(layer=0) + + # update the data set + + data["rch"]["rate"] = input_recharge + assert not is_planar_grid(data["rch"]["rate"]) + assert not is_transient_data_grid(data["rch"]["rate"]) + + # act + rch = imod.mf6.Recharge.from_imod5_data(data, target_discretization) + rendered_rch = rch.render(tmp_path, "rch", None, None) + + # assert + np.testing.assert_allclose( + data["rch"]["rate"].mean().values / 1e3, + rch.dataset["rate"].mean().values, + atol=1e-5, + ) + assert rendered_rch.count("begin period") == 1 + assert "maxbound 33856" in rendered_rch + + # teardown + data["rch"]["rate"] = original_rch + + +@pytest.mark.usefixtures("imod5_dataset") +def test_non_planar_rch_from_imod5_transient(imod5_dataset, tmp_path): + data = deepcopy(imod5_dataset[0]) + target_discretization = StructuredDiscretization.from_imod5_data(data) + # make the first layer of the target grid inactive + target_grid = target_discretization.dataset["idomain"] + target_grid.loc[{"layer": 1}] = 0 + + # the input for recharge is on the second layer of the targetgrid + input_recharge = nan_like(data["rch"]["rate"]) + input_recharge = input_recharge.assign_coords({"layer": [2]}) + input_recharge.loc[{"layer": 2}] = data["rch"]["rate"].sel(layer=1) + input_recharge = input_recharge.expand_dims({"time": [0, 1, 2]}) + + # update the data set + data["rch"]["rate"] = input_recharge + assert not is_planar_grid(data["rch"]["rate"]) + assert is_transient_data_grid(data["rch"]["rate"]) + + # act + rch = imod.mf6.Recharge.from_imod5_data(data, target_discretization) + rendered_rch = rch.render(tmp_path, "rch", [0, 1, 2], None) + + # assert + np.testing.assert_allclose( + data["rch"]["rate"].mean().values / 1e3, + rch.dataset["rate"].mean().values, + atol=1e-5, + ) + assert rendered_rch.count("begin period") == 3 + assert "maxbound 33856" in rendered_rch diff --git a/imod/tests/test_mf6/test_mf6_regrid_package.py b/imod/tests/test_mf6/test_mf6_regrid_package.py index b472f4d7d..79a190e71 100644 --- a/imod/tests/test_mf6/test_mf6_regrid_package.py +++ b/imod/tests/test_mf6/test_mf6_regrid_package.py @@ -117,10 +117,10 @@ def test_regrid_structured(): structured_grid_packages = create_package_instances(is_structured=True) new_grid = grid_data_structured(np.float64, 12, 2.5) - regrid_context = RegridderWeightsCache() + regrid_cache = RegridderWeightsCache() new_packages = [] for package in structured_grid_packages: - new_packages.append(package.regrid_like(new_grid, regrid_context)) + new_packages.append(package.regrid_like(new_grid, regrid_cache)) new_idomain = new_packages[0].dataset["icelltype"] @@ -137,11 +137,11 @@ def test_regrid_unstructured(): """ unstructured_grid_packages = create_package_instances(is_structured=False) new_grid = grid_data_unstructured(np.float64, 12, 2.5) - regrid_context = RegridderWeightsCache() + regrid_cache = RegridderWeightsCache() new_packages = [] for package in unstructured_grid_packages: - new_packages.append(package.regrid_like(new_grid, regrid_context)) + new_packages.append(package.regrid_like(new_grid, regrid_cache)) new_idomain = new_packages[0].dataset["icelltype"] for new_package in new_packages: @@ -170,12 +170,12 @@ def test_regrid_structured_missing_dx_and_dy(): ) new_grid = grid_data_structured(np.float64, 12, 0.25) - regrid_context = RegridderWeightsCache() + regrid_cache = RegridderWeightsCache() with pytest.raises( ValueError, match="DataArray icelltype does not have both a dx and dy coordinates", ): - _ = package.regrid_like(new_grid, regrid_context) + _ = package.regrid_like(new_grid, regrid_cache) def test_regrid(tmp_path: Path): @@ -208,8 +208,8 @@ def test_regrid(tmp_path: Path): save_flows=True, alternative_cell_averaging="AMT-HMK", ) - regrid_context = RegridderWeightsCache() - new_npf = npf.regrid_like(k, regrid_context) + regrid_cache = RegridderWeightsCache() + new_npf = npf.regrid_like(k, regrid_cache) # check the rendered versions are the same, they contain the options new_rendered = new_npf.render(tmp_path, "regridded", None, False) @@ -247,8 +247,8 @@ def test_regridding_can_skip_validation(): # Regrid the package to a finer domain new_grid = grid_data_structured(np.float64, 1.0, 0.025) - regrid_context = RegridderWeightsCache() - regridded_package = sto_package.regrid_like(new_grid, regrid_context) + regrid_cache = RegridderWeightsCache() + regridded_package = sto_package.regrid_like(new_grid, regrid_cache) # Check that write validation still fails for the regridded package new_bottom = deepcopy(new_grid) @@ -293,8 +293,8 @@ def test_regridding_layer_based_array(): validate=False, ) new_grid = grid_data_structured(np.float64, 1.0, 0.025) - regrid_context = RegridderWeightsCache() - regridded_package = sto_package.regrid_like(new_grid, regrid_context) + regrid_cache = RegridderWeightsCache() + regridded_package = sto_package.regrid_like(new_grid, regrid_cache) assert ( regridded_package.dataset.coords["dx"].values[()] diff --git a/imod/tests/test_mf6/test_mf6_regrid_simulation.py b/imod/tests/test_mf6/test_mf6_regrid_simulation.py index 5d5e0a817..098356727 100644 --- a/imod/tests/test_mf6/test_mf6_regrid_simulation.py +++ b/imod/tests/test_mf6/test_mf6_regrid_simulation.py @@ -67,7 +67,7 @@ def test_regrid_with_custom_method(circle_model): regrid_method = ConstantHeadRegridMethod( head=(RegridderType.BARYCENTRIC,), concentration=(RegridderType.BARYCENTRIC,) ) - regrid_context = RegridderWeightsCache() + regrid_cache = RegridderWeightsCache() simulation_regridded["GWF_1"]["chd"] = chd_pkg.regrid_like( - idomain, regrid_context=regrid_context, regridder_types=regrid_method + idomain, regrid_cache=regrid_cache, regridder_types=regrid_method ) diff --git a/imod/tests/test_mf6/test_mf6_riv.py b/imod/tests/test_mf6/test_mf6_riv.py index 84a83062b..7e57abcde 100644 --- a/imod/tests/test_mf6/test_mf6_riv.py +++ b/imod/tests/test_mf6/test_mf6_riv.py @@ -2,6 +2,7 @@ import re import tempfile import textwrap +from datetime import datetime import numpy as np import pytest @@ -10,11 +11,21 @@ from pytest_cases import parametrize_with_cases import imod +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.disv import VerticesDiscretization +from imod.mf6.npf import NodePropertyFlow from imod.mf6.write_context import WriteContext +from imod.prepare.topsystem.allocation import ALLOCATION_OPTION +from imod.prepare.topsystem.conductance import DISTRIBUTING_OPTION from imod.schemata import ValidationError +from imod.typing.grid import ones_like, zeros_like + +TYPE_DIS_PKG = { + xu.UgridDataArray: VerticesDiscretization, + xr.DataArray: StructuredDiscretization, +} -@pytest.fixture(scope="function") def make_da(): x = [5.0, 15.0, 25.0] y = [25.0, 15.0, 5.0] @@ -29,9 +40,8 @@ def make_da(): ) -@pytest.fixture(scope="function") -def dis_dict(make_da): - da = make_da +def dis_dict(): + da = make_da() bottom = da - xr.DataArray( data=[1.5, 2.5], dims=("layer",), coords={"layer": [2, 3]} ) @@ -39,16 +49,15 @@ def dis_dict(make_da): return {"idomain": da.astype(int), "top": da.sel(layer=2), "bottom": bottom} -@pytest.fixture(scope="function") -def riv_dict(make_da): - da = make_da +def riv_dict(): + da = make_da() da[:, 1, 1] = np.nan bottom = da - xr.DataArray( data=[1.0, 2.0], dims=("layer",), coords={"layer": [2, 3]} ) - return {"stage": da, "conductance": da, "bottom_elevation": bottom} + return {"stage": da, "conductance": da.copy(), "bottom_elevation": bottom} def make_dict_unstructured(d): @@ -56,19 +65,27 @@ def make_dict_unstructured(d): class RivCases: - def case_structured(self, riv_dict): - return riv_dict + def case_structured(self): + return riv_dict() + + def case_unstructured(self): + return make_dict_unstructured(riv_dict()) + - def case_unstructured(self, riv_dict): - return make_dict_unstructured(riv_dict) +class DisCases: + def case_structured(self): + return dis_dict() + + def case_unstructured(self): + return make_dict_unstructured(dis_dict()) class RivDisCases: - def case_structured(self, riv_dict, dis_dict): - return riv_dict, dis_dict + def case_structured(self): + return riv_dict(), dis_dict() - def case_unstructured(self, riv_dict, dis_dict): - return make_dict_unstructured(riv_dict), make_dict_unstructured(dis_dict) + def case_unstructured(self): + return make_dict_unstructured(riv_dict()), make_dict_unstructured(dis_dict()) @parametrize_with_cases("riv_data", cases=RivCases) @@ -113,19 +130,32 @@ def test_all_nan(riv_data, dis_data): errors = river._validate(river._write_schemata, **dis_data) assert len(errors) == 1 + assert "stage" in errors.keys() - for var, var_errors in errors.items(): - assert var == "stage" + +@parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) +def test_validate_inconsistent_nan(riv_data, dis_data): + riv_data["stage"][..., 2] = np.nan + river = imod.mf6.River(**riv_data) + + errors = river._validate(river._write_schemata, **dis_data) + + assert len(errors) == 2 + assert "bottom_elevation" in errors.keys() + assert "conductance" in errors.keys() @parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) -def test_inconsistent_nan(riv_data, dis_data): +def test_cleanup_inconsistent_nan(riv_data, dis_data): riv_data["stage"][..., 2] = np.nan river = imod.mf6.River(**riv_data) + type_grid = type(riv_data["stage"]) + dis_pkg = TYPE_DIS_PKG[type_grid](**dis_data) + river.cleanup(dis_pkg) errors = river._validate(river._write_schemata, **dis_data) - assert len(errors) == 1 + assert len(errors) == 0 @parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) @@ -205,11 +235,11 @@ def test_check_dimsize_zero(): @parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) -def test_check_zero_conductance(riv_data, dis_data): +def test_validate_zero_conductance(riv_data, dis_data): """ - Test for zero conductance + Test for validation zero conductance """ - riv_data["conductance"] = riv_data["conductance"] * 0.0 + riv_data["conductance"][..., 2] = 0.0 river = imod.mf6.River(**riv_data) @@ -221,9 +251,25 @@ def test_check_zero_conductance(riv_data, dis_data): @parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) -def test_check_bottom_above_stage(riv_data, dis_data): +def test_cleanup_zero_conductance(riv_data, dis_data): """ - Check that river bottom is not above stage. + Cleanup zero conductance + """ + riv_data["conductance"][..., 2] = 0.0 + type_grid = type(riv_data["stage"]) + dis_pkg = TYPE_DIS_PKG[type_grid](**dis_data) + + river = imod.mf6.River(**riv_data) + river.cleanup(dis_pkg) + + errors = river._validate(river._write_schemata, **dis_data) + assert len(errors) == 0 + + +@parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) +def test_validate_bottom_above_stage(riv_data, dis_data): + """ + Validate that river bottom is not above stage. """ riv_data["bottom_elevation"] = riv_data["bottom_elevation"] + 10.0 @@ -233,8 +279,26 @@ def test_check_bottom_above_stage(riv_data, dis_data): errors = river._validate(river._write_schemata, **dis_data) assert len(errors) == 1 - for var, var_errors in errors.items(): - assert var == "stage" + assert "stage" in errors.keys() + + +@parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) +def test_cleanup_bottom_above_stage(riv_data, dis_data): + """ + Cleanup river bottom above stage. + """ + + riv_data["bottom_elevation"] = riv_data["bottom_elevation"] + 10.0 + type_grid = type(riv_data["stage"]) + dis_pkg = TYPE_DIS_PKG[type_grid](**dis_data) + + river = imod.mf6.River(**riv_data) + river.cleanup(dis_pkg) + + errors = river._validate(river._write_schemata, **dis_data) + + assert len(errors) == 0 + assert river.dataset["bottom_elevation"].equals(river.dataset["stage"]) @parametrize_with_cases("riv_data,dis_data", cases=RivDisCases) @@ -275,12 +339,12 @@ def test_check_boundary_outside_active_domain(riv_data, dis_data): assert len(errors) == 1 -def test_check_dim_monotonicity(riv_dict): +def test_check_dim_monotonicity(): """ Test if dimensions are monotonically increasing or, in case of the y coord, decreasing """ - riv_ds = xr.merge([riv_dict]) + riv_ds = xr.merge([riv_dict()]) message = textwrap.dedent( """ @@ -322,19 +386,19 @@ def test_check_dim_monotonicity(riv_dict): imod.mf6.River(**riv_ds.sel(layer=slice(None, None, -1))) -def test_validate_false(riv_dict): +def test_validate_false(): """ Test turning off validation """ - riv_ds = xr.merge([riv_dict]) + riv_ds = xr.merge([riv_dict()]) imod.mf6.River(validate=False, **riv_ds.sel(layer=slice(None, None, -1))) @pytest.mark.usefixtures("concentration_fc") -def test_render_concentration(riv_dict, concentration_fc): - riv_ds = xr.merge([riv_dict]) +def test_render_concentration(concentration_fc): + riv_ds = xr.merge([riv_dict()]) concentration = concentration_fc.sel( layer=[2, 3], time=np.datetime64("2000-01-01"), drop=True @@ -387,3 +451,145 @@ def test_write_concentration_period_data(concentration_fc): assert ( data.count("2") == 1755 ) # the number 2 is in the concentration data, and in the cell indices. + + +@pytest.mark.usefixtures("imod5_dataset") +def test_import_river_from_imod5(imod5_dataset, tmp_path): + imod5_data = imod5_dataset[0] + period_data = imod5_dataset[1] + globaltimes = [np.datetime64("2000-01-01")] + target_dis = StructuredDiscretization.from_imod5_data(imod5_data) + grid = target_dis.dataset["idomain"] + target_npf = NodePropertyFlow.from_imod5_data(imod5_data, grid) + + (riv, drn) = imod.mf6.River.from_imod5_data( + "riv-1", + imod5_data, + period_data, + target_dis, + target_npf, + time_min=datetime(2000, 1, 1), + time_max=datetime(2002, 1, 1), + allocation_option_riv=ALLOCATION_OPTION.at_elevation, + distributing_option_riv=DISTRIBUTING_OPTION.by_crosscut_thickness, + regridder_types=None, + ) + + write_context = WriteContext(simulation_directory=tmp_path) + riv.write("riv", globaltimes, write_context) + drn.write("drn", globaltimes, write_context) + + errors = riv._validate( + imod.mf6.River._write_schemata, + idomain=target_dis.dataset["idomain"], + bottom=target_dis.dataset["bottom"], + ) + assert len(errors) == 0 + + errors = drn._validate( + imod.mf6.Drainage._write_schemata, + idomain=target_dis.dataset["idomain"], + bottom=target_dis.dataset["bottom"], + ) + assert len(errors) == 0 + + +@pytest.mark.usefixtures("imod5_dataset") +def test_import_river_from_imod5_infiltration_factors(imod5_dataset): + imod5_data = imod5_dataset[0] + period_data = imod5_dataset[1] + target_dis = StructuredDiscretization.from_imod5_data(imod5_data) + grid = target_dis.dataset["idomain"] + target_npf = NodePropertyFlow.from_imod5_data(imod5_data, grid) + + original_infiltration_factor = imod5_data["riv-1"]["infiltration_factor"] + imod5_data["riv-1"]["infiltration_factor"] = ones_like(original_infiltration_factor) + + (riv, drn) = imod.mf6.River.from_imod5_data( + "riv-1", + imod5_data, + period_data, + target_dis, + target_npf, + time_min=datetime(2000, 1, 1), + time_max=datetime(2002, 1, 1), + allocation_option_riv=ALLOCATION_OPTION.at_elevation, + distributing_option_riv=DISTRIBUTING_OPTION.by_crosscut_thickness, + regridder_types=None, + ) + + assert riv is not None + assert drn is None + + imod5_data["riv-1"]["infiltration_factor"] = zeros_like( + original_infiltration_factor + ) + (riv, drn) = imod.mf6.River.from_imod5_data( + "riv-1", + imod5_data, + period_data, + target_dis, + target_npf, + time_min=datetime(2000, 1, 1), + time_max=datetime(2002, 1, 1), + allocation_option_riv=ALLOCATION_OPTION.at_elevation, + distributing_option_riv=DISTRIBUTING_OPTION.by_crosscut_thickness, + regridder_types=None, + ) + + assert riv is None + assert drn is not None + + # teardown + imod5_data["riv-1"]["infiltration_factor"] = original_infiltration_factor + + +def test_import_river_from_imod5_period_data(imod5_dataset_periods): + imod5_data = imod5_dataset_periods[0] + imod5_periods = imod5_dataset_periods[1] + target_dis = StructuredDiscretization.from_imod5_data(imod5_data, validate=False) + grid = target_dis.dataset["idomain"] + target_npf = NodePropertyFlow.from_imod5_data(imod5_data, grid) + + original_infiltration_factor = imod5_data["riv-1"]["infiltration_factor"] + imod5_data["riv-1"]["infiltration_factor"] = ones_like(original_infiltration_factor) + + (riv, drn) = imod.mf6.River.from_imod5_data( + "riv-1", + imod5_data, + imod5_periods, + target_dis, + target_npf, + datetime(2002, 2, 2), + datetime(2022, 2, 2), + ALLOCATION_OPTION.at_elevation, + DISTRIBUTING_OPTION.by_crosscut_thickness, + regridder_types=None, + ) + + assert riv is not None + assert drn is None + + imod5_data["riv-1"]["infiltration_factor"] = zeros_like( + original_infiltration_factor + ) + (riv, drn) = imod.mf6.River.from_imod5_data( + "riv-1", + imod5_data, + imod5_periods, + target_dis, + target_npf, + datetime(2002, 2, 2), + datetime(2022, 2, 2), + ALLOCATION_OPTION.at_elevation, + DISTRIBUTING_OPTION.by_crosscut_thickness, + regridder_types=None, + ) + + assert riv is None + assert drn is not None + + # teardown + imod5_dataset_periods[0]["riv-1"]["infiltration_factor"] = ( + original_infiltration_factor + ) diff --git a/imod/tests/test_mf6/test_mf6_simulation.py b/imod/tests/test_mf6/test_mf6_simulation.py index 3959a1ad3..b2f757ea1 100644 --- a/imod/tests/test_mf6/test_mf6_simulation.py +++ b/imod/tests/test_mf6/test_mf6_simulation.py @@ -10,15 +10,28 @@ from unittest.mock import MagicMock import numpy as np +import pandas as pd import pytest import rasterio import xarray as xr import xugrid as xu import imod +from imod.mf6 import LayeredWell, Well from imod.mf6.model import Modflow6Model from imod.mf6.multimodel.modelsplitter import PartitionInfo +from imod.mf6.oc import OutputControl +from imod.mf6.regrid.regrid_schemes import ( + DiscretizationRegridMethod, + NodePropertyFlowRegridMethod, + StorageCoefficientRegridMethod, +) +from imod.mf6.simulation import Modflow6Simulation from imod.mf6.statusinfo import NestedStatusInfo, StatusInfo +from imod.prepare.topsystem.default_allocation_methods import ( + SimulationAllocationOptions, + SimulationDistributingOptions, +) from imod.schemata import ValidationError from imod.tests.fixtures.mf6_small_models_fixture import ( grid_data_structured, @@ -463,3 +476,171 @@ def compare_submodel_partition_info(first: PartitionInfo, second: PartitionInfo) return (first.id == second.id) and np.array_equal( first.active_domain, second.active_domain ) + + +@pytest.mark.unittest_jit +@pytest.mark.usefixtures("imod5_dataset") +def test_import_from_imod5(imod5_dataset, tmp_path): + imod5_data = imod5_dataset[0] + period_data = imod5_dataset[1] + default_simulation_allocation_options = SimulationAllocationOptions + default_simulation_distributing_options = SimulationDistributingOptions + + datelist = pd.date_range(start="1/1/1989", end="1/1/2013", freq="W") + + simulation = Modflow6Simulation.from_imod5_data( + imod5_data, + period_data, + default_simulation_allocation_options, + default_simulation_distributing_options, + datelist, + ) + simulation["imported_model"]["oc"] = OutputControl( + save_head="last", save_budget="last" + ) + simulation.create_time_discretization(["01-01-2003", "02-01-2003"]) + # Cleanup + # Remove HFB packages outside domain + # TODO: Build in support for hfb packages outside domain + for hfb_outside in ["hfb-24", "hfb-26"]: + simulation["imported_model"].pop(hfb_outside) + # Align NoData to domain + idomain = simulation["imported_model"].domain + simulation.mask_all_models(idomain) + # write and validate the simulation. + simulation.write(tmp_path, binary=False, validate=True) + + # Test if simulation attribute appropiately set + assert simulation._validation_context.strict_well_validation is False + + +@pytest.mark.unittest_jit +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5__strict_well_validation_set(imod5_dataset): + imod5_data = imod5_dataset[0] + period_data = imod5_dataset[1] + default_simulation_allocation_options = SimulationAllocationOptions + default_simulation_distributing_options = SimulationDistributingOptions + + datelist = pd.date_range(start="1/1/1989", end="1/1/1990", freq="W") + + simulation = Modflow6Simulation.from_imod5_data( + imod5_data, + period_data, + default_simulation_allocation_options, + default_simulation_distributing_options, + datelist, + ) + assert simulation._validation_context.strict_well_validation is False + assert Modflow6Simulation("test")._validation_context.strict_well_validation is True + + +@pytest.mark.unittest_jit +@pytest.mark.usefixtures("imod5_dataset") +def test_import_from_imod5__correct_well_type(imod5_dataset): + # Unpack + imod5_data = imod5_dataset[0] + period_data = imod5_dataset[1] + # Temporarily change layer number to 0, to force Well object instead of + # LayeredWell + original_wel_layer = imod5_data["wel-WELLS_L3"]["layer"] + imod5_data["wel-WELLS_L3"]["layer"] = [0] * len(original_wel_layer) + # Other arrangement + default_simulation_allocation_options = SimulationAllocationOptions + default_simulation_distributing_options = SimulationDistributingOptions + datelist = pd.date_range(start="1/1/1989", end="1/1/2013", freq="W") + + # Act + simulation = Modflow6Simulation.from_imod5_data( + imod5_data, + period_data, + default_simulation_allocation_options, + default_simulation_distributing_options, + datelist, + ) + # Set layer back to right value (before AssertionError might be thrown) + imod5_data["wel-WELLS_L3"]["layer"] = original_wel_layer + # Assert + assert isinstance(simulation["imported_model"]["wel-WELLS_L3"], Well) + assert isinstance(simulation["imported_model"]["wel-WELLS_L4"], LayeredWell) + assert isinstance(simulation["imported_model"]["wel-WELLS_L5"], LayeredWell) + + +@pytest.mark.unittest_jit +@pytest.mark.usefixtures("imod5_dataset") +def test_import_from_imod5__nonstandard_regridding(imod5_dataset, tmp_path): + imod5_data = imod5_dataset[0] + period_data = imod5_dataset[1] + default_simulation_allocation_options = SimulationAllocationOptions + default_simulation_distributing_options = SimulationDistributingOptions + + regridding_option = {} + regridding_option["npf"] = NodePropertyFlowRegridMethod() + regridding_option["dis"] = DiscretizationRegridMethod() + regridding_option["sto"] = StorageCoefficientRegridMethod() + times = pd.date_range(start="1/1/2018", end="12/1/2018", freq="ME") + + simulation = Modflow6Simulation.from_imod5_data( + imod5_data, + period_data, + default_simulation_allocation_options, + default_simulation_distributing_options, + times, + regridding_option, + ) + simulation["imported_model"]["oc"] = OutputControl( + save_head="last", save_budget="last" + ) + simulation.create_time_discretization(["01-01-2003", "02-01-2003"]) + # Cleanup + # Remove HFB packages outside domain + # TODO: Build in support for hfb packages outside domain + for hfb_outside in ["hfb-24", "hfb-26"]: + simulation["imported_model"].pop(hfb_outside) + # Align NoData to domain + idomain = simulation["imported_model"].domain + simulation.mask_all_models(idomain) + # write and validate the simulation. + simulation.write(tmp_path, binary=False, validate=True) + + +@pytest.mark.unittest_jit +@pytest.mark.usefixtures("imod5_dataset") +def test_import_from_imod5_no_storage_no_recharge(imod5_dataset, tmp_path): + # this test imports an imod5 simulation, but it has no recharge and no storage package. + imod5_data = imod5_dataset[0] + imod5_data.pop("sto") + imod5_data.pop("rch") + period_data = imod5_dataset[1] + + default_simulation_allocation_options = SimulationAllocationOptions + default_simulation_distributing_options = SimulationDistributingOptions + + times = pd.date_range(start="1/1/2018", end="12/1/2018", freq="ME") + + simulation = Modflow6Simulation.from_imod5_data( + imod5_data, + period_data, + default_simulation_allocation_options, + default_simulation_distributing_options, + times, + ) + simulation["imported_model"]["oc"] = OutputControl( + save_head="last", save_budget="last" + ) + simulation.create_time_discretization(["01-01-2003", "02-01-2003"]) + # Cleanup + # Remove HFB packages outside domain + # TODO: Build in support for hfb packages outside domain + for hfb_outside in ["hfb-24", "hfb-26"]: + simulation["imported_model"].pop(hfb_outside) + # check storage is present and rch is absent + assert not simulation["imported_model"]["sto"].dataset["transient"].values[()] + package_keys = simulation["imported_model"].keys() + for key in package_keys: + assert key[0:3] != "rch" + # Align NoData to domain + idomain = simulation["imported_model"].domain + simulation.mask_all_models(idomain) + # write and validate the simulation. + simulation.write(tmp_path, binary=False, validate=True) diff --git a/imod/tests/test_mf6/test_mf6_sto.py b/imod/tests/test_mf6/test_mf6_sto.py index 185f67482..e748f97e5 100644 --- a/imod/tests/test_mf6/test_mf6_sto.py +++ b/imod/tests/test_mf6/test_mf6_sto.py @@ -428,3 +428,33 @@ def test_check_nan_in_active_cell(sy_layered, convertible, dis): for var, error in errors.items(): assert var == "storage_coefficient" + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5(imod5_dataset, tmp_path): + data = imod5_dataset[0] + + target_grid = data["khv"]["kh"] + + sto = imod.mf6.StorageCoefficient.from_imod5_data(data, target_grid) + + assert not sto.dataset["save_flows"] + assert sto.dataset["transient"] + assert sto.dataset["storage_coefficient"].values[0] == 0.15 + assert np.all(sto.dataset["storage_coefficient"].values[1:] == 1e-5) + assert sto.dataset["specific_yield"].values[()] is None + + rendered_sto = sto.render(tmp_path, "sto", None, False) + assert "ss" in rendered_sto + + +@pytest.mark.usefixtures("imod5_dataset") +def test_from_imod5_steady_state(imod5_dataset): + data = imod5_dataset[0] + + data["sto"]["storage_coefficient"].values[:] = 0 + target_grid = data["khv"]["kh"] + + sto = imod.mf6.StorageCoefficient.from_imod5_data(data, target_grid) + + assert not sto.dataset["transient"] diff --git a/imod/tests/test_mf6/test_mf6_unsupported_grid_operations.py b/imod/tests/test_mf6/test_mf6_unsupported_grid_operations.py index 7a2f807f5..eff33b966 100644 --- a/imod/tests/test_mf6/test_mf6_unsupported_grid_operations.py +++ b/imod/tests/test_mf6/test_mf6_unsupported_grid_operations.py @@ -57,9 +57,9 @@ def test_mf6_package_regrid_with_lakes(rectangle_with_lakes, tmp_path): simulation = rectangle_with_lakes package = simulation["GWF_1"]["lake"] new_grid = finer_grid(simulation["GWF_1"].domain) - regrid_context = RegridderWeightsCache() + regrid_cache = RegridderWeightsCache() with pytest.raises(ValueError, match="package(.+)not be regridded"): - _ = package.regrid_like(new_grid, regrid_context) + _ = package.regrid_like(new_grid, regrid_cache) @pytest.mark.usefixtures("rectangle_with_lakes") diff --git a/imod/tests/test_mf6/test_mf6_wel.py b/imod/tests/test_mf6/test_mf6_wel.py index e91714826..6048524f6 100644 --- a/imod/tests/test_mf6/test_mf6_wel.py +++ b/imod/tests/test_mf6/test_mf6_wel.py @@ -1,54 +1,214 @@ import pathlib +import sys import tempfile import textwrap from contextlib import nullcontext as does_not_raise +from datetime import datetime import numpy as np +import pandas as pd import pytest import xarray as xr import xugrid as xu -from pytest_cases import parametrize_with_cases +from pytest_cases import parametrize, parametrize_with_cases import imod +from imod.formats.prj.prj import open_projectfile_data +from imod.logging.config import LoggerType +from imod.logging.loglevel import LogLevel +from imod.mf6.dis import StructuredDiscretization +from imod.mf6.npf import NodePropertyFlow from imod.mf6.utilities.grid import broadcast_to_full_domain +from imod.mf6.wel import LayeredWell, Well from imod.mf6.write_context import WriteContext from imod.schemata import ValidationError from imod.tests.fixtures.flow_basic_fixture import BasicDisSettings +times = [ + datetime(1981, 11, 30), + datetime(1981, 12, 31), + datetime(1982, 1, 31), + datetime(1982, 2, 28), + datetime(1982, 3, 31), + datetime(1982, 4, 30), +] + + +class GridAgnosticWellCases: + def case_well_stationary(self, well_high_lvl_test_data_stationary): + obj = imod.mf6.Well(*well_high_lvl_test_data_stationary) + dims_expected = { + "ncellid": 8, + "nmax_cellid": 3, + "species": 2, + } + cellid_expected = np.array( + [ + [1, 1, 9], + [1, 2, 9], + [1, 1, 8], + [1, 2, 8], + [2, 3, 7], + [2, 4, 7], + [2, 3, 6], + [2, 4, 6], + ], + dtype=np.int64, + ) + rate_expected = np.array(np.ones((8,), dtype=np.float32)) + return obj, dims_expected, cellid_expected, rate_expected + + def case_well_stationary_multilevel(self, well_high_lvl_test_data_stationary): + x, y, screen_top, _, rate_wel, concentration = ( + well_high_lvl_test_data_stationary + ) + screen_bottom = [-20.0] * 8 + obj = imod.mf6.Well(x, y, screen_top, screen_bottom, rate_wel, concentration) + dims_expected = { + "ncellid": 12, + "nmax_cellid": 3, + "species": 2, + } + cellid_expected = np.array( + [ + [1, 1, 9], + [1, 2, 9], + [1, 1, 8], + [1, 2, 8], + [2, 1, 9], + [2, 2, 9], + [2, 1, 8], + [2, 2, 8], + [2, 3, 7], + [2, 4, 7], + [2, 3, 6], + [2, 4, 6], + ], + dtype=np.int64, + ) + rate_expected = np.array([0.25] * 4 + [0.75] * 4 + [1.0] * 4) + return obj, dims_expected, cellid_expected, rate_expected -def test_to_mf6_pkg__high_lvl_stationary(basic_dis, well_high_lvl_test_data_stationary): + def case_well_point_filter(self, well_high_lvl_test_data_stationary): + x, y, screen_point, _, rate_wel, concentration = ( + well_high_lvl_test_data_stationary + ) + obj = imod.mf6.Well(x, y, screen_point, screen_point, rate_wel, concentration) + dims_expected = { + "ncellid": 8, + "nmax_cellid": 3, + "species": 2, + } + cellid_expected = np.array( + [ + [1, 1, 9], + [1, 2, 9], + [1, 1, 8], + [1, 2, 8], + [2, 3, 7], + [2, 4, 7], + [2, 3, 6], + [2, 4, 6], + ], + dtype=np.int64, + ) + rate_expected = np.array(np.ones((8,), dtype=np.float32)) + return obj, dims_expected, cellid_expected, rate_expected + + def case_well_transient(self, well_high_lvl_test_data_transient): + obj = imod.mf6.Well(*well_high_lvl_test_data_transient) + dims_expected = { + "ncellid": 8, + "time": 5, + "nmax_cellid": 3, + "species": 2, + } + cellid_expected = np.array( + [ + [1, 1, 9], + [1, 2, 9], + [1, 1, 8], + [1, 2, 8], + [2, 3, 7], + [2, 4, 7], + [2, 3, 6], + [2, 4, 6], + ], + dtype=np.int64, + ) + rate_expected = np.outer(np.ones((8,), dtype=np.float32), np.arange(5) + 1) + return obj, dims_expected, cellid_expected, rate_expected + + def case_layered_well_stationary(self, well_high_lvl_test_data_stationary): + x, y, _, _, rate_wel, concentration = well_high_lvl_test_data_stationary + layer = [1, 1, 1, 1, 2, 2, 2, 2] + obj = imod.mf6.LayeredWell(x, y, layer, rate_wel, concentration) + dims_expected = { + "ncellid": 8, + "nmax_cellid": 3, + "species": 2, + } + cellid_expected = np.array( + [ + [1, 1, 9], + [1, 2, 9], + [1, 1, 8], + [1, 2, 8], + [2, 3, 7], + [2, 4, 7], + [2, 3, 6], + [2, 4, 6], + ], + dtype=np.int64, + ) + rate_expected = np.array(np.ones((8,), dtype=np.float32)) + return obj, dims_expected, cellid_expected, rate_expected + + def case_layered_well_transient(self, well_high_lvl_test_data_transient): + x, y, _, _, rate_wel, concentration = well_high_lvl_test_data_transient + layer = [1, 1, 1, 1, 2, 2, 2, 2] + obj = imod.mf6.LayeredWell(x, y, layer, rate_wel, concentration) + dims_expected = { + "ncellid": 8, + "time": 5, + "nmax_cellid": 3, + "species": 2, + } + cellid_expected = np.array( + [ + [1, 1, 9], + [1, 2, 9], + [1, 1, 8], + [1, 2, 8], + [2, 3, 7], + [2, 4, 7], + [2, 3, 6], + [2, 4, 6], + ], + dtype=np.int64, + ) + rate_expected = np.outer(np.ones((8,), dtype=np.float32), np.arange(5) + 1) + return obj, dims_expected, cellid_expected, rate_expected + + +@parametrize_with_cases( + ["wel", "dims_expected", "cellid_expected", "rate_expected"], + cases=GridAgnosticWellCases, +) +def test_to_mf6_pkg(basic_dis, wel, dims_expected, cellid_expected, rate_expected): # Arrange idomain, top, bottom = basic_dis - wel = imod.mf6.Well(*well_high_lvl_test_data_stationary) active = idomain == 1 k = xr.ones_like(idomain) nmax_cellid_expected = np.array(["layer", "row", "column"]) - cellid_expected = np.array( - [ - [1, 1, 9], - [1, 2, 9], - [1, 1, 8], - [1, 2, 8], - [2, 3, 7], - [2, 4, 7], - [2, 3, 6], - [2, 4, 6], - ], - dtype=np.int64, - ) - rate_expected = np.array(np.ones((8,), dtype=np.float32)) # Act mf6_wel = wel.to_mf6_pkg(active, top, bottom, k) mf6_ds = mf6_wel.dataset # Assert - assert dict(mf6_ds.dims) == { - "ncellid": 8, - "nmax_cellid": 3, - "species": 2, - } + assert dict(mf6_ds.dims) == dims_expected np.testing.assert_equal(mf6_ds.coords["nmax_cellid"].values, nmax_cellid_expected) np.testing.assert_equal(mf6_ds["cellid"].values, cellid_expected) np.testing.assert_equal(mf6_ds["rate"].values, rate_expected) @@ -68,92 +228,165 @@ def test_to_mf6_pkg__validate(well_high_lvl_test_data_stationary): assert len(errors) == 1 -def test_to_mf6_pkg__high_lvl_multilevel(basic_dis, well_high_lvl_test_data_stationary): - """ - Test with stationary wells where the first 4 well screens extend over 2 layers. - Rates are distributed based on the fraction of the screen length in each layer. - In this case: The first layer should get 0.25, the second 0.75. - """ +def test_to_mf6_pkg__validate_filter_top(well_high_lvl_test_data_stationary): # Arrange - idomain, top, bottom = basic_dis - x, y, screen_top, _, rate_wel, concentration = well_high_lvl_test_data_stationary - screen_bottom = [-20.0] * 8 - wel = imod.mf6.Well(x, y, screen_top, screen_bottom, rate_wel, concentration) - active = idomain == 1 - k = xr.ones_like(idomain) - - nmax_cellid_expected = np.array(["layer", "row", "column"]) - cellid_expected = np.array( - [ - [1, 1, 9], - [1, 2, 9], - [1, 1, 8], - [1, 2, 8], - [2, 1, 9], - [2, 2, 9], - [2, 1, 8], - [2, 2, 8], - [2, 3, 7], - [2, 4, 7], - [2, 3, 6], - [2, 4, 6], - ], - dtype=np.int64, + x, y, screen_top, screen_bottom, rate_wel, concentration = ( + well_high_lvl_test_data_stationary ) - rate_expected = np.array([0.25] * 4 + [0.75] * 4 + [1.0] * 4) + screen_top = [-2.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] + screen_bottom = [0.0, -2.0, -2.0, -2.0, -2.0, -2.0, -2.0, -2.0] + well_test_data = (x, y, screen_top, screen_bottom, rate_wel, concentration) + + wel = imod.mf6.Well(*well_test_data) # Act - mf6_wel = wel.to_mf6_pkg(active, top, bottom, k) - mf6_ds = mf6_wel.dataset + kwargs = {"screen_top": wel.dataset["screen_top"]} + errors = wel._validate(wel._write_schemata, **kwargs) # Assert - assert dict(mf6_ds.dims) == { - "ncellid": 12, - "nmax_cellid": 3, - "species": 2, - } - np.testing.assert_equal(mf6_ds.coords["nmax_cellid"].values, nmax_cellid_expected) - np.testing.assert_equal(mf6_ds["cellid"].values, cellid_expected) - np.testing.assert_equal(mf6_ds["rate"].values, rate_expected) + assert len(errors) == 1 + assert ( + str(errors["screen_bottom"][0]) + == "not all values comply with criterion: <= screen_top" + ) -def test_to_mf6_pkg__high_lvl_transient(basic_dis, well_high_lvl_test_data_transient): +def test_to_mf6_pkg__logging_with_message( + tmp_path, basic_dis, well_high_lvl_test_data_transient +): # Arrange + logfile_path = tmp_path / "logfile.txt" idomain, top, bottom = basic_dis + modified_well_fixture = list(well_high_lvl_test_data_transient) + + # create an idomain where layer 1 is active and layer 2 and 3 are inactive. + # layer 1 has a bottom at -6, layer 2 at -35 and layer 3 at -120 + # so only wells that have a filter top above -6 will end up in the simulation + active = idomain == 1 + + active.loc[1, :, :] = True + active.loc[2:, :, :] = False + + # modify the well filter top and filter bottoms so that + # well 0 is not placed + # well 1 is partially placed + # well 2 is fully placed + # well 3 is partially placed + # wells 4 to 7 are not placed + modified_well_fixture[2] = [ + -6.0, + -3.0, + 0.0, + 0.0, + -6.0, + -6.0, + -6.0, + -6.0, + ] + modified_well_fixture[3] = [ + -102.0, + -102.0, + -6, + -102.0, + -1020.0, + -1020.0, + -1020.0, + -1020.0, + ] + + with open(logfile_path, "w") as sys.stdout: + imod.logging.configure( + LoggerType.PYTHON, + log_level=LogLevel.DEBUG, + add_default_file_handler=False, + add_default_stream_handler=True, + ) + + wel = imod.mf6.Well(*modified_well_fixture) + + k = xr.ones_like(idomain) + _ = wel.to_mf6_pkg(active, top, bottom, k) + + # the wells that were fully or partially placed should not appear in the log message + # but all the wells that are completely left out should be listed + with open(logfile_path, "r") as log_file: + log = log_file.read() + assert "Some wells were not placed" in log + assert "id = 1" not in log + assert "id = 2" not in log + assert "id = 3" not in log + assert "id = 0" in log + assert "id = 4" in log + assert "id = 5" in log + assert "id = 6" in log + assert "id = 7" in log + + +def test_to_mf6_pkg__logging_without_message( + tmp_path, basic_dis, well_high_lvl_test_data_transient +): + # This test activates logging, and then converts a high level well package to + # an MF6 package, in such a way that all the wells can be placed. + # Logging is active, and the log file should not include the "Some wells were not placed" + # message + logfile_path = tmp_path / "logfile.txt" + idomain, top, bottom = basic_dis + + with open(logfile_path, "w") as sys.stdout: + imod.logging.configure( + LoggerType.PYTHON, + log_level=LogLevel.DEBUG, + add_default_file_handler=False, + add_default_stream_handler=True, + ) + + wel = imod.mf6.Well(*well_high_lvl_test_data_transient) + active = idomain == 1 + k = xr.ones_like(idomain) + + active.loc[1, :, :] = True + active.loc[2:, :, :] = True + + # Act + _ = wel.to_mf6_pkg(active, top, bottom, k) + + with open(logfile_path, "r") as log_file: + log = log_file.read() + assert "Some wells were not placed" not in log + + +def test_to_mf6_pkg__error_on_all_wells_removed( + basic_dis, well_high_lvl_test_data_transient +): + """Drop all wells, then run to_mf6_pkg""" + idomain, top, bottom = basic_dis + wel = imod.mf6.Well(*well_high_lvl_test_data_transient) + wel.dataset = wel.dataset.drop_sel(index=wel.dataset["index"]) active = idomain == 1 k = xr.ones_like(idomain) - nmax_cellid_expected = np.array(["layer", "row", "column"]) - cellid_expected = np.array( - [ - [1, 1, 9], - [1, 2, 9], - [1, 1, 8], - [1, 2, 8], - [2, 3, 7], - [2, 4, 7], - [2, 3, 6], - [2, 4, 6], - ], - dtype=np.int64, - ) - rate_expected = np.outer(np.ones((8,), dtype=np.float32), np.arange(5) + 1) + with pytest.raises(ValidationError, match="No wells were assigned in package"): + wel.to_mf6_pkg(active, top, bottom, k) - # Act - mf6_wel = wel.to_mf6_pkg(active, top, bottom, k) - mf6_ds = mf6_wel.dataset - # Assert - assert dict(mf6_wel.dataset.dims) == { - "ncellid": 8, - "time": 5, - "nmax_cellid": 3, - "species": 2, - } - np.testing.assert_equal(mf6_ds.coords["nmax_cellid"].values, nmax_cellid_expected) - np.testing.assert_equal(mf6_ds["cellid"].values, cellid_expected) - np.testing.assert_equal(mf6_ds["rate"].values, rate_expected) +def test_to_mf6_pkg__error_on_well_removal( + basic_dis, well_high_lvl_test_data_transient +): + """Set k values at well location x=81 to 1e-3, causing it to be dropped. + Should throw error if error_on_well_removal = True""" + idomain, top, bottom = basic_dis + + wel = imod.mf6.Well(minimum_k=1.0, *well_high_lvl_test_data_transient) + active = idomain == 1 + k = xr.ones_like(idomain) + k.loc[{"x": 85.0}] = 1e-3 + + with pytest.raises(ValidationError, match="x = 81"): + wel.to_mf6_pkg(active, top, bottom, k, strict_well_validation=True) + + mf6_wel = wel.to_mf6_pkg(active, top, bottom, k, strict_well_validation=False) + assert mf6_wel.dataset.sizes["ncellid"] < wel.dataset.sizes["index"] @pytest.mark.parametrize("save_flows", [True, False]) @@ -195,6 +428,32 @@ def test_is_empty(well_high_lvl_test_data_transient): assert wel_empty.is_empty() +def test_cleanup(basic_dis, well_high_lvl_test_data_transient): + # Arrange + wel = imod.mf6.Well(*well_high_lvl_test_data_transient) + ds_original = wel.dataset.copy() + idomain, top, bottom = basic_dis + top = top.isel(layer=0, drop=True) + deep_offset = 100.0 + dis_normal = imod.mf6.StructuredDiscretization(top, bottom, idomain) + dis_deep = imod.mf6.StructuredDiscretization( + top - deep_offset, bottom - deep_offset, idomain + ) + + # Nothing to be cleaned up with default discretization + wel.cleanup(dis_normal) + xr.testing.assert_identical(ds_original, wel.dataset) + + # Cleanup + wel.cleanup(dis_deep) + assert not ds_original.identical(wel.dataset) + # Wells filters should be placed downwards at surface level as point filters + np.testing.assert_array_almost_equal( + wel.dataset["screen_top"], wel.dataset["screen_bottom"] + ) + np.testing.assert_array_almost_equal(wel.dataset["screen_top"], top - deep_offset) + + class ClipBoxCases: @staticmethod def case_clip_xy(parameterizable_basic_dis): @@ -205,7 +464,7 @@ def case_clip_xy(parameterizable_basic_dis): } expected_dims = {"index": 3, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_layer_max(parameterizable_basic_dis): @@ -213,7 +472,7 @@ def case_clip_layer_max(parameterizable_basic_dis): clip_arguments = {"layer_max": 2, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_layer_min(parameterizable_basic_dis): @@ -221,7 +480,7 @@ def case_clip_layer_min(parameterizable_basic_dis): clip_arguments = {"layer_min": 5, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_layer_min_layer_max(parameterizable_basic_dis): @@ -229,7 +488,7 @@ def case_clip_layer_min_layer_max(parameterizable_basic_dis): clip_arguments = {"layer_min": 1, "layer_max": 1, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_top_is_scalar(parameterizable_basic_dis): @@ -238,7 +497,7 @@ def case_clip_top_is_scalar(parameterizable_basic_dis): clip_arguments = {"layer_max": 2, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_top_is_non_layered_structuredgrid(parameterizable_basic_dis): @@ -249,7 +508,7 @@ def case_clip_top_is_non_layered_structuredgrid(parameterizable_basic_dis): clip_arguments = {"layer_max": 2, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_top_is_layered_structuredgrid(parameterizable_basic_dis): @@ -259,7 +518,7 @@ def case_clip_top_is_layered_structuredgrid(parameterizable_basic_dis): clip_arguments = {"layer_max": 2, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_top_is_non_layered_unstructuredgrid(parameterizable_basic_dis): @@ -273,7 +532,7 @@ def case_clip_top_is_non_layered_unstructuredgrid(parameterizable_basic_dis): clip_arguments = {"layer_max": 2, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_top_is_layered_unstructuredgrid(parameterizable_basic_dis): @@ -285,23 +544,33 @@ def case_clip_top_is_layered_unstructuredgrid(parameterizable_basic_dis): clip_arguments = {"layer_max": 2, "bottom": bottom, "top": top} expected_dims = {"index": 4, "species": 2} - return clip_arguments, expected_dims, does_not_raise() + return clip_arguments, expected_dims, does_not_raise(), does_not_raise() @staticmethod def case_clip_missing_top(parameterizable_basic_dis): _, _, bottom = parameterizable_basic_dis clip_arguments = {"layer_max": 2, "bottom": bottom} - expected_dims = {} - return clip_arguments, expected_dims, pytest.raises(ValueError) + expected_dims = {"index": 4, "species": 2} + return ( + clip_arguments, + expected_dims, + pytest.raises(ValueError), + does_not_raise(), + ) @staticmethod def case_clip_missing_bottom(parameterizable_basic_dis): _, top, _ = parameterizable_basic_dis clip_arguments = {"layer_max": 2, "top": top} - expected_dims = {} - return clip_arguments, expected_dims, pytest.raises(ValueError) + expected_dims = {"index": 4, "species": 2} + return ( + clip_arguments, + expected_dims, + pytest.raises(ValueError), + does_not_raise(), + ) @pytest.mark.parametrize( @@ -321,10 +590,10 @@ def case_clip_missing_bottom(parameterizable_basic_dis): indirect=True, ) @parametrize_with_cases( - ("clip_box_args", "expected_dims", "expectation"), cases=ClipBoxCases + ("clip_box_args", "expected_dims", "expectation", "_"), cases=ClipBoxCases ) -def test_clip_box__high_lvl_stationary( - well_high_lvl_test_data_stationary, clip_box_args, expected_dims, expectation +def test_clip_box__well_stationary( + well_high_lvl_test_data_stationary, clip_box_args, expected_dims, expectation, _ ): # Arrange wel = imod.mf6.Well(*well_high_lvl_test_data_stationary) @@ -337,6 +606,40 @@ def test_clip_box__high_lvl_stationary( assert dict(ds.dims) == expected_dims +@pytest.mark.parametrize( + "parameterizable_basic_dis", + [ + BasicDisSettings( + nlay=10, + zstop=-10.0, + xstart=50.0, + xstop=100.0, + ystart=50.0, + ystop=100.0, + nrow=10, + ncol=10, + ) + ], + indirect=True, +) +@parametrize_with_cases( + ("clip_box_args", "expected_dims", "_", "expectation"), cases=ClipBoxCases +) +def test_clip_box__layered_well_stationary( + well_high_lvl_test_data_stationary, clip_box_args, expected_dims, _, expectation +): + x, y, _, _, rate_wel, concentration = well_high_lvl_test_data_stationary + layer = [1, 1, 1, 1, 9, 9, 9, 9] + wel = imod.mf6.LayeredWell(x, y, layer, rate_wel, concentration) + + with expectation: + # Act + ds = wel.clip_box(**clip_box_args).dataset + + # Assert + assert dict(ds.dims) == expected_dims + + @pytest.mark.parametrize( "parameterizable_basic_dis", [BasicDisSettings(nlay=10, zstop=-10.0)], @@ -405,7 +708,7 @@ def test_derive_cellid_from_points(basic_dis, well_high_lvl_test_data_stationary ) # Act - cellid = imod.mf6.wel.Well._Well__derive_cellid_from_points(idomain, x, y, layer) + cellid = imod.mf6.wel.Well._derive_cellid_from_points(idomain, x, y, layer) # Assert np.testing.assert_array_equal(cellid, cellid_expected) @@ -682,3 +985,194 @@ def test_render__concentration_dis_vertices_transient(well_test_data_transient): assert ( data.count(" 246 135") == 15 ) # check salinity and temperature was written to period data + + +@parametrize("wel_class", [Well, LayeredWell]) +@pytest.mark.usefixtures("imod5_dataset") +def test_import_and_convert_to_mf6(imod5_dataset, tmp_path, wel_class): + data = imod5_dataset[0] + target_dis = StructuredDiscretization.from_imod5_data(data) + target_npf = NodePropertyFlow.from_imod5_data(data, target_dis.dataset["idomain"]) + + times = list(pd.date_range(datetime(1989, 1, 1), datetime(2013, 1, 1), 8400)) + + # import grid-agnostic well from imod5 data (it contains 1 well) + wel = wel_class.from_imod5_data("wel-WELLS_L3", data, times, minimum_thickness=1.0) + assert wel.dataset["x"].values[0] == 197910.0 + assert wel.dataset["y"].values[0] == 362860.0 + assert np.mean(wel.dataset["rate"].values) == -317.2059091946156 + # convert to a gridded well + top = target_dis.dataset["top"] + bottom = target_dis.dataset["bottom"] + active = target_dis.dataset["idomain"] + k = target_npf.dataset["k"] + mf6_well = wel.to_mf6_pkg(active, top, bottom, k, True) + + # assert mf6 well properties + assert len(mf6_well.dataset["x"].values) == 1 + assert mf6_well.dataset["x"].values[0] == 197910.0 + assert mf6_well.dataset["y"].values[0] == 362860.0 + assert np.mean(mf6_well.dataset["rate"].values) == -317.2059091946156 + + # write the package for validation + write_context = WriteContext(simulation_directory=tmp_path) + mf6_well.write("wel", [], write_context) + + +@parametrize("wel_class", [Well]) +@pytest.mark.usefixtures("imod5_dataset") +def test_import_and_cleanup(imod5_dataset, wel_class: Well): + data = imod5_dataset[0] + target_dis = StructuredDiscretization.from_imod5_data(data) + + ntimes = 8399 + times = list(pd.date_range(datetime(1989, 1, 1), datetime(2013, 1, 1), ntimes + 1)) + + # Import grid-agnostic well from imod5 data (it contains 1 well) + wel = wel_class.from_imod5_data("wel-WELLS_L3", data, times) + assert len(wel.dataset.coords["time"]) == ntimes + # Cleanup + wel.cleanup(target_dis) + # Nothing to be cleaned, single well point is located properly, test that + # time coordinate has not been dropped. + assert "time" in wel.dataset.coords + assert len(wel.dataset.coords["time"]) == ntimes + + +@parametrize("wel_class", [Well, LayeredWell]) +@pytest.mark.usefixtures("well_regular_import_prj") +def test_import_multiple_wells(well_regular_import_prj, wel_class): + imod5dict, _ = open_projectfile_data(well_regular_import_prj) + times = [ + datetime(1981, 11, 30), + datetime(1981, 12, 31), + datetime(1982, 1, 31), + datetime(1982, 2, 28), + datetime(1982, 3, 31), + datetime(1982, 4, 30), + ] + # Set layer to 1, to avoid validation error. + if wel_class is LayeredWell: + imod5dict["wel-ipf1"]["layer"] = [1] + imod5dict["wel-ipf2"]["layer"] = [1] + # import grid-agnostic well from imod5 data (it contains 2 packages with 3 wells each) + wel1 = wel_class.from_imod5_data("wel-ipf1", imod5dict, times) + wel2 = wel_class.from_imod5_data("wel-ipf2", imod5dict, times) + + assert np.all(wel1.x == np.array([191112.11, 191171.96, 191231.52])) + assert np.all(wel2.x == np.array([191112.11, 191171.96, 191231.52])) + assert wel1.dataset["rate"].shape == (5, 3) + assert wel2.dataset["rate"].shape == (5, 3) + + +@parametrize("wel_class", [Well, LayeredWell]) +@pytest.mark.usefixtures("well_duplication_import_prj") +def test_import_from_imod5_with_duplication(well_duplication_import_prj, wel_class): + imod5dict, _ = open_projectfile_data(well_duplication_import_prj) + times = [ + datetime(1981, 11, 30), + datetime(1981, 12, 31), + datetime(1982, 1, 31), + datetime(1982, 2, 28), + datetime(1982, 3, 31), + datetime(1982, 4, 30), + ] + # Set layer to 1, to avoid validation error. + if wel_class is LayeredWell: + imod5dict["wel-ipf1"]["layer"] = [1] + imod5dict["wel-ipf2"]["layer"] = [1] + # import grid-agnostic well from imod5 data (it contains 2 packages with 3 wells each) + wel1 = wel_class.from_imod5_data("wel-ipf1", imod5dict, times) + wel2 = wel_class.from_imod5_data("wel-ipf2", imod5dict, times) + + assert np.all(wel1.x == np.array([191171.96, 191231.52, 191231.52])) + assert np.all(wel2.x == np.array([191112.11, 191171.96, 191231.52])) + assert wel1.dataset["rate"].shape == (5, 3) + assert wel2.dataset["rate"].shape == (5, 3) + + +@pytest.mark.parametrize("layer", [0, 1]) +@pytest.mark.usefixtures("well_regular_import_prj") +def test_logmessage_for_layer_assignment_import_imod5( + tmp_path, well_regular_import_prj, layer +): + imod5dict = open_projectfile_data(well_regular_import_prj) + + logfile_path = tmp_path / "logfile.txt" + imod5dict[0]["wel-ipf1"]["layer"] = [layer] * len(imod5dict[0]["wel-ipf1"]["layer"]) + + try: + with open(logfile_path, "w") as sys.stdout: + # start logging + imod.logging.configure( + LoggerType.PYTHON, + log_level=LogLevel.WARNING, + add_default_file_handler=False, + add_default_stream_handler=True, + ) + + _ = imod.mf6.Well.from_imod5_data("wel-ipf1", imod5dict[0], times) + + finally: + # turn the logger off again + imod.logging.configure( + LoggerType.NULL, + log_level=LogLevel.WARNING, + add_default_file_handler=False, + add_default_stream_handler=False, + ) + + # import grid-agnostic well from imod5 data (it contains 2 packages with 3 wells each) + with open(logfile_path, "r") as log_file: + log = log_file.read() + message_required = layer != 0 + message_present = ( + "In well wel-ipf1 a layer was assigned, but this is not\nsupported" in log + ) + assert message_required == message_present + + +@pytest.mark.parametrize("remove", ["filt_top", "filt_bot", None]) +@pytest.mark.usefixtures("well_regular_import_prj") +def test_logmessage_for_missing_filter_settings( + tmp_path, well_regular_import_prj, remove +): + imod5dict = open_projectfile_data(well_regular_import_prj) + logfile_path = tmp_path / "logfile.txt" + if remove is not None: + imod5dict[0]["wel-ipf1"]["dataframe"][0] = imod5dict[0]["wel-ipf1"][ + "dataframe" + ][0].drop(remove, axis=1) + + try: + with open(logfile_path, "w") as sys.stdout: + # start logging + imod.logging.configure( + LoggerType.PYTHON, + log_level=LogLevel.WARNING, + add_default_file_handler=False, + add_default_stream_handler=True, + ) + + _ = imod.mf6.Well.from_imod5_data("wel-ipf1", imod5dict[0], times) + except Exception: + assert remove is not None + + finally: + # turn the logger off again + imod.logging.configure( + LoggerType.NULL, + log_level=LogLevel.WARNING, + add_default_file_handler=False, + add_default_stream_handler=False, + ) + + # import grid-agnostic well from imod5 data (it contains 2 packages with 3 wells each) + with open(logfile_path, "r") as log_file: + log = log_file.read() + message_required = remove is not None + message_present = ( + "In well wel-ipf1 the 'filt_top' and 'filt_bot' columns were\nnot both found;" + in log + ) + assert message_required == message_present diff --git a/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_structured.py b/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_structured.py index 2dc495849..2c609912e 100644 --- a/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_structured.py +++ b/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_structured.py @@ -6,13 +6,15 @@ import geopandas as gpd import numpy as np import pytest -import shapely import xarray as xr from pytest_cases import case, parametrize_with_cases import imod from imod.mf6 import Modflow6Simulation from imod.mf6.wel import Well +from imod.prepare.hfb import ( + linestring_to_square_zpolygons, +) from imod.typing.grid import zeros_like @@ -132,13 +134,17 @@ def case_hfb_vertical(self): # Vertical line at x = 52500.0 barrier_y = [0.0, 52500.0] barrier_x = [52500.0, 52500.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) @@ -146,13 +152,17 @@ def case_hfb_horizontal(self): # Horizontal line at y = 52500.0 barrier_x = [0.0, 52500.0] barrier_y = [52500.0, 52500.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) @@ -160,13 +170,17 @@ def case_hfb_horizontal_outside_domain(self): # Horizontal line at y = -100.0 running outside domain barrier_x = [0.0, 1_000_000.0] barrier_y = [52500.0, 52500.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) @@ -174,13 +188,17 @@ def case_hfb_diagonal(self): # Diagonal line barrier_y = [0.0, 52500.0] barrier_x = [0.0, 52500.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) diff --git a/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_unstructured.py b/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_unstructured.py index 0ec8a78e8..1eba1227f 100644 --- a/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_unstructured.py +++ b/imod/tests/test_mf6/test_multimodel/test_mf6_partitioning_unstructured.py @@ -4,13 +4,13 @@ import geopandas as gpd import numpy as np import pytest -import shapely import xarray as xr import xugrid as xu from pytest_cases import parametrize_with_cases import imod from imod.mf6 import Modflow6Simulation +from imod.prepare.hfb import linestring_to_square_zpolygons from imod.typing.grid import ones_like, zeros_like @@ -73,13 +73,17 @@ def case_hfb_vertical(self): # Vertical line at x = -100 barrier_y = [-990.0, 990.0] barrier_x = [-100.0, -100.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) @@ -87,13 +91,17 @@ def case_hfb_horizontal(self): # Horizontal line at y = -100.0 barrier_x = [-990.0, 990.0] barrier_y = [-100.0, -100.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) @@ -101,27 +109,17 @@ def case_hfb_horizontal_outside_domain(self): # Horizontal line at y = -100.0 running outside domain barrier_x = [-990.0, 10_000.0] barrier_y = [-100.0, -100.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] - return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], - data={ - "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], - }, + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom ) - def case_hfb_horizontal_origin(self): - # Horizontal line through origin - barrier_x = [-990.0, 990.0] - barrier_y = [0.0, 0.0] - return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) @@ -129,13 +127,17 @@ def case_hfb_diagonal(self): # Diagonal line barrier_y = [-480.0, 480.0] barrier_x = [-480.0, 480.0] + barrier_ztop = [10.0] + barrier_zbottom = [0.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) return gpd.GeoDataFrame( - geometry=[shapely.linestrings(barrier_x, barrier_y)], + geometry=polygons, data={ "resistance": [10.0], - "ztop": [10.0], - "zbottom": [0.0], }, ) diff --git a/imod/tests/test_mf6/test_package_sanity.py b/imod/tests/test_mf6/test_package_sanity.py index 95ded8f19..cfd2c0851 100644 --- a/imod/tests/test_mf6/test_package_sanity.py +++ b/imod/tests/test_mf6/test_package_sanity.py @@ -42,6 +42,7 @@ imod.mf6.HorizontalFlowBarrierHydraulicCharacteristic, imod.mf6.HorizontalFlowBarrierMultiplier, imod.mf6.HorizontalFlowBarrierResistance, + imod.mf6.LayeredWell, ] @@ -108,3 +109,12 @@ def test_save_and_load(instance, tmp_path): def test_repr(instance): assert isinstance(instance.__repr__(), str) assert isinstance(instance._repr_html_(), str) + + +@pytest.mark.parametrize("instance", ALL_PACKAGE_INSTANCES) +def test_from_dataset(instance): + pkg_class = type(instance) + ds = instance.dataset + new_instance = pkg_class._from_dataset(ds) + assert isinstance(new_instance, pkg_class) + assert instance.dataset.equals(new_instance.dataset) diff --git a/imod/tests/test_mf6/test_utilities/test_grid.py b/imod/tests/test_mf6/test_utilities/test_grid.py new file mode 100644 index 000000000..9e444146a --- /dev/null +++ b/imod/tests/test_mf6/test_utilities/test_grid.py @@ -0,0 +1,29 @@ +import numpy as np + +from imod.mf6.utilities.grid import create_smallest_target_grid +from imod.util.spatial import empty_2d + + +def test_create_smallest_target_grid(): + # Three grids with aligned cell edges at each 100m + grid1 = empty_2d(dx=25.0, xmin=100.0, xmax=300.0, dy=-25.0, ymin=100.0, ymax=300.0) + grid2 = empty_2d(dx=50.0, xmin=0.0, xmax=200.0, dy=-50.0, ymin=0.0, ymax=200.0) + grid3 = empty_2d(dx=20.0, xmin=0.0, xmax=400.0, dy=-20.0, ymin=0.0, ymax=400.0) + + actual = create_smallest_target_grid(grid1, grid2, grid3) + + assert actual.coords["dx"] == 20.0 + assert actual.coords["dy"] == -20.0 + assert np.all(actual.coords["x"].values == [110.0, 130.0, 150.0, 170.0, 190.0]) + assert np.all(actual.coords["y"].values == [190.0, 170.0, 150.0, 130.0, 110.0]) + + # Two grids with barely aligned cell edges. + grid1 = empty_2d(dx=50.0, xmin=110.0, xmax=220.0, dy=-50.0, ymin=110.0, ymax=220.0) + grid2 = empty_2d(dx=20.0, xmin=0.0, xmax=400.0, dy=-20.0, ymin=0.0, ymax=400.0) + + actual = create_smallest_target_grid(grid1, grid2) + + assert actual.coords["dx"] == 20.0 + assert actual.coords["dy"] == -20.0 + assert np.all(actual.coords["x"].values == [120.0, 140.0, 160.0, 180.0, 200.0]) + assert np.all(actual.coords["y"].values == [210.0, 190.0, 170.0, 150.0, 130.0]) diff --git a/imod/tests/test_mf6/test_utilities/test_hfb.py b/imod/tests/test_mf6/test_utilities/test_hfb.py new file mode 100644 index 000000000..e55aedd34 --- /dev/null +++ b/imod/tests/test_mf6/test_utilities/test_hfb.py @@ -0,0 +1,61 @@ +import pandas as pd +import pytest + +from imod.mf6.utilities.hfb import _prepare_index_names + + +def test_pandas_behavior_index_naming_expected(): + """ + Monitor whether pandas behaviour changes. Quite some logic in + mf6/utilities/hfb depends on specific index naming and how pandas behaves. + This tests if this behaviour goes unchanged. + """ + + df = pd.DataFrame(data={"col1": [1, 2], "col2": [1, 2]}) + df_reset = df.reset_index(drop=False) + + assert df.index.names == [None] + assert df.index.name is None + assert "index" in df_reset.columns + + df_roundtrip = df_reset.set_index("index") + + assert df_roundtrip.index.names == ["index"] + assert df_roundtrip.index.name == "index" + + +def test_prepare_index_names(): + # Case 1: Single index, unnamed + df = pd.DataFrame(data={"col1": [1, 2], "col2": [1, 2]}) + df_prepared = _prepare_index_names(df.copy()) + + assert df_prepared.index.names == ["index"] + + # Case 2: Single index, named + df_index_named = df.copy() + df_index_named.index = df.index.set_names(["index"]) + df_prepared = _prepare_index_names(df_index_named) + + assert df_prepared.index.names == ["index"] + + # Case 3: Multi index, unnamed + df_renamed = df.rename(columns={"col1": "parts"}) + df_multi_unnamed = df_renamed.set_index([df.index, "parts"]) + assert df_multi_unnamed.index.names == [None, "parts"] + df_prepared = _prepare_index_names(df_multi_unnamed) + + assert df_prepared.index.names == ["index", "parts"] + + # Case 4: Multi index, named + df_renamed = df_index_named.rename(columns={"col1": "parts"}) + df_multi_unnamed = df_renamed.set_index([df_renamed.index, "parts"]) + assert df_multi_unnamed.index.names == ["index", "parts"] + df_prepared = _prepare_index_names(df_multi_unnamed) + + assert df_prepared.index.names == ["index", "parts"] + + # Case 5: Wrong index name + df_index_wrongname = df.copy() + df_index_wrongname.index = df.index.set_names(["wrong_name"]) + with pytest.raises(IndexError): + _prepare_index_names(df_index_wrongname) diff --git a/imod/tests/test_mf6/test_utilities/test_imod5_converter.py b/imod/tests/test_mf6/test_utilities/test_imod5_converter.py new file mode 100644 index 000000000..826c1b440 --- /dev/null +++ b/imod/tests/test_mf6/test_utilities/test_imod5_converter.py @@ -0,0 +1,75 @@ +import pytest +import xarray as xr +from pytest_cases import parametrize_with_cases + +from imod.mf6.utilities.imod5_converter import convert_ibound_to_idomain +from imod.util import empty_3d + + +@pytest.fixture(scope="function") +def template_grid(): + dx = 50.0 + dy = -50.0 + xmin = 0.0 + xmax = 100.0 + ymin = 0.0 + ymax = 100.0 + layer = [1, 2, 3, 4, 5] + + return empty_3d(dx, xmin, xmax, dy, ymin, ymax, layer).fillna(1.0) + + +class IboundCases: + def case_active(self): + thickness = [1.0, 1.0, 1.0, 1.0, 1.0] + ibound = [1, 1, 1, 1, 1] + idomain = [1, 1, 1, 1, 1] + return thickness, ibound, idomain + + def case_inactive(self): + thickness = [1.0, 1.0, 1.0, 1.0, 1.0] + ibound = [0, 1, 1, 1, 0] + idomain = [0, 1, 1, 1, 0] + return thickness, ibound, idomain + + def case_min1(self): + thickness = [1.0, 1.0, 1.0, 1.0, 1.0] + ibound = [1, -1, 1, -1, 0] + idomain = [1, 1, 1, 1, 0] + return thickness, ibound, idomain + + def case_all_inactive(self): + thickness = [1.0, 0.0, 0.0, 0.0, 1.0] + ibound = [0, 0, 0, 0, 0] + idomain = [0, 0, 0, 0, 0] + return thickness, ibound, idomain + + def case_vpt(self): + thickness = [1.0, 0.0, 1.0, 0.0, 1.0] + ibound = [1, 1, 1, 1, 1] + idomain = [1, -1, 1, -1, 1] + return thickness, ibound, idomain + + def case_vpt_zero_thickness_at_edge(self): + thickness = [1.0, 0.0, 1.0, 0.0, 1.0] + ibound = [0, 1, 1, 1, 0] + idomain = [0, 0, 1, 0, 0] + return thickness, ibound, idomain + + def case_mixed(self): + thickness = [1.0, 0.0, 1.0, 0.0, 1.0] + ibound = [1, -1, 1, 1, 0] + idomain = [1, -1, 1, 0, 0] + return thickness, ibound, idomain + + +@parametrize_with_cases(argnames="thickness,ibound,expected", cases=IboundCases) +def test_convert_ibound_to_idomain(template_grid, thickness, ibound, expected): + layer = template_grid.coords["layer"] + thickness = xr.ones_like(layer) * thickness * template_grid + ibound = xr.ones_like(layer) * ibound * template_grid + expected = xr.ones_like(layer) * expected * template_grid + + actual = convert_ibound_to_idomain(ibound, thickness) + + assert actual.equals(expected) diff --git a/imod/tests/test_mf6/test_utilities/test_mf6hfb.py b/imod/tests/test_mf6/test_utilities/test_mf6hfb.py new file mode 100644 index 000000000..59027ae9a --- /dev/null +++ b/imod/tests/test_mf6/test_utilities/test_mf6hfb.py @@ -0,0 +1,230 @@ +import geopandas as gpd +import numpy as np +import pytest +import shapely +import xarray as xr + +from imod.mf6.hfb import ( + HorizontalFlowBarrierResistance, + SingleLayerHorizontalFlowBarrierResistance, +) +from imod.mf6.utilities.mf6hfb import merge_hfb_packages +from imod.prepare.hfb import linestring_to_square_zpolygons + + +@pytest.mark.usefixtures("structured_flow_model") +@pytest.fixture(scope="function") +def modellayers_single_layer(structured_flow_model): + model = structured_flow_model.clip_box(layer_max=1) + dis = model["dis"] + dis["bottom"] = dis["bottom"].isel(x=0, y=0, drop=True) + npf = model["npf"] + + return { + "idomain": dis["idomain"], + "top": dis["top"], + "bottom": dis["bottom"], + "k": npf["k"], + } + + +@pytest.mark.usefixtures("structured_flow_model") +@pytest.fixture(scope="function") +def modellayers(structured_flow_model): + model = structured_flow_model + dis = model["dis"] + dis["bottom"] = dis["bottom"].isel(x=0, y=0, drop=True) + npf = model["npf"] + + return { + "idomain": dis["idomain"], + "top": dis["top"], + "bottom": dis["bottom"], + "k": npf["k"], + } + + +def make_layer_geometry(resistance, layer): + barrier_y = [11.0, 5.0, -1.0] + barrier_x = [5.0, 5.0, 5.0] + + geometry = gpd.GeoDataFrame( + geometry=[shapely.linestrings(barrier_x, barrier_y)], + data={ + "resistance": [resistance], + "layer": [layer], + }, + ) + return geometry + + +def make_depth_geometry(resistance, top, bot): + barrier_y = [11.0, 5.0, -1.0] + barrier_x = [5.0, 5.0, 5.0] + barrier_ztop = [top, top] + barrier_zbottom = [bot, bot] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbottom + ) + + geometry = gpd.GeoDataFrame( + geometry=polygons, + data={ + "resistance": [resistance, resistance], + }, + ) + return geometry + + +def test_merge_three_hfbs__single_layer(modellayers_single_layer): + """Merge three single layer hfbs, test for lenght""" + # Arrange + n_barriers = 3 + single_resistance = 400.0 + + geometry = make_layer_geometry(single_resistance, 1) + hfb_ls = [ + SingleLayerHorizontalFlowBarrierResistance(geometry) for _ in range(n_barriers) + ] + + # Act + mf6_hfb = merge_hfb_packages(hfb_ls, **modellayers_single_layer) + + # Assert + assert mf6_hfb["cell_id"].shape == (6,) + assert (mf6_hfb["layer"] == 1).all() + expected_resistance = n_barriers * single_resistance + assert (expected_resistance == 1 / mf6_hfb["hydraulic_characteristic"]).all() + + +def test_merge_three_hfbs__compare_single_hfb(modellayers_single_layer): + """ + Merge three single layer hfbs, compare with one hfb with tripled + resistance as created with a call to merge_hfb_packages. + """ + # Arrange + n_barriers = 3 + single_resistance = 400.0 + + geometry = make_layer_geometry(single_resistance, 1) + geometry_tripled = make_layer_geometry(n_barriers * single_resistance, 1) + + hfb_ls_triple = [ + SingleLayerHorizontalFlowBarrierResistance(geometry) for _ in range(n_barriers) + ] + hfb_ls_single = [SingleLayerHorizontalFlowBarrierResistance(geometry_tripled)] + + # Act + mf6_hfb_three = merge_hfb_packages(hfb_ls_triple, **modellayers_single_layer) + mf6_hfb_single = merge_hfb_packages(hfb_ls_single, **modellayers_single_layer) + + # Assert + xr.testing.assert_equal(mf6_hfb_single.dataset, mf6_hfb_three.dataset) + + +def test_merge_three_hfbs__to_mf6_pkg_single_layer(modellayers_single_layer): + """ + Merge three single layer hfbs, compare with one hfb with tripled + resistance as with a call to to_mf6_pkg. + """ + # Arrange + n_barriers = 3 + single_resistance = 400.0 + + geometry = make_layer_geometry(single_resistance, 1) + geometry_tripled = make_layer_geometry(n_barriers * single_resistance, 1) + + hfb_ls_triple = [ + SingleLayerHorizontalFlowBarrierResistance(geometry) for _ in range(n_barriers) + ] + hfb_ls_single = [SingleLayerHorizontalFlowBarrierResistance(geometry_tripled)] + + # Act + mf6_hfb_three = merge_hfb_packages(hfb_ls_triple, **modellayers_single_layer) + mf6_hfb_single = hfb_ls_single[0].to_mf6_pkg(**modellayers_single_layer) + + # Assert + xr.testing.assert_equal(mf6_hfb_single.dataset, mf6_hfb_three.dataset) + + +def test_merge_mixed_hfbs__single_layer(modellayers_single_layer): + """Merge mix of layer hfb and depth hfb.""" + # Arrange + n_barriers = 3 + single_resistance = 400.0 + + top = float(modellayers_single_layer["top"].values) + bot = modellayers_single_layer["bottom"].values[0] + + geometry = make_layer_geometry(single_resistance, 1) + geometry_depth = make_depth_geometry(single_resistance, top, bot) + + hfb_ls_triple = [ + SingleLayerHorizontalFlowBarrierResistance(geometry), + HorizontalFlowBarrierResistance(geometry_depth), + HorizontalFlowBarrierResistance(geometry_depth), + ] + + # Act + mf6_hfb = merge_hfb_packages(hfb_ls_triple, **modellayers_single_layer) + + # Assert + assert mf6_hfb["cell_id"].shape == (6,) + assert (mf6_hfb["layer"] == 1).all() + expected_resistance = n_barriers * single_resistance + assert (expected_resistance == 1 / mf6_hfb["hydraulic_characteristic"]).all() + + +def test_merge_three_hfbs__multiple_single_layers(modellayers): + """Merge three single layer hfbs at different layers""" + # Arrange + n_barriers = 3 + single_resistance = 400.0 + + hfb_ls = [ + SingleLayerHorizontalFlowBarrierResistance( + make_layer_geometry(single_resistance, i) + ) + for i in range(1, n_barriers + 1) + ] + + # Act + mf6_hfb = merge_hfb_packages(hfb_ls, **modellayers) + + # Assert + assert mf6_hfb["cell_id"].shape == (18,) + assert np.all(np.unique(mf6_hfb["layer"]) == np.array([1, 2, 3])) + expected_resistance = single_resistance + assert (expected_resistance == 1 / mf6_hfb["hydraulic_characteristic"]).all() + + +def test_merge_mixed_hfbs__multiple_layer(modellayers): + """ + Merge three single layer hfbs at different layers plus one hfb spread across + the complete depth. + """ + # Arrange + n_barriers = 3 + single_resistance = 400.0 + + hfb_ls = [ + SingleLayerHorizontalFlowBarrierResistance( + make_layer_geometry(single_resistance, i) + ) + for i in range(1, n_barriers + 1) + ] + hfb_ls.append( + HorizontalFlowBarrierResistance( + make_depth_geometry(single_resistance, 10.0, -3.0) + ) + ) + + # Act + mf6_hfb = merge_hfb_packages(hfb_ls, **modellayers) + + # Assert + assert mf6_hfb["cell_id"].shape == (18,) + assert np.all(np.unique(mf6_hfb["layer"]) == np.array([1, 2, 3])) + expected_resistance = 2 * single_resistance + assert (expected_resistance == 1 / mf6_hfb["hydraulic_characteristic"]).all() diff --git a/imod/tests/test_mf6/test_utilities/test_resampling.py b/imod/tests/test_mf6/test_utilities/test_resampling.py new file mode 100644 index 000000000..e72549bb0 --- /dev/null +++ b/imod/tests/test_mf6/test_utilities/test_resampling.py @@ -0,0 +1,140 @@ +from datetime import datetime + +import pandas as pd + +from imod.util.expand_repetitions import resample_timeseries + + +def initialize_timeseries(times: list[datetime], rates: list[float]) -> pd.DataFrame: + timeseries = pd.DataFrame( + {}, columns=["time", "rate", "x", "y", "id", "filt_top", "filt_bot"] + ) + timeseries["time"] = times + timeseries["rate"] = rates + timeseries["x"] = 0 + timeseries["y"] = 0 + timeseries["id"] = "ID" + timeseries["filt_top"] = 20 + timeseries["filt_bot"] = 10 + + return timeseries + + +def test_timeseries_resampling(): + # In this test, we resample a timeseries for a coarser output discretization. + # The output times are a subset of the input times. + times = [datetime(1989, 1, i) for i in [1, 3, 4, 5, 6]] + rates = [i * 100 for i in range(1, 6)] + timeseries = initialize_timeseries(times, rates) + + new_dates = [datetime(1989, 1, 1), datetime(1989, 1, 5), datetime(1989, 1, 6)] + new_timeseries = resample_timeseries(timeseries, new_dates) + + expected_times = [datetime(1989, 1, i) for i in [1, 5, 6]] + expected_rates = [175.0, 400.0, 500.0] + expected_timeseries = initialize_timeseries(expected_times, expected_rates) + + assert new_timeseries.equals(expected_timeseries) + + +def test_timeseries_resampling_2(): + # In this test, we resample a timeseries for a coarser output discretization. + # The output times are a not a subset of the input times, and they begin earlier. + times = [datetime(1989, 1, 1, 14, 0, 0)] + times += [datetime(1989, 1, i) for i in [3, 4, 5, 6]] + rates = [i * 100 for i in range(1, 6)] + timeseries = initialize_timeseries(times, rates) + + # initialize data of lists. + + new_dates = [datetime(1989, 1, 1), datetime(1989, 1, 5), datetime(1989, 1, 6)] + new_timeseries = resample_timeseries(timeseries, new_dates) + + expected_times = [datetime(1989, 1, i) for i in [1, 5, 6]] + expected_rates = [160.416667, 400.0, 500.0] + expected_timeseries = initialize_timeseries(expected_times, expected_rates) + + pd.testing.assert_frame_equal( + new_timeseries, expected_timeseries, check_dtype=False + ) + + +def test_timeseries_resampling_3(): + # In this test, we resample a timeseries for a coarser output discretization. + # The output times are a subset of the input times. + + times = [datetime(1989, 1, i) for i in [1, 3, 4, 5]] + times += [datetime(1999, 1, 6)] # Ten years after last entry. + rates = [i * 100 for i in range(1, 6)] + timeseries = initialize_timeseries(times, rates) + + new_dates = [datetime(1989, 1, 1), datetime(1989, 1, 5), datetime(1989, 1, 6)] + new_timeseries = resample_timeseries(timeseries, new_dates) + + expected_times = [datetime(1989, 1, i) for i in [1, 5, 6]] + expected_rates = [175.0, 400.0, 400.0] + expected_timeseries = initialize_timeseries(expected_times, expected_rates) + + pd.testing.assert_frame_equal( + new_timeseries, expected_timeseries, check_dtype=False + ) + + +def test_timeseries_resampling_4(): + # In this test, we resample a timeseries for a coarser output discretization. + # The output times are a not a subset of the input times, and they end later. + # initialize data of lists. + + times = [datetime(1989, 1, i, 11, 0, 0) for i in [1, 2, 3, 4, 5]] + rates = [100, 100, 200, 300, 300] + timeseries = initialize_timeseries(times, rates) + + new_dates = [datetime(1989, 1, 1), datetime(1989, 1, 3), datetime(1999, 1, 4)] + new_timeseries = resample_timeseries(timeseries, new_dates) + + expected_times = [datetime(1989, 1, 1), datetime(1989, 1, 3), datetime(1999, 1, 4)] + expected_rates = [77.083333, 299.947532, 300] + expected_timeseries = initialize_timeseries(expected_times, expected_rates) + + pd.testing.assert_frame_equal( + new_timeseries, expected_timeseries, check_dtype=False + ) + + +def test_timeseries_resampling_coarsen_and_refine(): + # In this test, we resample a timeseries for a coarser output discretization. + # Then we refine it again to the original discretization. + # The coarsening was chosen so that after this the original timeseries should be obtained + + original_times = [datetime(1989, 1, i) for i in [1, 2, 3, 4, 5, 6]] + original_rates = [100, 100, 200, 200, 300, 300] + original_timeseries = initialize_timeseries(original_times, original_rates) + + coarse_times = [datetime(1989, 1, 1), datetime(1989, 1, 3), datetime(1989, 1, 5)] + coarse_timeseries = resample_timeseries(original_timeseries, coarse_times) + + re_refined_timeseries = resample_timeseries(coarse_timeseries, original_times) + + pd.testing.assert_frame_equal( + original_timeseries, re_refined_timeseries, check_dtype=False + ) + + +def test_timeseries_resampling_refine_and_coarsen(): + # In this test, we resample a timeseries for a finer output discretization. + # Then we coarsen it again to the original discretization. + # The refinement was chosen so that after this the original timeseries should be obtained + original_times = [datetime(1989, 1, i) for i in [1, 2, 3, 4, 5, 6]] + original_rates = [100, 100, 200, 200, 300, 300] + original_timeseries = initialize_timeseries(original_times, original_rates) + + refined_times = pd.date_range( + datetime(1989, 1, 1), datetime(1989, 1, 6), periods=121 + ) + refined_timeseries = resample_timeseries(original_timeseries, refined_times) + + re_coarsened_timeseries = resample_timeseries(refined_timeseries, original_times) + + pd.testing.assert_frame_equal( + original_timeseries, re_coarsened_timeseries, check_dtype=False + ) diff --git a/imod/tests/test_mf6/test_well_highlvl.py b/imod/tests/test_mf6/test_well_highlvl.py index ffa0935f9..98479c99c 100644 --- a/imod/tests/test_mf6/test_well_highlvl.py +++ b/imod/tests/test_mf6/test_well_highlvl.py @@ -12,7 +12,9 @@ import imod.tests.fixtures import imod.tests.fixtures.mf6_circle_fixture import imod.tests.fixtures.mf6_twri_fixture +from imod.mf6.validation_context import ValidationContext from imod.mf6.write_context import WriteContext +from imod.schemata import ValidationError from imod.tests.fixtures.mf6_small_models_fixture import ( grid_data_structured, grid_data_structured_layered, @@ -79,10 +81,9 @@ def test_write_well(tmp_path: Path, grid_data, grid_data_layered, reference_outp k = 100.0 top = ones_like(active.sel(layer=1), dtype=np.float64) bottom = grid_data_layered(np.float64, -2.0, 10) + validation_context = ValidationContext(False) write_context = WriteContext(tmp_path) - mf6_pkg = well.to_mf6_pkg( - active, top, bottom, k, False, write_context.is_partitioned - ) + mf6_pkg = well._to_mf6_pkg(active, top, bottom, k, validation_context) mf6_pkg.write("packagename", globaltimes, write_context) assert pathlib.Path.exists(tmp_path / "packagename.wel") assert pathlib.Path.exists(tmp_path / "packagename" / "wel.dat") @@ -149,7 +150,8 @@ def test_write_well_from_model_transient_rate( def test_write_all_wells_filtered_out( tmp_path: Path, twri_simulation: imod.mf6.Modflow6Simulation ): - # for this test, we leave the low conductivity of the twri model as is, so all wells get filtered out + # for this test, we leave the low conductivity of the twri model as is, so + # all wells get filtered out twri_simulation["GWF_1"]["well"] = imod.mf6.Well( x=[1.0, 6002.0], y=[3.0, 5004.0], @@ -160,15 +162,16 @@ def test_write_all_wells_filtered_out( validate=True, ) - with pytest.raises(ValueError): + with pytest.raises(ValidationError): twri_simulation.write(tmp_path, binary=False) def test_write_one_well_filtered_out( tmp_path: Path, twri_simulation: imod.mf6.Modflow6Simulation ): - # for this test, we increase the low conductivity of the twri model . But one of the wells violates the thickness constraint (the second one) - # and gets filtered out alltogether + # for this test, we increase the low conductivity of the twri model . But + # one of the wells violates the thickness constraint (the second one) and + # gets filtered out alltogether twri_simulation["GWF_1"]["npf"]["k"] *= 20000 twri_simulation["GWF_1"]["well"] = imod.mf6.Well( x=[1.0, 6002.0], @@ -180,7 +183,7 @@ def test_write_one_well_filtered_out( validate=True, ) - with pytest.raises(ValueError): + with pytest.raises(ValidationError): twri_simulation.write(tmp_path, binary=False) @@ -233,7 +236,7 @@ def test_constraints_are_configurable( twri_simulation["GWF_1"]["well"]["minimum_k"] = 1.0 twri_simulation["GWF_1"]["well"]["minimum_thickness"] = 1.0 - with pytest.raises(ValueError): + with pytest.raises(ValidationError): twri_simulation.write(tmp_path, binary=False) # reset the constraints again so that all constraints are met @@ -246,7 +249,7 @@ def test_constraints_are_configurable( twri_simulation["GWF_1"]["well"]["minimum_k"] = 1e-9 twri_simulation["GWF_1"]["well"]["minimum_thickness"] = 700.0 - with pytest.raises(ValueError): + with pytest.raises(ValidationError): twri_simulation.write(tmp_path, binary=False) diff --git a/imod/tests/test_assign_wells.py b/imod/tests/test_prepare/test_assign_wells.py similarity index 76% rename from imod/tests/test_assign_wells.py rename to imod/tests/test_prepare/test_assign_wells.py index 38f2669ce..f72928d43 100644 --- a/imod/tests/test_assign_wells.py +++ b/imod/tests/test_prepare/test_assign_wells.py @@ -8,7 +8,7 @@ from imod.testing import assert_frame_equal -def test_vectorized_overlap(): +def test_compute_vectorized_overlap(): bounds_a = np.array( [ [0.0, 3.0], @@ -21,34 +21,57 @@ def test_vectorized_overlap(): [1.0, 2.0], ] ) - actual = prepwel.vectorized_overlap(bounds_a, bounds_b) + actual = prepwel.compute_vectorized_overlap(bounds_a, bounds_b) assert np.array_equal(actual, np.array([1.0, 1.0])) +def test_compute_point_filter_overlap(): + bounds_well = np.array( + [ + [0.0, 0.0], + [0.0, 0.0], + [0.0, 1.0], + [0.0, 0.0], + [0.0, 0.0], + ] + ) + bounds_layer = np.array( + [ + [1.0, 2.0], + [-1.0, 2.0], + [-1.0, 2.0], + [0.0, 0.0], + [-5.0, -4.0], + ] + ) + actual = prepwel.compute_point_filter_overlap(bounds_well, bounds_layer) + assert np.array_equal(actual, np.array([0.0, 3.0, 0.0, 0.0, 0.0])) + + def test_compute_overlap(): # Three wells wells = pd.DataFrame( { - "top": [5.0, 4.0, 3.0], - "bottom": [4.0, 2.0, -1.0], + "top": [5.0, 4.0, 3.0, 0.0], + "bottom": [4.0, 2.0, -1.0, 0.0], } ) top = xr.DataArray( data=[ - [10.0, 10.0, 10.0], - [0.0, 0.0, 0.0], + [10.0, 10.0, 10.0, 10.0], + [0.0, 0.0, 0.0, 0.0], ], dims=["layer", "index"], ) bottom = xr.DataArray( data=[ - [0.0, 0.0, 0.0], - [-10.0, -10.0, -10.0], + [0.0, 0.0, 0.0, 0.0], + [-10.0, -10.0, -10.0, -10.0], ], dims=["layer", "index"], ) actual = prepwel.compute_overlap(wells, top, bottom) - expected = np.array([1.0, 2.0, 3.0, 0.0, 0.0, 1.0]) + expected = np.array([1.0, 2.0, 3.0, 0.0, 0.0, 0.0, 1.0, 10.0]) assert np.allclose(actual, expected) @@ -79,7 +102,7 @@ def case_mix_wells(self): "id": [1, 2, 3, 4], "top": [5.0, 4.0, 3.0, 0.0], "bottom": [4.0, 2.0, -1.0, 0.0], - "rate": [1.0, 10.0, 100.0, 0.0], + "rate": [1.0, 10.0, 100.0, 0.1], } ) return wells, top, bottom, k @@ -109,7 +132,7 @@ def case_all_in_domain(self): "id": [1, 2, 3, 4], "top": [5.0, 4.0, 3.0, 0.0], "bottom": [4.0, 2.0, -1.0, 0.0], - "rate": [1.0, 10.0, 100.0, 0.0], + "rate": [1.0, 10.0, 100.0, 0.1], } ) @@ -173,9 +196,9 @@ def test_assign_wells_errors(self, wells, top, bottom, k): with pytest.raises(ValueError, match="Columns are missing"): faulty_wells = pd.DataFrame({"id": [1], "x": [1.0], "y": [1.0]}) prepwel.assign_wells(faulty_wells, top, bottom, k) - with pytest.raises(TypeError, match="top, bottom, and optionally"): + with pytest.raises(TypeError, match="top, bottom, "): prepwel.assign_wells(wells, top, bottom.values) - with pytest.raises(TypeError, match="top, bottom, and optionally"): + with pytest.raises(TypeError, match="top, bottom, k"): prepwel.assign_wells(wells, top.values, bottom, k) @parametrize_with_cases( @@ -190,17 +213,17 @@ def test_assign_wells__no_kh(self, wells, top, bottom, k): assert isinstance(actual, pd.DataFrame) expected = pd.DataFrame( { - "index": [0, 1, 2, 3], - "id": [1, 2, 3, 3], - "layer": [1, 1, 1, 2], - "bottom": [4.0, 2.0, -1.0, -1.0], - "overlap": [1.0, 2.0, 3.0, 1.0], - "rate": [1.0, 10.0, 75.0, 25.0], - "top": [5.0, 4.0, 3.0, 3.0], - "k": [1.0, 1.0, 1.0, 1.0], - "transmissivity": [1.0, 2.0, 3.0, 1.0], - "x": [0.6, 1.1, 2.3, 2.3], - "y": [0.6, 1.1, 2.3, 2.3], + "index": [0, 1, 2, 3, 4], + "id": [1, 2, 3, 3, 4], + "layer": [1, 1, 1, 2, 2], + "bottom": [4.0, 2.0, -1.0, -1.0, 0.0], + "overlap": [1.0, 2.0, 3.0, 1.0, 10.0], + "rate": [1.0, 10.0, 75.0, 25.0, 0.1], + "top": [5.0, 4.0, 3.0, 3.0, 0.0], + "k": [1.0, 1.0, 1.0, 1.0, 1.0], + "transmissivity": [1.0, 2.0, 3.0, 1.0, 10.0], + "x": [0.6, 1.1, 2.3, 2.3, 2.6], + "y": [0.6, 1.1, 2.3, 2.3, 2.6], } ) assert_frame_equal(actual, expected, check_like=True) @@ -218,17 +241,17 @@ def test_assign_wells(self, wells, top, bottom, k): assert isinstance(actual, pd.DataFrame) expected = pd.DataFrame( { - "index": [0, 1, 2, 3], - "id": [1, 2, 3, 3], - "layer": [1, 1, 1, 2], - "bottom": [4.0, 2.0, -1.0, -1.0], - "overlap": [1.0, 2.0, 3.0, 1.0], - "rate": [1.0, 10.0, 60.0, 40.0], - "top": [5.0, 4.0, 3.0, 3.0], - "k": [10.0, 10.0, 10.0, 20.0], - "transmissivity": [10.0, 20.0, 30.0, 20.0], - "x": [0.6, 1.1, 2.3, 2.3], - "y": [0.6, 1.1, 2.3, 2.3], + "index": [0, 1, 2, 3, 4], + "id": [1, 2, 3, 3, 4], + "layer": [1, 1, 1, 2, 2], + "bottom": [4.0, 2.0, -1.0, -1.0, 0.0], + "overlap": [1.0, 2.0, 3.0, 1.0, 10.0], + "rate": [1.0, 10.0, 60.0, 40.0, 0.1], + "top": [5.0, 4.0, 3.0, 3.0, 0.0], + "k": [10.0, 10.0, 10.0, 20.0, 20.0], + "transmissivity": [10.0, 20.0, 30.0, 20.0, 200.0], + "x": [0.6, 1.1, 2.3, 2.3, 2.6], + "y": [0.6, 1.1, 2.3, 2.3, 2.6], } ) assert_frame_equal(actual, expected, check_like=True) @@ -247,17 +270,17 @@ def test_assign_wells_minimum_thickness(self, wells, top, bottom, k): assert isinstance(actual, pd.DataFrame) expected = pd.DataFrame( { - "index": [0, 1], - "id": [2, 3], - "layer": [1, 1], - "bottom": [2.0, -1.0], - "overlap": [2.0, 3.0], - "rate": [10.0, 100.0], - "top": [4.0, 3.0], - "k": [10.0, 10.0], - "transmissivity": [20.0, 30.0], - "x": [1.1, 2.3], - "y": [1.1, 2.3], + "index": [0, 1, 2], + "id": [2, 3, 4], + "layer": [1, 1, 2], + "bottom": [2.0, -1.0, 0.0], + "overlap": [2.0, 3.0, 10.0], + "rate": [10.0, 100.0, 0.1], + "top": [4.0, 3.0, 0.0], + "k": [10.0, 10.0, 20.0], + "transmissivity": [20.0, 30.0, 200.0], + "x": [1.1, 2.3, 2.6], + "y": [1.1, 2.3, 2.6], } ) assert_frame_equal(actual, expected, check_like=True) @@ -281,7 +304,7 @@ def test_assign_wells_transient_rate(self, wells, top, bottom, k): bottom=bottom, k=k, ) - assert np.array_equal(actual["id"], np.repeat([1, 2, 3, 3], 5)) + assert np.array_equal(actual["id"], np.repeat([1, 2, 3, 3, 4], 5)) actual = prepwel.assign_wells( wells=transient_wells, @@ -290,7 +313,7 @@ def test_assign_wells_transient_rate(self, wells, top, bottom, k): k=k, minimum_thickness=1.01, ) - assert np.array_equal(actual["id"], np.repeat([2, 3], 5)) + assert np.array_equal(actual["id"], np.repeat([2, 3, 4], 5)) @parametrize_with_cases( "wells, top, bottom, k", cases=AssignWellCases.case_mix_wells diff --git a/imod/tests/test_prepare/test_cleanup.py b/imod/tests/test_prepare/test_cleanup.py new file mode 100644 index 000000000..b6b7861c4 --- /dev/null +++ b/imod/tests/test_prepare/test_cleanup.py @@ -0,0 +1,248 @@ +from typing import Callable + +import numpy as np +import pandas as pd +import pytest +import xugrid as xu +from pytest_cases import parametrize, parametrize_with_cases + +from imod.prepare.cleanup import cleanup_drn, cleanup_ghb, cleanup_riv, cleanup_wel +from imod.tests.test_mf6.test_mf6_riv import DisCases, RivDisCases +from imod.typing import GridDataArray + + +def _first(grid: GridDataArray): + """ + helper function to get first value, regardless of unstructured or + structured grid.""" + return grid.values.ravel()[0] + + +def _first_index(grid: GridDataArray) -> tuple: + if isinstance(grid, xu.UgridDataArray): + return (0, 0) + else: + return (0, 0, 0) + + +def _rename_data_dict(data: dict, func: Callable): + renamed = data.copy() + to_rename = _RENAME_DICT[func] + for src, dst in to_rename.items(): + mv_data = renamed.pop(src) + if dst is not None: + renamed[dst] = mv_data + return renamed + + +def _prepare_dis_dict(dis_dict: dict, func: Callable): + """Keep required dis args for specific cleanup functions""" + keep_vars = _KEEP_FROM_DIS_DICT[func] + return {var: dis_dict[var] for var in keep_vars} + + +_RENAME_DICT = { + cleanup_riv: {}, + cleanup_drn: {"stage": "elevation", "bottom_elevation": None}, + cleanup_ghb: {"stage": "head", "bottom_elevation": None}, +} + +_KEEP_FROM_DIS_DICT = { + cleanup_riv: ["idomain", "bottom"], + cleanup_drn: ["idomain"], + cleanup_ghb: ["idomain"], + cleanup_wel: ["top", "bottom"], +} + + +@parametrize_with_cases("riv_data, dis_data", cases=RivDisCases) +@parametrize("cleanup_func", [cleanup_drn, cleanup_ghb, cleanup_riv]) +def test_cleanup__align_nodata(riv_data: dict, dis_data: dict, cleanup_func: Callable): + dis_dict = _prepare_dis_dict(dis_data, cleanup_func) + data_dict = _rename_data_dict(riv_data, cleanup_func) + # Assure conductance not modified by previous tests. + np.testing.assert_equal(_first(data_dict["conductance"]), 1.0) + idx = _first_index(data_dict["conductance"]) + # Arrange: Deactivate one cell + first_key = next(iter(data_dict.keys())) + data_dict[first_key][idx] = np.nan + # Act + data_cleaned = cleanup_func(**dis_dict, **data_dict) + # Assert + for key in data_cleaned.keys(): + np.testing.assert_equal(_first(data_cleaned[key][idx]), np.nan) + + +@parametrize_with_cases("riv_data, dis_data", cases=RivDisCases) +@parametrize("cleanup_func", [cleanup_drn, cleanup_ghb, cleanup_riv]) +def test_cleanup__zero_conductance( + riv_data: dict, dis_data: dict, cleanup_func: Callable +): + dis_dict = _prepare_dis_dict(dis_data, cleanup_func) + data_dict = _rename_data_dict(riv_data, cleanup_func) + # Assure conductance not modified by previous tests. + np.testing.assert_equal(_first(data_dict["conductance"]), 1.0) + idx = _first_index(data_dict["conductance"]) + # Arrange: Deactivate one cell + data_dict["conductance"][idx] = 0.0 + # Act + data_cleaned = cleanup_func(**dis_dict, **data_dict) + # Assert + for key in data_cleaned.keys(): + np.testing.assert_equal(_first(data_cleaned[key][idx]), np.nan) + + +@parametrize_with_cases("riv_data, dis_data", cases=RivDisCases) +@parametrize("cleanup_func", [cleanup_drn, cleanup_ghb, cleanup_riv]) +def test_cleanup__negative_concentration( + riv_data: dict, dis_data: dict, cleanup_func: Callable +): + dis_dict = _prepare_dis_dict(dis_data, cleanup_func) + data_dict = _rename_data_dict(riv_data, cleanup_func) + first_key = next(iter(data_dict.keys())) + # Create concentration data + data_dict["concentration"] = data_dict[first_key].copy() + # Assure conductance not modified by previous tests. + np.testing.assert_equal(_first(data_dict["conductance"]), 1.0) + idx = _first_index(data_dict["conductance"]) + # Arrange: Deactivate one cell + data_dict["concentration"][idx] = -10.0 + # Act + data_cleaned = cleanup_func(**dis_dict, **data_dict) + # Assert + np.testing.assert_equal(_first(data_cleaned["concentration"]), 0.0) + + +@parametrize_with_cases("riv_data, dis_data", cases=RivDisCases) +@parametrize("cleanup_func", [cleanup_drn, cleanup_ghb, cleanup_riv]) +def test_cleanup__outside_active_domain( + riv_data: dict, dis_data: dict, cleanup_func: Callable +): + dis_dict = _prepare_dis_dict(dis_data, cleanup_func) + data_dict = _rename_data_dict(riv_data, cleanup_func) + # Assure conductance not modified by previous tests. + np.testing.assert_equal(_first(data_dict["conductance"]), 1.0) + idx = _first_index(data_dict["conductance"]) + # Arrange: Deactivate one cell + dis_dict["idomain"][idx] = 0.0 + # Act + data_cleaned = cleanup_func(**dis_dict, **data_dict) + # Assert + for key in data_cleaned.keys(): + np.testing.assert_equal(_first(data_cleaned[key][idx]), np.nan) + + +@parametrize_with_cases("riv_data, dis_data", cases=RivDisCases) +def test_cleanup_riv__fix_bottom_elevation_to_bottom(riv_data: dict, dis_data: dict): + dis_dict = _prepare_dis_dict(dis_data, cleanup_riv) + # Arrange: Set bottom elevation model layer bottom + riv_data["bottom_elevation"] -= 3.0 + # Assure conductance not modified by previous tests. + np.testing.assert_equal(_first(riv_data["conductance"]), 1.0) + # Act + riv_data_cleaned = cleanup_riv(**dis_dict, **riv_data) + # Assert + # Account for cells inactive river cells. + riv_active = riv_data_cleaned["stage"].notnull() + expected = dis_dict["bottom"].where(riv_active) + + np.testing.assert_equal( + riv_data_cleaned["bottom_elevation"].values, expected.values + ) + + +@parametrize_with_cases("riv_data, dis_data", cases=RivDisCases) +def test_cleanup_riv__fix_bottom_elevation_to_stage(riv_data: dict, dis_data: dict): + dis_dict = _prepare_dis_dict(dis_data, cleanup_riv) + # Arrange: Set bottom elevation above stage + riv_data["bottom_elevation"] += 3.0 + # Assure conductance not modified by previous tests. + np.testing.assert_equal(_first(riv_data["conductance"]), 1.0) + # Act + riv_data_cleaned = cleanup_riv(**dis_dict, **riv_data) + # Assert + np.testing.assert_equal( + riv_data_cleaned["bottom_elevation"].values, riv_data_cleaned["stage"].values + ) + + +@parametrize_with_cases("riv_data, dis_data", cases=RivDisCases) +def test_cleanup_riv__raise_error(riv_data: dict, dis_data: dict): + """ + Test if error raised when stage below model layer bottom and see if user is + guided to the right prepare function. + """ + dis_dict = _prepare_dis_dict(dis_data, cleanup_riv) + # Arrange: Set bottom elevation above stage + riv_data["stage"] -= 10.0 + # Act + with pytest.raises(ValueError, match="imod.prepare.topsystem.allocate_riv_cells"): + cleanup_riv(**dis_dict, **riv_data) + + +@parametrize_with_cases("dis_data", cases=DisCases) +def test_cleanup_wel(dis_data: dict): + """ + Cleanup wells. + + Cases by id (on purpose not in order, to see if pandas' + sorting results in any issues): + + a: filter completely above surface level -> point filter in top layer + c: filter partly above surface level -> filter top set to surface level + b: filter completely below model base -> well should be removed + d: filter partly below model base -> filter bottom set to model base + f: well outside grid bounds -> well should be removed + e: utrathin filter -> filter should be forced to point filter + g: filter screen_bottom above screen_top -> filter should be forced to point filter + """ + # Arrange + dis_dict = _prepare_dis_dict(dis_data, cleanup_wel) + wel_dict = { + "id": ["a", "c", "b", "d", "f", "e", "g"], + "x": [17.0, 17.0, 17.0, 17.0, 40.0, 17.0, 17.0], + "y": [15.0, 15.0, 15.0, 15.0, 15.0, 15.0, 15.0], + "screen_top": [ + 2.0, + 2.0, + -7.0, + -1.0, + -1.0, + 1e-3, + 0.0, + ], + "screen_bottom": [ + 1.5, + 0.0, + -8.0, + -8.0, + -1.0, + 0.0, + 0.5, + ], + } + well_df = pd.DataFrame(wel_dict) + wel_expected = { + "id": ["a", "c", "d", "e", "g"], + "x": [17.0, 17.0, 17.0, 17.0, 17.0], + "y": [15.0, 15.0, 15.0, 15.0, 15.0], + "screen_top": [ + 1.0, + 1.0, + -1.0, + 1e-3, + 0.0, + ], + "screen_bottom": [ + 1.0, + 0.0, + -1.5, + 1e-3, + 0.0, + ], + } + well_expected_df = pd.DataFrame(wel_expected).set_index("id") + # Act + well_cleaned = cleanup_wel(well_df, **dis_dict) + # Assert + pd.testing.assert_frame_equal(well_cleaned, well_expected_df) diff --git a/imod/tests/test_prepare/test_prepare_hfb.py b/imod/tests/test_prepare/test_prepare_hfb.py new file mode 100644 index 000000000..a8e7a908d --- /dev/null +++ b/imod/tests/test_prepare/test_prepare_hfb.py @@ -0,0 +1,106 @@ +import numpy as np +import pytest +import shapely + +from imod.prepare.hfb import ( + linestring_to_square_zpolygons, + linestring_to_trapezoid_zpolygons, +) + + +def test_linestring_to_square_zpolygons(): + barrier_x = [-10.0, 0.0, 10.0] + barrier_y = [10.0, 0.0, -10.0] + barrier_ztop = [10.0, 20.0] + barrier_zbot = [-10.0, -20.0] + + polygons = linestring_to_square_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbot + ) + + assert len(polygons) == 2 + + coordinates_0 = shapely.get_coordinates(polygons[0], include_z=True) + expected_0 = np.array( + [ + [-10.0, 10.0, 10.0], + [-10.0, 10.0, -10.0], + [0.0, 0.0, -10.0], + [0.0, 0.0, 10.0], + [-10.0, 10.0, 10.0], + ] + ) + + coordinates_1 = shapely.get_coordinates(polygons[1], include_z=True) + expected_1 = np.array( + [ + [0.0, 0.0, 20.0], + [0.0, 0.0, -20.0], + [10.0, -10.0, -20.0], + [10.0, -10.0, 20.0], + [0.0, 0.0, 20.0], + ] + ) + + np.testing.assert_equal(coordinates_0, expected_0) + np.testing.assert_equal(coordinates_1, expected_1) + + +def test_linestring_to_trapezoid_zpolygons(): + barrier_x = [-10.0, 0.0, 10.0] + barrier_y = [10.0, 0.0, -10.0] + barrier_ztop = [10.0, 20.0, 15.0] + barrier_zbot = [-10.0, -20.0, 0.0] + + polygons = linestring_to_trapezoid_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbot + ) + + assert len(polygons) == 2 + + coordinates_0 = shapely.get_coordinates(polygons[0], include_z=True) + expected_0 = np.array( + [ + [-10.0, 10.0, 10.0], + [-10.0, 10.0, -20.0], + [0.0, 0.0, 20.0], + [0.0, 0.0, -10.0], + [-10.0, 10.0, 10.0], + ] + ) + + coordinates_1 = shapely.get_coordinates(polygons[1], include_z=True) + expected_1 = np.array( + [ + [0.0, 0.0, 20.0], + [0.0, 0.0, 0.0], + [10.0, -10.0, 15.0], + [10.0, -10.0, -20.0], + [0.0, 0.0, 20.0], + ] + ) + + np.testing.assert_equal(coordinates_0, expected_0) + np.testing.assert_equal(coordinates_1, expected_1) + + +def test_linestring_to_trapezoid_zpolygons__fails(): + barrier_x = [-10.0, 0.0, 10.0] + barrier_y = [10.0, 0.0, -10.0] + barrier_ztop = [10.0, 20.0] + barrier_zbot = [-10.0, -20.0] + + with pytest.raises(ValueError): + linestring_to_trapezoid_zpolygons( + barrier_x, barrier_y, barrier_ztop, barrier_zbot + ) + + +def test_linestring_to_square_zpolygons__fails(): + barrier_x = [-10.0, 0.0, 10.0] + barrier_y = [10.0, 0.0, -10.0] + barrier_ztop = [10.0, 20.0, 15.0] + barrier_zbot = [-10.0, -20.0, 0.0] + + with pytest.raises(ValueError): + linestring_to_square_zpolygons(barrier_x, barrier_y, barrier_ztop, barrier_zbot) diff --git a/imod/tests/test_prepare/test_topsystem.py b/imod/tests/test_prepare/test_topsystem.py index f967cff0c..1ce825006 100644 --- a/imod/tests/test_prepare/test_topsystem.py +++ b/imod/tests/test_prepare/test_topsystem.py @@ -195,3 +195,193 @@ def test_distribute_ghb_conductance( actual = take_nth_layer_column(actual_da, 0) np.testing.assert_equal(actual, expected) + + +@parametrize_with_cases( + argnames="active,top,bottom,stage,bottom_elevation", + prefix="riv_", +) +@parametrize_with_cases( + argnames="option,expected_riv,expected_drn", prefix="allocation_", has_tag="riv" +) +def test_riv_allocation__elevation_above_surface_level( + active, top, bottom, stage, bottom_elevation, option, expected_riv, expected_drn +): + # Put elevations a lot above surface level. Need to be allocated to first + # layer. + actual_riv_da, actual_drn_da = allocate_riv_cells( + option, active, top, bottom, stage + 100.0, bottom_elevation + 100.0 + ) + + # Override expected values + expected_riv = [True, False, False, False] + if expected_drn: + expected_drn = [False, False, False, False] + + actual_riv = take_nth_layer_column(actual_riv_da, 0) + empty_riv = take_nth_layer_column(actual_riv_da, 1) + + if actual_drn_da is None: + actual_drn = None + empty_drn = None + else: + actual_drn = take_nth_layer_column(actual_drn_da, 0) + empty_drn = take_nth_layer_column(actual_drn_da, 1) + + np.testing.assert_equal(actual_riv, expected_riv) + np.testing.assert_equal(actual_drn, expected_drn) + assert np.all(~empty_riv) + if empty_drn is not None: + assert np.all(~empty_drn) + + +@parametrize_with_cases( + argnames="active,top,bottom,elevation", + prefix="drn_", +) +@parametrize_with_cases( + argnames="option,expected,_", prefix="allocation_", has_tag="drn" +) +def test_drn_allocation__elevation_above_surface_level( + active, top, bottom, elevation, option, expected, _ +): + # Put elevations a lot above surface level. Need to be allocated to first + # layer. + actual_da = allocate_drn_cells( + option, + active, + top, + bottom, + elevation + 100.0, + ) + + # Override expected + expected = [True, False, False, False] + + actual = take_nth_layer_column(actual_da, 0) + empty = take_nth_layer_column(actual_da, 1) + + np.testing.assert_equal(actual, expected) + assert np.all(~empty) + if empty is not None: + assert np.all(~empty) + + +@parametrize_with_cases( + argnames="active,top,bottom,head", + prefix="ghb_", +) +@parametrize_with_cases( + argnames="option,expected,_", prefix="allocation_", has_tag="ghb" +) +def test_ghb_allocation__elevation_above_surface_level( + active, top, bottom, head, option, expected, _ +): + # Put elevations a lot above surface level. Need to be allocated to first + # layer. + actual_da = allocate_ghb_cells( + option, + active, + top, + bottom, + head + 100.0, + ) + + # Override expected + expected = [True, False, False, False] + + actual = take_nth_layer_column(actual_da, 0) + empty = take_nth_layer_column(actual_da, 1) + + np.testing.assert_equal(actual, expected) + assert np.all(~empty) + if empty is not None: + assert np.all(~empty) + + +@parametrize_with_cases( + argnames="active,top,bottom,elevation", + prefix="drn_", +) +@parametrize_with_cases( + argnames="option,allocated_layer,_", prefix="distribution_", has_tag="drn" +) +def test_distribute_drn_conductance__above_surface_level( + active, top, bottom, elevation, option, allocated_layer, _ +): + allocated_layer.data = np.array([True, False, False, False]) + expected = [1.0, np.nan, np.nan, np.nan] + allocated = enforce_dim_order(active & allocated_layer) + k = xr.DataArray( + [2.0, 2.0, 1.0, 1.0], coords={"layer": [1, 2, 3, 4]}, dims=("layer",) + ) + + conductance = zeros_like(elevation) + 1.0 + + actual_da = distribute_drn_conductance( + option, allocated, conductance, top, bottom, k, elevation + 100.0 + ) + actual = take_nth_layer_column(actual_da, 0) + + np.testing.assert_equal(actual, expected) + + +@parametrize_with_cases( + argnames="active,top,bottom,stage,bottom_elevation", + prefix="riv_", +) +@parametrize_with_cases( + argnames="option,allocated_layer,_", prefix="distribution_", has_tag="riv" +) +def test_distribute_riv_conductance__above_surface_level( + active, top, bottom, stage, bottom_elevation, option, allocated_layer, _ +): + allocated_layer.data = np.array([True, False, False, False]) + expected = [1.0, np.nan, np.nan, np.nan] + allocated = enforce_dim_order(active & allocated_layer) + k = xr.DataArray( + [2.0, 2.0, 1.0, 1.0], coords={"layer": [1, 2, 3, 4]}, dims=("layer",) + ) + + conductance = zeros_like(bottom_elevation) + 1.0 + + actual_da = distribute_riv_conductance( + option, + allocated, + conductance, + top, + bottom, + k, + stage + 100.0, + bottom_elevation + 100.0, + ) + actual = take_nth_layer_column(actual_da, 0) + + np.testing.assert_equal(actual, expected) + + +@parametrize_with_cases( + argnames="active,top,bottom,elevation", + prefix="ghb_", +) +@parametrize_with_cases( + argnames="option,allocated_layer,_", prefix="distribution_", has_tag="ghb" +) +def test_distribute_ghb_conductance__above_surface_level( + active, top, bottom, elevation, option, allocated_layer, _ +): + allocated_layer.data = np.array([True, False, False, False]) + expected = [1.0, np.nan, np.nan, np.nan] + allocated = enforce_dim_order(active & allocated_layer) + k = xr.DataArray( + [2.0, 2.0, 1.0, 1.0], coords={"layer": [1, 2, 3, 4]}, dims=("layer",) + ) + + conductance = zeros_like(elevation) + 1.0 + + actual_da = distribute_ghb_conductance( + option, allocated, conductance, top, bottom, k + ) + actual = take_nth_layer_column(actual_da, 0) + + np.testing.assert_equal(actual, expected) diff --git a/imod/tests/test_typing/test_typing_grid.py b/imod/tests/test_typing/test_typing_grid.py index 9e5dbddb1..ff7f398b0 100644 --- a/imod/tests/test_typing/test_typing_grid.py +++ b/imod/tests/test_typing/test_typing_grid.py @@ -1,9 +1,15 @@ +import numpy as np import xarray as xr import xugrid as xu from imod.typing.grid import ( + UGRID2D_FROM_STRUCTURED_CACHE, + GridCache, + as_ugrid_dataarray, enforce_dim_order, + is_planar_grid, is_spatial_grid, + is_transient_data_grid, merge_with_dictionary, preserve_gridtype, ) @@ -71,6 +77,27 @@ def test_enforce_dim_order__unstructured(basic_unstructured_dis): assert isinstance(actual, type(ibound)) +def test_is_planar_grid(basic_dis, basic_unstructured_dis): + discretizations = [basic_dis, basic_unstructured_dis] + for discr in discretizations: + ibound, _, _ = discr + + # layer coordinates is present + assert not is_planar_grid(ibound) + + # set layer coordinates as present but empty + bottom_layer = ibound.sel(layer=3) + assert is_planar_grid(bottom_layer) + + # set layer coordinates as present and not empty or 0 + bottom_layer = bottom_layer.expand_dims({"layer": [9]}) + assert not is_planar_grid(bottom_layer) + + # set layer coordinates as present and 0 + bottom_layer.coords["layer"].values[0] = 0 + assert is_planar_grid(bottom_layer) + + def test_is_spatial_grid__structured(basic_dis): ibound, _, bottom = basic_dis ds = xr.Dataset() @@ -82,6 +109,25 @@ def test_is_spatial_grid__structured(basic_dis): assert is_spatial_grid(ds) +def test_is_transient_data_grid(basic_dis, basic_unstructured_dis): + discretizations = [basic_dis, basic_unstructured_dis] + + for discr in discretizations: + ibound, _, _ = discr + + # no time coordinate + assert not is_transient_data_grid(ibound) + + # time coordinate but with single value + ibound = ibound.expand_dims({"time": [1]}) + assert not is_transient_data_grid(ibound) + + # time coordinate but with several values + ibound, _, _ = discr + ibound = ibound.expand_dims({"time": [1, 2]}) + assert is_transient_data_grid(ibound) + + def test_is_spatial_grid__unstructured(basic_unstructured_dis): ibound, _, bottom = basic_unstructured_dis grid = ibound.ugrid.grid @@ -90,23 +136,7 @@ def test_is_spatial_grid__unstructured(basic_unstructured_dis): # to dataset. ds["ibound"] = (("layer", "mesh2d_nFaces"), ibound) ds["bottom"] = bottom - uds = xu.UgridDataset(ds, grid) - - assert is_spatial_grid(ibound) - assert not is_spatial_grid(bottom) - assert is_spatial_grid(uds) - - -def test_merge_dictionary__structured(basic_dis): - ibound, _, bottom = basic_dis - - ds = merge_with_dictionary({"ibound": ibound, "bottom": bottom}) - - assert isinstance(ds, xr.Dataset) - assert isinstance(ds["ibound"], xr.DataArray) - assert isinstance(ds["bottom"], xr.DataArray) - assert ds["ibound"].dims == ("layer", "y", "x") - assert ds["bottom"].dims == ("layer",) + _ = xu.UgridDataset(ds, grid) def test_merge_dictionary__unstructured(basic_unstructured_dis): @@ -119,3 +149,73 @@ def test_merge_dictionary__unstructured(basic_unstructured_dis): assert isinstance(uds["bottom"], xr.DataArray) assert uds["ibound"].dims == ("layer", "mesh2d_nFaces") assert uds["bottom"].dims == ("layer",) + + +def test_as_ugrid_dataarray__structured(basic_dis): + # Arrange + ibound, top, bottom = basic_dis + top_3d = top * ibound + bottom_3d = bottom * ibound + # Clear cache + UGRID2D_FROM_STRUCTURED_CACHE.clear() + # Act + ibound_disv = as_ugrid_dataarray(ibound) + top_disv = as_ugrid_dataarray(top_3d) + bottom_disv = as_ugrid_dataarray(bottom_3d) + # Assert + # Test types + assert isinstance(ibound_disv, xu.UgridDataArray) + assert isinstance(top_disv, xu.UgridDataArray) + assert isinstance(bottom_disv, xu.UgridDataArray) + # Test cache proper size + assert len(UGRID2D_FROM_STRUCTURED_CACHE.grid_cache) == 1 + # Test that data is different + assert np.all(ibound_disv != top_disv) + assert np.all(top_disv != bottom_disv) + # Test that grid is equal + assert np.all(ibound_disv.grid == top_disv.grid) + assert np.all(top_disv.grid == bottom_disv.grid) + + +def test_as_ugrid_dataarray__unstructured(basic_unstructured_dis): + # Arrange + ibound, top, bottom = basic_unstructured_dis + top_3d = enforce_dim_order(ibound * top) + bottom_3d = enforce_dim_order(ibound * bottom) + # Clear cache + UGRID2D_FROM_STRUCTURED_CACHE.clear() + # Act + ibound_disv = as_ugrid_dataarray(ibound) + top_disv = as_ugrid_dataarray(top_3d) + bottom_disv = as_ugrid_dataarray(bottom_3d) + # Assert + # Test types + assert isinstance(ibound_disv, xu.UgridDataArray) + assert isinstance(top_disv, xu.UgridDataArray) + assert isinstance(bottom_disv, xu.UgridDataArray) + assert len(UGRID2D_FROM_STRUCTURED_CACHE.grid_cache) == 0 + + +def test_ugrid2d_cache(basic_dis): + # Arrange + ibound, _, _ = basic_dis + # Act + cache = GridCache(xu.Ugrid2d.from_structured, max_cache_size=3) + for i in range(5): + ugrid2d = cache.get_grid(ibound[:, i:, :]) + # Assert + # Test types + assert isinstance(ugrid2d, xu.Ugrid2d) + # Test cache proper size + assert cache.max_cache_size == 3 + assert len(cache.grid_cache) == 3 + # Check if smallest grid in last cache list by checking if amount of faces + # correct + expected_size = ibound[0, i:, :].size + keys = list(cache.grid_cache.keys()) + last_ugrid = cache.grid_cache[keys[-1]] + actual_size = last_ugrid.n_face + assert expected_size == actual_size + # Test clear cache + cache.clear() + assert len(cache.grid_cache) == 0 diff --git a/imod/typing/__init__.py b/imod/typing/__init__.py index b67837e43..a5ca2f69e 100644 --- a/imod/typing/__init__.py +++ b/imod/typing/__init__.py @@ -2,7 +2,7 @@ Module to define type aliases. """ -from typing import TypeAlias, Union +from typing import TYPE_CHECKING, TypeAlias, TypeVar, Union import numpy as np import xarray as xr @@ -15,3 +15,19 @@ UnstructuredData: TypeAlias = Union[xu.UgridDataset, xu.UgridDataArray] FloatArray: TypeAlias = np.ndarray IntArray: TypeAlias = np.ndarray + + +# Types for optional dependencies. +if TYPE_CHECKING: + import geopandas as gpd + import shapely + + GeoDataFrameType: TypeAlias = gpd.GeoDataFrame + GeoSeriesType: TypeAlias = gpd.GeoSeries + PolygonType: TypeAlias = shapely.Polygon + LineStringType: TypeAlias = shapely.LineString +else: + GeoDataFrameType = TypeVar("GeoDataFrameType") + GeoSeriesType = TypeVar("GeoSeriesType") + PolygonType = TypeVar("PolygonType") + LineStringType = TypeVar("LineStringType") diff --git a/imod/typing/grid.py b/imod/typing/grid.py index 1d76e59bd..509da6aef 100644 --- a/imod/typing/grid.py +++ b/imod/typing/grid.py @@ -8,7 +8,7 @@ import xugrid as xu from fastcore.dispatch import typedispatch -from imod.typing import GridDataArray, GridDataset, structured +from imod.typing import GeoDataFrameType, GridDataArray, GridDataset, structured from imod.util.spatial import _polygonize T = TypeVar("T") @@ -45,6 +45,16 @@ def nan_like(grid: xu.UgridDataArray, dtype=np.float32, *args, **kwargs): # noq return xu.full_like(grid, fill_value=np.nan, dtype=dtype, *args, **kwargs) +@typedispatch +def full_like(grid: xr.DataArray, fill_value, *args, **kwargs): + return xr.full_like(grid, fill_value, *args, **kwargs) + + +@typedispatch # type: ignore [no-redef] +def full_like(grid: xu.UgridDataArray, fill_value, *args, **kwargs): # noqa: F811 + return xu.full_like(grid, fill_value, *args, **kwargs) + + @typedispatch def is_unstructured(grid: xu.UgridDataArray | xu.UgridDataset) -> bool: return True @@ -227,7 +237,7 @@ def merge_with_dictionary( @typedispatch -def bounding_polygon(active: xr.DataArray): +def bounding_polygon(active: xr.DataArray) -> GeoDataFrameType: """Return bounding polygon of active cells""" to_polygonize = active.where(active, other=np.nan) polygons_gdf = _polygonize(to_polygonize) @@ -237,7 +247,7 @@ def bounding_polygon(active: xr.DataArray): @typedispatch # type: ignore[no-redef] -def bounding_polygon(active: xu.UgridDataArray): # noqa: F811 +def bounding_polygon(active: xu.UgridDataArray) -> GeoDataFrameType: # noqa: F811 """Return bounding polygon of active cells""" active_indices = np.where(active > 0)[0] domain_slice = {f"{active.ugrid.grid.face_dimension}": active_indices} @@ -355,9 +365,11 @@ def enforce_dim_order(grid: xu.UgridDataArray) -> xu.UgridDataArray: # noqa: F8 ) -def _enforce_unstructured(obj: GridDataArray, ugrid2d=xu.Ugrid2d) -> xu.UgridDataArray: - """Force obj to unstructured""" - return xu.UgridDataArray(xr.DataArray(obj), ugrid2d) +def _as_ugrid_dataarray_with_topology( + obj: GridDataArray, topology: xu.Ugrid2d +) -> xu.UgridDataArray: + """Force obj and topology to ugrid dataarray""" + return xu.UgridDataArray(xr.DataArray(obj), topology) def preserve_gridtype(func: Callable[P, T]) -> Callable[P, T]: @@ -389,8 +401,100 @@ def decorator(*args: P.args, **kwargs: P.kwargs): if unstructured: # Multiple grids returned if isinstance(x, tuple): - return tuple(_enforce_unstructured(i, grid) for i in x) - return _enforce_unstructured(x, grid) + return tuple(_as_ugrid_dataarray_with_topology(i, grid) for i in x) + return _as_ugrid_dataarray_with_topology(x, grid) return x return decorator + + +def is_planar_grid( + grid: xr.DataArray | xr.Dataset | xu.UgridDataArray | xu.UgridDataset, +) -> bool: + # Returns True if the grid is planar. It has then a layer coordinate with + # length 1 and value 0, or an empty layer coordinate axis, or no layer coordinate at all + # and it should have either x, y coordinates or cellface/edge coordinates. + if not is_spatial_grid(grid): + return False + if "layer" not in grid.coords: + return True + if grid["layer"].shape == (): + return True + if grid["layer"][0] == 0 and len(grid["layer"]) == 1: + return True + return False + + +def is_transient_data_grid( + grid: xr.DataArray | xr.Dataset | xu.UgridDataArray | xu.UgridDataset, +): + # Returns True if there is a time coordinate on the object with more than one value. + if "time" in grid.coords: + if len(grid["time"]) > 1: + return True + return False + + +class GridCache: + """ + Cache grids in this object for a specific function, lookup grids based on + unique geometry hash. + """ + + def __init__(self, func: Callable, max_cache_size=5): + self.max_cache_size = max_cache_size + self.grid_cache: dict[int, GridDataArray] = {} + self.func = func + + def get_grid(self, grid: GridDataArray): + geom_hash = get_grid_geometry_hash(grid) + if geom_hash not in self.grid_cache.keys(): + if len(self.grid_cache.keys()) >= self.max_cache_size: + self.remove_first() + self.grid_cache[geom_hash] = self.func(grid) + return self.grid_cache[geom_hash] + + def remove_first(self): + keys = list(self.grid_cache.keys()) + self.grid_cache.pop(keys[0]) + + def clear(self): + self.grid_cache = {} + + +UGRID2D_FROM_STRUCTURED_CACHE = GridCache(xu.Ugrid2d.from_structured) + + +@typedispatch +def as_ugrid_dataarray(grid: xr.DataArray) -> xu.UgridDataArray: + """ + Enforce GridDataArray to UgridDataArray, calls + xu.UgridDataArray.from_structured, which is a costly operation. Therefore + cache results. + """ + + topology = UGRID2D_FROM_STRUCTURED_CACHE.get_grid(grid) + + # Copied from: + # https://github.com/Deltares/xugrid/blob/3dee693763da1c4c0859a4f53ac38d4b99613a33/xugrid/core/wrap.py#L236 + # Note that "da" is renamed to "grid" and "grid" to "topology" + dims = grid.dims[:-2] + coords = {k: grid.coords[k] for k in dims} + face_da = xr.DataArray( + grid.data.reshape(*grid.shape[:-2], -1), + coords=coords, + dims=[*dims, topology.face_dimension], + name=grid.name, + ) + return xu.UgridDataArray(face_da, topology) + + +@typedispatch # type: ignore[no-redef] +def as_ugrid_dataarray(grid: xu.UgridDataArray) -> xu.UgridDataArray: # noqa: F811 + """Enforce GridDataArray to UgridDataArray""" + return grid + + +@typedispatch # type: ignore[no-redef] +def as_ugrid_dataarray(grid: object) -> xu.UgridDataArray: # noqa: F811 + raise TypeError(f"Function doesn't support type {type(grid)}") diff --git a/imod/util/expand_repetitions.py b/imod/util/expand_repetitions.py new file mode 100644 index 000000000..53d911bef --- /dev/null +++ b/imod/util/expand_repetitions.py @@ -0,0 +1,127 @@ +import itertools +from datetime import datetime +from typing import Dict, List + +import numpy as np +import pandas as pd + + +def expand_repetitions( + repeat_stress: List[datetime], time_min: datetime, time_max: datetime +) -> Dict[np.datetime64, np.datetime64]: + """ + Given a list of repeat stresses, and the start and end time of the simulation, + this function returns a dictionary indicating what repeat stress should be used + at what time in the simulation. + + Parameters + ---------- + repeat_stress: list of datetime, optional + This dict list contains contains, per topic, the period alias (a string) to + its datetime. + time_min: datetime + starting time of the simulation + time_max: datetime + ending time of the simulation + + + returns: + -------- + A dictionary that can be used to repeat data for e.g.repeating stress periods such as + seasonality without duplicating the values. The ``repeat_items`` + dimension should have size 2: the first value is the "key", the second + value is the "value". For the "key" datetime, the data of the "value" + datetime will be used. + """ + expanded = {} + for year, date in itertools.product( + range(time_min.year, time_max.year + 1), + repeat_stress, + ): + newdate = np.datetime64(date.replace(year=year)) + if newdate < time_max: + expanded[newdate] = np.datetime64(date) + return expanded + + +def resample_timeseries(well_rate: pd.DataFrame, times: list[datetime]) -> pd.DataFrame: + """ + On input, well_rate is a dataframe containing a timeseries for rate for one well + while "times" is a list of datetimes. + This function creates a new dataframe containing a timeseries defined on the + times in "times". + + Parameters + ---------- + well_rate: pd.DataFrame + input timeseries for well + times: datetime + list of times on which the output dataframe should have entries. + + returns: + -------- + a new dataframe containing a timeseries defined on the times in "times". + """ + is_steady_state = len(times) == 0 + + output_frame = pd.DataFrame(times) + output_frame = output_frame.rename(columns={0: "time"}) + intermediate_df = pd.merge( + output_frame, + well_rate, + how="outer", + on="time", + ).fillna(method="ffill") + + # The entries before the start of the well timeseries do not have data yet, + # so we fill them in here. Keep rate to zero and pad the location columns with + # the first entry. + location_columns = ["x", "y", "id", "filt_top", "filt_bot"] + time_before_start_input = ( + intermediate_df["time"].values < well_rate["time"].values[0] + ) + if time_before_start_input[0]: + intermediate_df.loc[time_before_start_input, "rate"] = 0.0 + intermediate_df.loc[time_before_start_input, location_columns] = ( + well_rate.loc[0, location_columns], + ) + + # compute time difference from perious to current row + time_diff_col = intermediate_df["time"].diff() + intermediate_df.insert(7, "time_to_next", time_diff_col.values) + + # shift the new column 1 place down so that they become the time to the next row + intermediate_df["time_to_next"] = intermediate_df["time_to_next"].shift(-1) + + # Integrate by grouping by the period number + intermediate_df["duration_sec"] = intermediate_df["time_to_next"].dt.total_seconds() + intermediate_df["volume"] = ( + intermediate_df["rate"] * intermediate_df["duration_sec"] + ) + intermediate_df["period_nr"] = intermediate_df["time"].isin(times).cumsum() + gb = intermediate_df.groupby("period_nr") + + output_frame["rate"] = (gb["volume"].sum() / gb["duration_sec"].sum()).reset_index( + drop=True + ) + # If last value is nan (fell outside range), pad with last well rate. + if np.isnan(output_frame["rate"].values[-1]): + output_frame["rate"].values[-1] = well_rate["rate"].values[-1] + + if is_steady_state: + # Take first element, the slice is to force pandas to return it as + # dataframe instead of series. + location_dataframe = intermediate_df[location_columns].iloc[slice(0, 1), :] + # Concat along columns and drop time column + return pd.concat([output_frame, location_dataframe], axis=1).drop( + columns="time" + ) + else: + columns_to_merge = ["time"] + location_columns + return pd.merge( + output_frame, + intermediate_df[columns_to_merge], + on="time", + how="left", + validate="1:m", + ) diff --git a/imod/util/time.py b/imod/util/time.py index fcf5a8dda..75186b517 100644 --- a/imod/util/time.py +++ b/imod/util/time.py @@ -15,6 +15,18 @@ } +def to_pandas_datetime_series(series: pd.Series): + """ + Convert series to pandas datetime, uses length of first string to find the + appropriate format. This takes nanosecond as base. This only supports going + up to the year 2261; the function sets dates beyond this year silently to + pd.NaT. + """ + len_date = len(series.iloc[0]) + dt_format = DATETIME_FORMATS[len_date] + return pd.to_datetime(series, format=dt_format, errors="coerce") + + def to_datetime(s: str) -> datetime.datetime: """ Convert string to datetime. Part of the public API for backwards diff --git a/pixi.lock b/pixi.lock index d8dbf3c17..a9133b583 100644 --- a/pixi.lock +++ b/pixi.lock @@ -21,18 +21,18 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/asciitree-0.3.3-py_2.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/linux-64/atk-1.0-2.38.0-h04ea711_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-auth-0.7.29-h03582ad_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-auth-0.7.30-hec5e740_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-cal-0.7.4-hfd43aa1_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-common-0.9.28-hb9d3cd8_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-compression-0.2.19-h756ea98_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-event-stream-0.4.3-h235a6dd_1.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-http-0.8.8-h5e77a74_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-http-0.8.9-h5e77a74_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-io-0.14.18-hc2627b9_9.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.10.4-h01636a3_19.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.6.5-h191b246_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.10.5-h0009854_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.6.5-hbaf354b_4.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-sdkutils-0.1.19-h756ea98_3.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-checksums-0.1.18-h756ea98_11.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.28.2-h29c84ef_4.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.28.2-h6c0439f_6.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-sdk-cpp-1.11.379-h5a9005d_9.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/azure-core-cpp-1.13.0-h935415a_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/azure-identity-cpp-1.8.0-hd126650_2.conda @@ -73,13 +73,13 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/cryptography-43.0.1-py311hafd3f86_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/cytoolz-0.12.3-py311h459d7ec_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.14-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/dav1d-1.2.1-hd590300_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.9.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/double-conversion-3.3.0-h59595ed_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda @@ -87,7 +87,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/expat-2.6.3-h5888daf_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.8-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/linux-64/ffmpeg-7.0.2-gpl_h226ea3b_102.conda - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda @@ -141,7 +141,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/icu-75.1-he02047a_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/linux-64/imath-3.1.12-h7955e40_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda @@ -169,10 +169,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libabseil-20240116.2-cxx17_he02047a_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libaec-1.1.3-h59595ed_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libarchive-3.7.4-hfca40fe_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-17.0.0-h8d2e343_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-acero-17.0.0-h5888daf_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-dataset-17.0.0-h5888daf_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-substrait-17.0.0-hf54134d_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-17.0.0-hc80a628_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-acero-17.0.0-h5888daf_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-dataset-17.0.0-h5888daf_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-substrait-17.0.0-hf54134d_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libasprintf-0.22.5-he8f35ee_3.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libasprintf-devel-0.22.5-he8f35ee_3.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libass-0.17.3-h1dc1e6a_0.conda @@ -222,8 +222,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libglvnd-1.7.0-ha4b6fd6_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libglx-1.7.0-ha4b6fd6_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libgomp-14.1.0-h77fa898_1.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.28.0-h26d7fe4_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-storage-2.28.0-ha262f82_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.29.0-h435de7b_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-storage-2.29.0-h0121fbd_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libgrpc-1.62.2-h15f2491_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libhwloc-2.11.1-default_hecaa2ac_1000.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.17-hd590300_2.conda @@ -252,9 +252,9 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libopenvino-tensorflow-frontend-2024.3.0-h39126c6_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libopenvino-tensorflow-lite-frontend-2024.3.0-he02047a_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/linux-64/libparquet-17.0.0-h39682fd_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libparquet-17.0.0-h39682fd_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libpciaccess-0.18-hd590300_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.44-hadc24fc_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libpq-16.4-h2d7952a_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libprotobuf-4.25.3-h08a7969_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libraw-0.21.1-h2a13503_2.conda @@ -278,6 +278,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libvpx-1.14.1-hac33072_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.4.0-hd590300_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.16-hb9d3cd8_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.7.0-h2c5496b_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.7-he7c6b58_4.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxslt-1.1.39-h76b75d6_0.conda @@ -338,7 +339,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.2-h59595ed_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/poppler-24.08.0-h47131b8_1.conda @@ -369,10 +370,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-dotenv-0.5.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.11.0-he550d4f_1_cpython.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.11.10-hc5c86c4_0_cpython.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2024.1-pyhd8ed1ab_0.conda @@ -392,7 +395,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3986-2.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/ruff-0.6.4-py311hef32070_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/ruff-0.6.5-py311hef32070_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/s2n-1.5.2-h7b32b05_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/scikit-learn-1.5.2-py311h57cc02b_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/scipy-1.14.1-py311he1f765f_0.conda @@ -423,7 +426,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/tbb-devel-2021.13.0-h94b29a5_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/tiledb-2.26.0-h86fa3b2_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/tiledb-2.26.0-h93dd694_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/tk-8.6.13-noxft_h4845f30_101.conda - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2 @@ -438,7 +441,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/tzcode-2024b-hb9d3cd8_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/uriparser-0.9.8-hac33072_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/utfcpp-4.0.5-ha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/vtk-9.3.1-qt_py311hadc0db7_205.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/vtk-base-9.3.1-qt_py311h680aef5_205.conda @@ -485,451 +488,11 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/yarl-1.9.4-py311h9ecbd09_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-h4ab18f5_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py311hbc35293_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda - pypi: https://files.pythonhosted.org/packages/50/fa/a2561d6837cd45a3971c514222e94d3ded3f105993ddcf4983ed68ce3da3/mypy2junit-1.9.0-py3-none-any.whl - osx-64: - - conda: https://conda.anaconda.org/conda-forge/noarch/accessible-pygments-0.0.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/affine-2.4.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/aiohappyeyeballs-2.4.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aiohttp-3.10.5-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/aiosignal-1.3.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/alabaster-1.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/annotated-types-0.7.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aom-3.9.1-hf036a51_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/asciitree-0.3.3-py_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/atk-1.0-2.38.0-h4bec284_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-auth-0.7.29-h2dfa2de_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-cal-0.7.4-h8128ea2_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-common-0.9.28-h00291cd_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-compression-0.2.19-h8128ea2_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-event-stream-0.4.3-hf6f7cdd_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-http-0.8.8-h2f86973_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-io-0.14.18-hf9a0f1c_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-mqtt-0.10.4-he4b61a0_19.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-s3-0.6.5-h915d0f8_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-sdkutils-0.1.19-h8128ea2_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-checksums-0.1.18-h8128ea2_11.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-crt-cpp-0.28.2-h27d4fa7_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-sdk-cpp-1.11.379-h7a58a96_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-core-cpp-1.13.0-hf8dbe3c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-identity-cpp-1.8.0-h60298e3_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-blobs-cpp-12.12.0-h646f05d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-common-cpp-12.7.0-hf91904f_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-files-datalake-cpp-12.11.0-h14965f0_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/babel-2.14.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/backports-1.0-pyhd8ed1ab_4.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/backports.tarfile-1.0.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/backports.zoneinfo-0.2.1-py311h6eed73b_9.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/beautifulsoup4-4.12.3-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/black-24.8.0-py311h6eed73b_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/blosc-1.21.6-h7d75f6d_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/bokeh-3.5.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/bottleneck-1.4.0-py311h0034819_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/branca-0.7.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/brotli-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/brotli-bin-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/brotli-python-1.1.0-py311hd89902b_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/c-ares-1.33.1-h44e7173_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cairo-1.18.0-h37bd5c4_3.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/certifi-2024.8.30-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cffi-1.17.1-py311h137bacd_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cfitsio-4.4.1-ha105788_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cftime-1.6.4-py311h0034819_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.3.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/click-8.1.7-unix_pyh707e725_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/click-plugins-1.1.1-py_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/cligj-0.7.2-pyhd8ed1ab_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/cloudpickle-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cmarkgfm-0.8.0-py311h2725bcf_3.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/contextily-1.6.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/contourpy-1.3.0-py311hf2f7c97_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/coverage-7.6.1-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cytoolz-0.12.3-py311he705e18_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/dav1d-1.2.1-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/double-conversion-3.3.0-he965462_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/eigen-3.4.0-h1c7c39f_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/expat-2.6.3-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/ffmpeg-6.1.2-gpl_h5256a10_102.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/flopy-3.8.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fltk-1.3.9-ha50d76c_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fmt-11.0.2-h3c5361c_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/folium-0.17.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-dejavu-sans-mono-2.37-hab24e00_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-inconsolata-3.000-h77eed37_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77eed37_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fontconfig-2.14.2-h5bb23bf_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/fonttools-4.53.1-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/freeimage-3.18.0-h55e5cf8_21.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/freetype-2.12.1-h60636b9_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/freexl-2.0.0-h3ec172f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fribidi-1.0.10-hbcb3906_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/frozenlist-1.4.1-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fsspec-2024.9.0-pyhff2d567_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gdal-3.9.2-py311ha943c4b_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gdk-pixbuf-2.42.12-ha587570_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/geographiclib-2.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/geopandas-1.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/geopandas-base-1.0.1-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/geopy-2.4.1-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/geos-3.12.2-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/geotiff-1.7.3-h4bbec01_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gettext-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gettext-tools-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gflags-2.2.2-hb1e8313_1004.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/gh-2.56.0-he13f2d6_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/giflib-5.2.2-h10d778d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gl2ps-1.4.2-hd82a5f3_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/glew-2.1.0-h046ec9c_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/glog-0.7.1-h2790a97_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gmp-6.3.0-hf036a51_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gmsh-4.12.2-h48a2193_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gnutls-3.8.7-hfad6214_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/graphite2-1.3.13-h73e2aa4_1003.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/graphviz-12.0.0-he14ced1_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gtk2-2.24.33-h2c15c3c_5.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gts-0.7.6-h53e17e3_4.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/h2-4.1.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/harfbuzz-9.0.0-h098a298_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/hatchling-1.25.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/hdf4-4.2.15-h8138101_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/hdf5-1.14.3-nompi_h687a608_105.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/hpack-4.0.0-pyh9f0ad1d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/icu-75.1-h120a0e1_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/imath-3.1.12-h2016aa1_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.5.0-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.4.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jaraco.classes-3.4.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jaraco.context-5.3.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jaraco.functools-4.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/json-c-0.17-h6253ea5_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/jsoncpp-1.9.5-h940c156_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/jxrlib-1.1-h10d778d_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/kealib-1.5.3-he475af8_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/keyring-25.3.0-pyh534df25_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/kiwisolver-1.4.7-py311hf2f7c97_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/krb5-1.21.3-h37d8d59_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lame-3.100-hb7f2c08_1003.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/lcms2-2.16-ha2f27b4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lerc-4.0.0-hb486fe8_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libabseil-20240116.2-cxx17_hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libaec-1.1.3-h73e2aa4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarchive-3.7.4-h20e244c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-17.0.0-ha60c65e_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-acero-17.0.0-hac325c4_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-dataset-17.0.0-hac325c4_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-substrait-17.0.0-hba007a9_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libasprintf-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libasprintf-devel-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libass-0.17.3-h5386a9e_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libblas-3.9.0-22_osx64_openblas.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libbrotlicommon-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libbrotlidec-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libbrotlienc-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcblas-3.9.0-22_osx64_openblas.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libclang-cpp16-16.0.6-default_h0c94c6a_13.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libclang13-18.1.8-default_h9ff962c_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcrc32c-1.1.2-he49afe7_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcurl-8.10.0-h58e7537_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcxx-18.1.8-hd876a4e_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libdeflate-1.21-hfdf4475_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libedit-3.1.20191231-h0678c8f_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libev-4.33-h10d778d_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libevent-2.1.12-ha90c15b_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libexpat-2.6.3-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgd-2.3.3-h2e77e4f_10.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-3.9.2-h694c41f_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-core-3.9.2-h26ecb72_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-fits-3.9.2-h2000d26_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-grib-3.9.2-h9237131_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-hdf4-3.9.2-hbfba102_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-hdf5-3.9.2-hc0c3446_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-jp2openjpeg-3.9.2-hd77bb1f_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-kea-3.9.2-he223473_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-netcdf-3.9.2-he83ae23_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-pdf-3.9.2-h85e1e31_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-pg-3.9.2-h7ffd8cf_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-postgisraster-3.9.2-h7ffd8cf_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-tiledb-3.9.2-h6b11327_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-xls-3.9.2-hc33d192_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgettextpo-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgettextpo-devel-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgfortran-5.0.0-13_2_0_h97931a8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgfortran5-13.2.0-h2873a65_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libglib-2.80.3-h736d271_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgoogle-cloud-2.28.0-h721cda5_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgoogle-cloud-storage-2.28.0-h9e84e37_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgrpc-1.62.2-h384b2fc_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libhwloc-2.11.1-default_h456cccd_1000.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libiconv-1.17-hd75f5a5_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libidn2-2.3.7-h10d778d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libintl-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libintl-devel-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libjpeg-turbo-3.0.0-h0dc2134_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libkml-1.3.0-h9ee1731_1021.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/liblapack-3.9.0-22_osx64_openblas.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libllvm14-14.0.6-hc8e404f_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libllvm16-16.0.6-hbedff68_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libllvm18-18.1.8-h9ce406d_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libnetcdf-4.9.2-nompi_h7334405_114.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libnghttp2-1.58.0-h64cf6d3_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libogg-1.3.5-hfdf4475_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenblas-0.3.27-openmp_h8869122_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-2024.3.0-h3d2f4b3_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-auto-batch-plugin-2024.3.0-h7b87a6e_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-auto-plugin-2024.3.0-h7b87a6e_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-hetero-plugin-2024.3.0-h280e65d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-intel-cpu-plugin-2024.3.0-h3d2f4b3_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-ir-frontend-2024.3.0-h280e65d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-onnx-frontend-2024.3.0-he1e86a1_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-paddle-frontend-2024.3.0-he1e86a1_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-pytorch-frontend-2024.3.0-hf036a51_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-tensorflow-frontend-2024.3.0-haca2b7f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-tensorflow-lite-frontend-2024.3.0-hf036a51_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopus-1.3.1-hc929b4f_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libparquet-17.0.0-hf1b0f52_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libpng-1.6.43-h92b6c6a_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libpq-16.4-h75a757a_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libprotobuf-4.25.3-h4e4d658_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libraw-0.21.1-h8138101_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libre2-11-2023.09.01-h81f5012_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/librsvg-2.58.4-h2682814_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/librttopo-1.1.0-he2ba7a0_16.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libspatialite-5.1.0-hdc25a2c_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libssh2-1.11.0-hd019ec5_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libtasn1-4.19.0-hb7f2c08_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libtheora-1.1.1-hfdf4475_1006.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libthrift-0.20.0-h75589b3_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libtiff-4.6.0-h603087a_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libunistring-0.9.10-h0d85af4_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libutf8proc-2.8.0-hb7f2c08_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libvorbis-1.3.7-h046ec9c_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libvpx-1.14.1-hf036a51_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libwebp-base-1.4.0-h10d778d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libxcb-1.16-h00291cd_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libxml2-2.12.7-heaf3512_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzip-1.10.1-hc158999_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/llvm-openmp-18.1.8-h15ab845_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/llvmlite-0.43.0-py311h25b8078_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/locket-1.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/loguru-0.7.2-py311h6eed73b_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lz4-4.3.3-py311h12b7ed1_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lz4-c-1.9.4-hf0c8a7f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lzo-2.10-h10d778d_1001.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/makefun-1.15.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mapclassify-2.8.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/markdown-it-py-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/markupsafe-2.1.5-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/matplotlib-3.9.2-py311h6eed73b_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/matplotlib-base-3.9.2-py311h8b21175_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mdurl-0.1.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mercantile-1.2.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/metis-5.1.1-h73e2aa4_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/minizip-4.0.7-h62b0c8d_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/more-itertools-10.5.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/msgpack-python-1.1.0-py311hf2f7c97_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/multidict-6.1.0-py311h3e662af_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/mypy-1.11.2-py311h3336109_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mypy_extensions-1.0.0-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/mysql-common-9.0.1-h3829a10_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/mysql-libs-9.0.1-h01befea_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/netcdf4-1.7.1-nompi_py311h79bb2b8_102.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nettle-3.9.1-h8e11ae5_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/networkx-3.3-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nh3-0.2.18-py311h95688db_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nlohmann_json-3.11.3-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nspr-4.35-hea0b92c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nss-3.104-h3135457_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/numba-0.60.0-py311h0e5bd6a_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/numba_celltree-0.2.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/numcodecs-0.13.0-py311hfdcbad3_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/numpy-2.0.2-py311h394b0bb_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/occt-7.7.2-novtk_h0a0d97a_101.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openexr-3.2.2-h2627bef_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openh264-2.4.1-h73e2aa4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openjpeg-2.5.2-h7310d3a_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/orc-2.0.2-h22b2039_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/p11-kit-0.24.1-h65f8906_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/packaging-24.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pandamesh-0.2.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pandas-2.2.2-py311hfdcbad3_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pango-1.54.0-h115fe74_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/partd-1.4.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pathspec-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pcre2-10.44-h7634a1b_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pillow-10.4.0-py311h17ad1af_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pixman-0.43.4-h73e2aa4_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/poppler-24.08.0-h65860a0_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/poppler-data-0.4.12-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/postgresql-16.4-h4b98a8f_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/proj-9.4.1-hf92c781_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/psutil-6.0.0-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pthread-stubs-0.4-hc929b4f_1001.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/pugixml-1.14-he965462_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/py-cpuinfo-9.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/py-triangle-20230923-py311he705e18_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyarrow-17.0.0-py311he764780_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyarrow-core-17.0.0-py311h073f6b9_1_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyarrow-hotfix-0.6-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pycparser-2.22-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pydantic-2.9.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pydantic-core-2.23.3-py311h95688db_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pydata-sphinx-theme-0.15.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pygments-2.18.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pymetis-2023.1.1-py311h5fe6d0d_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyogrio-0.9.0-py311hdb57d13_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyproj-3.6.1-py311h48d2620_9.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyproject_hooks-1.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.11.0-he7542f4_1_cpython.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2024.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.11-5_cp311.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytz-2024.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyvista-0.44.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyyaml-6.0.2-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/qhull-2020.2-h3c5361c_5.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/qt6-main-6.7.2-h03f778c_5.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/rapidjson-1.1.0.post20240409-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/rasterio-1.3.11-py311h57fe283_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/re2-2023.09.01-hb168e87_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/readme_renderer-44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/requests-2.32.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/requests-toolbelt-1.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3986-2.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ruff-0.6.4-py311h8c6096b_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/scikit-learn-1.5.2-py311ha1d5734_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/scipy-1.14.1-py311hb3ed397_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/scooby-0.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-73.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-scm-8.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools_scm-8.1.0-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/shapely-2.0.6-py311hbb437d5_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/snappy-1.2.1-he1e6707_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/snowballstemmer-2.2.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/snuggs-1.4.7-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sortedcontainers-2.4.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/soupsieve-2.5-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/spdlog-1.14.1-h325aa07_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinx-8.0.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinx-gallery-0.17.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-applehelp-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-devhelp-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-htmlhelp-2.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-jsmath-1.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-qthelp-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-serializinghtml-1.1.10-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/sqlite-3.46.1-he26b093_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/svt-av1-2.2.1-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tbb-2021.13.0-h37c8870_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tbb-devel-2021.13.0-hf74753b_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tiledb-2.26.0-h313d0e2_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-w-1.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/toolz-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tornado-6.4.1-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tqdm-4.66.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/trove-classifiers-2024.9.12-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/twine-5.1.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typing-extensions-4.12.2-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typing_extensions-4.12.2-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tzcode-2024b-h00291cd_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/uriparser-0.9.8-h6aefe2f_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/utfcpp-4.0.5-h694c41f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/vtk-9.3.1-qt_py311hccf493d_205.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/vtk-base-9.3.1-qt_py311h579de60_205.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/vtk-io-ffmpeg-9.3.1-qt_py311h98fac4b_205.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wslink-2.1.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/x264-1!164.3095-h775f41a_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/x265-3.5-hbb4e6a2_3.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/xarray-2024.9.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xerces-c-3.2.5-hfb503d4_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-fixesproto-5.0-h0d85af4_1002.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-kbproto-1.0.7-h35c211d_1002.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libice-1.1.1-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libsm-1.2.4-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libx11-1.8.9-h7022169_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxau-1.0.11-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxdmcp-1.1.3-h35c211d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxext-1.3.4-hb7f2c08_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxfixes-5.0.3-h0d85af4_1004.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxrender-0.9.11-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-renderproto-0.11.1-h0d85af4_1002.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-xextproto-7.3.0-hb7f2c08_1003.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-xproto-7.0.31-h35c211d_1007.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/xugrid-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/xyzservices-2024.9.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/yaml-0.2.5-h0d85af4_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/yarl-1.9.4-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py311hdf6fcd6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda - - pypi: https://files.pythonhosted.org/packages/50/fa/a2561d6837cd45a3971c514222e94d3ded3f105993ddcf4983ed68ce3da3/mypy2junit-1.9.0-py3-none-any.whl osx-arm64: - conda: https://conda.anaconda.org/conda-forge/noarch/accessible-pygments-0.0.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/affine-2.4.0-pyhd8ed1ab_0.conda @@ -942,18 +505,18 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/asciitree-0.3.3-py_2.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/atk-1.0-2.38.0-hd03087b_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-auth-0.7.29-hd3c7522_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-auth-0.7.30-h338687b_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-cal-0.7.4-h41dd001_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-common-0.9.28-hd74edd7_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-compression-0.2.19-h41dd001_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-event-stream-0.4.3-hb2a355e_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-http-0.8.8-hf5a2c8c_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-http-0.8.9-hf5a2c8c_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-io-0.14.18-hc3cb426_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-mqtt-0.10.4-hb9beb3e_19.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-s3-0.6.5-h439c227_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-mqtt-0.10.5-h9658b26_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-s3-0.6.5-h663ac5c_4.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-sdkutils-0.1.19-h41dd001_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-checksums-0.1.18-h41dd001_11.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-crt-cpp-0.28.2-h4756f83_4.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-crt-cpp-0.28.2-h8f7a527_6.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-sdk-cpp-1.11.379-h67f4a54_9.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-core-cpp-1.13.0-hd01fc5c_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-identity-cpp-1.8.0-h13ea094_2.conda @@ -993,12 +556,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/coverage-7.6.1-py311h460d6c5_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cytoolz-0.12.3-py311h05b510d_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.14-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/dav1d-1.2.1-hb547adb_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.9.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/double-conversion-3.3.0-h13dd4ca_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda @@ -1006,7 +569,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/expat-2.6.3-hf9b8971_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.8-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ffmpeg-6.1.2-gpl_h3ef3969_102.conda - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda @@ -1060,7 +623,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/icu-75.1-hfee45f7_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/imath-3.1.12-h025cafa_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda @@ -1085,10 +648,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libabseil-20240116.2-cxx17_h00cdb27_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libaec-1.1.3-hebf3989_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarchive-3.7.4-h83d404f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-17.0.0-h20538ec_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-acero-17.0.0-hf9b8971_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-dataset-17.0.0-hf9b8971_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-substrait-17.0.0-hbf8b706_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-17.0.0-h77c2f02_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-acero-17.0.0-hf9b8971_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-dataset-17.0.0-hf9b8971_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-substrait-17.0.0-hbf8b706_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libasprintf-0.22.5-h8414b35_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libasprintf-devel-0.22.5-h8414b35_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libass-0.17.3-hf20b609_0.conda @@ -1128,8 +691,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgfortran-5.0.0-13_2_0_hd922786_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgfortran5-13.2.0-hf226fd6_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libglib-2.80.3-h59d46d9_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-2.28.0-hfe08963_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-storage-2.28.0-h1466eeb_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-2.29.0-hfa33a2f_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-storage-2.29.0-h90fd6fa_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgrpc-1.62.2-h9c18a4f_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libhwloc-2.11.1-default_h7685b71_1000.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libiconv-1.17-h0d3ecfb_2.conda @@ -1158,8 +721,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libopenvino-tensorflow-frontend-2024.3.0-h2741c3b_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libopenvino-tensorflow-lite-frontend-2024.3.0-h00cdb27_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libopus-1.3.1-h27ca646_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libparquet-17.0.0-hf0ba9ef_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libpng-1.6.43-h091b4b1_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libparquet-17.0.0-hf0ba9ef_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libpng-1.6.44-hc14010f_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libpq-16.4-h671472c_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libprotobuf-4.25.3-hbfab5d5_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libraw-0.21.1-h2ee6834_2.conda @@ -1237,7 +800,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/pixman-0.43.4-hebf3989_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/poppler-24.08.0-h37b219d_1.conda @@ -1267,10 +830,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-dotenv-0.5.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/python-3.11.0-h3ba56d0_1_cpython.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/python-3.11.10-h739c21a_0_cpython.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2024.1-pyhd8ed1ab_0.conda @@ -1290,7 +855,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3986-2.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ruff-0.6.4-py311h2cf8269_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ruff-0.6.5-py311h2cf8269_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/scikit-learn-1.5.2-py311h9e23f0f_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/scipy-1.14.1-py311h2929bc6_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/scooby-0.10.0-pyhd8ed1ab_0.conda @@ -1319,7 +884,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tbb-devel-2021.13.0-h8e01b61_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tiledb-2.26.0-h3c94177_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tiledb-2.26.0-hfe5b9dc_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tk-8.6.13-h5083fa2_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2 @@ -1334,7 +899,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tzcode-2024b-hd74edd7_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/uriparser-0.9.8-h00cdb27_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/utfcpp-4.0.5-hce30654_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/vtk-9.3.1-qt_py311h07c347a_205.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/vtk-base-9.3.1-qt_py311h14e0e01_205.conda @@ -1365,7 +930,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/yarl-1.9.4-py311h460d6c5_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/zlib-1.3.1-hfb2fe0b_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py311ha60cc69_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda @@ -1381,18 +946,18 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/aom-3.9.1-he0c23c2_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/asciitree-0.3.3-py_2.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-auth-0.7.29-hf1f9119_1.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-auth-0.7.30-h4ab18a5_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-cal-0.7.4-hf1fc857_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-common-0.9.28-h2466b09_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-compression-0.2.19-hf1fc857_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-event-stream-0.4.3-hb6a8f00_1.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-http-0.8.8-heca9ddf_2.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-http-0.8.9-heca9ddf_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-io-0.14.18-h3831a8d_9.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-mqtt-0.10.4-h4d6445f_19.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-s3-0.6.5-h184cd82_2.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-mqtt-0.10.5-h8fec231_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-s3-0.6.5-h6945fc3_4.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-sdkutils-0.1.19-hf1fc857_3.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-checksums-0.1.18-hf1fc857_11.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-crt-cpp-0.28.2-hcae1b89_4.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-crt-cpp-0.28.2-h2ae5ca2_6.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-sdk-cpp-1.11.379-h76bae87_9.conda - conda: https://conda.anaconda.org/conda-forge/win-64/azure-core-cpp-1.13.0-haf5610f_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/azure-identity-cpp-1.8.0-h148e6f0_2.conda @@ -1431,12 +996,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/coverage-7.6.1-py311he736701_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/cytoolz-0.12.3-py311ha68e1ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.14-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/dav1d-1.2.1-hcfcfb64_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.9.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/double-conversion-3.3.0-h63175ca_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda @@ -1444,7 +1009,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/expat-2.6.3-he0c23c2_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.8-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/win-64/ffmpeg-6.1.2-gpl_h9cf63cc_102.conda - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda @@ -1490,7 +1055,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/icu-75.1-he0c23c2_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/win-64/imath-3.1.12-hbb528cf_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda @@ -1514,10 +1079,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/libabseil-20240116.2-cxx17_he0c23c2_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libaec-1.1.3-h63175ca_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libarchive-3.7.4-haf234dc_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-17.0.0-h29daf90_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-acero-17.0.0-he0c23c2_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-dataset-17.0.0-he0c23c2_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-substrait-17.0.0-h1f0e801_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-17.0.0-he3462ed_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-acero-17.0.0-he0c23c2_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-dataset-17.0.0-he0c23c2_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-substrait-17.0.0-h1f0e801_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libblas-3.9.0-23_win64_mkl.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libbrotlicommon-1.1.0-h2466b09_2.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libbrotlidec-1.1.0-h2466b09_2.conda @@ -1546,8 +1111,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/libgdal-tiledb-3.9.2-hb8b5d01_2.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libgdal-xls-3.9.2-hd0e23a6_2.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libglib-2.80.3-h7025463_2.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-2.28.0-h5e7cea3_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-storage-2.28.0-he5eb982_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-2.29.0-h5e7cea3_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-storage-2.29.0-he5eb982_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libgrpc-1.62.2-h5273850_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libhwloc-2.11.1-default_h8125262_1000.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libiconv-1.17-hcfcfb64_2.conda @@ -1558,8 +1123,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/libnetcdf-4.9.2-nompi_h92078aa_114.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libogg-1.3.5-h2466b09_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libopus-1.3.1-h8ffe710_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/win-64/libparquet-17.0.0-ha915800_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libpng-1.6.43-h19919ed_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libparquet-17.0.0-ha915800_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libpng-1.6.44-h3ca93ac_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libpq-16.4-hab9416b_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libprotobuf-4.25.3-h503648d_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libraw-0.21.1-h5557f11_2.conda @@ -1632,7 +1197,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/pixman-0.43.4-h63175ca_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/poppler-24.08.0-h9415970_1.conda @@ -1664,10 +1229,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-dotenv-0.5.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/python-3.11.0-hcf16a7b_0_cpython.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/win-64/python-3.11.10-hce54a09_0_cpython.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2024.1-pyhd8ed1ab_0.conda @@ -1687,7 +1254,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3986-2.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/ruff-0.6.4-py311heeab51b_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/ruff-0.6.5-py311heeab51b_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/scikit-learn-1.5.2-py311hdcb8d17_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/scipy-1.14.1-py311hd4686c6_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/scooby-0.10.0-pyhd8ed1ab_0.conda @@ -1716,7 +1283,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/tbb-devel-2021.13.0-h053bfa6_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/tiledb-2.26.0-h98a567f_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/tiledb-2.26.0-hefd1f8f_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/tk-8.6.13-h5226925_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2 @@ -1731,7 +1298,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/ucrt-10.0.22621.0-h57928b3_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/win-64/uriparser-0.9.8-h5a68840_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/utfcpp-4.0.5-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/vc-14.3-h8a93ad2_21.conda - conda: https://conda.anaconda.org/conda-forge/win-64/vc14_runtime-14.40.33810-ha82c5b3_21.conda @@ -1768,7 +1335,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/yarl-1.9.4-py311he736701_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py311h53056dc_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda @@ -1800,18 +1367,18 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/async-lru-2.0.4-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/atk-1.0-2.38.0-h04ea711_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-auth-0.7.29-h03582ad_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-auth-0.7.30-hec5e740_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-cal-0.7.4-hfd43aa1_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-common-0.9.28-hb9d3cd8_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-compression-0.2.19-h756ea98_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-event-stream-0.4.3-h235a6dd_1.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-http-0.8.8-h5e77a74_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-http-0.8.9-h5e77a74_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-io-0.14.18-hc2627b9_9.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.10.4-h01636a3_19.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.6.5-h191b246_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.10.5-h0009854_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.6.5-hbaf354b_4.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-c-sdkutils-0.1.19-h756ea98_3.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-checksums-0.1.18-h756ea98_11.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.28.2-h29c84ef_4.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.28.2-h6c0439f_6.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/aws-sdk-cpp-1.11.379-h5a9005d_9.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/azure-core-cpp-1.13.0-h935415a_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/azure-identity-cpp-1.8.0-hd126650_2.conda @@ -1856,16 +1423,16 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/cryptography-43.0.1-py311hafd3f86_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/cytoolz-0.12.3-py311h459d7ec_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.14-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/dav1d-1.2.1-hd590300_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/linux-64/debugpy-1.8.5-py311hfdbb021_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/decorator-5.1.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/defusedxml-0.7.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.9.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/double-conversion-3.3.0-h59595ed_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda @@ -1875,7 +1442,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/executing-2.1.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/expat-2.6.3-h5888daf_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.8-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/linux-64/ffmpeg-7.0.2-gpl_h226ea3b_102.conda - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda @@ -1933,7 +1500,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/icu-75.1-he02047a_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/linux-64/imath-3.1.12-h7955e40_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda @@ -1983,10 +1550,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libabseil-20240116.2-cxx17_he02047a_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libaec-1.1.3-h59595ed_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libarchive-3.7.4-hfca40fe_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-17.0.0-h8d2e343_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-acero-17.0.0-h5888daf_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-dataset-17.0.0-h5888daf_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-substrait-17.0.0-hf54134d_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-17.0.0-hc80a628_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-acero-17.0.0-h5888daf_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-dataset-17.0.0-h5888daf_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libarrow-substrait-17.0.0-hf54134d_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libasprintf-0.22.5-he8f35ee_3.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libasprintf-devel-0.22.5-he8f35ee_3.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libass-0.17.3-h1dc1e6a_0.conda @@ -2036,8 +1603,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libglvnd-1.7.0-ha4b6fd6_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libglx-1.7.0-ha4b6fd6_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libgomp-14.1.0-h77fa898_1.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.28.0-h26d7fe4_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-storage-2.28.0-ha262f82_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.29.0-h435de7b_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-storage-2.29.0-h0121fbd_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libgrpc-1.62.2-h15f2491_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libhwloc-2.11.1-default_hecaa2ac_1000.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.17-hd590300_2.conda @@ -2066,9 +1633,9 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libopenvino-tensorflow-frontend-2024.3.0-h39126c6_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libopenvino-tensorflow-lite-frontend-2024.3.0-he02047a_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libopus-1.3.1-h7f98852_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/linux-64/libparquet-17.0.0-h39682fd_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libparquet-17.0.0-h39682fd_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libpciaccess-0.18-hd590300_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.44-hadc24fc_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libpq-16.4-h2d7952a_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libprotobuf-4.25.3-h08a7969_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libraw-0.21.1-h2a13503_2.conda @@ -2093,6 +1660,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libvpx-1.14.1-hac33072_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.4.0-hd590300_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.16-hb9d3cd8_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.7.0-h2c5496b_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.12.7-he7c6b58_4.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxslt-1.1.39-h76b75d6_0.conda @@ -2167,7 +1735,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/pixman-0.43.2-h59595ed_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkgutil-resolve-name-1.3.10-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/poppler-24.08.0-h47131b8_1.conda @@ -2203,10 +1771,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-dotenv-0.5.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.11.0-he550d4f_1_cpython.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.11.10-hc5c86c4_0_cpython.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-fastjsonschema-2.20.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda @@ -2233,7 +1803,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/rpds-py-0.20.0-py311h9e33e62_1.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/ruff-0.6.4-py311hef32070_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/ruff-0.6.5-py311hef32070_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/s2n-1.5.2-h7b32b05_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/scikit-learn-1.5.2-py311h57cc02b_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/scipy-1.14.1-py311he1f765f_0.conda @@ -2268,7 +1838,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/terminado-0.18.1-pyh0d859eb_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/tiledb-2.26.0-h86fa3b2_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/tiledb-2.26.0-h93dd694_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tinycss2-1.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/tk-8.6.13-noxft_h4845f30_101.conda - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 @@ -2288,7 +1858,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/uri-template-1.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/uriparser-0.9.8-hac33072_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/utfcpp-4.0.5-ha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/vtk-9.3.1-qt_py311hadc0db7_205.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/vtk-base-9.3.1-qt_py311h680aef5_205.conda @@ -2341,164 +1911,164 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/zeromq-4.3.5-ha4adb4c_5.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-h4ab18f5_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py311hbc35293_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda - pypi: https://files.pythonhosted.org/packages/50/fa/a2561d6837cd45a3971c514222e94d3ded3f105993ddcf4983ed68ce3da3/mypy2junit-1.9.0-py3-none-any.whl - osx-64: + osx-arm64: - conda: https://conda.anaconda.org/conda-forge/noarch/accessible-pygments-0.0.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/affine-2.4.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/aiohappyeyeballs-2.4.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aiohttp-3.10.5-py311h3336109_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aiohttp-3.10.5-py311h460d6c5_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/aiosignal-1.3.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/alabaster-1.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/annotated-types-0.7.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/anyio-4.4.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aom-3.9.1-hf036a51_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aom-3.9.1-h7bae524_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/appnope-0.1.4-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/argon2-cffi-23.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/argon2-cffi-bindings-21.2.0-py311h3336109_5.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/argon2-cffi-bindings-21.2.0-py311h460d6c5_5.conda - conda: https://conda.anaconda.org/conda-forge/noarch/arrow-1.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/asciitree-0.3.3-py_2.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/asttokens-2.4.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/async-lru-2.0.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/atk-1.0-2.38.0-h4bec284_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/atk-1.0-2.38.0-hd03087b_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-auth-0.7.29-h2dfa2de_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-cal-0.7.4-h8128ea2_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-common-0.9.28-h00291cd_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-compression-0.2.19-h8128ea2_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-event-stream-0.4.3-hf6f7cdd_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-http-0.8.8-h2f86973_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-io-0.14.18-hf9a0f1c_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-mqtt-0.10.4-he4b61a0_19.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-s3-0.6.5-h915d0f8_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-c-sdkutils-0.1.19-h8128ea2_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-checksums-0.1.18-h8128ea2_11.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-crt-cpp-0.28.2-h27d4fa7_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/aws-sdk-cpp-1.11.379-h7a58a96_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-core-cpp-1.13.0-hf8dbe3c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-identity-cpp-1.8.0-h60298e3_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-blobs-cpp-12.12.0-h646f05d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-common-cpp-12.7.0-hf91904f_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-files-datalake-cpp-12.11.0-h14965f0_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-auth-0.7.30-h338687b_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-cal-0.7.4-h41dd001_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-common-0.9.28-hd74edd7_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-compression-0.2.19-h41dd001_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-event-stream-0.4.3-hb2a355e_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-http-0.8.9-hf5a2c8c_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-io-0.14.18-hc3cb426_9.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-mqtt-0.10.5-h9658b26_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-s3-0.6.5-h663ac5c_4.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-sdkutils-0.1.19-h41dd001_3.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-checksums-0.1.18-h41dd001_11.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-crt-cpp-0.28.2-h8f7a527_6.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-sdk-cpp-1.11.379-h67f4a54_9.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-core-cpp-1.13.0-hd01fc5c_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-identity-cpp-1.8.0-h13ea094_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-storage-blobs-cpp-12.12.0-hfde595f_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-storage-common-cpp-12.7.0-hcf3b6fd_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-storage-files-datalake-cpp-12.11.0-h082e32e_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/babel-2.14.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/backports-1.0-pyhd8ed1ab_4.conda - conda: https://conda.anaconda.org/conda-forge/noarch/backports.tarfile-1.0.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/backports.zoneinfo-0.2.1-py311h6eed73b_9.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/backports.zoneinfo-0.2.1-py311h267d04e_9.conda - conda: https://conda.anaconda.org/conda-forge/noarch/beautifulsoup4-4.12.3-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/black-24.8.0-py311h6eed73b_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/black-24.8.0-py311h267d04e_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/bleach-6.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/blosc-1.21.6-h7d75f6d_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/blosc-1.21.6-h5499902_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/bokeh-3.5.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/bottleneck-1.4.0-py311h0034819_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bottleneck-1.4.0-py311h0f07fe1_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/branca-0.7.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/brotli-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/brotli-bin-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/brotli-python-1.1.0-py311hd89902b_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/c-ares-1.33.1-h44e7173_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/brotli-1.1.0-hd74edd7_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/brotli-bin-1.1.0-hd74edd7_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/brotli-python-1.1.0-py311h3f08180_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/c-ares-1.33.1-hd74edd7_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ca-certificates-2024.8.30-hf0a4a13_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/cached-property-1.5.2-hd8ed1ab_1.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/cached_property-1.5.2-pyha770c72_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/cairo-1.18.0-h37bd5c4_3.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cairo-1.18.0-hb4a6bf7_3.conda - conda: https://conda.anaconda.org/conda-forge/noarch/certifi-2024.8.30-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cffi-1.17.1-py311h137bacd_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cfitsio-4.4.1-ha105788_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cftime-1.6.4-py311h0034819_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cffi-1.17.1-py311h3a79f62_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cfitsio-4.4.1-h793ed5c_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cftime-1.6.4-py311h0f07fe1_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.3.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/click-8.1.7-unix_pyh707e725_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/click-plugins-1.1.1-py_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/cligj-0.7.2-pyhd8ed1ab_1.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/cloudpickle-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cmarkgfm-0.8.0-py311h2725bcf_3.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cmarkgfm-0.8.0-py311heffc1b2_3.conda - conda: https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/comm-0.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/contextily-1.6.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/contourpy-1.3.0-py311hf2f7c97_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/coverage-7.6.1-py311h3336109_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/contourpy-1.3.0-py311h2c37856_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/coverage-7.6.1-py311h460d6c5_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/cytoolz-0.12.3-py311he705e18_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/dav1d-1.2.1-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/debugpy-1.8.5-py311hd89902b_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cytoolz-0.12.3-py311h05b510d_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.14-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/dav1d-1.2.1-hb547adb_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/debugpy-1.8.5-py311h3f08180_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/decorator-5.1.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/defusedxml-0.7.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.9.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/double-conversion-3.3.0-he965462_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/double-conversion-3.3.0-h13dd4ca_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/eigen-3.4.0-h1c7c39f_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/eigen-3.4.0-h1995070_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/entrypoints-0.4-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/executing-2.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/expat-2.6.3-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/expat-2.6.3-hf9b8971_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.8-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/ffmpeg-6.1.2-gpl_h5256a10_102.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ffmpeg-6.1.2-gpl_h3ef3969_102.conda - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/flopy-3.8.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fltk-1.3.9-ha50d76c_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fmt-11.0.2-h3c5361c_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fltk-1.3.9-h5164b75_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fmt-11.0.2-h420ef59_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/folium-0.17.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-dejavu-sans-mono-2.37-hab24e00_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-inconsolata-3.000-h77eed37_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77eed37_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fontconfig-2.14.2-h5bb23bf_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fontconfig-2.14.2-h82840c6_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/fonttools-4.53.1-py311h3336109_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fonttools-4.53.1-py311h460d6c5_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fqdn-1.5.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/freeimage-3.18.0-h55e5cf8_21.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/freetype-2.12.1-h60636b9_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/freexl-2.0.0-h3ec172f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/fribidi-1.0.10-hbcb3906_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/frozenlist-1.4.1-py311h3336109_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/freeimage-3.18.0-hf268909_21.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/freetype-2.12.1-hadb7bae_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/freexl-2.0.0-hfbad9fb_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fribidi-1.0.10-h27ca646_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/frozenlist-1.4.1-py311h460d6c5_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fsspec-2024.9.0-pyhff2d567_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gdal-3.9.2-py311ha943c4b_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gdk-pixbuf-2.42.12-ha587570_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gdal-3.9.2-py311h40baa13_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gdk-pixbuf-2.42.12-h7ddc832_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/geographiclib-2.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/geopandas-1.0.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/geopandas-base-1.0.1-pyha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/geopy-2.4.1-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/geos-3.12.2-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/geotiff-1.7.3-h4bbec01_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gettext-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gettext-tools-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gflags-2.2.2-hb1e8313_1004.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/gh-2.56.0-he13f2d6_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/giflib-5.2.2-h10d778d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gl2ps-1.4.2-hd82a5f3_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/glew-2.1.0-h046ec9c_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/glog-0.7.1-h2790a97_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gmp-6.3.0-hf036a51_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gmsh-4.12.2-h48a2193_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gnutls-3.8.7-hfad6214_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/graphite2-1.3.13-h73e2aa4_1003.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/graphviz-12.0.0-he14ced1_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gtk2-2.24.33-h2c15c3c_5.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/gts-0.7.6-h53e17e3_4.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/geos-3.12.2-h00cdb27_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/geotiff-1.7.3-h7e5fb84_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gettext-0.22.5-h8414b35_3.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gettext-tools-0.22.5-h8414b35_3.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gflags-2.2.2-hc88da5d_1004.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gh-2.56.0-h163aea0_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/giflib-5.2.2-h93a5062_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gl2ps-1.4.2-hc97c1ff_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/glew-2.1.0-h9f76cd9_2.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/glog-0.7.1-heb240a5_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gmp-6.3.0-h7bae524_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gmsh-4.12.2-hd427cfb_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gnutls-3.8.7-h9df781c_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/graphite2-1.3.13-hebf3989_1003.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/graphviz-12.0.0-hbf8cc41_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gtk2-2.24.33-h91d5085_5.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gts-0.7.6-he42f4ea_4.conda - conda: https://conda.anaconda.org/conda-forge/noarch/h11-0.14.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/h2-4.1.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/harfbuzz-9.0.0-h098a298_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/harfbuzz-9.0.0-h997cde5_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/hatchling-1.25.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/hdf4-4.2.15-h8138101_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/hdf5-1.14.3-nompi_h687a608_105.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/hdf4-4.2.15-h2ee6834_7.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/hdf5-1.14.3-nompi_hec07895_105.conda - conda: https://conda.anaconda.org/conda-forge/noarch/hpack-4.0.0-pyh9f0ad1d_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/httpcore-1.0.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/httpx-0.27.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/icu-75.1-h120a0e1_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/icu-75.1-hfee45f7_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/imath-3.1.12-h2016aa1_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/imath-3.1.12-h025cafa_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.5.0-hd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.4.5-pyhd8ed1ab_0.conda @@ -2513,10 +2083,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/jedi-0.19.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.4-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/json-c-0.17-h6253ea5_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/json-c-0.17-he54c16a_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/json5-0.9.25-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/jsoncpp-1.9.5-h940c156_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/jsonpointer-3.0.0-py311h6eed73b_1.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jsoncpp-1.9.5-hc021e02_1.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jsonpointer-3.0.0-py311h267d04e_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-4.23.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-specifications-2023.12.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-with-format-nongpl-4.23.0-hd8ed1ab_0.conda @@ -2524,7 +2094,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter-lsp-2.2.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_client-8.6.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_console-6.6.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/jupyter_core-5.7.2-py311h6eed73b_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jupyter_core-5.7.2-py311h267d04e_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_events-0.10.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_server-2.14.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_server_terminals-0.5.3-pyhd8ed1ab_0.conda @@ -2532,534 +2102,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab_pygments-0.3.0-pyhd8ed1ab_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab_server-2.27.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab_widgets-3.0.13-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/jxrlib-1.1-h10d778d_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/kealib-1.5.3-he475af8_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/keyring-25.3.0-pyh534df25_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/kiwisolver-1.4.7-py311hf2f7c97_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/krb5-1.21.3-h37d8d59_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lame-3.100-hb7f2c08_1003.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/lcms2-2.16-ha2f27b4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lerc-4.0.0-hb486fe8_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libabseil-20240116.2-cxx17_hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libaec-1.1.3-h73e2aa4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarchive-3.7.4-h20e244c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-17.0.0-ha60c65e_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-acero-17.0.0-hac325c4_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-dataset-17.0.0-hac325c4_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libarrow-substrait-17.0.0-hba007a9_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libasprintf-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libasprintf-devel-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libass-0.17.3-h5386a9e_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libblas-3.9.0-22_osx64_openblas.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libbrotlicommon-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libbrotlidec-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libbrotlienc-1.1.0-h00291cd_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcblas-3.9.0-22_osx64_openblas.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libclang-cpp16-16.0.6-default_h0c94c6a_13.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libclang13-18.1.8-default_h9ff962c_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcrc32c-1.1.2-he49afe7_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcurl-8.10.0-h58e7537_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libcxx-18.1.8-hd876a4e_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libdeflate-1.21-hfdf4475_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libedit-3.1.20191231-h0678c8f_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libev-4.33-h10d778d_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libevent-2.1.12-ha90c15b_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libexpat-2.6.3-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgd-2.3.3-h2e77e4f_10.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-3.9.2-h694c41f_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-core-3.9.2-h26ecb72_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-fits-3.9.2-h2000d26_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-grib-3.9.2-h9237131_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-hdf4-3.9.2-hbfba102_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-hdf5-3.9.2-hc0c3446_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-jp2openjpeg-3.9.2-hd77bb1f_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-kea-3.9.2-he223473_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-netcdf-3.9.2-he83ae23_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-pdf-3.9.2-h85e1e31_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-pg-3.9.2-h7ffd8cf_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-postgisraster-3.9.2-h7ffd8cf_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-tiledb-3.9.2-h6b11327_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgdal-xls-3.9.2-hc33d192_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgettextpo-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgettextpo-devel-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgfortran-5.0.0-13_2_0_h97931a8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgfortran5-13.2.0-h2873a65_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libglib-2.80.3-h736d271_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgoogle-cloud-2.28.0-h721cda5_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgoogle-cloud-storage-2.28.0-h9e84e37_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libgrpc-1.62.2-h384b2fc_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libhwloc-2.11.1-default_h456cccd_1000.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libiconv-1.17-hd75f5a5_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libidn2-2.3.7-h10d778d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libintl-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libintl-devel-0.22.5-hdfe23c8_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libjpeg-turbo-3.0.0-h0dc2134_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libkml-1.3.0-h9ee1731_1021.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/liblapack-3.9.0-22_osx64_openblas.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libllvm14-14.0.6-hc8e404f_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libllvm16-16.0.6-hbedff68_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libllvm18-18.1.8-h9ce406d_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libnetcdf-4.9.2-nompi_h7334405_114.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libnghttp2-1.58.0-h64cf6d3_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libogg-1.3.5-hfdf4475_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenblas-0.3.27-openmp_h8869122_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-2024.3.0-h3d2f4b3_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-auto-batch-plugin-2024.3.0-h7b87a6e_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-auto-plugin-2024.3.0-h7b87a6e_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-hetero-plugin-2024.3.0-h280e65d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-intel-cpu-plugin-2024.3.0-h3d2f4b3_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-ir-frontend-2024.3.0-h280e65d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-onnx-frontend-2024.3.0-he1e86a1_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-paddle-frontend-2024.3.0-he1e86a1_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-pytorch-frontend-2024.3.0-hf036a51_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-tensorflow-frontend-2024.3.0-haca2b7f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-tensorflow-lite-frontend-2024.3.0-hf036a51_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libopus-1.3.1-hc929b4f_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libparquet-17.0.0-hf1b0f52_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libpng-1.6.43-h92b6c6a_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libpq-16.4-h75a757a_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libprotobuf-4.25.3-h4e4d658_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libraw-0.21.1-h8138101_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libre2-11-2023.09.01-h81f5012_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/librsvg-2.58.4-h2682814_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/librttopo-1.1.0-he2ba7a0_16.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libsodium-1.0.20-hfdf4475_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libspatialite-5.1.0-hdc25a2c_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libssh2-1.11.0-hd019ec5_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libtasn1-4.19.0-hb7f2c08_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libtheora-1.1.1-hfdf4475_1006.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libthrift-0.20.0-h75589b3_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libtiff-4.6.0-h603087a_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libunistring-0.9.10-h0d85af4_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libutf8proc-2.8.0-hb7f2c08_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libvorbis-1.3.7-h046ec9c_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libvpx-1.14.1-hf036a51_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libwebp-base-1.4.0-h10d778d_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libxcb-1.16-h00291cd_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libxml2-2.12.7-heaf3512_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzip-1.10.1-hc158999_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/llvm-openmp-18.1.8-h15ab845_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/llvmlite-0.43.0-py311h25b8078_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/locket-1.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/loguru-0.7.2-py311h6eed73b_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lz4-4.3.3-py311h12b7ed1_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lz4-c-1.9.4-hf0c8a7f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/lzo-2.10-h10d778d_1001.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/makefun-1.15.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mapclassify-2.8.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/markdown-it-py-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/markupsafe-2.1.5-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/matplotlib-3.9.2-py311h6eed73b_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/matplotlib-base-3.9.2-py311h8b21175_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/matplotlib-inline-0.1.7-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mdurl-0.1.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mercantile-1.2.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/metis-5.1.1-h73e2aa4_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/minizip-4.0.7-h62b0c8d_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mistune-3.0.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/more-itertools-10.5.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/msgpack-python-1.1.0-py311hf2f7c97_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/multidict-6.1.0-py311h3e662af_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyh9f0ad1d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/mypy-1.11.2-py311h3336109_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mypy_extensions-1.0.0-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/mysql-common-9.0.1-h3829a10_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/mysql-libs-9.0.1-h01befea_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/nbclient-0.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/nbconvert-core-7.16.4-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/nbformat-5.10.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/nest-asyncio-1.6.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/netcdf4-1.7.1-nompi_py311h79bb2b8_102.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nettle-3.9.1-h8e11ae5_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/networkx-3.3-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nh3-0.2.18-py311h95688db_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nlohmann_json-3.11.3-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/notebook-7.2.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/notebook-shim-0.2.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nspr-4.35-hea0b92c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/nss-3.104-h3135457_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/numba-0.60.0-py311h0e5bd6a_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/numba_celltree-0.2.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/numcodecs-0.13.0-py311hfdcbad3_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/numpy-2.0.2-py311h394b0bb_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/occt-7.7.2-novtk_h0a0d97a_101.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openexr-3.2.2-h2627bef_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openh264-2.4.1-h73e2aa4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openjpeg-2.5.2-h7310d3a_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/orc-2.0.2-h22b2039_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/overrides-7.7.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/p11-kit-0.24.1-h65f8906_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/packaging-24.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pandamesh-0.2.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pandas-2.2.2-py311hfdcbad3_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pandocfilters-1.5.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/pango-1.54.0-h115fe74_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/parso-0.8.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/partd-1.4.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pathspec-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pcre2-10.44-h7634a1b_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pexpect-4.9.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pickleshare-0.7.5-py_1003.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/pillow-10.4.0-py311h17ad1af_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pixman-0.43.4-h73e2aa4_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pkgutil-resolve-name-1.3.10-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/poppler-24.08.0-h65860a0_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/poppler-data-0.4.12-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/postgresql-16.4-h4b98a8f_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/proj-9.4.1-hf92c781_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/prometheus_client-0.20.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/prompt-toolkit-3.0.47-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/prompt_toolkit-3.0.47-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/psutil-6.0.0-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pthread-stubs-0.4-hc929b4f_1001.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/ptyprocess-0.7.0-pyhd3deb0d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/pugixml-1.14-he965462_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pure_eval-0.2.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/py-cpuinfo-9.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/py-triangle-20230923-py311he705e18_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyarrow-17.0.0-py311he764780_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyarrow-core-17.0.0-py311h073f6b9_1_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyarrow-hotfix-0.6-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pycparser-2.22-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pydantic-2.9.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pydantic-core-2.23.3-py311h95688db_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pydata-sphinx-theme-0.15.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pygments-2.18.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pymetis-2023.1.1-py311h5fe6d0d_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyobjc-core-10.3.1-py311hd6939f8_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyobjc-framework-cocoa-10.3.1-py311hd6939f8_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyogrio-0.9.0-py311hdb57d13_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.1.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyproj-3.6.1-py311h48d2620_9.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyproject_hooks-1.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha2e5f31_6.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-7.4.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.11.0-he7542f4_1_cpython.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-fastjsonschema-2.20.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-json-logger-2.0.7-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2024.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.11-5_cp311.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pytz-2024.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pyvista-0.44.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyyaml-6.0.2-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pyzmq-26.2.0-py311h95f92fe_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/qhull-2020.2-h3c5361c_5.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/qt6-main-6.7.2-h03f778c_5.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/rapidjson-1.1.0.post20240409-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/rasterio-1.3.11-py311h57fe283_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/re2-2023.09.01-hb168e87_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/readme_renderer-44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/referencing-0.35.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/requests-2.32.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/requests-toolbelt-1.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3339-validator-0.1.4-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3986-2.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3986-validator-0.1.1-pyh9f0ad1d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/rpds-py-0.20.0-py311h95688db_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ruff-0.6.4-py311h8c6096b_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/scikit-learn-1.5.2-py311ha1d5734_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/scipy-1.14.1-py311hb3ed397_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/scooby-0.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/send2trash-1.8.3-pyh31c8845_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-73.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-scm-8.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools_scm-8.1.0-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/shapely-2.0.6-py311hbb437d5_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/snappy-1.2.1-he1e6707_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sniffio-1.3.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/snowballstemmer-2.2.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/snuggs-1.4.7-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sortedcontainers-2.4.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/soupsieve-2.5-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/spdlog-1.14.1-h325aa07_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinx-8.0.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinx-gallery-0.17.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-applehelp-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-devhelp-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-htmlhelp-2.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-jsmath-1.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-qthelp-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-serializinghtml-1.1.10-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/sqlite-3.46.1-he26b093_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/stack_data-0.6.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/svt-av1-2.2.1-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tbb-2021.13.0-h37c8870_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tbb-devel-2021.13.0-hf74753b_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/terminado-0.18.1-pyh31c8845_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tiledb-2.26.0-h313d0e2_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tinycss2-1.3.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-2.0.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-w-1.0.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/toolz-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tornado-6.4.1-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tqdm-4.66.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/traitlets-5.14.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/trove-classifiers-2024.9.12-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/twine-5.1.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/types-python-dateutil-2.9.0.20240906-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typing-extensions-4.12.2-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typing_extensions-4.12.2-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typing_utils-0.1.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/tzcode-2024b-h00291cd_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/uri-template-1.3.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/uriparser-0.9.8-h6aefe2f_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/utfcpp-4.0.5-h694c41f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/vtk-9.3.1-qt_py311hccf493d_205.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/vtk-base-9.3.1-qt_py311h579de60_205.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/vtk-io-ffmpeg-9.3.1-qt_py311h98fac4b_205.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wcwidth-0.2.13-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/webcolors-24.8.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/webencodings-0.5.1-pyhd8ed1ab_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/websocket-client-1.8.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/widgetsnbextension-4.0.13-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wslink-2.1.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/x264-1!164.3095-h775f41a_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/x265-3.5-hbb4e6a2_3.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/xarray-2024.9.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xerces-c-3.2.5-hfb503d4_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-fixesproto-5.0-h0d85af4_1002.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-kbproto-1.0.7-h35c211d_1002.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libice-1.1.1-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libsm-1.2.4-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libx11-1.8.9-h7022169_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxau-1.0.11-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxdmcp-1.1.3-h35c211d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxext-1.3.4-hb7f2c08_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxfixes-5.0.3-h0d85af4_1004.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxrender-0.9.11-h0dc2134_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-renderproto-0.11.1-h0d85af4_1002.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-xextproto-7.3.0-hb7f2c08_1003.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xorg-xproto-7.0.31-h35c211d_1007.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/xugrid-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/xyzservices-2024.9.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/yaml-0.2.5-h0d85af4_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/yarl-1.9.4-py311h3336109_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/zeromq-4.3.5-hb33e954_5.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py311hdf6fcd6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda - - pypi: https://files.pythonhosted.org/packages/50/fa/a2561d6837cd45a3971c514222e94d3ded3f105993ddcf4983ed68ce3da3/mypy2junit-1.9.0-py3-none-any.whl - osx-arm64: - - conda: https://conda.anaconda.org/conda-forge/noarch/accessible-pygments-0.0.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/affine-2.4.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/aiohappyeyeballs-2.4.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aiohttp-3.10.5-py311h460d6c5_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/aiosignal-1.3.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/alabaster-1.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/annotated-types-0.7.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/anyio-4.4.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aom-3.9.1-h7bae524_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/appnope-0.1.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/argon2-cffi-23.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/argon2-cffi-bindings-21.2.0-py311h460d6c5_5.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/arrow-1.3.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/asciitree-0.3.3-py_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/asttokens-2.4.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/async-lru-2.0.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/atk-1.0-2.38.0-hd03087b_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-auth-0.7.29-hd3c7522_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-cal-0.7.4-h41dd001_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-common-0.9.28-hd74edd7_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-compression-0.2.19-h41dd001_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-event-stream-0.4.3-hb2a355e_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-http-0.8.8-hf5a2c8c_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-io-0.14.18-hc3cb426_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-mqtt-0.10.4-hb9beb3e_19.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-s3-0.6.5-h439c227_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-sdkutils-0.1.19-h41dd001_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-checksums-0.1.18-h41dd001_11.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-crt-cpp-0.28.2-h4756f83_4.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/aws-sdk-cpp-1.11.379-h67f4a54_9.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-core-cpp-1.13.0-hd01fc5c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-identity-cpp-1.8.0-h13ea094_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-storage-blobs-cpp-12.12.0-hfde595f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-storage-common-cpp-12.7.0-hcf3b6fd_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/azure-storage-files-datalake-cpp-12.11.0-h082e32e_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/babel-2.14.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/backports-1.0-pyhd8ed1ab_4.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/backports.tarfile-1.0.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/backports.zoneinfo-0.2.1-py311h267d04e_9.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/beautifulsoup4-4.12.3-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/black-24.8.0-py311h267d04e_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/bleach-6.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/blosc-1.21.6-h5499902_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/bokeh-3.5.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bottleneck-1.4.0-py311h0f07fe1_2.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/branca-0.7.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/brotli-1.1.0-hd74edd7_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/brotli-bin-1.1.0-hd74edd7_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/brotli-python-1.1.0-py311h3f08180_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/c-ares-1.33.1-hd74edd7_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ca-certificates-2024.8.30-hf0a4a13_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/cached-property-1.5.2-hd8ed1ab_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/cached_property-1.5.2-pyha770c72_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cairo-1.18.0-hb4a6bf7_3.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/certifi-2024.8.30-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cffi-1.17.1-py311h3a79f62_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cfitsio-4.4.1-h793ed5c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cftime-1.6.4-py311h0f07fe1_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.3.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/click-8.1.7-unix_pyh707e725_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/click-plugins-1.1.1-py_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/cligj-0.7.2-pyhd8ed1ab_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/cloudpickle-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cmarkgfm-0.8.0-py311heffc1b2_3.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/comm-0.2.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/contextily-1.6.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/contourpy-1.3.0-py311h2c37856_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/coverage-7.6.1-py311h460d6c5_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/cytoolz-0.12.3-py311h05b510d_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/dav1d-1.2.1-hb547adb_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/debugpy-1.8.5-py311h3f08180_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/decorator-5.1.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/defusedxml-0.7.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/double-conversion-3.3.0-h13dd4ca_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/eigen-3.4.0-h1995070_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/entrypoints-0.4-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.2.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/executing-2.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/expat-2.6.3-hf9b8971_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ffmpeg-6.1.2-gpl_h3ef3969_102.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/flopy-3.8.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fltk-1.3.9-h5164b75_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fmt-11.0.2-h420ef59_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/folium-0.17.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-dejavu-sans-mono-2.37-hab24e00_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-inconsolata-3.000-h77eed37_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77eed37_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fontconfig-2.14.2-h82840c6_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fonttools-4.53.1-py311h460d6c5_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fqdn-1.5.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/freeimage-3.18.0-hf268909_21.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/freetype-2.12.1-hadb7bae_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/freexl-2.0.0-hfbad9fb_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fribidi-1.0.10-h27ca646_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/frozenlist-1.4.1-py311h460d6c5_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fsspec-2024.9.0-pyhff2d567_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gdal-3.9.2-py311h40baa13_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gdk-pixbuf-2.42.12-h7ddc832_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/geographiclib-2.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/geopandas-1.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/geopandas-base-1.0.1-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/geopy-2.4.1-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/geos-3.12.2-h00cdb27_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/geotiff-1.7.3-h7e5fb84_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gettext-0.22.5-h8414b35_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gettext-tools-0.22.5-h8414b35_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gflags-2.2.2-hc88da5d_1004.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gh-2.56.0-h163aea0_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/giflib-5.2.2-h93a5062_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gl2ps-1.4.2-hc97c1ff_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/glew-2.1.0-h9f76cd9_2.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/glog-0.7.1-heb240a5_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gmp-6.3.0-h7bae524_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gmsh-4.12.2-hd427cfb_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gnutls-3.8.7-h9df781c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/graphite2-1.3.13-hebf3989_1003.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/graphviz-12.0.0-hbf8cc41_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gtk2-2.24.33-h91d5085_5.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gts-0.7.6-he42f4ea_4.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/h11-0.14.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/h2-4.1.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/harfbuzz-9.0.0-h997cde5_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/hatchling-1.25.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/hdf4-4.2.15-h2ee6834_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/hdf5-1.14.3-nompi_hec07895_105.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/hpack-4.0.0-pyh9f0ad1d_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/httpcore-1.0.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/httpx-0.27.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/icu-75.1-hfee45f7_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/imath-3.1.12-h025cafa_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.5.0-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_resources-6.4.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/ipykernel-6.29.5-pyh57ce528_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/ipython-8.27.0-pyh707e725_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/ipywidgets-8.1.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/isoduration-20.11.0-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/jaraco.classes-3.4.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jaraco.context-5.3.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jaraco.functools-4.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jedi-0.19.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/joblib-1.4.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/json-c-0.17-he54c16a_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/json5-0.9.25-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jsoncpp-1.9.5-hc021e02_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jsonpointer-3.0.0-py311h267d04e_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-4.23.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-specifications-2023.12.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-with-format-nongpl-4.23.0-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter-1.1.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter-lsp-2.2.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_client-8.6.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_console-6.6.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jupyter_core-5.7.2-py311h267d04e_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_events-0.10.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_server-2.14.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_server_terminals-0.5.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab-4.2.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab_pygments-0.3.0-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab_server-2.27.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab_widgets-3.0.13-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jxrlib-1.1-h93a5062_3.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/kealib-1.5.3-h8edbb62_2.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/jxrlib-1.1-h93a5062_3.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/kealib-1.5.3-h8edbb62_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/keyring-25.3.0-pyh534df25_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/kiwisolver-1.4.7-py311h2c37856_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/krb5-1.21.3-h237132a_0.conda @@ -3069,10 +2113,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libabseil-20240116.2-cxx17_h00cdb27_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libaec-1.1.3-hebf3989_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarchive-3.7.4-h83d404f_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-17.0.0-h20538ec_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-acero-17.0.0-hf9b8971_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-dataset-17.0.0-hf9b8971_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-substrait-17.0.0-hbf8b706_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-17.0.0-h77c2f02_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-acero-17.0.0-hf9b8971_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-dataset-17.0.0-hf9b8971_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-substrait-17.0.0-hbf8b706_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libasprintf-0.22.5-h8414b35_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libasprintf-devel-0.22.5-h8414b35_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libass-0.17.3-hf20b609_0.conda @@ -3112,8 +2156,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgfortran-5.0.0-13_2_0_hd922786_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgfortran5-13.2.0-hf226fd6_3.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libglib-2.80.3-h59d46d9_2.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-2.28.0-hfe08963_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-storage-2.28.0-h1466eeb_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-2.29.0-hfa33a2f_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-storage-2.29.0-h90fd6fa_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libgrpc-1.62.2-h9c18a4f_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libhwloc-2.11.1-default_h7685b71_1000.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libiconv-1.17-h0d3ecfb_2.conda @@ -3142,8 +2186,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libopenvino-tensorflow-frontend-2024.3.0-h2741c3b_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libopenvino-tensorflow-lite-frontend-2024.3.0-h00cdb27_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libopus-1.3.1-h27ca646_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libparquet-17.0.0-hf0ba9ef_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libpng-1.6.43-h091b4b1_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libparquet-17.0.0-hf0ba9ef_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libpng-1.6.44-hc14010f_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libpq-16.4-h671472c_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libprotobuf-4.25.3-hbfab5d5_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libraw-0.21.1-h2ee6834_2.conda @@ -3236,7 +2280,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/pixman-0.43.4-hebf3989_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkgutil-resolve-name-1.3.10-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/poppler-24.08.0-h37b219d_1.conda @@ -3273,10 +2317,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-dotenv-0.5.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/python-3.11.0-h3ba56d0_1_cpython.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/python-3.11.10-h739c21a_0_cpython.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-fastjsonschema-2.20.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda @@ -3303,7 +2349,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/rpds-py-0.20.0-py311h481aa64_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ruff-0.6.4-py311h2cf8269_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ruff-0.6.5-py311h2cf8269_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/scikit-learn-1.5.2-py311h9e23f0f_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/scipy-1.14.1-py311h2929bc6_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/scooby-0.10.0-pyhd8ed1ab_0.conda @@ -3336,7 +2382,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/terminado-0.18.1-pyh31c8845_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tiledb-2.26.0-h3c94177_0.conda + - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tiledb-2.26.0-hfe5b9dc_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tinycss2-1.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/tk-8.6.13-h5083fa2_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 @@ -3356,7 +2402,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/uri-template-1.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/uriparser-0.9.8-h00cdb27_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/utfcpp-4.0.5-hce30654_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/vtk-9.3.1-qt_py311h07c347a_205.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/vtk-base-9.3.1-qt_py311h14e0e01_205.conda @@ -3393,7 +2439,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/zeromq-4.3.5-h64debc3_5.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/zlib-1.3.1-hfb2fe0b_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py311ha60cc69_1.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda @@ -3415,18 +2461,18 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/asttokens-2.4.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/async-lru-2.0.4-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-24.2.0-pyh71513ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-auth-0.7.29-hf1f9119_1.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-auth-0.7.30-h4ab18a5_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-cal-0.7.4-hf1fc857_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-common-0.9.28-h2466b09_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-compression-0.2.19-hf1fc857_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-event-stream-0.4.3-hb6a8f00_1.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-http-0.8.8-heca9ddf_2.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-http-0.8.9-heca9ddf_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-io-0.14.18-h3831a8d_9.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-mqtt-0.10.4-h4d6445f_19.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-s3-0.6.5-h184cd82_2.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-mqtt-0.10.5-h8fec231_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-s3-0.6.5-h6945fc3_4.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-c-sdkutils-0.1.19-hf1fc857_3.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-checksums-0.1.18-hf1fc857_11.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/aws-crt-cpp-0.28.2-hcae1b89_4.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/aws-crt-cpp-0.28.2-h2ae5ca2_6.conda - conda: https://conda.anaconda.org/conda-forge/win-64/aws-sdk-cpp-1.11.379-h76bae87_9.conda - conda: https://conda.anaconda.org/conda-forge/win-64/azure-core-cpp-1.13.0-haf5610f_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/azure-identity-cpp-1.8.0-h148e6f0_2.conda @@ -3469,15 +2515,15 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/coverage-7.6.1-py311he736701_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/cytoolz-0.12.3-py311ha68e1ae_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.14-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/dav1d-1.2.1-hcfcfb64_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/debugpy-1.8.5-py311hda3d55a_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/decopatch-1.4.10-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/decorator-5.1.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/defusedxml-0.7.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.9.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/double-conversion-3.3.0-h63175ca_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/editables-0.5-pyhd8ed1ab_0.conda @@ -3487,7 +2533,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/executing-2.1.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/expat-2.6.3-he0c23c2_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.8-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/fasteners-0.17.3-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/win-64/ffmpeg-6.1.2-gpl_h9cf63cc_102.conda - conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.0-pyhd8ed1ab_0.conda @@ -3537,7 +2583,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/hypothesis-6.112.1-pyha770c72_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/icu-75.1-he0c23c2_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/win-64/imath-3.1.12-hbb528cf_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.5.0-pyha770c72_0.conda @@ -3583,10 +2629,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/libabseil-20240116.2-cxx17_he0c23c2_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libaec-1.1.3-h63175ca_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libarchive-3.7.4-haf234dc_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-17.0.0-h29daf90_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-acero-17.0.0-he0c23c2_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-dataset-17.0.0-he0c23c2_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-substrait-17.0.0-h1f0e801_13_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-17.0.0-he3462ed_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-acero-17.0.0-he0c23c2_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-dataset-17.0.0-he0c23c2_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libarrow-substrait-17.0.0-h1f0e801_14_cpu.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libblas-3.9.0-23_win64_mkl.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libbrotlicommon-1.1.0-h2466b09_2.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libbrotlidec-1.1.0-h2466b09_2.conda @@ -3615,8 +2661,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/libgdal-tiledb-3.9.2-hb8b5d01_2.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libgdal-xls-3.9.2-hd0e23a6_2.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libglib-2.80.3-h7025463_2.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-2.28.0-h5e7cea3_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-storage-2.28.0-he5eb982_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-2.29.0-h5e7cea3_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-storage-2.29.0-he5eb982_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libgrpc-1.62.2-h5273850_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libhwloc-2.11.1-default_h8125262_1000.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libiconv-1.17-hcfcfb64_2.conda @@ -3627,8 +2673,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/libnetcdf-4.9.2-nompi_h92078aa_114.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libogg-1.3.5-h2466b09_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libopus-1.3.1-h8ffe710_1.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/win-64/libparquet-17.0.0-ha915800_13_cpu.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/libpng-1.6.43-h19919ed_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libparquet-17.0.0-ha915800_14_cpu.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/libpng-1.6.44-h3ca93ac_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libpq-16.4-hab9416b_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libprotobuf-4.25.3-h503648d_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/libraw-0.21.1-h5557f11_2.conda @@ -3715,7 +2761,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/pixman-0.43.4-h63175ca_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkginfo-1.10.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pkgutil-resolve-name-1.3.10-pyhd8ed1ab_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.5.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pooch-1.8.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/poppler-24.08.0-h9415970_1.conda @@ -3751,10 +2797,12 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-benchmark-4.0.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cases-3.8.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-cov-5.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-dotenv-0.5.2-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.6.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/python-3.11.0-hcf16a7b_0_cpython.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/win-64/python-3.11.10-hce54a09_0_cpython.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-build-1.2.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-fastjsonschema-2.20.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-gmsh-4.12.2-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/python-graphviz-0.20.3-pyh717bed2_0.conda @@ -3783,7 +2831,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/rioxarray-0.17.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/rpds-py-0.20.0-py311h533ab2d_1.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/ruff-0.6.4-py311heeab51b_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/ruff-0.6.5-py311heeab51b_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/scikit-learn-1.5.2-py311hdcb8d17_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/scipy-1.14.1-py311hd4686c6_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/scooby-0.10.0-pyhd8ed1ab_0.conda @@ -3816,7 +2864,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tblib-3.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/terminado-0.18.1-pyh5737063_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/threadpoolctl-3.5.0-pyhc1e730c_0.conda - - conda: https://conda.anaconda.org/conda-forge/win-64/tiledb-2.26.0-h98a567f_0.conda + - conda: https://conda.anaconda.org/conda-forge/win-64/tiledb-2.26.0-hefd1f8f_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/tinycss2-1.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/tk-8.6.13-h5226925_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/toml-0.10.2-pyhd8ed1ab_0.tar.bz2 @@ -3836,7 +2884,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/win-64/ucrt-10.0.22621.0-h57928b3_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/uri-template-1.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/uriparser-0.9.8-h5a68840_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/utfcpp-4.0.5-h57928b3_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/vc-14.3-h8a93ad2_21.conda - conda: https://conda.anaconda.org/conda-forge/win-64/vc14_runtime-14.40.33810-ha82c5b3_21.conda @@ -3880,7 +2928,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/zarr-2.18.3-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/zeromq-4.3.5-he1f189c_5.conda - conda: https://conda.anaconda.org/conda-forge/noarch/zict-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py311h53056dc_1.conda - conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda @@ -3936,44 +2984,6 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2 - osx-64: - - conda: https://conda.anaconda.org/conda-forge/noarch/annotated-types-0.7.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/click-8.1.7-unix_pyh707e725_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libexpat-2.6.3-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/markdown-it-py-3.0.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mdurl-0.1.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/more-itertools-10.5.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/ordered_enum-0.0.9-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pixi-diff-to-markdown-0.2.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/py-rattler-0.5.0-py312he8fc997_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pydantic-2.9.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/pydantic-core-2.23.3-py312h669792a_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pydantic-settings-2.5.2-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pygments-2.18.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.5-h37a9e06_0_cpython.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.12-5_cp312.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/rich-13.8.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-73.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/shellingham-1.5.4-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typer-0.12.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typer-slim-0.12.5-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typer-slim-standard-0.12.5-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typing-extensions-4.12.2-hd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/typing_extensions-4.12.2-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 osx-arm64: - conda: https://conda.anaconda.org/conda-forge/noarch/annotated-types-0.7.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda @@ -4081,22 +3091,6 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2 - osx-64: - - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.10.12-had23ca6_0_cpython.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-73.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 osx-arm64: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ca-certificates-2024.8.30-hf0a4a13_0.conda @@ -4159,22 +3153,6 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2 - osx-64: - - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.11.0-he7542f4_1_cpython.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-73.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 osx-arm64: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ca-certificates-2024.8.30-hf0a4a13_0.conda @@ -4238,23 +3216,6 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2 - osx-64: - - conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libexpat-2.6.3-hac325c4_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/pip-24.2-pyh8b19718_1.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.0-h30d4d87_0_cpython.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-73.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2024a-h8827d51_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.44.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 osx-arm64: - conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda - conda: https://conda.anaconda.org/conda-forge/osx-arm64/ca-certificates-2024.8.30-hf0a4a13_0.conda @@ -4404,31 +3365,6 @@ packages: - pkg:pypi/aiohappyeyeballs?source=hash-mapping size: 17032 timestamp: 1724167966661 -- kind: conda - name: aiohttp - version: 3.10.5 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aiohttp-3.10.5-py311h3336109_1.conda - sha256: 500cdf57db7380996f92d47499a6e37f641cc46b6d13c8d0da4fda4270c10aa2 - md5: ff15a465b64cca09e28d472778945ebe - depends: - - __osx >=10.13 - - aiohappyeyeballs >=2.3.0 - - aiosignal >=1.1.2 - - attrs >=17.3.0 - - frozenlist >=1.1.1 - - multidict >=4.5,<7.0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - yarl >=1.0,<1.9.5 - license: MIT AND Apache-2.0 - license_family: Apache - purls: - - pkg:pypi/aiohttp?source=hash-mapping - size: 797218 - timestamp: 1726062958620 - kind: conda name: aiohttp version: 3.10.5 @@ -4665,22 +3601,6 @@ packages: purls: [] size: 1958151 timestamp: 1718551737234 -- kind: conda - name: aom - version: 3.9.1 - build: hf036a51_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aom-3.9.1-hf036a51_0.conda - sha256: 3032f2f55d6eceb10d53217c2a7f43e1eac83603d91e21ce502e8179e63a75f5 - md5: 3f17bc32cb7fcb2b4bf3d8d37f656eb8 - depends: - - __osx >=10.13 - - libcxx >=16 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 2749186 - timestamp: 1718551450314 - kind: conda name: appnope version: 0.1.4 @@ -4719,26 +3639,6 @@ packages: - pkg:pypi/argon2-cffi?source=hash-mapping size: 18602 timestamp: 1692818472638 -- kind: conda - name: argon2-cffi-bindings - version: 21.2.0 - build: py311h3336109_5 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/argon2-cffi-bindings-21.2.0-py311h3336109_5.conda - sha256: fa5eb633b320e10fc2138f3d842d8a8ca72815f106acbab49a68ec9783e4d70d - md5: 29b46bd410067f668c4cef7fdc78fe25 - depends: - - __osx >=10.13 - - cffi >=1.0.1 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/argon2-cffi-bindings?source=hash-mapping - size: 32275 - timestamp: 1725356815696 - kind: conda name: argon2-cffi-bindings version: 21.2.0 @@ -4896,27 +3796,6 @@ packages: purls: [] size: 355900 timestamp: 1713896169874 -- kind: conda - name: atk-1.0 - version: 2.38.0 - build: h4bec284_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/atk-1.0-2.38.0-h4bec284_2.conda - sha256: a5972a943764e46478c966b26be61de70dcd7d0cfda4bd0b0c46916ae32e0492 - md5: d9684247c943d492d9aac8687bc5db77 - depends: - - __osx >=10.9 - - libcxx >=16 - - libglib >=2.80.0,<3.0a0 - - libintl >=0.22.5,<1.0a0 - constrains: - - atk-1.0 2.38.0 - license: LGPL-2.0-or-later - license_family: LGPL - purls: [] - size: 349989 - timestamp: 1713896423623 - kind: conda name: atk-1.0 version: 2.38.0 @@ -4957,91 +3836,67 @@ packages: timestamp: 1722977241383 - kind: conda name: aws-c-auth - version: 0.7.29 - build: h03582ad_1 - build_number: 1 - subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-auth-0.7.29-h03582ad_1.conda - sha256: 97379dd69b78e5b07a4a776bccb5835aa71f170912385e71ddba5cc93d9085dc - md5: 6d23dd1c1742112d5fe9f529da7afea9 - depends: - - __glibc >=2.17,<3.0.a0 - - aws-c-cal >=0.7.4,<0.7.5.0a0 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 - - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 - - libgcc >=13 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 107282 - timestamp: 1725868193209 -- kind: conda - name: aws-c-auth - version: 0.7.29 - build: h2dfa2de_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-auth-0.7.29-h2dfa2de_1.conda - sha256: a75f56a0d258a837f555c63a5d621e10497e6026c667b919a218038b9ad18647 - md5: e297a166392146d9e3fe3118550b9ff3 + version: 0.7.30 + build: h338687b_0 + subdir: osx-arm64 + url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-auth-0.7.30-h338687b_0.conda + sha256: 2d650815e881ac24237a98ab33d6f6d7431ad52edafc18c748399e1d85ca8641 + md5: 385fc8994159e570d0d45223f6b48aa9 depends: - - __osx >=10.13 + - __osx >=11.0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 license: Apache-2.0 license_family: Apache purls: [] - size: 94284 - timestamp: 1725868368256 + size: 92830 + timestamp: 1726208174150 - kind: conda name: aws-c-auth - version: 0.7.29 - build: hd3c7522_1 - build_number: 1 - subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-auth-0.7.29-hd3c7522_1.conda - sha256: a75545e58f83ce27bffc9d1fdb9218d34aa86ec9be364de207de56ff57c552ff - md5: c48c6fa5c0e9894235f8a883d81dea05 + version: 0.7.30 + build: h4ab18a5_0 + subdir: win-64 + url: https://conda.anaconda.org/conda-forge/win-64/aws-c-auth-0.7.30-h4ab18a5_0.conda + sha256: 7f80bf6b3a276cbef67c8ad2f7d1868fd20c789ca02bffb5aa1eb07e7c5e3351 + md5: d48ee1d9eec32116fe1a0a5c92a4cdf7 depends: - - __osx >=11.0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 + - ucrt >=10.0.20348.0 + - vc >=14.2,<15 + - vc14_runtime >=14.29.30139 license: Apache-2.0 license_family: Apache purls: [] - size: 92432 - timestamp: 1725868225655 + size: 102448 + timestamp: 1726208446187 - kind: conda name: aws-c-auth - version: 0.7.29 - build: hf1f9119_1 - build_number: 1 - subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/aws-c-auth-0.7.29-hf1f9119_1.conda - sha256: 617b3aa9cea4d1107a0809e0bc85ed60a7c6095a4992af9c08e97492cc65fa56 - md5: 8f3aa5632a78884b7f788e9d0fee03f3 + version: 0.7.30 + build: hec5e740_0 + subdir: linux-64 + url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-auth-0.7.30-hec5e740_0.conda + sha256: 5735b8ae76580e81662137c4dafca62cd9da8083b5b7bebe8ea7e9a806f1053f + md5: bc1b9f70ea7fa533aefa6a8b6fbe8da7 depends: + - __glibc >=2.17,<3.0.a0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 - - ucrt >=10.0.20348.0 - - vc >=14.2,<15 - - vc14_runtime >=14.29.30139 + - libgcc >=13 license: Apache-2.0 license_family: Apache purls: [] - size: 102681 - timestamp: 1725868656049 + size: 107190 + timestamp: 1726208110918 - kind: conda name: aws-c-cal version: 0.7.4 @@ -5060,24 +3915,6 @@ packages: purls: [] size: 39881 timestamp: 1725829996108 -- kind: conda - name: aws-c-cal - version: 0.7.4 - build: h8128ea2_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-cal-0.7.4-h8128ea2_1.conda - sha256: 6ffa143181fa40bbbe1b5dfad149b68e4c3fcb6e5d38a4f5a4490c8c3b4402df - md5: 195ef3e2d7dadb02a4b1f874a1e5e1e6 - depends: - - __osx >=10.13 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - openssl >=3.3.1,<4.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 39204 - timestamp: 1725829973 - kind: conda name: aws-c-cal version: 0.7.4 @@ -5117,21 +3954,6 @@ packages: purls: [] size: 47532 timestamp: 1725829965837 -- kind: conda - name: aws-c-common - version: 0.9.28 - build: h00291cd_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-common-0.9.28-h00291cd_0.conda - sha256: 9af8c4514526829de390bc5f5c103487dff1cd025463ea90b7f8dbb8f1d0ff16 - md5: ffe8898e6d97ecb791df1350ce273508 - depends: - - __osx >=10.13 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 225877 - timestamp: 1725670122224 - kind: conda name: aws-c-common version: 0.9.28 @@ -5215,23 +4037,6 @@ packages: purls: [] size: 19116 timestamp: 1725829968483 -- kind: conda - name: aws-c-compression - version: 0.2.19 - build: h8128ea2_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-compression-0.2.19-h8128ea2_1.conda - sha256: f60f8bec5eddd1974367aac03a646996374d8f290bb4463dfbf1e7620462e7be - md5: 43be0637437461d48ff524c04459ee46 - depends: - - __osx >=10.13 - - aws-c-common >=0.9.28,<0.9.29.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 17936 - timestamp: 1725829971987 - kind: conda name: aws-c-compression version: 0.2.19 @@ -5313,55 +4118,14 @@ packages: purls: [] size: 54527 timestamp: 1725857386993 -- kind: conda - name: aws-c-event-stream - version: 0.4.3 - build: hf6f7cdd_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-event-stream-0.4.3-hf6f7cdd_1.conda - sha256: 3a86d81ece111acc080cab42df6afc5c272c4ee7495d8cda22c90fc54bb0f27e - md5: 6f1d1e8b410d31a11db29d802f21cb64 - depends: - - __osx >=10.13 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-checksums >=0.1.18,<0.1.19.0a0 - - libcxx >=17 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 46628 - timestamp: 1725856844781 -- kind: conda - name: aws-c-http - version: 0.8.8 - build: h2f86973_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-http-0.8.8-h2f86973_2.conda - sha256: ed4350ada258ea8127a1d6af681e109956c3258aeb3e7e81f9e3d03881e91c5e - md5: a4fa477bc4b23b11f5a8f6b0e3a9ca97 - depends: - - __osx >=10.13 - - aws-c-cal >=0.7.4,<0.7.5.0a0 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-compression >=0.2.19,<0.2.20.0a0 - - aws-c-io >=0.14.18,<0.14.19.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 164110 - timestamp: 1725856460863 - kind: conda name: aws-c-http - version: 0.8.8 - build: h5e77a74_2 - build_number: 2 + version: 0.8.9 + build: h5e77a74_0 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-http-0.8.8-h5e77a74_2.conda - sha256: cef335beb17cd299024fae300653ae491c866f7c93287bdf44a9e9b4762b1a54 - md5: b75afaaf2a4ea0e1137ecb35262b8ed4 + url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-http-0.8.9-h5e77a74_0.conda + sha256: 9eac64d76ba4a799689392eb03bbbc93fd169ccbcb7719bcccbe2e9100ff2075 + md5: d7714013c40363f45850a25113e2cb05 depends: - __glibc >=2.17,<3.0.a0 - aws-c-cal >=0.7.4,<0.7.5.0a0 @@ -5372,17 +4136,16 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 197416 - timestamp: 1725856481663 + size: 197695 + timestamp: 1726016818491 - kind: conda name: aws-c-http - version: 0.8.8 - build: heca9ddf_2 - build_number: 2 + version: 0.8.9 + build: heca9ddf_0 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/aws-c-http-0.8.8-heca9ddf_2.conda - sha256: 9f151008584d7eb58b1184ac83015a5f8bc8e82cc4fa1e69d660e6260f79f4bc - md5: fcfd389b611656e45860e8e91ac70088 + url: https://conda.anaconda.org/conda-forge/win-64/aws-c-http-0.8.9-heca9ddf_0.conda + sha256: b347d4d86850eb7f5089bb99cff00071997697d451835c4b976f2a245a93986c + md5: c66174f469df56f4e2d6dbdcac7033e0 depends: - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 @@ -5394,17 +4157,16 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 182376 - timestamp: 1725857088696 + size: 182415 + timestamp: 1726017296936 - kind: conda name: aws-c-http - version: 0.8.8 - build: hf5a2c8c_2 - build_number: 2 + version: 0.8.9 + build: hf5a2c8c_0 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-http-0.8.8-hf5a2c8c_2.conda - sha256: 0af8e69c5b36210971298fe8de512974596f26317c244487b9906beeef12ef61 - md5: 12d315734dafda8e8af2f6d73c631c8b + url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-http-0.8.9-hf5a2c8c_0.conda + sha256: c2781a06bdd8a695296429c3bbe8dc528a93f426e45ebeb83f3a41de43213dd6 + md5: 59efa3b4dc632c4fef6911be61ed1779 depends: - __osx >=11.0 - aws-c-cal >=0.7.4,<0.7.5.0a0 @@ -5414,8 +4176,8 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 152422 - timestamp: 1725856488039 + size: 152487 + timestamp: 1726017067618 - kind: conda name: aws-c-io version: 0.14.18 @@ -5474,56 +4236,36 @@ packages: purls: [] size: 138080 timestamp: 1725843155224 -- kind: conda - name: aws-c-io - version: 0.14.18 - build: hf9a0f1c_9 - build_number: 9 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-io-0.14.18-hf9a0f1c_9.conda - sha256: a089493c67ec9e000061920f5a2ef233f59911d474bc77dcec0f4fb9738750ab - md5: c67eee7b35a3fa7a186d65a604a4a01f - depends: - - __osx >=10.13 - - aws-c-cal >=0.7.4,<0.7.5.0a0 - - aws-c-common >=0.9.28,<0.9.29.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 138424 - timestamp: 1725843066014 - kind: conda name: aws-c-mqtt - version: 0.10.4 - build: h01636a3_19 - build_number: 19 + version: 0.10.5 + build: h0009854_0 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.10.4-h01636a3_19.conda - sha256: f188f9127e12b2f90d68c5887f9742838528d8ea64c11e25c90e135cc1465326 - md5: 8ec16206ccaaf74ee5830ffeba436ebc + url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-mqtt-0.10.5-h0009854_0.conda + sha256: 2345d19078aa7c40aaa6b95a7d0e019ddbf3518ff47dc09fadd69f4e1a7073a5 + md5: d393d0a6c9b993771fbc67a998fccf6c depends: - __glibc >=2.17,<3.0.a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - libgcc >=13 license: Apache-2.0 license_family: Apache purls: [] - size: 163865 - timestamp: 1725892070997 + size: 194112 + timestamp: 1726205708559 - kind: conda name: aws-c-mqtt - version: 0.10.4 - build: h4d6445f_19 - build_number: 19 + version: 0.10.5 + build: h8fec231_0 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/aws-c-mqtt-0.10.4-h4d6445f_19.conda - sha256: 0dc65ecddda8d26390b2d1cb5db074739c74d47c94f5e0a3927f8431bd0912b5 - md5: edf26447a744762aa7ac8fe678e046ca + url: https://conda.anaconda.org/conda-forge/win-64/aws-c-mqtt-0.10.5-h8fec231_0.conda + sha256: 2f851d2895d77b41c7ef0f8ce956fb14e85f307e1dfaa227d49de67af283e82a + md5: ac525109aeb0571fe434e4e4d6ecbfef depends: - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - ucrt >=10.0.20348.0 - vc >=14.2,<15 @@ -5531,60 +4273,62 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 157732 - timestamp: 1725892612990 + size: 185093 + timestamp: 1726206008091 - kind: conda name: aws-c-mqtt - version: 0.10.4 - build: hb9beb3e_19 - build_number: 19 + version: 0.10.5 + build: h9658b26_0 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-mqtt-0.10.4-hb9beb3e_19.conda - sha256: f10ffa7d46f89cedb37c9d2035fc411b194c725e71091c84782c998af844aa46 - md5: 588c40cf7234526f87e28a990f16367e + url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-mqtt-0.10.5-h9658b26_0.conda + sha256: a43c24179ddeb980a9cd920354f866d10d1f6a70c00ca7fe18840cb79678ecb1 + md5: 3736531fdfe90d4513b633d907aab907 depends: - __osx >=11.0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 license: Apache-2.0 license_family: Apache purls: [] - size: 117800 - timestamp: 1725891739734 + size: 133346 + timestamp: 1726205300093 - kind: conda - name: aws-c-mqtt - version: 0.10.4 - build: he4b61a0_19 - build_number: 19 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-mqtt-0.10.4-he4b61a0_19.conda - sha256: 5d38c7493b28100b954ae1f7420e0876ad0209b99a84600de6d691a220f03e6e - md5: 3cacaf9254c92818cd32de10b3a7bafe - depends: - - __osx >=10.13 + name: aws-c-s3 + version: 0.6.5 + build: h663ac5c_4 + build_number: 4 + subdir: osx-arm64 + url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-s3-0.6.5-h663ac5c_4.conda + sha256: 693463e8dbc86441f9a612c0556f200089aaa72e6e83479205072444d7e1fe34 + md5: 1714f0dbabb1431a351c8babe6b75bd9 + depends: + - __osx >=11.0 + - aws-c-auth >=0.7.30,<0.7.31.0a0 + - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 + - aws-checksums >=0.1.18,<0.1.19.0a0 license: Apache-2.0 license_family: Apache purls: [] - size: 138915 - timestamp: 1725892131190 + size: 96372 + timestamp: 1726237156052 - kind: conda name: aws-c-s3 version: 0.6.5 - build: h184cd82_2 - build_number: 2 + build: h6945fc3_4 + build_number: 4 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/aws-c-s3-0.6.5-h184cd82_2.conda - sha256: 21ffdc5473041b92a5e581a775988cb59d5b1cbda707b63dc6fc28cefd3b8f25 - md5: df345266c40ab1a2ac3b79be8aa421a2 + url: https://conda.anaconda.org/conda-forge/win-64/aws-c-s3-0.6.5-h6945fc3_4.conda + sha256: 5b260a765f82de496d516f71337de126fd2ca1b55f0415f397af64a959cfad59 + md5: bfd3ce054536b89fdc0aa4e448cde7fe depends: - - aws-c-auth >=0.7.29,<0.7.30.0a0 + - aws-c-auth >=0.7.30,<0.7.31.0a0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - aws-checksums >=0.1.18,<0.1.19.0a0 - ucrt >=10.0.20348.0 @@ -5593,23 +4337,23 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 108320 - timestamp: 1725882801691 + size: 108386 + timestamp: 1726237498088 - kind: conda name: aws-c-s3 version: 0.6.5 - build: h191b246_2 - build_number: 2 + build: hbaf354b_4 + build_number: 4 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.6.5-h191b246_2.conda - sha256: f43e6a308ae388e4a3968690ae8789e5cfb4d51c96d36a00c832a9067685b1d3 - md5: f8f40355dac7a75313d9c10de91330e7 + url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-s3-0.6.5-hbaf354b_4.conda + sha256: 03d8bd937ed49c857255483fa5d252064121adfbccab46aa0b3716de75022747 + md5: 2cefeb144de7712995d1b52cc6a3864c depends: - __glibc >=2.17,<3.0.a0 - - aws-c-auth >=0.7.29,<0.7.30.0a0 + - aws-c-auth >=0.7.30,<0.7.31.0a0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - aws-checksums >=0.1.18,<0.1.19.0a0 - libgcc >=13 @@ -5617,52 +4361,8 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 112780 - timestamp: 1725882305631 -- kind: conda - name: aws-c-s3 - version: 0.6.5 - build: h439c227_2 - build_number: 2 - subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-c-s3-0.6.5-h439c227_2.conda - sha256: 2b7d29e53d36745761977e9ff50e6733eb2d6a4b4fb8e5dc7af30de662ea1857 - md5: 981599eb3154f388e08278d8fba67bf2 - depends: - - __osx >=11.0 - - aws-c-auth >=0.7.29,<0.7.30.0a0 - - aws-c-cal >=0.7.4,<0.7.5.0a0 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 - - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-checksums >=0.1.18,<0.1.19.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 96196 - timestamp: 1725882392338 -- kind: conda - name: aws-c-s3 - version: 0.6.5 - build: h915d0f8_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-s3-0.6.5-h915d0f8_2.conda - sha256: b12778ac3bfa5574420472faee2944952c07067f1dc8cca832013edea1982b48 - md5: eb182c006b6eb87d523d51295c2e8050 - depends: - - __osx >=10.13 - - aws-c-auth >=0.7.29,<0.7.30.0a0 - - aws-c-cal >=0.7.4,<0.7.5.0a0 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 - - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-checksums >=0.1.18,<0.1.19.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 97417 - timestamp: 1725882369510 + size: 112846 + timestamp: 1726237070601 - kind: conda name: aws-c-sdkutils version: 0.1.19 @@ -5688,33 +4388,16 @@ packages: subdir: linux-64 url: https://conda.anaconda.org/conda-forge/linux-64/aws-c-sdkutils-0.1.19-h756ea98_3.conda sha256: 4e6f79f3fee5ebb4fb12b6258d91315ed0f7a2ac16c75611cffdbaa0d54badb2 - md5: bfe6623096906d2502c78ccdbfc3bc7a - depends: - - __glibc >=2.17,<3.0.a0 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - libgcc >=13 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 55799 - timestamp: 1725836731034 -- kind: conda - name: aws-c-sdkutils - version: 0.1.19 - build: h8128ea2_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-c-sdkutils-0.1.19-h8128ea2_3.conda - sha256: 50912641279d00a6ce12b1d72e74ea5d30078e91a0557a48a9e9fe285c2f6b2c - md5: 8d93b3603363214303737f74b6efb5da + md5: bfe6623096906d2502c78ccdbfc3bc7a depends: - - __osx >=10.13 + - __glibc >=2.17,<3.0.a0 - aws-c-common >=0.9.28,<0.9.29.0a0 + - libgcc >=13 license: Apache-2.0 license_family: Apache purls: [] - size: 50686 - timestamp: 1725836776385 + size: 55799 + timestamp: 1725836731034 - kind: conda name: aws-c-sdkutils version: 0.1.19 @@ -5769,23 +4452,6 @@ packages: purls: [] size: 49962 timestamp: 1725836852149 -- kind: conda - name: aws-checksums - version: 0.1.18 - build: h8128ea2_11 - build_number: 11 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-checksums-0.1.18-h8128ea2_11.conda - sha256: 37965af8d420d114a5d603d149b7e4ce353b119dffe90ec67c53895cb0e5c402 - md5: 45959482adbad4397bfedcdf262bbb32 - depends: - - __osx >=10.13 - - aws-c-common >=0.9.28,<0.9.29.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 48884 - timestamp: 1725836961245 - kind: conda name: aws-checksums version: 0.1.18 @@ -5808,47 +4474,48 @@ packages: - kind: conda name: aws-crt-cpp version: 0.28.2 - build: h27d4fa7_4 - build_number: 4 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-crt-cpp-0.28.2-h27d4fa7_4.conda - sha256: ccdf92124ea1b0909164226b22932ad39ac80838d537ec960ed26f50f0680c7e - md5: 760a535c189a995ee99474027a87d1bb + build: h2ae5ca2_6 + build_number: 6 + subdir: win-64 + url: https://conda.anaconda.org/conda-forge/win-64/aws-crt-cpp-0.28.2-h2ae5ca2_6.conda + sha256: 8246682d553c794efcce331609f99515fd7c739b852a3c73cb7b482304f8daba + md5: 61f5d43cfbabdbace9203a6fd0bd2267 depends: - - __osx >=10.13 - - aws-c-auth >=0.7.29,<0.7.30.0a0 + - aws-c-auth >=0.7.30,<0.7.31.0a0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - aws-c-event-stream >=0.4.3,<0.4.4.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-c-mqtt >=0.10.4,<0.10.5.0a0 + - aws-c-mqtt >=0.10.5,<0.10.6.0a0 - aws-c-s3 >=0.6.5,<0.6.6.0a0 - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 - - libcxx >=17 + - ucrt >=10.0.20348.0 + - vc >=14.2,<15 + - vc14_runtime >=14.29.30139 license: Apache-2.0 license_family: Apache purls: [] - size: 294389 - timestamp: 1725905017625 + size: 254015 + timestamp: 1726483967569 - kind: conda name: aws-crt-cpp version: 0.28.2 - build: h29c84ef_4 - build_number: 4 + build: h6c0439f_6 + build_number: 6 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.28.2-h29c84ef_4.conda - sha256: 1404b6fd34e6e0e6587b771d4d63800123e0712792982bc2bbb0d78eeca26a94 - md5: 81674a3f6a59966a9ffaaaf063c8c331 + url: https://conda.anaconda.org/conda-forge/linux-64/aws-crt-cpp-0.28.2-h6c0439f_6.conda + sha256: 02b1494f6be3d8e5f40afea1ff2c24cb7f0b42d5d18761a99b43f0faa7c39d29 + md5: 4e472c316d08af60faeb71f86d7563e1 depends: - __glibc >=2.17,<3.0.a0 - - aws-c-auth >=0.7.29,<0.7.30.0a0 + - aws-c-auth >=0.7.30,<0.7.31.0a0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - aws-c-event-stream >=0.4.3,<0.4.4.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-c-mqtt >=0.10.4,<0.10.5.0a0 + - aws-c-mqtt >=0.10.5,<0.10.6.0a0 - aws-c-s3 >=0.6.5,<0.6.6.0a0 - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 - libgcc >=13 @@ -5856,61 +4523,34 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 349192 - timestamp: 1725904799209 + size: 350331 + timestamp: 1726483498115 - kind: conda name: aws-crt-cpp version: 0.28.2 - build: h4756f83_4 - build_number: 4 + build: h8f7a527_6 + build_number: 6 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-crt-cpp-0.28.2-h4756f83_4.conda - sha256: 12ab40d95bf4f27ef65bc4b0c6071284b5e163a770551bb8aea4ee98a0297a9f - md5: e23e171abfdaee648353c7a11700b409 + url: https://conda.anaconda.org/conda-forge/osx-arm64/aws-crt-cpp-0.28.2-h8f7a527_6.conda + sha256: 47c3a71486f5889b46706e45158e1bc3c375aec9eca99b1edf8d719cdda5cf46 + md5: 119a410e9b4fe4e31f91a32d7cb4a764 depends: - __osx >=11.0 - - aws-c-auth >=0.7.29,<0.7.30.0a0 + - aws-c-auth >=0.7.30,<0.7.31.0a0 - aws-c-cal >=0.7.4,<0.7.5.0a0 - aws-c-common >=0.9.28,<0.9.29.0a0 - aws-c-event-stream >=0.4.3,<0.4.4.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 + - aws-c-http >=0.8.9,<0.8.10.0a0 - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-c-mqtt >=0.10.4,<0.10.5.0a0 + - aws-c-mqtt >=0.10.5,<0.10.6.0a0 - aws-c-s3 >=0.6.5,<0.6.6.0a0 - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 - libcxx >=17 license: Apache-2.0 license_family: Apache purls: [] - size: 230467 - timestamp: 1725904893581 -- kind: conda - name: aws-crt-cpp - version: 0.28.2 - build: hcae1b89_4 - build_number: 4 - subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/aws-crt-cpp-0.28.2-hcae1b89_4.conda - sha256: 0e15511fb4fc3afa1ad5b08f75a508ea1a5ba85f68e0a7e621666104cda60673 - md5: 83ab71884fd2e42b68d0fae48fbcc2b0 - depends: - - aws-c-auth >=0.7.29,<0.7.30.0a0 - - aws-c-cal >=0.7.4,<0.7.5.0a0 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-event-stream >=0.4.3,<0.4.4.0a0 - - aws-c-http >=0.8.8,<0.8.9.0a0 - - aws-c-io >=0.14.18,<0.14.19.0a0 - - aws-c-mqtt >=0.10.4,<0.10.5.0a0 - - aws-c-s3 >=0.6.5,<0.6.6.0a0 - - aws-c-sdkutils >=0.1.19,<0.1.20.0a0 - - ucrt >=10.0.20348.0 - - vc >=14.2,<15 - - vc14_runtime >=14.29.30139 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 254847 - timestamp: 1725905205654 + size: 229739 + timestamp: 1726483567299 - kind: conda name: aws-sdk-cpp version: 1.11.379 @@ -5983,30 +4623,6 @@ packages: purls: [] size: 2765550 timestamp: 1725945456988 -- kind: conda - name: aws-sdk-cpp - version: 1.11.379 - build: h7a58a96_9 - build_number: 9 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/aws-sdk-cpp-1.11.379-h7a58a96_9.conda - sha256: 65ceb0bfddbeaf3f242ad737e1ed4dba77ba1ebc4ce74a02d7fc276aa2df544d - md5: 9d700e1fee39399bf96abf6e66cdd92d - depends: - - __osx >=10.13 - - aws-c-common >=0.9.28,<0.9.29.0a0 - - aws-c-event-stream >=0.4.3,<0.4.4.0a0 - - aws-checksums >=0.1.18,<0.1.19.0a0 - - aws-crt-cpp >=0.28.2,<0.28.3.0a0 - - libcurl >=8.9.1,<9.0a0 - - libcxx >=17 - - libzlib >=1.3.1,<2.0a0 - - openssl >=3.3.2,<4.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 2754435 - timestamp: 1725944826345 - kind: conda name: azure-core-cpp version: 1.13.0 @@ -6062,24 +4678,6 @@ packages: purls: [] size: 287922 timestamp: 1720853302106 -- kind: conda - name: azure-core-cpp - version: 1.13.0 - build: hf8dbe3c_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/azure-core-cpp-1.13.0-hf8dbe3c_0.conda - sha256: 1976259d75ef68431039522d7105777ac0621ef8a0f8a31140fa8926b1fe1280 - md5: 514d3cbb527a88930e816370e34caa19 - depends: - - __osx >=10.13 - - libcurl >=8.8.0,<9.0a0 - - libcxx >=16 - - openssl >=3.3.1,<4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 296234 - timestamp: 1720853326346 - kind: conda name: azure-identity-cpp version: 1.8.0 @@ -6118,25 +4716,6 @@ packages: purls: [] size: 383395 timestamp: 1721777916149 -- kind: conda - name: azure-identity-cpp - version: 1.8.0 - build: h60298e3_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/azure-identity-cpp-1.8.0-h60298e3_2.conda - sha256: 7bc11d77aab926aff437b6afc089fe937ab03b9f09d679520d4d4a91717b5337 - md5: 29dc05d3b825fd7e2efe0263621c2fdb - depends: - - __osx >=10.13 - - azure-core-cpp >=1.13.0,<1.13.1.0a0 - - libcxx >=16 - - openssl >=3.3.1,<4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 148019 - timestamp: 1721777648770 - kind: conda name: azure-identity-cpp version: 1.8.0 @@ -6157,24 +4736,6 @@ packages: purls: [] size: 199516 timestamp: 1721777604325 -- kind: conda - name: azure-storage-blobs-cpp - version: 12.12.0 - build: h646f05d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-blobs-cpp-12.12.0-h646f05d_0.conda - sha256: 7153e4ba0112246fc93b2b6631c17b1c2c4f7878f2c4a25426e38a78a0b4cd4c - md5: d3f572c8ebf9ad5cdc07558b3b2c27ce - depends: - - __osx >=10.13 - - azure-core-cpp >=1.13.0,<1.13.1.0a0 - - azure-storage-common-cpp >=12.7.0,<12.7.1.0a0 - - libcxx >=16 - license: MIT - license_family: MIT - purls: [] - size: 423224 - timestamp: 1721865021128 - kind: conda name: azure-storage-blobs-cpp version: 12.12.0 @@ -6291,26 +4852,6 @@ packages: purls: [] size: 119821 timestamp: 1721832870493 -- kind: conda - name: azure-storage-common-cpp - version: 12.7.0 - build: hf91904f_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-common-cpp-12.7.0-hf91904f_1.conda - sha256: 333599899b25ef22e2a2e1c09bab75203da9f47612e1ff2a40fddae76feb08eb - md5: 99146c62f4b2a74c3026f128f42e35bf - depends: - - __osx >=10.13 - - azure-core-cpp >=1.13.0,<1.13.1.0a0 - - libcxx >=16 - - libxml2 >=2.12.7,<3.0a0 - - openssl >=3.3.1,<4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 124472 - timestamp: 1721832914540 - kind: conda name: azure-storage-files-datalake-cpp version: 12.11.0 @@ -6331,26 +4872,6 @@ packages: purls: [] size: 189065 timestamp: 1721925275724 -- kind: conda - name: azure-storage-files-datalake-cpp - version: 12.11.0 - build: h14965f0_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/azure-storage-files-datalake-cpp-12.11.0-h14965f0_1.conda - sha256: 73ada329714a4893238737d77be147b1e1412f80fa94191c3f686eae0bee459c - md5: d99c3c0c72b11340028cac4689835c0c - depends: - - __osx >=10.13 - - azure-core-cpp >=1.13.0,<1.13.1.0a0 - - azure-storage-blobs-cpp >=12.12.0,<12.12.1.0a0 - - azure-storage-common-cpp >=12.7.0,<12.7.1.0a0 - - libcxx >=16 - license: MIT - license_family: MIT - purls: [] - size: 192115 - timestamp: 1721925157499 - kind: conda name: azure-storage-files-datalake-cpp version: 12.11.0 @@ -6478,23 +4999,6 @@ packages: purls: [] size: 6586 timestamp: 1724954134732 -- kind: conda - name: backports.zoneinfo - version: 0.2.1 - build: py311h6eed73b_9 - build_number: 9 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/backports.zoneinfo-0.2.1-py311h6eed73b_9.conda - sha256: d503a07b251d4ce7bc372ec9ce7617c349919d96222e6c8127c09c7bf5939d04 - md5: bad08123d75515ac39d82dbca34ca8fc - depends: - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 6652 - timestamp: 1724954260065 - kind: conda name: beautifulsoup4 version: 4.12.3 @@ -6583,29 +5087,6 @@ packages: - pkg:pypi/black?source=hash-mapping size: 397532 timestamp: 1726154954524 -- kind: conda - name: black - version: 24.8.0 - build: py311h6eed73b_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/black-24.8.0-py311h6eed73b_1.conda - sha256: 5a3cd26fe7fdb23efbd319da2f6ad213f6c1fe0440064c5d90e659d8d69472e4 - md5: 8b68d2691deb5a3319a6b1bc5c8ec148 - depends: - - click >=8.0.0 - - mypy_extensions >=0.4.3 - - packaging >=22.0 - - pathspec >=0.9 - - platformdirs >=2 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/black?source=hash-mapping - size: 397529 - timestamp: 1726155022924 - kind: conda name: bleach version: 6.1.0 @@ -6647,26 +5128,6 @@ packages: purls: [] size: 33201 timestamp: 1719266149627 -- kind: conda - name: blosc - version: 1.21.6 - build: h7d75f6d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/blosc-1.21.6-h7d75f6d_0.conda - sha256: 65e5f5dd3d68ed0d9d35e79d64f8141283cad2b55dcd9a04480ceea0e436aca8 - md5: 3e5669e51737d04f4806dd3e8c424663 - depends: - - __osx >=10.13 - - libcxx >=16 - - libzlib >=1.3.1,<2.0a0 - - lz4-c >=1.9.3,<1.10.0a0 - - snappy >=1.2.0,<1.3.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 47051 - timestamp: 1719266142315 - kind: conda name: blosc version: 1.21.6 @@ -6734,26 +5195,6 @@ packages: - pkg:pypi/bokeh?source=hash-mapping size: 4798991 timestamp: 1724417639170 -- kind: conda - name: bottleneck - version: 1.4.0 - build: py311h0034819_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/bottleneck-1.4.0-py311h0034819_2.conda - sha256: 9cdd9ee006de1bdbe86a3a927026c775f4294bd570116a3787b364098d08fabe - md5: c36eb3227c7d4cc32934a0b9b416f36f - depends: - - __osx >=10.13 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-2-Clause - license_family: BSD - purls: - - pkg:pypi/bottleneck?source=hash-mapping - size: 142336 - timestamp: 1725351486741 - kind: conda name: bottleneck version: 1.4.0 @@ -6836,25 +5277,6 @@ packages: - pkg:pypi/branca?source=hash-mapping size: 28923 timestamp: 1714071906758 -- kind: conda - name: brotli - version: 1.1.0 - build: h00291cd_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/brotli-1.1.0-h00291cd_2.conda - sha256: 624954bc08b3d7885a58c7d547282cfb9a201ce79b748b358f801de53e20f523 - md5: 2db0c38a7f2321c5bdaf32b181e832c7 - depends: - - __osx >=10.13 - - brotli-bin 1.1.0 h00291cd_2 - - libbrotlidec 1.1.0 h00291cd_2 - - libbrotlienc 1.1.0 h00291cd_2 - license: MIT - license_family: MIT - purls: [] - size: 19450 - timestamp: 1725267851605 - kind: conda name: brotli version: 1.1.0 @@ -6915,24 +5337,6 @@ packages: purls: [] size: 19588 timestamp: 1725268044856 -- kind: conda - name: brotli-bin - version: 1.1.0 - build: h00291cd_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/brotli-bin-1.1.0-h00291cd_2.conda - sha256: 642a8492491109fd8270c1e2c33b18126712df0cedb94aaa2b1c6b02505a4bfa - md5: 049933ecbf552479a12c7917f0a4ce59 - depends: - - __osx >=10.13 - - libbrotlidec 1.1.0 h00291cd_2 - - libbrotlienc 1.1.0 h00291cd_2 - license: MIT - license_family: MIT - purls: [] - size: 16643 - timestamp: 1725267837325 - kind: conda name: brotli-bin version: 1.1.0 @@ -7013,28 +5417,6 @@ packages: - pkg:pypi/brotli?source=hash-mapping size: 339584 timestamp: 1725268241628 -- kind: conda - name: brotli-python - version: 1.1.0 - build: py311hd89902b_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/brotli-python-1.1.0-py311hd89902b_2.conda - sha256: 004cefbd18f581636a8dcb1964fb73478f15d496769226ec896c1d4a0161b7d8 - md5: d75f06ee06001794aa83a05e885f1520 - depends: - - __osx >=10.13 - - libcxx >=17 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - libbrotlicommon 1.1.0 h00291cd_2 - license: MIT - license_family: MIT - purls: - - pkg:pypi/brotli?source=hash-mapping - size: 363793 - timestamp: 1725267947069 - kind: conda name: brotli-python version: 1.1.0 @@ -7078,7 +5460,7 @@ packages: license: MIT license_family: MIT purls: - - pkg:pypi/brotli?source=compressed-mapping + - pkg:pypi/brotli?source=hash-mapping size: 350367 timestamp: 1725267768486 - kind: conda @@ -7180,37 +5562,6 @@ packages: license_family: BSD size: 122909 timestamp: 1720974522888 -- kind: conda - name: bzip2 - version: 1.0.8 - build: hfdf4475_7 - build_number: 7 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - sha256: cad153608b81fb24fc8c509357daa9ae4e49dfc535b2cb49b91e23dbd68fc3c5 - md5: 7ed4301d437b59045be7e051a0308211 - depends: - - __osx >=10.13 - license: bzip2-1.0.6 - license_family: BSD - purls: [] - size: 134188 - timestamp: 1720974491916 -- kind: conda - name: bzip2 - version: 1.0.8 - build: hfdf4475_7 - build_number: 7 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda - sha256: cad153608b81fb24fc8c509357daa9ae4e49dfc535b2cb49b91e23dbd68fc3c5 - md5: 7ed4301d437b59045be7e051a0308211 - depends: - - __osx >=10.13 - license: bzip2-1.0.6 - license_family: BSD - size: 134188 - timestamp: 1720974491916 - kind: conda name: c-ares version: 1.33.1 @@ -7228,21 +5579,6 @@ packages: purls: [] size: 166630 timestamp: 1724438651925 -- kind: conda - name: c-ares - version: 1.33.1 - build: h44e7173_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/c-ares-1.33.1-h44e7173_0.conda - sha256: 98b0ac09472e6737fc4685147d1755028cc650d428369cbe3cb74ab38b327095 - md5: b31a2de5edfddb308dda802eab2956dc - depends: - - __osx >=10.13 - license: MIT - license_family: MIT - purls: [] - size: 163203 - timestamp: 1724438157472 - kind: conda name: c-ares version: 1.33.1 @@ -7297,29 +5633,6 @@ packages: license: ISC size: 158773 timestamp: 1725019107649 -- kind: conda - name: ca-certificates - version: 2024.8.30 - build: h8857fd0_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda - sha256: 593f302d0f44c2c771e1614ee6d56fffdc7d616e6f187669c8b0e34ffce3e1ae - md5: b7e5424e7f06547a903d28e4651dbb21 - license: ISC - purls: [] - size: 158665 - timestamp: 1725019059295 -- kind: conda - name: ca-certificates - version: 2024.8.30 - build: h8857fd0_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/ca-certificates-2024.8.30-h8857fd0_0.conda - sha256: 593f302d0f44c2c771e1614ee6d56fffdc7d616e6f187669c8b0e34ffce3e1ae - md5: b7e5424e7f06547a903d28e4651dbb21 - license: ISC - size: 158665 - timestamp: 1725019059295 - kind: conda name: ca-certificates version: 2024.8.30 @@ -7427,31 +5740,6 @@ packages: purls: [] size: 1516680 timestamp: 1721139332360 -- kind: conda - name: cairo - version: 1.18.0 - build: h37bd5c4_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/cairo-1.18.0-h37bd5c4_3.conda - sha256: 8d70fbca4887b9b580de0f3715026e05f9e74fad8a652364aa0bccd795b1fa87 - md5: 448aad56614db52338dc4fd4c758cfb6 - depends: - - __osx >=10.13 - - fontconfig >=2.14.2,<3.0a0 - - fonts-conda-ecosystem - - freetype >=2.12.1,<3.0a0 - - icu >=75.1,<76.0a0 - - libcxx >=16 - - libglib >=2.80.3,<3.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libzlib >=1.3.1,<2.0a0 - - pixman >=0.43.4,<1.0a0 - - zlib - license: LGPL-2.1-only or MPL-1.1 - purls: [] - size: 892544 - timestamp: 1721139116538 - kind: conda name: cairo version: 1.18.0 @@ -7522,29 +5810,9 @@ packages: - python >=3.7 license: ISC purls: - - pkg:pypi/certifi?source=hash-mapping - size: 163752 - timestamp: 1725278204397 -- kind: conda - name: cffi - version: 1.17.1 - build: py311h137bacd_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/cffi-1.17.1-py311h137bacd_0.conda - sha256: 012ee7b1ed4f9b0490d6e90c72decf148d7575173c7eaf851cd87fd434d2cacc - md5: a4b0f531064fa3dd5e3afbb782ea2cd5 - depends: - - __osx >=10.13 - - libffi >=3.4,<4.0a0 - - pycparser - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/cffi?source=hash-mapping - size: 288762 - timestamp: 1725560945833 + - pkg:pypi/certifi?source=hash-mapping + size: 163752 + timestamp: 1725278204397 - kind: conda name: cffi version: 1.17.1 @@ -7628,26 +5896,6 @@ packages: purls: [] size: 802060 timestamp: 1718906517515 -- kind: conda - name: cfitsio - version: 4.4.1 - build: ha105788_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/cfitsio-4.4.1-ha105788_0.conda - sha256: 6b54b24abd3122d33d80a59a901cd51b26b6d47fbb9f84c2bf1f87606e9899c6 - md5: 99445be39aaea44a05046c479f8c6dc9 - depends: - - __osx >=10.13 - - bzip2 >=1.0.8,<2.0a0 - - libcurl >=8.8.0,<9.0a0 - - libgfortran 5.* - - libgfortran5 >=12.3.0 - - libgfortran5 >=13.2.0 - - libzlib >=1.3.1,<2.0a0 - license: LicenseRef-fitsio - purls: [] - size: 849075 - timestamp: 1718906514228 - kind: conda name: cfitsio version: 4.4.1 @@ -7685,26 +5933,6 @@ packages: purls: [] size: 924198 timestamp: 1718906379286 -- kind: conda - name: cftime - version: 1.6.4 - build: py311h0034819_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/cftime-1.6.4-py311h0034819_1.conda - sha256: 025d33877c4e3558bd3d8c8d11bcd11760d61634d1b1be819203c5b00b770132 - md5: cda3610885501f18aec020da7f92cf25 - depends: - - __osx >=10.13 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/cftime?source=hash-mapping - size: 211019 - timestamp: 1725400678395 - kind: conda name: cftime version: 1.6.4 @@ -7910,25 +6138,6 @@ packages: - pkg:pypi/cloudpickle?source=hash-mapping size: 24746 timestamp: 1697464875382 -- kind: conda - name: cmarkgfm - version: 0.8.0 - build: py311h2725bcf_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/cmarkgfm-0.8.0-py311h2725bcf_3.conda - sha256: a8036546261cc57f5383f9fcacaedd3c8aed76ca03c05fa5955fcd0a0707ff45 - md5: 3a4ef0858a3fae7e61ae9cdf72adefd1 - depends: - - cffi >=1.0.0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/cmarkgfm?source=hash-mapping - size: 113116 - timestamp: 1695670250339 - kind: conda name: cmarkgfm version: 0.8.0 @@ -8132,47 +6341,6 @@ packages: - pkg:pypi/contourpy?source=hash-mapping size: 275152 timestamp: 1725378492908 -- kind: conda - name: contourpy - version: 1.3.0 - build: py311hf2f7c97_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/contourpy-1.3.0-py311hf2f7c97_1.conda - sha256: 9fe4cee91fedc93ec5621b55c7f97b2bc262ba80e91f3ce10835ce471e2c6aee - md5: 8193dfff44172ea446c706827a0d7fa0 - depends: - - __osx >=10.13 - - libcxx >=17 - - numpy >=1.23 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/contourpy?source=hash-mapping - size: 258382 - timestamp: 1725378615519 -- kind: conda - name: coverage - version: 7.6.1 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/coverage-7.6.1-py311h3336109_1.conda - sha256: 9311ac3d0ff222ddef574bd817ff1efbe76e7c15ccde221b567af612d322119c - md5: 2709c5f016b1e6ffc0b0803109c02d04 - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - tomli - license: Apache-2.0 - license_family: APACHE - purls: - - pkg:pypi/coverage?source=hash-mapping - size: 372985 - timestamp: 1724953967565 - kind: conda name: coverage version: 7.6.1 @@ -8336,42 +6504,24 @@ packages: - pkg:pypi/cytoolz?source=hash-mapping size: 322488 timestamp: 1706897983176 -- kind: conda - name: cytoolz - version: 0.12.3 - build: py311he705e18_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/cytoolz-0.12.3-py311he705e18_0.conda - sha256: ef0ce284ca238ab279d7c0ffc7710e0e797855d07f1be3d5ae6cf17389311f15 - md5: e5d928a48ce4a515ac69d5f65fda1a60 - depends: - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - toolz >=0.10.0 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/cytoolz?source=hash-mapping - size: 343617 - timestamp: 1706897348938 - kind: conda name: dask - version: 2024.8.2 + version: 2024.9.0 build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/dask-2024.8.2-pyhd8ed1ab_0.conda - sha256: 6afd548c338bb418d9645081cbe49b93ffa70f0fb74d9c3c4ed7defd910178ea - md5: 3adbad9b363bd0163ef2ac59f095cc13 + url: https://conda.anaconda.org/conda-forge/noarch/dask-2024.9.0-pyhd8ed1ab_0.conda + sha256: cdffbc06b0bf0994e014133d2f2cc93541cf177d8646ea16d9843cd4fd78dc76 + md5: 43e08d885b7669b7605ede5bb9aa861f depends: - - bokeh >=2.4.2,!=3.0.* + - bokeh >=3.1.0 - cytoolz >=0.11.0 - - dask-core >=2024.8.2,<2024.8.3.0a0 + - dask-core >=2024.9.0,<2024.9.1.0a0 - dask-expr >=1.1,<1.2 - - distributed >=2024.8.2,<2024.8.3.0a0 + - distributed >=2024.9.0,<2024.9.1.0a0 - jinja2 >=2.10.3 - lz4 >=4.3.2 - - numpy >=1.21 + - numpy >=1.24 - pandas >=2.0 - pyarrow >=7.0 - pyarrow-hotfix @@ -8379,19 +6529,18 @@ packages: constrains: - openssl !=1.1.1e license: BSD-3-Clause - license_family: BSD purls: [] - size: 7417 - timestamp: 1725064395582 + size: 7439 + timestamp: 1726274070376 - kind: conda name: dask-core - version: 2024.8.2 + version: 2024.9.0 build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.8.2-pyhd8ed1ab_0.conda - sha256: 1c1b86b719262a7d557327f5c1e363e7039a4078c42270a19dcd9af42fe1404f - md5: 8e7524a2fb561506260db789806c7ee9 + url: https://conda.anaconda.org/conda-forge/noarch/dask-core-2024.9.0-pyhd8ed1ab_0.conda + sha256: 5664f6a63a482e19dfbaedd3022c8d4f8885f5e702f4bd5a0fb43b3a9d53e576 + md5: 8e6585b996dfa6fff92d7ccd0f18bb99 depends: - click >=8.1 - cloudpickle >=3.0.0 @@ -8403,22 +6552,21 @@ packages: - pyyaml >=5.3.1 - toolz >=0.10.0 license: BSD-3-Clause - license_family: BSD purls: - pkg:pypi/dask?source=hash-mapping - size: 888258 - timestamp: 1725051212771 + size: 896527 + timestamp: 1726263677521 - kind: conda name: dask-expr - version: 1.1.13 + version: 1.1.14 build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.13-pyhd8ed1ab_0.conda - sha256: e1b570064d24e85278c53c87e4e361e60fb01a156ce026eac310ff9dcbd85111 - md5: b77166a6032a2b8e52b3fee90d62ea4d + url: https://conda.anaconda.org/conda-forge/noarch/dask-expr-1.1.14-pyhd8ed1ab_0.conda + sha256: ea32799d1a29a2e271c9a4f05fdb7470e98fca1358be5c3c68c37a6f02674782 + md5: 6644c676dce50d7355e5e1c7e90e999c depends: - - dask-core 2024.8.2 + - dask-core 2024.9.0 - pandas >=2 - pyarrow - python >=3.10 @@ -8426,21 +6574,8 @@ packages: license_family: BSD purls: - pkg:pypi/dask-expr?source=hash-mapping - size: 185183 - timestamp: 1725321008333 -- kind: conda - name: dav1d - version: 1.2.1 - build: h0dc2134_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/dav1d-1.2.1-h0dc2134_0.conda - sha256: ec71a835866b42e946cd2039a5f7a6458851a21890d315476f5e66790ac11c96 - md5: 9d88733c715300a39f8ca2e936b7808d - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 668439 - timestamp: 1685696184631 + size: 185135 + timestamp: 1726268147604 - kind: conda name: dav1d version: 1.2.1 @@ -8525,26 +6660,6 @@ packages: - pkg:pypi/debugpy?source=hash-mapping size: 2257948 timestamp: 1725269516494 -- kind: conda - name: debugpy - version: 1.8.5 - build: py311hd89902b_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/debugpy-1.8.5-py311hd89902b_1.conda - sha256: e12eb0e8c203188eabbf366aa01ca41d5e113b3508992e88c4969da32128dcc7 - md5: b54f06cbb64d0d8794f0784071d53f53 - depends: - - __osx >=10.13 - - libcxx >=17 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/debugpy?source=hash-mapping - size: 2272632 - timestamp: 1725269348240 - kind: conda name: debugpy version: 1.8.5 @@ -8641,18 +6756,18 @@ packages: timestamp: 1615232388757 - kind: conda name: distributed - version: 2024.8.2 + version: 2024.9.0 build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.8.2-pyhd8ed1ab_0.conda - sha256: b0eb013bc9fa6d88424ec7bf2a9fb82448d2457edacccc798dea5ef760a6ef01 - md5: 44d22b5d98a219a4c35cafe9bf3b9ce2 + url: https://conda.anaconda.org/conda-forge/noarch/distributed-2024.9.0-pyhd8ed1ab_0.conda + sha256: e318c904cfb81dce68bce2c095aa267901a055b670441ad8cb7bea588aead532 + md5: 2e4adbc7926d91412fec7076f14d554d depends: - click >=8.0 - cloudpickle >=3.0.0 - cytoolz >=0.11.2 - - dask-core >=2024.8.2,<2024.8.3.0a0 + - dask-core >=2024.9.0,<2024.9.1.0a0 - jinja2 >=2.10.3 - locket >=1.0.0 - msgpack-python >=1.0.2 @@ -8672,8 +6787,8 @@ packages: license_family: BSD purls: - pkg:pypi/distributed?source=hash-mapping - size: 798375 - timestamp: 1725058359740 + size: 799773 + timestamp: 1726268229066 - kind: conda name: docutils version: 0.21.2 @@ -8738,21 +6853,6 @@ packages: purls: [] size: 70425 timestamp: 1686490368655 -- kind: conda - name: double-conversion - version: 3.3.0 - build: he965462_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/double-conversion-3.3.0-he965462_0.conda - sha256: 74b7e151887e2c79de5dfc2079bc4621a1bd85b8bed4595be3e0b7313808a498 - md5: a3de9d9550078b51db74fde63b1ccae6 - depends: - - libcxx >=15.0.7 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 67397 - timestamp: 1686490152080 - kind: conda name: editables version: '0.5' @@ -8801,21 +6901,6 @@ packages: purls: [] size: 1087751 timestamp: 1690275869049 -- kind: conda - name: eigen - version: 3.4.0 - build: h1c7c39f_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/eigen-3.4.0-h1c7c39f_0.conda - sha256: 187c0677e0cdcdc39aed716687a6290dd5b7f52b49eedaef2ed76be6cd0a5a3d - md5: 5b2cfc277e3d42d84a2a648825761156 - depends: - - libcxx >=15.0.7 - license: MPL-2.0 - license_family: MOZILLA - purls: [] - size: 1090184 - timestamp: 1690272503232 - kind: conda name: eigen version: 3.4.0 @@ -8917,22 +7002,6 @@ packages: purls: [] size: 137891 timestamp: 1725568750673 -- kind: conda - name: expat - version: 2.6.3 - build: hac325c4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/expat-2.6.3-hac325c4_0.conda - sha256: 79b0da6ca997f7a939bfb9631356afbc519343944fc81cc4261c6b3a85f6db32 - md5: 474cd8746e9f896fc5ae84af3c951796 - depends: - - __osx >=10.13 - - libexpat 2.6.3 hac325c4_0 - license: MIT - license_family: MIT - purls: [] - size: 128253 - timestamp: 1725568880679 - kind: conda name: expat version: 2.6.3 @@ -8969,22 +7038,21 @@ packages: timestamp: 1725568799108 - kind: conda name: fastcore - version: 1.7.5 + version: 1.7.8 build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.5-pyhd8ed1ab_0.conda - sha256: ab0081f7277adbd7d765404e7a9d6a1647dd3d4c8f33f0e57731a12ca98a71af - md5: c318e98648516f5e6d0d81aca96f8bdf + url: https://conda.anaconda.org/conda-forge/noarch/fastcore-1.7.8-pyhd8ed1ab_0.conda + sha256: d28cd99a99d26564968e2f9c84f4cfd833522cb7deaa350fbd1f38ad51846194 + md5: 473d8c430670e48862f33bf3b18e12a3 depends: - packaging - python >=3.8 license: Apache-2.0 - license_family: APACHE purls: - pkg:pypi/fastcore?source=hash-mapping - size: 72511 - timestamp: 1725881332151 + size: 72390 + timestamp: 1726319541057 - kind: conda name: fasteners version: 0.17.3 @@ -9051,55 +7119,6 @@ packages: purls: [] size: 8643313 timestamp: 1724646269620 -- kind: conda - name: ffmpeg - version: 6.1.2 - build: gpl_h5256a10_102 - build_number: 102 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/ffmpeg-6.1.2-gpl_h5256a10_102.conda - sha256: 2c2588f4e1f04e78ade580531f3840d3aa2a5e7dd6476f06f7d60554652e2674 - md5: 97b3872f2b835538be01d5b669725ac7 - depends: - - __osx >=10.13 - - aom >=3.9.1,<3.10.0a0 - - bzip2 >=1.0.8,<2.0a0 - - dav1d >=1.2.1,<1.2.2.0a0 - - fontconfig >=2.14.2,<3.0a0 - - fonts-conda-ecosystem - - freetype >=2.12.1,<3.0a0 - - gmp >=6.3.0,<7.0a0 - - gnutls >=3.8.7,<3.9.0a0 - - harfbuzz >=9.0.0,<10.0a0 - - lame >=3.100,<3.101.0a0 - - libass >=0.17.3,<0.17.4.0a0 - - libcxx >=17 - - libiconv >=1.17,<2.0a0 - - libopenvino >=2024.3.0,<2024.3.1.0a0 - - libopenvino-auto-batch-plugin >=2024.3.0,<2024.3.1.0a0 - - libopenvino-auto-plugin >=2024.3.0,<2024.3.1.0a0 - - libopenvino-hetero-plugin >=2024.3.0,<2024.3.1.0a0 - - libopenvino-intel-cpu-plugin >=2024.3.0,<2024.3.1.0a0 - - libopenvino-ir-frontend >=2024.3.0,<2024.3.1.0a0 - - libopenvino-onnx-frontend >=2024.3.0,<2024.3.1.0a0 - - libopenvino-paddle-frontend >=2024.3.0,<2024.3.1.0a0 - - libopenvino-pytorch-frontend >=2024.3.0,<2024.3.1.0a0 - - libopenvino-tensorflow-frontend >=2024.3.0,<2024.3.1.0a0 - - libopenvino-tensorflow-lite-frontend >=2024.3.0,<2024.3.1.0a0 - - libopus >=1.3.1,<2.0a0 - - libvpx >=1.14.1,<1.15.0a0 - - libxml2 >=2.12.7,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - openh264 >=2.4.1,<2.4.2.0a0 - - svt-av1 >=2.2.1,<2.2.2.0a0 - - x264 >=1!164.3095,<1!165 - - x265 >=3.5,<3.6.0a0 - - xz >=5.2.6,<6.0a0 - license: GPL-2.0-or-later - license_family: GPL - purls: [] - size: 9714082 - timestamp: 1724646469200 - kind: conda name: ffmpeg version: 6.1.2 @@ -9313,51 +7332,6 @@ packages: purls: [] size: 1507188 timestamp: 1720534018967 -- kind: conda - name: fltk - version: 1.3.9 - build: ha50d76c_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/fltk-1.3.9-ha50d76c_1.conda - sha256: 0904ed13fe6590e89b74ace95fbbcc3f8155f6e2287318383d7813ba4bb04207 - md5: a958392eb3607deaab9de701d24cf159 - depends: - - __osx >=10.13 - - libcxx >=16 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libxcb >=1.16,<1.17.0a0 - - libzlib >=1.3.1,<2.0a0 - - xorg-libice >=1.1.1,<2.0a0 - - xorg-libsm >=1.2.4,<2.0a0 - - xorg-libx11 >=1.8.9,<2.0a0 - - xorg-libxau >=1.0.11,<2.0a0 - - xorg-libxdmcp - - xorg-libxext >=1.3.4,<2.0a0 - - xorg-libxfixes - - xorg-libxrender >=0.9.11,<0.10.0a0 - license: LGPL-2.0-or-later - license_family: LGPL - purls: [] - size: 1268490 - timestamp: 1720534170056 -- kind: conda - name: fmt - version: 11.0.2 - build: h3c5361c_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/fmt-11.0.2-h3c5361c_0.conda - sha256: 4502053d2556431caa3a606b527eb1e45967109d6c6ffe094f18c3134cf77db1 - md5: e8070546e8739040383f6774e0cd4033 - depends: - - __osx >=10.13 - - libcxx >=16 - license: MIT - license_family: MIT - purls: [] - size: 184400 - timestamp: 1723046749457 - kind: conda name: fmt version: 11.0.2 @@ -9506,23 +7480,6 @@ packages: purls: [] size: 272010 timestamp: 1674828850194 -- kind: conda - name: fontconfig - version: 2.14.2 - build: h5bb23bf_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/fontconfig-2.14.2-h5bb23bf_0.conda - sha256: f63e6d1d6aef8ba6de4fc54d3d7898a153479888d40ffdf2e4cfad6f92679d34 - md5: 86cc5867dfbee4178118392bae4a3c89 - depends: - - expat >=2.5.0,<3.0a0 - - freetype >=2.12.1,<3.0a0 - - libzlib >=1.2.13,<2.0.0a0 - license: MIT - license_family: MIT - purls: [] - size: 237068 - timestamp: 1674829100063 - kind: conda name: fontconfig version: 2.14.2 @@ -9596,27 +7553,6 @@ packages: purls: [] size: 4102 timestamp: 1566932280397 -- kind: conda - name: fonttools - version: 4.53.1 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/fonttools-4.53.1-py311h3336109_1.conda - sha256: bee1d395de62b29f95b7dbf0810dcd2e4c12c134e16f97f90a4a0fa61c278dd2 - md5: eef33a0170f8f29c988177c50adbe8e4 - depends: - - __osx >=10.13 - - brotli - - munkres - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/fonttools?source=hash-mapping - size: 2797302 - timestamp: 1725391503677 - kind: conda name: fonttools version: 4.53.1 @@ -9729,32 +7665,6 @@ packages: purls: [] size: 467484 timestamp: 1726031370488 -- kind: conda - name: freeimage - version: 3.18.0 - build: h55e5cf8_21 - build_number: 21 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/freeimage-3.18.0-h55e5cf8_21.conda - sha256: 9975f1be1a7fe178cecfaca6bbb433d46f2abbc8d8fff7bbdc9d0c128525b299 - md5: aac33d7112ac5642fce95d91b12a7faa - depends: - - __osx >=10.13 - - imath >=3.1.12,<3.1.13.0a0 - - jxrlib >=1.1,<1.2.0a0 - - libcxx >=17 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libraw >=0.21.1,<0.22.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - - openexr >=3.2.2,<3.3.0a0 - - openjpeg >=2.5.2,<3.0a0 - license: GPL-2.0-or-later OR GPL-3.0-or-later OR FreeImage - purls: [] - size: 410929 - timestamp: 1726031551016 - kind: conda name: freeimage version: 3.18.0 @@ -9825,22 +7735,6 @@ packages: purls: [] size: 634972 timestamp: 1694615932610 -- kind: conda - name: freetype - version: 2.12.1 - build: h60636b9_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/freetype-2.12.1-h60636b9_2.conda - sha256: b292cf5a25f094eeb4b66e37d99a97894aafd04a5683980852a8cbddccdc8e4e - md5: 25152fce119320c980e5470e64834b50 - depends: - - libpng >=1.6.39,<1.7.0a0 - - libzlib >=1.2.13,<2.0.0a0 - license: GPL-2.0-only OR FTL - purls: [] - size: 599300 - timestamp: 1694616137838 - kind: conda name: freetype version: 2.12.1 @@ -9874,25 +7768,8 @@ packages: - vc14_runtime >=14.29.30139 license: GPL-2.0-only OR FTL purls: [] - size: 510306 - timestamp: 1694616398888 -- kind: conda - name: freexl - version: 2.0.0 - build: h3ec172f_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/freexl-2.0.0-h3ec172f_0.conda - sha256: 9d59f1894c3b526e6806e376e979b81d0df23a836415122b86458aef72cda24a - md5: 640c34a8084e2a812bcee5b804597fc9 - depends: - - libexpat >=2.5.0,<3.0a0 - - libiconv >=1.17,<2.0a0 - - minizip >=4.0.1,<5.0a0 - license: MPL-1.1 - license_family: MOZILLA - purls: [] - size: 54007 - timestamp: 1694952882265 + size: 510306 + timestamp: 1694616398888 - kind: conda name: freexl version: 2.0.0 @@ -9989,37 +7866,6 @@ packages: purls: [] size: 64567 timestamp: 1604417122064 -- kind: conda - name: fribidi - version: 1.0.10 - build: hbcb3906_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/fribidi-1.0.10-hbcb3906_0.tar.bz2 - sha256: 4f6db86ecc4984cd4ac88ca52030726c3cfd11a64dfb15c8602025ee3001a2b5 - md5: f1c6b41e0f56998ecd9a3e210faa1dc0 - license: LGPL-2.1 - purls: [] - size: 65388 - timestamp: 1604417213 -- kind: conda - name: frozenlist - version: 1.4.1 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/frozenlist-1.4.1-py311h3336109_1.conda - sha256: a0e874185da4b85250b5416f0c63d40de72f1a7c4f7ebe864eeb298b691d46a5 - md5: 76713e20ff1f712ab6c6ef122fd4e2d9 - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: APACHE - purls: - - pkg:pypi/frozenlist?source=hash-mapping - size: 52909 - timestamp: 1725395958538 - kind: conda name: frozenlist version: 1.4.1 @@ -10173,30 +8019,6 @@ packages: - pkg:pypi/gdal?source=hash-mapping size: 1648279 timestamp: 1726094444948 -- kind: conda - name: gdal - version: 3.9.2 - build: py311ha943c4b_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gdal-3.9.2-py311ha943c4b_2.conda - sha256: c4e2f68e62f1e0a0de7de3e3517b702f26acaef79a35872d494d19388167f11d - md5: a10ca9664f2fd723a2e7a117ee6087f2 - depends: - - __osx >=10.13 - - libcxx >=17 - - libgdal-core 3.9.2.* - - libkml >=1.3.0,<1.4.0a0 - - libxml2 >=2.12.7,<3.0a0 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/gdal?source=hash-mapping - size: 1695166 - timestamp: 1726092958444 - kind: conda name: gdk-pixbuf version: 2.42.12 @@ -10217,26 +8039,6 @@ packages: purls: [] size: 509570 timestamp: 1715783199780 -- kind: conda - name: gdk-pixbuf - version: 2.42.12 - build: ha587570_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gdk-pixbuf-2.42.12-ha587570_0.conda - sha256: 92cb602ef86feb35252ee909e19536fa043bd85b8507450ad8264cfa518a5881 - md5: ee186d2e8db4605030753dc05025d4a0 - depends: - - __osx >=10.13 - - libglib >=2.80.2,<3.0a0 - - libintl >=0.22.5,<1.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libtiff >=4.6.0,<4.7.0a0 - license: LGPL-2.1-or-later - license_family: LGPL - purls: [] - size: 516815 - timestamp: 1715783154558 - kind: conda name: gdk-pixbuf version: 2.42.12 @@ -10386,22 +8188,6 @@ packages: purls: [] size: 1737633 timestamp: 1721746525671 -- kind: conda - name: geos - version: 3.12.2 - build: hf036a51_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/geos-3.12.2-hf036a51_1.conda - sha256: 1d5ec9da8a543885228aa7ca9fabfcacd653b0f14e8d175bb83de60afcffc166 - md5: fbb2688b537dafd5fb554d0b7ef27397 - depends: - - __osx >=10.13 - - libcxx >=16 - license: LGPL-2.1-only - purls: [] - size: 1482492 - timestamp: 1721747118528 - kind: conda name: geotiff version: 1.7.3 @@ -10425,28 +8211,6 @@ packages: purls: [] size: 123406 timestamp: 1722335928788 -- kind: conda - name: geotiff - version: 1.7.3 - build: h4bbec01_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/geotiff-1.7.3-h4bbec01_2.conda - sha256: a56e2154bfd21588ffde48ae14f906ea6b7e0eb49f71b2e3fb320cd066c22503 - md5: d83428f874b4fc2d204613ad7ad42b6d - depends: - - __osx >=10.13 - - libcxx >=16 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libzlib >=1.3.1,<2.0a0 - - proj >=9.4.1,<9.5.0a0 - - zlib - license: MIT - license_family: MIT - purls: [] - size: 115552 - timestamp: 1722335565552 - kind: conda name: geotiff version: 1.7.3 @@ -10534,30 +8298,6 @@ packages: purls: [] size: 483255 timestamp: 1723627203687 -- kind: conda - name: gettext - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gettext-0.22.5-hdfe23c8_3.conda - sha256: f68cd35c98394dc322f2695a720b31b77a9cdfe7d5c08ce53bc68c9e3fe4c6ec - md5: 4e53e0f241c09fcdf674e4a37c0c70e6 - depends: - - __osx >=10.13 - - gettext-tools 0.22.5 hdfe23c8_3 - - libasprintf 0.22.5 hdfe23c8_3 - - libasprintf-devel 0.22.5 hdfe23c8_3 - - libcxx >=16 - - libgettextpo 0.22.5 hdfe23c8_3 - - libgettextpo-devel 0.22.5 hdfe23c8_3 - - libiconv >=1.17,<2.0a0 - - libintl 0.22.5 hdfe23c8_3 - - libintl-devel 0.22.5 hdfe23c8_3 - license: LGPL-2.1-or-later AND GPL-3.0-or-later - purls: [] - size: 480155 - timestamp: 1723627002489 - kind: conda name: gettext version: 0.22.5 @@ -10598,24 +8338,6 @@ packages: purls: [] size: 2467439 timestamp: 1723627140130 -- kind: conda - name: gettext-tools - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gettext-tools-0.22.5-hdfe23c8_3.conda - sha256: 7fe97828eae5e067b68dd012811e614e057854ed51116bbd2fd2e8d05439ad63 - md5: 70a5bb1505016ebdba1214ba10de0503 - depends: - - __osx >=10.13 - - libiconv >=1.17,<2.0a0 - - libintl 0.22.5 hdfe23c8_3 - license: GPL-3.0-or-later - license_family: GPL - purls: [] - size: 2513986 - timestamp: 1723626957941 - kind: conda name: gettext-tools version: 0.22.5 @@ -10633,22 +8355,6 @@ packages: purls: [] size: 2750908 timestamp: 1723626056740 -- kind: conda - name: gflags - version: 2.2.2 - build: hb1e8313_1004 - build_number: 1004 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gflags-2.2.2-hb1e8313_1004.tar.bz2 - sha256: 39540f879057ae529cad131644af111a8c3c48b384ec6212de6a5381e0863948 - md5: 3f59cc77a929537e42120faf104e0d16 - depends: - - libcxx >=10.0.1 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 94612 - timestamp: 1599590973213 - kind: conda name: gflags version: 2.2.2 @@ -10729,36 +8435,6 @@ packages: purls: [] size: 21548063 timestamp: 1725905601441 -- kind: conda - name: gh - version: 2.56.0 - build: he13f2d6_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gh-2.56.0-he13f2d6_0.conda - sha256: 142d91a83a8c749240dde7d83b9ac106a4787d99a6bf5fb82cf1cd31cf752767 - md5: d485f4102ee8ebf5b5e24939b766365b - depends: - - __osx >=10.13 - constrains: - - __osx>=10.12 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 21242563 - timestamp: 1725905532557 -- kind: conda - name: giflib - version: 5.2.2 - build: h10d778d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/giflib-5.2.2-h10d778d_0.conda - sha256: 2c825df829097536314a195ae5cacaa8695209da6b4400135a65d8e23c008ff8 - md5: 03e8c9b4d3da5f3d6eabdd020c2d63ac - license: MIT - license_family: MIT - purls: [] - size: 74516 - timestamp: 1712692686914 - kind: conda name: giflib version: 5.2.2 @@ -10843,40 +8519,6 @@ packages: purls: [] size: 63049 timestamp: 1718543005831 -- kind: conda - name: gl2ps - version: 1.4.2 - build: hd82a5f3_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gl2ps-1.4.2-hd82a5f3_1.conda - sha256: 2da5a699a75a9366996d469e05bbf2014f62102b2da70607a2230f9031ca7f52 - md5: 707318c6171d4d8b07b51e0de03c7595 - depends: - - __osx >=10.13 - - libpng >=1.6.43,<1.7.0a0 - - libzlib >=1.3.1,<2.0a0 - license: LGPL-2.0-or-later - license_family: LGPL - purls: [] - size: 67880 - timestamp: 1718542959037 -- kind: conda - name: glew - version: 2.1.0 - build: h046ec9c_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/glew-2.1.0-h046ec9c_2.tar.bz2 - sha256: 1d114d93fd4bf043aa6fccc550379c0ac0a48461633cd1e1e49abe55be8562df - md5: 6b753c8c7e4c46a8eb17b6f1781f958a - depends: - - libcxx >=11.0.0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 708867 - timestamp: 1607113212595 - kind: conda name: glew version: 2.1.0 @@ -10930,23 +8572,6 @@ packages: purls: [] size: 783742 timestamp: 1607113139225 -- kind: conda - name: glog - version: 0.7.1 - build: h2790a97_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/glog-0.7.1-h2790a97_0.conda - sha256: dd56547db8625eb5c91bb0a9fbe8bd6f5c7fbf5b6059d46365e94472c46b24f9 - md5: 06cf91665775b0da395229cd4331b27d - depends: - - __osx >=10.13 - - gflags >=2.2.2,<2.3.0a0 - - libcxx >=16 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 117017 - timestamp: 1718284325443 - kind: conda name: glog version: 0.7.1 @@ -11013,48 +8638,6 @@ packages: purls: [] size: 460055 timestamp: 1718980856608 -- kind: conda - name: gmp - version: 6.3.0 - build: hf036a51_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gmp-6.3.0-hf036a51_2.conda - sha256: 75aa5e7a875afdcf4903b7dc98577672a3dc17b528ac217b915f9528f93c85fc - md5: 427101d13f19c4974552a4e5b072eef1 - depends: - - __osx >=10.13 - - libcxx >=16 - license: GPL-2.0-or-later OR LGPL-3.0-or-later - purls: [] - size: 428919 - timestamp: 1718981041839 -- kind: conda - name: gmsh - version: 4.12.2 - build: h48a2193_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gmsh-4.12.2-h48a2193_0.conda - sha256: cd772b30ec119c445ebcda76bfcebeedb4ce3de7f70d893bbfab5bba836b42aa - md5: bbf50024b7f60b94ce01f0ec1135ad3d - depends: - - cairo >=1.18.0,<2.0a0 - - fltk >=1.3.9,<1.4.0a0 - - gmp >=6.3.0,<7.0a0 - - libblas >=3.9.0,<4.0a0 - - libcxx >=15 - - libjpeg-turbo >=3.0.0,<4.0a0 - - liblapack >=3.9.0,<4.0a0 - - libpng >=1.6.39,<1.7.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - occt >=7.7.2,<7.8.0a0 - - zlib - license: GPL-2.0-or-later - license_family: GPL - purls: - - pkg:pypi/gmsh?source=hash-mapping - size: 11568361 - timestamp: 1705884796464 - kind: conda name: gmsh version: 4.12.2 @@ -11185,29 +8768,6 @@ packages: purls: [] size: 1738547 timestamp: 1723812228427 -- kind: conda - name: gnutls - version: 3.8.7 - build: hfad6214_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gnutls-3.8.7-hfad6214_0.conda - sha256: a1103c2b724bc62641ae9202f86f674958a62ade09a7c04c36c2c842a74743ae - md5: bf601d33ba3cf249b2096b080b59fc37 - depends: - - __osx >=10.13 - - libasprintf >=0.22.5,<1.0a0 - - libcxx >=16 - - libgettextpo >=0.22.5,<1.0a0 - - libidn2 >=2,<3.0a0 - - libintl >=0.22.5,<1.0a0 - - libtasn1 >=4.19.0,<5.0a0 - - nettle >=3.9.1,<3.10.0a0 - - p11-kit >=0.24.1,<0.25.0a0 - license: LGPL-2.1-or-later - license_family: LGPL - purls: [] - size: 1812102 - timestamp: 1723813110695 - kind: conda name: graphite2 version: 1.3.13 @@ -11243,22 +8803,6 @@ packages: purls: [] size: 95406 timestamp: 1711634622644 -- kind: conda - name: graphite2 - version: 1.3.13 - build: h73e2aa4_1003 - build_number: 1003 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/graphite2-1.3.13-h73e2aa4_1003.conda - sha256: b71db966e47cd83b16bfcc2099b8fa87c07286f24a0742078fede4c84314f91a - md5: fc7124f86e1d359fc5d878accd9e814c - depends: - - libcxx >=16 - license: LGPL-2.0-or-later - license_family: LGPL - purls: [] - size: 84384 - timestamp: 1711634311095 - kind: conda name: graphite2 version: 1.3.13 @@ -11358,55 +8902,6 @@ packages: purls: [] size: 5082874 timestamp: 1722673934247 -- kind: conda - name: graphviz - version: 12.0.0 - build: he14ced1_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/graphviz-12.0.0-he14ced1_0.conda - sha256: 91fbeecf3aaa4032c6f01c4242cfe2ee1bee21e70d085bafb3958ce7d6ab7c3c - md5: ef49aa1e3614bfc6fb5369675129c09b - depends: - - __osx >=10.13 - - cairo >=1.18.0,<2.0a0 - - fonts-conda-ecosystem - - gdk-pixbuf >=2.42.12,<3.0a0 - - gtk2 - - gts >=0.7.6,<0.8.0a0 - - libcxx >=16 - - libexpat >=2.6.2,<3.0a0 - - libgd >=2.3.3,<2.4.0a0 - - libglib >=2.80.3,<3.0a0 - - librsvg >=2.58.2,<3.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - - pango >=1.50.14,<2.0a0 - license: EPL-1.0 - license_family: Other - purls: [] - size: 4984341 - timestamp: 1722673941539 -- kind: conda - name: gtk2 - version: 2.24.33 - build: h2c15c3c_5 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gtk2-2.24.33-h2c15c3c_5.conda - sha256: 9d7a50dae4aef357473b16c5121c1803a0c9ee1b8f93c4d90dc0196ae5007208 - md5: 308376a1154bc0ab3bbeeccf6ff986be - depends: - - __osx >=10.13 - - atk-1.0 >=2.38.0 - - cairo >=1.18.0,<2.0a0 - - gdk-pixbuf >=2.42.12,<3.0a0 - - libglib >=2.80.3,<3.0a0 - - libintl >=0.22.5,<1.0a0 - - pango >=1.54.0,<2.0a0 - license: LGPL-2.1-or-later - purls: [] - size: 6162947 - timestamp: 1721286459536 - kind: conda name: gtk2 version: 2.24.33 @@ -11456,23 +8951,6 @@ packages: purls: [] size: 6152068 timestamp: 1721286930050 -- kind: conda - name: gts - version: 0.7.6 - build: h53e17e3_4 - build_number: 4 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/gts-0.7.6-h53e17e3_4.conda - sha256: d5b82a36f7e9d7636b854e56d1b4fe01c4d895128a7b73e2ec6945b691ff3314 - md5: 848cc963fcfbd063c7a023024aa3bec0 - depends: - - libcxx >=15.0.7 - - libglib >=2.76.3,<3.0a0 - license: LGPL-2.0-or-later - license_family: LGPL - purls: [] - size: 280972 - timestamp: 1686545425074 - kind: conda name: gts version: 0.7.6 @@ -11564,28 +9042,6 @@ packages: - pkg:pypi/h2?source=hash-mapping size: 46754 timestamp: 1634280590080 -- kind: conda - name: harfbuzz - version: 9.0.0 - build: h098a298_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/harfbuzz-9.0.0-h098a298_1.conda - sha256: dbc7783ea89faaf3a810d0e55979be02031551be8edad00de915807b3b148ff1 - md5: 8dd3c790d5ce9f3bc94c46e5b218e5f8 - depends: - - __osx >=10.13 - - cairo >=1.18.0,<2.0a0 - - freetype >=2.12.1,<3.0a0 - - graphite2 - - icu >=75.1,<76.0a0 - - libcxx >=16 - - libglib >=2.80.3,<3.0a0 - license: MIT - license_family: MIT - purls: [] - size: 1372588 - timestamp: 1721186294497 - kind: conda name: harfbuzz version: 9.0.0 @@ -11733,26 +9189,8 @@ packages: license: BSD-3-Clause license_family: BSD purls: [] - size: 779637 - timestamp: 1695662145568 -- kind: conda - name: hdf4 - version: 4.2.15 - build: h8138101_7 - build_number: 7 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/hdf4-4.2.15-h8138101_7.conda - sha256: 8c767cc71226e9eb62649c903c68ba73c5f5e7e3696ec0319d1f90586cebec7d - md5: 7ce543bf38dbfae0de9af112ee178af2 - depends: - - libcxx >=15.0.7 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 724103 - timestamp: 1695661907511 + size: 779637 + timestamp: 1695662145568 - kind: conda name: hdf5 version: 1.14.3 @@ -11775,30 +9213,6 @@ packages: purls: [] size: 2039111 timestamp: 1717587493910 -- kind: conda - name: hdf5 - version: 1.14.3 - build: nompi_h687a608_105 - build_number: 105 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/hdf5-1.14.3-nompi_h687a608_105.conda - sha256: 98f8350730d09e8ad7b62ca6d6be38ee2324b11bbcd1a5fe2cac619b12cd68d7 - md5: 98544299f6bb2ef4d7362506a3dde886 - depends: - - __osx >=10.13 - - libaec >=1.1.3,<2.0a0 - - libcurl >=8.8.0,<9.0a0 - - libcxx >=16 - - libgfortran 5.* - - libgfortran5 >=12.3.0 - - libgfortran5 >=13.2.0 - - libzlib >=1.2.13,<2.0a0 - - openssl >=3.3.1,<4.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 3733954 - timestamp: 1717588360008 - kind: conda name: hdf5 version: 1.14.3 @@ -11942,25 +9356,11 @@ packages: - setuptools - sortedcontainers >=2.1.0,<3.0.0 license: MPL-2.0 + license_family: MOZILLA purls: - pkg:pypi/hypothesis?source=hash-mapping size: 334066 timestamp: 1726228987330 -- kind: conda - name: icu - version: '75.1' - build: h120a0e1_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/icu-75.1-h120a0e1_0.conda - sha256: 2e64307532f482a0929412976c8450c719d558ba20c0962832132fd0d07ba7a7 - md5: d68d48a3060eb5abdc1cdc8e2a3a5966 - depends: - - __osx >=10.13 - license: MIT - license_family: MIT - purls: [] - size: 11761697 - timestamp: 1720853679409 - kind: conda name: icu version: '75.1' @@ -12012,21 +9412,20 @@ packages: timestamp: 1720853997952 - kind: conda name: idna - version: '3.8' + version: '3.10' build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/idna-3.8-pyhd8ed1ab_0.conda - sha256: 8660d38b272d3713ec8ac5ae918bc3bc80e1b81e1a7d61df554bded71ada6110 - md5: 99e164522f6bdf23c177c8d9ae63f975 + url: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_0.conda + sha256: 8c57fd68e6be5eecba4462e983aed7e85761a519aab80e834bbd7794d4b545b2 + md5: 7ba2ede0e7c795ff95088daf0dc59753 depends: - python >=3.6 license: BSD-3-Clause - license_family: BSD purls: - pkg:pypi/idna?source=hash-mapping - size: 49275 - timestamp: 1724450633325 + size: 49837 + timestamp: 1726459583613 - kind: conda name: imagesize version: 1.4.1 @@ -12061,23 +9460,6 @@ packages: purls: [] size: 153017 timestamp: 1725971790238 -- kind: conda - name: imath - version: 3.1.12 - build: h2016aa1_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/imath-3.1.12-h2016aa1_0.conda - sha256: 5bf9c041b97b1af21808938fcaa64acafe0d853de5478fa08005176664ee4552 - md5: 326b3d68ab3f43396e7d7e0e9a496f73 - depends: - - __osx >=10.13 - - libcxx >=17 - - libzlib >=1.3.1,<2.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 155534 - timestamp: 1725971674035 - kind: conda name: imath version: 3.1.12 @@ -12165,7 +9547,7 @@ packages: license: Apache-2.0 license_family: APACHE purls: - - pkg:pypi/importlib-resources?source=compressed-mapping + - pkg:pypi/importlib-resources?source=hash-mapping size: 32725 timestamp: 1725921462405 - kind: conda @@ -12532,22 +9914,6 @@ packages: purls: [] size: 83682 timestamp: 1720812978049 -- kind: conda - name: json-c - version: '0.17' - build: h6253ea5_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/json-c-0.17-h6253ea5_1.conda - sha256: 66ddd1a4d643c7c800a1bb8e61f5f4198ec102be37db9a6d2e037004442eff8d - md5: fb72a2ef514c2df4ba035187945a6dcf - depends: - - __osx >=10.13 - license: MIT - license_family: MIT - purls: [] - size: 72163 - timestamp: 1720813111542 - kind: conda name: json-c version: '0.17' @@ -12613,21 +9979,6 @@ packages: purls: [] size: 194553 timestamp: 1640883128046 -- kind: conda - name: jsoncpp - version: 1.9.5 - build: h940c156_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/jsoncpp-1.9.5-h940c156_1.tar.bz2 - sha256: 291696d97a252af1a50adf245f832ebf6c9f566effdae18fa11467833eb6947f - md5: 45824afbfd843fe0584ae8df22f1d99a - depends: - - libcxx >=11.1.0 - license: LicenseRef-Public-Domain OR MIT - purls: [] - size: 173394 - timestamp: 1640883229294 - kind: conda name: jsoncpp version: 1.9.5 @@ -12698,24 +10049,6 @@ packages: - pkg:pypi/jsonpointer?source=hash-mapping size: 17645 timestamp: 1725303065473 -- kind: conda - name: jsonpointer - version: 3.0.0 - build: py311h6eed73b_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/jsonpointer-3.0.0-py311h6eed73b_1.conda - sha256: 2499e5ebb3efa4186d6922122224d16bac791a5c0adad5b48b2bcd1e1e2afc8d - md5: b6c1710105dad14d47001a339cd14da6 - depends: - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/jsonpointer?source=hash-mapping - size: 17727 - timestamp: 1725302991176 - kind: conda name: jsonschema version: 4.23.0 @@ -12931,25 +10264,6 @@ packages: - pkg:pypi/jupyter-core?source=hash-mapping size: 95226 timestamp: 1710257482063 -- kind: conda - name: jupyter_core - version: 5.7.2 - build: py311h6eed73b_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/jupyter_core-5.7.2-py311h6eed73b_0.conda - sha256: 3078f27009ce1f3cdd46dc97bd4f3f51277aa5957f6a90e300c613bd848767b7 - md5: 582fe977a5a6b9f37a72ff34a753381e - depends: - - platformdirs >=2.5 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - traitlets >=5.3 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/jupyter-core?source=hash-mapping - size: 95661 - timestamp: 1710257750738 - kind: conda name: jupyter_events version: 0.10.0 @@ -13127,20 +10441,6 @@ packages: - pkg:pypi/jupyterlab-widgets?source=hash-mapping size: 186024 timestamp: 1724331451102 -- kind: conda - name: jxrlib - version: '1.1' - build: h10d778d_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/jxrlib-1.1-h10d778d_3.conda - sha256: a548a4be14a4c76d6d992a5c1feffcbb08062f5c57abc6e4278d40c2c9a7185b - md5: cfaf81d843a80812fe16a68bdae60562 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 220376 - timestamp: 1703334073774 - kind: conda name: jxrlib version: '1.1' @@ -13226,24 +10526,6 @@ packages: purls: [] size: 142261 timestamp: 1725399546359 -- kind: conda - name: kealib - version: 1.5.3 - build: he475af8_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/kealib-1.5.3-he475af8_2.conda - sha256: 12badb5e2f8bd38bee33a3c3ec0108a37f106f286e2caad97d8c660936d59249 - md5: 1d0f27a93940d512f681fe3f4f7439f0 - depends: - - __osx >=10.13 - - hdf5 >=1.14.3,<1.14.4.0a0 - - libcxx >=17 - license: MIT - license_family: MIT - purls: [] - size: 150151 - timestamp: 1725399603970 - kind: conda name: kealib version: 1.5.3 @@ -13409,25 +10691,6 @@ packages: - pkg:pypi/kiwisolver?source=hash-mapping size: 72393 timestamp: 1725459421768 -- kind: conda - name: kiwisolver - version: 1.4.7 - build: py311hf2f7c97_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/kiwisolver-1.4.7-py311hf2f7c97_0.conda - sha256: 00b477bff9138ca51edd94f7b31ce9fe2cd13a1dc8768abcf037a22eccf26940 - md5: 24b0e3e3444be9fabcc8457c409e297f - depends: - - __osx >=10.13 - - libcxx >=17 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/kiwisolver?source=hash-mapping - size: 60398 - timestamp: 1725459431458 - kind: conda name: krb5 version: 1.21.3 @@ -13447,25 +10710,6 @@ packages: purls: [] size: 1155530 timestamp: 1719463474401 -- kind: conda - name: krb5 - version: 1.21.3 - build: h37d8d59_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/krb5-1.21.3-h37d8d59_0.conda - sha256: 83b52685a4ce542772f0892a0f05764ac69d57187975579a0835ff255ae3ef9c - md5: d4765c524b1d91567886bde656fb514b - depends: - - __osx >=10.13 - - libcxx >=16 - - libedit >=3.1.20191231,<3.2.0a0 - - libedit >=3.1.20191231,<4.0a0 - - openssl >=3.3.1,<4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 1185323 - timestamp: 1719463492984 - kind: conda name: krb5 version: 1.21.3 @@ -13534,20 +10778,6 @@ packages: purls: [] size: 528805 timestamp: 1664996399305 -- kind: conda - name: lame - version: '3.100' - build: hb7f2c08_1003 - build_number: 1003 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/lame-3.100-hb7f2c08_1003.tar.bz2 - sha256: 0f943b08abb4c748d73207594321b53bad47eea3e7d06b6078e0f6c59ce6771e - md5: 3342b33c9a0921b22b767ed68ee25861 - license: LGPL-2.0-only - license_family: LGPL - purls: [] - size: 542681 - timestamp: 1664996421531 - kind: conda name: lcms2 version: '2.16' @@ -13583,22 +10813,6 @@ packages: purls: [] size: 211959 timestamp: 1701647962657 -- kind: conda - name: lcms2 - version: '2.16' - build: ha2f27b4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/lcms2-2.16-ha2f27b4_0.conda - sha256: 222ebc0a55544b9922f61e75015d02861e65b48f12113af41d48ba0814e14e4e - md5: 1442db8f03517834843666c422238c9b - depends: - - libjpeg-turbo >=3.0.0,<4.0a0 - - libtiff >=4.6.0,<4.7.0a0 - license: MIT - license_family: MIT - purls: [] - size: 224432 - timestamp: 1701648089496 - kind: conda name: lcms2 version: '2.16' @@ -13694,21 +10908,6 @@ packages: purls: [] size: 215721 timestamp: 1657977558796 -- kind: conda - name: lerc - version: 4.0.0 - build: hb486fe8_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/lerc-4.0.0-hb486fe8_0.tar.bz2 - sha256: e41790fc0f4089726369b3c7f813117bbc14b533e0ed8b94cf75aba252e82497 - md5: f9d6a4c82889d5ecedec1d90eb673c55 - depends: - - libcxx >=13.0.1 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 290319 - timestamp: 1657977526749 - kind: conda name: libabseil version: '20240116.2' @@ -13771,26 +10970,6 @@ packages: purls: [] size: 1802886 timestamp: 1720857653184 -- kind: conda - name: libabseil - version: '20240116.2' - build: cxx17_hf036a51_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libabseil-20240116.2-cxx17_hf036a51_1.conda - sha256: 396d18f39d5207ecae06fddcbc6e5f20865718939bc4e0ea9729e13952833aac - md5: d6c78ca84abed3fea5f308ac83b8f54e - depends: - - __osx >=10.13 - - libcxx >=16 - constrains: - - abseil-cpp =20240116.2 - - libabseil-static =20240116.2=cxx17* - license: Apache-2.0 - license_family: Apache - purls: [] - size: 1124364 - timestamp: 1720857589333 - kind: conda name: libaec version: 1.1.3 @@ -13824,21 +11003,6 @@ packages: purls: [] size: 32567 timestamp: 1711021603471 -- kind: conda - name: libaec - version: 1.1.3 - build: h73e2aa4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libaec-1.1.3-h73e2aa4_0.conda - sha256: dae5921339c5d89f4bf58a95fd4e9c76270dbf7f6a94f3c5081b574905fcccf8 - md5: 66d3c1f6dd4636216b4fca7a748d50eb - depends: - - libcxx >=16 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 28602 - timestamp: 1711021419744 - kind: conda name: libaec version: 1.1.3 @@ -13854,30 +11018,6 @@ packages: purls: [] size: 28451 timestamp: 1711021498493 -- kind: conda - name: libarchive - version: 3.7.4 - build: h20e244c_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libarchive-3.7.4-h20e244c_0.conda - sha256: 9e46db25e976630e6738b351d76d9b79047ae232638b82f9f45eba774caaef8a - md5: 82a85fa38e83366009b7f4b2cef4deb8 - depends: - - __osx >=10.13 - - bzip2 >=1.0.8,<2.0a0 - - libiconv >=1.17,<2.0a0 - - libxml2 >=2.12.7,<3.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - lz4-c >=1.9.3,<1.10.0a0 - - lzo >=2.10,<3.0a0 - - openssl >=3.3.0,<4.0a0 - - xz >=5.2.6,<6.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 742682 - timestamp: 1716394747351 - kind: conda name: libarchive version: 3.7.4 @@ -13953,12 +11093,12 @@ packages: - kind: conda name: libarrow version: 17.0.0 - build: h20538ec_13_cpu - build_number: 13 + build: h77c2f02_14_cpu + build_number: 14 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-17.0.0-h20538ec_13_cpu.conda - sha256: fb77709a184e934b8662388f91bf9bd51a96eb3b11c53d0453e9bc43b01b4c27 - md5: c6813f605a0fd1947e768b5f6138e0a7 + url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-17.0.0-h77c2f02_14_cpu.conda + sha256: 51b8903b706466b48a65fcbb54665bbaffe134bb7df387d276acbf1812c38554 + md5: 3e0d761c4cd976f6c3db33d659548743 depends: - __osx >=11.0 - aws-crt-cpp >=0.28.2,<0.28.3.0a0 @@ -13974,46 +11114,8 @@ packages: - libbrotlidec >=1.1.0,<1.2.0a0 - libbrotlienc >=1.1.0,<1.2.0a0 - libcxx >=17 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 - - libre2-11 >=2023.9.1,<2024.0a0 - - libutf8proc >=2.8.0,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - lz4-c >=1.9.3,<1.10.0a0 - - orc >=2.0.2,<2.0.3.0a0 - - re2 - - snappy >=1.2.1,<1.3.0a0 - - zstd >=1.5.6,<1.6.0a0 - constrains: - - apache-arrow-proc =*=cpu - - parquet-cpp <0.0a0 - - arrow-cpp <0.0a0 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 5253411 - timestamp: 1725214512114 -- kind: conda - name: libarrow - version: 17.0.0 - build: h29daf90_13_cpu - build_number: 13 - subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libarrow-17.0.0-h29daf90_13_cpu.conda - sha256: 1a0f66e822f4cde398b15fe7ac94cb4197635798da9feebcb88c900637e05f77 - md5: d0ea8c4474c45aae86eff71a0f293013 - depends: - - aws-crt-cpp >=0.28.2,<0.28.3.0a0 - - aws-sdk-cpp >=1.11.379,<1.11.380.0a0 - - bzip2 >=1.0.8,<2.0a0 - - libabseil * cxx17* - - libabseil >=20240116.2,<20240117.0a0 - - libbrotlidec >=1.1.0,<1.2.0a0 - - libbrotlienc >=1.1.0,<1.2.0a0 - - libcrc32c >=1.1.2,<1.2.0a0 - - libcurl >=8.9.1,<9.0a0 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 + - libgoogle-cloud >=2.29.0,<2.30.0a0 + - libgoogle-cloud-storage >=2.29.0,<2.30.0a0 - libre2-11 >=2023.9.1,<2024.0a0 - libutf8proc >=2.8.0,<3.0a0 - libzlib >=1.3.1,<2.0a0 @@ -14021,28 +11123,24 @@ packages: - orc >=2.0.2,<2.0.3.0a0 - re2 - snappy >=1.2.1,<1.3.0a0 - - ucrt >=10.0.20348.0 - - vc >=14.2,<15 - - vc14_runtime >=14.29.30139 - zstd >=1.5.6,<1.6.0a0 constrains: - arrow-cpp <0.0a0 - parquet-cpp <0.0a0 - apache-arrow-proc =*=cpu license: Apache-2.0 - license_family: APACHE purls: [] - size: 5128979 - timestamp: 1725215183038 + size: 5344222 + timestamp: 1726334126114 - kind: conda name: libarrow version: 17.0.0 - build: h8d2e343_13_cpu - build_number: 13 + build: hc80a628_14_cpu + build_number: 14 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-17.0.0-h8d2e343_13_cpu.conda - sha256: 91e639761f29ee1ca144e92110d47c8e68038f26201eef25585a48826e037fb2 - md5: dc379f362829d5df5ce6722565110029 + url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-17.0.0-hc80a628_14_cpu.conda + sha256: 05cad1a288c40ea9e8c0712d879a62b22458421e759823e9d1ff096c26e3880b + md5: 00ae1eabd338f01d8b6810d6944d1440 depends: - __glibc >=2.17,<3.0.a0 - aws-crt-cpp >=0.28.2,<0.28.3.0a0 @@ -14059,8 +11157,8 @@ packages: - libbrotlidec >=1.1.0,<1.2.0a0 - libbrotlienc >=1.1.0,<1.2.0a0 - libgcc >=13 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 + - libgoogle-cloud >=2.29.0,<2.30.0a0 + - libgoogle-cloud-storage >=2.29.0,<2.30.0a0 - libre2-11 >=2023.9.1,<2024.0a0 - libstdcxx >=13 - libutf8proc >=2.8.0,<3.0a0 @@ -14071,40 +11169,34 @@ packages: - snappy >=1.2.1,<1.3.0a0 - zstd >=1.5.6,<1.6.0a0 constrains: - - arrow-cpp <0.0a0 - parquet-cpp <0.0a0 - apache-arrow-proc =*=cpu + - arrow-cpp <0.0a0 license: Apache-2.0 - license_family: APACHE purls: [] - size: 8512685 - timestamp: 1725214716301 + size: 8521011 + timestamp: 1726334764839 - kind: conda name: libarrow version: 17.0.0 - build: ha60c65e_13_cpu - build_number: 13 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libarrow-17.0.0-ha60c65e_13_cpu.conda - sha256: d8096066ce779a82cbb2045030f8095ed5689cac2ac1ee0c58251e7f448f1a87 - md5: 4cdf43459510697d824c377428a120b1 + build: he3462ed_14_cpu + build_number: 14 + subdir: win-64 + url: https://conda.anaconda.org/conda-forge/win-64/libarrow-17.0.0-he3462ed_14_cpu.conda + sha256: 7756b21f94ed41901b9fb7adb4ccf8d0574da9b4dd13728755eb13bfa8955d48 + md5: cf3d72b24681348b84cfb89e12af23ab depends: - - __osx >=10.13 - aws-crt-cpp >=0.28.2,<0.28.3.0a0 - aws-sdk-cpp >=1.11.379,<1.11.380.0a0 - - azure-core-cpp >=1.13.0,<1.13.1.0a0 - - azure-identity-cpp >=1.8.0,<1.8.1.0a0 - - azure-storage-blobs-cpp >=12.12.0,<12.12.1.0a0 - - azure-storage-files-datalake-cpp >=12.11.0,<12.11.1.0a0 - bzip2 >=1.0.8,<2.0a0 - - glog >=0.7.1,<0.8.0a0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - libbrotlidec >=1.1.0,<1.2.0a0 - libbrotlienc >=1.1.0,<1.2.0a0 - - libcxx >=17 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 + - libcrc32c >=1.1.2,<1.2.0a0 + - libcurl >=8.10.0,<9.0a0 + - libgoogle-cloud >=2.29.0,<2.30.0a0 + - libgoogle-cloud-storage >=2.29.0,<2.30.0a0 - libre2-11 >=2023.9.1,<2024.0a0 - libutf8proc >=2.8.0,<3.0a0 - libzlib >=1.3.1,<2.0a0 @@ -14112,266 +11204,198 @@ packages: - orc >=2.0.2,<2.0.3.0a0 - re2 - snappy >=1.2.1,<1.3.0a0 + - ucrt >=10.0.20348.0 + - vc >=14.2,<15 + - vc14_runtime >=14.29.30139 - zstd >=1.5.6,<1.6.0a0 constrains: - - arrow-cpp <0.0a0 - parquet-cpp <0.0a0 + - arrow-cpp <0.0a0 - apache-arrow-proc =*=cpu license: Apache-2.0 - license_family: APACHE purls: [] - size: 5899274 - timestamp: 1725214352592 + size: 5145979 + timestamp: 1726335377495 - kind: conda name: libarrow-acero version: 17.0.0 - build: h5888daf_13_cpu - build_number: 13 + build: h5888daf_14_cpu + build_number: 14 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-acero-17.0.0-h5888daf_13_cpu.conda - sha256: cda9e38ad7af7ba72416031b089de5048f8526ae586149ff9f6506366689d699 - md5: b654d072b8d5da807495e49b28a0b884 + url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-acero-17.0.0-h5888daf_14_cpu.conda + sha256: bb2f935f92608ec76de2a7da3c587e4d57101016c9019d4f45cc345cf86d18bd + md5: d031fb9d87d0196563b3d47b986fbae7 depends: - __glibc >=2.17,<3.0.a0 - - libarrow 17.0.0 h8d2e343_13_cpu + - libarrow 17.0.0 hc80a628_14_cpu - libgcc >=13 - libstdcxx >=13 license: Apache-2.0 - license_family: APACHE - purls: [] - size: 609649 - timestamp: 1725214754397 -- kind: conda - name: libarrow-acero - version: 17.0.0 - build: hac325c4_13_cpu - build_number: 13 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libarrow-acero-17.0.0-hac325c4_13_cpu.conda - sha256: c6195a789edb257746ca9f8648419c9efdb67e0ef62d2ba818eaa921f94e90af - md5: 218079f1d0ba0a46246db86a9e96c417 - depends: - - __osx >=10.13 - - libarrow 17.0.0 ha60c65e_13_cpu - - libcxx >=17 - license: Apache-2.0 - license_family: APACHE purls: [] - size: 515115 - timestamp: 1725214443841 + size: 608316 + timestamp: 1726334811486 - kind: conda name: libarrow-acero version: 17.0.0 - build: he0c23c2_13_cpu - build_number: 13 + build: he0c23c2_14_cpu + build_number: 14 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libarrow-acero-17.0.0-he0c23c2_13_cpu.conda - sha256: 850b28abba3e40302cb5425ffb96f085d2089decafb2e80d85b4f8b44c2c777d - md5: 1a38e993ef119557596ae20cd68a1207 + url: https://conda.anaconda.org/conda-forge/win-64/libarrow-acero-17.0.0-he0c23c2_14_cpu.conda + sha256: 83adaa301ab3c045eeaa00f751248afa62a1b9397ec0b633bead1c310416d237 + md5: b07608814b2d2136daac0d898825bcb7 depends: - - libarrow 17.0.0 h29daf90_13_cpu + - libarrow 17.0.0 he3462ed_14_cpu - ucrt >=10.0.20348.0 - vc >=14.2,<15 - vc14_runtime >=14.29.30139 license: Apache-2.0 - license_family: APACHE purls: [] - size: 445286 - timestamp: 1725215254997 + size: 444588 + timestamp: 1726335464668 - kind: conda name: libarrow-acero version: 17.0.0 - build: hf9b8971_13_cpu - build_number: 13 + build: hf9b8971_14_cpu + build_number: 14 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-acero-17.0.0-hf9b8971_13_cpu.conda - sha256: ea958d1947670dc913f1a0ee631c70d8ac8d6db5f08039002b233bf896815cb2 - md5: 5d52b70e8174f52934d646bbaf0a928b + url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-acero-17.0.0-hf9b8971_14_cpu.conda + sha256: 112e0795480e108a1d159a34b430b4e138bc5e2709c1899685f6e769f5190aae + md5: 5c9256947f5152bc2044a725a3e4994b depends: - __osx >=11.0 - - libarrow 17.0.0 h20538ec_13_cpu + - libarrow 17.0.0 h77c2f02_14_cpu - libcxx >=17 license: Apache-2.0 - license_family: APACHE purls: [] - size: 477391 - timestamp: 1725214612356 + size: 483609 + timestamp: 1726334200324 - kind: conda name: libarrow-dataset version: 17.0.0 - build: h5888daf_13_cpu - build_number: 13 + build: h5888daf_14_cpu + build_number: 14 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-dataset-17.0.0-h5888daf_13_cpu.conda - sha256: b3fac9bc9a399670d6993738d018324d6e1b0a85755b484204405bb72efabc4e - md5: cd2c36e8865b158b82f61c6aac28b7e1 + url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-dataset-17.0.0-h5888daf_14_cpu.conda + sha256: 3c461f8c4f17100215e001046b111e14b896bdf5e4cbed597c8b4ad9210df22e + md5: e419802e159434b071afeb393e14bc8c depends: - - __glibc >=2.17,<3.0.a0 - - libarrow 17.0.0 h8d2e343_13_cpu - - libarrow-acero 17.0.0 h5888daf_13_cpu - - libgcc >=13 - - libparquet 17.0.0 h39682fd_13_cpu - - libstdcxx >=13 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 582848 - timestamp: 1725214820464 -- kind: conda - name: libarrow-dataset - version: 17.0.0 - build: hac325c4_13_cpu - build_number: 13 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libarrow-dataset-17.0.0-hac325c4_13_cpu.conda - sha256: de66e86133af737ecafd67a043f2756afb78fe77503bcf8e1dc2b73a706f55b5 - md5: d7609f5867208b278655602ac636363b - depends: - - __osx >=10.13 - - libarrow 17.0.0 ha60c65e_13_cpu - - libarrow-acero 17.0.0 hac325c4_13_cpu - - libcxx >=17 - - libparquet 17.0.0 hf1b0f52_13_cpu + - __glibc >=2.17,<3.0.a0 + - libarrow 17.0.0 hc80a628_14_cpu + - libarrow-acero 17.0.0 h5888daf_14_cpu + - libgcc >=13 + - libparquet 17.0.0 h39682fd_14_cpu + - libstdcxx >=13 license: Apache-2.0 - license_family: APACHE purls: [] - size: 506575 - timestamp: 1725215307580 + size: 585120 + timestamp: 1726334892529 - kind: conda name: libarrow-dataset version: 17.0.0 - build: he0c23c2_13_cpu - build_number: 13 + build: he0c23c2_14_cpu + build_number: 14 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libarrow-dataset-17.0.0-he0c23c2_13_cpu.conda - sha256: 12b0395dc22a2c3fb03e8b8ab32bcf4ff08947b8611b2a1e9c49644d8391893c - md5: dd78096e1335abc3c7bf6915d0ac7c34 + url: https://conda.anaconda.org/conda-forge/win-64/libarrow-dataset-17.0.0-he0c23c2_14_cpu.conda + sha256: 478f65420cdd2c0232611ca7e504a120dcaae4e699f58267853a31c26f06a14f + md5: aedfb97ed0c3762a2786a4be1675f643 depends: - - libarrow 17.0.0 h29daf90_13_cpu - - libarrow-acero 17.0.0 he0c23c2_13_cpu - - libparquet 17.0.0 ha915800_13_cpu + - libarrow 17.0.0 he3462ed_14_cpu + - libarrow-acero 17.0.0 he0c23c2_14_cpu + - libparquet 17.0.0 ha915800_14_cpu - ucrt >=10.0.20348.0 - vc >=14.2,<15 - vc14_runtime >=14.29.30139 license: Apache-2.0 - license_family: APACHE purls: [] - size: 427535 - timestamp: 1725215469376 + size: 427069 + timestamp: 1726335734124 - kind: conda name: libarrow-dataset version: 17.0.0 - build: hf9b8971_13_cpu - build_number: 13 + build: hf9b8971_14_cpu + build_number: 14 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-dataset-17.0.0-hf9b8971_13_cpu.conda - sha256: 3dac99549e4f9307bb4d35e94003f6e7c9052a957de122ec78c5d9750fd46096 - md5: 35df832aa0371c7bbe9fec5ea1c3139c + url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-dataset-17.0.0-hf9b8971_14_cpu.conda + sha256: 60cc15ee46e0cdb809c6a822f55192c418c7e807cd4ab9dea51b770fa339af5a + md5: a9475d8085f99d2211621ab0f245c993 depends: - __osx >=11.0 - - libarrow 17.0.0 h20538ec_13_cpu - - libarrow-acero 17.0.0 hf9b8971_13_cpu + - libarrow 17.0.0 h77c2f02_14_cpu + - libarrow-acero 17.0.0 hf9b8971_14_cpu - libcxx >=17 - - libparquet 17.0.0 hf0ba9ef_13_cpu + - libparquet 17.0.0 hf0ba9ef_14_cpu license: Apache-2.0 - license_family: APACHE purls: [] - size: 482335 - timestamp: 1725215502662 + size: 491698 + timestamp: 1726335017443 - kind: conda name: libarrow-substrait version: 17.0.0 - build: h1f0e801_13_cpu - build_number: 13 + build: h1f0e801_14_cpu + build_number: 14 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libarrow-substrait-17.0.0-h1f0e801_13_cpu.conda - sha256: 637c2652cfe676d6949f7953de7d51e90bc35863c3a114c29795b5b0e119699c - md5: b618c36e7eff7a28a53bde4d9aa017e0 + url: https://conda.anaconda.org/conda-forge/win-64/libarrow-substrait-17.0.0-h1f0e801_14_cpu.conda + sha256: 732c3225859448a5a3ecab1c6ad0bc1e8f17acf5a65165c9e4e68bd5f2ef5a35 + md5: 95efdbfd06c2eb63f5ca695a6431ff28 depends: - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - - libarrow 17.0.0 h29daf90_13_cpu - - libarrow-acero 17.0.0 he0c23c2_13_cpu - - libarrow-dataset 17.0.0 he0c23c2_13_cpu + - libarrow 17.0.0 he3462ed_14_cpu + - libarrow-acero 17.0.0 he0c23c2_14_cpu + - libarrow-dataset 17.0.0 he0c23c2_14_cpu - libprotobuf >=4.25.3,<4.25.4.0a0 - ucrt >=10.0.20348.0 - vc >=14.2,<15 - vc14_runtime >=14.29.30139 license: Apache-2.0 - license_family: APACHE - purls: [] - size: 382757 - timestamp: 1725215569161 -- kind: conda - name: libarrow-substrait - version: 17.0.0 - build: hba007a9_13_cpu - build_number: 13 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libarrow-substrait-17.0.0-hba007a9_13_cpu.conda - sha256: 729523ec54db45127b1e644454d3612ce48196c27426ae5c2ace022b6791bf53 - md5: 883ffa72318b7952df9a21243ab2f281 - depends: - - __osx >=10.13 - - libabseil * cxx17* - - libabseil >=20240116.2,<20240117.0a0 - - libarrow 17.0.0 ha60c65e_13_cpu - - libarrow-acero 17.0.0 hac325c4_13_cpu - - libarrow-dataset 17.0.0 hac325c4_13_cpu - - libcxx >=17 - - libprotobuf >=4.25.3,<4.25.4.0a0 - license: Apache-2.0 - license_family: APACHE purls: [] - size: 478730 - timestamp: 1725215444041 + size: 382299 + timestamp: 1726335853385 - kind: conda name: libarrow-substrait version: 17.0.0 - build: hbf8b706_13_cpu - build_number: 13 + build: hbf8b706_14_cpu + build_number: 14 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-substrait-17.0.0-hbf8b706_13_cpu.conda - sha256: 5fd22a610e14669f0f392aaf7b61511d9a7a5f99a23a3ce4bdf5b2880ddbd244 - md5: e079f9b14e75bb3f571b1345ce8dad78 + url: https://conda.anaconda.org/conda-forge/osx-arm64/libarrow-substrait-17.0.0-hbf8b706_14_cpu.conda + sha256: f43b8ca4126abd5e95cb492b735070e3b1ae3b2eb1a6903dbc08af97c196f578 + md5: 21a043ce8a8f295f2a7d0d1365574ab6 depends: - __osx >=11.0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - - libarrow 17.0.0 h20538ec_13_cpu - - libarrow-acero 17.0.0 hf9b8971_13_cpu - - libarrow-dataset 17.0.0 hf9b8971_13_cpu + - libarrow 17.0.0 h77c2f02_14_cpu + - libarrow-acero 17.0.0 hf9b8971_14_cpu + - libarrow-dataset 17.0.0 hf9b8971_14_cpu - libcxx >=17 - libprotobuf >=4.25.3,<4.25.4.0a0 license: Apache-2.0 - license_family: APACHE purls: [] - size: 466501 - timestamp: 1725215667169 + size: 473450 + timestamp: 1726335147692 - kind: conda name: libarrow-substrait version: 17.0.0 - build: hf54134d_13_cpu - build_number: 13 + build: hf54134d_14_cpu + build_number: 14 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-substrait-17.0.0-hf54134d_13_cpu.conda - sha256: 01ff52d5b866f3174018c81dee808fbef1101f2cff05cc5f29c80ff68cc8796c - md5: 46f41533959eee8826c09e55976b8c06 + url: https://conda.anaconda.org/conda-forge/linux-64/libarrow-substrait-17.0.0-hf54134d_14_cpu.conda + sha256: 95bd98ea8a020223de4ee2d0bedbca11194fbf75ec1097d4f0edf13df7a3531f + md5: aceb64084d1e5124aa9fad22de48ccf7 depends: - __glibc >=2.17,<3.0.a0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - - libarrow 17.0.0 h8d2e343_13_cpu - - libarrow-acero 17.0.0 h5888daf_13_cpu - - libarrow-dataset 17.0.0 h5888daf_13_cpu + - libarrow 17.0.0 hc80a628_14_cpu + - libarrow-acero 17.0.0 h5888daf_14_cpu + - libarrow-dataset 17.0.0 h5888daf_14_cpu - libgcc >=13 - libprotobuf >=4.25.3,<4.25.4.0a0 - libstdcxx >=13 license: Apache-2.0 - license_family: APACHE purls: [] - size: 550883 - timestamp: 1725214851656 + size: 550312 + timestamp: 1726334930205 - kind: conda name: libasprintf version: 0.22.5 @@ -14388,22 +11412,6 @@ packages: purls: [] size: 40657 timestamp: 1723626937704 -- kind: conda - name: libasprintf - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libasprintf-0.22.5-hdfe23c8_3.conda - sha256: 9c6f3e2558e098dbbc63c9884b4af368ea6cc4185ea027563ac4f5ee8571b143 - md5: 55363e1d53635b3497cdf753ab0690c1 - depends: - - __osx >=10.13 - - libcxx >=16 - license: LGPL-2.1-or-later - purls: [] - size: 40442 - timestamp: 1723626787726 - kind: conda name: libasprintf version: 0.22.5 @@ -14437,22 +11445,6 @@ packages: purls: [] size: 34648 timestamp: 1723626983419 -- kind: conda - name: libasprintf-devel - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libasprintf-devel-0.22.5-hdfe23c8_3.conda - sha256: 408e59cc215b654b292f503d37552d319e71180d33798867975377c28fd3c6b3 - md5: e2ae0568825e62d439a921fdc7f6db64 - depends: - - __osx >=10.13 - - libasprintf 0.22.5 hdfe23c8_3 - license: LGPL-2.1-or-later - purls: [] - size: 34522 - timestamp: 1723626838677 - kind: conda name: libasprintf-devel version: 0.22.5 @@ -14493,28 +11485,6 @@ packages: purls: [] size: 133110 timestamp: 1719985879751 -- kind: conda - name: libass - version: 0.17.3 - build: h5386a9e_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libass-0.17.3-h5386a9e_0.conda - sha256: 2a19c0230f0d6d707a2f0d3fdfe50fb41fbf05e88fb4a79e8e2b5a29f66c4c55 - md5: b6b8a0a32d77060c4431933a0ba11d3b - depends: - - __osx >=10.13 - - fontconfig >=2.14.2,<3.0a0 - - fonts-conda-ecosystem - - freetype >=2.12.1,<3.0a0 - - fribidi >=1.0.10,<2.0a0 - - harfbuzz >=9.0.0,<10.0a0 - - libexpat >=2.6.2,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - license: ISC - license_family: OTHER - purls: [] - size: 133998 - timestamp: 1719986071273 - kind: conda name: libass version: 0.17.3 @@ -14537,28 +11507,6 @@ packages: purls: [] size: 116755 timestamp: 1719986027249 -- kind: conda - name: libblas - version: 3.9.0 - build: 22_osx64_openblas - build_number: 22 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libblas-3.9.0-22_osx64_openblas.conda - sha256: d72060239f904b3a81d2329efcf84dc62c2dfd66dbc4efc8dcae1afdf8f02b59 - md5: b80966a8c8dd0b531f8e65f709d732e8 - depends: - - libopenblas >=0.3.27,<0.3.28.0a0 - - libopenblas >=0.3.27,<1.0a0 - constrains: - - liblapacke 3.9.0 22_osx64_openblas - - blas * openblas - - libcblas 3.9.0 22_osx64_openblas - - liblapack 3.9.0 22_osx64_openblas - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 14749 - timestamp: 1712542279018 - kind: conda name: libblas version: 3.9.0 @@ -14624,22 +11572,6 @@ packages: purls: [] size: 5192100 timestamp: 1721689573083 -- kind: conda - name: libbrotlicommon - version: 1.1.0 - build: h00291cd_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libbrotlicommon-1.1.0-h00291cd_2.conda - sha256: b377056470a9fb4a100aa3c51b3581aab6496ba84d21cd99bcc1d5ef0359b1b6 - md5: 58f2c4bdd56c46cc7451596e4ae68e0b - depends: - - __osx >=10.13 - license: MIT - license_family: MIT - purls: [] - size: 67267 - timestamp: 1725267768667 - kind: conda name: libbrotlicommon version: 1.1.0 @@ -14691,23 +11623,6 @@ packages: purls: [] size: 68426 timestamp: 1725267943211 -- kind: conda - name: libbrotlidec - version: 1.1.0 - build: h00291cd_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libbrotlidec-1.1.0-h00291cd_2.conda - sha256: 4d49ea72e2f44d2d7a8be5472e4bd0bc2c6b89c55569de2c43576363a0685c0c - md5: 34709a1f5df44e054c4a12ab536c5459 - depends: - - __osx >=10.13 - - libbrotlicommon 1.1.0 h00291cd_2 - license: MIT - license_family: MIT - purls: [] - size: 29872 - timestamp: 1725267807289 - kind: conda name: libbrotlidec version: 1.1.0 @@ -14762,23 +11677,6 @@ packages: purls: [] size: 28378 timestamp: 1725267980316 -- kind: conda - name: libbrotlienc - version: 1.1.0 - build: h00291cd_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libbrotlienc-1.1.0-h00291cd_2.conda - sha256: 477d236d389473413a1ccd2bec1b66b2f1d2d7d1b4a57bb56421b7b611a56cd1 - md5: 691f0dcb36f1ae67f5c489f20ae987ea - depends: - - __osx >=10.13 - - libbrotlicommon 1.1.0 h00291cd_2 - license: MIT - license_family: MIT - purls: [] - size: 296353 - timestamp: 1725267822076 - kind: conda name: libbrotlienc version: 1.1.0 @@ -14833,26 +11731,6 @@ packages: purls: [] size: 279644 timestamp: 1725268003553 -- kind: conda - name: libcblas - version: 3.9.0 - build: 22_osx64_openblas - build_number: 22 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libcblas-3.9.0-22_osx64_openblas.conda - sha256: 6a2ba9198e2320c3e22fe3d121310cf8a8ac663e94100c5693b34523fcb3cc04 - md5: b9fef82772330f61b2b0201c72d2c29b - depends: - - libblas 3.9.0 22_osx64_openblas - constrains: - - liblapacke 3.9.0 22_osx64_openblas - - blas * openblas - - liblapack 3.9.0 22_osx64_openblas - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 14636 - timestamp: 1712542311437 - kind: conda name: libcblas version: 3.9.0 @@ -14913,24 +11791,6 @@ packages: purls: [] size: 5191981 timestamp: 1721689628480 -- kind: conda - name: libclang-cpp16 - version: 16.0.6 - build: default_h0c94c6a_13 - build_number: 13 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libclang-cpp16-16.0.6-default_h0c94c6a_13.conda - sha256: bc064c078a58ce81d26f2fc9b8414c8a7f6d8317caebbe86fe48b5ba2fbbf777 - md5: 04ad673e08f4ba5d434b0c96a2e90e3d - depends: - - __osx >=10.13 - - libcxx >=16.0.6 - - libllvm16 >=16.0.6,<16.1.0a0 - license: Apache-2.0 WITH LLVM-exception - license_family: Apache - purls: [] - size: 12823030 - timestamp: 1725061894194 - kind: conda name: libclang-cpp16 version: 16.0.6 @@ -14987,24 +11847,6 @@ packages: purls: [] size: 11017079 timestamp: 1725430212320 -- kind: conda - name: libclang13 - version: 18.1.8 - build: default_h9ff962c_4 - build_number: 4 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libclang13-18.1.8-default_h9ff962c_4.conda - sha256: 69784e2f221b926da336d9b935f018d921082ae427b157e9672d622e2794db46 - md5: ad31a668ef3526b95525337ab3c41d95 - depends: - - __osx >=10.13 - - libcxx >=18.1.8 - - libllvm18 >=18.1.8,<18.2.0a0 - license: Apache-2.0 WITH LLVM-exception - license_family: Apache - purls: [] - size: 8033230 - timestamp: 1725429887940 - kind: conda name: libclang13 version: 18.1.8 @@ -15090,21 +11932,6 @@ packages: purls: [] size: 18765 timestamp: 1633683992603 -- kind: conda - name: libcrc32c - version: 1.1.2 - build: he49afe7_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libcrc32c-1.1.2-he49afe7_0.tar.bz2 - sha256: 3043869ac1ee84554f177695e92f2f3c2c507b260edad38a0bf3981fce1632ff - md5: 23d6d5a69918a438355d7cbc4c3d54c9 - depends: - - libcxx >=11.1.0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 20128 - timestamp: 1633683906221 - kind: conda name: libcups version: 2.3.3 @@ -15165,27 +11992,6 @@ packages: purls: [] size: 342210 timestamp: 1726064608464 -- kind: conda - name: libcurl - version: 8.10.0 - build: h58e7537_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libcurl-8.10.0-h58e7537_0.conda - sha256: 6b28ed898562ee9e351bbb209fea25c9cd4078f2010223f23dbccc3be0c3d361 - md5: 732abd8f88ee1749239335c2328e5fc3 - depends: - - __osx >=10.13 - - krb5 >=1.21.3,<1.22.0a0 - - libnghttp2 >=1.58.0,<2.0a0 - - libssh2 >=1.11.0,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - - openssl >=3.3.2,<4.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: curl - license_family: MIT - purls: [] - size: 402216 - timestamp: 1726064094965 - kind: conda name: libcurl version: 8.10.0 @@ -15224,22 +12030,6 @@ packages: purls: [] size: 436921 timestamp: 1725403628507 -- kind: conda - name: libcxx - version: 18.1.8 - build: hd876a4e_7 - build_number: 7 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libcxx-18.1.8-hd876a4e_7.conda - sha256: ca43fcc18bff98cbf456ccc76fe113b2afe01d4156c2899b638fd1bc0323d239 - md5: c346ae5c96382a12563e3b0c403c8c4a - depends: - - __osx >=10.13 - license: Apache-2.0 WITH LLVM-exception - license_family: Apache - purls: [] - size: 439306 - timestamp: 1725403678987 - kind: conda name: libdeflate version: '1.21' @@ -15288,21 +12078,6 @@ packages: purls: [] size: 54533 timestamp: 1722820240854 -- kind: conda - name: libdeflate - version: '1.21' - build: hfdf4475_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libdeflate-1.21-hfdf4475_0.conda - sha256: 1defb3e5243a74a9ef64de2a47812f524664e46ca9dbecb8d7c746cb1779038e - md5: 88409b23a5585c15d52de0073f3c9c61 - depends: - - __osx >=10.13 - license: MIT - license_family: MIT - purls: [] - size: 70570 - timestamp: 1722820232914 - kind: conda name: libdrm version: 2.4.123 @@ -15320,22 +12095,6 @@ packages: purls: [] size: 303108 timestamp: 1724719521496 -- kind: conda - name: libedit - version: 3.1.20191231 - build: h0678c8f_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libedit-3.1.20191231-h0678c8f_2.tar.bz2 - sha256: dbd3c3f2eca1d21c52e4c03b21930bbce414c4592f8ce805801575b9e9256095 - md5: 6016a8a1d0e63cac3de2c352cd40208b - depends: - - ncurses >=6.2,<7.0.0a0 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 105382 - timestamp: 1597616576726 - kind: conda name: libedit version: 3.1.20191231 @@ -15384,20 +12143,6 @@ packages: purls: [] size: 44492 timestamp: 1723473193819 -- kind: conda - name: libev - version: '4.33' - build: h10d778d_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libev-4.33-h10d778d_2.conda - sha256: 0d238488564a7992942aa165ff994eca540f687753b4f0998b29b4e4d030ff43 - md5: 899db79329439820b7e8f8de41bca902 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 106663 - timestamp: 1702146352558 - kind: conda name: libev version: '4.33' @@ -15463,22 +12208,6 @@ packages: purls: [] size: 410555 timestamp: 1685726568668 -- kind: conda - name: libevent - version: 2.1.12 - build: ha90c15b_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libevent-2.1.12-ha90c15b_1.conda - sha256: e0bd9af2a29f8dd74309c0ae4f17a7c2b8c4b89f875ff1d6540c941eefbd07fb - md5: e38e467e577bd193a7d5de7c2c540b04 - depends: - - openssl >=3.1.1,<4.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 372661 - timestamp: 1685726378869 - kind: conda name: libevent version: 2.1.12 @@ -15531,39 +12260,6 @@ packages: license_family: MIT size: 73616 timestamp: 1725568742634 -- kind: conda - name: libexpat - version: 2.6.3 - build: hac325c4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libexpat-2.6.3-hac325c4_0.conda - sha256: dd22dffad6731c352f4c14603868c9cce4d3b50ff5ff1e50f416a82dcb491947 - md5: c1db99b0a94a2f23bd6ce39e2d314e07 - depends: - - __osx >=10.13 - constrains: - - expat 2.6.3.* - license: MIT - license_family: MIT - purls: [] - size: 70517 - timestamp: 1725568864316 -- kind: conda - name: libexpat - version: 2.6.3 - build: hac325c4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libexpat-2.6.3-hac325c4_0.conda - sha256: dd22dffad6731c352f4c14603868c9cce4d3b50ff5ff1e50f416a82dcb491947 - md5: c1db99b0a94a2f23bd6ce39e2d314e07 - depends: - - __osx >=10.13 - constrains: - - expat 2.6.3.* - license: MIT - license_family: MIT - size: 70517 - timestamp: 1725568864316 - kind: conda name: libexpat version: 2.6.3 @@ -15634,33 +12330,6 @@ packages: license_family: MIT size: 63895 timestamp: 1725568783033 -- kind: conda - name: libffi - version: 3.4.2 - build: h0d85af4_5 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - sha256: 7a2d27a936ceee6942ea4d397f9c7d136f12549d86f7617e8b6bad51e01a941f - md5: ccb34fb14960ad8b125962d3d79b31a9 - license: MIT - license_family: MIT - purls: [] - size: 51348 - timestamp: 1636488394370 -- kind: conda - name: libffi - version: 3.4.2 - build: h0d85af4_5 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libffi-3.4.2-h0d85af4_5.tar.bz2 - sha256: 7a2d27a936ceee6942ea4d397f9c7d136f12549d86f7617e8b6bad51e01a941f - md5: ccb34fb14960ad8b125962d3d79b31a9 - license: MIT - license_family: MIT - size: 51348 - timestamp: 1636488394370 - kind: conda name: libffi version: 3.4.2 @@ -15851,33 +12520,6 @@ packages: purls: [] size: 344264 timestamp: 1722928697150 -- kind: conda - name: libgd - version: 2.3.3 - build: h2e77e4f_10 - build_number: 10 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgd-2.3.3-h2e77e4f_10.conda - sha256: b5ae19078f96912058d0f96120bf56dae11a417178cfcf220219486778ef868d - md5: a87f68ea91c66e1a9fb515f6aeba6ba2 - depends: - - __osx >=10.13 - - fontconfig >=2.14.2,<3.0a0 - - fonts-conda-ecosystem - - freetype >=2.12.1,<3.0a0 - - icu >=75.1,<76.0a0 - - libexpat >=2.6.2,<3.0a0 - - libiconv >=1.17,<2.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - license: GD - license_family: BSD - purls: [] - size: 200456 - timestamp: 1722928713359 - kind: conda name: libgd version: 2.3.3 @@ -15935,40 +12577,12 @@ packages: - kind: conda name: libgdal version: 3.9.2 - build: h57928b3_2 - build_number: 2 - subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libgdal-3.9.2-h57928b3_2.conda - sha256: d4f9528a8e256b84e3d0a901a9b77d8a0fa654adda07e270e7fae0164652305e - md5: a38e3c87e1ce87145464716aec93fefc - depends: - - libgdal-core 3.9.2.* - - libgdal-fits 3.9.2.* - - libgdal-grib 3.9.2.* - - libgdal-hdf4 3.9.2.* - - libgdal-hdf5 3.9.2.* - - libgdal-jp2openjpeg 3.9.2.* - - libgdal-kea 3.9.2.* - - libgdal-netcdf 3.9.2.* - - libgdal-pdf 3.9.2.* - - libgdal-pg 3.9.2.* - - libgdal-postgisraster 3.9.2.* - - libgdal-tiledb 3.9.2.* - - libgdal-xls 3.9.2.* - license: MIT - license_family: MIT - purls: [] - size: 423328 - timestamp: 1726098743251 -- kind: conda - name: libgdal - version: 3.9.2 - build: h694c41f_2 + build: h57928b3_2 build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-3.9.2-h694c41f_2.conda - sha256: 13a3b2dcf7ce090fe778a736cc7bc1034b0609ed6e19b91291b1958767978d64 - md5: abb256d462df471d514b7535eeb211a0 + subdir: win-64 + url: https://conda.anaconda.org/conda-forge/win-64/libgdal-3.9.2-h57928b3_2.conda + sha256: d4f9528a8e256b84e3d0a901a9b77d8a0fa654adda07e270e7fae0164652305e + md5: a38e3c87e1ce87145464716aec93fefc depends: - libgdal-core 3.9.2.* - libgdal-fits 3.9.2.* @@ -15986,8 +12600,8 @@ packages: license: MIT license_family: MIT purls: [] - size: 422986 - timestamp: 1726095491845 + size: 423328 + timestamp: 1726098743251 - kind: conda name: libgdal version: 3.9.2 @@ -16044,52 +12658,6 @@ packages: purls: [] size: 423109 timestamp: 1726097409464 -- kind: conda - name: libgdal-core - version: 3.9.2 - build: h26ecb72_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-core-3.9.2-h26ecb72_2.conda - sha256: d7d0d7f15409286a3ab10bd8740189c659899433b33fdbcad4d62d317dbee908 - md5: 019cc81c6b62de83b4b3e7cd3487fd5d - depends: - - __osx >=10.13 - - blosc >=1.21.6,<2.0a0 - - geos >=3.12.2,<3.12.3.0a0 - - geotiff >=1.7.3,<1.8.0a0 - - giflib >=5.2.2,<5.3.0a0 - - json-c >=0.17,<0.18.0a0 - - lerc >=4.0.0,<5.0a0 - - libarchive >=3.7.4,<3.8.0a0 - - libcurl >=8.10.0,<9.0a0 - - libcxx >=17 - - libdeflate >=1.21,<1.22.0a0 - - libexpat >=2.6.3,<3.0a0 - - libiconv >=1.17,<2.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libkml >=1.3.0,<1.4.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libspatialite >=5.1.0,<5.2.0a0 - - libsqlite >=3.46.1,<4.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libxml2 >=2.12.7,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - lz4-c >=1.9.3,<1.10.0a0 - - openssl >=3.3.2,<4.0a0 - - pcre2 >=10.44,<10.45.0a0 - - proj >=9.4.1,<9.5.0a0 - - xerces-c >=3.2.5,<3.3.0a0 - - xz >=5.2.6,<6.0a0 - - zstd >=1.5.6,<1.6.0a0 - constrains: - - libgdal 3.9.2.* - license: MIT - license_family: MIT - purls: [] - size: 8978330 - timestamp: 1726092675670 - kind: conda name: libgdal-core version: 3.9.2 @@ -16250,26 +12818,6 @@ packages: purls: [] size: 496865 timestamp: 1726096554431 -- kind: conda - name: libgdal-fits - version: 3.9.2 - build: h2000d26_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-fits-3.9.2-h2000d26_2.conda - sha256: 99e50299f4fc4bd56004d46bc687e2911951af1eb1d789a2575f41ecc27cf466 - md5: 9c3aba4aca7b18a4bf164e140150c257 - depends: - - __osx >=10.13 - - cfitsio >=4.4.1,<4.4.2.0a0 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 469488 - timestamp: 1726094128863 - kind: conda name: libgdal-fits version: 3.9.2 @@ -16331,26 +12879,6 @@ packages: purls: [] size: 652199 timestamp: 1726095792225 -- kind: conda - name: libgdal-grib - version: 3.9.2 - build: h9237131_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-grib-3.9.2-h9237131_2.conda - sha256: 8bf83845c8f7c6114fb57e3d63d73e66f4e1457997e7b2a804c66170764e45b9 - md5: 00555b58b0bcca46e7b9e1459be4ccf2 - depends: - - __osx >=10.13 - - libaec >=1.1.3,<2.0a0 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 663497 - timestamp: 1726094251462 - kind: conda name: libgdal-grib version: 3.9.2 @@ -16436,27 +12964,6 @@ packages: purls: [] size: 562233 timestamp: 1726096927503 -- kind: conda - name: libgdal-hdf4 - version: 3.9.2 - build: hbfba102_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-hdf4-3.9.2-hbfba102_2.conda - sha256: c38e3bc65dc35de96c9f8cfee7ae6606f72214714c53db4581f7f5baf2516007 - md5: 2d081b1f5acdcd3dcf7ed52c1d775a6a - depends: - - __osx >=10.13 - - hdf4 >=4.2.15,<4.2.16.0a0 - - libaec >=1.1.3,<2.0a0 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 591694 - timestamp: 1726094364531 - kind: conda name: libgdal-hdf4 version: 3.9.2 @@ -16541,26 +13048,6 @@ packages: purls: [] size: 613268 timestamp: 1726097135142 -- kind: conda - name: libgdal-hdf5 - version: 3.9.2 - build: hc0c3446_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-hdf5-3.9.2-hc0c3446_2.conda - sha256: 7ac2b60f99639a662234d39403c9ff3360b6f2ac85f909b25ca86d10ff44b244 - md5: ab9f93b55a1d47ac6d87d9e00f836633 - depends: - - __osx >=10.13 - - hdf5 >=1.14.3,<1.14.4.0a0 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 601423 - timestamp: 1726094496440 - kind: conda name: libgdal-jp2openjpeg version: 3.9.2 @@ -16602,26 +13089,6 @@ packages: purls: [] size: 463272 timestamp: 1726096265879 -- kind: conda - name: libgdal-jp2openjpeg - version: 3.9.2 - build: hd77bb1f_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-jp2openjpeg-3.9.2-hd77bb1f_2.conda - sha256: 10102b96411dfea446deb6235ecea536d34b59b81cad311648b87d4249c3dc08 - md5: 45031c24274b3035b4877732e192f392 - depends: - - __osx >=10.13 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - - openjpeg >=2.5.2,<3.0a0 - license: MIT - license_family: MIT - purls: [] - size: 464295 - timestamp: 1726094606697 - kind: conda name: libgdal-jp2openjpeg version: 3.9.2 @@ -16711,28 +13178,6 @@ packages: purls: [] size: 518345 timestamp: 1726098538348 -- kind: conda - name: libgdal-kea - version: 3.9.2 - build: he223473_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-kea-3.9.2-he223473_2.conda - sha256: 8687adb1ab399a08f9901dfd6ba22cf38fac91b3b56cafc4efbd846cfaaacc87 - md5: 4f4b68b06d7e4b3fcf5a5999bddb1298 - depends: - - __osx >=10.13 - - hdf5 >=1.14.3,<1.14.4.0a0 - - kealib >=1.5.3,<1.6.0a0 - - libcxx >=17 - - libgdal-core >=3.9 - - libgdal-hdf5 3.9.2.* - - libkml >=1.3.0,<1.4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 475299 - timestamp: 1726095352836 - kind: conda name: libgdal-netcdf version: 3.9.2 @@ -16782,30 +13227,6 @@ packages: purls: [] size: 668901 timestamp: 1726097400854 -- kind: conda - name: libgdal-netcdf - version: 3.9.2 - build: he83ae23_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-netcdf-3.9.2-he83ae23_2.conda - sha256: ae605bc31e3b9503b1e6ae733530fdd1cc21721bc6fead69c3dd2066e6202db1 - md5: 049cd27768fd0735bde2237c2f436e88 - depends: - - __osx >=10.13 - - hdf4 >=4.2.15,<4.2.16.0a0 - - hdf5 >=1.14.3,<1.14.4.0a0 - - libcxx >=17 - - libgdal-core >=3.9 - - libgdal-hdf4 3.9.2.* - - libgdal-hdf5 3.9.2.* - - libkml >=1.3.0,<1.4.0a0 - - libnetcdf >=4.9.2,<4.9.3.0a0 - license: MIT - license_family: MIT - purls: [] - size: 692336 - timestamp: 1726095485705 - kind: conda name: libgdal-netcdf version: 3.9.2 @@ -16872,26 +13293,6 @@ packages: purls: [] size: 667813 timestamp: 1726093333981 -- kind: conda - name: libgdal-pdf - version: 3.9.2 - build: h85e1e31_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-pdf-3.9.2-h85e1e31_2.conda - sha256: c9e9a11af7fe7dc2eb306300d7972e2d03e5d3abc0945407deb93026d1749c91 - md5: 95b05a267dc00e4f5d3efc2cb56feea7 - depends: - - __osx >=10.13 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - - poppler - license: MIT - license_family: MIT - purls: [] - size: 610106 - timestamp: 1726094743209 - kind: conda name: libgdal-pdf version: 3.9.2 @@ -16956,27 +13357,6 @@ packages: purls: [] size: 526700 timestamp: 1726093379300 -- kind: conda - name: libgdal-pg - version: 3.9.2 - build: h7ffd8cf_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-pg-3.9.2-h7ffd8cf_2.conda - sha256: de450f862f4595949b7c0e4f9594a50d400a049a43a8c6a4abec2b208e906f30 - md5: 5cd82b1f469ec92d3000f537dd9c9c70 - depends: - - __osx >=10.13 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - - libpq >=16.4,<17.0a0 - - postgresql - license: MIT - license_family: MIT - purls: [] - size: 507951 - timestamp: 1726094861491 - kind: conda name: libgdal-pg version: 3.9.2 @@ -17042,27 +13422,6 @@ packages: purls: [] size: 480156 timestamp: 1726093423575 -- kind: conda - name: libgdal-postgisraster - version: 3.9.2 - build: h7ffd8cf_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-postgisraster-3.9.2-h7ffd8cf_2.conda - sha256: bbabb16d210723591ef6a20aa743deefd7e704813a459b2959203d967efb084c - md5: 477603447d6359fc22119bd95b49e98e - depends: - - __osx >=10.13 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - - libpq >=16.4,<17.0a0 - - postgresql - license: MIT - license_family: MIT - purls: [] - size: 470037 - timestamp: 1726094979700 - kind: conda name: libgdal-postgisraster version: 3.9.2 @@ -17126,26 +13485,6 @@ packages: purls: [] size: 681765 timestamp: 1726093490312 -- kind: conda - name: libgdal-tiledb - version: 3.9.2 - build: h6b11327_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-tiledb-3.9.2-h6b11327_2.conda - sha256: dda569758f13d5ec8d485397abce0a4b6fceedd1e06d10f031f4c5f644dd2709 - md5: 82799fcd51f47381b7398e9521c1ad95 - depends: - - __osx >=10.13 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - - tiledb >=2.26.0,<2.27.0a0 - license: MIT - license_family: MIT - purls: [] - size: 630738 - timestamp: 1726095119594 - kind: conda name: libgdal-tiledb version: 3.9.2 @@ -17208,26 +13547,6 @@ packages: purls: [] size: 432762 timestamp: 1726097071724 -- kind: conda - name: libgdal-xls - version: 3.9.2 - build: hc33d192_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgdal-xls-3.9.2-hc33d192_2.conda - sha256: af48ecd38ed1b1e0a8a55c7bae5a646f164275f8aba93cc3aaa7939c99b30dcb - md5: 19731e92fa7d594f556519d4b4c40b36 - depends: - - __osx >=10.13 - - freexl >=2.0.0,<3.0a0 - - libcxx >=17 - - libgdal-core >=3.9 - - libkml >=1.3.0,<1.4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 431816 - timestamp: 1726095230562 - kind: conda name: libgdal-xls version: 3.9.2 @@ -17267,24 +13586,6 @@ packages: purls: [] size: 159800 timestamp: 1723627007035 -- kind: conda - name: libgettextpo - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgettextpo-0.22.5-hdfe23c8_3.conda - sha256: 8f7631d03a093272a5a8423181ac2c66514503e082e5494a2e942737af8a34ad - md5: ba6eeccaee150e24a544be8ae71aeca1 - depends: - - __osx >=10.13 - - libiconv >=1.17,<2.0a0 - - libintl 0.22.5 hdfe23c8_3 - license: GPL-3.0-or-later - license_family: GPL - purls: [] - size: 172305 - timestamp: 1723626852373 - kind: conda name: libgettextpo version: 0.22.5 @@ -17321,25 +13622,6 @@ packages: purls: [] size: 37153 timestamp: 1723627048279 -- kind: conda - name: libgettextpo-devel - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgettextpo-devel-0.22.5-hdfe23c8_3.conda - sha256: 8ea6bcba8c002f547edfd51e27e1e81465c8838033877c56439d20bcbc8f32a3 - md5: efbba22e1657ef214c9ce9105b2ca562 - depends: - - __osx >=10.13 - - libgettextpo 0.22.5 hdfe23c8_3 - - libiconv >=1.17,<2.0a0 - - libintl 0.22.5 hdfe23c8_3 - license: GPL-3.0-or-later - license_family: GPL - purls: [] - size: 36977 - timestamp: 1723626874373 - kind: conda name: libgettextpo-devel version: 0.22.5 @@ -17358,22 +13640,6 @@ packages: purls: [] size: 36790 timestamp: 1723626032786 -- kind: conda - name: libgfortran - version: 5.0.0 - build: 13_2_0_h97931a8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgfortran-5.0.0-13_2_0_h97931a8_3.conda - sha256: 4874422e567b68334705c135c17e5acdca1404de8255673ce30ad3510e00be0d - md5: 0b6e23a012ee7a9a5f6b244f5a92c1d5 - depends: - - libgfortran5 13.2.0 h2873a65_3 - license: GPL-3.0-only WITH GCC-exception-3.1 - license_family: GPL - purls: [] - size: 110106 - timestamp: 1707328956438 - kind: conda name: libgfortran version: 5.0.0 @@ -17424,24 +13690,6 @@ packages: purls: [] size: 52212 timestamp: 1724802086021 -- kind: conda - name: libgfortran5 - version: 13.2.0 - build: h2873a65_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgfortran5-13.2.0-h2873a65_3.conda - sha256: da3db4b947e30aec7596a3ef92200d17e774cccbbf7efc47802529a4ca5ca31b - md5: e4fb4d23ec2870ff3c40d10afe305aec - depends: - - llvm-openmp >=8.0.0 - constrains: - - libgfortran 5.0.0 13_2_0_*_3 - license: GPL-3.0-only WITH GCC-exception-3.1 - license_family: GPL - purls: [] - size: 1571379 - timestamp: 1707328880361 - kind: conda name: libgfortran5 version: 13.2.0 @@ -17545,45 +13793,23 @@ packages: build_number: 2 subdir: win-64 url: https://conda.anaconda.org/conda-forge/win-64/libglib-2.80.3-h7025463_2.conda - sha256: 1461eb3b10814630acd1f3a11fc47dbb81c46a4f1f32ed389e3ae050a09c4903 - md5: b60894793e7e4a555027bfb4e4ed1d54 - depends: - - libffi >=3.4,<4.0a0 - - libiconv >=1.17,<2.0a0 - - libintl >=0.22.5,<1.0a0 - - libzlib >=1.3.1,<2.0a0 - - pcre2 >=10.44,<10.45.0a0 - - ucrt >=10.0.20348.0 - - vc >=14.2,<15 - - vc14_runtime >=14.29.30139 - constrains: - - glib 2.80.3 *_2 - license: LGPL-2.1-or-later - purls: [] - size: 3726738 - timestamp: 1723209368854 -- kind: conda - name: libglib - version: 2.80.3 - build: h736d271_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libglib-2.80.3-h736d271_2.conda - sha256: 5543fbb3b1487ffd3a4acbb0b5322ab74ef48c68748fa2907fb47fb825a90bf8 - md5: 975e416ffec75b06cbf8532f5fc1a55e + sha256: 1461eb3b10814630acd1f3a11fc47dbb81c46a4f1f32ed389e3ae050a09c4903 + md5: b60894793e7e4a555027bfb4e4ed1d54 depends: - - __osx >=10.13 - libffi >=3.4,<4.0a0 - libiconv >=1.17,<2.0a0 - libintl >=0.22.5,<1.0a0 - libzlib >=1.3.1,<2.0a0 - pcre2 >=10.44,<10.45.0a0 + - ucrt >=10.0.20348.0 + - vc >=14.2,<15 + - vc14_runtime >=14.29.30139 constrains: - glib 2.80.3 *_2 license: LGPL-2.1-or-later purls: [] - size: 3674504 - timestamp: 1723209150363 + size: 3726738 + timestamp: 1723209368854 - kind: conda name: libglu version: 9.0.0 @@ -17667,37 +13893,37 @@ packages: timestamp: 1724801743478 - kind: conda name: libgoogle-cloud - version: 2.28.0 - build: h26d7fe4_0 + version: 2.29.0 + build: h435de7b_0 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.28.0-h26d7fe4_0.conda - sha256: d87b83d91a9fed749b80dea915452320598035949804db3be616b8c3d694c743 - md5: 2c51703b4d775f8943c08a361788131b + url: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-2.29.0-h435de7b_0.conda + sha256: c8ee42a4acce5227d220ec6500f6872d52d82e478c76648b9ff57dd2d86429bd + md5: 5d95d9040c4319997644f68e9aefbe70 depends: - __glibc >=2.17,<3.0.a0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - libcurl >=8.9.1,<9.0a0 - - libgcc-ng >=12 + - libgcc >=13 - libgrpc >=1.62.2,<1.63.0a0 - libprotobuf >=4.25.3,<4.25.4.0a0 - - libstdcxx-ng >=12 - - openssl >=3.3.1,<4.0a0 + - libstdcxx >=13 + - openssl >=3.3.2,<4.0a0 constrains: - - libgoogle-cloud 2.28.0 *_0 + - libgoogle-cloud 2.29.0 *_0 license: Apache-2.0 license_family: Apache purls: [] - size: 1226849 - timestamp: 1723370075980 + size: 1241649 + timestamp: 1725640926284 - kind: conda name: libgoogle-cloud - version: 2.28.0 + version: 2.29.0 build: h5e7cea3_0 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-2.28.0-h5e7cea3_0.conda - sha256: 30c5eb3509d0a4b5418e58da7cda7cfee7d06b8759efaec1f544f7fcb54bcac0 - md5: 78a31d951ca2e524c6c223d865edd7ae + url: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-2.29.0-h5e7cea3_0.conda + sha256: db703d1045d8b30c2cc2beab5baf9f46d325e4ad55c7aeccfff1ad4d9bad1bea + md5: c6aa97588124153b52bd15f18e899b61 depends: - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 @@ -17708,140 +13934,94 @@ packages: - vc >=14.2,<15 - vc14_runtime >=14.29.30139 constrains: - - libgoogle-cloud 2.28.0 *_0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 14358 - timestamp: 1723371187491 -- kind: conda - name: libgoogle-cloud - version: 2.28.0 - build: h721cda5_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgoogle-cloud-2.28.0-h721cda5_0.conda - sha256: bf45c8d96cb69476814a674f59640178a6b7868d644351bd84e85e37a045795b - md5: c06aee3922ccde627583a5480a0c8445 - depends: - - __osx >=10.13 - - libabseil * cxx17* - - libabseil >=20240116.2,<20240117.0a0 - - libcurl >=8.9.1,<9.0a0 - - libcxx >=16 - - libgrpc >=1.62.2,<1.63.0a0 - - libprotobuf >=4.25.3,<4.25.4.0a0 - - openssl >=3.3.1,<4.0a0 - constrains: - - libgoogle-cloud 2.28.0 *_0 + - libgoogle-cloud 2.29.0 *_0 license: Apache-2.0 license_family: Apache purls: [] - size: 863685 - timestamp: 1723369321726 + size: 14372 + timestamp: 1725641443174 - kind: conda name: libgoogle-cloud - version: 2.28.0 - build: hfe08963_0 + version: 2.29.0 + build: hfa33a2f_0 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-2.28.0-hfe08963_0.conda - sha256: 8ac585e360937aaf9f323e7414c710bf00eec6cf742c15b521fd502e6e3abf2b - md5: 68fb9b247b79e8ac3be37c2923a0cf8a + url: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-2.29.0-hfa33a2f_0.conda + sha256: 1f42048702d773a355d276d24313ac63781a331959fc3662c6be36e979d7845c + md5: f78c7bd435ee45f4661daae9e81ddf13 depends: - __osx >=11.0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - libcurl >=8.9.1,<9.0a0 - - libcxx >=16 + - libcxx >=17 - libgrpc >=1.62.2,<1.63.0a0 - libprotobuf >=4.25.3,<4.25.4.0a0 - - openssl >=3.3.1,<4.0a0 + - openssl >=3.3.2,<4.0a0 constrains: - - libgoogle-cloud 2.28.0 *_0 + - libgoogle-cloud 2.29.0 *_0 license: Apache-2.0 license_family: Apache purls: [] - size: 848880 - timestamp: 1723369224404 + size: 866727 + timestamp: 1725640714587 - kind: conda name: libgoogle-cloud-storage - version: 2.28.0 - build: h1466eeb_0 - subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-storage-2.28.0-h1466eeb_0.conda - sha256: c62d08339e98fd56d65390df1184d8c2929de2713d431a910c3bb59750daccac - md5: 16874ac519f64d2199fab04fd9bd821d + version: 2.29.0 + build: h0121fbd_0 + subdir: linux-64 + url: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-storage-2.29.0-h0121fbd_0.conda + sha256: 2847c9e940b742275a7068e0a742bdabf211bf0b2bbb1453592d6afb47c7e17e + md5: 06dfd5208170b56eee943d9ac674a533 depends: - - __osx >=11.0 - - libabseil - - libcrc32c >=1.1.2,<1.2.0a0 - - libcurl - - libcxx >=16 - - libgoogle-cloud 2.28.0 hfe08963_0 - - libzlib >=1.3.1,<2.0a0 - - openssl - license: Apache-2.0 - license_family: Apache - purls: [] - size: 522700 - timestamp: 1723370053755 -- kind: conda - name: libgoogle-cloud-storage - version: 2.28.0 - build: h9e84e37_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgoogle-cloud-storage-2.28.0-h9e84e37_0.conda - sha256: c55dfdd25ecc40383ba9829ae23cca95a0c48280794edc1280fdca2bc0342ef4 - md5: 6f55d1a6c280ffaddb741dc770cb817c - depends: - - __osx >=10.13 + - __glibc >=2.17,<3.0.a0 - libabseil - libcrc32c >=1.1.2,<1.2.0a0 - libcurl - - libcxx >=16 - - libgoogle-cloud 2.28.0 h721cda5_0 + - libgcc >=13 + - libgoogle-cloud 2.29.0 h435de7b_0 + - libstdcxx >=13 - libzlib >=1.3.1,<2.0a0 - openssl license: Apache-2.0 license_family: Apache purls: [] - size: 542383 - timestamp: 1723370234408 + size: 781655 + timestamp: 1725641060970 - kind: conda name: libgoogle-cloud-storage - version: 2.28.0 - build: ha262f82_0 - subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libgoogle-cloud-storage-2.28.0-ha262f82_0.conda - sha256: 3237bc1ee88dab8d8fea0a1886e12a0262ff5e471944a234c314aa1da411588e - md5: 9e7960f0b9ab3895ef73d92477c47dae + version: 2.29.0 + build: h90fd6fa_0 + subdir: osx-arm64 + url: https://conda.anaconda.org/conda-forge/osx-arm64/libgoogle-cloud-storage-2.29.0-h90fd6fa_0.conda + sha256: ec80383fbb6fae95d2ff7d04ba46b282ab48219b7ce85b3cd5ee7d0d8bae74e1 + md5: baee0b9cb1c5319f370a534ca5a16267 depends: - - __glibc >=2.17,<3.0.a0 + - __osx >=11.0 - libabseil - libcrc32c >=1.1.2,<1.2.0a0 - libcurl - - libgcc-ng >=12 - - libgoogle-cloud 2.28.0 h26d7fe4_0 - - libstdcxx-ng >=12 + - libcxx >=17 + - libgoogle-cloud 2.29.0 hfa33a2f_0 - libzlib >=1.3.1,<2.0a0 - openssl license: Apache-2.0 license_family: Apache purls: [] - size: 769298 - timestamp: 1723370220027 + size: 535346 + timestamp: 1725641618955 - kind: conda name: libgoogle-cloud-storage - version: 2.28.0 + version: 2.29.0 build: he5eb982_0 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-storage-2.28.0-he5eb982_0.conda - sha256: 6318a81a6ef2a72b70c2ddfdadaa5ac79fce431ffa1125e7ca0f9286fa9d9342 - md5: c60153238c7fcdda236b51248220c4bb + url: https://conda.anaconda.org/conda-forge/win-64/libgoogle-cloud-storage-2.29.0-he5eb982_0.conda + sha256: 06f017a06c84ebd0e558651cc07e4919b769a250afbc5f43222fedd3646301c3 + md5: 9371e3773ba476f33b462937406f6c6c depends: - libabseil - libcrc32c >=1.1.2,<1.2.0a0 - libcurl - - libgoogle-cloud 2.28.0 h5e7cea3_0 + - libgoogle-cloud 2.29.0 h5e7cea3_0 - libzlib >=1.3.1,<2.0a0 - ucrt >=10.0.20348.0 - vc >=14.2,<15 @@ -17849,8 +14029,8 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 14259 - timestamp: 1723371607596 + size: 14266 + timestamp: 1725641747676 - kind: conda name: libgrpc version: 1.62.2 @@ -17877,32 +14057,6 @@ packages: purls: [] size: 7316832 timestamp: 1713390645548 -- kind: conda - name: libgrpc - version: 1.62.2 - build: h384b2fc_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libgrpc-1.62.2-h384b2fc_0.conda - sha256: 7c228040e7dac4e5e7e6935a4decf6bc2155cc05fcfb0811d25ccb242d0036ba - md5: 9421f67cf8b4bc976fe5d0c3ab42de18 - depends: - - __osx >=10.13 - - c-ares >=1.28.1,<2.0a0 - - libabseil * cxx17* - - libabseil >=20240116.1,<20240117.0a0 - - libcxx >=16 - - libprotobuf >=4.25.3,<4.25.4.0a0 - - libre2-11 >=2023.9.1,<2024.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - openssl >=3.2.1,<4.0a0 - - re2 - constrains: - - grpc-cpp =1.62.2 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 5189573 - timestamp: 1713392887258 - kind: conda name: libgrpc version: 1.62.2 @@ -17955,24 +14109,6 @@ packages: purls: [] size: 5016525 timestamp: 1713392846329 -- kind: conda - name: libhwloc - version: 2.11.1 - build: default_h456cccd_1000 - build_number: 1000 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libhwloc-2.11.1-default_h456cccd_1000.conda - sha256: 0b5294c8e8fc5f9faab7e54786c5f3c4395ee64b5577a1f2ae687a960e089624 - md5: a14989f6bbea46e6ec4521a403f63ff2 - depends: - - __osx >=10.13 - - libcxx >=16 - - libxml2 >=2.12.7,<3.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 2350774 - timestamp: 1720460664713 - kind: conda name: libhwloc version: 2.11.1 @@ -18075,34 +14211,6 @@ packages: purls: [] size: 705775 timestamp: 1702682170569 -- kind: conda - name: libiconv - version: '1.17' - build: hd75f5a5_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libiconv-1.17-hd75f5a5_2.conda - sha256: 23d4923baeca359423a7347c2ed7aaf48c68603df0cf8b87cc94a10b0d4e9a23 - md5: 6c3628d047e151efba7cf08c5e54d1ca - license: LGPL-2.1-only - purls: [] - size: 666538 - timestamp: 1702682713201 -- kind: conda - name: libidn2 - version: 2.3.7 - build: h10d778d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libidn2-2.3.7-h10d778d_0.conda - sha256: 54430e45dffa8cbe3cbf12a3f4376947e7e2d50c67db90a90e91c3350510823e - md5: a985867eae03167666bba45c2a297da1 - depends: - - gettext >=0.21.1,<1.0a0 - - libunistring >=0,<1.0a0 - license: LGPLv2 - purls: [] - size: 133237 - timestamp: 1706368325339 - kind: conda name: libidn2 version: 2.3.7 @@ -18165,22 +14273,6 @@ packages: purls: [] size: 81171 timestamp: 1723626968270 -- kind: conda - name: libintl - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libintl-0.22.5-hdfe23c8_3.conda - sha256: 0dbb662440a73e20742f12d88e51785a5a5117b8b150783a032b8818a8c043af - md5: 52d4d643ed26c07599736326c46bf12f - depends: - - __osx >=10.13 - - libiconv >=1.17,<2.0a0 - license: LGPL-2.1-or-later - purls: [] - size: 88086 - timestamp: 1723626826235 - kind: conda name: libintl-devel version: 0.22.5 @@ -18198,38 +14290,6 @@ packages: purls: [] size: 38584 timestamp: 1723627022409 -- kind: conda - name: libintl-devel - version: 0.22.5 - build: hdfe23c8_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libintl-devel-0.22.5-hdfe23c8_3.conda - sha256: 4913a20244520d6fae14452910613b652752982193a401482b7d699ee70bb13a - md5: aeb045f400ec2b068c6c142b16f87c7e - depends: - - __osx >=10.13 - - libiconv >=1.17,<2.0a0 - - libintl 0.22.5 hdfe23c8_3 - license: LGPL-2.1-or-later - purls: [] - size: 38249 - timestamp: 1723626863306 -- kind: conda - name: libjpeg-turbo - version: 3.0.0 - build: h0dc2134_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libjpeg-turbo-3.0.0-h0dc2134_1.conda - sha256: d9572fd1024adc374aae7c247d0f29fdf4b122f1e3586fe62acc18067f40d02f - md5: 72507f8e3961bc968af17435060b6dd6 - constrains: - - jpeg <0.0.0a - license: IJG AND BSD-3-Clause AND Zlib - purls: [] - size: 579748 - timestamp: 1694475265912 - kind: conda name: libjpeg-turbo version: 3.0.0 @@ -18302,26 +14362,6 @@ packages: purls: [] size: 1651104 timestamp: 1724667610262 -- kind: conda - name: libkml - version: 1.3.0 - build: h9ee1731_1021 - build_number: 1021 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libkml-1.3.0-h9ee1731_1021.conda - sha256: dba3732e9a3b204e5af01c5ddba8630f4a337693b1c5375c2981a88b580116bd - md5: b098eeacf7e78f09c8771f5088b97bbb - depends: - - __osx >=10.13 - - libcxx >=17 - - libexpat >=2.6.2,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - uriparser >=0.9.8,<1.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 286877 - timestamp: 1724667518323 - kind: conda name: libkml version: 1.3.0 @@ -18363,26 +14403,6 @@ packages: purls: [] size: 402219 timestamp: 1724667059411 -- kind: conda - name: liblapack - version: 3.9.0 - build: 22_osx64_openblas - build_number: 22 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/liblapack-3.9.0-22_osx64_openblas.conda - sha256: e36744f3e780564d6748b5dd05e15ad6a1af9184cf32ab9d1304c13a6bc3e16b - md5: f21b282ff7ba14df6134a0fe6ab42b1b - depends: - - libblas 3.9.0 22_osx64_openblas - constrains: - - liblapacke 3.9.0 22_osx64_openblas - - blas * openblas - - libcblas 3.9.0 22_osx64_openblas - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 14657 - timestamp: 1712542322711 - kind: conda name: liblapack version: 3.9.0 @@ -18443,23 +14463,6 @@ packages: purls: [] size: 5191980 timestamp: 1721689666180 -- kind: conda - name: libllvm14 - version: 14.0.6 - build: hc8e404f_4 - build_number: 4 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libllvm14-14.0.6-hc8e404f_4.conda - sha256: 0df3902a300cfe092425f86144d5e00ef67be3cd1cc89fd63084d45262a772ad - md5: ed06753e2ba7c66ed0ca7f19578fcb68 - depends: - - libcxx >=15 - - libzlib >=1.2.13,<2.0.0a0 - license: Apache-2.0 WITH LLVM-exception - license_family: Apache - purls: [] - size: 22467131 - timestamp: 1690563140552 - kind: conda name: libllvm14 version: 14.0.6 @@ -18514,25 +14517,6 @@ packages: purls: [] size: 23347663 timestamp: 1701374993634 -- kind: conda - name: libllvm16 - version: 16.0.6 - build: hbedff68_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libllvm16-16.0.6-hbedff68_3.conda - sha256: ad848dc0bb02b1dbe54324ee5700b050a2e5f63c095f5229b2de58249a3e268e - md5: 8fd56c0adc07a37f93bd44aa61a97c90 - depends: - - libcxx >=16 - - libxml2 >=2.12.1,<3.0.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - zstd >=1.5.5,<1.6.0a0 - license: Apache-2.0 WITH LLVM-exception - license_family: Apache - purls: [] - size: 25196932 - timestamp: 1701379796962 - kind: conda name: libllvm18 version: 18.1.8 @@ -18574,26 +14558,6 @@ packages: purls: [] size: 38233031 timestamp: 1723208627477 -- kind: conda - name: libllvm18 - version: 18.1.8 - build: h9ce406d_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libllvm18-18.1.8-h9ce406d_2.conda - sha256: 1c7002a0373f1b5c2e65c0d5cb2339ed96714d1ecb63bfd2c229e5010a119519 - md5: 26d0c419fa96d703f9728c39e2727196 - depends: - - __osx >=10.13 - - libcxx >=16 - - libxml2 >=2.12.7,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: Apache-2.0 WITH LLVM-exception - license_family: Apache - purls: [] - size: 27603249 - timestamp: 1723202047662 - kind: conda name: libnetcdf version: 4.9.2 @@ -18623,35 +14587,6 @@ packages: purls: [] size: 849172 timestamp: 1717671645362 -- kind: conda - name: libnetcdf - version: 4.9.2 - build: nompi_h7334405_114 - build_number: 114 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libnetcdf-4.9.2-nompi_h7334405_114.conda - sha256: a4af96274a6c72d97e84dfc728ecc765af300de805d962a835c0841bb6a8f331 - md5: 32ffbe5b0b0134e49f6347f4de8c5dcc - depends: - - __osx >=10.13 - - blosc >=1.21.5,<2.0a0 - - bzip2 >=1.0.8,<2.0a0 - - hdf4 >=4.2.15,<4.2.16.0a0 - - hdf5 >=1.14.3,<1.14.4.0a0 - - libaec >=1.1.3,<2.0a0 - - libcurl >=8.8.0,<9.0a0 - - libcxx >=16 - - libxml2 >=2.12.7,<3.0a0 - - libzip >=1.10.1,<2.0a0 - - libzlib >=1.2.13,<2.0a0 - - openssl >=3.3.1,<4.0a0 - - zlib - - zstd >=1.5.6,<1.6.0a0 - license: MIT - license_family: MIT - purls: [] - size: 726205 - timestamp: 1717671847032 - kind: conda name: libnetcdf version: 4.9.2 @@ -18732,28 +14667,6 @@ packages: purls: [] size: 631936 timestamp: 1702130036271 -- kind: conda - name: libnghttp2 - version: 1.58.0 - build: h64cf6d3_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libnghttp2-1.58.0-h64cf6d3_1.conda - sha256: 412fd768e787e586602f8e9ea52bf089f3460fc630f6987f0cbd89b70e9a4380 - md5: faecc55c2a8155d9ff1c0ff9a0fef64f - depends: - - __osx >=10.9 - - c-ares >=1.23.0,<2.0a0 - - libcxx >=16.0.6 - - libev >=4.33,<4.34.0a0 - - libev >=4.33,<5.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - openssl >=3.2.0,<4.0a0 - license: MIT - license_family: MIT - purls: [] - size: 599736 - timestamp: 1702130398536 - kind: conda name: libnghttp2 version: 1.58.0 @@ -18852,21 +14765,6 @@ packages: purls: [] size: 205451 timestamp: 1719301708541 -- kind: conda - name: libogg - version: 1.3.5 - build: hfdf4475_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libogg-1.3.5-hfdf4475_0.conda - sha256: bebf5797e2a278fd2094f2b0c29ccdfc51d400f4736701108a7e544a49705c64 - md5: 7497372c91a31d3e8d64ce3f1a9632e8 - depends: - - __osx >=10.13 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 203604 - timestamp: 1719301669662 - kind: conda name: libopenblas version: 0.3.27 @@ -18888,27 +14786,6 @@ packages: purls: [] size: 2925328 timestamp: 1720425811743 -- kind: conda - name: libopenblas - version: 0.3.27 - build: openmp_h8869122_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenblas-0.3.27-openmp_h8869122_1.conda - sha256: 83b0b9d3d09889b3648a81d2c18a2d78c405b03b115107941f0496a8b358ce6d - md5: c0798ad76ddd730dade6ff4dff66e0b5 - depends: - - __osx >=10.13 - - libgfortran 5.* - - libgfortran5 >=12.3.0 - - llvm-openmp >=16.0.6 - constrains: - - openblas >=0.3.27,<0.3.28.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 6047513 - timestamp: 1720426759731 - kind: conda name: libopenblas version: 0.3.27 @@ -18946,22 +14823,6 @@ packages: purls: [] size: 5329666 timestamp: 1722425597194 -- kind: conda - name: libopenvino - version: 2024.3.0 - build: h3d2f4b3_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-2024.3.0-h3d2f4b3_0.conda - sha256: 6a869c8ee37cefccd8f2f7471d155f096d10df273a80b1ad9e4a33154962b3de - md5: 99900219f254fe27415b5a234fd0ca33 - depends: - - __osx >=10.13 - - libcxx >=16 - - pugixml >=1.14,<1.15.0a0 - - tbb >=2021.12.0 - purls: [] - size: 4193612 - timestamp: 1722423625545 - kind: conda name: libopenvino version: 2024.3.0 @@ -18995,22 +14856,6 @@ packages: purls: [] size: 6746180 timestamp: 1722423313285 -- kind: conda - name: libopenvino-auto-batch-plugin - version: 2024.3.0 - build: h7b87a6e_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-auto-batch-plugin-2024.3.0-h7b87a6e_0.conda - sha256: fddefecc6b8b02dc24d7608139e6fecec4dc26529e041c232a1a089ee6dfb892 - md5: b84405d3acf6ca16bc14fabe4b4212c4 - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - tbb >=2021.12.0 - purls: [] - size: 104164 - timestamp: 1722423675685 - kind: conda name: libopenvino-auto-batch-plugin version: 2024.3.0 @@ -19042,24 +14887,8 @@ packages: - libopenvino 2024.3.0 h5c9529b_0 - tbb >=2021.12.0 purls: [] - size: 102795 - timestamp: 1722423370742 -- kind: conda - name: libopenvino-auto-plugin - version: 2024.3.0 - build: h7b87a6e_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-auto-plugin-2024.3.0-h7b87a6e_0.conda - sha256: 12e2d475b752716a36b02f05094a052ec41ed3b1f7c4d5decf1907efb890d005 - md5: 516c187dcf24b3678fbe08dc2fa9fe25 - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - tbb >=2021.12.0 - purls: [] - size: 214991 - timestamp: 1722423696573 + size: 102795 + timestamp: 1722423370742 - kind: conda name: libopenvino-auto-plugin version: 2024.3.0 @@ -19093,22 +14922,6 @@ packages: purls: [] size: 208182 timestamp: 1722423393060 -- kind: conda - name: libopenvino-hetero-plugin - version: 2024.3.0 - build: h280e65d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-hetero-plugin-2024.3.0-h280e65d_0.conda - sha256: 2c9cc4e2f5c781ae52cbee02c5d95f2e65146a1b07ff8eedc633eea73d8f94b6 - md5: c1f5964272bf79cd1d6387cd5cc422da - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - pugixml >=1.14,<1.15.0a0 - purls: [] - size: 180677 - timestamp: 1722423718264 - kind: conda name: libopenvino-hetero-plugin version: 2024.3.0 @@ -19160,23 +14973,6 @@ packages: purls: [] size: 11113833 timestamp: 1722425655998 -- kind: conda - name: libopenvino-intel-cpu-plugin - version: 2024.3.0 - build: h3d2f4b3_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-intel-cpu-plugin-2024.3.0-h3d2f4b3_0.conda - sha256: 9485e50e1f26e038f777f375c1c81efc11316ec9a47b614ff72697812742caa8 - md5: e36263ec6cadebd71a4af3f4a4600371 - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - pugixml >=1.14,<1.15.0a0 - - tbb >=2021.12.0 - purls: [] - size: 10316736 - timestamp: 1722423753834 - kind: conda name: libopenvino-intel-gpu-plugin version: 2024.3.0 @@ -19214,22 +15010,6 @@ packages: purls: [] size: 712050 timestamp: 1722425726425 -- kind: conda - name: libopenvino-ir-frontend - version: 2024.3.0 - build: h280e65d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-ir-frontend-2024.3.0-h280e65d_0.conda - sha256: ef308a3def4abf0de0379d55c9d2aa7d02725f1692c502fc39ee2e8336b0725e - md5: 4e09b0e2b0abab731a5e92e7014bba4b - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - pugixml >=1.14,<1.15.0a0 - purls: [] - size: 181236 - timestamp: 1722423812998 - kind: conda name: libopenvino-ir-frontend version: 2024.3.0 @@ -19296,22 +15076,6 @@ packages: purls: [] size: 1208478 timestamp: 1722423474069 -- kind: conda - name: libopenvino-onnx-frontend - version: 2024.3.0 - build: he1e86a1_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-onnx-frontend-2024.3.0-he1e86a1_0.conda - sha256: 5acdd80b86c25cbad665f879487250fd8383c56af66f29c36ded22a30e8e43e1 - md5: 40482daa20287d72b29206f1b1ee053c - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - libprotobuf >=4.25.3,<4.25.4.0a0 - purls: [] - size: 1278017 - timestamp: 1722423852063 - kind: conda name: libopenvino-paddle-frontend version: 2024.3.0 @@ -19345,22 +15109,6 @@ packages: purls: [] size: 417515 timestamp: 1722423502265 -- kind: conda - name: libopenvino-paddle-frontend - version: 2024.3.0 - build: he1e86a1_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-paddle-frontend-2024.3.0-he1e86a1_0.conda - sha256: 2c275348efce49cbe565ae8dc0848bd3df9b3f068c5b932335ada73116d04def - md5: 3616d00a2dad571942ea916e3a10a77b - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - libprotobuf >=4.25.3,<4.25.4.0a0 - purls: [] - size: 431188 - timestamp: 1722423880270 - kind: conda name: libopenvino-pytorch-frontend version: 2024.3.0 @@ -19392,21 +15140,6 @@ packages: purls: [] size: 1110380 timestamp: 1722425778193 -- kind: conda - name: libopenvino-pytorch-frontend - version: 2024.3.0 - build: hf036a51_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-pytorch-frontend-2024.3.0-hf036a51_0.conda - sha256: df9bfcb8b62fd61267603c1afec854f4e6af11d5808b1f42820a81047f4ea3c2 - md5: 980ce44ff9d557478c9a0407bd01e4ae - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - purls: [] - size: 784154 - timestamp: 1722423909319 - kind: conda name: libopenvino-tensorflow-frontend version: 2024.3.0 @@ -19446,25 +15179,6 @@ packages: purls: [] size: 1305937 timestamp: 1722425793747 -- kind: conda - name: libopenvino-tensorflow-frontend - version: 2024.3.0 - build: haca2b7f_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-tensorflow-frontend-2024.3.0-haca2b7f_0.conda - sha256: 63a8a686618c3f80e9ebeebe1b06f2137cf07c1c1a8db6edce8cc85d8d9c831e - md5: e62aa7346424d946c88cdcf6602ea240 - depends: - - __osx >=10.13 - - libabseil * cxx17* - - libabseil >=20240116.2,<20240117.0a0 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - - libprotobuf >=4.25.3,<4.25.4.0a0 - - snappy >=1.2.1,<1.3.0a0 - purls: [] - size: 975656 - timestamp: 1722423958377 - kind: conda name: libopenvino-tensorflow-lite-frontend version: 2024.3.0 @@ -19496,21 +15210,6 @@ packages: purls: [] size: 471539 timestamp: 1722425807123 -- kind: conda - name: libopenvino-tensorflow-lite-frontend - version: 2024.3.0 - build: hf036a51_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopenvino-tensorflow-lite-frontend-2024.3.0-hf036a51_0.conda - sha256: 5182f24da6d1ccf70487a49241578febadc163e620349f152bcfbbd314c76055 - md5: 9fe69b4f984f7b0ffd960482dc8fe70f - depends: - - __osx >=10.13 - - libcxx >=16 - - libopenvino 2024.3.0 h3d2f4b3_0 - purls: [] - size: 370750 - timestamp: 1722423984972 - kind: conda name: libopus version: 1.3.1 @@ -19558,102 +15257,65 @@ packages: purls: [] size: 260615 timestamp: 1606824019288 -- kind: conda - name: libopus - version: 1.3.1 - build: hc929b4f_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libopus-1.3.1-hc929b4f_1.tar.bz2 - sha256: c126fc225bece591a8f010e95ca7d010ea2d02df9251830bec24a19bf823fc31 - md5: 380b9ea5f6a7a277e6c1ac27d034369b - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 279983 - timestamp: 1606823633642 - kind: conda name: libparquet version: 17.0.0 - build: h39682fd_13_cpu - build_number: 13 + build: h39682fd_14_cpu + build_number: 14 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libparquet-17.0.0-h39682fd_13_cpu.conda - sha256: 3c63b7391275cf6cf2a18d2dba3c30c16dd9d210373d206675e342b084cccdf4 - md5: 49c60a8dc089d8127b9368e9eb6c1a77 + url: https://conda.anaconda.org/conda-forge/linux-64/libparquet-17.0.0-h39682fd_14_cpu.conda + sha256: 3131ed308c172a448e349ad8309ace25080fd06b7badadeb38fcf7584a5a6ca2 + md5: 276506dfebdbe017d90de67581724357 depends: - __glibc >=2.17,<3.0.a0 - - libarrow 17.0.0 h8d2e343_13_cpu + - libarrow 17.0.0 hc80a628_14_cpu - libgcc >=13 - libstdcxx >=13 - libthrift >=0.20.0,<0.20.1.0a0 - - openssl >=3.3.1,<4.0a0 + - openssl >=3.3.2,<4.0a0 license: Apache-2.0 - license_family: APACHE purls: [] - size: 1189824 - timestamp: 1725214804075 + size: 1188696 + timestamp: 1726334873011 - kind: conda name: libparquet version: 17.0.0 - build: ha915800_13_cpu - build_number: 13 + build: ha915800_14_cpu + build_number: 14 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libparquet-17.0.0-ha915800_13_cpu.conda - sha256: 8cf6d193600b4dd6cb1a8fbdea168ef6bddbf8ca1ee57d08ce6992df71a62670 - md5: 30b08e672c5dcd827ce7b44f01f4821e + url: https://conda.anaconda.org/conda-forge/win-64/libparquet-17.0.0-ha915800_14_cpu.conda + sha256: a2981c53f016af22c51cae6fa8c41a2537dd1c1cbc29370ea574d1523f26351a + md5: a1529fe4b1a24864999c5de4bc54b9c8 depends: - - libarrow 17.0.0 h29daf90_13_cpu + - libarrow 17.0.0 he3462ed_14_cpu - libthrift >=0.20.0,<0.20.1.0a0 - - openssl >=3.3.1,<4.0a0 + - openssl >=3.3.2,<4.0a0 - ucrt >=10.0.20348.0 - vc >=14.2,<15 - vc14_runtime >=14.29.30139 license: Apache-2.0 - license_family: APACHE purls: [] - size: 805417 - timestamp: 1725215420059 + size: 806654 + timestamp: 1726335672411 - kind: conda name: libparquet version: 17.0.0 - build: hf0ba9ef_13_cpu - build_number: 13 + build: hf0ba9ef_14_cpu + build_number: 14 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libparquet-17.0.0-hf0ba9ef_13_cpu.conda - sha256: fb7ee16e8f9bf60a1f136170231615c1a6f200087f12de4220e0c73f45edcdc6 - md5: 8d415217e6cc74179b5d00a61238b6a9 + url: https://conda.anaconda.org/conda-forge/osx-arm64/libparquet-17.0.0-hf0ba9ef_14_cpu.conda + sha256: 3781a186f35dc9110111cdad1660c6ccd86214bf6388aaf2015c8c086b669e6e + md5: 584ca15fd7e26547d94d28135be60fb1 depends: - __osx >=11.0 - - libarrow 17.0.0 h20538ec_13_cpu - - libcxx >=17 - - libthrift >=0.20.0,<0.20.1.0a0 - - openssl >=3.3.1,<4.0a0 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 859328 - timestamp: 1725215440648 -- kind: conda - name: libparquet - version: 17.0.0 - build: hf1b0f52_13_cpu - build_number: 13 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libparquet-17.0.0-hf1b0f52_13_cpu.conda - sha256: f4b5b4e32cc6ffed205594b8db0764b34b896d4080473f271ff893ca44b872e9 - md5: 303a154bbc5ce01673f6b83cf20da30a - depends: - - __osx >=10.13 - - libarrow 17.0.0 ha60c65e_13_cpu + - libarrow 17.0.0 h77c2f02_14_cpu - libcxx >=17 - libthrift >=0.20.0,<0.20.1.0a0 - - openssl >=3.3.1,<4.0a0 + - openssl >=3.3.2,<4.0a0 license: Apache-2.0 - license_family: APACHE purls: [] - size: 925660 - timestamp: 1725215237883 + size: 871630 + timestamp: 1726334967200 - kind: conda name: libpciaccess version: '0.18' @@ -19671,64 +15333,52 @@ packages: timestamp: 1707101388552 - kind: conda name: libpng - version: 1.6.43 - build: h091b4b1_0 - subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/libpng-1.6.43-h091b4b1_0.conda - sha256: 66c4713b07408398f2221229a1c1d5df57d65dc0902258113f2d9ecac4772495 - md5: 77e684ca58d82cae9deebafb95b1a2b8 - depends: - - libzlib >=1.2.13,<2.0.0a0 - license: zlib-acknowledgement - purls: [] - size: 264177 - timestamp: 1708780447187 -- kind: conda - name: libpng - version: 1.6.43 - build: h19919ed_0 + version: 1.6.44 + build: h3ca93ac_0 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/libpng-1.6.43-h19919ed_0.conda - sha256: 6ad31bf262a114de5bbe0c6ba73b29ed25239d0f46f9d59700310d2ea0b3c142 - md5: 77e398acc32617a0384553aea29e866b + url: https://conda.anaconda.org/conda-forge/win-64/libpng-1.6.44-h3ca93ac_0.conda + sha256: 0d3d6ff9225f6918ac225e3839c0d91e5af1da08a4ebf59cac1bfd86018db945 + md5: 639ac6b55a40aa5de7b8c1b4d78f9e81 depends: - - libzlib >=1.2.13,<2.0.0a0 + - libzlib >=1.3.1,<2.0a0 - ucrt >=10.0.20348.0 - vc >=14.2,<15 - vc14_runtime >=14.29.30139 license: zlib-acknowledgement purls: [] - size: 347514 - timestamp: 1708780763195 + size: 348933 + timestamp: 1726235196095 - kind: conda name: libpng - version: 1.6.43 - build: h2797004_0 + version: 1.6.44 + build: hadc24fc_0 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.43-h2797004_0.conda - sha256: 502f6ff148ac2777cc55ae4ade01a8fc3543b4ffab25c4e0eaa15f94e90dd997 - md5: 009981dd9cfcaa4dbfa25ffaed86bcae + url: https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.44-hadc24fc_0.conda + sha256: e5b14f7a01c2db4362d8591f42f82f336ed48d5e4079e4d1f65d0c2a3637ea78 + md5: f4cc49d7aa68316213e4b12be35308d1 depends: - - libgcc-ng >=12 - - libzlib >=1.2.13,<2.0.0a0 + - __glibc >=2.17,<3.0.a0 + - libgcc >=13 + - libzlib >=1.3.1,<2.0a0 license: zlib-acknowledgement purls: [] - size: 288221 - timestamp: 1708780443939 + size: 290661 + timestamp: 1726234747153 - kind: conda name: libpng - version: 1.6.43 - build: h92b6c6a_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libpng-1.6.43-h92b6c6a_0.conda - sha256: 13e646d24b5179e6b0a5ece4451a587d759f55d9a360b7015f8f96eff4524b8f - md5: 65dcddb15965c9de2c0365cb14910532 + version: 1.6.44 + build: hc14010f_0 + subdir: osx-arm64 + url: https://conda.anaconda.org/conda-forge/osx-arm64/libpng-1.6.44-hc14010f_0.conda + sha256: 38f8759a3eb8060deabd4db41f0f023514d853e46ddcbd0ba21768fc4e563bb1 + md5: fb36e93f0ea6a6f5d2b99984f34b049e depends: - - libzlib >=1.2.13,<2.0.0a0 + - __osx >=11.0 + - libzlib >=1.3.1,<2.0a0 license: zlib-acknowledgement purls: [] - size: 268524 - timestamp: 1708780496420 + size: 263385 + timestamp: 1726234714421 - kind: conda name: libpq version: '16.4' @@ -19764,23 +15414,6 @@ packages: purls: [] size: 2398238 timestamp: 1724948760153 -- kind: conda - name: libpq - version: '16.4' - build: h75a757a_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libpq-16.4-h75a757a_1.conda - sha256: 161d92de944fefc60414b44f1672d2917dac1e5996f9363635301589b5ee0a94 - md5: 3316ac3fbb20afd3e2a18d6c4264885f - depends: - - __osx >=10.13 - - krb5 >=1.21.3,<1.22.0a0 - - openssl >=3.3.1,<4.0a0 - license: PostgreSQL - purls: [] - size: 2340921 - timestamp: 1724948593326 - kind: conda name: libpq version: '16.4' @@ -19819,25 +15452,6 @@ packages: purls: [] size: 2811207 timestamp: 1709514552541 -- kind: conda - name: libprotobuf - version: 4.25.3 - build: h4e4d658_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libprotobuf-4.25.3-h4e4d658_0.conda - sha256: 3f126769fb5820387d436370ad48600e05d038a28689fdf9988b64e1059947a8 - md5: 57b7ee4f1fd8573781cfdabaec4a7782 - depends: - - __osx >=10.13 - - libabseil * cxx17* - - libabseil >=20240116.1,<20240117.0a0 - - libcxx >=16 - - libzlib >=1.2.13,<2.0.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 2216001 - timestamp: 1709514908146 - kind: conda name: libprotobuf version: 4.25.3 @@ -19937,25 +15551,6 @@ packages: purls: [] size: 489009 timestamp: 1695984134380 -- kind: conda - name: libraw - version: 0.21.1 - build: h8138101_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libraw-0.21.1-h8138101_2.conda - sha256: e69f7fa11709b96be6d99c3c55768996a61c4c0a675bb57f433b3b712126410e - md5: 099b1112ffc520a8d40b16d3ca9d47d0 - depends: - - lcms2 >=2.15,<3.0a0 - - libcxx >=15.0.7 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - license: LGPL-2.1-only - license_family: LGPL - purls: [] - size: 594346 - timestamp: 1695984111953 - kind: conda name: libre2-11 version: 2023.09.01 @@ -19997,27 +15592,6 @@ packages: purls: [] size: 171443 timestamp: 1708947163461 -- kind: conda - name: libre2-11 - version: 2023.09.01 - build: h81f5012_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libre2-11-2023.09.01-h81f5012_2.conda - sha256: 384b72a09bd4bb29c1aa085110b2f940dba431587ffb4e2c1a28f605887a1867 - md5: c5c36ec64e3c86504728c38b79011d08 - depends: - - __osx >=10.13 - - libabseil * cxx17* - - libabseil >=20240116.1,<20240117.0a0 - - libcxx >=16 - constrains: - - re2 2023.09.01.* - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 184017 - timestamp: 1708947106275 - kind: conda name: libre2-11 version: 2023.09.01 @@ -20040,27 +15614,6 @@ packages: purls: [] size: 256561 timestamp: 1708947458481 -- kind: conda - name: librsvg - version: 2.58.4 - build: h2682814_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/librsvg-2.58.4-h2682814_0.conda - sha256: ed2d08ef3647d1c10fa51a0480f215ddae04f73a2bd9bbd135d3f37d313d84a6 - md5: 0022c69263e9bb8c530feff2dfc431f9 - depends: - - __osx >=10.13 - - cairo >=1.18.0,<2.0a0 - - gdk-pixbuf >=2.42.12,<3.0a0 - - libglib >=2.80.3,<3.0a0 - - libxml2 >=2.12.7,<3.0a0 - - pango >=1.54.0,<2.0a0 - constrains: - - __osx >=10.13 - license: LGPL-2.1-or-later - purls: [] - size: 4919155 - timestamp: 1726227702081 - kind: conda name: librsvg version: 2.58.4 @@ -20163,24 +15716,6 @@ packages: purls: [] size: 231637 timestamp: 1720347750456 -- kind: conda - name: librttopo - version: 1.1.0 - build: he2ba7a0_16 - build_number: 16 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/librttopo-1.1.0-he2ba7a0_16.conda - sha256: 907f602ad39172a98e3062c0d6616535075f5227435753fe2c843eb10891403c - md5: 80cc407788999eb3cd5a3651981e55fd - depends: - - __osx >=10.13 - - geos >=3.12.2,<3.12.3.0a0 - - libcxx >=16 - license: GPL-2.0-or-later - license_family: GPL - purls: [] - size: 213675 - timestamp: 1720347819147 - kind: conda name: libsodium version: 1.0.20 @@ -20225,20 +15760,6 @@ packages: purls: [] size: 202344 timestamp: 1716828757533 -- kind: conda - name: libsodium - version: 1.0.20 - build: hfdf4475_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libsodium-1.0.20-hfdf4475_0.conda - sha256: d3975cfe60e81072666da8c76b993af018cf2e73fe55acba2b5ba0928efaccf5 - md5: 6af4b059e26492da6013e79cbcb4d069 - depends: - - __osx >=10.13 - license: ISC - purls: [] - size: 210249 - timestamp: 1716828641383 - kind: conda name: libspatialite version: 5.1.0 @@ -20295,34 +15816,6 @@ packages: purls: [] size: 8283487 timestamp: 1722338203533 -- kind: conda - name: libspatialite - version: 5.1.0 - build: hdc25a2c_9 - build_number: 9 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libspatialite-5.1.0-hdc25a2c_9.conda - sha256: 97f2915388c7b216202aff37bb5163681e96eef0991a7366ccdd3e228d2b6aa6 - md5: 230006cfdaf8e653d16e91e6a9a57c98 - depends: - - __osx >=10.13 - - freexl >=2 - - freexl >=2.0.0,<3.0a0 - - geos >=3.12.2,<3.12.3.0a0 - - libcxx >=16 - - libiconv >=1.17,<2.0a0 - - librttopo >=1.1.0,<1.2.0a0 - - libsqlite >=3.46.0,<4.0a0 - - libxml2 >=2.12.7,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - proj >=9.4.1,<9.5.0a0 - - sqlite - - zlib - license: MPL-1.1 - license_family: MOZILLA - purls: [] - size: 3148395 - timestamp: 1722338108366 - kind: conda name: libspatialite version: 5.1.0 @@ -20382,35 +15875,6 @@ packages: license: Unlicense size: 876666 timestamp: 1725354171439 -- kind: conda - name: libsqlite - version: 3.46.1 - build: h4b8f8c9_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - sha256: 1d075cb823f0cad7e196871b7c57961d669cbbb6cd0e798bf50cbf520dda65fb - md5: 84de0078b58f899fc164303b0603ff0e - depends: - - __osx >=10.13 - - libzlib >=1.3.1,<2.0a0 - license: Unlicense - purls: [] - size: 908317 - timestamp: 1725353652135 -- kind: conda - name: libsqlite - version: 3.46.1 - build: h4b8f8c9_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libsqlite-3.46.1-h4b8f8c9_0.conda - sha256: 1d075cb823f0cad7e196871b7c57961d669cbbb6cd0e798bf50cbf520dda65fb - md5: 84de0078b58f899fc164303b0603ff0e - depends: - - __osx >=10.13 - - libzlib >=1.3.1,<2.0a0 - license: Unlicense - size: 908317 - timestamp: 1725353652135 - kind: conda name: libsqlite version: 3.46.1 @@ -20523,22 +15987,6 @@ packages: purls: [] size: 266806 timestamp: 1685838242099 -- kind: conda - name: libssh2 - version: 1.11.0 - build: hd019ec5_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libssh2-1.11.0-hd019ec5_0.conda - sha256: f3886763b88f4b24265db6036535ef77b7b77ce91b1cbe588c0fbdd861eec515 - md5: ca3a72efba692c59a90d4b9fc0dfe774 - depends: - - libzlib >=1.2.13,<2.0.0a0 - - openssl >=3.1.1,<4.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 259556 - timestamp: 1685837820566 - kind: conda name: libstdcxx version: 14.1.0 @@ -20599,19 +16047,6 @@ packages: purls: [] size: 116745 timestamp: 1661325945767 -- kind: conda - name: libtasn1 - version: 4.19.0 - build: hb7f2c08_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libtasn1-4.19.0-hb7f2c08_0.tar.bz2 - sha256: 4197c155fb460fae65288c6c098c39f22495a53838356d29b79b31b8e33486dc - md5: 73f67fb011b4477b101a95a082c74f0a - license: GPL-3.0-or-later - license_family: GPL - purls: [] - size: 118785 - timestamp: 1661325967954 - kind: conda name: libtheora version: 1.1.1 @@ -20672,26 +16107,6 @@ packages: purls: [] size: 160440 timestamp: 1719668116346 -- kind: conda - name: libtheora - version: 1.1.1 - build: hfdf4475_1006 - build_number: 1006 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libtheora-1.1.1-hfdf4475_1006.conda - sha256: 72421637a05c2e99120d29a00951190644a4439c8155df9e8a8340983934db13 - md5: fc8c11f9f4edda643302e28aa0999b90 - depends: - - __osx >=10.13 - - libogg 1.3.* - - libogg >=1.3.5,<1.4.0a0 - - libvorbis 1.3.* - - libvorbis >=1.3.7,<1.4.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 289472 - timestamp: 1719667988764 - kind: conda name: libthrift version: 0.20.0 @@ -20732,27 +16147,7 @@ packages: license_family: APACHE purls: [] size: 315041 - timestamp: 1724657608736 -- kind: conda - name: libthrift - version: 0.20.0 - build: h75589b3_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libthrift-0.20.0-h75589b3_1.conda - sha256: a1f40fcb9970fbfd6d0b825841b4127cf7dd7c54199d0b49bdbcd838b66f3b7a - md5: c20b01aa07ece86a237c580f7ba56923 - depends: - - __osx >=10.13 - - libcxx >=17 - - libevent >=2.1.12,<2.1.13.0a0 - - libzlib >=1.3.1,<2.0a0 - - openssl >=3.3.1,<4.0a0 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 324391 - timestamp: 1724657549149 + timestamp: 1724657608736 - kind: conda name: libthrift version: 0.20.0 @@ -20798,29 +16193,6 @@ packages: purls: [] size: 282236 timestamp: 1722871642189 -- kind: conda - name: libtiff - version: 4.6.0 - build: h603087a_4 - build_number: 4 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libtiff-4.6.0-h603087a_4.conda - sha256: 3b853901835167406f1c576207ec0294da4aade69c170a6e29206d454f42c259 - md5: 362626a2aacb976ec89c91b99bfab30b - depends: - - __osx >=10.13 - - lerc >=4.0.0,<5.0a0 - - libcxx >=16 - - libdeflate >=1.21,<1.22.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - - xz >=5.2.6,<6.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: HPND - purls: [] - size: 257905 - timestamp: 1722871821174 - kind: conda name: libtiff version: 4.6.0 @@ -20867,18 +16239,6 @@ packages: purls: [] size: 238731 timestamp: 1722871853823 -- kind: conda - name: libunistring - version: 0.9.10 - build: h0d85af4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libunistring-0.9.10-h0d85af4_0.tar.bz2 - sha256: c5805a58cd2b211bffdc8b7cdeba9af3cee456196ab52ab9a30e0353bc95beb7 - md5: 40f27dc16f73256d7b93e53c4f03d92f - license: GPL-3.0-only OR LGPL-3.0-only - purls: [] - size: 1392865 - timestamp: 1626955817826 - kind: conda name: libunistring version: 0.9.10 @@ -20950,19 +16310,6 @@ packages: purls: [] size: 104389 timestamp: 1667316359211 -- kind: conda - name: libutf8proc - version: 2.8.0 - build: hb7f2c08_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libutf8proc-2.8.0-hb7f2c08_0.tar.bz2 - sha256: 55a7f96b2802e94def207fdfe92bc52c24d705d139bb6cdb3d936cbe85e1c505 - md5: db98dc3e58cbc11583180609c429c17d - license: MIT - license_family: MIT - purls: [] - size: 98942 - timestamp: 1667316472080 - kind: conda name: libuuid version: 2.38.1 @@ -21014,22 +16361,6 @@ packages: purls: [] size: 209586 timestamp: 1718886769974 -- kind: conda - name: libvorbis - version: 1.3.7 - build: h046ec9c_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libvorbis-1.3.7-h046ec9c_0.tar.bz2 - sha256: fbcce1005efcd616e452dea07fe34893d8dd13c65628e74920eeb68ac549faf7 - md5: fbbda1fede0aadaa252f6919148c4ce1 - depends: - - libcxx >=11.0.0 - - libogg >=1.3.4,<1.4.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 254208 - timestamp: 1610609857389 - kind: conda name: libvorbis version: 1.3.7 @@ -21095,37 +16426,6 @@ packages: purls: [] size: 1022466 timestamp: 1717859935011 -- kind: conda - name: libvpx - version: 1.14.1 - build: hf036a51_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libvpx-1.14.1-hf036a51_0.conda - sha256: 47e70e76988c11de97d539794fd4b03db69b75289ac02cdc35ae5a595ffcd973 - md5: 9b8744a702ffb1738191e094e6eb67dc - depends: - - __osx >=10.13 - - libcxx >=16 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 1297054 - timestamp: 1717860051058 -- kind: conda - name: libwebp-base - version: 1.4.0 - build: h10d778d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libwebp-base-1.4.0-h10d778d_0.conda - sha256: 7bafd8f4c637778cd0aa390bf3a894feef0e1fcf6ea6000c7ffc25c4c5a65538 - md5: b2c0047ea73819d992484faacbbe1c24 - constrains: - - libwebp 1.4.0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 355099 - timestamp: 1713200298965 - kind: conda name: libwebp-base version: 1.4.0 @@ -21177,25 +16477,6 @@ packages: purls: [] size: 438953 timestamp: 1713199854503 -- kind: conda - name: libxcb - version: '1.16' - build: h00291cd_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libxcb-1.16-h00291cd_1.conda - sha256: 2cd6b74fa4b3ef9a3fe7f92271eb34346af673509aa86739e9f04bf72015f841 - md5: c989b18131ab79fdc67e42473d53d545 - depends: - - __osx >=10.13 - - pthread-stubs - - xorg-libxau >=1.0.11,<2.0a0 - - xorg-libxdmcp - license: MIT - license_family: MIT - purls: [] - size: 323886 - timestamp: 1724419422116 - kind: conda name: libxcb version: '1.16' @@ -21255,6 +16536,21 @@ packages: purls: [] size: 325266 timestamp: 1724419525819 +- kind: conda + name: libxcrypt + version: 4.4.36 + build: hd590300_1 + build_number: 1 + subdir: linux-64 + url: https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda + sha256: 6ae68e0b86423ef188196fff6207ed0c8195dd84273cb5623b85aa08033a410c + md5: 5aa797f8787fe7a17d1b0821485b5adc + depends: + - libgcc-ng >=12 + license: LGPL-2.1-or-later + purls: [] + size: 100393 + timestamp: 1702724383534 - kind: conda name: libxcrypt version: 4.4.36 @@ -21351,26 +16647,6 @@ packages: purls: [] size: 707169 timestamp: 1721031016143 -- kind: conda - name: libxml2 - version: 2.12.7 - build: heaf3512_4 - build_number: 4 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libxml2-2.12.7-heaf3512_4.conda - sha256: ed18a2d8d428c0b88d47751ebcc7cc4e6202f99c3948fffd776cba83c4f0dad3 - md5: ea1be6ecfe814da889e882c8b6ead79d - depends: - - __osx >=10.13 - - icu >=75.1,<76.0a0 - - libiconv >=1.17,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - - xz >=5.2.6,<6.0a0 - license: MIT - license_family: MIT - purls: [] - size: 619901 - timestamp: 1721031175411 - kind: conda name: libxslt version: 1.1.39 @@ -21463,24 +16739,6 @@ packages: purls: [] size: 128244 timestamp: 1694416824668 -- kind: conda - name: libzip - version: 1.10.1 - build: hc158999_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libzip-1.10.1-hc158999_3.conda - sha256: 0689e4a6e67e80027e43eefb8a365273405a01f5ab2ece97319155b8be5d64f6 - md5: 6112b3173f3aa2f12a8f40d07a77cc35 - depends: - - bzip2 >=1.0.8,<2.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - openssl >=3.1.2,<4.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 127599 - timestamp: 1694416738467 - kind: conda name: libzlib version: 1.3.1 @@ -21555,41 +16813,6 @@ packages: license_family: Other size: 61574 timestamp: 1716874187109 -- kind: conda - name: libzlib - version: 1.3.1 - build: h87427d6_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - sha256: 80a62db652b1da0ccc100812a1d86e94f75028968991bfb17f9536f3aa72d91d - md5: b7575b5aa92108dcc9aaab0f05f2dbce - depends: - - __osx >=10.13 - constrains: - - zlib 1.3.1 *_1 - license: Zlib - license_family: Other - purls: [] - size: 57372 - timestamp: 1716874211519 -- kind: conda - name: libzlib - version: 1.3.1 - build: h87427d6_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/libzlib-1.3.1-h87427d6_1.conda - sha256: 80a62db652b1da0ccc100812a1d86e94f75028968991bfb17f9536f3aa72d91d - md5: b7575b5aa92108dcc9aaab0f05f2dbce - depends: - - __osx >=10.13 - constrains: - - zlib 1.3.1 *_1 - license: Zlib - license_family: Other - size: 57372 - timestamp: 1716874211519 - kind: conda name: libzlib version: 1.3.1 @@ -21625,24 +16848,6 @@ packages: license_family: Other size: 46921 timestamp: 1716874262512 -- kind: conda - name: llvm-openmp - version: 18.1.8 - build: h15ab845_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/llvm-openmp-18.1.8-h15ab845_1.conda - sha256: 06a245abb6e6d8d6662a35ad162eacb39f431349edf7cea9b1ff73b2da213c58 - md5: ad0afa524866cc1c08b436865d0ae484 - depends: - - __osx >=10.13 - constrains: - - openmp 18.1.8|18.1.8.* - license: Apache-2.0 WITH LLVM-exception - license_family: APACHE - purls: [] - size: 300358 - timestamp: 1723605369115 - kind: conda name: llvm-openmp version: 18.1.8 @@ -21661,28 +16866,6 @@ packages: purls: [] size: 276263 timestamp: 1723605341828 -- kind: conda - name: llvmlite - version: 0.43.0 - build: py311h25b8078_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/llvmlite-0.43.0-py311h25b8078_1.conda - sha256: 8f47684beb89f6c03d10a06929365218cdf4454aee52f0bf83a97da4597c429c - md5: 19d1706a45751962b116123dcbc578f0 - depends: - - __osx >=10.13 - - libcxx >=17 - - libllvm14 >=14.0.6,<14.1.0a0 - - libzlib >=1.3.1,<2.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-2-Clause - license_family: BSD - purls: - - pkg:pypi/llvmlite?source=hash-mapping - size: 379249 - timestamp: 1725305363038 - kind: conda name: llvmlite version: 0.43.0 @@ -21826,44 +17009,6 @@ packages: - pkg:pypi/loguru?source=hash-mapping size: 126239 timestamp: 1725349863378 -- kind: conda - name: loguru - version: 0.7.2 - build: py311h6eed73b_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/loguru-0.7.2-py311h6eed73b_2.conda - sha256: dcd47a089a6d096ee221126efce9f4b67663e522c05e84e37d3e8e7312e31bdd - md5: 7f1619b30b39af5c0a7386577ff77d1f - depends: - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/loguru?source=hash-mapping - size: 126346 - timestamp: 1725349845053 -- kind: conda - name: lz4 - version: 4.3.3 - build: py311h12b7ed1_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/lz4-4.3.3-py311h12b7ed1_1.conda - sha256: 47515c300a34c47be42b3196f6f5d400fd3f80f0ae25c73eb10ad367cef450cb - md5: ab30eba2d9fe565ff640369016e74fe5 - depends: - - __osx >=10.13 - - lz4-c >=1.9.3,<1.10.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/lz4?source=hash-mapping - size: 36652 - timestamp: 1725089602246 - kind: conda name: lz4 version: 4.3.3 @@ -21976,35 +17121,6 @@ packages: purls: [] size: 134235 timestamp: 1674728465431 -- kind: conda - name: lz4-c - version: 1.9.4 - build: hf0c8a7f_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/lz4-c-1.9.4-hf0c8a7f_0.conda - sha256: 39aa0c01696e4e202bf5e337413de09dfeec061d89acd5f28e9968b4e93c3f48 - md5: aa04f7143228308662696ac24023f991 - depends: - - libcxx >=14.0.6 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 156415 - timestamp: 1674727335352 -- kind: conda - name: lzo - version: '2.10' - build: h10d778d_1001 - build_number: 1001 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/lzo-2.10-h10d778d_1001.conda - sha256: 4006c57f805ca6aec72ee0eb7166b2fd648dd1bf3721b9de4b909cd374196643 - md5: bfecd73e4a2dc18ffd5288acf8a212ab - license: GPL-2.0-or-later - license_family: GPL2 - purls: [] - size: 146405 - timestamp: 1713516112292 - kind: conda name: lzo version: '2.10' @@ -22208,27 +17324,6 @@ packages: license_family: MIT size: 64356 timestamp: 1686175179621 -- kind: conda - name: markupsafe - version: 2.1.5 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/markupsafe-2.1.5-py311h3336109_1.conda - sha256: 8e8bc3e75c8c4a8b3de7a8e79ecd7888ef44418d6236ec7bffa64fd6d70f5be0 - md5: a9fe56bf4730111131ae9f137df97593 - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - jinja2 >=3.0.0 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/markupsafe?source=hash-mapping - size: 26060 - timestamp: 1724959631776 - kind: conda name: markupsafe version: 2.1.5 @@ -22336,25 +17431,6 @@ packages: purls: [] size: 8795 timestamp: 1726165022112 -- kind: conda - name: matplotlib - version: 3.9.2 - build: py311h6eed73b_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/matplotlib-3.9.2-py311h6eed73b_1.conda - sha256: df996ed4d3f5104730afbdfe5711f25894e2225238ed90d376adda371cf38cde - md5: d9da018e823b719894115e23adf59169 - depends: - - matplotlib-base >=3.9.2,<3.9.3.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - tornado >=5 - license: PSF-2.0 - license_family: PSF - purls: [] - size: 8842 - timestamp: 1726165082526 - kind: conda name: matplotlib version: 3.9.2 @@ -22409,39 +17485,6 @@ packages: - pkg:pypi/matplotlib?source=hash-mapping size: 8026168 timestamp: 1726164999361 -- kind: conda - name: matplotlib-base - version: 3.9.2 - build: py311h8b21175_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/matplotlib-base-3.9.2-py311h8b21175_1.conda - sha256: 95dcf204b8beae65a2c592b29dda05840c7e84dcfe07f9106e040b5890c223ec - md5: a3f7858b3bf24733069e6a93e3240de0 - depends: - - __osx >=10.13 - - certifi >=2020.06.20 - - contourpy >=1.0.1 - - cycler >=0.10 - - fonttools >=4.22.0 - - freetype >=2.12.1,<3.0a0 - - kiwisolver >=1.3.1 - - libcxx >=17 - - numpy >=1.19,<3 - - numpy >=1.23 - - packaging >=20.0 - - pillow >=8 - - pyparsing >=2.3.1 - - python >=3.11,<3.12.0a0 - - python-dateutil >=2.7 - - python_abi 3.11.* *_cp311 - - qhull >=2020.2,<2020.3.0a0 - license: PSF-2.0 - license_family: PSF - purls: - - pkg:pypi/matplotlib?source=hash-mapping - size: 7851098 - timestamp: 1726165050863 - kind: conda name: matplotlib-base version: 3.9.2 @@ -22613,20 +17656,6 @@ packages: purls: [] size: 4096907 timestamp: 1698848042467 -- kind: conda - name: metis - version: 5.1.1 - build: h73e2aa4_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/metis-5.1.1-h73e2aa4_2.conda - sha256: 522bab6afc78cf858b19e89f8f01c092a1559ebdc4e41f5909684b4ad6375ee0 - md5: f3a60f050e50631cc20a51c47d70d1b1 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 3895456 - timestamp: 1705681589772 - kind: conda name: metis version: 5.1.1 @@ -22696,29 +17725,7 @@ packages: - bzip2 >=1.0.8,<2.0a0 - libgcc-ng >=12 - libiconv >=1.17,<2.0a0 - - libstdcxx-ng >=12 - - libzlib >=1.3.1,<2.0a0 - - openssl >=3.3.1,<4.0a0 - - xz >=5.2.6,<6.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: Zlib - license_family: Other - purls: [] - size: 91409 - timestamp: 1718483022284 -- kind: conda - name: minizip - version: 4.0.7 - build: h62b0c8d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/minizip-4.0.7-h62b0c8d_0.conda - sha256: e02a6e1a43b0ff44bb9460d46d3f7687a1876d435fb3c2c6cf9e19bab60901f6 - md5: 9cb19284d7d835918241acf8180099db - depends: - - __osx >=10.13 - - bzip2 >=1.0.8,<2.0a0 - - libcxx >=16 - - libiconv >=1.17,<2.0a0 + - libstdcxx-ng >=12 - libzlib >=1.3.1,<2.0a0 - openssl >=3.3.1,<4.0a0 - xz >=5.2.6,<6.0a0 @@ -22726,8 +17733,8 @@ packages: license: Zlib license_family: Other purls: [] - size: 78595 - timestamp: 1718483214061 + size: 91409 + timestamp: 1718483022284 - kind: conda name: mistune version: 3.0.2 @@ -22854,25 +17861,6 @@ packages: - pkg:pypi/msgpack?source=hash-mapping size: 104809 timestamp: 1725975116412 -- kind: conda - name: msgpack-python - version: 1.1.0 - build: py311hf2f7c97_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/msgpack-python-1.1.0-py311hf2f7c97_0.conda - sha256: b56b1e7d156b88cc0c62734acf56d4ee809723614f659e4203028e7eeac16a78 - md5: 6804cd42195bf94efd1b892688c96412 - depends: - - __osx >=10.13 - - libcxx >=17 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: Apache - purls: - - pkg:pypi/msgpack?source=hash-mapping - size: 90868 - timestamp: 1725975178961 - kind: conda name: msys2-conda-epoch version: '20160418' @@ -22885,24 +17873,6 @@ packages: purls: [] size: 3227 timestamp: 1608166968312 -- kind: conda - name: multidict - version: 6.1.0 - build: py311h3e662af_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/multidict-6.1.0-py311h3e662af_0.conda - sha256: b2c5b80e3727973a0d2f6868946118d8349be6295c1f495d8d0db2a47969ed62 - md5: c3ce7ac59348588e335101a78a13594d - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: APACHE - purls: - - pkg:pypi/multidict?source=hash-mapping - size: 55954 - timestamp: 1725953843481 - kind: conda name: multidict version: 6.1.0 @@ -22978,27 +17948,6 @@ packages: - pkg:pypi/munkres?source=hash-mapping size: 12452 timestamp: 1600387789153 -- kind: conda - name: mypy - version: 1.11.2 - build: py311h3336109_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/mypy-1.11.2-py311h3336109_0.conda - sha256: 0550ddaf5e7d8ffff133ad2beb91ed65478188334d4ab95833561e4064cd2cc1 - md5: 4d8c373615565fdc89e5518e4add4bf7 - depends: - - __osx >=10.13 - - mypy_extensions >=1.0.0 - - psutil >=4.0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - typing_extensions >=4.1.0 - license: MIT - license_family: MIT - purls: - - pkg:pypi/mypy?source=hash-mapping - size: 12360403 - timestamp: 1724601912385 - kind: conda name: mypy version: 1.11.2 @@ -23108,23 +18057,6 @@ packages: purls: [] size: 630728 timestamp: 1723208368623 -- kind: conda - name: mysql-common - version: 9.0.1 - build: h3829a10_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/mysql-common-9.0.1-h3829a10_0.conda - sha256: aeff60083b9f78c16a44dddbe95fe8255fbf542cf4cfb587347dd1782ca0f354 - md5: fc7b39f4b0f6525c3df3bf5c975ac57a - depends: - - __osx >=10.13 - - libcxx >=16 - - openssl >=3.3.1,<4.0a0 - license: GPL-2.0-or-later - license_family: GPL - purls: [] - size: 653259 - timestamp: 1723207043869 - kind: conda name: mysql-common version: 9.0.1 @@ -23143,26 +18075,6 @@ packages: purls: [] size: 612947 timestamp: 1723209940114 -- kind: conda - name: mysql-libs - version: 9.0.1 - build: h01befea_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/mysql-libs-9.0.1-h01befea_0.conda - sha256: abd6f964114efdd4eee14885f66b1219df6c6455fb3117b5e3c6f67d814bade8 - md5: b2a4eca57a7fd941ec5d47a220c42535 - depends: - - __osx >=10.13 - - libcxx >=16 - - libzlib >=1.3.1,<2.0a0 - - mysql-common 9.0.1 h3829a10_0 - - openssl >=3.3.1,<4.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: GPL-2.0-or-later - license_family: GPL - purls: [] - size: 1326999 - timestamp: 1723207284576 - kind: conda name: mysql-libs version: 9.0.1 @@ -23343,35 +18255,6 @@ packages: license: X11 AND BSD-3-Clause size: 889086 timestamp: 1724658547447 -- kind: conda - name: ncurses - version: '6.5' - build: hf036a51_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - sha256: b0b3180039ef19502525a2abd5833c00f9624af830fd391f851934d57bffb9af - md5: e102bbf8a6ceeaf429deab8032fc8977 - depends: - - __osx >=10.13 - license: X11 AND BSD-3-Clause - purls: [] - size: 822066 - timestamp: 1724658603042 -- kind: conda - name: ncurses - version: '6.5' - build: hf036a51_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/ncurses-6.5-hf036a51_1.conda - sha256: b0b3180039ef19502525a2abd5833c00f9624af830fd391f851934d57bffb9af - md5: e102bbf8a6ceeaf429deab8032fc8977 - depends: - - __osx >=10.13 - license: X11 AND BSD-3-Clause - size: 822066 - timestamp: 1724658603042 - kind: conda name: nest-asyncio version: 1.6.0 @@ -23389,32 +18272,6 @@ packages: - pkg:pypi/nest-asyncio?source=hash-mapping size: 11638 timestamp: 1705850780510 -- kind: conda - name: netcdf4 - version: 1.7.1 - build: nompi_py311h79bb2b8_102 - build_number: 102 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/netcdf4-1.7.1-nompi_py311h79bb2b8_102.conda - sha256: 7a57148bf5db973936ef528c996819119c746c2860db7bbc11ea777523b70925 - md5: 40240d978512ab04ce12c232881849b6 - depends: - - __osx >=10.13 - - certifi - - cftime - - hdf5 >=1.14.3,<1.14.4.0a0 - - libnetcdf >=4.9.2,<4.9.3.0a0 - - libzlib >=1.3.1,<2.0a0 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - setuptools - license: MIT - license_family: MIT - purls: - - pkg:pypi/netcdf4?source=hash-mapping - size: 1041756 - timestamp: 1725450182812 - kind: conda name: netcdf4 version: 1.7.1 @@ -23525,19 +18382,6 @@ packages: purls: [] size: 1011638 timestamp: 1686309814836 -- kind: conda - name: nettle - version: 3.9.1 - build: h8e11ae5_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/nettle-3.9.1-h8e11ae5_0.conda - sha256: 62de51fc44f1595a06c5b24bb717b949b4b9fb4c4acaf127b92ce99ddb546ca7 - md5: 400dffe5d2fbb9813b51948d3e9e9ab1 - license: GPL 2 and LGPL3 - license_family: GPL - purls: [] - size: 509519 - timestamp: 1686310097670 - kind: conda name: networkx version: '3.3' @@ -23604,27 +18448,6 @@ packages: - pkg:pypi/nh3?source=hash-mapping size: 496149 timestamp: 1725342334862 -- kind: conda - name: nh3 - version: 0.2.18 - build: py311h95688db_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/nh3-0.2.18-py311h95688db_1.conda - sha256: 3699a826aa564e9263672c3cc25e930a5a8258c08c583e663bb2166c287073e6 - md5: b3b01f63815fa71fb58c37056af92bf7 - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - __osx >=10.13 - license: MIT - license_family: MIT - purls: - - pkg:pypi/nh3?source=hash-mapping - size: 545351 - timestamp: 1725341849687 - kind: conda name: nh3 version: 0.2.18 @@ -23700,23 +18523,6 @@ packages: purls: [] size: 124255 timestamp: 1723652081336 -- kind: conda - name: nlohmann_json - version: 3.11.3 - build: hf036a51_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/nlohmann_json-3.11.3-hf036a51_1.conda - sha256: 41b1aa2a67654917c9c32a5f0111970b11cfce49ed57cf44bba4aefdcd59e54b - md5: 00c3efa95b3a010ee85bc36aac6ab2f6 - depends: - - __osx >=10.13 - - libcxx >=16 - license: MIT - license_family: MIT - purls: [] - size: 122773 - timestamp: 1723652497933 - kind: conda name: notebook version: 7.2.2 @@ -23788,40 +18594,6 @@ packages: purls: [] size: 220745 timestamp: 1669785182058 -- kind: conda - name: nspr - version: '4.35' - build: hea0b92c_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/nspr-4.35-hea0b92c_0.conda - sha256: da6e19bd0ff31e219760e647cfe1cc499a8cdfaff305f06c56d495ca062b86de - md5: a9e56c98d13d8b7ce72bf4357317c29b - depends: - - libcxx >=14.0.6 - license: MPL-2.0 - license_family: MOZILLA - purls: [] - size: 230071 - timestamp: 1669785313586 -- kind: conda - name: nss - version: '3.104' - build: h3135457_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/nss-3.104-h3135457_0.conda - sha256: a5b3fe0367a39edfac92e2cd69426123049257cb6aedf9bba002ea45c70fcdfc - md5: 8cf0f6f72197a4fb10ccb897b30f1731 - depends: - - __osx >=10.13 - - libcxx >=17 - - libsqlite >=3.46.0,<4.0a0 - - libzlib >=1.3.1,<2.0a0 - - nspr >=4.35,<5.0a0 - license: MPL-2.0 - license_family: MOZILLA - purls: [] - size: 1859778 - timestamp: 1725079369298 - kind: conda name: nss version: '3.104' @@ -23891,37 +18663,6 @@ packages: - pkg:pypi/numba?source=hash-mapping size: 5807308 timestamp: 1718888792863 -- kind: conda - name: numba - version: 0.60.0 - build: py311h0e5bd6a_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/numba-0.60.0-py311h0e5bd6a_0.conda - sha256: 47fbc7925d5ee5ba9d841e542752288fc7059f44c0b95c34e11c609f4754e517 - md5: 8bd1ff28924ea52b539528d85f70a1ac - depends: - - __osx >=10.13 - - libcxx >=16 - - llvm-openmp >=16.0.6 - - llvm-openmp >=18.1.8 - - llvmlite >=0.43.0,<0.44.0a0 - - numpy >=1.19,<3 - - numpy >=1.22.3,<2.1 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - libopenblas !=0.3.6 - - cuda-python >=11.6 - - scipy >=1.0 - - tbb >=2021.6.0 - - cudatoolkit >=11.2 - - cuda-version >=11.2 - license: BSD-2-Clause - license_family: BSD - purls: - - pkg:pypi/numba?source=hash-mapping - size: 5773011 - timestamp: 1718888442395 - kind: conda name: numba version: 0.60.0 @@ -24072,28 +18813,6 @@ packages: - pkg:pypi/numcodecs?source=hash-mapping size: 534256 timestamp: 1724107601089 -- kind: conda - name: numcodecs - version: 0.13.0 - build: py311hfdcbad3_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/numcodecs-0.13.0-py311hfdcbad3_0.conda - sha256: cea1d50dd81851bd74beb5b4e732d7d43492a9aafb65e0199c0f3d3b1d8cf762 - md5: b6dc924435262ef5780a408e8f354806 - depends: - - __osx >=10.13 - - libcxx >=16 - - msgpack-python - - numpy >=1.19,<3 - - numpy >=1.7 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/numcodecs?source=hash-mapping - size: 759249 - timestamp: 1724107098021 - kind: conda name: numpy version: 2.0.2 @@ -24119,30 +18838,6 @@ packages: - pkg:pypi/numpy?source=hash-mapping size: 7507136 timestamp: 1724749843736 -- kind: conda - name: numpy - version: 2.0.2 - build: py311h394b0bb_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/numpy-2.0.2-py311h394b0bb_0.conda - sha256: 9c244959047306c7551a94e71f9227e46ee59782e5f5fe81be713d7a4ce3b26f - md5: 2711f0de1d44b82d1e023b73b612e901 - depends: - - __osx >=10.13 - - libblas >=3.9.0,<4.0a0 - - libcblas >=3.9.0,<4.0a0 - - libcxx >=17 - - liblapack >=3.9.0,<4.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - numpy-base <0a0 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/numpy?source=hash-mapping - size: 8106762 - timestamp: 1724749068929 - kind: conda name: numpy version: 2.0.2 @@ -24193,27 +18888,6 @@ packages: - pkg:pypi/numpy?source=hash-mapping size: 8961918 timestamp: 1724749067277 -- kind: conda - name: occt - version: 7.7.2 - build: novtk_h0a0d97a_101 - build_number: 101 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/occt-7.7.2-novtk_h0a0d97a_101.conda - sha256: a662e87fbd14a3dcbc54e7ac1f8ca3e32ab1c97cb4e6f202b85b7eb21a8540bc - md5: 8a337666ac2b42fff1325f55f4f210a2 - depends: - - fontconfig >=2.14.2,<3.0a0 - - fonts-conda-ecosystem - - freeimage >=3.18.0,<3.19.0a0 - - freetype >=2.12.1,<3.0a0 - - libcxx >=15.0.7 - - rapidjson - license: LGPL-2.1-only - license_family: LGPL - purls: [] - size: 25519137 - timestamp: 1696403173440 - kind: conda name: occt version: 7.7.2 @@ -24317,25 +18991,6 @@ packages: purls: [] size: 1457889 timestamp: 1726024792651 -- kind: conda - name: openexr - version: 3.2.2 - build: h2627bef_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/openexr-3.2.2-h2627bef_2.conda - sha256: a0eef2672e778c22416dbbedd0be47fb32d745c402f8a250f3015e0cacbe1d9a - md5: e54052079776261c205927064e54b921 - depends: - - __osx >=10.13 - - imath >=3.1.12,<3.1.13.0a0 - - libcxx >=17 - - libzlib >=1.3.1,<2.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 1287176 - timestamp: 1726024906174 - kind: conda name: openexr version: 3.2.2 @@ -24408,21 +19063,6 @@ packages: purls: [] size: 409185 timestamp: 1706874444698 -- kind: conda - name: openh264 - version: 2.4.1 - build: h73e2aa4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/openh264-2.4.1-h73e2aa4_0.conda - sha256: 4e660e62225815dd996788ed08dc50870e387c159f31d65cd8b677988dfb387b - md5: 877f116d9a4f8b826b0e1d427ac00871 - depends: - - libcxx >=16 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 660428 - timestamp: 1706874091051 - kind: conda name: openh264 version: 2.4.1 @@ -24477,24 +19117,6 @@ packages: purls: [] size: 341592 timestamp: 1709159244431 -- kind: conda - name: openjpeg - version: 2.5.2 - build: h7310d3a_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/openjpeg-2.5.2-h7310d3a_0.conda - sha256: dc9c405119b9b54f8ca5984da27ba498bd848ab4f0f580da6f293009ca5adc13 - md5: 05a14cc9d725dd74995927968d6547e3 - depends: - - libcxx >=16 - - libpng >=1.6.43,<1.7.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libzlib >=1.2.13,<2.0.0a0 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 331273 - timestamp: 1709159538792 - kind: conda name: openjpeg version: 2.5.2 @@ -24594,77 +19216,24 @@ packages: license: Apache-2.0 license_family: Apache purls: [] - size: 2891789 - timestamp: 1725410790053 -- kind: conda - name: openssl - version: 3.3.2 - build: hb9d3cd8_0 - subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/openssl-3.3.2-hb9d3cd8_0.conda - sha256: cee91036686419f6dd6086902acf7142b4916e1c4ba042e9ca23e151da012b6d - md5: 4d638782050ab6faa27275bed57e9b4e - depends: - - __glibc >=2.17,<3.0.a0 - - ca-certificates - - libgcc >=13 - license: Apache-2.0 - license_family: Apache - size: 2891789 - timestamp: 1725410790053 -- kind: conda - name: openssl - version: 3.3.2 - build: hd23fc13_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - sha256: 2b75d4b56e45992adf172b158143742daeb316c35274b36f385ccb6644e93268 - md5: 2ff47134c8e292868a4609519b1ea3b6 - depends: - - __osx >=10.13 - - ca-certificates - license: Apache-2.0 - license_family: Apache - purls: [] - size: 2544654 - timestamp: 1725410973572 + size: 2891789 + timestamp: 1725410790053 - kind: conda name: openssl version: 3.3.2 - build: hd23fc13_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/openssl-3.3.2-hd23fc13_0.conda - sha256: 2b75d4b56e45992adf172b158143742daeb316c35274b36f385ccb6644e93268 - md5: 2ff47134c8e292868a4609519b1ea3b6 + build: hb9d3cd8_0 + subdir: linux-64 + url: https://conda.anaconda.org/conda-forge/linux-64/openssl-3.3.2-hb9d3cd8_0.conda + sha256: cee91036686419f6dd6086902acf7142b4916e1c4ba042e9ca23e151da012b6d + md5: 4d638782050ab6faa27275bed57e9b4e depends: - - __osx >=10.13 + - __glibc >=2.17,<3.0.a0 - ca-certificates + - libgcc >=13 license: Apache-2.0 license_family: Apache - size: 2544654 - timestamp: 1725410973572 -- kind: conda - name: orc - version: 2.0.2 - build: h22b2039_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/orc-2.0.2-h22b2039_0.conda - sha256: b5a0667937d9d2d8d50e624e67fdc54c898a33013cd3a6fada343f3c4e69ae6e - md5: f7c6463d97edb79a39df8e5e90c53b1b - depends: - - __osx >=10.13 - - libcxx >=16 - - libprotobuf >=4.25.3,<4.25.4.0a0 - - libzlib >=1.3.1,<2.0a0 - - lz4-c >=1.9.3,<1.10.0a0 - - snappy >=1.2.1,<1.3.0a0 - - tzdata - - zstd >=1.5.6,<1.6.0a0 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 466353 - timestamp: 1723760915178 + size: 2891789 + timestamp: 1725410790053 - kind: conda name: orc version: 2.0.2 @@ -24782,22 +19351,6 @@ packages: purls: [] size: 890711 timestamp: 1654869118646 -- kind: conda - name: p11-kit - version: 0.24.1 - build: h65f8906_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/p11-kit-0.24.1-h65f8906_0.tar.bz2 - sha256: e16fbaadb2714c0965cb76de32fe7d13a21874cec02c97efef8ac51f4fda86fc - md5: e936a0ee28be948846108582f00e2d61 - depends: - - libffi >=3.4.2,<3.5.0a0 - - libtasn1 >=4.18.0,<5.0a0 - license: MIT - license_family: MIT - purls: [] - size: 834487 - timestamp: 1654869241699 - kind: conda name: p11-kit version: 0.24.1 @@ -24929,30 +19482,6 @@ packages: - pkg:pypi/pandas?source=hash-mapping size: 14600290 timestamp: 1715898888392 -- kind: conda - name: pandas - version: 2.2.2 - build: py311hfdcbad3_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pandas-2.2.2-py311hfdcbad3_1.conda - sha256: 070c97918f2ea3384120a87ca3681803242b48875d9269ed73542bacfa14fd03 - md5: 8dbecc860148500512e768571c59fbe0 - depends: - - __osx >=10.13 - - libcxx >=16 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python-dateutil >=2.8.1 - - python-tzdata >=2022a - - python_abi 3.11.* *_cp311 - - pytz >=2020.1 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/pandas?source=hash-mapping - size: 14887900 - timestamp: 1715898095186 - kind: conda name: pandocfilters version: 1.5.0 @@ -24970,29 +19499,6 @@ packages: - pkg:pypi/pandocfilters?source=hash-mapping size: 11627 timestamp: 1631603397334 -- kind: conda - name: pango - version: 1.54.0 - build: h115fe74_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pango-1.54.0-h115fe74_2.conda - sha256: ed400571a75027563b91bc48054a6599f22c8c2a7ee94a9c3d4e9932c02581ac - md5: 9bfd18e7d9292154b2b79ddb7145f9cf - depends: - - __osx >=10.13 - - cairo >=1.18.0,<2.0a0 - - fontconfig >=2.14.2,<3.0a0 - - fonts-conda-ecosystem - - freetype >=2.12.1,<3.0a0 - - fribidi >=1.0.10,<2.0a0 - - harfbuzz >=9.0.0,<10.0a0 - - libglib >=2.80.3,<3.0a0 - - libpng >=1.6.43,<1.7.0a0 - license: LGPL-2.1-or-later - purls: [] - size: 423324 - timestamp: 1723832327771 - kind: conda name: pango version: 1.54.0 @@ -25155,24 +19661,6 @@ packages: purls: [] size: 820831 timestamp: 1723489427046 -- kind: conda - name: pcre2 - version: '10.44' - build: h7634a1b_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pcre2-10.44-h7634a1b_2.conda - sha256: 336057fce69d45e1059f138beb38d60eb87ba858c3ad729ed49d9ecafd23669f - md5: 58cde0663f487778bcd7a0c8daf50293 - depends: - - __osx >=10.13 - - bzip2 >=1.0.8,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 854306 - timestamp: 1723488807216 - kind: conda name: pcre2 version: '10.44' @@ -25227,33 +19715,6 @@ packages: - pkg:pypi/pickleshare?source=hash-mapping size: 9332 timestamp: 1602536313357 -- kind: conda - name: pillow - version: 10.4.0 - build: py311h17ad1af_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pillow-10.4.0-py311h17ad1af_1.conda - sha256: b7a8d8cb5e32bb5786e9c2061a7a8331dc475b49f975b318e66d7235ea5e4fca - md5: 0f285390d41394d7ea77acb17a69f952 - depends: - - __osx >=10.13 - - freetype >=2.12.1,<3.0a0 - - lcms2 >=2.16,<3.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libxcb >=1.16,<1.17.0a0 - - libzlib >=1.3.1,<2.0a0 - - openjpeg >=2.5.2,<3.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - tk >=8.6.13,<8.7.0a0 - license: HPND - purls: - - pkg:pypi/pillow?source=hash-mapping - size: 42013162 - timestamp: 1726075338857 - kind: conda name: pillow version: 10.4.0 @@ -25431,21 +19892,6 @@ packages: purls: [] size: 461854 timestamp: 1709239971654 -- kind: conda - name: pixman - version: 0.43.4 - build: h73e2aa4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pixman-0.43.4-h73e2aa4_0.conda - sha256: 3ab44e12e566c67a6e9fd831f557ab195456aa996b8dd9af19787ca80caa5cd1 - md5: cb134c1e03fd32f4e6bea3f6de2614fd - depends: - - libcxx >=16 - license: MIT - license_family: MIT - purls: [] - size: 323904 - timestamp: 1709239931160 - kind: conda name: pixman version: 0.43.4 @@ -25497,21 +19943,20 @@ packages: timestamp: 1694617398467 - kind: conda name: platformdirs - version: 4.3.2 + version: 4.3.3 build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.2-pyhd8ed1ab_0.conda - sha256: 3aef5bb863a2db94e47272fd5ec5a5e4b240eafba79ebb9df7a162797cf035a3 - md5: e1a2dfcd5695f0744f1bcd3bbfe02523 + url: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.3.3-pyhd8ed1ab_0.conda + sha256: 30d9448d38392cc6fcf0c1d515c85c75ecf6b4eaed0895efc1cac9e10cb57c51 + md5: 32ecde72bc26b834382b93d454c9a68d depends: - python >=3.8 license: MIT - license_family: MIT purls: - pkg:pypi/platformdirs?source=hash-mapping - size: 20623 - timestamp: 1725821846879 + size: 20578 + timestamp: 1726315538191 - kind: conda name: pluggy version: 1.5.0 @@ -25617,40 +20062,6 @@ packages: purls: [] size: 1907007 timestamp: 1724659640508 -- kind: conda - name: poppler - version: 24.08.0 - build: h65860a0_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/poppler-24.08.0-h65860a0_1.conda - sha256: 2a526b86471a539eafe6ad49c5e380fe47a3c8b8b6fbd82125d08e3861028055 - md5: 3fd516e90f0b36d6d47b5a91cf6dd90c - depends: - - __osx >=10.13 - - cairo >=1.18.0,<2.0a0 - - fontconfig >=2.14.2,<3.0a0 - - fonts-conda-ecosystem - - freetype >=2.12.1,<3.0a0 - - lcms2 >=2.16,<3.0a0 - - libcurl >=8.9.1,<9.0a0 - - libcxx >=17 - - libglib >=2.80.3,<3.0a0 - - libiconv >=1.17,<2.0a0 - - libintl >=0.22.5,<1.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libzlib >=1.3.1,<2.0a0 - - nspr >=4.35,<5.0a0 - - nss >=3.103,<4.0a0 - - openjpeg >=2.5.2,<3.0a0 - - poppler-data - license: GPL-2.0-only - license_family: GPL - purls: [] - size: 1591573 - timestamp: 1724659773322 - kind: conda name: poppler version: 24.08.0 @@ -25696,29 +20107,6 @@ packages: purls: [] size: 2348171 timestamp: 1675353652214 -- kind: conda - name: postgresql - version: '16.4' - build: h4b98a8f_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/postgresql-16.4-h4b98a8f_1.conda - sha256: 2399f6b2eea2af0bd37a6c71fe9055a83248fbbd438cde14d3057dabff39a279 - md5: 1286c495eb0b5817270acdf5b4144b03 - depends: - - __osx >=10.13 - - krb5 >=1.21.3,<1.22.0a0 - - libpq 16.4 h75a757a_1 - - libxml2 >=2.12.7,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - openssl >=3.3.1,<4.0a0 - - readline >=8.2,<9.0a0 - - tzcode - - tzdata - license: PostgreSQL - purls: [] - size: 4593109 - timestamp: 1724948725869 - kind: conda name: postgresql version: '16.4' @@ -25836,29 +20224,6 @@ packages: purls: [] size: 2726576 timestamp: 1722328352769 -- kind: conda - name: proj - version: 9.4.1 - build: hf92c781_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/proj-9.4.1-hf92c781_1.conda - sha256: 826e1fcd191d17a6f16c745779254265e4cf1cdbd1761e627e3cdf0b9d6ed487 - md5: edf9f0581ffc0f50a1159943be5d0729 - depends: - - __osx >=10.13 - - libcurl >=8.9.0,<9.0a0 - - libcxx >=16 - - libsqlite >=3.46.0,<4.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - sqlite - constrains: - - proj4 ==999999999999 - license: MIT - license_family: MIT - purls: [] - size: 2831538 - timestamp: 1722327962605 - kind: conda name: proj version: 9.4.1 @@ -25935,25 +20300,6 @@ packages: purls: [] size: 6784 timestamp: 1718048101184 -- kind: conda - name: psutil - version: 6.0.0 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/psutil-6.0.0-py311h3336109_1.conda - sha256: f10f181173610dbd3459907b6ee99f581030372401d400e656fc6f1efce23582 - md5: dd6bc68808f33dad6a22bd7c66a14ef0 - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/psutil?source=hash-mapping - size: 514910 - timestamp: 1725738001143 - kind: conda name: psutil version: 6.0.0 @@ -26045,20 +20391,6 @@ packages: purls: [] size: 5625 timestamp: 1606147468727 -- kind: conda - name: pthread-stubs - version: '0.4' - build: hc929b4f_1001 - build_number: 1001 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pthread-stubs-0.4-hc929b4f_1001.tar.bz2 - sha256: 6e3900bb241bcdec513d4e7180fe9a19186c1a38f0b4080ed619d26014222c53 - md5: addd19059de62181cd11ae8f4ef26084 - license: MIT - license_family: MIT - purls: [] - size: 5653 - timestamp: 1606147699844 - kind: conda name: pthread-stubs version: '0.4' @@ -26154,21 +20486,6 @@ packages: purls: [] size: 111324 timestamp: 1696182979614 -- kind: conda - name: pugixml - version: '1.14' - build: he965462_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pugixml-1.14-he965462_0.conda - sha256: 8ba30eb9ead058a19a472bb8e795ab408c629b0b84fc5bb7b6899e7429d5e625 - md5: 92f9416f48c010bf04c34c9841c84b09 - depends: - - libcxx >=15.0.7 - license: MIT - license_family: MIT - purls: [] - size: 94175 - timestamp: 1696182807580 - kind: conda name: pure_eval version: 0.2.3 @@ -26258,25 +20575,6 @@ packages: license_family: BSD size: 5306628 timestamp: 1714140922537 -- kind: conda - name: py-rattler - version: 0.5.0 - build: py312he8fc997_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/py-rattler-0.5.0-py312he8fc997_0.conda - sha256: d520760ffe2c84c7e7410df9f121ac782dab87c607d50833954831c1534c3e06 - md5: e928de523a5635c78f4b3ec8f6834c97 - depends: - - __osx >=10.12 - - openssl >=3.3.0,<4.0a0 - - python >=3.12,<3.13.0a0 - - python_abi 3.12.* *_cp312 - constrains: - - __osx >=10.12 - license: BSD-3-Clause - license_family: BSD - size: 3499897 - timestamp: 1716891127118 - kind: conda name: py-triangle version: '20230923' @@ -26336,24 +20634,6 @@ packages: - pkg:pypi/triangle?source=hash-mapping size: 1206218 timestamp: 1695717015618 -- kind: conda - name: py-triangle - version: '20230923' - build: py311he705e18_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/py-triangle-20230923-py311he705e18_2.conda - sha256: d50caf5a02f56ec52deda3d2ce93badbbc94d23c8f61c5df023fa6a89eda1b9e - md5: 7749cf2ad8a781b4a3fb2d4bf40de569 - depends: - - numpy - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: LGPL and Triangle - purls: - - pkg:pypi/triangle?source=hash-mapping - size: 1203946 - timestamp: 1697728514082 - kind: conda name: pyarrow version: 17.0.0 @@ -26423,54 +20703,6 @@ packages: purls: [] size: 25719 timestamp: 1722487909182 -- kind: conda - name: pyarrow - version: 17.0.0 - build: py311he764780_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyarrow-17.0.0-py311he764780_1.conda - sha256: 04227d9772eb1a116fff6992586f95f2fc38f60cba45222b1b04751cebe9923c - md5: 355083896d9cd52daf9e6cad2fe69368 - depends: - - libarrow-acero 17.0.0.* - - libarrow-dataset 17.0.0.* - - libarrow-substrait 17.0.0.* - - libparquet 17.0.0.* - - numpy >=1.19,<3 - - pyarrow-core 17.0.0 *_1_* - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 25901 - timestamp: 1722487480901 -- kind: conda - name: pyarrow-core - version: 17.0.0 - build: py311h073f6b9_1_cpu - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyarrow-core-17.0.0-py311h073f6b9_1_cpu.conda - sha256: d32ad71b9408a0c9cefc10783ae2e854345cb3527ff4b36d6ac69a59f252b240 - md5: b7a65f6c5653931e20663fb54d12776f - depends: - - __osx >=10.13 - - libarrow 17.0.0.* *cpu - - libcxx >=17 - - libzlib >=1.3.1,<2.0a0 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - apache-arrow-proc =*=cpu - license: Apache-2.0 - license_family: APACHE - purls: - - pkg:pypi/pyarrow?source=hash-mapping - size: 4139975 - timestamp: 1722487425649 - kind: conda name: pyarrow-core version: 17.0.0 @@ -26651,41 +20883,20 @@ packages: subdir: win-64 url: https://conda.anaconda.org/conda-forge/win-64/pydantic-core-2.23.3-py311h533ab2d_0.conda sha256: 47dc8bf5b2b3192b60fff750b8906555ead348d98d674aa973b4fd75da67e74e - md5: 61cdae48cc53c8e5b930e77dd58b7c2e - depends: - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - typing-extensions >=4.6.0,!=4.7.0 - - ucrt >=10.0.20348.0 - - vc >=14.2,<15 - - vc14_runtime >=14.29.30139 - license: MIT - license_family: MIT - purls: - - pkg:pypi/pydantic-core?source=hash-mapping - size: 1565195 - timestamp: 1725736815330 -- kind: conda - name: pydantic-core - version: 2.23.3 - build: py311h95688db_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pydantic-core-2.23.3-py311h95688db_0.conda - sha256: e367a1b2d753dde157054bf3639d9069a41a89a7f84811fd1e21e598063cda0e - md5: 6f4686084db797df2f855c25de9c9b8b + md5: 61cdae48cc53c8e5b930e77dd58b7c2e depends: - - __osx >=10.13 - python >=3.11,<3.12.0a0 - python_abi 3.11.* *_cp311 - typing-extensions >=4.6.0,!=4.7.0 - constrains: - - __osx >=10.13 + - ucrt >=10.0.20348.0 + - vc >=14.2,<15 + - vc14_runtime >=14.29.30139 license: MIT license_family: MIT purls: - pkg:pypi/pydantic-core?source=hash-mapping - size: 1533152 - timestamp: 1725736010649 + size: 1565195 + timestamp: 1725736815330 - kind: conda name: pydantic-core version: 2.23.3 @@ -26747,25 +20958,6 @@ packages: license_family: MIT size: 1569350 timestamp: 1725736732484 -- kind: conda - name: pydantic-core - version: 2.23.3 - build: py312h669792a_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pydantic-core-2.23.3-py312h669792a_0.conda - sha256: 38f7ef2eb082a75cafbcc37d05e285780858dfff64004d80afdd03a04448a88b - md5: 6599b550ea3dae7abbeda4f44e78750c - depends: - - __osx >=10.13 - - python >=3.12,<3.13.0a0 - - python_abi 3.12.* *_cp312 - - typing-extensions >=4.6.0,!=4.7.0 - constrains: - - __osx >=10.13 - license: MIT - license_family: MIT - size: 1535653 - timestamp: 1725736002889 - kind: conda name: pydantic-core version: 2.23.3 @@ -26882,27 +21074,6 @@ packages: - pkg:pypi/pymetis?source=hash-mapping size: 103657 timestamp: 1699633597728 -- kind: conda - name: pymetis - version: 2023.1.1 - build: py311h5fe6d0d_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pymetis-2023.1.1-py311h5fe6d0d_2.conda - sha256: 39526578383952b3d7b895dc5ba4af93be8517fad01283f61ee68241bf34d41f - md5: 125c8e8855a02131b6a979bcda865b3d - depends: - - __osx >=10.9 - - libcxx >=16.0.6 - - metis >=5.1.1,<5.1.2.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: APACHE - purls: - - pkg:pypi/pymetis?source=hash-mapping - size: 107524 - timestamp: 1699633297476 - kind: conda name: pymetis version: 2023.1.1 @@ -26968,27 +21139,6 @@ packages: - pkg:pypi/pyobjc-core?source=hash-mapping size: 485377 timestamp: 1725739643057 -- kind: conda - name: pyobjc-core - version: 10.3.1 - build: py311hd6939f8_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyobjc-core-10.3.1-py311hd6939f8_1.conda - sha256: 48de2a78d71e6c1a2681c1fbcf1f1503a29c58cc42cfc0fafa5c1b59a10eda94 - md5: c8e529b8f6a408dfc6a2bc0c607e2338 - depends: - - __osx >=10.13 - - libffi >=3.4,<4.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - setuptools - license: MIT - license_family: MIT - purls: - - pkg:pypi/pyobjc-core?source=hash-mapping - size: 491149 - timestamp: 1725739585987 - kind: conda name: pyobjc-framework-cocoa version: 10.3.1 @@ -27011,27 +21161,6 @@ packages: - pkg:pypi/pyobjc-framework-cocoa?source=hash-mapping size: 384333 timestamp: 1725875205492 -- kind: conda - name: pyobjc-framework-cocoa - version: 10.3.1 - build: py311hd6939f8_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyobjc-framework-cocoa-10.3.1-py311hd6939f8_1.conda - sha256: bf6179d71edb920cedf7ce4395f4447d5ae96a9deb5a44dcc1a6abffea0de4aa - md5: f3f565f99289de1cd140bdbea51b94eb - depends: - - __osx >=10.13 - - libffi >=3.4,<4.0a0 - - pyobjc-core 10.3.1.* - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/pyobjc-framework-cocoa?source=hash-mapping - size: 381020 - timestamp: 1725875173947 - kind: conda name: pyogrio version: 0.9.0 @@ -27110,31 +21239,6 @@ packages: - pkg:pypi/pyogrio?source=hash-mapping size: 742341 timestamp: 1725519877180 -- kind: conda - name: pyogrio - version: 0.9.0 - build: py311hdb57d13_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyogrio-0.9.0-py311hdb57d13_2.conda - sha256: 0f8a08b59ba47f2a2b32cc7401dd2387b577ffc1df2e5aed25d85b1e48ff10c9 - md5: 3ff351e34ac564e1215573fc8150796c - depends: - - __osx >=10.13 - - gdal - - libcxx >=17 - - libgdal >=3.9.2,<3.10.0a0 - - libgdal-core >=3.9.2,<3.10.0a0 - - numpy - - packaging - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/pyogrio?source=hash-mapping - size: 671828 - timestamp: 1725520040341 - kind: conda name: pyparsing version: 3.1.4 @@ -27152,27 +21256,6 @@ packages: - pkg:pypi/pyparsing?source=hash-mapping size: 90129 timestamp: 1724616224956 -- kind: conda - name: pyproj - version: 3.6.1 - build: py311h48d2620_9 - build_number: 9 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyproj-3.6.1-py311h48d2620_9.conda - sha256: a1d5ab74d2d309fc903639424202b2d5508672a15aba48825881b7e5816c011a - md5: 49a319ac9fde15792463c8cbbef4851e - depends: - - __osx >=10.13 - - certifi - - proj >=9.4.1,<9.5.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: MIT - license_family: MIT - purls: - - pkg:pypi/pyproj?source=hash-mapping - size: 488567 - timestamp: 1725436192422 - kind: conda name: pyproj version: 3.6.1 @@ -27435,6 +21518,25 @@ packages: - pkg:pypi/pytest-cov?source=hash-mapping size: 25507 timestamp: 1711411153367 +- kind: conda + name: pytest-dotenv + version: 0.5.2 + build: pyhd8ed1ab_0 + subdir: noarch + noarch: python + url: https://conda.anaconda.org/conda-forge/noarch/pytest-dotenv-0.5.2-pyhd8ed1ab_0.tar.bz2 + sha256: 43ab7de6af7b298a9199aea2bf6fa481a3059ba1068dd0967fe3a040ff6e9303 + md5: 11b16b526f60cc18748c3fe45d10315a + depends: + - pytest >=5.0.0 + - python >=3.6 + - python-dotenv >=0.9.1 + license: MIT + license_family: MIT + purls: + - pkg:pypi/pytest-dotenv?source=hash-mapping + size: 7383 + timestamp: 1606859705188 - kind: conda name: pytest-xdist version: 3.6.1 @@ -27504,30 +21606,6 @@ packages: license: Python-2.0 size: 16002560 timestamp: 1687560007019 -- kind: conda - name: python - version: 3.10.12 - build: had23ca6_0_cpython - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/python-3.10.12-had23ca6_0_cpython.conda - sha256: cbf1b9cf9bdba639675a1431a053f3f2babb73ca6b4329cf72dcf9cd45a29cc8 - md5: 351b8aa0687f3510620cf06ad11229f4 - depends: - - bzip2 >=1.0.8,<2.0a0 - - libffi >=3.4,<4.0a0 - - libsqlite >=3.42.0,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - ncurses >=6.4,<7.0a0 - - openssl >=3.1.1,<4.0a0 - - readline >=8.2,<9.0a0 - - tk >=8.6.12,<8.7.0a0 - - tzdata - - xz >=5.2.6,<6.0a0 - constrains: - - python_abi 3.10.* *_cp310 - license: Python-2.0 - size: 13065974 - timestamp: 1687560536470 - kind: conda name: python version: 3.10.12 @@ -27556,32 +21634,6 @@ packages: license: Python-2.0 size: 25543395 timestamp: 1687561173886 -- kind: conda - name: python - version: 3.11.0 - build: h3ba56d0_1_cpython - build_number: 1 - subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/python-3.11.0-h3ba56d0_1_cpython.conda - sha256: 28a54d78cd2624a12bd2ceb0f1816b0cba9b4fd97df846b5843b3c1d51642ab2 - md5: 2aa7ca3702d9afd323ca34a9d98879d1 - depends: - - bzip2 >=1.0.8,<2.0a0 - - libffi >=3.4,<4.0a0 - - libsqlite >=3.40.0,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - ncurses >=6.3,<7.0a0 - - openssl >=3.0.7,<4.0a0 - - readline >=8.1.2,<9.0a0 - - tk >=8.6.12,<8.7.0a0 - - tzdata - - xz >=5.2.6,<6.0a0 - constrains: - - python_abi 3.11.* *_cp311 - license: Python-2.0 - purls: [] - size: 14492975 - timestamp: 1673699560906 - kind: conda name: python version: 3.11.0 @@ -27607,31 +21659,6 @@ packages: license: Python-2.0 size: 14492975 timestamp: 1673699560906 -- kind: conda - name: python - version: 3.11.0 - build: hcf16a7b_0_cpython - subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/python-3.11.0-hcf16a7b_0_cpython.tar.bz2 - sha256: 20d1f1b5dc620b745c325844545fd5c0cdbfdb2385a0e27ef1507399844c8c6d - md5: 13ee3577afc291dabd2d9edc59736688 - depends: - - bzip2 >=1.0.8,<2.0a0 - - libffi >=3.4.2,<3.5.0a0 - - libsqlite >=3.39.4,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - openssl >=3.0.5,<4.0a0 - - tk >=8.6.12,<8.7.0a0 - - tzdata - - vc >=14.1,<15 - - vs2015_runtime >=14.16.27033 - - xz >=5.2.6,<5.3.0a0 - constrains: - - python_abi 3.11.* *_cp311 - license: Python-2.0 - purls: [] - size: 19819816 - timestamp: 1666678800085 - kind: conda name: python version: 3.11.0 @@ -27683,89 +21710,94 @@ packages: constrains: - python_abi 3.11.* *_cp311 license: Python-2.0 - purls: [] size: 31476523 timestamp: 1673700777998 - kind: conda name: python - version: 3.11.0 - build: he550d4f_1_cpython - build_number: 1 - subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/python-3.11.0-he550d4f_1_cpython.conda - sha256: 464f998e406b645ba34771bb53a0a7c2734e855ee78dd021aa4dedfdb65659b7 - md5: 8d14fc2aa12db370a443753c8230be1e + version: 3.11.10 + build: h739c21a_0_cpython + subdir: osx-arm64 + url: https://conda.anaconda.org/conda-forge/osx-arm64/python-3.11.10-h739c21a_0_cpython.conda + sha256: c7a8698fff5e8b451c3168c14f2f3bf340d523cb8b197aacad9e890e4851df4d + md5: ec064c104aa080f5e5e4c159d8e8fed0 depends: + - __osx >=11.0 - bzip2 >=1.0.8,<2.0a0 - - ld_impl_linux-64 >=2.36.1 + - libexpat >=2.6.3,<3.0a0 - libffi >=3.4,<4.0a0 - - libgcc-ng >=12 - - libnsl >=2.0.0,<2.1.0a0 - - libsqlite >=3.40.0,<4.0a0 - - libuuid >=2.32.1,<3.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - ncurses >=6.3,<7.0a0 - - openssl >=3.0.7,<4.0a0 - - readline >=8.1.2,<9.0a0 - - tk >=8.6.12,<8.7.0a0 + - libsqlite >=3.46.1,<4.0a0 + - libzlib >=1.3.1,<2.0a0 + - ncurses >=6.5,<7.0a0 + - openssl >=3.3.2,<4.0a0 + - readline >=8.2,<9.0a0 + - tk >=8.6.13,<8.7.0a0 - tzdata - xz >=5.2.6,<6.0a0 constrains: - python_abi 3.11.* *_cp311 license: Python-2.0 - size: 31476523 - timestamp: 1673700777998 + purls: [] + size: 14524339 + timestamp: 1725966089506 - kind: conda name: python - version: 3.11.0 - build: he7542f4_1_cpython - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/python-3.11.0-he7542f4_1_cpython.conda - sha256: 5c069c9908e48a4490a56d3752c0bc93c2fc93ab8d8328efc869fdc707618e9f - md5: 9ecfa530b33aefd0d22e0272336f638a + version: 3.11.10 + build: hc5c86c4_0_cpython + subdir: linux-64 + url: https://conda.anaconda.org/conda-forge/linux-64/python-3.11.10-hc5c86c4_0_cpython.conda + sha256: 844bb9cefdfe93969fd9a9b593f6eb1ecbe6c53ab8d1a5d441bd7c93b31d0fef + md5: 43a02ff0a2dafe8a8a1b6a9eacdbd2cc depends: + - __glibc >=2.17,<3.0.a0 - bzip2 >=1.0.8,<2.0a0 + - ld_impl_linux-64 >=2.36.1 + - libexpat >=2.6.3,<3.0a0 - libffi >=3.4,<4.0a0 - - libsqlite >=3.40.0,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - ncurses >=6.3,<7.0a0 - - openssl >=3.0.7,<4.0a0 - - readline >=8.1.2,<9.0a0 - - tk >=8.6.12,<8.7.0a0 + - libgcc >=13 + - libnsl >=2.0.1,<2.1.0a0 + - libsqlite >=3.46.1,<4.0a0 + - libuuid >=2.38.1,<3.0a0 + - libxcrypt >=4.4.36 + - libzlib >=1.3.1,<2.0a0 + - ncurses >=6.5,<7.0a0 + - openssl >=3.3.2,<4.0a0 + - readline >=8.2,<9.0a0 + - tk >=8.6.13,<8.7.0a0 - tzdata - xz >=5.2.6,<6.0a0 constrains: - python_abi 3.11.* *_cp311 license: Python-2.0 purls: [] - size: 15410083 - timestamp: 1673762717308 + size: 30607461 + timestamp: 1725967457875 - kind: conda name: python - version: 3.11.0 - build: he7542f4_1_cpython - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/python-3.11.0-he7542f4_1_cpython.conda - sha256: 5c069c9908e48a4490a56d3752c0bc93c2fc93ab8d8328efc869fdc707618e9f - md5: 9ecfa530b33aefd0d22e0272336f638a + version: 3.11.10 + build: hce54a09_0_cpython + subdir: win-64 + url: https://conda.anaconda.org/conda-forge/win-64/python-3.11.10-hce54a09_0_cpython.conda + sha256: ecc919108615142bc9281344151bee78158e0d93e07562e5dfe0c166848c092b + md5: d187a4d8bd52cc55e34cd92379a77b30 depends: - bzip2 >=1.0.8,<2.0a0 + - libexpat >=2.6.3,<3.0a0 - libffi >=3.4,<4.0a0 - - libsqlite >=3.40.0,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - ncurses >=6.3,<7.0a0 - - openssl >=3.0.7,<4.0a0 - - readline >=8.1.2,<9.0a0 - - tk >=8.6.12,<8.7.0a0 + - libsqlite >=3.46.1,<4.0a0 + - libzlib >=1.3.1,<2.0a0 + - openssl >=3.3.2,<4.0a0 + - tk >=8.6.13,<8.7.0a0 - tzdata + - ucrt >=10.0.20348.0 + - vc >=14.2,<15 + - vc14_runtime >=14.29.30139 - xz >=5.2.6,<6.0a0 constrains: - python_abi 3.11.* *_cp311 license: Python-2.0 - size: 15410083 - timestamp: 1673762717308 + purls: [] + size: 18230109 + timestamp: 1725966041845 - kind: conda name: python version: 3.12.0 @@ -27785,38 +21817,13 @@ packages: - tzdata - ucrt >=10.0.20348.0 - vc >=14.2,<15 - - vc14_runtime >=14.29.30139 - - xz >=5.2.6,<6.0a0 - constrains: - - python_abi 3.12.* *_cp312 - license: Python-2.0 - size: 16140836 - timestamp: 1696321871976 -- kind: conda - name: python - version: 3.12.0 - build: h30d4d87_0_cpython - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.0-h30d4d87_0_cpython.conda - sha256: 0a1ed3983acbd0528bef5216179e46170f024f4409032875b27865568fef46a1 - md5: d11dc8f4551011fb6baa2865f1ead48f - depends: - - bzip2 >=1.0.8,<2.0a0 - - libexpat >=2.5.0,<3.0a0 - - libffi >=3.4,<4.0a0 - - libsqlite >=3.43.0,<4.0a0 - - libzlib >=1.2.13,<2.0.0a0 - - ncurses >=6.4,<7.0a0 - - openssl >=3.1.3,<4.0a0 - - readline >=8.2,<9.0a0 - - tk >=8.6.13,<8.7.0a0 - - tzdata + - vc14_runtime >=14.29.30139 - xz >=5.2.6,<6.0a0 constrains: - python_abi 3.12.* *_cp312 license: Python-2.0 - size: 14529683 - timestamp: 1696323482650 + size: 16140836 + timestamp: 1696321871976 - kind: conda name: python version: 3.12.0 @@ -27902,32 +21909,6 @@ packages: license: Python-2.0 size: 31663253 timestamp: 1723143721353 -- kind: conda - name: python - version: 3.12.5 - build: h37a9e06_0_cpython - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.5-h37a9e06_0_cpython.conda - sha256: c0f39e625b2fd65f70a9cc086fe4b25cc72228453dbbcd92cd5d140d080e38c5 - md5: 517cb4e16466f8d96ba2a72897d14c48 - depends: - - __osx >=10.13 - - bzip2 >=1.0.8,<2.0a0 - - libexpat >=2.6.2,<3.0a0 - - libffi >=3.4,<4.0a0 - - libsqlite >=3.46.0,<4.0a0 - - libzlib >=1.3.1,<2.0a0 - - ncurses >=6.5,<7.0a0 - - openssl >=3.3.1,<4.0a0 - - readline >=8.2,<9.0a0 - - tk >=8.6.13,<8.7.0a0 - - tzdata - - xz >=5.2.6,<6.0a0 - constrains: - - python_abi 3.12.* *_cp312 - license: Python-2.0 - size: 12173272 - timestamp: 1723142761765 - kind: conda name: python version: 3.12.5 @@ -28022,6 +22003,23 @@ packages: - pkg:pypi/python-dateutil?source=hash-mapping size: 222742 timestamp: 1709299922152 +- kind: conda + name: python-dotenv + version: 1.0.1 + build: pyhd8ed1ab_0 + subdir: noarch + noarch: python + url: https://conda.anaconda.org/conda-forge/noarch/python-dotenv-1.0.1-pyhd8ed1ab_0.conda + sha256: 2d4c80364f03315d606a50eddd493dbacc078e21412c2462c0f781eec49b572c + md5: c2997ea9360ac4e015658804a7a84f94 + depends: + - python >=3.8 + license: BSD-3-Clause + license_family: BSD + purls: + - pkg:pypi/python-dotenv?source=hash-mapping + size: 24278 + timestamp: 1706018281544 - kind: conda name: python-dotenv version: 1.0.1 @@ -28140,22 +22138,6 @@ packages: purls: [] size: 6211 timestamp: 1723823324668 -- kind: conda - name: python_abi - version: '3.11' - build: 5_cp311 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.11-5_cp311.conda - sha256: 9b092850a268aca99600b724bae849f51209ecd5628e609b4699debc59ff1945 - md5: e6d62858c06df0be0e6255c753d74787 - constrains: - - python 3.11.* *_cpython - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 6303 - timestamp: 1723823062672 - kind: conda name: python_abi version: '3.11' @@ -28203,21 +22185,6 @@ packages: license_family: BSD size: 6238 timestamp: 1723823388266 -- kind: conda - name: python_abi - version: '3.12' - build: 5_cp312 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/python_abi-3.12-5_cp312.conda - sha256: 4da26c7508d5bc5d8621e84dc510284402239df56aab3587a7d217de9d3c806d - md5: c34dd4920e0addf7cfcc725809f25d8e - constrains: - - python 3.12.* *_cpython - license: BSD-3-Clause - license_family: BSD - size: 6312 - timestamp: 1723823137004 - kind: conda name: python_abi version: '3.12' @@ -28349,26 +22316,6 @@ packages: - pkg:pypi/pywinpty?source=hash-mapping size: 213169 timestamp: 1724951443134 -- kind: conda - name: pyyaml - version: 6.0.2 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyyaml-6.0.2-py311h3336109_1.conda - sha256: d8f4513c53a7c0be9f1cdb9d1af31ac85cf8a6f0e4194715e36e915c03104662 - md5: b0132bec7165a53403dcc393ff761a9e - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - yaml >=0.2.5,<0.3.0a0 - license: MIT - license_family: MIT - purls: - - pkg:pypi/pyyaml?source=hash-mapping - size: 193941 - timestamp: 1725456465818 - kind: conda name: pyyaml version: 6.0.2 @@ -28502,44 +22449,6 @@ packages: - pkg:pypi/pyzmq?source=hash-mapping size: 387556 timestamp: 1725449077083 -- kind: conda - name: pyzmq - version: 26.2.0 - build: py311h95f92fe_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/pyzmq-26.2.0-py311h95f92fe_2.conda - sha256: 39a2a81e0d7f51862d8724291bef0f58849db5a9fb7274c460a74df13d64acd4 - md5: a70baba4bb42449282b53e07ead41ddd - depends: - - __osx >=10.13 - - libcxx >=17 - - libsodium >=1.0.20,<1.0.21.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - zeromq >=4.3.5,<4.4.0a0 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/pyzmq?source=hash-mapping - size: 366949 - timestamp: 1725449221321 -- kind: conda - name: qhull - version: '2020.2' - build: h3c5361c_5 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/qhull-2020.2-h3c5361c_5.conda - sha256: 79d804fa6af9c750e8b09482559814ae18cd8df549ecb80a4873537a5a31e06e - md5: dd1ea9ff27c93db7c01a7b7656bd4ad4 - depends: - - __osx >=10.13 - - libcxx >=16 - license: LicenseRef-Qhull - purls: [] - size: 528122 - timestamp: 1720814002588 - kind: conda name: qhull version: '2020.2' @@ -28590,44 +22499,6 @@ packages: purls: [] size: 1377020 timestamp: 1720814433486 -- kind: conda - name: qt6-main - version: 6.7.2 - build: h03f778c_5 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/qt6-main-6.7.2-h03f778c_5.conda - sha256: b5b87e6812f2814189e2360b877dfe5eb3b7d7ddab079974f2df72c838712bb6 - md5: 8ac7658fec2ca4849ebbcc610e5d5e49 - depends: - - __osx >=11.0 - - double-conversion >=3.3.0,<3.4.0a0 - - harfbuzz >=9.0.0,<10.0a0 - - icu >=75.1,<76.0a0 - - krb5 >=1.21.3,<1.22.0a0 - - libclang-cpp16 >=16.0.6,<16.1.0a0 - - libclang13 >=16.0.6 - - libcxx >=16 - - libglib >=2.80.3,<3.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libllvm16 >=16.0.6,<16.1.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libpq >=16.4,<17.0a0 - - libsqlite >=3.46.0,<4.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - - mysql-libs >=9.0.1,<9.1.0a0 - - openssl >=3.3.1,<4.0a0 - - pcre2 >=10.44,<10.45.0a0 - - zstd >=1.5.6,<1.6.0a0 - constrains: - - qt 6.7.2 - license: LGPL-3.0-only - license_family: LGPL - purls: [] - size: 41498470 - timestamp: 1724535421868 - kind: conda name: qt6-main version: 6.7.2 @@ -28817,23 +22688,6 @@ packages: purls: [] size: 145494 timestamp: 1715007138921 -- kind: conda - name: rapidjson - version: 1.1.0.post20240409 - build: hf036a51_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/rapidjson-1.1.0.post20240409-hf036a51_1.conda - sha256: 07f88271bc5a73fc5a910895bf83bb6046b2b84a3935015c448667aef41abf9e - md5: 7b32c6b26b7c3a0d97ad484ab6f207c9 - depends: - - __osx >=10.13 - - libcxx >=16 - license: MIT - license_family: MIT - purls: [] - size: 145520 - timestamp: 1715007151117 - kind: conda name: rasterio version: 1.3.11 @@ -28867,38 +22721,6 @@ packages: - pkg:pypi/rasterio?source=hash-mapping size: 7606327 timestamp: 1726176768648 -- kind: conda - name: rasterio - version: 1.3.11 - build: py311h57fe283_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/rasterio-1.3.11-py311h57fe283_1.conda - sha256: db7fc93a62a6ea4ea7302ec8f0b3a7e2f358c33d4c80cb46e5fdf7f68c7c0cb3 - md5: 1f69b04debd92de77811173a2081165b - depends: - - __osx >=10.13 - - affine - - attrs - - certifi - - click >=4 - - click-plugins - - cligj >=0.5 - - libcxx >=17 - - libgdal >=3.9.2,<3.10.0a0 - - libgdal-core >=3.9.2,<3.10.0a0 - - numpy >=1.19,<3 - - proj >=9.4.1,<9.5.0a0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - setuptools >=0.9.8 - - snuggs >=1.4.1 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/rasterio?source=hash-mapping - size: 7743057 - timestamp: 1726176646431 - kind: conda name: rasterio version: 1.3.11 @@ -28997,22 +22819,6 @@ packages: purls: [] size: 26617 timestamp: 1708946796423 -- kind: conda - name: re2 - version: 2023.09.01 - build: hb168e87_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/re2-2023.09.01-hb168e87_2.conda - sha256: 5739ed2cfa62ed7f828eb4b9e6e69ff1df56cb9a9aacdc296451a3cb647034eb - md5: 266f8ca8528fc7e0fa31066c309ad864 - depends: - - libre2-11 2023.09.01 h81f5012_2 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 26814 - timestamp: 1708947195067 - kind: conda name: re2 version: 2023.09.01 @@ -29093,37 +22899,6 @@ packages: license_family: GPL size: 250351 timestamp: 1679532511311 -- kind: conda - name: readline - version: '8.2' - build: h9e318b2_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - sha256: 41e7d30a097d9b060037f0c6a2b1d4c4ae7e942c06c943d23f9d481548478568 - md5: f17f77f2acf4d344734bda76829ce14e - depends: - - ncurses >=6.3,<7.0a0 - license: GPL-3.0-only - license_family: GPL - purls: [] - size: 255870 - timestamp: 1679532707590 -- kind: conda - name: readline - version: '8.2' - build: h9e318b2_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/readline-8.2-h9e318b2_1.conda - sha256: 41e7d30a097d9b060037f0c6a2b1d4c4ae7e942c06c943d23f9d481548478568 - md5: f17f77f2acf4d344734bda76829ce14e - depends: - - ncurses >=6.3,<7.0a0 - license: GPL-3.0-only - license_family: GPL - size: 255870 - timestamp: 1679532707590 - kind: conda name: readme_renderer version: '44.0' @@ -29361,27 +23136,6 @@ packages: - pkg:pypi/rpds-py?source=hash-mapping size: 208679 timestamp: 1725327961461 -- kind: conda - name: rpds-py - version: 0.20.0 - build: py311h95688db_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/rpds-py-0.20.0-py311h95688db_1.conda - sha256: 8cd75a394aea88873df33fce27865bd8a40c9ebb13e08ceb15a77f720a0b7664 - md5: 725a2cae824df9c489c72dc9b02bf86d - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - __osx >=10.13 - license: MIT - license_family: MIT - purls: - - pkg:pypi/rpds-py?source=hash-mapping - size: 297046 - timestamp: 1725327351207 - kind: conda name: rpds-py version: 0.20.0 @@ -29406,12 +23160,12 @@ packages: timestamp: 1725327207078 - kind: conda name: ruff - version: 0.6.4 + version: 0.6.5 build: py311h2cf8269_0 subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/ruff-0.6.4-py311h2cf8269_0.conda - sha256: d1e34c233e453aed60f424202dd938472c7e05f2d7bdbe838b9ce5c72ffcb482 - md5: aae92f44f2d3b3a39d52bf8e43457d89 + url: https://conda.anaconda.org/conda-forge/osx-arm64/ruff-0.6.5-py311h2cf8269_0.conda + sha256: bbe7935570930ef51d6533256b4e2af0e140ba1f229d4eb8fcafc47edad58f23 + md5: 94c9ea7da4b652fd377137bc729a7d9a depends: - __osx >=11.0 - libcxx >=17 @@ -29424,37 +23178,16 @@ packages: license_family: MIT purls: - pkg:pypi/ruff?source=hash-mapping - size: 6010594 - timestamp: 1725618632541 + size: 6004241 + timestamp: 1726264305778 - kind: conda name: ruff - version: 0.6.4 - build: py311h8c6096b_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/ruff-0.6.4-py311h8c6096b_0.conda - sha256: 81b19e28e9b8799e057856a2632dba9341abfbf58b3170d7652e942ed93f78ab - md5: e98ab5307a64ffcc5606286616aee960 - depends: - - __osx >=10.13 - - libcxx >=17 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - constrains: - - __osx >=10.12 - license: MIT - license_family: MIT - purls: - - pkg:pypi/ruff?source=hash-mapping - size: 6300562 - timestamp: 1725618442272 -- kind: conda - name: ruff - version: 0.6.4 + version: 0.6.5 build: py311heeab51b_0 subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/ruff-0.6.4-py311heeab51b_0.conda - sha256: 9aae8777f57cec453a359e814a9b9ef5db96f4cc8b5c70d89d2050269aae45b7 - md5: 34f3dc523009e4710b4fc21e55d39ff9 + url: https://conda.anaconda.org/conda-forge/win-64/ruff-0.6.5-py311heeab51b_0.conda + sha256: 0abe4ac55ea8b8f4e7b752f583ba7c14475ddb1f52eeba7f4d6d16e1dff4d7ca + md5: 2cbd2d6ddee11b1722d0f7c6d6bf2033 depends: - python >=3.11,<3.12.0a0 - python_abi 3.11.* *_cp311 @@ -29465,16 +23198,16 @@ packages: license_family: MIT purls: - pkg:pypi/ruff?source=hash-mapping - size: 6457753 - timestamp: 1725619339023 + size: 6470052 + timestamp: 1726265748106 - kind: conda name: ruff - version: 0.6.4 + version: 0.6.5 build: py311hef32070_0 subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/ruff-0.6.4-py311hef32070_0.conda - sha256: 70ab079ed46f55affa8991753560dcac7a3564c49f22dd62316c3a97e1f8b243 - md5: cda8b625beddd4edca000d1d6299973c + url: https://conda.anaconda.org/conda-forge/linux-64/ruff-0.6.5-py311hef32070_0.conda + sha256: 171eb0b11cadc910f92f8a20f479c1c6ed665c9068ae1d4415030b1439c3ac27 + md5: d77981869cb07544a753b8d2cc1814ec depends: - __glibc >=2.17,<3.0.a0 - libgcc >=13 @@ -29485,8 +23218,8 @@ packages: license_family: MIT purls: - pkg:pypi/ruff?source=hash-mapping - size: 6536890 - timestamp: 1725618269280 + size: 6556551 + timestamp: 1726264072019 - kind: conda name: s2n version: 1.5.2 @@ -29556,31 +23289,6 @@ packages: - pkg:pypi/scikit-learn?source=hash-mapping size: 9729967 timestamp: 1726083178729 -- kind: conda - name: scikit-learn - version: 1.5.2 - build: py311ha1d5734_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/scikit-learn-1.5.2-py311ha1d5734_1.conda - sha256: 2bec7d70f518550b2d573f2cbe083db95dd65e5b4f0fb417a0c94300bfd146e1 - md5: 2797c34b77d64a38c092dd9b21ed908b - depends: - - __osx >=10.13 - - joblib >=1.2.0 - - libcxx >=17 - - llvm-openmp >=17.0.6 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - scipy - - threadpoolctl >=3.1.0 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/scikit-learn?source=hash-mapping - size: 9712928 - timestamp: 1726083255913 - kind: conda name: scikit-learn version: 1.5.2 @@ -29634,33 +23342,6 @@ packages: - pkg:pypi/scipy?source=hash-mapping size: 15374158 timestamp: 1724328343933 -- kind: conda - name: scipy - version: 1.14.1 - build: py311hb3ed397_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/scipy-1.14.1-py311hb3ed397_0.conda - sha256: b295c8c7984da0bf910d6e55ec5def15ba21d287a3606ed4310ad5f6639de8c7 - md5: ad59f76d9b7b02fbcdddf741bb4d531a - depends: - - __osx >=10.13 - - libblas >=3.9.0,<4.0a0 - - libcblas >=3.9.0,<4.0a0 - - libcxx >=17 - - libgfortran 5.* - - libgfortran5 >=13.2.0 - - liblapack >=3.9.0,<4.0a0 - - numpy <2.3 - - numpy >=1.19,<3 - - numpy >=1.23.5 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/scipy?source=hash-mapping - size: 16255436 - timestamp: 1724327987128 - kind: conda name: scipy version: 1.14.1 @@ -29823,7 +23504,7 @@ packages: license: MIT license_family: MIT purls: - - pkg:pypi/setuptools?source=compressed-mapping + - pkg:pypi/setuptools?source=hash-mapping size: 1460460 timestamp: 1725348602179 - kind: conda @@ -29890,28 +23571,7 @@ packages: depends: - __glibc >=2.17,<3.0.a0 - geos >=3.12.2,<3.12.3.0a0 - - libgcc >=13 - - numpy >=1.19,<3 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/shapely?source=hash-mapping - size: 582017 - timestamp: 1725394121924 -- kind: conda - name: shapely - version: 2.0.6 - build: py311hbb437d5_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/shapely-2.0.6-py311hbb437d5_1.conda - sha256: 07be6f482a92ef60158270b4dff118596e96e314d59f147de9d8861242570da8 - md5: 6debc457cfee51432c6cc3967757af76 - depends: - - __osx >=10.13 - - geos >=3.12.2,<3.12.3.0a0 + - libgcc >=13 - numpy >=1.19,<3 - python >=3.11,<3.12.0a0 - python_abi 3.11.* *_cp311 @@ -29919,8 +23579,8 @@ packages: license_family: BSD purls: - pkg:pypi/shapely?source=hash-mapping - size: 546750 - timestamp: 1725394094819 + size: 582017 + timestamp: 1725394121924 - kind: conda name: shapely version: 2.0.6 @@ -30047,22 +23707,6 @@ packages: purls: [] size: 35708 timestamp: 1720003794374 -- kind: conda - name: snappy - version: 1.2.1 - build: he1e6707_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/snappy-1.2.1-he1e6707_0.conda - sha256: a979319cd4916f0e7450aa92bb3cf4c2518afa80be50de99f31d075e693a6dd9 - md5: ddceef5df973c8ff7d6b32353c0cb358 - depends: - - __osx >=10.13 - - libcxx >=16 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 37036 - timestamp: 1720003862906 - kind: conda name: sniffio version: 1.3.1 @@ -30152,24 +23796,6 @@ packages: - pkg:pypi/soupsieve?source=hash-mapping size: 36754 timestamp: 1693929424267 -- kind: conda - name: spdlog - version: 1.14.1 - build: h325aa07_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/spdlog-1.14.1-h325aa07_1.conda - sha256: ec594f80f82f69472cf518795303a222a03460cc4102c4758b33eab833640024 - md5: 4aa13d84a5c71b5df6642761a6c35ce9 - depends: - - __osx >=10.13 - - fmt >=11.0.1,<12.0a0 - - libcxx >=16 - license: MIT - license_family: MIT - purls: [] - size: 171455 - timestamp: 1722238446029 - kind: conda name: spdlog version: 1.14.1 @@ -30440,24 +24066,6 @@ packages: purls: [] size: 859188 timestamp: 1725353670478 -- kind: conda - name: sqlite - version: 3.46.1 - build: he26b093_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/sqlite-3.46.1-he26b093_0.conda - sha256: 668dcc8c38aabf42de440f682de4afe11f390b1dc5b49e09b34501bbf19571c8 - md5: 56a8cc349cf8e2310ee0e52f90247dab - depends: - - __osx >=10.13 - - libsqlite 3.46.1 h4b8f8c9_0 - - libzlib >=1.3.1,<2.0a0 - - ncurses >=6.5,<7.0a0 - - readline >=8.2,<9.0a0 - license: Unlicense - purls: [] - size: 912164 - timestamp: 1725353686354 - kind: conda name: stack_data version: 0.6.2 @@ -30511,22 +24119,6 @@ packages: purls: [] size: 1326484 timestamp: 1724459521607 -- kind: conda - name: svt-av1 - version: 2.2.1 - build: hac325c4_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/svt-av1-2.2.1-hac325c4_0.conda - sha256: 9e229a7e34d0526c9e52bac85e3aa4c3d8c25df4f8618274bc135f9c19140a5d - md5: 07799aecfd86318fa5b4c5202b7acdce - depends: - - __osx >=10.13 - - libcxx >=17 - license: BSD-2-Clause - license_family: BSD - purls: [] - size: 2153628 - timestamp: 1724459465920 - kind: conda name: svt-av1 version: 2.2.1 @@ -30544,23 +24136,6 @@ packages: purls: [] size: 1704957 timestamp: 1724459941490 -- kind: conda - name: tbb - version: 2021.13.0 - build: h37c8870_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/tbb-2021.13.0-h37c8870_0.conda - sha256: 9a20a60ebf743f99e38a7be049f8ca90f264851c13dc8cb41eb09d854a631e31 - md5: 89742f5ac7aeb5c44ec2b4c3c6692c3c - depends: - - __osx >=10.13 - - libcxx >=17 - - libhwloc >=2.11.1,<2.11.2.0a0 - license: Apache-2.0 - license_family: APACHE - purls: [] - size: 159453 - timestamp: 1725532728568 - kind: conda name: tbb version: 2021.13.0 @@ -30661,21 +24236,6 @@ packages: purls: [] size: 1054129 timestamp: 1725532588872 -- kind: conda - name: tbb-devel - version: 2021.13.0 - build: hf74753b_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/tbb-devel-2021.13.0-hf74753b_0.conda - sha256: 87211cb5e0ac89bae332a9d709400adb0a89c2cd0de4832bdac9f5df4e31d411 - md5: 0b573dab0c70a7db11f734c7a711c126 - depends: - - __osx >=10.13 - - libcxx >=17 - - tbb 2021.13.0 h37c8870_0 - purls: [] - size: 1055091 - timestamp: 1725532749871 - kind: conda name: tblib version: 3.0.0 @@ -30773,48 +24333,14 @@ packages: - kind: conda name: tiledb version: 2.26.0 - build: h313d0e2_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/tiledb-2.26.0-h313d0e2_0.conda - sha256: 56f5f42acb0c46f8a0871c8f092850fc58fb8a542adaeb9f3f412ffccacfecde - md5: 7034c5fe1336d6f1c86299ce8e545de0 - depends: - - __osx >=10.13 - - aws-crt-cpp >=0.28.2,<0.28.3.0a0 - - aws-sdk-cpp >=1.11.379,<1.11.380.0a0 - - azure-core-cpp >=1.13.0,<1.13.1.0a0 - - azure-identity-cpp >=1.8.0,<1.8.1.0a0 - - azure-storage-blobs-cpp >=12.12.0,<12.12.1.0a0 - - azure-storage-common-cpp >=12.7.0,<12.7.1.0a0 - - bzip2 >=1.0.8,<2.0a0 - - fmt >=11.0.2,<12.0a0 - - libabseil * cxx17* - - libabseil >=20240116.2,<20240117.0a0 - - libcurl >=8.9.1,<9.0a0 - - libcxx >=17 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 - - libwebp-base >=1.4.0,<2.0a0 - - libzlib >=1.3.1,<2.0a0 - - lz4-c >=1.9.3,<1.10.0a0 - - openssl >=3.3.2,<4.0a0 - - spdlog >=1.14.1,<1.15.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: MIT - license_family: MIT - purls: [] - size: 3960161 - timestamp: 1726059270643 -- kind: conda - name: tiledb - version: 2.26.0 - build: h3c94177_0 - subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/tiledb-2.26.0-h3c94177_0.conda - sha256: 02c930cfec6a9b4a9a3837a1e5e723d1dc163cb9c60f186f6817c87ee028ad62 - md5: ad1a0bc095a4e0d1d15eafd888c119b8 + build: h93dd694_1 + build_number: 1 + subdir: linux-64 + url: https://conda.anaconda.org/conda-forge/linux-64/tiledb-2.26.0-h93dd694_1.conda + sha256: e74495d56c0b3ef0cacd932cb960b5464c75000e04931f4f2c97071f3f862837 + md5: 01e8e2e49932006efe6516c2fb9d675b depends: - - __osx >=11.0 + - __glibc >=2.17,<3.0.a0 - aws-crt-cpp >=0.28.2,<0.28.3.0a0 - aws-sdk-cpp >=1.11.379,<1.11.380.0a0 - azure-core-cpp >=1.13.0,<1.13.1.0a0 @@ -30825,10 +24351,11 @@ packages: - fmt >=11.0.2,<12.0a0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - - libcurl >=8.9.1,<9.0a0 - - libcxx >=17 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 + - libcurl >=8.10.0,<9.0a0 + - libgcc >=13 + - libgoogle-cloud >=2.29.0,<2.30.0a0 + - libgoogle-cloud-storage >=2.29.0,<2.30.0a0 + - libstdcxx >=13 - libwebp-base >=1.4.0,<2.0a0 - libzlib >=1.3.1,<2.0a0 - lz4-c >=1.9.3,<1.10.0a0 @@ -30838,18 +24365,18 @@ packages: license: MIT license_family: MIT purls: [] - size: 3588995 - timestamp: 1726059138173 + size: 4542050 + timestamp: 1726339354403 - kind: conda name: tiledb version: 2.26.0 - build: h86fa3b2_0 - subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/tiledb-2.26.0-h86fa3b2_0.conda - sha256: 3e92cec15daed5e03d7fc676a021500fc92ac80716495504537d6e4bdb80138f - md5: 061175d9d4c046a1cf8bffe95a359fab + build: hefd1f8f_1 + build_number: 1 + subdir: win-64 + url: https://conda.anaconda.org/conda-forge/win-64/tiledb-2.26.0-hefd1f8f_1.conda + sha256: fef3c523acb956f5c1e5dec0cbf3cc5a23a81487ac29231c1816cd321fabc096 + md5: 75ca9c4dfbf11e51c676aab32a21cde6 depends: - - __glibc >=2.17,<3.0.a0 - aws-crt-cpp >=0.28.2,<0.28.3.0a0 - aws-sdk-cpp >=1.11.379,<1.11.380.0a0 - azure-core-cpp >=1.13.0,<1.13.1.0a0 @@ -30860,31 +24387,35 @@ packages: - fmt >=11.0.2,<12.0a0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - - libcurl >=8.9.1,<9.0a0 - - libgcc >=13 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 - - libstdcxx >=13 + - libcrc32c >=1.1.2,<1.2.0a0 + - libcurl >=8.10.0,<9.0a0 + - libgoogle-cloud >=2.29.0,<2.30.0a0 + - libgoogle-cloud-storage >=2.29.0,<2.30.0a0 - libwebp-base >=1.4.0,<2.0a0 - libzlib >=1.3.1,<2.0a0 - lz4-c >=1.9.3,<1.10.0a0 - openssl >=3.3.2,<4.0a0 - spdlog >=1.14.1,<1.15.0a0 + - ucrt >=10.0.20348.0 + - vc >=14.3,<15 + - vc14_runtime >=14.40.33810 - zstd >=1.5.6,<1.6.0a0 license: MIT license_family: MIT purls: [] - size: 4537477 - timestamp: 1726059097900 + size: 3103557 + timestamp: 1726339623315 - kind: conda name: tiledb version: 2.26.0 - build: h98a567f_0 - subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/tiledb-2.26.0-h98a567f_0.conda - sha256: 823d6d5c172cd90b105553d5dd93e07e0860c8e5751deb3cd076b684366797d7 - md5: 451f161732757b5124fc3a320401c587 + build: hfe5b9dc_1 + build_number: 1 + subdir: osx-arm64 + url: https://conda.anaconda.org/conda-forge/osx-arm64/tiledb-2.26.0-hfe5b9dc_1.conda + sha256: 2c9e708aa413b8662afe84d4b4a7d181fe1713dd7ea8e4798228a213acd579ab + md5: b74bc83b11eaaf3d0245fd3177eb6f9a depends: + - __osx >=11.0 - aws-crt-cpp >=0.28.2,<0.28.3.0a0 - aws-sdk-cpp >=1.11.379,<1.11.380.0a0 - azure-core-cpp >=1.13.0,<1.13.1.0a0 @@ -30895,24 +24426,21 @@ packages: - fmt >=11.0.2,<12.0a0 - libabseil * cxx17* - libabseil >=20240116.2,<20240117.0a0 - - libcrc32c >=1.1.2,<1.2.0a0 - - libcurl >=8.9.1,<9.0a0 - - libgoogle-cloud >=2.28.0,<2.29.0a0 - - libgoogle-cloud-storage >=2.28.0,<2.29.0a0 + - libcurl >=8.10.0,<9.0a0 + - libcxx >=17 + - libgoogle-cloud >=2.29.0,<2.30.0a0 + - libgoogle-cloud-storage >=2.29.0,<2.30.0a0 - libwebp-base >=1.4.0,<2.0a0 - libzlib >=1.3.1,<2.0a0 - lz4-c >=1.9.3,<1.10.0a0 - openssl >=3.3.2,<4.0a0 - spdlog >=1.14.1,<1.15.0a0 - - ucrt >=10.0.20348.0 - - vc >=14.3,<15 - - vc14_runtime >=14.40.33810 - zstd >=1.5.6,<1.6.0a0 license: MIT license_family: MIT purls: [] - size: 3093646 - timestamp: 1726059615242 + size: 3582635 + timestamp: 1726338867693 - kind: conda name: tinycss2 version: 1.3.0 @@ -30931,37 +24459,6 @@ packages: - pkg:pypi/tinycss2?source=hash-mapping size: 25405 timestamp: 1713975078735 -- kind: conda - name: tk - version: 8.6.13 - build: h1abcd95_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - sha256: 30412b2e9de4ff82d8c2a7e5d06a15f4f4fef1809a72138b6ccb53a33b26faf5 - md5: bf830ba5afc507c6232d4ef0fb1a882d - depends: - - libzlib >=1.2.13,<2.0.0a0 - license: TCL - license_family: BSD - purls: [] - size: 3270220 - timestamp: 1699202389792 -- kind: conda - name: tk - version: 8.6.13 - build: h1abcd95_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/tk-8.6.13-h1abcd95_1.conda - sha256: 30412b2e9de4ff82d8c2a7e5d06a15f4f4fef1809a72138b6ccb53a33b26faf5 - md5: bf830ba5afc507c6232d4ef0fb1a882d - depends: - - libzlib >=1.2.13,<2.0.0a0 - license: TCL - license_family: BSD - size: 3270220 - timestamp: 1699202389792 - kind: conda name: tk version: 8.6.13 @@ -31129,25 +24626,6 @@ packages: - pkg:pypi/toolz?source=hash-mapping size: 52358 timestamp: 1706112720607 -- kind: conda - name: tornado - version: 6.4.1 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/tornado-6.4.1-py311h3336109_1.conda - sha256: 2e54c0d478b8d0793f89b855749aa74acaa185d08d353d8e5aa95f8e89eb6123 - md5: 5e051c4c2b80c381173b2c1719265617 - depends: - - __osx >=10.13 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: Apache - purls: - - pkg:pypi/tornado?source=hash-mapping - size: 856251 - timestamp: 1724956238423 - kind: conda name: tornado version: 6.4.1 @@ -31436,21 +24914,6 @@ packages: - pkg:pypi/typing-utils?source=hash-mapping size: 13829 timestamp: 1622899345711 -- kind: conda - name: tzcode - version: 2024b - build: h00291cd_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/tzcode-2024b-h00291cd_0.conda - sha256: 3cce425fc4b1ab8ccd57e1591334b1ef37c11af108620c283d09902bfb78ada8 - md5: 146c172e6c1e704f8ba8a57a693da033 - depends: - - __osx >=10.13 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 62685 - timestamp: 1725600484536 - kind: conda name: tzcode version: 2024b @@ -31588,22 +25051,6 @@ packages: purls: [] size: 49181 timestamp: 1715010467661 -- kind: conda - name: uriparser - version: 0.9.8 - build: h6aefe2f_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/uriparser-0.9.8-h6aefe2f_0.conda - sha256: fec8e52955fc314580a93dee665349bf430ce6df19019cea3fae7ec60f732bdd - md5: 649890a63cc818b24fbbf0572db221a5 - depends: - - __osx >=10.9 - - libcxx >=16 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 43396 - timestamp: 1715010079800 - kind: conda name: uriparser version: 0.9.8 @@ -31622,14 +25069,13 @@ packages: timestamp: 1715010035325 - kind: conda name: urllib3 - version: 2.2.2 - build: pyhd8ed1ab_1 - build_number: 1 + version: 2.2.3 + build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.2-pyhd8ed1ab_1.conda - sha256: 00c47c602c03137e7396f904eccede8cc64cc6bad63ce1fc355125df8882a748 - md5: e804c43f58255e977093a2298e442bb8 + url: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_0.conda + sha256: b6bb34ce41cd93956ad6eeee275ed52390fb3788d6c75e753172ea7ac60b66e5 + md5: 6b55867f385dd762ed99ea687af32a69 depends: - brotli-python >=1.0.9 - h2 >=4,<5 @@ -31640,8 +25086,8 @@ packages: license_family: MIT purls: - pkg:pypi/urllib3?source=hash-mapping - size: 95048 - timestamp: 1719391384778 + size: 98076 + timestamp: 1726496531769 - kind: conda name: utfcpp version: 4.0.5 @@ -31654,18 +25100,6 @@ packages: purls: [] size: 14042 timestamp: 1704191209163 -- kind: conda - name: utfcpp - version: 4.0.5 - build: h694c41f_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/utfcpp-4.0.5-h694c41f_0.conda - sha256: a480e0a0e563d3915221d52bb11e0acf1a1bb58aa913349eefe8dca6ce02d4f4 - md5: f59ae41dec5f4035713eb00b552c6eb9 - license: BSL-1.0 - purls: [] - size: 13826 - timestamp: 1704191204619 - kind: conda name: utfcpp version: 4.0.5 @@ -31806,124 +25240,52 @@ packages: license: BSD-3-Clause license_family: BSD purls: [] - size: 20411 - timestamp: 1724915503517 -- kind: conda - name: vtk - version: 9.3.1 - build: qt_py311h31e1f40_205 - build_number: 205 - subdir: win-64 - url: https://conda.anaconda.org/conda-forge/win-64/vtk-9.3.1-qt_py311h31e1f40_205.conda - sha256: e985c2f027cddb9f9d958f251ceceecfeb0266d312ead59a2ee5fa6cc8cb354f - md5: 428e2669a1e334e4ade6ecfd3d36eab5 - depends: - - vtk-base 9.3.1 qt_py311h865aaba_205 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 20750 - timestamp: 1724917536745 -- kind: conda - name: vtk - version: 9.3.1 - build: qt_py311hadc0db7_205 - build_number: 205 - subdir: linux-64 - url: https://conda.anaconda.org/conda-forge/linux-64/vtk-9.3.1-qt_py311hadc0db7_205.conda - sha256: 094fead997358052b811aee398684d7b1312312f0457436c99fb2b1e236e41c2 - md5: d7d58c215111f04c7af805e67795c646 - depends: - - vtk-base 9.3.1 qt_py311h680aef5_205 - - vtk-io-ffmpeg 9.3.1 qt_py311he1e5eab_205 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 20423 - timestamp: 1724915280481 -- kind: conda - name: vtk - version: 9.3.1 - build: qt_py311hccf493d_205 - build_number: 205 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/vtk-9.3.1-qt_py311hccf493d_205.conda - sha256: 6b2867d3b5b3c0d3110a43309d6399e85062120288ddeb8db80fb4e4baecdf86 - md5: 91976e71f6b3602341569cfbad297d4e - depends: - - vtk-base 9.3.1 qt_py311h579de60_205 - - vtk-io-ffmpeg 9.3.1 qt_py311h98fac4b_205 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 20443 - timestamp: 1724913888506 -- kind: conda - name: vtk-base - version: 9.3.1 - build: qt_py311h14e0e01_205 - build_number: 205 - subdir: osx-arm64 - url: https://conda.anaconda.org/conda-forge/osx-arm64/vtk-base-9.3.1-qt_py311h14e0e01_205.conda - sha256: 4df146e81c185e8154dc0e5da58c278c7a9191c28ba3fc45cf225a69f6e58309 - md5: aa9485229dc5e267133459af5664b13a - depends: - - __osx >=11.0 - - double-conversion >=3.3.0,<3.4.0a0 - - eigen - - expat - - freetype >=2.12.1,<3.0a0 - - gl2ps >=1.4.2,<1.4.3.0a0 - - glew >=2.1.0,<2.2.0a0 - - hdf5 >=1.14.3,<1.14.4.0a0 - - jsoncpp >=1.9.5,<1.9.6.0a0 - - libcxx >=17 - - libexpat >=2.6.2,<3.0a0 - - libjpeg-turbo >=3.0.0,<4.0a0 - - libnetcdf >=4.9.2,<4.9.3.0a0 - - libogg >=1.3.5,<1.4.0a0 - - libpng >=1.6.43,<1.7.0a0 - - libsqlite >=3.46.0,<4.0a0 - - libtheora >=1.1.1,<1.2.0a0 - - libtiff >=4.6.0,<4.7.0a0 - - libxml2 >=2.12.7,<3.0a0 - - libzlib >=1.3.1,<2.0a0 - - loguru - - lz4-c >=1.9.3,<1.10.0a0 - - nlohmann_json - - numpy - - proj >=9.4.1,<9.5.0a0 - - pugixml >=1.14,<1.15.0a0 - - python >=3.11,<3.12.0a0 - - python >=3.11,<3.12.0a0 *_cpython - - python_abi 3.11.* *_cp311 - - qt6-main >=6.7.2,<6.8.0a0 - - sqlite - - tbb >=2021.12.0 - - tbb-devel - - tk >=8.6.13,<8.7.0a0 - - utfcpp - - wslink - - zlib - constrains: - - paraview ==9999999999 - - libboost_headers + size: 20411 + timestamp: 1724915503517 +- kind: conda + name: vtk + version: 9.3.1 + build: qt_py311h31e1f40_205 + build_number: 205 + subdir: win-64 + url: https://conda.anaconda.org/conda-forge/win-64/vtk-9.3.1-qt_py311h31e1f40_205.conda + sha256: e985c2f027cddb9f9d958f251ceceecfeb0266d312ead59a2ee5fa6cc8cb354f + md5: 428e2669a1e334e4ade6ecfd3d36eab5 + depends: + - vtk-base 9.3.1 qt_py311h865aaba_205 license: BSD-3-Clause license_family: BSD purls: [] - size: 34715664 - timestamp: 1724915285262 + size: 20750 + timestamp: 1724917536745 +- kind: conda + name: vtk + version: 9.3.1 + build: qt_py311hadc0db7_205 + build_number: 205 + subdir: linux-64 + url: https://conda.anaconda.org/conda-forge/linux-64/vtk-9.3.1-qt_py311hadc0db7_205.conda + sha256: 094fead997358052b811aee398684d7b1312312f0457436c99fb2b1e236e41c2 + md5: d7d58c215111f04c7af805e67795c646 + depends: + - vtk-base 9.3.1 qt_py311h680aef5_205 + - vtk-io-ffmpeg 9.3.1 qt_py311he1e5eab_205 + license: BSD-3-Clause + license_family: BSD + purls: [] + size: 20423 + timestamp: 1724915280481 - kind: conda name: vtk-base version: 9.3.1 - build: qt_py311h579de60_205 + build: qt_py311h14e0e01_205 build_number: 205 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/vtk-base-9.3.1-qt_py311h579de60_205.conda - sha256: 794e3cad477948495eaf717332bed5e729b5c874e64c473aa4e57c125678bdb0 - md5: 5bc09f7df1004502290c4d7c7c121fbe + subdir: osx-arm64 + url: https://conda.anaconda.org/conda-forge/osx-arm64/vtk-base-9.3.1-qt_py311h14e0e01_205.conda + sha256: 4df146e81c185e8154dc0e5da58c278c7a9191c28ba3fc45cf225a69f6e58309 + md5: aa9485229dc5e267133459af5664b13a depends: - - __osx >=10.13 + - __osx >=11.0 - double-conversion >=3.3.0,<3.4.0a0 - eigen - expat @@ -31950,6 +25312,7 @@ packages: - proj >=9.4.1,<9.5.0a0 - pugixml >=1.14,<1.15.0a0 - python >=3.11,<3.12.0a0 + - python >=3.11,<3.12.0a0 *_cpython - python_abi 3.11.* *_cp311 - qt6-main >=6.7.2,<6.8.0a0 - sqlite @@ -31960,13 +25323,13 @@ packages: - wslink - zlib constrains: - - libboost_headers - paraview ==9999999999 + - libboost_headers license: BSD-3-Clause license_family: BSD purls: [] - size: 36709259 - timestamp: 1724913766358 + size: 34715664 + timestamp: 1724915285262 - kind: conda name: vtk-base version: 9.3.1 @@ -32105,23 +25468,6 @@ packages: purls: [] size: 69629 timestamp: 1724915499444 -- kind: conda - name: vtk-io-ffmpeg - version: 9.3.1 - build: qt_py311h98fac4b_205 - build_number: 205 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/vtk-io-ffmpeg-9.3.1-qt_py311h98fac4b_205.conda - sha256: 6f3005288615e7462bac396ab92fe0ed2bfb0e98cb2800613fa4904cd6808a43 - md5: 73ee9fc830fe508cda753b847cfe763b - depends: - - ffmpeg >=6.1.2,<7.0a0 - - vtk-base 9.3.1 qt_py311h579de60_205 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 69130 - timestamp: 1724913885102 - kind: conda name: vtk-io-ffmpeg version: 9.3.1 @@ -32391,20 +25737,6 @@ packages: purls: [] size: 717038 timestamp: 1660323292329 -- kind: conda - name: x264 - version: 1!164.3095 - build: h775f41a_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/x264-1!164.3095-h775f41a_2.tar.bz2 - sha256: de611da29f4ed0733a330402e163f9260218e6ba6eae593a5f945827d0ee1069 - md5: 23e9c3180e2c0f9449bb042914ec2200 - license: GPL-2.0-or-later - license_family: GPL - purls: [] - size: 937077 - timestamp: 1660323305349 - kind: conda name: x264 version: 1!164.3095 @@ -32456,22 +25788,6 @@ packages: purls: [] size: 3357188 timestamp: 1646609687141 -- kind: conda - name: x265 - version: '3.5' - build: hbb4e6a2_3 - build_number: 3 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/x265-3.5-hbb4e6a2_3.tar.bz2 - sha256: 6b6a57710192764d0538f72ea1ccecf2c6174a092e0bc76d790f8ca36bbe90e4 - md5: a3bf3e95b7795871a6734a784400fcea - depends: - - libcxx >=12.0.1 - license: GPL-2.0-or-later - license_family: GPL - purls: [] - size: 3433205 - timestamp: 1646610148268 - kind: conda name: x265 version: '3.5' @@ -32690,25 +26006,6 @@ packages: purls: [] size: 3547000 timestamp: 1721032032254 -- kind: conda - name: xerces-c - version: 3.2.5 - build: hfb503d4_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xerces-c-3.2.5-hfb503d4_1.conda - sha256: 58c07f66e7a9b6853bc25663ce83098ae0ef2dc8f8ac383b9e708d9cd1349813 - md5: 0a0c50f248ec412e3225e2683b49d6cb - depends: - - __osx >=10.13 - - icu >=75.1,<76.0a0 - - libcurl >=8.8.0,<9.0a0 - - libcxx >=16 - license: Apache-2.0 - license_family: Apache - purls: [] - size: 1348901 - timestamp: 1721031740491 - kind: conda name: xkeyboard-config version: '2.42' @@ -32725,22 +26022,6 @@ packages: purls: [] size: 388998 timestamp: 1717817668629 -- kind: conda - name: xorg-fixesproto - version: '5.0' - build: h0d85af4_1002 - build_number: 1002 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-fixesproto-5.0-h0d85af4_1002.tar.bz2 - sha256: 10bfab0f8c129b16454a1fc86f88b26e8f5d3728eb8aaa251f7d4b24b33d0f9b - md5: a4dde7ba6e898f8cc4a33cc97943d2d7 - depends: - - xorg-xextproto - license: MIT - license_family: MIT - purls: [] - size: 9143 - timestamp: 1617479756296 - kind: conda name: xorg-fixesproto version: '5.0' @@ -32821,20 +26102,6 @@ packages: purls: [] size: 27417 timestamp: 1610027770456 -- kind: conda - name: xorg-kbproto - version: 1.0.7 - build: h35c211d_1002 - build_number: 1002 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-kbproto-1.0.7-h35c211d_1002.tar.bz2 - sha256: ea4e792e48f28023668ce3e716ebee9b7d04e2d397d678f8f3aef4c7a66f4449 - md5: 41302c2bc60a15ca4a018775fd20b442 - license: MIT - license_family: MIT - purls: [] - size: 27396 - timestamp: 1610027854580 - kind: conda name: xorg-kbproto version: 1.0.7 @@ -32867,19 +26134,6 @@ packages: purls: [] size: 28166 timestamp: 1610028297505 -- kind: conda - name: xorg-libice - version: 1.1.1 - build: h0dc2134_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libice-1.1.1-h0dc2134_0.conda - sha256: ddd5c7354d7c52fd0849d10ca846ab9b11610519ee423ba6117a5146b234ee71 - md5: 39743dd8d95b672aca2fec556bf83176 - license: MIT - license_family: MIT - purls: [] - size: 49588 - timestamp: 1685307846593 - kind: conda name: xorg-libice version: 1.1.1 @@ -32925,21 +26179,6 @@ packages: purls: [] size: 58469 timestamp: 1685307573114 -- kind: conda - name: xorg-libsm - version: 1.2.4 - build: h0dc2134_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libsm-1.2.4-h0dc2134_0.conda - sha256: 9215066151eb8efd2da53a02c5e44d2c4d37bdcb0af2b23f4f9ba1a1e7a9dd73 - md5: 0c0762c224b062747efb59eaef586541 - depends: - - xorg-libice >=1.1.1,<2.0a0 - license: MIT - license_family: MIT - purls: [] - size: 24364 - timestamp: 1685453929828 - kind: conda name: xorg-libsm version: 1.2.4 @@ -33010,26 +26249,6 @@ packages: purls: [] size: 814589 timestamp: 1718847832308 -- kind: conda - name: xorg-libx11 - version: 1.8.9 - build: h7022169_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libx11-1.8.9-h7022169_1.conda - sha256: eab0b85b6bf75724979bbf13f8b060efb7b159b5b5ce03c649f1ffad9f282bf1 - md5: f09e69ee27a9aaf14a1698600f8e7f55 - depends: - - __osx >=10.13 - - libxcb >=1.16,<1.17.0a0 - - xorg-kbproto - - xorg-xextproto >=7.3.0,<8.0a0 - - xorg-xproto - license: MIT - license_family: MIT - purls: [] - size: 771911 - timestamp: 1718846897439 - kind: conda name: xorg-libx11 version: 1.8.9 @@ -33070,19 +26289,6 @@ packages: purls: [] size: 747568 timestamp: 1718847013634 -- kind: conda - name: xorg-libxau - version: 1.0.11 - build: h0dc2134_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxau-1.0.11-h0dc2134_0.conda - sha256: 8a2e398c4f06f10c64e69f56bcf3ddfa30b432201446a0893505e735b346619a - md5: 9566b4c29274125b0266d0177b5eb97b - license: MIT - license_family: MIT - purls: [] - size: 13071 - timestamp: 1684638167647 - kind: conda name: xorg-libxau version: 1.0.11 @@ -33140,19 +26346,6 @@ packages: purls: [] size: 18164 timestamp: 1610071737668 -- kind: conda - name: xorg-libxdmcp - version: 1.1.3 - build: h35c211d_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxdmcp-1.1.3-h35c211d_0.tar.bz2 - sha256: 485421c16f03a01b8ed09984e0b2ababdbb3527e1abf354ff7646f8329be905f - md5: 86ac76d6bf1cbb9621943eb3bd9ae36e - license: MIT - license_family: MIT - purls: [] - size: 17225 - timestamp: 1610071995461 - kind: conda name: xorg-libxdmcp version: 1.1.3 @@ -33218,23 +26411,6 @@ packages: purls: [] size: 41541 timestamp: 1677037316516 -- kind: conda - name: xorg-libxext - version: 1.3.4 - build: hb7f2c08_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxext-1.3.4-hb7f2c08_2.conda - sha256: 56ca81c5e6d493e7a991f2beac1c38dec36d732c83495ef08f57a34c260a5aaa - md5: 0f98aff18e0455f0bdc4326c04f98883 - depends: - - xorg-libx11 >=1.7.2,<2.0a0 - - xorg-xextproto - license: MIT - license_family: MIT - purls: [] - size: 43076 - timestamp: 1677037100444 - kind: conda name: xorg-libxext version: 1.3.4 @@ -33253,23 +26429,6 @@ packages: purls: [] size: 221821 timestamp: 1677038179908 -- kind: conda - name: xorg-libxfixes - version: 5.0.3 - build: h0d85af4_1004 - build_number: 1004 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxfixes-5.0.3-h0d85af4_1004.tar.bz2 - sha256: 51550eca0f260a7f1b9bc9e177ba3e78ebcf3d77511363470219b710a3ef31b1 - md5: 6ecbef6618bb15e389a8bb5618e579f6 - depends: - - xorg-fixesproto - - xorg-libx11 >=1.7.0,<2.0a0 - license: MIT - license_family: MIT - purls: [] - size: 15510 - timestamp: 1617718115961 - kind: conda name: xorg-libxfixes version: 5.0.3 @@ -33387,22 +26546,6 @@ packages: purls: [] size: 195881 timestamp: 1696449889560 -- kind: conda - name: xorg-libxrender - version: 0.9.11 - build: h0dc2134_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-libxrender-0.9.11-h0dc2134_0.conda - sha256: f78a453b8339d77c515d76982c003d073fddcf62527b638558205cd25ac06705 - md5: 70c152c8a61b91a62d3ab0ade5fd9e2b - depends: - - xorg-libx11 >=1.8.6,<2.0a0 - - xorg-renderproto - license: MIT - license_family: MIT - purls: [] - size: 28581 - timestamp: 1688301169348 - kind: conda name: xorg-libxrender version: 0.9.11 @@ -33556,20 +26699,6 @@ packages: purls: [] size: 8014 timestamp: 1621340029114 -- kind: conda - name: xorg-renderproto - version: 0.11.1 - build: h0d85af4_1002 - build_number: 1002 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-renderproto-0.11.1-h0d85af4_1002.tar.bz2 - sha256: ac633b59ebf10da5d00040655e2ca5746d0e6813b4d20cf2c30adef753d3d082 - md5: e1cff95f235c6ad73199735685186693 - license: MIT - license_family: MIT - purls: [] - size: 9632 - timestamp: 1614866616392 - kind: conda name: xorg-renderproto version: 0.11.1 @@ -33646,20 +26775,6 @@ packages: purls: [] size: 30550 timestamp: 1677037030945 -- kind: conda - name: xorg-xextproto - version: 7.3.0 - build: hb7f2c08_1003 - build_number: 1003 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-xextproto-7.3.0-hb7f2c08_1003.conda - sha256: 53f1690e46c31c93f9899c6e6524bd1ddd4c8928caff5570b1d30e4ed89858f6 - md5: e4db268e1dc61ab3dcbbb302f6519f66 - license: MIT - license_family: MIT - purls: [] - size: 30477 - timestamp: 1677037035675 - kind: conda name: xorg-xextproto version: 7.3.0 @@ -33690,20 +26805,6 @@ packages: purls: [] size: 74988 timestamp: 1607291556181 -- kind: conda - name: xorg-xproto - version: 7.0.31 - build: h35c211d_1007 - build_number: 1007 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xorg-xproto-7.0.31-h35c211d_1007.tar.bz2 - sha256: 433fa2cf3282e0e6f13cf5e73280cd1add4d3be76f19f2674cbd127c9ec70dd4 - md5: e1b279e3b8c03f88a90e81480a8f319a - license: MIT - license_family: MIT - purls: [] - size: 74832 - timestamp: 1607291623383 - kind: conda name: xorg-xproto version: 7.0.31 @@ -33830,29 +26931,6 @@ packages: license: LGPL-2.1 and GPL-2.0 size: 235693 timestamp: 1660346961024 -- kind: conda - name: xz - version: 5.2.6 - build: h775f41a_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 - sha256: eb09823f34cc2dd663c0ec4ab13f246f45dcd52e5b8c47b9864361de5204a1c8 - md5: a72f9d4ea13d55d745ff1ed594747f10 - license: LGPL-2.1 and GPL-2.0 - purls: [] - size: 238119 - timestamp: 1660346964847 -- kind: conda - name: xz - version: 5.2.6 - build: h775f41a_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2 - sha256: eb09823f34cc2dd663c0ec4ab13f246f45dcd52e5b8c47b9864361de5204a1c8 - md5: a72f9d4ea13d55d745ff1ed594747f10 - license: LGPL-2.1 and GPL-2.0 - size: 238119 - timestamp: 1660346964847 - kind: conda name: xz version: 5.2.6 @@ -33882,20 +26960,6 @@ packages: license: LGPL-2.1 and GPL-2.0 size: 217804 timestamp: 1660346976440 -- kind: conda - name: yaml - version: 0.2.5 - build: h0d85af4_2 - build_number: 2 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/yaml-0.2.5-h0d85af4_2.tar.bz2 - sha256: 5301417e2c8dea45b401ffee8df3957d2447d4ce80c83c5ff151fc6bfe1c4148 - md5: d7e08fcf8259d742156188e8762b4d20 - license: MIT - license_family: MIT - purls: [] - size: 84237 - timestamp: 1641347062780 - kind: conda name: yaml version: 0.2.5 @@ -33943,27 +27007,6 @@ packages: purls: [] size: 63274 timestamp: 1641347623319 -- kind: conda - name: yarl - version: 1.9.4 - build: py311h3336109_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/yarl-1.9.4-py311h3336109_1.conda - sha256: 13290fed4301203c29141f864b0c16c42cf72c128201cec5b6c1430e4f7b1f98 - md5: a2b7ea15fa4af4005614d47f68375d1c - depends: - - __osx >=10.13 - - idna >=2.0 - - multidict >=4.0 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - license: Apache-2.0 - license_family: Apache - purls: - - pkg:pypi/yarl?source=hash-mapping - size: 112715 - timestamp: 1726055280200 - kind: conda name: yarl version: 1.9.4 @@ -34095,25 +27138,6 @@ packages: purls: [] size: 353159 timestamp: 1725429777124 -- kind: conda - name: zeromq - version: 4.3.5 - build: hb33e954_5 - build_number: 5 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/zeromq-4.3.5-hb33e954_5.conda - sha256: 7e63a9ec19660666095ea9332a5b226329ff4f499018e8a281d0d160cbb60ca4 - md5: a9735eb372d515c78f8211785406e36f - depends: - - __osx >=10.13 - - krb5 >=1.21.3,<1.22.0a0 - - libcxx >=17 - - libsodium >=1.0.20,<1.0.21.0a0 - license: MPL-2.0 - license_family: MOZILLA - purls: [] - size: 303596 - timestamp: 1725430161260 - kind: conda name: zeromq version: 4.3.5 @@ -34153,21 +27177,21 @@ packages: timestamp: 1681770298596 - kind: conda name: zipp - version: 3.20.1 + version: 3.20.2 build: pyhd8ed1ab_0 subdir: noarch noarch: python - url: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.1-pyhd8ed1ab_0.conda - sha256: 30762bd25b6fc8714d5520a223ccf20ad4a6792dc439c54b59bf44b60bf51e72 - md5: 74a4befb4b38897e19a107693e49da20 + url: https://conda.anaconda.org/conda-forge/noarch/zipp-3.20.2-pyhd8ed1ab_0.conda + sha256: 1e84fcfa41e0afdd87ff41e6fbb719c96a0e098c1f79be342293ab0bd8dea322 + md5: 4daaed111c05672ae669f7036ee5bba3 depends: - python >=3.8 license: MIT license_family: MIT purls: - pkg:pypi/zipp?source=hash-mapping - size: 21110 - timestamp: 1724731063145 + size: 21409 + timestamp: 1726248679175 - kind: conda name: zlib version: 1.3.1 @@ -34204,23 +27228,6 @@ packages: purls: [] size: 93004 timestamp: 1716874213487 -- kind: conda - name: zlib - version: 1.3.1 - build: h87427d6_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-h87427d6_1.conda - sha256: 41bd5fef28b2755d637e3a8ea5c84010628392fbcf80c7e3d7370aaced7ee4fe - md5: 3ac9ef8975965f9698dbedd2a4cc5894 - depends: - - __osx >=10.13 - - libzlib 1.3.1 h87427d6_1 - license: Zlib - license_family: Other - purls: [] - size: 88782 - timestamp: 1716874245467 - kind: conda name: zlib version: 1.3.1 @@ -34308,28 +27315,6 @@ packages: - pkg:pypi/zstandard?source=hash-mapping size: 417923 timestamp: 1725305669690 -- kind: conda - name: zstandard - version: 0.23.0 - build: py311hdf6fcd6_1 - build_number: 1 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py311hdf6fcd6_1.conda - sha256: d9bf977b620750049eb60fffca299a701342a2df59bcc2586a79b2f7c5783fa1 - md5: 4fc42d6f85a21b09ee6477f456554df3 - depends: - - __osx >=10.13 - - cffi >=1.11 - - python >=3.11,<3.12.0a0 - - python_abi 3.11.* *_cp311 - - zstd >=1.5.6,<1.5.7.0a0 - - zstd >=1.5.6,<1.6.0a0 - license: BSD-3-Clause - license_family: BSD - purls: - - pkg:pypi/zstandard?source=hash-mapping - size: 411350 - timestamp: 1725305723486 - kind: conda name: zstd version: 1.5.6 @@ -34348,22 +27333,6 @@ packages: purls: [] size: 349143 timestamp: 1714723445995 -- kind: conda - name: zstd - version: 1.5.6 - build: h915ae27_0 - subdir: osx-64 - url: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda - sha256: efa04a98cb149643fa54c4dad5a0179e36a5fbc88427ea0eec88ceed87fd0f96 - md5: 4cb2cd56f039b129bb0e491c1164167e - depends: - - __osx >=10.9 - - libzlib >=1.2.13,<2.0.0a0 - license: BSD-3-Clause - license_family: BSD - purls: [] - size: 498900 - timestamp: 1714723303098 - kind: conda name: zstd version: 1.5.6 diff --git a/pixi.toml b/pixi.toml index 4d65ee10e..0f15bfa1f 100644 --- a/pixi.toml +++ b/pixi.toml @@ -4,7 +4,7 @@ version = "0.17.2" description = "Make massive MODFLOW models" authors = ["Deltares ", ] channels = ["conda-forge", ] -platforms = ["win-64", "linux-64", "osx-arm64", "osx-64"] +platforms = ["win-64", "linux-64", "osx-arm64"] license = "MIT" license-file = "LICENSE" readme = "README.rst" @@ -15,14 +15,16 @@ repository = "https://github.com/Deltares/imod-python.git" [tasks] docs = { cmd = "make html", depends_on = ["install"], cwd = "docs" } install = "python -m pip install --no-deps --editable ." +install_with_deps = "python -m pip install --editable ." format = "ruff check --fix .; ruff format ." lint = "ruff check . ; ruff format --check ." tests = { depends_on = ["unittests", "examples"] } -unittests = { cmd = [ +unittests = { depends_on = ["unittests_njit", "unittests_jit"] } +unittests_njit = { cmd = [ "NUMBA_DISABLE_JIT=1", "pytest", "-n", "auto", - "-m", "not example", + "-m", "not example and not user_acceptance and not unittest_jit", "--cache-clear", "--verbose", "--junitxml=unittest_report.xml", @@ -31,8 +33,15 @@ unittests = { cmd = [ "--cov-report=html:coverage", "--cov-config=.coveragerc" ], depends_on = ["install"], cwd = "imod/tests" } +unittests_jit = { cmd = [ + "pytest", + "-n", "auto", + "-m", "unittest_jit", + "--cache-clear", + "--verbose", + "--junitxml=unittest_jit_report.xml", +], depends_on = ["install"], cwd = "imod/tests" } examples = { cmd = [ - "NUMBA_DISABLE_JIT=1", "pytest", "-n", "auto", "-m", "example", @@ -40,6 +49,20 @@ examples = { cmd = [ "--verbose", "--junitxml=examples_report.xml", ], depends_on = ["install"], cwd = "imod/tests", env = { IMOD_DATA_DIR = ".imod_data" } } +# User acceptance tests, only works when paths to models are located on local +# drive and are specified in a .env file. +user_acceptance = { cmd = [ + "pytest", + "-m", "user_acceptance", + "--cache-clear", + "--verbose", + "--junitxml=user_acceptance_report.xml", +], depends_on = ["install"], cwd = "imod/tests", env = { IMOD_DATA_DIR = ".imod_data" } } +test_import = { cmd = [ + "python", + "-c", + "import imod" +], depends_on = ["install_with_deps"]} pypi-publish = { cmd = "rm --recursive --force dist && python -m build && twine check dist/* && twine upload dist/*" } mypy_lint = { cmd ="mypy", depends_on = ["install"]} @@ -81,8 +104,9 @@ pytest = "<8" # Newer version incompatible with pytest-cases pytest-benchmark = "*" pytest-cases = "*" pytest-cov = "*" +pytest-dotenv = "*" pytest-xdist = "*" -python = "3.11" +python = "3.11.*" python-graphviz = "*" pyvista = "*" rasterio = ">=1.0" @@ -101,7 +125,7 @@ tqdm = "*" twine = "*" vtk = { version = ">=9.0", build = "*qt*", channel = "conda-forge" } xarray = ">=2023.08.0" -xugrid = ">=0.10.0" +xugrid = ">=0.11.0" zarr = "*" python-build = "*" diff --git a/pyproject.toml b/pyproject.toml index d2c0646dd..b4a74959b 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -158,6 +158,8 @@ ignore_missing_imports = true [tool.pytest.ini_options] markers = [ "example: marks test as example (deselect with '-m \"not example\"')", + "user_acceptance: marks user acceptance tests (deselect with '-m \"not user_acceptance\"')", + "unittest_jit: marks unit tests that should be jitted (deselect with '-m \"not unittest_jit\"')" ] [tool.hatch.version]